Tech

Cell mechanics research is making chemotherapy friendlier

image: PhD student Andrzej Kubiak at the atomic force microscope.

Image: 
Source: IFJ PAN

Malignant tumour cells undergo mechanical deformation more easily than normal cells, allowing them to migrate throughout the body. The mechanical properties of prostate cancer cells treated with the most commonly used anti-cancer drugs have been investigated at the Institute of Nuclear Physics of the Polish Academy of Sciences in Cracow. According to the researchers, current drugs can be used more effectively and at lower doses.

In cancer, a key factor contributing to the formation of metastasis is the ability of the neoplastic cells to undergo mechanical deformation. At the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow, research on the mechanical properties of cells has been conducted for a quarter of a century. The latest study, carried out in cooperation with the Department of Medical Biochemistry of the Jagiellonian University Medical College, concerned several drugs currently used in prostate cancer chemotherapy, and specifically their impact on the mechanical properties of cancer cells. The results are optimistic: everything indicates that the doses of some drugs can be reduced without the risk of reducing their effectiveness.

Chemotherapy is an extremely brutal attack not only on the patient's cancer cells but on all the cells in the body. By using it, doctors hope that the more sensitive tumour cells will die before the healthy ones begin to die. In this situation, it is crucial to know how to choose the optimal drug in a given case and how to determine its minimum dose, which on the one hand will guarantee the effectiveness of the treatment and on the other hand will minimize the adverse effects of the therapy.

As early as 1999, physicists from the IFJ PAN showed that cancer cells deform mechanically more easily. In practice, this fact means that they can squeeze through the narrow vessels of the circulatory and/or lymphatic systems with greater efficiency.

"The mechanical properties of a cell are determined by elements of its cytoskeleton such as the microtubules we examine, built of tubulin (a protein), actin filaments and intermediate filaments made of proteins such as keratin or vimentin," says Prof. Malgorzata Lekka from the Department of Biophysical Microstructures IFJ PAN and adds: "Biomechanical measurements of cells are carried out using an atomic force microscope. Depending on the needs, we can press the probe more or less onto the cell, and in this way we obtain a mechanical response coming from structures lying either at its surface, i.e. at the cell membrane, or deeper, even at the cell nucleus. However, in order to obtain information about the effects of a drug, we must evaluate what contribution each type of cytoskeleton fibre makes to the mechanical properties of the cell."

In the currently reported results, the Cracow-based physicists presented experiments using the commercially available DU145 human prostate cancer cell line. This line was chosen for its drug resistance. Undergoing long-term drug exposure, these cells become resistant to the drugs over time and not only do not die but even begin to divide.

"We focused on the effects of three commonly used drugs: vinflunine, colchicine and docetaxel. They all act on the microtubules, which is desirable since these fibres are essential for cell division. Docetaxel stabilizes the microtubules and therefore also increases the rigidity of the tumour cells and makes it difficult for them to migrate throughout the body. The other two drugs destabilize the microtubules, so cancer cells can migrate, but due to the disturbed functions of the cytoskeleton, they are unable to divide," says PhD student Andrzej Kubiak, the first author of the article published in the prestigious Nanoscale.

The researchers from Cracow analysed the viability and mechanical properties of cells 24, 48 and 72 hours after drug treatment, and it turned out that the greatest changes were observed three days after drug exposure. This allowed them to determine two concentrations of drugs: one higher, which destroyed cells, and one lower, at which although cells survived, their mechanical properties were found to be altered. For obvious reasons, what happened to the cells in the latter case was of particular interest. The precise interpretation of some of the results required several tools, such as a confocal microscope and flow cytometry. Their use was possible thanks to cooperation with the Institute of Pharmacology of the Polish Academy of Sciences in Cracow, the Department of Cell Biology at the Faculty of Biochemistry, Biophysics and Biotechnology of the Jagiellonian University and the University of Milan (Department of Physics, Universita degli Studi di Milano).

"It has been known for some time that when microtubules are damaged, some of their functions are taken over by actin filaments. The combination of measurements of the mechanical properties of cells with images from confocal and fluorescence microscopes allowed us to observe this effect. We were able to accurately determine the areas in the cell affected by a given drug and understand how its impact changes over time," emphasised PhD student Kubiak.

Practical conclusions can be drawn from the research of the Cracow physicists. For example, the effect of vinflunine is clearly visible in the nuclear region but is compensated by the actin filaments. As a result, the cell remains rigid enough to continue to multiply. On the other hand, 48 hours after the administration of the drug, the effects of docetaxel are most visible, mainly at the cell periphery. This fact also alerts us to the increased role of actin filaments and means that the therapy should be supported with a drug that acts on these filaments.

"Until now, there has been little research into the effectiveness of low concentrations of anti-cancer drugs. We show that the issue is really worth taking an interest in. For if we understand the mechanisms of action of individual drugs, we can maintain - and sometimes even increase - their current effectiveness while at the same time reducing the side effects of chemotherapy. In this way, chemotherapy can become more patient-friendly, which should affect not only the patient's physical health but also their mental attitude which is so necessary in the fight against cancer," concludes Prof. Lekka.

Credit: 
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences

Vaccine target for devastating livestock disease could change lives of millions

image: Cattle head out to graze in Serengeti District Tanzania

Image: 
Harriet Auty, University of Glasgow

The first ever vaccine target for trypanosomes, a family of parasites that cause devastating disease in animals and humans, has been discovered by scientists at the Wellcome Sanger Institute. By targeting a protein on the cell surface of the parasite Trypanosoma vivax, researchers were able to confer long-lasting protection against animal African trypanosomiasis (AAT) infection in mice.

The study, published today (26 May 2021) in Nature, is the first successful attempt to induce apparently sterile immunity against a trypanosome parasite. A vaccine was long thought impossible due to the sophisticated ability of the parasites to evade the host immune system. As well as a strong vaccine target for AAT, the findings raise the possibility of identifying vaccine targets for trypanosome species that cause the deadly human infections sleeping sickness and Chagas' disease.

Animal African trypanosomiasis (AAT) is a disease affecting livestock in Africa and, more recently, South America. It is caused by several species of Trypanosoma parasite, which are transmitted by tsetse flies, causing animals to suffer from fever, weakness, lethargy and anaemia. The resulting weight loss, low fertility and reduced milk yields have a huge economic impact on the people who depend on these animals. The disease has been said to lie at the heart of poverty in Africa1.

In humans, a disease called sleeping sickness is caused by infection with another trypanosome species, Trypanosoma brucei. Although control efforts have reduced the number of infections each year considerably, 65 million people remain at risk. In South America, the potentially life-threatening infection Chagas' disease is caused by Trypanosoma cruzi and affects at least 6 million people living in endemic areas2.

All trypanosome species have developed sophisticated anti-immune mechanisms that allow the parasites to thrive in their host. For example, African trypanosomes display a protein on their surface that constantly changes and prevents host antibodies from recognising the pathogen. Until now, it was thought impossible to vaccinate against trypanosome infection for this reason.

In this study, scientists at the Wellcome Sanger Institute analysed the genome of T. vivax to identify 60 cell surface proteins that could be viable vaccine targets. Each protein was produced using mammalian cell lines and then used to vaccinate mice to determine if the host immune system had been instructed to identify and destroy the T. vivax parasite.

One cell surface protein, named 'invariant flagellum antigen from T. vivax' (IFX), was observed to confer immunity against infection in almost all vaccinated mice for at least 170 days after experimental challenge with T. vivax parasites.

Dr Delphine Autheman, first author of the study from the Wellcome Sanger Institute, said: "Scientists have been searching for a way to vaccinate against animal African trypanosomiasis (AAT) since the parasite and vector were first discovered in the early 20th century. We've heard a lot about vaccines recently, but compared to a virus protozoan parasites have a huge number of proteins, making it very difficult to identify the right targets. Several of the 60 targets we tested elicited a partial immune response, but only one conferred the long-lasting protection that makes it a promising vaccine candidate."

Though drugs exist to prevent or treat AAT, many communities that require them live in remote locations that are difficult to access. Reliance on a handful of drugs, and a lack of professional expertise in their administration, are thought to be contributing to increased drug resistance in the parasites3. An effective vaccine would help to overcome some of these practical barriers.

Dr Andrew Jackson, a senior author of the study from the University of Liverpool, said: "It was considered impossible to vaccinate against trypanosome parasites because of the sophisticated immune-protective mechanisms they have evolved, so I'm delighted that we have been able to demonstrate that this can be done. Beyond the obvious benefit of a strong vaccine candidate for animal trypanosomiasis, the genome-led vaccine approach we outline in this study is one that could potentially be applied to other trypanosome species and other parasite families."

The next step will be to validate the results using a cattle model. If successful, work could begin on developing a vaccine for AAT that would be an important tool for tackling poverty in affected regions.

Dr Gavin Wright, a senior author of the study from the Wellcome Sanger Institute and the University of York, said: "This study is an important first step toward relieving the burden of animal African trypanosomiasis (AAT) on both animals and humans in Africa and South America. The protective effect of the vaccine target we identified will first need to be replicated in a cattle model, but I think we can be cautiously optimistic that in a few years' time we will have made substantial progress against this devastating disease."

Michael Pearce, AAT Programme Officer at livestock vaccine organisation GALVmed, said: "Trypanosomiasis remains a major disease challenge for livestock and farmers in Asia, Africa and South America, and is a significant human health problem in Africa and South America. Options for control and treatment of trypanosomiasis are very limited and resistance to currently available medicines is an increasing problem. These novel results from the Sanger Institute are a very important and welcome development, opening up the possibility of successful vaccine development for the prevention and control of trypanosomiasis in both humans and animals."

Credit: 
Wellcome Trust Sanger Institute

Magnetized threads weave spectacular galactic tapestry

image: A panorama of the Galactic Center builds on previous surveys from Chandra and other telescopes. This latest version expands Chandra's high-energy view farther above and below the plane of the galaxy - that is, the disk where most of the galaxy's stars reside - than previous imaging campaigns. In the first two images, X-rays from Chandra are orange, green, and purple, showing different X-ray energies, and the radio data from MeerKAT are gray.

Image: 
X-ray: NASA/CXC/UMass/Q.D. Wang; Radio: NRF/SARAO/MeerKAT

Threads of superheated gas and magnetic fields are weaving a tapestry of energy at the center of the Milky Way galaxy. A new image of this new cosmic masterpiece was made using a giant mosaic of data from NASA's Chandra X-ray Observatory and the MeerKAT radio telescope in South Africa.

The new panorama of the Galactic Center builds on previous surveys from Chandra and other telescopes. This latest version expands Chandra's high-energy view farther above and below the plane of the Galaxy - that is, the disk where most of the Galaxy's stars reside - than previous imaging campaigns. In the image featured in our main graphic, X-rays from Chandra are orange, green, blue and purple, showing different X-ray energies, and the radio data from MeerKAT are shown in lilac and gray. The main features in the image are shown in a labeled version.

One thread is particularly intriguing because it has X-ray and radio emission intertwined. It points perpendicular to the plane of the galaxy and is about 20 light-years long but only one-hundredth that size in width.

A new study of the X-ray and radio properties of this thread by Q. Daniel Wang of the University of Massachusetts at Amherst suggests these features are bound together by thin strips of magnetic fields. This is similar to what was observed in a previously studied thread. (Both threads are labeled in the image. The newly studied one is in the lower left and is much farther away from the plane of the Galaxy.) Such strips may have formed when magnetic fields aligned in different directions, collided, and became twisted around each other in a process called magnetic reconnection. This is similar to the phenomenon that drives energetic particles away from the Sun and is responsible for the space weather that sometimes affects Earth.

A detailed study of these threads teaches us more about the Galactic space weather astronomers have witnessed throughout the region. This weather is driven by volatile phenomena such as supernova explosions, close-quartered stars blowing off hot gas, and outbursts of matter from regions near Sagittarius A*, our Galaxy's supermassive black hole.

In addition to the threads, the new panorama reveals other wonders in the Galactic Center. For example, Wang's paper reports large plumes of hot gas, which extend for about 700 light-years above and below the plane of the galaxy, seen here in greater detail than ever before. (They are much smaller than the Fermi Bubbles which extend for about 25,000 light-years above and below the plane of the galaxy.) These plumes may represent galactic-scale outflows, analogous to the particles driven away from the Sun. The gas is likely heated by supernova explosions and many recent magnetic reconnections occurring near the center of the galaxy. Such reconnection events in the Galaxy are normally not sufficiently energetic to be detected in X-rays, except for the most energetic ones at the center of the Galaxy, where the interstellar magnetic field is much stronger.

Magnetic reconnection events may play a major role in heating the gas existing between stars (the interstellar medium). This process may also be responsible for accelerating particles to produce cosmic rays like those observed on Earth and driving turbulence in the interstellar medium that triggers new generations of star birth.

The image shows that the magnetic threads tend to occur at the outer boundaries of the large plumes of hot gas. This suggests that the gas in the plumes is driving magnetic fields that collide to create the threads.

Credit: 
Center for Astrophysics | Harvard & Smithsonian

Lactate reveals all about its antidepressant prowess

image: Nascent cells in red, neurons in green. The cells in pink are new neurons to be integrated into a neural network

Image: 
© UNIL-CHUV- Carron / Toni

Depression is the leading cause of disability worldwide. Neuroscientists from Synapsy - the Swiss National Centre of Competence in Research into Mental Illness - based at Lausanne University Hospital (CHUV) and Lausanne University (UNIL) have recently demonstrated that lactate, a molecule produced by the body during exercise, has an antidepressant effect in mice. Lactate is best known for the pivotal role it plays in the nutrition of neurons inside the brain. Yet it can also counter the inhibition of the survival and proliferation of new neurons, a loss seen in patients suffering from depression and in stressed animal. Furthermore, the research team pinpointed NADH as a vital component in the mechanism: this is a molecule with antioxidant properties that is derived from the metabolism of lactate. The findings, published in the scientific journal Molecular Psychiatry, provide a better understanding of the physiological mechanisms that underpin physical activity, which should lead to an improvement in the way depression is treated in the future.

WHO recognises depression - which affects nearly 264 million people - as the leading cause of disability worldwide. Treatments based on antidepressants and psychotherapy are available to help people suffering from the disorder. Yet, as Jean-Luc Martin, senior lecturer and researcher at CHUV's Centre for Psychiatric Neurosciences (CNP) and UNIL, Synapsy member and co-director of the study together with Professor Pierre Magistretti, points out: "Around 30% of people with depression don't respond to antidepressants." At the same time, the antidepressant effects of physical activity have been known for many years, even though the scientific community has struggled to figure out the molecular mechanisms involved.

Exercise and lactate: united against depression

During its previous investigations, the laboratory led by Dr Martin focused on lactate - a molecule produced during physical exercise - in an attempt to explain the benefits of sport. The researchers observed the antidepressant action of lactate when administered to mice at doses comparable to those found during physical activity. As the Vaud-based neuroscientist continues: "Lactate decreases anhedonia in particular, one of the main symptoms of depression, which involves losing interest or pleasure in all those activities which, prior to depression, were considered enjoyable".

Giving birth to new neurons

The CNP team was keen to delve deeper and understand how lactate acts on the brain to counter depression. They focused on adult neurogenesis in the hippocampus, a region of the brain that plays a role in memory and depression. "Adult neurogenesis is the term used for the production of new neurons in adulthood from brain stem cells", points out Dr Martin. "Its core purpose is to replace neurons, and it's known to be impaired in depressive patients, when it contributes to the reduction in the volume of the hippocampus observed in some individuals". With the help of his fellow researchers, Dr Martin was able to show that lactate restores neurogenesis and lowers depressive behaviour in mice. Conversely, without neurogenesis, lactate loses its antidepressant power, indicating that the two are intimately linked.

A key trio

But this does not tell us anything about the mechanism by which lactate regulates neurogenesis. Accordingly, the researchers studied its metabolism: in other words, all the cellular chemical reactions relating to it. Lactate is largely derived from the breakdown of glucose from food, and is then oxidised to pyruvate. Anthony Carrard, a biologist at CNP and the study's lead investigator, explains: "We logically tested pyruvate on neurogenesis, without success. So, we said to ourselves that the answer had to be found in the conversion of lactate to pyruvate".

During conversion of lactate to pyruvate, cells produce a molecule with antioxidant potential, known as NADH. As Dr Carrard continues: "It's NADH and its antioxidant properties that protect neurogenesis during a depressive episode - or at least during a modelling of some of these symptoms in animals". In conclusion, the researcher adds: "This mechanism could explain the link between sport and depression, understanding that further experiments are still needed to demonstrate it. Importantly, it offers potential targets for devising future treatments. To do this, we're first going to identify the proteins on which the NADH factor acts".

Credit: 
National Center of Competence in Research Synapsy

When cancer cells "put all their eggs in one basket"

Normal cells usually have multiple solutions for fixing problems. For example, when DNA becomes damaged, healthy white blood cells can use several different strategies to make repairs. But cancer cells may "put all their eggs in one basket," getting rid of all backup plans and depending on just one pathway to mend their DNA. Cold Spring Harbor Laboratory (CSHL) Professor Christopher Vakoc focuses on probing cancers to figure out if they have any unique dependencies. His lab was surprised to discover that a single DNA repair method remained in acute myeloid leukemia (AML), an aggressive cancer that originates in bone marrow. They discovered that if they shut down that pathway in cells grown in the laboratory, they could kill the cancer cells while leaving normal cells unharmed.

Cancer cells may unintentionally remove multiple methods for fixing problems as they change their DNA to grow and spread quickly. But developing a dependency on just one repair pathway means that they have no backup plans if it fails. Vakoc explains:

"Sometimes cancer cells, to become "super cells," they had to get rid of stuff that they thought they didn't need. You get rid of what you don't need, you kind of spring clean maybe a little too much, then you realize: 'Shoot!' You threw away something you actually do need."

In normal cells, a particular type of DNA damage can be solved with two different methods: the ALDH2 gene and the Fanconi anemia (FA) pathway. AML cells have inactivated ALDH2 and are dependent on the FA proteins to perform this DNA repair. The researchers showed that if they shut down the FA pathway, it resulted in cancer cell death.

The team hopes their findings will lead to clinical treatments that eliminate cancer cells without harming other cells in the body. Vakoc says:

"The reality is, there aren't that many differences between cancer cells and normal cells with regard to dependencies. So this is one of the most striking things we've found, which is the kind of win-win for us, to discover a dependency that can be modified with a drug is, we think, the way to make new cancer medicines that are safer and more effective."

Credit: 
Cold Spring Harbor Laboratory

Astonishing quantum experiment in Science raises questions

image: Artistic image of the experiment

Image: 
Enrique Sahagún, Scixel

Quantum systems are considered extremely fragile. Even the smallest interactions with the environment can result in the loss of sensitive quantum effects. In the renowned journal Science, however, researchers from TU Delft, RWTH Aachen University and Forschungszentrum Jülich now present an experiment in which a quantum system consisting of two coupled atoms behaves surprisingly stable under electron bombardment. The experiment provide an indication that special quantum states might be realised in a quantum computer more easily than previously thought.

The so-called decoherence is one of the greatest enemies of the quantum physicist. Experts understand by this the decay of quantum states. This inevitably occurs when the system interacts with its environment. In the macroscopic world, this exchange is unavoidable, which is why quantum effects rarely occur in daily life. The quantum systems used in research, such as individual atoms, electrons or photons, are better shielded, but are fundamentally similarly sensitive.

"Systems subject to quantum physics, unlike classical objects, are not sharply defined in all their properties. Instead, they can occupy several states at once. This is called superposition," Markus Ternes explains. "A famous example is Schrödinger's thought experiment with the cat, which is temporarily dead and alive at the same time. However, the superposition breaks down as soon as the system is disturbed or measured. What is left then is only a single state, which is the measured value," says the quantum physicist from Forschungszentrum Jülich and RWTH Aachen University.

Given this context, the experiment that researchers at TU Delft have now carried out seems all the more astonishing. Using a new method, they succeeded for the first time in real-time observing how two coupled atoms freely exchange quantum information, switching back and forth between different states in a flip-flop interaction.

"Each atom carries a small magnetic moment called spin. These spins influence each other, like compass needles do when you bring them close. If you give one of them a push, they will start moving together in a very specific way," explains Sander Otte, head of the Delft team that performed the experiment.

On a large scale, this kind of information exchange between atoms can lead to fascinating phenomena. Various forms of quantum technologies are based on these. A classical example is superconductivity: the effect where some materials lose all electrical resistivity below a critical temperature.

Unconventional approach

To observe this interaction between atoms, Otte and his team chose a rather direct way: Using a scanning tunnelling microscope, they placed two titanium atoms next to each other at a distance of just over one nanometre - one millionth of a millimetre. At that distance, the atoms are just able to feel each other's spin. If you would now twist one of the two spins, the conversation will start by itself.

Usually, this twist is performed by sending very precise radio signals to the atoms. This so-called spin resonance technique - which is quite reminiscent of the working principle of an MRI scanner found in hospitals - is used successfully in research on quantum bits. Among other things, quantum bits in certain types of quantum computers are programmed in such a way. However, the method has a disadvantage. "It is simply too slow," says PhD student Lukas Veldman, lead author on the Science publication. "You have barely started twisting the one spin before the other starts to rotate along. This way you can never investigate what happens upon placing the two spins in opposite directions."

So the researchers tried something unorthodox: they rapidly inverted the spin of one of the two atoms with a sudden burst of electric current. To their surprise, this drastic approach resulted in a beautiful quantum interaction, exactly by the book. During the pulse, electrons collide with the atom, causing its spin to rotate. Otte: "But we always assumed that during this process, the delicate quantum information - the so-called coherence - was lost. After all, the electrons that you send are incoherent: the history of each electron prior to the collision is slightly different and this chaos is transferred to the atom's spin, destroying any coherence."

The fact that this now seems not to be true was cause for some debate. Apparently, each random electron, regardless of its past, can initiate a superposition: a specific combination of elementary quantum states which is fully known and which forms the basis for almost any form of quantum technology. The aspect that these electrons are still connected to their environment via their history is obviously irrelevant. What is at stake here, then, is the violation of a principle of quantum physics, according to which every measurement irretrievably destroys the superposition of quantum states.

"The crux is that it depends on the perspective," argues Markus Ternes, co-author of the Science paper. "The electron inverts the spin of one atom causing it to point, say, to the left. You could view this as a measurement, erasing all quantum memory. But from the point of view of the combined system comprising both atoms, the resulting situation is not so mundane at all. For the two atoms together, the new state constitutes a perfect superposition, enabling the exchange of information between them. Crucially for this to happen is that both spins become entangled: a particular quantum state in which they share more information about each other than classically possible."

The discovery could have far-reaching consequences for the development of quantum computers, whose function is based on the entanglement and superposition of quantum states. If one follows the findings, one could get away with being slightly less careful when initializing quantum states than previously thought. For Otte and his team at TU Delft, however, the result is above all the starting point of further exciting experiments. Veldman: "Here we used two atoms, but what happens if you use three? Or ten, or a thousand? Nobody can predict that, because the computing power [for simulating such] numbers is not sufficient."

Credit: 
Forschungszentrum Juelich

Some forams could thrive with climate change, metabolism study finds

image: Light micrograph of the benthic foraminifer Nonionella stella, which thrives in anoxic sulfidic sediments far below the euphotic zone. Individuals are ~225 microns in diameter.

Image: 
Image J.M. Bernhard.

Woods Hole, Mass. (May 27, 2021) - With the expansion of oxygen-depleted waters in the oceans due to climate change, some species of foraminifera (forams, a type of protist or single-celled eukaryote) that thrive in those conditions could be big winners, biologically speaking.

A new paper that examines two foram species found that they demonstrated great metabolic versatility to flourish in hypoxic and anoxic sediments where there is little or no dissolved oxygen, inferring that the forams' contribution to the marine ecosystem will increase with the expansion of oxygen-depleted habitats.

In addition, the paper found that the multiple metabolic strategies that these forams exhibit to adapt to low and no oxygen conditions are changing the classical view about the evolution and diversity of eukaryotes. That classical view hypothesizes that the rise of oxygen in Earth's system led to the acquisition of oxygen-respiring mitochondria, the part of a cell that generates most of the chemical energy that powers a cell's biochemical reactions. The forams in the study represent "typical" mitochondrial-bearing eukaryotes. However, these two forams respire nitrate and produce energy in the absence of oxygen, with one colonizing an anoxic environment, often with high levels of hydrogen sulfide, a chemical compound typically toxic to eukaryotes.

"Benthic foraminifera represent truly successful microbial eukaryotes with diverse and sophisticated metabolic adaptive strategies" that scientists are just beginning to discover, the authors noted in the paper, Multiple integrated metabolic strategies allow foraminiferal protists to thrive in anoxic marine sediments appearing in Science Advances.

This is important because scientists have studied forams extensively for interpreting past oceanographic and climate conditions. Scientists largely have assumed that forams evolved after oxygen was on the planet and likely require oxygen to survive. However, finding that forams can perform the processes described "throws a whole new wrench in interpretations of past environmental conditions on Earth, driven by the foram fossil record," said co-author and project leader Joan Bernhard, senior scientist in the Geology and Geophysics Department at the Woods Hole Oceanographic Institution (WHOI).

Bernhard said that over the past several decades she has worked to establish that forams can live where there is little or no oxygen. "We never knew exactly why forams can live where there isn't any oxygen until molecular methods got good enough that we could really start to ask some of these questions. This is our first paper that's coming out with some of these insights," she said. Bernhard added that with thousands of foram species living today, and with hundreds of thousands extinct, it is likely that this is "the tip of the iceberg" in terms of possibly discovering other metabolic strategies invoked by these forams.

Specific insights from the paper pertain to two highly successful benthic foraminiferal species that inhabit hypoxic or anoxic sediments in the Santa Barbara Basin, a sort of natural laboratory off the coast of California for studying the impact of oxygen depletion in the ocean.

Through gene expression analysis of the two species--Nonionella stella and Bolivina argentea--scientists found different successful metabolic adaptations that allowed the forams to succeed in oxygen-depleted marine sediments and identified candidate genes involved in anaerobic respiration and energy metabolism.

The N. stella is a sort of kleptomaniac, utilizing a technique to steal chloroplasts--the structure in a cell where photosynthesis occurs--from a particular diatom genus. What makes this particularly interesting is that N. stella lives well below what is considered to be the zone where photosynthesis can happen. The authors noted that there has been discussion in the literature questioning the functionality of these kleptoplasts in the Santa Barbara Basin N. stella but the new results show that these kleptoplasts are firmly functional, although exact metabolic details remain elusive.

In addition, the scientists found that the two foram species in the study use different metabolic pathways to incorporate ammonium into organic nitrogen in the form of glutamate, a metabolic strategy that was not previously known to be performed by these organisms.

"The metabolic variety suggests that at least some species of this diverse protistan group will withstand severe deoxygenation and likely play major roles in oceans affected by climate change," the authors wrote.

The study "gives the scientific community a new direction for research," said lead author Fatma Gomaa, who, at the time of the study, was a postdoctoral investigator at the Geology and Geophysics Department at WHOI. "We are now starting to learn that there are microeukaryotes living in habitats similar to those in Earth's early history that are performing very interesting biological functions. Learning about these forams is very intriguing and will shed light on how early eukaryotes evolved."

Credit: 
Woods Hole Oceanographic Institution

Reaping the benefits of noise

image: Two mirrors with a drop of oil in between form a non-linear optical cavity, in which stochastic resonance was observed. By modulating the position on one of the mirrors, the laser light (approaching from the left) is turned into a signal (right). An optimum amount of noise amplifies this signal when the conditions of stochastic resonance have been met.

Image: 
AMOLF Interacting Photons group

Signals can be amplified by an optimum amount of noise, but this so-called stochastic resonance is a rather fragile phenomenon. Researchers at AMOLF were the first to investigate the role of memory for this phenomenon in an oil-filled optical microcavity. The effects of slow non-linearity (i.e. memory) on stochastic resonance were never considered before, but these experiments suggest that stochastic resonance becomes robust to variations in the signal frequency when systems have memory. This has implications in many fields of physics and energy technology. In particular, the scientists numerically show that introducing slow non-linearity in a mechanical oscillator harvesting energy from noise can increase its efficiency by tenfold. They publish their findings in Physical Review Letters on May 27th.

It is not easy to concentrate on a difficult task when two people are having a loud discussion right next to you. However, complete silence is often not the best alternative. Whether it is some soft music, remote traffic noise or the hum of people chatting in the distance, for many people, an optimum amount of noise enables them to concentrate better. "This is the human equivalent of stochastic resonance", says AMOLF group leader Said Rodriguez. "In our scientific labs stochastic resonance happens in non-linear systems that are bistable. This means that, for a given input, the output can switch between two possible values. When the input is a periodic signal, the response of a non-linear system can be amplified by an optimum amount of noise using the stochastic resonance condition."

Ice ages

In the 1980's stochastic resonance was proposed as an explanation for the recurrence of ice ages. Since then, it has been observed in many natural and technological systems, but this wide-spread observation poses a puzzle to scientists. Rodriguez: "Theory suggests that stochastic resonance can only occur at a very specific signal frequency. However, many noise-embracing systems live in environments where signal frequencies fluctuate. For example, it has been shown that certain fish prey on plankton by detecting a signal they emit, and that an optimum amount of noise enhances the fish's ability to detect that signal through the phenomenon of stochastic resonance. But how can this effect survive fluctuations in the signal frequency occurring in such complex environments?"

Memory effects

Rodriguez and his PhD student Kevin Peters who is the first author of the paper, were the first to demonstrate that memory effects must be taken into account to solve this puzzle. "The theory of stochastic resonance assumes that non-linear systems respond instantaneously to an input signal. However, in reality most systems respond to their environment with a certain delay and their response depends on all that happened before", he says. Such memory effects are difficult to describe theoretically and to control experimentally, but the Interacting Photons group at AMOLF has now managed both. Rodriguez: "We have added a controlled amount of noise to a beam of laser light and have shined it on a tiny cavity filled with oil, which is a non-linear system. The light causes the temperature of the oil to rise, and its optical properties to change, but not immediately. It takes about ten microseconds, thus the system is non-instantaneous as well. In our experiments, we have shown for the first time that stochastic resonance can occur over a broad range of signal frequencies when memory effects are present."

Energy harvesting

Having thus shown that the widespread occurrence of stochastic resonance may be due to yet unnoticed memory dynamics, the researchers hope that their results will inspire colleagues in several other fields of science to search for memory effects in in their own systems. To extend the impact of their findings, Rodriguez and his team have theoretically investigated the effects of non-instantaneous response on mechanical systems for energy harvesting. "Small piezo-electric devices that harvest energy from vibrations are useful when battery replacement is difficult, for example in pacemakers or other biomedical devices", he explains. "We have found a tenfold increase in the amount of energy that could be harvested from environmental vibrations, if memory effects would have been incorporated."

The obvious next step for the group is to expand their system with several connected oil-filled cavities and investigate collective behavior emerging from noise. Rodriguez does not fear stepping outside his scientific comfort zone. He says: "It would be great if we could team up with researchers that have expertise in mechanical oscillators. If we can implement our memory effects in those systems, the impact on energy technology will be enormous."

Credit: 
AMOLF

Comprehensive electronic-structure methods review featured in Nature Materials

image: A large number of candidate materials are chosen from experimental or computational databases, and a sequence of screening calculations reduces their number down to a small set of candidates with the most promising properties.

Image: 
@Nicola Marzari

Over the past 20 years, first-principles simulations have become powerful, widely used tools in many, diverse fields of science and engineering. From nanotechnology to planetary science, from metallurgy to quantum materials, they have accelerated the identification, characterization, and optimization of materials enormously. They have led to astonishing predictions--from ultrafast thermal transport to electron-phonon mediated superconductivity in hydrides to the emergence of flat bands in twisted-bilayer graphene-- that have gone on to inspire remarkable experiments.

The current push to complement experiments with simulations; continued, rapid growth in computer throughput capacity; the ability of machine-learning and artificial intelligence to accelerate materials discovery as well as the promise of disruptive accelerators such as quantum computing for exponentially expensive tasks mean it is apparent that these methods will become ever more relevant as time goes by. It is an appropriate time then to review the capabilities as well as the limitations of the electronic-structure methods underlying these simulations. Marzari, Ferretti and Wolverton address this task in the paper "Electronic-structure methods for materials design," just published in Nature Materials.

"Simulations do not fail in spectacular ways but can subtly shift from being invaluable to barely good enough to just useless," the authors said in the paper. "The reasons for failure are manifold, from stretching the capabilities of the methods to forsaking the complexity of real materials. But simulations are also irreplaceable: they can assess materials at conditions of pressure and temperature so extreme that no experiment on earth is able to replicate, they can explore with ever-increasing nimbleness the vast space of materials phases and compositions in the search for that elusive materials breakthrough, and they can directly identify the microscopic causes and origin of a macroscopic property. Last, they share with all branches of computational science a key element of research: they can be made reproducible and open and shareable in ways that no physical infrastructure will ever be."

The authors first look at the framework of density-functional theory (DFT) and give an overview of the increasingly complex approaches that can improve accuracy or extend the scope of simulations. They then discuss the capabilities that computational materials science has developed to exploit this toolbox and deliver predictions for the properties of materials under realistic conditions of ever-increasing complexity. Finally, they highlight how physics- or data-driven approaches can provide rational, high-throughput, or artificial-intelligence avenues to materials discovery, and explain how such efforts are changing the entire research ecosystem.

Looking ahead, the authors say that developing methods that can assess the thermodynamic stability, synthesis conditions, manufacturability, and tolerance of the predicted properties to intrinsic and extrinsic defects in novel materials will be a significant challenge. Researchers may need to augment DFT estimates by more advanced electronic-structure methods or machine learning algorithms to improve accuracy, and use computational methods to address realistic conditions such as vibrational entropies, the concentration of defects and applied electrochemical potentials.

Finally, given the extended role that such methods are likely to play in the coming decades, the authors note that support and planning for the needed computational infrastructures--widely used scientific software, the verification of codes and validation of theories, the dissemination and curation of computational data, tools and workflows as well as the associated career models these entail and require--are only just beginning to emerge.

Credit: 
National Centre of Competence in Research (NCCR) MARVEL

The robot smiled back

image: The Robot Smiles Back: Eva mimics human facial expressions in real-time from a living stream camera. The entire system is learned without human labels. Eva learns two essential capabilities: 1) anticipating what itself would look like if it were making an observed facial expression, known as self-image; 2) map its imagined face to physical actions.

Image: 
Creative Machines Lab/Columbia Engineering

New York, NY--May 27, 2021--While our facial expressions play a huge role in building trust, most robots still sport the blank and static visage of a professional poker player. With the increasing use of robots in locations where robots and humans need to work closely together, from nursing homes to warehouses and factories, the need for a more responsive, facially realistic robot is growing more urgent.

Long interested in the interactions between robots and humans, researchers in the Creative Machines Lab at Columbia Engineering have been working for five years to create EVA, a new autonomous robot with a soft and expressive face that responds to match the expressions of nearby humans. The research will be presented at the ICRA conference on May 30, 2021, and the robot blueprints are open-sourced on Hardware-X (April 2021).

"The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes," said Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering) and director of the Creative Machines Lab.

Lipson observed a similar trend in the grocery store, where he encountered restocking robots wearing name badges, and in one case, decked out in a cozy, hand-knit cap. "People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name," he said. "This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?"

While this sounds simple, creating a convincing robotic face has been a formidable challenge for roboticists. For decades, robotic body parts have been made of metal or hard plastic, materials that were too stiff to flow and move the way human tissue does. Robotic hardware has been similarly crude and difficult to work with--circuits, sensors, and motors are heavy, power-intensive, and bulky.

VIDEO: https://youtu.be/1vBLI-q04kM
PROJECT WEBSITE: http://www.cs.columbia.edu/~bchen/aiface/

The first phase of the project began in Lipson's lab several years ago when undergraduate student Zanwar Faraj led a team of students in building the robot's physical "machinery." They constructed EVA as a disembodied bust that bears a strong resemblance to the silent but facially animated performers of the Blue Man Group. EVA can express the six basic emotions of anger, disgust, fear, joy, sadness, and surprise, as well as an array of more nuanced emotions, by using artificial "muscles" (i.e. cables and motors) that pull on specific points on EVA's face, mimicking the movements of the more than 42 tiny muscles attached at various points to the skin and bones of human faces.

"The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions," Faraj noted.

To overcome this challenge, the team relied heavily on 3D printing to manufacture parts with complex shapes that integrated seamlessly and efficiently with EVA's skull. After weeks of tugging cables to make EVA smile, frown, or look upset, the team noticed that EVA's blue, disembodied face could elicit emotional responses from their lab mates. "I was minding my own business one day when EVA suddenly gave me a big, friendly smile," Lipson recalled. "I knew it was purely mechanical, but I found myself reflexively smiling back."

Once the team was satisfied with EVA's "mechanics," they began to address the project's second major phase: programming the artificial intelligence that would guide EVA's facial movements. While lifelike animatronic robots have been in use at theme parks and in movie studios for years, Lipson's team made two technological advances. EVA uses deep learning artificial intelligence to "read" and then mirror the expressions on nearby human faces. And EVA's ability to mimic a wide range of different human facial expressions is learned by trial and error from watching videos of itself.

The most difficult human activities to automate involve non-repetitive physical movements that take place in complicated social settings. Boyuan Chen, Lipson's PhD student who led the software phase of the project, quickly realized that EVA's facial movements were too complex a process to be governed by pre-defined sets of rules. To tackle this challenge, Chen and a second team of students created EVA's brain using several Deep Learning neural networks. The robot's brain needed to master two capabilities: First, to learn to use its own complex system of mechanical muscles to generate any particular facial expression, and, second, to know which faces to make by "reading" the faces of humans.

To teach EVA what its own face looked like, Chen and team filmed hours of footage of EVA making a series of random faces. Then, like a human watching herself on Zoom, EVA's internal neural networks learned to pair muscle motion with the video footage of its own face. Now that EVA had a primitive sense of how its own face worked (known as a "self-image"), it used a second network to match its own self-image with the image of a human face captured on its video camera. After several refinements and iterations, EVA acquired the ability to read human face gestures from a camera, and to respond by mirroring that human's facial expression.

The researchers note that EVA is a laboratory experiment, and mimicry alone is still a far cry from the complex ways in which humans communicate using facial expressions. But such enabling technologies could someday have beneficial, real-world applications. For example, robots capable of responding to a wide variety of human body language would be useful in workplaces, hospitals, schools, and homes.

"There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers," said Lipson. "Our brains seem to respond well to robots that have some kind of recognizable physical presence."

Added Chen, "Robots are intertwined in our lives in a growing number of ways, so building trust between humans and machines is increasingly important."

Credit: 
Columbia University School of Engineering and Applied Science

Shedding new light: A new type of immunosensor for immunoassay tests

image: The enzyme "Nluc" is added to the Q-body forming a Bret Q-body. Luminescence is observed upon antigen binding. To the right, emission spectra in the presence of the substrate of the BRET-Q body labeled with the fluorescent dye TAMRA-C5-mal can be observed. The inset shows the change in color observed on antigen binding.

Image: 
Tokyo Tech

Immunosensors are widely used in immunoassays to detect antigens. One such immunosensor is a quenchbody (Q-body), which contains a modified antibody fragment with a quenched fluorescent dye. When an antigen binds to the Q-body, the dye leaves the antibody and the fluorescence intensifies. The change in fluorescence intensity is easy to measure, making Q-body-based antigen detection systems incredibly simple. However, this method requires an external light source to excite the electrons in the fluorescent dye to produce luminescence.

One way to solve this is to induce luminescence by an alternative method. To achieve this, researchers from Tokyo Tech, Japan, have developed a novel immunosensor. They used a modified luciferase enzyme called "NanoLuc" (Nluc) that is originally responsible for bioluminescence in shrimp and fused it to the Q-body. This immunosensor, termed "BRET Q-body", works on the bioluminescence resonance energy transfer (BRET) principle. Here, a luminescent substrate is added to the fused Q-body. The substrate reacts with the enzyme and this reaction provides the energy required by the dye to induce fluorescence. This type of luminescence is advantageous, as Professor Hiroshi Ueda, who leads the team of researchers, explains, "The BRET Q-body system can be used to visualize the presence or the absence of an antigen as a change in the emission color without any instrument." Their findings were recently published in the journal Analytical Chemistry.

To prepare the BRET Q-body, the researchers used a single-chain antibody fragment which binds to the antigen BGP, a protein found in the bone. The antibody fragment was then labeled with a fluorescent dye. They then tested the fluorescence intensity of the resulting BRET Q-bodies and observed that the addition of the antigen increased the intensity of the fluorescence.

After these initial results, the researchers tested the antigen dependency of the BRET Q-body. They compared the fluorescence obtained by the new BRET-based method against the conventional irradiation-based method. The fluorescence intensity of the BRET-Q body was initially measured with excitation light and later in the presence of the luminescent substrate. They found that the antigen-binding brought the Nluc enzyme and the dye closer together resulting in higher fluorescence intensity levels when the substrate was used. As luminescence from the BRET Q-body is obtained initially from the enzyme and then from the fluorescent dye upon antigen binding, a simple color change indicates the presence of an antigen.

This study paves the way forward for a new class of bioluminescent sensors that do not require an external excitation light source. The novel BRET Q-bodies are expected to make immunoassay tests much simpler and more accurate. As Prof. Ueda explains the advantages and potential applications of their findings, "The detection of the BRET signal does not need a light source, allows visual observation of the color change, and easier integration to a smartphone-based device. Therefore, we expect that BRET Q-bodies will be a promising tool for diagnosis, food safety, environmental preservation, and biological research."

Credit: 
Tokyo Institute of Technology

Partners in crime: Agricultural pest that relies on bacteria to overcome plant defenses

image: Levels of damage to A. thaliana leaves after exposure to S. litura larvae raised under conditions that did or did not sterilize their oral secretions. The asterisk indicates a statistically significant difference between the damage levels under the different conditions

Image: 
Professor Gen-ichiro Arimura, Tokyo University of Science

Although insect larvae may seem harmless to humans, they can be extremely dangerous to the plant species that many of them feed on, and some of those plant species are important as agricultural crops. Although plants cannot simply flee from danger like animals typically would, many have nonetheless evolved ingenious strategies to defend themselves from herbivores. Herbivorous insect larvae will commonly use their mouths to smear various digestive proteins onto plants that they want to eat, and when plants detect chemicals commonly found in these oral secretions, they can respond to the injury by producing defensive molecules, including proteins and specialized metabolites, of their own that inactivate the insect's digestive proteins and thus prevent the insect from obtaining nutrients from the plant.

Of course, the existence of such chemical defense mechanisms in plants is a problem, which herbivorous insects must counter. One way that insects have evolved to overcome these problems is by forming partnerships with bacteria. For example, the digestive oral secretions of the Colorado potato beetle (Leptinotarsa decemlineata) include bacteria that can suppress the defense mechanisms of the tomato plants that the beetle commonly feeds on. The beetle and the bacteria have thus achieved "symbiosis," which is a term that biologists use to describe a mutually beneficial partnership: the beetle provides the bacteria with a comfortable environment inside its mouth and other secretory organs, and the bacteria help the beetle consume nutrients from tomato plants.

To Prof. Gen-ichiro Arimura of Tokyo University of Science, this is a fascinating result: "Although it is well known that symbiotic microorganisms in animals (especially bacteria in the intestines of herbivores such as pandas and cows) affect biological activities such as digestion and reproduction, the fact that they affect the prey (i.e., the plants) is not so well known." In other words, the fact that the insect's bacterial partners work to alter biochemical processes within the living plant before it is eaten is a matter of considerable interest to scientists.

Prof. Arimura and his research team, in collaboration with Okayama University, wondered whether such partnerships with bacteria may apply in the case of the insect Spodoptera litura, the larvae of which are major pests that commonly damage crops in Asia. In an article recently published in the journal New Phytologist, Prof. Arimura's research team experimented with applying the oral secretions of S. litura larvae to mechanically damaged leaves of the thale cress plant (Arabidopsis thaliana). When the researchers sterilized the oral secretions to kill or remove any bacteria that may be present in them, they found that applying these secretions to the plant leaves stimulated the expression of defense-related genes and the production of oxylipins that play important roles in defending A. thaliana cells from digestion. However, when the researchers applied oral secretions that had not been sterilized, the bacteria present within the oral secretions acted to prevent the expression of defense-related genes and the production of oxylipins. In contrast, the bacteria stimulated the production of salicylic acid and abscisic acid, two chemicals that act to suppress the production of oxylipins.

These findings are compelling evidence that bacteria in the oral secretions of S. litura assist the larvae in overcoming plant defense mechanisms, and the researchers wanted to identify the bacteria responsible. Tests of the larvae's oral secretions revealed the presence of a bacterium called Staphylococcus epidermidis, and further experiments confirmed the S. epidermidis acted to suppress plant defense mechanisms.

These results provide important insights into how S. litura counteracts the defense mechanisms of the plants that it feeds on, and Prof. Arimura hopes that knowing more about the relationship between the larvae and the bacteria will help crop scientists develop techniques to protect important crop species from S. litura. Such techniques may help farmers reduce their use of environmentally harmful pesticides, and Prof. Arimura expresses optimism that his research will thus "contribute to the creation of a safe and secure food supply and a rich environment."

Credit: 
Tokyo University of Science

Engineered defects in crystalline material boosts electrical performance

image: Xiaoli Tan and a team of campus collaborators used this transmission electron microscope at the Ames Laboratory's Sensitive Instrument Facility to study the effects of engineering defects into certain materials.

Image: 
Photo by Christopher Gannon/Iowa State University

AMES, Iowa - Materials engineers don't like to see line defects in functional materials.

The structural flaws along a one-dimensional line of atoms generally degrades performance of electrical materials. So, as a research paper published today by the journal Science reports, these linear defects, or dislocations, "are usually avoided at all costs."

But sometimes, a team of researchers from Europe, Iowa State University and the U.S. Department of Energy's Ames Laboratory report in that paper, engineering those defects in some oxide crystals can actually increase electrical performance.

The research team - led by Jürgen Rödel and Jurij Koruza of the Technical University of Darmstadt in Germany - found certain defects produce significant improvements in two key measurements of electrical performance in barium titanate, a crystalline ceramic material.

"By introducing these defects into the material, we can change, modify or improve the material's functional properties," said Xiaoli Tan, an Iowa State professor of materials science and engineering and a longtime research collaborator with Rödel.

In this case, the engineered defects led to a five-fold increase in dielectric properties (that restrict the flow of current) and a 19-fold increase in piezoelectric properties (that internally generates an electric field when subject to mechanical stress), Tan said.

Special tools for special measurements

In addition to Tan, two other Iowa State researchers helped the project's international research team explore fundamental materials questions: Lin Zhou, a scientist in materials science and engineering and the U.S. Department of Energy's Ames Laboratory; and Binzhi Liu, a doctoral student in materials science and engineering.

With support from the National Science Foundation, the three contributed their expertise in transmission electron microscopy - technology that can show the structures and features of materials by shooting a beam of electrons through thin samples and recording an image. The images have much higher resolution than light microscopy and can show fine details down to the scale of individual atoms.

Key to the project was the Ames Laboratory's Sensitive Instrument Facility built in cooperation with Iowa State. The building was built in 2015 with nearly $10 million from the Department of Energy. It provides a vibration- and static-free environment for electron microscopy at the highest possible resolutions.

"It's a state-of-the-art electron microscopy facility," Zhou said. "It provides an ultra-stable environment so we can achieve atom-level images of material and at the same time acquire chemical information.

"It's a great platform for research and educating the next generation of materials scientists."

A better material for capacitors?

For this project, the electron microscopy team quantified the evidence that line defects in a crystalline material can boost electrical performance, Liu said.

The numbers showed that "the dislocations can significantly alter the behavior of other fine features in the material," Liu said.

Tan said the finding could have big implications for the electrical capacitor industry.

There are hundreds of capacitors in your cell phone and the market for them is huge, Tan said. The ceramic material tested in this project has been widely used in capacitors, but the defect-induced boost in electrical performance could make it better. It is also lead-free and less-toxic than other material options.

And so, the researchers wrote, these engineered line defects could turn into "a different suite of tools to tailor functional materials." And this "functional harvesting" could be good for our electronics, and even our environment and health.

Credit: 
Iowa State University

Driving in the snow is a team effort for AI sensors

image: Pavement can be hard to find on some winter roads. Sensor technology and image processing could help autonomous vehicles better navigate snowy conditions.

Image: 
Sarah Atkinson/Michigan Tech

Nobody likes driving in a blizzard, including autonomous vehicles. To make self-driving cars safer on snowy roads, engineers look at the problem from the car's point of view.

A major challenge for fully autonomous vehicles is navigating bad weather. Snow especially confounds crucial sensor data that helps a vehicle gauge depth, find obstacles and keep on the correct side of the yellow line, assuming it is visible. Averaging more than 200 inches of snow every winter, Michigan's Keweenaw Peninsula is the perfect place to push autonomous vehicle tech to its limits. In two papers presented at SPIE Defense + Commercial Sensing 2021, researchers from Michigan Technological University discuss solutions for snowy driving scenarios that could help bring self-driving options to snowy cities like Chicago, Detroit, Minneapolis and Toronto.

Just like the weather at times, autonomy is not a sunny or snowy yes-no designation. Autonomous vehicles cover a spectrum of levels, from cars already on the market with blind spot warnings or braking assistance, to vehicles that can switch in and out of self-driving modes, to others that can navigate entirely on their own. Major automakers and research universities are still tweaking self-driving technology and algorithms. Occasionally accidents occur, either due to a misjudgment by the car's artificial intelligence (AI) or a human driver's misuse of self-driving features.

Humans have sensors, too: our scanning eyes, our sense of balance and movement, and the processing power of our brain help us understand our environment. These seemingly basic inputs allow us to drive in virtually every scenario, even if it is new to us, because human brains are good at generalizing novel experiences. In autonomous vehicles, two cameras mounted on gimbals scan and perceive depth using stereo vision to mimic human vision, while balance and motion can be gauged using an inertial measurement unit. But, computers can only react to scenarios they have encountered before or been programmed to recognize.

Since artificial brains aren't around yet, task-specific artificial intelligence (AI) algorithms must take the wheel -- which means autonomous vehicles must rely on multiple sensors. Fisheye cameras widen the view while other cameras act much like the human eye. Infrared picks up heat signatures. Radar can see through the fog and rain. Light detection and ranging (lidar) pierces through the dark and weaves a neon tapestry of laser beam threads.

"Every sensor has limitations, and every sensor covers another one's back," said Nathir Rawashdeh, assistant professor of computing in Michigan Tech's College of Computing and one of the study's lead researchers. He works on bringing the sensors' data together through an AI process called sensor fusion.

"Sensor fusion uses multiple sensors of different modalities to understand a scene," he said. "You cannot exhaustively program for every detail when the inputs have difficult patterns. That's why we need AI."

Rawashdeh's Michigan Tech collaborators include Nader Abu-Alrub, his doctoral student in electrical and computer engineering, and Jeremy Bos, assistant professor of electrical and computer engineering, along with master's degree students and graduates from Bos' lab: Akhil Kurup, Derek Chopp and Zach Jeffries. Bos explains that lidar, infrared and other sensors on their own are like the hammer in an old adage. "'To a hammer, everything looks like a nail,'" quoted Bos. "Well, if you have a screwdriver and a rivet gun, then you have more options."

Most autonomous sensors and self-driving algorithms are being developed in sunny, clear landscapes. Knowing that the rest of the world is not like Arizona or southern California, Bos's lab began collecting local data in a Michigan Tech autonomous vehicle (safely driven by a human) during heavy snowfall. Rawashdeh's team, notably Abu-Alrub, poured over more than 1,000 frames of lidar, radar and image data from snowy roads in Germany and Norway to start teaching their AI program what snow looks like and how to see past it.

"All snow is not created equal," Bos said, pointing out that the variety of snow makes sensor detection a challenge. Rawashdeh added that pre-processing the data and ensuring accurate labeling is an important step to ensure accuracy and safety: "AI is like a chef -- if you have good ingredients, there will be an excellent meal," he said. "Give the AI learning network dirty sensor data and you'll get a bad result."

Low-quality data is one problem and so is actual dirt. Much like road grime, snow buildup on the sensors is a solvable but bothersome issue. Once the view is clear, autonomous vehicle sensors are still not always in agreement about detecting obstacles. Bos mentioned a great example of discovering a deer while cleaning up locally gathered data. Lidar said that blob was nothing (30% chance of an obstacle), the camera saw it like a sleepy human at the wheel (50% chance), and the infrared sensor shouted WHOA (90% sure that is a deer).

Getting the sensors and their risk assessments to talk and learn from each other is like the Indian parable of three blind men who find an elephant: each touches a different part of the elephant -- the creature's ear, trunk and leg -- and comes to a different conclusion about what kind of animal it is. Using sensor fusion, Rawashdeh and Bos want autonomous sensors to collectively figure out the answer -- be it elephant, deer or snowbank. As Bos puts it, "Rather than strictly voting, by using sensor fusion we will come up with a new estimate."

While navigating a Keweenaw blizzard is a ways out for autonomous vehicles, their sensors can get better at learning about bad weather and, with advances like sensor fusion, will be able to drive safely on snowy roads one day.

Credit: 
Michigan Technological University

Measuring the effects of radiotherapy on cancer may open up avenues for treatment

Ionizing radiation is used for treating nearly half of all cancer patients. Radiotherapy works by damaging the DNA of cancer cells, and cells sustaining so much DNA damage that they cannot sufficiently repair it will soon cease to replicate and die. It's an effective strategy overall, and radiotherapy is a common frontline cancer treatment option. Unfortunately, many cancers have subsets of cells that are able to survive initial radiotherapeutic regimens by developing mechanisms that are able to repair the DNA damage. This often results in resistance to further radiation as cancerous growth recurs. But until recently, little was known about exactly what happens in the genomes of cancer cells following radiotherapy.

To probe the traits of post-radiotherapy cancer genomics, Jackson Laboratory (JAX) Professor Roel Verhaak, Ph.D., led a team that analyzed gliomas, a brain cancer, both when they first arose and after they recurred in 190 patients. They also looked at data from nearly 3,700 other post-treatment metastatic tumors. In "Radiotherapy is associated with a deletion signature that contributes to poor outcomes in cancer patients," published in Nature Genetics, the researchers present findings showing that radiotherapy is associated with consistent genomic damage in the form of a large number of DNA deletions. The study also implicates an error-prone DNA damage repair mechanism as being an important contributor to the deletion signature. Targeting this DNA repair mechanism may help maximize the efficacy of radiotherapy. The study was co-led by Floris Barthel, M.D., a senior fellow in the group, and first author Emre Kocakavuk, M.D., a visiting student from University Hospital Essen in Germany.

Analyzing the pre- and post-treatment glioma datasets as well as the post-treatment metastatic tumor dataset, Verhaak and his team identified a significant increase of small (five to 15 base pair) deletions and chromosome or chromosome-arm level deletions in response to radiation. Furthermore, the post-treatment small deletion pattern had distinct characteristics that implicate canonical non-homologous end joining (c-NHEJ) as the preferred repair pathway for radiation-induced DNA damage.

"c-NHEJ is an error-prone mechanism for DNA repair that cancer cells need to use to mitigate the damage done by radiation. We discovered that small deletions commonly happen as a consequence," says Verhaak.

The research also reveals that patients who acquire the high small deletion burden have significantly shorter survival than others post-radiotherapy. The finding implies that the deletion signature is associated with a loss of sensitivity to subsequent radiotherapy and suggests that the presence of many small deletions may serve as a biomarker for radiotherapy response. Somewhat paradoxically, tumors that acquire the high deletion burden signature appear to become robust, possessing sufficient, if not precise, DNA repair to survive mutation-inducing treatments. Therefore, even if initial radiotherapy is effective, recurrent tumors that exhibit this genomic trait are unlikely to respond to more.

On the other hand, the study supports the hypothesis that DNA repair inhibitors likely represent an effective therapeutic strategy and could improve cancer cell response to radiotherapy. At this point, research has indeed indicated that drugs known as PARP inhibitors are effective for treating various cancer types, and preliminary work is being done to evaluate other candidate DNA repair mechanism inhibitors.

"Finding a robust signature of c-NHEJ in post-irradiated tumors is very exciting because it suggests that slowing this repair process during radiotherapy could potentially increase the effectiveness of treatment," says Barthel.

Credit: 
Jackson Laboratory