Tech

UCI engineers reveal molecular secrets of cephalopod powers

Irvine, Calif., Dec. 17, 2020 -- Reflectins, the unique structural proteins that give squids and octopuses the ability to change colors and blend in with their surroundings, are thought to have great potential for innovations in areas as diverse as electronics, optics and medicine. Scientists and inventors have been stymied in their attempts to fully utilize the powers of these biomolecules due to their atypical chemical composition and high sensitivity to subtle environmental changes.

In a study published recently in the Proceedings of the National Academy of Sciences, University of California, Irvine researchers have revealed the structure of a reflectin variant at the molecular level, and they have demonstrated a method for mechanically controlling the hierarchical assembly and optical properties of the protein. These findings are seen as key steps in exploiting many of the potentially useful attributes of the reflectin family.

"My laboratory at UCI has for a long time worked to mimic the light-scattering and light-reflecting powers of cephalopods with the goal of inventing new classes of adaptive thermoregulatory fabrics and other everyday technologies," said co-author Alon Gorodetsky, UCI associate professor of chemical and biomolecular engineering. "With this research, we have focused on developing a detailed fundamental understanding of how reflectins function at a molecular level."

Gorodetsky said scientists are attracted to reflectins because, similar to other protein-based materials, they offer many advantageous attributes such as controllable self-assembly, stimuli-responsiveness, customizable functionality and compatibility with other biological systems. The model biomaterials have also shown their usefulness for modifying the refractive index of human cells and supporting the growth of neural stem cells.

In their laboratory in UCI's Henry Sameuli School of Engineering, Gorodetsky and his collaborators used bioinformatics predictions to select a reflectin variant, produced the protein in bacteria and developed solution conditions for maintaining it in a stable state.

The researchers then used a variety of tools for analysis of the protein and its solutions, including molecular dynamics simulations, small-angle x-ray scattering, and nuclear magnetic resonance spectroscopy. They also probed the assembled multimeric protein ensembles with techniques such as atomic force microscopy and three-dimensional holotomographic microscopy. These methods enabled the team to assess a full range of qualities and properties for the reflectin variant.

"Through our synergistic computational and experimental approaches, we were able to elucidate the three-dimensional structure of the reflectin variant, thereby establishing a direct correlation between the protein's structural characteristics and intrinsic optical properties," said Gorodetsky. "This research can be viewed as a valuable conceptual framework for using this class of proteins in bioengineering applications."

Gorodetsky said his team's work will enable new techniques for processing reflectin-based materials and points to new avenues for custom tailoring films of the protein at the nano- and micro-meter scales, which would be beneficial for biophotonic and bioelectronic applications as well as for inspiring the design of polymeric materials with sophisticated light-scattering capabilities. He also said the approach used in this project could help better understand the mechanisms underpinning cephalopods' ability to change color.

Credit: 
University of California - Irvine

Seizure risk forecasted days in advance with brain implant data

image: Cover image

Image: 
Me?lanie Proix

Patterns of brain activity can be used to forecast seizure risk in epilepsy patients several days in advance, according to a new analysis of data obtained from clinically approved brain implants by neuroscientists at UC San Francisco, the University of Bern, and the University of Geneva.

"For forty years, efforts to predict seizures have focused on developing early warning systems, which at best could give patients warnings just a few seconds or minutes in advance of a seizure. This is the first time anyone has been able to forecast seizures reliably several days in advance, which could really allow people to start planning their lives around when they're at high or low risk," said Vikram Rao, MD, PhD, a neurologist at the UCSF Epilepsy Center, part of the UCSF Helen Diller Medical Center at Parnassus Heights. Rao was co-senior author of the new study, which was published December 17, 2020 in The Lancet Neurology.

Epilepsy is a chronic disease characterized by recurrent seizures - brief storms of electrical activity in the brain that can cause convulsions, hallucinations, or loss of consciousness. For decades, epilepsy researchers around the world have been working to identify patterns of electrical activity in the brain that signal an oncoming seizure, but with limited success. In part, study authors say, this is because technology has limited the field to recording brain activity for days to weeks at most, and in artificial inpatient settings.

At the UCSF Epilepsy Center, a major referral center for patients throughout the Western United States, Rao has pioneered the use of an implanted brain stimulation device that can quickly halt seizures by precisely stimulating a patient's brain at the first signs of an imminent seizure. This device, called the NeuroPace RNS System, has also made it possible for Rao's team to study seizure-related brain activity recorded over many months or even years in patients as they go about their normal lives -- typically unheard-of in neuroscience.

By analyzing this data, Rao and Maxime Baud, MD, PhD, a former UCSF neurology resident who is now an epileptologist at the University of Bern and the Wyss Center for Bio- and Neurotechnology in Geneva, recently discovered that seizures are less random than they appear, identifying weekly-to-monthly cycles of "brain irritability" that predict higher likelihood of having a seizure.

In their new study, Rao and Baud set out to test whether these regular patterns could be used to create clinically reliable forecasts of seizure risk.

"Currently, the perceived threat of seizures is constant for people with epilepsy, because no methods exist to identify times of high versus low risk," said Baud, who was co-senior author on the new study. "This has very broad consequences for daily activities, including avoiding potentially dangerous situations, like bathing, cooking on a hot stove and participating in sports."

Led by Timothée Proix, PhD, of the University of Geneva, the researchers built statistical models matching patterns of recorded brain activity to subsequent seizures in 18 epilepsy patients with implanted NeuroPace devices being followed at UCSF and California Pacific Medical Center in San Francisco. They then tested these forecasting algorithms using data from 157 participants who participated in the multi-center Long-Term Treatment trial of the RNS System between 2004 and 2018.

Looking back at the trial data, the researchers were able to identify periods of time when patients were nearly 10 times more likely to have a seizure than at baseline, and in some patients, signs of these periods of heightened risk could be detected several days in advance.

Of course, elevated risk of a seizure does not necessarily mean a seizure will occur. Epileptologists still do not fully understand what causes a seizure to happen at a particular moment in time, though many individuals report reliable triggers such as stress, alcohol, missed medication doses, or lack of sleep. He likens the system to the predictive models used by weather forecasters, which we frequently use to make decisions about what clothes to wear and whether to bring an umbrella when going out.

"I don't think I'm ever going to be able to tell a patient that she is going to have a seizure at precisely 3:17 pm tomorrow -- that's like predicting when lightning will strike," said Rao, who is Ernest Gallo Foundation Distinguished Professor of Neurology in the UCSF Weill Institute for Neurosciences. "But our findings in this study give me hope that I may someday be able to tell her that, based on her brain activity, she has a 90 percent chance of a seizure tomorrow, so she should consider avoiding triggers like alcohol and refrain from high-risk activities like driving."

Having accurate advance forecasts of seizure risk could also potentially allow neurologists to adjust patients' medication dosage accordingly, the researchers say, keeping doses low most of the time to minimize side effects and only raising dosage during times of higher seizure risk.

The researchers found significant variability in how well future seizure risk could be predicted from study participants' brain activity. While risk could be forecasted several days in advance in 40 percent of RNS System trial participants, other participants' brain data only predicted the following day's risk, and still others didn't exhibit the activity cycles needed for reliable predictions at all.

More research is needed to interpret this variability, Rao says. The RNS System itself is designed to detect and avert imminent seizures, not for advance seizure prediction, so it's possible that purpose-built devices could detect predictive fluctuations in brain activity in a broader spectrum of patients. Or it could be that epilepsy patients simply vary, as they do in many other respects, in the predictability of their risk cycles.

"It is worth remembering that, currently, patients have absolutely no information about the future--which is like having no idea what the weather tomorrow might be--and we think our results could help significantly reduce that uncertainty for many people," Rao said. "Truly determining the utility of these forecasts, and which patients will benefit most, will require a prospective trial, which is the next step."

Credit: 
University of California - San Francisco

Finding a personalized approach to treating chronic rejection after lung transplantation

ANN ARBOR, Mich. - Two new papers examine the processes of lung scarring and chronic rejection of the organ after transplantation, and potential therapies to stop the graft, or transplanted organ and its tissue, from failing.

"Chronic graft failure due to progressive scarring is the number one worry of all transplant physicians with very few means available to stop it," says Vibha Lama, M.D., a lung transplantation physician and professor and vice chair of basic and translational research in internal medicine at Michigan Medicine. "We realize that the way for us to stop it is by discovering why, and how, this scarring develops so that we can offer novel personalized therapies for patients."

Lama is the senior author of the two papers that demonstrate the link between antibodies targeting the donor lung and a particularly aggressive form of chronic rejection after lung transplantation called restrictive allograft syndrome, or RAS.

"It's a very devastating diagnosis because the prognosis is very poor with a high six-month mortality rate and no current therapeutic options," Lama says.

In the study published in JCI Insight, the research team created a mouse lung transplant model that emulates the human disease seen in RAS and show that this specific form of chronic rejection is dependent upon B cells, a type of white blood cells which secrete antibodies. Lama says she then took the mouse model used in the study and applied it to another study, published in American Journal of Transplantation. In that paper, a group of scientists led by Lama, studied interleukin 6, an inflammatory protein, and its role in chronic rejection after lung transplant.

Along with demonstrating how interleukin 6 activates cells that cause lung scarring, they found that targeting this cytokine can be beneficial.

"We show in the mouse model that if the recipient is deficient in interleukin 6, we see much less fibrosis and less chronic rejection," Lama says.

She notes that more research needs to take place, but these studies, along with her previous work, continue to shed additional light on rejection in lung transplant patients.

"The most exciting part is that therapies are available to target both B cells and interleukin 6," Lama says. "The important part for us to understand is which patients with chronic rejection will benefit from them, and these studies moves us forward in that direction. Every transplant patient is unique and we want to have therapy options available that are specific to them."

Credit: 
Michigan Medicine - University of Michigan

Satellite tracking supports whale survival

image: Among baleen whales, blue whales (Balaenoptera musculus) are endangered due to twentieth century whaling, with different subspecies impacted at various levels.

Image: 
Pixabay

Extensive satellite tracking has revealed important new knowledge about the little known pygmy blue whale population of Southern Australia.

Marine biologists have extensively tracked the movements of foraging and migrating blue whales (Balaenoptera musculus brevicauda) along the Australian continental shelf on a journey towards breeding grounds in Indonesia as part of conservation efforts for the endangered species.

A team of researchers led by Flinders University Associate Professor Luciana Möller tracked the movements of 13 of the blue whale subspecies to determine important environmental habitats along foraging grounds and migratory routes in Southern and Western Australia, which incorporate major shipping and fishing routes, and areas targeted for oil and gas exploration - all activities known to negatively impact whale behaviour.

The research team travelled 4236 kilometres deploying tagging equipment and recording photos of individual pygmy blue whales before tracking them up to 382 days as they travelled as much as 15,120 km during the study.

Published in Scientific Reports, the results shed light on the movements and distribution of the pygmy blue whale population to facilitate future conservation efforts for the endangered species.

Senior author and whale expert at Flinders University, Associate Professor Luciana Möller, says the study for the first time sheds light on the movements and occupancy patterns of the pygmy blue whales along Southern Australia's foraging grounds and migration routes to develop an understanding about potential impacts on their behaviour.

"Our tracking results provide new information and highlight the importance of understanding the movements and behaviour of pygmy blue whales in their migratory routes from Southern Australian foraging grounds to a Western Australian migratory corridor, and towards breeding grounds in Indonesia," says marine biologist Associate Professor Möller, who leads the Cetacean Ecology, Behaviour and Evolution Lab (CEBEL) and Molecular Ecology Lab at Flinders University.

"When combined with previous movement data, this information could be used to predict future whale presence and behaviour based on the forecasted effects of climate change, including in coastal and upwelling systems.

"More importantly, the ecological data can help mitigate the potential impacts of human activities such as oil and gas exploration on the little known pygmy blue whale population."

Associate Professor Möller says tagging information for the first time reveals the importance of foraging grounds in the Great Southern Australian Coastal Upwelling System, by identifying the importance of the Bonney Upwelling region and other smaller upwelling centres in Southern Australia.

"This new information, along with acoustic, sighting, genetic and past catch data, will substantially expand knowledge about the spatial distribution of this recovering blue whale population and its potential exposure to impacts from human activities throughout its travels.

"The data can contribute positively to various conservation management decisions for policymakers to consider in Australian, West Timor and Indonesian environmental legislation and forward planning, and for the development of international government collaborations to protect this little known subspecies of blue whales."

Credit: 
Flinders University

The most consumed species of mussels contain microplastics all around the world

"If you eat mussels, you eat microplastics." This was already known to a limited extent about mussels from individual ocean regions. A new study by the University of Bayreuth, led by Prof. Dr. Christian Laforsch, reveals that this claim holds true globally. The Bayreuth team investigated the microplastic load of four mussel species which are particularly often sold as food in supermarkets from twelve countries around the world. The scientists now present their research results in the journal Environmental Pollution.

All the samples analyzed contained microplastic particles, and the researchers detected a total of nine different types of plastic. Polypropylene (PP) and polyethylene terephthalate (PET) were the most common types of plastic. Both are plastics ubiquitous to people's everyday lives all over the world. To make the analyses of different sized mussels comparable, one gram of mussel meat was used as a fixed reference. According to the study, one gram of mussel meat contained between 0.13 and 2.45 microplastic particles. Mussel samples from the North Atlantic and South Pacific were the most contaminated. Because mussels filter out microplastic particles from the water in addition to food particles, a microplastic investigation of the mussels allows indirect conclusions to be drawn about pollution in their respective areas of origin.

The four species of mussels sampled were the European blue mussel, the greenshell mussel, the undulate venus, and the Pacific venus clam. All of the mussels sampled were purchased from grocery stores. Some of them had been farmed while some were wild catch from the North Sea, the Mediterranean Sea, the Atlantic Ocean, the South Pacific Ocean, the South China Sea, and the Gulf of Thailand.

The microplastic particles detected in the mussels were of a size of between three and 5,000 micrometres, i.e. between 0.003 and five millimetres. Special enzymatic purification was followed by chemical analysis of the particles via micro-Fourier transform infrared spectrometry (micro-FTIR) and Raman spectroscopy. "To analyze the types of microplastic, we used so-called random forest algorithms for the first time in this study, both for the immensely large micro-FTIR data sets and for the Raman measurement data. These enabled us to evaluate data quickly, automatically, and reliably," says Dr. Martin Löder, head of the plastics working group at the chair of Prof. Dr. Christian Laforsch.

Indeed, the contamination of different organisms with microplastics has been investigated in earlier research. However, the results available to date can only be compared with each other to a very limited extent because often different analytical methods were used in the studies. "Our new study represents an important advance in terms of methodology. We have combined the latest technologies and procedures in sample preparation, measurement, and analysis of microplastic contamination in such a way that comparable results can be obtained on this basis in the future. Such methodological harmonization is an indispensable prerequisite for correctly assessing and evaluating risks potentially emanating from the spread of microplastics in the environment," says Prof. Dr. Christian Laforsch, spokesperson for the "Microplastics" Collaborative Research Centre at the University of Bayreuth, and Chair of Animal Ecology I.

Credit: 
Universität Bayreuth

2D material controls light twice stronger

image: The SH light produced in the MoS2 double layer has the same phase as the MoS2 layer so the overlaid secondary harmonic wave has linear polarization. However, in a heterobilayer with different stacked materials, the phase of the secondary harmonic waves produced by each material is different, so the superimposed secondary harmonic waves have elliptically polarized light.

Image: 
POSTECH

Since the invention of world's first laser - the ruby laser - in 1960, the human desire to control light has spread to various industries, including telecommunications, medicine, GPS, optical sensors and optical computers. Recently, a POSTECH research team has taken a step closer to its goal of controlling light by identifying nonlinear optical phenomena occurring in heterobilayers composed of two-dimensional materials.

A nonlinear optical phenomenon refers to the occurrence of light whose intensity is not doubled when optical input intensity becomes doubled, in which the resulting output has different frequencies from original input. This phenomenon is easily understood if you think of electrons and nuclei as spring-connected oscillators. When the spring is moved at a constant cycle, light is generated by the oscillation of electrons and nuclei. If the spring-pulling force is small, only light with the same frequency as the applied external force is formed, but when a strong force is exerted, light with multiple frequency is produced. Among these, light with twice the input frequency is called second-harmonic generation (SHG). The secondary harmonic wave phenomenon can occur in substances that are not point-symmetric, and it has recently been discovered that efficiency is high in 2D semiconductor crystals such as molybdenum disulfide (MoS2) and tungsten disulfide (WS2).

A research team led by Professor Sunmin Ryu and Wontaek Kim in the MS/Ph.D integrated program in the Department of Chemistry at POSTECH noted that the secondary harmonic wave produced by a heterobilayer material (MoS2/WS2) could not be explained by the existing model, and confirmed that it was caused by the SHG interference with different phases. The team anticipated the phase difference in SHG through the results of polarizing spectroscopy of heterolayers that showed the elliptically polarized SHG light. The phase difference directly measured through the secondary harmonic wave interferometer2 was quantitatively consistent with the results obtained from polarizing spectroscopy, proving their hypothesis. In addition, DFT calculations were able to support these results.

So far SHG studies of 2D materials have mostly been limited to its intensity but this is the first time that the SHG phase was measured and it was shown that there is a difference in SHG phase between the two materials. The research showed the possibility of controlling SHG's phase.

"The conventional research was biased toward identifying the orientation of 2D crystal samples using SHG intensity and controlling it through external stimuli," remarked Professor Sunmin Ryu who led the study. He added, "This study not only broadened our understanding of nonlinear optical phenomena of 2D materials, but also opened new possibilities for nonlinear spectroscopic control methods." He concluded, "The research results are expected to greatly contribute to the control of nonlinear optical phenomena by using 2D materials to produce new photons with twice the frequency of vibration and controlled phase."

Credit: 
Pohang University of Science & Technology (POSTECH)

New discovery opens novel pathway for high-titer production of drop-in biofuels

image: The lab-setup of the light incubator showing two different experimental blue light setups.

Image: 
Photo: Jingbo Li, MIT

New discovery opens novel pathway for high-titer production of drop-in biofuels

Using an unusual, light-dependent enzyme and a newly discovered enzymatic mechanism, researchers from Aarhus University and Massachusetts Institute of Technology have enabled the biological synthesis of high-yield industry relevant production of climate neutral drop-in fuels from biowaste.

A special light-dependent enzyme, which was first discovered about three years ago, is the focal point in a new scientific discovery, that enables high-yield production of drop-in biofuels from biomass.

In a study now published in Nature Communications, engineers from Aarhus University and Massachusetts Institute of Technology have proved, that the original assumption of the enzymatic process in this biomass-to-biofuels conversion is actually wrong.

The findings have allowed the researchers to successfully biosynthesize green fuels at close to industrially relevant levels of 1.47 gram per liter from glucose.

The light-dependent enzyme, which originates from microalgae, has the particular characteristic that it can decarboxylate fatty acids into alkanes (thus converting cellulosic biomass into drop-in biofuels) using blue light as the only source of energy.

The researchers artificially insert the enzyme into the cells of the oleaginous yeast Yarrowia Lipolytica thereby engineering its metabolism. The yeast synthesizes glucose, originating from biomass, into lipids (specifically the molecules free fatty acids and fatty acyl-CoAs) which is then converted to alkanes by the enzyme in a metabolic reaction called fatty acid photodecarboxylase, in short FAP.

But ever since the discovery of the enzyme, it has been assumed, that free fatty acids are the enzyme's preferred reactant in the FAP process. That an abundance of free fatty acids would result in higher yield biofuel production.

Wrong, however.

"In our study, we have proved that fatty acyl-CoA - and not free fatty acid - is the preferred reactant for the light-dependent enzyme. This finding has been successfully used in our study to metabolize 89 per cent of fatty acyl-CoA into alkanes, reaching titers of 1.47 g/l from glucose," says Bekir Engin Eser, an assistant professor at Aarhus University.

The predominant production of oleochemical based drop-in fuels today are made by converting 'conventional' oleochemicals such as vegetable oils, used cooking oils, tallow, and other lipids to hydrocarbons (mainly alkanes) using energy intense chemical treatment methods.

However, sourcing large quantities of more or less sustainable lipid feedstocks at a low enough cost to result in profitable drop-in biofuel production remains a challenge that severely limits the expansion of this production platform. And furthermore, this production is competing with food supply.

Biosynthesis constitutes a cheap and sustainable solution, where the production is instead based on the conversion of cellulosic biomass - the most abundant renewable natural biological resource available on Earth.

Biological synthesis of alkanes from fatty acids is not a native, preferable metabolic pathway for the yeast however, since alkanes are toxic to its cells. Therefore, researchers use special ability enzymes for this purpose and encode the corresponding genes into the cells of the yeast.

The new discovery is a possible breakthrough in biosynthesis of drop-in fuels, since the researchers - for the first time ever using this process - have utilized the new knowledge to synthesize green fuels at a level that's relevant for future industrial production:

"Previous metabolic engineering studies would target maximizing the concentration of free fatty acids in the cells that are being engineered. But now, with this discovery, we know that it is fatty acyl-CoA that needs to be maximized. This is important news for synthetic biology applications, and we can now begin to maximize the flux of the fatty acyl-coA into this engineered metabolic pathway to reach even higher titers in the future," says Associate Professor Zheng Guo from Aarhus University.

Credit: 
Aarhus University

Big data will analyze the mystery of Beethoven's metronome

Data science and physics research at the Universidad Carlos III de Madrid and UNED has analysed a centuries-old controversy over Beethoven's annotations about the tempo (the playing speed) of his works, which is considered to be too fast based on these marks. In this study, published in the PLOS ONE journal, it is noted that this deviation could be explained by the composer reading the metronome incorrectly when using it to measure the beat of his symphonies.

Ludwig van Beethoven (1770-1827) was one of the first composers to start using a metronome, a device patented by Johann Nepomuk Maelzel in 1815. At that time, he started to edit his works with numerical marks with metronome indications. Doubts about the validity of these marks date back to the 19th century and during the 20th century many musicological analyses were carried out, some of which already pointed to the hypothesis that the metronome was broken, an assumption that could never be verified. In any case, most orchestra conductors have omitted these marks as they consider them to be too fast (Romanticism), whereas since the 1980s, other conductors (Historicism) have used them to play Beethoven. However, music critics and the public described these concerts as frantic and even unpleasant.

Previous scientific research, such as Sture Forsén's study in 2013, has pointed to several defects that may have affected the metronome, causing it to function slower, which would have led the composer from Bonn to choose faster marks than those actually proposed. In order to validate this explanation, researchers from the UC3M and UNED have systematically compared the metronomic marks with contemporary interpretations. This requires physical skills to model the metronome mathematically, analyse data, computing, usability, and, of course, music skills. Overall, they have analysed the tempo and its variations for each movement of 36 symphonies interpreted by 36 different conductors, a total of 169 hours of music.

"Our study has revealed that conductors tend to play slower than Beethoven indicated. Even those who aim to follow his directions to the letter! The tempi indicated by the composer are, in general, too fast, to the point that, collectively, musicians tend to slow them down," says Iñaki Ucar, one of the authors of this research, data scientist at the UC3M's Big Data Institute, and clarinetist. This slowing down follows, on average, a systematic deviation, so it is not random, but conductors tend to play consistently below Beethoven's marks. "This deviation could be explained by the composer reading the scale of the apparatus in the wrong place, for example, under the weight instead of above. Ultimately, this would be a problem caused by using new technology," says Almudena Martín Castro, the other author of the study, user experience designer and pianist, who carried out this research within the framework of her Bachelor Thesis for her Degree in Physics at UNED.

In this study, researchers have developed a mathematical model for the metronome based on a double pendulum, perfected with three types of corrections which take the amplitude of its oscillation, the friction of its mechanism, the impulse force, and the mass of its rod, an aspect that had not been considered in previous work, into account. "With the help of this model, we developed a methodology for estimating the original parameters of Beethoven's metronome from photographs that are available and the patent outline," the work explains. In addition to this, they dismantled a modern metronome to measure it and use it to validate both the mathematical model and methodology.

The researchers tried to identify a "break" in the metronome that gave rise to the slow tempi usually followed by musicians. They tried to change the metronome's mass (it may have been damaged and a piece may have fallen off), move it onto the rod, increase the friction (the metronome may have been poorly lubricated) and even testing the assumption that the apparatus may have been misplaced, leaning over the piano while the composer was creating his music. "None of the hypotheses matched what the data told us, which is a homogeneous slowdown in the tempi on the entire scale. Finally, we considered the fact that the deviation matches the size of the metronome's weight exactly, and we also found the annotation '108 or 120' on the first page of the manuscript for his ninth symphony, which indicates that the composer doubted where he was reading at least once. Suddenly, it all made sense: Beethoven was able to write down a lot of these marks by reading the tempo in the wrong place," they explain.

This methodology could be applied when investigating the work of other classical composers, as they are able to extract the tempo from a musical recording and clean up the data so they can be compared. "Studying the relationship between the tempo played and marks from other composers would be very interesting, or even looking for the 'correct tempo' for composers who did not leave any metronomic marks. Is it possible that there is an average tempo at which people usually interpret Bach's fugues, for example?" they ask.

Credit: 
Universidad Carlos III de Madrid

LSU health research suggests new mechanism to balance emotional behavior

New Orleans, LA - Research led by Si-Qiong June Liu, MD, PhD, Professor of Cell Biology and Anatomy at LSU Health New Orleans School of Medicine, discovered a surprising reciprocal interaction between chemicals in the brain resulting in accelerated loss of molecules that regulate brain cell communication. The research team's findings are published online in Nature Communications, available here.

Working in a rodent model, the researchers showed that the release of Gamma-Aminobutyric acid (GABA), an amino acid that acts as a neurotransmitter, hastens the breakdown of endocannabinoids in the brain. Endocannabinoids are naturally produced molecules that regulate how brain cells communicate, and their dysfunction can lead to neurological disorders. Endocannabinoids are produced "on-demand" and are removed when they are no longer needed. The researchers found that GABA upsets this delicate balance. Endocannabinoids are critically involved in several aspects of emotional memory processing, and the researchers found that memory formation through fear conditioning selectively speeds up their decline in the cerebellum. The findings reveal a potential therapeutic target to regulate the rate of degradation of endocannabinoids and provide an effective way to alter behavior.

"Endocannabinoids control emotional behavior," notes Dr. Liu. "Learning increased the release of the inhibitory neurotransmitter, GABA, and this was responsible for driving the change in endocannabinoid degradation. This form of plasticity is responsible for the formation of fear memory. Our findings suggest a novel mechanism for the physiological regulation of endocannabinoid signaling and for modulating emotional behavior."

Credit: 
Louisiana State University Health Sciences Center

HSE researchers use neural networks to study DNA

image: Maria Poptsova, Head of the Laboratory of Bioinformatics (HSE Faculty of Computer Science)

Image: 
Maria Poptsova

HSE scientists have proposed a way to improve the accuracy of finding Z-DNA, or DNA regions that are twisted to the left instead of to the right. To do this, they used neural networks and a dataset of more than 30,000 experiments conducted by different laboratories around the world. Details of the study are published in Scientific Reports.

Over the 67 years that have passed since the discovery of the structure of DNA, scientists have found many structural variations of this molecule. Sometimes DNA structures do not at all resemble the usual double helix, which is called B-DNA: they can differ from B-DNA by the number of chains (from two to four), chain density and thickness, the way in which the nitrogenous bases are joined, and the direction of the twist of the helix.

One of the structures, Z-DNA, is composed of a double helix, twisted differently - to the left instead of to the right. It is known that regions of Z-DNA are found in the cells of various organisms (from bacteria to humans), arise under certain conditions (for example, in supercoiled DNA or high salt concentration), and can be combined with other DNA structures in one molecule. For example, if, for some reason, the B-DNA molecule is supercoiled to the extent that it complicates transcription (synthesis of RNA based on DNA), some of its sections can twist in the opposite direction, thereby relieving unnecessary 'stress'. Scientists also suggest that Z-DNA can regulate transcription and increase the likelihood of mutations. Some research suggests that the formation of Z-DNA may be associated with certain diseases such as cancer, diabetes, and Alzheimer's. Recently, more and more studies have appeared that show the role of Z-DNA in the innate immune response--the reaction to viruses and other pathogens within the cell itself.

To learn more about the conditions of formation and the biological role of Z-DNA regions, it is necessary to have methods to find their location in the genome. The first genetic map with the markup of Z-DNA sites was compiled back in 1997, based on experimental data on the structural binding of consecutive nucleotides. In recent years, methods have emerged in which the location of regions other than B-DNA is predicted using computer algorithms. Advances in machine learning have made it possible to use another powerful tool for this task--neural networks. Unlike most methods, neural networks can take into account many factors and do not require scientists to select in advance few most likely influential. But even for neural networks, the search for Z-DNA remains a difficult task, since there is not enough experimental data: Z-DNA appears and disappears, and an experiment records only a small part of these regions. The researchers decided to test whether the accuracy of the neural networks increases with the addition of information from omics data, or information on how gene activity and protein synthesis in cells are regulated.

The scientists began by comparing how three types of neural networks - convolutional, recurrent, and a combination of the first two - can handle the task. A convolutional neural network is most often used for image processing, while a recurrent neural network is most often used to analyze texts. All three types of neural networks have already been tested on problems related to the study of the genome. In total, the authors of the study trained and evaluated 151 models on the DNA dataset enlarged by omics data. One of the recurrent neural networks, which the authors named DeepZ, yielded the best results, and they used it to predict novel Z-DNA regions in the human genome. Its accuracy significantly exceeds the accuracy of the existing algorithm, Z-Hunt.

With the help of DeepZ, the scientists mapped the entire sequence of the human genome, determining for each nucleotide the likelihood that it will end up inside a Z-DNA region. A sequence of several nucleotides for which the probability exceeded a certain threshold value was marked as a potential target site.

'The results of this study are important, because, with the help of neural networks, we were not only able to replicate the experiments, but also predict the potential sites of Z-DNA formation in the genome,' said Maria Poptsova, leader of the study and Head of the Laboratory of Bioinformatics at the Faculty of Computer Science at HSE University. 'The abundance of Z-DNA signals suggests that they are actively used to turn genes on and off. This is a faster signal than the genomic motifs. For example, the study by the group of scientists from Australia has shown that Z-DNA serves as a signal in training to suppress fear. Apparently, Z-DNA evolutionarily appeared in cases when a quick reaction to events was required. We plan to initiate joint projects with experimental groups to test the predictions.'

The authors demonstrated a novel approach to predicting Z-DNA regions using omix data and deep learning methods. The neural network generated genome markup will help scientists conduct experiments to detect Z-DNA, the full spectrum of which is just beginning to emerge.

Credit: 
National Research University Higher School of Economics

A phantom training program may help acclimate heifers to an automatic milking system

image: The phantom of a milking robot is built similarly to the Lely Astronaut A4.

Image: 
Lely Industries NV, Maassluis, the Netherlands

Philadelphia, December 17, 2020 - A new study appearing in the Journal of Dairy Science indicates that heifers that participated in a training program using a phantom before introduction to an automated milking system (AMS) visited the actual AMS more frequently, thereby potentially increasing milk yield. Acclimating the herd to an unfamiliar milking robot in advance is a potential solution to decrease stress for animals and farm employees.

"Overall, training on the phantom provided the animals with the necessary amount of experience to perform well with the actual milking robot and to achieve a higher number of voluntary milking visits," said Almuth Einspanier, PhD, Institute of Physiological Chemistry, Leipzig University, Leipzig, Germany. "Therefore, training on an AMS phantom is a good alternative to a training program directly at the AMS, with some important advantages, and can be an important contribution to improving animal welfare in dairy farming."

Since their market launch 28 years ago, AMS have been gaining popularity as a way to increase daily milk yield through increased milking frequency and allow cows to decide individually when to be milked. These benefits, however, depend in large part on the animals' acceptance of the AMS and on them visiting it voluntarily and without human assistance for milking.

The authors of this study, from Leipzig University, Martin Luther University Halle-Wittenberg, Agricultural Society Ruppendorf AG, and MAP Meißener Agrarprodukte AG, randomly assigned 77 Holstein-Friesian heifers to either a control group or phantom group. The phantom group was given free access to the phantom for four weeks before calving so that they could explore it and be positively conditioned by feeding concentrate; the control group had no contact with the phantom or the AMS before the first milking at the AMS. Fecal cortisol concentrations and rumination times of the animals were measured to assess their stress level.

The heifers trained on the phantom showed a significantly higher number of milking visits, leading to the conclusion that they were familiar with the AMS and therefore entered the milking robot more often. In addition, the proportion of trained heifers that had to be driven into the AMS was significantly lower than in the control group, indicating that they accepted the AMS readily and were in a better position to implement regular and voluntary milking visits.

Although the study had hypothesized that animals that had previously been trained on a phantom would undergo less stress on an actual milking robot, the fecal cortisol concentrations did not significantly increase when they were introduced into the AMS, and there was no significant difference between the two experimental groups. In addition, the results of the study suggest that training on a phantom had no significant effect on rumination time or lactation performance.

This study further illustrated training on a phantom offers the possibility of facilitating the start into early lactation for the heifers.

Professor Einspanier added, "The increased number of milking visits and the reduced proportion of animals that had to be fetched into the AMS for milking indicate that training on the phantom prepares the animals well for being milked in the AMS."

Credit: 
Elsevier

Fertilizer runoff in streams and rivers can have cascading effects, analysis shows

Fertilizer pollution can have significant ripple effects in the food webs of streams and rivers, according to a new analysis of global data. The researchers also found some detection methods could miss pollution in certain types of streams.

The analysis, published in Biological Reviews, combined the results of 184 studies drawn from 885 individual experiments around the globe that investigated the effects of adding nitrogen and phosphorus, the main components of fertilizer, in streams and rivers. While the analysis only included studies where scientists added nitrogen and phosphorus experimentally, nitrogen and phosphorus pollution can run off from farms into streams, lakes, and rivers - as well as from wastewater discharge. At high levels, fertilizer pollution can cause harmful algal blooms and can lead to fish kills.

"Overall, we found that high levels of nutrients affect streams and rivers everywhere," said the study's lead author Marcelo Ardón, associate professor of forestry and environmental resources at North Carolina State University. "Wherever we looked, we saw increases in the abundance and biomass of organisms that live in streams, and also the speeding up of processes that happen in streams - how fast algae grow, how fast leaves decompose, and how fast organisms grow that feed on them."

Across the studies, the researchers saw that nitrogen and phosphorus led to increased growth across the food web, such as in algae, the insects that eat the algae and the fish that eat the insects. In shaded streams where algae doesn't grow, they reported nitrogen and phosphorus sped decomposition of leaves and boosted growth of organisms that feed on them.

"We saw an average 48 percent increase overall in biomass abundance and activity in all levels of the food web," Ard?n said. "We also found that the food webs responded most strongly when both nitrogen and phosphorus were added together."

While experts already use the presence of a specific type of chlorophyll - chlorophyll a - in water to detect algae growth, researchers said using that method could miss pollution in waterways where algae do not grow, and where decomposition of leaves or other plant matter is the primary source of food for other organisms.

"The food webs in those streams don't depend on algae - the trees shade out the algae," Ardón said. "The streams there depend on leaves that fall in and decompose, which is what the insects, such as caddisflies and stoneflies, are eating. In those detrital-based streams, we found similar responses to increases in nitrogen and phosphorus as has been found in algae."

Another finding was that factors such as light, temperature, and baseline concentrations of nitrogen and phosphorus impacted the response to increases in the two nutrients.

"All of those things will determine how much of a response you get to increased nitrogen and phosphorus," said study co-author Ryan Utz of Chatham University.

The findings have implications for environmental policy, Ard?n said.

"The EPA has been asking states to come up with ways to reduce runoff of nitrogen and phosphorus into streams, because we know they can cause these really big problems," said Ardón. "We know that at a big scale, and we don't really know the details. A lot of states that are coming up with criteria to reduce the amount of nutrients in the water focus only on algal responses. Our study suggests regulators should expand their view."

The study, "Experimental nitrogen and phosphorus enrichment stimulates multiple trophic levels of algal and detrital-based food webs: a global meta-analysis from streams and rivers," was published Dec. 17 in Biological Reviews. The study was authored by Marcelo Ardón, Lydia H. Zeglin, Ryan M. Utz, Scott D. Cooper, Walter K. Dodds, Rebecca J. Bixby, Ayesha S. Burdett, Jennifer Follstad Shah, Natalie A. Griffiths, Tamara K. Harms, Sherri L. Johnson, Jeremy B. Jones, John S. Kominoski, William H. McDowell, Amy D. Rosemond, Matt T. Trentman, David Van Horn and Amelia Ward. The study was funded by the National Science Foundation under grant DEB-0832653 through the Long Term Ecological Research Network Office. Individual authors were funded by the National Science Foundation under grant DEB-1713502, and the Department of Energy's Office of Science, Biological and Environmental Research.

Credit: 
North Carolina State University

Interventional radiology associated with an increased risk for preventable adverse events

(Boston)--Medical errors pose a serious threat to patient safety and are estimated to account for more than 250,000 deaths in the U.S. each year. While few studies have analyzed the frequency and nature of adverse events in interventional radiology (minimally-invasive imaging), published data suggest that many adverse events in this field are preventable and frequently involve technical mishaps such as improper device positioning and device misuse or malfunction.

In a review article in the journal Radiology, researchers from Boston University School of Medicine (BUSM), suggest there is a critical need to renew understanding of adverse events and complications within interventional radiology. They also call for a robust recommitment to patient safety and quality assurance in clinical practice, continuing medical education and graduate medical education.

According to the researchers, most interventional radiology procedures are successful and the majority of patients undergo no adverse events or minor complications. "When complications do occur, however, they can be associated with considerable morbidity and treatment of these complications can lead to more invasive correctional procedures, thus exposing the patient to even greater cumulative risk of harm," explained corresponding author Mikhail C.S.S. Higgins, MD, MPH, assistant professor of radiology at BUSM.

While interventional radiology is associated with numerous systemic factors that carry an increased risk of medical error and may result in preventable patient harm, Higgins believes these risks may be lessened by further acknowledging the unique factors that facilitate adverse events in the specialty and continuing the development of safety practices that address and strengthen known areas of weakness.

Higgins stresses that widespread cultural changes that encourage blame-free error reporting and education initiatives focused on prevention and management of procedural complications can be advantageous in bettering the safety practices in interventional radiology. "With an improved understanding of the causes of medical error and the nature of complications, physicians may begin to take the precautions necessary to mitigate the deleterious impact of medico-legal exposure," adds Higgins who also serves as associate director for the Early Specialization in Interventional Radiology program at Boston Medical Center and the Founding Chair of the Radiology Interventions Safety, Quality and Complications Symposium (RISQCS).

"By choosing to embrace a reinvigorated commitment to patient safety and quality assurance in interventional radiology practice and education, the specialty can continue its steadfast evolution on a progressive trajectory that ensures a continued and more optimized quality of care for its patients."

Credit: 
Boston University School of Medicine

Computational model reveals how the brain manages short-term memories

image: From left: Terrence Sejnowski and Robert Kim.

Image: 
L: Salk Institute; R: Courtesy of Robert Kim

LA JOLLA--(December 17, 2020) If you've ever forgotten something mere seconds after it was at the forefront of your mind--the name of a dish you were about to order at a restaurant, for instance--then you know how important working memory is. This type of short-term recall is how people retain information for a matter of seconds or minutes to solve a problem or carry out a task, like the next step in a series of instructions. But, although it's critical in our day-to-day lives, exactly how the brain manages working memory has been a mystery.

Now, Salk scientists have developed a new computational model showing how the brain maintains information short-term using specific types of neurons. Their findings, published in Nature Neuroscience on December 7, 2020, could help shed light on why working memory is impaired in a broad range of neuropsychiatric disorders, including schizophrenia, as well as in normal aging.

"Most research on working memory focuses on the excitatory neurons in the cortex, which are numerous and broadly connected, rather than the inhibitory neurons, which are locally connected and more diverse," says Terrence Sejnowski, head of Salk's Computational Neurobiology Laboratory and senior author of the new work. "However, a recurrent neural network model that we taught to perform a working memory task surprised us by using inhibitory neurons to make correct decisions after a delay."

In the new paper, Sejnowski and Robert Kim, a Salk and UC San Diego MD/PhD student, developed a computer model of the prefrontal cortex, an area of the brain known to manage working memory. The researchers used learning algorithms to teach their model to carry out a test typically used to gauge working memory in primates--the animals must determine whether a pattern of colored squares on a screen matches one that was seen several seconds earlier.

Sejnowski and Kim analyzed how their model was able to perform this task with high accuracy, and then compared it to existing data on the patterns of brain activity seen in monkeys carrying out the task. In both tests, the real and simulated neurons involved in working memory operated on a slower timescale than other neurons.

Kim and Sejnowski found that good working memory required both that long-timescale neurons be prevalent, and that connections between inhibitory neurons--which suppress brain activity--be strong. When they altered the strength of connections between these inhibitory neurons in their model, the researchers could change how well the model performed on the working memory test as well as the timescale of the pertinent neurons.

The new observations point toward the importance of inhibitory neurons, and could inspire future research on the role of these cells in working memory, the researchers say. They also could inform studies on why some people with neuropsychiatric disorders, including schizophrenia and autism, struggle with working memory.

"Working memory impairment is common in neuropsychiatric disorders, including schizophrenia and autism spectrum disorders," says Kim. "If we can elucidate the mechanism of working memory, that's a step toward understanding how working memory deficits arise in these disorders."

Credit: 
Salk Institute

Crops near Chernobyl still contaminated

Crops grown near Chernobyl are still contaminated due to the 1986 nuclear accident, new research shows.

Scientists analysed grains including wheat, rye, oats and barley and found concentrations of radioactive isotopes - strontium 90 and/or caesium 137 - above Ukraine's official safe limits in almost half of samples.

The researchers also examined wood samples and found three quarters contained strontium 90 concentrations above Ukrainian limits for firewood.

The study was carried out by the Greenpeace Research Laboratories at the University of Exeter and the Ukrainian Institute of Agricultural Radiology.

"We focussed on strontium 90 because it is known to be currently present in soil mostly in bioavailable form, meaning it can be taken up by plants," said lead author Dr Iryna Labunska, of the Greenpeace Research Laboratories at the University of Exeter.

"Ukrainian government monitoring of goods containing strontium 90 ended in 2013, but our study clearly shows this needs to continue.

"People need to be aware of the ongoing contamination of soil and plants, and they need to be advised on the safest agricultural and remediation methods.

"We found very high levels of strontium 90 in wood ash - yet many people still use ash from their fires as a crop fertiliser."

Dr David Santillo, also of the Greenpeace Research Laboratories, added: "Our findings point to ongoing contamination and human exposure, compounded by lack of official routine monitoring.

"This research also highlights the potential for Chernobyl-derived radiation to be spread more widely again as more and more wood is used for power generation in the region."

The study analysed 116 grain samples collected during 2011-19 from fields in 13 settlements in the Ivankiv district of Ukraine - about 50km south of the power plant and outside its "exclusion zone".

Wood samples - mostly pine - were collected from 12 locations in the same district during 2015-19.

The study found:

- 45% of grain samples from the north-east part of the Ivankiv district contained strontium 90 at above permissible levels for human consumption. This situation will likely persist for at least another decade.

- Taking both strontium 90 and caesium 137 into account, combined activity concentrations of these isotopes were above permissible levels in 48% of grain samples.

- Nevertheless, modelled data show that the greater part of the Ivankiv district could produce grains containing strontium 90 below corresponding Ukrainian permissible levels.

- In the case of wood, it is estimated that levels of strontium 90 could exceed permissible levels for firewood in forest woods from vast areas in the north-east of the Ivankiv district.

- Wood from these territories may still contain strontium 90 above permissible levels by the end of this century.

- In one sample of ash from a domestic wood-burning oven, strontium 90 was found at a level 25 times higher than in the most contaminated wood sample collected in this study.

The authors recommend:

- Reinstating environmental and food monitoring programmes, and ensuring these are properly financed into the future.

- Government-led agricultural policies such as liming and using organic fertilisers, which could reduce strontium 90 concentrations by about half.

- Decreasing or eliminating the use of radioactively contaminated wood in fires.

- Establishing a programme to monitor radioactive contamination of ash in both households and at the local thermal power plant (TPP).

- Providing the population with information on safe handling of radioactively contaminated ash, and establishing a centralised disposal service for such ash.

Professor Valery Kashparov, Director of the Ukrainian Institute of Agricultural Radiology, added: "Contamination of grain and wood grown in the Ivankiv district remains of major concern and deserves further urgent investigation.

"Similarly, further research is urgently needed to assess the effects of the Ivankiv TPP on the environment and local residents, which still remain mostly unknown."

In a previous study, the researchers found that milk in parts of Ukraine had radioactivity levels up to five times over the country's official safe limit.

Credit: 
University of Exeter