Earth

6000-8000 km round trip flight of migratory wading birds tracked

image: Little ringed plover. Their eggs are speckled allowing for camouflage on gravel.

Image: 
Satoe Kasahara Ph.D., Suwa Hydrobiological Station, Faculty of Science, and Institute of Mountain Science, Shinshu University

The next time you eat a bowl of rice you might appreciate the fact that the rice paddy fields that produced the rice might have allowed an inland migratory bird to fuel by feeding on insects during its monumental journey covering thousands of kilometers across oceans.

Scientists in Japan set out to track the journey of migratory birds with the use of GPS tracking devices, allowing for detailed data on their routes. The little ringed plover (Charadrius dubius) is an inland freshwater wader. This particular group of plover's migration routes is East Asian-Australasian, but little ringed plovers can be found in many parts of the world including Europe. Their flyway had not been studied in-depth, unlike other species of migratory birds who are coastal.

The research group lead by Satoe Kasahara of Shinshu University began their study of the birds at the gravel ground bank of the Chikuma River in Nagano City in the early summer of 2017. This location happens to be very close to the location of the Shinkansen bullet trains that got submerged in water during typhoon Hagibis in October of 2019. In 2017, at the time of the study, the river had also flooded (to a lesser extent) which shortened the breeding period of the little ringed plover. Despite the unfavorable odds, uncertainties and obstacles, the research team successfully completed this research which relied on the return of the birds a year later to the same site, a likelihood that is said to be 30%.

"I have nothing but gratitude for the plovers," Assistant Professor Kasahara reiterated; without them, the researchers could not have elucidated the routes and habitat of plovers. By understanding their favored grounds and annual life cycle activities, conservation efforts and breeding could be more successful.

The team was lucky some of the birds returned the following year, allowing Assistant Professor Kasahara to let out a big sigh of relief upon their serendipitous reunion. Her team diligently recaptured 3 males and 3 females at the same breeding site along the Chikuma River in Nagano. Two birds returned with complete data sets. This included the wintering site and autumn and spring migration routes. Two had incomplete spring migration data because the GPS stopped working. The remaining two birds only had migration data until mid or late October, also because the GPS stopped working or the antenna for position fixing was lost. All were healthy, uninjured and released after data retrieval.

The birds travelled 3108 to 4226 kilometers over 32 to 136 days to their wintering sites. To put this into perspective, a flight from New York to Los Angeles is a little less than 4000km. A flight from New York to London is 5600km. Wintering areas were defined as the place where plovers stayed for more than 2 months without long-distance flight (more than 50km). The accuracy of the data points allowed the researchers to learn that rice paddy fields were the plover's preferred place to stay. The researchers noticed the plovers used rice paddies more in the non-breeding season when the birds are in the southern nations such as Taiwan and the Philippines. These locations have rice cultivation year-round allowing for the birds to have dependable access to insects in wet locations.

In the spring, the plovers travelled faster northbound, perhaps eager to increase the success of their breeding. Little ringed plovers breed in open gravel grounds. If open gravel grounds near freshwater are maintained in Japan, plovers will likely continue to breed successfully due to their ingrained tendency to return to previously occupied sites.

Migratory birds are on the decline world-wide with an increase in human activity and habitats decreasing. It is crucial for their conservation that we understand how these migratory wading birds live. The change in management of rice paddy fields may have negatively impacted the reliance of wading birds on rice paddy fields where new drainage systems decreased the availability of insects.

The Kochi-dori, as the plovers are called in Japan are truly amazing birds, displaying a range of intriguing characteristic behaviors such as injury feigning and the rodent run, which mimics the way small rodents run to confuse potential predators. Assistant Professor Kasahara hopes to continue the study of the little ringed plover whose routes may differ according to different breeding sites. Through studying birds of different breeding sites, new crucial habitats may be elucidated. Floods halted the breeding season prematurely in 2017. If this had not occurred, the plovers may have stayed longer in Japan. Assistant Professor Kasahara hopes to continue research to elucidate differences according to year.

Credit: 
Shinshu University

Tracking adeno-associated virus capsid evolution

image: The Official Journal of the European Society of Gene and Cell Therapy and eight other international gene therapy societies, was the first peer-reviewed journal in the field and provides all-inclusive access to the critical pillars of human gene therapy: research,

Image: 
Mary Ann Liebert Inc., publishers

New Rochelle, NY, March 18, 2020--Researchers have used high-throughput screening of adeno-associated viral (AAV) vector capsid libraries to maximize the likelihood of obtaining AAV variants with desired properties. As a result of these experiments, they gained some unexpected insights, reported in an article published in Human Gene Therapy, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the Human Gene Therapy website through April, 18 2020.

Mark Kay and colleagues from Stanford University (Stanford, CA) coauthored the article entitled "Tracking Adeno-Associated Virus Capsid Evolution by Thigh-Throughput Sequencing." The researchers used high-throughput screening of barcoded AAV capsid libraries to track directed AAV capsid evolution. The ultimate goal is to be able to more quickly identify improved recombinant AAV vectors for use in clinical gene therapy trials.

Among the most important findings was the following: it is not essential to use multiple rounds of selection, and this may in fact be counterproductive. Functional and efficient AAV variants were obtained after only one round of selection. Additionally, infection with a high multiplicity of infection (MOI) is preferable to infection with a low MOI, as the use of low MOIs results in more variation between screens and is not optimal at selecting the most desired capsids. Furthermore, competition can take place between AAVs with specific capsids in cells that. Have been infected with different AAVs. Other key findings are outlined in the article.

"This cutting-edge work by Dr. Kay and his Stanford colleagues is helping to make directed evolution of AAV capsids less of a 'black box'," says Editor-in-Chief Terence R. Flotte, MD, Celia and Isaac Haidak Professor of Medical Education and Dean, Provost, and Executive Deputy Chancellor, University of Massachusetts Medical School, Worcester, MA. "His insights are likely to result in the discovery of important novel capsids that might otherwise be overlooked."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Precision mirrors poised to improve sensitivity of gravitational wave detectors

image: The illustration shows the cross-section of a thermal bimorph mirror and its constituents. Controlling the temperature of the mirror changes the curvature of the reflected wavefront. Overlaid on the cross-section is the simulated radial stress, showing a concentration of stress at the boundary of the two layers, where the adhesive holds the structure together.

Image: 
Huy Tuong Cao, University of Adelaide

WASHINGTON -- Researchers have developed a new type of deformable mirror that could increase the sensitivity of ground-based gravitational wave detectors such as the Advanced Laser Interferometer Gravitational-Wave Observatory (LIGO). Advanced LIGO measures faint ripples in space time called gravitational waves, which are caused by distant events such as collisions between black holes or neutron stars.

"In addition to improving today's gravitational wave detectors, these new mirrors will also be useful for increasing sensitivity in next generation detectors and allow detection of new sources of gravitational waves," said research team leader Huy Tuong Cao from the University of Adelaide node of the Australian Center of Excellence for Gravitational Waves Discovery (OzGrav).

Deformable mirrors, which are used to shape and control laser light, have a surface made of tiny mirrors that can each be moved, or actuated, to change the overall shape of the mirror. As detailed in The Optical Society's (OSA) journal Applied Optics, Cao and colleagues have, for the first time, made a deformable mirror based on the bimetallic effect in which a temperature change is used to achieve mechanical displacement.

"Our new mirror provides a large actuation range with great precision," said Cao. "The simplicity of the design means it can turn commercially available optics into a deformable mirror without any complicated or expensive equipment. This makes it useful for any system where precise control of beam shape is crucial."

The new technology was conceived by Cao and Aidan Brooks of LIGO as part of a visitor program between the University of Adelaide and LIGO Laboratory, funded by the Australian Research Council and National Science Foundation.

Building a better mirror

Ground-based gravitational wave detectors use laser light traveling back and forth down an interferometer's two arms to monitor the distance between mirrors at each arm's end. Gravitational waves cause a slight but detectable variation in the distance between the mirrors.

Detecting this tiny change requires extremely precise laser beam steering and shaping, which is accomplished with a deformable mirror.

"We are reaching a point where the precision needed to improve the sensitivity of gravitational wave detectors is beyond what can be accomplished with the fabrication techniques used to make deformable mirrors," said Cao.

Most deformable mirrors use thin mirrors to induce large amount of actuation, but these thin mirrors can produce undesirable scattering because they are hard to polish. The researchers designed a new type of deformable mirror using the bimetallic effect by attaching a piece of metal to a glass mirror. When the two are heated together the metal expands more than the glass, causing the mirror to bend.

The new design not only creates a large amount of precise actuation but is also compact and requires minimum modifications to existing systems. Both the fused silica mirrors and aluminum plates used to create the deformable mirror are commercially available. To attach the two layers, the researchers carefully selected a bonding adhesive that would maximize actuation.

"Importantly, the new design has fewer optical surfaces for the laser beam to travel through, said Cao. "This reduces light loss caused by scattering or absorption of coatings."

Precision characterization

Creating a highly precise mirror requires precision characterization techniques. The researchers developed and built a highly sensitive Hartmann wave front sensor to measure how the mirror's deformations changed the shape of laser light.

"This sensor was crucial to our experiment and is also used in gravitational detectors to measure minute changes in the core optics of the interferometer," said Cao. "We used it to characterize the performance of our mirrors and found that the mirrors were highly stable and have a very linear response to changes in temperature."

The tests also showed that the adhesive is the main limiting factor for the mirrors' actuation range. The researchers are currently working to overcome the limitation caused by the adhesive and will perform more tests to verify compatibility before incorporating the mirrors into Advanced LIGO.

Credit: 
Optica

How 'pioneer' protein turns stem cells into organs

PHILADELPHIA -- Early on in each cell, a critical protein known as FoxA2 simultaneously binds to both the chromosomal proteins and the DNA, opening the flood gates for gene activation, according to a new study led by researchers in the Perelman School of Medicine at the University of Pennsylvania. The discovery, published in Nature Genetics, helps untangle mysteries of how embryonic stem cells develop into organs.

Molecular signals begin dictating what organs an embryo's stem cells will give rise to in the body--such as the liver or pancreas--within the first two weeks of development. It's an intricate process guided by these so-called "pioneer" transcription factors that gain access to the tightly packed DNA inside each cell so other specialized proteins can get in and activate the necessary genes. However, until now, it's been unclear how these pioneer factors open the DNA.

"We now understand that this pioneer factor, FoxA2, grabs the chromosomal proteins, known as histones, and exposes the DNA region," said the study's corresponding author Kenneth S. Zaret, PhD, the Joseph Leidy Professor in the Department of Cell and Developmental Biology and Director of Penn's Institute for Regenerative Medicine (IRM). "That opening allows other specialized, regulatory proteins to access the DNA and activate a network of silent genes that leads to the formation of internal organs."

For decades, researchers in Penn's IRM have been pulling back the curtain on this process as they work toward developing new cells for transplantation and tissue repair as part of treatment for common problems like liver or heart disease. Knowing how regulatory gene proteins work during this early stage can help the field better understand how to control the process of cell development for both clinical research and therapeutic purposes.

Zaret's lab previously discovered pioneer factors in 2002 and has been working to better understand their function and role in early embryonic development. In this latest study, the team of researchers, co-led by Makiko Iwafuchi, PhD, who performed the work while at Penn and is now at the University of Cincinnati College of Medicine, first used in vitro genetic techniques to investigate the interaction of FoxA with chromosomal proteins at the same time it interacts with DNA. They found that a small region of the FoxA2 protein--just 10 amino acids of more than 460--were necessary for the protein to make an opening in the chromatin fiber.

Next, the teams translated those findings into a mouse model, deleting the same sequences in mice to see how those changes would affect embryonic development. Removing those key amino acid signatures significantly impaired embryonic development, caused deformities in organs--including the brain and the heart--and resulted in death in the mice.

"This very small deletion in the protein had a profound effect that mirrored what we had seen in vitro, which surprised us," Zaret said. "We originally thought it would be a broadly acting phenomenon that would be hard to pinpoint, but we nailed it down. To see this biochemistry approach, which others were skeptical of, so clearly illuminate a facet of developmental biology was a real thrill."

Zaret's lab continues to investigate FoxA and other pioneer factors to learn how they may open up the chromatin and interact with chromosomal proteins, similar to FoxA or perhaps in other ways. The current findings serve as a road map.

"Now that we have these results, we are emboldened to investigate diverse other proteins that behave this way," Zaret said. "We know that FoxA2 doesn't act alone in turning on the endoderm program to make organs, and we're currently working to better understand how the different factors play in role in that development."

Credit: 
University of Pennsylvania School of Medicine

Ethylene sensor could help monitor plant health

To control flowering and fruit ripening, plants release the gaseous hormone ethylene. Environmental conditions, including drought, salinity and pathogens, can also cause levels of the hormone to fluctuate. Therefore, monitoring ethylene's release in real time could provide a farmer with important information about a plant's development and health. Now, researchers reporting in ACS Central Science have developed an easy-to-use, robust sensor that can do just that.

Because of the key role ethylene plays in plant health, the agricultural industry is interested in monitoring the hormone. Early detection of changes in the release of this gas could allow farmers to take preventative actions that restore plant health, reducing crop losses. However, existing sensors have limitations that make them impractical for use in the field. Timothy Swager, Darryl Fong and colleagues at the Massachusetts Institute of Technology wanted to make a sensor that could sensitively detect changes in ethylene levels.

The new sensor contains a network of single-walled carbon nanotubes (SWCNTs) on a piece of glass, sandwiched between gold electrodes. The researchers placed a catalytic mixture containing palladium on top of the SWCNTs. In a chemical reaction known as Wacker oxidation, the palladium catalyst converted ethylene gas to acetaldehyde. During this reaction, palladium changed its oxidation state and interactions with the SWCNTs, altering their electrical conductance. In this way, the researchers could monitor changes in ethylene gas levels over time. To demonstrate the sensor, the team placed carnations or lisianthus flowers in a chamber with the device and observed fluctuations in ethylene production as the flowers bloomed and faded. The device can detect parts-per-billion concentrations of the gas within the chamber, and with this sensitivity it could potentially be used to monitor plants in the field, the researchers say.

Credit: 
American Chemical Society

Artificial intelligence helps prevent disruptions in fusion devices

image: Physicist Yichen Fu.

Image: 
Photo and collage by Elle Starkman/PPPL Office of Communications.

An international team of scientists led by a graduate student at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) has demonstrated the use of Artificial Intelligence (AI), the same computing concept that will empower self-driving cars, to predict and avoid disruptions -- the sudden release of energy stored in the plasma that fuels fusion reactions -- that can halt the reactions and severely damage fusion facilities.

Risk of disruptions

Fusion devices called tokamaks run increased risk of disruptions as researchers, aiming to maximize fusion power to create on Earth the fusion that powers the sun and stars, bump up against the operational limits of the facilities. Scientists thus must be able to boost fusion power without hitting those limits. This capability will be crucial for ITER, the large international tokamak under construction in France to demonstrate the practicality of fusion energy.

Fusion reactions combine light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei that makes up 99 percent of the visible universe -- to generate massive amounts of energy. Scientists around the world are seeking to create fusion for a virtually inexhaustible supply of safe and clean power to generate electricity.

The researchers trained an AI machine learning algorithm, or set of rules, on thousands of previous experiments on the DIII-D National Fusion Facility that General Atomics operates for the DOE. Scientists then applied the rules in real-time to ongoing DIII-D experiments and found the algorithm capable of forecasting the likelihood of disruptions and initiating actions that averted the onset of disruptions.

Relatively simple model

"It's fascinating to see that a relatively simple machine learning model could accurately predict the complicated behavior of fusion plasma," said Yichen Fu, a graduate student in the Princeton Program in Plasma Physics at PPPL and lead author of a paper describing the findings (link is external) in Physics of Plasmas and showcased in a featured American Institute of Physics publication called "SciLight." "It's great to see students leading multi-institutional teams and making a real impact on the development of machine learning methods for the control of fusion plasmas," said PPPL physicist Egemen Kolemen, supervisor of Yichen's work and an assistant professor of Mechanical and Aerospace Engineering at Princeton University.

The results mark another step toward preventing disruptions in ITER and next-generation facilities, said physicist Raffi Nazikian, head of the ITER and Tokamak department at PPPL. "This work represents significant progress in the use of machine learning to develop a disruption prediction and avoidance method in fusion devices," Nazikian said. "However, a great deal of R&D is still required to improve the accuracy of the predictions and to develop fail-safe control methods to avoid disruptions in ITER and future reactors."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Rethinking mortality and how we plan for old age

Many people dream of comfortably living out their golden years. A new IIASA study however shows that older Europeans, and especially women, frequently underestimate how many years they have left, which could lead to costly decisions related to planning for their remaining life course.

Older people have to make important decisions about their remaining life years, such as how to invest savings and manage properties, changes in employment status and retirement, living arrangements, and matters related to their health. Their personal evaluation of the length of their remaining life is therefore crucial, because decisions can be biased if this expected personal length of life differs significantly from the actual number of remaining life years, leading to negative consequences like financial strife and increased anxiety or depression.

In their study published in the journal PLOS ONE, the researchers used data from the Survey of Health, Ageing, and Retirement in Europe (SHARE) for nine European countries (Austria, Belgium, France, Germany, Greece, Italy, Sweden, Spain, and Switzerland) gathered in 2004 and 2015, to estimate subjective life expectancies from age 60 to 90 for men and women. They compared how these results match the actual observed life expectancies in the countries included in the study. The study also for the first time highlighted these matches in terms of differences between how many years individuals thought they had left to live in 2004 compared to 2015, between countries, and in terms of differences between men and women.

The results reveal one major inference that dominates across countries, time, and genders, namely that the number of years that people think they have left to live is less than their actual remaining life span. Interestingly, this downward bias was considerably larger for women when compared to men: it was close to five years in 2004 and more than three years in 2015.

One of the more surprising findings of the study was that women and men's subjective expectations of length of life are about equal - around 19 years in 2004 and 21 years in 2015 - because women's actual length of life is usually longer. Previous studies have made similar unexpected observations for healthy and unhealthy life expectancies with women reporting a higher proportion of unhealthy life than men, despite the fact that they live longer. This similarity indicates that health plays a primary role in the formation of personal perceptions about length of life.

In terms of differences between the 2004 and 2015 surveys in how many years people think they have left to live, the findings indicate that differences between subjective and actual life expectancies decline over time for both men and women. In the 11 years between 2004 and 2015, gender differences remained unchanged and underestimation decreased for both genders, with subjective life expectancies increasing at a higher pace than actual ones. For men specifically the difference between subjective and actual life expectancy in fact became very small - in 2015, it was only 4 months. According to the study, this could be due to an increased focus on healthy life styles with good diets, a decline in smoking and alcohol consumption, or other issues related to active aging. The authors point out that it is important to see how this tendency will develop in the future as it may hold implications for social and economic policies related to the life course of the elderly.

"The issues we highlight in this paper imply a need for adequate policies that will lead to a decrease in the downward bias people have in terms of their self-perceived life expectancy. These policies could be directed towards further improvement of information about health-related issues, so individuals will be able to construct realistic views about their health status and hence gain a more realistic view on their remaining life span. Since women have a larger bias than men, it might even be appropriate to consider gender-related policy aspects," concludes study author Dimiter Philipov, a guest researcher in the IIASA World Population Program.

Credit: 
International Institute for Applied Systems Analysis

Tropical Cyclone Herold's eye opens further on NASA satellite imagery

image: On March 17, the MODIS instrument that flies aboard NASA's Terra satellite took this image of Tropical Cyclone Herold and showed a well-developed hurricane which is maintaining a clearer eye.

Image: 
NASA Worldview

As Tropical Cyclone Herold intensified, its eye appeared more defined in imagery taken by NASA's Terra satellite.

A Tropical Cyclone Warning class 3 was in force for Rodrigues Island on March 17. Rodrigues is an autonomous outer island of the Republic of Mauritius in the Southern Indian Ocean. It is about 42 square miles (108 square kilometers). Rodrigues Island is located about 350 miles (560 kilometers) east of Mauritius.

On March 17 at 5 a.m. EDT (0900 UTC), forecasters at the Joint Typhoon Warning Center (JTWC) noted "Infrared satellite imagery indicates a sharp intensification trend over the past 12 hours as the system formed an irregular 13-nautical mile wide eye."

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite confirmed that intensification. MODIS provided forecasters with a visible image of Tropical Cyclone Herold that revealed a much clearer eye than on March 16. The eye was no longer obscured by high clouds, and the Terra view was able to see down to the ocean's surface through the eye. Powerful bands of thunderstorms circled the eye.

At 5 a.m. EDT (0900 UTC) on March 17, the center of Tropical Cyclone Herold was located near latitude 19.1 degrees south and longitude 60.4 degrees east, about 185 nautical miles east-southeast of Port Louis, Mauritius. Maximum sustained winds had increased to 100 knots (115 mph/185 kph).

The JTWC noted that Herold is moving southeast and is at peak intensity, while it is passing just west of Rodrigues. The storm is now expected to begin weakening before it becomes subtropical.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Plant water saving system works like clockwork, it transpires

Plants, just like humans, have circadian clocks that allow them to tell the time. In humans this cellular clockwork influences when we wake and sleep.

Plants are so dependent on daylight that circadian clocks are even more influential, regulating the rate of photosynthesis, gas exchange, and transpiration, which is the flow of water through the stem and evaporation from leaves.

Now researchers have discovered that these biological clocks play a critical role in the consumption of water, allowing plants to use this precious resource more efficiently.

They carried out a series of experiments with model laboratory plants in which the genes encoding circadian rhythms had been changed.

Some changes made plants use more water in relation to growth but, unexpectedly, the experiments revealed that some of these changes to circadian rhythms allowed plants to grow strong and healthily whilst using less water. The study reveals that it is the whole circadian system that affects water use efficiency not just a specific part.

The research opens an opportunity for the tuning of crops to use water more efficiently: losing less water through transpiration whilst still growing.

Agriculture accounts for around 80% of freshwater used worldwide. So, understanding processes in plants that affect the amount of water they use is vitally important to develop crops that are productive but use less water.

Plants transpire water with a daily rhythm because the stomata, tiny pores on the surface of leaves, generally open only in the day. Previous studies showed that daily opening is regulated by circadian rhythms.

"We reasoned that circadian rhythms might have a big impact upon the amount of water that plants use. And our experiments show this to be the case," explains Dr Antony Dodd of the John Innes Centre, who is the senior author of the study.

"The overarching goal of the work lies in reducing the amount of water that is used in crop irrigation to improve the sustainability of agricultural food production."

The study reveals that the altered circadian clock genes affect water use efficiency through a variety of ways. Along with adjusting the process of transpiration, the altered clock influences how big leaves grow which effects how much water the plant uses. These changes together with others account for the improvements in water use efficiency the researchers observed.

The next steps of the study will be to discover the cellular mechanisms that explain how circadian rhythms regulate plant water loss and establish the importance of the findings in key crops, using the knowledge from the model plants used in this study. Further work could involve investigating the role of temperature in how the clock affects water use efficiency.

The research was funded by BBSRC (the GEN ISP at JIC and the SWBio PhD programme in Bristol), and was in collaboration with Prof Alistair Hetherington (University of Bristol).

The circadian clock contributes to the long-term water efficiency of Arabidopsis appears in Plant Physiology journal.

Credit: 
John Innes Centre

New research reveals that scents alter how memories are processed in the brain

image: Pyramidal cells in the mouse prelimbic cortex (shown in blue) that were active during the formation of a discrete memory (shown in green). These cells have been traditionally thought to be less active upon initial memory formation (or shortly after e.g., one day later) and become more active with the passage of time (when accessed at more remote time points e.g. 21 days later).

Image: 
Dr. Stephanie Grella (Ramirez Lab)

In a new paper published in Learning and Memory, researchers from Boston University's Center for Systems Neuroscience reveal just how much power scents have in triggering the memory of past experiences--and the potential for odor to be used as a tool to treat memory-related mood disorders.

"If odor could be used to elicit the rich recollection of a memory--even of a traumatic experience--we could take advantage of that [therapeutically]," says BU neuroscientist Steve Ramirez, assistant professor of psychology and brain sciences and senior author of the study.

Until now, the scent-memory connection has been something of an enigma. In fact, even the mechanisms that underlie memory formation in general have been debated in recent years. The traditional theory--systems consolidation theory--suggests that our memories start out being processed by a small, horseshoe-shaped brain area called the hippocampus, which infuses them with rich details. Over time, especially when we sleep, the set of brain cells that holds onto a particular memory reactivates and reorganizes. The memory then becomes processed by the front of the brain--the prefrontal cortex--instead of the hippocampus, and many of the details become lost in the shuffle.

This theory has its merits. For starters, it would explain why our memories tend to get a bit fuzzy as time passes. It also helps explain why people with hippocampal damage are often unable to form new memories while their ability to keep old, prefrontal cortex-stored memories remains perfectly intact. In contrast, those with prefrontal cortex damage often exhibit the flavor of amnesia we often see in soap operas: an inability to remember the past.

However, critics of the systems consolidation theory maintain that it doesn't tell the whole story. If memories slip out of the hippocampus and become stripped of their details over time, then why do many people retain vivid recollections of an event even years later--particularly people with post-traumatic stress disorder (PTSD)? And why do scents, which are processed in the hippocampus, sometimes trigger seemingly dormant memories?

To answer these questions, Ramirez and members of his lab created fear memories in mice by giving them a series of harmless but startling electric shocks inside a special container. During the shocks, half of the mice were exposed to the scent of almond extract, while the other half were not exposed to any scent. The next day, the researchers returned the mice to the same container to prompt them to recall their newly formed memories. Once again, the mice in the odor group got a whiff of almond extract during their session, while the no-odor group was not exposed to any scent. But this time, neither group received any new electric shocks. Consistent with the systems consolidation theory, both groups exhibited significant activation of the hippocampus during this early recall session, indicating they remembered receiving the shocks from the day before.

However, during the next recall session 20 days later, the researchers were in for a shock of their own. As expected, in the no-odor group, processing of the fear memory had shifted to the prefrontal cortex--but the odor group still had significant brain activity in the hippocampus.

"[This finding suggests] that we can bias the hippocampus to come back online at a timepoint when we wouldn't expect it to be online anymore because the memory is too old," Ramirez says. "Odor can act as a cue to reinvigorate or reenergize that memory with detail."

Ramirez adds that we still aren't sure about odor's exact role in memory processing. Perhaps odors delay a memory's shift to relying on the prefrontal cortex, thereby preserving the details for longer. If this is the case, an odor needs only to be present during memory formation for a memory to retain its vividness. Alternatively, it's possible that the prefrontal-cortex shift still occurs in an odor-associated memory, but that if the same odor emerges again later on, the hippocampus becomes reactivated and the memory regains the details it had lost.

Regardless of the specifics, Ramirez says that this research provides us with a "blueprint" of memory processing in nonhuman animals, and this information might one day lead to breakthroughs in the treatment of mental health conditions in humans, such as PTSD.

Many psychotherapy- and drug-based treatments for PTSD involve trying to suppress or dampen traumatic memories, but this process can only be carried out effectively when people actively recall the memories first.

"Now that we know that odor can shift memories to become more hippocampus dependent, we could potentially develop strategies that engage or disengage the hippocampus. And then we could integrate some behavioral or drug-based approaches to bring the hippocampus back offline if our goal is to permanently suppress a fear memory," Ramirez says.

In other words, the scents that spark our memories may be more powerful than we realize. Today, they serve as the triggers for our nostalgia and our anxiety--but tomorrow, they could be our treatments.

"We can potentially view memory as its own kind of drug--as an antidepressant or [anxiety reducer]," Ramirez says. "And [odor] could be an experimentally controllable factor that we could deliver to people. It may be a very powerful tool."

Credit: 
Boston University

AI-powered shoes unlock the secrets of your sole

Researchers at Stevens Institute of Technology have developed an AI-powered, smart insole that instantly turns any shoe into a portable gait-analysis laboratory.

The work, reported in January 2020 issue of IEEE Transactions on Neural Systems and Rehabilitation Engineering, could benefit clinical researchers by providing a new way to precisely measure walking function in patients with movement disorders or musculoskeletal injuries, in their living environments. The technology could also lead to significant advances for athletes, by helping them improve their running technique.

"From a practical standpoint, that's invaluable," said Damiano Zanotto, lead author and director of Stevens Wearable Robotic Systems Lab. "We're now able to accurately analyze a person's gait in real time, in real-world environments."

Taking a single step might seem simple, but capturing reliable information about a person's gait in real-life environments remains a major challenge for researchers. Gold-standard gait-analysis technologies, such as camera-based motion-capture systems and force plates, are expensive and can only be used inside laboratories, so they offer few insights into how people walk around in the real world. Emerging wearable technologies such as smart shoes, pods, and insoles can potentially overcome this limitation, but the existing products cannot provide accurate gait data.

In their work, Zanotto and his team show that their smart insole can deliver real-time data on the length, speed, and power of a wearer's stride with better accuracy than existing foot-worn technologies - and at a fraction of the cost of traditional laboratory equipment. (Zanotto and his team are seeking two patents relating to the SportSole, and several companies and professional sporting franchises are closely following the team's work.)

The team's SportSole technology uses accelerometers and gyroscopes to monitor its own movement and orientation in space, and an array of force sensors to detect plantar pressure, allowing it to capture 500 readings per second -- around a fivefold improvement over smart pedometers and other wearable gait-analysis tools.

The real magic, however, happens outside the shoe. Wearable motion sensors are inherently noisy. To overcome that challenge, Zanotto simmers those 500 measurements per second down to just a few key features, then feeds the results into an AI algorithm capable of rapidly extracting gait parameters that are accurate to within a couple of percentage points.

That's a big improvement over other AI gait-analysis tools, which are computationally intensive and require data to be recorded for later analysis. The Stevens system is far more efficient, allowing it to be baked into a microcontroller capable of delivering real-time gait analysis.

It also works regardless of whether the wearer is walking or running, and generates accurate results without requiring calibration or customization for individual users. Preliminary testing suggests the SportSole even works with children as young as three years of age and elderly with vestibular disorders, whose gait patterns are very different from those of healthy adults.

Such consistent accuracy is impressive because most gait researchers use high-end sensors costing $1,000 or more in a bid to reduce errors. By contrast, Zanotto and his team used off-the-shelf sensors costing around $100, relying on AI to extract reliable data. "We're achieving the same or better results at a far lower cost, and that's a big deal when it comes to scaling this technology," said Zanotto.

For now, though, the team is focusing on testing the SportSole for clinical use. An unobtrusive, wearable gait monitor could help researchers optimize treatments for people with movement disorders, allow remote monitoring of vulnerable populations, or offer important insights into the safety and efficacy of new treatments that might affect gait and balance.

Credit: 
Stevens Institute of Technology

Medical radiation exposure fell in the US from 2006 to 2016

OAK BROOK, Ill. - Medical radiation exposure to patients in the U.S. fell by 20% between 2006 and 2016, reversing a quarter century-long trend of increasing exposure, according to a study appearing in the journal Radiology.

The use of medical imaging has grown rapidly in recent decades, raising concerns about the exposure of patients to ionizing radiation. A landmark report published in 2008 found that per capita radiation exposure in the U.S. increased six-fold between 1980 and 2006.

"The radiation dose to the U.S. population went up dramatically because of medical exposure, mostly from CT scanning and nuclear medicine, and that woke everybody up to the problem," said study senior author Fred A. Mettler Jr., M.D., radiologist from the Department of Radiology at the University of New Mexico in Albuquerque.

In the wake of the report, medical societies and organizations enacted initiatives to increase awareness of exposure while equipment manufacturers developed more refined dose modulation technology. The effects of these and other efforts in the years since 2006 have remained largely unknown.

Armed with a grant from the Centers for Disease Control and Prevention, Dr. Mettler assembled a group of experts in medical imaging and physics and set out to determine the change in per capita radiation exposure in the U.S. from 2006 to 2016.

The results showed that the number of diagnostic and interventional radiology examinations performed remained largely unchanged over the 10-year period, even though the U.S. population increased by about 23 million. Estimated annual individual dose from diagnostic and interventional medical procedures fell from 2.9 millisieverts (mSv) in 2006 to 2.3 mSv in 2016, a decrease of approximately 20%.

"The overall trend stabilized, and the total dose to the U.S. population dropped a bit," Dr. Mettler said.

A key factor in the reduction was a substantial decrease in the number of nuclear medicine procedures, from 17 million in 2006 to 13.5 million in 2016. The decline was particularly notable in cardiology after a cut in Medicare reimbursement drove many cardiologists away from nuclear medicine-based procedures to stress echocardiography, a test that relies on ultrasound instead of ionizing radiation.

"Nuclear medicine basically fell off a cliff, possibly due to cardiologists discovering that reimbursement was way down and that, for cardiac ischemia and other indications, stress echocardiography is essentially as accurate as the nuclear medicine procedures," Dr. Mettler said.

CT scans, a major driver of medical radiation exposure, increased from 67 million to 84 million scans over the 10-year period. However, the average individual effective dose from CT procedures dropped by 6%, thanks to several factors, according to study co-author Mahadevappa Mahesh, M.S., Ph.D., professor of radiology and cardiology at Johns Hopkins University School of Medicine in Baltimore.

"One important factor is the dose modulation techniques available on most CT scanners in the country," he said. "The second factor is that overall CT detectors are becoming very efficient in the sense that they can utilize less radiation to create the same quality images."

Dr. Mahesh also credited campaigns such as Image Wisely, a joint effort of the Radiological Society of North America, the American College of Radiology, American Society of Radiologic Technologists and American Association of Physicists in Medicine, aimed at optimizing dose and reducing unnecessary imaging examinations in the adult population, and Image Gently, which does the same in the pediatric population. He also pointed to accreditation requirements that are contributing towards dose stabilization.

Now that U.S. medical radiation exposure levels appear to have stabilized, the researchers stressed the importance of sustained vigilance to keep the trend from reversing.

"We don't want people to become complacent," Dr. Mahesh said. "Things are generally going in a good direction, but we need to continue on that path."

Dr. Mahesh and colleagues plan to publish a report next year on trends in medical radiation exposure worldwide.

Credit: 
Radiological Society of North America

How horses can save the permafrost

image: Herds of herbivores preserve the permafrost -- even under strong global warming.

Image: 
Pleistocene Park

Permafrost soils in the Arctic are thawing. As they do, large additional quantities of greenhouse gases could be released, accelerating climate change. In Russia, experiments are now being conducted in which herds of horses, bison and reindeer are being used to combat this effect. A study from Universität Hamburg, just released in the Nature journal Scientific Reports, now shows for the first time that this method could indeed significantly slow the loss of permafrost soils.

Theoretically speaking, 80 percent of all permafrost soils around the globe could be preserved until the year 2100, as has now been demonstrated by Prof. Christian Beer from Universität Hamburg's Center for Earth System Research and Sustainability (CEN), an expert on the permanently frozen soils found throughout the Northern Hemisphere. If no action is taken to prevent it, half of the world's permafrost will thaw by 2100. The new study explores a somewhat unconventional countermeasure: resettling massive herds of large herbivores.

The inspiration came from Pleistocene Park in Chersky, a city in northeast Russia. Russian scientists Sergey and Nikita Zimov resettled herds of bison, wisents, reindeer and horses there more than 20 years ago, and have been observing the effects on the soil ever since. In winter the permafrost in Chersky is ca. minus 10 degrees Celsius; at temperatures down to minus 40 degrees Celsius, the air is far colder. Thanks to ample snowfall, there is a thick layer of snow cover that insulates the ground from the frigid air, keeping it "warm." When the snow cover is scattered and compressed thanks to the grazing animals' stamping hooves, its insulating effect is dramatically reduced, intensifying the freezing of the permafrost. "This type of natural manipulation in ecosystems that are especially relevant for the climate system has barely been researched to date - but holds tremendous potential," Beer says.

The long-term experiments conducted in Russia show that, when 100 animals are resettled in a 1 km2 area, they cut the mean snow cover height in half. Christian Beer and his colleagues wanted to determine what effect this could produce when applied to all Arctic permafrost soils as a whole. Could the animals' influence, at least in theory, even be enough to mitigate intensive warming of the atmosphere and stop the thawing of the permafrost?

For the purposes of his study, Beer used a special climate model that can simulate such temperature processes on the land surface over the course of an entire year. The results show: if emissions continue to rise unchecked (scenario RCP 8.5 in the latest IPCC Assessment Report), we can expect to see a 3.8-degree Celsius increase in permafrost temperatures, which would cause half of all permafrost to thaw. In contrast, with animal herds the ground would only warm by ca. 2.1 degrees - 44 percent less, which would be enough to preserve 80 percent of the current soils, as the model shows.

"It may be utopian to imaging resettling wild animal herds in all the permafrost regions of the Northern Hemisphere," the Earth system expert concedes. "But the results indicate that using fewer animals would still produce a cooling effect. What we've shown here is a promising method for slowing the loss of our permanently frozen soils, and with it, the decomposition and release of the enormous carbon stockpiles they contain."

Beer and his team also considered potential side effects of the approach. For example, in summer the animals destroy the cooling moss layer on the ground, which warms it additionally. This aspect was also taken into account in the simulations, but the positive impact of the snow effect in winter is several times greater. As a next step, Beer plans to collaborate with biologists, in order to investigate how the animals would actually spread across the landscape.

Credit: 
University of Hamburg

Changes in cellular degradation hubs can lead to cancer

Cancer cells grow and divide in an uncontrolled manner. A new study from Uppsala University now shows how alterations in a cell's degradation hubs, called lysosomes, can cause abnormal cell growth. The results are published today in the scientific journal Nature Communications.

Normal cells have several control mechanisms that prevent them from growing uncontrollably. During the last few years, it has become increasingly clear that these regulatory processes are taking place on the surface of lysosomes, which are small membrane-encapsulated vesicles that function as degradation centres of all cells. A cell can have hundreds of lysosomes that are organised into complex networks. Cancer cells frequently have alterations in the organisation of their lysosome networks, although it remains unclear to what degree this contributes to tumour progression.

In the present study, scientists from Uppsala University and Weill Cornell Medicine, USA, have found that the amount of lysosomes in a lysosomal network affects cellular growth through the activation of a protein called mTOR.

"We saw that when the number of lysosomes increased, mTOR molecules on the lysosomal surface became hyperactivated. Since mTOR is a central stimulator of cellular growth, this leads to increased growth, says Anders Mutvei, researcher at the Department of Immunology, Genetics and Pathology, Uppsala University, who led the study together with John Blenis at Weill Cornell Medicine.

The scientists have also identified another protein, Rap1, that regulates both the number of lysosomes present in the lysosomal network, and its organisation.

"Although this study is in an early phase, it demonstrates that lysosomes play a central role in cellular growth control. We need more knowledge about how changes in a lysosomal network contribute to cancer, which is something we are about to test in models of human cancers," says Anders Mutvei.

Credit: 
Uppsala University

Vitamin D boosts chances of walking after hip fracture

image: Senior citizens who are not vitamin D deficient have a better chance of walking after hip fracture surgery, according to a Rutgers-led study. The findings in The American Journal of Clinical Nutrition suggest that vitamin D deficiency could limit mobility in older adults, said senior author Sue Shapses, a professor in the Department of Nutritional Sciences at the School of Environmental and Biological Sciences at Rutgers University-New Brunswick.

Image: 
Sue Shapses/Rutgers University-New Brunswick

Senior citizens who are not vitamin D deficient have a better chance of walking after hip fracture surgery, according to a Rutgers-led study.

The findings in The American Journal of Clinical Nutrition suggest that vitamin D deficiency could limit mobility in older adults, said senior author Sue Shapses, a professor in the Department of Nutritional Sciences at the School of Environmental and Biological Sciences at Rutgers University-New Brunswick.

Shapses suggests that older adults take 800 international units (IU), equivalent to 20 micrograms, of vitamin D daily to prevent deficiency. Vitamin D is important for bone health, and people get it through some foods, exposure to the sun and vitamin pills.

"An important next step is learning how vitamin D affects mobility," said Shapses, who is also an adjunct professor in the Department of Medicine at Rutgers Robert Wood Johnson Medical School and director of the Center for Human Nutrition, Exercise and Metabolism at Rutgers' New Jersey Institute for Food, Nutrition, and Health. "For example, it is not clear if severe vitamin D deficiency is associated with direct effects on muscle, cognition and/or other organ systems."

A broken hip - among the most serious fall injuries - is hard to recover from, with many people unable to live on their own afterward. In the United States, more than 300,000 people 65 or older are hospitalized for hip fractures annually and falling causes more than 95 percent of these type of fractures. Women fall more frequently than men, experiencing three-quarters of hip fractures, and the number of fractures is likely to rise as the population ages, according to the U.S. Centers for Disease Control and Prevention.

Regaining mobility after a hip fracture is important for full recovery and to reduce the risk of death. But vitamin D deficiency is associated with reduced mobility after surgery to repair a hip fracture.

The multi-site study of patients 65 or older in the United States and Canada examined the influence of vitamin D levels in blood serum and nutrition on mobility. The study focused on death rate or inability to walk 10 feet (or across a room) without someone's help after surgery.

The findings showed that vitamin D levels greater than 12 nanograms per milliliter (12 parts per billion) in blood serum are associated with a higher rate of walking at 30 and 60 days after hip fracture surgery. While poor nutrition is associated with reduced mobility 30 days after surgery, that factor was not statistically significant. Still, in patients with high levels of parathyroid hormone, which leads to high levels of calcium in blood, mobility was reduced if their nutritional status was poor.

"This matters because vitamin D deficiency and malnutrition are common disorders in elderly patients with hip fractures and often occur together since both are complications of poor nutrition," Shapses said.

Previous studies have shown that taking 800 IU of vitamin D a day can prevent falling and fractures. A Rutgers-led study published last year indicated that high vitamin D intake (4,000 IU a day) compared with 600 IU a day may reduce reaction time, potentially boosting the risk of falling and fractures. The recommended dietary allowance for vitamin D is 600 IU daily for people from 1 to 70 years old and 800 for people over 70.

"These studies suggest that too much or too little vitamin D will affect mobility and falls in the elderly," Shapses said.

The lead author is Lihong Hao, a post-doctoral associate in the Department of Nutritional Sciences. Co-authors include Jeffrey L. Carson, provost, New Brunswick at Rutgers Biomedical and Health Sciences and Distinguished Professor of Medicine and Richard C. Reynolds M.D. chair in General Internal Medicine at Rutgers Robert Wood Johnson Medical School; Yvette Schlussel, research scientist and statistician in the Department of Nutritional Sciences; and Helaine Noveck at Rutgers Robert Wood Johnson Medical School.

Credit: 
Rutgers University