Earth

Plant provenance influences pollinators

image: Earth bumblebee covered with pollen from field scabious.

Image: 
WWU - Peter Lessmann

Insect decline is one of the greatest challenges facing our society. As a result of the destruction of many natural habitats, bees, bumblebees, butterflies, beetles and the like find less and less food. As a consequence, they are barely able to fulfil their role as pollinators of wild and cultivated plants. This trend is increasingly noticeable in agricultural regions in particular.

Researchers at the University of Münster have now taken a more detailed look at how the choice of seeds in restoration measures - i.e. the restoration of natural habitats at degraded land - affects how insects benefit from these measures. Here, not only the plant species plays an important role, but so does the geographical provenance of the seeds used - because the provenance influences not only insect diversity but also how often the pollinators visit flowers. The results of the study have been published in the "Journal of Applied Ecology".

Background and methodology

Insects are indispensable for the functioning of ecosystems - and for human survival. They are necessary, for example, for the pollination of many cultivated plants which are, in turn, an essential source of nutrition for humans. In regions characterized by agriculture or in built-up areas with settlements and cities, there are reduced resources available to pollinators. In order to support them in their pollination, flower-rich habitats are created in the landscape, often in the form of wildflower stripes.

When flower stripes or other habitats are created, however, it should be taken into account that plant species are not homogeneous entities, as their populations genetically differ. This differentiation often occurs as a result of population adaptation to their local environment. A brown knapweed, for example, which grows near the sea - where frost is rare - will be less frost-resistant than a brown knapweed which grows in the mountains, where frost is common. The differences can be seen in many plant traits, and some of these differences can influence pollinators, for example the number of flowers or the time when they flower. "Depending on the provenance, some populations flower earlier than others," as Dr. Anna Lampei Bucharová from Münster University's Institute of Landscape Ecology explains, who also lead the study. "When setting up habitats for pollinators, these within-species differences have so far often been neglected," she adds, "and the plants are mostly selected regardless of their provenance. This is why we tested to see whether the provenance of the plants influences pollinators."

The geographical provenance of the seeds plays a key role in this context. In a field experiment, the researchers formed small experimental plant communities which had exactly the same species composition but different provenances. The populations came from the Münster region, from the area around Munich and from greater Frankfurt an der Oder. They then recorded flowering data, observed the pollinators visiting these communities, and compared the frequency and diversity of the pollinators in communities with different provenances.

The researchers discovered that a plant's provenance influences pollinators - both how often the pollinators visit flowers and also the diversity of the insect species. "The effect can be considerable," says Dr. David Ott, co-author of the study. "We observed twice as many visits by pollinators at flowers of one provenance than at flowers of another provenance. The most important parameter driving this is the phenology of the plant's flower - in other words, the temporal sequence of flowering," he adds. The researchers conclude that plants from some provenances started to flower earlier and more intensively than others, and so they presented more flowers and, as a result, interacted more frequently with pollinators.

The results are important both for scientists and for ecological restoration. The researchers are confident that Germany provides good conditions for implementing provenance-based restoration strategies, because regional ecotypes of many species are readily available in the so-called "Regiosaatgut" ("regional seeds") system. This system provides regional seeds for many species for up to 22 regions in Germany. Thus, by selecting the appropriate plant origins, resources for pollinators could be sustainably improved.

Credit: 
University of Münster

Unlocking Australia's biodiversity, one dataset at a time

image: An ALA Staff member

Image: 
CSIRO

Australia's unique and highly endemic flora and fauna are threatened by rapid losses in biodiversity and ecosystem health, caused by human influence and environmental challenges. To monitor and respond to these trends, scientists and policy-makers need reliable data.

Biodiversity researchers and managers often don't have the necessary information, or access to it, to tackle some of the greatest environmental challenges facing society, such as biodiversity loss or climate change. Data can be a powerful tool for the development of science and decision-making, which is where the Atlas of Living Australia (ALA) comes in.

ALA - Australia's national biodiversity database - uses cutting-edge digital tools which enable people to share, access and analyse data about local plants, animals and fungi. It brings together millions of sightings as well as environmental data like rainfall and temperature in one place to be searched and analysed. All data are made publicly available - ALA was established in line with open-access principles and uses an open-source code base.

The impressive set of databases on Australia's biodiversity includes information on species occurrence, animal tracking, specimens, biodiversity projects, and Australia's Natural History Collections. The ALA also manages a wide range of other data, including information on spatial layers, indigenous ecological knowledge, taxonomic profiles and biodiversity literature. Together with its partner tools, the ALA has radically enhanced ease of access to biodiversity data. A forum paper recently published with the open-access, peer-reviewed Biodiversity Data Journal details its history, current state and future directions.

Established in 2010 under the Australian Government's National Collaborative Research Infrastructure Strategy (NCRIS) to support the research sector with trusted biodiversity data, it now delivers data and related services to more than 80,000 users every year, helping scientists, policy makers, environmental planners, industry, and the general public to work more efficiently. It also supports the international community as the Australian node of the Global Biodiversity Information Facility and the code base for the successful international Living Atlases community.

With thousands of records being added daily, the ALA currently contains nearly 95 million occurrence records of over 111,000 species, the earliest of them being from the late 1600s. Among them, 1.7 million are observation records harvested by computer algorithms, and the trend is that their share will keep growing.

Recognising the potential of citizen science for contributing valuable information to Australia's biodiversity, the ALA became a member of the iNaturalist Network in 2019 and established an Australian iNaturalist node to encourage people to submit their species observations. Projects like DigiVol and BioCollect were also born from ALA's interest in empowering citizen science.

The ALA BioCollect platform supports biodiversity-related projects by capturing both descriptive metadata and raw primary field data. BioCollect has a strong citizen science emphasis, with 524 citizen science projects that are open to involvement by anyone. The platform also provides information on projects related to ecoscience and natural resource management activities.

Hosted by the Australian Museum, DigiVol is a volunteer portal where over 6,000 public volunteers have transcribed over 800,000 specimen labels and 124,000 pages of field notes. Harnessing the power and passion of volunteers, the tool makes more information available to science by digitising specimens, images, field notes and archives from collections all over the world.

Built on a decade of partnerships with biodiversity data partners, government departments, community and citizen science organisations, the ALA provides a robust suite of services, including a range of data systems and software applications that support both the research sector and decision makers. Well regarded both domestically and internationally, it has built a national community that is working to improve the availability and accessibility of biodiversity data.

Credit: 
Pensoft Publishers

Researchers design micro-sized capsules for targeted drug delivery -- inspired by Russian pelmeni

An international team led by a Skoltech researcher has developed a method of fabrication for biodegradable polymer microcapsules, made more efficient by turning to an unusual source of inspiration - traditional Russian dumpling, or pelmeni, making. The two papers were published in Materials and Design and ACS Applied Materials and Interfaces.

Micro-sized capsules, which can be tailored to a variety of purposes, have proven very useful in targeted delivery of drugs and other bioactive compounds. To ensure optimal functioning, these have to be designed and manufactured with precision and in particular shapes, as non-spherical capsules turned out to be more efficient and effective than spherical ones.

"Non-spherical capsules could have side directed release as one side could degrade first and let the cargo release, they also could be navigated in flow with magnetic field. But the most important advantage is that biological cells more readily internalize non-spherical objects, however, this phenomenon is not yet understood," Gleb Sukhorukov of Skoltech and Queen Mary University of London, the papers' lead author, explains.

In the two papers, Sukhorukov and his colleagues describe a way to create micrometer-sized pyramid, rectangular and torpedo-shaped capsules by using soft lithography. In this method, a template is coated with a polymer, then cargo (a drug, for instance) is loaded onto the polymer and sealed by a top polymer layer, ending up sandwiched between the two layers. The capsules are then printed onto gelatin and harvested by dissolving it in water.

"The approach is not only inspired by Russian pelmeni making process, but in fact really reproduces on a microstructure level the trick that allows us to wrap various components inside, like proteins (meat in proper pelmeni) or natural healthy components (like berries or mashed potatoes in case of vareniki, a similar product)," Sukhorukov notes.

In the first paper, the team demonstrated two approaches, based on polyelectrolyte multilayer and polylactic acid, that resulted in 7-micrometer-long torpedo-shaped capsules. These had a high loading capacity, retained hydrophilic molecules well and were internalized by cells without causing toxic effects. "The proposed method offers great flexibility for the choice of active substances, regardless of their solubility and molecular weight," the authors write.

In the second paper, the researchers described pyramid and rectangular capsules made of polylactic acid, which are respectively about 1 and 11 micrometers in size. These capsules proved sufficiently stable to encapsulate small water-soluble molecules and to retain them for several days for subsequent intracellular delivery and/or serve as depot for controlled release.

"So far we created the capsules from polylactic acid, and we plan to explore these principles with other polymers which undergo degradation and hence release of cargo under specific conditions such as temperature, enzymes, pH and so on," Sukhorukov says.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

XYZeq: A better map of cell diversity

image: Alex Marson, director of the Gladstone-UCSF Institute of Genomic Immunology, is part of a team of researchers who developed a new technique to map the diversity of individual cells within a tissue or tumor.

Image: 
Anastasiia Sapon

SAN FRANCISCO, CA--April 21, 2021--Not all cancer cells within a tumor are created equal; nor do all immune cells (or all liver or brain cells) in your body have the same job. Much of their function depends on their location. Now, researchers at Gladstone Institutes, UC San Francisco (UCSF), and UC Berkeley have developed a more efficient method than ever before to simultaneously map the specialized diversity and spatial location of individual cells within a tissue or a tumor.

The technique, called XYZeq, was described online this week in the journal Science Advances. It involves segmenting a tissue into a microscopic grid before analyzing RNA from intact cells in each square of the grid, in order to gain a clear understanding of how each particular cell is functioning within its spatial location. This offers new insight into the organization of tissues and the interplay between different cell types during disease, including in cancers.

"What we have built is essentially a way to combine microscopy and single-cell analysis by sequencing," says Chun Jimmie Ye, PhD, associate professor of medicine at UCSF and lead author of the paper. "This technology gets us closer to being able to create a modern-day atlas of the human body. It lets us see not only what cells are included in a specific tissue, but where they are located within that tissue, what their relationships are, and how that changes with disease."

"I think we're actually taking a step toward this being the way tissues are analyzed to diagnose, characterize, or study disease; this is the pathology of the future," says Alex Marson, MD, PhD, one of the study's senior authors who is the director of the Gladstone-UCSF Institute of Genomic Immunology, and associate professor of medicine at UCSF.

Over the last decade, the advent of single-cell sequencing--which lets researchers analyze (or sequence) all the DNA or RNA contained in one cell at the same time--has shed new light on the diversity of cells within tissues. However, this technique, as well as experimentally integrating data about the cells' location, remains challenging using currently available methods. So, while researchers could see that a tissue contained a great diversity of cells, they didn't know how that diverse mixture of cells was arranged.

"What most people had been doing until now was taking a whole tissue, grinding it up, and then getting single-cell data from that mixture," says Youjin Lee, a postdoctoral scholar at Gladstone and co-first author of the new study. "Once the tissue is ground up, all information about the cells' spatial relationships is lost. With this new approach, we retain information about where each cell came from."

In XYZeq, a slice of tissue is placed on a slide that divides the tissue into hundreds of "microwells," each about the size of a grain of salt. Each cell in the tissue gets tagged with a molecular barcode--a short stretch of DNA--that's unique to the microwell it's contained in, like a neighborhood zip code. The cells are then mixed up and assigned a second barcode to ensure that each cell within a given square has its own unique identifier, like a street address within the neighborhood. Finally, the RNA from each cell is sequenced, and the results retain both barcodes to tell the researchers exactly where in the tissue it came from.

To test the utility and effectiveness of XYZeq, the team sampled tissue from mice with liver and spleen tumors. Their new approach let them visualize how cancer cells and healthy cells were arranged next to each other in the tissue samples.

However, that basic division--which could have been seen using other methods--wasn't all they could see. The team also found that some cell types located in the vicinity of the liver tumor weren't evenly spaced out; immune cells and specific types of stem cells were clustered in certain regions of the tumor. Moreover, certain stem cells had different levels of some RNA molecules depending on how far they resided from the tumor.

"This is a pattern we never would have been able to see without the spatial information that XYZeq conveys," says Derek Bogdanoff, graduate student at UCSF and co-first author of the study.

The researchers aren't sure yet what this pattern means, but they hint at the possibility that molecular signals generated by or near the tumor affect what nearby cells do.

This is the kind of spatial information that XYZeq was created to show. And the approach not only has promise in helping researchers untangle the roles of cells in the complex environment around a cancerous tumor, but in revealing cellular patterns in other organs and diseases. The brain, for instance, contains diverse cells whose physical arrangement influences how they communicate and store information.

To develop the XYZeq system, the researchers had to build each individual component, which required a large investment in time and machinery that is not available to all research groups. So, the team is now working on ways to scale up the technology to make it more accessible to other scientists.

"This technology generates spatially localized single-cell data that can be applied to tissues from different diseases," says Eric Chow, PhD, an assistant professor of biochemistry at UCSF and a senior author of the paper. "Eventually, that will help us move toward being able to use it in clinical settings as well."

Credit: 
Gladstone Institutes

Fighting harmful bacteria with nanoparticles

image: Deadly contact: Researchers at Empa and ETH have developed nanoparticles (red) that can kill resistant bacteria (yellow).

Image: 
Empa

In the arms race "mankind against bacteria", bacteria are currently ahead of us. Our former miracle weapons, antibiotics, are failing more and more frequently when germs use tricky maneuvers to protect themselves from the effects of these drugs. Some species even retreat into the inside of human cells, where they remain "invisible" to the immune system. These particularly dreaded pathogens include multi-resistant staphylococci (MRSA), which can cause life-threatening diseases such as sepsis or pneumonia.

In order to track down the germs in their hidouts and eliminate them, a team of researchers from Empa and ETH Zurich is now developing nanoparticles that use a completely different mode of action from conventional antibiotics: While antibiotics have difficulty in penetrating human cells, these nanoparticles, due to their small size and structure, can penetrate the membrane of affected cells. Once there, they can fight the bacteria.

Bioglass and metal

The team of Inge Herrmann and Tino Matter has used cerium oxide, a material with antibacterial and anti-inflammatory properties in its nanoparticle form. The researchers combined the nanoparticles with a bioactive ceramic material known as bioglass. Bioglass is of interest in the medical field because it has versatile regenerative properties and is used, for example, for the reconstruction of bones and soft tissues.

They then synthesized flame-made nanoparticle hybrids made of cerium oxide and bioglass. The particles have already been successfully used as wound adhesives (https://www.empa.ch/de/web/s604/empa-innovation-award-2020), whereby several interesting properties can be utilized simultaneously: Thanks to the nanoparticles, bleeding can be stopped, inflammation can be dampened and wound healing can be accelerated. In addition, the novel particles show a significant effectiveness against bacteria, while the treatment is well tolerated by human cells.

Recently, the new technology was successfully patented. The team has now published its results in the scientific journal Nanoscale in the "Emerging Investigator Collection 2021".

Destruction of germs

The researchers were able to show the interactions between the hybrid nanoparticles, the human cells and the germs using electron microscopy, among other methods. If infected cells were treated with the nanoparticles, the bacteria inside the cells began to dissolve. However, if the researchers specifically blocked the uptake of the hybrid particles, the antibacterial effect was gone.

The particles' exact mode of action is not yet fully understood. It has been shown that other metals also have antimicrobial effects. However, cerium is less toxic to human cells than, for instance, silver. Scientists currently assume that the nanoparticles affect the cell membrane of the bacteria, creating reactive oxygen species that lead to the destruction of the germs. Since the membrane of human cells is structurally different, our cells are not affected by this process.

The researchers think that resistance is less likely to develop against a mechanism of this kind. "What's more, the cerium particles regenerate over time, so that the oxidative effect of the nano-particles on the bacteria can start all over again," says Empa researcher Tino Matter. In this way, the cerium particles could have a long-lasting effect.

Next, the researchers want to analyze the interactions of the particles in the infection process in more detail in order to further optimize the structure and composition of the nanoparticles. The goal is to develop a simple, robust antibacterial agent that is effective inside infected cells.

Tricky germs

Among bacteria, there are some particularly devious pathogens that penetrate into cells and are thus invisible to the immune system. This is how they survive times when the body's defense is on alert. This phenomenon is also known for

staphylococci. They can retreat into cells of the skin, connective tissue, bones and even the immune system. The mechanism of this persistence is not yet fully understood.

Staphylococci are mostly harmless germs that can be found on the skin and mucous membranes. Under certain conditions, however, the bacteria flood the body and cause severe inflammation, or even lead to toxic shock and sepsis. This makes staphylococci the main cause of death from infections with only one single type of pathogen.

The increasing number of staphylococcal infections that no longer respond to treatment with antibiotics is particularly precarious. MRSA, multi-resistant germs, are particularly feared in hospitals where, as nosocomial pathogens, they cause poorly treatable wound infections or colonize catheters and other medical equipment. In total, around 75,000 hospital infections occur in Switzerland every year, 12,000 of which are fatal.

Cerium: Jack-of-all-trades among the chemical elements

The chemical element cerium was unjustly named after the dwarf planet Ceres; the silvery metal is currently making a big splash. As cerium oxide, it is incorporated into car catalytic converters, and it is also used in the manufacture of products as diverse as self-cleaning ovens, windscreens and light-emitting diodes (LEDs). Its antimicrobial and anti-inflammatory properties also make it interesting for medical applications.

Credit: 
Swiss Federal Laboratories for Materials Science and Technology (EMPA)

California's wildfire season has lengthened, and its peak is now earlier in the year

Irvine, Calif., April 22, 2021 -- California's wildfire problem, fueled by a concurrence of climate change and a heightened risk of human-caused ignitions in once uninhabited areas, has been getting worse with each passing year of the 21st century.

Researchers in the Department of Civil & Environmental Engineering at the University of California, Irvine have conducted a thorough analysis of California Department of Forestry and Fire Protection wildfire statistics from 2000 to 2019, comparing them with data from 1920 to 1999. They learned that the annual burn season has lengthened in the past two decades and that the yearly peak has shifted from August to July. The team's findings are the subject of a study published today in the open-access journal Scientific Reports.

The study is a focused examination of fire frequency, burned area and myriad drivers of the catastrophically destructive events. The team found that the number of hot spots - places with severe fire risk - has grown significantly in recent years, fueled by higher annual mean temperatures, greater vapor pressure deficit (lack of air moisture), drought, and an elevated chance of blazes being sparked through such human causes as power line disruptions, construction, transportation, campfires, discarded cigarettes and fireworks.

"CALFIRE data show that each new year of the 21st century has been a record breaker in terms of wildfire damage in California," said co-author Tirtha Banerjee, UCI assistant professor of civil & environmental engineering. "We also have seen that about 80 percent of the total number of the state's wildfires over the past few decades have been small, measuring less that 500 acres. But when fires get large, their deadliness greatly increases."

Banerjee said that to gain a proper understanding of the growth of fire risk in California, it's important to put large and small incidents into separate buckets. By doing this, the team learned that 1,247 out of 6,336 wildfires, about 20 percent, accounted for 97 percent of the total burned area in the 2000 to 2020 period.

"And more than nine-tenths of the casualties and property losses can be attributed to fires exceeding the 500-acre threshold," Banerjee said.

He added that over the past two decades, there has been a significant increase in "extreme" wildfires scorching more that 10,000 acres. Coinciding with that has been a rapid uptick in the frequency of small, human-caused blazes.

One of the most alarming findings of the study, according to lead-author Shu Li, a Ph.D. student in Banerjee's laboratory, is the substantial spatial growth of fire risk throughout the state. From 1920 to 1999, California's only hot spot with "very high wildfire density" was Los Angeles County. In the past 20 years, that designation has expanded greatly in Southern California to include Ventura County, and portions of Riverside, San Diego and San Bernardino Counties.

Even in northern California, areas known by fire managers as the Nevada-Yuba-Placer Unit and the Tuolumne-Calaveras Unit are newly emerged as high-density wildfire regions.

"Before 2000, there were almost no human-caused wildfires along California's Pacific coastline, but now nearly every coastal county is experiencing increased risk, and the San Benito-Monterey Unit and the San Luis Obispo Unit have even become new hot spots," said Li.

Many of the major fires in the northern part of the state are naturally occurring, predominantly ignited by lightning. But the majority of the increase in fire probability in recent years can be blamed on an expansion of the wilderness-urban interface. As people move into previously unpopulated areas, they bring their bad fire management habits with them.

"The concurrence of human-caused climate change, which is drying out our forests and grasslands and creating longer stretches of hot weather, and a steady influx of people into remote areas is creating conditions for the perfect fire storm," said Banerjee. "But there is some good news in all of this; human-caused fire risk can be mitigated by better fire management practices by humans."

He said he hoped the study and the near real-time analysis of fire risks in California's natural environment it provides can be used by government agencies and public policy officials to both prevent and combat costly blazes.

Credit: 
University of California - Irvine

Blacks, hispanics, impoverished have worse survival rates among teens, adults under 40 with cancer

image: Caitlin Murphy, Ph.D.

Image: 
UT Southwestern Medical Center

DALLAS - April 22, 2021 - Being Black or Hispanic, living in high-poverty neighborhoods, and having Medicaid or no insurance coverage are associated with higher mortality in men and women under 40 with cancer, a review by UT Southwestern Medical Center researchers found.

"Survival is not different because of biology. It's not different because of patient-level factors," says Caitlin Murphy, Ph.D., lead author of the study and an assistant professor of population and data sciences and internal medicine at UT Southwestern. "No matter which way we looked at the data, we still saw consistent and alarming differences in survival by race - and these are teens and young adults."

Other findings based on an analysis of Texas Cancer Registry data from 1995 to 2016 showed:

- Black men with non-Hodgkin lymphoma had a 57 percent survival rate compared with 75 percent for white men.

- The survival rate for Black patients with testicular cancer was 88.7 percent, compared with 96.6 percent for white patients.
Survival decreased as poverty increased for these highly treatable cancers among all race and ethnic groups.

- Men with private insurance had survival rates 20 percent higher for testicular, colorectal, and kidney cancer, and non-Hodgkin lymphoma than for men with no insurance or with Medicaid.

"By far the strongest predictor or association was race. In particular, the Black race was consistently associated with lower survival, even if patients are not poor and have insurance," Murphy says.

The study, published in the Journal of the National Cancer Institute, included 55,000 female and more than 32,000 male cancer patients ages 15 to 39, a population in which few studies have been conducted as the average age of most cancer patients is 66. The Texas Cancer Registry, established in 1995 by the Texas Department of State Health Services, is one of the largest cancer registries in the United States.

Senior author Sandi Pruitt, Ph.D., associate professor of population and data sciences, says the numbers show a need for greater investments in health care coverage and neighborhood revitalization.

"Where Black teens and young adults are with cancer survival today is worse than it was for white kids about 10 years ago. It's unreal," Pruitt says. "I think it is underappreciated how much the conditions in which we are born and live impact our health, and, in this case, the health of a very special and underserved population - teens and young adults with cancer. Persistent poverty and racism, combined with the low rate of health insurance which is common in Texas, are part of the context that leads to the worse survival for certain population groups we've observed in this study."

The authors call for future research and interventions to address the disparities, including better health insurance coverage, greater inclusion of teens and young adults in clinical trials, more collection of biospecimens from underserved teens and young adults, programs and policies designed to be anti-racist, and additional studies comparing risk factors, treatment, and outcomes.

Credit: 
UT Southwestern Medical Center

Life satisfaction among young people linked to collectivism

An international group of scientists from Italy, the USA, China and Russia have studied the relationship between collectivism, individualism and life satisfaction among young people aged 18-25 in four countries. They found that the higher the index of individualistic values at the country level, the higher the life satisfaction of young people's lives. At the individual level, however, collectivism was more significant for young people. In all countries, young people found a positive association between collectivism, particularly with regard to family ties, and life satisfaction. This somewhat contradicts and at the same time clarifies the results of previous studies. Russia was represented in the research group by Sofya Nartova-Bochaver https://www.hse.ru/en/org/persons/143572312, Professor at HSE University's School of Psychology. The results of the study have been published in the journal Applied Psychology: Health and Well-Being.

https://iaap-journals.onlinelibrary.wiley.com/doi/abs/10.1111/aphw.12259

What is this about?

Research shows that cultural factors play a significant role in explaining differences in indicators of subjective well-being and, in particular, life satisfaction.

Life satisfaction is one component of subjective well-being. It is an individual assessment of the correlation of living conditions with standards, a sense of correspondence between desires and needs on the one hand, and achievements and resources on the other.

Cultural factors include the values of individualism or collectivism. In general, an understanding of individualism is based on the assumption that people are independent of each other. It is a worldview centred on personal goals, uniqueness and control. Collectivism, on the other hand, assumes the importance of connections with others and mutual obligations.

Scientists distinguish between collectivism and individualism both at the cultural level (part of the national culture) and at the individual level (the individual's worldview). In this case, within the scope of the approach taken by the American psychologist Harry Triandis, individualism and collectivism can be considered in two dimensions -- horizontal and vertical:

Vertical Individualism (VI) is characterized by a desire to be outstanding and gain status through competition with others.

Horizontal Individualism (HI) is related to the desire to be unique, different from the group and able to rely on oneself.

Vertical Collectivism (VC) is characteristic of people who emphasize the integrity of their group and maintain competition with outgroups (a group of people to which the individual feels no sense of identity or belonging), as well as the possible subordination of their desires to authority.

Horizontal Collectivism (HC) is related to the desire to be like others, to follow common values, and to live interdependently without having to submit to authority.

The study's authors set out to discover how different dimensions of collectivism and individualism relate to life satisfaction in young people during early adulthood.

How was it studied?

The study involved 1,760 young boys and girls aged 18-25 from China, Italy, Russia and the USA -- countries that differ greatly in their individualistic values index. The average age of the respondents was around 20 years old. All of them university students, studying primarily social and behavioural sciences.

According to Hofstede's model, Italy and the United States are individualist cultures, while China and Russia are collectivist.

The study used special methods and questionnaires to identify individual levels of collectivism and individualism -- the Horizontal and Vertical Individualism and Collectivism Scale (INDCOL), as well as the level of life satisfaction -- the Satisfaction With Life Scale (SWLS). The influence of gender, age and cultural differences on life satisfaction was taken into account.

What were the findings?

At the country level, it was confirmed that individualism is closely linked to the degree of life satisfaction among young people. The higher the country's index of individualistic values, the more satisfied respondents are with their lives. Americans are the luckiest in this regard, as the USA has the highest individualism index, followed by Italians in second place and Russians and Chinese in third and fourth place, respectively.

At an individual level, the results were different -- life satisfaction showed a positive correlation with the two collectivist dimensions (vertical and horizontal) regardless of the type of culture. However, no significant correlations were found with either vertical or horizontal individualism.

The study showed that the degree of life satisfaction among young people is related to interdependence and social communication in different types of cultures. The researchers cite the example of Russians and Italians. For both, although some live in a collectivist country and others in an individualist one, life satisfaction is positively related to the successful fulfilment of social roles and obligations. Although this is to be expected, the transition to adulthood in Italy, as the authors note, is strongly intertwined with family relationships.

Previous research on American samples has not shown a relationship between life satisfaction and mutual social commitment. But this study did, for both levels of collectivism.

Overall, the fact that vertical collectivism, namely family ties and the obligation to take care of one's family, even at the expense of one's own needs, contributes positively to life satisfaction is unexpected and noteworthy, say the researchers. At the same time, the findings show correlation with a recent study proving that family and social relations are important basic components of happiness in different countries, regardless of gender and age.

Why is this needed?

Early adulthood is a period when there are still few social obligations and more opportunities to live out individualistic values. The original hypothesis of the study was that levels of life satisfaction are positively related to individualistic values at a personal level. Concluding this would have confirmed the results of much previous work. However, the results turned out to be the opposite.

The authors note that this study is more age-restricted than previous ones and also looks at the relationship between life satisfaction and different dimensions of individualism and collectivism. The new findings suggest that further research in this area is needed to clarify the particular influence of individualist and collectivist values on different aspects of subjective well-being.

Here, however, the researchers make it clear that this situation can occur not only because Americans and Italians are more satisfied with life, thanks to their countries' individualistic culture, but also because of differences in social inequality, the increased availability of opportunities and future life prospects.

Credit: 
National Research University Higher School of Economics

Improving survival in pancreatic cancer

image: Kondo and his team found TUG1 overexpression in some pancreatic cancer patients leads to increased release of an enzyme (DPD), which breaks down the chemotherapeutic, 5-FU, into a compound that can't kill cancer cells. Targeting this pathway reduced chemotherapy resistance in mice.

Image: 
Yutaka Kondo

Nagoya University researchers and colleagues in Japan have uncovered a molecular pathway that enhances chemotherapy resistance in some pancreatic cancer patients. Targeting an RNA to interrupt its activity could improve patient response to therapy and increase their overall survival.

"Pancreatic cancer is one of the most aggressive human malignancies, with an overall median survival that is less than five months," says cancer biologist Yutaka Kondo of Nagoya University Graduate School of Medicine. "This poor prognosis is partially due to a lack of potent therapeutic strategies against pancreatic cancer, so more effective treatments are urgently needed."

Kondo and his colleagues focused their attention on a long noncoding RNA (lncRNA) called taurine upregulating gene 1 (TUG1). lncRNAs are gene regulators, several of which have recently been identified for helping some cancers resist chemotherapy. TUG1 is already known for being overexpressed in gastrointestinal cancers that have poor prognosis and are resistant to chemotherapy.

The researchers found TUG1 was overexpressed in a group of patients with pancreatic ductal adenocarcinoma. These patients were resistant to the standard chemotherapy treatment 5-fluorouracil (5-FU), and died much sooner compared to cancer patients with low TUG1 expression levels.

Further laboratory tests showed TUG1 counteracts a specific microRNA, leading to increased activity of an enzyme, called dihydropyrimidine dehydrogenase, which breaks down 5-FU into a compound that can't kill cancer cells.

Kondo and his team found they could suppress TUG1 during 5-FU treatment of mice with pancreatic cancer by using antisense oligonucleotides attached to a specially designed cancer-targeting drug delivery system. Antisense oligonucleotides interfere with gene expression.

"Our data provides evidence that our therapeutic approach against pancreatic cancer could be promising," says Kondo.

The team now plans to conduct further laboratory investigations to test the effectiveness of their therapeutic strategy.

Credit: 
Nagoya University

Freeze! Executioner protein caught in the act

image: A cell in yellow is shown dying by necroptosis, a process requiring the protein MLKL

Image: 
WEHI Australia

A new molecular 'freeze frame' technique has allowed WEHI researchers to see key steps in how the protein MLKL kills cells.

Small proteins called 'monobodies' were used to freeze MLKL at different stages as it moved from a dormant to an activated state, a key process that enables an inflammatory form of cell death called necroptosis. The team were able to map how the three-dimensional structure of MLKL changed, revealing potential target sites that might be targets for drugs - a potential new approach to blocking necroptosis as a treatment for inflammatory diseases.

The research, which was published in Nature Communications, was led by Associate Professor James Murphy and PhD students Ms Sarah Garnish and Mr Yanxiang Meng, in collaboration with Assistant Professor Akiko Koide and Professor Shohei Koide from New York University, US.

At a glance

The 'executioner' protein MLKL kills cells through an inflammatory process called necroptosis.

Monobody technology has enabled WEHI researchers to capture different forms of MLKL as it becomes activated and moves to kill the cell.

Understanding how the three-dimensional shape of MLKL changes may lead to the development of drugs that prevent necroptosis, as a treatment for inflammatory diseases.

Key steps in necroptosis

MLKL is a key protein in necroptosis, being the 'executioner' that kills cells by making irreparable holes in their exterior cell membrane. This allows the cell contents to leak out and triggers inflammation - alerting nearby cells to a threat, such as an infection.

Ms Garnish said MLKL was activated within a protein complex called a 'necrosome' which responded to external signals.

"While we know which proteins activate MLKL, and that this involves protein phosphorylation, nobody had been able to observe any detail about how this changes MLKL at the structural level. It happens so fast that it's essentially a 'molecular blur'," she said.

A new technology - monobodies - developed by Professor Koide's team, was key to revealing how MLKL changed.

Monobodies that specifically bound to different 'shapes' of MLKL were used to capture these within cells, Mr Meng said.

"These monobodies prevented MLKL from moving out of these shapes - so we could freeze MLKL into its different shapes," he said.

"We then used structural biology to generate three-dimensional maps of these shapes which could be compared. This revealed that MLKL passed through distinct shape changes as it transitioned from being activated through to breaking the cell membrane."

An important step

Associate Professor Murphy said the structures provided the first formal evidence for how MLKL changed its shape after it was activated.

"Until now, we've speculated that this happens, but it was only with monobodies that we could actually prove there are distinct steps in MLKL activation," he said.

"Necroptosis is an important contributor to inflammatory conditions such as inflammatory bowel disease. There is intense interest in MLKL as a key regulator of necroptosis - and how it could be blocked by drugs as a potential new anti-inflammatory therapy."

The research was supported by the Australian Government National Health and Medical Research Council and Department of Education, Skills and Employment, a Melbourne Research Scholarship, the Wendy Dowsett Scholarship, an Australian Institute of Nuclear Science and Engineering Postgraduate Research Award, the Australian Cancer Research Foundation, the US National Institutes of Health and the Victorian Government.

The Australian Synchrotron's MX beamlines were critical infrastructure for the project.

Credit: 
Walter and Eliza Hall Institute

Landscape induced back-building thunderstorm lines along the mei-yu front

image: Torrential floodwaters rushing down from the mountains surrounding a river basin in the Jiangxi Province of China.

Image: 
Cover design by AAS

Thunderstorm development is not always dependent on atmospheric physics alone. Often, the surrounding landscape can influence convection, especially in regions with dramatic elevation changes. The Yangtze river basin in China's Jiangxi Province, which is surrounded by the Nanling Mountains, often experiences mesoscale convective systems (MCS) or squall line thunderstorms during the summer. These MCSs develop along the persistent mei-yu front, and often exhibit quickly developing parallel back-building, or training thunderstorms, resulting in torrential flooding. A research team led by Dr. Zhemin Tan, Professor at the School of Atmospheric Sciences of Nanjing University, analyzed the influences of the regional landscape that lead to consistent MCS back-building in the Yangtze river basin.

"Parallel back-building convective lines are often observed along the mei-yu front in China, and they can quickly develop into a stronger convective group of echoes, resulting in locally heavy rainfall within the mei-yu front rainband." said Dr. Tan

. "Mesoscale convective systems evolving along the mei-yu front induced cold outflow centered over the eastern side of the basin, which pushed the leading edge of the mei-yu front toward the mountains on the southeast side of the basin."

To better understand what initiates back-building convective lines, Dr. Tan and a group of researchers from the Key Laboratory of Mesoscale Severe Weather of Nanjing University, performed a high-resolution model simulation of a typical MCS event during 27-28 June 2013. Simulation results show that new convection along the convective lines is forced by intermittent interaction between the cold MCS outflow and the warm southerly airflow ahead of the mei-yu front. This process is enhanced by nearby terrain, especially the Nanling Mountains.

"The mountains along the way played a crucial role in supporting the rapid development of the convective lines to include torrential flood." said Dr. Tan. He, along with the study's coauthors, submitted their findings in Advances in Atmospheric Sciences. The journal published the noteworthy research as a cover article.

This mei-yu front MCS evolved from the western side of the basin. As it moved east, cold outflow centered over the eastern part of the basin. Strong southwest airflow ahead of the front passed the Nanling Mountains, merging with the cold outflow within the basin, sparking the erratic first stage of parallel convective line formation (Fig. 1a). Then, low mountains along the airmass boundary enhanced uneven storm development within the MCS.

"Knowledge of the effects of the mountains on the convective line formation can help to understand and predict the heavy precipitation events over the basin region during the mei-yu season in China," believes Dr. Tan.

In this case, the MCS quickly grew upscale from the first stage convective lines, resulting in apparent precipitation cooling. This process enhanced the cold outflow, shifting it southward (Fig 1b). Stronger cold outflow then pushed the warm airflow farther south, impacting the mountains on the southeast side of the basin. Mountain valleys, or terrain gaps in the southeastern basin are then roughly parallel to the outflow and play a controlling role in a second stage formation of parallel convective lines.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Recreating the earliest stages of life

image: Gladstone investigator Kiichiro Tomoda and his colleagues use stem cells to generate synthetic mouse embryos containing the three fundamental cell types normally found in pre-implantation embryos.

Image: 
Gladstone Institutes

SAN FRANCISCO, CA--April 22, 2021--In their effort to understand the very earliest stages of life and how they can go wrong, scientists are confronted with ethical issues surrounding the use of human embryos. The use of animal embryos is also subject to restrictions rooted in ethical considerations. To overcome these limitations, scientists have been trying to recreate early embryos using stem cells.

One of the challenges in creating these so-called synthetic embryos is to generate all the cell types normally found in a young embryo before it implants into the wall of the uterus. Some of these cells eventually give rise to the placenta. Others become the amniotic sac in which the fetus grows. Both the placenta and the amniotic sac are crucial for the survival of the fetus, and defects in these embryo components are major causes of early pregnancy loss.

A group of scientists from Gladstone Institutes, the Center for iPS Cell Research and Application (CiRA) from Kyoto University, and the RIKEN Center for Biosystems Dynamics Research in Kobe, Japan, has now demonstrated the presence of precursors of the placenta and the amniotic sac in synthetic embryos they created from mouse stem cells.

"Our findings provide strong evidence that our system is a good model for studying the early, pre-implantation stages of embryo development," says Kiichiro Tomoda, PhD, research investigator at the recently opened iPS Cell Research Center at Gladstone and first author of the study published in the journal Stem Cell Reports. "Using this model, we will be able to dissect the molecular events that take place during these early stages, and the signals that the different embryonic cells send to each other."

Ultimately, this knowledge might help scientists develop strategies to decrease infertility due to early embryonic development gone awry.

The new findings could also shed light on a defining property of the earliest embryo cells that has been difficult to capture in the lab: their ability to produce all the cell types found in the embryo and, ultimately, the whole body. Scientists refer to this property as "totipotency."

"Totipotency is a very unique and short-lived property of early embryonic cells," says Cody Kime, PhD, an investigator at the RIKEN Center for Biosystems Dynamics Research and the study's senior author.

"It has been much harder to harness in the lab than pluripotency," he adds, referring to the ability of some cells to give rise to several--but not all--cell types. "A very exciting prospect of our work is the ability to understand how we can reprogram cells in the lab to achieve totipotency."

Growing the Fundamental Components of Early Embryos in the Lab

To generate synthetic embryos, the scientists started from mouse pluripotent stem cells that normally give rise to the fetus only--not the placenta or amniotic sac. They can grow these cells, called epiblast stem cells, and multiply them indefinitely in the lab.

In previous work, the team had discovered a combination of nutrients and chemicals that could make epiblast stem cells assemble into small cell structures that closely resemble pre-implantation embryos. In fact, the structures could even reach the implantation stage when transferred into female mice, though they degenerated shortly thereafter.

"This meant that we might successfully reprogram the epiblast cells to revert to an earlier stage, when embryonic cells are totipotent, and provided a clue to how we might generate both the fetus and the tissues that support its implantation," explains Tomoda, who is also a program-specific research center associate professor at CiRA.

To build on that work and better understand the reprogramming process, the scientists needed molecular resolution. In their new study, they turned to single-cell RNA sequencing, a technique that allows scientists to study individual cells based on the genes they turn on or off.

After analyzing thousands of individual cells reprogrammed from epiblast stem cells, and sifting the data through computer-powered analyses, they confirmed that, after 5 days of reprogramming, some cells closely resembled all three precursors of the fetus, the placenta, and the amniotic sac.

Moreover, as they were grown in the lab for a few more days, the three cell types displayed more distinct molecular profiles with striking similarity to real embryonic model cells. This is the same as would be expected during the growth of a normal embryo, when the three tissues acquire distinct physical properties and biological functions.

"Our single-cell RNA-sequencing analysis confirms the emergence in our synthetic embryo system of the cell types that lead to the three fundamental components of an early mammalian embryo," says Kime. "In addition, it unveils in amazing detail the genes and biological pathways involved in the development of these precursors and their maturation into specific tissues."

This knowledge provides a comprehensive backdrop against which to understand the mechanisms of early embryo development and the possible causes of its failure.

For now, the scientists plan to work on ways to increase the efficiency of their reprogramming process, so as to reliably produce large amounts of pre-implantation-like synthetic embryos for further studies. This would allow them to carry out experiments that were up to now unthinkable, such as large-scale screens for gene mutations that disrupt early embryos. And it may shed light on the causes of pregnancy loss due to early embryo failure.

They also want to better understand the molecular steps involved in reprogramming. In particular, they plan to look earlier than 5 days into the reprogramming process, with the hope of pinpointing truly totipotent cells at the origin of their synthetic embryos.

"The discovery that we could reprogram cells to adopt earlier, more pluripotent states revolutionized developmental biology 15 years ago," says Tomoda, referring to the discovery of induced pluripotent stem cells by his and Kime's mentor, Nobel Laureate Shinya Yamanaka.

"In the last few years, the field of synthetic embryology utilizing stem cells has seen a true explosion," he says. "Our method of generating synthetic embryos is simpler than others, and quite efficient. We think it will be a great resource for many labs."

Credit: 
Gladstone Institutes

New study shows people with a high Omega-3 index less likely to die prematurely

A new research paper examining the relationship between the Omega-3 Index and risk for death from any and all causes has been published in Nature Communications. It showed that those people with higher omega-3 EPA and DHA blood levels (i.e., Omega-3 Index) lived longer than those with lower levels. In other words, those people who died with relatively low omega-3 levels died prematurely, i.e., all else being equal, they might have lived longer had their levels been higher.

Numerous studies have investigated the link between omega-3s and diseases affecting the heart, brain, eyes and joints, but few studies have examined their possible effects on lifespan.

In Japan, omega-3 intakes and blood levels are higher than most other countries in the world AND they happen to live longer than most. Coincidence? Possibly, or maybe a high Omega-3 Index is part of the explanation.

Studies reporting estimated dietary fish or omega-3 intake have reported benefits on risk for death from all causes, but "diet record" studies carry little weight because of the imprecision in getting at true EPA and DHA intakes. Studies using biomarkers - i.e., blood levels - of omega-3 are much more believable because the "exposure" variable is objective.

This new paper is from the FORCE - Fatty Acids & Outcomes Research - Consortium. FORCE is comprised of researchers around the world that have gathered data on blood fatty acid levels in large groups of study subjects (or cohorts) and have followed those individuals over many years to determine what diseases they develop. These data are then pooled to get a clearer picture of these relationships than a single cohort can provide. The current study focused on omega-3 levels and the risk for death during the follow-up period, and it is the largest study yet to do so.

Specifically, this report is a prospective analysis of pooled data from 17 separate cohorts from around the world, including 42,466 people followed for 16 years on average during which time 15,720 people died. When FORCE researchers examined the risk for death from any cause, the people who had the highest EPA+DHA levels (i.e., at the 90th percentile) had a statistically significant, 13% lower risk for death than people with EPA+DHA levels in the 10th percentile. When they looked at three major causes of death - cardiovascular disease, cancer and all other causes combined - they found statistically significant risk reductions (again comparing the 90th vs 10th percentile) of 15%, 11%, and 13%, respectively.

The range between the 10th and 90th percentile for EPA+DHA was (in terms of red blood cell membrane omega-3 levels, i.e., the Omega-3 Index) about 3.5% to 7.6%. From other research, an optimal Omega-3 Index is 8% or higher.

In the new paper, the authors noted that these findings suggest that omega-3 fatty acids may beneficially affect overall health and thus slow the aging process, and that they are not just good for heart disease.

"Since all of these analyses were statistically adjusted for multiple personal and medical factors (i.e., age, sex, weight, smoking, diabetes, blood pressure, etc., plus blood omega-6 fatty acid levels), we believe that these are the strongest data published to date supporting the view that over the long-term, having higher blood omega-3 levels can help maintain better overall health," said Dr. Bill Harris, Founder of the Fatty Acid Research Institute (FARI), and lead author on this paper.

Dr. Harris co-developed the Omega-3 Index 17 years ago as an objective measure of the body's omega-3 status. Measuring omega-3s in red blood cell membranes offers an accurate picture of one's overall omega-3 intake during the last four to six months. To date, the Omega-3 Index has been featured in more than 200 research studies.

"This comprehensive look at observational studies of circulating omega-3 fatty acids indicates that the long chain omega-3s EPA, DPA, and DHA, usually obtained from seafood, are strongly associated with all-cause mortality, while levels of the plant omega-3 alpha-linolenic acid (ALA) are less so," said Tom Brenna, PhD, Professor of Pediatrics, Human Nutrition, and Chemistry, Dell Medical School of the University of Texas at Austin.

Credit: 
Wright On Marketing & Communications

Climate has shifted the axis of the Earth

image: Melting of glaciers in Alaska, Greenland, the Southern Andes, Antarctica, the Caucasus and the Middle East accelerated in the mid-90s, becoming the main driver pushing Earth's poles into a sudden and rapid drift toward 26°E at a rate of 3.28 millimeters (0.129 inches) per year.

Color intensity on the map shows where changes in water stored on land (mostly as ice) had the strongest effect on the movement of the poles from April 2004 to June 2020. Inset graphs plot the change in glacier mass (black) and the calculated change in water on land (blue) in the regions of largest influence.

Image: 
Deng et al (2021) Geophysical Research Letters/AGU

WASHINGTON-- Glacial melting due to global warming is likely the cause of a shift in the movement of the poles that occurred in the 1990s.

The locations of the North and South poles aren't static, unchanging spots on our planet. The axis Earth spins around--or more specifically the surface that invisible line emerges from--is always moving due to processes scientists don't completely understand. The way water is distributed on Earth's surface is one factor that drives the drift.

Melting glaciers redistributed enough water to cause the direction of polar wander to turn and accelerate eastward during the mid-1990s, according to a new study in Geophysical Research Letters, AGU's journal for high-impact, short-format reports with immediate implications spanning all Earth and space sciences.

"The faster ice melting under global warming was the most likely cause of the directional change of the polar drift in the 1990s," said Shanshan Deng, a researcher at the Institute of Geographic Sciences and Natural Resources Research at the Chinese Academy of Sciences, the University of the Chinese Academy of Sciences and an author of the new study.

The Earth spins around an axis kind of like a top, explains Vincent Humphrey, a climate scientist at the University of Zurich who was not involved in this research. If the weight of a top is moved around, the spinning top would start to lean and wobble as its rotational axis changes. The same thing happens to the Earth as weight is shifted from one area to the other.

Researchers have been able to determine the causes of polar drifts starting from 2002 based on data from the Gravity Recovery and Climate Experiment (GRACE), a joint mission by NASA and the German Aerospace Center, launched with twin satellites that year and a follow up mission in 2018. The mission gathered information on how mass is distributed around the planet by measuring uneven changes in gravity at different points.

Previous studies released on the GRACE mission data revealed some of the reasons for later changes in direction. For example, research has determined more recent movements of the North Pole away from Canada and toward Russia to be caused by factors like molten iron in the Earth's outer core. Other shifts were caused in part by what's called the terrestrial water storage change, the process by which all the water on land--including frozen water in glaciers and groundwater stored under our continents--is being lost through melting and groundwater pumping.

The authors of the new study believed that this water loss on land contributed to the shifts in the polar drift in the past two decades by changing the way mass is distributed around the world. In particular, they wanted to see if it could also explain changes that occurred in the mid-1990s.

In 1995, the direction of polar drift shifted from southward to eastward. The average speed of drift from 1995 to 2020 also increased about 17 times from the average speed recorded from 1981 to 1995.

Now researchers have found a way to wind modern pole tracking analysis backward in time to learn why this drift occurred. The new research calculates the total land water loss in the 1990s before the GRACE mission started.

"The findings offer a clue for studying past climate-driven polar motion," said Suxia Liu, a hydrologist at the Institute of Geographic Sciences and Natural Resources Research at the Chinese Academy of Sciences, the University of the Chinese Academy of Sciences and the corresponding author of the new study. "The goal of this project, funded by the Ministry of Science and Technology of China is to explore the relationship between the water and polar motion."

Water loss and polar drift

Using data on glacier loss and estimations of ground water pumping, Liu and her colleagues calculated how the water stored on land changed. They found that the contributions of water loss from the polar regions is the main driver of polar drift, with contributions from water loss in nonpolar regions. Together, all this water loss explained the eastward change in polar drift.

"I think it brings an interesting piece of evidence to this question," said Humphrey. "It tells you how strong this mass change is--it's so big that it can change the axis of the Earth."

Humphrey said the change to the Earth's axis isn't large enough that it would affect daily life. It could change the length of day we experience, but only by milliseconds.

The faster ice melting couldn't entirely explain the shift, Deng said. While they didn't analyze this specifically, she speculated that the slight gap might be due to activities involving land water storage in non-polar regions, such as unsustainable groundwater pumping for agriculture.

Humphrey said this evidence reveals how much direct human activity can have an impact on changes to the mass of water on land. Their analysis revealed large changes in water mass in areas like California, northern Texas, the region around Beijing and northern India, for example--all areas that have been pumping large amounts of groundwater for agricultural use.

"The ground water contribution is also an important one," Humphrey said. "Here you have a local water management problem that is picked up by this type of analysis."

Liu said the research has larger implications for our understanding of land water storage earlier in the 20th century. Researchers have 176 years of data on polar drift. By using some of the methods highlighted by her and her colleagues, it could be possible to use those changes in direction and speed to estimate how much land water was lost in past years.

Credit: 
American Geophysical Union

Ankle exoskeleton enables faster walking

image: Engineers at Stanford University have tested how well a prototype exoskeleton system they have developed increased the self-selected walking speed of people in an experimental setting. The mode that was optimized for speed was created through a human-in-the-loop process.

Image: 
Farrin Abbott

Being unable to walk quickly can be frustrating and problematic, but it is a common issue, especially as people age. Noting the pervasiveness of slower-than-desired walking, engineers at Stanford University have tested how well a prototype exoskeleton system they have developed - which attaches around the shin and into a running shoe - increased the self-selected walking speed of people in an experimental setting.

The exoskeleton is externally powered by motors and controlled by an algorithm. When the researchers optimized it for speed, participants walked, on average, 42 percent faster than when they were wearing normal shoes and no exoskeleton. The results of this study were published April 20 in IEEE Transactions on Neural Systems and Rehabilitation Engineering.

"We were hoping that we could increase walking speed with exoskeleton assistance, but we were really surprised to find such a large improvement," said Steve Collins, associate professor of mechanical engineering at Stanford and senior author of the paper. "Forty percent is huge."

For this initial set of experiments, the participants were young, healthy adults. Given their impressive results, the researchers plan to run future tests with older adults and to look at other ways the exoskeleton design can be improved. They also hope to eventually create an exoskeleton that can work outside the lab, though that goal is still a ways off.

"My research mission is to understand the science of biomechanics and motor control behind human locomotion and apply that to enhance the physical performance of humans in daily life," said Seungmoon Song, a postdoctoral fellow in mechanical engineering and lead author of the paper. "I think exoskeletons are very promising tools that could achieve that enhancement in physical quality of life."

Walking in the loop

The ankle exoskeleton system tested in this research is an experimental emulator that serves as a testbed for trying out different designs. It has a frame that fastens around the upper shin and into an integrated running shoe that the participant wears. It is attached to large motors that sit beside the walking surface and pull a tether that runs up the length of the back of the exoskeleton. Controlled by an algorithm, the tether tugs the wearer's heel upward, helping them point their toe down as they push off the ground.

For this study, the researchers had 10 participants walk with five different modes of operation. They walked in normal shoes without the exoskeleton, with the exoskeleton turned off and with the exoskeleton turned on with three different modes: optimized for speed, optimized for energy use, and a placebo mode adjusted to make them walk more slowly. In all of the tests, participants walked on a treadmill that adapts to their speed.

The mode that was optimized for speed - which resulted in the 42 percent increase in walking pace - was created through a human-in-the-loop process. An algorithm repeatedly adjusted the exoskeleton settings while the user walked, with the goal of improving the user's speed with each adjustment. Finding the speed-optimized mode of operation took about 150 rounds of adjustment and two hours per person.

In addition to greatly increasing walking speed, the speed-optimized mode also reduced energy use, by about 2 percent per meter traveled. However, that result varied widely from person to person, which is somewhat expected, given that it was not an intentional feature of that exoskeleton mode.

"The study was designed to specifically answer the scientific question about increasing walking speed," Song said. "We didn't care too much about the other performance measures, like comfort or energy. However, seven out of 10 participants not only walked faster but consumed less energy, which really shows how much potential exoskeletons have for helping people in an efficient way."

The settings that were optimized specifically for energy use were borrowed from a previous experiment. In the current study, this mode decreased energy use more than the speed-optimized settings but did not increase speed as much. As intended, the placebo mode both slowed down participants and boosted their energy use.

Better, faster, stronger

Now that the researchers have attained such significant speed assistance, they plan to focus future versions of the ankle exoskeleton emulator on reducing energy use consistently across users, while also being more comfortable.

In considering older adults specifically, Collins and his lab wonder whether future designs could reduce pain caused by weight on joints or improve balance. They plan to conduct similar walking tests with older adults and hope those provide encouraging results as well.

"A 40 percent increase in speed is more than the difference between younger adults and older adults," said Collins. "So, it's possible that devices like this could not only restore but enhance self-selected walking speed for older individuals and that's something that we're excited to test next."

Credit: 
Stanford University