Tech

Older adults most likely to make the effort to help others

Older adults are more willing to make an effort to help others than younger adults, according to new research from the University of Birmingham.

The study, led by researchers in the University's School of Psychology, is the first to show how effortful 'prosocial' behaviour - intended to benefit others - changes as people get older. In particular, it focused on people's willingness to exert physical effort, rather than to give money or time, since attitudes to both these are known to change with age. The research results are published in Psychological Science.

In the study, the research team tested a group of 95 adults aged between 18 and 36, and a group of 92 adults aged 55-85. Each participant made 150 choices about whether or not to grip a handheld dynamometer - a device for measuring grip strength or force,- with 6 different levels of how hard they had to grip. Before the experiment, the researchers measured each person's maximum grip strength, so they could make sure that the effort people had to put in was the same for everyone, and not affected by how strong people were.

For each decision, participants were told whether they would be working to gain money for themselves, or for another person. First they were asked to decide whether they would be willing to put in effort to gain money or not. If they accepted the offer they had to grip hard enough to get the money.

The results showed that when the task was easy, young and older adults were equally willing to work for others, but, when the task was more effortful older adults were more willing to work to help others. In contrast, younger adults were more selfish and were much more likely to put in higher levels of effort to benefit themselves.

The team also noticed a correlation between the willingness to put effort into tasks that benefited other people and positive feelings towards other people. But it was only in younger people that this 'warm glow' feeling also related to them completing tasks for themselves.

Senior author, Dr Matthew Apps, explained: "Past research had suggested that older adults were more prosocial than young adults because they donate more money to charity. But, the amount of money or time people have available changes a lot as we get older, as such older adults might just appear more prosocial. We wanted to focus simply on people's willingness to exert effort on behalf of someone else, as this shouldn't depend on your wealth or the time you have available. Our results showed very clearly that participants in our older age group were more likely to work harder for others, even though they would gain no significant financial reward for themselves."

Lead author Patricia Lockwood added: "A lot of research has focused on the negative changes that happen as people get older. We show that there are positive benefits to getting older too, in particular older adults seem to be more willing to put in effort to help others. These 'prosocial behaviours' are really important for social cohesion. Understanding how prosocial behaviour changes as people get older is critical as we predict the impact of an ageing society."

Credit: 
University of Birmingham

Thermoelectric material discovery sets stage for new forms of electric power in the future

image: Jian He is an associate professor in Clemson University's Department of Physics and Astronomy.

Image: 
Clemson University College of Science

Thermoelectrics directly convert heat into electricity and power a wide array of items -- from NASA's Perseverance rover currently exploring Mars to travel coolers that chill beverages.

A Clemson University physicist has joined forces with collaborators from China and Denmark to create a new and potentially paradigm-shifting high-performance thermoelectric compound.

A material's atomic structure, which is how atoms arrange themselves in space and time, determines its properties. Typically, solids are crystalline or amorphous. In crystals, atoms are in an orderly and symmetrical pattern. Amorphous materials have randomly distributed atoms.

Clemson researcher Jian He and the international team created a new hybrid compound in which the crystalline and amorphous sublattices are intertwined into a one-of-a-kind crystal-amorphic duality.

"Our material is a unique hybrid atomic structure with half being crystalline and half amorphous," said He, an associate professor in the College of Science's Department of Physics and Astronomy. "If you have a unique or peculiar atomic structure, you would expect to see very unusual properties because properties follow structure."

The high-profile energy research journal Joule published their findings in a paper titled "Thermoelectric materials with crystal-amorphicity duality induced by large atomic size mismatch," which appeared online on April 16 ahead of the May 19 issue.

The researchers created their hybrid material by intentionally mixing elements in the same group on the periodic table but with different atomic sizes. Here, they used the atomic size mismatches between sulfur and tellurium and between copper and silver to create a new compound (Cu1-xAgx)2(Te1-ySy) in which the crystalline and amorphous sublattices intertwine into a one-of-a-kind crystal-amorphicity duality. The new compound exhibited excellent thermoelectric performance.

While this discovery doesn't directly impact application now, it is likely to lead to better thermoelectrics in the future.

"The new material performs well, but more important than that is how it achieves that level of performance," He said. "Traditionally, thermoelectric materials are crystals. Our material is not pure crystal, and we show we can achieve the same level of performance with a material with a new atomic structure."

He said he expects the new material will begin affecting applications in 10 to 20 years.

"They definitely can do something current thermoelectric materials cannot do, but not now," He said. "However, the future of this research is bright."

In addition to He, the research involved scientists from Shanghai Jiaotong University, Shanghai Institute of Ceramics and SUSTech in China, and Aarhus University in Denmark.

Credit: 
Clemson University

A rich marine algal ecosystem 600 million years earlier than previously thought

image: The Xiamaling Formation in China, which contains fossilised algae from primeval times.

Image: 
© Don E. Canfield/University of Southern Denmark

The first photosynthetic oxygen-producing organisms on Earth were cyanobacteria. Their evolution dramatically changed the Earth allowing oxygen to accumulate into the atmosphere for the first time and further allowing the evolution of oxygen-utilizing organisms including eukaryotes. Eukaryotes include animals, but also algae, a broad group of photosynthetic oxygen-producing organisms that now dominate photosynthesis in the modern oceans. When, however, did algae begin to occupy marine ecosystems and compete with cyanobacteria as important phototrophic organisms?

In a new study Zhang et al use the molecular remains of ancient algae (so-called biomarkers) to show that algae occupied an important role in marine ecosystems 1400 million years ago, some 600 million years earlier than previously recognized.

The specific biomarkers explored by Zhang et al are a group of sterane molecules derived from sterols that are prominent components of cell membranes in eukaryotic organisms. A particular difficulty in analyzing for ancient steranes is that samples are easily contaminated with steranes from other sources. The sources of contamination range from steranes introduced during the sampling, transport and processing of the samples, to geological contamination of steranes as fluids have flow through the rocks.

Zhang et al carefully controlled for each of the sources of contamination and found, as have others, that no steranes were liberated when using standard protocols to extract biomarkers from such ancient rocks, in this case the 1400 million-year-old Xiamaling Formation in North China.

However, Shuichang Zhang, the lead author of the study speculated that "There is some fossil evidence for eukaryotic algae 1400 million years ago, or even earlier, so we wondered whether any steranes in these rocks might be more tightly bound to the kerogens and not easily released during standard biomarker extraction". Therefore, Zhang et al utilized a stepwise heating protocol where samples were slowly heated in gold tubes in 9 steps from 300°C to 490°C. The organic molecules liberated in each of the nine steps were extracted and steranes indicating the presence of both red and green algae were liberated, especially at the higher temperatures.

Zhang continues "Many will be concerned that the steranes we found were a product of some kind of contamination. We were also worried about this, but we ran in parallel samples that have been heated to high temperatures during their geologic history and that, therefore, contained no biomarkers. We found no steranes in these. This means that our protocols were clean, and we are therefore confident that the steranes we found were indigenous to the rock".

It's still not completely clear why the steranes were so tightly bound to the kerogen and not released during standard protocols. But, the findings of Zhang et al. show that both green and red algal groups were present in marine ecosystems by 1400 million years ago. This is 600 million years earlier than evident from previous biomarker studies. This work shows that the red and green algal lineages had certainly evolved by 1400 million years ago, and this should be a useful constraint in timing the overall history of eukaryote evolution. This work also shows that at least some ancient marine ecosystems functioned more similarly to modern ecosystems than previously thought, at least with respect to the types of photosynthetic organisms producing organic matter. This means furthermore that there was sufficient nutrients and oxygen available to drive the presence of algae-containing ecosystems.

Professor Don Canfield, Nordic Center for Earth Evolution, University of Southern Denmark, a co-author on the study adds: "We hope that our study will inspire others to utilize similar techniques to better unravel the full history of eukaryote evolution through geologic time".

Credit: 
University of Southern Denmark

Are our oil and gas pipelines safe during an earthquake?

image: The new research shows that current methods used for calculating stress received by the underground pipelines during an earthquake are incorrect.

Image: 
Peter the Great St.Petersburg Polytechnic University

Underground pipelines that transport oil and gas are very important engineering communications worldwide. Some of these underground communications are built and operated in earthquake-prone areas.

Seismic safety or seismic stability of underground pipelines began to be intensively studied since the 1950s.

Since then, a number of methodologies were proposed for calculating stress received by an underground pipeline during an earthquake. The purpose of these methodologies was to make an accurate prediction on the structural stress received by a pipeline during an earthquake, and thus it would allow to decide how resilient the pipeline must be made in the first place. It is important to find a right balance between pipeline cost and its structural resilience.

Turns out, we were wrong.

New research has been conducted by scientists from Institute of Mechanics and Seismic Stability of Structures Academy of Sciences of the Republic of Uzbekistan and Peter the Great St.Petersburg Polytechnic University (SPbPU) of Russia. The research shows that current methods used for calculating stress received by the underground pipelines during an earthquake are incorrect.

"The existing methods do not consider seismic pressure, i.e., the stress normal to the pipeline's outer surface arising from the propagation of a seismic wave in the soil. This leads to an incorrect determination of the pipeline's stress; the longitudinal stress calculations lead to errors of 100% or more," said Nikolai Vatin, professor of Institute of Civil Engineering SPbPU.

Disregard of the longitudinal stress means that all oil and gas underground pipelines are more susceptible to earthquakes than was initially expected.

Researchers have developed a new theory of the seismic wave propagation process in an underground pipeline and surrounding soil to address this problem.

The related theoretical material that explains all the mathematical calculations of the new theory and its benefits in comparison to predecessor methodology can be found in the article Wave Theory of Seismic Resistance of Underground Pipelines.

"It is very admirable indeed that science moves forwards, and we get better theories and find our errors. Yet the ominous concern is that all our current pipelines are at risk will stay for some long time until they are replaced and modernized" mentioned Karim Sultanov from Institute of Mechanics and Seismic Stability of Structures Academy of Sciences of the Republic of Uzbekistan.

The research results in the article.

Credit: 
Peter the Great Saint-Petersburg Polytechnic University

Virtual humans are equal to real ones in helping people practice new leadership skills

image: Virtual Humans (VH) were developed with Adobe Fuse CC

Image: 
Gonzalo Suárez, Sungchul Jung, Robert W. Lindeman

A virtual human can be as good as a flesh-and-blood one when it comes to helping people practice new leadership skills. That's the conclusion from new research published in the journal Frontiers in Virtual Reality that evaluated the effectiveness of computer-generated characters in a training scenario compared to real human role-players in a conventional setting.

Practice-based training techniques, including role-playing, are sometimes used to help improve training outcomes. However, these methods can be expensive to implement, and often require specialized knowledge and even professional actors to create realistic training environments. In addition, some trainees can feel intimidated in these role-playing scenarios.

Researchers at the Human Interface Technology Lab New Zealand at the University of Canterbury wanted to find out if computer-generated role-players in virtual and mixed reality settings could provide similar levels of effectiveness to address some of the drawbacks to traditional training techniques.

In VR, or virtual reality, participants are completely immersed in a digital world. In mixed reality (MR), elements of the digital world are overlaid onto the physical world.

For the study, the researchers designed eight virtual humans, as well as realistic VR and MR environments using commercially available software and hardware. They recruited 30 people, split into three groups, who would undergo training using a well-known leadership model.

One group involved interactions between leadership trainees and two human role-players who acted as subordinates. The second group interacted with virtual human subordinates in a VR world, while the last group met in a MR setting where participants could see virtual humans in a real office space.

Trainees were scored on how well their leadership style matched each situation based on predefined criteria before and after they received coaching. The result: All three groups improved their performance between the pre- and post-training session, while the MR cohort had a statistically significant mean increase.

"The most remarkable finding is that virtual human role-players have been shown to be as effective as real human role-players to support the practice of leadership skills," said Gonzalo Suarez, lead author of the paper.

Suarez said one possible reason for the better results was that the MR setting blended reality and virtuality, allowing participants to perform in a safe learning environment with known features.

"Participants were able to perceive their real bodies and characteristics of the physical room where the experiment was conducted while interacting with virtual humans," he noted. "On the other hand, the experience provided by the VR scenario was completely new for the participants."

Previous studies of extended reality (XR) technologies - a term that refers to all environments using computer-generated graphics and wearables - have shown that they can provide both effective technical training and development of social skills, according to Suarez. One advantage is that XR allows learners to practice their skills and knowledge in multiple scenarios that may be too dangerous or expensive to replicate in the real world.

The current pandemic is another example of how XR technologies could be applied. "Institutions such as schools and universities can greatly benefit by using these technologies," Suarez noted.

However, there are still barriers to widespread adoption of the technology. For example, the equipment to render high-quality XR experiences can be expensive, and development could require the expertise of several different types of professionals.

Suarez said he believes those obstacles will eventually be overcome. "More sophisticated and automated content creation tools will arise over time, and their implementation will only strengthen the adoption of XR technologies for the creation of more effective and engaging learning experiences," he said. "In fact, many of these topics are currently being worked on in projects within the HIT Lab NZ."

Credit: 
Frontiers

Long-term consequences of CO2 emissions

image: Vertical section of zonally averaged oxygen changes in the simulation with historical CO2 emissions and zero emissions from 1 January 2021 onwards. Left: Year 2020 relative to 1800. Right: Year 2650 relative to 2020.

Image: 
C. Kersten, modified from A. Oschlies, 2021, GEOMAR.

The life of almost all animals in the ocean depends on the availability of oxygen, which is dissolved as a gas in seawater. However, the ocean has been continuously losing oxygen for several decades. In the last 50 years, the loss of oxygen accumulates globally to about 2% of the total inventory (regionally sometimes significantly more). The main reason for this is global warming, which leads to a decrease in the solubility of gases and thus also of oxygen, as well as to a slowdown in the ocean circulation and vertical mixing. A new study published today in the scientific journal Nature Communications shows that this process will continue for centuries, even if all CO2 emissions and thus warming at the Earth's surface would be stopped immediately.

"In the study, a model of the Earth system was used to assess what would happen in the ocean in the long term if all CO2 emissions would be stopped immediately", explains the author, Professor Andreas Oschlies from GEOMAR Helmholtz Centre for Ocean Research Kiel. "The results show that even in this extreme scenario, the oxygen depletion will continue for centuries, more than quadrupling the oxygen loss we have seen to date in the ocean ", Oschlies continues.

The long-term decrease in oxygen takes place primarily in deeper layers. According to Prof. Oschlies, this also has an impact on marine ecosystems. A so-called 'metabolic index', which measures the maximum possible activity of oxygen-breathing organisms, shows a widespread decline by up to 25%, especially in the deep sea (below 2000 metres). This is likely to lead to major shifts in this habitat, which was previously considered to be very stable, explains the oceanographer. These changes have already been initiated by our historical CO2 emissions and are now on their way to the deep ocean. He recommends that a comprehensive investigation of the deep ocean habitat, which has only been studied randomly so far, should take place before this environment, that is deemed as having been stable for many millennia, is likely to change significantly due to the now expected decrease in oxygen.

In the upper layers of the ocean, the model shows a much faster response to climate action. There, a further expansion of the relatively near-surface oxygen minimum zones can be stopped within a few years if the emissions were stopped. An ambitious climate policy can therefore help to prevent at least the near-surface ecosystems from being put under further pressure by a progressive decrease in oxygen.

Credit: 
Helmholtz Centre for Ocean Research Kiel (GEOMAR)

Egg and sperm cell size evolved from competition

In most living animals, egg cells are vastly larger than sperm cells. In humans, for example, a single egg is 10 million times the volume of a sperm cell.

In a new study, Northwestern University researchers found that competition and natural selection drove this curious size discrepancy.

Using mathematical modeling, the researchers considered a time very early in evolution when primordial species reproduced using external fertilization. In the model, bigger reproductive cells, or gametes, presented a competitive edge because they could hold more nutrients for a potential zygote. Smaller gametes, however, required fewer resources to make, which put less stress on the parent.

"Organisms either needed to produce the biggest gametes with the most provisions or the smallest gametes to use the least resources," said Northwestern's Daniel Abrams, the study's senior author. "We believe this size difference is almost inevitable, based on plausible assumptions about how sexual reproduction works and how natural selection works."

The research was published online last night (April 15) in the Journal of Theoretical Biology.

Abrams is a professor of engineering sciences and applied mathematics at Northwestern's McCormick School of Engineering. Joseph Johnson, a Ph.D. candidate in Abrams' laboratory, is the paper's first author. Nathan White and Alain Kangabire, undergraduate students in Abrams' lab, coauthored the paper.

The Northwestern team's model begins with isogamy, a primordial state in which all gametes were roughly the same size and distinct sexes did not yet exist. The team then developed and applied a simple mathematical model to show how isogamy transitioned to anisogamy, a state where the gametes either became very small or quite large -- precursors to sperm and eggs associated with biological sexes today.

In the model, anisogamy emerged from competition to survive in an environment with limited resources. Gametes were more likely to survive if they had an edge in size over their neighbors, leading to an "arms race" favoring larger and larger gametes. But organisms could not produce many sex cells without needing more and more resources themselves. They could, however, save their resources by producing a lot of tiny gametes.

"Early in evolution when sexual reproduction emerged, gametes were symmetrical. But this is where that symmetry breaks," Abrams said. "We end up with some organisms specializing in large gametes and others specializing in small gametes."

Abrams said one remaining mystery is why some isogamist species still exist today. Some types of algae and fungi, for example, reproduce either asexually or with symmetrical mating types.

"There have been different theories about how anisogamy emerged, going all the way back to Charles Darwin," Abrams said. "Issues in evolutionary biology are very hard to test because we can only study species that are around today. We can't see what they looked like billions of years ago. Using mathematical models can yield new insight and understanding."

Credit: 
Northwestern University

Alpine plants are losing their white "protective coat"

image: There is a clear trend towards earlier snowmelt at elevations between 1,000 and 2,500 meters. A first few spots at 2,500 meters are already snow-free in April 2020.

Image: 
Photo: Lawrence Blem

Snow cover in the Alps has been melting almost three days earlier per decade since the 1960s. This trend is temperature-related and cannot be compensated by heavier snowfall. By the end of the century, snow cover at 2,500 meters could disappear a month earlier than today, as simulations by environmental scientists at the University of Basel demonstrate.

Global warming demands huge adjustments in tourism, hydropower generation and agriculture in alpine areas. But the fauna and flora also have to adapt to rising temperatures. By the end of the century, continuous snow cover for 30 days below 1,600 meters is expected to be a rare occurrence. "Snow cover protects alpine plants from frost and the growing season begins after the snowmelt. Changes in the snowmelt have a very strong influence on this period," explains Dr. Maria Vorkauf of the Department of Environmental Sciences at the University of Basel. She researched alpine plant physiology intensively for her doctoral dissertation.

New measurements at high elevations

In a new study, Vorkauf and colleagues at the University of Basel and the Institute for Snow and Avalanche Research investigated how the date of snowmelt has changed in recent decades and what shifts can be expected by the end of the 21st century. For a long time, only a few measurement series of snow cover at high elevations were available, as measurements were usually only made near inhabited regions below 2,000 meters. This changed with the IMIS measurement network, which went into operation in 2000. It automatically records the snow depth between 2,000 and 3,000 meters every half hour. The researchers combined this data with measurement series from 23 lower-lying stations with manual measurements going back to at least 1958.

Analysis of the data showed that between 1958 and 2019, snow cover between 1,000 and 2,500 meters melted an average of 2.8 days earlier every decade. This shift was not linear, but was particularly strong in the late 1980s and early 1990s. This corresponds to strong temperature increases in this time period that have been verified by climate research.

Simulation of the alpine future

Based on the analyzed measurements, the researchers developed a model that makes it possible to forecast the future development of alpine snow cover. They combined their data with the latest climate scenarios for Switzerland. If greenhouse gas emissions continue to rise as they have so far, without consistent climate protection measures, the date of snowmelt in the last third of the 21st century is likely to move forward by six days per decade. This means that by the end of the century, snowmelt at 2,500 meters elevation would occur about one month earlier than today.

The research also showed that the earlier snowmelt at high elevations cannot be compensated for by greater precipitation in the winter, as is predicted by climate models for Switzerland. "As soon as the three-week running mean of daily air temperatures exceeds 5 °C, snow melts relatively quickly," explains Vorkauf. "At high elevations in particular, temperature is much more important than the depth of the snow cover."

Earlier flowering and higher frost risk

In the future, the early snowmelt could extend the growing season of alpine plants by about a third. As is known from studies of other alpine plant species, an earlier start to the growing season leads to fewer flowers, less leaf growth and a lower survival rate due to the higher risk of frost. "Some species such as the Alpine sedge, which is typical of alpine grasslands, will grow and flower earlier because of the early snowmelt," says Vorkauf.

Although temperatures in alpine areas are rising faster, alpine plant species are not more strongly affected by climate change than those at other elevations. "The topography and exposure of alpine terrain creates very diverse microclimates on a small scale. Within these, plants can retreat over short distances at the same elevation," Vorkauf explains. As a result, alpine plant species do not have to "flee" to the heights, as is often assumed.

Credit: 
University of Basel

An antibody-drug combo to combat cancer

image: A novel therapeutic strategy targeting cMoP for leukemias and tumors.

Image: 
Department of Biodefense Research,TMDU

Tokyo, Japan - Leukemias are debilitating cancers of the hematopoietic or blood-forming cells of the bone marrow. Now, researchers at Tokyo Medical and Dental University (TMDU) describe an ingenious strategy against chronic myelomonocytic leukemia (CMML) wherein an antibody-drug conjugate (ADC) comprising a cytotoxic drug payload linked to an antibody that selectively targets specific cell lines effectively blocks malignant cell proliferation at source.

Hematopoietic stem and progenitor cells (HSPCs) continually differentiate into the entire panoply of blood cells, as many as 500 billion per day in the average human. CMML results from genetic mutations in HSPCs and is characterized by Increase of monocytes and immature abnormal cells in the peripheral blood and bone marrow. This disordered hematopoiesis results in intractable anemia, infections and bleeding disorders. Human stem cell transplantation is the only established cure, but this requires invasive preconditioning and risks Graft versus host disease (GvHD) and intractable infections especially in the elderly age group affected. Conventional drugs may induce remission and reduce tumor burden but fluctuate between unresponsiveness and fatal marrow suppression.

HSPCs, being multipotent, can replenish all blood cell types and can self-renew. It would seem that targeting them would be a solution to cancers and other immune disease, but this can also disrupt normal cell lines resulting in red cell deficiency causing anemias and white cell dysfunction increasing the risks of infection. It's therefore desirable to identify and specifically target monopotent progenitors, cells that are 'committed' to produce the particular cell line.

"We had earlier identified monocyte progenitors and pre-monocytes which express the monocyte marker CD64," first author Yuta Izumi explains. "We developed an ADC combining anti-CD64 antibody with a cytotoxic agent dimeric pyrrolobenzodiazepine (dPBD) that induces apoptosis of multiplying human monocyte-restricted progenitors but not of stable mature monocytes."

Co-first author Masashi Kanayama elaborates, "We found that anti-CD64-dPBD killed proliferating monocytic leukemia cells and blocked their generation from bone marrow progenitors in a patient-derived CMML xenograft experimental mouse model. Moreover, other types of hematopoietic cells including HSPCs, neutrophils, lymphocytes and platelets were unaffected. Additionally, by depleting the source of monocytes, our ADC eliminated tumor-associated macrophages and significantly reduced tumor size in humanized mice bearing solid tumors."

"Selectively targeting proliferating monocyte progenitors and leukemia cells with our double-barreled ADC strategy causes minimal collateral damage to other cell lineages," claims senior author Toshiaki Ohteki. "It is therefore a very promising therapeutic tool against monocytic leukemias, solid tumors and monocyte-related inflammatory and auto-immune diseases."

Credit: 
Tokyo Medical and Dental University

Combining news media and AI to rapidly identify flooded buildings

image: Combining News Media and AI to Rapidly Identify Flooded Buildings

Image: 
MLIT, Shikoku Regional Development Bureau

Artificial intelligence (AI) has sped up the process of detecting flooded buildings immediately after a large-scale flood, allowing emergency personnel to direct their efforts efficiently. Now, a research group from Tohoku University has created a machine learning (ML) model that uses news media photos to identify flooded buildings accurately within 24 hours of the disaster.

Their research was published in the journal Remote Sensing on April 5, 2021.

"Our model demonstrates how the rapid reporting of news media can speed up and increase the accuracy of damage mapping activities, accelerating disaster relief and response decisions, said Shunichi Koshimura of Tohoku University's International Research Institute of Disaster Science and co-author of the study.

ML and deep learning algorithms are tailored to classify objects through image analysis. For AI and ML to be effective, data is needed to train the model - flood data in the current case.

Although flood data can be collected from previous events, it will inadvertently lead to problems on account of every event being different and subject to the local characteristics of the flooded area. Thus, onsite information has higher reliability.

News crews and media teams are often the first on the scene of a disaster to broadcast images to viewers at home, and the research team recognized that this information too could be used in AI algorithms.

They applied their model to Mabi-cho, Kurashiki city in Okayama Prefecture, which was affected by the heavy rains across western Japan in 2018.

First, researchers identified press photos and geolocated them based on landmarks and other clues appearing in the photo. Next, they used synthetic aperture radar (SAR) PALSAR-2 images provided by JAXA to discretize flooded and non-flooded conditions of unknown areas.

Here, SAR images can be employed to classify water bodies since microwaves irradiate differently on wet and dry surfaces. A support vector machine (SVM), one of the machine learning techniques, was used to classify buildings surrounded by floodwaters or within non-flooded areas.

"The performance of our model resulted in an 80% estimation accuracy," added Koshimura.

Looking ahead, the research group will explore the applicability of news media databases from past events as training datasets for developing AI Models at present situations to increase the accuracy and speed of classification.

Credit: 
Tohoku University

Quality and quantity of enrichments influence well-being of aquaculture fishes

image: The research demonstrates that stone enrichments that have been previously conditioned in lake water significantly improve survival of aquaculture fish compared to clean stones.

Image: 
Pekka Hyvärinen/Luke

Collaborative research of the University of Jyvaskyla and Natural Research Institute Finland presents new evidence of the effects of enriched rearing on well-being of aquaculture fishes. The research demonstrates that stone enrichments that have been previously conditioned in lake water significantly improve survival of fish compared to clean stones. Also a higher number of stones has a similar positive effect. The results have practical implications for prevention of aquaculture diseases. The study was published in Antibiotics in March 2021.

The volume of aquaculture is continuously increasing. Parasitic diseases represent a significant threat to farmed fishes and ecological solutions to minimize use of medication are being sought.

Enriched rearing, where rearing tanks are equipped with different types of structures, has been shown to reduce the impact of diseases. However, the mechanisms underlying these effects are unknown.

"This study investigates the effect of quality and quantity of enrichments on survival of young brown trout and landlocked Atlantic salmon from a common disease of aquaculture fishes, the flavobacterial infection. Enriched tanks with just a handful of stones previously conditioned in lake water had a higher fish survival compared to tanks with clean stones or those without stones. In addition, a higher number of stones has a similar positive effect", says Senior Lecturer Anssi Karvonen from the University of Jyväskyla.

"The results suggest that the conditioning forms a beneficial microbial community on the stones, which is capable in preventing harmful bacteria. We are currently conducting further investigations on the structure of the microbial communities", says Karvonen.

"Disease prevention might be possible with precision use of enrichments"

The recently published research provides information on the dynamics between enrichments and diseases. When and how the enrichments are being applied can influence the outbreak and harmfulness of a disease.

The results have significance for the health of aquaculture fishes, as well as costs associated with the rearing.

"The results suggest that the use of certain type of enrichments just before the anticipated disease outbreak can reduce the impact of a disease and possibly minimize the need for medication. This type of precision enrichment could also decrease the workload and costs associated with the cleaning of the tanks. Enrichments are already being applied in production-scale rearing of salmonid fishes intended for stocking in collaboration with aquaculture companies. For example, tests of new and more user-friendly structures of enrichments for young Atlantic salmon are currently being run in facilities of Voimalohi Ltd", says principal scientist Pekka Hyvärinen from the Natural Resources Institute Finland.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

Heart health of shift workers linked to body clock

Sophia Antipolis - 16 April 2021: Working hours that deviate from an individual's natural body clock are associated with greater cardiovascular risk, according to research presented at ESC Preventive Cardiology 2021, an online scientific congress of the European Society of Cardiology (ESC).1

"Our study found that for each hour the work schedule was out of sync with an employee's body clock, the risk of heart disease got worse," said study author Dr. Sara Gamboa Madeira of the University of Lisbon, Portugal.

At least 20% of European employees work atypical hours or shifts,2 and growing scientific evidence associates these with deleterious cardiovascular outcomes.3 A number of explanations have been proposed, including sleep disruption and unhealthy behaviours. This study focused on the role of circadian misalignment, which is the difference between the "social clock" (e.g. work schedules) and the individual "biological clock".

Dr. Gamboa Madeira explained: "We all have an internal biological clock which ranges from morning types (larks), who feel alert and productive in the early morning and sleepy in the evening, to late types (owls), for whom the opposite is true - with most of the population falling in between. Circadian misalignment occurs when there is a mismatch between what your body wants (e.g. to fall asleep at 10pm) and what your social obligations impose on you (e.g. work until midnight)."

The study included 301 blue collar workers, all performing manual picking activity in the distribution warehouses of a retail company in Portugal. Staff always worked either early morning (6am-3pm), late evening (3pm-midnight), or night (9pm-6am) shifts. Participants completed a questionnaire on sociodemographic factors (age, sex, education), occupational factors (work schedule, seniority), and lifestyle factors and had their blood pressure and cholesterol measured.

The Munich ChronoType Questionnaire was used to assess sleep duration, and to estimate each individual's internal biological clock (also called chronotype). It was also used to quantify the amount of circadian misalignment (i.e. the mismatch between an individual's biological clock and working hours) - referred to as social jetlag. Participants were divided into three groups according to hours of social jetlag: 2 hours or less, 2-4 hours, 4 hours or more.

The researchers used the European relative risk SCORE chart which incorporates smoking, blood pressure and cholesterol to calculate relative cardiovascular risk. Relative risk ranges from 1 (non-smoker with healthy blood pressure and cholesterol) to 12 (smoker with very high blood pressure and cholesterol). In this study, a relative risk of 3 or more was considered "high cardiovascular risk". The researchers then investigated the association between social jetlag and high cardiovascular risk.

The average age of participants was 33 years and 56% were men. Just over half (51%) were smokers, 49% had high cholesterol, and 10% had hypertension. One in five (20%) were classified as high cardiovascular risk. Some 40% had a short sleep duration on workdays (6 hours or less). The average social jetlag was nearly 2 hours. In most workers (59%), social jetlag was 2 hours or less, while for 33% of staff it was 2-4 hours, and in 8% it was 4 hours or more.

A higher level of social jetlag was significantly associated with greater odds of being in the high cardiovascular risk group. The odds of being classified high cardiovascular risk increased by 31% for each additional hour of social jetlag, even after adjusting for sociodemographic, occupational, lifestyle, and sleep characteristics and body mass index.

Dr. Gamboa Madeira said: "These results add to the growing evidence that circadian misalignment may explain, at least in part, the association found between shift work and detrimental health outcomes. The findings suggest that staff with atypical work schedules may need closer monitoring for heart health. Longitudinal studies are needed to investigate whether late chronotypes cope better with late/night shifts and earlier chronotypes to early morning schedules, both psychologically and physiologically."

Credit: 
European Society of Cardiology

A new drought monitoring approach: Vector Projection Analysis (VPA)

image: The illustration of VPA with a 3-dimensional case.

Image: 
UNIST

A team of researchers, affiliated with UNIST has proposed a satellite-aided drought monitoring method that can adequately represent the complex drought conditions into a single integrated drought index. The newly-proposed drought index has attracted considerable attention as a new method for monitoring and forecasting drought hazards due to its accuracy with no space-time constraints.

Drought is one of the most complex natural disasters. Therefore, unlike most other natural disasters, it is usually difficult to define the drought onset or drought declaration. For this reason, various drought indices (i.e., drought severity, the area affected, duration, and timing) are used to monitor drought and its risk management. The existing drought indices are tended to be specific to particular types of drought indicators. Therefore, in order to comprehensively investigate drought disasters, we need a more generalized index that is suitable for varied drought conditions.

Professor Jungho Im and his research team in the Department of Urban and Environmental Engineering at UNIST proposed a new drought monitoring approach with an adaptive index -- Vector Projection Analysis (VPA) and Vector Projection Index of Drought (VPID). The new approach is said to consider multiple drought indicators in various climate zones across the Contiguous United States (CONUS) and East Asia. A major advantage of VPA, according to the research team, is that it uses multiple dependent variables (i.e., surface-based drought indices) and multiple independent variables (i.e., satellite-derived drought factors) to capture varied climate and environmental characteristics.

In this study, three schemes of VPID with different combinations of variables focused on integrated-, short-, and long-term drought (VPIDinte, VPIDshort, and VPIDlong, respectively) were also evaluated over the CONUS and East Asia. According to the research team, all three schemes showed good agreement with both multi-timescale surface-based indices and drought references (USDM and EM-DAT) across the study areas at continental scales.

"This implies that VPA can capture the general trends of multi-drought indices across a wide area with heterogeneous characteristics. Since VPA is a simple method to determine the weight of its calculation under the consideration of multi-dependent variables (i.e., drought indices), it will be helpful to monitor the general characteristics of complex drought conditions in a simple way," noted the research team.

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Sunlight to solve the world's clean water crisis

image: UniSA's photothermal evaporator is a cost-effective water purification technique that could deliver safe drinking water to millions of vulnerable people using cheap, sustainable materials and sunlight.

Image: 
University of South Australia

Researchers at UniSA have developed a cost-effective technique that could deliver safe drinking water to millions of vulnerable people using cheap, sustainable materials and sunlight.

Less than 3 per cent of the world's water is fresh, and due to the pressures of climate change, pollution, and shifting population patterns, in many areas this already scarce resource is becoming scarcer.

Currently, 1.42 billion people - including 450 million children - live in areas of high, or extremely high, water vulnerability, and that figure is expected to grow in coming decades.

Researchers at UniSA's Future Industries Institute have developed a promising new process that could eliminate water stress for millions of people, including those living in many of the planet's most vulnerable and disadvantaged communities.

A team led by Associate Professor Haolan Xu has refined a technique to derive freshwater from seawater, brackish water, or contaminated water, through highly efficient solar evaporation, delivering enough daily fresh drinking water for a family of four from just one square metre of source water.

"In recent years, there has been a lot of attention on using solar evaporation to create fresh drinking water, but previous techniques have been too inefficient to be practically useful," Assoc Prof Xu says.

"We have overcome those inefficiencies, and our technology can now deliver enough fresh water to support many practical needs at a fraction of the cost of existing technologies like reverse osmosis."

At the heart of the system is a highly efficient photothermal structure that sits on the surface of a water source and converts sunlight to heat, focusing energy precisely on the surface to rapidly evaporate the uppermost portion of the liquid.

While other researchers have explored similar technology, previous efforts have been hampered by energy loss, with heat passing into the source water and dissipating into the air above.

"Previously many of the experimental photothermal evaporators were basically two dimensional; they were just a flat surface, and they could lose 10 to 20 per cent of solar energy to the bulk water and the surrounding environment," Dr Xu says.

"We have developed a technique that not only prevents any loss of solar energy, but actually draws additional energy from the bulk water and surrounding environment, meaning the system operates at 100 per cent efficiency for the solar input and draws up to another 170 per cent energy from the water and environment."

In contrast to the two-dimensional structures used by other researchers, Assoc Prof Xu and his team developed a three-dimensional, fin-shaped, heatsink-like evaporator.

Their design shifts surplus heat away from the evaporator's top surfaces (i.e. solar evaporation surface), distributing heat to the fin surface for water evaporation, thus cooling the top evaporation surface and realising zero energy loss during solar evaporation.

This heatsink technique means all surfaces of the evaporator remain at a lower temperature than the surrounding water and air, so additional energy flows from the higher-energy external environment into the lower-energy evaporator.

"We are the first researchers in the world to extract energy from the bulk water during solar evaporation and use it for evaporation, and this has helped our process become efficient enough to deliver between 10 and 20 litres of fresh water per square metre per day."

In addition to its efficiency, the practicality of the system is enhanced by the fact it is built entirely from simple, everyday materials that are low cost, sustainable and easily obtainable.

"One of the main aims with our research was to deliver for practical applications, so the materials we used were just sourced from the hardware store or supermarket," Assoc Prof Xu says.

"The only exception is the photothermal materials, but even there we are using a very simple and cost-effective process, and the real advances we have made are with the system design and energy nexus optimisation, not the materials."

In addition to being easy to construct and easy to deploy, the system is also very easy to maintain, as the design of the photothermal structure prevents salt and other contaminants building up on the evaporator surface.

Together, the low cost and easy upkeep mean the system developed by Assoc Prof Xu and his team could be deployed in situations where other desalination and purification systems would be financially and operationally unviable.

"For instance, in remote communities with small populations, the infrastructure cost of systems like reverse osmosis is simply too great to ever justify, but our technique could deliver a very low cost alterative that would be easy to set up and basically free to run," Assoc Prof Xu says.

"Also, because it is so simple and requires virtually no maintenance, there is no technical expertise needed to keep it running and upkeep costs are minimal.

"This technology really has the potential to provide a long-term clean water solution to people and communities who can't afford other options, and these are the places such solutions are most needed."

In addition to drinking water applications, Assoc Prof Xu says his team is currently exploring a range of other uses for the technology, including treating wastewater in industrial operations.

"There are a lot of potential ways to adapt the same technology, so we are really at the beginning of a very exciting journey," he says.

Credit: 
University of South Australia

Oregon scientists create mechanism to precisely control soundwaves in metamaterials

EUGENE, Ore. -- April 16, 2021 -- University of Oregon physicists have developed a new method to manipulate sound -- stop it, reverse it, store it and even use it later -- in synthetic composite structures known as metamaterials.

The discovery was made using theoretical and computational analysis of the mechanical vibrations of thin elastic plates, which serve as the building blocks for the proposed design. The physicists, Pragalv Karki and Jayson Paulose, also developed a simpler minimal model consisting of springs and masses demonstrating the same signal manipulation ability.

"There have been a lot of mechanisms that can guide or block the transmission of sound waves through a metamaterial, but our design is the first to dynamically stop and reverse a sound pulse," said Karki, a postdoctoral researcher in the UO's Department of Physics and Institute for Fundamental Science.

The interplay between bending stiffness and the global tension --two physical parameters governing sound transmission in thin plates--is at the heart of their signal-manipulation mechanism. While bending stiffness is a material property, global tension is an externally controllable parameter in their system.

Karki and Paulose, an assistant professor of physics and member of the Institute for Fundamental Science, described their new mechanism, which they call dynamic dispersion tuning, in a paper published online March 29 in the journal Physical Review Applied.

"If you throw a stone onto a pond, you see the ripples," Karki said. "But what if you threw the stone and instead of seeing ripples propagating outward you just see the displacement of the water going up and down at the point of impact? That's similar to what happens in our system."

The ability to manipulate sound, light or any other waves in artificially made metamaterials is an active area of research, Karki said.

Optical or photonic metamaterials, which exhibit properties such as a negative refractive index not possible with conventional materials, were initially developed to control light in ways that could be used to create invisibility cloaks and super lenses.

Their use is being explored in diverse applications such as aerospace and defense, consumer electronics, medical devices and energy harvesting.

Acoustic metamaterials are usually static and unchangeable once produced, and dynamically tuning their properties is an ongoing challenge, Karki said. Other research groups have proposed several strategies for tuning acoustic transmission, ranging from origami-inspired designs to magnetic switching.

"In our case, the tunability comes from the ability to change the tension of the drum-like membranes in real time," Karki said.

Additional inspiration, Karki and Paulose noted, came from research in the UO lab of physicist Benjamín Alemán. In Nature Communications in 2019, Alemán's group unveiled a graphene nanomechanical bolometer, a drum-like membrane that can detect colors of light at high speeds and high temperatures. The approach exploits a change in global tension.

While the mechanism in the new paper was identified theoretically and needs to be proven in lab experiments, Karki said, he is confident the approach will work.

"Our mechanism of dynamic dispersion tuning is independent of whether you are using acoustic, light or electronic waves," Karki said. "This opens up the possibility of manipulating signals in photonic and electronic systems as well."

Possibilities, he said, include improved acoustic signal processing and computation. Designing acoustic metamaterials based on graphene, such as those developed in Alemán's lab, could lead to variety of uses like wave-based computing, micromechanical transistors and logic devices, waveguides and ultra-sensitive sensors.

"Our design could be built at the microscale with graphene and at large scales using drum-like membrane sheets," Karki said. "You strike the chain of drums, creating a particular pattern of sound that moves in one direction, but by tuning the tension of the drums, we can stop the sound and store it for future use. It can be reversed or manipulated into any number of other patterns."

Credit: 
University of Oregon