Earth

Living environment affects the microbiota and health of both dogs and their owners

image: Dogs and their own­ers seemed to share mi­crobes on their skin, but not in their gut.

Image: 
Emma Hakanen

In urban environments, allergic diseases are more common among dogs and their owners compared to those living in rural areas. Simultaneous allergic traits appear to be associated with the microbes found in the environment, but microbes relevant to health differ between dogs and humans.

In a joint research project known as DogEnvi, researchers from the University of Helsinki, the Finnish Environment Institute and the Finnish Institute for Health and Welfare have previously observed that dogs are more likely to have allergies when their owners suffer from allergic symptoms. In a new study, the researchers investigated whether such simultaneous presence of allergic traits is associated with gut or skin microbes shared by dogs and their owners. A total of 168 dog-owner pairs living in rural and urban environments participated in the study.

"Research shows that dogs and owners living in rural areas have a lower risk of developing an allergic disease compared to urban areas. We assumed that in rural areas both dogs and owners are exposed to health-promoting microbes. We found that the microbial exposure of both was different in rural and urban environments. For instance, the skin microbiota varied more between individuals in rural areas compared to their urban counterparts. A diverse and varying microbial exposure may be precisely what provides the associated health benefit," says Senior Researcher Jenni Lehtimäki, PhD, from the Finnish Environment Institute.

Dogs and their owners seemed to share microbes on their skin, but not in their gut. The study demonstrated that the living environment had a markedly more significant effect on the skin microbiota than on that of the gut in dogs and humans. Dogs living in urban areas had on their skin more microbes typically found on human skin, which may be caused by the accumulation of microbes typical to humans indoors and in urban areas, a phenomenon that has been previously observed.

In a study conducted earlier, the researchers noticed that both the living environment and living habits affected the canine skin microbiota.

"The same was now observed in humans. For both dogs and humans, the risk of developing allergic diseases was at its lowest when the skin microbiota was shaped by a rural environment and a lifestyle that promotes microbial abundance. Such a lifestyle was associated with a number of different animals in the family, as well as larger family size," says Professor Hannes Lohi from the University of Helsinki.

While the living environment appeared to alter the species of the skin microbiota as well as the risk of allergic diseases in both dogs and their owners, no single shared microbe in the environment had a link to allergies in both dogs and humans.

"We detected microbes associated with allergies in urban dogs, as well as microbes connected to health in rural dogs and humans, but these microbes were different in dogs and humans. It appears that the microbes in the living environment are important for the health of both dogs and humans, but due to the physiological differences of the species, the microbes that are relevant can vary," Lehtimäki sums up.

Credit: 
University of Helsinki

New solutions for addressing systemic risks

Systemic risks like climate change, cybersecurity and pandemics are characterised by high complexity, uncertainty, ambiguity, and effects beyond the system in which they originate. That's why novel research approaches and regulatory measures are indispensable for the evaluation and management of these risks. An interdisciplinary team recently published a paper on this subject, which appears as the first article in a special issue of the journal "Risk Analysis," edited by Ortwin Renn and Pia-Johanna Schweizer from the Institute for Advanced Sustainability Studies.

The negative effects of systemic risks are often felt far beyond the areas where the most obvious damage occurs - the coronavirus crisis is a case in point. The authors of the article "Systemic Risks from Different Perspectives" analyse the challenges presented by systemic risks from the perspective of different disciplines, ranging from mathematics to complexity science, engineering, biology, ecology, and the social sciences. They highlight the particular insights of each scientific perspective and combine them to produce an interdisciplinary understanding of systemic risks and effective risk governance measures.

The researchers recommend integrating modelling tools and empirical data in order to gain a comprehensive understanding of systemic risks that can feed into policy advice. "Human reactions and other unanticipated stress factors also have to be considered here, as well as the ripple effects by which risk spreads from one system to another," explains co-author Pia-Johanna Schweizer. To be both effective and socially acceptable, governance of systemic risks requires interdisciplinary and cross-sectoral cooperation, a close monitoring system, and the engagement of scientists, regulators and stakeholders. In their study, the authors outline a step-by-step procedure for dealing with these challenges.

They also emphasise the importance of public and stakeholder participation and risk communication. In democratic societies, successful risk governance is more than simply a matter of reducing risks: risk management also has to be democratically legitimised. The ultimate aim is to ensure adaptable and inclusive governance of systemic risks.

Special issue on conceptual questions and case studies

The special issue in which the article appears offers a theoretically substantiated and empirically proven insight into the concept of systemic risks and makes practical suggestions for managing them. It is the culmination of a workshop series that the IASS organised in cooperation with the Berlin-Brandenburg Academy of Sciences and Humanities (BBAW) and is divided into two parts. The first part is devoted to conceptual issues and the second part to case studies. A study on governance requirements in the case of systemic risks rounds off the special issue. The individual articles will be published online in the coming months prior to the publication of the entire special issue.

Credit: 
Research Institute for Sustainability (RIFS) – Helmholtz Centre Potsdam

Power boost thanks to gold lamellae

image: Ultra-thin gold lamellae drastically amplify the incoming terahertz pulses (red) in the underlying graphene layer, enabling efficient frequency multiplication.

Image: 
HZDR/Werkstatt X

On the electromagnetic spectrum, terahertz light is located between infrared radiation and microwaves. It holds enormous potential for tomorrow's technologies: Among other things, it might succeed 5G by enabling extremely fast mobile communications connections and wireless networks. The bottleneck in the transition from gigahertz to terahertz frequencies has been caused by insufficiently efficient sources and converters. A German-Spanish research team with the participation of the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) has now developed a material system to generate terahertz pulses much more effectively than before. It is based on graphene, i.e., a super-thin carbon sheet, coated with a metallic lamellar structure. The research group presented its results in the journal ACS Nano (DOI: 10.1021/acsnano.0c08106).

Some time ago, a team of experts working on the HZDR accelerator ELBE were able to show that graphene can act as a frequency multiplier: When the two-dimensional carbon is irradiated with light pulses in the low terahertz frequency range, these are converted to higher frequencies. Until now, the problem has been that extremely strong input signals, which in turn could only be produced by a full-scale particle accelerator, were required to generate such terahertz pulses efficiently."This is obviously impractical for future technical applications," explains the study's primary author Jan-Christoph Deinert of the Institute of Radiation Physics at HZDR. "So, we looked for a material system that also works with a much less violent input, i.e., with lower field strengths."

For this purpose, HZDR scientists, together with colleagues from the Catalan Institute of Nanoscience and Nanotechnology (ICN2), the Institute of Photonic Sciences (ICFO), the University of Bielefeld, TU Berlin and the Mainz-based Max Planck Institute for Polymer Research, came up with a new idea: the frequency conversion could be enhanced enormously by coating the graphene with tiny gold lamellae, which possess a fascinating property: "They act like antennas that significantly amplify the incoming terahertz radiation in graphene," explains project coordinator Klaas-Jan Tielrooij from ICN2. "As a result, we get very strong fields where the graphene is exposed between the lamellae. This allows us to generate terahertz pulses very efficiently."

Surprisingly effective frequency multiplication

To test the idea, team members from ICN2 in Barcelona produced samples: First, they applied a single graphene layer to a glass carrier. On top, they vapor-deposited an ultra-thin insulating layer of aluminum oxide, followed by a lattice of gold strips. The samples were then taken to the TELBE terahertz facility in Dresden-Rossendorf, where they were hit with light pulses in the low terahertz range (0.3 to 0.7 THz). During this process, the experts used special detectors to analyze how effectively the graphene coated with gold lamellae can multiply the frequency of the incident radiation.

"It worked very well," Sergey Kovalev is happy to report. He is responsible for the TELBE facility at HZDR. "Compared to untreated graphene, much weaker input signals sufficed to produce a frequency-multiplied signal." Expressed in numbers, just one-tenth of the originally required field strength was enough to observe the frequency multiplication. And at technologically relevant low field strengths, the power of the converted terahertz pulses is more than a thousand times stronger thanks to the new material system. The wider the individual lamellae and the smaller the areas of graphene that are left exposed, the more pronounced the phenomenon. Initially, the experts were able to triple the incoming frequencies. Later, they attained even larger effects - fivefold, sevenfold, and even ninefold increases in the input frequency.

Compatible with chip technology

This offers a very interesting prospect, because until now, scientists have needed large, complex devices such as accelerators or large lasers to generate terahertz waves. Thanks to the new material, it might also be possible to achieve the leap from gigahertz to terahertz purely with electrical input signals, i.e., with much less effort. "Our graphene-based metamaterial would be quite compatible with current semiconductor technology," Deinert emphasizes. "In principle, it could be integrated into ordinary chips." He and his team have proven the feasibility of the new process - now implementation in specific assemblies may become possible.

The potential applications could be vast: Since terahertz waves have higher frequencies than the gigahertz mobile communications frequencies used today, they could be used to transmit significantly more wireless data - 5G would become 6G. But the terahertz range is also of interest to other fields - from quality control in industry and security scanners at airports to a wide variety of scientific applications in materials research, for example.

Credit: 
Helmholtz-Zentrum Dresden-Rossendorf

COVID-19 as leading cause of death in US

What The Viewpoint Says: This Viewpoint uses Centers for Disease Control and Prevention data to compare the COVID-19 mortality rate in 2020 with prior leading causes of death (heart disease, cancer, lung disease and injury) to put into context the cost of the infection in loss of life in the United States.

Authors: Steven H. Woolf, M.D., M.P.H., of the Virginia Commonwealth University School of Medicine in Richmond, Virginia, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2020.24865)

Editor's Note: The articles includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Scientists unlock promising key to preventing cancer relapse after immunotherapy

New York, NY (December 17, 2020) -- Mount Sinai researchers have solved one of the enduring mysteries of cancer immunotherapy: Why does it completely eliminate tumors in many patients, even when not all the cells in those tumors have the molecular target that the therapy is aimed at?

The answer involves a protein called fas, and regulating fas may be a route to preventing cancer relapse, the researchers reported in a study published in Cancer Discovery in December.

Cancer immunotherapies target antigens, or proteins, on the surface of tumor cells. One common example is a protein called CD19. But even when most cells in a tumor express CD19 on their surface, some do not. And tumors are constantly evolving and often experience "antigen escape," meaning that the target is no longer expressed, which can make the immunotherapy fail and the cancer relapse.

The researchers discovered that cancer immunotherapies that make use of immune system cells such as T cells and CAR-T cells kill not only tumor cells that express the drugs' target, but also adjacent tumor cells that lack the targets, because of the presence of fas. This process, known as bystander killing, can be made more effective by adding therapeutics that turn off the regulation of fas proteins, the researchers said.

"This study should engender many clinical trials solving the common weakness of immunotherapies--antigen escape and relapse," said Joshua Brody, MD, Director of the Lymphoma Immunotherapy Program at The Tisch Cancer Institute at Mount Sinai. "Specifically, by combining immunotherapies with small molecule inhibitors that increase fas-signaling, which are already being used in the clinic, bystander tumor cell killing may be potentiated and eliminate antigen-loss variants from heterogenous tumors."

T cell-based immunotherapies--including CAR-T, bispecific antibodies, and anti-PD1 antibodies--have revolutionized cancer treatment. However, even with the remarkably high response rates of CAR-T-treated patients, most either progress or relapse within one year.

In this study, Mount Sinai researchers looked at tumors from patients in a large clinical trial studying CAR-T's effectiveness in patients with non-Hodgkin's lymphoma and found for the first time that the level of fas present in the tumors predicted the patients' response to the drug and their long-term survival. Those with significantly elevated fas in their tumors had longer-lasting positive responses to the therapy.

Based on this, the researchers tested small-molecule therapies that increased the function of fas in the tumor cells, in turn increasing the targeted and bystander tumor cell killing induced by T cells, CAR-T cells, and bispecific antibodies.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

How scientists are using declassified military photographs to analyse historical ecological change

image: Photo of bomb craters converted into fish ponds in Vietnam, 2015

Image: 
Photo by Mihai Daniel Nita

Researchers are using?Cold?War spy satellite images to explore changes in the environment, including deforestation in Romania, marmot decline in Kazakhstan and ecological damage from bombs in Vietnam.?

Ecologists have harnessed new advances in image processing?to?improve analysis of declassified US military intelligence photographs and detect previously unseen?changes in the environment.?Dr Catalina Munteanu, of Humboldt University, and Dr Mihai Daniel Nita,?Transilvania?University of?Bra?ov, present new findings from the US Geological Survey?declassified?satellite imagery.?

The main data source for the analyses are Cold War Spy satellite images, which were collected by the US since 1960, initially to monitor the Sino-Soviet bloc. Eight satellites took pictures on film roll, which were then parachuted back into the atmosphere, where a perfectly timed US military plane snatched it mid-air before it could be intercepted.

The researchers obtained the photos through the U.S. Geological Survey's Earth Resources Observation and Science (EROS) Centre, after being declassified in 1995 under an executive order by President Bill Clinton.??

This type of film data has been given an?upgrade,?by?employing drone image processing software, using a?rectification?technique?known as structure from motion.??

Mihai, a co-author, a pioneer of this method commented "The mathematic procedure behind the drone image processing software is structure from motion. This approach allows us to process historical aerial or satellite images faster and more precise than the traditional approach".?This essentially creates a historical Google-Earth style of map, for satellite imagery taken as far back as the Cold War.?

Previously, the same research group had utilised the CORONA dataset to analyse photos?of agricultural landscapes in Kazakhstan?between 1960's and 1970's. These photos?were used to?identify population declines in steppe marmots?due to a?reduction in burrow number.?On marmots, Catalina Munteanu?says, "Marmot population declined over the past 50 years in Kazakhstan- and this is a decline that we might have missed, if only looking at short time periods of 10-15 years for which modern data is available.???

The authors present several new?findings, in addition to the?published?research on marmots?in Kazakhstan.?One use included revealing the extent of?large-scale?deforestation in the aftermath of the Second World War in Romania.??

"The extent and location?of these historical clear-cuts were?previously unknown - this data revealed where most of these harvests were located. Many of the forest harvested then?were old?forest, of?high?ecological value, and?some areas were?planted with spruce monocultures that are ecologically much less resilient and diverse", says lead researcher Catalina Munteanu.?

Pictures from the 1960's revealed the watershed?was completely clear cut by Soviet-Romanian companies as a reparation to the war. In 2015, a Google Earth image of the same area, showed the secondary forest regrowth after 60 years.?

Interestingly, new examination of photos from the Vietnam War has revealed the extensive ecological damage caused by?explosions.? Mihai Daniel Nita, in a separate piece of work, has assessed the?expansion of agricultural land in previously ravaged forests, as well as craters from the impact of?bomb?explosions, which have been transformed into fish farming ponds.?

Spy satellite?imagery?can be used to map warfare-induced deforestation?and changing agricultural practices in Vietnam.?

"With this data we can not only map the extent of this damage with help of these images, but also explore how landscapes have changed later in response to the war. For example, some of the bomb craters are now filled with water and are used as fishponds", commented Mihai.

This work demonstrates that often our choice of baselines is dictated by data availability, and that by using different data sources, we may shift?the?baselines?against which we quantify change. The interpretation of environmental change will depend heavily on the?reference points?we choose.??

Catalina?and the co-authors caution that?"this is a reminder to be very careful in our interpretations?of environmental change. All data sources have their?limitations and maybe also integrate?these data. A good idea is to consider integrating across?multiple data sources whenever possible."?

There may be many more applications of the data, such as mapping the development of cities and built infrastructure.??

Catalina commented, "Photos of this nature can also be a direct source of information (e.g., a penguin colony detected on an ice shelf) or be an indication of species or their habitat (e.g., previous work on the burrows of marmots in Kazakhstan)."?

It is expected that large scale applications of?historical?satellite imagery data, as seen here, can set an example for the expansion of the use of these data into other disciplines relating to human environment interactions.?

Future work could involve investigated ecological shock events such as war, and how this modified landscape has influenced the land use change itself.??

Catalina Munteanu's talk will be available on-demand from the 14th - 18th of?December 2020 at the Festival of Ecology.?Parts of this work?are?unpublished and has not been through the peer-review process yet. This online conference will bring together 1,400 ecologists from than 50 countries to discuss the most recent breakthroughs in ecology.?

Credit: 
British Ecological Society

Lithuanian researchers propose combination of methods to improve anticancer drug delivery

image: "We believe that our research results will add to the development of global knowledge in this critical topic", says Professor Dr Renaldas Raišutis, Head of Numerical Simulation Laboratory of Ultrasound Research Institute at Kaunas University of Technology (KTU).

Image: 
KTU

Application of low-intensity pulsed ultrasound in combination with microbubbles might enhance the delivery of chemotherapy medication used for treating cancers. In their study, a team of Lithuanian researchers from three universities - KTU, LSMU and VMU - claim that the rate of microbubble survival time is the best indicator for determining the efficiency of sonoporation, i.e. ultrasound-induced laceration of the cancer cell membrane.

Poor drug delivery into cancer cells is one of the greatest concerns in anticancer therapy. Insufficient drug concentration in the tumour limits medication's therapeutic efficacy, therefore the researchers all over the world are looking for solutions that would enhance its transport into cancer cells. One of the methods that are getting increased attention recently is sonoporation caused by ultrasound.

There are two main mechanisms by which ultrasound may affect drug transport into cells. Firstly, under the effect of ultrasound, all fluid starts moving in oscillatory fashion and this may enhance the diffusion of the molecules through the cell membrane. Another mechanism includes usage of microbubbles to damage the structure of cells. The tiny (1-4 μm in diameter) gas-filled spheres coated with a shell made of lipids, polymers, and proteins are used to enhance the ultrasound imaging procedure. It has been observed that ultrasound causes microbubbles to rapidly expand and contract - this phenomenon is called cavitation. Cavitation causes the flow of the fluid around the microbubble, and the produced shear forces may create small pores in cell membranes.

"The method where ultrasound together with microbubbles is used to improve the anticancer medication delivery into cancer cells is widely researched. Although mainly research is conducted in a laboratory environment during in vitro studies, first clinical trials have already started. We believe that our research results will add to the development of global knowledge in this critical topic", says Professor Dr Renaldas Raisutis, Head of Numerical Simulation Laboratory of Ultrasound Research Institute at Kaunas University of Technology (KTU).

While researching the sonoporation for improving the efficiency of the anticancer medication, the multi-disciplinary team of researchers from three Lithuanian universities worked together - Lithuanian University of Health Sciences (LSMU), Vytautas Magnus University (VMU) and KTU. After numerous studies, in which the researchers analysed ultrasound penetration signals under different acoustic pressure and duration of exposure, they have improved the method of predicting the efficiency of sonoporation. The research conducted by Lithuanian scientists reveals that microbubble survival time is the universal estimate, which defines the optimal duration of exposure for sonoporation.

"Our main finding is that only time-dependent estimate, the rate of microbubble survival time is needed to predict the efficiency of sonoporation. Therefore, our metrics are more advanced than the currently used inertia cavitation control variable", explains Prof Raisutis.

Moreover, while studying the anticancer medication doxorubicin delivery into spherical (3D) cancer cell culture (a model for conducting experiments, which is more closely able to mimic conditions in vivo), KTU researchers together with the team from LSMU have determined the ultimate combination of microbubbles and ultrasound, that would enhance the transport of the drug into the tumour tissue.

During the research, which took place over several years, the scientists from KTU, LSMU and VMU were solving numerous problems related to the development of the ultrasound technology needed for delivering anticancer medication, the determination of ultrasound parameters for exposure and analysing the ultrasound penetration signals. According to Prof Raišutis, all the partners have actively participated and contributed to the research according to their fields of expertise.

Credit: 
Kaunas University of Technology

A new means of neuronal communication discovered in the human brain

In a new study published in Nature Communications, research groups of Professor J. Matias Palva and Research Director Satu Palva at the Neuroscience Centre of the University of Helsinki and Aalto University, in collaboration with the University of Glasgow and the University of Genoa, have identified a novel coupling mechanism linking neuronal networks by using human intracerebral recordings.

Neuronal oscillations are an essential part of the functioning of the human brain. They regulate the communication between neural networks and the processing of information carried out by the brain by pacing neuronal groups and synchronising brain regions.

High-frequency oscillations with frequencies over 100 Hertz are known to indicate the activity of small neuronal populations. However, up to now, they have been considered to be exclusively a local phenomenon.

The findings of the European research project demonstrate that also high-frequency oscillations over 100 Hertz synchronize across several brain regions. This important finding reveals that strictly-timed communication between brain regions can be achieved by high-frequency oscillations.

The researchers observed that high-frequency oscillations were synchronised between neuronal groups with a similar architecture of brain structures across subjects, but occurring in individual frequency bands. Carrying out a visual task resulted in the synchronisation of high-frequency oscillations in the specific brain regions responsible for the task execution.

These observations suggest that high-frequency oscillations convey within the brain 'information packages' from one small neuronal group to another.

The discovery of high-frequency oscillations synchronised between brain regions is the first evidence of the transmission and reception of such information packages in a context broader than individual locations in the brain. The finding also helps to understand how the healthy brain processes information and how this processing is altered in brain diseases.

Credit: 
University of Helsinki

The latest magnesium studies pave the way for new biomedical materials

image: Surface mechanical attrition treatment (SMAT) of magnesium improves its strength and corrosion resistance. (Source: IFJ PAN)

Image: 
Source: IFJ PAN

Materials used in biomedicine must be characterized by controlled biodegradability, sufficient strength and total absence of toxicity to the human body. The search for such materials is, therefore, not a simple task. In this context, scientists have been interested in magnesium for a long time. Recently, using such techniques as positron annihilation spectroscopy, the researchers were able to demonstrate that magnesium subjected to surface mechanical attrition treatment obtains the properties necessary for a biocompatible material.

Materials showing controlled corrosion rate are gaining more and more interest. This applies in particular to biomedicine, where implants made of natural or synthetic polymers are used. Their advantage is that the rate of decomposition can be easily adjusted under physiological conditions. On the other hand, the mechanical properties of these materials are deteriorated in the environment of the human body, making them unsuitable for high-stress applications. For this reason, metallic implants based on magnesium that is entirely harmless to the human body seem to be a good option.

Magnesium is the lightest metal that can be used in structural applications. Due to its mechanical, thermal and electrical properties as well as biodegradability and the controlled rate of corrosion, it sparks great interest in researchers dealing with biocompatible implants. Despite these advantages, the use of magnesium as a biomaterial for the production of implants has not been easy due to the relatively high corrosion rate in the human body environment. However, this problem can be overcome by using appropriate coatings.

In many years of research, it was noticed that the fine-grained microstructure of materials not only improves their mechanical properties but can also significantly increase the corrosion resistance. That is why an international research team led by Prof. Ewa Dryzek from the Institute of Nuclear Physics of the Polish Academy of Sciences in Krakow set the goal of quantifying the impact of the surface mechanical attrition treatment (SMAT) of commercial-grade magnesium on its corrosion resistance. In this method, a large number of stainless steel balls a few millimetres in diameter hit the surface of the target material, causing plastic deformation of the subsurface layer. Plastic deformation is accompanied by the production of a large number of crystal lattice defects.

Typical research techniques such as light and electron microscopy, X-ray diffraction (XRD), electron backscatter diffraction (EBSD), and microhardness measurements were used to describe the microstructure.

"Microscopic examination revealed a gradually changing microstructure of the material's surface layer, formed during SMAT processing. We observed considerable grain refinement close to the treated surface. Deformation twins were visible deeper, the density of which decreased with increasing distance from this surface," explains Prof. Dryzek.

As part of this work, positron annihilation spectroscopy (PAS) was used for the first time. The technique is non-destructive and allows for the identification of lattice defects at the atomic level. It consists in the fact that when positrons are implanted into a material sample and meet their antiparticles, i.e. electrons, they annihilate and turn into photons that can be registered. A positron that finds on its way an open volume defect in the crystal lattice can be trapped in it. This extends the time until it annihilates. Measuring the lifetime of positrons gives researchers a picture of the sample's structure at the atomic level.

The purpose of using this method was, in particular, to obtain information on the distribution of crystal lattice defects in the surface layer resulting from SMAT treatment. Also, it was employed to study a material layer with a thickness of a few micrometres, lying just below the treated surface, and to link the obtained information with corrosion properties. This is important because the lattice defects determine the key properties of the materials as it is utilized, for example, in metallurgy or semiconductor technology.

"The mean lifetime of the positrons in the 200-micrometre layer obtained from the 120-second SMAT treatment shows a high constant value of 244 picoseconds. This means that all positrons emitted from the source reaching this layer annihilate in structure defects, i.e. missing atoms in the sites of the crystal lattice called vacancies, which in this case are associated with dislocations. This layer corresponds to a strongly deformed area with fine grains. Deeper, the mean lifetime of positrons decreases, which indicates a decreasing concentration of defects, reaching at a distance of about 1 millimetre from the surface the value characteristic for well-annealed magnesium with a relatively low density of structural defects, which was our reference material," PhD student Konrad Skowron, the lead author of the article and originator of the studies, describes the details of the work.

The SMAT process significantly influenced the behaviour of magnesium samples during electrochemical corrosion tests. Structural changes caused by SMAT increased the susceptibility of magnesium to anodic oxidation, intensifying the formation of a hydroxide film on the surface and consequently leading to better corrosion resistance. This is confirmed by the results obtained with the use of a positron beam at the Joint Institute for Nuclear Research in Dubna, Russia. The results show that besides the grain and subgrain boundaries present on the surface, also other crystal defects such as dislocations and vacancies can play an essential role in the corrosive behaviour of magnesium.

"We are currently conducting a similar study for titanium. Titanium is a metal widely used in the aerospace, automotive, energy and chemical industries. It is also applied as a material for the production of biomedical devices and implants. An economically acceptable method that allows obtaining pure titanium with a gradient microstructure with nanometric grains in layers adjacent to the surface may open wider prospects for the use of titanium in products important for the global economy and for improving the comfort of human life," says Prof. Dryzek.

Credit: 
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences

Weddell sea: Whale song reveals behavioral patterns

Whale Song Reveals Behavioural Patterns of Antarctic Minke Whales and Humpback Whales in the Weddell Sea

The AWI's underwater recordings confirm: minke whales prefer the shelter of sea ice, while humpback whales avoid it

Bremerhaven/GERMANY, 17 December 2020. Until recently, what we knew about the lives of baleen whales in the Southern Ocean was chiefly based on research conducted during the Antarctic summer. The reason: in the winter, there were virtually no biologists on site to watch for the animals. Experts from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) have now used permanently installed underwater microphones, which have been recording for the past nine years, to successfully gather and analyse whale observation data from the Weddell Sea. The audio recordings offer unique insights into the lives of humpback whales and Antarctic minke whales. They show e.g. that there are most likely two humpback whale populations in the Weddell Sea, both of which avoid the sea ice and call or sing most frequently in the autumn. In contrast, Antarctic minke whales primarily live in ice-covered regions and produce their characteristic quacking sounds in the winter, as the researchers report in two studies recently published in the online journal Royal Society Open Science. Their goal: for the new findings to help improve protective measures for these baleen whales and their main food source, the Antarctic krill.

Antarctic minke whales (Balaenoptera bonaerensis) are still a mystery to marine biologists, who don't know how many of these whales there are, or where exactly they live, mate and give birth to their calves. A few years ago, however, it was discovered that Antarctic minke whales produce certain characteristic sounds. These calls, which often sound a bit like the quacking of a duck, provide incontrovertible proof of the presence of the small whales, which measure up to eleven metres in length.

AWI biologist Diego Filun and his team are now using these sounds in the first-ever comprehensive, long-term observation of Antarctic minke whales in the Weddell Sea. "We've been monitoring our underwater microphones for nine years. They were deployed at 21 points throughout the Weddell Sea and along the prime meridian, allowing us to record the whales' acoustic activities in regions where research vessels rarely venture. Thanks to the recordings, we now finally understand in what parts of the Weddell Sea the minke whales prefer to be at different times of year, and know that at least some of them stay there for the winter and don't migrate to warmer waters," Filun explains.

The recordings from 2008 to 2016 show that, in summer and winter alike, Antarctic minke whales tend to stay in those regions of the Weddell Sea that are covered with sea ice. Yet the frequency of their calls appears to change with the season: they can be heard far more often in the autumn and winter months (April to October) than in the summer months (December to March). In addition, the acoustic observations call into question certain previous assumptions: "On aerial survey flights over the Weddell Sea in the summer, minke whales were primarily sighted near the sea-ice edge and less frequently in areas with thick sea ice. But our audio recordings showed just the opposite: the minke whales were rarely found in the marginal ice zone, and much more often under thick ice - most likely in an attempt to avoid their archenemies, killer whales," Filun reports.

Humpbacks avoid the sea ice

In contrast, the humpback whales (Megaptera novaeangliae) of the Weddell Sea don't seek shelter below the ice. On the contrary! As the second hydroacoustic study, led by AWI biologist Elena Schall, determined, the baleen whales avoid ice-covered regions. Instead, they venture to the north of the ice edge on the hunt for Antarctic krill, which can be found in especially large swarms in the Weddell Sea and waters north of it.

"Our audio recordings from 2013 indicate that at least two humpback whale populations come to the Weddell Sea in summer to build up their fat reserves. Whales from South Africa seem to go hunting at the eastern edge, near the prime meridian. But humpbacks from South America tend to stay in the northern coastal waters of the Antarctic Peninsula, and can be heard until later in the year than their counterparts to the east," explains first author Elena Schall.

In both groups, some of the animals don't begin the long trek to the north at summer's end, and instead overwinter in the ice-free regions of the Weddell Sea. At the same time, the recordings indicate that, in the summer, the humpbacks move southward as the ice retreats, but only go as far as is absolutely necessary to find sufficient food.

Essential information for successful marine protection in the Antarctic

"If we want to protect the unique biotic communities of the Weddell Sea in the long term, we need to know as precisely as possible how many baleen whales come to the Atlantic sector of the Southern Ocean in search of food, what regions they hunt in, whether they overwinter there, and how much krill is needed for the whales to find sufficient food. In this regard, long-term acoustic observations are a vital tool, because they offer us a far more detailed picture of life below the water than the handful of scientific whale sightings alone," says Dr Ilse van Opzeeland, an AWI biologist and co-author of both studies.

The team now hopes that the findings from the new studies will be taken into account in future discussions on the establishment of a marine protected area in the Weddell Sea, especially in terms of limiting krill fishing to ensure that there is still enough food for all marine fauna.

In the meantime, the experts will continue to analyse their wealth of underwater recordings. First of all, they'll seek to determine the purpose of Antarctic minke whales' unusual 'quacking'; in addition, there are initial indications that the humpback whales' recorded songs and calls could be used to discover why they leave the Weddell Sea much earlier or later than normal in certain summers, or the conditions under which they sometimes don't return to the Weddell Sea at all when winter ends. Understanding these behavioural patterns would mean a major step forward for the AWI's marine biologists.

Credit: 
Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research

Change in global precipitation patterns as a result of climate change

image: Climate archives like this drill core taken from cave flowstone (speleothem) enable climate scientists to reconstruct past climate changes and better understand our climate system.

Image: 
photo/©: Michael Deininger

The Earth's climate system is largely determined by the differences in temperature between the tropics and the poles. Global warming is likely to cause global atmospheric circulation to change and progressively revert to a situation similar to that of 5,000 to 10,000 years ago. This is the conclusion of a study undertaken by a research team led by Dr. Michael Deininger, the results of which have been published in Nature Communications.

At the Institute of Geosciences at Johannes Gutenberg University Mainz (JGU), Deininger investigated how regional climate systems have changed since the beginning of the current interglacial period some 10,000 years ago and what conclusions can be drawn from this. To do this, the paleoclimatologist looked at data for rainfall time series recorded in various climate archives. "We were able to accurately reconstruct summer precipitation in the monsoon regions in Africa and South America, compare this data with changes in precipitation in the northern mid-latitudes, and relate this to changes in temperature," Deininger explained. The study also involved scientists from Australia, Brazil, Mexico, Ireland, Austria, and South Africa.

Synchroneity in the development of precipitation patterns in the various regions over the past 10,000 years

As the Earth is heated stronger at the equator than at the poles due to the differing distribution of solar radiation, a temperature gradient develops which, to put it in simple terms, causes atmospheric circulation to transport energy toward the poles. Changes to this solar radiation-related temperature difference will in turn influence the atmospheric circulation and thus also regional precipitation patterns.

The new study shows that over the past 10,000 years, changes to regional precipitation in the northern latitudes, Africa, and South America have more or less been synchronous. "We argue that these regional climate variations are connected and that they are mainly caused by alterations to solar radiation and the associated temperature differences between the tropics and polar regions," stated Deininger.

Learning from the past to benefit the future

The researchers involved in the study were particularly interested in the question of whether it is possible to learn from the past to benefit the future. With the current level of global warming, the temperature gradient between the equator and the poles is being reduced - especially due to the fact that warming in the Arctic has a particularly marked effect. This can weaken the westerly winds in mid-latitudes in the Northern Hemisphere, cause a weaker South American monsoon and a stronger African monsoon, while at the same time lead to lower precipitation levels in the summer rainfall zone of Southeast Africa. The consequences of this could be shifts in regional rainfall patterns, potentially causing droughts in some areas and flooding in others. "In future, we need to recognize the fundamental role the variation in temperature difference plays in controlling our climate system," concluded Dr. Michael Deininger.

Credit: 
Johannes Gutenberg Universitaet Mainz

Improving multi-sectoral ocean management to achieve the Sustainable Development Goals

image: In Seychelles, artisanal and industrial fisheries coexist.

Image: 
© IRD - Thibaut Vergoz

They have shown that certain multi-sectoral mechanisms, such as marine protected areas, are the most effective in reconciling the ecological, economic and social dimensions of this SDG. These results were published in the journal Nature Sustainability on 14 December 2020, and will make it possible to improve operational guidelines for the preservation of the oceans.

In 2015, the United Nations adopted 17 Sustainable Development Goals (SDG), calling on States to act on the environmental, social and economic aspects of development. SDG 14, "Life below water", aims for the conservation and sustainable use the oceans, seas and marine resources. Through 7 targets, this objective addresses multiple challenges: reducing marine pollution, restoring marine ecosystems, reducing ocean acidification, allowing sustainable fisheries, conserving marine and coastal areas, ending harmful fisheries subsidies, and increasing the economic benefits of the sustainable use of marine resources for small island developing states and least developed countries.

To meet these challenges, decision-makers make use of "spatial management tools" that regulate uses in a given area. Some tools regulate the activities of a single sector, such as fishing or maritime traffic: this is the case of Gear Restricted Areas (GRAs), Fishing Closures (FCs), Territorial Use Rights in Fisheries (TURFs) and Particularly Sensitive Sea Areas (PSSAs). Other tools are multi-sectoral, such as marine Fully Protected Areas (FPAs), Partially Protected Areas (PPAs) and Locally Managed Marine Areas (LMMAs).

Assessing the level of confidence in the evidence

In this study, the researchers looked at the proven effectiveness of these management tools in achieving the targets of SDG 14, in its ecological (increasing the size and abundance of marine organisms and species diversity, ecosystem resilience, etc.), and economic and social (equitable access to resources, improved income, maintenance of traditions and customs, etc.) dimensions.

To do this, they analysed the scientific literature, favouring articles that provided an overview of previous studies (177 articles), and conducted surveys among 75 international experts specialised in the oceans. Following a similar approach to that of the expert group of the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES), the researchers determined the "level of confidence" in the tools' capacity to produce certain results. They then developed a scoring system that linked spatial management tools to the targets of SDG 14, based on the relative contributions of results to the targets.

Five of the seven targets of SDG 14 are achievable

Using this methodology, the authors first found that the spatialised management tools they had evaluated had the potential to contribute to five of the seven targets of SDG 14: restoration of marine ecosystems, sustainable fisheries, conservation of maritime and coastal areas, reduction of harmful subsidies and increasing the income of small island developing states.

"Our results confirm the inability of the tools evaluated to effectively reduce marine pollution and the impacts of ocean acidification", says Rodolphe Devillers, a Geographer at IRD who coordinated the study. "Solutions for these aspects will require a reduction in pollution from the earth and a drastic reduction in greenhouse gas emissions", adds Joachim Claudet, an ecologist at the CNRS and co-author of the study.

The scientists' second finding is that some single-sector tools, such as GRAs and Fishing Closures (FCs), are useful in the sector they regulate, but not very effective for the other targets of SDG 14. On the other hand, multi-sectoral tools - such as Fully (FPA) and Partially Protected Areas (PPA), as well as Locally Managed Marine Areas (LMMA) - are more likely to facilitate the achievement of a wide range of targets because of their proven ecological and socio-economic benefits.

For Rodolphe Devillers, "our results constitute a scientific contribution to the United Nations Decade of Ocean Science for Sustainable Development, which begins in 2021. They highlight the complexity of the problem and the need to change our management approaches to achieve all the targets of SDG 14".

Furthermore, "holistic approaches to planning and management of the land-sea interface, such as integrated coastal zone management, are likely to be important for integrating land-based regulations in spatial management tools in order to achieve SDGs," stresses Joachim Claudet.

Finally, the authors of the study point out that to attain their full potential, these tools must be designed with local needs in mind, be well managed and their regulations well enforced.

Credit: 
Institut de recherche pour le développement

Can water saving traits help wine survive climate change?

Climate change is expected to make many grape-growing regions too hot and dry to produce high-quality wine from traditional varieties. But scientists at the University of California, Davis, have found that wine grape varieties from regions that are more prone to stress have traits that could help them cope with climate change.

The study, published in the Journal of Experimental Botany, finds that varieties that produce their best wines in warmer, drier regions have traits that conserve water, helping the vines extend their water resources to last over the growing season.

"The relationships between grape varieties and regions have historically been based on wine, without considering traits that affect drought or heat tolerance," said lead author Megan Bartlett, an assistant professor in the Department of Viticulture and Enology. "These findings show these varieties could be more resilient to climate change than expected."

THE TRADE-OFF

The study examined how grapevines regulate their stomata­ -- tiny pores found on the surface of leaves that allow plants to take in carbon dioxide for photosynthesis and expel oxygen. The regulation of these stomata affects how much CO2 is available for photosynthesis and how much water evaporates from the leaves. Grapevines must choose between opening their stomata to take in CO2 to produce sugars for growth and ripening or closing the stomata to reduce evaporation and water stress.

A little water stress improves wine by concentrating the flavors and aromas in the grapes. But too much will prevent grapes from achieving their ideal balance of sugars, acids and tannins, creating flat, uninteresting wines.

The researchers examined traits for 34 varieties and used a global database of planting areas in different wine regions to define the associations between varieties and regions. The study focused on European regions, where irrigation is banned or restricted, to directly capture the stress imposed by the local climate.

The study found that the varieties grown in regions more likely to experience water stress, such as Italy's Sangiovese and Montepulciano, kept their stomata more closed than varieties like Sauvignon Blanc from cooler, more humid regions.

"This strategy would help these varieties save water," said study co-author Gabriela Sinclair, a research technician in the Bartlett lab.

Bartlett cautions that these traits may have unintended consequences as heatwaves become more extreme. Grapevines use evaporation to cool the leaves, the same way we cool ourselves by sweating. Restricting evaporation too tightly could allow leaves to reach damaging temperatures, reducing their future photosynthesis and limiting the sugars available for ripening.

"We have more work to do to understand how these traits will affect grapevines as the climate reaches new extremes," said Bartlett. "These findings show that traits will be important to consider when we predict what will happen to different wine regions."

Credit: 
University of California - Davis

Fish oil supplements don't raise bad cholesterol

image: Fatty Acid Research Institute studies the connection between omega-3s EPA and DHA and health

Image: 
Fatty Acid Research Institute

The Fatty Acid Research Institute (FARI) has published a new research paper in conjunction with The Cooper Institute on the omega-3s EPA and DHA in fish oil and low density lipoprotein cholesterol (LDL-C).

Omega-3 fatty acids have a long history of being "heart healthy," and are well-known for lowering blood levels of triglycerides (but typically not cholesterol). Recent questions have been raised, however, about one of the two "fish oil" omega-3 fatty acids -- DHA (docosahexaenoic acid) -- and the possibility that it might actually raise levels of LDL-C, the "bad" cholesterol.

There is good evidence that people with very high serum triglyceride levels (>500 mg/dL) who are treated with high doses of omega-3, i.e., 4 g/day of EPA (eicosapentaenoic acid) and DHA commonly see a rise in LDL-C, whether this occurs in the "real world" with generally healthy people taking fish oil supplements
for cardioprotection is not clear.

A recent study from the Cooper Center Longitudinal Study (CCLS) and FARI sheds new light on this question.

The investigators utilized data from 9253 healthy men and women who had at least two preventive medical examinations at Cooper Clinic in Dallas over a 10-year period. These examinations routinely included both blood cholesterol testing and measurement of the Omega-3 Index (i.e., red blood cell (RBC) EPA+DHA levels from OmegaQuant Analytics). Questions about current use of fish oil supplements was also collected.

With this information, the researchers then asked 2 questions: 1) did people who started taking fish oil supplements between visits experience a rise in LDL-C levels, and 2) did LDL-C levels rise in people whose RBC DHA levels increased between visits?

It turns out that the answer to both of these questions was "no." In fact, a 1-unit rise in RBC DHA levels was associated with a small (1-2 mg/dL) but statistically significant decrease in LDL-C. And this analysis took into account concurrent changes in background use of cholesterol-lowering drugs like statins. This small decrease in LDL-C is not a clinically-relevant, but this study shows that fish oil supplement use in the general population does not adversely affect LDL-C.

Dr. William Harris, President of FARI and co-inventor of the Omega-3 Index, was the lead author on this study. In his view, "these new findings from the CCLS clearly show that people who take fish oil supplements need not worry about adversely affecting their cholesterol levels as some have proposed."

He also noted that these results also harmonize well with the conclusions of a recent American Heart Association Advisory on the use of omega-3 fatty acids in the treatment of high triglyceride levels. This major review found there is "no strong evidence that DHA-containing prescription omega-3 fatty acid agents used alone or in combination with statins raise LDL-C in patients with high triglyceride levels.1"

Commenting on this paper, Dr. Carl Lavie, a cardiologist and Medical Director of the Cardiac Rehabilitation and Prevention Program at the John Ochsner Heart and Vascular Institute in New Orleans, LA, said, "This large study from the Cooper Clinic indicates that RBC DHA levels are not associated with higher LDL-cholesterol levels (actually with lower), and adding omega-3 supplements was also not associated with increases in LDL-C."

Dr. Lavie and colleagues recently published data from 40 studies in over 135,000 participants in the Mayo Clinic Proceedings indicating that the combined EPA and DHA dose predicted reductions in major cardiovascular outcomes2. "These new data from the Cooper Institute add to the cumulative evidence of the safety and efficacy of omega-3 from dietary sources and supplements, including the combination of EPA and DHA," he said.

Credit: 
Wright On Marketing & Communications

Green revolution saved over 100 million infant lives in developing world

image: Modern crop varieties have substantially reduced infant mortality, especially for male babies and among poor households.

Image: 
baona

New research from the University of California San Diego shows that since modern crop varieties were introduced in the developing world starting in 1961, they have substantially reduced infant mortality, especially for male babies and among poor households.

The study assessed mortality rates of more 600,000 children across 37 developing countries, revealing global diffusion of agricultural technology reduced infant mortality by up to 2.4 to 5.3 percentage points. This translates to around 3 to 6 million infant deaths averted per year by the year 2000.

The global scale of the study--the most sweeping to measure the green revolution's impact on child health--is critical because while the green revolutions represents one of the most important technological transformations in modern history, it did not reach all parts of the world equally.

"If the green revolution had spread to sub-Saharan Africa like it did to South Asia, our estimates imply that infant mortality rates would improve by 31 percent," said Gordon McCord, study co-author and associate teaching professor of economics at UC San Diego's School of Global Policy and Strategy.

In the course of the past 60 years, the green revolution catalyzed the spread of modern crop varieties for staple crops such as wheat, maize and rice throughout the developing world. It also exemplifies successful U.S. international cooperation--the Rockefeller and Ford foundations were the initial funders of the green revolution in the 1950s and 1960s, followed by the governments of wealthy countries, including the United States.

Developed by dozens of national agriculture programs with the support of international agricultural research centers, the crops have high yield potential such as resistance to stress, pests and disease, and improved quality of the harvested material. The increase of agricultural production worldwide has been credited with saving over a billion people from starvation.

In the paper, published in the Journal of Health Economics, McCord and co-authors combined geospatial crop data with child-level data of over 600,000 children across 21,604 locations in 37 developing countries between 1961 and 2000. Their findings imply that a substantial part of the infant mortality reduction observed in the developing world during the second half of the 20th century is due to diffusion of agricultural technologies and inputs. By the year 2000, more than three million infant lives were saved per year as a result.

The child-level data were provided by geo-located public health surveys of women of ages 15-49 regarding their fertility history, generating records for around three million children. McCord and co-authors culled down that information to focus on rural areas and to mothers who never migrated. This data set was spatially merged with crop distribution data, allowing for an analysis at high spatial resolution.

Modern crop varieties proved to have positive effect on all infants; however, the impact is greater among male than female babies. The researchers found impact on female infants only in countries with more gender parity, suggesting the larger impact on male babies is partly due to discrimination by sex in resource allocation to children. Additionally, infant mortality rates declined more sharply among poorer households.

"The health benefits of broad-based increases in agricultural productivity should not be overlooked," McCord said. "From the policy perspective, government support for inputs leading to a green revolution as well as investments in extension and R&D programs are important."

At the global level, the researchers' estimates suggest that an increase in modern crop adoption from 0 to 50 percent leads to a decline in infant mortality by 33-38 deaths per 1,000 children.

The authors conclude their work speaks to the importance of improving productivity in agriculture as a means of improving lives in developing countries, including the lives of the poor in rural areas.

"It is reasonable to view with some alarm the steady decline in funding for cereal crop improvement over the last few decades in sub-Saharan Africa, the continent with the least modern crop varieties," they write. "As such, our research can inform the recent debate about whether investing in increased smallholder agricultural productivity is an effective strategy for economic development, health improvement and poverty alleviation in sub-Saharan Africa."

Credit: 
University of California - San Diego