Tech

Pressure-regulated excitonic feature enhances photocurrent of all-inorganic 2D perovski

image: Pressure-reduced exciton binding energy and enhanced photoconductivity in 2D Cs2PbI2Cl2.

Image: 
Xujie Lü

HPSTAR scientists Dr. Songhao Guo and Dr. Xujie Lü report three orders of magnitude increase in the photoconductivity of Cs2PbI2Cl2 from its initial value, at the industrially achievable level of 2 GPa, using pressure regulation. Impressively, pressure regulating the 2D perovskite's excitonic features gains it 3D compound characteristics without diminishing its own advantages, making it a more promising material for photovoltaic and photodetector applications. Their study is published as a Cover article in the latest issue of the Journal of the American Chemical Society.

Two-dimensional (2D) halide perovskites have recently emerged for photovoltaic and optoelectronic applications due to their unique and tunable properties as well as high stability. Despite substantial development progress on developing these materials, how structural regulation affects their excitonic features, which govern their optoelectronic properties, has been unknown until now.

Comprehensive in situ experimental characterization and first-principles calculations reveal that lattice compression effectively regulates the excitonic features of Cs2PbI2Cl2, reducing the exciton binding energy from 133 meV at ambient conditions to 78 meV at 2 GPa. Notably, this reduced exciton binding energy of the 2D perovskite is comparable to typical 3D halide perovskites' values, facilitating the dissociation of photo-excited excitons into free carriers and thus, enhancing the photoconductivity. Further pressurization leads to a layer-sliding-induced phase transition and an anomalous negative-linear compression, which has never been observed in other halide perovskites.

This work reveals the pressure-enhanced photocurrents in 2D halide perovskite for the first time and provides deeper insights into the relationship between their excitonic features and optoelectronic properties, furthering our understanding of their fundamental mechanisms.

Credit: 
Center for High Pressure Science & Technology Advanced Research

How are universities planning to tackle emissions associated with food and flying?

New research from The University of Manchester has identified various ways in which UK higher education institutions are beginning to tackle emissions associated with business travel and catering. These are two substantial contributors to emissions in this sector, and difficult to decarbonise. The findings suggest need for further sector-wide efforts to tackle the planet's most pressing issue.

This new study, from The University of Manchester's Tyndall Centre for Climate Change and the Centre for Climate Change and Social Transformations (CAST), analysed publicly available policies of 66 UK universities to identify strategies related to long-distance business travel and catering. For each university, documents including Carbon Management Plans and Annual Reports, Travel Plans and Sustainable Food Policies were downloaded, catalogued and reviewed.

Long-distance business travel and catering (particularly meat-based meals) are substantial contributors to the carbon footprint of universities (and many other organisations), but are typically under-accounted for in carbon management planning. The collaborative research team set-out to understand the extent to which university plans and actions in these areas are commensurate with climate emergency declarations, and make recommendations to support setting sufficiently ambitious targets and actions.

The research, published today in Climate Policy, demonstrates that action on climate change in universities is extending beyond the familiar focus on energy related emissions to engage in more complex workplace practices, including long-distance business travel and catering. However, increasing sector-wide effort is unavoidable if universities are to fulfil their climate emergency declarations and align emissions reduction strategies with the UK Government's net zero ambitions.

Lead author on the research paper Professor Claire Hoolohan, The University of Manchester said: "Many universities omit, or only partially account for, business travel and food within their carbon management reporting. However, the importance of emissions in these areas is widely recognised and there is evidence of pioneer institutions setting targets and taking action to reduce emissions in these areas.

"Across the sector more action is required to reduce emissions. To support sector-wide action, this briefing note focusses on targets and actions that should be implemented to rapidly and substantially reduce emissions in these two areas, and contribute towards a low-carbon workplace culture."

The UK's Committee on Climate Change recognises aviation and agriculture as sectors where it is very challenging to reduce emissions. Mobility scholars have shown that aeromobility is deeply embedded in the institutional culture of Higher Education, with individual career progression and institutional standing linked to international mobility.

Similarly, for meat-eating, coordinated developments across production-consumption systems sustain meat-heavy diets, and this is no less true in workplace cafeterias and catering. Subsequently, reducing emissions requires the reconfiguration of professional practices and institutional policies to enable low-carbon transformation.

The research finds many universities planning to reduce emissions in these areas, but few have robust targets to support decarbonisation. Further it is action, not plans or targets, that reduce emissions and few universities have actions in place to reduce emissions across both areas.

That said, there were examples of good practice in both areas, and future action could focus on the following:

Positive actions on flying and food for Universities.

Review and define 'essential travel' to support staff in avoiding travel as much as possible.

Maximise the number of engagements per trip, reduce the distance and frequency.

Make train travel the default for journeys within a specified distance, with additional time and funding for long distance rail travel

Focus on reducing trips of frequent fliers and recognise the differentiated travel needs of staff with children, care commitments and medical needs.

Review University policies for contradictions that encourage flying

Reduce meat, and replace with plant-based alternatives

Make plant-based event catering the default to spark conversation and enable staff to try new meals.

Experiment at sub-organisation level, then share learning and scale up

Professor Alice Larkin, Head of Engineering at The University of Manchester, said: "Higher education's response to the COVID-19 pandemic has demonstrated that rapid, deep and widespread changes are possible. The shifts in our academic activities that we've all experienced, as well as changes to how we've started to operate in new ways, present significant opportunities to establish alternative, more sustainable, practices."

Credit: 
University of Manchester

Immunotherapy drug delays onset of Type 1 diabetes in at-risk group

More than five years after receiving an experimental immunotherapy drug, half of a group of people at high risk of developing Type 1 diabetes remained disease-free compared with 22% of those who received a placebo, according to a new trial overseen by Yale School of Medicine researchers.

And those who developed diabetes did so on average about five years after receiving the new drug, called teplizumab, compared with 27 months for those who received the placebo.

The study, which was done in collaboration with researchers from Indiana University, was published March 3 in the journal Science Translational Medicine.

"If approved for use, this will be the first drug to delay or prevent Type 1 diabetes," said Kevan Herold, the C.N.H. Long Professor of Immunobiology and of Medicine (Endocrinology) at Yale and co-senior author of the paper.

The drug, developed by biotechnology company Provention, has been awarded breakthrough status by the U.S. Food and Drug Administration and could be approved for general use by summer, Herold said.

In the trial, an analysis of the 76 subjects showed reduced levels of damage caused by T cells in response to the drug and improved functioning of insulin-producing beta cells in those who received teplizumab.

The subjects in the trial had a median age of 13 years and relatives with Type 1 diabetes.

The new study is the result of 30 years of work by Herold's lab to find new treatments for Type 1 diabetes. The findings are a follow-up to another clinical study organized by TrialNet, an international coalition dedicated to the study of the disease. That study, which was published in 2019, showed a delay in the onset of Type 1 diabetes among those who received teplizumab.

Type 1 diabetes is an autoimmune disease in which a person's own T cells attack insulin-producing beta cells in the pancreas. Those diagnosed require lifelong insulin treatment and face higher risk of death and diseases affecting the heart, kidneys, and vision. Diagnosis of the disease often occurs during childhood or adolescence.

Herold stressed it is not known whether some of the subjects who received teplizumab will never develop Type 1 diabetes. But delaying onset of disease could have a big impact on the development of those at high risk.

"Any time without diabetes is important, but particularly so for those children who might have a chance to grow up without it," he said.

Credit: 
Yale University

Tackling tumors with two types of virus

An international research group led by the University of Basel has developed a promising strategy for therapeutic cancer vaccines. Using two different viruses as vehicles, they administered specific tumor components in experiments on mice with cancer in order to stimulate their immune system to attack the tumor. The approach is now being tested in clinical studies.

Making use of the immune system as an ally in the fight against cancer forms the basis of a wide range of modern cancer therapies. One of these is therapeutic cancer vaccination: following diagnosis, specialists set about determining which components of the tumor could function as an identifying feature for the immune system. The patient is then administered exactly these components by means of vaccination, with a view to triggering the strongest possible immune response against the tumor.

Viruses that have been rendered harmless are used as vehicles for delivering the characteristic tumor molecules into the body. In the past, however, many attempts at creating this kind of cancer therapy failed due to an insufficient immune response. One of the hurdles is that the tumor is made up of the body's own cells, and the immune system takes safety precautions in order to avoid attacking such cells. In addition, the immune cells often end up attacking the "foreign" virus vehicle more aggressively than the body's own cargo. With almost all cancer therapies of this kind developed so far, therefore, the desired effect on the tumor has failed to materialize. Finding the appropriate vehicle is just as relevant in terms of effectiveness as the choice of tumor component as the point of attack.

Arenaviruses as vehicles

The research group led by Professor Daniel Pinschewer of the University of Basel had already discovered in previous studies that viruses from the arenavirus family are highly suitable as vehicles for triggering a strong immune response. The group now reports in the journal Cell Reports Medicine that the combination of two different arenaviruses produced promising results in animal experiments.

The researchers focused on two distantly related viruses called Pichinde virus and Lymphocytic choriomeningitis virus, which they adapted via molecular biological methods for use as vaccine vectors. When they took the approach of administering the selected tumor component first with the one virus and then, at a later point, with the other, the immune system shifted its attack away from the vehicle and more towards the cargo. "By using two different viruses, one after the other, we focus the triggered immune response on the actual target, the tumor molecule," explains Pinschewer.

Tumor eliminated or slowed down

In experiments with mice, the researchers were able to measure a potent activation of killer T cells that eliminated the cancer cells. In 20% to 40% of the animals - depending on the type of cancer - the tumor disappeared, while in other cases the rate of tumor growth was at least temporary slowed.

"We can't say anything about the efficacy of our approach in humans as yet," Pinschewer points out. However, ongoing studies with a cancer therapy based on a single arenavirus have already shown promising results. The effects on tumors in animal experiments cannot be assumed to translate directly into the effect on corresponding cancer types in humans. "However, since the therapy with two different viruses works better in mice than the therapy with only one virus, our research results make me optimistic," Pinschewer adds.

The biotech company Hookipa Pharma, of which Pinschewer is one of the founders, is now investigating the efficacy of this novel approach to cancer therapy in humans. "We are currently exploring what our approach by itself can actually achieve," the researcher says. "If it proves successful, a wide range of combinations with existing therapies could be envisaged, in which the respective mechanisms would join forces to eliminate tumors even better."

Credit: 
University of Basel

More extreme short-duration thunderstorms likely in the future due to global warming

Climate experts have revealed that rising temperatures will intensify future rainfall extremes at a much greater rate than average rainfall, with largest increases to short thunderstorms.

New research by Newcastle University has shown that warming temperatures in some regions of the UK are the main drivers of increases in extreme short-duration rainfall intensities, which tend to occur in summer and cause dangerous flash flooding.

These intensities are increasing at significantly higher rates than for winter storms. A study, led by Professor Hayley Fowler, of Newcastle University's School of Engineering, highlights the urgent need for climate change adaptation measures as heavier short-term rainfall increases the risk of flash flooding and extreme rainfall globally.

Publishing their findings in the journal Nature Reviews Earth & Environment, the international team analysed data from observational, theoretical and modelling studies to examine the intensification of rainfall extremes, what drives these extremes and the impact on flash flooding.

The scientists found that rainfall extremes intensify with warming, generally at a rate consistent with increasing atmospheric moisture, which is what would be expected. However, the study has shown that temperature increases in some regions affect short-duration heavy rainfall extremes more than the increase in atmospheric moisture alone, with local feedbacks in convective cloud systems providing part of the answer to this puzzle.

Professor Fowler said: "We know that climate change is bringing us hotter, drier summers and warmer, wetter winters. But, in the past, we have struggled to capture the detail in extreme rainfall events as these can be highly localised and occur in a matter of hours or even minutes.

"Thanks to our new research, we now know more about how really heavy rainfall might be affected by climate change. Because warmer air holds more moisture, rainfall intensity increases as temperatures rise.

"This new work shows that the increase in intensity is even greater for short and heavy events, meaning localised flash flooding is likely to be a more prominent feature of our future climate."

The findings are also highlighted in a Philosophical Transactions A issue on "Intensification of short-duration rainfall extremes and implications for flash flood risks" by the Royal Society, which was published on 1 March.

It is unclear whether storm size will increase or decrease with warming. However, the researchers warn that increases in rainfall intensity and the footprint of a storm can compound to substantially increase the total rainfall during an event.

In recent years, short but significantly heavy rainfall events have caused much disruption across the UK. Recent examples include severe flooding and landslides in August 2020 and damage to the Toddbrook Reservoir, in the Peak District, in August 2019.

Information about current and future rainfall intensity is critical for the management of surface water flooding, as well as our guidance for surface water management on new developments and sewer design.

Credit: 
Newcastle University

Chemists develop a new technology to prevent lithium-ion batteries from catching fire

video: During five hours of overcharging, the non-protected battery on the left is sporadically getting swollen, while the chemically modified battery on the right suppressed swelling as the 'chemical fuse' blocks the side processes.

Image: 
SPbU

Lithium-ion battery fire hazards are extensive worldwide and such failure can have a severe implication for both smartphones and electric cars, says the head of the group and Professor in the Department of Electrochemistry at St Petersburg University Oleg Levin. 'From 2012 to 2018, 25,000 cases of catching fire by a wide range of devices in the USA only were reported. Earlier, from 1999 to 2012, only 1,013 cases were reported. The number of fire incidents is increasing as is the number of the batteries being used,' he said.

Among the main reasons why lithium ion batteries catch fire or explode are overcharging, short circuit, and others. As a result, the battery is overheated and the battery cell goes into thermal runaway. Increasing the temperature up to 70 or 90°C can lead to hazardous chemical reactions that may result in further increasing temperature and consequently fire or explosion. To keep batteries from catching fire we can use an adjacent device, i.e. an electronic microcircuit. It tracks all parameters of the battery and can switch the battery off in case of emergency. Yet most of the fire incidents were due to failures of the electronic microcircuits caused by manufacturing defects.

'This is why it was particularly important to develop a safety strategy of the battery based on the chemical reactions to block the flow of electric current inside the battery pack. To this end, we propose to use a special polymer. Its electrical conductivity can adjust to the voltage fluctuations in the battery. If the battery works normally, the polymer does not prevent the electric current from flowing. If the battery is overcharged, there is a short circuit, or battery voltage drops below normal operating levels, the polymer goes into a so called isolator, circuit breaker, mode,' said Professor Levin.

There are polymers that can change resistance when heated, says Professor Levin. The problem we faced when using this technology, including in the companies in St Petersburg, was if the polymer starts working as an isolator, it means that the battery has been already undergoing overheating which has resulted in hazardous processes that cannot be stopped by merely breaking electric circuit. This makes this technology far from being effective. Yet such advances generated interests in searching new technologies, including the polymer that will be able to adjust voltage before the battery starts to overheat.

'I collaborated with Evegenii Beletskii, my postgraduate student at the Department of Electrochemistry, who had worked in industry. He has an extensive experience in developing battery safety systems. This helped us a lot in carrying out the experimental part of the project that focused on how the polymer worked. Anna Fedorova, a postgraduate student at the Department of Electrochemistry, also worked in industry. In the project, she was mainly concerned with calculating physical and chemical properties of the material,' said Oleg Levin.

The project lasted two years. During the six years before the start of the project to develop the technology, the scientists had carried out fundamental research to study the physical and chemical properties of a wide range of polymers. They discovered a class of polymers that change resistance with voltage. This was what the scientists got focused on.

'The most difficult part in developing the "chemical fuse" was to find an active polymer. We knew a great variety of polymers of this class. Yet choosing the one that would be suitable to create a prototype was a hard nut to crack,' said Oleg Levin. 'Moreover, we had to advance the technology by developing an industrial version to show that we had come up with an idea of effective battery safety strategy. Thus, we had to purchase a lot of new equipment for prototyping and adjusting techniques to work with lithium-ion batteries.'

What makes this safety technology different is high scalability. For example, how big the traditional adjusting guard circuit depends on how powerful the battery is. Therefore, the scheme of the motive power batteries of electric cars will be both big and costly. Scaling the 'chemical fuse' is simple as it is applied all over the surface of the inner current collector.

'Lithium-ion batteries use different type of cathodes, i.e. positively charged electrode by which electrons enter an electrical device. They have different working voltage. Thus, a safety polymer should react accordingly. We have managed to find a polymer that would be suitable for only one type of battery, that is a lithium iron phosphate battery. Changing the structure of the polymer might result in changing its conductivity to make it suitable for other types of cathodes that are on the market today. We have some thoughts as to how to make this safety strategy more universal by adding a safety component into the polymer to adjust to changes in temperature levels in the battery. This is expected to eliminate all fire risks associated with the batteries,' said Oleg Levin.

Before publishing the article, St Petersburg University received a patent for this technology. The scientists are currently preparing a real-size model of protected batteries to demonstrate them to potential investors.

Credit: 
St. Petersburg State University

Will this solve the mystery of the expansion of the universe?

The universe was created by a giant bang; the Big Bang 13.8 billion years ago, and then it started to expand. The expansion is ongoing: it is still being stretched out in all directions like a balloon being inflated.

Physicists agree on this much, but something is wrong. Measuring the expansion rate of the universe in different ways leads to different results.

So, is something wrong with the methods of measurement? Or is something going on in the universe that physicists have not yet discovered and therefore have not taken into account?

It could very well be the latter, according to several physicists, i.a. Martin S. Sloth, Professor of Cosmology at University of Southern Denmark (SDU).

In a new scientific article, he and his SDU colleague, postdoc Florian Niedermannn, propose the existence of a new type of dark energy in the universe. If you include it in the various calculations of the expansion of the universe, the results will be more alike.

- A new type of dark energy can solve the problem of the conflicting calculations, says Martin S. Sloth.

Conflicting measurements

When physicists calculate the expansion rate of the universe, they base the calculation on the assumption that the universe is made up of dark energy, dark matter and ordinary matter. Until recently, all types of observations fitted in with such a model of the universe's composition of matter and energy, but this is no longer the case.

Conflicting results arise when looking at the latest data from measurements of supernovae and the cosmic microwave background radiation; the two methods quite simply lead to different results for the expansion rate.

- In our model, we find that if there was a new type of extra dark energy in the early universe, it would explain both the background radiation and the supernova measurements simultaneously and without contradiction, says Martin S. Sloth.

From one phase to another

- We believe that in the early universe, dark energy existed in a different phase. You can compare it to when water is cooled and it undergoes a phase transition to ice with a lower density, he explains and continues:

- In the same way, dark energy in our model undergoes a transition to a new phase with a lower energy density, thereby changing the effect of the dark energy on the expansion of the universe.

According to Sloth and Niedermann's calculations, the results add up if you imagine that dark energy thus underwent a phase transition triggered by the expansion of the universe.

A very violent process

- It is a phase transition where many bubbles of the new phase suddenly appear, and when these bubbles expand and collide, the phase transition is complete. On a cosmic scale, it is a very violent quantum mechanical process, explains Martin S. Sloth.

Today we know approx. 20 per cent of the matter that the universe is made of. It is the matter that you and I, planets and galaxies are made of. The universe also consists of dark matter, which no one knows what is.

In addition, there is dark energy in the universe; it is the energy that causes the universe to expand, and it makes up approx. 70 pct. of the energy density of the universe.

Credit: 
University of Southern Denmark

New search engine for single cell atlases

A new software tool allows researchers to quickly query datasets generated from single-cell sequencing. Users can identify which cell types any combination of genes are active in. Published in Nature Methods on 1st March, the open-access 'scfind' software enables swift analysis of multiple datasets containing millions of cells by a wide range of users, on a standard computer.

Processing times for such datasets are just a few seconds, saving time and computing costs. The tool, developed by researchers at the Wellcome Sanger Institute, can be used much like a search engine, as users can input free text as well as gene names.

Techniques to sequence the genetic material from an individual cell have advanced rapidly over the last 10 years. Single-cell RNA sequencing (scRNAseq), used to assess which genes are active in individual cells, can be used on millions of cells at once and generates vast amounts of data (2.2 GB for the Human Kidney Atlas). Projects including the Human Cell Atlas and the Malaria Cell Atlas are using such techniques to uncover and characterise all of the cell types present in an organism or population. Data must be easy to access and query, by a wide range of researchers, to get the most value from them.

To allow for fast and efficient access, a new software tool called scfind uses a two-step strategy to compress data ~100-fold. Efficient decompression makes it possible to quickly query the data. Developed by researchers at the Wellcome Sanger Institute, scfind can perform large scale analysis of datasets involving millions of cells on a standard computer without special hardware. Queries that used to take days to return a result, now take seconds.

The new tool can also be used for analyses of multi-omics data, for example by combining single-cell ATAC-seq data, which measures epigenetic activity, with scRNAseq data.

Dr Jimmy Lee, Postdoctoral Fellow at the Wellcome Sanger Institute, and lead author of the research, said: "The advances of multiomics methods have opened up an unprecedented opportunity to appreciate the landscape and dynamics of gene regulatory networks. Scfind will help us identify the genomic regions that regulate gene activity - even if those regions are distant from their targets."

Scfind can also be used to identify new genetic markers that are associated with, or define, a cell type. The researchers show that scfind is a more accurate and precise method to do this, compared with manually curated databases or other computational methods available.

To make scfind more user friendly, it incorporates techniques from natural language processing to allow for arbitrary queries.

Dr Martin Hemberg, former Group Leader at the Wellcome Sanger Institute, now at Harvard Medical School and Brigham and Women's Hospital, said: "Analysis of single-cell datasets usually requires basic programming skills and expertise in genetics and genomics. To ensure that large single-cell datasets can be accessed by a wide range of users, we developed a tool that can function like a search engine - allowing users to input any query and find relevant cell types."

Dr Jonah Cool, Science Program Officer at the Chan Zuckerberg Initiative, said: "New, faster analysis methods are crucial for finding promising insights in single-cell data, including in the Human Cell Atlas. User-friendly tools like scfind are accelerating the pace of science and the ability of researchers to build off of each other's work, and the Chan Zuckerberg Initiative is proud to support the team that developed this technology."

Credit: 
Wellcome Trust Sanger Institute

Diversity of fish species supply endangered killer whale diet throughout the year

image: Endangered Southern Resident killer whales prey on salmon throughout the year, diversifying their diet when salmon presence declines.

Image: 
Candice Emmons/NOAA Fisheries and reference NMFS permit number 16163

Endangered Southern Resident killer whales prey on a diversity of Chinook and other salmon. The stocks come from an enormous geographic range as far north as Alaska and as far south as California's Central Valley, a new analysis shows.

The diverse salmon stocks each have their own migration patterns and timing. They combine to provide the whales with a "portfolio" of prey that supports them across the entire year. The catch is that many of the salmon stocks are at risk themselves.

"If returns to the Fraser River are in trouble, and Columbia River returns are strong, then prey availability to the whales potentially balances out as the whales have evolved to move rapidly throughout their range," said NOAA Fisheries wildlife biologist Brad Hanson, lead author of the new research. "But if most of the stocks throughout their range are reduced then this could spell trouble for the whales."

The researchers examined more than 150 prey and fecal samples collected from the whales from 2004 to 2017. This produced the most comprehensive picture yet of Southern Resident prey over the course of the year.

"When so many prey species are endangered, then they lose some of that diversity," Hanson said. "The question for managers is how do you support and improve this diverse portfolio of species and stocks."

The analysis also revealed that the whales prey almost exclusively on Chinook salmon when they are available in the summer. However, they diversify their diet the rest of the year to include species such as skates, halibut, and lingcod, as well as steelhead, chum, and coho salmon. Most of the salmon the whales consume in winter and spring come from three large river systems: the Columbia, Sacramento, and rivers entering Puget Sound.

Whales Rely on Hatchery Fish

The researchers note that increased production of hatchery fish could help support the 75 remaining whales, but that this strategy is not without risks.

Many hatchery fish are already available at certain times of the year, said Robin Baird, a research biologist at Cascadia Research Collective and a coauthor of the study. Increasing the diversity of hatchery stocks to include stocks that inhabit the whales' range in winter, when they appear to have less food, may be most helpful.

"We don't need more cookie-cutter fish that all come back during the time when Chinook are most abundant; we need to diversify and increase availability at other times of the year," Baird said.

Additional funding was provided through the federal Pacific Salmon Commission and the Washington State Legislature. The funding paid for the release of more than 11.6 million additional hatchery-origin Chinook salmon in 2020, compared to previous years. It will also pay for more than 18.3 million additional fish in 2021.

In line with the findings, the hatchery production will help maintain the portfolio of stocks, including those that overlap with the killer whales during the lean times of winter. Biologists also manage the production to avoid risk to naturally produced salmon.

The Southern Residents have historically spent much of the summer in the inland waters of the Salish Sea. There they feed almost exclusively on Chinook salmon from the Fraser River and the Skagit, Snohomish, and other rivers entering Puget Sound. In the late summer, fall and early winter, they also turn to coho and chum salmon as Chinook decline in number.

Two of the three pods--K and L pods--typically move to the outer coast in fall and winter. There they spend much of their time feeding off the Washington coast, with forays south as far as Monterey Bay in California. They likely concentrate near the Columbia River area because of the large number of returning salmon, heavily supplemented by hatchery fish, the scientists said.

They also diversify their diet with salmon from potentially as far away as the Taku River in Alaska and other species. Those fish comprise a larger share of the whales' diet than scientists first thought based on observations of the whales feeding at the surface. Biologists collected fecal samples from the whales while tracking them along the coast. The samples revealed the more complete picture, including the ocean species they consume underwater and out of sight.

The third pod, J pod, spends more time in inland waters while also traveling along the west coast of Vancouver Island. There they access a mix of salmon traveling a kind of superhighway south to West Coast rivers.

Northern Residents Pose Competition

The research also explores the Southern Residents' potential competition for prey with Northern Resident killer whales. These whales are a separate population that primarily frequents Canadian waters off Vancouver Island and north. The Northern Residents have increased in number to about 300 as the Southern Residents have declined. These opposing trends may reflect the Northern Residents' greater access to prey.

Salmon returning south from waters off Alaska and British Columbia to West Coast rivers pass through Northern Residents' range before reaching the Southern Residents. In comparing their data to results from an earlier Canadian study of Northern Residents, the researchers found that the Northern Residents consumed larger and older salmon. The largest and oldest class of salmon consumed by the Northern Residents were absent from the Southern Residents' diet.

The Southern Resident J-pod that forages to the north of the other two pods may also benefit from the same earlier access. J-pod has had a greater reproductive rate than the other pods, the study notes.

Chinook salmon are also shrinking over time along the entire West Coast, and other predators may also play a role.

"The net result is that the consistent consumption of these smaller fish, which have a lower caloric value, may pose an additional challenge to the [Southern Resident killer whale] population's ability to meet their energetic needs," the scientists wrote.

NOAA Fisheries has designated the Southern Residents one of nine national Species in the Spotlight. These species are highly endangered and can benefit from focused recovery efforts, including conservation of their prey. NOAA Fisheries West Coast Region has already applied the prey study findings in its proposal to expand critical habitat for the Southern Residents along the West Coast.

"Knowing what these whales eat throughout the year and across their various habitats helps us focus recovery efforts for both the Southern Residents and the salmon they rely on," says Lynne Barre, Recovery Coordinator for the Southern Residents at NOAA Fisheries West Coast Region.

Credit: 
NOAA Fisheries West Coast Region

Serious new COVID-related smoking threat discovered by Ben-Gurion University researchers

BEER-SHEVA, ISRAEL...March 3, 2021 - Ben-Gurion University of the Negev Researchers (BGU) have found for the first time that cigarette smoke toxicity impacts the protective biofilm in the lungs, particularly concerning when paired with COVID-19 respiratory issues.

Though many health factors are known about smoking, little is known about the overall toxicity potential of its ingredients. Researchers developed a new smoke testing system called a bacterial panel with genetically modified bioluminescent bacteria to measure both filtered and unfiltered cigarette smoke's complex molecular mixture.

According to the new study published in the journal Talanta, the researchers found that cigarette smoke affects communication between bacteria, which can affect microorganisms in the body and cause a negative effect on the formation of biofilm, which protects lung bacterial colonies. The study examined 12 distinct types of commercial cigarettes of varying prices bought at local Israeli stores, revealing that filters helped somewhat in lowering toxicity.

"The experiment proved that the filter is a crucial element in reducing the harm of smoking so therefore, new filters need to be developed to reduce toxicity," explains Prof. Robert Marks, head of the BGU Avram and Stella Goldstein-Goren Department of Biotechnology Engineering.

Prof. Robert Marks is a leading expert in the study of genetically engineered bacteria. His work focuses on finding the specific mechanisms of toxins in a variety of materials and their impact on the environment.

Tobacco companies, research organizations, and academics can use the bacterial panel and its accompanying system to cost-effectively monitor the overall toxicity of various commercial cigarettes and test their filter effectiveness.

"The recently developed smoke testing system, based on our bacterial panel, is a new system for researchers that need to analyze toxicity of smoke at a reasonable cost," says Prof. Marks.

Credit: 
American Associates, Ben-Gurion University of the Negev

Ghosts of past pesticide use can haunt organic farms for decades

Although the use of pesticides in agriculture is increasing, some farms have transitioned to organic practices and avoid applying them. But it's uncertain whether chemicals applied to land decades ago can continue to influence the soil's health after switching to organic management. Now, researchers reporting in ACS' Environmental Science & Technology have identified pesticide residues at 100 Swiss farms, including all the organic fields studied, with beneficial soil microbes' abundance negatively impacted by their occurrence.

Fungicides, herbicides and insecticides protect crops by repelling or destroying organisms that harm the plants. In contrast, organic agriculture management strategies avoid adding synthetic substances, instead relying on a presumably healthy existing soil ecosystem. However, some organic farms are operating on land treated with pesticides in the past. Yet, it's unclear whether pesticides have a long-lasting presence in organically managed fields and what the reverberations are to soil life, specifically microbes and beneficial soil fungi, years after their application. So, Judith Riedo, Thomas Bucheli, Florian Walder, Marcel van der Heijden and colleagues wanted to examine pesticide levels and their impact on soil health on farms managed with conventional versus organic practices, as well as on farms converted to organic methods.

The researchers measured surface soil characteristics and the concentrations of 46 regularly used pesticides and their breakdown products in samples taken from 100 fields that were managed with either conventional or organic practices. Surprisingly, the researchers found pesticide residues at all of the sites, including organic farms converted more than 20 years prior. Multiple herbicides and one fungicide remained in the surface soil after the conversion to organic practices; though the total number of synthetic chemicals and their concentrations decreased significantly the longer the fields were in organic management. According to the researchers, some of the pesticides alternatively could have contaminated the organic fields by traveling through the air, water or soil from nearby conventional fields. In addition, the team observed lower microbial abundance and decreased levels of a beneficial microbe when fields had higher numbers of pesticides in the fields, suggesting that the presence of these substances can decrease soil health. The researchers say future work should examine the synergistic effects of pesticide residues and other environmental stressors on soil health.

Credit: 
American Chemical Society

Color blindness-correcting contact lenses

image: Rose-tinted contact lenses (about 10 mm in diameter) containing gold nanoparticles filter out problematic colors for people with red-green color blindness.

Image: 
Adapted from <i>ACS Nano</i> <b>2021</b>, DOI: 10.1021/acsnano.0c09657

Imagine seeing the world in muted shades -- gray sky, gray grass. Some people with color blindness see everything this way, though most can't see specific colors. Tinted glasses can help, but they can't be used to correct blurry vision. And dyed contact lenses currently in development for the condition are potentially harmful and unstable. Now, in ACS Nano, researchers report infusing contact lenses with gold nanoparticles to create a safer way to see colors.

Some daily activities, such as determining if a banana is ripe, selecting matching clothes or stopping at a red light, can be difficult for those with color blindness. Most people with this genetic disorder have trouble discriminating red and green shades, and red-tinted glasses can make those colors more prominent and easier to see. However, these lenses are bulky and the lens material cannot be made to fix vision problems. Thus, researchers have shifted to the development of special tinted contact lenses. Although the prototype hot-pink dyed lenses improved red-green color perception in clinical trials, they leached dye, which led to concerns about their safety. Gold nanocomposites are nontoxic and have been used for centuries to produce "cranberry glass" because of the way they scatter light. So, Ahmed Salih, Haider Butt and colleagues wanted to see whether incorporating gold nanoparticles into contact lens material instead of dye could improve red-green contrast safely and effectively.

To make the contact lenses, the researchers evenly mixed gold nanoparticles into a hydrogel polymer, producing rose-tinted gels that filtered light within 520-580 nm, the wavelengths where red and green overlap. The most effective contact lenses were those with 40 nm-wide gold nanoparticles, because in tests, these particles did not clump or filter more color than necessary. In addition, these lenses had water-retention properties similar to those of commercial ones and were not toxic to cells growing in petri dishes in the lab. Finally, the researchers directly compared their new material to two commercially available pairs of tinted glasses, and their previously developed hot-pink dyed contact lens. The gold nanocomposite lenses were more selective in the wavelengths they blocked than the glasses. The new lenses matched the wavelength range of the dyed contact lenses, suggesting the gold nanocomposite ones would be suitable for people with red-green color issues without the potential safety concerns. The researchers say that the next step is to conduct clinical trials with human patients to assess comfort.

Credit: 
American Chemical Society

Bioinspired materials from dandelions

video: This video shows the reversible air-trapping compliant mechanism. In particular, the compliant mechanism discussed in the study brings new bioinspired insights towards new related technological solutions for underwater air-trapping and air-bubbles transportation.

Image: 
University of Trento

Fields are covered with dandelions in spring, a very common plant with yellow gold flowers and toothed leaves. When they wither, the flowers turn into fluffy white seed heads that, like tiny parachutes, are scattered around by the wind. Taraxacum officinale, that is its scientific name, inspired legends and poems and has been used for centuries as a natural remedy for many ailments.

Now, thanks to a study conducted at the University of Trento, dandelions will inspire new engineered materials. The air trapping capacity of dandelion clocks submerged in water has been measured in the lab for the first time. The discovery paves the way for the development of new and advanced devices and technologies that could be used in a broad range of applications, for example, to create devices or materials that retain air bubbles under water.

The study was coordinated by Nicola Pugno, professor of the University of Trento and coordinator of the Laboratory of Bio-inspired, Bionic, Nano, Meta Materials & Mechanics at the Department of Civil, Environmental and Mechanical Engineering.

The discovery was given international prominence by "Materials Today Bio", a multidisciplinary journal focused on the interface between biology and materials science, chemistry, physics, engineering, and medicine.

Nicola Pugno outlined how the research unfolded: "Diego Misseroni and I started to work on a discovery that my daughter made, in her first year in high school. She noticed that dandelion clocks, when submerged by water, turn silvery because they trap air. We have quantified this discovery. For the first time, we have measured the air trapping capacity of dandelion clocks in a laboratory setting. This paper demonstrated that kids and young adults can make significant discoveries by observing nature".

When submerged in water, the research team observed, the soft seed heads turn silver in color, become thinner and take on a rhombus-like shape. The team then developed an analytical model to measure the mechanical properties of the flower, in order to mimic them and create re-engineered dandelion-like materials.

Bioinspired engineering can explore different opportunities thanks to this discovery, such as miniaturized parachute-like elements to develop innovative devices and advanced, light and low-cost technological solutions to trap and transport air bubbles underwater. These materials could be used, for example, in underwater operations.

Credit: 
Università di Trento

Temperature and aridity fluctuations over the past century linked to flower color changes

image: Clemson University graduate student Cierra Sullivan used herbarium specimen data in her research linking temperature and aridity changes to flower color changes over the past 124 years.

Image: 
Clemson University College of Science

CLEMSON, South Carolina - Clemson University scientists have linked climatic fluctuations over the past one and a quarter-century with flower color changes.

Researchers combined descriptions of flower color from museum flower specimens dating back to 1895 with longitudinal- and latitudinal-specific climate data to link changes in temperature and aridity with color change in the human-visible spectrum (white to purple).

The study, which was published in the journal Proceedings of the Royal Society B, showed the change varied across taxa.

"Species experiencing larger increases in temperature tended to decline in pigmentation, but those experiencing larger increases in aridity tended to increase in pigmentation," said Cierra Sullivan, a graduate student in the College of Science's Department of Biological Sciences and lead author of the paper titled "The effects of climate change on floral anthocyanin polymorphisms."

Matthew Koski, an assistant professor of biological sciences, co-authored the paper.

Previous research by Koski and his team, including Sullivan, showed that the ultraviolet-absorbing pigmentation of flowers increased globally over the past 75 years in response to a rapidly degraded ozone layer. That study discussed how flower color changes could influence the behavior of pollinators, which have UV photoreceptors that enable them to detect patterns not visible to human eyes. This study discusses plant color change visible to humans.

"Although we see these changes in flower color, that doesn't inherently mean it's doomsday because the forest, plants and animals naturally respond to what's going on in their environment," Sullivan said. "Seeing changes is not necessarily bad, but it's something to which we should pay attention."

Researchers selected 12 species with reported floral color polymorphisms in North America, representing eight families and 10 genera.

Sullivan obtained herbarium specimen data from the Southeast Regional Network of Expertise and Collections (SRNEC), Consortium of Pacific Northwest Herbaria, Consortium of California Herbaria and the Consortium of Northeastern Herbaria. She also checked Clemson University Herbarium's physical collection for specimens not already represented in SERNEC.

After researchers retrieved the date of specimen collection and latitudinal and longitudinal coordinates, they obtained historical bioclimatic data from the year and month that the plant was collected. That data included monthly precipitation, minimum, maximum and mean temperature, minimum and maximum vapor pressure deficit (VPD), and dew point temperature. Vapor pressure deficit is the difference between how much moisture is in the air and the amount of moisture that can be held when the air is saturated. It has implications for drought stress in plants -- higher VPD means more water loss from plants.

Researchers were able to get complete data sets for 1,944 herbarium specimens.

They found variation among the 12 species. Some increased in pigmentation, while others declined in color over the past century.

"It was all tightly linked to how much climatic variation they experienced over time across their range," Koski said.

Two of the species that tended to get lighter in pigmentation are found in the western parts of North America that experienced more dramatic temperature changes than the species in the eastern United States, which had more moderate temperature increases.

"This study documents that flower color that is visually more obvious to humans is also responding to global change but is responding to different factors such as temperature and drought," Koski said.

He said such flower color changes are likely to affect plant-pollinator and plant-herbivore interactions and warrant further study.

Continued research will help give insight to how species will respond to the various aspects of climate change and which species are the most vulnerable to future climate projections," he said.

Credit: 
Clemson University

Conquering the timing jitters

image: Artistic depiction of XFEL measurement with neon gas. The inherent delay between the emission of photoelectrons and Auger electrons leads to a characteristic ellipse in the analyzed data. In principle, the position of individual data points around the ellipse can be read like the hands of a clock to reveal the precise timing of decay processes.

Image: 
(Image by Daniel Haynes and J&ouml;rg Harms/Max Planck Institute for the Structure and Dynamics of Matter.)

Breakthrough greatly enhances the ultrafast resolution achievable with X-ray free-electron lasers.

A large international team of scientists from various research organizations, including the U.S. Department of Energy’s (DOE) Argonne National Laboratory, has developed a method that dramatically improves the already ultrafast time resolution achievable with X-ray free-electron lasers (XFELs). It could lead to breakthroughs on how to design new materials and more efficient chemical processes.

An XFEL device is a powerful combination of particle accelerator and laser technology producing extremely brilliant and ultrashort pulses of X-rays for scientific research. “With this technology, scientists can now track processes that occur within millions of a billionth of a second (femtoseconds) at sizes down to the atomic scale,” said Gilles Doumy, a physicist in Argonne’s Chemical Sciences and Engineering division. “Our method makes it possible to do this for even faster times.”

“It’s like trying to photograph the end of a race when the camera shutter might activate at any moment in the final ten seconds.” — Dan Haynes, doctoral student, Max Planck Institute for the Structure and Dynamics of Matter

One of the most promising applications of XFELs has been in the biological sciences. In such research, scientists can capture how biological processes fundamental to life change over time, even before the radiation from the laser’s X-rays destroys the samples. In physics and chemistry, these X-rays can also shed light on the fastest processes occurring in nature with a shutter speed lasting only a femtosecond. Such processes include the making and breaking of chemical bonds and the vibrations of atoms on thin film surfaces.

For over a decade XFELs have delivered intense, femtosecond X-ray pulses, with recent forays into the sub-femtosecond regime (attosecond). However, on these miniscule time scales, it is difficult to synchronize the X-ray pulse that sparks a reaction in the sample and the laser pulse that “observes” it. This problem is called timing jitter.

Lead author Dan Haynes, a doctoral student at the Max Planck Institute for the Structure and Dynamics of Matter, said, “It’s like trying to photograph the end of a race when the camera shutter might activate at any moment in the final ten seconds.”

To circumvent the jitter problem, the research team came up with a pioneering, highly precise approach dubbed “self-referenced attosecond streaking.” The team demonstrated their method by measuring a fundamental decay process in neon gas at the Linac Coherent Light Source, a DOE Office of Science User Facility at SLAC National Accelerator Laboratory.

Doumy and his advisor at the time, Ohio State University Professor Louis DiMauro, had first proposed the measurement in 2012.

In the decay process, called Auger decay, an X-ray pulse catapults atomic core electrons in the sample out of their place. This leads to their replacement by electrons in outer atomic shells. As these outer electrons relax, they release energy. That process can induce the emission of another electron, known as an Auger electron. Radiation damage occurs due to both the intense X-rays and the continued emission of Auger electrons, which can rapidly degrade the sample. Upon X-ray exposure, the neon atoms also emit electrons, called photoelectrons.

After exposing both types of electrons to an external “streaking” laser pulse, the researchers determined their final energy in each of tens of thousands of individual measurements.

“From those measurements, we can follow Auger decay in time with sub-femtosecond precision, even though the timing jitter was a hundred-times larger,” said Doumy. “The technique relies on the fact that Auger electrons are emitted slightly later than the photoelectrons and thus interact with a different part of the streaking laser pulse.”

This factor forms the foundation of the technique. By combining so many individual observations, the team was able to construct a detailed map of the physical decay process. From that information, they could determine the characteristic time delay between the photoelectron and Auger electron emission.

The researchers are hopeful that self-referenced streaking will have a broad impact in the field of ultrafast science. Essentially, the technique enables traditional attosecond streaking spectroscopy to be extended to XFELs worldwide as they approach the attosecond frontier. In this way, self-referenced streaking may facilitate a new class of experiments benefitting from the flexibility and extreme intensity of XFELs without compromising on time resolution.

Credit: 
DOE/Argonne National Laboratory