Tech

T cells linked to myelin implicated in MS-like disease in monkeys

Scientists have uncovered new clues implicating a type of herpes virus as the cause of a central nervous system disease in monkeys that's similar to multiple sclerosis in people.

The findings, published in the Annals of Clinical and Translational Neurology, expand on previous work to understand the cause of the disease and potentially develop antiviral therapies. The work was led by scientists at Oregon Health & Science University.

"This gives us a better understanding of the model," said Scott Wong, Ph.D., senior author of the study and a scientist at the OHSU Vaccine and Gene Therapy Institute and the Oregon National Primate Research Center. "It draws more parallels to MS in people."

The new study reveals the presence of two kinds of T cells, a type of white blood cell that's a critical part of the body's immune system. In this case, scientists determined the T cells were associated with an immune response involving the loss of myelin, the protective sheath that covers nerve fibers.

Myelin and nerve fibers become damaged in multiple sclerosis, which slows or blocks electrical signals required for us to see, move our muscles, feel sensations and think.

"We found that some of the T cell epitopes targeting myelin in these animals are identical to those found in humans with MS," Wong said.

By linking these specific T cells to the loss of myelin, scientists say the new study opens the possibility of developing an antiviral therapy that could be especially useful for newly diagnosed cases of multiple sclerosis.

"If we found a unique virus that we believed was causing MS, then you could in theory come up with a vaccine against that virus," said co-author Dennis Bourdette, M.D., professor emeritus and former chair of neurology in the OHSU School of Medicine.

The work builds on a chance discovery in the colony of Japanese macaques at the primate center.

In 2011, scientists at OHSU published research identifying a group of monkeys at the primate center with a naturally occurring disease known as Japanese macaque encephalomyelitis. Since then, scientists have been working to understand the cause and progression of the disease in the macaques with an eye toward applying possible therapies in people.

The latest study points toward developing strategies to combat the disease leveraging the body's immune response.

"If we can understand how it's doing it, we may be able to test vaccine strategies," Wong said. "I'm not sure we can prevent virus infection, but we may be able to prevent virus-associated disease."

Credit: 
Oregon Health & Science University

Conductive nature in crystal structures revealed at magnification of 10 million times

image: University of Minnesota Professor K. Andre Mkhoyan and his team used analytical scanning transmission electron microscopy (STEM), which combines imaging with spectroscopy, to observe metallic properties in the perovskite crystal barium stannate (BaSnO3). The atomic-resolution STEM image, with a BaSnO3 crystal structure (on the left), shows an irregular arrangement of atoms identified as the metallic line defect core.

Image: 
Mkhoyan Group, University of Minnesota

In groundbreaking materials research, a team led by University of Minnesota Professor K. Andre Mkhoyan has made a discovery that blends the best of two sought-after qualities for touchscreens and smart windows--transparency and conductivity.

The researchers are the first to observe metallic lines in a perovskite crystal. Perovskites abound in the Earth's center, and barium stannate (BaSnO3) is one such crystal. However, it has not been studied extensively for metallic properties because of the prevalence of more conductive materials on the planet like metals or semiconductors. The finding was made using advanced transmission electron microscopy (TEM), a technique that can form images with magnifications of up to 10 million.

The research is published in Science Advances, a peer-reviewed scientific journal published by the American Association for the Advancement of Science (AAAS).

"The conductive nature and preferential direction of these metallic line defects mean we can make a material that is transparent like glass and at the same time very nicely directionally conductive like a metal," said Mkhoyan, a TEM expert and the Ray D. and Mary T. Johnson/Mayon Plastics Chair in the Department of Chemical Engineering and Materials Science at the University of Minnesota's College of Science and Engineering. "This gives us the best of two worlds. We can make windows or new types of touch screens transparent and at the same time conductive. This is very exciting."

Defects, or imperfections, are common in crystals--and line defects (the most common among them is the dislocation) are a row of atoms that deviate from the normal order. Because dislocations have the same composition of elements as the host crystal, the changes in electronic band structure at the dislocation core, due to symmetry-reduction and strain, are often only slightly different than that of the host. The researchers needed to look outside the dislocations to find the metallic line defect, where defect composition and resulting atomic structure are vastly different.

"We easily spotted these line defects in the high-resolution scanning transmission electron microscopy images of these BaSnO3 thin films because of their unique atomic configuration and we only saw them in the plan view," said Hwanhui Yun, a graduate student in the Department of Chemical Engineering and Materials Science and a lead author of the study.

For this study, BaSnO3 films were grown by molecular beam epitaxy (MBE)--a technique to fabricate high-quality crystals--in a lab at the University of Minnesota Twin Cities. Metallic line defects observed in these BaSnO3 films propagate along film growth direction, which means researchers can potentially control how or where line defects appear--and potentially engineer them as needed in touchscreens, smart windows, and other future technologies that demand a combination of transparency and conductivity.

"We had to be creative to grow high-quality BaSnO3 thin films using MBE. It was exciting when these new line defects came into light in the microscope," said Bharat Jalan, associate professor and Shell Chair in the Department of Chemical Engineering and Materials Science, who heads up the lab that grows a variety of perovskite oxide films by MBE.

Perovskite crystals (ABX3) contain three elements in the unit cell. This gives it freedom for structural alterations such as composition and crystal symmetry, and the ability to host a variety of defects. Because of different coordination and bonding angles of the atoms in the line defect core, new electronic states are introduced and the electronic band structure is modified locally in such a dramatic way that it turns the line defect into metal.

"It was fascinating how theory and experiment agreed with each other here," said Turan Birol, assistant professor in the Department of Chemical Engineering and Materials Science and an expert in density functional theory (DFT). "We could verify the experimental observations of the atomic structure and electronic properties of this line defect with first principles DFT calculations."

Credit: 
University of Minnesota

Scientists offer road map to improve environmental observations in the Indian Ocean

image: Artist's illustration of the Indian Ocean Observing System and its societal applications. IndOOS data support research to advance scientific knowledge about the Indian Ocean circulation, climate variability and change, and biogeochemistry, as well as societal applications due to its contribution to operational analyses and forecasts.
Citation: Bulletin of the American Meteorological Society 101, 11; 10.1175/BAMS-D-19-0209.1

Image: 
JAMSTEC.

MIAMI--A group of more than 60 scientists have provided recommendations to improve the Indian Ocean Observing System (IndOOS), a basin-wide monitoring system to better understand the impacts of human-caused climate change in a region that has been warming faster than any other ocean.

The group, led by Lisa Beal, professor of ocean sciences at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science, provides a road map for an enhanced IndOOS to better meet the scientific and societal needs for more reliable environmental forecasts in the next decade. The 136 actionable recommendations from the three-year, internationally coordinated review were published in the Bulletin of the American Meteorological Society.

The scientists call for four major improvements to the current observing system: 1) more chemical and biological measurements in at-risk ecosystems and fisheries; 2) expansion into the western tropics to improve understanding of the monsoon; 3) better-resolved upper ocean processes to improve predictions of rainfall, drought, and heat waves; and 4) expansion into key coastal regions and the deep ocean to better constrain the basin-wide energy budget.

Although the smallest of the major oceans on Earth, the Indian Ocean is home to roughly one-third of the global population living among the 22 countries that border its rim. Many of these countries are developing or emerging economies vulnerable to climate changes such as sea level rise and more extreme weather events. The Indian Ocean also influences climate globally and is thought to have played a key role in regulating global mean surface temperatures.

The Indian Ocean Observing System, established in 2006, is a multinational network of sustained oceanic measurements that underpin understanding and forecasting of weather and climate for the Indian Ocean region and beyond. IndOOS is part of the Global Ocean Observing System (GOOS) which is coordinated through the World Meteorological Organization and the Intergovernmental Oceanographic Commission of the United Nations.

IndOOS-2 will require new agreements and partnerships with and among Indian Ocean rim countries, creating opportunities for them to enhance their monitoring and forecasting capacity, said the authors.

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

Dairy product purchasing differs in households with and without children

image: A new study published in JDS Communications found that households with children reported purchasing larger quantities and higher-fat dairy products compared to households without children.

Image: 
JDS Communications

Champaign, IL, January 15, 2021 - American dairy consumers are often influenced by a variety of factors that can affect their buying habits. These factors include taste, preference, government information, cultural background, social media, and the news. In an article appearing in JDS Communications, researchers found that households that frequently bought food for children are interested in dairy as part of their diet and purchased larger quantities of fluid milk and more fluid milk with a higher fat content.

To assess the purchasing habits of households that purchase food for children versus those that do not, researchers from Purdue University and Oklahoma State University collected data through an online survey tool, Qualtrics. Respondents, required to be 18 years of age or older, were asked a variety of questions to collect demographic information and dairy product purchasing behavior from US residents. Kantar, an online panel database, was used to obtain participants through their opt-in panel database. "The sample was targeted to be representative of the US population in terms of sex, age, income, education, and geographical region of residence as defined by the US Census Bureau (2016)," said author Mario Ortez, PhD student at Purdue University in West Lafayette, IN, USA.

The survey received a total of 1,440 responses to be assessed. Per the results, 511 respondents indicated they frequently purchased food specifically for children, whereas 929 indicated they did not. Of the 1,440 respondents, 521 indicated that they had at least one child in the household, and 912 indicated they did not have children in their household. The study found that households that frequently purchased food for children generally purchased larger quantities of fluid milk, along with their chosen fluid milk having a higher fat content. Households with children also bought yogurt more frequently than other households.

Other findings from the survey indicated that cheese and milk are most often purchased for part of a meal, and yogurt is bought most frequently as a snack. The survey also found that households largely reported reviewing product attributes of price, expiration date, and nutritional information (in that order) on egg, milk, and meat labels.

"This study demonstrates the continued belief among American consumers that dairy products are an important part of a healthy diet fed to children. The popularity of whole milk, cheese, and yogurt within these households suggests that children enjoy the taste of dairy products and are happy to have them served during regular meals and at snack time," said Matthew Lucy, PhD, Editor-in-Chief of JDS Communications, University of Missouri, Columbia, MO, USA. These findings can influence product marketing efforts and stakeholder decisions in the dairy industry.

"Future studies can build on this work by evaluating whether there is a spillover effect from purchasing specifically for children and the general dairy and protein product purchasing habits of those households," said Dr. Courtney Bir, PhD, coauthor of the study and assistant professor, Oklahoma State University, Stillwater, OK, USA.

Policy makers and companies can use this information to help inform product labeling and better target necessary segments to increase product awareness and better the dairy industry as a whole.

Credit: 
Elsevier

Changing resilience of oceans to climate change

Oxygen levels in the ancient oceans were surprisingly resilient to climate change, new research suggests.

Scientists used geological samples to estimate ocean oxygen during a period of global warming 56 million years ago - and found "limited expansion" of seafloor anoxia (absence of oxygen).

Global warming - both past and present - depletes ocean oxygen, but the new study suggests warming of 5°C in the Paleocene Eocene Thermal Maximum (PETM) led to anoxia covering no more than 2% of the global seafloor.

However, conditions are different today to the PETM - today's rate of carbon emissions is much faster, and we are adding nutrient pollution to the oceans - both of which could drive more rapid and expansive oxygen loss.

The study was carried out by an international team including researchers from ETH Zurich, the University of Exeter and Royal Holloway, University of London.

"The good news from our study is that the Earth system was resilient to seafloor deoxygenation 56 million years ago despite pronounced global warming," said lead author Dr Matthew Clarkson, of ETH Zurich.

"However, there are reasons why things are different today.

"In particular, we think the Paleocene had higher atmospheric oxygen than today, which would have made anoxia less likely.

"Additionally, human activity is putting more nutrients into the ocean through fertilisers and pollution, which can drive oxygen loss and accelerate environmental deterioration."

To estimate ocean oxygen levels during the PETM, the researchers analysed the isotopic composition of uranium in ocean sediments, which tracks oxygen concentrations.

Surprisingly, these barely changed during the PETM.

This sets an upper limit on how much ocean oxygen levels could have changed.

Computer simulations based on the results suggest a maximum ten-fold increase in the area of seafloor devoid of oxygen - taking the total to no more than 2% of the global seafloor.

This is still significant, at around ten times the modern area of anoxia, and there were clearly detrimental impacts and extinctions of marine life in some parts of the ocean.

Co-author Professor Tim Lenton, Director of Exeter's Global Systems Institute notes: "This study shows how the resilience of the Earth's climate system has changed over time.

"The order of mammals we belong to - the primates - originated in the PETM. Unfortunately, as we primates have been evolving for the last 56 million years, it looks like the oceans have been getting less resilient."

Professor Lenton added: "Although the oceans were more resilient than we thought at this time in the past, nothing should distract us from the urgent need to reduce emissions and tackle the climate crisis today."

Credit: 
University of Exeter

Divergences between scientific and Indigenous and Local Knowledge can be helpful

image: Discussion about carnivore species with Daasanach pastoralists.

Image: 
ANDRÉ BRAGA JUNQUEIRA

Divergences between scientific and Indigenous and Local Knowledge can provide a better understanding of why local pastoralists may be willing, or not, to participate in conservation initiatives for carnivores, a study from University of Helsinki suggests.

Carnivore conservation has historically been based primarily on scientific knowledge using a wide range of sampling methods, such as camera trapping and track surveys. However, the estimates of these common ecological sampling methods can be quite uncertain and can depend on accessibility and geology, which is the case of many remote areas, such as Sibiloi National Park in Northern Kenya. For this reason, the inclusion of local communities that share land with carnivore species has been encouraged to enhance conservation.

"I remember at the beginning of the project, many local pastoralists told me that they see cheetahs running at full speed. However, despite their accurate descriptions, I was very sceptical about it, as I did not have any image from my camera traps. A year later, I got a photograph of a cheetah holding a grant gazelle. On that day, I recognized that local pastoralists were correct all along. I acknowledged the importance of complementing scientific knowledge with Indigenous and Local Knowledge of local pastoralists that share their day-by-day lives next to those carnivores," describes Miquel Torrents-Ticó, a PhD student from the Faculty of Biological and Environmental Sciences at the University of Helsinki, and the lead author of the study.

Indigenous and Local Knowledge is held by the local community and handed down through generations of continuous interactions with their environment. Traditionally, there has been a tendency to compare Indigenous and Local Knowledge with scientific knowledge looking especially for convergences to enhance wildlife management. However, there are also divergences between scientific and Indigenous and Local Knowledge that have been less studied because they can indicate stakeholders' conflicts and can create challenges in effectively implementing conservation actions.

"The idea of this study was to complement information derived from camera trapping, track surveys and semi-structured interviews of local pastoralists acknowledging the divergences and explaining the unknown status of carnivores of the remote area of Sibiloi," explains Miquel Torrents-Ticó.

Biodiversity loss of carnivores in Sibiloi is dramatic

Although in the past, Sibiloi National Park in Northern Kenya, had a rich mammalian fauna, nowadays, large animals have decreased sharply. Surprisingly, researchers know very little of this process of biodiversity loss due to Sibiloi' isolated location, and it is not clear which species remain and how threatened they are.

"Many local Daasanach remember with joy the time when as children they used to see lions, leopards and cheetahs in Sibiloi," says Torrents-Ticó. "However, our study shows a new reality of Sibiloi with some carnivore species already gone, such as lions, and many others present in very low numbers, such as cheetahs and striped hyaenas."

In view of these results, the authors call for an urgent need for conservation actions before Sibiloi reaches a dramatic point of no return, where all carnivores will be extinct from the area.

The potential of complementation: Inclusive conservation with space for divergences

This study highlights that carnivores abundances obtained by complementing scientific and Indigenous and Local Knowledge can diverge, which can provide a wider picture of human-carnivores relationships and conservation contexts. For instance, carnivore species that have a high impact on local pastoralists' livelihoods and safety can be perceived as being more abundant by local pastoralists, even if the abundances obtained from ecological scientific methods are low.

"Divergences between scientific and Indigenous and Local Knowledge due to high perceptions of risk and damage should be considered in order to take into account local pastoralists views, and in this way, make conservation more inclusive," emphasizes Torrents-Ticó.

Credit: 
University of Helsinki

A new tool to facilitate quicker, error-free software design

Any building project requires the formulation of a series of initial plans prior to starting construction to serve as a basis and guide for the whole process. A similar procedure is followed in software development, with the inclusion of a specific step known as modelling. "The process is equivalent to the production of a set of plans for a building before its construction," explained Universitat Oberta de Catalunya (UOC) Faculty of Computer Science, Multimedia and Telecommunications professor and member of the SOM Research Lab research group -from the Internet Interdisciplinary Institute (IN3)-, Robert Clarisó.

Engineers use modelling to describe a software system from a specific perspective, such as the data it will use, its components or the way they expect it to function. Going back to the building project example, the plans would be the 'models', which can be used for guidance during the development stage, as well as for carrying out simulations and tests.

According to the researcher, "The model type most frequently used is the UML (Unified Modelling Language) class diagram notation, which is used to describe the structure of a software system." The advantage of working with these models is that they are more abstract than source code, which contains a lot of specific details about the technology being used. In the words of Clarisó, "Models can be more concise, easier to produce and understand."

As such, modelling would serve more as a preliminary step rather than an alternative to source code. The models make it easier to understand the system being developed and can also be used to generate certain implementation elements, automating the most repetitive parts of the programming process.

The role of verification tools

Engineers use verification tools to prevent errors that could affect the code itself and, therefore, the final implementation of the system. Emphasizing the importance of this process, Clarisó said, "We need to ensure the models are correct in order to minimize possible errors in the software that could occur as a result."

As part of a study published in the Journal of Object Technology, Clarisó, along with his colleagues, Carlos A. González (Gran Telescopio Canarias researcher) and Jordi Cabot (ICREA researcher), has come up with a new verification technique for UML/OCL models that solves a common problem. Every time a designer makes a change - such as adding, deleting, or modifying information - to the model it means that the whole system has to be re-analysed, which is the why verification is usually only carried out once a definitive model has been produced at the end of the process.

As Clarisó explained, "Our article outlines the application of incremental methods of verification, that is, we make it easier to verify a model any time changes are made." Rather than only being able to verify the model at the end of the process, as is currently the case, this permits it to be verified during construction, without having to start from scratch, which facilitates the early detection of errors.

An active community, both nationally and globally

This method is also innovative with regard to its use of certificates, the examples that illustrate the correct operation of the model. As the researcher pointed out, "When we modify a model, having a new certificate would remove the need for its verification.

It's far less costly to adapt a certificate than it is to rerun the verification process." Explaining the process, the authors propose that, rather than verifying the new model, a certificate from the original model could be adapted to the new one. The biggest challenge they now face is integrating these techniques into existing software modelling tools and environments.

Commenting on the sector in general, Clarisó, who acted as the coordinator of the Spanish Network of Excellence in Model-driven Software Engineering , stressed that, "the Spanish modelling community is still very active and participates in a variety of national and international research projects".

In fact, although the network is no longer operational, the community still works together and collaborates as part of the annual Software and Database Engineering Conference, which has a space dedicated to Model-driven Software Engineering.

Credit: 
Universitat Oberta de Catalunya (UOC)

Low cost chlorine dispensing device improves tap water safety in low-resource regions

image: The chlorine treatment device requires little maintenance and no change in collection of water from the tap

Image: 
Amy Pickering

MEDFORD/SOMERVILLE (January 14, 2021) - A team of researchers led by engineers at Tufts University's School of Engineering and Stanford University's Program on Water, Health and Development have developed a novel and inexpensive chlorine dispensing device that can improve the safety of drinking water in regions of the world that lack financial resources and adequate infrastructure. With no moving parts, no need for electricity, and little need for maintenance, the device releases measured quantities of chlorine into the water just before it exits the tap. It provides a quick and easy way to eliminate water-borne pathogens and reduce the spread of high mortality diseases such as cholera, typhoid fever and diarrhea.

According to the CDC, more than 1.6 million people die from diarrheal diseases every year and half of those are children. The authors suggest that the solution to this problem could be relatively simple.

In communities and regions that do not have the resources to build water treatment plants and distribution infrastructure, the researchers found that the device can provide an effective, alternative means of water treatment at the point of collection. The device was installed and tested at several water collection stations, or kiosks, across rural areas in Kenya.

The study, which also looks at the economic feasibility and local demand for the system, was published today in the journal NPJ Clean Water.

"The idea we pursued was to minimize the user burden by automating water treatment at the point of collection," said Amy J. Pickering, former professor of civil and environmental engineering at Tufts (now at Stanford) and corresponding author of the study. "Clean water is central to improving human health and alleviating poverty. Our goal was to design a chlorine doser that could fit onto any tap, allowing for wide-scale implementation and increasing accessibility to a higher-level of safe water service."

Water is a simple substance, but a complex global health issue in both its availability and quality. Although it has long been a focus of the World Health Organization and other NGO's, 2.1 billion people still lack access to safe water at home (WHO). In areas of the world where finances and infrastructure are scarce, water may be delivered to communities by pipe, boreholes or tube wells, dug wells, and springs. Unfortunately, 29 percent of the global population uses a source that fails to meet the Sustainable Development Goal (SDG) criteria for safely managed water - accessible and available when needed, and free from fecal and chemical contamination. In many places, access to safe water is out of reach due to the lack of available funds to create and support water treatment facilities.

The device works on the principle of a physical phenomenon in fluid dynamics called the Venturi effect, in which a non-compressible fluid flows at a faster rate when it runs from a wider to a narrower passage. In the device, the water passes through a so-called pinch valve. The fast-moving water stream draws in chlorine from a tube attached to the pinch valve. A needle valve controls the rate and thus amount of chlorine flowing into the water stream. The simple design could allow the device to be manufactured for $35 USD at scale.

"Rather than just assume we made something that was easier to use, we conducted user surveys and tracked the performance of the devices over time," said study co-author Jenna Davis, a professor of civil and environmental engineering at Stanford, director of Stanford's Program on Water, Health and Development, and co-PI of the Lotus Water project. This research is an extension of Lotus Water, which aims to provide reliable and affordable disinfection services for communities most at risk of waterborne illness.

A six-month evaluation in Kenya revealed stable operation of six of seven installed devices; one malfunctioned due to accumulation of iron deposits, a problem likely solvable with a pre-filter. Six of the seven sites were able to maintain payment for and upkeep of the device, and 86.2 percent of 167 samples taken from the devices throughout the period showed chlorine above the WHO recommended minimum level to ensure safe water, and below a threshold determined for acceptable taste. Technical adjustments were required in less than 5 percent of visits by managers of the kiosks. In a survey, more than 90 percent of users said they were satisfied with the quality of the water and operation of the device.

"Other devices and methods have been used to treat water at the point of collection," said Julie Powers, PhD student at Tufts School of Engineering and first author of the study. "but the Venturi has several advantages. Perhaps most importantly, it doesn't change the way people collect their water or how long it takes - there's no need for users to determine the correct dosing or spend extra time- just turn on the tap. Our hope is the low cost and high convenience will encourage widespread adoption that can lead to improved public health."

Future work examining the effect of the in-line chlorination device on diarrhea, enteric infections, and child mortality could further catalyze investment and scaling up this technology, said Powers.

Credit: 
Tufts University

A climate in crisis calls for investment in direct air capture, news research finds

image: Massive deployment of Direct Air Capture could reverse the rise in global temperature well before 2100, but only with immediate and sustained investments from governments and firms to scale up the new technology.

Image: 
acilo/iStock

There is a growing consensus among scientists as well as national and local governments representing hundreds of millions of people, that humanity faces a climate crisis that demands a crisis response. New research from the University of California San Diego explores one possible mode of response: a massively funded program to deploy direct air capture (DAC) systems that remove CO2 directly from the ambient air and sequester it safely underground.

The findings reveal such a program could reverse the rise in global temperature well before 2100, but only with immediate and sustained investments from governments and firms to scale up the new technology.

Despite the enormous undertaking explored in the study, the research also reveals the need for governments, at the same time, to adopt policies that would achieve deep cuts in CO2 emissions. The scale of the effort needed just to achieve the Paris Agreement goals of holding average global temperature rise below 2 degrees Celsius is massive.

The study, published in Nature Communications, assesses how crisis-level government funding on direct air capture--on par with government spending on wars or pandemics--would lead to deployment of a fleet of DAC plants that would collectively remove CO2 from the atmosphere.

"DAC is substantially more expensive than many conventional mitigation measures, but costs could fall as firms gain experience with the technology," said first-author Ryan Hanna, assistant research scientist at UC San Diego. "If that happens, politicians could turn to the technology in response to public pressure if conventional mitigation proves politically or economically difficult."

Co-author David G. Victor, professor of industrial innovation at UC San Diego's School of Global Policy and Strategy, added that atmospheric CO2 concentrations are such that meeting climate goals requires not just preventing new emissions through extensive decarbonization of the energy system, but also finding ways to remove historical emissions already in the atmosphere.

"Current pledges to cut global emissions put us on track for about 3 degrees C of warming," Victor said. "This reality calls for research and action around the politics of emergency response. In times of crisis, such as war or pandemics, many barriers to policy expenditure and implementation are eclipsed by the need to mobilize aggressively."

Emergency deployment of direct air capture

The study calculates the funding, net CO2 removal, and climate impacts of a large and sustained program to deploy direct air capture technology.

The authors find that if an emergency direct air capture program were to commence in 2025 and receive investment of 1.2-1.9% of global GDP annually it would remove 2.2-2.3 gigatons of CO2 by the year 2050 and 13-20 gigatons of CO2 by 2075. Cumulatively, the program would remove 570-840 gigatons of CO2 from 2025-2100, which falls within the range of CO2 removals that IPCC scenarios suggest will be needed to meet Paris targets.

Even with such a massive program, the globe would see temperature rise of 2.4-2.5ºC in the year 2100 without further cuts in global emissions below current trajectories.

Exploring the reality of a fleet of CO2 scrubbers in the sky

According to the authors, DAC has attributes that could prove attractive to policymakers if political pressures continue to mount to act on climate change, yet cutting emissions remains insurmountable.

"Policymakers might see value in the installation of a fleet of CO2 scrubbers: deployments would be highly controllable by the governments and firms that invest in them, their carbon removals are verifiable, and they do not threaten the economic competitiveness of existing industries," said Hanna.

From the Civil War to Operation Warp Speed, the authors estimate the financial resources that might be available for emergency deployment of direct air capture--in excess of one trillion dollars per year--based on previous spending the U.S. has made in times of crisis.

The authors then built a bottom-up deployment model that constructs, operates and retires successive vintages of DAC scrubbers, given available funds and the rates at which direct air capture technologies might improve with time. They link the technological and economic modeling to climate models that calculate the effects of these deployments on atmospheric CO2 concentration level and global mean surface temperature.

With massive financial resources committed to DAC, the study finds that the ability of the DAC industry to scale up is the main factor limiting CO2 removal from the atmosphere. The authors point to the ongoing pandemic as an analog: even though the FDA has authorized use of coronavirus vaccines, there is still a huge logistical challenge to scaling up production, transporting, and distributing the new therapies quickly and efficiently to vast segments of the public.

Conventional mitigation is still needed, even with wartime spending combating climate change

"Crisis deployment of direct air capture, even at the extreme of what is technically feasible, is not a substitute for conventional mitigation," the authors write.

Nevertheless, they note that the long-term vision for combating climate requires taking negative emissions seriously.

"For policymakers, one implication of this finding is the high value of near-term direct air capture deployments--even if societies today are not yet treating climate change as a crisis--because near term deployments enhance future scalability," they write. "Rather than avoiding direct air capture deployments because of high near-term costs, the right policy approach is the opposite."

Additionally, they note that such a large program would grow a new economic sector, producing a substantial number of new jobs.

The authors conclude it is time to extend research on direct air capture systems to real-world conditions and constraints that accompany deployment--especially in the context of acute political pressures that will arise as climate change becomes viewed as a crisis.

Credit: 
University of California - San Diego

A scanning transmission X-ray microscope for analysis of chemical states of lithium

image: Fig. 1. Schematic of an optical system of a scanning transmission soft X-ray microscope (STXM).

Image: 
NINS/IMS

Lithium-ion batteries (LIB) are widely used for daily products in our life, such as hybrid cars, cell phone, etc. but their charge/discharge process is not fully understood yet. To understand the process, behaviors of lithium ion, distribution and chemical composition and state, should be revealed. A research group in Institute for Molecular Science noticed on a scanning transmission X-ray microscope (STXM, shown in Fig. 1) as a powerful technique to perform X-ray absorption spectroscopy (XAS) with high spatial resolution. By using absorption edge of a specific element, 2-dimensional chemical state of a sample can be obtained. To analyze lithium by the STXM, the Li K absorption edge (55 eV) in low energy region makes it difficult to measure the XAS due to lack of a proper optical element and tremendous higher-order harmonics from a monochromator which contaminate XAS. Therefore, a low-pass filtering zone plate (LPFZP), a focusing optical element of the STXM, was developed to overcome these issues. The LPFZP uses silicon of 200 nm thick as a substrate of the zone plate and the substrate works as a low-pass filter above 100 eV by using Si L2,3 edges. The hybrid optics of the LPFZP can suppress the higher-harmonics without installing an additional optical component into the STXM. As a result, the STXM with the LPFZP suppresses higher-order harmonics down to 0.1% of an original intensity and enables to measure XAS spectra of the Li K-edge. Then the spatial resolution was estimated as 72 nm.

A thin section sample of a test electrode of the LIB was analyzed. The sample is made of Li2CO3 by a focusing ion-beam process. The STXM image at 70 eV and the Li K-edge XAS spectra are shown in Fig. 2(a) and 2(b), respectively. The XAS spectra are successfully obtained from the regions indicated by circles in Fig. 2(a).

To understand the behavior of lithium in the LIB is necessary to improve its performance. Then, the STXM with the LPFZP will be helpful to analyze that with high spatial resolution.

Credit: 
National Institutes of Natural Sciences

A highly sensitive technique for measuring the state of a cytoskeleton

image: Optical blur was artificially added to microscopic images of reticular and bundle cytoskeleton and the difference between the existing and proposed methods was assessed. In the right figure, the higher the vertical axis, the better the distinction between the two. The proposed method more accurately distinguishes the reticular and bundle cytoskeleton images, even in degraded images, compared to the existing method.

Image: 
Associate Professor Takumi Higaki

A research group from Kumamoto University, Japan has developed a highly sensitive technique to quantitatively evaluate the extent of cytoskeleton bundling from microscopic images. Until now, analysis of cytoskeleton organization was generally made by manually checking microscopic images. The new method uses microscopic image analysis techniques to automatically measure cytoskeleton organization. The researchers expect it to dramatically improve our understanding of various cellular phenomena related to cytoskeleton bundling.

The cytoskeleton is a fibrous structure inside the cell made of proteins. It forms higher-order structures called networks and bundles which maintain or change the shape of the cell depending on its state. An accurate understanding of the structures woven by the cytoskeleton makes it possible to estimate the state of a cell. In the past, analysis of the higher-order cytoskeleton structures was generally done by visual observation of the stained cytoskeleton under a microscope by an expert. However, these conventional methods are based on the subjective judgment of the researcher, and thus lack objectivity. In addition, as the number of specimens to be analyzed increases, the personnel costs for experts also grows.

To counter these issues, Associate Professor Takumi Higaki of Kumamoto University has been developing a quantitative method to automatically evaluate the characteristics of complex cytoskeleton structures using microscope image analysis technology. About 10 years ago, he reported that the degree of cytoskeleton bundling could be evaluated by a numerical index he called the "skewness of intensity distribution" from fluorescently stained microscopic images of the cytoskeleton. This technique is now widely used but it has a problem; bundle conditions cannot be accurately evaluated in excessive bundling or when the microscopic image contains a lot of optical blur.

Therefore, Dr. Higaki and a new collaborative research group developed a new quantitative evaluation technique for cytoskeletal bundles that is both more sensitive and versatile than any of the methods described above. Through graphical computer simulations of cytoskeletal bundling, they found that the coefficient of variation of intensities in cytoskeleton pixels nicely reflected the bundle state. Using cytoskeleton microscopic images, they performed a comparative analysis between existing methods and the new method, and found that the proposed method was more sensitive at detecting bundle states than the other methods. They also found that it could be applied to a multitude of biological samples and microscopes. Furthermore, they studied the effect of the proposed method on optical blur—a major cause of image degradation in microscope images—and found that it was sufficient for quantitative evaluation of the bundle state even in unclear images.

"This technology will enable the quantitative evaluation of the state of cytoskeletal bundles from a more diverse set of microscopic images. We expect that it will dramatically advance our understanding of cells by furthering our understanding of higher-order structures of the cytoskeleton," said Dr. Higaki. "Since cytoskeletal bundling can now be accurately measured, even from unclear images acquired with inexpensive microscopic equipment, new insights may be obtained by reanalyzing the vast amount of microscopic image data that had not been fully utilized in the past."

Credit: 
Kumamoto University

Giving the hydrogen economy an acid test

image: Image photo of graphene

Image: 
University of Tsukuba

Scientists at the University of Tsukuba show that using a layer of graphene just one atom thick improves the catalytic activity of nickel or copper when generating hydrogen gas, which may lead to cheaper fuel for zero-emission automobiles

Tsukuba, Japan - A team of researchers led by the Institute of Applied Physics at the University of Tsukuba has demonstrated a method for producing acid-resistant catalysts by covering them with layers of graphene. They show that using few layers allows for greater proton penetration during a hydrogen evolution reaction, which is crucial for maximizing the efficiency when producing H2 as fuel. This work may lead to industrial-scale manufacturing of hydrogen as a completely renewable energy source for vehicles that do not contribute to climate change.

The dream of hydrogen-powered cars has excited many people as a solution for the huge amount of carbon dioxide fossil-fuel burning vehicles emit into the atmosphere daily. However, the production of hydrogen gas has been slowed by the lack of cheap catalysts required to split water efficiently. In this process, hydrogen nuclei, called protons, need to combine to form hydrogen gas, H2. Nickel and Ni-based alloy are seen as promising cheap alternatives to platinum, but these metals corrode easily when exposed to the acidic conditions of the reaction. One solution is to use graphene, a single sheet of carbon atoms arranged in a honeycomb lattice, to protect the catalyst. However, the mechanism by which the reaction takes place remained poorly understood.

Now, an international research collaboration led by the University of Tsukuba has shown that using three-five layers of graphene can efficiently prevent corrosion while still partly allowing protons to combine at the catalyst through defects in the honeycomb structure. In addition, they found that the catalytic efficiency decreased linearly as more layers of graphene were added. "This result allowed us to conclude that protons must penetrate through the graphene layers in order to react at the surface of the metal" says Dr. Kailong Hu, senior author on the study. The alternative explanation, that electrons travel up from the metal so the protons can react at the outer surface of the graphene, was not a major reaction process supported by the experiments. Future work will focus on the optimization of the number of graphene layers to balance the corrosion resistance with catalytic activity.

"Hydrogen fuel is particularly eco-friendly because it produces zero greenhouse gasses, and still has a larger energy density than gasoline," Professor Yoshikazu Ito explains. "So, we may soon be able to step on the accelerator without leaving a carbon footprint." The work is published in Nature Communications as "Catalytic Activity of Graphene-Covered Non-Noble Metals Governed by Proton Penetration in Electrochemical Hydrogen Evolution Reaction" (DOI: 10.1038/s41467-020-20503-7).

Credit: 
University of Tsukuba

Pillarene hybrid material shows enhanced tunable multicolor luminescence and sensing ability

image: Schematic illustration of the synthesis of PHM, the proposed fluorescence tuning mechanisms, and the tunable luminescent responses of PHM toward different external stimuli.

Image: 
©Science China Press

Organic luminescent materials have been highlighted as an exciting research topic owing to their prominent potentials in light-emitting diodes, fluorescent sensors, optoelectronic devices, in vivo imaging, anti-counterfeiting, data storage, and information encryption. However, applications of tunable fluorescent materials in the solid states have been largely hampered because these luminescent systems generally require time-consuming organic synthesis procedures and suffer from reduced photoluminescence (PL) owing to the notorious aggregation caused quenching. Aggregation-induced emission (AIE) has proved a powerful antidote for the universal self-quenching problems in the field of organic luminescent materials. Inspired by the enhanced luminescence achieved by AIE, supramolecular approaches have been proposed and have facilitated the fabrication of a myriad number of smart luminescent materials.

Pillararenes (pillarenes), as an important class of macrocyclic hosts within the scope of supramolecular chemistry, are composed of hydroquinone units bridged by methylene groups at para positions, possessing rigid pillar-like molecular structures and electron-rich hydrophobic cavities. Remarkably, supramolecular fluorescent systems involving fluorescent polymeric materials and supramolecular assemblies have been constructed using pillarenes as seminal building blocks owing to the versatile functionalization and unique host-guest properties. However, apart from merely providing host-guest binding sites, it is expected that pillarenes might straightly serve as the modulators for tuning luminescent properties of chromophores.

In a research article published in the Beijing-based National Science Review, scientists at Jilin University in Changchun, China, report the successful fabrication of a pillar[5]arene-based multicolor hybrid material (PHM), which possesses enhanced emission in both solid state and solution phases and stimuli-responsive luminescent properties (Scheme 1). A linear π-conjugated chromophore oligo(phenylenevinylene) (OPV) has been chosen as the fluorescent ligand, while the pillar[5]arene rings serves as the rigid macrocycle skeleton. With the two ligands integrated via coordination with Cd (II), the afforded hybrid material can perform desirable optimized fluorescence and well-tuned emission in response to external stimuli including solvents, particular ions, and acid, whereby three emission colors of PHM are obtained, i.e., greenish yellow (pristine), mazarine (solvent variation), and cyan (ions and acid).

The enhanced luminescence of PHM can be explained from two viewpoints involving the molecular packing mode and the changes in the electronic distributions (studied by density functional theory (DFT) calculations), which verified that the existence of pillar[5]arenes can largely minimize the quenching caused by the stacking of the OPV moieties and facilitate the radiative decay pathway. The color-tuning phenomenon of solvent responsiveness can be attributed to the altered intermolecular interactions among OPV ligands. Additionally, the luminescent responses toward Fe3+ and acid are rationally ascribed to the protonation of pyridine nitrogen of the ligands, which causes the interception of charge transfer (CT) processes among the ligands.

This study has paved a new way of adjusting and optimizing fluorescent features of planar dyes via integration with the rigid pillarene rings through coordination, providing a feasible approach for developing solid-state luminescent materials with tunable emission and desired stimuli-responsiveness.

Credit: 
Science China Press

Study the boundary between bulk, nano and molecule scale of gold plasmonic physics

As an elementary type of collective excitation, plasmon has been found to dominate the optical properties of metals. The collective behavior of electrons in plasmons reflects the important difference between condensed matter and molecule-like ones. It is of great significance to study the evolution of plasmonic response and find out the boundary.

Controversy exists on such interesting questions as the division between the nanoparticle and molecules, and the physics of mesoscopic and microscopic plasmonic evolution. A unified understanding covering the small and large size limit, namely macro / meso / micro scales with sufficiently atomic precision is thus required. Clusters, as the transition from atomic molecules to condensed matter, are the ideal candidate for studying the evolution of plasmons.

In a new overview published in the Beijing-based National Science Review, A joint team from Nanjing University, Southern University of Science and Technology, National University of Defense Technology, Institute of Physics, Chinese Academy of Sciences and National Center for Nanoscience and Technology present a research on the evolution of plasmon through mass-selected gold clusters. In this work, scientists push the limit to atomic scale and present a complete evolution picture of size-dependent plasmon physics.

Gold clusters with precise atomic number ranging from 70000 to 100 were prepared using time-of-flight mass-selected magnetron sputtering gas-phase condensation cluster beam source in Fengqi Song's group. The mass resolution M / Δ M was about 50. Scientists then successfully measured the plasmonic responses of a series of atomically precise individual gold particles with atom number (N) of 100-70000 through the well collected high-resolution electron energy loss spectroscopy with the great help of Jiaqing He's group and Pico Center at SUSTech.

Scientists found three characteristic regimes separated by the two grey bold vertical dashed lines, as shown below. In regime 3 (N~887-70000), the positions of the peak of surface plasmon (SP) exhibit a very slight red shift with decreasing N, while the peak of bulk plasmon (BP) remains unchanged and the full width at half maximum (FWHM) remains at a high value of about 0.36 eV. In regime 2 (N~300-887), the position of the peak of SP exhibits a steady blueshift with decreasing N, while the peak of BP disappears completely and the FWHM stays at a high value of about 0.26 eV. In regime 1 (N~100-300), the peak of SP is replaced by 3 fine features with a much smaller FWHM, which is close to the FWHM of the EELS zero-loss peak.

Based on the basic physical model, the physics of the three regimes is explained below. For large clusters with similar electronic structures like bulk, the redshift of SP in regime 3 is explained by electron-boundary scattering modified classical plasmon (Nanoscale. 2017; 9: 3188-95). The reduction in the size only introduces extra boundary scattering for free electrons in the metal, in addition to the Coulomb scattering between electrons. The classical model calculations performed by Jianing Chen's group show good agreement with the experimental results. With the continuous decrease of cluster size, the monotonic blueshift of SP in regime 2 is caused by the well-studied quantum confinement effects (Nature. 2012; 483: 421-7; Nat Phys. 2019; 15: 275-80). The classical Drude model for the dielectric function becomes invalid and gives its way to the quantum-descripted one. Here the total permittivity ε is the sum of the permittivity of free electron transitions in the quantized conduction band and the frequency-dependent permittivity εinter of interband transitions between the d bands and the higher conduction bands. When the bulk-like electronic structure is finally destroyed with even fewer atom numbers in clusters (Nat Commun. 2016; 7: 13240; J Am Chem Soc. 2018; 140: 5691-5), superimposed transitions between quantized molecule-like electronic structures happens, and the traditional plasmonic peak degenerated into fine structures (molecular plasmon, regime 1). With the help of Jiayu Dai's group, the rt-TDDFT calculations show that after strong laser action, the collective charge density oscillation could be found at the core of the cluster, which is quite different from the case for a weak laser field. The collective behavior of electrons is some superimposition of single electron transitions between quantized molecular energy levels (Nat Commun. 2015; 6: 10107). Thus 3 regimes in the plasmonic evolution are observed with distinct physics, namely classical plasmon (N=887-70000), quantum confinement corrected plasmon (N=300-887) and molecule related plasmon (NThis work paves the way for new developments in physics and for future applications of nanoplasmonics as their noting: "Au887 is very small, indicating hardly any quantum effect for most gold nanoparticles since they are larger than Au887, and most nanoplasmonic designs in industrial nanofabrication can currently be tackled based on classical electromagnetism and the dielectric function."

Credit: 
Science China Press

Research reveals new insight into why breastfed babies have improved immune systems

image: Generic image of a baby breastfeeding

Image: 
University of Birmingham

Research led by the University of Birmingham and Birmingham Women's and Children's NHS Foundation Trust has revealed new insight into the biological mechanisms of the long-term positive health effects of breastfeeding in preventing disorders of the immune system in later life.

Breastfeeding is known to be associated with better health outcomes in infancy and throughout adulthood, and previous research has shown that babies receiving breastmilk are less likely to develop asthma, obesity, and autoimmune diseases later in life compared to those who are exclusively formula fed.

However, up until now, the immunological mechanisms responsible for these effects have been very poorly understood. In this new study, researchers have for the first time discovered that a specific type of immune cells - called regulatory T cells - expand in the first three weeks of life in breastfed human babies and are nearly twice as abundant as in formula fed babies. These cells also control the baby's immune response against maternal cells transferred with breastmilk and help reduce inflammation.

Moreover, the research - supported by the National Institute for Health Research's Surgical Reconstruction and Microbiology Research Centre (NIHR SRMRC) - showed that specific bacteria, called Veillonella and Gemella, which support the function of regulatory T cells, are more abundant in the gut of breastfed babies.

The results of the study, published in Allergy, emphasise the importance of breastfeeding, say the researchers.

Senior author Gergely Toldi, researcher at the University of Birmingham and consultant neonatologist at Birmingham Women's and Children's NHS Foundation Trust, said: "The influence of the type of milk received on the development of the immune response has not previously been studied in the first few weeks of life.

"Prior to our research the outstanding importance and the early involvement of this specific cell type in breastfed babies was unknown.

"We hope this invaluable new insight will lead to an increase in rates of breastfeeding and will see more babies benefit from the advantages of receiving breastmilk.

"Furthermore, we hope for those babies who are formula fed, these results will contribute to optimising the composition of formula milk in order to exploit these immunological mechanisms.

"We are very grateful for the mums and babies who contributed to this special project."

The study is the culmination of a unique three-year research project analysing data from 38 healthy mothers and their healthy babies. Small amounts of blood and stool samples were collected at birth at Birmingham Women's Hospital and then again later during home visits when the babies were three weeks old. Sixteen out of the 38 babies (42%) were exclusively breastfed for the duration of the study, while nine babies received mixed feeding, and 13 babies were exclusively formula-fed.

The researchers hope to now further study this biological mechanism in sick and pre-term newborn babies who have developed inflammatory complications.

The research was carried out by a team working across the University of Birmingham's Institutes of Immunology and Immunotherapy; Cancer and Genomic Studies; Microbiology and Infection; and Metabolism and Systems Research, as well as the Department of Neonatology at Birmingham Women's and Children's NHS Foundation Trust, and NIHR SRMRC based at University Hospitals Birmingham NHS Foundation Trust.

Credit: 
University of Birmingham