Tech

Coronavirus makes changes that cause cells not to recognize it

image: Yogesh K. Gupta, PhD, and colleagues at UT Health San Antonio discovered the mechanism by which the novel coronavirus is able to enter cells without encountering immune system resistance.

Image: 
UT Health San Antonio

With an alarm code, we can enter a building without bells going off. It turns out that the SARS coronavirus 2 (SARS-CoV-2) has the same advantage entering cells. It possesses the code to waltz right in.

On July 24 in Nature Communications, researchers at The University of Texas Health Science Center at San Antonio (UT Health San Antonio) reported how the coronavirus achieves this.

The scientists resolved the structure of an enzyme called nsp16, which the virus produces and then uses to modify its messenger RNA cap, said Yogesh Gupta, PhD, the study lead author from the Joe R. and Teresa Lozano Long School of Medicine at UT Health San Antonio.

"It's a camouflage," Dr. Gupta said. "Because of the modifications, which fool the cell, the resulting viral messenger RNA is now considered as part of the cell's own code and not foreign."

Deciphering the 3D structure of nsp16 paves the way for rational design of antiviral drugs for COVID-19 and other emerging coronavirus infections, Dr. Gupta said. The drugs, new small molecules, would inhibit nsp16 from making the modifications. The immune system would then pounce on the invading virus, recognizing it as foreign.

"Yogesh's work discovered the 3D structure of a key enzyme of the COVID-19 virus required for its replication and found a pocket in it that can be targeted to inhibit that enzyme. This is a fundamental advance in our understanding of the virus," said study coauthor Robert Hromas, MD, professor and dean of the Long School of Medicine.

Dr. Gupta is an assistant professor in the Department of Biochemistry and Structural Biology at UT Health San Antonio and is a member of the university's Greehey Children's Cancer Research Institute.

In lay terms, messenger RNA can be described as a deliverer of genetic code to worksites that produce proteins.

Credit: 
University of Texas Health Science Center at San Antonio

Discovery of a rare human gene mutation that causes MAIT cells to disappear

image: A man gazes at the complex protein structures of his own immune system, and the rare genetic mutations that caused his illness, leading scientists to a dramatic discovery.

Image: 
(C) Erica Tandori

A collaboration between Monash Health, the Australian Genomics Health Alliance (AGHA) and researchers at the Monash Biomedicine Discovery Institute has led to the discovery of a rare single gene mutation in a patient that eliminates an immune cell population, namely MAIT cells.

This study's journey began with a patient of Dr Samar Ojaimi from Monash Health who presented with a mild primary immunodeficiency with no known cause. He was identified as a candidate for the Genetic Immunology Flagship of AGHA led by Professor Matthew Cook from the Centre for Personalised Immunology at the Australian National University, which focuses on identifying genetic causes for immunological diseases.

The genome sequencing by the AGHA team identified a rare mutation in the gene encoding a protein called MR1, which normally helps initiate an inflammatory response from an immune cell population called mucosal-associated invariant T (MAIT) cells. However, upon further investigation by a research team from Monash BDI, led by Dr Lauren Howson, they uncovered there had been a complete loss of this immune cell population, while the rest of the immune system remained intact.

"We studied the patient's MR1 protein structure and found that the mutation prevented MR1 from being able to bind the vitamin metabolite it normally presents in order to activate MAIT cells. This led us to look at the patient's immune system to see what effect the mutation had on the MAIT cell population and we were surprised to find it completely gone," Dr Howson said.

Published in Science Immunology, the study demonstrates the power of interdisciplinary collaboration to uncover the impact of a single gene mutation and aid in diagnosis of rare immune disorders.

It not only advances the MR1 and MAIT cell biology research fields, but also demonstrates the substantial impact that discovery-based research can have when combined with clinical and genetic research, creating an avenue for advanced personalised medicine for rare genetic and immune disorders.

"This occurrence of a single immune cell population loss in a person gives us invaluable insight into the important role that this cell type plays in human immune responses," Dr Howson said.

Professor Cook said: "Human genomics is a powerful method for advancing our understanding of the complexity of immunity.

"Genome sequencing has emerged as a crucial tool for both diagnosis and discovery of immune-mediated disease."

Credit: 
Monash University

Solving the jigsaw puzzle of regional carbon budgets

image: Carbon storage change from inventories (?C in red) and lateral fluxes from trade and riverine carbon export to the ocean for different regions of the globe (in blue) for the 2000s. The resulting bottom-up net land atmosphere C flux from the sum of ?C and lateral fluxes is given in dark green. An atmospheric convention is used, so that a negative ?C denotes an increase of land carbon stocks, and a negative net flux is also a net uptake of atmospheric CO2. The upper stacked bars on the left show the sub-components of the net land -atmosphere carbon balance of each region and the resulting imbalance between net primary production and soil heterotrophic respiration, a negative value indicating that soil heterotrophic respiration is smaller than net primary production.

Image: 
©Science China Press

Accurate regional carbon budgets are critical for informing land-based mitigation policy and tracking mitigation progress. For nine regions covering the globe, inventory estimates of carbon stock changes were collected and complemented by satellite estimates of biomass changes where inventory data are missing. The net land-atmospheric carbon exchange was then calculated by taking the sum of the carbon stock changes measured by inventories and lateral carbon fluxes from crop and wood trade and riverine carbon export to the ocean.

Summing up estimates from all regions, the first global 'bottom-up' terrestrial carbon budget of anthropogenic CO2 uptake is obtained, giving a net sink of -2.2 ± 0.6 Pg C y-1 (one Pg C being one billion tons of carbon) during the period from 2000 to 2009 consistent with the independent top-down budget derived in previous IPCC assessment reports from observations of the CO2 growth rate and fossil fuel emissions. This new estimate set up an important milestone for global carbon cycle studies.

By decomposing the net land -atmosphere carbon flux of each region into incoming and outgoing fluxes, we showed that a significant part of the carbon fixed in terrestrial ecosystems by plant productivity is moved away from harvest and export to rivers or lost by fires and by emissions of reduced biogenic carbon species like biogenic volatile compounds and methane. In other words, carbon delivered to the soil as litter and available as a substrate for microbial decomposition is a smaller fraction of productivity that previously estimated by carbon models that ignore lateral export processes.

The implication is that global soil heterotrophic respiration, the amount of CO2 released annually from soil microbial processes is only of 39 Pg C yr-1 compared to Net Primary Productivity inputs of 50 Pg C yr-1. In the study, Ciais and co-workers examined the consequences of a smaller heterotrophic respiration for the residence time of carbon and future projections of the carbon cycle and temperature. They found a positive feedback of 15 parts per million in the future CO2 concentration, that is, an additional warming of 0.1 °C.

Credit: 
Science China Press

Pizza study shows body copes surprisingly well with one-off calorie indulgence

A new study, which involved participants eating pizza well after feeling 'full' in order to test what immediate effects this had on the body, finds that our metabolism is surprisingly good at coping with over-indulgence.

Researchers with the Centre for Nutrition, Exercise and Metabolism at the University of Bath compared the effects of normal eating (i.e. 'eat until you are comfortably full') with maximal eating (i.e. 'eat until you cannot manage another bite').

They found that the young, healthy men (aged 22 - 37) who volunteered for the trial consumed almost twice as much pizza when pushing beyond their usual limits, doubling their calorie intake, yet, remarkably, managed to keep the amount of nutrients in the bloodstream within normal range.

This, say the researchers, shows that if an otherwise healthy person overindulges occasionally there are no immediate, negative consequences in terms of losing metabolic control. However, they caution of the risks of prolonged over-eating.

Lead researcher Aaron Hengist explained: "We all know the long-term risks of over-indulgence with food when it comes to obesity, type II diabetes and cardiovascular disease, but we know much less about some of the immediate effects 'all you can eat' places on the body. Our findings show that the body actually copes remarkably well when faced with a massive and sudden calorie excess. Healthy humans can eat twice as much as 'full' and deal effectively with this huge initial energy surplus."

In the study, the average calorie intake in the all-you-can-eat trial was over 3000 kcal, roughly one and half large pizzas. However, this varied a lot, with some individuals able to consume up to two and half large pizzas in one go.

This is well beyond standard adult guidelines for calorie intake in one day (let alone one meal) - and is even more calories than US Olympic swimmer Michael Phelps famously reported eating for breakfast.

Results show that after eating maximally:

Blood sugar (glucose) levels were no higher than after a normal meal.

The amount of insulin in the blood was 50% higher than normal (this hormone is released by the body to control blood sugar levels).

Blood lipids (triglycerides and non-esterified fatty acids) were only slightly higher despite having consumed over twice as much fat. This is interesting because previous research had shown that blood lipids increase proportionally when low-to-moderate amounts of fat are consumed.

Hormones that are released by the gut to stimulate insulin secretion and increase feelings of fullness were changed the most by overeating (e.g. GLP-1 and peptide YY).

The study also looked at appetite and mood throughout the trials:

- Four hours after eating maximally, the participants felt sleepy/lethargic and reported no desire to eat anything else, including sweet foods. This was surprising because reward centres in the brain are usually food specific, so eating pizza might not be expected to change the desire for sweet food - which may be why you always have room for dessert.

Professor James Betts, who oversaw the work, added: "We know that people often eat beyond their needs, which is why so many of us struggle to manage our body weight. It is therefore surprising that no previous research had measured the maximal capacity for eating at a single meal in order to understand how the human body responds to that challenge.

"This study reveals that humans are capable of eating twice as much food as is needed to make us feel 'full', but that our bodies are well adapted to an excessive delivery of dietary nutrients at one huge meal. Specifically, those tested in this study were able to efficiently use or store the nutrients they ingested during the pizza-eating challenge, such that the levels of sugar and fats in their blood were not much higher than when they ate half as much food.

"The main problem with overeating is that it adds more stored energy to our bodies (in the form of fat), which can culminate in obesity if you overeat day after day. However, this study shows that if an otherwise healthy person overindulges occasionally, for example eating a large buffet meal or Christmas lunch, then there are no immediate negative consequences in terms of losing metabolic control."

The researchers acknowledge that their study involved healthy young men, so they plan to investigate whether similar effects are apparent in women, and for overweight and older populations.

Credit: 
University of Bath

Sci-fi foretold social media, Uber and Augmented Reality, offers insights into the future

Science fiction authors foresaw augmented reality video games, the rise of social media and trends of hyper-consumption, and can help predict future consumer patterns.

New research in the Journal of Consumer Culture, by Lancaster University Management School's Dr Mike Ryder, highlights many parallels between the futures created by sci-fi pioneer Frederik Pohl in the 1950s and the modern world.

These include patterns of hyper-consumption, ecological disaster, and products where producer and consumer are blurred - the prosumer - so consumers also plays a part in a product's creation, as with social media and augmented reality games.

Dr Ryder's research highlights the worlds imagined in Pohl's works, where advertising firms are in charge, exploiting customers for profit and priding themselves on their ability to shape human desire, where social status and consumption are intrinsically entwined, and where characters become hyper-consumers, threatening the stability of the local area. An overarching theme is a concern that the boundaries between humans and machine are blurring.

"Pohl's work highlight the ability of science fiction to provide a better understanding of possible futures and the lasting impacts of modern and emerging technologies, allowing people to see what the world may become in a way easily understood by a mass audience," says Dr Ryder.

"Science fiction is an important tool for testing 'what-if' scenarios, speculating on what the future might bring. Frederik Pohl's worlds of hyper-consumption, robot workers and ecological disaster are even more relevant today than they were in the 1950s.

"Pohl envisages worlds with robot prosumers, hyper-consumption and ecological disaster, all of which are particularly relevant today given the social and political climate and the rise of 'surveillance capitalism'.

"Science fiction is a vital resource for the imagining of possible futures and the lasting impacts of consumption, allowing us to see that the world may become. Social theorists and policy-makers need to take science fiction far more seriously to help prepare for the world of tomorrow. The challenges that arise from new technologies should be considered before they happen."

Pohl's prediction that consumption might one day become a social institution has been borne out with the rise of social media, creating and sustaining a never-ending cycle of robotic prosumption, where users gain social status by sharing their latest purchases alongside holidays and other symbols of 'success'.

"Social media users are perhaps the best example of modern-day robotic prosumption, mindlessly producing and consuming content, while social media firms sell their data and target them with ads that feed back into the cycle. Users struggle be break free through a 'fear of missing out'," added Dr Ryder.

In The Space Merchants (1952, with CM Kornbluth), protagonist Mitchell Courtenay is forced to spend his low wages on goods to help make his work bearable, which creates an ongoing cycle of debt. His behaviour becomes more like that of a machine, a producing-consuming robotic prosumer, unaware he is trapped in a cycle.

This pre-empted the real-world criticism of Vance Packard, who depicted a dystopia where marketers use psychological techniques to influence behaviour to the point consumers do not realise they are being influenced. The similarities of fiction and fact were such that the two worlds blurred together.

"Pohl's works blur the boundaries between human and machine, and questions the roles of both in production and consumption," said Dr Ryder. "Humans become more and more like machines, such that consumption itself becomes a mechanical process, creating a dystopian world where the only freedom is the freedom not to consume, one limited to the very rich."

Pohl's work highlights the ever-increasing robotisation of consumers, such that ad agencies and marketeers effectively programme and manipulate their behaviour, while he predicts workers who produce the very content they consume, much like with many modern companies.

Airbnb acts as a broker for users who are both the producers and consumers of goods, paying to rent rooms, while making money through renting out their own; Uber drivers and passengers able to rate each other, making the consumers a sort of product; and augmented reality video games, such as Ingress and Pokémon GO, see players become part of the product, appearing in each other's games and competing for the same objectives. Players are not paid as employees, but rather provide a form of free labour and conform to expected patterns of behaviour.

Dr Ryder said: "This raises the question of whether players are dehumanised while using these apps as even the act of consumption becomes a robotic act and their behaviour while playing is standardised.

"The choice offered by standardised models of consumption is really no choice at all - it is prescription packaged as choice."

Credit: 
Lancaster University

Proposed framework for integrating chatbots into health care

While the technology for developing artificial intelligence-powered chatbots has existed for some time, a new viewpoint piece in JAMA lays out the clinical, ethical, and legal aspects that must be considered before applying them in healthcare. And while the emergence of COVID-19 and the social distancing that accompanies it has prompted more health systems to explore and apply automated chatbots, the authors still urge caution and thoughtfulness before proceeding.

"We need to recognize that this is relatively new technology and even for the older systems that were in place, the data are limited," said the viewpoint's lead author, John D. McGreevey III, MD, an associate professor of Medicine in the Perelman School of Medicine at the University of Pennsylvania. "Any efforts also need to realize that much of the data we have comes from research, not widespread clinical implementation. Knowing that, evaluation of these systems must be robust when they enter the clinical space, and those operating them should be nimble enough to adapt quickly to feedback."

McGreevey, joined by C. William Hanson III, MD, chief medical information officer at Penn Medicine, and Ross Koppel, PhD, FACMI, a senior fellow at the Leonard Davis Institute of Healthcare Economics at Penn and professor of Medical Informatics, wrote "Clinical, Legal, and Ethical Aspects of AI-Assisted Conversational Agents." In it, the authors lay out 12 different focus areas that should be considered when planning to implement a chatbot, or, more formally, "conversational agent," in clinical care.

Chatbots are a tool used to communicate with patients via text message or voice. Many chatbots are powered by artificial intelligence (AI). This paper specifically discusses chatbots that use natural language processing, an AI process that seeks to "understand" language used in conversations and draws threads and connections from them to provide meaningful and useful answers.

With health care, those messages, and people's reactions to them, are extremely important and carry tangible consequences.

"We are increasingly in direct communication with our patients through electronic medical records, giving them direct access to their test results, diagnoses and doctors' notes," Hanson said. "Chatbots have the ability to enhance the value of those communications on the one hand, or cause confusion or even harm, on the other."

For instance, how a chatbot handles someone telling it something as serious as "I want to hurt myself" has many different implications.

In the self-harm example, there are several areas of focus laid out by the authors that apply. This touches first and foremost on the "Patient Safety" category: Who monitors the chatbot and how often do they do it? It also touches on "Trust and Transparency": Would this patient actually take a response from a known chatbot seriously? It also, unfortunately, raises questions in the paper's "Legal & Licensing" category: Who is accountable if the chatbot fails in its task. Moreover, a question under the "Scope" category may apply here, too: Is this a task best suited for a chatbot, or is it something that should still be totally human-operated?

Within their viewpoint, the team believes they have laid out key considerations that can inform a framework for decision-making when it comes to implementing chatbots in health care. Their considerations should apply even when rapid implementation is required to respond to events like the spread of COVID-19.

"To what extent should chatbots be extending the capabilities of clinicians, which we'd call augmented intelligence, or replacing them through totally artificial intelligence?" Koppel said. "Likewise, we need to determine the limits of chatbot authority to perform in different clinical scenarios, such as when a patient indicates that they have a cough, should the chatbot only respond by letting a nurse know or digging in further: 'Can you tell me more about your cough?'"

Chatbots have the opportunity to significantly improve health outcomes and lower health systems' operating costs, but evaluation and research will be key to that: both to ensure smooth operation and to keep the trust of both patients and health care workers.

"It's our belief that the work is not done when the conversational agent is deployed," McGreevey said. "These are going to be increasingly impactful technologies that deserve to be monitored not just before they are launched, but continuously throughout the life cycle of their work with patients."

Credit: 
University of Pennsylvania School of Medicine

NASA Water vapor data reveals Tropical Storm Gonzalo's soaking capability

image: On July 24 at 1:35 a.m. EDT (0535 UTC), NASA's Aqua satellite passed over Tropical Storm Gonzalo in the central Atlantic Ocean. Aqua found highest concentrations of water vapor (brown) and coldest cloud top temperatures were around the center.

Image: 
Credits: NASA/NRL

When NASA's Aqua satellite passed over the North Atlantic Ocean, it gathered water vapor data on Tropical Storm Gonzalo as tropical storm warnings, a tropical storm watch, and hurricane watch were posted.

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and stronger the storms.

NASA's Aqua satellite passed over Tropical Storm Gonzalo on July 24 at 1:35 a.m. EDT (0535 UTC) and the Moderate Resolution Imaging Spectroradiometer or MODIS instrument gathered water vapor content and temperature information. The MODIS image showed highest concentrations of water vapor and coldest cloud top temperatures were around the center of circulation.

MODIS data also showed coldest cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in those storms. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

The National Hurricane Center (NHC) noted that Gonzalo is expected to produce total rain accumulations of 2 to 5 inches, with isolated maximum amounts of 7 inches in Barbados and the Windward Islands tonight (July 24) through Sunday night (July 26). Gonzalo is also expected to produce total rain accumulations of 1 to 2 inches in Trinidad and Tobago. Rainfall in Barbados and the Windward Islands could lead to life-threatening flash floods.

On July 24, a Tropical Storm Warning is in effect for St. Lucia, Barbados, St. Vincent and the Grenadines. A Hurricane Watch is in effect for Barbados, St. Vincent and the Grenadines and a Tropical Storm Watch is in effect for Tobago, Grenada and its dependencies.

At 8 a.m. EDT (1200 UTC), NHC said satellite data indicated that the center of Tropical Storm Gonzalo was located near latitude 10.0 degrees north and longitude 52.8 degrees west. Gonzalo is moving toward the west near 15 mph (24 kph). A westward to west-northwestward motion with an increase in forward speed is expected through the weekend. The estimated minimum central pressure is 1000 millibars. Maximum sustained winds are near 60 mph (95 kph) with higher gusts. Gonzalo is a small tropical cyclone.  Tropical-storm-force winds extend outward up to 25 miles (35 km) from the center. Some strengthening is forecast during the next day or two, and there is still a chance that Gonzalo could become a hurricane before reaching the Windward Islands. On the forecast track, the center of Gonzalo will approach the southern Windward Islands tonight and then move across the islands on Saturday (July 25) and over the eastern Caribbean Sea on Sunday (July 26).

NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For updated forecasts, visit: http://www.nhc.noaa.gov.

Credit: 
NASA/Goddard Space Flight Center

Sputum testing provides higher rate of COVID-19 detection

Boston, MA -- Early and accurate detection is critical for preventing the spread of COVID-19 and providing appropriate care for patients. Nasopharyngeal (NP) swabs, which require inserting a long shaft into the nasal cavity to collect a sample from the back of the nose and throat, are currently the gold standard for collecting a specimen for diagnosis. But the procedure is technically challenging, often uncomfortable for patients and requires personal protective equipment that may be in short supply. Other approaches to collecting specimens -- including from an oropharyngeal swab and sputum -- have been tested in small studies, but there is uncertainty about which method is best for detecting the virus. In a new study published in EBioMedicine, investigators from Brigham and Women's Hospital conducted a systematic review and meta-analysis, analyzing data from more than 3,000 specimens to compare the three approaches. The team found that sputum testing detected the RNA of the virus that causes COVID-19 at significantly higher rates while oropharyngeal swab testing had lower rates. Regardless of the collection method, the earlier samples were collected after symptoms began, the higher the detection rate.

"The accurate diagnosis of COVID-19 has implications for health care, return-to-work, infection control and public health," said corresponding author Jonathan Li, MD, a faculty member in the Division of Infectious Diseases at the Brigham. "Our gold standard in and out of the hospital is the nasopharyngeal swab, but there's a lot of confusion about which sampling modality is best and most sensitive. Our study shows that sputum testing resulted in significantly higher rates of SARS-CoV-2 detection and supports the use of this type of testing as a valuable method for the diagnosis and monitoring of COVID-19 patients."

Li and his colleagues scoured the literature -- both preprints and published papers -- for studies that assessed at least two respiratory sampling sites using an NP swab, oropharyngeal swab or sputum. From more than 1,000 studies, they identified 11 that met their criteria. These studies included results from a total of 3,442 respiratory tract specimens.

The team examined how often each collection method produced a positive result. For NP swabs, the rate was 54 percent; for oropharyngeal swabs, 43 percent; for sputum, 71 percent. The rate of viral detection was significantly higher in sputum than either oropharyngeal swabs or NP swabs. Detection rates were highest within one week of symptom onset for all three tests.

"When it comes to testing, the earlier the better, as diagnostic accuracy is improved earlier after symptom onset, regardless of the sampling site," said Li. "Unlike antibody testing, it's very rare to have a false positive qPCR test when diagnosing COVID-19 early in the course of the disease using these methods."

Nasopharyngeal swabs are collected through the nasal cavity; oropharyngeal swabs are collected by inserting a shaft through the mouth; and sputum samples are generally collected by having a patient cough deeply to produce and expel phlegm. Not all patients are able to produce a sputum sample; for such patients, a nasopharyngeal swab may be the best collection method. The meta-analysis included only studies conducted on hospitalized individuals -- additional study will be needed of patients who are asymptomatic or have mild symptoms. The current study did not assess alternative testing methods, such as saliva or anterior nasal swabs (taken from the front of the nose). Li and his colleagues at the Brigham are currently working on a project, funded by the Massachusetts Consortium on Pathogen Readiness, to collect and process multiple kinds of samples from patients with COVID-19 to create a resource for researchers.

"The holy grail will be to find a test that is readily acceptable by patients, easy to collect, and highly sensitive," said Li.

This study was funded in part by the National Institutes of Health (U01AI106701) and the Harvard University for AIDS Research (NIAID 5P30AI060354). Li reports personal fees from Abbvie and from Jan Biotech, outside the submitted work. A co-author reports personal fees from Harvard TH Chan School of Public Health, during the conduct of the study, as well as grants from NIH/NIAID, outside the submitted work.

Credit: 
Brigham and Women's Hospital

Discovery of disordered nanolayers in intermetallic alloys

image: (A) Atom maps reconstructed using 3D-APT show the distribution of each element. Iron (Fe), cobalt (Co), and boron (B) are enriched (darker in colour) at the nanolayer, whereas nickel (Ni), aluminum (Al), and titanium (Ti) are depleted (lighter in colour) correspondingly. (B) and (C) also show the same results.

Image: 
Photo source: DOI number: 10.1126/science.abb6830

Intermetallic alloys potentially have high strength in a high-temperature environment. But they generally suffer poor ductility at ambient and low temperatures, hence limiting their applications in aerospace and other engineering fields. Yet, a research team led by scientists of City University of Hong Kong (CityU) has recently discovered the disordered nanoscale layers at grain boundaries in the ordered intermetallic alloys. The nanolayers can not only resolve the irreconcilable conflict between strength and ductility effectively, but also maintain the alloy's strength with an excellent thermal stability at high temperatures. Designing similar nanolayers may open a pathway for the design of new structural materials with optimal alloy properties.

This research was led by Professor Liu Chain-tsuan, CityU's University Distinguished Professor and Senior Fellow of the Hong Kong Institute for Advanced Study (HKIAS). The findings were just published in the prestigious scientific journal Science, titled "Ultrahigh-strength and ductile superlattice alloys with nanoscale disordered interfaces".

Just like metals, the inner structure of intermetallic alloys is made of individual crystalline areas knows as "grains". The usual brittleness in intermetallic alloys is generally ascribed to the cracking along their grain boundaries during tensile deformation. Adding the element boron to the intermetallic alloys has been one of the traditional approaches to overcome the brittleness. Professor Liu was actually one of those who studied this approach 30 years ago. At that time, he found that the addition of boron to binary intermetallic alloys (constituting two elements, like Ni3Al) enhances the grain boundary cohesion, hence improving their overall ductility.

A surprising experimental result

In recent years, Professor Liu has achieved many great advances in developing bulk intermetallic alloys (intermetallic alloy is also called superlattice alloy, constructed with long-range, atomically close-packed ordered structure). These materials with good strengths are highly attractive for high-temperature structural applications, but generally suffer from serious brittleness at ambient temperatures, as well as rapid grain coarsening (i.e. growth in grain size) and softening at high temperatures. So this time, Professor Liu and his team have developed the novel "interfacial nanoscale disordering" strategy in multi-element intermetallic alloys, which enables the high strength, large ductility at room temperature and also excellent thermal stability at elevated temperatures.

"What we originally tried to do is to enhance the grain boundary cohesion through optimizing the amount of boron," said Dr Yang Tao, a postdoc research fellow at CityU's Department of Mechanical Engineering (MNE) and IAS, who is also one of the co-first authors of the paper. "We expected that, as we increased the amount of boron, the alloy would retain ultrahigh strength due to its multi-element constituents."

According to conventional wisdom, adding trace amounts (0.1 to 0.5 atomic percent (at. %)) of boron substantially improves their tensile ductility by increasing grain-boundary cohesion. When excessive amounts of boron were added, this traditional approach would not work. "But when we added excessive amounts of boron to the present multicomponent intermetallic alloys, we obtained completely different results. At one point I wondered whether something went wrong during the experiments," Dr Yang recalled.

To the team's surprise, when increasing boron to as high as 1.5 to 2.5 at. %, these boron-doped alloys became very strong but very ductile. Experiment results revealed that the intermetallic alloys with 2.5 at. % of boron have an ultrahigh yield strength of 1.6 gigapascals with tensile ductility of 25% at ambient temperatures.

By studying through different transmission electron microscopies, the team discovered that when the concentration of boron ranged from 1.5 to 2.5 at. %, a distinctive nanolayer was formed between adjacent ordered grains. Each of the grains was capsulated within this ultrathin nanolayer of about 5nm thick. And the nanolayer itself has a disordered atomic structure. "This special phenomenon had never been discovered and reported before," said Professor Liu.

Their tensile tests showed that the nanolayer serves as a buffer zone between adjacent grains, which enables plastic-deformation at grain boundaries, resulting in the large tensile ductility at an ultrahigh yield strength level.

Why is the disordered nanolayer formed?

The team found that the further increase in boron has substantially enhanced the "multi-element co-segregation" - the partitioning of multiple elements along the grain boundaries. With the advanced three-dimension atom probe tomography (3D APT) at CityU, the only one of its kind in Hong Kong and southern China, they observed a high concentration of boron, iron and cobalt atoms within the nanolayers. In contrast, the nickel, aluminium and titanium were largely depleted there. This unique elemental partitioning, as a result, induced the nanoscale disordering within the nanolayer which effectively suppresses the fractures along grain boundaries and enhances the ductility.

Moreover, when evaluating the thermal response of the alloy, the team found that the increase in grain size was negligible even after 120 hours of annealing at a high temperature of 1050°C. This surprised the team again because most of the structural materials usually show the rapid growth of grain size at high temperature, resulting in strength decrease quickly.

A new pathway for developing structure materials for high-temperature uses

They believed that the nanolayer is pivotal in suppressing growth in grain size and maintain its strength at high temperature. And the thermal stability of the disordered nanolayer will render this type of alloy suitable for high-temperature structural applications.

"The discovery of this disordered nanolayer in the alloy will be impactful to the development of high-strength materials in future. In particular, this approach can be applied to structural materials for applications at high-temperature settings like aerospace, automotive, nuclear power, and chemical engineering," said Professor Liu.

Credit: 
City University of Hong Kong

Photochromic bismuth complexes show great promise for optical memory elements

image: Working routine in Center for Energy Science and Technology.

Image: 
Timur Sabirov / Skoltech

Russian chemists obtained a new photochromic complex composed of of bismuth (III) and viologen cations and used the new compound to create optical memory elements which were shown to be highly efficient and stable. The outcomes of the study may serve to expand the range of microelectronics components in the future. The research was published in the journal Chemical Communications.

Modern memory devices, such as memory cards and SSD drives, are based on electrical switches known as transistors, which can form two quasi-stable electrical states due to the presence of additional components capable of accumulating and storing electrical charge. The value of this charge enables or disables electric current through transistor at certain read voltage. In memory elements, the high current or “open” and low current or “closed” states correspond to logic 1 and logic 0, respectively, or vice versa. To write or erase one bit of information, the transistor should switch from one state to the other. In the case of photochromic materials, i.e. materials that change color when exposed to light, the switching requires a pulse of light and, quite often, superposition of the electric field, too.

Viologen cations consist of two linked aromatic pyridine rings (C10H8N2R2)2+ with two substituents (R) at the nitrogen atoms. Some halide metal and viologen complexes, i.e. those that contain elements of the seventh group of the Periodic Table (F, Cl, Br, and I), can change color when exposed to light. These compounds have not yet found application in electronics despite their highly appealing optoelectronic characteristics. For the first time ever, a group of scientists from the Skolkovo Institute of Science and Technology (Moscow), the Institute of Problems of Chemical Physics of RAS (Chernogolovka) and the Nikolaev Institute of Inorganic Chemistry of SB RAS (Novosibirsk) led by Skoltech professor Pavel Troshin succeeded in designing a photosensitive bismuth complex with optimal properties and demonstrated that it can be successfully used as advanced optically triggered material for memory devices.

“Earlier, we showed the prospects of using organic photochromic materials in photoswitchable field-effect transistors and optical memory elements. Recently, we looked into a series of dihetarylethene derivatives and established very important correlations between their structure and properties. In the current study, we have made a step forward along this avenue of research by using metal compounds in optical switches and memory elements,” explains Lyubov Frolova, a senior research scientist at Skoltech.

The researchers assembled organic field-effect transistors with an additional photosensitive layer made of the bismuth complex with viologen cations. As an intermediate device frabrication step, the complex was crystallized as a film from a solution on a dielectric aluminum oxide layer. The scientists found that the device can be “programmed” by simultaneously applied light pulse and electric bias between the device electrodes, which results in the device switching between two or more quasi-stable electrical states. Having multiple states in the transistor opens up great prospects for creating multi-bit memory elements for high-density data recording.

The current running through the transistor channel can be modulated by 100 times in half a second and by 10,000 times in several tens of seconds of “programming”. This figure points to high efficiency of the devices, which matches the characteristics of the best organic photosensitive field-effect transistors known to date. The authors assume that their devices will have long-term data storage capacity and will be able to withstand many write-read-erase cycles. The recent research has demonstrated their stable operation in over 200 cycles.

Journal

Chemical Communications

DOI

10.1039/D0CC03732J

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

NASA's tracking Hawaii-bound Major Hurricane Douglas

image: On July 24 at 6:30 a.m. EDT (1030 UTC), the MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Hurricane Douglas' cloud tops. MODIS found the most powerful thunderstorms (red) were in the eyewall, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

Hurricane Douglas is a major hurricane tracking through the Central Pacific Ocean on a forecast track to Hawaii. NASA's Aqua satellite used infrared light to identify strongest storms and coldest cloud top temperatures and found them surrounding the eyewall of the powerful hurricane. In addition, images from NASA-NOAA's Suomi NPP satellite were used to generate an animated track of Douglas' movement and intensification over four days.

Infrared Data Reveals Powerful Storms

On July 24 at 6:30 a.m. EDT (1030 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Hurricane Douglas' cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found the most powerful thunderstorms were in the eyewall, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Andrew Latto, hurricane specialist at NOAA's National Hurricane Center noted, "Douglas continues to look impressive in satellite images, with a clear eye and symmetric convection in all quadrants."

NASA Animates Douglas Through Time

At NASA's Goddard Space Flight Center in Greenbelt, Md. using the NASA Worldview platform, an animation was created to show Douglas over four days. Using visible imagery from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite, an animation shows the intensification and movement of Hurricane Douglas from July 20 to July 24 in the Eastern Pacific Ocean. Douglas was a Category 4 hurricane on July 24.

Infrared Data Reveals Powerful Storms

On July 24 at 6:30 a.m. EDT (1030 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Hurricane Douglas' cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found the most powerful thunderstorms were in the eyewall, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Andrew Latto, hurricane specialist at NOAA's National Hurricane Center noted, "Douglas continues to look impressive in satellite images, with a clear eye and symmetric convection in all quadrants."

Douglas' Status on Friday, July 24, 2020

At 5 a.m. EDT (0900 UTC), the center of Hurricane Douglas was located near latitude 15.7 degrees north and longitude 140.3 degrees west. That is about 1,010 miles (1,630 km) east-southeast of Hilo, Hawaii.

Douglas is moving toward the west-northwest near 18 mph (30 kph), and this motion is expected to continue for the next few days with a gradual decrease in forward speed and a slight turn toward the west.

Maximum sustained winds are near 130 mph (215 kph) with higher gusts. Douglas is a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. Hurricane-force winds extend outward up to 30 miles (45 km) from the center and tropical-storm-force winds extend outward up to 90 miles (150 km). The estimated minimum central pressure is 954 millibars.

Gradual weakening is expected to begin today, July 24, and continue through the weekend.

NHC Key Messages

The National Hurricane Center's key about Douglas is that the storm is expected to move near or over portions of the Hawaiian Islands this weekend, and there is an increasing chance that strong winds, dangerous surf, and heavy rainfall could affect portions of the state beginning Saturday night or Sunday.

About NASA's Worldview

NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now."

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For updated forecasts, visit: http://www.nhc.noaa.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Manipulating non-magnetic atoms in a chromium halide enables tuning of magnetic properties

image: The atomic landscape of chromium halides are illustrated. The magnetic chromium atoms appear as gray spheres and the non-magnetic ligand atoms as green (chlorine), orange (bromine), and magenta (iodine) spheres.

Image: 
Fazel Tafti

Chestnut Hill, Mass. (7/24/2020) - The magnetic properties of a chromium halide can be tuned by manipulating the non-magnetic atoms in the material, a team, led by Boston College researchers, reports in the most recent edition of Science Advances.

The seemingly counter-intuitive method is based on a mechanism known as an indirect exchange interaction, according to Boston College Assistant Professor of Physics Fazel Tafti, a lead author of the report.

An indirect interaction is mediated between two magnetic atoms via a non-magnetic atom known as the ligand. The Tafti Lab findings show that by changing the composition of these ligand atoms, all the magnetic properties can be easily tuned.

"We addressed a fundamental question: is it possible to control the magnetic properties of a material by changing the non-magnetic elements?" said Tafti. "This idea and the methodology we report on are unprecedented. Our findings demonstrate a new approach to create synthetic layered magnets with unprecedented level of control over their magnetic properties."

Magnetic materials are the backbone of most current technology, such as the magnetic memory in our mobile devices. It is common practice to tune the magnetic properties by modifying the magnetic atoms in a material. For example, one magnetic element, such as chromium, can be replaced with another one, such as iron.

The team studied ways to experimentally control the magnetic properties of inorganic magnetic materials, specifically, chromium halides. These materials are made of one Chromium atom and three halide atoms: Chlorine, Bromine, and Iodine.

The central finding illustrates a new method of controlling the magnetic interactions in layered materials by using a special interaction known as the ligand spin-orbit coupling. The spin-orbit coupling is a property of an atom to re-orient the direction of spins - the tiny magnets on the electrons - with the orbital movement of the electrons around the atoms.

This interaction controls the direction and magnitude of magnetism. Scientists have been familiar with the spin-orbit coupling of the magnetic atoms, but they did not know that the spin-orbit coupling of the non-magnetic atoms could also be utilized to re-orient the spins and tune the magnetic properties, according to Tafti.

The team was surprised that they could generate an entire phase diagram by modifying the non-magnetic atoms in a compound, said Tafti, who co-authored the report with fellow BC physicists Ying Ran and Kenneth Burch, post-doctoral researchers Joseph Tang and Mykola Abramchuk, graduate student Faranak Bahrami, and undergraduate students Thomas Tartaglia and Meaghan Doyle. Julia Chan and Gregory McCandless of the University of Texas, Dallas, and Jose Lado of Finland's Aalto University, were also part of the team.

"This finding puts forward a novel procedure to control magnetism in layered materials, opening up a pathway to create new synthetic magnets with exotic properties," Tafti said. "Moreover, we found strong signatures of a potentially exotic quantum state associated to magnetic frustration, an unexpected discovery that can lead to an exciting new research direction."

Tafti said the next step is to use these materials in innovative technologies such as magneto-optical devices or the new generation of magnetic memories.

Credit: 
Boston College

NASA animation tracks Tropical Storm Hanna's progression

image: This animation of visible imagery from NASA Aqua Satellite shows the development of Tropical Storm Hanna from July 20 to 23. Clouds associated with the low-pressure area were near south Florida and moved west over the Gulf of Mexico where it formed into a depression and further into a tropical storm.

Image: 
Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).

NASA's Aqua satellite obtained visible imagery as Tropical Storm Hanna formed in the Gulf of Mexico and continued to organize. A new animation from NASA shows how Hanna developed and intensified as it heads toward landfall in Texas this weekend.

NASA Satellite View: Hanna's Organization

The Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite captured a visible image of Tropical Storm Hanna on July 23 at 1:30 p.m. EDT. The image showed the storm appeared more organized and had a more rounded shape than it did the previous couple of days. That is an indication that the storm was consolidating, organizing and strengthening.

Satellite imagery from the two days before were coupled with the July 23 image and made into an animation using NASA's Worldview product at NASA's Goddard Space Flight Center in Greenbelt, Md. That animation showed the clouds associated with the low-pressure area near south Florida and moving west over the Gulf of Mexico where it formed into a depression and further into a tropical storm.

Warnings Posted

On July 24, 2020, the National Hurricane Center posted a Tropical Storm Warning from the mouth of the Rio Grande to San Luis Pass, Texas.

Tropical Storm Hanna on July 24

At 11 a.m. EDT (1500 UTC) on July 24, the National Hurricane Center (NHC) noted the center of Tropical Storm Hanna was located by NOAA reconnaissance aircraft near latitude 27.2 degrees north and longitude 93.2 degrees west. That is 260 miles (420 km) east of Corpus Christi, Texas.

Hanna is moving toward the west-northwest near 9 mph (15 kph), and this motion should continue today. Maximum sustained winds have increased to near 45 mph (75 kph) with higher gusts. Reports from the NOAA reconnaissance aircraft indicate that the minimum central pressure is 1000 millibars.

Hanna's Forecast Track

The NHC forecast calls for a turn toward the west is expected tonight, followed by a westward to west-southwestward motion through the weekend. Gradual strengthening is expected until the tropical cyclone makes landfall. On the forecast track, the center of Hanna should make landfall along the Texas coast within the warning area Saturday afternoon or evening. Steady weakening is expected after Hanna moves inland.

Credit: 
NASA/Goddard Space Flight Center

Citizen science at heart of new study showing COVID-19 seismic noise reduction

image: Locations of the 268 global seismic stations used in the study. Lockdown effects are observed (red) at 185 of 268 stations. Symbol size is scaled by the inverse of population density to emphasize stations located in remote areas.

Image: 
Reprinted with permission from T. Lecocq et al., Science 10.1126/science.abd2438 (2020).

Research published in the journal Science, using a mix of professional and Raspberry Shake citizen seismic data, finds that lockdown measures to slow the spread of the virus COVID-19 reduced seismic noise by 50% worldwide.

By analyzing months-to-years long datasets from over 300 seismic stations in 78 countries, including 65 Raspberry Shake seismographs, the report was able to demonstrate that ambient seismic noise levels were reduced in many countries and regions around the world, making it possible to visualize the resulting “wave” starting in China, then moving to Italy and the rest of the world. This seismic noise reduction represents the total effects of physical / social distancing measures, reduced economic and industrial activity, and drops in tourism and travel. The 2020 seismic noise quiet period is the longest and most prominent global anthropogenic seismic noise reduction on record.

The study was spawned after the lead author, Dr. Thomas Lecocq, decided that the best way to tackle the problem of analyzing data from all around the globe was to share his method with the seismological community. This started a unique collaboration involving 76 authors from 66 institutions in 27 countries. The study’s lead authors are based in Belgium, the United Kingdom, New Zealand and Mexico.

Seismometers are sensitive scientific instruments to record vibrations traveling through the ground – known as seismic waves. Traditionally, seismology focuses on measuring seismic waves arising after earthquakes. Seismic records from natural sources however are contaminated by high-frequency vibrations (“buzz”) from humans at the surface – walking around, driving cars, and getting the train all create unique seismic signatures in the subsurface. Heavy industry and construction work also generate seismic waves that are recorded on seismometers.

There are many thousands of seismic monitoring stations around the world, and it took a team effort to download, process, and analyze the many terabytes of data available. Data came from high-end seismic monitoring networks, as well as Raspberry Shake citizen seismic sensors, sharing data to a global community. Raspberry Shake operates the largest singular network of real-time seismographs in the world, which are used in various applications including research, professional vibration monitoring, and by hobbyists. The research involved major collaboration between academic and citizen scientists using this network.

“This is a great example of the type of role citizen seismology can play in contributing to the scientific record,” Raspberry Shake chief scientist Ian Nesbitt said in a statement. “We are very proud of our community’s involvement in this unique study.”

While 2020 has not seen a reduction in earthquakes, the drop in the anthropogenic “buzz” has been unprecedented. The strongest seismic noise reductions were found in urban areas, but the study also found signatures of the lockdown on sensors buried hundreds of meters into the ground and in more remote areas, such as in Sub-Saharan Africa.

The study found a strong match between seismic noise reductions and human mobility datasets drawn from mapping apps on mobile phones and made publicly available by Google and Apple. This correlation allows open seismic data to be used as a broad proxy for tracking human activity in near-real-time, and to understand the effects of pandemic lockdowns and recoveries without impinging on potential privacy issues.

The environmental effects of the pandemic lockdowns are wide and varied, including reduced emissions in the atmosphere and reduced traffic and noise pollution impacting wildlife. This period of time has been coined “anthropause”. This new study is the first global study of the impact of the anthropause on the solid Earth beneath our feet.

Will the 2020 seismic noise quiet period allow new types of signals to be detected? The study has shown the first evidence that previously concealed earthquake signals, especially during daytime, appeared much clearer on seismic sensors in urban areas during lockdown. The study’s authors hope that their work will spawn further research on the seismic effects of lockdown. Finding previously hidden signals from earthquakes and volcanoes will be one key aim.

With growing urbanization and increasing populations globally, more people will be living in geologically hazardous areas. Therefore it will become more important than ever—especially with the rising popularity of citizen seismology—to characterize the anthropogenic noise humans cause so that seismologists can better listen to the Earth, especially in cities, and monitor the ground movements beneath our feet.

Full details of the study can be found in the report.

Journal

Science

DOI

10.1126/science.abd2438

Credit: 
Raspberry Shake

USTC made breakthrough in the Sb2(S,Se)3 solar cell efficiency

A research group led by Prof. CHEN Tao and Prof. ZHU Changfei, and their collaborator Prof. HAO Xiaojing at UNSW, developed a hydrothermal deposition method for the synthesis of antimony selenosulfide for solar cell applications. With this absorber material, the solar cell break the 10% benchmark efficiency barrier. This result has been published in Nature Energy entitled "Hydrothermal deposition of antimony selenosulfide thin films enables solar cells with 10% efficiency".

Antimony selenosulfide, Sb2(S,Se)3, as the ROHS-compliant and earth-abundant light harvesting material, has received increasing interests during the past few years. The band gap of Sb2(S,Se)3 is tunable in the range of 1.1-1.7 eV, satisfying the requirement for optimal sunlight harvesting. In addition, Sb2(S,Se)3 possesses high extinction coefficient and the film thickness of about 500 nanometers can absorb sufficient light irradiation. With these advantages, the Sb2(S,Se)3 is a promising energy material for the applications of light-weight and portable electricity generation devices.

Consider that Sb2(S,Se)3 is consisted of earth-abundant elements and with the excellent stability, the improvement in breaking 10% benchmark efficiency will set a ground for commercialization path. In this study, the authors found that the hydrothermal deposition at supercritical condition enables the generation of compact and flat film with homogeneous element distribution in the lateral direction. These superior characteristics allow the efficient carrier transport and suppression of the detrimental recombination. With further optimizations of the band gap, cation/anion ratio, crystal orientation and defect properties, the device successfully achieves a record power conversion efficiency.

The reviewer of this paper highly praised this work, commenting that "This paper presents a landmark efficiency value for Sb2(S,Se)3 solar cells breaking the 10% barrier.", "This achievement sheds new light on the investigation and application of Sb2(S,Se)3 ...".

The co-first authors of this articles are Dr. RONG feng, Dr. WANG Xiaomin and LIAN Weitao, from the School of Chemistry and Materials Science of University of Science and Technology of China. The co-corresponding authors are ZHU Changfei (USTC), HAO Xiaojing (UNSW) and CHEN Tao (USTC). Collaborators also include Prof. YANG Shangfeng at USTC, Prof. XING Guichuan at University of Macau, Prof. CHEN Shiyou at the East China Normal University and so on.

Credit: 
University of Science and Technology of China