Tech

Lower than expected risk of bone density decline with Truvada PrEP

image: AIDS Research and Human Retroviruses, published monthly online with open access options and in print, presents papers, reviews, and case studies documenting the latest developments and research advances in the molecular biology of HIV and SIV and innovative approaches to HIV vaccine and therapeutic drug research, including the development of antiretroviral agents and immune-restorative therapies.

Image: 
(c) 2019, Mary Ann Liebert, Inc., publishers

New Rochelle, NY, July 15, 2019--Researchers have shown that among users of pre-exposure prophylaxis (PrEP) to prevent against AIDS that includes tenofovir (Truvada), those with daily use - very high adherence - had only about a 1% average decrease in bone mineral density in the spine and a 0.5% decline in the hip. The study findings and the implications for the lower than expected results on the potential for broader use of Truvada in PrEP and in AIDS treatment are explored in an article published in AIDS Research and Human Retroviruses, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the AIDS Research and Human Retroviruses website through August 15, 2019.

Matthew Spinelli, University of California, San Francisco and a large team of researchers coauthored the article entitled "Impact of Estimated Pre-Exposure Prophylaxis (PrEP) Adherence Patterns on Bone Mineral Density in a Large PrEP Demonstration Project." Declines in bone mineral density with regular tenofovir use are an ongoing concern and may be limiting PrEP. The researchers in this study used estimated PrEP adherence data and measurements of bone density using X-ray absorptiometry (DXA). The data were gathered over a median of 24 weeks.

Thomas Hope, PhD, Editor-in-Chief of AIDS Research and Human Retroviruses and Professor of Cell and Molecular Biology at Northwestern University, Feinberg School of Medicine, Chicago, IL states: "Correct use of Truvada-based PrEP has been shown to significantly protect individuals from HIV acquisition. However, previous studies in individuals using Truvada for treatment raised possible concerns about decreases in bone density. Such concerns could decrease the number of individuals using PrEP. In the study presented here, only minor changes in bone density were observed indicating that young healthy individuals using Truvada for PrEP have only a minimal risk of decreased bone density and increased fractures. However, individuals at highest risk for fracture may consider alternative PrEP formulations such as DESCOVY® (TAF/FTC). This new knowledge should facilitate increased PrEP utilization in high risk populations."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Next generation metagenomics: Exploring the opportunities and challenges

image: OMICS: A Journal of Integrative Biology is an authoritative and highly innovative peer-reviewed interdisciplinary journal published monthly online, addressing the latest advances at the intersection of postgenomics medicine, biotechnology and global society, including the integration of multi-omics knowledge, data analyses and modeling, and applications of high-throughput approaches to study complex biological and societal problems.

Image: 
(c) 2019 Mary Ann Liebert, Inc., publishers

New Rochelle, NY, July 8, 2019--A new expert review highlights the opportunities and methodological challenges at this critical juncture in the growth of the field of metagenomics. With important implications and applications in clinical medicine, public health, biology, and ecology, metagenomics is benefitting from advances in high-throughput techniques and technology, while facing the challenges of big data storage and analysis, according to the review article published in OMICS: A Journal of Integrative Biology, the peer-reviewed interdisciplinary journal published by Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the OMICS: A Journal of Integrative Biology website until August 8, 2019.

Ilaria Laudadio, Valerio Fulci, Laura Stronati, and Claudia Carissimi, at Sapienza University of Rome, Italy, coauthored the article entitled "Next Generation Metagenomics: Methodological Challenges and Opportunities." Metagenomics provides a view into the genetic composition of microbial communities, whether from environmental, human, or other types of samples. The authors identify the major bottlenecks in current metagenomic experimental design and data reporting and analysis. They discuss the differences in previous shotgun metagenomics approaches to the more recent technological developments such as single-cell metagenomics. They also focus on advances in the intriguing field of functional metagenomics and identify the need for greater standardization to allow for the proper comparison of data produced by different research groups.

Vural Özdemir, MD, PhD, DABCP, Editor-in-Chief of OMICS: A Journal of Integrative Biology states: "Metagenomics is a sophisticated example of what omics and systems sciences offer to both human and planetary health. Metagenomics is of interest not only to cell biologists and medical and environmental scientists, but also to physicians and healthcare specialists in need of new approaches to medical diagnostics and therapeutics. Dr. Carissimi and coauthors highlight the actionable targets for metagenomics, as well as what the future holds in this new frontier of systems sciences. For readers seeking to rapidly grasp the nuances of metagenomics, this concise expert review is a thoughtful and timely resource."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

How expectation influences perception

CAMBRIDGE, MA -- For decades, research has shown that our perception of the world is influenced by our expectations. These expectations, also called "prior beliefs," help us make sense of what we are perceiving in the present, based on similar past experiences. Consider, for instance, how a shadow on a patient's X-ray image, easily missed by a less experienced intern, jumps out at a seasoned physician. The physician's prior experience helps her arrive at the most probable interpretation of a weak signal.

The process of combining prior knowledge with uncertain evidence is known as Bayesian integration and is believed to impact widely our perceptions, thoughts, and actions. Now, MIT neuroscientists have discovered distinctive brain signals that encode these prior beliefs. They have also found how the brain uses these signals to make judicious decisions in the face of uncertainty.

"How these beliefs come to influence brain activity and bias our perceptions was the question we wanted to answer," says Mehrdad Jazayeri, the Robert A. Swanson Career Development Professor of Life Sciences, a member of MIT's McGovern Institute for Brain Research, and the senior author of the study.

The researchers trained animals to perform a timing task in which they had to reproduce different time intervals. Performing this task is challenging because our sense of time is imperfect and can go too fast or too slow. However, when intervals are consistently within a fixed range, the best strategy is to bias responses toward the middle of the range. This is exactly what animals did. Moreover, recording from neurons in the frontal cortex revealed a simple mechanism for Bayesian integration: Prior experience warped the representation of time in the brain so that patterns of neural activity associated with different intervals were biased toward those that were within the expected range.

MIT postdoc Hansem Sohn, former postdoc Devika Narain, and graduate student Nicolas Meirhaeghe are the lead authors of the study, which appears in the July 15 issue of Neuron.

Ready, set, go

Statisticians have known for centuries that Bayesian integration is the optimal strategy for handling uncertain information. When we are uncertain about something, we automatically rely on our prior experiences to optimize behavior.

"If you can't quite tell what something is, but from your prior experience you have some expectation of what it ought to be, then you will use that information to guide your judgment," Jazayeri says. "We do this all the time."

In this new study, Jazayeri and his team wanted to understand how the brain encodes prior beliefs, and put those beliefs to use in the control of behavior. To that end, the researchers trained animals to reproduce a time interval, using a task called "ready-set-go." In this task, animals measure the time between two flashes of light ("ready" and "set") and then generate a "go" signal by making a delayed response after the same amount of time has elapsed.

They trained the animals to perform this task in two contexts. In the "Short" scenario, intervals varied between 480 and 800 milliseconds, and in the "Long" context, intervals were between 800 and 1,200 milliseconds. At the beginning of the task, the animals were given the information about the context (via a visual cue), and therefore knew to expect intervals from either the shorter or longer range.

Jazayeri had previously shown that humans performing this task tend to bias their responses toward the middle of the range. Here, they found that animals do the same. For example, if animals believed the interval would be short, and were given an interval of 800 milliseconds, the interval they produced was a little shorter than 800 milliseconds. Conversely, if they believed it would be longer, and were given the same 800-millisecond interval, they produced an interval a bit longer than 800 milliseconds.

"Trials that were identical in almost every possible way, except the animal's belief led to different behaviors," Jazayeri says. "That was compelling experimental evidence that the animal is relying on its own belief."

Once they had established that the animals relied on their prior beliefs, the researchers set out to find how the brain encodes prior beliefs to guide behavior. They recorded activity from about 1,400 neurons in a region of the frontal cortex, which they have previously shown is involved in timing.

During the "ready-set" epoch, the activity profile of each neuron evolved in its own way, and about 60 percent of the neurons had different activity patterns depending on the context (Short versus Long). To make sense of these signals, the researchers analyzed the evolution of neural activity across the entire population over time, and found that prior beliefs bias behavioral responses by warping the neural representation of time toward the middle of the expected range.

Embedded knowledge

Researchers believe that prior experiences change the strength of connections between neurons. The strength of these connections, also known as synapses, determines how neurons act upon one another and constrains the patterns of activity that a network of interconnected neurons can generate. The finding that prior experiences warp the patterns of neural activity provides a window onto how experience alters synaptic connections. "The brain seems to embed prior experiences into synaptic connections so that patterns of brain activity are appropriately biased," Jazayeri says.

As an independent test of these ideas, the researchers developed a computer model consisting of a network of neurons that could perform the same ready-set-go task. Using techniques borrowed from machine learning, they were able to modify the synaptic connections and create a model that behaved like the animals.

These models are extremely valuable as they provide a substrate for the detailed analysis of the underlying mechanisms, a procedure that is known as "reverse-engineering." Remarkably, reverse-engineering the model revealed that it solved the task the same way the monkeys' brain did. The model also had a warped representation of time according to prior experience.

The researchers used the computer model to further dissect the underlying mechanisms using perturbation experiments that are currently impossible to do in the brain. Using this approach, they were able to show that unwarping the neural representations removes the bias in the behavior. This important finding validated the critical role of warping in Bayesian integration of prior knowledge.

The researchers now plan to study how the brain builds up and slowly fine-tunes the synaptic connections that encode prior beliefs as an animal is learning to perform the timing task.

Credit: 
Massachusetts Institute of Technology

Model development is crucial in understanding climate change

image: The earth grids indicate the dynamical core of the atmospheric model component in FGOALS-f3-L, while the clouds and associated precipitation indicate the key physical scheme in the atmospheric model -- the Resolving Convective Precipitation (RCP) scheme -- which makes the model scale-aware and computationally fast. Based on the 'Tianhe 2' supercomputer, as shown below the gridded globe, the authors complete the CMIP6 AMIP experiments, which will contribute greatly to our understanding of extreme climate events such as typhoons, floods, drought, and snowstorms. Moreover, these datasets will also contribute to the benchmark of current model behaviors for the desired continuity of CMIP.

Image: 
Advances in Atmospheric Sciences

Numerical models are a key tool for climate scientists in understanding the past, present and future climate change arising from natural, unforced variability or in response to changes, according to Dr Qing Bao, Research Fellow at the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics (LASG), Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS), and the corresponding author of a recently published study.

"Climate changes, such as global warming, substantially influence human society in all aspects, and climate prediction is a constant hot topic in the climate science community," says Dr Bao. "The Coupled Model Intercomparison Project [CMIP], organized under the auspices of the World Climate Research Programme's Working Group on Coupled Modelling, uses state-of-the-art climate models to provide a physical evidence base for policymakers, such as the IPCC [Intergovernmental Panel on Climate Change]".

Dr Bao and his model team--a group of researchers from LASG/IAP--are in charge of the development of the atmospheric model of CAS' FGOALS-f3-L climate model. They recently completed the AMIP (Atmospheric Model Intercomparison Project) simulations in the sixth phase of CMIP and published their datasets of the ESGF (Earth System Grid Federation) nodes as a data description paper in Advances in Atmospheric Sciences.

The Finite-volume Atmospheric Model (FAMIL) in FGOALS-f3-L, which is the new-generation AGCM (atmospheric general circulation model) of the Spectral Atmosphere Model of LASG (SAMIL), has been fixed for the CMIP6 experiments in 2017. In this version, the dynamical core and model physics parameterization scheme have been substantially updated. The new model is fast in completing huge computing tasks and overcomes some model biases related to climate sensitivity and cloud microphysics from the last version. The current version shows good ability not only in capturing large-scale patterns of climatological mean precipitation and surface temperature, but is also good at reflecting intraseasonal events like MJO (Madden-Julian Oscillation) and typhoons, which were a challenge for the CMIP5 models, according to Dr He, the first author of this paper.

Following the design of the AMIP experiments, three ensemble simulations were carried out over the period 1979-2014, which were forced by monthly mean observed sea surface temperature and sea ice, as recommended by the CMIP6 projects. The model outputs contain a total of 37 variables and include the required three-hourly mean, six-hourly transient, daily and monthly mean datasets.

"Preliminary evaluation suggests that FGOALS-f3-L can capture the basic patterns of atmospheric circulation and precipitation well, and these datasets could contribute to the benchmark of current model behaviors for the desired continuity of CMIP," Dr Bao explains. "Analysis of these datasets will also be helpful in understanding the sources of model biases and be of benefit to the development of climate forecast systems."

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

NASA sees heavy rainfall potential in strengthening Tropical Storm Barry

image: At 3:15 a.m. EDT (0715 UTC) on July 13, the MODIS instrument aboard NASA's Aqua satellite looked at Tropical Storm Barry in infrared light. MODIS found coldest cloud tops (light green) had temperatures near minus 80 degrees Fahrenheit (minus 62.2 degrees Celsius) around the center of the tropical storm which was offshore from south central Louisiana.

Image: 
NASA/NRL

Tropical Storm Barry continued to linger in the Gulf of Mexico, generating a lot of heavy rainfall on Saturday, July 13, 2019. Barry was just under the threshold of being classified a Category 1 hurricane and is expected to become one before landfall. NASA's Aqua satellite analyzed cloud top temperatures in the storm which gave an indication of the storm's strength.

At 3:15 a.m. EDT (0715 UTC) on July 13, the MODIS or Moderate Resolution Imaging Spectroradiometer instrument aboard NASA's Aqua satellite looked at Tropical Strom Barry infrared light. MODIS found coldest cloud tops had temperatures near minus 80 degrees Fahrenheit (minus 62.2 degrees Celsius) south and east around a slightly more rounded center of the tropical storm. Storms with temperatures that cold are indicative of strong storms and have been shown to have the capability to generate heavy rainfall.

The satellite image revealed a large area of strong thunderstorms that cold, surrounded by an even larger area of thunderstorms with cloud tops just slightly less cold. At the time of the image, the largest area of strong storms still appeared to be mostly south of the center of circulation and over the Gulf of Mexico. That's because wind shear from the north-northwest is still battering the storm and pushing the strongest thunderstorms south and southeast of the center.

Barry's Status on July 13, 2019

Barry is a strong tropical storm, just below the threshold of a hurricane and is forecast to reach hurricane status before landfalling. At 8 a.m. EDT (1200 UTC), the National Hurricane Center or NHC said the center of Tropical Storm Barry was located near latitude 29.3 North and longitude 91.9 West. Barry is moving toward the northwest near 5 mph (7 km/h), and a turn toward the north is expected tonight or Sunday.

Maximum sustained winds have increased to near 70 mph (115 kph) with higher gusts. Tropical-storm-force winds extend outward up to 175 miles (280 km) from the center.

Additional strengthening is forecast before landfall, and Barry is expected to be a hurricane when the center reaches the Louisiana coast during the next several hours.

Steady weakening is expected after Barry moves inland.

The estimated minimum central pressure based on surface observations is 991 millibars (29.26 inches).

Warnings and Watches in Effect

The NHC posted many warnings and watches for Barry on July 13. A Hurricane Warning is in effect for Intracoastal City to Grand Isle, Louisiana. A Tropical Storm Warning is in effect from the mouth of the Pearl River to Grand Isle, Lake Pontchartrain and Lake Maurepas including metropolitan New Orleans, and from Intracoastal City to Cameron, La.

A Storm Surge Warning is in effect from Intracoastal City, La. to Biloxi, Mississippi and for Lake Pontchartrain. A Storm Surge Watch is in effect from Biloxi, Miss. to the Mississippi/Alabama border. A Hurricane Watch is in effect for the mouth of the Mississippi River to Grand Isle and from Intracoastal City to Cameron, La. A Tropical Storm Watch is in effect from east of the mouth of the Pearl River to the Mississippi / Alabama border.

Landfall in South-Central Louisiana Today

On the forecast track, the National Hurricane Center said the center of Barry will make landfall along the south-central Louisiana coast during the next several hours (after 8 a.m. EDT). After landfall, Barry is expected to move generally northward through the Mississippi Valley through Sunday night.

Credit: 
NASA/Goddard Space Flight Center

New e-cigarette laws could drive some users to smoke more cigarettes

image: Efforts by the FDA and some cities to limit the availability and appeal of e-cigarettes to young users could drive some existing users to smoke more tobacco cigarettes to get their fix, according to new research from Duke Health publishing July 15, 2019 in the journal Substance Use & Misuse.

Image: 
Duke Health

DURHAM, N.C. -- Efforts by the FDA and some cities to limit the availability and appeal of e-cigarettes to young users could drive some existing users to smoke more tobacco cigarettes to get their fix, according to new research from Duke Health.

The findings, from a survey of 240 young U.S. adults who use both e-cigarettes and traditional tobacco cigarettes, are scheduled to be published July 15 in the journal Substance Use & Misuse.

"The FDA now has regulatory authority over all tobacco products, including e-cigarettes and we know that some communities have taken action to ban flavored e-cigarette products," said Lauren Pacek, Ph.D., the study's lead author and an assistant professor in psychiatry and behavioral sciences at Duke.

"We wanted to take a first pass at seeing what users' anticipated responses to new regulations might be," Pacek said. "Our findings suggest that while some regulations, such as banning certain flavors to limit appeal to adolescents, might improve outcomes for those young users, the new regulations might have unintended consequences with other portions of the population."

The online survey asked participants aged 18 to 29 to predict their use of two products they already used -- e-cigarettes and traditional tobacco cigarettes -- in response to hypothetical regulations to limit e-cigarette flavors, limit the customizability of e-cigarettes or eliminate the nicotine in e-cigarettes.

About 47 percent of respondents said if regulations eliminated the nicotine in e-cigarettes, they wouldn't use e-cigs as much and would increase their use of traditional cigarettes.

About 22 percent said if regulations limited the customizability of devices, such as features allowing users to adjust nicotine dose or vapor temperature, they would use e-cigs less and smoke more tobacco cigarettes.

About 17 percent said if e-cigarettes were to be limited to tobacco and menthol flavors, they wouldn't use e-cigs as much and they would smoke more tobacco cigarettes.

According to other research on e-cigarette use, about a third of people who use e-cigarettes also use other tobacco products, Pacek said. For instance, some smokers might use an e-cigarette where tobacco smoking is not allowed, such as at work or a restaurant.

The survey was small and not designed to predict the behavior of e-cigarette users across the U.S., Pacek said. However, the data suggest that when considering changes to e-cigarettes, such as limiting fruity flavors proven to appeal to youth, that regulators also consider the downstream effects of new regulations on other users.

"It's likely some potential new regulations on e-cigarettes will result in a net good for the whole population, such as limiting flavors that might entice young users, improving safety standards, or mandating that liquids come in child-proof containers," Pacek said. "However, our findings suggest that there should also be thoughtful consideration to potential unintended consequences that could affect other subsets of users of e-cigarettes and other tobacco products."

Credit: 
Duke University Medical Center

2D perovskite materials found to have unique, conductive edge states

image: Topographical view of the surface of the perovskite layer (l) and electrical current image of the same layer showing the conductive edges.

Image: 
Shashank Priya lab, Penn State

A new class of 2D perovskite materials with edges that are conductive like metals and cores that are insulating was found by researchers who said these unique properties have applications in solar cells and nanoelectronics.

"This observation of the metal-like conductive states at the layer edges of these 2D perovskite materials provides a new way to improve the performance of next-generation optoelectronics and develop innovative nanoelectronics," said Kai Wang, assistant research professor in materials science and engineering at Penn State and lead author on the study.

Wang and a team of Penn State researchers made the discovery while synthesizing lead halide perovskite materials for use in next generation solar cells. Perovskites, materials with a crystal structure good at absorbing visible light, are an area of focus in developing both rigid and flexible solar cells that can compete commercially with traditional cells made with silicon. These 2D perovskite materials are cheaper to create than silicon and have the potential to be equally efficient at absorbing sunlight.

The findings, reported in Science Advances, provide new insights into the charge and energy flow in perovskite materials, important for the continued advancement of the technology, the scientists said.

"I think the beauty of this work is that we found a material that has completely different properties along the edges compared to the core," said Shashank Priya, professor of materials science and engineering and associate vice president for research at Penn State. "It's very unusual that the current can flow around the edges and not in the center of a material, and this has huge implications for the design of solar cell architectures."

The 2D perovskite materials consist of thin, alternately stacked organic and inorganic layers. The organic layers protect the inorganic layers of lead halide crystals from moisture that can degrade 3D versions of the material. This layered structure results in a large variation in conductivity along perpendicular and parallel directions.

Using scanning and mapping techniques, the researchers found that sharp edges of the 2D single crystals exhibited extraordinarily large free charge carrier density.

"This work reveals the distinct differences in optoelectronic properties between the crystal layer edge and the core region, which can give a hint toward answering other important questions raised in the field of optoelectronics about these 2D perovskite materials," Wang said.

Researchers said the findings could boost performance of solar cells and LED technology by providing additional charge pathways within the devices. The findings also open the door for the development of innovative one-dimensional electrical conduction in nanoelectronics.

"Across the length of these materials, you have a junction between metal and semiconductor, and there are a lot of hypothetical devices proposed based on that junction," Priya said.

Because of the strong current found at the edges, 2D perovskite crystals may also be a good candidate for a triboelectric nanogenerator, the researchers said.

Nanogenerators convert motion into electric power, which could lead to wearable technology that charges phones and other devices using both light and mechanical energy and inputs.

Credit: 
Penn State

Study gives insight into sun-induced DNA damage and cell repair

image: Baylor University biochemist Jung-Hyun Min, Ph.D., led a study that provides better understanding of how sunlight-induced DNA damage is initially recognized for repair in cells

Image: 
Robert Rogers/Baylor University

A team led by a Baylor University researcher has published a breakthrough article that provides a better understanding of the dynamic process by which sunlight-induced DNA damage is recognized by the molecular repair machinery in cells as needing repair.

Ultraviolet light from the sun is a ubiquitous carcinogen that can inflict structural damage to the cellular DNA. As DNA carries important blueprints for cellular functions, failure in removing and restoring damaged parts of DNA in a timely fashion can have detrimental outcomes and lead to skin cancers in humans, said lead author Jung-Hyun Min, Ph.D., associate professor of chemistry and biochemistry in Baylor's College of Arts & Sciences.

Min and her team showed how the repair protein Rad4/XPC would bind to one such UV-induced DNA damage--6-4 photoproduct -- to mark the damaged site along the DNA in preparation for the rest of the nucleotide excision repair (NER) process in cells.

The study -- "Structure and mechanism of pyrimidine-pyrimidone (6-4) photoproduct recognition by the Rad4/XPC nucleotide excision repair complex" -- is published in the journal Nucleic Acids Research (NAR) as a "breakthrough article."

Breakthrough articles present high-impact studies answering long-standing questions in the field of nucleic acids research and/or opening new areas and mechanistic hypotheses for investigation. They are the very best papers published at NAR, constituting 1 to 2 percent of those received by the journal.

UV light threatens the integrity of the genome by generating cellular DNA damage known as intra-strand crosslink damage, Min said. Two major types of these lesions are cyclobutane pyrimidine dimer (CPD), which makes up about 70 percent of such damage; and 6-4 photoproduct (6-4PP), which constitutes about 30 percent.

The cellular DNA repair system (NER), which is responsible for clearing these lesions, works much faster for 6-4PP than CPD, Min said. This is because a DNA damage-sensing protein (called Rad4/XPC) that initiates NER is more efficient at recognizing 6-4PP than it is at recognizing CPD.

Once a lesion is bound by Rad4/XPC, it can be removed by the NER pathway. NER works in all organisms, ranging from yeast to humans. How the Rad4/XPC protein recognizes the lesions and what leads to the differences in the recognition efficiencies remains unclear, Min said.

The team first determined a 3D structure of Rad4 protein bound to a DNA substrate containing a 6-4PP lesion, using a technique called X-ray crystallography. The structure showed that the proteins flips outward the portions of the DNA containing the 6-4PP and thus "opens" up the DNA double helix. This was accompanied by severe untwisting and bending of the DNA strands.

However, it was not the damaged portion of the DNA that the protein directly contacted, Min said.

Instead, the protein bound specifically to the healthy bits of the DNA opposite the lesion. This shows that the protein could in principle bind to the CPD as well as other environmentally induced DNA lesions that are known to be recognized by Rad4/XPC. But it could not directly explain why the recognition efficiencies among the lesions may be different.

To address this, Min then collaborated with Suse Broyde, Ph.D., at New York University and used molecular dynamics to computationally simulate the process by which Rad4 initially may latch on to the DNA containing either 6-4PP or CPD.

The simulation studies showed that the protein readily engages with 6-4PP to untwist, bend and partly "open" the DNA at the lesion site. But remarkably, the CPD-containing DNA resisted the untwisting and bending that was readily happening with 6-4PP.

Altogether, the team was able to assemble a 3-D molecular trajectory that depict the key steps during the DNA "opening" carried out by Rad4/XPC and unveiled the reasons behind the different recognition of 6-4PP and CPD.

Min believes the discovery of these mechanics of nucleotide excision repair could provide benefits beyond understanding UV-induced damage, as NER also is an important pathway that repairs much of environmentally induced DNA damage -- including that caused by industrial pollutants, cigarette smoke and even some chemotherapeutic drugs.

"The hallmark of NER is that it repairs a very broad range of DNA damage. That is quite important in terms of how our genomes are protected from environmentally caused DNA damage," Min said.

"While it has been known for many decades that this Rad4/XPC protein can recognize 6-4PP very efficiently, there has been no structure to show how it really binds to the lesion and why the recognition is so efficient compared with lesions such as CPD," she said. "Basically, our study nicely fills this missing gap and details what that mechanism must be."

While this research showed how Rad4/XPC can bind to the damage in a DNA duplex, it is still unknown how the protein can find such damage if it is on DNA that is compactly organized as it does in cells (called chromatin).

Min said most DNA in chromatin are spooled around proteins called histones and how Rad4/XPC can get around to find a lesion is another mystery.

Also, she said it is unknown how Rad4/XPC would recruit the next player of the repair pathway, called Transcription Factor II H complex (TFIIH), which is important in verifying the damage before other proteins come and actually cut out the damaged portion.

"We hope the knowledge we uncover can be helpful in solving major problems in human health," Min said. "This is how we imagine we can help -- by understanding how things work with full 3-D structural detail."

Credit: 
Baylor University

Healthy lifestyle may offset genetic risk of dementia

Living a healthy lifestyle may help offset a person's genetic risk of dementia, according to new research.

The study was led by the University of Exeter - simultaneously published today in JAMA and presented at the Alzheimer's Association International Conference 2019 in Los Angeles. The research found that the risk of dementia was 32 per cent lower in people with a high genetic risk if they had followed a healthy lifestyle, compared to those who had an unhealthy lifestyle.

Participants with high genetic risk and an unfavourable lifestyle were almost three times more likely to develop dementia compared to those with a low genetic risk and favourable lifestyle.

Joint lead author Dr El?bieta Ku?ma, at the University of Exeter Medical School, said: "This is the first study to analyse the extent to which you may offset your genetic risk of dementia by living a healthy lifestyle. Our findings are exciting as they show that we can take action to try to offset our genetic risk for dementia. Sticking to a healthy lifestyle was associated with a reduced risk of dementia, regardless of the genetic risk."

The study analysed data from 196,383 adults of European ancestry aged 60 and older from UK Biobank. The researchers identified 1,769 cases of dementia over a follow-up period of eight years. The team grouped the participants into those with high, intermediate and low genetic risk for dementia.

To assess genetic risk, the researchers looked at previously published data and identified all known genetic risk factors for Alzheimer's disease. Each genetic risk factor was weighted according to the strength of its association with Alzheimer's disease.

To assess lifestyle, researchers grouped participants into favourable, intermediate and unfavourable categories based on their self-reported diet, physical activity, smoking and alcohol consumption. The researchers considered no current smoking, regular physical activity, healthy diet and moderate alcohol consumption as healthy behaviours. The team found that living a healthy lifestyle was associated with a reduced dementia risk across all genetic risk groups.

Joint lead author Dr David Llewellyn, from the University of Exeter Medical School and the Alan Turing Institute, said: "This research delivers a really important message that undermines a fatalistic view of dementia. Some people believe it's inevitable they'll develop dementia because of their genetics. However it appears that you may be able to substantially reduce your dementia risk by living a healthy lifestyle."

Credit: 
University of Exeter

Artificial intelligence (AI) designs metamaterials used in the invisibility cloak

image: These are schematics of and artificial neural network that can design structural parameters and material simultaneously. When desired optical properties (electric/magnetic dipole spectrum) is inputted, each thickness and types of materials of the three-layer core-shell nanoparticle are provided as output.

Image: 
POSTECH

Metamaterials are artificial materials engineered to have properties not found in naturally occurring materials and they are best known as materials for the 'invisibility cloak' often featured in SF novels or games. By precisely designing artificial atoms that are smaller than the wavelength of light and controlling the polarization and spin of light, new optical properties are made that are not found in nature. However, the current process require numerous trial and failures until the right material is obtained. It is not only time consuming but also compromise efficiency. And AI is expected to provide a solution for this problem.

The research group of Prof. Junsuk Rho, Sunae So and Jungho Mun of Department of Mechanical Engineering and Department of Chemical Engineering at POSTECH developed a design with a higher degree of freedom which allows to choose materials and to design photonic structures arbitrarily by using Deep Learning. Their findings are published in several renowned journals such as the Applied Materials and Interfaces, Nanophotonics, Microsystems & Nanoengineering, Optics Express, and Scientific Reports. As they published their findings in five journals a month, they have attracted tremendous attention from the field of academics.

Properties of metamaterials depend on the way they are designed. The conventional intuitive-based and labor-intensive design has been suggested as a problem due to repetitive trial and error process. However, Prof. Rho and his team suggested a new data-based design method by utilizing AI.

AI can be trained with a vast amount of data and it can learn designs of various metamaterials and the correlation between photonic structures and their optical properties. Using this training process, it can provide a design method that makes a photonic structure with desired optical properties. Once trained, it can provide a desired design promptly and efficiently. This has already been researched at various institutions in the U.S.A such as MIT, Stanford University, Georgia Institute of Technology. However, the previous studies require inputs of materials and structural parameters of structures beforehand and adjust photonic structures afterwards.

Prof. Rho and his group educated AI to design arbitrary photonic structures and gave additional level of freedom of the design by categorizing types of materials and adding them as a design factor, which made it possible to design appropriate materials for relevant optical properties. Analysis of metameterials obtained through this design method revealed that it had identical optical properties inputted in the artificial neural network.

The research team, who have published various research findings on the design of metamaterials and optics theory, put enormous efforts in this research by studying the programming language, Python, needed to learn Deep Learning with online courses, MOOC.

Their accomplishment of developing this design method is revolutionary in many ways. First of all, it significantly reduced the time needed to design photonic structures. Also, it allows various designs of new metamaterials because scientists are no longer limited to conduct empirical designs to obtain results. Metamaterials can be utilized in applications such as display, security, and military technology but all within the limits of development stage at the moment. In this regard, introduction of AI to the design method is expected to make important contribution to the technological development of metamaterials.

Prof. Junsuk Rho, the lead-researcher in the team, commented that, "Our research was successful in bringing it to a higher degree of freedom of the design, but the new design still requires users to input certain problem settings in the beginning. It sometimes produced wrong designs and therefore make it impossible to produce desired metamaterials. So, I'd like to take our findings a step further by developing a complete design method of metamaterials utilizing AI. Also, I'd like to make innovative and practical metamaterials by training AI with reviews of the design constructed in consideration of final products.

Credit: 
Pohang University of Science & Technology (POSTECH)

NASA finds an asymmetric Tropical Storm Barry

image: On July 12, 2019 at 4:10 a.m. EDT (0810 UTC) the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms in Tropical Storm Barry were south of the elongated center where cloud top temperatures were as cold as minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

Infrared imagery from NASA's Aqua satellite shows that Tropical Storm Barry doesn't look like a typical strong tropical cyclone. Imagery revealed that Barry is elongated and the strongest storms were south of it's stretched out center of circulation.

Warnings and Watches

At 8 a.m. EDT (1200 UTC) on Friday, July 12, NOAA's National Hurricane Center (NHC) in Miami, Florida said that Barry is moving slowly to the west-northwest in the Gulf of Mexico, and south of the coast of southeastern Louisiana. NHC warns of dangerous storm surge, heavy rains, and wind conditions expected across the north-central Gulf coast.

Many warnings and watches are in effect as Barry hugs that northern Gulf coast, hammering the region. A Hurricane Warning is in effect from Intracoastal City to Grand Isle, Louisiana. A Tropical Storm Warning is in effect from the mouth of the Pearl River to Grand Isle, La. and for Lake Pontchartrain and Lake Maurepas including metropolitan New Orleans, and from Intracoastal City, Louisiana to Cameron, Louisiana.

A Storm Surge Warning is in effect from Intracoastal City to Shell Beach, Louisiana. A Storm Surge Watch is in effect from Shell Beach to the Mississippi/Alabama border and for Lake Pontchartrain. A Hurricane Watch is in effect from the mouth of the Mississippi River to Grand Isle, La. and for Intracoastal City to Cameron, La. A Tropical Storm Watch is in effect from east of the Mouth of the Pearl River to the Mississippi/Alabama border.

Satellite Imagery

NASA's Aqua satellite used infrared light to analyze the strength of storms and found the bulk of them in the southern quadrant. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 12 at 4:10 a.m. EDT (0810 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite gathered infrared data on Tropical Storm Barry. Strongest thunderstorms had cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall. Those strongest storms were south and southeast of the center of the elongated circulation.

The NHC said, "Barry does not have the typical presentation of a tropical cyclone on satellite imagery at this time. The cloud pattern consists of a cyclonically curved convective band on the southern semicircle, and the system is devoid of an inner convective core near the center. Barry is an asymmetric storm with most of the tropical-storm-force winds occurring in the eastern semicircle. Tropical-storm-force winds extend outward up to 175 miles (280 km) to the east of the center."

Barry's Status on July 12, 2019 at 8 a.m. EDT

On July 12 at 8 a.m. EDT (1200 UTC), the National Hurricane Center (NHC) said the broad circulation center of Tropical Storm Barry was located near latitude 28.2 degrees north and longitude 90.3 degrees west. The minimum central pressure based on the Hurricane Hunter aircraft data is 998 millibars (29.47 inches).

Barry is moving toward the west-northwest near 5 mph (7 kph). A track toward the northwest is expected to begin later in the day on Friday, July 12, followed by a turn toward the north on Saturday, July 13.

Reports from NOAA and Air Force Reserve Hurricane Hunter aircraft indicate that the maximum sustained winds remain near 50 mph (85 kph) with higher gusts. Some strengthening is expected during the next day or so, and Barry could become a hurricane tonight or early on July 13 when the center is near the Louisiana coast. After landfall, weakening is expected after Barry moves inland.

Barry's Path Forward

On the NHC forecast track, the center of Barry will be near or over the central or southeastern coast of Louisiana tonight or Saturday, July 13 and then move inland over the Lower Mississippi Valley on Sunday, July 14.

Key Messages from the National Hurricane Center

There is a danger of life-threatening storm surge inundation along the coast of southern and southeastern Louisiana where a Storm Surge Warning is in effect. The highest storm surge inundation is expected between Intracoastal City and Shell Beach. Residents in these areas should listen to any advice given by local officials.

The slow movement of Barry will result in a long duration heavy rainfall and flood threat along the central Gulf Coast and inland through the lower Mississippi Valley through the weekend into early next week. Flash flooding and river flooding will become increasingly likely, some of which may be significant, especially along and east of the track of the system.

Hurricane conditions are expected along a portion of the coast of Louisiana, where a Hurricane Warning has been issued. Residents in these areas should rush their preparations to completion, as tropical storm conditions are expected to arrive in the warning area by Friday morning.

Credit: 
NASA/Goddard Space Flight Center

Rice device channels heat into light

image: Rice University graduate student Xinwei Li, left, and postdoctoral researcher Weilu Gao used carbon nanotube films Gao helped develop to create a device to recycle waste heat. It could ultimately enhance solar cell output and increase the efficiency of industrial waste-heat recovery.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (July 12, 2019) - The ever-more-humble carbon nanotube may be just the device to make solar panels - and anything else that loses energy through heat - far more efficient.

Rice University scientists are designing arrays of aligned single-wall carbon nanotubes to channel mid-infrared radiation (aka heat) and greatly raise the efficiency of solar energy systems.

Gururaj Naik and Junichiro Kono of Rice's Brown School of Engineering introduced their technology in ACS Photonics.

Their invention is a hyperbolic thermal emitter that can absorb intense heat that would otherwise be spewed into the atmosphere, squeeze it into a narrow bandwidth and emit it as light that can be turned into electricity.

The discovery rests on another by Kono's group in 2016 when it found a simple method to make highly aligned, wafer-scale films of closely packed nanotubes.

Discussions with Naik, who joined Rice in 2016, led the pair to see if the films could be used to direct "thermal photons."

"Thermal photons are just photons emitted from a hot body," Kono said. "If you look at something hot with an infrared camera, you see it glow. The camera is capturing these thermally excited photons."

Infrared radiation is a component of sunlight that delivers heat to the planet, but it's only a small part of the electromagnetic spectrum. "Any hot surface emits light as thermal radiation," Naik said. "The problem is that thermal radiation is broadband, while the conversion of light to electricity is efficient only if the emission is in a narrow band.

"The challenge was to squeeze broadband photons into a narrow band," he said.

The nanotube films presented an opportunity to isolate mid-infrared photons that would otherwise be wasted. "That's the motivation," Naik said. "A study by (co-lead author and Rice graduate student) Chloe Doiron found that about 20% of our industrial energy consumption is waste heat. That's about three years of electricity just for the state of Texas. That's a lot of energy being wasted.

"The most efficient way to turn heat into electricity now is to use turbines, and steam or some other liquid to drive them," he said. "They can give you nearly 50% conversion efficiency. Nothing else gets us close to that, but those systems are not easy to implement." Naik and his colleagues aim to simplify the task with a compact system that has no moving parts.

The aligned nanotube films are conduits that absorb waste heat and turn it into narrow-bandwidth photons. Because electrons in nanotubes can only travel in one direction, the aligned films are metallic in that direction while insulating in the perpendicular direction, an effect Naik called hyperbolic dispersion. Thermal photons can strike the film from any direction, but can only leave via one.

"Instead of going from heat directly to electricity, we go from heat to light to electricity," Naik said. "It seems like two stages would be more efficient than three, but here, that's not the case."

Naik said adding the emitters to standard solar cells could boost their efficiency from the current peak of about 22%. "By squeezing all the wasted thermal energy into a small spectral region, we can turn it into electricity very efficiently," he said. "The theoretical prediction is that we can get 80% efficiency."

Nanotube films suit the task because they stand up to temperatures as high as 1,700 degrees Celsius (3,092 degrees Fahrenheit). Naik's team built proof-of-concept devices that allowed them to operate at up to 700 C (1,292 F) and confirm their narrow-band output. To make them, the team patterned arrays of submicron-scale cavities into the chip-sized films.

"There's an array of such resonators, and each one of them emits thermal photons in just this narrow spectral window," Naik said. "We aim to collect them using a photovoltaic cell and convert it to energy, and show that we can do it with high efficiency."

Rice postdoctoral researcher Weilu Gao is co-lead author and graduate student Xinwei Li is co-author. Kono is a professor of electrical and computer engineering, of physics and astronomy and of materials science and nanoengineering. Naik is an assistant professor of electrical and computer engineering.
The Basic Energy Science program of the Department of Energy, the National Science Foundation and the Robert A. Welch Foundation supported the research.

Credit: 
Rice University

Rush unveils quality composite rank

image: Rush University Medical Center's chief analytics officer Dr. Bala Hota is the lead author of 'Disagreement Between Hospital Rating Systems: Measuring the Correlation of Multiple Benchmarks and Developing a Quality Composite Rank'.

Image: 
Rush Production Group

Rush University Medical Center researchers have proposed a rating system that standardizes and combines data from five leading hospital rating systems into an easy-to-understand composite score of one to 10 that will help guide consumer's hospitals choice.

In a paper published July 2 in the American Journal of Medical Quality, the authors first cited research showing that despite almost two decades of public reporting of quality metrics, consumers have found hospital rating systems "to be limited and lacking in personalization or relevance for individual consumers." This lack of consumer engagement, the authors suggest, is likely driven by the substantial variability that exists between the ranking of top performing hospitals in different ranking systems: The U.S. News & World Report Best Hospitals List, the Vizient Quality and Accountability Study, the Centers for Medicare & Medicaid Services (CMS) Star Rating, the Leapfrog Hospital Safety Grade, and the Truven Top 100 Hospitals list.

Lead author Dr. Bala Hota, the Medical Center's chief analytics officer, noted that while each of the rating organizations provides valuable data and insight that help drive hospital quality improvement efforts, their complexity and variability have made them difficult for consumers to use.

"The science behind each rating systems is very complex and measures different outcomes, domains and even time periods," Hota said. "And while this wealth of data supporting the ratings is vital to hospitals, consumers are confused when the ratings disagree."

Thus nearly two years ago, Hota and his Rush colleagues began gathering the data and assembling an objective framework to needed assess the overall similarity of rating systems to one another. The paper, "Disagreement Between Hospital Rating Systems: Measuring the Correlation of Multiple Benchmarks and Developing a Quality Composite Rank" details how they aggregated scoring data from multiple hospital ranking systems to generate a single measure, the Quality Composite Rank (QCR).

For the study, the scores for 70 high-performing hospitals ranked by the various raking systems were combined into a core data set of ten performance measures. Using a series of statistical correlation approaches that accounted for differences and similarities in what each rating organization measured, researchers were able to better identify variations and ultimately generate a single digit composite score that rewards hospitals for consistency across ratings systems.

"Standardizing what is measured more objectively identifies hospitals that do well in multiple measurement systems. Hospitals with the best QCR scores had higher quality scores across more areas and measured by more scoring systems. We believe that suggests a more sustained and institutional commitment to quality care," Hota said.

More importantly, the authors believe a single-digit QCR composite score built from the various ratings systems will benefit patients.

"The most important metrics are those that help patients navigate the health system. But publicly-reported quality measures that the public does not understand defeats their purpose," said Omar Lateef, DO, Rush University Medical Center chief executive officer and paper co-author.

"When patients see conflicting ratings, they must then reconcile that information in their mind. What we've done is to develop a measure that quantitatively does that reconciliation."

Credit: 
Rush University Medical Center

Thwack! Insects feel chronic pain after injury

image: First genetic evidence that insects experience chronic pain.

Image: 
Viroreanu Laurentiu: Pixabay

Scientists have known insects experience something like pain since 2003, but new research published today from Associate Professor Greg Neely and colleagues at the University of Sydney proves for the first time that insects also experience chronic pain that lasts long after an initial injury has healed.

The study in the peer-reviewed journal Science Advances offers the first genetic evidence of what causes chronic pain in Drosophila (fruit flies) and there is good evidence that similar changes also drive chronic pain in humans. Ongoing research into these mechanisms could lead to the development of treatments that, for the first time, target the cause and not just the symptoms of chronic pain.

"If we can develop drugs or new stem cell therapies that can target and repair the underlying cause, instead of the symptoms, this might help a lot of people," said Associate Professor Neely, whose team of researchers is studying pain at the Charles Perkins Centre with the goal of developing non-opioid solutions for pain management.

Pain and insects

"People don't really think of insects as feeling any kind of pain," said Associate Professor Neely. "But it's already been shown in lots of different invertebrate animals that they can sense and avoid dangerous stimuli that we perceive as painful. In non-humans, we call this sense 'nociception', the sense that detects potentially harmful stimuli like heat, cold, or physical injury, but for simplicity we can refer to what insects experience as 'pain'."

"So we knew that insects could sense 'pain', but what we didn't know is that an injury could lead to long lasting hypersensitivity to normally non-painful stimuli in a similar way to human patients' experiences."

What is chronic pain?

Chronic pain is defined as persistent pain that continues after the original injury has healed. It comes in two forms: inflammatory pain and neuropathic pain.

The study of fruit flies looked at neuropathic 'pain', which occurs after damage to the nervous system and, in humans, is usually described as a burning or shooting pain. Neuropathic pain can occur in human conditions such as sciatica, a pinched nerve, spinal cord injuries, postherpetic neuralgia (shingles), diabetic neuropathy, cancer bone pain, and in accidental injuries.

Testing pain in fruit flies

In the study, Associate Professor Neely and lead author Dr Thang Khuong from the University's Charles Perkins Centre, damaged a nerve in one leg of the fly. The injury was then allowed to fully heal. After the injury healed, they found the fly's other legs had become hypersensitive. "After the animal is hurt once badly, they are hypersensitive and try to protect themselves for the rest of their lives," said Associate Professor Neely. "That's kind of cool and intuitive."

Next, the team genetically dissected exactly how that works.

"The fly is receiving 'pain' messages from its body that then go through sensory neurons to the ventral nerve cord, the fly's version of our spinal cord. In this nerve cord are inhibitory neurons that act like a 'gate' to allow or block pain perception based on the context," Associate Professor Neely said. "After the injury, the injured nerve dumps all its cargo in the nerve cord and kills all the brakes, forever. Then the rest of the animal doesn't have brakes on its 'pain'. The 'pain' threshold changes and now they are hypervigilant."

"Animals need to lose the 'pain' brakes to survive in dangerous situations but when humans lose those brakes it makes our lives miserable. We need to get the brakes back to live a comfortable and non-painful existence."

In humans, chronic pain is presumed to develop through either peripheral sensitisation or central disinhibition, said Associate Professor Neely. "From our unbiased genomic dissection of neuropathic 'pain' in the fly, all our data points to central disinhibition as the critical and underlying cause for chronic neuropathic pain."

"Importantly now we know the critical step causing neuropathic 'pain' in flies, mice and probably humans, is the loss of the pain brakes in the central nervous system, we are focused on making new stem cell therapies or drugs that target the underlying cause and stop pain for good."

Credit: 
University of Sydney

Weyl fermions discovered in another class of materials

image: The 3 PSI researchers Junzhang Ma, Ming Shi and Jasmin Jandke (from left to right) at the Swiss Light Source SLS, where they succeeded in proving the existence of Weyl fermions in paramagnetic material.

Image: 
Paul Scherrer Institute/Markus Fischer

A particular kind of elementary particle, the Weyl fermions, were first discovered a few years ago. Their specialty: they move through a material in a well ordered manner that practically never lets them collide with each other and is thus very energy efficient. This opens up intriguing possibilities for the electronics of the future. Up to now, Weyl fermions had only been found in certain non-magnetic materials. Now however, for the very first time, scientists at the Paul Scherrer Institute PSI have experimentally proved their existence in another type of material: a paramagnet with intrinsic slow magnetic fluctuations. This finding also shows that it is possible to manipulate the Weyl fermions with small magnetic fields, potentially enabling their use in spintronics, a promising development in electronics for novel computer technology. The researchers now published their findings in the scientific journal Science Advances.

Amongst the approaches that could pave the way to energy efficient electronics of the future, Weyl fermions could play an intriguing role. Found experimentally only inside materials as so-called quasi-particles, they behave like particles which have no mass. Predicted theoretically already in 1929 by the mathematician Hermann Weyl, their experimental discovery by scientists amongst other at PSI only came in 2015. So far, Weyl fermions had only been observed in certain non-magnetic materials. Now however, a team of scientists at PSI together with researchers in the USA, China, Germany and Austria also found them in a specific paramagnetic material. This discovery could bring a potential usage of Weyl fermions in future computer technology one step closer.

Searching for slow magnetic fluctuations

"The difficult part," says Junzhang Ma, postdoctoral researcher at PSI and first author of the new study, "was to identify a suitable magnetic material in which to look for these Weyl fermions." For years, although the accepted theoretical assumption had been that in certain magnetic materials Weyl fermions could exist by themselves, experimental proof of this was still missing despite considerable effort from several research groups worldwide. The team of scientists at PSI then had the idea to turn their attention to a specific group of magnetic materials: paramagnets with slow magnetic fluctuations.

"In specific paramagnetic materials, these intrinsic magnetic fluctuations could suffice to create a pair of Weyl fermions," says Ming Shi, who is a professor in the same research group as Ma: the Spectroscopy of Novel Materials Group. "But we understood that the fluctuations had to be slow enough in order for the Weyl fermions to appear. From this point on, identifying which material could have sufficiently slow magnetic fluctuations became our primary challenge."

Since the characteristic time of the magnetic fluctuations is not a feature that can be checked in a work of reference for every material, it took the researchers some time and effort to find a suitable material for their experiment. Model analysis in theoretical physics also done at PSI helped them in identifying a promising candidate with slow magnetic fluctuations: the material with the chemical notation EuCd2As2: Europium-Cadmium-Arsenic. And indeed, in this paramagnetic material, the scientists were able to experimentally prove Weyl fermions.

Measurements with Muons and X-rays

The scientists used two of PSIs large research facilities for their experiments: First, they employed the Swiss Muon Source SμS to measure and better characterise the magnetic fluctuations of their material. Subsequently, they visualized the Weyl fermions with an x-ray spectroscopy method at the Swiss Light Source SLS.

"What we have proven here is that Weyl fermions can exist in a wider range of materials than previously thought", says Junzhang Ma. The scientists' research thus significantly broadens the range of materials considered viable in the search for materials suitable for the electronics of the future. Within an area of development called spintronics, Weyl fermions could be used to transport information with much higher efficiency than that achieved by electrons in today's technology.

Credit: 
Paul Scherrer Institute