Brain

A hidden electronic transition 'S0 → Tn' in heavy-atom-containing molecules

image: (a) Absorption and emission properties of hypervalent iodine compound II; the vertical axis is normalized. (b) Absorption spectra of II under various conditions.

Image: 
Masaya Nakajima

Researchers in Japan have discovered S0 → Tn, a previously overlooked electronic transition in photoreactions occurring in heavy-atom-containing molecules exposed to visible light. The study was published online in Angewadte Chemie International Edition on March 9, 2020.

In recent years, researchers have paid much attention to techniques for processing materials using visible light because this is safer and easier to handle than using other light sources, such as UV light. The key in such technologies is to control the photoreaction in order to achieve the intended structural changes in the material.

"We've been studying the mechanism of photoreactions in basic synthetic organic chemistry, but there has always been a mystery; photoreactions are promoted in molecules containing iodine during irradiation with light from the non-absorbing region. To fully understand and be able to control photoreactions in materials, we had to solve this mystery," says Tetsuhiro Nemoto, a professor in the Graduate School of Pharmaceutical Sciences at Chiba University in Japan.

The research group led by Nemoto and Masaya Nakajima, an assistant professor, investigated emission characteristics, such as absorption wavelength, fluorescence, and phosphorescence, in iodine-containing molecules. The wavelength of excitation light required for phosphorescence at 550 nm ranged from 230 to 410 nm, even though almost no absorption band was observed at wavelengths above 320 nm (Fig.1).

Furthermore, when the absorption wavelength of high-concentration samples was measured using a 10-cm cell, 10 times as long as a normal one, a S0 → Tn absorber was observed. With these physical results, the research group succeeded in proving the existence of the S0 → Tn transition in the iodine-containing molecules.

Interestingly, the group also found that radical reactions specifically occurring in photoreactions with visible light were a common chemical phenomenon not only with iodine (I) but also with heavy-atom-containing molecules such as bromine (Br) and bismuth (Bi).

Based on these physical and chemical findings, we need to renew our understanding of S0 → Tn, which according to current textbooks, does not play the main role as a mechanism behind photoreactions.

"In the future, we expect that this mechanism of photoreaction will lead to the design of new molecules and reactions in various research fields," Nakajima says.

Credit: 
Chiba University

Sorry, Einstein: Hard workers may make better role models than geniuses

UNIVERSITY PARK, Pa. -- Role models are important for aspiring scientists, but new research suggests that scientists who are known for their hard work -- like Thomas Edison -- are more motivating than scientists who are viewed as naturally brilliant, like Albert Einstein.

In a series of studies, researchers found that young people were more motivated by scientists whose success was associated with effort than those whose success was attributed to innate, exceptional intelligence, even if that scientist was Albert Einstein.

Danfei Hu, a doctoral student at Penn State, and Janet N. Ahn, an assistant professor of psychology at William Paterson University, said the findings -- recently published in Basic and Applied Social Psychology -- will help dispel certain myths about what it takes to succeed in science.

"There's a misleading message out there that says you have to be a genius in order to be a scientist," Hu said. "This just isn't true and may be a big factor in deterring people from pursuing science and missing out on a great career. Struggling is a normal part of doing science and exceptional talent is not the sole prerequisite for succeeding in science. It's important we help spread this message in science education."

According to the researchers, there is concern in the science community with the number of students who pursue careers in science during school only to drop out from those career paths once they graduate from college. Researchers have coined this phenomenon as the "leaking STEM pipeline."

To help solve the problem, Hu and Ahn wanted to research role modeling, which gives aspiring scientists specific goals, behaviors or strategies they can mimic. But while previous studies have examined qualities that make role models effective, Hu and Ahn were curious about whether the aspiring scientists' own beliefs about potential role models had an effect on their motivation.

"The attributions people make of others' success are important because those views could significantly impact whether they believe they, too, can succeed," Ahn said. "We were curious about whether aspiring scientists' beliefs about what contributed to the success of established scientists would influence their own motivation."

The researchers performed three studies with 176, 162 and 288 participants in each, respectively. In the first study, all participants read the same story about common struggles a scientist encountered in their science career. However, half were told the story was about Einstein, while half believed it was about Thomas Edison.

Despite the stories being the same, participants were more likely to believe natural brilliance was the reason for Einstein's success. Additionally, the participants who believed the story was about Edison were more motivated to complete a series of math problems.

"This confirmed that people generally seem to view Einstein as a genius, with his success commonly linked to extraordinary talent," Hu said. "Edison, on the other hand, is known for failing more than 1,000 times when trying to create the light bulb, and his success is usually linked to his persistence and diligence."

In the second study, participants once again read a story about a struggling scientist, but while one half of the sample was told it was about Einstein, the other half was told it was about a fabricated scientist whose name -- Mark Johnson -- was previously unfamiliar to them. Compared to those believing they were reading about Einstein, participants who read about Mark Johnson were less likely to think exceptional talent was necessary for success and more likely to perform better on a series of math problems.

Finally, the researchers wanted to perform a final study to see if people simply felt demotivated in comparison to Einstein or if Edison and an unknown scientist could boost participants' motivation.

In the third study, the researchers followed the same procedure as the previous two experiments with one change: The participants were randomly assigned to read a story about an unknown scientist, Einstein, or Edison. Compared to the unknown scientist, Edison motivated participants while Einstein demotivated them.

"The combined results suggest that when you assume that someone's success is linked to effort, that is more motivating than hearing about a genius's predestined success story," Hu said. "Knowing that something great can be achieved through hard work and effort, that message is much more inspiring."

Hu and Ahn both believe that in addition to providing insight for how to enhance scientists' effectiveness as role models, the findings can also be used to help optimize science education for students of all ages.

"This information can help shape the language we use in textbooks and lesson plans and the public discourse regarding what it takes to succeed in science," Hu said. "Young people are always trying to find inspiration from and mimic the people around them. If we can send the message that struggling for success is normal, that could be incredibly beneficial."

Credit: 
Penn State

A novel biofuel system for hydrogen production from biomass

image: Schematic diagram of byproduct production and hydrogen evolution through lignin decomposition.

Image: 
UNIST

A novel technology has been developed for hydrogen production from the process, which involves electron that is produced during the decomposition of biomass such as waste wood. The result produced after biomass decomposition is a high value-added compound, and it is a two-stone technology that improves the efficiency of hydrogen production.

A research team, led by Professor Jungki Ryu in the School of Energy and Chemical Engineering at UNIST has presented a new biofuel system that uses lignin found in biomass for the production of hydrogen. The system decomposes lignin with a molybdenum (Mo) catalyst to produce high value-added compounds, and the electrons extracted in the process effectively produce hydrogen.

An eco-friendly way of producing hydrogen is the electrolysis of water (H?O). The voltage is applied to the water to produce hydrogen and oxygen at the same time. However, in the currently reported technology, the oxygen generation reaction (OER) is slow and complicated, and hydrogen production efficiency is low. This is because hydrogen gas (H?) is produced by hydrogen ions (H?) as electrons, because these electrons come from the oxygen evolution reaction.

Through the study, Professor Ryu and his research team have developed a new biofuel system that uses lignin as an electron donor in a way to reduce the overall inefficiency of the oxygen evolution reaction (OER). This is the principle of using molybdenum-based inexpensive metal catalysts (PMA) to break down lignin at low temperatures, and extract the electrons produced in the process to produce hydrogen. The new device has been designed to move electrons from lignin, along the wire to the electrode where the hydrogen evolution reaction (HER) occurs.

"With this new system, we can produce hydrogen with less energy (overvoltage) than conventional water electrolysis, as there is no need for oxygen reactions, requiring high energy and precious metal catalysts," says Hyeonmyeong Oh (Combined M.S/Ph.D. of Energy and Chemical Engineering, UNIST), the first author of the study. "Conventional methods require more than 1.5 volts, but the new system was capable of producing hydrogen at a much lower potential (0.95 volts)."

In addition, vanillin or carbon monoxide (CO), which are produced via lignin breakdown is very useful substance for various industrial processes. "Lignin, the second most naturally abundant biomass, is difficult to decompose. However, using molybdenum-based catalysts (PMA) it was easily degraded at low temperatures," says Research Assistant Professor Yuri Choi, the co-author of the study.

"The new biofuel system is a technology that produces hydrogen and valuable chemicals using cheap catalysts and low voltages instead of expensive catalysts such as platinum (Pt)," says Professor Ryu. "Our work is also significant, as it presents a new way to replace oxygen-producing reactions in the electrolysis of water."

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Approximating a kernel of truth

image: The researchers used a process of error estimation and mathematical approximation to prove that their approximate kernel remains consistent with the accurate kernel.

Image: 
© 2020 Ding et al.

By using an approximate rather than explicit "kernel" function to extract relationships in very large data sets, KAUST researchers have been able to dramatically accelerate the speed of machine learning. The approach promises to greatly improve the speed of artificial intelligence (AI) in the era of big data.

When AI is exposed to a large unknown data set, it needs to analyze the data and develop a model or function that describes the relationships in the set. The calculation of this function, or kernel, is a computationally intensive task that increases in complexity cubically (to the power of three) with the size of the data set. In the era of big data and increasing reliance on AI for analysis, this presents a real problem where kernel selection can become impractically time consuming.

With the supervision of Xin Gao, Lizhong Ding and his colleagues have been working on methods to speed up kernel selection using statistics.

"The computational complexity of accurate kernel selection is usually cubic with the number of samples," says Ding. "This kind of cubic scaling is prohibitive for big data. We have instead proposed an approximation approach for kernel selection, which significantly improves the efficiency of kernel selection without sacrificing predictive performance."

The true or accurate kernel provides a verbatim description of relationships in the data set. What the researchers found is that statistics can be used to derive an approximate kernel that is almost as good as the accurate version, but can be computed many times faster, scaling linearly, rather than cubically, with the size of the data set.

To develop the approach, the team had to construct specifically designed kernel matrices, or mathematical arrays, that could be computed quickly. They also had to establish the rules and theoretical bounds for selection of the approximate kernel that would still guarantee learning performance.

"The main challenge was that we needed to design new algorithms satisfying these two points at the same time," says Ding.

Combining a process of error estimation and mathematical approximation, the researchers were able to prove that their approximate kernel remains consistent with the accurate kernel and then demonstrated its performance in real examples.

"We have shown that approximate methods, such as our computing framework, provide sufficient accuracy for solving a kernel-based learning method, without the impractical computational burden of accurate methods," says Ding. "This provides an effective and efficient solution for problems in data mining and bioinformatics that require scalability."

Credit: 
King Abdullah University of Science & Technology (KAUST)

Common feed ingredient tested safe in bulls

image: University of Illinois researchers tested the effects of distillers grains on Simmental x Angus bull development and reproductive capacity.

Image: 
Dan Shike, University of Illinois

URBANA, Ill. - Cattle feeders choose distillers grains in feedlot diets as an inexpensive alternative to corn and soybean meal. But until now, no one had studied the effects of the common feed ingredient on bull development and fertility. With bull fertility to blame for a significant portion of reproductive failures in cow-calf operations, University of Illinois researchers decided it was worth a look.

"We get questions occasionally about the effects of distillers grains on bulls, and a recent study showed some negative effects in rams. Even though there have been hundreds of experiments done on distillers grains with growing and finishing cattle, there's very limited bull development research from a breeding standpoint," says Daniel Shike, associate professor in the Department of Animal Sciences at U of I and co-author on a new study in Translational Animal Science.

In the study, eight-month-old Simmental × Angus bulls consumed either a 40% distillers-grain diet or a standard corn-based diet for 140 days, then they all switched to a common low-energy diet for 70 days.

"We chose the 40% inclusion rate, which is kind of on the high end, thinking there's really no reason to have distillers inclusion greater than that in most practical situations," Shike says. "Then we collected data on just about everything under the sun."

Throughout the study, the researchers measured growth performance, body condition, and hoof development, as well as a wide array of reproductive metrics. Shike and his team wanted to make the most of the opportunity to take a detailed look at bull development, citing a lack of basic information in the scientific literature.

At the end of the initial 140-day experimental treatment, only one reproductive metric differed between the groups of bulls. Those that ate distillers grains had a higher percentage of sperm with "proximal droplets," tiny fluid-filled sacs near the head of the sperm. Normally, these droplets shake down the flagella with movement, and eventually are flung away. But when they're retained close to the head, it's considered a major defect that could affect reproductive capacity. However, after 70 days on the low-energy diet, the problem resolved.

Similarly, bulls that consumed distillers grains had greater body weight and fat at a couple of time points during the experiment, but, by the end of the study, the researchers could only detect a slight difference in body fat between the two groups of animals.

"We wanted the study to be typical of what bulls experience around here. The distillers grain is a little higher in fat and energy, so they're really being developed on a higher plane of nutrition to get ready for sale time. When people go to bull sales, they like to see well-developed heavyweight bulls. But then we've got to cool them back down again, back into their working condition. So that's why we evaluated them at the end of that low-energy common diet period as well," Shike says.

The fact that the two groups of bulls were essentially indistinguishable at the end of the study is good news for feeders utilizing distillers grains. However, Shike notes that the source of distillers grains used in the study was relatively low in sulfur; the experimental diet contained 0.23% sulfur as formulated, coming in below the 0.3% threshold for potential toxicity.

"That's a little bit of a disclaimer. If your distillers had elevated sulfur content, I would recommend a lower inclusion than 40%," he says. "Sources vary, and that's a potential problem. It's getting better, though. They're getting more predictable and they're getting lower across the board.

"The bottom line is, assuming sulfur content is not at a toxic level, we can utilize distillers in bull development rations, with very similar results as if they were developed on a corn-based diet."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Is federal rental assistance associated with childhood asthma outcomes?

What The Study Did: National survey data and housing assistance records were used to examine whether participation in U.S. Department of Housing and Urban Development rental assistance programs was associated with childhood asthma outcomes, including ever being diagnosed with asthma, history of asthma attack, and treatment in the emergency department for asthma.

Author: Michel Boudreaux, Ph.D., of the University of Maryland School of Public Health in College Park, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/ 

(doi:10.1001/jamapediatrics.2019.6242)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Uncovering novel relationships between SLCs and cytotoxic drugs in human cells

image: Solute Carriers (SLCs) affect the uptake, metabolism and mechanism of action (MoA) of cytotoxic drugs.

Image: 
© Enrico Girardi / CeMM

Solute carriers (SLCs) represent the largest family of transmembrane transporters in the human genome, with over 400 members arranged into 65 families. They have a key role in determining cellular metabolism and control essential physiological functions, including nutrient uptake, ion transport and waste removal. SLCs are vital for maintaining a stable internal state of the human body (known as homeostasis), but the presence of genetic variation (polymorphisms) in SLCs are associated with several diseases, such as gout or diabetes, while gene mutations are associated with literally hundreds of disorders including many metabolic deficiencies and orphan diseases.

Solute carriers have been shown to act as drug targets, as well as constitute paths for drug absorption into specific organs. However, despite of decades of studies, there is still a lack of systematic surveys of transporter-drug relationships in human cells. Uncovering how particular drugs enter human cells and how the cell metabolism affects them is key to gaining a better understanding of the side effects and limitations of current drugs and thus developing more effective drug therapies in the future.

Expanding on a previous study (Winter et al. Nat Chem Biol, 10, pp 768-773, 2014), which uncovered how a single solute carrier (SLC35F2) is necessary in the uptake of the cytotoxic compound YM-155, Giulio Superti-Furga and his group at CeMM now performed a more systematic investigation on the role of solute carriers in determining the activity of a large and diverse set of cytotoxic compounds. Their goal was to survey on the "how often" and "how" SLC transporters would lose or a affect the activity of a certain drug.

In their study, CeMM researchers built a CRISPR/Cas9-focused library specifically targeting 394 solute carriers and applied it to identify SLCs affecting the activity of a panel of 60 chemically diverse, most of them clinically approved, cytotoxic compounds. They determined that approximately 80% of the screened drug set shows a dependency with at least one solute carrier. To further validate these results, the scientists individually validated a subset of SLC-drug interactions and employed uptake assays and transcriptomics approaches to investigate how some of the SLCs affected drug uptake and activity. "The use of a custom-made, SLC-focused library was instrumental in allowing us to screen a large number of compounds, revealing hundreds of SLC-drug associations and providing many novel insights into SLC biology and drug mechanisms", says Enrico Girardi, CeMM senior postdoctoral fellow and first author of the study.

The present study is the result of a cross-disciplinary collaboration with researchers from the University of Vienna Pharmacoinformatics Research Group of Gerhard Ecker as well as the group of Stefan Kubicek at CeMM. It provides not only a strong validated argument to demonstrate the requirement of solute carriers in cellular uptake and drug's activity, but also an experimentally validated set of SLC-drug associations for several clinically relevant compounds. The evidence provided by CeMM researchers opens the pathway to further investigations of the genetic determinants of drug activity and especially uptake in human cells. "This study raises strong doubts that the generally accepted idea that most drugs can enter cells by simply diffusing through its membrane is correct and highlights the increasingly appreciated need to systematically studying the biological roles of solute carriers", says Giulio Superti-Furga, CeMM Scientific Director and last author of the study. Gaining further insights into how the transporters affect the uptake and activity of drugs in tumors and tissues allows scientists to predict and counteract resistance mechanisms to design the most effective precision therapies. Furthermore, understanding the relationship between the expression of SLCs, cellular/organismal metabolism and nutrition is likely to allow the opening of novel therapeutic avenues in the future.

Credit: 
CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences

Groovy key to nanotubes in 2D

image: Rice University graduate student Natsumi Komatsu was the first to notice that the alignment of 2D carbon nanotube films corresponds to grooves in the filter paper beneath the films.

Image: 
Jeff Fitlow/Rice University

HOUSTON -- (March 9, 2020) -- Ultrathin carbon nanotubes crystals could have wonderous uses, like converting waste heat into electricity with near-perfect efficiency, and Rice University engineers have taken a big step toward that goal.

The latest step continues a story that began in 2013, when Rice's Junichiro Kono and his students discovered a breakthrough method for making carbon nanotubes line up in thin films on a filter membrane.

Nanotubes are long, hollow and notoriously tangle-prone. Imagine a garden hose that's dozens of miles long, then shrink the diameter of the hose to the width of a few atoms.
Anyone who's ever fought with a knotted hose can appreciate Kono's feat: He and his students had turned a mob of unruly nanotubes into a well-ordered collective. Of their own accord, and by the billions, nanotubes were willingly lying down side by side, like dry spaghetti in a box.

The problem? Kono and his students had no idea why it was happening.

"It was magical. I mean, really mysterious," said Kono, an electrical engineer, applied physicist and materials scientist who has studied carbon nanotubes for more than two decades. "We had no idea what was really happening on a microscopic scale. And most importantly, we did not even know in which direction that nanotubes would align."

He and his team published their findings in 2016, and the field weighed in with possible explanations. The answer, as described in a new paper by Kono's team and collaborators in Japan, is both unexpected and simple: Tiny parallel grooves in the filter paper -- an artifact of the paper's production process -- cause the nanotube alignment. The research is available online in the American Chemical Society journal Nano Letters.

Kono said a graduate student in his lab, study lead author Natsumi Komatsu, was the first to notice the grooves and associate them with nanotube alignment.

"I found that any commercially purchased filter membrane paper used for this technique has these grooves," Komatsu said. "The density of grooves varies from batch to batch. But there are always grooves."

To form the 2D crystalline films, researchers first suspend a mixture of nanotubes in a water-surfactant solution. The soaplike surfactant coats the nanotubes and acts as a detangler. In 2013, Kono's students were using vacuum filtration to draw these mixtures through membrane filter paper. The liquid passed through the paper membrane, leaving a film of aligned nanotubes on top.

In an exhaustive set of experiments, Komatsu and colleagues, including Kono group postdoctoral researcher Saunab Ghosh, showed that the alignment of nanotubes in these films corresponded to parallel, submicroscopic grooves on the paper. The grooves likely form when the filter paper is pulled onto rolls at the factory, Kono said.

Komatsu examined dozens of samples of filter paper and used scanning electron microscopes and atomic force microscopes to characterize grooves and patterns of grooves. She cut filters into pieces, reassembled the pieces with grooves facing different directions and showed they produced films with matching alignments.

Komatsu and colleagues also used heat and pressure to remove the grooves from filter paper, using the same principles involved in ironing wrinkles from clothing. They showed that films made with groove-free paper had nanotubes aligned in several directions.

Finally, starting with groove-free paper, they showed they could use a very fine reflective grating with periodic grooves to create their own patterns of grooves and that corresponding nanotube films followed those patterns.

Kono said the method is exciting because it brings a needed level of predictability to the production of 2D crystalline nanotube films.

"If the nanotubes are randomly oriented, you lose all of the one-dimensional properties," Kono said. "Being one-dimensional is key. It leads to all of the unusual but important properties."

While Kono group's films are essentially 2D -- as much as an inch in diameter but just a few billionths of a meter thick -- the individual nanotubes behave like 1D materials, especially in terms of their optical and electronic properties.

The extraordinary optical and electronic properties of carbon nanotubes depend on their diameter and structure, or chirality. Some chiralities act like metals and others like semiconductors, and researchers have struggled for decades to find a way to make large, macroscopic objects like a wire or one of Kono's 1-inch diameter films purely of nanotubes with one diameter and chirality.

"That's obviously the next step," Ghosh said. "In this study, we still used a mixture of metallic and semiconducting carbon nanotubes with a diameter distribution. The next step is to apply this new method based on intentional groove-making using a grating to achieve total control of the alignment direction."

Kono said his team has made highly aligned 2D crystals from solutions with a diverse mixture of nanotubes.

"But when we go to a single-chirality solution, we were never satisfied with the alignment," he said. "Now, with this knowledge of grooves, we are confident we can improve the degree of alignment in the case of single-chirality carbon nanotube films."

Single-chirality films could open the door to applications with mind-boggling potential -- for example, sheets of pure carbon that convert heat into light with almost perfect efficiency. Marrying such a sheet to a photovoltaic material could provide a way to turn heat into electric power very efficiently, creating the possibility of radiators that both cool motors and electronics while also powering them.

Kono's lab and the research group of Rice's Gururaj Naik demonstrated the concept for this in a 2019 paper about hyperbolic carbon nanotube films.

Single-chirality crystalline films could also be used to study new states of matter, such as exciton polaritons and Bose-Einstein condensates, and for applications that haven't yet been envisioned, Kono said.

"At this moment, only a small number of groups in the world can make these aligned, highly dense, heavily packed carbon nanotube films," he said. "And the work we just finished, the groove-assisted work, offers more control. This will lead to better films, new applications and new science. We are very excited."

Credit: 
Rice University

Researchers map protein motion

ITHACA, N.Y - Cornell structural biologists took a new approach to using a classic method of X-ray analysis to capture something the conventional method had never accounted for: the collective motion of proteins. And they did so by creating software to painstakingly stitch together the scraps of data that are usually disregarded in the process.

Their paper, "Diffuse X-ray Scattering from Correlated Motions in a Protein Crystal," published March 9 in Nature Communications.

As a structural biologist, Nozomi Ando, M.S. '04, Ph.D. '08, assistant professor of chemistry and chemical biology, is interested in charting the motion of proteins, and their internal parts, to better understand protein function. This type of movement is well known but has been difficult to document because the standard technique for imaging proteins is X-ray crystallography, which produces essentially static snapshots.

"Because we're studying really challenging biological systems, the group often has to pioneer new structural methods as well," said postdoctoral researcher Steve Meisburger, Ph.D. '14, the paper's lead author. "One of the questions that we have been interested in since the beginning is how a protein's subtle breathing motions direct biochemical function."

The researchers brought their project to the Cornell High Energy Synchrotron Source (CHESS), where they took advantage of the facility's Pilatus 6M pixel-array detector, which enabled them to make very high-resolution images.

For this work, as in regular crystallography, X-rays were beamed at a sample crystal. The pixel-array detector recorded the intensity of the X-rays that were diffracted by the crystal's proteins, thus encoding the atomic structure. Any disorder - i.e., motion - inside the crystal caused additional photons to bounce out, creating a very weak background signal called diffuse scattering. This information has been traditionally discarded during data processing.

"The photons go everywhere, and the signal appears extremely weak because it's spread out," said Ando, the paper's senior author. "For decades, people couldn't measure it accurately, and they didn't know how to interpret it."

Meisburger created software to process the approximately 50 million unique data points, resulting in a high-quality three-dimensional map. Much to the researchers' surprise, the map revealed that a significant component of this diffuse scattering pattern was actually a result of the protein lattice vibrating. This jiggling movement was so dominant, it seemed to obscure any motion inside the proteins, which was initially a disappointment for the researchers.

But after accounting for these lattice vibrations in simulations, the researchers identified internal protein motions as well. These motions included the opening and closing of the active site of the protein.

"Imagine the crystal being like a row of people trying to walk together while holding hands, but at the same time, each individual might be doing something slightly different," Ando said. "The signal from everybody moving together is dominant, so we couldn't discern the subtle signal that was coming from the individuals. That was something that had never been accounted for."

This new approach to diffuse scattering could help researchers get a clearer picture of protein structure and dynamics and, ultimately, a better understanding of biochemical reactions.

"We really want to push this in a direction where lots of people can use the technique and learn something new about their protein," Meisburger said. "One great thing about it is you get diffuse scattering for free anytime you do a regular crystallography experiment. This technique really adds information to what you would normally get."

Credit: 
Cornell University

New study unveils the mechanism of DNA high-order structure formation

image: Molecular structures of Abo1 in different energy states (left), Demonstration of an Abo1-assisted histone loading onto DNA by the DNA curtain assay.

Image: 
UNIST

A joint research team, led by Professor Ja Yil Lee (School of Life Sciences, UNIST) and Professor Ji-Joon Song (Department of Biological Sciences, KAIST) has unveiled the structure and mechanism of proteins that are highly overexpressed in various cancers and associated with poor patient prognoses. Such research findings could speed up the discovery and development of new cancer drugs.

DNA, the genetic material responsible for inheritance in humans, exists in a high-order structure. Such structure, known as chromatin, consists of DNA wrapped around certain proteins, known as histones. The function of chromatin is to efficiently package DNA into a small volume to fit into the nucleus of a cell and protect the DNA structure and sequence.

Regulation of histone proteins allows the DNA strands become more tightly or loosely coiled during the processes of DNA replication and gene expression. However, problems may arise when histones clump together or when DNA strands intertwine. Indeed, the misregulation of chromatin structures could result in aberrant gene expression and can ultimately lead to developmental disorders or cancers.

Histone chaperones are those proteins, responsible for adding and removing specific histones at the wrong time and place during the DNA packaging process. Thus, they also play a key role in the assembly and disassembly of chromatin.

The study focused on ATAD2 (also termed ANCCA), a histone chaperone that has been implicated in nucleosome density regulation by histone H3-H4 loading or removal. It is highly overexpressed in various cancers and associated with poor patient prognoses. As a result, there has been demand for development of therapeutic agents, targeting ATAD2 protein and some clinical trials are already underway. Yet, to date, no specific information about the structure and function of ATAD2 gene have been revealed to the public.

Through the use of cryo-electron microscopy (Cryo-EM) that allows direct observation of proteins in native and near-native states in atomic detail, the research team identified the structural details of ATAD2 protein. They presented cryo-EM structures of an ATAD2 family ATPase to atomic resolution in three different nucleotide states, revealing unique structural features required for histone loading on DNA, and directly visualize the transitions of Abo1 from an asymmetric spiral (ATP-state) to a symmetric ring (ADP- and apo-states) using high-speed atomic force microscopy (HS-AFM).

Additionally, they find that the acidic pore of ATP-Abo1 binds a peptide substrate which is suggestive of a histone tail. Based on these results, we propose a model whereby Abo1 facilitates H3-H4 loading by utilizing ATP.

"This study is meaningful, as it reveals the structure and mechanism of histone chaperones protein through the use of cutting-edge techniques in biophysical, such as Cryo-EM," says Professor Lee. "This will accelerate the development of drug candidates, targeting ATAD2."

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Healthcare innovators focus on 'quality as a business strategy' -- update from Journal of Healthcare Quality

March 6, 2020 - Despite two decades of effort - targeting care processes, outcomes, and most recently the value of care - progress has been slow in closing the gap between quality and cost in the US healthcare system. It's time for a new approach focusing on healthcare quality as a business strategy, according to a special issue of the Journal for Healthcare Quality (JHQ), the peer-reviewed journal of the National Association for Healthcare Quality (NAHQ). The journal is published in the Lippincott portfolio by Wolters Kluwer.

Commissioned by NAHQ, the special issue shares insights from healthcare organizations leading the way toward a focus on quality as a business strategy. "For the past 20 years, the health care industry has improved and innovated, but it hasn't come far enough or fast enough.," commented Stephanie Mercado, CAE, CEO of the National Association for Healthcare Quality. "As you can see from the papers published in this issue, when quality is leveraged as a business strategy and the people doing the work have the competencies to deliver on quality and safety, we can accelerate our progress and achieve real improvement."

Quality as a Business Strategy for Improving Healthcare: Six Innovative Approaches

The special issue was assembled by Guest Editors Cathy E. Duquette, PhD, RN, NEA-BC, CPHQ, FNAHQ, and Nidia S. Williams, PhD, MBB, CPHQ, FNAHQ. It presents six invited papers illustrating the wide range of programs being implemented by interprofessional teams across the health continuum to improve the quality of care:

Improving Documentation for Stroke Patients. At Memorial Hermann Health System in Houston, Randi Toumbs, DNP, RN, MS, AGACNP-BC, and colleagues used standardized templates and trainee education to increase the accuracy of clinical documentation for stroke care. In addition to improving performance on stroke metrics, the program improved case mix index and expected length of stay - two factors with ad impact on reimbursement.

Assessing Costs of Pressure Injury. Shea Polancich, PhD, RN, of University of Alabama at Birmingham and colleagues tested a method to estimate the costs of hospital-acquired pressure injuries: a common and preventable complication. Despite challenges, the pilot study suggested that assessing the costs of adverse events can help in validating the return on investment efforts to improve the quality of care.

Bundled Payments for Joint Replacement. A study led by Tobin Lassen, MBA, MPH, of Cedar Gate, a value-based care performance company in Houston, found that mandatory participation in a Medicare bundled payment program for joint replacement does not improve patient outcomes. The researchers suggest that further study is needed to explore differences in the effects of bundled payment programs for patients in different settings.

Vertical Integration in Skilled Nursing Facilities. Research led by Tory H. Hogan, PhD, of the Ohio State University College of Public Health, Columbus, found that that vertical integration of hospitals into skilled nursing facilities reduced hospital readmissions for pneumonia - but not for heart failure. These mixed results suggest that vertical integration of care may have differing effects in different types of hospitals.

Reducing Unnecessary Phlebotomy. Valerie L. Strockbine DNP, RN, CPHQ, of The Johns Hopkins Hospital, Baltimore, and colleagues evaluated the use of a clinical decision support system (CDSS) to help reduce the rate of non-evidence-based phlebotomy testing. They found that the CDSS reduced the number of unnecessary tests and lowered costs, with the potential for further savings with more widespread introduction of the system.

Value Transformation Framework. Cheryl Modica, PhD, MPH, BSN, of the National Association of Community Health Centers (NACHC) and colleagues report on the development of a Value Transformation framework to provide an "actionable pathway" toward a value-based care approach at federally qualified health centers. The NACHC provides tools to help guide systems change toward the "Quadruple Aim" goals of improved healthcare outcomes, improved patient experience, improved staff experience, and reduced costs.

"As a leader in healthcare quality, I know first-hand the value of leveraging quality as a business strategy. I've worked with countless individuals and together we've made patient care better, and saved millions of dollars," commented Carole Guinane, RN, BA, MBA, CPHQ, President of the NAHQ Board of Directors. "There is no question, a vital component of leveraging quality as a business strategy is the healthcare workforce that is ready to deliver on value."

Credit: 
Wolters Kluwer Health

Radar and ice could help detect an elusive subatomic particle

COLUMBUS, Ohio - One of the greatest mysteries in astrophysics these days is a tiny subatomic particle called a neutrino, so small that it passes through matter - the atmosphere, our bodies, the very Earth - without detection.

Physicists around the world have for decades been trying to detect neutrinos, which are constantly bombarding our planet and which are lighter than any other known subatomic particles. Scientists hope that by capturing neutrinos, they can study them and, hopefully, understand where they come from and what they do.

But existing attempts are often expensive, and miss an entire class of high-energy neutrinos from some of the furthest reaches of space.

A new study published today in the journal Physical Review Letters shows, for the first time, an experiment that could detect that class of neutrinos using radar echoes.

"These neutrinos are fundamental particles that we don't understand," said Steven Prohira, lead author of the study and a researcher at The Ohio State University Center for Cosmology and Astroparticle Physics. "And ultra-high-energy neutrinos can tell us about huge parts of the universe that we can't really access in any other way. We need to figure out how to study them, and that's what this experiment tries to do."

The study relies on a phenomenon known as a cascade. Scientists think neutrinos move through the Earth at almost the speed of light - billions of them are passing through you now, as you read this.

Higher-energy neutrinos are more likely to collide with atoms. Those collisions cause a cascade of charged particles - "like a giant spray," Prohira said. And the cascades are important: If researchers can detect the cascade, they can detect a neutrino. Ultra-high-energy neutrinos are so rare that scientists so far have not been able to detect them.

Scientists have figured out that the best places to detect neutrinos are in large sheets of remote ice: The longest-running and most successful neutrino experiments are in Antarctica. But those experiments so far have not been able to detect neutrinos with higher energies.

That's where Prohira's research comes in: His team showed, in a laboratory, that it is possible to detect the cascade that happens when a neutrino hits an atom by bouncing radio waves off of the trail of charged particles left by the cascade.

For this study, they went to the SLAC National Accelerator Laboratory in California, set up a 4-meter-long plastic target to simulate ice in Antarctica, and blasted the target with a billion electrons packed into a tiny bunch to simulate neutrinos. (The total energy of that electron bunch, Prohira said, is similar to the total energy of a high-energy neutrino.) Then they transmitted radio waves at the plastic target to see if the waves would indeed detect a cascade. They did.

Prohira said the next step is to take the experiment to Antarctica, to see if it can detect neutrinos over a wide volume of remote ice there.

Radio waves are the cheapest known technology for detecting neutrinos, he said, "which is part of why this is so exciting." Radio waves have been used in the search for the highest-energy neutrinos for about 20 years, Prohira said. This radar technique could be one more tool in the radio wave toolbox for scientists hoping to study ultra-high-energy neutrinos.

And having a greater understanding of neutrinos could help us understand more about our galaxy and the rest of the universe.

"Neutrinos are the only known particles that travel in straight lines -- they go right through things," he said. "There aren't any other particles that do that: Light gets blocked. Other charged particles get deflected in magnetic fields."

When a neutrino is created somewhere in the universe, it travels in a straight line, unaltered.

"It points straight back to the thing that produced it," Prohira said. "So, it's a way for us to identify and learn more about these extremely energetic processes in the universe."

Credit: 
Ohio State University

An ultimate one-dimensional electronic channel in hexagonal boron nitride

image: A layer of hexagonal boron nitride has the shape of a chicken wire, and is formed by the alternation of boron (B, pink) and nitrogen (N, blue). Depending on how the layers are piled up together, the material assumes different arrangements: AA, AB, AC, AA?, AB?, and AC?. The team achieved and studied a AA?/AB stacking boundary for the first time.

Image: 
IBS

In the field of 2D electronics, the norm used to be that graphene is the main protagonist and hexagonal boron nitride (hBN) is its insulating passive support. Researchers of the Center for Multidimensional Carbon Materials (CMCM) within the Institute for Basic Science (IBS, South Korea) made a discovery that might change the role of hBN. They have reported that stacking of ultrathin sheets of hBN in a particular way creates a conducting boundary with zero bandgap. In other words, the same material could block the flow of electrons, as a good insulator, and also conduct electricity in a specific location. Published in the journal Science Advances, this result is expected to raise interest in hBN by giving it a more active part in 2D electronics.

Similarly to graphene, hBN is a 2D material with high chemical, mechanical and thermal stability. hBN sheets resemble a chicken wire, and are made of hexagonal rings of alternating boron and nitrogen atoms, strongly bound together. However, unlike graphene, hBN is an insulator with a large bandgap of more than five electronVolts, which limits its applications.

"In contrast to the wide spectrum of proposed applications for graphene, hexagonal boron nitride is often regarded as an inert material, largely confined as substrate or electron barrier for 2D material-based devices. When we began this research, we were convinced that reducing the bandgap of hBN could give to this material the versatility of graphene," says the first author, Hyo Ju Park.

Several attempts to lower the bandgap of hBN have been mostly ineffective because of its strong covalent boron-nitrogen bonds and chemical inertness. IBS researchers in collaboration with colleagues of Ulsan National Institute of Science and Technology (UNIST), Sejong University, Korea, and Nanyang Technological University, Singapore, managed to produce a particular stacking boundary of a few hBN layers having a bandgap of zero electronVolts.

Depending on how the hBN sheets are piled up, the material can assume different configurations. For example, in the so-called AA? arrangement, the atoms in one layer are aligned directly on the top of atoms in another layer, but successive layers are rotated such that boron is located on nitrogen and nitrogen on boron atoms. In another type of layout, known as AB, half of the atoms of one layer lie directly over the center of the hexagonal rings of the lower sheet, and the other atoms overlap with the atoms underneath.

For the first time, the team has reported atomically sharp AA?/AB stacking boundaries formed in few-layer hBN grown by chemical vapor deposition. Characterized by a line of oblong hexagonal rings, this specific boundary has zero bandgap. To confirm this result, the research performed several simulations and tests via transmission electron microscopy, density functional theory calculations, and ab initio molecular dynamics simulations.

"An atomic conducting channel expands the application range of boron nitride infinitely, and opens new possibilities for all-hBN or all 2D nanoelectronic devices," points out the corresponding author Zonghoon Lee.

Credit: 
Institute for Basic Science

Young teachers happier but say hard work is unrewarded

Newly qualified teachers report higher levels of wellbeing and life satisfaction compared to other graduates, but are more likely to say hard work in Britain is unrewarded, according to UCL research.

The study, published today in the British Journal of Educational Studies and funded by the Nuffield Foundation, also shows that newly qualified teachers work, on average, nine hours more a week compared to graduates in other professions.

Researchers from the UCL Institute of Education analysed data of around 16,000 people born in 1989-90 from the Next Steps cohort study. The study began in 2004 and has continued to track individuals into their twenties. By age 26 the final sample of teachers who had been in the job for up to three years was 291.

Teachers were asked questions about their wellbeing, health, working and social lives as well as whether they believed that hard work in Britain is rewarded.

Professor John Jerrim (UCL Institute of Education), lead author, said, "We are currently seeing a shortage of appropriately qualified teachers, particularly in secondary schools, and we wanted to find out why so many are leaving the profession.

"This is of particular concern because not only are teachers feeling undervalued, many school teachers and heads are saying this is directly affecting and harming the quality of education pupils receive."

Overall, teachers reported higher levels of life satisfaction and showed no evidence of worse mental health or less active social lives compared to others in the cohort For example, 37% said they were 'very satisfied' at ages 20 and 26, whereas 34% cent of those in office jobs (for example) said they were 'very satisfied' at age 20 compared to 25% at age 26.

Teachers were however less likely than their peers to believe that Britain is a place where hard work gets rewarded. Around 30 per cent of teachers agreed or strongly agreed that hard work is rewarded, compared to around 40% of health workers and lower-managerial workers and 45% of all graduates and over half of all office workers.

The findings also showed that compared to all graduates, teachers are paid around £22 more per week. However, teachers received £54 per week less than their peers working in health and £71 less than those in office jobs.

Professor Jerrim added: "If teachers are expected to work long hours, often for little extra pay - but do not feel that this effort is appreciated - it is little wonder why many end up choosing to leave the profession.

"More work needs to be done to understand exactly why young teachers in England feel this way, and education policymakers and school leaders need to make greater efforts to show junior teachers that their hard work and dedication to the job is highly valued and sincerely appreciated."

Cheryl Lloyd, Education Programme Head at the Nuffield Foundation, said: "It is reassuring that the early career teachers in this study reported relatively good life satisfaction and similar mental health to their peers.

"However, given the ongoing teacher supply crisis we must not be complacent, as less experienced teachers are more likely to leave the profession. New, returning and more experienced teachers have a vital role to play in education and it is important that we build a better understanding of how we can better attract and retain teachers."

Credit: 
Taylor & Francis Group

One step closer to understanding the human brain

image: Jan Mulder, researcher at the Department of Neuroscience, Karolinska Institutet, Sweden. Photo: Jan Mulder

Image: 
Jan Mulder

An international team of scientists led by researchers at Karolinska Institutet in Sweden has launched a comprehensive overview of all proteins expressed in the brain, published today in the journal Science. The open-access database offers medical researchers an unprecedented resource to deepen their understanding of neurobiology and develop new, more effective therapies and diagnostics targeting psychiatric and neurological diseases.

The brain is the most complex organ of our body, both in structure and function. The new Brain Atlas resource is based on the analysis of nearly 1,900 brain samples covering 27 brain regions, combining data from the human brain with corresponding information from the brains of the pig and mouse. It is the latest database released by the Human Protein Atlas (HPA) program which is based at the Science for Life Laboratory (SciLifeLab) in Sweden, a joint research centre aligned with KTH Royal Institute of Technology, Karolinska Institutet, Stockholm University and Uppsala University. The project is a collaboration with the BGI research centre in Shenzhen and Qingdao in China and Aarhus University in Denmark.

"As expected the blueprint for the brain is shared among mammals, but the new map also reveals interesting differences between human, pig and mouse brains," says Mathias Uhlén, Professor at the Department of Protein Science at KTH Royal Institute of Technology, Visiting professor at the Department of Neuroscience at Karolinska Institutet and Director of the Human Protein Atlas effort.

The cerebellum emerged in the study as the most distinct region of the brain. Many proteins with elevated expression levels in this region were found, including several associated to psychiatric disorders supporting a role of the cerebellum in the processing of emotions.

"Another interesting finding is that the different cell types of the brain share specialised proteins with peripheral organs," says Dr. Evelina Sjöstedt, researcher at the Department of Neuroscience at Karolinska Institutet and first author on the paper. "For example, astrocytes, the cells that 'filter' the extracellular environment in the brain share a lot of transporters and metabolic enzymes with cells in the liver that filter the blood."

When comparing the neurotransmitter systems, responsible for the communication between neurons, some clear differences between the species could be identified.

"Several molecular components of neurotransmitter systems, especially receptors that respond to released neurotransmitters and neuropeptides, show a different pattern in humans and mice," says Dr. Jan Mulder, group leader of the Human Protein Atlas brain profiling group and researcher at the Department of Neuroscience at Karolinska Institutet. "This means that caution should be taken when selecting animals as models for human mental and neurological disorders."

For selected genes/proteins, the Brain Atlas also contains microscopic images showing the protein distribution in human brain samples and detailed, zoomable maps of protein distribution in the mouse brain.

The Human Protein Atlas started in 2003 with the aim to map all of the human proteins in cells, tissues and organs (the proteome). All the data in the knowledge resource is open access allowing scientists both in academia and industry to freely use the data for the exploration of the human proteome.

Credit: 
Karolinska Institutet