Brain

Advanced memory from advanced materials

image: Diagram to show how Weyl points are controlled.

Image: 
© 2020 Higo et al.

Researchers successfully demonstrated a method to switch a novel material between two different nonvolatile states at very high speeds and with great accuracy. The physical constituents of the device in question are significantly robust against external influences such as magnetic fields. These facts together mean a high-speed and high-capacity memory device could be created. Such a device would also be extremely energy efficient.

In 1929, theoretical physicist Hermann Weyl was exploring the newly derived Dirac equation, which describes many things in particle physics and led to the discovery of antimatter. He noticed the equation implied the existence of a massless particle which became known as the Weyl fermion. This was once believed to be the elementary particle the neutrino. Almost a century later in 2015 the Weyl fermion was finally discovered in reality, and in the years since, physicists are beginning to not only understand it but find potential uses for it too. A team including researchers from the laboratory led by Professor Satoru Nakatsuji at the Institute for Solid State Physics and Department of Physics at the University of Tokyo found a way to use Weyl fermions to make advanced memory devices.

"Spintronics is a word likely to excite those interested in the future of technology. Broadly, it is something that could supersede and replace many electronic functions in present-day devices," explained Research Associate Tomoya Higo. "For a while now, ferromagnetic materials, magnets that behave in a familiar way, have been used to explore spintronic phenomena. But there is a better class of magnetic materials for this purpose called antiferromagnetic materials, which seem harder to work with but have many advantages."

Antiferromagnets are interesting materials because they offer researchers many useful properties that ferromagnetic materials offer, but they are less subject to external magnetic fields due to a unique arrangement of their constituent parts. This is a benefit when working towards memory devices, as accuracy and robustness are important, but this special arrangement also makes it harder to manipulate the material as needed.

"It was not at all obvious whether you can control an antiferromagnetic state with a simple electrical pulse as you can a ferromagnetic one," said Nakatsuji.

This is where the aforementioned Weyl fermions come in.

"In our sample (antiferromagnetic manganese-tin alloy Mn3Sn), Weyl fermions exist at Weyl points in momentum space (not a physical space but a mathematical way of representing momentums of particles in a system). These Weyl points have two possible states which could represent binary digits," explained Postdoctoral Research Fellow Hanshen Tsai. "Our breakthrough finding is that we can switch a Weyl point between these states with an external electrical current applied to neighboring thin layers of Mn3Sn and either platinum or tungsten. This method is called spin-orbit torque switching."

"Our discovery indicates the massless Weyl fermion pursued by physicists has been found in our magnet, and moreover can be electrically manipulated," added Nakatsuji.

Thanks to a very large signal made by Weyl fermions in Mn3Sn, the detection of spin-orbit torque switching is possible. The switching rate which corresponds to how fast memory based on such technology could be written to or read from is in the region of trillions of times a second, or terahertz. Current high-end computer memory switches a few billion times a second, or gigahertz. So, when realized, it could lead to quite a jump in performance, but there is still a way to go.

"There were two big challenges in our study. One was optimizing the synthesis of Mn3Sn thin films. The other was figuring out the switching mechanism," said Higo. "We are excited not only because we found some interesting phenomena, but because we can expect our findings may have important applications in the future. By creating new materials, we discover new phenomena which can lead to new devices. Our research is full of dreams."

Credit: 
University of Tokyo

Study describes cocktail of pharmaceuticals in waters in Bangladesh

image: To study pharmaceuticals in water, UB scientists used the system pictured to isolate chemical compounds from water samples.

Image: 
Meredith Forrest Kulwicki

BUFFALO, N.Y. -- In spring of 2019, researchers set out to investigate what chemicals could be found in the waters of Bangladesh.

The scientists -- from the University at Buffalo and icddr,b, a leading global health research institute in Bangladesh -- tested a lake, a canal and a river in Dhaka, Bangladesh's capital and the nation's largest city. The team also sampled water from ditches, ponds and drinking wells in a rural area known as Matlab.

In the lab, an analysis revealed that the waters held a cocktail of pharmaceuticals and other compounds, including antibiotics, antifungals, anticonvulsants, anesthetics, antihypertensive drugs, pesticides, flame retardants and more.

Not all of these chemicals were found at every location, and sometimes amounts detected were low.

But the ubiquity of contamination is concerning, says lead scientist Diana Aga, an environmental chemist at UB.

"When we analyzed all these samples of water from Bangladesh, we found fungicides and a lot of antibiotics we weren't looking for," says Aga, PhD, Henry M. Woodburn Professor of Chemistry in the UB College of Arts and Sciences. "This kind of pollution is a problem because it can contribute to the development of bacteria and fungi that are resistant to the medicines we have for treating human infection."

The study appears in the April 10 volume of the journal Science of the Total Environment, and was published online in December 2019. The research, funded by the UB Community of Excellence in Global Health Equity, was a partnership between UB and icddr,b.

Compounds the team found at every sampling site included the antifungal agent carbendazim, flame retardants and the insect repellent DEET.

The canal and river in Dhaka contained a medley of chemicals. Of note, scientists discovered multiple varieties of antibiotics at these two sites, along with antifungals. While researchers generally found fewer antimicrobials at the rural test locations, some antibiotics were found at certain sites, and antifungal agents were common.

"The fact that we found so many different types of chemicals is really concerning," Aga says. "I recently saw a paper, a lab study, that showed exposure to antidepressants put pressure on bacteria in a way that caused them to become resistant to multiple antibiotics. So it's possible that even chemicals that are not antibiotics could increase antibacterial resistance."

Aga's team included first author Luisa F. Angeles, a PhD candidate in UB's Department of Chemistry, who traveled to Bangladesh to sample water and train scientists there on sample collection and preparation techniques.

Afterward, Aga, Angeles and colleagues studied the water in their Buffalo laboratory using state-of-the-art analytical methods.

In the past, technological limitations meant scientists could only test samples for specific targeted chemicals. Aga's team was able to employ a more advanced form of analysis that screens samples for a huge variety of pollutants -- checking for more than 1,000 potential compounds in this case, including ones the researchers did not anticipate finding.

The discovery of antimicrobials in urban areas was not surprising, as these chemicals are often found in human urine and later in wastewater that's released into rivers, Aga says. She thinks at rural sites, the presence of antibiotics and antifungals in water may be due to the fact that people may be using these chemicals to protect food crops and farm animals.

"It's important to note that antimicrobial contamination of the environment is not unique to Bangladesh, but expected in many countries throughout the world where antimicrobial use is poorly regulated in both human medicine and agriculture, which is generally the case in lower-middle income countries of Asia," says study co-author Shamim Islam, MD, clinical associate professor of pediatrics in the Jacobs School of Medicine and Biomedical Sciences at UB.

Islam adds that, "As undertaken in this study, we feel analyzing and characterizing such environmental antimicrobial contamination is a critically important component of global antimicrobial resistance surveillance and mitigation efforts."

Credit: 
University at Buffalo

Cell biology: Your number's up!

mRNAs program the synthesis of proteins in cells, and their functional lifetimes are dynamically regulated. Researchers from Ludwig-Maximilians-Universitaet (LMU) in Munich have now shown why blueprints that are more difficult to decipher have shorter lifetimes than others.

The control of gene expression is a fundamental component of living systems. The term refers to the suite of mechanisms that determines how the hereditary information encoded in the DNA genome of every cell is selectively transcribed into messenger RNAs (mRNAs), and then translated by ribosomes into proteins. Because the set of proteins synthesized in a cell defines its structure and biochemical capacities, every step in the process must be tightly regulated. One of the modules of this regulatory system is dedicated to the timely destruction of mRNAs in response to changing conditions. An international team led by Professor Roland Beckmann at LMU's Gene Center, in collaboration with Jeff Coller (Case Western Reserve University, Cleveland, USA) and Toshifumi Inada (Tohoku University, Sendai, Japan) has now worked out the detailed structure of a protein complex that is involved in mRNA degradation, and dissected its mode of action. The results of the new study, which appears in the leading journal Science, explain how and why the lifetime of an mRNA molecule is linked to the rate of synthesis of the protein it encodes.

"Statistical data had already revealed that the lifetime of an mRNA is correlated with speed of the ribosome during synthesis of its protein product," says Robert Buschauer, a PhD student in Beckmann's group and lead author of the new paper. "But the molecular basis for this relationship was completely unknown." 

The efficiency of protein synthesis largely depends on how well the ribosome can read the instructions encoded in the nucleotide sequences of mRNAs. These programs are written in the language of the genetic code. Triplet sequences of nucleotides ('codons') specify the order in which the different amino acids that make up the protein are linked together. Each of the required set of amino acids is delivered to the ribosome by an adaptor molecule called a tRNA. Each tRNA is also equipped with a nucleotide triplet (an 'anticodon') that recognizes its counterpart in the mRNA, and this interaction enables its amino-acid cargo to be slotted into the correct position in the growing protein. The genetic code is redundant: virtually all amino acids are specified by several nucleotide triplets, which are read with varying efficiencies by the ribosome. If a given triplet is difficult to read, the ribosome takes longer to select the appropriate tRNA with the required amino acid. The new study identifies the mechanistic basis for the link between this delay and the degradation of mRNAs. "With the aid of cryo-electron microscopy, we were able to show that a key protein complex that is required for mRNA degradation can interact with the ribosome only if the tRNA binding site is not occupied." This finding explains why the probability that an mRNA molecule will be degraded on the ribosome rises with the fraction of inefficiently decoded codons it contains.

"This newly discovered interaction is in fact crucial for the coupling between mRNA degradation and ribosomal efficiency," says Beckmann.  The destruction of mRNAs is an essential process, whose central components differ very little between yeast and human cells. Any errors that occur can give rise to neurodegenerative diseases, cancers or other serious disorders. A better understanding of the underlying mechanisms is therefore a prerequisite for the development of more effective therapies.

Credit: 
Ludwig-Maximilians-Universität München

Co-delivery of IL-10 and NT-3 to enhance spinal cord injury repair

image: Journal brings together scientific and medical experts in the fields of biomedical engineering, material science, molecular and cellular biology, and genetic engineering.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, April 17, 2020--Spinal cord injury (SCI) creates a complex microenvironment that is not conducive to repair; growth factors are in short supply, whereas factors that inhibit regeneration are plentiful. In a new report, researchers have developed a structural bridge material that simultaneously stimulates IL-10 and NT-3 expression using a single bi-cistronic vector to alter the microenvironment and enhance repair. The article is reported in Tissue Engineering, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the article for free on the Tissue Engineering website through May 17, 2020.

In "Polycistronic Delivery of IL-10 and NT-3 Promotes Oligodendrocyte Myelination and Functional Recovery in a Mouse Spinal Cord Injury Model," Lonnie D. Shea, PhD, University of Michigan, and coauthors report the development of a new poly(lactide-co-glycolide) (PLG) bridge with an incorporated polycistronic IL-3/NT-3 lentiviral construct. This material was used to stimulate repair in a mouse SCI model. IL-10 was included to successfully stimulate a regenerative phenotype in recruited macrophages, while NT-3 was used to promote axonal survival and elongation. The combined expression was successful; axonal density and myelination were increased, and locomotor functional recovery in mice was improved.

"Inflammation plays a vital role in tissue repair and regeneration, and the use of a PLG bridge to take advantage of the inflammatory response to promote SCI repair is an elegant way to take advantage of these natural processes to improve SCI healing," says Tissue Engineering Co-Editor-in-Chief Antonios G. Mikos, PhD, Louis Calder Professor at Rice University, Houston, TX.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Meeting multiple management goals to maximize coral reef health

While management strategies can be effective at achieving reef fisheries' conservation goals, a new study reveals how increased human pressure makes conservation of coral reef biodiversity truly difficult to achieve. Earth's coral reefs are not only home to some of the planet's richest collections of biodiversity, they also provide food and ecosystem services to millions of people worldwide. Yet, these fragile environments are highly susceptible to the impacts of climate change and increasing human pressure. As a result, many reefs across the globe are in steep states of decline. In order to sustain coral reef ecosystems and the livelihoods of the people who depend on them, there is a need for management tools capable of achieving multiple social and ecological goals related to ecosystem functioning and biodiversity, and to fishery health. However, little is known about how best to implement this type of ecosystem-based management approach, where multiple goals are simultaneously evaluated and perused. Joshua Cinner and colleagues compiled data from roughly 1,800 tropical reef sites worldwide to better understand the conditions under which reef ecosystems can simultaneously support three key ecological metrics representing reef health and use: reef fisheries, ecological function and biodiversity. Specifically, Cinner et al. evaluated each of these key traits by measuring fish biomass, parrotfish grazing activity and overall fish trait diversity, respectively. The findings suggest that when human use is low, all three traits can be maximized through high conservation levels. However, with increased human use and pressure, it becomes far more challenging to ensure biodiversity conservation - even at the highest levels of protection.

Credit: 
American Association for the Advancement of Science (AAAS)

Can coral reefs 'have it all'?

image: A blacktip reef shark (Carcharhinus melanopterus
) swims in the shallows in the Cocos-Keeling Islands, Indian Ocean.

Image: 
Tane Sinclair-Taylor

Though coral reefs are in sharp decline across the world, scientists say some reefs can still thrive with plentiful fish stocks, high fish biodiversity, and well-preserved ecosystem functions.

An international team, led by Professor Josh Cinner from the ARC Centre of Excellence for Coral Reef Studies at James Cook University (Coral CoE at JCU), assessed around 1,800 tropical reefs from 41 countries across the globe.

"Only five percent of the reefs were simultaneously able to meet the combined goals of providing enough fishing stocks, maintaining biodiversity and a working ecosystem," Prof Cinner said.

"These are like the Hollywood A-listers of coral reefs. They have it all, but they're also rare and live in exclusive areas--remote locations with little human pressure. Our study shows how to help other coral reefs get on that A-list."

The research team assessed if no-fishing marine reserves and other fisheries restrictions helped reefs to meet multiple goals. The study found that implementing such local efforts helped, "but only if the management efforts are in the right locations," Prof Cinner said.

"It's all about location, location, location," he said. "Marine reserves placed in areas with low human pressures had the best results for helping reefs get on the A-list."

"We also had a B-list of reefs, which met all the goals, but to a lesser degree. Reserves in areas with intermediate human pressure made the biggest difference to getting reefs on our B-list. Quite simply, they occurred in less exclusive locations than our A-listers."

However, marine reserves made little difference in areas where the environment was so severely degraded that only wider seascape conservation could help.

Co-author Jessica Zamborain-Mason, a Coral CoE and JCU PhD candidate, says coral reefs worldwide are facing intense degradation due to numerous anthropogenic drivers, such as overfishing, pollution, and climate change.

"There is an increasing need to manage coral reefs to meet multiple goals simultaneously," she said.

"Our findings provide guidance on where to strategically place local management to achieve the greatest benefits."

Co-author Professor Nick Graham from Lancaster University says the study uses data to show what works.

"Coral reef science and management is often focussed on meeting just a single goal," Prof Graham said.

"Managing for just one goal at a time is common, but what if you want it all? The multiple goals of biodiversity, fisheries and functioning ecosystems are often required at any given location, yet the science to understand when and how this can be achieved has been lacking."

"We looked at the fish communities, not the coral communities, and these are affected by different drivers--overfishing really drives the former and climate change the latter."

"The study not only has important implications for the placement of new marine reserves, but is also relevant to future socioeconomic changes, such as how infrastructure development and population growth may impact the efficacy of reef conservation," Prof Cinner said.

"We show where managers will be able to maximise multiple goals, and likewise, where they will be wasting their time."

The study concludes that, while international action on climate change is crucial for ensuring a future for coral-dominated reefs, effective management is also critical to sustaining reefs--and the millions of people whose livelihoods depend on them.

Credit: 
ARC Centre of Excellence for Coral Reef Studies

Flatter graphene, faster electrons

image: Corrugations in graphene slow down the pace of travelling electrons. By pulling on the graphene sheet on two opposite sides, it is flattened and smoothed, and electron transport is improved.

Image: 
Swiss Nanoscience Institute

Bumps on a road slow down our pace, so do corrugations in graphene to travelling electrons. By flattening the corrugations out, we help electrons move effectively faster through a graphene sheet.

Limits because of microscopic distortions

The sample quality of graphene has been improved significantly since its discovery. One factor that limited further improvements has not been investigated directly so far, namely corrugations in the graphene sheet, i.e. microscopic distortions that form even when placed on atomically flat surfaces. Such corrugations can scatter the electrons when moving through an electronic device.

The team of professor Christian Schönenberger of the Swiss Nanoscience Institute and Department of Physics at the University of Basel has developed a technique to pull the graphene sheet on two opposite sides and thereby flattening and smoothing it. "It is similar to pulling on a piece of crumpled paper which irons out wrinkles and folds», says Dr. Lujun Wang, first author of the study. "After this process, the electrons travel effectively faster through the graphene sheet, their "mobility" increases, demonstrating an improved sample quality", his supervisor Dr. Andreas Baumgartner adds.

These findings not only help us to further understand the electron transport in graphene but also provide instructions for studying other two-dimensional materials.

Credit: 
Swiss Nanoscience Institute, University of Basel

New 'toolbox' for urological cancer detection

image: Electron microscopy image of the urinary extracellular vesicles studied.

Image: 
Pekka Rappu, University of Turku

Researchers from Ghent University, Belgium, together with researchers from the University of Turku, Finland, have developed a new method for biomarker discovery of urological cancers. The method enables timely diagnosis and treatment of cancer. Urological cancers include e.g. prostate, bladder and kidney cancers.

Biomarkers are biological signatures in the body that can indicate the presence of cancer. A promising source of new biomarkers are extracellular vesicles. These are microscopic vesicles that are released by cancer cells into biological fluids, such as urine.

- Detecting and examining these vesicles in urine has an enormous potential for developing new tests for early detection of urological cancers. However, research related to this is still in its infancy, says Bert Dhondt from Ghent University.

To date, no sufficiently effective method exists for separating extracellular vesicles from urine. Such method would be essential for investigating these vesicles and using them in patient diagnostics and treatment. This means that extensive laboratory research into these promising biomarkers has not yet been translated into new urine tests which can help patients. The recently published study addresses this problem in several ways.

New 'Toolbox' Helps Mapping the Composition of Extracellular Vesicles

Researchers concluded that the currently used methods for separating extracellular vesicles from urine are not optimal for detecting new cancer biomarkers. Therefore, they developed a new 'toolbox' to map the composition of urinary extracellular vesicles.

This 'toolbox' consists of a novel method, developed at Ghent University, to separate extracellular vesicles from urine with high purity. In addition, researchers at the University of Turku were involved in developing a method for determining the protein composition of the vesicles.

- We have the know-how and the world's top equipment here at the University of Turku for determining the protein composition of biological samples, whereas the researchers at Ghent University represent the very top in extracellular vesicle research. Therefore, the distribution of work was very clear from the beginning, notes Docent Pekka Rappu from the Department of Biochemistry at the University of Turku.

Researchers applied this new method to urine samples from patients with prostate, bladder and kidney cancer. They established that extracellular vesicles in urine carry protein signatures specific to the various urological cancers. Using this new toolbox, the researchers were also able to map the protein composition of urinary extracellular vesicles in unprecedented detail.

Results Can Accelerate the Development of New Tests

Extracellular vesicles are increasingly being recognized as promising cancer biomarkers. Thanks to this recent research, scientists now have access to a new 'toolbox' that brings us one step closer to the development of promising new urine tests.

- In the future, the results of the study can aid patients with urological cancers through faster diagnosis and timely treatment, sums Bert Dhondt.

Credit: 
University of Turku

Innovating the peer-review research process

image: Using machine learning and implementing a feedback mechanism can improve the peer-review process for academics.

Image: 
Michigan State University

A team of scientists led by a Michigan State University astronomer has found that a new process of evaluating proposed scientific research projects is as effective - if not more so - than the traditional peer-review method.

Normally, when a researcher submits a proposal, the funding agency then asks a number of researchers in that particular field to evaluate and make funding recommendations. A system that can sometimes be a bit bulky and slow - not quite an exact science.

"As in all human endeavors, this one has it flaws," said Wolfgang Kerzendorf, an assistant professor in MSU's departments of Physics and Astronomy, and Computational Mathematics, Science and Engineering.

Detailed in the publication Nature Astronomy, Kerzendorf and colleagues tested a new system that distributes the work load of reviewin¬¬g project proposals among the proposers, known as the "distributed peer review" approach.

However, the team enhanced it by using two other novel features: Using machine learning to match reviewers with proposals and the inclusion of a feedback mechanism on the review.

Essentially, this process consists of three different features designed to improve the peer-review process.

First, when a scientist submits a proposal for evaluation, he or she is first asked to review several of their competitors' papers, a way of lessening the amount of papers one is asked to review.

"If you lower the number of reviews that every person has to do, they may spend a little more time with each one of the proposals," Kerzendorf said.

Second, by using computers - machine learning - funding agencies can match up the reviewer with proposals of fields in which they are experts. This process can take human bias out of the equation, resulting in a more accurate review.

"We essentially look at the papers that potential readers have written and then give these people proposals they are probably good at judging," Kerzendorf said. "Instead of a reviewer self-reporting their expertise, the computer does the work."

And third, the team introduced a feedback system in which the person who submitted the proposal can judge if the feedback they received was helpful. Ultimately, this might help the community reward scientists that consistently provide constructive criticism.

"This part of the process is not unimportant," Kerzendorf said. "A good, constructive review is a bit of a bonus, a reward for the work you put in reviewing other proposals."

To do the experiment, Kerzendorf and his team considered 172 submitted proposals that each requested use of the telescopes on the European Southern Observatory, a 16-nation ground-based observatory in Germany.

The proposals were reviewed in both the traditional manner and using distributed peer review. The results? From a statistical standpoint, it was seemingly indistinguishable

However, Kerzendorf said this was a novel experiment testing a new approach to evaluating peer-review research, one that could make a difference in the scientific world.

"While we think very critically about science, we sometimes do not take the time to think critically about improving the process of allocating resources in science," he said. "This is an attempt to do this."

Credit: 
Michigan State University

Study estimates revenue produced by top college football players

COLUMBUS, Ohio - The most elite players in college football increase revenue for their school football programs by an average of $650,000 a year, a first-of-its-kind study suggests.

This is the money brought in by the highest-rated recruits coming out of high school - those given five stars by Rivals, a recruiting news service, according to researchers at The Ohio State University.

Four-star recruits generated about $350,000 a year and three-star recruits increased revenue by about $150,000, while two-star recruits actually reduced revenue by about $13,000 a year for college football programs, the study found.

Amid the continuing national debate about compensation for college athletes, this study offers the first solid numbers on the financial impact of players in the highest-revenue college sport, said Trevon Logan, co-author of the study and professor of economics at Ohio State.

"There have been a lot of numbers put out there about how much college athletes should get under various compensation proposals," Logan said.

"But it's hard to do that when you don't know how players affect the bottom line. That's what we're trying to do here."

Logan conducted the study with Stephen Bergman, a former undergraduate student at Ohio State. The study has been accepted for publication in the Journal of Sports Economics.

For the study, the researchers collected a unique dataset from the federal Office of Postsecondary Education that included annual football-specific revenue and expenses from 2002 to 2012 for all college football bowl subdivision (FBS) schools - the top level in the sport.

To evaluate the quality of football players, the researchers used the high-school rankings of the players from Rivals. Using these rankings are the best way to rate college players for several reasons, Logan said.

One of the most important is that the service rates both defensive and offensive players the same way. Without access to this type of ranking, it would be nearly impossible for researchers to develop their own method to rate the impact of a defensive player's impact on the field on a similar scale to an offensive player, he said.

The researchers then calculated the effect of recruit quality on team performance, including wins and college bowl appearances. They then estimated the effects of team performance on total revenue.

The calculations were completed just before the current college football playoff system was introduced in 2014.

Results showed that five-star recruits had no statistically significant effect on the likelihood of their team getting to a bowl game. This was probably because teams didn't need the best players to get to just any bowl, Logan said.

But a five-star recruit increased the probability of appearing in a Bowl Championship Series (BCS) game - the elite bowls that helped determine a national championship - by more than 4 percent if they played for one of the top schools.

"The best recruits had a significant impact on team performance and their ability to appear in the most lucrative postseason bowls," Logan said.

The study estimated that $650,000 was generated by five-star recruits because of the wins, bowl appearances, BCS bowl appearances and premier bowls that they helped their schools achieve - all of which bring additional revenue to their schools.

For some analyses, the researchers controlled for the fact that football powerhouses like Alabama or Ohio State tend to attract more of the highest-rated players than other schools.

That means that the revenue value of any individual elite player at a top school wouldn't be as high as it would be at other schools.

But the value would still be high, Logan said. When the school effects were taken into account, each five-star recruit still increased revenue by nearly $200,000 a year, while four-star recruits were responsible for nearly $90,000 a year.

The conferences that schools participated in also affected revenue, because many conferences share money earned with all their members, regardless of performance. The researchers also took this into account in their analyses.

Logan said it isn't possible to come up with clear compensation policy recommendations based just on the results of this paper.

One important issue is that the revenue from football supports many other college sports that don't make money, he said.

"If you pay players, especially based on how much they generate, you will also have to reduce the number of other sports available," Logan said.

"What our study can do is bring some hard data to the discussions about compensation."

Credit: 
Ohio State University

Insight into the synapses

image: The distribution of the glutamate receptor mGluR4 and other proteins in the presynaptic membrane. Left a super-resolution dSTORM image. On the right, the result obtained with conventional fluorescence microscopy - molecular details are not visible here.

Image: 
(Picture: Chair Markus Sauer / University of Würzburg)

When people think of glutamate, the first thing they remember is the flavour enhancer that is often used in Asian cuisine. Glutamate is also an important messenger substance in the nervous system of humans. There it plays a role in learning processes and memory. Some Alzheimer drugs, for example, slow down the progression of the disease by inhibiting the effect of glutamate.

In the nervous system, glutamate acts as a signal transmitter at the synapses. There, it binds to specific receptors of which there are several types. The metabotropic glutamate receptor of type 4 (mGluR4) plays a decisive role in this system.

Direct contact to other proteins

Until now, not much was known about the distribution of this receptor in the active zones of synapses. It is now clear that the majority of mGluR4 receptors are located in groups of one to two units on average in the presynaptic membrane. There they are often in direct contact with calcium channels and the protein Munc-18-1, which is important for the release of messengers.

This is reported in the journal Science Advances by a research team led by Professor Markus Sauer from the Biocenter of Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany, and Professor Davide Calebiro from the University of Birmingham in England. "Our data indicate that the direct contact of mGluR4 receptors with other key proteins plays a major role in the regulation of synapse activity," says Professor Sauer.

Active zones are densely packed

The new knowledge was gained with the super-resolution microscopy method dSTORM (direct stochastic optical reconstruction microscopy). The method was developed by Sauer's team in 2008. It enables individual molecules to be located even in the very small and densely packed active zones of synapses. This is not possible with conventional light microscopy because of the diffraction limit of 200 nanometers.

"For the first time we now have insights into the molecular organisation of the complex protein machines that control the signal transmission at the synapses of our brain," says Professor Calebiro. Only with this knowledge will we be able to understand how the brain functions and how it processes information on different time scales.

The research teams will now use dSTORM to find out how all the proteins are distributed in the active synaptic zone. It is generally assumed that more than 100 proteins are involved in signal transmission in the active zones.

Credit: 
University of Würzburg

A new tool to predict volcanic eruptions

image: Geysers in Yellowstone National Park attest to the presence of a supervolcano, which is currently dormant. An eruption of this explosive volcano would impact the entire planet.

Image: 
© P.H. Barry

Earth's atmosphere is made up of 78% nitrogen and 21% oxygen, a mixture that is unique in the Solar System (1). The oxygen was produced by some of the first living organisms. But where did the nitrogen come from? Did it escape from Earth's mantle through volcanic activity? To try to answer these questions, Jabrane Labidi, a CNRS researcher at the Institut de Physique du Globe de Paris (CNRS/IPGP/IGN) (2) and his colleagues collected samples of gas from several volcanic sites on our planet. Their study, published on 16 March 2020 in the journal Nature, shows that nitrogen from magma formed within the mantle does not have the same isotopic (3) composition as atmospheric nitrogen, implying that the latter does not come from degassing of the mantle. However, the team were able to use these measurements to identify, in geysers, fumaroles and other phenomena involving volcanic gases, the contribution of the atmosphere (in the form of heated rainwater) and that of Earth's mantle (magmatic gas): for instance, small amounts of magmatic gas were detected in geysers in Yellowstone National Park, indicating renewed activity. This highly precise data could therefore help to predict future volcanic eruptions. Samples continue to be collected at Yellowstone, and more sampling will be carried out in fumaroles on the Mayotte islands, near which a new submarine volcano recently emerged. As for the origin of atmospheric nitrogen, it remains a mystery...for now.

Credit: 
CNRS

N-doped porous carbon supported Fe single atom catalysts for highly efficient ORR

image: The schematic illustration of the preparation, characterization, electrocatalytic performance test of nitrogen-doped porous carbon supported Fe single atom catalysts.

Image: 
©Science China Press

Noble metals (e.g., platinum) are often used as catalysts in the oxygen reduction reaction (ORR) of fuel cell cathodes. However, the drawbacks, such as the high cost, easy to be poisoned by CO, and poor stability, obviously limit their industrialization and application. Therefore, it is urgent to develop a new type of oxygen reduction catalysts to replace platinum.

Nitrogen-doped porous carbon supported single atom catalysts (SACs) have become one of the most promising alternatives to precious metal catalysts in ORR due to their unique geometric/electronic structures and outstanding performances, especially the Fe/Co SACs. However, most of them involve tedious pre- and/or post-treatments, especially derived from porphyrin-based materials, which would increase the operation difficulty, even mislead the relationship between structures and activities of the catalysts.

Therefore, the rational design of synthesis route, the achievement of the high efficiency in electrocatalytic reactions and the exploration of the catalytic mechanism and active sites, have become one of the research focus of SACs in fuel cells.

Very recently, the group of Professor Hongbing Ji and Dr. Xiaohui He in Fine Chemical Industry Research Institute of Sun Yat-sen University demonstrated a facile precursor-dilution strategy to prepare nitrogen-doped porous carbon supported Fe SACs through the Schiff-based reaction via co-polycondensation of amino-porphyrin materials, followed by pyrolysis at high temperature.

According to the aberration corrected high-angle annular dark-field scanning transmission electron microscopy and synchrotron radiation, which determined that the Fe atom was atomically dispersed in the support and forming a FeN4O-like structure. It is superior to commercial 20 wt% Pt/C in terms of ORR activities, stability, and methanol resistance in alkaline condition, and moderate ORR activities under the acidic condition.

The structure-activity relationship and catalytic mechanism of the catalyst was further verified by KSCN poisoning, CO poisoning and catalytic activity comparison with reference sample (pure carbon supports without metal loadings, iron nanoclusters, and iron nanoparticles), which confirmed that the active centers of electrocatalytic oxygen reduction were atomically dispersed Fe species.

Credit: 
Science China Press

Two is better than one

image: The team of scientists worked together with Eli Stavitski (left) and Yonghua Du (right) to "see" the lighter elements in their catalyst at the Tender Energy X-ray Absorption Spectroscopy (TES) beamline at the National Synchrotron Light Source II (NSLS-II).

Image: 
Brookhaven National Laboratory

UPTON, NY - A collaboration of scientists from the National Synchrotron Light Source II (NSLS-II)--a U.S. Department of Energy (DOE) Office of Science user facility at DOE's Brookhaven National Laboratory--Yale University, and Arizona State University has designed and tested a new two-dimensional (2-D) catalyst that can be used to improve water purification using hydrogen peroxide. While water treatment with hydrogen peroxide is environmentally friendly, the two-part chemical process that drives it is not very efficient. So far, scientists have struggled to improve the efficiency of the process through catalysis because each part of the reaction needs its own catalyst--called a co-catalyst--and the co-catalysts can't be next to each other.

"Our overarching goal is to develop a material that increases the efficiency of the process so that no additional chemical treatment of the water would be necessary. This would be particularly useful for systems that are off-the-grid and far away from urban centers," said Jaehong Kim, Henry P. Becton Sr. Professor of Engineering and Chair of Department of Chemical and Environmental Engineering at Yale University. Kim is also a member of Nanosystems Engineering Research Center for Nanotechnology-Enabled Water Treatment (NEWT), which partly supported this research.

In their recent paper, published on March 11 in Proceedings of the National Academy of Sciences (PNAS), the team presented the design for the new 2-D catalyst and revealed its structure through measurements at NSLS-II. The trick of their new design is that the scientists managed to place two co-catalysts--one for each part of the reaction--onto two different locations on a thin nanosheet.

"Many processes need two reactions in one. This means that you need two co-catalysts. However, the challenge is that the two co-catalysts have to stay separated, otherwise they'll interact with each other and create a negative effect on the efficiency of the whole process," said Eli Stavitski, a chemist and beamline scientist at NSLS-II.

In many cases, catalysts are made from a large number of atoms to form a catalytic nanomaterial, which may seem small to a human but, in the world of chemical reactions, are still fairly large. Therefore, placing two of these materials next to each other without them interacting is quite challenging. To solve this challenge, the team took a different route.

"We used a thin nanosheet to co-host two co-catalysts for the different parts of the reaction. The beauty is in its simplicity: one of the co-catalysts--a single cobalt (Co) atom--sits in the center of the sheet, whereas the other one, a molecule called anthraquinone, is placed around the edges. This would not be possible with catalysts made of nanomaterials- since they would be 'too big' for this purpose," said Kim.

Kim and his team at Yale synthesized this new 2-D catalyst in their lab following a precise series of chemical reactions, heating, and separating steps.

After the scientists synthesized the new two-in-one catalyst, they needed to figure out if the co-catalysts would stay separated during an actual reaction and how well this new 2-D catalyst would perform. However, to really 'see' the atomic structure and chemical properties of their two-in-one catalyst in action, the scientists needed two different kinds of x-rays: hard x-rays and tender x-rays. Just like visible light, x-rays come in different colors--or wavelengths--and instead of calling them blue or red, they are called hard, tender, or soft.

"Human eyes cannot see ultraviolet or infrared light and we need special cameras to see them. Our instruments are not able to 'see' both hard and tender x-rays at the same time. So, we needed two different instruments--or beamlines--to investigate the catalyst's materials using different x-rays," said Stavitski.

The scientists started their investigation at NSLS-II's hard x-ray Inner Shell Spectroscopy (ISS) beamline using a technique called x-ray absorption spectroscopy. This technique helped the team to learn more about the local structure of the new 2-D catalyst. Specifically, they found out how many neighboring atoms each co-catalyst has, how far away these neighbors are, and how they are connected to each other.

The next stop in the investigation was NSLS-II's Tender Energy X-ray Absorption Spectroscopy (TES) beamline.

"By using the same technique at TES with tender x-rays instead of hard x-rays, we could see the light elements clearly. Traditionally, many catalysts are made from heavy elements such as cobalt, nickel, or platinum, which we can study using hard x-rays, however our 2-D catalyst also includes important lighter elements such as phosphorous. So, to learn more about the role of this lighter element in our two-in-one catalyst, we also needed tender x-rays," said Yonghua Du, a physicist and TES beamline scientist.

NSLS-II's TES beamline is a one of the few instruments within the U.S. that can complement the different hard x-ray capabilities by offering tender x-ray imaging and spectroscopic capabilities.

After their experiments, the scientists wanted to be sure that they understood how the catalyst worked and decided to simulate different candidate structures and their properties.

"We used an approach called density functional theory to understand the structures and the mechanisms that controls the efficiency of the reaction. Based on what we learned through the experiments and what we know about how atoms interact with each other, we simulated several candidate structures to determine which one was most plausible," said Christopher Muhich, assistant professor of chemical engineering at Arizona State University and also a member of NEWT.

Only by combining their expertise in synthesis, analytical experimentation, and theoretical simulation could the team create their new 2-D catalyst and demonstrate its efficiency. The team agrees that collaboration was the key to their success, and they will continue searching for the next generation of catalysts for various environmental applications.

Credit: 
DOE/Brookhaven National Laboratory

Turned-down temperatures boost crops' penchant for production

URBANA, Ill. - Drought and heat put stress on plants and reduce grain yield. For some farmers, irrigation is the answer. Many of us assume the practice boosts crop yields by delivering soil water, but it turns out irrigation's cooling effect on crops is important in its own right.

In a recent U.S.-based study, a research team led by University of Illinois scientists discovered 16% of the yield increase from irrigation is attributable to cooling alone.

"This study highlights the non-negligible contribution of cooling to the yield benefits of irrigation. Such an effect may become more important in the future with continued warming and more frequent droughts," says Yan Li, lead author on the Global Change Biology study and a former postdoctoral associate in the Department of Natural Resources and Environmental Sciences at Illinois. Yan is now an associate professor at Beijing Normal University.

Irrigation cools crops through the combined effects of transpiration - water loss through tiny holes in leaves called stomata - and evaporation from soil. Transpiration can only occur when there is a sufficient soil water supply; when roots sense dry soil, plants close their stomata to prevent water loss. When this happens for long enough, plants heat up and suffer drought stress, sometimes leading to yield reductions

With soil water supply and temperature so inextricably linked, the researchers had to develop an innovative approach to separate out the cooling effect on yield. The team analyzed satellite-derived crop temperature and biomass data, irrigation maps, and county-level corn yield in Nebraska between 2003 and 2016. By comparing irrigated and rainfed sites, they found irrigation bumped yield by 81%, of which 16% was attributable to cooling and 84% to water supply. Irrigation also lowered July land surface temperatures by 1.63 degrees Celsius relative to rainfed sites.

"Crops grown in the irrigated part of the U.S. Corn Belt, mostly in Nebraska and some other western states in the Midwest, all feel the cooling benefit," says Kaiyu Guan, principal investigator of the project and Blue Waters professor in the Department of Natural Resources and Environmental Sciences and the National Center for Supercomputing Applications at Illinois. "Crops feel happier when they are cooler."

According to Li, teasing out irrigation's dual contributions to yield will allow better yield forecasting into the future. The cooling effect has largely been neglected in previous crop models.

"When yield forecasters develop their models, they should realize cooling is another important benefit for irrigated cropland, and they need to take that into account. Otherwise, they may underestimate the irrigation benefit on yield," he says.

Guan adds, "This matters not only for currently irrigated cropland like Nebraska. Under a warming climate, we envision that crops will need more water to grow the same biomass, and parts of the Corn Belt that are currently rainfed, like Iowa and Illinois, may also need irrigation."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences