Tech

Early modern humans cooked starchy food in South Africa, 170,000 years ago

image: Hypoxis angustifolia growth habit.

Image: 
Prof. Lyn Wadley/Wits University

"The inhabitants of the Border Cave in the Lebombo Mountains on the Kwazulu-Natal/eSwatini border were cooking starchy plants 170 thousand years ago," says Professor Lyn Wadley, a scientist from the Wits Evolutionary Studies Institute at the University of the Witwatersrand, South Africa (Wits ESI). "This discovery is much older than earlier reports for cooking similar plants and it provides a fascinating insight into the behavioural practices of early modern humans in southern Africa. It also implies that they shared food and used wooden sticks to extract plants from the ground."

"It is extraordinary that such fragile plant remains have survived for so long," says Dr Christine Sievers, a scientist from the University of the Witwatersrand, who completed the archaeobotanical work with Wadley. The underground food plants were uncovered during excavations at Border Cave in the Lebombo Mountains (on the border of KwaZulu-Natal Province, South Africa, and eSwatini [formerly Swaziland]), where the team has been digging since 2015. During the excavation, Wadley and Sievers recognised the small, charred cylinders as rhizomes. All appear to belong to the same species, and 55 charred, whole rhizomes were identified as Hypoxis, commonly called the Yellow Star flower. "The most likely of the species growing in KwaZulu-Natal today is the slender-leafed Hypoxis angustifolia that is favoured as food," adds Sievers. "It has small rhizomes with white flesh that is more palatable than the bitter, orange flesh of rhizomes from the better known medicinal Hypoxis species (incorrectly called African Potato)."

The Border Cave plant identifications were made on the size and shape of the rhizomes and on the vascular structure examined under a scanning electron microscope. Modern Hypoxis rhizomes and their ancient counterparts have similar cellular structures and the same inclusions of microscopic crystal bundles, called raphides. The features are still recognisable even in the charred specimens. Over a four-year period, Wadley and Sievers made a collection of modern rhizomes and geophytes from the Lebombo area. "We compared the botanical features of the modern geophytes and the ancient charred specimens, in order to identify them," explains Sievers.

Hypoxis rhizomes are nutritious and carbohydrate-rich with an energy value of approximately 500 KJ/100g. While they are edible raw, the rhizomes are fibrous and have high fracture toughness until they are cooked. The rhizomes are rich in starch and would have been an ideal staple plant food. "Cooking the fibre-rich rhizomes would have made them easier to peel and to digest so more of them could be consumed and the nutritional benefits would be greater," says Wadley.

Wooden digging sticks used to extract the plants from the ground

"The discovery also implies the use of wooden digging sticks to extract the rhizomes from the ground. One of these tools was found at Border Cave and is directly dated at circa 40,000 years ago," says co-author of the paper and co-director of the excavation, Professor Francesco d'Errico, (Centre National de la Recherche Scientifique (CNRS), Université de Bordeaux, France and University of Bergen, Norway). Dr Lucinda Backwell (Instituto Superior de Estudios Sociales, ISES-CONICET, Tucumán, Argentina) also co-authored the paper and was a co-director of the excavation.

The plants were cooked and shared

The Hypoxis rhizomes were mostly recovered from fireplaces and ash dumps rather than from surrounding sediment. "The Border Cave inhabitants would have dug Hypoxis rhizomes from the hillside near the cave, and carried them back to the cave to cook them in the ashes of fireplaces," says Wadley. "The fact that they were brought back to the cave rather than cooked in the field suggests that food was shared at the home base. This suggests that the rhizomes were roasted in ashes and that, in the process, some were lost. While the evidence for cooking is circumstantial, it is nonetheless compelling."

Discoveries at Border Cave

This new discovery adds to the long list of important finds at Border Cave. The site has been repeatedly excavated since Raymond Dart first worked there in 1934. Amongst earlier discoveries were the burial of a baby with a Conus seashell at 74,000 years ago, a variety of bone tools, an ancient counting device, ostrich eggshell beads, resin, and poison that may once have been used on hunting weapons.

The Border Cave Heritage Site

Border Cave is a heritage site with a small site museum. The cave and museum are open to the public, though bookings are essential [Olga Vilane (+27) (0) 72 180 4332]. Wadley and her colleagues hope that the Border Cave discovery will emphasise the importance of the site as an irreplaceable cultural resource for South Africa and the rest of the world.

About Hypoxis angustifolia

Hypoxis angustifolia is evergreen, so it has visibility year-round, unlike the more common deciduous Hypoxis species. It thrives in a variety of modern habitats and is thus likely to have had wide distribution in the past as it does today. It occurs in sub-Saharan Africa, south Sudan, some Indian Ocean islands, and as far afield as Yemen. Its presence in Yemen may imply even wider distribution of this Hypoxis plant during previous humid conditions. Hypoxis angustifolia rhizomes grow in clumps so many can be harvested at once. "All of the rhizome's attributes imply that it could have provided a reliable, familiar food source for early humans trekking within Africa, or even out of Africa," said Lyn Wadley. Hunter-gatherers tend to be highly mobile so the wide distribution of a potential staple plant food would have ensured food security.

Credit: 
University of the Witwatersrand

REE mineral-bearing rocks found in eastern Mojave Desert

Boulder, Colo., USA: Scientists from the U.S. Geological Survey (USGS) have mapped a rare earth element deposit of magmatic carbonatite located in the Mountain Pass region of the eastern Mojave Desert. The new report details the geophysical and geological setting of the deposit, including a map of the deposit's subsurface extent, to help land-use managers evaluate sites for further exploration. The report was recently published in the Geological Society of America's online journal, Geosphere.

Rare earth elements (REEs) are critical to emerging industrial technologies including strategic defense, science and medical, automotive and transportation, and civilian electronics. However, large economic REE sources are unique and uncommon worldwide. International concerns about increasing demand and global supply vulnerability have prompted many countries, including the U.S., to explore and assess domestic REE resources. Increased efforts to characterize geologic processes related to REE deposits in the U.S. have focused attention on the world-class Mountain Pass, California, deposit located approximately 60 miles southwest of Las Vegas, Nevada.

In their study, collaborators K.M. Denton and USGS colleagues use geophysical and geological techniques to image geologic structures related to REE mineral-bearing rocks at depth. Their work suggests REE minerals occur along a fault zone or geologic contact near the eastern edge of the Mescal Range. These findings could prove as a useful guide to future exploration efforts.

Credit: 
Geological Society of America

A quantum breakthrough brings a technique from astronomy to the nano-scale

image: The discovery of multi-messenger nanoprobes allows scientists to simultaneously probe multiple properties of quantum materials at nanometer-scale spatial resolutions.

Image: 
Ella Maru Studio

Researchers at Columbia University and University of California, San Diego, have introduced a novel "multi-messenger" approach to quantum physics that signifies a technological leap in how scientists can explore quantum materials.

The findings appear in a recent article published in Nature Materials, led by A. S. McLeod, postdoctoral researcher, Columbia Nano Initiative, with co-authors Dmitri Basov and A. J. Millis at Columbia and R.A. Averitt at UC San Diego.

"We have brought a technique from the inter-galactic scale down to the realm of the ultra-small," said Basov, Higgins Professor of Physics and Director of the Energy Frontier Research Center at Columbia. Equipped with multi-modal nanoscience tools we can now routinely go places no one thought would be possible as recently as five years ago."

The work was inspired by "multi-messenger" astrophysics, which emerged during the last decade as a revolutionary technique for the study of distant phenomena like black hole mergers. Simultaneous measurements from instruments, including infrared, optical, X-ray and gravitational-wave telescopes can, taken together, deliver a physical picture greater than the sum of their individual parts.

The search is on for new materials that can supplement the current reliance on electronic semiconductors. Control over material properties using light can offer improved functionality, speed, flexibility and energy efficiency for next-generation computing platforms.

Experimental papers on quantum materials have typically reported results obtained by using only one type of spectroscopy. The researchers have shown the power of using a combination of measurement techniques to simultaneously examine electrical and optical properties.

The researchers performed their experiment by focusing laser light onto the sharp tip of a needle probe coated with magnetic material. When thin films of metal oxide are subject to a unique strain, ultra-fast light pulses can trigger the material to switch into an unexplored phase of nanometer-scale domains, and the change is reversible.

By scanning the probe over the surface of their thin film sample, the researchers were able to trigger the change locally and simultaneously manipulate and record the electrical, magnetic and optical properties of these light-triggered domains with nanometer-scale precision.

The study reveals how unanticipated properties can emerge in long-studied quantum materials at ultra-small scales when scientists tune them by strain.

"It is relatively common to study these nano-phase materials with scanning probes. But this is the first time an optical nano-probe has been combined with simultaneous magnetic nano-imaging, and all at the very low temperatures where quantum materials show their merits," McLeod said. "Now, investigation of quantum materials by multi-modal nanoscience offers a means to close the loop on programs to engineer them."

Credit: 
Columbia University

Tests measure solar panel performance beyond established standards

image: The PV field-testing facility located at the Max Planck Institute for Chemical Energy Conversion (MPI-CEC), located in Mülheim an der Ruhr, Germany. The rooftop installation includes five different inorganic PV technologies.

Image: 
Thomas Hobirk, MPI-CEC

WASHINGTON, D.C., January 2, 2020 -- Photovoltaics used in solar panels are sensitive to environmental factors and often suffer degradation over time. International Electrotechnical Commission standards for accelerated degradation do not include field tests. While some testing facilities have made data available, much of the data needed to make business decisions for PV is not available publicly.

In testing solar panels, the sun's intensity, the spectral composition and the angle of light are important factors in understanding why certain panels are successful and others degrade more quickly. Tests must also include many parameters beyond just temperature.

To address the knowledge gap in degradation mechanisms for various PV types, researchers performed tests over five years in which they collected weather data and panel performance information. These data points were processed using aggregation and regression algorithms and filtering masks to understand the change over time. The results are published in the Journal of Renewable and Sustainable Energy, from AIP Publishing.

"Our study highlights that one of the proposed methods of tackling this problem, i.e., applying the irradiance mask, might add bias to the data without decreasing the spread," author Peter Kraus said. "What we were surprised by was that a simple data aggregation to a longer time interval, coupled with the year-on-year method for calculating degradation rates, yielded reasonable results that were validated when the pyranometer data was excluded."

The tests were performed at a PV power plant installed at the Max Planck Institute for Chemical Energy Conversion in Mülheim an der Ruhr, Germany. The rooftop installation includes five inorganic PV technologies: micromorph thin film silicon, cadmium telluride, copper-indium-gallium-selenium, polycrystalline silicon and amorphous silicon.

Pyranometers are sensors used to measure sunlight irradiance, but they are prone to errors and malfunction. As a result, they must be regularly checked and calibrated.

To handle this problem, the researchers employed an open-source testing methodology created by the National Renewable Energy Laboratory and Sun Power Corporation called clear sky irradiance, which is the expected solar irradiance at a given location in ideal clear-sky conditions.

They compared performance ratios based on measured real-world data and data modeled using clear sky irradiance to show the difference between datasets, highlight data inconsistencies and report accurate performance over time.

The authors plan to continue to produce detailed data on the PV plant to expand the dataset over longer periods of time and bring raw performance data into the open where it can be used to improve the technology.

Credit: 
American Institute of Physics

Study finds dopamine, biological clock link to snacking, overeating and obesity

During the years 1976 through 1980, 15% of U.S. adults were obese. Today, about 40% of adults are obese. Another 33% are overweight.

Coinciding with this increase in weight are ever-rising rates of heart disease, diabetes, cancer and health complications caused by obesity, such as hypertension. Even Alzheimer's disease may be partly attributable to obesity and physical inactivity.

"The diet in the U.S. and other nations has changed dramatically in the last 50 years or so, with highly processed foods readily and cheaply available at any time of the day or night," Ali Güler, a professor of biology at the University of Virginia, said. "Many of these foods are high in sugars, carbohydrates and calories, which makes for an unhealthy diet when consumed regularly over many years."

In a study published Thursday in the journal Current Biology, Güler and his colleagues demonstrate that the pleasure center of the brain that produces the chemical dopamine, and the brain's separate biological clock that regulates daily physiological rhythms, are linked, and that high-calorie foods - which bring pleasure - disrupt normal feeding schedules, resulting in overconsumption. Using mice as study models, the researchers mimicked the 24/7 availability of a high-fat diet, and showed that anytime snacking eventually results in obesity and related health problems.

Güler's team found that mice fed a diet comparable to a wild diet in calories and fats maintained normal eating and exercise schedules and proper weight. But mice fed high-calorie diets laden with fats and sugars began "snacking" at all hours and became obese.

Additionally, so-called "knockout" mice that had their dopamine signaling disrupted - meaning they didn't seek the rewarding pleasure of the high-fat diet - maintained a normal eating schedule and did not become obese, even when presented with the 24/7 availability of high-calorie feeds.

"We've shown that dopamine signaling in the brain governs circadian biology and leads to consumption of energy-dense foods between meals and during odd hours," Güler said.

Other studies have shown, Güler said, that when mice feed on high-fat foods between meals or during what should be normal resting hours, the excess calories are stored as fat much more readily than the same number of calories consumed only during normal feeding periods. This eventually results in obesity and obesity-related diseases, such as diabetes.

Speaking of the modern human diet, Güler said, "The calories of a full meal may now be packed into a small volume, such as a brownie or a super-size soda. It is very easy for people to over-consume calories and gain excessive weight, often resulting in obesity and a lifetime of related health problems.

"Half of the diseases that affect humans are worsened by obesity. And this results in the need for more medical care and higher health care costs for individuals, and society."

Güler said the human body, through thousands of years of evolution, is hard-wired to consume as much food as possible as long as it's available. He said this comes from a long earlier history when people hunted or gathered food and had brief periods of plenty, such as after a kill, and then potentially lengthy periods of famine. Humans also were potential prey to large animals and so actively sought food during the day, and sheltered and rested at night.

"We evolved under pressures we no longer have," Güler said. "It is natural for our bodies as organisms to want to consume as much as possible, to store fat, because the body doesn't know when the next meal is coming.

"But, of course, food is now abundant, and our next meal is as close as the kitchen, or the nearest fast-food drive-through, or right here on our desk. Often, these foods are high in fats, sugars, and therefore calories, and that's why they taste good. It's easy to overconsume, and, over time, this takes a toll on our health."

Additionally, Güler said, prior to the advent of our electricity-powered society, people started the day at dawn, worked all day, often doing manual labor, and then went to sleep with the setting of the sun. Human activity, therefore, was synchronized to day and night. Today, we are working, playing, staying connected - and eating - day and night. This, Guler said, affects our body clocks, which were evolved to operate on a sleep-wake cycle timed to daytime activity, moderate eating and nighttime rest.

"This lights-on-all-the-time, eat-at-any-time lifestyle recasts eating patterns and affects how the body utilizes energy," he said. "It alters metabolism - as our study shows - and leads to obesity, which causes disease. We're learning that when we eat is just as important as how much we eat. A calorie is not just a calorie. Calories consumed between meals or at odd hours become stored as fat, and that is the recipe for poor health."

Credit: 
University of Virginia

ORNL researchers advance performance benchmark for quantum computers

image: An ORNL research team lead is developing a universal benchmark for the accuracy and performance of quantum computers based on quantum chemistry simulations. The benchmark will help the community evaluate and develop new quantum processors. (Below left: schematic of one of quantum circuits used to test the RbH molecule. Top left: molecular orbitals used. Top right: actual results obtained using the bottom left circuit for RbH).

Image: 
Oak Ridge National Laboratory

Researchers at the Department of Energy's Oak Ridge National Laboratory have developed a quantum chemistry simulation benchmark to evaluate the performance of quantum devices and guide the development of applications for future quantum computers.

Their findings were published in npj Quantum Information.

Quantum computers use the laws of quantum mechanics and units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional "bits" have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, allowing for a vast number of possibilities for storing data.

While still in their early stages, quantum systems have the potential to be exponentially more powerful than today's leading classical computing systems and promise to revolutionize research in materials, chemistry, high-energy physics, and across the scientific spectrum.

But because these systems are in their relative infancy, understanding what applications are well suited to their unique architectures is considered an important field of research.

"We are currently running fairly simple scientific problems that represent the sort of problems we believe these systems will help us to solve in the future," said ORNL's Raphael Pooser, principal investigator of the Quantum Testbed Pathfinder project. "These benchmarks give us an idea of how future quantum systems will perform when tackling similar, though exponentially more complex, simulations."

Pooser and his colleagues calculated the bound state energy of alkali hydride molecules on 20-qubit IBM Tokyo and 16-qubit Rigetti Aspen processors. These molecules are simple and their energies well understood, allowing them to effectively test the performance of the quantum computer.

By tuning the quantum computer as a function of a few parameters, the team calculated these molecules' bound states with chemical accuracy, which was obtained using simulations on a classical computer. Of equal importance is the fact that the quantum calculations also included systematic error mitigation, illuminating the shortcomings in current quantum hardware.

Systematic error occurs when the "noise" inherent in current quantum architectures affects their operation. Because quantum computers are extremely delicate (for instance, the qubits used by the ORNL team are kept in a dilution refrigerator at around 20 millikelvin (or more than -450 degrees Fahrenheit), temperatures and vibrations from their surrounding environments can create instabilities that throw off their accuracy. For instance, such noise may cause a qubit to rotate 21 degrees instead of the desired 20, greatly affecting a calculation's outcome.

"This new benchmark characterizes the 'mixed state,' or how the environment and machine interact, very well," Pooser said. "This work is a critical step toward a universal benchmark to measure the performance of quantum computers, much like the LINPACK metric is used to judge the fastest classical computers in the world."

While the calculations were fairly simple compared to what is possible on leading classical systems such as ORNL's Summit, currently ranked as the world's most powerful computer, quantum chemistry, along with nuclear physics and quantum field theory, is considered a quantum "killer app." In other words, it is believed that as they evolve quantum computers will be able to more accurately and more efficiently perform a wide swathe of chemistry-related calculations better than any classical computer currently in operation, including Summit.

"The current benchmark is a first step towards a comprehensive suite of benchmarks and metrics that govern the performance of quantum processors for different science domains," said ORNL quantum chemist Jacek Jakowski. "We expect it to evolve with time as the quantum computing hardware improves. ORNL's vast expertise in domain sciences, computer science and high-performance computing make it the perfect venue for the creation of this benchmark suite."

ORNL has been planning for paradigm-shifting platforms such as quantum for more than a decade via dedicated research programs in quantum computing, networking, sensing and quantum materials. These efforts aim to accelerate the understanding of how near-term quantum computing resources can help tackle today's most daunting scientific challenges and support the recently announced National Quantum Initiative, a federal effort to ensure American leadership in quantum sciences, particularly computing.

Such leadership will require systems like Summit to ensure the steady march from devices such as those used by the ORNL team to larger-scale quantum systems exponentially more powerful than anything in operation today.

Access to the IBM and Rigetti processors was provided by the Quantum Computing User Program at the Oak Ridge Leadership Computing Facility, which provides early access to existing, commercial quantum computing systems while supporting the development of future quantum programmers through educational outreach and internship programs. Support for the research came from DOE's Office of Science Advanced Scientific Computing Research program.

"This project helps DOE better understand what will work and what won't work as they forge ahead in their mission to realize the potential of quantum computing in solving today's biggest science and national security challenges," Pooser said.

Next, the team plans to calculate the exponentially more complex excited states of these molecules, which will help them devise further novel error mitigation schemes and bring the possibility of practical quantum computing one step closer to reality.

Credit: 
DOE/Oak Ridge National Laboratory

Switching tracks: Reversing electrons' course through nature's solar cells

Think of a train coming down the tracks to a switch point where it could go either to the right or the left -- and it always goes to the right.

Photosynthetic organisms have a similar switch point. After sunlight is absorbed, energy transfers rapidly to a protein called the reaction center. From this point, the electrons could move either to an A-branch (or "right-track") set of molecules, or to a B-branch ("left-track") set of identical molecules.

New research from Washington University in St. Louis and Argonne National Laboratory coaxes electrons down the track that they typically don't travel -- advancing understanding of the earliest light-driven events of photosynthesis. The findings were published Dec. 31 in the Proceedings of the National Academy of Sciences (PNAS).

"In the bacterial reaction center, an electron goes to the A-branch of molecules 100% of the time. We have made it go to the B-branch molecules 90% of the time," said Christine Kirmaier, research professor of chemistry in Arts & Sciences.

"After all, if you think you understand how the train and the tracks work, why shouldn't you be able to make the train go to the left rather than the right? That's essentially what we've done," Kirmaier said.

"Why two tracks have evolved is still an open question, but the ability to control which track is utilized is exciting," said Philip D. Laible, a biophysicist in the biosciences division at Argonne National Laboratory and another lead author on the paper.

"We would like to make the switching between them a more well understood phenomenon so that we could readily conduct electrons (pardon the pun) to any destination in a biological process," he said. "Right now, we are controlling features that allows for electrons to transverse a biological membrane -- the first step in making energy from sunlight in this organism."

Re-engineering a pathway

Plants, algae and photosynthetic bacteria convert the energy of sunlight into charge-separated units that they use to power life processes on Earth. And they do it in a very specific way: The reaction centers in these organisms feature two mirror image-like arrangements of protein and pigment cofactors, the A and B sides. Only one of these chains is active -- the A side -- while the B side is silent.

Kirmaier, with collaborator Dewey Holten, professor of chemistry at Washington University, and the team at Argonne National Laboratory have designed many iterations of photosynthetic mutants with the goal of achieving charge separation using the B branch instead. The new research re-engineers a pathway in a purple photosynthetic bacteria, one of nature's solar cells.

"Using molecular biology, we've been changing the amino acids around the pigments to try and find the magic combination to make the B branch work," she said.

The game was to make structural changes that de-tune, or make less optimal, electron transfers along the A side or normal path -- and then, at the same time, speed up the reactions along the B side.

The researchers were able to step up this trial-and-error process by testing all possible amino acids at a specific target site on the A or B side, finding one or more that improve the B-side yield. They then carried that "hit" forward in the mutant background to probe the next target site, and so on.

"It was unexpected," Kirmaier said. "We picked a site, and in one of our best mutant backgrounds, placed all 20 amino acids there -- and one of them gave us a 90% yield."

"This is a breakthrough achievement and something that [everyone in] the field has been actively trying to figure out for decades -- ever since we first set eyes on the two tracks in a high-profile structural study in Nature nearly 35 years ago," said Deborah K. Hanson of the biosciences division, Argonne National Laboratory, another lead author of the PNAS paper.

Rethinking the history of photosynthesis

The new work illuminates basic structure-function principles that govern efficient, light-induced electron transfer.

This knowledge can aid design of biohybrid and bioinspired systems for energy conversion and storage, the researchers said. The findings also will provoke additional experiments and analysis.

"The results raise lots of questions about what is required to get unidirectional charge separation," Holten said.

In nature, purple bacteria do initial charge separation with a two-step process that takes place in several trillionths of a second. But the team's new B-branch solution gets almost the same yield, even though it uses a tandem one-step process that takes 5-10 times longer.

"In the original history of photosynthesis, maybe such a combination of a fast two-step and slower one-step processes gave a 80 or 90% yield -- and then, over time, it optimized," Holten said.

Credit: 
Washington University in St. Louis

Alzheimer 'tau' protein far surpasses amyloid in predicting toll on brain tissue

Brain imaging of pathological tau-protein "tangles" reliably predicts the location of future brain atrophy in Alzheimer's patients a year or more in advance, according to a new study by scientists at the UC San Francisco Memory and Aging Center. In contrast, the location of amyloid "plaques," which have been the focus of Alzheimer's research and drug development for decades, was found to be of little utility in predicting how damage would unfold as the disease progressed.

The results, published January 1, 2020 in Science Translational Medicine, support researchers' growing recognition that tau drives brain degeneration in Alzheimer's disease more directly than amyloid protein, and at the same time demonstrates the potential of recently developed tau-based PET (positron emission tomography) brain imaging technology to accelerate Alzheimer's clinical trials and improve individualized patient care.

"The match between the spread of tau and what happened to the brain in the following year was really striking," said neurologist Gil Rabinovici, MD, the Edward Fein and Pearl Landrith Distinguished Professor in Memory and Aging and leader of the PET imaging program at the UCSF Memory and Aging Center. "Tau PET imaging predicted not only how much atrophy we would see, but also where it would happen. These predictions were much more powerful than anything we've been able to do with other imaging tools, and add to evidence that tau is a major driver of the disease."

Interest in Tau Growing as Amyloid-Based Therapies Stumble

Alzheimer's researchers have long debated the relative importance of amyloid plaques and tau tangles -- two kinds of misfolded protein clusters seen in postmortem studies of patients' brains, both first identified by Alois Alzheimer in the early 20th century. For decades, the "amyloid camp" has dominated, leading to multiple high-profile efforts to slow Alzheimer's with amyloid-targeting drugs, all with disappointing or mixed results.

Many researchers are now taking a second look at tau protein, once dismissed as simply a "tombstone" marking dying cells, and investigating whether tau may in fact be an important biological driver of the disease. In contrast to amyloid, which accumulates widely across the brain, sometimes even in people with no symptoms, autopsies of Alzheimer's patients have revealed that tau is concentrated precisely where brain atrophy is most severe, and in locations that help explain differences in patients' symptoms (in language-related areas vs. memory-related regions, for example).

"No one doubts that amyloid plays a role in Alzheimer's disease, but more and more tau findings are beginning to shift how people think about what is actually driving the disease," explained Renaud La Joie, PhD, a postdoctoral researcher in Rabinovici's In Vivo Molecular Neuroimaging Lab, and lead author of the new study. "Still, just looking at postmortem brain tissue, it has been hard to prove that tau tangles cause brain degeneration and not the other way around. One of our group's key goals has been to develop non-invasive brain imaging tools that would let us see whether the location of tau buildup early in the disease predicts later brain degeneration."

Tau PET Scans Predict Locations of Future Brain Atrophy in Individual Patients

Despite early misgivings that tau might be impossible to measure in the living brain, scientists recently developed an injectable molecule called flortaucipir -- currently under review by the FDA -- which binds to misfolded tau in the brain and emits a mild radioactive signal that can be picked up by PET scans.

Rabinovici and collaborator William Jagust, MD, of UC Berkeley and Lawrence Berkeley National Laboratory, have been among the first to adopt tau PET imaging to study the distribution of tau tangles in the normally aging brain and in a smaller cross-sectional study of Alzheimer's patients. Their new study represents the first attempt to test whether tau levels in Alzheimer's patients can predict future brain degeneration.

La Joie recruited 32 participants with early clinical stage Alzheimer's disease through the UCSF Memory and Aging Center, all of whom received PET scans using two different tracers to measure levels of amyloid protein and tau protein in their brains. The participants also received MRI scans to measure their brain's structural integrity, both at the start of the study, and again in follow-up visits one to two years later.

The researchers found that overall tau levels in participants' brains at the start of the study predicted how much degeneration would occur by the time of their follow up visit (on average 15 months later). Moreover, local patterns of tau buildup predicted subsequent atrophy in the same locations with more than 40 percent accuracy. In contrast, baseline amyloid-PET scans correctly predicted only 3 percent of future brain degeneration.

"Seeing that tau buildup predicts where degeneration will occur supports our hypothesis that tau is a key driver of neurodegeneration in Alzheimer's disease," La Joie said.

Notably, PET scans revealed that younger study participants had higher overall levels of tau in their brains, as well as a stronger link between baseline tau and subsequent brain atrophy, compared to older participants. This suggests that other factors -- likely other abnormal proteins or vascular injuries -- may play a larger role in late-onset Alzheimer's, the researchers say.

Ability to Predict Brain Atrophy a 'Valuable Precision Medicine Tool'

The results add to hopes that tau-targeting drugs currently under study at the UCSF Memory and Aging Center and elsewhere may provide clinical benefits to patients by blocking this key driver of neurodegeneration in the disease. At the same time, the ability to use tau PET to predict later brain degeneration could enable more personalized dementia care and speed ongoing clinical trials, the authors say.

"One of the first things people want to know when they hear a diagnosis of Alzheimer's disease is simply what the future holds for themselves or their loved ones. Will it be a long fading of memory, or a quick decline into dementia? How long will the patient be able to live independently? Will they lose the ability to speak or get around on their own? These are questions we can't currently answer, except in the most general terms," Rabinovici said. "Now, for the first time, this tool could let us give patients a sense of what to expect by revealing the biological process underlying their disease."

Rabinovici and his team also anticipate that the ability to predict future brain atrophy based on tau PET imaging will allow Alzheimer's clinical trials to quickly assess whether an experimental treatment can alter the specific trajectory predicted for an individual patient, which is currently impossible due to the wide variability in how the disease progresses from individual to individual. Such insights could make it possible to adjust dosage or switch to a different experimental compound if the first treatment is not affecting tau levels or altering a patient's predicted trajectory of brain atrophy.

"Tau PET could be an extremely valuable precision medicine tool for future clinical trials," Rabinovici said. "The ability to sensitively track tau accumulation in living patients would for the first time let clinical researchers seek out treatments that can slow down or even prevent the specific pattern of brain atrophy predicted for each patient."

Credit: 
University of California - San Francisco

Tumor DNA platform scopes out and classifies colorectal cancer

video: A brief introduction of our study, ctDNA methylation in colorectal cancer. This material relates to a paper that appeared in the Jan. 1, 2020, issue of Science Translational Medicine, published by AAAS. The paper, by H. Luo at Collaborative Innovation Center for Cancer Medicine in Guangzhou, P.R. China; and colleagues was titled, "Circulating tumor DNA methylation profiles enable early diagnosis, prognosis prediction, and screening for colorectal cancer."

Image: 
[Rui-hua Xu]

A new machine learning platform can identify patients with colorectal cancer and helps predict their disease severity and survival, according to a study involving samples from thousands of subjects. The noninvasive method adds to recent advances in technologies that analyze circulating tumor DNA (ctDNA) and could help spot colorectal cancers in at-risk patients at earlier stages. Like many other malignancies, colorectal cancers are most treatable if they are detected before they have metastasized to other tissues. Colonoscopies are the "gold standard" for diagnosis, but they are uncomfortable and invasive and can lead to complications, which leaves patients less willing to undergo screening. Huiyan Luo and colleagues leveraged machine learning techniques to develop a less invasive diagnostic method that can detect colorectal cancer in at-risk patients. Their technology works by screening for methylation markers, which are DNA modifications that are frequently found in tumors. The scientists first created a diagnostic model based on nine methylation markers associated with colorectal cancer, which they identified by studying plasma samples from 801 patients with colorectal cancer as well as 1,021 controls. This model accurately distinguished patients from healthy individuals with a sensitivity and specificity of 87.5% and 89.9%, respectively, and outperformed a clinically available blood test named CEA. Furthermore, a modified prognostic model helped predict the patients' risk of death over a follow-up period of 26.6 months on average, especially when combined with established clinical characteristics such as tumor location. One methylation marker was particularly useful, as screening for it alone spotted cases of colorectal cancer and precancerous lesions in a prospective study of 1,493 at-risk individuals. Luo et al. conclude that studies with longer follow-up periods will be needed to further assess their model's reliability for clinicians and patients.

Credit: 
American Association for the Advancement of Science (AAAS)

Bone analysis suggests small T. rexes were not a separate genus; they were kids

Settling a decades-long debate about whether small Tyrannosaurus rex specimens represent a separate genus or rather just "kids" of their kind, a new examination of thinly sliced bones from two specimens at the Burpee Museum of Natural History in Illinois suggests the latter. The specimens were juveniles that had not yet experienced a major growth spurt before they died, the authors say. "That's even cooler [than their being a separate genus,]" said co-author Scott Williams in a related video," because that tells us they go through a drastic change when they grow up from these sleek, slender, fleet-footed T. rexes with these wonderful knife-like teeth to these big, monster, plodding crushing tyrannosaurs that we are familiar with. It also tells us these animals probably dominated their ecosystems at all ages," even as juveniles. The results support the hypothesis that T. rex experienced a period of exponential growth late in their development. They also support that a skull specimen at the Cleveland Museum of Natural History, which was classified as a separate genus in 1988 (Nanotyrannus), is actually a young T. rex. Although most specialists now reject the idea that the specimen belongs to a separate classification, previous studies used characteristics of one of the Burpee Museum skulls to justify the classification. To assess the age and growth rate of the T. rex specimens, Holly Woodward et al. compared the organization of bone fibers and other microstructures in the two Burpee specimens, finding that they appeared to have been growing, as evidenced by growth rings in the bone in a spaced-out pattern not typically seen in adults. The bones also lacked the closely spaced series of lines present in adults that signals growth is complete. The researchers estimated the specimens' ages at the time of death by counting their cyclical growth marks, a series of lines in the femur and tibia that, like tree rings, that record periods of development. Woodward and colleagues suggest reaching full size after a period of prolonged adolescence may have meant that juveniles and adults fulfilled different roles in the ecosystem, such as feeding on different prey.

Credit: 
American Association for the Advancement of Science (AAAS)

Novel combination of antibodies leads to significant improvement in cancer immunotherapy

The simultaneous use of antibodies based on two differing mechanisms of action leads to a more effective destruction of tumors. This has been demonstrated by a study in animal models by medical oncologists and scientists at the University of Basel that has been published in the scientific journal PNAS. Patients who do not respond to current immunotherapy options could benefit most from this new treatment.

In recent years, immunotherapies against cancer have raised great hopes. These novel therapies recruit the body's immune system to destroy cancerous tissue. An antibody that activates the CD40 receptor on the surface of immune cells and thus stimulates the production of natural killer T-cells showed a promising effect in preclinical studies.

However, in subsequent clinical trials, the success of the CD40 antibody fell far short of expectations - less than 20% of patients responded. The research group Cancer Immunology at the University of Basel has now shown in animal models that the effect of the anti-CD40 antibody can be increased significantly by combining it with two other antibodies that attach to tumor blood vessels.

Open the way to the tumor

The starting point for the study was the observation that the administration of anti-CD40 antibodies leads to an increase in killer T-cells as intended - but these can then only be detected in the peripheral areas and not in the interior of the tumor. The researchers suspected that this was due to the nature of the tumor's blood vessels.

"Normally, the blood vessels of a tumor are leaky or stunted. Therefore, there is no good way for killer T-cells to get inside," says study leader Dr. Abhishek Kashyap. "Our hypothesis was that the killer cells are able to invade the tumor and destroy it only if there are enough healthy blood vessels."

Therefore, they combined the anti-CD40 antibody with two other anti-angiogenic antibodies that are able to stabilize the tumor blood vessels. One of the anti-angiogenetic antibodies is already approved for cancer therapy under the name Avastin, while the other is still in clinical development. All antibodies were provided by Roche.

New combination destroys tumor tissue

The researchers then tested this new combination of antibodies in several animal models for different types of cancer, such as colorectal, breast and skin cancer. As expected, the combination of the three antibodies significantly improved tumor tissue destruction in all cancers.

A more detailed analysis also showed that this success was based on the predicted mechanism: the addition of the two anti-angiogenetic antibodies ensured the tumors had more intact blood vessels. Unexpectedly, however, the investigations also showed that the antibody combination very effectively strengthens the immune system in several ways; for example, through a better penetration of the tumor by killer cells and by promoting a tumor-hostile inflammatory reaction in the tumor microenvironment.

"Our results illustrate how important it is to understand the biology of tumors," says Kashyap. He believes that patients with 'cold' tumors - tumors that do not respond well to immunotherapy - could benefit most from this new combination. "The anti-angiogenetic antibodies may make the 'cold' tumors 'hot', so that immunotherapy functions better." In the meantime, several early clinical trials of similar therapies in humans are underway.

Cooperation strengthens results

According to Kashyap, the strength of the study lies not only in the large effects measured, but also in the fact that several different laboratories achieved the same results. The experiments were carried out at the University Hospital of Basel, EPFL and the Roche Innovation Center Zurich.

This is also confirmed by Alfred Zippelius, Professor of Translational Oncology at the University of Basel and senior author of the study: "The innovative and translational potential of this work is the result of a close and excellent collaboration between applied and basic research, between the University of Basel and EPFL, and between academia and industry."

Credit: 
University of Basel

Samara Polytech scientists has developed a new concept of mathematical modeling

image: Mathematical modeling of locally nonequilibrium transfer processes and methods

Image: 
@SamaraPolytech

A team of scientists from the Research Center "Fundamental Problems of Thermophysics and Mechanics" of the Samara Polytech is engaged in the construction of new mathematical models and the search for methods for their study in relation to a wide range of locally nonequilibrium transport processes in various physical systems. An innovative approach developed not so long ago is based on a modern version of third-generation thermodynamics. The project of scientists "Development, theoretical research and experimental verification of mathematical models of oscillatory processes, heat and mass transfer and thermomechanics with two- and multiphase delays" was among the winners of the RFBR contest. Recent research results are published in the journal Physica A: Statistical Mechanics and its Applications.

An interest in studying locally nonequilibrium processes that take into account the specifics of transport processes at the molecular level (the mean free path of a molecule, the momentum transfer rate, relaxation time, etc.) is dictated by the need to conduct various physical processes under extreme conditions, for example, femtosecond concentrated exposure to energy flows on matter, ultra-low and ultra-high temperatures and pressures, shock waves, etc. Such physical processes are widely used to create new technologies for producing nanomaterials and coatings with unique physicochemical properties that cannot be obtained by traditional methods (binary and multicomponent metal alloys, ceramics, polymeric materials, metal and semiconductor glasses, nanofilms, graphene, composite nanomaterials, etc.).

- Classical thermodynamics is not suitable for describing processes that occur under locally nonequilibrium conditions, since it is based on the principle of local equilibrium. Our project is important both for fundamental science and for practical applications, "explains the project manager, Professor Igor Kudinov. - To accomplish the tasks we plan to create a new, unparalleled software package designed for 3D modeling of high-speed locally nonequilibrium processes of heat, mass and momentum transfer. Thus, our method opens up wide possibilities for studying processes that are practically significant from the point of view of modern nanotechnology.

Credit: 
Samara Polytech (Samara State Technical University)

Findings strengthen link between vitamin E acetate and vaping-associated lung injuries

COLUMBUS, Ohio - New research reported in the New England Journal of Medicine by the Centers for Disease Control and Prevention (CDC) in collaboration with The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC - James) strengthens prior findings on the link between vitamin E acetate and EVALI (E-cigarette or vaping product use-associated lung injury).

In this new study, the CDC analyzed bronchoalveolar lavage (BAL) fluid from 51 EVALI patients from 16 states and compared it to BAL fluid from 99 healthy individuals. Vitamin E acetate, also found in product samples tested by the Food and Drug Administration (FDA) and state laboratories, was identified in BAL fluid from 48 of 51 EVALI patients but was not found in any BAL fluid from healthy people. No other toxicants were found in BAL fluid from either group, except coconut oil and limonene (one EVALI patient each).

For this study, BAL samples were collected by the CDC from public health laboratories and health departments across the United States. These samples were received from hospital clinical teams that had collected the samples to guide clinical management decisions.

A team of scientists led by Peter Shields, MD, deputy director of the OSUCCC and thoracic oncologist at The James, provided BAL fluid samples from 99 healthy comparison subjects collected between 2015 and 2019 as part of a tobacco product study unrelated to the ongoing CDC investigation of EVALI.

"These findings support the conclusion that vitamin E acetate is a potential causative agent of EVALI, and that is an important discovery as decisions are made about how to best regulate the rapidly evolving e-cig industry," says Shields, who leads numerous e-cigarette research studies at the OSUCCC - James, including a bronchoscopy study to look at how e-cigarettes impact the lung microenvironment.

In October 2019, Shields and colleagues reported the first evidence that even short-term vaping causes concerning inflammation in the lungs in the medical journal Cancer Prevention Research. Additional data was reported in the medical journal Cancer Epidemiology on Dec. 17, 2019, online ahead of print which finds that the smoking related damage in e-cig users is much less than smokers, and more similar to never-smokers.

Credit: 
Ohio State University Wexner Medical Center

NASA tracks Tropical Storm Sarai moving away from Fiji

image: On Dec. 30, 2019, the MODIS instrument that flies aboard NASA's Terra satellite provided a visible image of Tropical Storm Sarai moving away from Fiji.

Image: 
NASA Worldview

NASA's Terra satellite passed over the Southern Pacific Ocean on Dec. 30 and found that Tropical Storm Sarai continued to move further away from Fiji and toward Tonga.

On Dec. 30, 2019, the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite provided a visible image of Sarai that showed the storm had flaring convection and strongest thunderstorms around the low-level center. The storm also appeared elongated indicating it was weakening.

On Dec. 30 at 10 a.m. EST (1500 UTC), the Joint Typhoon Warning Center noted that Tropical Cyclone Sarai was located near 22.1 degrees south latitude and 176.4 degrees west longitude. That is about 403 nautical miles west-southwest of Niue. Maximum sustained winds 45 knots (52 mph) and weakening.

Sarai is forecast to curve to the northeast and pass just north of Tonga and Niue over the next several days. Both of those islands can expect rough surf, tropical storm force winds and heavy rains. As this storm continues tracking in an easterly direction the Joint Typhoon Warning Center expects vertical wind shear, or outside winds to increase, leading to a weakening trend.

Credit: 
NASA/Goddard Space Flight Center

Bioelectric stimulation to clear skin lesions

image: The only peer-reviewed journal dedicated to the field of bioelectricity, publishing ground-breaking research and advances in a multitude of related disciplines.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, December 30, 2019--The delivery of ultrashort pulses of electrical energy represents a promising nonthermal, nonscarring method of inducing regulated cell death in common skin lesions. This and other novel approaches to applying electrical impulses to treat disease are published in Practical Applications of Bioelectric Stimulation, a special issue of Bioelectricity, a peer-reviewed journal from Mary Ann Liebert, Inc. Click here to read the issue free on the Bioelectricity website through January 30, 2019.

Guest Editor of the special issue Richard Nuccitelli, PhD, Pulse Biosciences, Hayward, CA, contributed the article entitled "Nano-Pulse Stimulation Therapy for the Treatment of Skin Lesions") Nano-Pulse Stimulation (NPS) delivers nanosecond pulsed electric fields to cells and tissues. NPS specifically targets cells, generating nanometer wide pores that allow small ions to enter to alter the flow of sodium, potassium and calcium ions in and out of the cells. It can induce cell death in epidermal or dermal lesions, but as it does not affect the dermal collagen it does not cause scarring. Dr. Nuccitelli discusses the characteristics of NPS, its effects on normal skin, on epidermal lesions such as seborrheic keratosis, on dermal lesions, and on warts caused by human papilloma virus.

Also of interest in this special issue is the article entitled "Preventing Ethanol-Induced Brain and Eye Morphology Defects Using Optogenetics") by Vaibhav Pai, Tufts University, Medford, MA and Dany Spencer Adams, Tufts University and Ion Diagnostics, Watertown, MA. Exposure of a fetus to alcohol can lead to defects in brain and eye morphology, as part of fetal alcohol spectrum disorder (FASD). This is also true in the developing tadpole. Pai and Adams used this model system to test the use of optogenetics - light-induced energy - to regulate ion channel function and ion fluxes and to rescue tadpoles from the effects of alcohol. Using controlled membrane voltage modulation, the researchers were able to rescue the ethanol-induced brain and eye defects in the tadpoles. The hyperpolarization effect was required for the full duration of the ethanol exposure. Furthermore, the rescue effect acted at a distance, suggesting that bioelectric modulation to treat ethanol-induced brain and eye defects in human embryos might be possible using existing ion channel drugs.

Research reported in this publication was supported by the National Institutes of Health under Award Number R01HDO81326. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News