Earth

Study: Scant evidence that 'wood overuse' at Cahokia caused collapse

image: Archaeologists at Washington University in St. Louis found scant evidence that 'wood overuse' at Cahokia caused local flooding and subsequent collapse.

Image: 
Joe Angeles / Washington University

Whatever ultimately caused inhabitants to abandon Cahokia, it was not because they cut down too many trees, according to new research from Washington University in St. Louis.

Archaeologists from Arts & Sciences excavated around earthen mounds and analyzed sediment cores to test a persistent theory about the collapse of Cahokia, the pre-Columbian Native American city in southwestern Illinois that was once home to more than 15,000 people.

No one knows for sure why people left Cahokia, though many environmental and social explanations have been proposed. One oft-repeated theory is tied to resource exploitation: specifically, that Native Americans from densely populated Cahokia deforested the area, an environmental misstep that could have resulted in erosion and localized flooding.

But such musings about self-inflicted disaster are outdated -- and they're not supported by physical evidence of flooding issues, Washington University scientists said.

"There's a really common narrative about land use practices that lead to erosion and sedimentation and contribute to all of these environmental consequences," said Caitlin Rankin, an assistant research scientist at the University of Illinois at Urbana-Champaign who conducted this work as part of her graduate studies at Washington University.

"When we actually revisit this, we're not seeing evidence of the flooding," Rankin said.

"The notion of looming ecocide is embedded in a lot of thinking about current and future environmental trajectories," said Tristram R. "T.R." Kidder, the Edward S. and Tedi Macias Professor of Anthropology in Arts & Sciences at Washington University. "With a growing population and more mouths to feed, overconsumption of all resources is a real risk.

"Inevitably, people turn to the past for models of what has happened. If we are to understand what caused changes at sites like Cahokia, and if we are to use these as models for understanding current possibilities, we need to do the hard slogging that critically evaluates different ideas," added Kidder, who leads an ongoing archaeological research program at the Cahokia Mounds State Historic Site. "Such work allows us to sift through possibilities so we can aim for those variables that do help us to explain what happened in the past -- and explore if this has a lesson to tell us about the future."

No indications of self-inflicted harm

Writing in the journal Geoarchaeology, Rankin and colleagues at Bryn Mawr University and Northern Illinois University described their recent excavations around a Mississippian Period (AD 1050-1400) earthen mound in the Cahokia Creek floodplain.

Their new archaeological work, completed while Rankin was at Washington University, shows that the ground surface on which the mound was constructed remained stable until industrial development.

The presence of a stable ground surface from Mississippian occupation to the mid-1800s does not support the expectations of the so-called "wood-overuse" hypothesis, the researchers said.

This hypothesis, first proposed in 1993, suggests that tree clearance in the uplands surrounding Cahokia led to erosion, causing increasingly frequent and unpredictable floods of the local creek drainages in the floodplain where Cahokia was constructed.

Rankin noted that archaeologists have broadly applied narratives of ecocide -- the idea that societies fail because people overuse or irrevocably damage the natural resources that their people rely on -- to help to explain the collapse of past civilizations around the world.

Although many researchers have moved beyond classic narratives of ecocide made popular in the 1990s and early 2000s, Cahokia is one such major archaeological site where untested hypotheses have persisted.

"We need to be careful about the assumptions that we build into these narratives," Rankin said.

"In this case, there was evidence of heavy wood use," she said. "But that doesn't factor in the fact that people can reuse materials -- much as you might recycle. We should not automatically assume that deforestation was happening, or that deforestation caused this event."

Kidder said: "This research demonstrates conclusively that the over-exploitation hypothesis simply isn't tenable. This conclusion is important because the hypothesis at Cahokia -- and elsewhere -- is sensible on its face. The people who constructed this remarkable site had an effect on their environment. We know they cut down tens of thousands of trees to make the palisades -- and this isn't a wild estimate, because we can count the number of trees used to build and re-build this feature. Wood depletion could have been an issue."

"The hypothesis came to be accepted as truth without any testing," Kidder said. "Caitlin's study is important because she did the hard work -- and I do mean hard, and I do mean work -- to test the hypothesis, and in doing so has falsified the claim. I'd argue that this is the exciting part; it's basic and fundamental science. By eliminating this possibility, it moves us toward other explanations and requires we pursue other avenues of research."

Credit: 
Washington University in St. Louis

New Forest Service assessment delivers research on invasive species

image: "Invasive Species in Forests and Rangelands of the United States: A Comprehensive Science Synthesis for the United States Forest Sector," was recently released by the USDA Forest Service as an open-access book by Springer Nature. Authors include 115 Forest Service scientists and managers, other federal and state agencies, and university, non-government organization, and tribal land partners. It presents the latest research on a wide range of natural science and social science fields that explore the ecology, impacts, and practical tools for management of invasive species.

Image: 
USDA Forest Service

MADISON, WI, April 8, 2021 - USDA Forest Service scientists have delivered a new comprehensive assessment of the invasive species that confront America's forests and grasslands, from new arrivals to some that invaded so long ago that people are surprised to learn they are invasive.

The assessment, titled "Invasive Species in Forests and Rangelands of the United States: A Comprehensive Science Synthesis for the United States Forest Sector," serves as a one-stop resource for land managers who are looking for information on the invasive species that are already affecting the landscape, the species that may threaten the landscape, and what is known about control of invasive species.

"Understanding the ecology of invasive species, their dynamics and complex ecological, economic, and societal interactions is critical to improving management strategies and reducing impacts to native ecosystems," said Cynthia West, director of the Forest Service's Northern Research Station and the Forest Products Laboratory. "Tracking the science can be daunting, so Forest Service scientists and many partners have assembled a science synthesis that puts the latest research at land managers' fingertips."

The assessment covers invasive species of all taxonomic groups from insects and pathogens, to plants, vertebrates, and aquatic organisms that impact a diversity of habitats in forests, rangelands and grasslands of the United States. It presents the latest research on a wide range of natural science and social science fields that explore the ecology, impacts, and practical tools for management of invasive species. The scientific synthesis provides the cultural, economic, regulatory, and social context for addressing environmental challenges posed by invasive species. Geographically focused regional summaries highlight the most important invasive species and issues impacting all regions of the country.

Credit: 
USDA Forest Service - Northern Research Station

Third of Antarctic ice shelf area at risk of collapse as planet warms

More than a third of the Antarctic's ice shelf area could be at risk of collapsing into the sea if global temperatures reach 4°C above pre-industrial levels, new research has shown.

The University of Reading led the most detailed ever study forecasting how vulnerable the vast floating platforms of ice surrounding Antarctica will become to dramatic collapse events caused by melting and runoff, as climate change forces temperatures to rise.

It found that 34% of the area of all Antarctic ice shelves - around half a million square kilometres - including 67% of ice shelf area on the Antarctic Peninsula, would be at risk of destabilisation under 4°C of warming. Limiting temperature rise to 2°C rather than 4°C would halve the area at risk and potentially avoid significant sea level rise.

The researchers also identified Larsen C - the largest remaining ice shelf on the peninsula, which split to form the enormous A68 iceberg in 2017 - as one of four ice shelves that would be particularly threatened in a warmer climate.

Dr Ella Gilbert, a research scientist in the University of Reading's Department of Meteorology, said: "Ice shelves are important buffers preventing glaciers on land from flowing freely into the ocean and contributing to sea level rise. When they collapse, it's like a giant cork being removed from a bottle, allowing unimaginable amounts of water from glaciers to pour into the sea.

"We know that when melted ice accumulates on the surface of ice shelves, it can make them fracture and collapse spectacularly. Previous research has given us the bigger picture in terms of predicting Antarctic ice shelf decline, but our new study uses the latest modelling techniques to fill in the finer detail and provide more precise projections.

"The findings highlight the importance of limiting global temperature increases as set out in the Paris Agreement if we are to avoid the worst consequences of climate change, including sea level rise."

The new study, published in the Geophysical Research Letters journal, used state-of-the-art, high-resolution regional climate modelling to predict in more detail than before the impact of increased melting and water runoff on ice shelf stability.

Ice shelf vulnerability from this fracturing process was forecast under 1.5°C, 2°C and 4°C global warming scenarios, which are all possible this century.

Ice shelves are permanent floating platforms of ice attached to areas of the coastline and are formed where glaciers flowing off the land meet the sea.

Every summer, ice at the surface of the ice shelf melts and trickles down into small air gaps in the snow layer below, where it refreezes. However, in years when there is a lot of melting but little snowfall, the water pools on the surface or flows into crevasses, deepening and widening them until the ice shelf eventually fractures and collapses into the sea. If there is water collecting on the surface of the ice shelf, that suggests it could be vulnerable to collapse in this way.

This is what happened to the Larsen B ice shelf in 2002, which fractured following several years of warm summer temperatures. Its collapse caused the glaciers behind the ice shelf to speed up, losing billions of tonnes of ice to the sea.

The researchers identified the Larsen C, Shackleton, Pine Island and Wilkins ice shelves as most at-risk under 4°C of warming, due to their geography and the significant runoff predicted in those areas.

Dr Gilbert said: "If temperatures continue to rise at current rates, we may lose more Antarctic ice shelves in the coming decades.

"Limiting warming will not just be good for Antarctica - preserving ice shelves means less global sea level rise, and that's good for us all."

Credit: 
University of Reading

All-in-one device uses microwave power for defense, medicine

WEST LAFAYETTE, Ind. - An invention from Purdue University innovators may provide a new option to use directed energy for biomedical and defense applications.

The Purdue invention uses composite based nonlinear transmission lines (NLTLs) for a complete high-power microwave system, eliminating the need for multiple auxiliary systems. The interest in NLTLs has increased in the past few decades because they offer an effective solid-state alternative to conventional vacuum-based, high-power microwave generators that require large and expensive external systems, such as cryogenic electromagnets and high-voltage nanosecond pulse generators.

NLTLs have proven effective for applications in the defense and biomedical fields. They create directed high-power microwaves that can be used to disrupt or destroy adversary electronic equipment at a distance. The same technology also can be used for biomedical devices for sterilization and noninvasive medical treatments.

"We created a new NLTL device that reduces the bulkiness of current options and offers new opportunities to protect our country and help patients in a man-portable form factor," said Andrew Fairbanks, a Ph.D. student and graduate research assistant in Purdue's College of Engineering. "In engineering, we are concerned about size, weight, power and cost. Our invention helps address all of these."

Allen Garner, an associate professor of nuclear engineering, led the Purdue team. The researchers created a novel device using composite-based NLTLs as complete high-power microwave systems, encompassing high-voltage pulse and high-power microwave formation. The Purdue device combines the elements of traditional NLTLs into a composite-based system and eliminates typical bulky auxiliary equipment.

The system is charged using a DC high-voltage supply and discharged using a high-voltage, gas-based switch. The system eliminates the need for external pulse generation and is more rugged due to the solid-state construction.

This work was supported by the Office of Naval Research (Grant No. N00014-18-1-2341).

Fairbanks presented the Purdue technology in March at the Directed Energy Professional Society annual Science and Technology Symposium.

The innovators worked with the Purdue Research Foundation Office of Technology Commercialization to patent their technology.

"It has been very beneficial to our team and our advancement of this technology to have OTC here at Purdue," Fairbanks said. "OTC and other resources from Purdue Research Foundation provide support to take our technology and push it out to the world through patenting and commercialization."

Credit: 
Purdue University

Reversing a genetic cause of poor stress responses in mice

image: Deleting the Ophn1 gene causes mice to respond to stressful situations with an inappropriate helpless behavior. CSHL Professor Linda Van Aelst and her lab wanted to know the exact location in the mouse brain affected by the lack of Ophn1 that leads to this helpless/depressive behavior. In this image of a mouse brain, the green color shows the prelimbic region of the medial prefrontal cortex, where the researchers injected a virus to delete Ophn1. Ophn1 (red) is still present in other parts of the brain. The researchers discovered that deleting the gene in only this part of the brain caused the observed failure in stress adaptation. Human brains are organized similarly, so their findings in mice may be applicable to helping human patients who experience an inability to deal with stressful situations.

Image: 
Minghui Wang/Van Aelst Lab, CSHL/2021

Everyone faces stress occasionally, whether in school, at work, or during a global pandemic. However, some cannot cope as well as others. In a few cases, the cause is genetic. In humans, mutations in the OPHN1 gene cause a rare X-linked disease that includes poor stress tolerance. Cold Spring Harbor Laboratory (CSHL) Professor Linda Van Aelst seeks to understand factors that cause specific individuals to respond poorly to stress. She and her lab studied the mouse gene Ophn1, an analog of the human gene, which plays a critical role in developing brain cell connections, memories, and stress tolerance. When Ophn1 was removed in a specific part of the brain, mice expressed depression-like helpless behaviors. The researchers found three ways to reverse this effect.

To test for stress, the researchers put mice into a two-room cage with a door in between. Normal mice escape from the room that gives them a light shock on their feet. But animals lacking Ophn1 sit helplessly in that room without trying to leave. Van Aelst wanted to figure out why.

Her lab developed a way to delete the Ophn1 gene in different brain regions. They found that removing Ophn1 from the prelimbic region of the medial prefrontal cortex (mPFC), an area known to influence behavioral responses and emotion, induced the helpless phenotype. Then the team figured out which brain circuit was disrupted by deleting Ophn1, creating overactivity in the brain region and ultimately the helpless phenotype.

Understanding the circuit

Pyramidal neurons are central to this brain circuit. If they fire too much, the mouse becomes helpless.

Another cell, an interneuron, regulates the pyramidal neuron activity, making sure it does not fire too much.

These two cells feedback to each other, creating a loop.

Ophn1 controls a particular protein, RhoA kinase, within this feedback loop which helps regulate and balances activity.

Van Aelst found three agents that reversed the helpless phenotype. Fasudil, an inhibitor specific for RhoA kinase, mimicked the effect of the missing Ophn1. A second drug dampens excess pyramidal neuron activity. A third drug wakes up the interneurons to inhibit pyramidal neurons. Van Aelst says:

"So bottom line, if you can restore the proper activity in the medial prefrontal cortex, then you could rescue the phenotype. So that was actually very exciting. You should be open to anything. You never know. Everything is surprising."

Van Aelst hopes that understanding the complex feedback loop behind Ophn1-related stress responses will lead to better treatments for stress in humans.

Credit: 
Cold Spring Harbor Laboratory

New proposal for the management of low back pain with a proprioceptive approach

image: Researchers made each participant stand on a balance board, attached fasteners to their body and legs and then generated vibration signals on their PC which they amplified and applied to the vibrators to induce stimulation.

Image: 
Image courtesy: Yoshifumi Morita from Nagoya Institute of Technology

Ever since the early humans learned to walk upright, they have suffered, as an unfortunate consequence of their erect posture, from low back pain. Modern understanding on this matter dictates that low back pain, in particular, is caused due to a postural instability resulting from poor "proprioception", which is a term for the perception of part of our body's own position in space. In fact, our trunk and lower legs are key to maintaining postural stability due to the presence of "proprioceptors"--sensory receptors responding to position and movement--in those areas.

Elderly people suffering from low back pain tend to have poorly performing proprioceptors, which is thought to affect their "proprioceptive control strategy"--a postural control strategy in response to vibratory stimulations as proprioceptive input. Interestingly, studies have suggested that a local vibratory stimulation can, in fact, improve proprioceptive function. In previous studies by other researchers, however, its effect on postural control is still unclear. Moreover, the studies make no distinction between poor and healthy proprioceptors and do not take into account the fact that each proprioceptor has a natural vibration response frequency.

To address these issues, a team of researchers from Japan recently conducted a study in which they explored the effect of local vibratory stimulations on the proprioceptive control strategy when applied to a poor proprioceptor. Prof. Yoshifumi Morita from Nagoya Institute of Technology, Japan, who was part of the study, published in Electronics, lays down the research question: "For elderly people with low back pain, can proprioceptive function be improved? Will it cure the low back pain?"

Researchers carried out their study over a period of 3 months in which they recruited six elderly individuals, all of whom were patients with low back pain. Researchers made each participant stand on a balance board to assess their standing balance and attached fasteners with vibrators to their legs as well as both sides of their trunk. They then generated vibration signals using a PC and amplified and output them from the vibrators as mechanical vibratory stimulations. Furthermore, they allowed the frequency of stimulation to vary with time, from an initial 20 Hz (cycles/second) up to 300 Hz, to gauge the postural response as a function of the applied frequency. Finally, they compared the proprioceptive control strategy in each patient before and after applying the stimulations to an impaired proprioceptor.

Three patients showed an improvement in their proprioceptive control strategy after their impaired muscle spindles (proprioceptor detecting stretch in muscles) responded to higher frequency, an observation that suggested that low back pain could be alleviated in patients by activating impaired proprioceptors with vibratory stimulations. Furthermore, the treatment device and protocol could be used for multiple frequency ranges, allowing for the diagnosis as well as activation of a poor proprioceptor.

Given the results, the researchers look forward to conducting a clinical trial for a larger group of patients. "The clinical trial is scheduled to start in April this year and will be conducted for the next three years. We plan to verify whether the improved proprioceptive sensation can be maintained for a long time, thus relieving elderly people of low back pain," comments an excited Prof. Morita.

The team hopes that the trial's findings will soon lead to the commercialization of their device, which will allow elderly patients with low back pain to finally breathe a huge sigh of relief!

Credit: 
Nagoya Institute of Technology

Scientists develop eco-friendly pollen sponge to tackle water contaminants

video: Scientists led by NTU Singapore have developed an eco-friendly pollen sponge to tackle water contaminants, making it a promising alternative to tackle marine oil spills.

Image: 
NTU Singapore

A team of scientists led by Nanyang Technological University, Singapore (NTU Singapore) has created a reusable, biodegradable sponge that can readily soak up oil and other organic solvents from contaminated water sources, making it a promising alternative for tackling marine oil spills.

Made of sunflower pollen, the sponge is hydrophobic - it repels water - thanks to a coat of natural fatty acid on the sponge. In lab experiments, the scientists showed the sponge's ability to absorb oil contaminants of various densities, such as gasoline and motor oil, at a rate comparable to that of commercial oil absorbents.

Oil spills are difficult to clean up, and result in severe long-lasting damage to the marine ecosystem. Conventional clean-up methods, including using chemical dispersants to break oil down into very small droplets, or absorbing it with expensive, unrecyclable materials, may worsen the damage. 

So far, the researchers have engineered sponges that measure 5 cm in diameter. The research team, made up of scientists from NTU Singapore and Sungkyunkwan University in South Korea, believes that these sponges, when scaled up, could be an eco-friendly alternative to tackle marine oil spills.

Professor Cho Nam-Joon from the NTU School of Materials Science and Engineering, who led the study, said: "By finetuning the material properties of pollen, our team successfully developed a sponge that can selectively target oil in contaminated water sources and absorb it. Using a material that is found abundantly in nature also makes the sponge affordable, biodegradable, and eco-friendly."

This study builds on NTU's body of work on finding new uses for pollen, known as the diamond of the plant kingdom for its hard exterior, by transforming its tough shell into microgel particles. This soft, gel-like material is then used as a building block for a new category of environmentally sustainable materials. 

Last year, Prof Cho, together with NTU President Professor Subra Suresh, led a research team to create a paper-like material from pollen as a greener alternative to paper created from trees. This 'pollen paper' also bends and curls in response to changing levels of environmental humidity, a trait that could be useful for soft robots, sensors, and artificial muscles. 

Prof Cho, who also holds the Materials Research Society of Singapore Chair in Materials Science and Engineering, added: "Pollen that is not used for plant pollination is often considered biological waste. Through our work, we try to find new uses for this 'waste' and turn it into a natural resource that is renewable, affordable, and biodegradable. Pollen is also biocompatible. It does not cause an immunological, allergic or toxic reaction when exposed to body tissues, making it potentially suitable for applications such as wound dressing, prosthetics, and implantable electronics."

The findings were published in the scientific journal Advanced Functional Materials in March.

Building a sponge from pollen

To form the sponge, the NTU team first transformed the ultra-tough pollen grains from sunflowers into a pliable, gel-like material through a chemical process akin to conventional soap-making. 

This process includes removing the sticky oil-based pollen cement that coats the grain's surface, before incubating the pollen in alkaline conditions for three days. The resulting gel-like material was then freeze-dried.

These processes resulted in the formation of pollen sponges with 3D porous architectures. The sponges were briefly heated to 200°C - a step that makes their form and structure stable after repeatedly absorbing and releasing liquids. Heating also led to a two-fold improvement in the sponge's resistance to deformation, the scientists found.

To make sure the sponge selectively targets oil and does not absorb water, the scientists coated it with a layer of stearic acid, a type of fatty acid found commonly in animal and vegetable fat. This renders the sponge hydrophobic while maintaining its structural integrity.

The scientists performed oil-absorption tests on the pollen sponge with oils and organic solvents of varying densities, such as gasoline, pump oil, and n-hexane (a chemical found in crude oil). 

They found that the sponge had an absorption capacity in the range of 9.7 to over 29.3 g/g . This is comparable to commercial polypropylene absorbents, which are petroleum derivatives and have an absorption capacity range of 8.1 to 24.6 g/g.

They also tested the sponge for its durability and reusability by repeatedly soaking it in silicone oil, then squeezing the oil out. They found that this process could go on for at least 10 cycles. 

In a final proof-of-concept experiment, the team tested the ability of a sponge 1.5cm in diameter and 5mm in height to absorb motor oil from a contaminated water sample. The sponge readily absorbed the motor oil in less than 2 minutes. 

"Collectively, these results demonstrate that the pollen sponge can selectively absorb and release oil contaminants and has similar performance levels to commercial oil absorbents while demonstrating compelling properties such as low cost, biocompatibility, and sustainable production," said Prof Cho, the corresponding author of this study. 

Going forward, the researchers plan to scale up the size of pollen sponges to meet industry needs. They are also looking to collaborate with non-governmental organisations and international partners to conduct pilot tests with pollen sponges in real-life environments. 

"We hope our innovative pollen materials can one day replace widely-used plastics and help to curb the global issue of plastic pollution," said Prof Cho.

Credit: 
Nanyang Technological University

Study demonstrates the need to monitor the bit area of event horses

image: The object of the current study was event horses, that is horses which compete in events composed of three phases: show jumping, dressage and a cross-country test.

Image: 
TheOtherKev / Pixabay

In a study conducted at the Faculty of Veterinary Medicine, bit-related lesions were observed in half of the event horses examined after competitions. Since most oral lesions are not accompanied by bleeding outside the mouth, the bit area should be monitored.

It was found that event horses that wear thin or thick bits in events had a greater risk of moderate or severe oral lesions compared to horses wearing medium-sized bits, while straight bits were associated with lesions in the bars of the horse's mouth.

"Our recommendation is to use a jointed bit of moderate thickness, that is 14 to 17 millimetres, if the size of the mouth is not known, paying particular attention to the handling of mares and both warmblood and coldblood event horses. They were seen to have a greater risk of mouth lesions compared to geldings and ponies," says doctoral student and veterinarian Kati Tuomola from the Faculty of Veterinary Medicine, University of Helsinki.

"Since most mouth lesions are not evidenced as bleeding outside the mouth, the bit area should be monitored, and event organisers should carry out systematic oral inspections," Tuomola adds.

The research group has previously investigated oral lesions in trotters. The object of the current study was event horses, that is horses which compete in events composed of three phases: show jumping, dressage and a cross-country test. The study was conducted in cooperation with the Equestrian Federation of Finland, and participation in it was voluntary.

"As many as 95% of the equestrians invited for the study wanted their horse to participate, for which we are thankful," says a pleased Tuomola.

For the study, 208 randomly selected horses which were 4 to 19 years old were examined in eight events organised in western Finland in 2018 and 2019. The front part of the horses' mouths was examined after a cross-country test, the last phase of the events. A total of 127 of the horses were warmbloods, 52 coldbloods and 29 ponies. Of the 208 horses examined, 52% had lesions in the bit area. Bruises were found in 39% of the horses and wounds in 19%. Among the group, 48% had no acute lesions, while 22% had mild, 26% moderate and 4% severe lesions. Blood was observed inside the mouth of one horse.

Consequently, event horses had fewer and less severe lesions than the trotters studied earlier.

"The field scoring system developed by Kati Tuomola in her doctoral thesis makes it possible to conduct comparative studies of different groups of horses, something that was previously impossible," says Professor Anna Valros from the Faculty of Veterinary Medicine, University of Helsinki.

Credit: 
University of Helsinki

Entropy measurements reveal exotic effect in "magic-angle" graphene

image: Pomeranchuk effect in magic angle graphene, revealing an exotic transition between two phases: A (Fermi) liquid phase, where the spatial positions of electrons are disordered but their magnetic moments (arrows) are perfectly aligned, and a solid-like phase where the electrons are ordered in space but their magnetic moments are fluctuating freely. Counterintuitively, the liquid phase transforms to the solid-like phase upon heating.

Image: 
Weizmann Institute of Science

Most materials go from being solids to liquids when they are heated. One rare counter-example is helium-3, which can solidify upon heating. This counterintuitive and exotic effect, known as the Pomeranchuk effect, may now have found its electronic analogue in a material known as magic-angle graphene, says a team of researchers from the Weizmann Institute of Science led by Prof. Shahal Ilani, in collaboration with Prof. Pablo Jarillo-Herrero's group at the Massachusetts Institute of Technology (MIT).

This result, published today in Nature, comes thanks to the first ever measurement of electronic entropy in an atomically-thin two dimensional material. "Entropy describes the level of disorder in a material and determines which of its phases is stable at different temperatures," explains Ilani. "Our team set up to measure the electronic entropy in magic angle graphene to resolve some of its outstanding mysteries, but discovered another surprise".

Giant magnetic entropy

Entropy is a basic physical quantities that is not easy to grasp or measure directly. At low temperatures, most of the degrees of freedom in a conducting material freeze out, and only the electrons contribute to the entropy. In bulk materials, there is an abundance of electrons, and thus it is possible to measure their heat capacity and from that deduce the entropy. In an atomically-thin two-dimensional material, due to the small number of electrons, such a measurement becomes extremely challenging. So far, no experiments succeeded in measuring the entropy in such systems.

To measure the entropy, the Weizmann team used a unique scanning microscope comprising of a carbon nanotube single-electron transistor positioned at the edge of a scanning probe cantilever. This instrument can spatially image the electrostatic potential produced by electrons in a material, with an unprecedented sensitivity. Based on Maxwell's relations that connect the different thermodynamic properties of a material, one can use these electrostatic measurements to directly probe the entropy of the electrons.

"When we performed the measurements at high magnetic fields, the entropy looked absolutely normal, following the expected behaviour of a conventional (Fermi) liquid of electrons, which is the most standard state in which electrons exist at low temperatures. Surprisingly, however, at zero magnetic field, the electrons exhibited giant excess entropy, whose presence was very mysterious." says Ilani. This giant entropy emerged when the number of electrons in the system was about one per each site of the artificial "superlattice" formed in magic angle graphene.

Artificial "superlattice" in twisted layers of graphene

Graphene is a one atom thick crystal of carbon atoms arranged in a hexagonal lattice. When two graphene sheets are placed on top of each other with a small and special, or "magic", misalignment angle, a periodic moiré pattern appears that acts as an artificial "superlattice" for the electrons in the material. Moiré patterns are a popular effect in fabrics and emerge wherever one mesh overlays another at a slight angle.

In magic angle graphene, the electrons come in four flavours: spin "up" or spin "down", and two "valleys". Each moiré site can thus hold up to four electrons, one of each flavour.

Researchers already knew that this system behaves as a simple insulator when all moiré sites are completely full (four electrons per site). In 2018, however, Prof. Jarillo-Herrero and colleagues discovered to their surprise that it can be insulating at other integer fillings (two or three electrons per moiré site), which could only be explained if a correlated state of electrons is formed. However, near a filling of one electron per moiré site, the vast majority of transport measurements indicated that the system is quite simple, behaving as an ordinary metal. This is exactly where the entropy measurements by the Weizmann-MIT team found the most surprising results.

"In contrast to the behaviour seen in transport near a filling of one electron per moiré site, which is quite featureless, our measurements indicated that thermodynamically, the most dramatic phase transition occurs at this filling", says Dr. Asaf Rozen, a lead author in this work. "We realized that near this filling, upon heating the material, a rather conventional Fermi liquid transforms into a correlated metal with a giant magnetic entropy. This giant entropy (of about 1 Boltzmann constant per lattice site) could only be explained if each moiré site has a degree of freedom that is completely free to fluctuate".

An electronic analogue of the Pomeranchuk effect

"This unusual excess entropy reminded us of an exotic effect that was discovered about 70 years ago in helium-3", says Weizmann theorist Prof. Erez Berg. "Most materials, when heated up, transform from a solid to a liquid. This is because a liquid always has more entropy than the solid, as the atoms move more erratically in the liquid than in the solid." In helium-3, however, in a small part of the phase diagram, the material behaves completely oppositely, and the higher temperature phase is the solid. This behaviour, predicted by Soviet theoretical physicist Isaak Pomeranchuk in the 1950s, can only be explained by the existence of another "hidden" source of entropy in the system. In the case of helium-3, this entropy comes from the freely rotating nuclear spins. "Each atom has a spin in its nucleus (an 'arrow' that can point in any direction)," explains Berg. "In liquid helium-3, due to the Pauli exclusion principle, exactly half of the spins must point up and half must point down, so spins cannot freely rotate. In the solid phase, however, the atoms are localized and never come close to each other, so their nuclear spins can freely rotate."

"The giant excess entropy that we observed in the correlated state with one electron per moiré site is analogous to the entropy in solid helium-3, but instead of atoms and nuclear spins, in the case of magic angle graphene we have electrons and electronic spins (or valley magnetic moments)", he says.

The magnetic phase diagram

To establish the relation with the Pomeranchuk effect further, the team performed detailed measurements of the phase diagram. This was done by measuring the "compressibility" of the electrons in the system- that is, how hard it is to squeeze additional electrons into a given lattice site (such a measurement was demonstrated in twisted bilayer graphene in the team's previous work). This measurement revealed two distinct phases separated by a sharp drop in the compressibility: a low-entropy, electronic liquid-like phase, and a high-entropy solid-like phase with free magnetic moments. By following the drop in the compressibility, the researchers mapped the boundary between the two phases as a function of temperature and magnetic field, demonstrating that the phase boundary behaves precisely as expected from the Pomerachuk effect.

"This new result challenges our understanding of magic angle graphene," says Berg. "We imagined that the phases in this material were simple - either conducting or insulating, and expected that at such low temperatures, all the electronic fluctuations are frozen out. This turns out not to be the case, as the giant magnetic entropy shows".

"The new findings will provide fresh insights into the physics of strongly correlated electron systems and perhaps even help explain how such fluctuating spins affect superconductivity," he adds.

The researchers acknowledge that they do not yet know how to explain the Pomeranchuk effect in magic angle graphene. Is it exactly as in helium-3 in that the electrons in the solid-like phase remain at a great distance from each other, allowing their magnetic moments to stay completely free? "We are not sure," admits Ilani, "since the phase we have observed has a 'spit personality' - some of its properties are associated with itinerant electrons while others can only be explained by thinking of the electrons as being localized on a lattice".

Credit: 
Weizmann Institute of Science

Unraveling the mysteries of sleep disorders in multiple system atrophy

image: Sleep disorders are an important but often overlooked symptom in multiple system atrophy that could help researchers understand the biological underpinnings of a rare neurodegenerative disease

Image: 
Pexels

Unusual diseases are medical mysteries that fascinate us, and one such disease is multiple system atrophy, or MSA. This rare neurological disorder causes failures in the proper functioning of the body's autonomic system (processes that are not under our conscious control, such as blood pressure, breathing, and involuntary movement). The resulting symptoms can look like two other types of neurodegenerative disease: Parkinson's disease and cerebellar ataxia. In fact, MSA can be separated into a parkinsonism subtype or a cerebellar subtype based on whether the resultant movement-related symptoms bear greater similarity to one or the other. However, MSA also has other symptoms, with sleep disorders being common, but under-researched. As a result, we understand very little about the factors that influence the presence of sleep disorders in patients with MSA. Being unable to sleep well makes life harder for these patients, who already suffer from the other symptoms of the condition; as such, emphasizing sleep disturbances is important for addressing patient needs.

With this in mind, researchers led by Dr. Hui-Fang Shang of Sichuan University in China set out to investigate three specific sleep disorders (Parkinson's disease related sleep problems [PD-SP], excessive daytime sleepiness [EDS], and rapid eye movement sleep behavior disorder [RBD]) in patients with MSA. Dr. Shang explains, "Our goal was to determine the frequency of these three sleep disturbances in MSA, including in both subtypes. We also wanted to know whether sleep disorders affected how severe MSA was." Their findings have been published in Chinese Medical Journal.

After screening for MSA and excluding other neurological disorders, the researchers examined 165 patients using questionnaires to determine the presence of sleep-related symptoms and MSA severity. Dr. Shang and colleagues found that PD-SP occurred in 18.8% of patients, EDS in 27.3%, and RBD in 49.7%, whereas all three coexisted in 7.3% of patients. They also showed that PD-SP and EDS, but not RBD, were more common in the parkinsonism subtype than in the cerebellar subtype. Their analysis adjusted for patient age, duration of MSA, and usage of drug treatment for MSA, meaning these three factors did not help explain the differences in sleep disorders across the two MSA subtypes. They also found that male sex was associated with EDS and RBD in patients with MSA.

And most importantly, greater the number of these sleep-related symptoms a patient had, more severe their MSA was.

When the researchers combined their findings with results from previous brain imaging studies that looked at MSA and sleep disorders, they concluded that sleep disorders are associated with MSA-induced damage to certain regions of the brain. The location and distribution of degenerating neurons are different between the two subtypes, with a wider range of brain areas affected in the parkinsonism form. This could explain why patients with the parkinsonism subtype have more sleep disorders. Additionally, the connection between MSA severity and increased number of sleep-related symptoms could reflect the amount of neuronal damage. Because sleep disorders are associated with loss of dopaminergic and non-dopaminergic neurons (that is, neurons producing the neurotransmitter chemical dopamine), the results shed some light on the underlying biological causes of MSA.

"For me, this research emphasizes the need to focus more on sleep disorders when treating patients with MSA," says Dr. Shang , explaining the scientific and clinical contributions of their work, "We are the first to perform such a systematic analysis of sleep-related symptoms in MSA, and any future research can build on what we did here to better understand this serious condition. Currently, MSA has no cure, so a greater emphasis on sleep disorders will help do two things: address the lower quality of life in these patients due to disturbed sleep, and provide scientific data for developing effective treatments."

The findings provide hope for a more holistic approach to improving patients' quality of life.

Credit: 
Cactus Communications

Parkinson's, cancer, type 2 diabetes share a key element that drives disease

image: Parkin protein (green signal) is in a different part of the cell than the mitochondria (red signal) at time 0 (left image) but then co-localizes with the mitochondria after 60 minutes (right image).

Image: 
Salk Institute

LA JOLLA--(April 7, 2021) When cells are stressed, chemical alarms go off, setting in motion a flurry of activity that protects the cell's most important players. During the rush, a protein called Parkin hurries to protect the mitochondria, the power stations that generate energy for the cell. Now Salk researchers have discovered a direct link between a master sensor of cell stress and Parkin itself. The same pathway is also tied to type 2 diabetes and cancer, which could open a new avenue for treating all three diseases.

"Our findings represent the earliest step in Parkin's alarm response that anyone's ever found by a long shot. All the other known biochemical events happen at one hour; we've now found something that happens within five minutes," says Professor Reuben Shaw, director of the NCI-designated Salk Cancer Center and senior author of the new work, detailed in Science Advances on April 7, 2021. "Decoding this major step in the way cells dispose of defective mitochondria has implications for a number of diseases."

Parkin's job is to clear away mitochondria that have been damaged by cellular stress so that new ones can take their place, a process called mitophagy. However, Parkin is mutated in familial Parkinson's disease, making the protein unable to clear away damaged mitochondria. While scientists have known for some time that Parkin somehow senses mitochondrial stress and initiates the process of mitophagy, no one understood exactly how Parkin was first sensing problems with the mitochondria--Parkin somehow knew to migrate to the mitochondria after mitochondrial damage, but there was no known signal to Parkin until after it arrived there.

Shaw's lab, which is well known for their work in the fields of metabolism and cancer, spent years intensely researching how the cell regulates a more general process of cellular cleaning and recycling called autophagy. About ten years ago, they discovered that an enzyme called AMPK, which is highly sensitive to cellular stress of many kinds, including mitochondrial damage, controls autophagy by activating an enzyme called ULK1.

Following that discovery, Shaw and graduate student Portia Lombardo began searching for autophagy-related proteins directly activated by ULK1. They screened about 50 different proteins, expecting about 10 percent to fit. They were shocked when Parkin topped the list. Biochemical pathways are usually very convoluted, involving up to 50 participants, each activating the next. Finding that a process as important as mitophagy is initiated by only three participants--first AMPK, then ULK1, then Parkin--was so surprising that Shaw could scarcely believe it.

To confirm the findings were correct, the team used mass spectrometry to reveal precisely where ULK1 was attaching a phosphate group to Parkin. They found that it landed in a new region other researchers had recently found to be critical for Parkin activation but hadn't known why. A postdoctoral fellow in Shaw's lab, Chien-Min Hung, then did precise biochemical studies to prove each aspect of the timeline and delineated which proteins were doing what, and where. Shaw's research now begins to explain this key first step in Parkin activation, which Shaw hypothesizes may serve as a "heads-up" signal from AMPK down the chain of command through ULK1 to Parkin to go check out the mitochondria after a first wave of incoming damage, and, if necessary, trigger destruction of those mitochondria that are too gravely damaged to regain function.

The findings have wide-ranging implications. AMPK, the central sensor of the cell's metabolism, is itself activated by a tumor suppressor protein called LKB1 that is involved in a number of cancers, as established by Shaw in prior work, and it is activated by a type 2 diabetes drug called metformin. Meanwhile, numerous studies show that diabetes patients taking metformin exhibit lower risks of both cancer and aging comorbidities. Indeed, metformin is currently being pursued as one of the first ever "anti-aging" therapeutics in clinical trials.

"The big takeaway for me is that metabolism and changes in the health of your mitochondria are critical in cancer, they're critical in diabetes, and they're critical in neurodegenerative diseases," says Shaw, who holds the William R. Brody Chair. "Our finding says that a diabetes drug that activates AMPK, which we previously showed can suppress cancer, may also help restore function in patients with neurodegenerative disease. That's because the general mechanisms that underpin the health of the cells in our bodies are way more integrated than anyone could have ever imagined."

Credit: 
Salk Institute

Scientists at IRB Barcelona identify a potential target to treat lung cancer

image: .

Image: 
IRB Barcelona

In cancer, personalised medicine takes advantage of the unique genetic changes in an individual tumour to find its vulnerabilities and fight it. Many tumours have a higher number of mutations due to a antiviral defence mechanism, the APOBEC system, which can accidentally damage DNA and cause mutations.

Researchers at IRB Barcelona led by Dr. Travis Stracker and Dr. Fran Supek have found the HMCES enzyme to be the Achilles heel of some lung tumours, specifically those with a higher number of mutations caused by the APOBEC system.

"We have discovered that blocking HMCES is very damaging to cells with an activated APOBEC system (which are many lung cancer cells), but much less so for those in which it is not activated (as is often the case in healthy cells)," explains Dr. Supek, ICREA researcher and head of the Genome Data Science lab at IRB Barcelona.

"Besides showing specificity to cancer cells, HMCES is potentially targetable by drugs, which makes it a great candidate for future lung cancer treatments," adds Dr. Stracker, former group leader at IRB Barcelona, now working at the National Cancer Institute (NIH/NCI) in the USA.

A multidisciplinary approach to double-check the best strategy

Involving collaboration between two research groups working on different disciplines, the study comprised both a computational and experimental approach. Genetic screening experiments were performed using CRISPR/Cas9 on several types of human lung adenocarcinoma cell lines. "These experiments can interrogate the effects of removing each gene individually from the cancer cells and they allow us to see whether the cancer can tolerate this change," say Josep Biayna and Isabel Garcia, IRB Barcelona researchers and first and second authors of the article. Previous data from CRISPR genetic screens performed by other labs were also statistically analysed and confirmed the experimental results.

A mutation fog caused by a defence system

When cells sense a mismatch in their DNA, they undergo a DNA repair reaction to preserve genetic information. Remarkably, this reaction can become coupled to the APOBEC enzymes, typically used by human cells to defend against viruses and having an important role in fighting hepatitis and HIV. This mechanism was previously described by the Genome Data Science lab and indicates that, in some cases, when the APOBEC enzymes and the DNA repair process are simultaneously active, APOBEC hijacks DNA repair, thus generating the mutation fog.

Outsmarting cancer evolution

Late-stage cancers can accumulate a high number of mutations in their DNA, which can cause the cancer to become more aggressive or to resist drugs better. Many of these mutations may be caused by APOBEC which accelerates tumor evolution. Therefore, killing cancer cells that activate APOBEC should slow down tumor evolution and prevent it from gaining new dangerous mutations.

Credit: 
Institute for Research in Biomedicine (IRB Barcelona)

We don't know how most mammals will respond to climate change, warn scientists

A new scientific review has found there are significant gaps in our knowledge of how mammal populations are responding to climate change, particularly in regions most sensitive to climate change. The findings are published in the British Ecological Society's Journal of Animal Ecology.

Nearly 25% of mammal species are threatened with extinction, with this risk exacerbated by climate change. But the ways climate change is impacting animals now, and projected to in the future, is known to be complex. Different environmental changes have multiple and potentially contrasting, effects on different aspects of animals' lives, such as reproduction and survival (known as demographic rates).

A new review by a global team of researchers from 15 different institutions has found that most studies on terrestrial mammals only looked at one of these demographic rates at a time, potentially not showing the full picture of climate change impacts.

In a search of 5,728 terrestrial mammal species, the researchers found only 106 studies that looked at both survival and reproduction at the same time. This covered 87 species and constitutes less than 1% of all terrestrial mammals.

"Researchers often publish results on the effects of climate on survival or on reproduction - and not both. But only in rare cases does a climatic variable (say, temperature) consistently negatively or positively affect all studied rates of survival and reproduction." said Dr Maria Paniw from the University of Zurich and lead author of the review.

For example, higher temperatures could decrease the number of offspring, but if the offspring have a better chance of survival because of less competition, the population size won't necessarily be affected. On the other hand, if higher temperatures decrease both reproduction and survival, a study of only one of these could underestimate the effects on a population.

The review also found a mismatch in the regions where studies on climate change impacts on mammals were taking place and regions recognised as being the most vulnerable to climate change, meaning that we know very little about the complex climate impacts in the most climate-vulnerable regions of the globe.

"We were surprised by the lack of data on high-altitude (alpine) mammals. Climate change is expected to be very pronounced in higher elevations." said Dr Paniw. "In our review, we had a few alpine species, such as yellow-bellied marmots and plateau pikas, but I was expecting a study or two on iconic species such as snow leopards.

The review highlights the need for more research on mammal populations that account for multiple demographic responses across entire lifecycles.

"To inform evidence-based conservation, we need to prioritize more holistic approaches in data collection and integration to understand the mechanisms that drive population persistence." said Dr Paniw.

"There are many reasons why this data isn't being captured. An important aspect is that collecting such data requires long-term investment without immediate returns, which has not been favoured by many funding agencies and is also logistically challenging. These challenges are compounded in climate-vulnerable regions, which include many countries with underfunded infrastructure for long-term ecological research."

The review raises concerns that there are even bigger data gaps for animal groups that are less well studied than terrestrial mammals, such as insects and amphibians. This data is urgently needed to inform which species are most vulnerable to climate-driven extinction.

In this study, the researchers performed a literature review, using the species names of 5,728 terrestrial mammal to search databases of scientific papers for studies that quantified the relationship between demographic-rates and climate variables, such as rainfall and temperature.

The researchers only included studies that linked at least two demographic rates, such as survival and reproduction. They also recorded where the studies that did so were distributed globally.

The researchers are now looking to perform similar reviews on less well studied animal groups. Dr Paniw added: "I would like to foster collaborations that will jumpstart new research and 'repurpose' existing data in climate-vulnerable areas of the globe to fill the knowledge gaps we identified in our work".

Credit: 
British Ecological Society

Are early treatments for cerebral palsy effective?

Symptoms of cerebral palsy, a neurological disorder that affects a person's ability to move and maintain balance and posture, appear early during childhood. A new analysis examines the effectiveness of therapies initiated from birth until 3 years of age for children with or at risk for cerebral palsy.

The analysis, which is published in Developmental Medicine & Child Neurology, included all systematic reviews from 2009-2020 that assessed the results of relevant published studies.

Investigators concluded that research has generated limited supportive data and cannot yet confirm a greater benefit from early versus later interventions; however, earlier, more targeted research efforts are emerging that may provide novel insights.

"With earlier identification now possible, our new imperative is to translate neuroscience evidence on early brain recovery in animals into transforming developmental motor outcomes in children with cerebral palsy," said co-author Diane L. Damiano, PhD, of the National Institutes of Health.

Credit: 
Wiley

New formulation of existing medicines prove highly effective against drug-resistant fungus

image: Mahmoud Ghannoum / Case Western Reserve University

Image: 
CWRU

CLEVELAND--A team of researchers from Case Western Reserve University has discovered a formulation of existing medicines that can significantly reduce the presence of the fungus Candida auris (C. auris) on skin, controlling its spread and potentially keeping it from forming infections that have a high mortality rate.

By using a proprietary formulation of topical medications terbinafine or clotrimazole, researchers prevented the growth and spread of the fungus on the skin of a host; the findings appear in the most recent issue of the journal Antimicrobial Agents and Chemotherapy.

"It's a very difficult fungus to kill because it is highly resistant and opportunistic--generally taking hold in those whose immune systems are already battling other threats," said Mahmoud Ghannoum, who led the research as director of the Center for Medical Mycology at the Case Western Reserve School of Medicine and University Hospitals Cleveland Medical Center. "It's promising that we were able to decolonize--get rid of the fungus from the skin--with a unique formulation of medicines that have already been approved and are available."

The U.S. Centers for Disease Control (CDC) has identified the fungus as a serious threat to public health because of its resistance to treatment by existing drugs. The fungus was first found in Japan in 2009 and has since been detected in India, the United States and other countries. The mortality rate from infection is about 60%, according previously published research.

C. auris infections often occur in hospitals--with the fungus living undetected, even on the skin of recovering patients, as well as clothing, bedding and other surfaces. Many who get infected are immunocompromised, including patients on antibiotics, which can suppress the presence of good bacteria to help fight the fungus.

Identifying the fungus and diagnosing the infections it causes have been difficult. But emerging molecular-testing methods that can definitively detect the organism are becoming more widespread, Ghannoum said.

"Our results could prove to be one leg of the stool in stopping this unique threat," he said. "While there are many types of bacteria resistant to medicines, C. auris stands alone in this respect among fungi--creating significant challenges for treatment or eradication where it's taken hold."

In this study, researchers tested the proprietary formulation of medicines--a novel transdermal mixture of 1% terbinafine or 1% clotrimazole created with an emerging method known as "Advanced Penetration Technology"--on the skin of mice. Researchers are next seeking approval to test the blend of medications on human volunteers.

"As the only fungus that is multidrug resistant, we still have a lot to learn, especially in how widespread C. auris is in our health care settings," said Ghannoum, who is also the Marti D. and Jeffrey S. Davis Family Master Clinician in Cancer Innovation. "We feel encouraged that our findings are a piece of the puzzle in solving this serious health threat worldwide."

Credit: 
Case Western Reserve University