Tech

Designed bacteria produce coral-antibiotic

image: Horn corals of the species Antillogorgia elisabethae produce antibiotic natural substances. A research team at TUM has successfully produced one of these substances in the laboratory.

Image: 
(Image: Thomas Brück / TUM)

Thomas Brück saw the sea whip Antillogorgia elisabethae for the first time 17 years ago while diving on a research trip to the Bahamas. He still remembers this encounter vividly, which took place 18 meters below the water's surface: "Their polyp-covered, violet branchlets moved gently in the current. A fascinating living organism!" As it also contains various biologically active compounds, the biochemist since then has studied the natural product biosynthesis of this soft coral.

Sea whips are protected; despite this, their existence is in danger. The collection and sale of dried corals is a lucrative business, as these contain various active agents, including an anti-inflammatory molecule called pseudopterosin, which is used in the cosmetics industry for years.

"Coral reefs fix and store the greenhouse gas carbon dioxide and are biodiversity hotspots. If we want to protect the world's reefs, we have to generate such biologically active natural products, via sustainable processes," says Brück.

Natural antibiotic from the biotechnological laboratory

Together with his team at the Werner Siemens Chair of Synthetic Biotechnology, he has now managed for the first time to produce one of the sea whip's active agents in the laboratory - without the need for a single reef inhabitant. The molecule "erogorgiaene" is an antibiotic. Initial bioactivity tests show, that it is suitable for fighting multi-resistant tuberculosis pathogens.

Previously, a use of the active agent was almost unthinkable: The sea whip contains only extremely small quantities of erogorgiaene and is additionally protected - using it as a raw material source would be neither financially feasible nor ecologically responsible. Although production via conventional chemical synthesis is possible, it is complex and associated with toxic waste. A kilo of the active agent would cost around EUR 21,000.

Sustainable biotechnology reduces the production costs

"However, with biotechnological methods, a consolidated erogorgiaene production is feasible, in a more environmentally friendly manner and much cheaper. With this method, the production costs per kilo would only be around EUR 9,000," emphasizes Brück.

The new method, which he has developed together with colleagues from Berlin, Canada, and Australia, consists of only two steps: The main work is done by genetically optimized bacteria that feed on glycerin - a residual substance from biodiesel production.

The bacteria generate a molecule, that can then be converted into the desired active agent using a highly selective enzymatic step. No waste is produced in the process, as all ancillary products can be reused in a circular manner. A patent has been filed for the innovative production method.

Bioactive product development along the lines of nature

"The new technology platform for the production of bioactive natural products via biotechnological methodologies complies with all 12 criteria of Green Chemistry," says Thomas Brück. "In addition, it fulfills four of the UN Sustainability Goals: a healthy life for all, combating climate change and its effects, preservation and sustainable use of the oceans and maritime resources, and preservation of life on land."

The research team is now working on the biotechnological production of another coral active agent: Using nature as a model, the molecule erogorgiaene is to be converted into the active agent pseudopteropsin in the laboratory.

Medical professionals are placing great hope on the latter: Clinical studies have shown that pseudopteropsin inhibits inflammations thanks to a new mechanism of action. Thus, it is a potential therapeutic candidate to control excessive inflammatory reactions, for example in the case of viral infections, such as Covid-19, or during age-related chronic inflammations.

Credit: 
Technical University of Munich (TUM)

New superlattice material for future energy efficient devices

image: A team of international physicists including Jennifer Cano, PhD, of Stony Brook University, has created a new material layered by two structures, forming a superlattice, that at a high temperature is a super-efficient insulator conducting current without dissipation and lost energy. The finding is detailed in a paper published in Nature Physics.

Image: 
Stony Brook University

STONY BROOK, NY, August 17, 2020 - A team of international physicists including Jennifer Cano, PhD, of Stony Brook University, has created a new material layered by two structures, forming a superlattice, that at a high temperature is a super-efficient insulator conducting current without dissipation and lost energy. The finding, detailed in a paper published in Nature Physics, could be the basis of research leading to new, better energy efficient electrical conductors.

The material is created and developed in a laboratory chamber. Over time atoms attach to it and the material appears to grow - similar to the way rock candy is formed. Surprisingly, it forms a novel ordered superlattice, which the researchers test for quantized electrical transport.

The research centers around the Quantum Anomalous Hall Effect (QAHE), which describes an insulator that conducts dissipationless current in discrete channels on its surfaces. Because QAHE current does not lose energy as it travels, it is similar to a superconducting current and has the potential if industrialized to improve energy-efficient technologies.

"The main advance of this work is a higher temperature QAHE in a superlattice, and we show that this superlattice is highly tunable through electron irradiation and thermal vacancy distribution, thus presenting a tunable and more robust platform for the QAHE," says Cano, Assistant Professor in the Department of Physics and Astronomy in the College of Arts and Sciences at Stony Brook University and also an Affiliate Associate Research Scientist at the Flatiron Institute's Center for Computational Quantum Physics.

Cano and colleagues say they can advance this platform to other topological magnets. The ultimate goal would be to help transform future quantum electronics with the material.

The collaborative research is led by City College of New York under the direction of Lia Krusin-Elbaum, PhD. The research is supported in part by the National Science Foundation (grant numbers DMR-1420634 and HRD-1547830).

Credit: 
Stony Brook University

Flies and mosquitoes beware, here comes the slingshot spider

image: A slingshot spider is ready to launch its cone-shaped web at a flying insect. To do so, the spider will release a bundle of silk, allowing the tension line to release and catapult both the spider and the web.

Image: 
Lawrence E. Reeves

Running into an unseen spiderweb in the woods can be scary enough, but what if you had to worry about a spiderweb - and the spider - being catapulted at you? That's what happens to insects in the Amazon rain forests of Peru, where a tiny slingshot spider launches a web - and itself - to catch unsuspecting flies and mosquitoes.

Researchers at the Georgia Institute of Technology have produced what may be the first kinematic study of how this amazing arachnid stores enough energy to produce acceleration of 1,300 meters/second2 - 100 times the acceleration of a cheetah. That acceleration produces velocities of 4 meters per second and subjects the spider to forces of approximately 130 Gs, more than 10 times what fighter pilots can withstand without blacking out.

The Peruvian spider and its cousins stand out among arachnids for their ability to make external tools - in this case, their webs - and use them as springs to create ultrafast motion. Their ability to hold a ready-to-launch spring for hours while waiting for an approaching mosquito suggests yet another amazing tool: a latch mechanism to release the spring.

"Unlike frogs, crickets, or grasshoppers, the slingshot spider is not relying on its muscles to jump really quickly," said Saad Bhamla, an assistant professor in Georgia Tech's School of Chemical and Biomolecular Engineering who studies ultrafast organisms. "When it weaves a new web every night, the spider creates a complex, three-dimensional spring. If you compare this natural silk spring to carbon nanotubes or other human-made materials in terms of power density or energy density, it is orders of magnitude more powerful."

The study, supported by the National Science Foundation and National Geographic Society Foundation, was published August 17 in the journal Current Biology. Understanding how web silk stores energy could potentially provide new sources of power for tiny robots and other devices, and lead to new applications for the robust material, the researchers say.

Slingshot spiders, known by the scientific genus name Theridiosomatid, build three-dimensional conical webs with a tension line attached to the center. The Peruvian member of that spider family, which is about 1 millimeter in length, pulls the tension line with its front legs to stretch the structure while holding on to the web with its rear legs. When it senses a meal within range, the spider launches the web and itself toward a fly or mosquito.

If the launch is successful, the spider quickly wraps its meal in silk. If the spider misses, it simply pulls the tension line to reset the web for the next opportunity.

"We think this approach probably gives the spider the advantage of speed and surprise, and perhaps even the effect of stunning the prey," noted Symone Alexander, a postdoctoral researcher in Bhamla's lab. "The spiders are tiny, and they are going after fast-flying insects that are larger than they are. To catch one, you must be much, much faster than they are."

Slingshot spiders were described in a 1932 publication, and more recently by Jonathan Coddington, now a senior research entomologist at the Smithsonian Institution. Bhamla has an interest in fast-moving but small organisms, so he and Alexander arranged a trip to study the catapulting creature using ultrafast cameras to measure and record the movement.

"We wanted to understand these ultrafast movements because they can force our perspective to change from thinking about cheetahs and falcons as the only fast animals," Bhamla said. "There are many very small invertebrates that can achieve fast movement through unusual structures. We really wanted to understand how these spiders achieve that amazing acceleration."

The researchers traveled six hours by boat from Puerto Maldonado to the Tambopata Research Center. There is no electricity in the area, so nights are very dark. "We looked up and saw a tiny red dot," Bhamla recalled. "We were so far away from the nearest light that the dot turned out to be the planet Mars. We could also see the Milky Way so clearly."

The intense darkness raises the question of how the spider senses its prey and determines where to aim itself. Bhamla believes it must be using an acoustic sensing technique, a theory supported by the way the researchers tricked the spider into launching its web: They simply snapped their fingers.

Beyond sensing in the dark, the researchers also wondered how the spider triggers release of the web. "If an insect gets within range, the spider releases a small bundle of silk that it has created by crawling along the tension line," Alexander said. "Releasing the bundle controls how far the web flies. Both the spider and web are moving backward."

Another mystery is how the spider patiently holds the web while waiting for food to fly by. Alexander and Bhamla estimated that stretching the web requires at least 200 dynes, a tremendous amount of energy for a tiny spider to generate. Holding that for hours could waste a lot of energy.

"Generating 200 dynes would produce tremendous forces on the tiny legs of the spider," Bhamla said. "If the reward is a mosquito at the end of three hours, is that worth it? We think the spider must be using some kind of trick to lock its muscles like a latch so it doesn't need to consume energy while waiting for hours."

Beyond curiosity, why travel to Peru to study the creature? "The slingshot spider offers an example of active hunting instead of the passive, wait for an insect to collide into the web strategy, revealing a further new functionality of spider silk," Bhamla said. "Before this, we hadn't thought about using silk as a really powerful spring."

Another unintended benefit is changing attitudes toward spiders. Prior to the study, Alexander admits she had a fear of spiders. Being surrounded by slingshot spiders in the Peruvian jungle - and seeing the amazing things they do - changed that.

"In the rainforest at night, if you shine your flashlight, you quickly see that you are completely surrounded by spiders," she said. "In my house, we don't kill spiders anymore. If they happen to be scary and in in the wrong place, we safely move them to another location."

Alexander and Bhamla had hoped to return to Peru this summer, but those plans were cut short by the coronavirus. They're eager to continue learning from the spider.

"Nature does a lot of things better than humans can do, and nature has been doing them for much longer," she said. "Being out in the field gives you a different perspective, not only about what nature is doing, but also why that is necessary."

Credit: 
Georgia Institute of Technology

NASA satellite catches the end of Post-tropical Storm Kyle

image: NASA's Terra satellite provided a visible image of an elongated, weakening Post-tropical Storm Kyle as it continued moving away from New England on Sunday, Aug. 16, 2020.

Image: 
Image: NASA Worldview

NASA's Terra satellite provided a visible image of the end of Post-tropical Storm Kyle in the North Atlantic Ocean on Aug. 16.

Kyle was a tropical storm for only one day, when it formed a couple of hundred miles off the coast of Rhode Island on Aug. 15. The next day, Aug. 16, Kyle had become a post-tropical storm.

NHC defines a post-tropical cyclone as a former tropical cyclone. This generic term describes a cyclone that no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. Post-tropical cyclones can continue carrying heavy rains and high winds. Two classes of post-tropical cyclones include extratropical and remnant low pressure areas.

On Sunday, Aug. 16, at 5 a.m. EDT (0900 UTC), the National Hurricane Center issued the final advisory on Kyle. The center of Post-Tropical Cyclone Kyle was located near latitude 40.0 degrees north and longitude 58.9 degrees west. The post-tropical cyclone was moving toward the east near 20 mph (31 kph). The estimated minimum central pressure is 1003 millibars. Maximum sustained winds had decreased to near 40 mph (65 kph) with higher gusts.

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite captured a visible image of Kyle at 2:30 p.m. EDT on Aug. 16 and the storm's circulation had become very elongated. The center had become ill defined.

Model analyses and satellite imagery suggested on Aug. 16 that the low-pressure area became attached to a prominent warm/stationary front to its east and a weaker trailing cold front to its southwest. As a result, Kyle had become an extratropical low-pressure area.

By Monday morning, Aug. 17, Kyle had dissipated.

Credit: 
NASA/Goddard Space Flight Center

Contextual engineering adds deeper perspective to local projects

image: Ann-Perry Witmer, University of Illinois, developed contextual engineering to combine technical expertise with understanding of local conditions.

Image: 
College of ACES, University of Illinois

URBANA, Ill. - When engineers develop drinking water systems, they often expect their technology and expertise to work in any context. But project success depends as much on the people and place as on technical design, says Ann-Perry Witmer, lecturer in the Department of Agricultural and Biological Engineering (ABE) and research scientist at the Applied Research Institute at University of Illinois.

Contextual engineering is a novel approach combining technological expertise with deep understanding of cultural and societal conditions. Witmer developed the concept based on her years of experience in water design engineering. She outlines the practice in a recent article, published in Journal AWWA (American Water Works Association), the leading journal for water engineers in the U.S.

Witmer worked extensively as a consultant in water engineering before obtaining her doctorate and joining the ABE faculty. Her experience with both U.S. and international projects showed her engineers often need to better understand the interplay of culture and technology.

"When you apply a technical process, you have to think about who you are doing it for and how it's going to be used," Witmer explains. "There's a predisposition when you're a design engineer or infrastructure manager to say, 'this is the way I know how to do things so this is the way we're going to do it,' rather than looking at the needs and capabilities of who's going to actually be using it and working from that point of view.

"This kind of 'off the shelf' behavior in designing and delivering ends up costing us more and just creating paths for failure, when things are not being used the way they're intended," she adds.

Instead, she recommends engineers consider the "why" of an operation before they develop the "how" of design and construction.

International engineering projects in particular highlight the shortcomings of a top-down approach to technology. Witmer gives an example from Ecuador, where U.S. engineers built a state-of-the art water treatment plant to provide clean water to local communities. However, when the drinking water was analyzed in people's homes, it was found to be heavily contaminated with coliform bacteria. The problem turned out to be the water bottling plant, which wasn't following sanitary practices.

"This rural community was paying more for water because it was coming from the plant, but it wasn't being handled in a sanitary way," she says. "Rather than paying for treated water that ends up not treated, it would have been as simple as encouraging the community to boil their water before they drink it. In that case, they could get it from tank water at a fraction of the cost. And for the context that was much more effective at providing them with clean water."

Witmer notes it may be tempting to assume the water bottling plant needed standards and education but that's another example of a perspective that isn't sensitive to local conditions.

"It gets really dicey when you are saying, 'all we have to do is educate people,' because it's an engineering predisposition to assume that people just don't know," she says. "Well, a lot of times they know a lot.

The importance of contextual understanding doesn't just pertain to projects in developing countries; it can be equally relevant within the U.S. For example, Witmer explains, a water plant in Wisconsin was developed by an engineering firm that normally operated in the Southwest. Because Wisconsin is colder, the water was more viscous and flowed more slowly, so the plant had much less capacity than anticipated. Nobody had thought to question whether the expertise of the engineering firm translated to a different environment.

Contextual engineering is inspired by social studies and follows a process of analyzing multiple components including people and stakeholders, local conditions, global influences, and the engineering design process. The approach outlines five key domains- political, cultural, economic, educational, and mechanical-that should be considered in the analysis.

The analysis does not have to be complicated but it does require immersion into the local culture and active listening, Witmer notes.

"Technical professionals have typically been trained to employ a rigorous engineering problem-solving approach, trusting in the power of their technology and expertise to work in any situation," she says. "It requires a shift in perspective to listen and absorb local conditions but doing so can greatly enhance the success of engineering projects that may otherwise fail due to various environmental factors."

The process starts with analysis at a global-level perspective, then moves to immersion at the local level. Pay attention to the different objectives of everyone involved, from the person who turns on the water tap in their house, to the utility manager, the state regulators, and the engineers, Witmer says. They each have their own perspectives and motivations, and it's important to understand and reconcile all of those.

Witmer recommends contextual engineering be part of the curriculum for engineering students, so cultural awareness becomes an integral part of their knowledge and practice. She's doing her part to make that happen through research, teaching, and workshops at Illinois.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Why young and female patients don't respond as well to cancer immunotherapy

image: Due to a process known as immuno-editing, younger and female patients with cancer have cancer-causing genetic mutations that are least visible to the immune system according to a new Nature Communications study by Hannah Carter, PhD, and team at UC San Diego School of Medicine.

Image: 
UC San Diego Health Sciences

Cancer immunotherapy -- empowering a patient's own immune system to clear away tumors on its own -- holds great promise for some patients. But for other patients, immunotherapy just doesn't work.

Researchers at University of California San Diego School of Medicine have found evidence that helps explain why patients who are young and/or female have especially low response rates to some types of cancer immunotherapy.

Their findings suggest that since the typically robust immune systems of young and female patients are better at getting rid of tumor cells, the cells left behind are not as readily visible to the immune system to begin with, rendering some types of immunotherapy ineffective.

The study is published August 17, 2020, in Nature Communications.

"Now that we know why some patients don't respond as well to immunotherapy, we can begin developing more informed approaches to treatment decisions -- for instance, developing predictive algorithms to determine a person's likely response before initiating immunotherapies that may have a high probability of not working or working poorly for them," said senior author Hannah Carter, PhD, associate professor of medicine at UC San Diego School of Medicine.

Cancerous or infected cells wave molecular flags that tell the immune system to clear them away before the problem gets out of control. The flag poles -- molecules of the Major Histocompatibility Complexes (MHC) -- are displayed at the surface of most cells in the body. MHCs hold up antigen flags -- bits of just about everything from inside the cells -- and display them to immune cell surveyors that are constantly checking for damaged or infected cells. Since tumor cells carry a lot of mutations, they show up frequently among these flags, allowing the immune system to detect and eliminate them.

But some tumor cells evade the immune system by also throwing up a stop sign molecule that keeps the immune system from recognizing the MHC flags. And here's where immune checkpoint inhibitors come in: This type of cancer immunotherapy uses antibodies to make the tumor cell once again visible to the patient's immune system.

So why would a person's age or sex influence how well immune checkpoint inhibitors work?

Sex and age differences have long been observed when it comes to immune response. For example, females have twice the antibody response to flu vaccines and are far more susceptible to autoimmune diseases. Similarly, human immune systems tend to weaken as we age. But if females and younger people have stronger immune responses in most cases, you might expect cancer immunotherapy to work better for them, not worse.

To get to the bottom of this conundrum, Carter's team looked at genomic information for nearly 10,000 patients with cancer available from the National Institutes of Health's The Cancer Genome Atlas, and another 342 patients with other tumor types available from the International Cancer Genome Consortium database and published studies. They found no age or sex-related differences in MHC function.

What they did find was that, compared to older and male patients with cancer, younger and female patients tend to accumulate more cancer-causing genetic mutations of the sort that MHCs can't present to the immune system as efficiently. Carter said this is likely because robust immune systems of the young and female are better at getting rid of cells displaying well-presented mutant self-antigens, leaving behind tumor cells that rely more heavily on the poorly presented mutations. This selective pressure is known as immuno-editing.

"So if a tumor cell doesn't present highly visible, mutated self antigens to begin with, checkpoint inhibitor drugs can't help reveal them to the immune system," she said.

"This shows an important thing, that the interplay between the cancer genome and the adaptive arm of the immune system is not a static one," said co-author Maurizio Zanetti, MD, professor of medicine at UC San Diego School of Medicine and head of the Laboratory of Immunology at UC San Diego Moores Cancer Center. "Two simple but important variables, age and sex, influence this interplay. The study also emphasizes the master role of the MHC in dictating the outcome of this interplay, reaffirming its central role in the evolution of disease, cancer included, at the level of the individual and population."

Carter cautions that their findings for "younger" patients don't necessarily apply to children since, genetically speaking, pediatric tumors are very different from adult tumors. In addition, she noted that, like most genomics databases, those used in this study contain data primarily from people of Caucasian descent, and more diversity is needed to confirm that the findings can be generalized to all populations.

"Cancer isn't just one disease, and so the way we treat it can't be one-size-fits-all," she said. "All checkpoint inhibitors can do is remove the generic block that tumors put up to hide from the immune system. The more we learn about how interactions between tumors and immune systems might vary, the better positioned we are to tailor treatments to each person's situation."

Credit: 
University of California - San Diego

NASA researchers track slowly splitting 'dent' in Earth's magnetic field

image: This stereoscopic visualization shows a simple model of the Earth's magnetic field. The magnetic field partially shields the Earth from harmful charged particles emanating from the Sun.

Image: 
NASA's Goddard Space Flight Center

A small but evolving dent in Earth's magnetic field can cause big headaches for satellites.

Earth's magnetic field acts like a protective shield around the planet, repelling and trapping charged particles from the Sun. But over South America and the southern Atlantic Ocean, an unusually weak spot in the field - called the South Atlantic Anomaly, or SAA - allows these particles to dip closer to the surface than normal. Particle radiation in this region can knock out onboard computers and interfere with the data collection of satellites that pass through it - a key reason why NASA scientists want to track and study the anomaly.

The South Atlantic Anomaly is also of interest to NASA's Earth scientists who monitor the changes in magnetic field strength there, both for how such changes affect Earth's atmosphere and as an indicator of what's happening to Earth's magnetic fields, deep inside the globe.

Currently, the SAA creates no visible impacts on daily life on the surface. However, recent observations and forecasts show that the region is expanding westward and continuing to weaken in intensity. It is also splitting - recent data shows the anomaly's valley, or region of minimum field strength, has split into two lobes, creating additional challenges for satellite missions.

A host of NASA scientists in geomagnetic, geophysics, and heliophysics research groups observe and model the SAA, to monitor and predict future changes - and help prepare for future challenges to satellites and humans in space.

It's what's inside that counts

The South Atlantic Anomaly arises from two features of Earth's core: The tilt of its magnetic axis, and the flow of molten metals within its outer core.

Earth is a bit like a bar magnet, with north and south poles that represent opposing magnetic polarities and invisible magnetic field lines encircling the planet between them. But unlike a bar magnet, the core magnetic field is not perfectly aligned through the globe, nor is it perfectly stable. That's because the field originates from Earth's outer core: molten, iron-rich and in vigorous motion 1800 miles below the surface. These churning metals act like a massive generator, called the geodynamo, creating electric currents that produce the magnetic field.

As the core motion changes over time, due to complex geodynamic conditions within the core and at the boundary with the solid mantle up above, the magnetic field fluctuates in space and time too. These dynamical processes in the core ripple outward to the magnetic field surrounding the planet, generating the SAA and other features in the near-Earth environment - including the tilt and drift of the magnetic poles, which are moving over time. These evolutions in the field, which happen on a similar time scale to the convection of metals in the outer core, provide scientists with new clues to help them unravel the core dynamics that drive the geodynamo.

"The magnetic field is actually a superposition of fields from many current sources," said Terry Sabaka, a geophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. Regions outside of the solid Earth also contribute to the observed magnetic field. However, he said, the bulk of the field comes from the core.

The forces in the core and the tilt of the magnetic axis together produce the anomaly, the area of weaker magnetism - allowing charged particles trapped in Earth's magnetic field to dip closer to the surface.

The Sun expels a constant outflow of particles and magnetic fields known as the solar wind and vast clouds of hot plasma and radiation called coronal mass ejections. When this solar material streams across space and strikes Earth's magnetosphere, the space occupied by Earth's magnetic field, it can become trapped and held in two donut-shaped belts around the planet called the Van Allen Belts. The belts restrain the particles to travel along Earth's magnetic field lines, continually bouncing back and forth from pole to pole. The innermost belt begins about 400 miles from the surface of Earth, which keeps its particle radiation a healthy distance from Earth and its orbiting satellites.

However, when a particularly strong storm of particles from the Sun reaches Earth, the Van Allen belts can become highly energized and the magnetic field can be deformed, allowing the charged particles to penetrate the atmosphere.

"The observed SAA can be also interpreted as a consequence of weakening dominance of the dipole field in the region," said Weijia Kuang, a geophysicist and mathematician in Goddard's Geodesy and Geophysics Laboratory. "More specifically, a localized field with reversed polarity grows strongly in the SAA region, thus making the field intensity very weak, weaker than that of the surrounding regions."

A pothole in space

Although the South Atlantic Anomaly arises from processes inside Earth, it has effects that reach far beyond Earth's surface. The region can be hazardous for low-Earth orbit satellites that travel through it. If a satellite is hit by a high-energy proton, it can short-circuit and cause an event called single event upset or SEU. This can cause the satellite's function to glitch temporarily or can cause permanent damage if a key component is hit. In order to avoid losing instruments or an entire satellite, operators commonly shut down non-essential components as they pass through the SAA. Indeed, NASA's Ionospheric Connection Explorer regularly travels through the region and so the mission keeps constant tabs on the SAA's position.

The International Space Station, which is in low-Earth orbit, also passes through the SAA. It is well protected, and astronauts are safe from harm while inside. However, the ISS has other passengers affected by the higher radiation levels: Instruments like the Global Ecosystem Dynamics Investigation mission, or GEDI, collect data from various positions on the outside of the ISS. The SAA causes "blips" on GEDI's detectors and resets the instrument's power boards about once a month, said Bryan Blair, the mission's deputy principal investigator and instrument scientist, and a lidar instrument scientist at Goddard.

"These events cause no harm to GEDI," Blair said. "The detector blips are rare compared to the number of laser shots - about one blip in a million shots - and the reset line event causes a couple of hours of lost data, but it only happens every month or so."

In addition to measuring the SAA's magnetic field strength, NASA scientists have also studied the particle radiation in the region with the Solar, Anomalous, and Magnetospheric Particle Explorer, or SAMPEX - the first of NASA's Small Explorer missions, launched in 1992 and providing observations until 2012. One study, led by NASA heliophysicist Ashley Greeley as part of her doctoral thesis, used two decades of data from SAMPEX to show that the SAA is slowly but steadily drifting in a northwesterly direction. The results helped confirm models created from geomagnetic measurements and showed how the SAA's location changes as the geomagnetic field evolves.

"These particles are intimately associated with the magnetic field, which guides their motions," said Shri Kanekal, a researcher in the Heliospheric Physics Laboratory at NASA Goddard. "Therefore, any knowledge of particles gives you information on the geomagnetic field as well."

Greeley's results, published in the journal Space Weather, were also able to provide a clear picture of the type and amount of particle radiation satellites receive when passing through the SAA, which emphasized the need for continuing monitoring in the region.

The information Greeley and her collaborators garnered from SAMPEX's in-situ measurements has also been useful for satellite design. Engineers for the Low-Earth Orbit, or LEO, satellite used the results to design systems that would prevent a latch-up event from causing failure or loss of the spacecraft.

Modeling a safer future for satellites

In order to understand how the SAA is changing and to prepare for future threats to satellites and instruments, Sabaka, Kuang and their colleagues use observations and physics to contribute to global models of Earth's magnetic field.

The team assesses the current state of the magnetic field using data from the European Space Agency's Swarm constellation, previous missions from agencies around the world, and ground measurements. Sabaka's team teases apart the observational data to separate out its source before passing it on to Kuang's team. They combine the sorted data from Sabaka's team with their core dynamics model to forecast geomagnetic secular variation (rapid changes in the magnetic field) into the future.

The geodynamo models are unique in their ability to use core physics to create near-future forecasts, said Andrew Tangborn, a mathematician in Goddard's Planetary Geodynamics Laboratory.

"This is similar to how weather forecasts are produced, but we are working with much longer time scales," he said. "This is the fundamental difference between what we do at Goddard and most other research groups modeling changes in Earth's magnetic field."

One such application that Sabaka and Kuang have contributed to is the International Geomagnetic Reference Field, or IGRF. Used for a variety of research from the core to the boundaries of the atmosphere, the IGRF is a collection of candidate models made by worldwide research teams that describe Earth's magnetic field and track how it changes in time.

"Even though the SAA is slow-moving, it is going through some change in morphology, so it's also important that we keep observing it by having continued missions," Sabaka said. "Because that's what helps us make models and predictions."

The changing SAA provides researchers new opportunities to understand Earth's core, and how its dynamics influence other aspects of the Earth system, said Kuang. By tracking this slowly evolving "dent" in the magnetic field, researchers can better understand the way our planet is changing and help prepare for a safer future for satellites.

Credit: 
NASA/Goddard Space Flight Center

Climate change mitigation not the primary motivator in regenerative ranching

CORVALLIS, Ore. - Regenerative ranching, a holistic approach to managing grazing lands, enhances ranchers' adaptive capacity and socioeconomic well-being while also providing an opportunity to mitigate climate change, a new study from Oregon State University has found.

Regenerative ranching practices rebuild ecological processes, allowing ranchers to reduce reliance on products such as chemical herbicides, pesticides and fertilizers, which are significant sources of greenhouse gas emissions.

While some science suggests that regenerative ranching can result in climate change mitigation through carbon drawdown into soils, that is not usually the driving factor behind ranchers' decision to adopt the practice, said the study's lead author, Hannah Gosnell, an OSU geographer who studies the human dimensions of climate change.

Understanding what motivates ranchers to adopt carbon-friendly practices will play an important role in efforts to expand the use of managed grazing systems to reduce climate change impacts, said Gosnell, a professor in Oregon State's College of Earth, Ocean, and Atmospheric Sciences.

"What we found is that ranchers manage regeneratively for all these other benefits, and if there's some measureable soil carbon sequestration and it contributes to climate change mitigation, then that's icing on the cake," she said.

The findings were just published in The Royal Society Interface Focus journal as part of a special issue on carbon dioxide removal. Co-authors are Susan Charnley of the U.S. Forest Service and Paige Stanley of the University of California, Berkeley.

More than a third of the Earth's ice-free land surface is used for livestock grazing. Livestock production, while important to livelihoods across the world, is a significant source of greenhouse gas emissions, a key contributor to climate change, Gosnell said.

Regenerative ranching is drawing increased interest as a potential climate change solution. Previous studies have suggested that these practices boost soil carbon sequestration, a process by which carbon dioxide from the atmosphere is transferred into and stored in soil through vegetation, and increase resilience to drought, which help ranchers both mitigate and adapt to the effects of climate change, Gosnell said.

To better understand ranchers' motivations and interest in regenerative agriculture practices, Gosnell interviewed ranchers in the United States and Australia about the perceived benefits and challenges of adopting the practices.

She and her colleagues found that the transition to regenerative ranching is often difficult because the practices require a thorough understanding of the fundamental ecosystem processes involved. They also found that offering incentives such as cash payments are not the most promising way to convince ranchers to make the switch, since the practice requires a paradigm shift in thinking along with a new set of practices.

"It's hard to transition to regenerative ranching because it requires such a deep commitment," Gosnell said. "If you want ranchers to make the switch, paying them is likely not motivation enough."

The most common benefit of regenerative agriculture mentioned by the ranchers interviewed was the increase in deep ground cover, which increases soil carbon sequestration and leads to increased forage for livestock and greater resilience to stressors such as droughts, floods or freezing temperatures. Because ranchers using regenerative practices were not dependent on expensive chemicals, they also were less vulnerable to financial shocks and stressors, which in turn increased their resilience, Gosnell said.

Improved water retention, increased soil fertility and other benefits from regenerative ranching motivate ranchers to continue using the approach once they adopt it, through a process of self-amplifying positive feedbacks, she said.

"As a result of their new practices, ranchers see less bare ground, more native perennials, more biodiversity and more forage for their cattle, all without use of chemicals," she said. "This inspires them to continue with regenerative practices, which then leads to more ecological improvement, better economic returns and more positive feedback for the rancher."

There are few opportunities for ranchers to be paid through carbon markets, a trading program where those who emit carbon purchase "offsets" or credits from an entity that is reducing its carbon footprint or increasing carbon sequestration. Also, because the approach takes tremendous dedication, cash incentives alone may not suffice, Gosnell said.

"Putting a price on carbon and incentivizing practices with payments is probably necessary, but certainly not sufficient for the approach to scale up," Gosnell said. "A broader shift in practices will likely require a 'bottom-up' approach involving networks of like-minded individuals contributing to cultural change within agriculture and the cultivation of new markets for regenerative products."

Research, outreach and education is also needed to help ranchers develop a deep understanding of the ecological processes that makes the switch to regenerative ranching effective, she said.

"This is a low-cost, low-tech, natural climate solution, and it can be a really effective and important one," she said. "But it is hard for ranchers to transition to because it requires a deep understanding of fundamental ecological processes and adoption of a new set of management tools."

Credit: 
Oregon State University

Machine learning reveals role of culture in shaping meanings of words

image: A new algorithm analyzes words across many languages. First it translates the semantic associates of a particular word into another language, and then repeats the process the other way around. In this example, the semantic neighbors of "beautiful" were translated into French and then the semantic neighbors of "beau" were translated into English. The respective lists were substantially different because of different cultural associations. Image courtesy the researchers.

Image: 
Thompson et al.

What do we mean by the word beautiful? It depends not only on whom you ask, but in what language you ask them. According to a machine learning analysis of dozens of languages conducted at Princeton University, the meaning of words does not necessarily refer to an intrinsic, essential constant. Instead, it is significantly shaped by culture, history and geography. This finding held true even for some concepts that would seem to be universal, such as emotions, landscape features and body parts.

"Even for every day words that you would think mean the same thing to everybody, there's all this variability out there," said William Thompson, a postdoctoral researcher in computer science at Princeton University, and lead author of the findings, published in Nature Human Behavior Aug. 10. "We've provided the first data-driven evidence that the way we interpret the world through words is part of our culture inheritance."

Language is the prism through which we conceptualize and understand the world, and linguists and anthropologists have long sought to untangle the complex forces that shape these critical communication systems. But studies attempting to address those questions can be difficult to conduct and time consuming, often involving long, careful interviews with bilingual speakers who evaluate the quality of translations. "It might take years and years to document a specific pair of languages and the differences between them," Thompson said. "But machine learning models have recently emerged that allow us to ask these questions with a new level of precision."

In their new paper, Thompson and his colleagues Seán Roberts of the University of Bristol, U.K., and Gary Lupyan of the University of Wisconsin, Madison, harnessed the power of those models to analyze over 1,000 words in 41 languages.

Instead of attempting to define the words, the large-scale method uses the concept of "semantic associations," or simply words that have a meaningful relationship to each other, which linguists find to be one of the best ways to go about defining a word and comparing it to another. Semantic associates of "beautiful," for example, include "colorful," "love," "precious" and "delicate."

The researchers built an algorithm that examined neural networks trained on various languages to compare millions of semantic associations. The algorithm translated the semantic associates of a particular word into another language, and then repeated the process the other way around. For example, the algorithm translated the semantic associates of "beautiful" into French and then translated the semantic associates of beau into English. The algorithm's final similarity score for a word's meaning came from quantifying how closely the semantics aligned in both directions of the translation.

"One way to look at what we've done is a data-driven way of quantifying which words are most translatable," Thompson said.

The findings revealed that there are some nearly universally-translatable words, primarily those that refer to numbers, professions, quantities, calendar dates and kinship. Many other word types, however, including those that referred to animals, food and emotions, were much less well matched in meaning.

In one final step, the researchers applied another algorithm that compared how similar the cultures that produced the two languages are, based on an anthropological dataset comparing things like marriage practices, legal systems and political organization of given language's speakers.

The researchers found that their algorithm could correctly predict how easily two languages could be translated based on how similar the two cultures that speak them are. This shows that variability in word meaning is not just random. Culture plays a strong role in shaping languages--a hypothesis that theory has long predicted, but that researchers lacked quantitative data to support.

"This is an extremely nice paper that provides a principled quantification to issues that have been central to the study of lexical semantics," said Damián Blasi, a language scientist at Harvard University, who was not involved in the new research. While the paper does not provide a definitive answer for all the forces that shape the differences in word meaning, the methods the authors established are sound, Blasi said, and the use of multiple, diverse data sources "is a positive change in a field that has systematically disregarded the role of culture in favor of mental or cognitive universals."

Thompson agreed that he and his colleagues' findings emphasize the value of "curating unlikely sets of data that are not normally seen in the same circumstances." The machine learning algorithms he and his colleagues used were originally trained by computer scientists, while the datasets they fed into the models to analyze were created by 20th century anthropologists as well as more recent linguistic and psychological studies. As Thompson said, "Behind these fancy new methods, there's a whole history of people in multiple fields collecting data that we're bringing together and looking at in a whole new way."

Credit: 
Princeton University, Engineering School

Quantum-mechanical interaction of two time crystals has been experimentally demonstrated

Quantum time crystals are systems characterised by spontaneously emerging periodic order in the time domain. In a regular crystal, atoms form periodic order in space, while in a time crystal, a periodic process in time (like oscillation, rotation etc.) spontaneously emerges.

While originally a phase of broken time translation symmetry was a just theoretical exercise, a few practical realisations of time crystals have been reported, such as in previous Aalto research on time crystals.

However, the dynamics and interactions between time crystals have not been investigated experimentally until now

Aalto Senior Scientist, Vladimir Eltsov, explains, 'We have demonstrated the flow of particles between two time crystals as predicted by the famous Josephson effect in quantum mechanics, while coherent time evolution, which is an essence of a time crystal, remains intact'.

Time crystals keep coherence over time, resistant to environmental noise. This is one essential property for building quantum devices (like qubits in a quantum computer). The new finding offers a possibility to precisely manipulate the quantum state.

Eltsov adds, 'In an approach to quantum information processing adopted by the major players (like Google), qubits are based on superconducting quantum electronics: Here islands of a superconductor are interacting by the Josephson effect. In the Josephson effect, coherent electrons flow back and forth between islands in a very specific way, depending on their quantum states'.

For this experiment, a liquid is built from 3He atoms which is cooled to a very low temperature (below 200 microkelvins), where it becomes superfluid. The 3He atoms have a magnetic moment, and at those low temperatures, all of these tiny magnets carried by each atom start to play together. The time crystalline periodic process in this system is the continuous rotation of the total magnetic moment. The constituent particles which are exchanged in the Josephson effect are magnetic quantum excitations, magnons.

Eltsov concludes, 'There is material development effort at some other labs to build similar time crystals based on coherent magnetic phenomena which are robust even at room temperature. Thus in some future, it might enable the construction of a quantum computer operating at room temperature'.

Credit: 
Aalto University

No limit yet for carbon nanotube fibers

image: The cross-section of a fiber produced at Rice University contains tens of millions of carbon nanotubes. The lab continually improves its method to make fibers, which tests show are now stronger than Kevlar.

Image: 
Pasquali Research Group/Rice University

HOUSTON - (Aug. 17, 2020) - Carbon nanotube fibers made at Rice University are now stronger than Kevlar and are inching up on the conductivity of copper.

The Rice lab of chemical and biomolecular engineer Matteo Pasquali reported in Carbon it has developed its strongest and most conductive fibers yet, made of long carbon nanotubes through a wet spinning process.

In the new study led by Rice graduate students Lauren Taylor and Oliver Dewey, the researchers noted that wet-spun carbon nanotube fibers, which could lead to breakthroughs in a host of medical and materials applications, have doubled in strength and conductivity every three years, a trend that spans almost two decades.

While that may never mimic Moore's Law, which set a benchmark for computer chip advances for decades, Pasquali and his team are doing their part to advance the method they pioneered to make carbon nanotube fibers.

The lab's threadlike fibers, with tens of millions of nanotubes in cross section, are being studied for use as bridges to repair damaged hearts, as electrical interfaces with the brain, for use in cochlear implants, as flexible antennas and for automotive and aerospace applications.

They are also part of the Carbon Hub, a multiuniversity research initiative launched in 2019 by Rice with support from Shell, Prysmian and Mitsubishi to create a zero-emissions future.

"Carbon nanotube fibers have long been touted for their potential superior properties," Pasquali said. "Two decades of research at Rice and elsewhere have made this potential a reality. Now we need a worldwide effort to increase production efficiency so these materials could be made with zero carbon dioxide emissions and potentially with concurrent production of clean hydrogen."

"The goal of this paper is to put forth the record properties of the fibers produced in our lab," Taylor said. "These improvements mean we're now surpassing Kevlar in terms of strength, which for us is a really big achievement. With just another doubling, we would surpass the strongest fibers on the market."

The flexible Rice fibers have a tensile strength of 4.2 gigapascals (GPa), compared to 3.6 GPa for Kevlar fibers. The fibers require long nanotubes with high crystallinity; that is, regular arrays of carbon-atom rings with few defects. The acidic solution used in the Rice process also helps reduce impurities that can interfere with fiber strength and enhances the nanotubes' metallic properties through residual doping, Dewey said.

"The length, or aspect ratio, of the nanotubes is the defining characteristic that drives the properties in our fibers," he said, noting the surface area of the 12-micrometer nanotubes used in Rice fiber facilitates better van der Waals bonds. "It also helps that the collaborators who grow our nanotubes optimize for solution processing by controlling the number of metallic impurities from the catalyst and what we call amorphous carbon impurities."

The researchers said the fibers' conductivity has improved to 10.9 megasiemens (million siemens) per meter. "This is the first time a carbon nanotube fiber has passed the 10 megasiemens threshold, so we've achieved a new order of magnitude for nanotube fibers," Dewey said. Normalized for weight, he said the Rice fibers achieve about 80% of the conductivity of copper.

"But we're surpassing platinum wire, which is a big achievement for us," Taylor said, "and the fiber thermal conductivity is better than any metal and any synthetic fibers, except for pitch graphite fibers."

The lab's goal is to make the production of superior fibers efficient and inexpensive enough to be incorporated by industry on a large scale, Dewey said. Solution processing is common in the production of other kinds of fibers, including Kevlar, so factories could use familiar processes without major retooling.

"The benefit of our method is that it's essentially plug-and-play," he said. "It's inherently scalable and fits in with the way synthetic fibers are already made."

"There's a notion that carbon nanotubes are never going to be able to obtain all the properties that people have been hyping now for decades," Taylor said. "But we're making good gains year over year. It's not easy, but we still do believe this technology is going to change the world."

Co-authors of the paper are Rice alumnus Robert Headrick; graduate students Natsumi Komatsu and Nicolas Marquez Peraca; Geoff Wehmeyer, an assistant professor of mechanical engineering; and Junichiro Kono, the Karl F. Hasselmann Professor in Engineering and a professor of electrical and computer engineering, of physics and astronomy, and of materials science and nanoengineering. Pasquali is the A.J. Hartsook Professor of Chemical and Biomolecular engineering, of chemistry and of materials science and nanoengineering.

The U.S. Air Force Office of Scientific Research, the Robert A. Welch Foundation, the Department of Energy's Advanced Manufacturing Office and the Advanced Research Projects Agency-Energy supported the research.

Credit: 
Rice University

Ultra-low voltage proven effective at killing bacteria, study finds

image: Yong Wang, University of Arkansas

Image: 
Russell Cothren

FAYETTEVILLE, Ark. - Ultra-low voltage electricity is effective at killing bacteria because it causes membranes that surround bacteria to leak, according to a new study by University of Arkansas researchers. The research advances work to fight drug-resistant bacteria.

Using E. coli bacteria, the team demonstrated that ultra-low voltage applied for 30 minutes created holes in the cell's membrane that allowed leakage of small molecules, ions and proteins both in and out of the cell, killing the bacterium.

While the antimicrobial property of electricity has long been known, it was not completely understood how ultra-low voltages damage and ultimately kill bacteria until this new finding, said Yong Wang, assistant professor of physics and part of the team that published the findings in the journal Applied and Environmental Microbiology. "The electric power we used is very low," said Wang. "A household battery can provide enough power. So can a one-centimeter square solar panel."

Such low voltage could, for example, be used to sterilize a doorknob or other high-touch surfaces that harbor bacteria without causing any harm to users, said Wang. It could also be used to hinder biofilm formation in water purification and storage applications, he added.

Credit: 
University of Arkansas

NASA infrared data shows Genevieve strengthening into a hurricane

image: On Aug. 17 at 1:15 a.m. EDT (0515 UTC), the MODIS instrument aboard NASA's Terra satellite gathered temperature information about Genevieve's cloud tops. MODIS found the most powerful thunderstorms (red) were in the developing eyewall, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). C

Image: 
redit: NASA/NRL

NASA's Terra satellite used infrared light to identify strongest storms and coldest cloud top temperatures and found them surrounding a developing eyewall around Genevieve as it was strengthening into a hurricane.

Genevieve formed on Sunday by 11 a.m. EDT (1500 UTC) as Tropical Depression 12E. Six hours later, by 5 p.m. EDT, it had strengthened into a tropical storm and was renamed Tropical Storm Genevieve. The storm continued to intensify rapidly and by 11 a.m. EDT on Monday, Aug. 17, it strengthened to a hurricane.

Infrared Data Reveals Powerful Storms

On Aug. 17 at 1:15 a.m. EDT (0515 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Terra satellite gathered temperature information about Genevieve's cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found the most powerful thunderstorms were in the eyewall, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius) around the center of circulation and in thick, fragmented bands south and west of the center. Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Genevieve's Status

At 11 a.m. EDT (1500 UTC) on Aug. 17, NOAA's National Hurricane Center reported the center of Hurricane Genevieve was located near latitude 14.3 degrees north and longitude 103.0 degrees west. The storm was about 250 miles (405 km) south-southwest of Zihuatanejo, Mexico.

Genevieve is moving toward the west-northwest near 18 mph (30 kph), and this motion is expected to continue through tonight. Maximum sustained winds are near 75 mph (120 kph) with higher gusts.

Forecast Track

NHC forecasters said, "A turn to the northwest and a decrease in forward speed is forecast to occur on Tuesday and continue through at least early Thursday. Rapid strengthening if forecast to continue over the next day or so, and Genevieve is expected to become a major hurricane on Tuesday. A weakening trend should begin on Wednesday. On the forecast track, the center of Genevieve is expected to move parallel to but well offshore the coast of southwestern Mexico during the next couple of days."

Genevieve Causing Dangerous Ocean Swells Near Mexico

Large swells produced by Genevieve will begin affecting portions of the southern coast of Mexico today and will spread northward along the southwestern and west-central coast of Mexico to the Baja California peninsula through Wednesday. These swells are likely to cause life-threatening surf and rip current conditions.

NASA Researches Tropical Cyclones

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

NASA finds short-lived Fausto faded fast

image: On Aug. 17 at 3 a.m. EDT (0700 UTC), the MODIS instrument that flies aboard NASA's Terra satellite showed Fausto devoid of strong storms. Coldest cloud top temperatures were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius) in fragmented bands north and south of the center.

Image: 
NASA/NRL

Post-Tropical Storm Fausto faded fast in the Eastern Pacific Ocean. NASA's Terra satellite provided an infrared look at the storm, which showed no areas of heavy rainfall, and the storm was classified as a remnant low-pressure area.

Fausto developed from Tropical Depression 11E, which formed by 11 p.m. EDT on Saturday, Aug. 15. Twelve hours later, it strengthened into Tropical Storm Fausto at 11 a.m. EDT, Sunday, Aug. 16.  A half day after that it had weakened back to a tropical depression. And by 11 a.m. EDT on Monday, Aug. 17, it weakened to a post-tropical cyclone, remnant low-pressure area.

NASA's Terra satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Fausto. Data showed Fausto devoid of strong storms. The coldest cloud top temperatures were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius) in fragmented bands north and south of the center.

The National Hurricane Center (NHC) noted at 11 a.m. EDT on Aug. 17, "Fausto has been absent of deep convection for about 12 hours, and with the system over sea surface temperatures below 23 degrees Celsius (73.4 degrees Fahrenheit and tropical cyclones need at least 26.6C/80F to maintain intensity), it is unlikely organized deep convection will return. Therefore, Fausto has become a remnant low, and this will be the final NHC advisory on this system."

Credit: 
NASA/Goddard Space Flight Center

Cashew shell compound appears to mend damaged nerves

image: Senior author Subramaniam Sriram, MBBS, William C. Weaver III Professor of Neurology and chief of the Division of Neuroimmunology at Vanderbilt University Medical Center.

Image: 
Vanderbilt University Medical Center

In laboratory experiments, a chemical compound found in the shell of the cashew nut promotes the repair of myelin, a team from Vanderbilt University Medical Center reports today in the Proceedings of the National Academy of Sciences.

Myelin is a protective sheath surrounding nerves. Damage to this covering -- demyelination -- is a hallmark of multiple sclerosis and related diseases of the central nervous system.

"We see this as an exciting finding, suggesting a new avenue in the search for therapies to correct the ravages of MS and other demyelinating diseases," said the paper's senior author, Subramaniam Sriram, MBBS, William C. Weaver III Professor of Neurology and chief of the Division of Neuroimmunology.

Previous work led by Sriram showed that a protein called interleukin 33, or IL-33, induced myelin formation. IL-33 is, among other things, an immune response regulator, and multiple sclerosis is an autoimmune disorder.

The cashew shell compound is called anacardic acid. Sriram and team grew interested in it because it's known to inhibit an enzyme involved in gene expression called histone acetyltransferase, or HAT, and the team had discovered that whatever inhibits HAT induces production of IL-33.

The report includes a range of new findings that point to potential therapeutic use of anacardic acid for demyelinating diseases:

In vitro, the addition of the compound to rat cells most responsible for myelination -- oligodendrocyte precursor cells, or OPCs -- spurred induction of IL-33 and rapidly increased the expression of myelin genes and proteins, including dose-dependent increases in myelin basic protein;

In two animal models of demyelination, treatment with the compound increased the relative presence of IL-33-expressing OPCs and led to reduced paralysis;

In an animal model of demyelination treated with the compound, dissection and electron microscopy showed dose-dependent increases in myelination.

"These are striking results that clearly urge further study of anarcardic acid for demyelinating diseases," Sriram said.

Credit: 
Vanderbilt University Medical Center