Tech

Stanford scientists pry apart party drug's therapeutic, addictive qualities

Stanford University School of Medicine investigators have succeeded in distinguishing the molecular pathway responsible for an illicit drug's abuse potential from the one behind its propensity to make people feel sociable.

The discovery, described in a study to be published Dec. 11 in Science Translational Medicine, could lead to novel treatments for psychiatric disorders marked by social awkwardness and withdrawal.

The findings were made in mouse experiments.

Methylenedioxy-methamphetamine -- better known by its acronym, MDMA, or its street name, ecstasy -- is a mind-altering drug used by 3 million Americans annually. MDMA is especially popular as a party drug because it gives people who take it a sense of well-being and makes them extremely sociable -- even instilling feelings of unguarded empathy for strangers. That makes MDMA a natural fit for raves, dance parties featuring lots of densely packed, sweaty bodies and unfamiliar faces.

It may also may make MDMA a good medicine for psychiatry. It's now in late-stage, multicenter clinical trials as an adjunct to psychotherapy for post-traumatic stress disorder. The goal is to harness MDMA's prosocial effects to strengthen the bond between patient and therapist. Thus, people who have experienced trauma may be able to feel comfortable reliving it through guided therapy.

Some 25 million people in the United States who suffer from PTSD could benefit from a drug capable of establishing, with a single dose in a therapist's office, a trust level that typically takes months or years to achieve, said Boris Heifets, MD, PhD, assistant professor of anesthesiology, perioperative and pain medicine, the study's lead author.

But MDMA can be addictive. Taken in the wrong settings or in repeated or oversized doses, it can have life-threatening consequences.

"We've figured out how MDMA promotes social interaction and showed that's distinct from how it generates abuse potential among its users," said the study's senior author, Robert Malenka, MD, PhD, the Nancy Friend Pritzker Professor in Psychiatry and Behavioral Sciences.

Stimulation of reward circuitry

MDMA's abuse potential stems from its capacity to stimulate the brain's reward circuitry, Malenka said. "The brain's reward circuitry tells us something is good for our survival and propagation. It evolved to tell us food is good when we're hungry, water is good when we're thirsty, and warmth is good when we're cold. For most of us, hanging out with friends is fun, because over the course of our evolution it's promoted our survival."

A crucial connection in the reward circuitry is between nerve cells, or neurons, projecting from one midbrain structure, the ventral tegmental area, to another, the nucleus accumbens. When those neurons release a chemical called dopamine, the nucleus accumbens forwards signals throughout the brain that induce a sense of reward.

"Drugs of abuse trick our brains by causing an unnatural dopamine surge in the nucleus accumbens," Malenka said. "This massive increase is much higher and more rapid than the one you get from eating ice cream or having sex."

Like all addictive drugs, MDMA triggers dopamine release in the nucleus accumbens. That explains its abuse potential but leaves open the question of why the prosocial effect dwarfs that of most other abused drugs.

In the study, the Stanford researchers showed that a different brain chemical, serotonin, is responsible for this. Serotonin-releasing neurons in a brain structure called the dorsal raphe nucleus send projections to the same part of the nucleus accumbens that the dopamine-releasing neurons do. Neuroscientists have previously shown that in fact MDMA triggers the release of far more serotonin than dopamine.

Drawing inferences from mice

The researchers performed a number of experimental manipulations to implicate serotonin as the signaling substance responsible for promoting social behavior in mice. The scientists got the same results using male or female mice. The same effects would probably be seen in humans because the midbrain areas in question have been remarkably conserved among mammalian species over evolutionary time, Malenka said.

"You can't ask mice how they're feeling about other mice," he said. "But you can infer it from their behavior."

The researchers tested whether an "explorer" mouse given a relatively low dose of MDMA or, alternatively, a saline solution prefers to spend time in a chamber holding another mouse under an upside-down mesh cup (to keep that mouse from moving about) or in an otherwise identical chamber with a cup but no mouse. They found, consistently, that saline-treated explorer mice get bored after 10 minutes with another mouse. But an explorer mouse given MDMA sustains its social curiosity for at least 30 minutes.

"Giving MDMA to both mice enhanced the effect even further," Heifets said. "It makes you wonder if maybe the therapist should also be taking MDMA."

Like humans, mice like to return to places where they've had a good time. Chalk it up to the brain's reward circuitry. To determine MDMA's addictive potential, the researchers gave mice an MDMA dose equal to the one in the first experiment, but only when the mice were in a particular room of a two-room structure. The next day, the mice showed no preference for either room -- evidence that at this dose, the drug hadn't noticeably triggered the reward circuitry.

But mice given a higher MDMA dose exhibited both its social and abuse-potential effects. Further tests determined that the secretion of dopamine triggered by MDMA is not necessary for promoting sociability. Serotonin release, triggered by the low MDMA dose, was all it took.

The scientists were able to induce MDMA's trademark sociability by infusing the drug only into the mice's nucleus accumbens, proving this is where serotonin, whose release MDMA triggers, exerts its sociability-inducing effect.

"Where, exactly, in the brain that's happening hadn't been proven," Heifets said. "If you don't know where something's happening, you're going to have a hell of a time figuring out how it's happening."

That's what the scientists discovered next. Blocking a specific subtype of serotonin receptor that abounds in the nucleus accumbens fully inhibited MDMA's prosocial effect. Furthermore, giving the mice a different serotonin-releasing drug that does not cause dopamine release mimicked the prosocial effects of MDMA but didn't cause any addictive, or rewarding, effects. The drug, fenfluramine, is the "fen" in a once-popular diet pill called fen/phen, a two-drug combination developed in the 1960s. Fen/phen was pulled off the market in the 1990s after 30% of patients taking it were found to be showing signs of heart disease, including pulmonary hypertension, a life-threatening condition.

Owing to their long-term cardiovascular and neurotoxic effects, neither MDMA nor fenfluramine would be suitable for any indications requiring daily use, the researchers cautioned.

But those nasty effects of chronic use would be highly unlikely to occur in the one or two sessions that would be required for patient-therapist bonding in a psychiatric setting, Heifets said.

Credit: 
Stanford Medicine

Information technology can save police lives, according to a new study

image: Paul A. Pavlou, dean of the C.T. Bauer College of Business, said the use of information technology could reduce violence against law enforcement by as much as 50%.

Image: 
C.T. Bauer College of Business

Police officers face well-documented risks, with more than 50,000 a year assaulted on the job in the United States.

But new research has found that the use of information technology by law enforcement agencies can significantly cut the number of police killed or injured in the line of duty, reducing violence as much as 50%.

"The use of IT by police increases the occupational safety of police officers in the field and reduces deaths and assaults against police officers," said Paul A. Pavlou, dean of the C.T. Bauer College of Business at the University of Houston and co-author of the paper, which was published in the journal Decision Support Systems.

Pavlou and Min-Seok Pang of Temple University used data from the FBI, the federal Bureau of Justice Statistics and the U.S. Census to build a dataset correlating IT use and reported violence against law enforcement from 4,325 U.S. police departments over a six-year period.

People haven't previously known much about the impact of IT on police safety, Pavlou said, both because relatively few departments used it until recently and because there hasn't been much research on the topic.

His and Pang's analysis determined extensive use of IT by the police could cut violence against law enforcement between 42% and 50%, amounting to between six and seven fewer assaults or deaths for an average-sized police department.

For large urban departments serving more than 1 million people, relying on information technology could mean up to 199 fewer assaults or deaths.

The dataset focused on the use of information technology in three areas:

Crime intelligence, or the use of technology for things ranging from gathering information to writing reports from the field, when the information gathered is fresh

Crime prediction, analyzing digitized data on past crimes and geographically visualizing past crimes to better understand crime patterns

Crime investigation, identifying suspects, discovering their whereabouts and gathering evidence for conviction

The use of IT to learn more about potential suspects improves the likelihood that police can make an arrest without violence, the researchers said. Discovering that a suspect is likely to be armed, for example, can lead police to don protective gear.

The dataset was collected in the early 2000s, when only about one-third of police departments had high use of IT in all three areas, Pavlou said. That's likely grown in recent years, he said.

The researchers said the finding is also applicable to other types of workplace safety, including those involving factory workers, chemical plant employees, truck drivers and other high-risk occupations.

Credit: 
University of Houston

Study supports long-term benefits of non-drug therapies for pain

image: Capt. Israel Orengo is evaluated by physical therapist Tyler Snow at Madigan Army Medical Center in 2015. Snow was one of the researchers on an Army study that looked at incorporating alternative methods such as acupuncture and biofeedback into pain management. A new study based on VA records suggests long-term benefits from service members' non-drug pain treatment.

Image: 
John Liston

A new study based on Veterans Affairs health records finds that non-drug therapies given to military service members with chronic pain may reduce the risk of long-term adverse outcomes, such as alcohol and drug disorder and self-induced injuries, including suicide attempts.

The findings appeared online Oct. 28, 2019, in the Journal of General Internal Medicine.

The researchers concluded that service members with chronic pain who received non-drug therapies while in the military, such as massage or acupuncture, had a "significantly lower" risk in VA of new onset alcohol or drug disorder; poisoning with opioids and related narcotics, barbiturates, or sedatives; and suicidal thoughts and attempts. The research team did not study death by suicide.

Dr. Esther Meerwijk, a statistician and suicide researcher at the VA Palo Alto Health Care System in California, was the lead author. Her team reviewed the VA health records of more than 140,000 Army soldiers who reported chronic pain following their deployment to Iraq or Afghanistan from 2008 to 2014. The most common types of chronic pain were joint discomfort, back and neck issues, and other problems involving muscles or bones.

"Chronic pain is associated with adverse outcomes, such as substance use and suicidal thoughts and behavior," Meerwijk says. "It made sense that if non-drug treatments are good at managing pain, their effect would go beyond only pain relief. However, I was surprised that the results of our analyses held, despite our attempts to prove them wrong. Often enough in research, significant results disappear once you start controlling for variables that can possibly affect the outcome of the study."

The researchers controlled for length of a service member's care in VA, whether the Veteran had been exposed to non-drug therapies in VA, and the number of days a VA patient received opioids. They also tested to see if service members who received non-drug treatments were healthier to begin with and if more Veterans who received non-drug therapies died before any of the adverse outcomes occurred.

It's possible, Meerwijk explains, that soldiers who received non-drug therapies didn't have to rely on opioids as much for their chronic pain and are therefore at lower risk for adverse outcomes. "We may also be seeing a genuine effect of non-drug therapies that occurs regardless of whether soldiers use opioids or not," she says. "If non-drug treatments make chronic pain more bearable, people may be more likely to have positive experiences in life. That makes them less likely to have thoughts of suicide or to turn to drugs."

Meerwijk's research is part of the Substance Use and Psychological Injury Combat Study (SUPIC), the largest and longest observational study to date of pain management and behavioral health conditions in Army service members returning from Iraq and Afghanistan. VA has participated in the study, which is led by Dr. Mary Jo Larson of Brandeis University in Massachusetts. Meerwijk became part of the study in 2016.

"When I joined the team, one of the goals was to study long-term effects of non-drug treatments for chronic pain received in the military," Meerwijk says. "Given my research interests in suicide and suicide prevention, it was suggested we look at suicidal thoughts and attempts as outcomes. Given SUPIC's interest in substance use, specifically opioids, we broadened the analysis to serious adverse events related to opioid use and chronic pain."

Chronic pain is often managed with prescription opioids. Especially at higher doses and longer length of use, opioids have been linked to a greater risk of substance use disorder and self-inflicted injuries, such as opioid overdose and suicide attempts.

While in service, the solders received non-drug therapies that included acupuncture, dry needling, biofeedback, chiropractic care, massage, exercise therapy, cold laser therapy, osteopathic spinal manipulation, electrical nerve stimulation, ultrasonography, superficial heat treatment, traction, and lumbar supports. Ultrasonography is a technique that uses echoes of ultrasound pulses to pinpoint objects or areas of different density in the body.

In the study, the researchers compared service members with chronic pain who did or didn't receive non-drug therapies and described the links between such treatments in the military and long-term adverse outcomes. They determined that soldiers who received non-drug therapies were at lower risk of being diagnosed with drug use disorders and self-inflicted injuries, such as accidental poisoning and suicidal ideation--which is the thought of taking one's own life.

The largest difference was seen with regard to accidental poisoning with opioids or other pain drugs: Those who received non-drug therapies were 35% less likely to injure themselves than those who didn't receive such therapies while in the service. Service members who received non-drug treatments were also at lower risk down the road for these adverse outcomes:

Self-inflicted injuries, including suicide attempts: 17% less likely.

Suicidal ideation: 12% less likely.

Alcohol or drug use disorders: 8% less likely.

The results supported the researchers' hypothesis that use of non-drug therapies in the military would be linked to fewer negative outcomes for patients in the VA system.

The median age of the cohort was 26, and the median length of deployment was a little more than a year. The researchers focused on outcomes that are tied to chronic pain and opioid use. Alcohol or drug use disorders were the most frequent adverse outcomes, followed by suicidal ideation and self-inflicted injuries including suicide attempts. Poisoning with opioids, related narcotics, barbiturates, or sedatives was least frequent.

Because the study was only observational, based on past treatment data, and didn't include a randomized clinical trial, it doesn't show cause and effect--only an association. The researchers did use a method called propensity matching, which allowed them to carefully analyze differences and similarities between those soldiers who received non-drug therapies for pain and those who did not, to try and tease out the effects of that variable.

"We aimed statistically to create groups that, with the exception of receiving non-drug therapies, were as similar as possible," Meerwijk says. "But we were limited to the observational data we had. That means that the groups may have been different in ways that we didn't measure and, as a consequence, we don't know about. We cannot rule out that one of those ways explains why we found what we found."

Another limitation of the study is that the researchers didn't look at specific non-drug therapies to gauge the extent to which they may have contributed--or not--to the overall finding.

"Another thing to keep in mind is we didn't look at effects of individual non-drug therapies. We treated them as one," says Meerwijk. "Most likely, only some of the therapies that we included are responsible for the effect that we reported, whereas others may have had no effect at all, assuming there's no other variable that explains our findings."

Other researchers have found that service members who use non-drug therapies may be healthier to start with than those who do not and, as such, may be at lower risk for poor outcomes. Meerwijk's team did not find that to be the case. They documented that service members who received non-drug treatments in the military were more often hospitalized and had longer inpatient stays, for example, than their peers who had not received such therapies. They were also more likely to be diagnosed with mental disorders, except alcohol use disorder.

Credit: 
Veterans Affairs Research Communications

Azteca ant colonies move the same way leopards' spots form

ANN ARBOR--What could Azteca ants in coffee farms in Mexico have in common with leopards' spots and zebras' stripes?

After two decades of analyzing the rise, spread and collapse of Azteca ant colonies in a coffee farm in Mexico, University of Michigan researchers have proven that the ant distributions follow a pattern named after chemist Alan Turing, who first described it in 1952, that is said to explain leopards' spots and other patterns in nature.

"The same equations that Turing used for chemistry, we can use in ecology," said John Vandermeer, a professor in the U-M Department of Ecology and Evolutionary Biology and first author of a study in the December issue of BioScience. "Those equations say you should get spots of predators and spots of prey in a system, and we've proven you do."

The finding, he says, helps shed light on the complex agroecological system of coffee farms and how the "control from above" (focused on pests) model is also more complicated than a predator-prey relationship. The system includes a complex community of predators, parasites and diseases that interact with each other in complicated ways that eventually generate a self-organized system that exerts effective control over the herbivore.

"This is an important finding because it shows how organisms in nature are embedded within a complex web of interactions and, therefore, the simplistic pest management approach of 'one pest, one natural enemy' may not be the most appropriate one for pest management," said co-author Ivette Perfecto, the George W. Pack Professor of Ecology, Natural Resources and Environment at U-M's School for the Environment and Sustainability.

"Rather, a complex systems approach that accounts for nonlinearities and networks of interactions is what is needed."

Turing pattern

Turing explained the creation of nonrandom patterns in chemistry by observing chemical reactions and how they are destabilized. A chemical reaction is stabilized by the balance of an activation and repression process. Then there's diffusion: a drop of ink in water eventually diffuses and can't be separated from the water. But, Turing observed, if the repression force diffused at a greater rate than the activation force, a nonrandom pattern would develop.

"Turing figured that if you took this reaction process, you can put together two forces that are stabilizing themselves and you put them together and that destabilizes the whole system, forming patterns," Vandermeer said. "Something very similar happens in ecological systems: the predator is eating the prey and the prey's population goes down, and then the predators' population goes down, that's a regulating thing. When you have diffusion--in biology we call it migration--the predator moves in space and the prey moves, too."

Vandermeer and Perfecto looked at data they collected in an organic coffee farm in Chiapas, Mexico, mapping the distribution of shade trees containing nests of Azteca sericeasur ants. While there are between 7,000 to 11,000 shade trees in the plot depending on the year, only about 700 have Aztecas on them. Each nest has anywhere between 10 to 20 queens.

Phorid flies find these clusters, parasitizing the ants by planting an egg on the ant's head. Larvae will develop until the ant's head falls off and a new fly emerges, repeating the cycle. As the local population of ant nests builds up, spatial clusters are formed and become larger, and so does the population of phorid flies, which will in turn act as a repressor in the system, resulting in the patchy distributions of ants similar to the leopard's patches.

"Turing was talking about his chemicals and we're talking about ants and flies," Vandermeer said. "We predicted that our ants and our flies should form these little clusters and we found that they do."

The complexity of farming coffee

In their study, the researchers explored the complex relationships between predators, prey and their environment, including the green coffee scale, a relatively benign coffee pest that rarely reaches pest status; the Azya orbigera, a predatory beetle that feasts on the scale; and the Azteca ant, which protects the scale from the beetle.

Under protection from the ants, the scales effectively have a refuge from the beetle predators and they increase dramatically in numbers. With such high local population density, a fungal disease takes over and the scale insects decline rapidly. Thus the combination of a predator, a refuge in the clusters of ant nests and a fungal disease keeps the scale insects under control.

And then there's the coffee rust, which has decimated coffee farms across Latin America but remains fairly controlled in Puerto Rico. The rust is spread by spores in the wind, but the same fungus that causes the disease in the scale insects, is also an antagonist of the rust, complicating the situation considerably.

The researchers warn about the temptation of providing simple answers to farmers seeking solutions to perceived problems in their farms--for example, by getting rid of the shade trees that house the ants.

"If we get rid of the shade trees, then the ants would go away. If the ants go away, there's no place to have a refuge for the scale insects to escape the predator. So the beetle would eat all of the scale insects and then itself die of starvation," Vandermeer said. "And when the next season arrives, the scale insects would come back without any predators to stop them. So if you get rid of the shade trees you get rid of the control."

The same can be said when it comes to rust, he says. Coffee rust spreads by spores that are taken by the wind and a canopy of shade trees above the coffee acts as a windbreak.

"So if you take the shade trees out of the system, you get the wind in the system, and with the wind brings the spores," Vandermeer said. "Since it's a complex system, it requires a more holistic approach to understand and manage, and there's more potential for surprise."

Credit: 
University of Michigan

Researchers discover brain circuit linked to food impulsivity

image: Emily Noble was the lead author on the research paper

Image: 
Cal Powell

You're on a diet, but the aroma of popcorn in the movie theater lobby triggers a seemingly irresistible craving.

Within seconds, you've ordered a tub of the stuff and have eaten several handfuls.

Impulsivity, or responding without thinking about the consequences of an action, has been linked to excessive food intake, binge eating, weight gain and obesity, along with several psychiatric disorders including drug addiction and excessive gambling.

A team of researchers that includes a faculty member at the University of Georgia has now identified a specific circuit in the brain that alters food impulsivity, creating the possibility scientists can someday develop therapeutics to address overeating.

The team's findings were published recently in the journal Nature Communications.

"There's underlying physiology in your brain that is regulating your capacity to say no to (impulsive eating)," said Emily Noble, an assistant professor in the UGA College of Family and Consumer Sciences who served as lead author on the paper. "In experimental models, you can activate that circuitry and get a specific behavioral response."

Using a rat model, researchers focused on a subset of brain cells that produce a type of transmitter in the hypothalamus called melanin concentrating hormone (MCH).

While previous research has shown that elevating MCH levels in the brain can increase food intake, this study is the first to show that MCH also plays a role in impulsive behavior, Noble said.

"We found that when we activate the cells in the brain that produce MCH, animals become more impulsive in their behavior around food," Noble said.

To test impulsivity, researchers trained rats to press a lever to receive a "delicious, high-fat, high-sugar" pellet, Noble said. However, the rat had to wait 20 seconds between lever presses. If the rat pressed the lever too soon, it had to wait an additional 20 seconds.

Researchers then used advanced techniques to activate a specific MCH neural pathway from the hypothalamus to the hippocampus, a part of the brain involved with learning and memory function.

Results indicated MCH doesn't affect how much the animals liked the food or how hard they were willing to work for the food. Rather, the circuit acted on the animals' inhibitory control, or their ability to stop themselves from trying to get the food."Activating this specific pathway of MCH neurons increased impulsive behavior without affecting normal eating for caloric need or motivation to consume delicious food," Noble said. "Understanding that this circuit, which selectively affects food impulsivity, exists opens the door to the possibility that one day we might be able to develop therapeutics for overeating that help people stick to a diet without reducing normal appetite or making delicious foods less delicious."

Credit: 
University of Georgia

Punching holes in opaque solar cells turns them transparent

video: This video is a demonstration of a neutral-colored transparent solar cell that can integrate with windows and mobile devices.

Image: 
Ulsan National Institute of Science and Technology (UNIST)

Researchers in Korea have found an effective and inexpensive strategy to transform solar cells from opaque to transparent. Existing transparent solar cells tend to have a reddish hue and lower efficiency, but by punching holes that are around 100 μm in diameter (comparable in size to a human hair) on crystalline silicon wafers, it allows light through without coloring. The holes are then strategically spaced, so the human eye is unable to "see" the pattern. The work appears December 11 in the journal Joule.

Making transparent solar cells out of naturally opaque crystalline silicon is one of the most challenging problems in the solar energy field. Most solar cells sacrifice their transparency to maximize their efficiency. The best solar cells on the market have an efficiency of over 20 percent. The transparent neutral-colored solar cell that the research team developed demonstrated long-term stability with a high-power conversion efficiency of 12.2 percent.

"My team members concluded that crystalline silicon is the best material to develop the glass-like, high-efficiency, high-stability, and neutral-colored solar cell," says Kwanyong Seo, of the Ulsan National Institute of Science and Technology (UNIST), co-senior author on the paper along with Seungwoo Lee of Korea University. "At first thought, it was a crazy idea for all of us. The problem was that crystalline silicon is not transparent, so before us, nobody tried to make transparent crystalline silicon with neutral colors."

Seo says that the see-through solar cell is an ideal material to turn windows into solar panels. "Current solar cells need space. On the ground or enough space on the roof," he says. "But the roof ratio is getting smaller and smaller compared to the window area."

Furthermore, most windows are vertically placed, which causes light to hit the windows at a low angle. When hit by low angle light, the electrical current in conventional cells drops nearly 30 percent, while transparent solar cells reduce less than 4 percent--allowing it to utilize solar energy more efficiently.

"We want to replace current windows," says Seo. "There are many things we have to overcome, such as the regulations by law. We also need to have the mechanical stability and strength to apply our device to replace the current window in the building."

However, the commercialization of the transparent crystalline silicon is promising. Besides the patterning of the wafers, the fabrication process is similar to conventional solar cells in the industry. The next step for the team is to scale up the device to 25 cm2 (3.88 in2) and increase the efficiency to 15 percent.

"Silicon substrate is a very popular material in the semiconductor industry," says Seo. "We believe that this vision can apply to many different applications, such as transparent electronics. It can also be applied to mobile devices as an energy source."

Credit: 
Cell Press

Fiber-optic cables capture thunderquake rumbles

video: Tieyuan Zhu, assistant professor of geophysics, used underground fiber optic cables to track movement above ground at Penn State. Seismic data created by events such as lightning and vehicle traffic registered as movement within the fiber optic cable.

Image: 
David Kubarek, Penn State

Underground fiber-optic cables, like those that connect the world through phone and internet service, hold untapped potential for monitoring severe weather, according to scientists at Penn State.

Researchers turned miles of cables under the University Park campus into thousands of virtual sensors capable of detecting tiny seismic events caused by thunder echoing down from the sky during a storm in April.

"Severe weather has strong interactions with the ground, but we haven't had the capability to study the coupling between the atmosphere and the solid Earth," said Tieyuan Zhu, assistant professor of geophysics at Penn State and lead author on the study. "With this new technology, we can utilize existing fiber-optics networks to clearly see how thunderstorm energy passed through campus."

The findings, published today (Dec. 11) in the Journal of Geophysical Research: Atmospheres, mark the first time scientists have recorded thunder-induced seismic events, or thunderquakes, using the kind of fiber-optic network buried under most major cities in the United States, the scientists said.

"The ability to use fiber-optic cables to detect the source of thunder provides yet another way to track thunderstorms and help with public safety and emergency response, especially in urban areas," said David Stensrud, head of the Department of Meteorology and Atmospheric Science at Penn State and co-author on the study. "Every new data source helps to improve our storm-tracking ability."

Using recent technology called a distributed acoustic sensing (DAS) array, scientists found they could track the direction of the storm based on the intensity of the thunderquake events, and that their findings matched up with the location of lightning recorded by the U.S. National Lightning Detection Network.

The DAS array sends a laser down one of the hair-thin glass fibers contained inside the cables and can detect small changes caused by pressure as slight as a human walking, the scientists said. The array takes measurements every six-and-a-half feet, meaning the several miles of continuous cable under the University Park campus acts like a network of 2,000 sensors.

"If there is any change in the external energy on the ground above, even walking steps, you will have a very small change that's going to stretch or compress the fiber," Zhu said. "The laser is very sensitive and can detect these small changes."

During storms, thunder creates acoustic pressure miles above the Earth that travels down, hits the ground and spreads like waves in a pond. Humans cannot hear or feel these thunderquakes, but the slight movements are captured by the fiber-optic cables, the scientists said.

The array can provide important new information about the Earth's interior. Because there are so few earthquakes on the East Coast, scientists lack local earthquake data that can help image the Earth's crust and mantle, the researchers said.

The method has the potential to be used more broadly for other natural hazards like earthquakes, hurricanes and flooding, as preexisting fiber-optic cable networks exist in urban areas throughout the country, according to the scientists.

"This research is an example of taking an existing technology and using it to serve another purpose," Stensrud said. "Having technologies that are multifunction maximizes the benefits to society."

Credit: 
Penn State

The right mouse model is crucial for Huntington's disease drug development

Scientists evaluated mouse models used for developing new treatments for mood disorders associated with Huntington's disease and recommend which are most relevant and have greater potential for success. They report their results in the Journal of Huntington's Disease.

Amsterdam, NL, December 11, 2019 - Huntington's disease (HD) is an incurable and fatal hereditary disease. Developing disease-modifying drugs to treat patients with HD depends on studying them in animal models. Scientists evaluated the mouse models used for developing new treatments for mood disorders in HD and recommended which of these models are most relevant to their studies. Their findings are published in the Journal of Huntington's Disease.

"Patients with HD commonly suffer from debilitating mood disorders, but developing disease-modifying drugs for HD has proved to be extremely challenging," explained lead investigator Robert M. Friedlander, MD, MA, Chair and Walter E. Dandy Professor of Neurosurgery and Neurobiology at University of Pittsburgh School of Medicine, and Neuroapoptosis Laboratory, Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA. "One reason could be that we are not using the right animal models in research."

The amount of serotonin transporter (SERT), a molecule commonly targeted by medications to regulate mood, is increased in the brain of patients with HD. Investigators compared SERT levels in normal and Huntington's brain tissue samples, as well as brain tissues from two different mouse models of the disease. They found that the amount of SERT was increased in HD patients compared with controls, and that this finding was modeled in one type of mouse model, but not another.

Investigators found that HD patients have significantly increased SERT protein levels in one region of the brain, the striatum, which coordinates multiple aspects of cognition, but not in the globus pallidus, which regulates voluntary movement. The neurodegenerative process commences in and most severely affects the striatum, affecting other brain regions as the disease progresses. This likely has an impact on the prevalent mood disorder in HD. Increased SERT levels are demonstrated in the brain of CAG140 mice, a full-length knock-in mouse model of the disease, but not in the striatum of the R6/2 fragment murine model of the disease.

Based on this parameter, the investigators recommend that the CAG140 huntingtin knock-in mouse model is more suitable than the R6/2 model for the study of serotonergic pathway pathology in HD. This is important because researchers should be sure to use the appropriate disease models when designing experiments. However, they conclude, there is no mouse model that completely recapitulates human HD. Each mouse model has its strengths and weaknesses and it is therefore important to use complementary models and always relate to known findings in human tissue.

"We found that not all HD mouse models are the same, so researchers need to use the models that are most relevant to their studies," commented Dr. Friedlander. "This advance in our understanding of the mood-related symptoms in HD is key to designing treatments that will improve the quality of life for patients with this disease."

HD is a fatal genetic neurodegenerative disease characterized by atrophy of certain regions of the brain. It causes the progressive breakdown of nerve cells in the brain. HD patients experience behavioral changes and uncontrolled movements. Symptoms include personality changes, mood swings and depression, forgetfulness and impaired judgment, and unsteady gait and involuntary movements (chorea). Every child of an HD parent has a fifty percent chance of inheriting the gene. Patients usually survive ten to twenty years after diagnosis.

Credit: 
IOS Press

New spray gel could help take the bite out of frostbite

Mountaineers and winter sports enthusiasts know the dangers of frostbite -- the tissue damage that can occur when extremities, such as the nose, ears, fingers and toes, are exposed to very cold temperatures. However, it can be difficult to get treated quickly in remote, snowbound areas. Now, researchers reporting in ACS Biomaterials Science & Engineering have developed a convenient gel that could be sprayed onto frostbite injuries when they occur, helping wounds heal.

Frostbite causes fluids in the skin and underlying tissues to freeze and crystallize, resulting in inflammation, decreased blood flow and cell death. Extremities are the most affected areas because they are farther away from the body's core and already have reduced blood flow. If frostbite is not treated soon after the injury, it could lead to gangrene and amputation of the affected parts. Conventional treatments include immersing the body part in warm water, applying topical antibiotic creams or administering vasodilators and anti-inflammatory drugs, but many of these are unavailable in isolated snowy areas, like mountaintops. Others, such as topical medications, could end up freezing themselves. Rahul Verma and colleagues at the Institute of Nano Science and Technology wanted to develop a cold-stable spray gel that could be administered on-site for the immediate treatment of frostbite injuries.

To develop their spray, the researchers packaged heparin, an anticoagulant that improves blood flow by reducing clotting and aiding in blood vessel repair, into liposomes. These lipid carriers helped deliver heparin deep inside the skin. They embedded the heparin-loaded liposomes in a sprayable hydrogel that also contained ibuprofen (a painkiller and anti-inflammatory drug) and propylene glycol, which helped keep the spray from freezing at very low temperatures. When the researchers tested the spray gel on rats with frostbite, they found that the treatment completely healed the injuries within 14 days, whereas untreated injuries were only about 40% healed, and wounds treated with an antibiotic cream were about 80% healed. The spray reduced levels of inflammatory cytokines at the wound site and in the blood circulation, which likely accelerated healing, the researchers say.

Credit: 
American Chemical Society

Uncovering how endangered pangolins, or 'scaly anteaters,' digest food

The endangered Sunda pangolin, or "scaly anteater," is a widely trafficked mammal, prized in some cultures for its meat and scales. Little is known about these animals, and raising rescued pangolins is tricky. In the wild, they eat termites and ants, but diets provided in captivity often make them sick. Now, a study in ACS Omega reports that pangolins lack some common digestive enzymes, which could explain why some diets don't work well for them.

Pangolins use their long, sticky tongues to extract termites and ants from their nests for a tasty meal. Most of the protein content in these insects is locked up in their exoskeletons, which are made of chitin, a substance that is difficult for most animals to digest. There is little information about how pangolins get energy and nutrients from ants and termites. While those who rescue pangolins work to feed them diets they think are nutritious, pangolins do not take well to artificial food sources and often develop gastrointestinal diseases in captivity. Shibao Wu and colleagues wanted to find out whether the pangolin has a special digestive physiology.

The subject of the study was a single female Sunda pangolin, who died shortly after arriving at a rescue facility. Using mass spectrometry to detect the various kinds of digestive enzymes in saliva and intestinal fluid, the researchers found higher levels of chitinase, the enzyme that breaks down insect protein into usable energy and nutrients, in this pangolin than what is typically found in other animals. They also observed that the pangolin lacked some key enzymes that would have allowed her to consume food other than ants and termites. Although the study only reported findings from one animal, the results strongly suggest that pangolins have very specialized dietary needs and provide a better understanding of what foods these creatures should be provided when raised in captivity.

Credit: 
American Chemical Society

NASA finds Tropical Storm Belna's heavy rainfall potential shrinks

image: On Dec. 10 at 6:47 a.m. EST (1047 UTC) NASA's Aqua satellite analyzed Tropical Storm Belna using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures (purple) as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) in a small area over northwestern Madagascar.

Image: 
NASA JPL/Heidar Thrastarson

Tropical Storm Belna weakened after it made landfall in northwestern Madagascar, and infrared imagery from NASA showed how the area of strong storms within had diminished. Cold cloud top temperatures can tell forecasters if a tropical cyclone has the potential to generate heavy rainfall, and that is exactly what NASA's Aqua satellite found on Dec. 10 over a much smaller area than was occurring on Dec. 9.

The AIRS instrument aboard NASA's Aqua satellite captured a look at cloud top temperatures in Belna which gave insight into the storm's strength. Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. Tropical cyclones do not always have uniform strength, and some sides are stronger than others. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud temperatures are.

On Dec. 10 at 6:47 a.m. EST (1047 UTC) NASA's Aqua satellite analyzed the storm using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found the strongest storms with coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) in a small area over northwestern Madagascar. NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

By Dec. 11 at 4 a.m. EST (0900 UTC), Belna had become devoid of all heavy rainfall, and the forecasters at the Joint Typhoon Warning Center (JTWC) issued their final bulletin on the storm. Belna had weakened to a tropical depression and had maximum sustained winds near 30 knots (34.5 mph/55.5 kph).

Belna was located near latitude 21.3 degrees south and longitude 45.4 degrees east. It was over land and just six nautical miles southwest of Antananarivo, Madagascar. Belna was moving south and is expected to dissipate in a day or two.

Tropical cyclones and hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

The AIRS instrument is one of six instruments flying on board NASA's Aqua satellite, launched on May 4, 2002.

Credit: 
NASA/Goddard Space Flight Center

A vaccine against chronic inflammatory diseases

Chronic inflammatory bowel diseases, such as Crohn's Disease and ulcerative colitis, are linked to abnormalities of the gut microbiota in humans and in animals. Patients generally present reduced bacterial diversity in their intestinal flora along with excessive levels of bacteria that express a protein called flagellin, which favors their mobility. This enables them to penetrate the layer of mucous that covers the intestinal wall and is usually sterile. The purpose of this layer is to form a bacteria-resistant wall between the internal digestive tract and the rest of the body, thereby protecting it from the risk of inflammation linked to the presence of the billions of bacteria of the intestinal flora.

Previous research had already shown that antibodies are naturally found within this mucous layer, some of which are directed against flagellin. This means that the body spontaneously develops immune protection against flagellin, making it possible to control the presence of the bacteria that express it.

Seeking to reduce the risk of chronic inflammation, Inserm researcher Benoit Chassaing and his colleagues had the idea of stimulating this anti-flagellin antibody production in order to reduce the presence of the bacteria that express flagellin in the gut microbiota.

As described in their study published in Nature Communications, the researchers administered flagellin to mice intraperitoneally, thereby inducing a marked increase in the anti-flagellin antibodies, particularly in the intestinal mucosa. The researchers then applied a protocol in order to induce chronic intestinal inflammation, whereupon they observed that immunizing against flagellin gave the animals significant protection from intestinal inflammation. In addition, detailed analysis of their microbiota and intestines revealed not just a reduction in the levels of bacteria that strongly express flagellin but also their absence in the intestinal mucosa, as opposed to the unvaccinated group.

Given that excess flagellin in the gut microbiota has also been linked to metabolic disorders such as diabetes and obesity, the researchers tested their vaccine strategy in mice exposed to a high-fat diet. Whereas the unvaccinated animals developed obesity, the vaccinated animals were protected.

"This vaccine strategy can be envisaged in humans, because such abnormalities of the microbiota have been observed in patients with inflammatory and metabolic diseases. With this in mind, we are currently working on a means of locally administering flagellin to the intestinal mucosa", explains Chassaing. The researchers are considering the possibility of developing ingestible flagellin-filled nanoparticles. Finally, in addition to the preventive aspect, they now wish to test this vaccination for treatment purposes, in animals already presenting chronic inflammatory disease or metabolic disorders.

Credit: 
INSERM (Institut national de la santé et de la recherche médicale)

Thunderquakes make underground fiber optic telecommunications cables hum (audio available)

SAN FRANCISCO--Telecommunications lines designed for carrying internet and phone service can pick up the rumble of thunder underground, potentially providing scientists with a new way of detecting environmental hazards and imaging deep inside the Earth. [Listen to a thunderquake]

The new research being presented today at AGU's Fall Meeting and published in AGU's Journal of Geophysical Research: Atmospheres marks the first time thunder has been heard underground by a telecommunications fiber optic array, according to the study's authors.

The new study used The Pennsylvania State University's existing fiber network for internet and phone service as a distributed sensor array to observe the progress of thunderstorms as they crossed the campus.

Traditional seismometers have recorded ground motions evoked by thunder, called thunderquakes, vibrating in the infrasound frequency range, below 20 Hertz, which is inaudible to the human ear. The fiber array, which is buried 1 meter (3 feet) underground, picked up a wider range of the frequencies heard in a peal of thunder. The bandwidth detected, from 20 to 130 Hertz, is consistent with microphone recordings of thunder and provides more information about the event, the study found.

Penn State geophysicist Tieyuan Zhu and meteorologist David Stensrud gained access to the university's telecommunication fiber optic cable in April 2019. They were listening for subtle vibrations from a variety of environmental effects, including sinkhole formation and flooding.

"Once we set up, we found a lot of very strong events in our fiber optic data, so I was very curious, what's the cause of these signals?" said Zhu. The researchers found a match when they synchronized their results with data from the U.S. National Lightning Detection Network. "We thought, yeah, this is exactly the thunderstorm data, actually recorded by our fiber array."

The passage of lightning heats the air so fast it creates a shockwave we hear as thunder. Vibrations from loud events like lightning, meteor explosions and aircraft sonic booms pass from the air to Earth's surface, shaking the ground.

Fiber optic cables carry telecommunications information in bursts of laser light conducted by strands of transparent glass about as thick as a human hair. Vibrations in the Earth such as those created by thunderstorms, earthquakes or hurricanes stretch or compress the glass fibers, causing a slight change in light intensity and the time the laser pulse takes to travel to its destination. The researchers tracked these aberrations to monitor ground motion, converting the laser pulses back to acoustic signals.

"The laser is very sensitive. If there is a subtle underground perturbation, the laser can detect that change," said Zhu.

Several kilometers of continuous fiber underlay Penn State's campus, which means the array can act like a network of more than 2,000 seismometers emplaced every two meters along the cable path. With this high density of sensors, the researchers can calculate the location where the thunder originated, potentially distinguishing between cloud-to-ground and cloud-to-cloud lightning.

"Compared to the seismometers, the fiber optic array can provide fabulous spatial, and also temporal, resolution," said Zhu. "We can track the thunderstorm source movement."

The researchers said the new study demonstrates fiber optic networks under urban areas are an untapped resource for monitoring environmental hazards. They also hold potential for studying the crust and deep structures of the Earth, which cannot be measured directly.

Scientists learn about the inside of the planet by observing the way seismic waves from earthquakes are altered as they pass through it. Ground motions induced by thunderstorms, which are much more frequent than earthquakes on the east coast of North America, could help reveal the hidden shapes of Earth's interior, Zhu said.

Credit: 
American Geophysical Union

Deciphering the equations of life

Research led by the University of Arizona has resulted in a set of equations that describes and predicts commonalities across life despite its enormous diversity.

"Our study develops a general theory to study the extraordinary diversity of life using simple rules common to all species. We can further apply these rules to predict specific traits of species that we might know a lot less about," said Joseph 'Robbie' Burger, an ecology and evolutionary biology postdoctoral fellow in the Institute of the Environment and the Bridging Biodiversity and Conservation Science program at the University of Arizona. These specific traits, include the timing of an organism's reproduction and death, called an organism's life history.

Burger is lead author on a paper, published in the journal Proceedings of the National Academy of Sciences, which also included collaborators from Missouri University Science and Technology and the University of New Mexico.

"When thinking about the enormous varieties of lifeforms, it would seem that life is very complicated and wouldn't be that predictable," he said. But whether you're as large as a whale or as tiny as plankton, or whether you're a giant clam that lays millions of eggs at once or an elephant that births a few 250-pound calves in a lifetime, all species have evolved to reproduce, grow, survive and replace within universal biophysical constraints.

"If you impose these constraints on the mathematical model, then certain unifying patterns fall out," Burger said.

One of the constraints is demography. Regardless of the number of offspring produced in a lifetime, on average, only two survive to replace the parents. Another constraint is mass-energy balance. Living things allocate energy to bodily maintenance, growth and reproduction, all of which must balance in a lifecycle.

Imposing these constraints explains two fundamental tradeoffs in how organisms reproduce: the tradeoff between number and size of offspring, and between parental investment in offspring and offspring growth.

"What's so cool about these equations is that to solve it, all you have to know are two values - the size of the offspring at independence and the adult size," Burger said. "If you plug that into the equation, you get the number of offspring an organism will produce in a lifetime and myriad other life history characteristics."

To arrive at this new understanding of how organisms allocate energy to growth, reproduction and survival, Burger and his colleagues compiled published data on the life histories of a diverse array of wild animals in stable populations.

Their new theory refines old understandings about life history tradeoffs. Past assumptions were that offspring size and number increased or decreased at the same rate. For example, elephants have relatively large calves, so they have few in a lifetime, while tuna produce millions of tiny eggs. It turns out that the relationship is not so straightforward, a realization that inspired Burger's work.

"We now need to put these equations to practice by developing user-friendly programming tools, collaborating with field scientists refine ecosystem models and informing management decisions," he said.

Credit: 
University of Arizona

Why polar bears at sea have higher pollution levels than those staying on land

As the climate changes, myriad animal populations are being impacted. In particular, Arctic sea-ice is in decline, causing polar bears in the Barents Sea region to alter their feeding and hunting habits. Bears that follow sea-ice to offshore areas have higher pollutant levels than those staying on land -- but why? A new study in ACS' Environmental Science & Technology reports the likely reasons.

Barents Sea polar bears fall into two categories: pelagic, which migrate annually to hunt at sea, and coastal, which stay on land to fast or hunt. Changes in sea-ice availability have forced both types of bears to adjust how they find food. In recent decades, pelagic bears have shifted northward as southern ice has receded. Pelagic bears now have farther to migrate, while longer periods without ice have led coastal bears to feed on land-based prey or rely on their fat reserves. Previous studies have shown that pelagic bears have higher levels of pollutants, such as persistent organic pollutants (POPs), in their bodies, but little is known about why that difference exists. To solve this mystery, Pierre Blévin at the Norwegian Polar Institute and colleagues collected an array of data that paints a clearer picture of how climate change affects polar bears.

In their study, the researchers gathered data on feeding habits, migration patterns, energy expenditure and geography to determine how the two polar bear types differed. They also measured pollutant levels in the prey that polar bears typically consume. The results indicated that several factors cause pelagic bears to accumulate more pollutants than those that stay on land. Sea-based bears feed on a higher proportion of marine life, especially those that are higher up the food chain, leading to multiple layers of polluted food, compared to land hunters. In addition, sea hunters have higher energy requirements, which in turn causes them to consume more prey. Pelagic bears also feed on prey located closer to pollutant sources and transport pathways. These combined factors highlight the unique pollution exposure mechanisms that polar bears face in this region, and how increased sea hunting by polar bears could enhance pollutant accumulation in these animals as the ice recedes, the researchers say.

Credit: 
American Chemical Society