Earth

Understanding the link between nicotine use and misuse of 'benzos'

WASHINGTON -- Studies have correlated a relationship between smoking or vaping nicotine with misuse of other substances, such as alcohol and prescription drugs. Lately, misuse of prescription benzodiazepines (such as alprazolam or Xanax, and diazepam or Valium) has also been linked to nicotine use. These connections have all been statistically derived -- researchers did not directly study human interaction with these drugs.

Now, however, evidence of how nicotine "sets up" a craving for benzodiazepines -- often called "benzos" -- in animal laboratory studies has been published in the open access journal eNeuro. Georgetown University Medical Center investigator Alexey Ostroumov, PhD, led the study, which he conducted with colleagues at the University of Pennsylvania before joining Georgetown where he continues this research focus.

"Our findings in rats show that nicotine increases the use of another drug," says Ostroumov, an assistant professor in the department of pharmacology & physiology. "Rats exposed to nicotine drank more of a solution that contained benzodiazepine. Rats without nicotine exposure were less interested in the benzodiazepine solution." Benzodiazepines, prescribed for severe anxiety and other conditions, also include the tradenames Ativan, Klonopin, Tranxene, Versed, among many others.

To investigate the mechanisms of increased benzodiazepine consumption, the research team focused on the brain pathway that is critical for use of all addictive substances, including nicotine and benzodiazepines.

This pathway originates in the ventral tegmental area (VTA), which sends dopaminergic neurons throughout the brain. VTA is considered an integral part of the brain's rewards system that is involved in reinforcing behavior, thus, the VTA is thought to play the major role in motivation, reward and addiction behaviors.

Ostroumov emphasizes that this pathway is probably most important in establishing addiction to other substances in nicotine users, but not in maintaining ongoing addiction. "We believe that nicotine-induced adaptations in this brain pathway increase the risk for future misuse of other addictive substances, including benzodiazepines," he says.

In their experiments, investigators observed that nicotine-treated rats had reduced dopamine neuron activity in response to benzodiazepine. Similar results have frequently been observed in humans, where blunted dopamine signaling reliably predicts elevated use of various addictive substances.

"This association between blunted dopamine responses and nicotine-induced benzo intake supports the idea that when the brain doesn't sense enough gratification from a substance, people may take more of it to compensate," Ostroumov explains.

By looking into mechanisms of this VTA modification, researchers found a protein, KCC2, that is downregulated by nicotine. Thus, upregulating KCC2 appears as a potential remedy to mitigate the effect of nicotine on benzodiazepine consumption. Researchers boosted KCC2 in the brain with a synthesized chemical (CLP290). They found that, indeed, CLP290 restored normal functioning of VTA neurons. In rats treated with nicotine, CLP290 decreased consumption of the benzo solution to pre-nicotine levels.

Ostroumov says there is much more research to do to define therapeutic potential of KCC2 in substance misuse. "But rat studies provide evidence that human smoking and vaping can likely contribute to the misuse of drugs, including benzodiazepines."

Credit: 
Georgetown University Medical Center

Motley crew: Rust and light a possible answer to the conundrum of hydrogen fuel production

image: Crafting a new and efficient way of producing hydrogen from organic waste.

Image: 
Tokyo University of Science

In today's narrative of climate change, pollution, and diminishing resources, one fuel could be a game-changer within the energy industry: hydrogen. When burned in a combustion engine or in an electrical power-plant, hydrogen fuel produces only water-making it far cleaner than our current fossil fuels. With no toxic gas production, no contribution to climate change, and no smog, hydrogen may be the answer to a future of cleaner energy, so why is it not more widely used?

There are two reasons for this. First, hydrogen is highly flammable and leaks very easily from storage tanks, causing potential explosion hazards during storage and transport. Second, although pure hydrogen occurs naturally on Earth, it is not found in quantities sufficient for cost-effective utilization. Hydrogen atoms must be extracted from molecules like methane or water, which requires a large amount of energy. Although several techniques exist to produce hydrogen fuel, scientists are yet to make this process "efficient" enough to make hydrogen a commercially competitive fuel on the energy market. Until this is achieved, fossil fuels will probably continue to dominate the industry.

For decades, scientists have been working towards a cheap, efficient, and safe way to produce hydrogen fuel. One of the most promising methods to achieve this is through solar-driven processes, using light to speed up (or "catalyze") the reaction to split water molecules into oxygen and hydrogen gas. In the 1970s, two scientists described the Honda-Fujishima effect, which uses titanium dioxide as a photocatalyst in hydrogen production. Building on this research, a team of Japanese researchers led by Prof Ken-ichi Katsumata of Tokyo University of Science, sought to use a cheaper, more readily available semiconductor catalyst for this reaction, with the hope to increase its efficiency even further, reducing the production costs and safety of hydrogen fuel. Their study published in Chemistry: A European Journal indicates that, by using a form of rust called α-FeOOH, hydrogen production under Hg-Xe lamp irradiation can be 25 times higher than titanium dioxide catalyst under the same light.

The experiment conducted by Prof Katsumata and colleagues aimed to address common challenges encountered in using semiconductor catalysts in solar-driven hydrogen production. There are three major obstacles described by the authors. The first is the need for the catalyst material to be suitable for the use of light energy. The second is that most photocatalysts currently used require rare or "noble" metals as cocatalysts, which are expensive and difficult to obtain. The last problem arises from the actual production of hydrogen and oxygen gases. If not separated straight away, the mixture of these two gases can at best reduce the hydrogen fuel output, and at worst, cause an explosion. Therefore, they aimed to find a solution that can not only increase the reaction's efficiency, but also successfully prevent hydrogen and oxygen from re-coupling and creating a potential hazard.

The team identified a promising candidate catalyst in α-FeOOH (or rust) and set out an experiment to evaluate its efficiency for hydrogen production and the optimal experimental conditions for its activation. "We were really surprised at the generation of hydrogen using this catalyst," states Prof Katsumata, "because most of the iron oxides are not known to reduce to hydrogen. Subsequently, we searched for the condition for activating α-FeOOH and found that oxygen was an indispensable factor, which was the second surprise because many studies showed that oxygen suppresses hydrogen production by capturing the excited electrons." The team confirmed the production mechanism of hydrogen from water-methanol solution using a 'gas-chromatography-mass-spectrometry' method, showing that α-FeOOH was 25 times more active than the titanium dioxide catalyst used in previous research, supporting stable hydrogen production for more than 400 hours!

More research will be required to optimize this process. Prof Katsumata elaborates: "The specific function of the oxygen in activating light-induced α-FeOOH has not been unveiled yet. Therefore, exploring the mechanism is the next challenge." For now, these findings of Katsumata and his colleagues represent new advancements in the production of a clean, zero-emissions energy source that will be central to the sustainable societies of the future!

Credit: 
Tokyo University of Science

Eat or be eaten

image: Ecosystems with 60 plant species contained, on average, twice the amount of standing biomass in comparison to plant monocultures.

Image: 
Alexandra Weigelt

For the first time, they did not just investigate one feeding type such as herbivores but the integrated feeding relationships across an entire ecosystem. Previous research examining the effects of biodiversity on the functioning of ecosystems focused mainly on single feeding levels (trophic levels) or simplified food chains.

"We have analyzed an entire feeding network - in other words, multitrophic interactions - above- and belowground. This is indispensable for understanding the effects resulting from global species extinction," explained Dr. Sebastian T. Meyer, a researcher at the Chair for Terrestrial Ecology at the Technical University of Munich (TUM) and lead author of the study.

A network of energy

An aboveground food chain could extend from grasses to grasshoppers and on to spiders, for example. The research group examined how much energy flows into the system, how much remains in the system - so how much biomass is present in the system - and eventually, how much energy is leaving the system. The main insight: The entire ecosystem's efficiency rises across all feeding levels when plant diversity increases.

"Seeing positive effects on one level does not imply that there cannot be simultaneous positive effects on other feeding levels," said Dr. Meyer. When a grasshopper feeds on grasses until it is saturated, this does not necessarily result in negative effects on the plant level - with a high level of biodiversity, the system keeps itself in a balance.

Unique database from a grassland biodiversity experiment

The group worked with data gathered through the Jena Experiment, a large-scale grassland biodiversity experiment that has been running since 2002. The research environment provided by the experiment is unique in the world and allow for the synthesis of large amounts of data.

For each of the 80 plots of the Jena Experiment, the researchers assembled trophic network models of the grassland ecosystem. These contain the standing biomass on every feeding level and the flow of energy through feeding interactions between the trophic levels. In addition to plants, the study also covers herbivores, carnivores, omnivores, soil microbes, dead organic material aboveground and in the soil and decomposers that feed on these sources of organic matter.

More efficient energy use in ecosystems with higher plant diversity

"The study shows that higher plant diversity leads to more energy stored, greater energy flow and higher energy-use efficiency in the entire trophic network, therefore across all trophic levels," explained Dr. Oksana Buzhdygan from Freie Universitaet Berlin, another lead author of the study.

Ecosystems with 60 plant species contained, on average, twice the amount of standing biomass in comparison to plant monocultures, which means that the total amount of resources used and recovered by plant and animal community rose with an increase in plant diversity.

Biodiversity as insurance against environmental fluctuations

"An enhanced ecosystem functionality on all levels can contribute to an increased insurance effect of biodiversity on ecosystem functions when environmental fluctuations occur; it also enhances the system's robustness in case of perturbations," Prof. Jana Petermann from the University of Salzburg concluded. She is the senior author of the study.

This research paper highlights the importance of biodiversity for functions in and services provided by ecosystems. For instance, agricultural land use that aims at yielding a wide range of goods and services should maintain high plant diversity, for example by planting mixed crops, in order to avoid losing ecosystem resources.

Credit: 
Technical University of Munich (TUM)

Freshwater flowing into the North Pacific plays key role in North America's climate

CORVALLIS, Ore. - Massive freshwater river flows stemming from glacier-fed flooding at the end of the last ice age surged across eastern Washington to the Columbia River and out to the North Pacific Ocean, where they triggered climate changes throughout the northern hemisphere, new research published today in Science Advances shows.

The findings provide new insight into the role the North Pacific Ocean plays in the planet's climate, said Alan Mix, an oceanographer and paleoclimatologist in Oregon State University's College of Earth, Ocean, and Atmospheric Sciences and one of the study's authors.

"We look to the past to give us context for what might happen in the future," Mix said. "We didn't know before this research that the increase in freshwater flows was going to trigger widespread changes. It tells us this system is sensitive to these kinds of changes."

The lead author of the study is Summer Praetorius, a research geologist at the U.S. Geological Survey who first started constructing records involved in the project as a doctoral student at Oregon State more than a decade ago.

Praetorius, Mix and OSU co-authors Maureen Walczak, Jennifer McKay and Jianghui Du collected data and analyzed records from across the Northeast Pacific region. They also examined an aggregate of data spanning several decades that was collected by scientists around the globe. Alan Condron, a modeling expert from Woods Hole Oceanographic Institute, also contributed to the analysis.

The researchers used computer modeling to project the rapid movement of floodwaters during deglaciation between 10,000 and 20,000 years ago, showing the flow from the Columbia River traveling along the coastline north to the Gulf of Alaska, across the Bering Strait and to Japan, as well as northward into the Bering Sea and the Arctic Ocean.

Freshwater is less dense than saltwater, sitting on top of the saltwater like a blanket and mixing down with the saltwater slowly, Mix said. The layers of water can change how heat moves around in the ocean, leading to less moderating of the climate.

The Columbia River runs along the dividing line between the subtropical region of the Pacific to the south and the subpolar region to the north. While some of the freshwater flowing out during the floods traveled south and dissipated, more of the water went north, where it flowed like a river along the coastline.

"This spread of floodwaters along the Alaskan coast was a big surprise," said Condron. "The model showed that water from the Columbia River can impact most of the North Pacific and might even leak across the Arctic Ocean and into the Atlantic."

Marine geologists from Oregon State then worked like crime scene investigators, tracing the impacts of the floodwaters through time using chemical "fingerprints" left in fossil shells that were alive during the flooding but sank and accumulated in muddy sentiments on the ocean floor along the floodwater's path.

Mix led an expedition to collect sediment cores along the path of the floodwaters in 2004. The cores were then stored in OSU's Marine and Geology Core Repository while the research was underway.

"The expedition yielded a treasure trove of mud from places nobody had thought to examine," he said. "It has taken more than a decade of painstaking work sifting through the mud we retrieved looking for fossil shells that could help tell the story of the floodwater's impact."

Praetorius used the data from the shells and the modeling to show how the repeated flooding over 1,000 years cooled the ocean, which in turn impacted the climate across North America.

"Our findings suggest that freshwater flows into the North Pacific can have far-reaching impacts, changing ocean temperatures and steering winds and storm tracks in North America," she said. "What happens in the North Pacific won't stay in the North Pacific, but instead will cause changes far and wide."

The warming underway today is opposite of what occurred at the end of the last ice age, Mix said, but understanding circulation patterns in the North Pacific gives researchers insight into what might happen as more warm water flows into the North Pacific as the planet warms.

"What we expect in the future is lower river flows and warmer water in the North Pacific - ¬the opposite of what we saw at the end of the last ice age," Mix said. "But the past is still informative, because it tells us how the circulation system of the North Pacific works."

Credit: 
Oregon State University

How low can you go? Lower than ever before

image: To create electric charges in silicon, researchers shine pulsed laser light onto a sample. One-photon tests using visible light only penetrate a tiny way into a silicon sample -- on the order of micrometers (millionths of a meter) or smaller. But the new two-photon tests using near infrared light penetrate much, much deeper into silicon, on the order of millimeters (thousandths of a meter) or longer. The one-photon tests create a lot of electric charge (shown here as pluses and minuses) in a relatively small volume. By contrast, the two-photon test creates far fewer electric charges in a much larger volume.

Image: 
S. Kelley/NIST

Silicon, the best-known semiconductor, is ubiquitous in electronic devices including cellphones, laptops and the electronics in cars. Now, researchers at the National Institute of Standards and Technology (NIST) have made the most sensitive measurements to date of how quickly electric charge moves in silicon, a gauge of its performance as a semiconductor. Using a novel method, they have discovered how silicon performs under circumstances beyond anything scientists could test before -- specifically, at ultralow levels of electric charge. The new results may suggest ways to further improve semiconductor materials and their applications, including solar cells and next-generation high-speed cellular networks. The NIST scientists report their results today in Optics Express.

Unlike previous techniques, the new method does not require physical contact with the silicon sample and allows researchers to easily test relatively thick specimens, which enable the most accurate measurements of semiconductor properties.

The NIST researchers had previously done a proof-of-principle test of this method using other semiconductors. But this latest study is the first time researchers have pitted the new light-based technique against the conventional contact-based method for silicon.

It's too soon to say exactly how this work might be used someday by industry. But the new findings could be a foundation for future work focused on making better semiconducting materials for a variety of applications, including potentially improving efficiency in solar cells, single-photon light detectors, LEDs and more. For example, the NIST team's ultrafast measurements are well-suited to tests of high-speed nanoscale electronics such as those used in fifth-generation (5G) wireless technology, the newest digital cellular networks. In addition, the low-intensity pulsed light used in this study simulates the kind of low-intensity light a solar cell would receive from the Sun.

"The light we use in this experiment is similar to the intensity of light that a solar cell might absorb on a sunny spring day," said NIST's Tim Magnanelli. "So the work could potentially find applications someday in improving solar-cell efficiency."

The new technique is also arguably the best way to get a fundamental understanding of how the movement of charge in silicon is affected by doping, a process common in light sensor cells that involves adulterating the material with another substance (called a "dopant") that increases conductivity.

Digging Deep

When researchers want to determine how well a material will perform as a semiconductor, they assess its conductivity. One way to gauge conductivity is by measuring its "charge carrier mobility," the term for how quickly electric charges move around within a material. Negative charge carriers are electrons; positive carriers are referred to as "holes" and are places where an electron is missing.

The conventional technique for testing charge carrier mobility is called the Hall method. This involves soldering contacts onto the sample and passing electricity through those contacts in a magnetic field. But this contact-based method has drawbacks: The results can be skewed by surface impurities or defects, or even problems with the contacts themselves.

To get around these challenges, NIST researchers have been experimenting with a method that uses terahertz (THz) radiation.

NIST's THz measurement method is a rapid, noncontact way to measure conductivity that relies on two kinds of light. First, ultrashort pulses of visible light create freely moving electrons and holes within a sample -- a process called "photodoping" the silicon. Then, THz pulses, with wavelengths much longer than the human eye can see, in the far infrared to microwave range, shine on the sample.

Unlike visible light, THz light can penetrate even opaque materials such as silicon semiconductor samples. How much of that light penetrates or is absorbed by the sample depends on how many charge carriers are freely moving. The more freely moving charge carriers, the higher the material's conductivity.

"No contacts are needed for this measurement," said NIST chemist Ted Heilweil. "Everything we do is just with light."

Finding the Sweet Spot

In the past, researchers performed the photodoping process using single photons of visible or ultraviolet light.

The problem with using only one photon for doping, though, is that it typically penetrates only a small way through the sample. And since the THz light completely penetrates the sample, researchers can effectively use this method to study only very skinny silicon samples -- on the order of 10 to 100 billionths of a meter thick (10 to 100 nanometers), about 10,000 times thinner than a human hair.

However, if the sample is that thin, researchers are stuck with some of the same issues as with the conventional Hall technique -- namely, surface defects can skew the results. The thinner the sample, the bigger the impact of surface defects.

The researchers were torn between two objectives: Increase the thickness of the silicon samples, or increase the sensitivity they get from using single photons of light.

The solution? Illuminate the sample with two photons at once instead of one at a time.

By shining two near-infrared photons on the silicon, scientists are still only using a small amount of light. But it's enough to get through much thicker samples while still creating the fewest possible electrons and holes per cubic centimeter.

"With two photons being absorbed at once, we can get deeper into the material and we can see a lot fewer electrons and holes generated," Magnanelli said.

Using a two-photon measurement means the researchers can keep the power levels as low as possible, but still fully penetrate the sample. A conventional measurement can resolve no fewer than one hundred trillion carriers per cubic centimeter. Using its new method, the NIST team resolved a mere 10 trillion, at least 10 times more sensitivity -- a lower threshold for measurement.

The samples studied so far are thicker than some other samples -- about half a millimeter thick. That's thick enough to avoid surface defect issues.

And in lowering the threshold for measuring free holes and electrons, the NIST researchers found a couple of surprising results:

Other methods had shown that as researchers create fewer and fewer electrons and holes, their instruments measure higher and higher carrier mobility in the sample -- but only up to a point, after which the carrier density gets so low that the mobility plateaus. By using their noncontact method, NIST researchers found that the plateau occurs at a lower carrier density than previously thought, and that the mobilities are 50% higher than measured before.

"An unexpected result like this shows us things we didn't know about silicon before," Heilweil said. "And though this is fundamental science, learning more about how silicon works could help device makers use it more effectively. For example, some semiconductors may work better at lower doping levels than currently used."

The researchers also used this technique on gallium arsenide (GaAs), another popular light-sensitive semiconductor, to demonstrate that their results are not unique to silicon. In GaAs, they found that the carrier mobility continues to increase with lower charge carrier density, about 100 times lower than the conventionally accepted limit.

Future NIST work might focus on applying different photodoping techniques to samples, as well as varying the samples' temperature. Experimenting with thicker samples may provide even more surprising results in semiconductors. "When we use the two-photon method on thicker samples we may produce even lower carrier densities that we can then probe with the THz pulses," Heilweil said.

Credit: 
National Institute of Standards and Technology (NIST)

Isotope movement holds key to the power of fusion reactions

image: Isotope non-mixing and isotope mixing profiles.

Image: 
Katsumi Ida, the National Institute for Fusion Science and the Graduate University for Advanced Studies

Fusion may be the future of clean energy. The same way the sun forces reactions between light elements, such as hydrogen, to produce heavy elements and heat energy, fusion on Earth can generate electricity by harnessing the power of elemental reactions. The problem is controlling the uniformity of hydrogen isotope density ratio in the fusion plasma--the soup of elements that will fuse and produce energy.

A research team in Japan has reached a key understanding of this process that may aid the future development and use of fusion plasma.

They published their results on Jan. 14 in Physical Review Letters, a journal of the American Physical Society.

The researchers focused on a ratio of hydrogen isotopes, or weight-varied versions of hydrogen, in plasma produced in the Large Helical Device (LHD) at the National Institute for Fusion Science (NIFS). The plasma consisted of hydrogen and deuterium, which weighs twice as much as hydrogen. By understanding how this plasma mixes, the researchers can begin to predict how future plasma consisting of deuterium and tritium, which weighs three times as much as hydrogen, may behave. 

"In the core of fusion plasma , it is most desirable to have an even split between deuterium and tritium because it gives the highest fusion power," said paper author Katsumi Ida, a professor with both the National Institute for Fusion Science and the Graduate University for Advanced Studies. "However, we can only control the isotope ratio at the edge of the plasma, not in the core. We set out to investigate if the isotope ratio is uniform throughout the mixture. If it's not, can we make it uniform?"

Ida and his team found that the uniformity is determined by how the isotopes move. Referred to as a turbulent state, isotopes affected by ion temperature gradient (ITG) turbulence were far more uniform than isotopes undergoing trapped-electron mode (TEM) turbulence.

"The ITG-dominant state is far more favorable in fusion plasma," Ida said. "We saw the formation of a non-mixing profile and its transition to a uniform isotope state in the plasma, associated with the increase of turbulence propagating along the ion temperature gradient."

ITG turbulence involves a temperature gradient matched to the magnetic fields confining the fusion plasma. The isotopes move more if they are on the hotter end, allowing the isotopes to more evenly mix. According to Ida, this understanding could help researchers control plasma uniformity and increase the power of fusion plasma isotope mixtures.

The researchers plan to study uniformity in other ions, including in helium--an element produced by the fusion reaction between deuterium and tritium.

Credit: 
National Institutes of Natural Sciences

International group of scientists found new regulators of blood supply to the brain

image: Professor Dr. Sergey Kasparov.

Image: 
Immanuel Kant Baltic Federal University

Only about ten years ago theories began to appear that astrocytes, along with neurons, are actively involved in processing information and in supporting the activity of the brain.

Recently, an article by an international group of scientists was published in the Nature Communications scientific journal, in which it was proved that astrocytes regulate blood flow in the vessels of the brain, the process called "perfusion".

Professor at the University of Bristol and the Immanuel Kant Baltic Federal University, Dr. Segrey Kasparov is one of the article's authors.

Prof. Dr. Kasparov said:

"Dr. Alexander Gurin from the University College London has played the leading role in the research of ours. The hypothesis of the research was that there is a signaling system in the brain showing it whether it is sufficiently supplied with blood or not. Astrocytes whose membranes have amazing mechanosensitivity are the key element of the system. These cells physically envelop the vessels with their protuberances, that serve as a kind of mechanosensors of the brain. If they detect the "system pressure" dropping, they activate a mechanism rising systemic blood pressure and heart rate thus improving the brain perfusion.

But the astrocytes do not control the whole process on all their own. The function of the heart and vascular tone are of course regulated by neurons, the activity of which astrocytes in a certain way modulate. So far we still do not know all the intricacies of this complex mechanism, In particular, it is unclear how mechanical influences are perceived and what are the signals given by astrocytes. There is still a lot of work ahead, a lot of experiments to conduct. But now it is more or less obvious that new knowledge can be used in medicine. Previously, only pressure receptors that are located on peripheral vessels were known. Now that the function of astrocytes has become clear, new possibilities for regulating pressure have opened up for specialists. And this, for example, gives hope to those suffering from high blood pressure, which occurs without an obvious reason. And there are many people around the world suffering from this problem.

Credit: 
Immanuel Kant Baltic Federal University

Comparing PFAS exposures in female firefighters and office workers

Firefighters have higher rates of some cancers than the general population, which might not be surprising given the many potential carcinogens they encounter while battling blazes. However, previous studies of chemical exposures in this occupation have focused almost exclusively on men. Now, researchers reporting in ACS' Environmental Science & Technology have compared poly- and perfluorinated substances (PFAS) in the serum of female firefighters and female office workers, finding higher levels of three compounds in the firefighters.

Manufacturers apply PFAS to many consumer products, such as fabrics, furniture and food containers, to make the items stain-, water- and grease-resistant. Firefighters likely have additional exposures through their equipment, including PFAS-containing protective gear and firefighting foam. Particularly relevant to women firefighters, some PFAS are endocrine disruptors that affect mammary gland development, possibly impairing lactation or increasing susceptibility to breast cancer. To find out if women firefighters are exposed to elevated levels of these substances, Rachel Morello-Frosch and colleagues with the Women Firefighters Biomonitoring Collaborative compared serum levels of PFAS in female firefighters and female office workers.

The researchers collected blood serum samples from 86 women firefighters and 84 female office workers in San Francisco. The team analyzed levels of 12 different PFAS in the samples using liquid chromatography-mass spectrometry. They detected a total of eight PFAS in the samples, four of which were present in all study participants. Firefighters had modestly higher levels (up to twice as much) of three compounds -- PFNA, PFUnDA and PFHxS -- in their serum than office workers. Among firefighters, those who reported using firefighting foam in the year prior to sample collection had higher PFAS concentrations than those who had not. These results could help guide efforts to reduce workplace exposure to PFAS in female firefighters, the researchers say.

Credit: 
American Chemical Society

Trials show new drug can ease symptoms of chronic cough

Two trials of a new drug have shown that at low doses, it can ease the often distressing symptoms of chronic cough with minimal side effects.

Principle researcher Jacky Smith, a Professor of Respiratory Medicine at The University of Manchester and a consultant at Wythenshawe Hospital, says Gefapixant has the potential to have a significant impact on the lives of thousands of suffers.

Higher doses can reduce the sense of taste, though at 50mg, the effect is much reduced, say the research team.

The drug is being developed in collaboration with the pharmaceutical company MSD, who have funded the trials.

The study published in Lancet Respiratory Medicine today shows that in a 12-week trial of 253 patients- the largest of its kind- 80% of patients had a clinically significant response to a dose of 50mg.

A dose of 7.5mg reduced the coughing by 52%, 20mg by 52% and 50mg by 67% from baseline. Around a quarter did not respond to the drug.

And another 16-day study describing a 57 patient trial, also published in the European Respiratory Journal this week, showed that as little as 30mg of the drug could be effective - much lower than previously thought.

Both studies were randomised and double blind, in which neither the participants nor the experimenters knew who received the treatment.

The drug is now in two larger global phase 3 trials, carried out to confirm and expand on the safety and effectiveness results from the previous research.

Chronic coughing is thought to affect between 4 and 10% of the population, some of whom cough thousands a time a day over many years.

While many patients improve with treatment of associated conditions such as asthma, gastroesophageal reflux disease and nasal disease, many do not.

The condition can cause abdominal pain, urinary incontinence in women, as well as anxiety, depression and difficulty sleeping.

Professor Smith said: "This drug has exciting prospects for patients who suffer from the often distressing condition of chronic cough.

"Effective treatments for cough are a significant unmet clinical need and no new therapies approved in over 50 years.

"Billions of pounds are spent annually on over-the-counter cough and cold medicines despite a lack of evidence to support their efficacy, concerns about the potential for abuse and risk of harm in overdose."

Gefapixant is able to target P2X3receptors in the nerves which control coughing and the team monitored the impact of the drug using a special cough monitoring device they developed which counts coughs.

The drug was initially developed as a pain killer, until the researchers discovered it had a significant impact on chronic cough.

Some unlicensed drugs have also been shown to improve chronic cough, but their use is limited by unpleasant side effects.

It is thought a chemical called adenosine triphosphate (ATP), released as a response to inflammation in airways, may be an important mechanism for patients with chronic cough.

Professor Smith added: "We can't yet say when or if this drug will be available on prescription, however, if the phase 3 trial is successful then it would certainly be a major step towards everyday use.

"Though it's fair to say the drug is not a cure for chronic cough, it can and often does reduce the frequency of coughing substantially"

"That could make a big difference to patients who often struggle with this condition which can make such a big impact on their lives."

Retired journalist Nick Peake, from Warrington, who was a television director at ITV and the BBC, has been suffering from chronic cough for 25 years.

He said: "Coughing has blighted my life : every day without fail I cough for the first two hours, soon after I wake up often every 30 seconds. It wears me out.

"It comes and goes through the day: usually after a meal, or when I have a change of atmosphere - out of warm into cold, or if I exercise too hard.

"It often stops me getting to sleep at night, but then I might wake up at 3 or 4 in the morning and start coughing.

He added: "The coughing interferes with conversations, sometimes it stop me singing which I love to do. It's embarrassing when I'm with people - I find myself apologising a lot, and I have no control over it.

"So I'm often in despair about it and it can make me miserable. How my wife has put up with it all this time I don't know.

"It's been going on for so long and I'm thoroughly fed up with it, and desperate for a cure to be found."

Credit: 
University of Manchester

Using social media to understand the vaccine debate in China

image: Researchers used social media to better understand the vaccine debate in China

Image: 
qimono; Pixabay

THE SITUATION

Vaccine acceptance is a crucial public health issue, which has been exacerbated by the use of social media to spread content expressing vaccine hesitancy. Studies have shown that social media can provide new information regarding the dynamics of vaccine communication online, potentially affecting real-world vaccine behaviors.

A team of United States-based researchers observed an example of this in 2018 related to the Changchun Changsheng Biotechnology vaccine incident in China. The researchers found:

Expressions of distrust in government pertaining to vaccines increased significantly during and immediately after the incident.

Self-reports of vaccination changed from positive endorsements of vaccination to concerns about vaccine harms.

Expressed support for vaccine acceptance in China may be decreasing.

FROM THE RESEARCHER:

"The World Health Organization identified vaccine hesitancy as one of their top 10 challenges of 2019. When combined with virulent illnesses, such as COVID-19 or influenza, small changes in vaccination rates could spell the difference between smaller, contained outbreaks and a worldwide pandemic. Governments and public health agencies around the world need to prioritize health communication efforts. Even the safest and most effective vaccine is useless if people refuse to take it."
-- David Broniatowski, associate professor of engineering management and systems engineering at the George Washington University.

CONCLUSIONS

The new study, "Chinese Social Media Suggest Decreased Vaccine Acceptance in China: An Observational Study on Weibo Following the 2018 Changchun Changsheng Vaccine Incident," highlights the dangers of public perception of even a single vaccine safety incident, according to the researchers.

The team also believes the possible emergence of vaccine opposition in China is a potential cause for concern, especially considering the density of several large Chinese population centers.

2018 VACCINE INCIDENT IN CHINA

In July 2018, Chinese government inspectors determined that Changchun Changsheng Biotechnology, a prominent manufacturer of vaccines in China, had violated national regulations and standards when producing 250,000 rabies vaccine doses. The violation might have undermined the effectiveness of the involved vaccines. News began slowly escalating on Chinese social media platforms not long after the incident.

Credit: 
George Washington University

Stanford research maps a faster, easier way to build diamond

image: A sample of diamond crystals synthesized from triamantane, a type of diamondoid.

Image: 
Sulgiye Park

It sounds like alchemy: take a clump of white dust, squeeze it in a diamond-studded pressure chamber, then blast it with a laser. Open the chamber and find a new microscopic speck of pure diamond inside.

A new study from Stanford University and SLAC National Accelerator Laboratory reveals how, with careful tuning of heat and pressure, that recipe can produce diamonds from a type of hydrogen and carbon molecule found in crude oil and natural gas.

"What's exciting about this paper is it shows a way of cheating the thermodynamics of what's typically required for diamond formation," said Stanford geologist Rodney Ewing, a co-author on the paper, published Feb. 21 in the journal Science Advances.

Scientists have synthesized diamonds from other materials for more than 60 years, but the transformation typically requires inordinate amounts of energy, time or the addition of a catalyst - often a metal - that tends to diminish the quality of the final product. "We wanted to see just a clean system, in which a single substance transforms into pure diamond - without a catalyst," said the study's lead author, Sulgiye Park, a postdoctoral research fellow at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).

Understanding the mechanisms for this transformation will be important for applications beyond jewelry. Diamond's physical properties - extreme hardness, optical transparency, chemical stability, high thermal conductivity - make it a valuable material for medicine, industry, quantum computing technologies and biological sensing.

"If you can make even small amounts of this pure diamond, then you can dope it in controlled ways for specific applications," said study senior author Yu Lin, a staff scientist in the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC National Accelerator Laboratory.

A natural recipe

Natural diamonds crystallize from carbon hundreds of miles beneath Earth's surface, where temperatures reach thousands of degrees Fahrenheit. Most natural diamonds unearthed to date rocketed upward in volcanic eruptions millions of years ago, carrying ancient minerals from Earth's deep interior with them.

As a result, diamonds can provide insight into the conditions and materials that exist in the planet's interior. "Diamonds are vessels for bringing back samples from the deepest parts of the Earth," said Stanford mineral physicist Wendy Mao, who leads the lab where Park performed most of the study's experiments.

To synthesize diamonds, the research team began with three types of powder refined from tankers full of petroleum. "It's a tiny amount," said Mao. "We use a needle to pick up a little bit to get it under a microscope for our experiments."

At a glance, the odorless, slightly sticky powders resemble rock salt. But a trained eye peering through a powerful microscope can distinguish atoms arranged in the same spatial pattern as the atoms that make up diamond crystal. It's as if the intricate lattice of diamond had been chopped up into smaller units composed of one, two or three cages.

Unlike diamond, which is pure carbon, the powders - known as diamondoids - also contain hydrogen. "Starting with these building blocks," Mao said, "you can make diamond more quickly and easily, and you can also learn about the process in a more complete, thoughtful way than if you just mimic the high pressure and high temperature found in the part of the Earth where diamond forms naturally."

Diamondoids under pressure

The researchers loaded the diamondoid samples into a plum-sized pressure chamber called a diamond anvil cell, which presses the powder between two polished diamonds. With just a simple hand turn of a screw, the device can create the kind of pressure you might find at the center of the Earth.

Next, they heated the samples with a laser, examined the results with a battery of tests, and ran computer models to help explain how the transformation had unfolded. "A fundamental question we tried to answer is whether the structure or number of cages affects how diamondoids transform into diamond," Lin said. They found that the three-cage diamondoid, called triamantane, can reorganize itself into diamond with surprisingly little energy.

At 900 Kelvin - which is roughly 1160 degrees Fahrenheit, or the temperature of red-hot lava - and 20 gigapascals, a pressure hundreds of thousands of times greater than Earth's atmosphere, triamantane's carbon atoms snap into alignment and its hydrogen scatters or falls away.

The transformation unfolds in the slimmest fractions of a second. It's also direct: the atoms do not pass through another form of carbon, such as graphite, on their way to making diamond.

The minute sample size inside a diamond anvil cell makes this approach impractical for synthesizing much more than the specks of diamond that the Stanford team produced in the lab, Mao said. "But now we know a little bit more about the keys to making pure diamonds."

Credit: 
Stanford's School of Earth, Energy & Environmental Sciences

MicroRNA regulates process vital to placenta growth in early pregnancy

image: In preparation for pregnancy, fetal trophoblast cells (brown) from which the placenta arises invade maternal decidual cells (pink) in the uterus lining.

Image: 
Image courtesy of Hana Totary-Jain of USF Health, originally published in Scientific Reports: doi.org/10.1038/s41598-020-59812-8

TAMPA, Fla (Feb. 25, 2020) -- Abnormal formation and growth of the placenta is considered an underlying cause of various pregnancy complications such as miscarriages, preeclampsia and fetal growth restriction. Yet, much remains to be learned about molecular mechanisms regulating this blood-vessel rich organ vital to the health of a pregnant woman and her growing fetus.

A new study by University of South Florida Health (USF Health) Morsani College of Medicine researchers has discovered how a very large human non-protein coding gene regulates epithelial-to-mesenchymal transition (EMT) - a process that contributes to placental implantation during early pregnancy as well as cancer progression and spread.

The USF Health researchers used a powerful genome editing technology called CRISPR (shorthand for "CRISPR-dCas9) to activate all of the chromosome 19 microRNA cluster (known as C19MC), so they could study the gene's function in early pregnancy. C19MC -- one of the largest microRNA gene clusters in the human genome -- is normally turned off and becomes expressed only in the placenta, embryonic stem cells and certain cancers.

In their cell model study, published Feb. 20 in Scientific Reports, a Nature research journal, the USF Health team showed that robustly activating C19MC inhibited EMT.

But when cells from which the placenta arises (trophoblasts) were exposed to hypoxia - a lack of oxygen like that occurring in early placental development -- C19MC expression was significantly reduced, the researchers found. This loss of C19MC function then freed the trophoblasts to differentiate from stem-like epithelial cells into mesenchymal cells that can migrate and invade much like metastatic tumors.

"We were the first to use CRISPR to efficiently activate the entire gene, not just a few regions of this huge gene, in human cell lines," said the paper's senior author Hana Totary-Jain, PhD, an associate professor in the Department of Molecular Pharmacology and Physiology, USF Health Morsani College of Medicine. "Our study indicates C19MC plays a key role in regulating many genes important in early implantation and placental development and function. The regulation of these genes are critical for proper fetal growth."

Dr. Totary-Jain and others in her department collaborated with colleagues in the medical college's Department of Obstetrics and Gynecology on the project.

The USF Health study offers new insight into how trophoblasts interact with the maternal uterine environment to become more invasive or less invasive in the formation of the placenta, said co-author Umit Kayisli, PhD, a USF Health professor of obstetrics and gynecology. "More research on microRNA expression and how it inhibits the epithelial-to-mesenchymal transition may help us better understand and control preeclampsia and fetal growth restriction, which account for 5-to-10 percent of all pregnancy complications and premature births."

EMT happens early in the formation of the placenta, an organ which attaches to the lining of the uterus during pregnancy and supplies oxygen and nutrients from mother to the growing fetus. During the first trimester, fetal trophoblasts penetrate the maternal uterine lining and modify its blood vessels. This remodeling of the mother's spiral arteries allows oxygenated blood to flow from the mother to fetus.

However, the trophoblast invasion prompted by EMT is a tightly coordinated balancing act. If the invasion is too shallow to adequately remodel the maternal blood vessels, preeclampsia and fetal growth restriction can occur. Invasion that progresses too deeply -- beyond normal anchoring of the placenta to the uterine wall - leads to placenta accreta, a rare condition that can cause dangerous bleeding and often require pregnancy termination.

"You need the EMT process, but at some point it needs to stop to prevent adverse pregnancy outcomes," Dr. Totary-Jain said. "You really need a balance between not enough invasion and too much invasion, and C19MC is important in maintaining that balance."

Investigating the effects of altered C19MC expression on cell differentiation and trophoblast invasion has implications not only for a better understanding of normal and abnormal placenta development, but also for cancer and stem cell research, Dr. Totary-Jain added.

Credit: 
University of South Florida (USF Health)

Study finds gender disparities in hematology research success

Hematologists who complete a mentored training program experience greater levels of academic success than those who do not; however, a study published today in Blood Advances suggests a slight discrepancy in success levels between male and female hematologists. The study, which examined the effect of caregiving responsibilities on academic success, identified that, on average, men had one more first- or senior-authored publication than women, and almost twice as many total publications. Surprisingly, the study found that self-identification as a caregiver was associated with decreased productivity for men but not women.

The American Society of Hematology (ASH) Clinical Research Training Institute (CRTI) aims to improve hematologists' research training and increase their academic success. Each year, 20 participants - typically fellows or faculty members with an intended career in hematology - are chosen to participate in the year-long education and mentoring program. The CRTI begins with a week-long workshop made up of lectures, group discussions, and interactive sessions. With the guidance of faculty and peers, participants work to develop individual clinical research projects over the course of the year.

Previous evaluations of the CRTI have indicated that the program increased academic success for all graduates. However, female participants have shown slightly fewer markers of success than their male counterparts. Researchers in the present study hypothesized that caring for a child or elderly relative could factor into that disparity, so they asked CRTI alumni from 2003 to 2016 to complete surveys about their caregiving duties. Respondents also submitted CVs, which researchers analyzed for the number of first- or senior-author publications, the total number of publications, the percent effort in research, and the number of instances of being a principal investigator on federal grants.

Of the 258 CRTI alumni who responded, 66% said they had caregiving responsibilities, and more than half of all respondents said they felt those responsibilities had affected their career success. The study found that hematologists with caregiving responsibilities had fewer first- or senior-author publications and less percent effort in research compared to those without.

Yet when analyzed by gender, the caregiving role most negatively affected men's academic productivity. Overall, men had more first- or senior-author publications and more total publications than women. Men who self-identified as caregivers had lower levels of success in academic markers compared to men who did not. Women had similar first- or senior-author publications, total publications, and percent effort in research regardless of whether they reported caregiving responsibilities.

The finding that the discrepancy in academic success between women and men cannot be explained by caregiving duties alone highlights a need for additional research into possible reasons. "We have to figure out why female physician scientists tend to have lower levels of productivity," said Allison King, MD, of Washington University School of Medicine and the study's lead author. "I hope this will prompt people to think about how they can level the playing field for women and caregivers."

Dr. King and her team noted that a possible limitation of the study is that it did not quantify the amount of time spent caregiving. The researchers indicated future evaluations will ask respondents to estimate the number of hours spent on caregiving to better assess its effect on career success.

They also hope their findings will not diminish what hematologists have achieved through the CRTI program. "What we don't want to lose in this important discussion of gender," Dr. King said, "is that everyone has a high level of success. Our female graduates are productive, but there is still a gap that needs to be addressed."

Credit: 
American Society of Hematology

Treatment to reset immune cells markedly improves TBI symptoms

image: Use of experimental CSF-1R inhibitor drug reduced inflammation in microglia cells after treatment (right) compared to before treatment (left).

Image: 
University of Maryland School of Medicine

Researchers at the University of Maryland School of Medicine (UMSOM) found that targeting overactive immune cells in the brain with an experimental drug could limit brain cell loss and reverse cognitive and motor difficulties caused by traumatic brain injury (TBI). The findings, published Monday in the Journal of Neuroscience, suggest a potential new treatment for TBI and possibly other brain injuries.

The UMSOM scientists administered a CSF-1R inhibitor for one week to mice beginning one month after TBI- a time at which animals have brain inflammation and neurological deficits. The researchers found the drug depleted more than 95 percent of the brain's overactive immune cells (microglia) that are known to cause neurotoxic inflammation. Several weeks following the treatment, the cells had regenerated and the new cells were more similar to normal microglia, with less inflammatory features.

More importantly, the mice recovered markedly better than the control group that didn't receive treatment, showing less loss of tissue and neurons with significantly better motor and cognitive performance.

"We were surprised to see that the extent to which such late treatment could reverse the inflammatory state and the cognitive effects of experimental TBI," said study co-author Rebecca Henry, PhD, Research Associate in Anesthesiology at UMSOM. "This was a proof of concept study that depletion and subsequent repopulation of microglia cells after injury has a strong protective effect, but we clearly need more research to better understand this process before clinical translation."

Both human and animal brains have specialized immune cells, called microglia, that protect against bacteria or viruses that enter the brain.

Evidence from previous clinical studies has shown an increase in microglia activation in patients who have suffered a moderate or severe traumatic brain injury. These changes appear to be correlated with neurological deficits in such patients, including cognitive function.

"These preclinical studies suggest that the consequences of TBI on brain degeneration and related neurological impairment may be modifiable quite long after injury," said study co-author Alan Faden, MD, the David S. Brown Professor in Trauma at UMSOM. "We can potentially alter these effects by even highly delayed targeting of inflammatory pathways, a finding at odds with widely accepted views about treating head injury."

The next step for the researchers is to isolate the microglia cells and use RNA resequencing techniques to learn more about which genes are driving inflammation and overactivation in order to better understand the mechanism identified in this study.

"This is an intriguing finding that points to an important role that inflammation plays in chronic debilitation from brain injuries," said UMSOM Dean E. Albert Reece, MD, PhD, MBA, University Executive Vice President for Medical Affairs and the John Z. and Akiko K. Bowers Distinguished Professor. "Future studies will hopefully lead to new treatments for severe TBI that destroys a patient's quality of life."

Credit: 
University of Maryland School of Medicine

Insulin signaling suppressed by decoys

image: The nematode roundworm c. elegans.

Image: 
Gill lab/Scripps Research

JUPITER, FL--FEB. 25, 2020--In a discovery that may further the understanding of diabetes and human longevity, scientists at Scripps Research have found a new biological mechanism of insulin signaling. Their study, involving the roundworm C. elegans, reveals that a "decoy" receptor is at work in binding to insulin molecules and keeping them from sending signals for increased insulin production.

The study appears in the journal eLife. It describes a new player in the insulin signaling system, one that may offer insights into insulin resistance, a feature of type 2 diabetes. The scientists are now assessing whether a similar decoy exists in humans. If so, it could present a new target for diabetes treatment and prevention research.

"This truncated, 'decoy' receptor that we've found adds yet another layer of complexity to our understanding of insulin signaling," says lead author Matthew Gill, PhD, associate professor in the Department of Molecular Medicine at Scripps Research in Florida.

In an associated commentary, Princeton University geneticist Coleen Murphy, PhD, writes that the discovery shocks, given how well-studied insulin signaling is.

"It would be hard to overstate the importance of a receptor called DAF-2 to our understanding of aging and longevity," Murphy writes. "The discovery...raises new questions and will change how we think about DAF-2's role in insulin signaling regulation of aging and longevity."

Insulin is a hormone of ancient and fundamental importance to animals, and insulin-like proteins are found even in simpler organisms such as bacteria, fungi and worms. In humans, it acts as a signal to key cell types, directing them to pull in glucose from the blood. This helps maintain cellular energy stores and keeps blood sugar within a safe range. Type 2 diabetes, which is estimated to affect more than 30 million people in the United States, features a failure of insulin signaling to reduce blood glucose levels.

Since the 1990s researchers have recognized that insulin signaling is also an important regulator of longevity. For example, mutations in the gene that encodes the C. elegans insulin receptor DAF-2 can more than double the worm's lifespan.

Gill and his colleagues focused on a variant form of the C. elegans receptor known as DAF-2B. It's a truncated version that contains the usual binding site for insulin, but doesn't respond as the normal version would by sending a cellular signal to initiate insulin production.

The team confirmed that the gene for DAF-2B is active throughout the worm lifespan, and they used CRISPR gene-editing technology to tag the receptor with a fluorescent molecule and thus track its location in the worm body. From these experiments it became apparent that DAF-2B is secreted from the cells that produce it into the space surrounding the tissues of the worm, acting as a decoy to capture insulin molecules and thereby reduce insulin signaling.

"Normally insulin molecules float around and interact with insulin receptors to create insulin signals, but when they bind to these decoy receptors, they generate no signal, so producing these decoys appears to be a way to modulate insulin signaling," Gill says.

The scientists found that overproducing DAF-2B could tip worms into a semi-dormant state that normally occurs when food is scarce and insulin signaling is low. Overproduction of DAF-2B increased worm lifespan as well.

Although the discovery of this mechanism for regulating insulin signaling is a significant basic-science advance, it also suggests a new way of thinking about diabetes and even aging. The precise causes of the insulin resistance that underlies diabetes and is also seen to some extent with normal aging have never been fully illuminated.

"One possibility is that insulin resistance is caused by the abnormal overproduction of a truncated, 'decoy' insulin receptor like the one we've found," Gill says.

DAF-2B is produced from the same gene as the DAF-2 receptor, and results when the RNA transcript that is copied out from the gene is sliced and re-spliced in an alternative form. This alternative splicing process is known to occur for many genes, but Gill notes that it is often dysregulated with aging or certain kinds of disease.

"You can imagine that in the prime of life, splicing and expression of this truncated isoform, DAF-2B, is tightly regulated, but then with a broader change in the splicing system due to disease or aging it becomes dysregulated and leads to insulin resistance," Gill says.

If so, and if humans also have a decoy insulin receptor like DAF-2B, then reversing its dysregulation in people who have insulin resistance might be a new strategy for better metabolic health.

Credit: 
Scripps Research Institute