Earth

Declines in lung cancer death rates among US women have lagged in 2 hot spots

Bottom Line: While lung cancer death rates among women in most of the United States have declined substantially in recent years, progress among women in a region covering central Appalachia and southern parts of the Midwest and in northern parts of the Midwest has lagged.

Journal in Which the Study was Published: Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research.

Author: Katherine Ross, MPH, a graduate student in the Department of Epidemiology of the Rollins School of Public Health at Emory University, Atlanta.

Background: Ross explained that nationally, lung cancer death rates have been declining steadily among women since the mid-2000s. "We wanted to see if there were geographic differences in this decline so that we could identify places in the United States where women might benefit from targeted tobacco control and smoking cessation programs, and other interventions aimed at reducing the burden of lung cancer," she said.

How the Study Was Conducted and Results: To conduct the study, Ross and her colleagues used data on the number of lung cancer deaths among women obtained from the National Cancer Institute's Surveillance, Epidemiology and End Results Program (SEER) database to calculate age-standardized lung cancer death rates for each county in the contiguous United States from 1990 to 1999 and from 2006 to 2015. They then calculated the absolute change and the relative change in the death rates between the two periods for each county.

The researchers used a software tool called ArcGIS to identify clusters of counties with increases or small decreases in lung cancer death rates between the two periods, called hot spots.

They found that from 1990-1999 to 2006-2015, lung cancer death rates among women rose by 13 percent in a hot spot encompassing 669 counties in 21 states in central Appalachia and southern parts of the Midwest. During the same period, in a second hot spot that encompasses 81 counties in four states in the northern Midwest, lung cancer death rates among women rose by 7 percent. In the remainder of the contiguous United States, lung cancer death rates among women fell by 6 percent.

The researchers also compared lung cancer death rates among women in each hot spot with those among women in the remainder of the United States. In 1990, the death rate for the largest hot spot was 4 percent lower than the death rate for non-hot spot regions, but in 2015, it was 28 percent higher. For the second hot spot, the death rate was 18 percent lower than the non-hot spot death rate in 1990 but equivalent to the non-hot spot death rate in 2015.

Author Comment: "We know that Midwestern and Appalachian states have the highest prevalence of smoking among women and the lowest percent declines in smoking in recent years, so it is perhaps not surprising that we found that women in these areas experienced a disparity in lung cancer death rates," said Ross. "This geographic disparity may widen unless we specifically aim to reduce tobacco use among women in these hot spots.

"There are several effective tobacco control policies available, such as increased excise taxes on tobacco and comprehensive smoke-free air laws that ban smoking in the workplace, restaurants, and bars," continued Ross. "However, many states in our identified hot spots either do not have these measures in place, or they are comparatively weak and could be strengthened."

Limitations: According to Ross, the main limitation of the study is that the researchers are not able to draw any conclusions about whether or not differences in tobacco control are responsible for the trends they observed. "In addition, in some counties there was not enough information about rates available to draw conclusions about their trends in lung cancer mortality among women, which means that the results of the study might not apply to those counties," Ross added.

Funding & Disclosures: Ross declares no conflicts of interest.

Credit: 
American Association for Cancer Research

Nanoscale alloys from elements thought to be incapable of mixing

image: This is Huang Zhennan (left) and Shahbazian-Yassar.

Image: 
UIC/Jenny Fontaine

A multi-institutional team of scientists describes a new technique that can meld ions from up to eight different elements to form what are known as high entropy alloyed nanoparticles. The atoms of the elements that make up these particles are distributed evenly throughout and form a single, solid-state crystalline structure -- a feat that has never been achieved before with more than three elements. The nanoparticles could have broad applications as catalysts. The findings are published in the journal Science.

Traditionally, materials scientists have not made serious attempts to create materials that contain more than three elements because of the tendency of each elements' atoms to clump together. Scientists also assumed that such multi-element materials wouldn't have any valuable real-world applications.

But now, using advanced transmission electron microscopy, researchers at the University of Illinois at Chicago have proven that multiple elements never before thought capable of forming a single material can do just that. The researchers showed that up to eight elements were able to form nanoparticles -- tiny spheres no more than 100 nanometers wide -- with a homogeneous crystal structure.

"This will really change the way people think about materials that were previously thought to be immiscible," said Reza Shahbazian-Yassar, associate professor of mechanical and industrial engineering in the UIC College of Engineering and an author on the paper.

Materials scientists at the University of Maryland, College Park, led by Liangbing Hu, produced the unique nanoparticles, known as high entropy alloys. "The novel high-entropy nanoparticles could be used in a broad range of applications, particularly as catalysts in emerging energy and environmental technologies," said Hu, associate professor of materials science and engineering.

The Maryland scientists used a two-step process that included a brief heat 'shock' followed by rapid cooling to get ions of various elements that normally wouldn't form alloys to mix and stabilize in crystalline nanoparticles. During the rapid cool down phase, these ions form a single solid crystal that is a uniform, homogenous mixture of multiple elements. "At the atomic scale, the various ions are found one next to another," said Shahbazian-Yassar. "So, there would be, for example, a gold atom next to a nickel atom, next to a copper atom, next to a platinum atom -- resulting in a homogeneous, mixed single-state nanoparticle that looks like a single unique material."

To confirm the homogeneity of the nanoparticles, Zhennan Huang, a doctoral student in the UIC College of Engineering, and Anmin Nie, a former postdoctoral researcher in Shahbazian-Yassar's lab, used advanced scanning transmission electron microscopy to image the crystals and identify individual atoms. They were able to determine that at the atomic level, their nanoparticles were made of homogeneous mixtures of different combinations of platinum, cobalt, nickel, copper, iron, palladium and gold. "We were able to provide definitive proof that these nanoparticles didn't have lumps of a single element, but that each of the component elements were distributed equally throughout the nanoparticle," said Huang.

Scientists at Johns Hopkins University were able to demonstrate one potential use of the nanoparticles. They used them as advanced catalysts for ammonia oxidation, which is a key step in the production of nitric acid. They were able to achieve 100 percent oxidation of ammonia with the particles, proving their ability as useful catalysts.

"But in reality, we really don't know all the ways these nanoparticles might be used because we've never been able to make them before at nanoscale," said Shahbazian-Yassar. "Materials science textbooks only discuss alloys of maybe three different elements at most, so we are really in novel territory here."

Credit: 
University of Illinois Chicago

Anti-aging protein alpha Klotho's molecular structure revealed

image: This is Dr. Orson Moe.

Image: 
UT Southwestern Medical Center

DALLAS - March 29, 2018 - Researchers from UT Southwestern's Charles and Jane Pak Center for Mineral Metabolism and Clinical Research and Internal Medicine's Division of Nephrology recently published work in Nature that reveals the molecular structure of the so-called "anti-aging" protein alpha Klotho (a-Klotho) and how it transmits a hormonal signal that controls a variety of biologic processes. The investigation was performed in collaboration with scientists from New York University School of Medicine and Wenzhou Medical University in China.

Studies at UTSW two decades ago by Dr. Makoto Kuro-o, Professor of Pathology, demonstrated that mice lacking either a-Klotho or the hormone FGF23 suffered from premature and multiple organ failure as well as other conditions, including early onset cardiovascular disease, cancer, and cognitive decline. Because defects in a-Klotho lead to symptoms seen in aging, researchers inferred that a-Klotho suppresses aging, leading to great interest in how the a-Klotho protein might work together with the hormone FGF23 to fulfill their roles.

a-Klotho can exist on the surface of a cell or can be released from the cell and circulate in body fluids, including the blood, as soluble a-Klotho. The cell-attached form and the circulating form of a-Klotho were previously and universally believed to serve completely different functions.

"The a-Klotho gene [then called Klotho] was cloned by Dr. Kuro-o in 1997 shortly before he was recruited here, and during his tenure at UT Southwestern he has carried out the most seminal work in this field," said Pak Center Director Dr. Orson Moe. "The gene protects against many diseases, including cardiovascular disease, cancer, diabetes, aging, neurodegeneration, and kidney disease. The structure of the a-Klotho protein and how the protein functions, however, largely remained a mystery until this current work."

By providing a first look at the structure of the protein complex that includes FGF23 and its co-receptors, the FGF receptor and a-Klotho, the most recent study challenges the long-accepted belief that only the cell-attached form of aKlotho can serve as a receptor for FGF23 and hence that FGF23 action is restricted to tissues having the cell-attached form.

Study authors include Dr. Moe, Professor of Internal Medicine and Physiology, and Dr. Ming Chang Hu, Associate Professor of Internal Medicine and Pediatrics. Dr. Moe holds The Charles Pak Distinguished Chair in Mineral Metabolism, and the Donald W. Seldin Professorship in Clinical Investigation. Dr. Hu holds the Makoto Kuro-o Professorship in Bone and Kidney Research.

One of the major, paradigm-changing findings revealed by solving the protein complex structure is that the circulating form of soluble a-Klotho can actually serve as a co-receptor for FGF23. Thus, the soluble form of a-Klotho can go to any cell in the body and act as a co-receptor for FGF23, rendering every cell a possible target of FGF23, representing a major paradigm shift.

"a-Klotho researchers in cancer, aging, neurologic, cardiovascular, and kidney disease will benefit from this research," Dr. Moe said. "The knowledge of the structure of the protein, along with its molecular binding partners, will enable us to greatly advance the understanding of how a-Klotho works and also how to best design therapeutic strategies and novel agents that can either activate or block FGF23-a-Klotho interaction and signaling as needed."

Collaboratively led by NYU School of Medicine structural biologist Dr. Moosa Mohammadi, the investigation included researchers from UTSW, the Rockefeller University-based New York Structural Biology Center, and Wenzhou Medical University.

The study provides evidence for how FGF23 signals to cells by forming a complex with a-Klotho and the two other molecular partners. Made by bone cells, the FGF23 hormone travels via the bloodstream to cells in all organs, where it regulates many aspects of mineral metabolism. Abnormal FGF23 levels are found in many disease states. In chronic kidney disease, for example, high FGF23 levels are believed to cause many of the disease's complications and fatalities.

The researchers say their findings also shed new light on how kidney disease leads to an abnormal thickening of heart muscle tissue called hypertrophy, which is a leading cause of death in people with kidney disease caused by high blood pressure, diabetes, and other illnesses. When damaged kidney tubules can no longer eliminate phosphate in the urine, FGF23 rises, initially as an effort to keep blood phosphate in check. With time, FGF23 can rise to harmful levels.

A prevailing hypothesis has been that very high levels of FGF23 cause hypertrophy in the heart. But the theory remained controversial because heart tissue does not have a-Klotho, which must be present if FGF23 is to signal. The latest findings indicate that a-Klotho can be "delivered" through the bloodstream to organs where it is not normally present. This could potentially launch drug development programs for kidney disease, the researchers said.

"The solution of this protein structure will guide many future studies," Dr. Moe said. "There are numerous diseases that involve a-Klotho deficiency. Replenishment of a-Klotho by either recombinant protein injection or drugs that increase a patient's own a-Klotho will have potential therapeutic implications for neurologic, metabolic, cardiovascular and kidney disease, and cancer."

Credit: 
UT Southwestern Medical Center

Rapid emissions reductions would keep CO2 removal and costs in check

Rapid greenhouse-gas emissions reductions are needed if governments want to keep in check both the costs of the transition towards climate stabilization and the amount of removing already emitted CO2 from the atmosphere. To this end, emissions in 2030 would need to be at least 20 percent below what countries have pledged under the Paris climate agreement, a new study finds - an insight that is directly relevant for the global stock-take scheduled for the UN climate summit in Poland later this year. Removing CO2 from the atmosphere through technical methods including carbon capture and underground storage (CCS) or increased use of plants to suck up CO2 comes with a number of risks and uncertainties, and hence the interest of limiting them.

"Emissions reduction efforts in the next decade pledged by governments under the Paris climate agreement are by far not sufficient to attain the explicit aim of the agreement - they will not keep warming below the 2-degrees-limit," says Jessica Strefler from the Potsdam Institute for Climate Impact Research (PIK), lead-author of the analysis published in Environmental Research Letters. "To stabilize the climate before warming crosses the Paris threshold, we either have to undertake the huge effort of halving emissions until 2030 and achieving emission neutrality by 2050 - or the emissions reductions would have to be complemented by CO2 removal technologies. In our study, we for the first time try to identify the minimum CO2 removal requirements - and how these requirements can be reduced with increased short-term climate action."

At least 5 billion tons of CO2 removal per year throughout the second half of the century

It turns out that, according to the computer simulations done by the scientists, challenges for likely keeping warming below the threshold agreed in Paris would increase sharply if CO2 removal from the atmosphere is restricted to less than 5 billion tons of CO2 per year throughout the second half of the century. This is substantial. It would mean for instance building up an industry for carbon capture and storage that moves masses comparable to today's global petroleum industry. Still, 5 billion tons of CO2 removal is modest compared to the tens of billion tons that some scenarios used in climate policy debates assume. Current CO2 emissions worldwide are more than 35 billion tons per year.

"Less than 5 billion tons of CO2 removal could drastically drive up the challenges of climate stabilization", says co-author Nico Bauer from PIK. "If for instance this amount of carbon dioxide removal (CDR) was halved, then the annual CO2 reduction rates between 2030 and 2050 would have to be doubled to still achieve 2 degrees Celsius. In addition, short-term emissions reductions would also have to be increased as the emissions reductions pledged so far by the signatories of the Paris Agreement are not sufficient to keep warming below 2 degrees if they're not combined with CO2 removal from the atmosphere."

"It is all about short-term entry points, like rapidly phasing out coal"

More CO2 removal could in principle reduce costs since, on paper, implementing the relevant technologies to compensate residual emissions in industry and transport is cheaper than pushing emissions reduction from 90 percent to 100 percent. However, CO2 removal technologies are afflicted with three types of uncertainties and risks. First, the technical feasibility and also the costs are not well known so far. Second, they might have negative effects for sustainability; a massive scale-up of bio-energy production for instance could trigger land-use conflicts and come at the expense of food production and ecosystem protection. Third, the political feasibility is by no means given. In Germany, fears expressed by parts of the population made the government stop even small-scale carbon capture and storage implementation.

"This gives important information to governments - first, rapid short-term emissions reductions are the most robust way of preventing climate damages, and second, large-scale deployment of CDR technologies can only be avoided when reliable CO2 prices are introduced as soon as possible," says Ottmar Edenhofer, co-author of the study and PIK's chief economist. "Ramping up climate policy ambition for 2030 to reduce emissions by 20 percent is economically feasible. It is all about short-term entry points: rapidly phasing out coal in developed countries such as Germany and introducing minimum prices for CO2 in pioneer coalitions in Europe and China makes sense almost irrespective of the climate target you aim for. In contrast, our research shows that delaying action makes costs and risks skyrocket. People as well as businesses want stability, and this is what policy-makers can provide - if they act rapidly."

Credit: 
Potsdam Institute for Climate Impact Research (PIK)

A moveable feast: Antibiotics give C. diff a nutrient-rich environment, no competition

image: This is C. difficile sporulation seen through a phase contrast microscope at a magnification of 1000X.

Image: 
Dr. Rajani Thanissery

Using a mouse model, researchers from North Carolina State University have found that antibiotic use creates a "banquet" for Clostridium difficile (C. diff), by altering the native gut bacteria that would normally compete with C. diff for nutrients. The findings could lead to the development of probiotics and other strategies for preventing C. diff infection.

C. diff is a harmful bacterium that can cause severe, recurrent and sometimes fatal infections in the gut. Although the bacteria are commonly found throughout our environment, C. diff infections primarily occur in patients who are taking, or who have recently finished taking, antibiotics.

"We know that antibiotics are major risk factors for C. diff infection because they alter the gut microbiota, or composition of bacteria in the gut, by eliminating the bacteria that are normally there," says Casey Theriot, assistant professor of infectious disease at NC State and corresponding author of a paper describing the research. "Our latest work suggests that the microbiota may provide natural resistance to C. diff colonization by competing with C. diff for nutrients in that environment; specifically, for an amino acid called proline."

Theriot and postdoctoral fellow Joshua Fletcher introduced C. diff to antibiotic-treated mice and monitored their gut environment at four intervals: 0, 12, 24, and 30 hours after introduction. They conducted metabolomic and RNA sequencing analysis of the gut contents and the C. diff at these time points to find out which nutrients the bacteria were "eating." Metabolomics allowed the team to trace the abundance of the nutrients in the gut, and RNA analysis indicated which genes in the C. diff were active in metabolizing nutrients.

The researchers found that the amount of proline in the gut decreased as the population of C. diff increased. Additionally, the amount of a proline byproduct called 5-aminovalerate also increased, indicating that C. diff was metabolizing the proline. The RNA analysis further confirmed C. diff's use of proline, as genes related to proline metabolism in C. diff increased during the early stages of colonization, when proline was abundant.

"We've been able to show that in the absence of competition C. diff is metabolizing proline and other amino acids in the mouse model, using it as fuel to survive and thrive," Theriot says. "Hopefully this information could lead to the development of better probiotics, or 'good' bacteria that can outcompete C. diff for nutrients in the gut. The ultimate goal is to control these bacteria in ways that don't rely solely on antibiotics."

Credit: 
North Carolina State University

A paperlike LCD -- thin, flexible, tough and cheap

image: This is a combined flexible blue optically rewritable LCD.

Image: 
Zhang et al.

WASHINGTON, D.C., March 28, 2018 -- Optoelectronic engineers in China and Hong Kong have manufactured a special type of liquid crystal display (LCD) that is paper-thin, flexible, light and tough. With this, a daily newspaper could be uploaded onto a flexible paperlike display that could be updated as fast as the news cycles. It sounds like something from the future, but scientists estimate it will be cheap to produce, perhaps only costing $5 for a 5-inch screen. The new optically rewritable LCD design was reported this week in Applied Physics Letters, from AIP Publishing.

The team focused on two key innovations for achieving highly flexible designs. The first is the recent development of optically rewritable LCDs. Like conventional LCD displays, the display is structured like a sandwich, with a liquid crystal filling between two plates. Unlike conventional liquid crystals where electrical connections on the plates create the fields required to switch individual pixels from light to dark, optically rewritable LCDs coat the plates with special molecules that realign in the presence of polarized light and switch the pixels. This removes the need for traditional electrodes, reduces the structure's bulk and allows more choices in the type and thickness of plates. Consequently, optically rewritable LCDs are thinner than traditional LCDs, at less than half a millimeter thick, can be made from flexible plastic, and weigh only a few grams. "It's only a little thicker than paper," said Jiatong Sun, a co-author from Donghua University in China.

Optically rewritable LCDs are durable and cheap to manufacture because of their simple structure. Moreover, like an electronic paper screen in an e-book, energy is only required to switch display images or text. Therefore, running costs are low because these new LCDs don't need power to sustain an image once it is written on the screen.

The second innovation involves the spacers that create the separation of the plastic or glass plates. "We put spacers between glass layers to keep the liquid crystal layer uniform," Sun said. Spacers are used in all LCDs to determine the thickness of the liquid crystal. A constant thickness is necessary for good contrast ratio, response time and viewing angle. However, when plates bend, it forces the liquid crystal away from the impact site and leaves sections of the screen blank and so alterations in spacer design are critical to prevent liquid crystal in flexible LCDs from moving excessively. Developing a flexible design that overcomes this barrier has proven challenging.

The researchers tried three different spacer designs and found that a meshlike spacer prevented liquid crystal from flowing when their LCD was bent or hit. This innovation enabled them to create the first flexible optically rewritable LCD.

An additional innovation involved improved color rendering. The scientists report that until this study, optically rewritable LCDs had only been able to display two colors at a time. Now, their optically rewritable LCD simultaneously displays the three primary colors. They achieved this by placing a special type of liquid crystal behind the LCD, which reflected red, blue and green. To make this into a commercial product, Sun wants to improve the resolution of the flexible optically rewritable LCD.

"Now we have three colours but for full colour we need to make the pixels too small for human eyes to see," Sun said.

Credit: 
American Institute of Physics

Monterey Bay Aquarium study finds sea turtles use flippers to manipulate food

image: A green turtle swiping the stinging jellyfish (Cyanea barkeri) in the water column at Hook Island, Queensland, Australia, taken June 2017.

Image: 
Copyright Fujii et al. shared under Creative Commons CC BY

Sea turtles use their flippers to handle prey despite the limbs being evolutionarily designed for locomotion, a discovery by Monterey Bay Aquarium researchers published today in PeerJ.

The in-depth examination of the phenomenon - Limb-use By Foraging Sea Turtles, an Evolutionary Perspective - by authors Jessica Fujii and Dr. Kyle Van Houtan and others reveals a behavior thought to be less likely in marine tetrapods is actually widespread and that this type of exaptation of flippers may have been occurring 70 million years earlier than previously thought.

"Sea turtles don't have a developed frontal cortex, independent articulating digits or any social learning," says Van Houtan, Director of Science at Monterey Bay Aquarium. "And yet here we have them 'licking their fingers' just like a kid who does have all those tools. It shows an important aspect of evolution - that opportunities can shape adaptations."

Lead author Jessica Fujii is part of the Aquarium's sea otter research team where she specializes in ecomorphology--the intersection of evolution, behavior and body form. Fujii's expertise in sea otter foraging and tool use behavior has influenced her recent examination of sea turtles and how they have evolved to use their limbs in novel ways.

Analysis by Fujii and Van Houtan using crowd-sourced photos and videos finds widespread examples of behaviors such as a green turtle holding a jelly, a loggerhead rolling a scallop on the seafloor and a hawksbill pushing against a reef for leverage to rip an anemone loose.

Similar behaviors have been documented in marine mammals from walruses to seals to manatees - but not in sea turtles. The paper shows that sea turtles are similar to the other groups in that flippers are used for a variety of foraging tasks (holding, bracing, corralling).

"Sea turtles' limbs have evolved mostly for locomotion, not for manipulating prey," Fujii says. "But that they're doing it anyway suggests that, even if it's not the most efficient or effective way, it's better than not using them at all."

The finding came as a surprise to the authors, given sea turtles' ancient lineage and the fact that the reptiles are considered to have simple brains and simple flippers. The results also offer an insight into the evolution of four-limbed ocean creatures that raises questions about which traits are learned and which are hardwired.

"We expect these things to happen with a highly intelligent, adaptive social animal," Van Houtan says. "With sea turtles, it's different; they never meet their parents," Kyle says. "They're never trained to forage by their mom. It's amazing that they're figuring out how to do this without any apprenticing, and with flippers that aren't well adapted for these tasks."

The study may also help inform the aquarium's ongoing sea otter research. How developmental biology predisposes animals to adopt dining strategies is of particular interest, given the aquarium's efforts to raise stranded sea otter pups and prepare them for a return to the wild. Rearing and releasing stranded pups contributes to the aquarium's work to recover California's threatened sea otter population.

Before they're released, ecologically naïve pups have to be taught foraging behaviors, be it for crabs or abalone, by adult female sea otters at the aquarium, which serve as surrogate mothers to the pups.

"What we're trying to understand is how to have the best sea otter surrogacy program," Kyle says. "This is kind of one end of the spectrum of that--the opposite end of the spectrum."

Credit: 
PeerJ

Mental health issues linked to risky driving in newly licensed teens

March 27, 2018 - Mental health symptoms related to attention deficit-hyperactivity disorder (ADHD) and conduct disorder are associated with increased errors in a driving simulator and self-reported risky driving behaviors in adolescents, according to study in Nursing Research, published by Wolters Kluwer.

"Inattention is associated with more errors in the driving simulator, and self-reported symptoms of hyperactivity and conduct disorder are independently associated with self-reported risky driving behaviors," comments lead author Catherine C. McDonald, PhD, RN, FAAN, of the University of Pennsylvania School of Nursing and the Center for Injury Research and Prevention at Children's Hospital of Philadelphia. The findings suggest that mental health issues might contribute to the risk of crashes in newly licensed adolescent drivers.

Symptoms of ADHD and Conduct Disorder Affect Teens' Driving Performance

The study included 60 young drivers, aged 16 to 17, who had received their license within the past 90 days. The teens were tested using a high-fidelity driving simulator, which assessed their responses to various types of common but avoidable crash scenarios. They also completed a self-report questionnaire on risky driving behaviors--for example, speeding, not wearing a seat belt, or nighttime driving with other teens as passengers.

The adolescents were also assessed on a measure of mental health symptoms, focusing on three conditions potentially associated with risky driving: ADHD, conduct disorder, and depression. Lastly, parents filled out a questionnaire about their teens' mental health.

A teen's self-report of inattention was the only mental health symptom to be related to errors on the driving simulator assessment. The higher the score for inattention, as rated by the teens themselves, the higher the rate of driving performance errors. Unexpectedly, teens with higher scores for depression symptoms made fewer errors in the simulator.

Teens with higher self-rated scores for hyperactivity/impulsivity and conduct disorder also scored higher for risky driving behaviors. Overall, parents' reports of the adolescents' mental health symptoms were not related to the teens' self-reported symptoms or their risky driving behaviors (self-reported or in the simulator).

The researchers emphasize that it was symptoms of hyperactivity and impulsivity--not necessarily the diagnosis of ADHD--that were associated with risky driving behaviors. However, teens who met the cutoff points for clinical follow-up for ADHD and conduct disorder had higher scores for risky driving.

Motor vehicle crashes are the leading cause of death in adolescents, but little is known about the mental health factors affecting crash risk. Mental health may be especially important for newly licensed teen drivers, who are at higher risk of crashes. The new study suggests that risky driving behaviors during this time might be related to symptoms of inattention, hyperactivity, and conduct disorder.

The increase in self-reported risky driving behavior is consistent with the mental health symptoms involved, according to the work of Dr. McDonald and colleagues. "With hyperactivity-impulsivity, rule violations may stem from inherent problems with self-control," they write. "With conduct disorder, rule violations may be an attempt to take advantage of a situation or express hostility."

Understanding the role of mental health factors might help in reducing risky driving behaviors in novice teen drivers, with the goal of lowering crash risk in this vulnerable population. Dr. McDonald and coauthors conclude, "Nurses are well-positioned in a variety of clinical settings to counsel adolescents, addressing the multidimensional nature of risks associated with mental health and risk behaviors."

Credit: 
Wolters Kluwer Health

Abnormal brain connections seen in preschoolers with autism

OAK BROOK, Ill. - Preschoolers with autism spectrum disorder, or ASD, have abnormal connections between certain networks of their brains that can be seen using a special MRI technique, according to a study published online in the journal Radiology. Researchers said the findings may one day help guide treatments for ASD.

ASD refers to a group of developmental disorders characterized by communication difficulties, repetitive behaviors and limited interests or activities. Young children with ASD can usually be diagnosed within the first few years of life. Early diagnosis and intervention are important because younger patients typically benefit most from treatments and services to improve their symptoms and ability to function.

While developments in brain imaging have enabled the discovery of abnormal brain connectivity in younger children with ASD, the phenomenon has not yet been fully investigated at the brain network level. Brain networks are areas of the brain connected by white matter tracts that interact to perform different functions.

For the new study, researchers looked for differences in brain connectivity in children with ASD using an MRI technique called diffusion tensor imaging (DTI). The technique provides important information on the state of the brain's white matter.

Researchers compared DTI results between 21 preschool boys and girls with ASD (mean age of 4-and-a-half years old) with those of 21 similarly aged children with typical development. They applied graph theory to the DTI results to understand more about the level of connectivity between brain networks. By applying graph analysis to the DTI results, researchers can measure the relationships among highly connected and complex data like the network of connections that forms the human brain.

Compared with the typically developed group, children with ASD demonstrated significant differences in components of the basal ganglia network, a brain system that plays a crucial role in behavior. Differences were also found in the paralimbic-limbic network, another important system in regulating behavior.

"Altered brain connectivity may be a key pathophysiological feature of ASD," said study co-author Lin Ma, M.D., from the Department of Radiology at Chinese PLA General Hospital in Beijing. "This altered connectivity is visualized in our findings, thus providing a further step in understanding ASD."

The results suggest that these altered patterns may underlie the abnormal brain development in preschool children with ASD and contribute to the brain and nervous system mechanisms involved in the disorder. In addition, the identification of altered structural connectivity in these networks may point toward potential imaging biomarkers for preschool children with ASD.

"The imaging finding of those 'targets' may be a clue for future diagnosis and even for therapeutic intervention in preschool children with ASD," Dr. Ma said.

For instance, Dr. Ma said, in the future this type of brain imaging might aid in the delivery of ASD therapies for children like repetitive transcranial magnetic stimulation, or TMS, and transcranial direct current stimulation, or tDCS. TMS involves using a magnet to target and stimulate certain areas of the brain, while tDCS relies on electrical currents to deliver therapy. Both are being investigated as possible treatments for ASD.

Credit: 
Radiological Society of North America

New research shows how submarine groundwater affects coral reef growth

image: This is Maunalua Bay, Oahu, Hawaii.

Image: 
Florybeth La Valle, HIMB/ UH SOEST

Groundwater that seeps into the coastal zone beneath the ocean's surface--termed submarine groundwater discharge (SGD)--is an important source of fresh water and nutrients to nearshore coral reefs throughout the globe. Although submarine groundwater is natural, it can act as a conduit for highly polluted water to shorelines. A recently published study, led by researchers at the University of Hawai'i at Mānoa's School of Ocean and Earth Science and Technology (SOEST), sheds light on the ways SGD affects coral reef growth.

"SGD is common on nearshore coral reefs, especially in Hawai'i, so we set out to test how SGD affects coral reef growth in Maunalua Bay, O'ahu." said Megan Donahue, associate researcher at the Hawai'i Institute of Marine Biology (HIMB) in SOEST and senior author of the study.

Two processes contribute to the overall growth of coral reefs: coral growth and bioerosion, the natural breakdown of coral reefs by reef organisms. To determine how SGD affects these processes, the research team outplanted small pieces of lobe coral on the reef flat in areas with a range of SGD and measured the changes in size over a six-month period. They also put out blocks of dead coral skeleton across the same SGD gradients for one year to measure bioerosion rates. The blocks were scanned before and after the deployment with a micro-CT scanner, similar to a hospital CT scanner, to determine the amount of coral skeleton removed by bioeroding organisms in three dimensions.

In areas with high levels of SGD, it was a double whammy for coral reefs. Corals that were right next to SGD seeps performed poorly, likely due to the stress of too much fresh water.

"Additionally, we found that marine organisms responsible for bioerosion broke down the skeletal reef framework very quickly when exposed to high amounts of SGD," said lead author Katie Lubarsky, who completed this research as part of her graduate degree in Marine Biology at UH Mānoa. "Many bioeroding organisms are filter feeders that perform better in high nutrient environments, so the high nutrient groundwater likely enhanced bioeroder activity. This indicates that high inputs of nutrient polluted SGD could favor reef breakdown and substantially slow down overall reef growth."

To the researchers' surprise, SGD actually enhanced coral growth when the nutrient enrichment and freshwater from the groundwater was at low levels.

"Our results indicate that corals can thrive on SGD-impacted reefs if isolated from secondary stressors such as competition from seaweeds and sedimentation," said Donahue. "Maunalua Bay is situated in a highly urbanized area, and the coral reefs in the bay have become degraded as the population has boomed over the last 50 years. But active management to reduce invasive algae and limit fine sediments could allow coral recovery in Maunalua Bay."

"While the current study found that corals grow faster when exposed to low levels of SGD nutrient enrichment, coral cover remains extremely low on the Maunalua Bay reef flats," said Nyssa Silbiger, study co-author and assistant professor at California State University, Northridge. "Our next studies will focus on how SGD and herbivory from fishes impact coral-algal competition, coral recruitment rates, and bioerosion rates."

Credit: 
University of Hawaii at Manoa

Gene therapy may help brain heal from stroke, other injuries

image: Reactive astrocytes surround the lesion site in the injured spinal cord. A new mouse study shows that triggering a gene inside astrocytes activates the star-shaped cells and may improve the brain's ability to heal from a range of debilitating conditions, from strokes to concussions and spine injuries. Credit: Meifan Amy Chen, Ph.D.

Image: 
UT Southwestern

DALLAS - March 27, 2018 - Scientists have found a genetic trigger that may improve the brain's ability to heal from a range of debilitating conditions, from strokes to concussions and spinal cord injuries.

A new study in mice from UT Southwestern's O'Donnell Brain Institute shows that turning on a gene inside cells called astrocytes results in a smaller scar and - potentially - a more effective recovery from injury.

The research examined spinal injuries but likely has implications for treating a number of brain conditions through gene therapy targeting astrocytes, said Dr. Mark Goldberg, Chairman of Neurology & Neurotherapeutics at UT Southwestern.

"We've known that astrocytes can help the brain and spinal cord recover from injury, but we didn't fully understand the trigger that activates these cells," Dr. Goldberg said. "Now we'll be able to look at whether turning on the switch we identified can help in the healing process."

The study published in Cell Reports found that the LZK gene of astrocytes can be turned on to prompt a recovery response called astrogliosis, in which these star-shaped cells proliferate around injured neurons and form a scar.

Scientists deleted the LZK gene in astrocytes of one group of injured mice, which decreased the cells' injury response and resulted in a larger wound on the spinal cord. They overexpressed the gene in other injured mice, which stimulated the cells' injury response and resulted in a smaller scar. Overexpressing the gene in uninjured mice also activated the astrocytes, confirming LZK as a trigger for astrogliosis.

Dr. Goldberg said a smaller scar likely aids the healing process by isolating the injured neurons, similar to how isolating a spreading infection can improve recovery. "But we don't know under what circumstances this hypothesis is true because until now we didn't have an easy way to turn the astrocyte reactivity on and off," he said.

Further study is needed to analyze whether a compact scar tissue indeed improves recovery and how this process affects the neurons' ability to reform connections with each other.

Dr. Goldberg's lab will conduct more research to examine the effects of astrogliosis in stroke and spinal cord injuries. The researchers will determine whether turning up LZK in mice in advance of an injury affects its severity. They will then measure how the formation of the compact scar helps or hinders recovery.

"It has been a big mystery whether increasing astrocyte reactivity would be beneficial," said Dr. Meifan Amy Chen, the study's lead author and Instructor of Neurology at the Peter O'Donnell Jr. Brain Institute. "The discovery of LZK as an on switch now offers a molecular tool to answer this question."

Credit: 
UT Southwestern Medical Center

Malaria Cell Atlas launched: Parasite development mapped in unprecedented detail

New single-cell technology has allowed scientists to study malaria parasites at the highest resolution to date. By investigating the genes in individual malaria parasites, scientists from the Wellcome Sanger Institute are beginning to understand the genetic processes each parasite undergoes as it moves through its complicated lifecycle.

The results, published today (27 March) in eLife are the first step towards developing the Malaria Cell Atlas, a data resource that will provide gene activity profiles of individual malaria parasites throughout their lifecycle. The Malaria Cell Atlas will allow researchers to identify weak points in the parasite's lifecycle for intervention with drugs, and will help transform research into the disease.

Nearly half of the world's population is at risk of malaria and more than 200 million people are infected each year. The disease caused the deaths of almost half a million people globally in 2015*.

Malaria is caused by Plasmodium parasites that are spread to people through the bites of infected mosquitoes. Plasmodium parasites themselves are tiny, single-celled organisms that have a complex lifecycle, including many stages. Their small size and complicated lifecycle make the parasites incredibly difficult to study.

Traditionally used to study mammals, single-cell sequencing allows scientists to investigate individual cells and realise the true diversity of cell states within the same tissue or organ.

In a new study, scientists from the Sanger Institute applied single-cell sequencing to individual malaria parasites and achieved the highest resolution view of malaria parasites to date.

The team were able to uncover hidden patterns in the way different individual parasites use their genes during an infection. These differences are not visible when a large pool of parasites are studied.

The results are the first steps to developing the Malaria Cell Atlas -- a reference map for understanding how malaria parasites move through the lifecycle and how much individual to individual variation there is among parasite stages that are most vulnerable to drugs and vaccines. The map, which will first be created for the rodent malaria parasite, will provide a visual representation of all the genes switched on and off in individual malaria parasites across their complete lifecycle, including within the mosquito.

The Malaria Cell Atlas will eventually include single cell data from the host tissues the parasite must colonise to complete its lifecycle, including the mosquito gut cells and the mammalian host liver cells. These tissues also play a very important role in the progression and transmission of the disease.

In this study, scientists analysed more than 500 individual parasites of both rodent malaria (Plasmodium berghei) and the most deadly human malaria parasite (Plasmodium falciparum) during the blood stage of the parasite's lifecycle.

By zooming in on individual P. berghei parasites, the team were able to detect the activity of over 4,500 genes in total across all cells: over 90 per cent of the genes in the P. berghei genome. At the level of individual cells, researchers detected the activity of nearly 2,000 genes on average, which is the most ever seen in single malaria parasites.

Dr Adam Reid, joint first author from the Wellcome Sanger Institute, said: "We now have the most high-resolution image of malaria parasites. By studying the activity of each gene in individual parasites, we have uncovered previously hidden ways in which genes are used for the development of malaria parasites in the blood. This study paves the way for studying malaria parasites during a natural infection in people."

Dr Arthur Talman, joint first author from the Wellcome Sanger Institute, said: "We have gained a better understanding of which genes are important for the parasite to transmit between human and mosquito during the spread of a malaria infection. We also found that the parasite's development in the blood is regulated by groups of genes that are switched on and off in unison. Knowing how the parasite's lifecycle is controlled by particular genes, we have a stronger chance of interfering with it using drugs."

Previously, malaria parasites have been pooled and studied together, so subtle differences between individual parasites may have been missed. A single parasite may have a distinct advantage for infection and now researchers have a tool to investigate these subtle differences.

Dr Mara Lawniczak, lead author from the Wellcome Sanger Institute, said: "Single-cell technology will revolutionise how we study single-celled organisms, such as malaria parasites. We can now begin to truly understand diversity between individual parasites, even within the same human infection. This is the first step towards creating the Malaria Cell Atlas, a data resource that we hope will be valuable to the important global community of malaria researchers working to eliminate this devastating disease."

Credit: 
Wellcome Trust Sanger Institute

Important development could reduce numbers of fish required in toxicology research

image: Immunofluorescence of fish intestine organoid: long term culture of intestinal trout tissue shows co-expression of ZO-1 (red), E-cadherin (green) and cell nuclei (blue).

Image: 
University of Plymouth

Scientists have developed a new technique to examine the effects of chemicals on digestive systems of fish and support research into gut related conditions.

The technique also has potential to reduce the number of animal experiments, in line with the principles of the 3Rs (Reduce, Refine and Replace).

There is a growing scientific, public and regulatory concern about dietary uptake of chemicals. But to implement legislation to assess the toxicity of some chemicals requires thousands of fish and there are currently no reliable alternative methods to assess their accumulation and toxic effects in the gut without using live animals.

Now researchers at the University of Plymouth, working in partnership with pharmaceutical company AstraZeneca, have for the first time successfully cultured and maintained cells from the guts of rainbow trout, a recommended fish species for toxicological studies.

They then demonstrated that under laboratory conditions, they could maintain the cells' function for extended periods while also growing new cells which could be used in future tests to assess the impact of environmental pollutants such as chemicals or plastics which enter in the body via diet.

The results of the study, led by Postdoctoral Research Fellow Dr Laura Langan and Professor of Genetic Toxicology & Ecotoxicology Awadhesh Jha, are published in Biology Open.

Professor Jha said: "This is a significant step for the 3Rs approach and our model can reduce the total number of fish required, potentially replace the in vivo studies and offer the refinement that live fish are not exposed to potentially toxic chemicals. Intestinal permeability, microbiota and immunology play an important role in maintaining the gut environment. But the intestine is also an important route of exposure to potentially harmful chemicals, and they can easily accumulate in tissue where they become toxic to their surroundings. If this environment is harmed by chemicals, such as through damage to gut cells, it could impact the health of the organisms and would lead to a number of fish diseases but this technique will enable us to increase the tests we can carry out and improve our understanding of how to preserve gut health."

Dr Stewart Owen, Principal Environmental Scientist at AstraZeneca, added: "We are working to develop sustainable tools to help us better understand the impact of pollutants; and we must do this using the best science available. Laura's success in developing this technique is creating a fantastic opportunity for many other fields of research to build on."

This builds on previous collaborations between the University and AstraZeneca, through which researchers have isolated and grown trout liver and gill cells.

Taken together these methods enable the researchers to address key questions about how chemicals in the environment can enter a fish, and better understand the implications. It was conducted with funding from the Biotechnology and Biological Sciences (BBSRC), Natural Environment Research Council (NERC) and AstraZeneca.

The researchers say that these cell cultures are already letting them address tissue specific investigations into gene expression and response, enhancinge basic understanding of these fundamental organs, and in turn providing information to help better protect the environment.

Credit: 
University of Plymouth

Improving human-data interaction to speed nanomaterials innovation

video: Lehigh University's Nano/Human Interface Presidential Engineering Research Initiative aims to help scientists visualize and interpret the vast amounts of data generated by research.

Image: 
Stephanie Veto, Lehigh University

Data is only as good as humans' ability to analyze and make use of it.

In materials research, the ability to analyze massive amounts of data--often generated at the nanoscale--in order to compare materials' properties is key to discovery and to achieving industrial use. Jeffrey M. Rickman, a professor of materials science and physics at Lehigh University, likens this process to candy manufacturing:

"If you are looking to create a candy that has, say, the ideal level of sweetness, you have to be able to compare different potential ingredients and their impact on sweetness in order to make the ideal final candy," says Rickman.

For several decades, nanomaterials--matter that is so small it is measured in nanometers (one nanometer = one-billionth of a meter) and can be manipulated at the atomic scale--have outperformed conventional materials in strength, conductivity and other key attributes. One obstacle to scaling up production is the fact that scientists lack the tools to fully make use of data--often in the terabytes, or trillions of bytes--to help them characterize the materials--a necessary step toward achieving "the ideal final candy."

What if such data could be easily accessed and manipulated by scientists in order to find real-time answers to research questions?

The promise of materials like DNA-wrapped single-walled carbon nanotubes could be realized. Carbon nanotubes are a tube-shaped material which can measure as small as one-billionth of a meter, or about 10,000 times smaller than a human hair. This material could revolutionize drug delivery and medical sensing with its unique ability to penetrate living cells.

A new paper takes a step toward realizing the promise of such materials. Authored by Rickman, the article describes a new way to map material properties relationships that are highly multidimensional in nature. Rickman employs methods of data analytics in combination with a visualization strategy called parallel coordinates to better represent multidimensional materials data and to extract useful relationships among properties. The article, "Data analytics and parallel-coordinate materials property charts," has been published in npj Computational Materials, a Nature Research journal.

"In the paper," says Rickman, "we illustrate the utility of this approach by providing a quantitative way to compare metallic and ceramic properties--though the approach could be applied to any materials you want to compare."

It is the first paper to come out of Lehigh's Nano/Human Interface Presidential Engineering Research Initiative, a multidisciplinary research initiative that proposes to develop a human-machine interface to improve the ability of scientists to visualize and interpret the vast amounts of data that are generated by scientific research. It was kickstarted by a $3-million institutional investment announced last year.

The leader of the initiative is Martin P. Harmer, professor of materials science and engineering. In addition to Rickman, other senior faculty members include Anand Jagota, department chair of bioengineering; Daniel P. Lopresti, department chair of computer science and engineering and director of Lehigh's Data X Initiative; and Catherine M. Arrington, associate professor of psychology.

"Several research universities are making major investments in big data," says Rickman. "Our initiative brings in a relatively new aspect: the human element."

According to Arrington, the Nano/Human Interface initiative emphasizes the human because the successful development of new tools for data visualization and manipulation must necessarily include a consideration of the cognitive strengths and limitations of the scientist.

"The behavioral and cognitive science aspects of the Nano/Human Interface initiative are twofold," says Arrington. "First, a human-factors research model allows for analysis of the current work environment and clear recommendations to the team for the development of new tools for scientific inquiry. Second, a cognitive psychology approach is needed to conduct basic science research on the mental representations and operations that may be uniquely challenged in the investigation of nanomaterials."

Rickman's proposed method uses parallel coordinates, which is a method of visualizing data that makes it possible to spot outliers or patterns based on related metric factors. Parallel coordinates charts can help tease out those patterns.

The challenge, says Rickman, lies in interpreting what you see.

"If plotting points in two dimensions using X and Y axes, you might see clusters of points and that would tell you something or provide a clue that the materials might share some attributes," he explains. "But, what if the clusters are in 100 dimensions?"

According to Rickman, there are tools that can help cut down on numbers of dimensions and eliminate non-relevant dimensions to help one better identify these patterns. In this work, he applies such tools to materials with success.

"The different dimensions or axes describe different aspects of the materials, such as compressibility and melting point," he says.

The charts described in the paper simplify the description of high-dimensional geometry, enable dimensional reduction and the identification of significant property correlations and underline distinctions among different materials classes.

From the paper: "In this work, we illustrated the utility of combining the methods of data analytics with a parallel coordinates representation to construct and interpret multidimensional materials property charts. This construction, along with associated materials analytics, permits the identification of important property correlations, quantifies the role of property clustering, highlights the efficacy of dimensional reduction strategies, provides a framework for the visualization of materials class envelopes and facilitates materials selection by displaying multidimensional property constraints. Given these capabilities, this approach constitutes a powerful tool for exploring complex property interrelationships that can guide materials selection."

Returning to the candy manufacturing metaphor, Rickman says: "We are looking for the best methods of putting the candies together to make what we want and this method may be one way of doing that."

New frontier, new approaches

Creating a roadmap to finding the best methods is the aim of a 2½-day, international workshop called "Workshop on the Convergence of Materials Research and Multi-Sensory Data Science" that is being hosted by Lehigh University in partnership with The Ohio State University.

The workshop--which will take place at Bear Creek Mountain Resort in Macungie, PA from June 11-13, 2018--will bring together scientists from allied disciplines in the basic and social sciences and engineering to address many issues involved in multi-sensory data science as applied to problems in materials research.

"We hope that one outcome of the workshop will be the forging of ongoing partnerships to help develop a roadmap to establishing a common language and framework for continued dialogue to move this effort of promoting multi-sensory data science forward," says Rickman, who is Principal Investigator on an National Science Foundation (NSF) grant, awarded by the Division of the Materials Research in support of the workshop.

Co-Principal Investigator, Nancy Carlisle, assistant professor in Lehigh's Department of Psychology, says the conference will bring together complementary areas of expertise to allow for new perspectives and ways forward.

"When humans are processing data, it's important to recognize limitations in the humans as well as the data," says Carlisle. "Gathering information from cognitive science can help refine the ways that we present data to humans and help them form better representations of the information contained in the data. Cognitive scientists are trained to understand the limits of human mental processing- it's what we do! Taking into account these limitations when devising new ways to present data is critical to success."

Adds Rickman: "We are at a new frontier in materials research, which calls for new approaches and partners to chart the way forward."

Credit: 
Lehigh University

LC-MS/MS Identification and characterization of biodegradation products of Nitroproston

image: This is the chemical structure of Nitroproston.

Image: 
Dr. Natalia Vladimirovna Mesonzhnik et al., Bentham Science Publishers

Nitroproston (11(S),15(S)-dihydroxy-9-keto-5Z,13E-prostadienoic acid 1?,3?-dinitroglycerol ester) is a novel prostaglandin-based compound with potential application in obstructive respiratory diseases such as asthma and obstructive bronchitis. Its pharmacological activity is provided by combined multi-target action on prostanoid EP4 receptors and soluble guanylylcyclase. Nitroproston is bearing a prostaglandin E2 (PGE2) moiety modified by an additional NO-donating fragment of glycerol-1,3-dinitrate (1,3-GDN) via ester bond and can be consider as nitrated derivative of glycerol ester of PGE2 ? the natural COX-2 metabolite of endogenous cannabinoid-like molecule 2-arachidonoyl glycerol. The presence of NO-donating fragment extremely changes pharmacological properties of PGE2. Nitroproston is more than 20-fold as active as prostaglandin E2 in the relaxation of respiratory muscles. Due to this enhanced myorelaxant activity Nitroproston is well tolerated by asthmatic subjects and is the first-in-class pharmaceutical candidate for therapy of asthma attacks utilized both prostanoid and NO receptors.

Despite the fact that Nitroproston has been extensively studied using various pharmacological models, its biological stability is still unknown. Thereby, the main aim of the present study was to evaluate Nitroproston stability in vitro, as well as to identify and characterize its biodegradation products. The principal in vitro biodegradation products of Nitroproston were identified using liquid chromatography/ion trap - time-of-flight mass-spectrometry (LC-HRMS/MS). The postulated structure of metabolites was confirmed using authentic reference standards. Rat, rabbit and human plasma and human whole blood samples were used for comparative in vitro degradation study. Nitroproston and its biodegradation products in biological samples were measured by target liquid chromatography/triple-stage quadrupole mass spectrometry (LC-MS/MS).

LC-HRMS/MS of spiked rat plasma samples clearly indicated the presence of two main metabolites of Nitroproston - 1,3-GDN and PGE2, the later can undergo dehydration to cyclopentenone prostaglandins. The applied LC-HRMS/MS screening method did not reveal the presence of biodegradation products of Nitroproston with one nitro-group or PGE2 glycerol ester. We assume that nitrate esters are more resistant to enzymatic hydrolysis in rat plasma than carboxyl ester moieties.

Target LC-MS/MS quantitative analysis was used to quantify the amount of Nitroproston and its major biodegradation products in rodent's plasma. The degradation was higher in rat plasma where only 5 % of parent Nitroproston was identified at the first moment of incubation. Similar pattern was observed for rabbit plasma where half-life (T1/2) of Nitroproston was about 2.0 minutes. Additionally, whole human blood and plasma samples were taken to perform stability and blood cell distribution study. Nitroproston biodegradation rate for human plasma was the slowest (T1/2 = 2.1 h) among tested species, but occurred more rapidly in whole blood (T1/2 = 14.8 min). Nitroproston was distributed between human RBCs and plasma with partition ratio of 0.82. These data suggest that metabolism of drug candidate in human whole blood was mainly associated with an enzymes located in RBC fraction. The observed interspecies variability highlights the need of suitable animal model selection for Nitroproston follow-up PK/PD studies. Our findings do not exclude that Nitroproston may be relatively stable in human after inhalation and may exert its therapeutic actions either as a whole drug molecule or as a prostaglandin E2 and nitric oxide prodrugs thanks to its active metabolites.

Nitroproston is being developed as a drug candidate for relief of bronchial asthma. The key principle of the action of Nitroproston is not only the effect on two targets having similar pharmacological activity with respect to bronchial smooth muscle, but also the synchronization of the pharmacological activity when the prostaglandin E2 is the driver of the donor part releasing the NO in the activity sites. Due to this, a powerful synergy of pharmacological activity is achieved and the dose of PGE2 drops sharply, which brings us back to the possibility of starting new and more successful attempts to use the of NO donating PGE2 derivatives for relief the bronchial asthma.

For more information about the research, please visit: http://www.eurekaselect.com/160373/article

Credit: 
Bentham Science Publishers