Earth

Spit and polish: The beauty of saliva for epigenetic studies

Biomarkers in saliva identify changes that impact the body, according to research published in Current Genomics. This important paper details the drawbacks and benefits of using saliva as a tissue sample for the study of epigenetic changes affected by experiences. Beginning with a detailed review introducing epigenetics and providing background from previous studies, the new findings include strong examples of how analysis of saliva can be applied to vulnerable populations, such as children, to identify changes correlated with experience and behavior.

Examples of epigenetic changes in saliva of children who experienced pediatric trauma and of children with autism spectrum disorder (ASD) are presented. Pediatric trauma is a leading cause of death for children aged 1-14 in the United States (CDC, 2016), accounting for nearly 40% of all childhood fatalities. Adverse childhood experiences (ACEs), such as traumatic injury, lead to long-term health and behavioral outcomes, including post-traumatic stress disorder (PTSD). ACEs also include abuse, neglect, and community violence in addition to physical harm, all which also result in long-term effects. Early detection of trauma is crucial to preventing abuse-related deaths, as well as for addressing early the health and behavioral consequences of ACEs. Early detection is also critical for children with ASD. In 2012, 1 in 68 Americans (CDC, 2012) had ASD. Scientists have not found a single unifying genetic explanation for ASD, and as such behavioral diagnosis at or after 18 months is the standard. Most children are not diagnosed until 3 years and 10 months (CDC, 2012), over two years later. The earlier children are diagnosed, the more effective interventions will become.

In this paper, scientists investigated the impact of ACEs on children aged four to eight over the course of six to eight months by collecting saliva at two time-points from children who had experienced traumatic events and from those who had not. Similarly saliva was collected at a single time point from children diagnosed with ASD and normal children. Pooling the studies together, 45 children were sampled. The saliva from these studies was assessed for a form of epigenetics called DNA methylation -- a reversible biochemical process through which methyl groups are added to the DNA, which changes gene expression by turning genes "on" or "off." Using a bead-chip array technology, over 425,000 methylation sites were assayed for each child, providing insight into the methylation pattern for each child.

Initial analysis showed that the children's methylation patterns did not cluster by diagnosis, experience, age, sex, or ethnicity. The scientists, Dr. Elaine L. Bearer and Brianna S. Mulligan of the University of New Mexico, discovered that the children grouped most closely by the cellular composition of their saliva, whether it contained primarily cheek cells or white blood cells. By correcting for cell type composition using methylation data from purified cheek and blood cells, sites were found that differed significantly according to experience. These sites are involved in brain development, rather than the differences between cheek and blood cell differentiation. The scientists are now exploring markers within corrected datasets for both cohorts of children. Preliminary results described in the paper, once verified, will be useful to flag young survivors of abuse or for earlier diagnosis of ASD in future.

Credit: 
Bentham Science Publishers

Vitamin D deficiency linked to greater risk of diabetes

image: This is Cedric Garland, Dr.PH, UC San Diego School of Medicine.

Image: 
UC San Diego Health

An epidemiological study conducted by researchers at University of California San Diego School of Medicine and Seoul National University suggests that persons deficient in vitamin D may be at much greater risk of developing diabetes.

The findings are reported in the April 19, 2018 online issue of PLOS One.

The scientists studied a cohort of 903 healthy adults (mean age: 74) with no indications of either pre-diabetes or diabetes during clinic visits from 1997 to 1999, and then followed the participants through 2009. Vitamin D levels in blood were measured during these visits, along with fasting plasma glucose and oral glucose tolerance.

Over the course of time, there were 47 new cases of diabetes and 337 new cases of pre-diabetes, in which blood sugar levels are higher than normal but not yet high enough to be categorized as type 2 diabetes.

For the study, the researchers identified the minimum healthy level of 25-hydroxyvitamin D in blood plasma to be 30 nanograms per milliliter. This is 10 ng/ml above the level recommended in 2010 by the Institute of Medicine, now part of The National Academies, a health advisory group to the federal government. Many groups, however, have argued for higher blood serum levels of vitamin D, as much as 50 ng/ml. The matter remains hotly debated.

"We found that participants with blood levels of 25-hydroxyvitamin D that were above 30 ng/ml had one-third of the risk of diabetes and those with levels above 50 ng/ml had one-fifth of the risk of developing diabetes," said first author Sue K. Park, MD, in the Department of Preventive Medicine at Seoul National University College of Medicine in South Korea.

Study co-author Cedric F. Garland, DrPH, adjunct professor in the UC San Diego School of Medicine Department of Family Medicine and Public Health, said persons with 25-hydroxyvitamin D levels below 30 ng/ml were considered vitamin D deficient. These persons, the researchers found, were up to five times at greater risk for developing diabetes than people with levels above 50 ng/ml.

Garland, who has previously investigated connections between vitamin D levels and various types of cancer, said the study builds upon previous epidemiological research linking vitamin D deficiency to a higher risk of diabetes. Epidemiological studies analyze the distribution and determinants of health and disease conditions. They do not necessarily prove cause-and-effect.

"Further research is needed on whether high 25-hydroxyvitamin D levels might prevent type 2 diabetes or the transition from pre-diabetes to diabetes," said Garland. "But this paper and past research indicate there is a strong association."

Garland and others have long advocated the health benefits of vitamin D. In 1980, he and his late brother Frank C. Garland, also an epidemiologist, published an influential paper that posited vitamin D (produced by the body through exposure to sunshine) and calcium (which vitamin D helps the body absorb) together reduced the risk of colon cancer. The Garlands and colleagues subsequently found associations with breast, lung and bladder cancers.

To reach 25-hydroxyvitamin D levels of 30 ng/ml, Garland said would require dietary supplements of 3,000 to 5,000 international units (IU) per day, less with the addition of moderate daily sun exposure with minimal clothing (approximately 10-15 minutes per day outdoors at noon).

The current recommended average daily amount of vitamin D is 400 IU for children up to 1 year; 600 IU for ages 1 to 70 years (less for pregnant or breastfeeding women) and 800 IU for persons over 70, according to the National Institutes of Health. Higher daily amounts of vitamin D are generally considered safe, but blood serum levels exceeding 125 ng/ml have been linked to adverse side effects, such as nausea, constipation, weight loss, heart rhythm problems and kidney damage.

Credit: 
University of California - San Diego

Elderly less likely to benefit from simultaneous radio- & chemotherapy for lung cancer

Barcelona, Spain: An analysis of elderly patients treated in a phase II trial of radiotherapy combined with chemotherapy in non-small cell lung cancer (NSCLC) has shown that they were less likely to benefit than younger patients if the two treatments were given at the same time.

Previous research has shown that for NSCLC patients with locally advanced disease (disease that has spread to the lymph nodes), radiotherapy given simultaneously with chemotherapy (concurrent chemo-radiotherapy) gives the best chance of survival, compared to giving chemotherapy followed by radiotherapy (sequential chemo-radiotherapy). However, it is a more intensive treatment and can lead to more and severe side-effects. Until now, it was unknown whether concurrent chemo-radiotherapy also improved survival in patients aged 75 or older, and how they would tolerate the treatment.

The trial, which is to be presented at the ESTRO 37 conference tomorrow (Saturday), investigated intensity-modulated radiation therapy (IMRT) - a sophisticated form of radiotherapy that precisely targets the cancer and adapts the beam's shape to that of the tumour, avoiding or reducing exposure of nearby healthy tissues - combined with chemotherapy. The radiation dose was personalised to each patient, so that the maximum possible total dose was delivered to the tumour, while sparing nearby organs that could be harmed by radiation. This is known as the isotoxic principle, and the researchers wanted to see if it was possible to give higher tumour doses without increasing the risk of side effects, and if it improved survival.

A total of 300 patients took part in the trial at the MAASTRO Clinic (Maastricht, The Netherlands) between May 2009 and April 2012, of whom 76 (25.3%) were aged 75 or older. The patients received IMRT alone, concurrent chemo-radiotherapy, or sequential chemo-radiotherapy. The researchers reviewed the patients' overall survival in October 2017.

Dr Judith van Loon, a radiation oncologist at the MAASTRO Clinic, said: "We found that elderly patients who were treated with concurrent chemo-radiotherapy had a worse survival than younger patients. They also did worse than the elderly patients treated with sequential chemo-radiotherapy or radiotherapy alone. Furthermore, it was not possible to increase the dose to the tumour without increasing the chance of side effects."

Among patients aged 75 or older, 32% received concurrent chemo-radiotherapy, 29% sequential chemo-radiotherapy and 39% radiotherapy alone. The total dose of radiation that could be delivered in the concurrent chemo-radiotherapy group was an average of 66.2 Gy, and in the sequential chemo-radiotherapy group it was an average of 66.7 Gy.

Death from any cause (overall survival) was significantly worse in the elderly patients receiving concurrent chemo-radiotherapy, even though the majority of them (96%) were assessed at the beginning of the study as having a World Health Organisation (WHO) performance score of 1 or less, meaning they were fully active and able to carry on normal life.

Compared with younger patients, the average overall survival was 15.5 versus 19.8 months, and after five years 13.2% elderly patients were alive compared to 24.1% of younger patients. While overall survival was worse for the elderly patients treated with concurrent chemo-radiotherapy, the survival was similar to younger patients among the elderly who were treated with sequential chemo-radiotherapy or radiotherapy alone. Overall, there was no difference in adverse effects between elderly and younger patients.

"These results indicate that the standard treatment for lung cancer patients may not result in the best outcomes for elderly patients," said Dr van Loon. "Most importantly, they show that selecting elderly patients for concurrent chemo-radiotherapy on the basis of their performance score is not sufficient. Physicians need to take care when deciding whether or not to administer concurrent chemo-radiotherapy outside the conditions of a study.

"These findings underscore the need for prospective studies that incorporate geriatric assessment in this understudied group of elderly cancer patients, as this enables us to identify predictive factors for treatment outcome. Also, we should look not only at the chance of cure but also quality of life and patient-reported outcome measures. This can help physicians to select the best treatment for individual patients.

"A multicentre trial is currently investigating the value of a geriatric assessment in elderly patients with locally advanced lung cancer. Within this trial, patients that are assessed as fit enough to be treated with chemo-radiotherapy are randomly assigned to concurrent or sequential chemo-radiotherapy. Results from this trial are expected in 2022." [1]

President of ESTRO, Professor Yolande Lievens, head of the department of radiation oncology at Ghent University Hospital, Belgium, said: "This analysis demonstrates that one size does not fit all when it comes to treating cancer patients of different ages. Elderly people, even when they are otherwise fit and healthy, respond differently to treatments than their younger counterparts. More research needs to be performed to define the most effective treatment strategies for these patients that do not impact on their quality of life without improving survival. We look forward to the results from the multi-centre trial that is investigating a more personalised and tailored approach and assesses the value of a more intensive treatment strategy in the elderly patient population with non-small cell lung cancer."

Credit: 
European Society for Radiotherapy and Oncology (ESTRO)

How can medical marijuana benefit older adults?

Managing symptoms such as pain, nausea, and psychiatric illness can be challenging as people age. A new Journal of the American Geriatrics Society review highlights what's currently known about the indications and risks of medical marijuana use for older adults.

The review notes that medical marijuana appears useful for the treatment of pain (particularly neuropathic pain) and chemotherapy-induced nausea and vomiting. It has neuropsychiatric side effects but even when smoked, it does not appear to increase the risk for lung cancer.

Importantly, however, medical marijuana's positive and negative effects have not been thoroughly studied specifically in older adults.

"There is a dearth of evidence supporting the use of cannabinoids for medical indications in older adults. Common sense practices are applicable here though, including performing a thorough assessment for side effects and expecting that lower doses will have a greater impact," said lead author Dr. Joshua Briscoe, of the Duke University Medical Center. "As younger generations age, it is also important to expect that they have experience using marijuana in recreational contexts, which will affect their approach to its use in a medical setting."

Credit: 
Wiley

Mayo Clinic study finds no evidence that anesthesia in young children lowers intelligence

ROCHESTER, Minn. - A Mayo Clinic study finds no evidence that children given anesthesia before their third birthdays have lower IQs than those who did not have it. A more complex picture emerges among people who had anesthesia several times as small children: Although their intelligence is comparable, they score modestly lower on tests measuring fine motor skills, and their parents are more likely to report behavioral and learning problems. The findings are published in Anesthesiology.

The U.S. Food and Drug Administration warned in 2016 that prolonged or repeated sedation before age 3 may affect brain development. The warning was based largely on data from animals, which may or may not apply to children.

Mayo researchers studied 997 people born from 1994 through 2007 in Olmsted County, Minnesota, the home of Mayo Clinic's Rochester campus. They were grouped according to the anesthesia exposures they had before their third birthdays: 206 had two or more; 380 had one; and 411 had none. Ear, nose and throat procedures were the most common surgeries.

The researchers used the Rochester Epidemiology Project medical records database, brain function testing at ages 8-12 or 15-20, and parent reports to assess behavior and brain function. Beyond their anesthesia exposure, the three groups of patients were matched to be as similar as possible.

Intelligence, memory, and several other measures of brain function were similar among the groups.

However, those with multiple exposures to anesthesia had modest declines in fine motor skills, such as the ability to draw figures with a pencil, and how quickly they processed information when reading. Their parents reported more learning and behavioral problems, such as difficulty reading; behaviors consistent with attention deficit hyperactivity disorder; breaking rules; or displaying aggression, anxiety or social withdrawal.

Parents whose children had anesthesia once under age 3 reported more problems with mental skills known as executive functions - skills that help with memory, impulse control, planning and flexibility - but not with other behaviors.

"For the majority of kids undergoing surgery, the results overall are reassuring," says lead author David Warner, M.D., a pediatric anesthesiologist at Mayo Clinic Children's Center. "About 80 percent of kids who need surgery under age 3 only need one, and it's relatively brief."

Several other studies also show little evidence that a single anesthetic is associated with significant harm.

"Although we do have some concerns about the children who are receiving multiple anesthetics, it's important to note that our results don't allow us to conclude that anesthesia itself is causing problems," Dr. Warner says, adding that other factors, such as the conditions that make surgery necessary, could contribute. "However, the fact that we found some problems in some of these children means that research in this area needs to continue, including further analysis of our data."

In the meantime, in most cases the benefit of surgery outweighs any risk, Dr. Warner says. However, the potential for problems may need to be part of the decision-making process when parents and surgeons discuss surgery, he adds.

Credit: 
Mayo Clinic

Peptide induces chirality evolution in a single gold nanoparticle

video: This video shows how the chiral gold nanoparticles are created with amino acids and peptides, and how they are applied in displays.

Image: 
Ki Tae Nam Research Group, Seoul National University

For the first time, scientists have successfully created optically active, chiral gold nanoparticles using amino acids and peptides. Many chemicals significant to life have mirror-image twins (left-handed and right-handed structures), a characteristic that is conventionally called chirality. This study describes how chirality, which is typically observed in organic molecules, can be extended to three-dimensional metallic nanostructures. This newly discovered synthesis method was described in Nature (April 19th) and was featured on the cover.

The Korean research teams at Seoul National University (SNU), Pohang University of Science and Technology (POSTECH) and LG Display (LGD) demonstrated the direct transfer of peptide handedness to nanoparticles' morphology during their growth. Mirror-image peptide twins induced the opposite twist of the chiral nanoparticles, which are further tunable with sequence variation. The chiral gold nanoparticles with different handedness interacted differently with circularly polarized visible light, displaying extensive color modulation. As a result, color change is possible by controlling the light polarization, which has potential applications in future displays.

In newly synthesized gold nanoparticles, chiral elements are arranged on cube-like structures with a side length of only about 100 nm. They can be easily dispersed in solutions and deposited on the substrates while maintaining high chiro-optical activities.

"Based on our understanding of the interface between peptides and inorganic materials, we have built a new platform technology to control the crystallographic asymmetry," explains Professor Ki Tae Nam at SNU, who led this collaborative project. He added, "This finding can make a direct and immediate impact on optical devices and could be further applied for the development of enantioselective bioinspired catalysts in the near future."

"The potential applications include active color displays, holography, chirality sensors and all-angle negative refractive index materials," explained Professor Junsuk Rho at POSTECH, the co-corresponding author.

Credit: 
Seoul National University

Green digitization: Botanical collections data answer real-world questions

IMAGE: Example of Symbiota's Image Scoring Tool, which filters images and applies a phenological score -- making available data on flowering and fruiting times to help understand global environmental change.

Image: 
Yost, J. M., P. W. Sweeney, E. Gilbert, G. Nelson, R. Guralnick, A. S. Gallinat, E. R. Ellwood, et al. 2018. Digitization protocol for scoring reproductive phenology from herbarium specimens...

Even as botany has moved firmly into the era of "big data," some of the most valuable botanical information remains inaccessible for computational analysis, locked in physical form in the orderly stacks of herbaria and museums. Herbarium specimens are plant samples collected from the field that are dried and stored with labels describing species, date and location of collection, along with various other information including habitat descriptions. The detailed historical record these specimens keep of species occurrence, morphology, and even DNA provides an unparalleled data source to address a variety of morphological, ecological, phenological, and taxonomic questions. Now efforts are underway to digitize these data, and make them easily accessible for analysis.

Two symposia were convened to discuss the possibilities and promise of digitizing these data--at the Botanical Society of America's 2017 annual meeting in Fort Worth, Texas, and again at the XIX International Botanical Congress in Shenzhen, China. The proceedings of those symposia have been published as a special issue of Applications in Plant Sciences; the articles discuss a range of methods and remaining challenges for extracting data from botanical collections, as well as applications for collections data once digitized. Many of the authors contributing to the issue are involved in iDigBio (Integrated Digitized Biocollections), a new "national coordinating center for the facilitation and mobilization of biodiversity specimen data," as described by Dr. Gil Nelson, a botanist at Florida State University and coeditor of this issue.

iDigBio is funded by the U.S. National Science Foundation's Advancing Digitization of Biodiversity Collections initiative, and has already digitized about 50 million herbarium specimens. According to Dr. Nelson, "A primary significance has been community building among biodiversity scientists, curators, and collections managers, and developing and disseminating recommended practices and technical skills for getting these jobs done." The challenges of digitizing these data are formidable, said Dr. Nelson, and include "developing computer vision techniques for making species determinations and scoring phenological traits, and developing effective natural language processing algorithms for parsing label data."

But as the papers in this issue show, steady progress is being made in developing methods to address these challenges. Nelson et al. (2018) and Contreras (2018) address more nuts-and-bolts issues of data management, the former discussing the need for globally unique IDs for herbarium specimens, and the latter providing a workflow for digitizing new fossil leaf collections. Botella et al. (2018) review and discuss the prospects for "computer vision" aided by deep-learning neural networks that, while in their infancy, could eventually identify species from variable images. Yost et al. (2018) offer a protocol for digitizing data on phenology (the timing of events such as flowering or fruiting) from herbarium specimens.

These digitization methods can help unlock valuable herbarium data to address a range of questions. James et al. (2018) discuss how digitized herbarium specimens can be used to show how plant species have responded to global change, for example by using location and time data to model shifts in range. Cantrill (2018) discusses how the Australasian Virtual Herbarium database has been used for ecological and other research. Thiers and Halling (2018) extend the applications to the fungal world, showing how herbarium data can be used as a baseline to determine the distribution of macrofungi in North America. Furthermore, digitization efforts can have real payoff in public perception; Dr. Nelson sees an "increasing presence of biodiversity data and museums in the popular press, which has raised the profiles of herbaria and other collections for the general public." Along these lines, von Konrat et al. (2018) show how digital herbarium data can be used to engage citizen scientists.

Through centuries of painstaking collection and cataloguing, botanists have created a unique and irreplaceable bank of data in the tens of millions of herbarium specimens worldwide. But converting a dried, pressed plant specimen with a handwritten label from 1835 into a format that you can fit on a USB stick is no small trick. Using creative thinking, sophisticated methodology, and hard work, these scientists are bringing the valuable information locked in herbarium specimens into the digital age.

Credit: 
Botanical Society of America

Smooth dance moves confirm new bird-of-paradise species

Ithaca, NY -- Newly publicized audiovisuals support full species status for one of the dancing birds-of-paradise in New Guinea. This new species, called the Vogelkop Superb Bird-of-Paradise, is found only in the island's far-western Bird's Head, or Vogelkop, region. In a new paper published in the journal PeerJ, scientists "show and tell" half-a-dozen ways this form is distinct from the more widespread Superb Bird-of-Paradise, now called the Greater Superb Bird-of-Paradise--the bird known for its bouncy "smiley face" dance routine.

"After you see what the Vogelkop form looks like and acts like in the wild, there's little room for doubt that it is a separate species," says Ed Scholes with the Cornell Lab of Ornithology's Birds-of-Paradise Project. "The courtship dance is different. The vocalizations are different. The females look different. Even the shape of the displaying male is different."

When expanded for courtship display, the Vogelkop male's raised cape creates a completely different appearance--crescent-shaped with pointed tips rather than the oval shape of the widespread form of the species. The way the Vogelkop male dances for the female is also is distinctive, the steps being smooth instead of bouncy.

Credit: 
Cornell University

Your immune system holds the line against repeat invaders, thanks to this molecule

image: (The researchers from left to right) Adam Getzler, Huitian Diao, Dapeng Wang and Matthew Pipkin led the study on the Florida campus of The Scripps Research Institute.

Image: 
Scott Wiseman/The Scripps Research Institute

JUPITER, FL--April 17, 2018--Memory T cells are a critical element of our immune system's historical archive. To prevent repeat infections, these cells retain a record of germs they've fought before.

But for all their importance, the origins of memory T cells have remained a mystery. Now, a new study from the laboratory of immunologist Matthew Pipkin, PhD, of The Scripps Research Institute's Florida campus, lays out the opening chapter of this enigma.

Researchers found a transcription factor protein called Runx3 puts dividing T cells on track to becoming memory T cells. This new insight may allow researchers to design drugs that improve immune responses to vaccines, Pipkin says. The discovery could also have implications for chronic diseases such as cancer, in which responding T cells sometimes become "exhausted" and unable to rally an effective defense.

"There are instances such as chronic infection and tumors where the T cells differentiate in an aberrant way that shortens their life span and decreases their function," Pipkin says. "Because our study found that Runx3 is one of the earliest players during an immune response, manipulating it might be an avenue to basically turn back the clock and reprogram dysfunctional T cells into a format that is conducive to them developing into genuine memory cells that are protective.

The study, "The transcription factor Runx3 establishes chromatin accessibility of cis-regulatory landscapes that drive memory cytotoxic T lymphocyte formation," appears online in the journal Immunity on April 17, 2018.

Runx3 coordinates a rapid memory cell response

Runx3's control of T cell differentiation is important because when our bodies fight off viruses and cancers--and our T cells burst into action--the vast majority tend to become effector cells. These effector cells are short-lived and do not persist once the infection resolves.

The amount of Runx3 has a deterministic effect on the outcome of the differentiation of the T cells, Pipkin says. Runx3 controls that burst of activity and ensures the cells are directed toward a different fate, that of memory T cells, which can live for decades.

"This finding provides molecular evidence that the programming of memory is established very rapidly, and that it's kind of a push and pull to restrain the developing memory cells from differentiating into effector cells, which is a dead-end road," says Pipkin.

The team studied what happened when Runx3 expression was partially suppressed using RNA interference. "All those experiments showed you lost the known precursors that give rise to memory T cells," Pipkin says. Conversely, cells with experimentally increased Runx3 produced more memory T cells.

Cells with increased Runx3 were also better at regenerating new rounds of memory cells than normal cells after repeated infections with lymphocytic choriomeningitis virus (LCMV) and Listeria monocytogenes. This indicated that Runx3 enhances memory T cell potential, Pipkin says.

"Our work demonstrates that Runx3 turns down another transcription factor that commits the cells to becoming these terminal effector cells, and it slows down proliferation. That keeps the cells on a trajectory into the memory lineage."

The new study also sheds light on the timeline when immune memory is established against an invader. Researchers found molecular evidence that the programming of memory T cells happens very rapidly after the immune system first encounters new threats. At this time, naïve CD8+ T cells must begin developing into specialists called cytotoxic T lymphocytes (CTL), that can kill infected or malignant cells. Pipkin's lab found that Runx3 coordinates the memory T cell differentiation program within the first few hours of infection.

Pipkin and his colleagues discovered the critical role of Runx3 in T cell differentiation by challenging naïve T cells with an antibody signal that mimicked infection, and then mapping the areas of the newly exposed genome. This revealed that the locations on our chromosomes where Runx3 binds became receptive to binding immediately after infection, and before the first CD8+ T cell division. These regions were also receptive in fully developed memory T cells, but less so in the terminal effector cells.

The findings raise many questions, Pipkin adds. He wants to determine if some type of therapeutic could rescue naturally occurring Runx3 deficiencies. He also wants to identify the other players that cooperate with Runx3.

"We know that Runx3 is working with a number of additional transcription factors and chromatin regulatory proteins," Pipkin says. "So, we are currently trying to identify them and determine how they collaborate with Runx3 to turn on and off different regions of the genome to promote memory development."

Credit: 
Scripps Research Institute

Mother's depression might do the same to her child's IQ

image: Patricia East, Ph.D., is a research scientist with the Department of Pediatrics at UC San Diego School of Medicine.

Image: 
UC San Diego Health

Roughly one in 10 women in the United States will experience depression, according to the Centers for Disease Control and Prevention. The consequences, however, may extend to their children, report researchers at University of California San Diego School of Medicine, who found that a mother's depression can negatively affect a child's cognitive development up to the age of 16.

The findings are published in the April issue of Child Development.

Researchers surveyed approximately 900 healthy children and their mothers living in Santiago, Chile at five-year intervals from the child's infancy through age 16. They observed how affectionate and responsive mothers were to their children at each age period, as well as how much mothers provided age-appropriate learning materials. Children were assessed on verbal cognitive abilities using standardized IQ tests during each assessment. Mothers were tested for symptoms of depression.

"We found that mothers who were highly depressed didn't invest emotionally or in providing learning materials to support their child, such as toys and books, as much as mothers who were not depressed. This, in turn, impacted the child's IQ at ages 1, 5, 10 and 16," said Patricia East, PhD, research scientist with the Department of Pediatrics at UC San Diego School of Medicine. "The consistency and longevity of these results speak to the enduring effect that depression has on a mother's parenting and her child's development."

On a scale from one to 19, the average verbal IQ score for all children in the study at age 5 was 7.64. Children who had severely depressed mothers were found to have an average verbal IQ score of 7.30 compared to a score of 7.78 in children without depressed mothers.

"Although seemingly small, differences in IQ from 7.78 to 7.30 are highly meaningful in terms of children's verbal skills and vocabulary," said East. "Our study results show the long term consequences that a child can experience due to chronic maternal depression."

Throughout the study period, at least half of the mothers were determined to be depressed based on a questionnaire with questions like, "Are you sad?" and "Do you find yourself crying?"

"For mothers in the study, there were many stressors in their lives. Most of the mothers, while literate, had only nine years of education, were not employed outside the home and often lived with extended family in small, crowded homes--factors that likely contributed to their depression," said East. "Many mothers suffer from depression in the first six months after childbirth, but for some, depression lingers."

East said study data suggested approximately 20 percent of mothers who are severely depressed when their child turns age 1 remain depressed for a long time.

"For health care providers, the results show that early identification, intervention and treatment of maternal depression are key," said East. "Providing resources to depressed moms will help them manage their symptoms in a productive way and ensure their children reach their full potential."

Study authors said future steps include further analyzing the data to see how mothers' depression affects children's own depressive symptoms through childhood and adolescence and children's academic achievement and health, such as their likelihood of being overweight or obese.

Credit: 
University of California - San Diego

Warming climate could speed forest regrowth in eastern US

image: Researchers grew tree seedlings in plots with varying soil fertility, and with and without different mixes of early succession plants such as broomsedge and goldenrod.

Image: 
Photo by Jason Fridley, Syracuse University.

DURHAM, N.C. -- Climate change could speed the natural regrowth of forests on undeveloped or abandoned land in the eastern U.S., according to a new study.

If left to nature's own devices, a field of weeds and grasses over time will be replaced by saplings, young trees and eventually mature forest. Earlier research has shown that this succession from field to forest can happen decades sooner in the southeastern U.S. than in the Northeast. But it wasn't obvious why, especially since northern and southern fields are first colonized by many of the same tree species.

Now, a study published Proceedings of the National Academy of Sciences points to temperature as the major factor influencing the pace at which trees take over.

The results suggest that as temperatures rise, faster-growing forests on lands that humans have left idle could play a bigger role in removing carbon dioxide from the atmosphere, say researchers from Duke University and Syracuse University.

The team conducted the experiment at six sites up and down the eastern U.S., from New York to Florida.

At each site, the researchers followed the early lives of four tree species that are common early arrivals in abandoned farm fields -- loblolly pine, black cherry, red cedar and sweetgum.

Using plastic wading pools as planters, they grew the trees from seed in plots with varying soil fertility, and with and without different mixes of early succession plants such as broomsedge and goldenrod.

In each plot the researchers also measured light availability, soil moisture, nutrients and other variables known to affect plant growth.

After two years, the tree seedlings grew faster at southern sites. But surprisingly, other plant species grew slower.

One possibility is that soil fertility is the main factor, said co-author Jason Fridley, associate professor of biology at Syracuse University. The thinking was that poorer southern soils produce a sparser carpet of weeds and grasses. This might in turn shade emerging tree seedlings to a lesser extent than in the north, and make it easier for them to grow up through the gaps.

But statistical analyses weighing the relative effects of soil fertility and other factors revealed that temperature was the biggest driver of tree seedling growth. Part of the reason is that milder winters and earlier springs mean a longer growing season, said Justin Wright, associate professor of biology at Duke.

The results are important because average annual temperatures in the eastern U.S. are predicted to warm by five to nine degrees Fahrenheit by the end of the century.

Rising temperatures could also bring more droughts, Wright cautions. But in the absence of drought stress, even minor warming will likely accelerate the transition from field to forest.

This also means that northeastern meadows that normally persist for decades may become shorter-lived, Fridley said. The forests that replace them probably won't mirror native forests, he added -- especially if cold-intolerant trees that are common colonizers of southern fields find it increasingly easy to survive and take hold in the north.

"Certainly in the next 100 years and maybe in the next 50 years, fields will likely transition much faster to woody vegetation," Fridley said. "The double whammy is the trees themselves are going to change too."

But young, rapidly growing trees can potentially absorb more carbon dioxide than weeds and grasses as they convert the heat-trapping gas to the sugar they need to grow. That means that undeveloped or abandoned land, if left undisturbed, could soon play a bigger role in offsetting human sources of carbon dioxide emissions.

"Faster-growing forests on once-cultivated land aren't going to solve the climate change problem," Wright said. "But one of the reasons we care about these abandoned sites is they have really high potential for carbon sequestration."

Credit: 
Duke University

Surviving climate change, then and now

Trade and social networking helped our Homo sapiens ancestors survive a climate-changing volcanic eruption 40,000 years ago, giving hope that we will be able to ride out global warming by staying interconnected, a new study suggests.

Analyzing ancient tools, ornaments and human remains from a prehistoric rock shelter called Riparo Bombrini, in Liguria on the Italian Riviera, archeologists at Université de Montréal and the University of Genoa conclude that the key to survival is cooperation.

Their study was published in early April in the Journal of Quaternary Science.

"Liguria is where some of the first Homo sapiens, more or less our direct ancestors, lived in Europe," said Julien Riel-Salvatore, a professor of archeology at UdeM who co-authored the study with his Italian colleague Fabio Negrino. "They came after the Neanderthals, and unlike them, when they were faced with sudden changes in their climate they didn't go locally extinct or abandon the region - they adapted."

Home sapiens had been living in the region for about 1,000 years when a "super-eruption" in the Phlegraean Fields in southern Italy, west of present-day Naples, devastated much of Europe. "It used to be thought that this wiped out most of the early Homo sapiens in Europe, but we've been able to show that some were able to deal with the situation just fine. They survived by dealing with the uncertainty of sudden change."

In their work, the archeologists gathered tool fragments such as bladelets - small flakes knocked off large stones to use as barbs and slicing components of weapons for hunting - that showed the ingenuity of our early ancestors. Some of the flint they used was brought in from hundreds of kilometres away, indicating a very extensive social and trading network that helped them survive for the next 4,000 years.

"They had a link to people living far away, so that if things went haywire in the territory where they lived, they had the social option of depending on people they'd built relationships with - the broader the network, the easier it was to survive," said Riel-Salvatore, whose evidence also includes rare skeletal remains and a child's tooth, as well as shell and stone ornaments, that show Homo sapiens were there.

His study mirrors others on an even older archeological site, Mount Toba on the Indonesian island of Sumatra, where a super-eruption 75,000 years ago was once thought to have come close to wiping out humanity entirely, a theory since disproven. In both cases, archeology has shown that evolution isn't always as dramatic as we think.

"This seems to be part of a pattern where humans are more adaptable and more resilient in the face of these enormously disruptive events," said Riel-Salvatore. "These events can be really terrible, but only in a limited way, not across continents or globally."

It's a bit of a leap to say that what happened tens of thousands of years ago can help predict how humans today will cope with climate change, but learning from the past does help situate us for the future - and even rebut climate-change deniers, he added.

"It underscores the importance of archeology in being able to inform the more immediate issues we face. Cooperation and resilient social networks were really key in helping people ride out dramatic climate change in the past. And considering some of the challenges we're facing nowadays, and some of the entrenched positions we have to deal with, maybe this notion that cooperation is fundamental is something we can communicate as a take-home lesson."

The bulk of the data the researchers gathered for their study was excavated between 2002 and 2005 from Riparo Bombrini, a part of the Balzi Rossi site complex from the Middle-Upper Paleolithic period that was first probed in 1938 and excavated in 1976. Over the next three years, Riel-Salvatore and Negrino intend to delve further into why the Neanderthal population there disappeared and was replaced by the better-equipped - and better-connected - Homo sapiens.

Credit: 
University of Montreal

Evidence mounts for Alzheimer's, suicide risks among youth in polluted cities

image: This is pollution haze over Mexico City.

Image: 
Lilian Calderón-Garcidueñas photo

MISSOULA - A University of Montana researcher and her collaborators have published a new study that reveals increased risks for Alzheimer's and suicide among children and young adults living in polluted megacities.

Dr. Lilian Calderón-Garcidueñas said her group studied 203 autopsies of Mexico City residents ranging in age from 11 months to 40 years. Metropolitan Mexico City is home to 24 million people exposed daily to concentrations of fine particulate matter and ozone above U.S. Environmental Protection Agency standards. The researchers tracked two abnormal proteins that indicate development of Alzheimer's, and they detected the early stages of the disease in babies less than a year old.

"Alzheimer's disease hallmarks start in childhood in polluted environments, and we must implement effective preventative measures early," said Calderón-Garcidueñas, a physician and Ph.D. toxicologist in UM's Department of Biomedical and Pharmaceutical Sciences. "It is useless to take reactive actions decades later."

The research was published in the Journal of Environmental Research and is online at http://bit.ly/2veeDsC.

The scientists found heightened levels of the two abnormal proteins - hyperphosphorylated tau and beta amyloid - in the brains of young urbanites with lifetime exposures to fine-particulate-matter pollution (PM2.5). They also tracked Apolipoprotein E (APOE 4), a well-known genetic risk factor for Alzheimer's, as well as lifetime cumulative exposure to unhealthy levels of PM2.5 - particles which are at least 30 times smaller than the diameter of a human hair and frequently cause the haze over urban areas.

Findings indicate Alzheimer's starts in early childhood, and the disease progression relates to age, APOE 4 status and particulate exposure. Researchers found hallmarks of the disease among 99.5 percent of the subjects they examined in Mexico City. In addition, APOE 4 carriers have a higher risk of rapid progression of Alzheimer's and 4.92 higher odds of committing suicide versus APOE 3 carriers, controlling for age and particulate exposure.

Overall, the authors have documented an accelerated and early disease process for Alzheimer's in highly exposed Mexico City residents. They believe the detrimental effects are caused by tiny pollution particles that enter the brain through the nose, lungs and gastrointestinal tract, and these particles damage all barriers and travel everywhere in the body through the circulatory system.

The authors conclude that ambient air pollution is a key modifiable risk for millions of people across the globe, including millions of Americans who are exposed to harmful particulate pollution levels.

"Neuroprotection measures ought to start very early, including the prenatal period and childhood," Calderón-Garcidueñas said. "Defining pediatric environmental, nutritional, metabolic and genetic risk-factor interactions are key to preventing Alzheimer's disease."

Credit: 
The University of Montana

To starve pancreatic tumors, researchers seek to block 'self-eating,' other fuel sources

CHAPEL HILL -- To get the extra energy they need to fuel their uncontrolled growth, cancer cells break down some of their own parts for fuel - a process known as autophagy, or "self-eating." Researchers from the University of North Carolina Lineberger Comprehensive Cancer Center found a possible therapeutic strategy to block self-eating in one of the deadliest cancers, as well as to cut off the tumor's other energy sources.

The researchers are reporting preclinical findings for a potential two-treatment strategy to block multiple mechanisms of cancer cell metabolism in pancreatic cancer at the American Association for Cancer Research Annual Meeting in Chicago. The findings will be presented from 8 a.m. to noon on Wednesday.

"We know that cancer cells have a greater need for energy than normal cells," said UNC Lineberger's Channing Der, PhD, Sarah Graham Kenan Distinguished Professor in the UNC School of Medicine Department of Pharmacology.

"They get their energy by changing normal metabolic processes to allow them to generate more energy, and one of these processes is self-eating. Basically what a cancer cell does is it does this more efficiently than a normal cell."

In other studies, pancreatic cancer cells have been known to rely more heavily on autophagy, but UNC Lineberger scientists reported evidence that a type of treatment -- an ERK inhibitor -- actually increased their reliance on this. The researchers believe the compound prevents the cell from relying on other energy sources, driving it toward autophagy.

"The cancer cell has many ways to achieve what it wants in terms of getting more energy," Der said. "We find that if you try to stop one, a cancer cell has the ability to compensate. I think the analogy many of us use is the 'whack-a-mole' concept where you knock one thing down, and something else pops up. We need more than one hammer basically."

To block multiple energy sources at once, the researchers used an ERK inhibitor to cut off these other energy sources, alongside with an investigational compound used to block autophagy, in the hopes of starving the cells completely. They reported at AACR that this co-treatment showed a synergistic effect.

"What if we could cripple more than one energy sources for the cell at once?" Der said.

Credit: 
UNC Lineberger Comprehensive Cancer Center

Extreme climate variability destabilizing West Coast ecosystems

CORVALLIS, Ore. - New research shows that extreme climate variability over the last century in western North America may be destabilizing both marine and terrestrial ecosystems.

Climate is increasingly controlling synchronous ecosystem behavior in which species populations rise and fall together, according to the National Science Foundation-funded study published in the journal Global Change Biology.

Climate variability is of concern given that extreme events, such as prolonged drought or heatwaves, can disproportionately impact biology, reduce resilience and leave a lasting impact. An increase in the synchrony of the climate could expose marine and terrestrial organisms to higher risks of extinction, said study co-author Ivan Arismendi, an aquatic ecologist and assistant professor at Oregon State University.

"There has been a tremendous amount of research on climate change, but almost all of it has been focused on trends in average conditions, such as rising temperatures," Arismendi said. "However, climate is also predicted to become more variable and very little research has addressed this issue. Our study found that extreme variability is synchronizing processes within and among ecosystems at a level not seen in the last 250 years."

The interdisciplinary research team, led by the University of Texas Marine Science Institute, documented that wintertime atmospheric conditions along the west coast of North America, known as North Pacific high, are important to marine, terrestrial, and freshwater ecosystems in California and the southwestern United States. A strong wintertime North Pacific High is associated with winds that are favorable for marine productivity, but also blocks the onshore storm track and leads to drought on land.

The researchers documented that the North Pacific High has become more variable over the past century, and that these trends have been imprinted on physical and biological indicators from the continental slope to the Sierra Nevada and beyond. There are more dramatic and frequent swings in this winter climate pattern, and not only has variability increased, but so too has the synchrony among diverse ecosystems.

"We've found that land, rivers, and oceans are all strongly related to a winter climate pattern off the western coast of North America, and that climate pattern has become more variable over the past century," said lead author Bryan Black, associate professor of marine science at UT-Austin. "This extreme variability is increasingly imprinted on these freshwater, terrestrial, and marine systems, and this has caused them to become more synchronous with one another with a number of implications for fisheries, drought, snowpack, and tree growth."

Indeed, tree-ring chronologies provide much longer histories than observational records and corroborate that variability and synchrony have risen over the past hundred years, and to levels that are as high as any observed over the past three centuries, according to the researchers.

More frequent and larger changes in the North Pacific High appear to originate from rising variability in the tropics and are linked to the record-breaking El Niño events in 1983, 1998, and 2016 and the 2014-2015 North Pacific Ocean heat wave known as "The Blob."

Credit: 
Oregon State University