Earth

Nitrogen study casts doubt on ability of plants to continue absorbing same amounts of CO2

A new study casts doubt as to whether plants will continue to absorb as much carbon dioxide in the future as they have in the past due to declining availability of nitrogen in certain parts of the world.

When it comes to the role plants play in keeping the heat-trapping greenhouse gas out of the atmosphere, "it may not be business as usual," said Lixin Wang, an associate professor in the Department of Earth Sciences at IUPUI.

Wang is a co-author of the paper "Isotopic evidence for oligotrophication of terrestrial ecosystems," which reports that finding. It was published Oct. 22 in the journal Nature Ecology and Evolution.

Grasslands View print quality image

In grasslands and forests, which are not directly fertilized, the availability of nitrogen to plants is declining.
The study examines global availability of nitrogen, using a data set that is more than 30,000 data points larger than those previously used to determine nitrogen availability.

An essential nutrient for plants as well as for humans and animals, nitrogen is used widely in more urban, developed countries to fertilize crops. In fact, it has been used so widely that its use has raised serious environmental concerns.

That gave people the impression "that we are kind of nitrogen-saturated everywhere, that we have too much nitrogen," Wang said.

But the researchers found that perception is not true.

In natural systems such as grasslands and forests that are not directly fertilized, the researchers said, the availability of nitrogen to plants is declining. As availability declines, compared to the relative demand for the nutrient due to plants leafing out earlier and the longer growing seasons associated with climate warming, plants are suffering from nitrogen deficiency, Wang said.

"In such systems, which cover a large part of the world, demand for nitrogen is rising at a faster rate than the supply of nitrogen," Wang said.

With nitrogen deficiency, plants are unable to absorb the same quantity of carbon dioxide as they did previously.

"We know that plants reliably suck up carbon dioxide that we emit into the environment," Wang said. "But the problem right now is if plants are suffering more and more nitrogen limitations, it means they will be able to take up less and less of the extra carbon dioxide."

"Not only will plants be more stressed for nitrogen," said Joseph Craine, the paper's lead author, "but so will animals that eat plants. Less nitrogen in plants means less protein for herbivores, which could threaten the entire food chain."

Credit: 
Indiana University

Rewilding landscapes can help to solve more than one problem

image: Bison in the Kennemerduinen National Park, The Netherlands.

Image: 
Staffan Widstrand/Rewilding Europe

Urbanisation, biodiversity loss, climate change: just some of the worldwide problems 'rewilding' - i.e. restoring food chains by returning 'missing' species to the landscape - can help tackle. Researcher Liesbeth Bakker (NIOO-KNAW) has edited a theme issue of the world's oldest life sciences journal, Phil Trans B, on rewilding, together with a Danish expert. The issue is now available online.

When animals become extinct or disappear from an area, their unique role in nature is often lost. "There is increasing evidence that this global wildlife loss does not only imply the loss of charismatic animals, but also the functions they have in ecosystems", argues ecologist Liesbeth Bakker (NIOO-KNAW).

The consequences can be disastrous. Wildfires, for instance, have been an increasingly serious problem: without large herbivores to eat the plant material more of it remains, meaning more 'fuel' for such fires.

"Since the world-wide expansion of modern humans began", explains Bakker, "humans have overexploited large vertebrates. From the Late Pleistocene extinctions of terrestrial megafauna to the current poaching of elephants and rhinos."

From debate to data

If we are to restore nature, the role of these animals in the food web is crucial. One approach to obtaining a healthy food web (again) is by reintroducing 'missing' species. 'It's called trophic rewilding," says Bakker. "There are other kinds of rewilding as well."

An example of the ripple-effect caused by trophic rewilding is the reintroduction of wolves in Yellowstone National Park in the United States in the 1990s, which is even said to have changed the course of some rivers. The wolves brought down the deer/elk population, river banks suffered less erosion, and with the rivers fixed in their course more biodiversity-rich pools formed.

The story has become part of the rather romantic and fashionable image attached to rewilding. But while plenty of people may dabble or express opinions, "scientific data on the effects of explicit rewilding efforts have until now remained scarce", says Bakker. The theme issue of Phil Trans B which she and Danish researcher Jens-Christian Svenning (University of Aarhus) have guest-edited is meant to change that.

Elk and bison

In the theme issue, researchers from all over the world share their data. Among their findings is that in the Arctic, large herbivores such as reindeer and muskoxen can actually mitigate the impact of rising temperatures.

Other examples demonstrate a similarly positive impact. Replacing ruminant livestock with non-ruminant wildlife will reduce the emission of methane - a greenhouse gas - in rangeland farming, beavers can enhance wetland plant diversity, and re-introductions of native carnivores can be an effective method for suppressing invasive carnivores and invasive herbivores.

Bakker adds: "Climate change doesn't form an impediment to the reintroduction of large animals in most cases. In the Netherlands, for instance, species such as the European bison and the elk feel right at home." She hopes rewilding will become an increasingly 'transdisciplinary' field, in which scientific and practical applications keep pace with each other and there's room for ecology, sociology, geography and economics.

Successful recipe

"These studies demonstrate that trophic rewilding is a promising tool to mitigate negative impacts of global change on ecosystems and their functioning", concludes Bakker. In due time it may even help to provide solutions for other global issues as well, including urbanisation and biodiversity loss. "But it's also clear that implementing trophic rewilding alone will not solve these problems."

Altered land-use - e.g. providing more space for rivers to follow their natural temporal and spatial dynamics - plays an important role in recipes for successful rewilding. So does scale. "Generally, it emerges that large-scale trophic rewilding produces the best results, whereas in human-dominated, fragmented landscapes a certain level of management of ecosystems may still be needed."

But even under these circumstances, concludes Bakker, "a gradual increase in naturalness of ecosystems over time is achievable." And that's even true for the Netherlands, which despite its small size and issues of overpopulation and overexploitation continues to be one of the trailblazers for rewilding.

Credit: 
Netherlands Institute of Ecology (NIOO-KNAW)

Marker found for condition that causes numerous tumors

image: Dr. Lu Le

Image: 
UTSW

DALLAS - Oct. 22, 2018 - UT Southwestern researchers have made a major advance in uncovering the biology of how thousands of disfiguring skin tumors occur in patients troubled by a genetic disorder called neurofibromatosis type 1 (NF1). This scientific advance could slow the development of these tumors.

NF1, which affects 1 in 3,000 people, has a wide spectrum of symptoms that include malignant tumors, high blood pressure, and learning disorders. While the skin tumors, which are called cutaneous neurofibromas, are most often noncancerous, they can number in the thousands and cover much of a patient's body. They also can be painful or itchy, catch on clothing, bleed and become infected. Perhaps even more severe than the physical discomfort is the emotional distress. NF1 tumors can be severely disfiguring, like a layer of warts across the skin, and patients often dress to hide them.

Currently, the only treatment for neurofibromas is surgical removal of the most uncomfortable and most disfiguring of the skin tumors. It would be impossible to remove them all.

"NF1 causes significant morbidity, and an effective treatment for NF1 is long overdue," said Dr. Lu Le, an Associate Professor of Dermatology who holds the Thomas L. Shields, M.D. Professorship in Dermatology at UT Southwestern Medical Center.

"For the first time we have a mouse model that develops different types of neurofibromas inside the body and on the skin, just like in humans. Because of this model, we now know the exact origin of these two types of tumors. If you know where the tumor begins, and you know the end result, then you can follow the steps in the occurrence of the tumor and figure out how to interrupt the development of the tumors," said Dr. Le, who treats NF1 patients as well as does research on the condition.

The researchers found that the protein Hox-B7 is a marker for the cell of origin for NF1 tumors. "It's like a GPS system in a car. By making the Hox-B7 cells light up, we can follow the development of the tumor. It's like branding," said Dr. Le, the senior author of the study and a member of the Harold C. Simmons Comprehensive Cancer Center.

Another key discovery is that a parallel pathway, the Hippo pathway, can modify growth and development of these tumors. This is particularly important because treatments are being developed to block the Hippo pathway. "If you can control the Hippo pathway, you should be able to slow the development of neurofibromas, specifically in NF1 patients who also have genetic changes in their Hippo pathway," Dr. Le said.

The research appears in the journal Cancer Discovery.

Other UT Southwestern researchers involved in this study were first author and Assistant Instructor Dr. Zhiguo Chen; postdoctoral researchers Dr. Juan Mo, Dr. Jean-Philippe Brosseau, Dr. Chung-Ping Liao, and Dr. Jonathan Cooper; research associates Tracey Shipman and Yong Wang of Dermatology; and Professor of Internal Medicine and Molecular Biology, Dr. Thomas Carroll. Dr. Carroll holds the NCH Corporation Chair in Molecular Transport.

"Research reported in this news release was supported by the National Cancer Institute of the National Institutes of Health under Award Number R01CA166593. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Research was also supported by the U.S. Army Medical Research and Materiel Command, Children's Tumor Foundation Award, Dermatology Foundation, the Giorgio Foundation, the Neurofibromatosis Therapeutic Acceleration Program, and the NF1 Research Consortium Fund."

Credit: 
UT Southwestern Medical Center

Oncologists demand more education on the use of biosimilars: ESMO takes action

Biological medicines are responsible for some of the most promising innovations in cancer treatment, including immunotherapy, targeted drugs and vaccines - but they are also expensive.

According to a report by IQVIA Institute for Human Data Science (1) EU-wide spending in cancer therapeutics exceeded 24 billion euros in 2016, with biological medicines accounting for almost 40% of the expense. The introduction of biosimilars for the top three oncology agents is estimated to enable up to 2 billion euros in savings across Europe in 2021 alone. As such, their use could increase the availability of innovative therapies for patients who would not otherwise be treated with biologicals due to economic constraints. All of this, however, hinges on a broader understanding of biosimilars throughout the oncology ecosystem.

ESMO has already taken a position on the use of biosimilars (2), supporting their use in oncology as they represent cheaper alternatives to reference biologics with the potential to make optimal cancer care more sustainable and widely accessible.

Drawing further attention to the issue, ESMO recently published a paper (3) about the integration of biosimilars into routine oncology practice, calling for multidisciplinary collaboration to build confidence wherever it is still lacking. According to preliminary results of an ESMO survey to be published later this year, questioned about their knowledge of and comfort with biosimilars, many oncologists exhibited only a moderate confidence in their understanding of key concepts that underpin biosimilar drug development and use. Close to 87% of respondents stated that they wished to be offered more educational activities on the subject.

On the same wavelength was the discussion at the fourth stakeholder conference on biosimilar medicines, (4) organised by the European Commission in Brussels in September, where ESMO moderated a discussion on the use of biosimilars in oncology.

"Collaboration and multi-stakeholder approaches to improve understanding and eradicate misconceptions are important for the successful uptake of biosimilars in oncology," said Josep Tabernero, ESMO President. "If more education is what oncologists need, being the Society of reference that looks after their professional development, we must urgently act on that."

At a session on biosimilars during the ESMO 2017 Congress in Madrid, physicians, but also nurses, pharmacists, patient organisations and competent authorities from the EU Member States were invited to join the discussion, in order to draw from everyone's knowledge and experience. "All of them play a crucial role in delivering these medicines to patients and making sure the latter accept and adhere to their therapies," Tabernero said. A similar approach will be proposed at this year's congress, later this month in Munich.

Extrapolating the use of a biosimilar in all indications approved for the reference drug appears the most common source of misunderstanding among physicians, nurses and patients alike, according to data in the ESMO survey. "It is a very difficult concept to explain outside of the regulatory setting," said Elena Wolff-Holz from the EMA (European Medicines Agency), first author of the paper and panel member at the event in Brussels. "This is what educational initiatives should focus on - not just for oncologists, but for all healthcare professionals and for patients," Tabernero highlighted.

Tabernero, who also co-authored the paper, cited one of its key examples in this area: "The Information Guide for Healthcare Professionals on Biosimilars in the EU has been publicly available since its release in 2017," he said. "It is the product of a concerted effort by the European Commission, the EMA, (5) as well as scientific experts, healthcare professionals and patients from the member states - and of course ESMO, which ensured that the information needs for the oncology community were also adequately addressed. This is exactly the kind of broad-spectrum collaboration we need to see more of going forward."

Credit: 
European Society for Medical Oncology

In five -10 years, gravitational waves could accurately measure universe's expansion

image: UChicago scientists estimate, based on LIGO's quick first detection of a first neutron star collision, that they could have an extremely precise measurement of the universe's rate of expansion within five to ten years.

Image: 
Robin Dienel/The Carnegie Institution for Science

Twenty years ago, scientists were shocked to realize that our universe is not only expanding, but that it's expanding fasterover time.

Pinning down the exact rate of expansion, called the Hubble constant after famed astronomer and UChicago alumnus Edwin Hubble, has been surprisingly difficult. Since then scientists have used two methods to calculate the value, and they spit out distressingly different results. But last year's surprising capture of gravitational waves radiating from a neutron star collision offered a third way to calculate the Hubble constant.

That was only a single data point from one collision, but in a new paper published Oct. 17 in Nature, three University of Chicago scientists estimate that given how quickly researchers saw the first neutron star collision, they could have a very accurate measurement of the Hubble constant within five to ten years.

"The Hubble constant tells you the size and the age of the universe; it's been a holy grail since the birth of cosmology. Calculating this with gravitational waves could give us an entirely new perspective on the universe," said study author Daniel Holz, a UChicago professor in physics who co-authored the first such calculation from the 2017 discovery. "The question is: When does it become game-changing for cosmology?"

In 1929, Edwin Hubble announced that based on his observations of galaxies beyond the Milky Way, they seemed to be moving away from us--and the farther away the galaxy, the faster it was receding. This is a cornerstone of the Big Bang theory, and it kicked off a nearly century-long search for the exact rate at which this is occurring.

To calculate the rate at which the universe is expanding, scientists need two numbers. One is the distance to a faraway object; the other is how fast the object is moving away from us because of the expansion of the universe. If you can see it with a telescope, the second quantity is relatively easy to determine, because the light you see when you look at a distant star gets shifted into the red as it recedes. Astronomers have been using that trick to see how fast an object is moving for more than a century--it's like the Doppler effect, in which a siren changes pitch as an ambulance passes.

'Major questions in calculations'

But getting an exact measure of the distance is much harder. Traditionally, astrophysicists have used a technique called the cosmic distance ladder, in which the brightness of certain variable stars and supernovae can be used to build a series of comparisons that reach out to the object in question. "The problem is, if you scratch beneath the surface, there are a lot of steps with a lot of assumptions along the way," Holz said.

Perhaps the supernovae used as markers aren't as consistent as thought. Maybe we're mistaking some kinds of supernovae for others, or there's some unknown error in our measurement of distances to nearby stars. "There's a lot of complicated astrophysics there that could throw off readings in a number of ways," he said.

The other major way to calculate the Hubble constant is to look at the cosmic microwave background--the pulse of light created at the very beginning of the universe, which is still faintly detectable. While also useful, this method also relies on assumptions about how the universe works.

The surprising thing is that even though scientists doing each calculation are confident about their results, they don't match. One says the universe is expanding almost 10 percent faster than the other. "This is a major question in cosmology right now," said the study's first author, Hsin-Yu Chen, then a graduate student at UChicago and now a fellow with Harvard University's Black Hole Initiative.

Then the LIGO detectors picked up their first ripple in the fabric of space-time from the collision of two stars last year. This not only shook the observatory, but the field of astronomy itself: Being able to both feel the gravitational wave and see the light of the collision's aftermath with a telescope gave scientists a powerful new tool. "It was kind of an embarrassment of riches," Holz said.

Gravitational waves offer a completely different way to calculate the Hubble constant. When two massive stars crash into each other, they send out ripples in the fabric of space-time that can be detected on Earth. By measuring that signal, scientists can get a signature of the mass and energy of the colliding stars. When they compare this reading with the strength of the gravitational waves, they can infer how far away it is.

This measurement is cleaner and holds fewer assumptions about the universe, which should make it more precise, Holz said. Along with Scott Hughes at MIT, he suggested the idea of making this measurement with gravitational waves paired with telescope readings in 2005. The only question is how often scientists could catch these events, and how good the data from them would be.

'It's only going to get more interesting'

The paper predicts that once scientists have detected 25 readings from neutron star collisions, they'll measure the expansion of the universe within an accuracy of 3 percent. With 200 readings, that number narrows to 1 percent.

"It was quite a surprise for me when we got into the simulations," Chen said. "It was clear we could reach precision, and we could reach it fast."

A precise new number for the Hubble constant would be fascinating no matter the answer, the scientists said. For example, one possible reason for the mismatch in the other two methods is that the nature of gravity itself might have changed over time. The reading also might shed light on dark energy, a mysterious force responsible for the expansion of the universe.

"With the collision we saw last year, we got lucky--it was close to us, so it was relatively easy to find and analyze," said Maya Fishbach, a UChicago graduate student and the other author on the paper. "Future detections will be much farther away, but once we get the next generation of telescopes, we should be able to find counterparts for these distant detections as well."

The LIGO detectors are planned to begin a new observing run in February 2019, joined by their Italian counterparts at VIRGO. Thanks to an upgrade, the detectors' sensitivities will be much higher--expanding the number and distance of astronomical events they can pick up.

"It's only going to get more interesting from here," Holz said.

Credit: 
University of Chicago

Children as young as seven suffer effects of discrimination, study shows

RIVERSIDE, CA - A new UC Riverside study finds children are sensitive to and suffer the impacts of discrimination as young as 7 years old.

Previous studies have shown children can identify racism at that age, but the study from Tuppett Yates, a UC Riverside psychology professor, and Ana Marcelo, an assistant professor of psychology at Clark University, is the first to study the impacts on children under 10 years old. The study also suggests that a strong sense of ethnic-racial identity is a significant buffer against these negative effects.

"We must recognize that ethnicity-race is an important part of a person's identify and development even at an early age, rather than profess to operate as a colorblind society," Yates said.

Research has long documented the negative effects of discrimination on human development. Among black and Latino teens, these impacts manifest themselves in substance abuse, depression, and risky sexual behavior. Among adults, those who report experiencing discrimination are more likely to suffer from cardiovascular disease and diabetes.

Yates' and Marcelo's study, recently published in the Journal of Cultural Diversity and Ethnic Minority Psychology, looked at experiences of discrimination in a sample of 172 7-year-old children -- half girls, half boys. Fifty-six percent of the children were Latino, 19 percent were black, and the rest were multiethnic-racial.

First, children were given the following definition of discrimination:

When people discriminate against other people, it means they treat people badly or do not respect them because of the color of their skin, because they speak a different language or have an accent, or because they come from a different country or culture. For each of the following situations, think whether you have ever felt discriminated against because of the color of your skin, your language or accent, or because of your culture or country of origin.

The children were asked questions, all of which began: "Have you ever in your life experienced __________ because of the color of your skin, your language or accent, or your culture or country of origin..." The range of experiences spanned relationships with peers (e.g., "had someone not be friends with you"), teachers (e.g., "been treated badly or unfairly by a teacher"), and general relationships (e.g., "been called an insulting name."

One year later, children received a definition of ethnicity that began by explaining that many ethnic groups exist in the U.S., stating: "Every person is born into one or more ethnic groups, but people differ on how important their ethnicity is to them, how they feel about it, and how much their behavior is affected by it."

Then, children were asked to rate statements such as: "I have often talked to other people in order to learn more about my ethnic group," and, "I understand pretty well what my ethnic background means to me."

Yates and Marcelo then factored ethnic-racial identity, or ERI, which reflects the beliefs and attitudes that individuals have about their ethnic and racial groups. They found that experiences of discrimination predicted increased internalized and externalized behavior problems (e.g., anxiety, depression, oppositionality) among children with below-average ethnic-racial identity, or ERI, scores, but these same experiences did not significantly predict problems among children with better-developed ERI.

Previous research has shown that teens with a greater interest in their ethnic background, and a greater sense of belonging to their ethnic-racial group, demonstrate greater psychological well-being, and fewer negative behavioral impacts in the wake of discrimination experiences than their peers who are less well-informed and connected to their ethnic-racial group.

The new research affirmed the same phenomenon among younger children. Yates said the new research suggests that efforts to promote a sense of understanding about and belonging to one's ethnic-racial group in early development can help to buffer children who are vulnerable to discrimination.

"Parents and caregivers should acknowledge that ethnicity, race, and culture are active elements in a child's life," said Marcelo, who worked in Yates' lab as a graduate student. "Talking with children about how they experience their ethnicity-race is very important."

The researchers suggested having books and learning materials in school that represent people of color can help, as well as community events that allow children to experience their cultures through food, art, and music.

Yates and Marcelo said the recently published study is all the more salient as young children encounter increasing exposure to racial and ethnic divisions represented by the Black Lives Matter movement and the Trump administration's high-profile actions regarding immigration and foreign travel.

"These findings highlight the importance of reducing discrimination and its pernicious effects, as well as promoting a positive sense of ethnic-racial identity and belonging to partially buffer children in the interim," Yates said.

Credit: 
University of California - Riverside

Genetic study improves lifespan predictions and scientific understanding of aging

image: Paul Timmers, MRes, graduate student, University of Edinburgh (courtesy Mr. Timmers).

Image: 
(courtesy Mr. Timmers)

SAN DIEGO, Calif. - By studying the effect of genetic variations on lifespan across the human genome, researchers have devised a way to estimate whether an individual can expect to live longer or shorter than average, and have advanced scientific understanding of the diseases and cellular pathways involved in aging. Their findings were presented at the American Society of Human Genetics (ASHG) 2018 Annual Meeting in San Diego, Calif.

Presenting author Paul Timmers, MRes, a graduate student at the University of Edinburgh, and an international group of collaborators set out to identify key genetic drivers of lifespan. In the largest ever genome-wide association study of lifespan to date, they paired genetic data from more than 500,000 participants in the UK Biobank and other cohorts with data on the lifespan of each participant's parents. Rather than studying the effects of one or more selected genes on lifespan, they looked across the whole genome to answer the question in a more open-ended way and identify new avenues to explore in future work.

Because the effect of any given gene is so small, the large sample size was necessary to identify genes relevant to lifespan with enough statistical power, Mr. Timmers explained. Using this sample, the researchers validated six previously identified associations between genes and aging, such as the APOE gene, which has been tied to risk of neurodegenerative disease. They also discovered 21 new genomic regions that influence lifespan.

They used their results to develop a polygenic risk score for lifespan: a single, personalized genomic score that estimates a person's genetic likelihood of a longer life. Based on weighted contributions from relevant genetic variants, this score allowed the researchers to predict which participants were likely to live longest.

"Using a person's genetic information alone, we can identify the 10 percent of people with the most protective genes, who will live an average of five years longer than the least protected 10 percent," said Mr. Timmers.

The researchers also wanted to know whether genetic variants were affecting the aging process directly or affecting risk of individual diseases that could lead to death. They found that among common variants - variants found in at least 1 in 200 people - those associated with Alzheimer's disease, heart disease, and smoking-related conditions were linked to overall lifespan. Notably, they did not find lifespan associations for other cancers, suggesting that susceptibility to death caused by other cancers is due to rarer genetic variants or the environment.

"This was an interesting result," Mr. Timmers said. "We suspect that the variants we found, such as for smoking and Alzheimer's disease, pertain uniquely to the modern period of human history. For example, a genetic propensity to smoke wasn't harmful before we discovered tobacco, but it is now. Since natural selection has not yet had many generations to act on these variants, the variants are still fairly common," he explained.

In addition, the researchers examined the cell types and protein pathways in which the genetic variants associated with lifespan had the strongest effect. They found that the genes played key roles in fetal brain cells and adult prefrontal cortex cells, with particular effects in pathways related to fat metabolism. Together, Mr. Timmers noted, these findings highlight the brain as an important organ in determining lifespan and present a good opportunity for follow-up studies.

To build on their findings, the researchers plan to investigate how the variants and functional pathways they identified affect lifespan. For example, they plan to study whether these pathways are associated with single diseases that have implications for longevity or a broader spectrum of age-related diseases. By better understanding how these pathways interact with one another, they ultimately hope to identify ways to slow aging and disease onset that will improve the length and quality of life.

Credit: 
American Society of Human Genetics

Heart patients advised to move around every 20 minutes to prolong life

Toronto, Canada 20 Oct 2018: Heart patients are being advised to move around every 20 minutes in a bid to prolong life following a study presented at the Canadian Cardiovascular Congress (CCC) 2018.

CCC 2018 is being held 20 to 23 October in Toronto, Canada. Visiting experts from the European Society of Cardiology (ESC) will participate in joint scientific sessions with the Canadian Cardiovascular Society (CCS) as part of the ESC Global Activities programme.1

Heart patients spend most of their waking hours sitting, lying down, and watching television.2,3 Previous research has shown that being sedentary for long periods could shorten life but taking breaks to move around may counteract the risk, particularly if it means burning more than 770 kcal a day.4-8 This study investigated how many breaks, and for what duration, are needed to expend 770 kcal.

"Our study shows that heart patients should interrupt sedentary time every 20 minutes with a 7 minute bout of light physical activity," said study author Dr Ailar Ramadi, postdoctoral fellow, Faculty of Rehabilitation Medicine, University of Alberta, Edmonton, Canada. "Simple activities such as standing up and walking at a casual pace will expend more than 770 kcal in a day if done with this frequency and duration."

The study enrolled 132 patients with coronary artery disease. The average age was 63 years and 77% were male. Participants wore an armband activity monitor for an average of 22 hours a day for five days. The activity monitor recorded the amount of energy spent during breaks from inactivity, the amount of inactive time, and the number and duration of breaks during each sedentary hour.

Dr Ramadi said: "There is a lot of evidence now that sitting for long periods is bad for health. Our study suggests that during each hour of sitting time, heart patients should take three breaks which add up to 21 minutes of light physical activity. This will expend 770 kcal a day, an amount associated with a lower risk of premature death."

Regarding limitations of the research, Professor Joep Perk, ESC Prevention Spokesperson, noted that this was a small, observational study with no control group. "A randomised controlled trial is needed before this can become a firm recommendation," he said. "Nevertheless, regular physical activity is key to achieving a healthy life, whether you are a cardiac patient or not."

Dr Michelle M. Graham, Scientific Programme Committee Chair of CCC 2018, said: "We are delighted to have innovative studies such as that by Ramadi and colleagues being presented at CCC. Their novel work has very practical implications, not only for patients with cardiovascular disease, but for improving prevention by altering how people work in sedentary environments."

Professor Jeroen Bax, Past President of the ESC and course director of the ESC programme at CCC 2018, said: "Sedentary lifestyles affect more than half of the world's population. ESC guidelines on the prevention of cardiovascular disease recommend a minimum of 150 minutes of moderate activity or 75 minutes of vigorous activity per week. Any activity is better than none and more activity is better than some."9

Credit: 
European Society of Cardiology

Healthy diets linked to better outcomes in colorectal cancer

Colorectal cancer patients who followed healthy diets had a lower risk of death from colorectal cancer and all causes, even those who improved their diets after being diagnosed, according to a new American Cancer Society study.

There are more than 1.4 million colorectal cancer (CRC) survivors in the United States. Previous studies have suggested a strong influence of diet quality in disease outcomes, and that some pre- and postdiagnosis dietary components are related to survival in men and women with CRC. But studies of dietary patterns to assess overall diet quality in relation to overall and CRC-specific mortality are inconsistent, making the development of evidence-based recommendations for CRC survivors difficult.

To learn more, investigators led by Mark A. Guinter, PhD, MPH, American Cancer Society post-doctoral fellow, reviewed data from 2,801 men and women diagnosed with CRC in the American Cancer Society's large, prospective Cancer Prevention Study-II (CPS-II) Nutrition Cohort. They found those whose pre- and postdiagnosis diets were consistent with the American Cancer Society Guidelines on Nutrition and Physical Activity for Cancer Prevention had lower all-cause and CRC specific mortality.

Pre-diagnosis diets that most closely aligned with ACS dietary recommendations were associated with a 22% lower risk of all-cause mortality compared to those on the other end of the spectrum. Significant inverse trends were observed for CRC specific mortality, as well. For the highest quartile of pre-diagnosis Western dietary pattern, which is characterized by high intakes of red meat and other animal products, there was a 30% higher risk of CRC death compared with the lowest quartile.

Postdiagnosis dietary patterns were also significantly associated with the risk of death. The highest compared with the lowest ACS-score showed a 65% lower risk of CRC mortality and a 38% lower risk of mortality from all causes.

The study authors say additional diet patterns and scores that also were based on plant foods and low red and processed meat consumption corroborated their main findings. They conclude that the results suggest the importance of diet quality as a potentially modifiable tool to improve prognosis among men and women with CRC.

"This study is this first to our knowledge that considered change in diet quality across the CRC continuum," said Guinter. "These results suggest that high diet quality after diagnosis, even if poor before, may be associated with a lower risk of death."

Credit: 
American Cancer Society

How schools can optimise support for children with ADHD

New research gives the clearest guidance yet on how schools can best support children with ADHD to improve symptoms and maximise their academic outcomes.

The study, led by the University of Exeter and involving researchers at the EPPI-Centre (University College London), undertook a systematic review which analysed all available research into non-medication measures to support children with ADHD in schools. Published in Review of Education, the paper found that interventions which include one-to-one support and a focus on self-regulation improved academic outcomes.

Around five per cent of children have ADHD, meaning most classrooms will include at least one child with the condition. They struggle to sit still, focus their attention and to control impulses much more than ordinary children of the same age. Schools can be a particularly challenging setting for these children, and their difficulty in waiting their turn or staying in their seat impacts peers and teachers. Research shows that medication is effective, but does not work for all children, and is not acceptable to some families.

The research was funded by the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South West Peninsula - or PenCLAHRC. The team found 28 randomised control trials on non-drug measures to support children with ADHD in schools. In a meta-analysis, they analysed the different components of the measures being carried out to assess the evidence for what was most effective.

The studies varied in quality, which limits the confidence the team can have in their results. They found that important aspects of successful interventions for improving the academic outcomes of children are when they focus on self-regulation and are delivered in one-to-one sessions.

Self- regulation is hard for children who are very impulsive and struggle to focus attention. Children need to learn to spot how they are feeling inside, to notice triggers and avoid them if possible, and to stop and think before responding. This is much harder for children with ADHD than most other children, but these are skills that can be taught and learned.

The team also found some promising evidence for daily report cards. Children are set daily targets which are reviewed via a card that the child carries between home and school and between lessons in school. Rewards are given for meeting targets. The number of studies looking at this was lower, and their findings did not always agree. But using a daily report card is relatively cheap and easy to implement. It can encourage home-school collaboration and offers the flexibility to respond to a child's individual needs

Tamsin Ford, Professor of Child Psychiatry at the University of Exeter Medical School, said: "Children with ADHD are of course all unique. It's a complex issue and there is no one-size-fits-all approach. However, our research gives the strongest evidence to date that non-drug interventions in schools can support children to meet their potential in terms of academic and other outcomes. More and better quality research is needed but in the mean-time, schools should try daily report cards and to increase children's ability to regulate their emotions. These approaches may work best for children with ADHD by one-to-one delivery"

Credit: 
University of Exeter

New fly species found in Indiana may indicate changing climate, says IUPUI researcher

image: These are various Lucilia blow flies collected.

Image: 
School of Science at IUPUI

INDIANAPOLIS -- A new type of blow fly spotted in Indiana points to shifting species populations due to climate change. Researchers at IUPUI have observed the first evidence of Lucilia cuprina in Indiana, an insect previously known to populate southern states from Virginia to California.

Researchers recorded the L. cuprina species more than two dozen times from 2015 to 2017 in parks and other public places throughout Central Indiana. The fly was observed as far north as Michigan in the 1950s during a short period of warmer temperatures but had not been found in this region since then.

"As temperatures change and increase, the distributions of these insects will continue to change as well," said Christine J. Picard, an associate professor of biology. "There is definitely a northward movement of species -- not just insects, but all species -- as they try to find temperatures where they are more comfortable."

The movement of this species of fly into the Midwest could also have implications for forensic investigations involving decomposing remains. The growth and development of flies play an important role for scientists looking to learn how long a human or animal has been dead.

"With forensic science and forensic entomology, you should have an idea of which flies are present in your location in part because different species will have different development times," Picard said.

The L. cuprina blow fly's sister species Lucilla sericata is widely present in Indiana and is often used in forensic cases. Since the two species are so closely related, it's difficult to tell them apart. If investigators don't know they are dealing with an L. cuprina instead of the more typically seen L. sericata, their data could be inaccurate.

Credit: 
Indiana University-Purdue University Indianapolis School of Science

For preterm infants, skin-to-skin contact affects

October 18, 2018 - For premature infants in the neonatal intensive care unit (NICU), skin-to-skin contact with parents influences levels of hormones related to mother-infant attachment (oxytocin) and stress (cortisol) - and may increase parents' level of engagement with their infants, reports a study in Advances in Neonatal Care, official journal of the National Association of Neonatal Nurses. The journal is published in the Lippincott portfolio by Wolters Kluwer.

Promoting early contact and parental engagement might help to lessen the risk of neurodevelopmental delay associated with preterm birth and NICU care, according to the exploratory study by Dorothy J. Vittner, PhD, RN, CHPE, of University of Connecticut School of Nursing and colleagues. They write, "Parental touch, especially during skin-to-skin contact (SSC) has potential to reduce adverse consequences."

Study Attempts to Measure Benefits of Skin-to-Skin Contact for Preterm Infants

The pilot study included 28 preterm infants, average gestational age 33 weeks. All infants were in stable condition while receiving NICU care. Infants underwent periods of SSC on two consecutive days: once with the mother and once with the father. Saliva samples were collected from infants and parents to measure levels of oxytocin, a hormone that has been linked to maternal-infant attachment; and the stress-related hormone cortisol.

"Oxytocin facilitates social sensitivity and attunement necessary for developing relationships and nurturance for emotional and physical health," the researchers write. Cortisol plays an important role in the "fight or flight" reaction to fear or stress.

Levels of both hormones changed in response to SSC. "Oxytocin significantly increased and cortisol levels decreased for mothers, fathers, and infants during SSC as compared to baseline," Dr. Vittner and coauthors write. The changes indicate the "calming and beneficial impact of SSC for both parents and infants."

Parents also completed a questionnaire called the "PREEMI" (Parent Risk Evaluation and Engagement Model and Instrument) scale, designed to measure attachment between parents and their preterm infants. Overall PREEMI scores indicated a "moderate to high" level of parental engagement for all participants.

Increased oxytocin and decreased cortisol levels during SSC were associated with higher PREEMI scores by the time the infant was discharged from the hospital. "We believe these findings suggest that parents with a lower salivary cortisol as seen with SSC (decreased stress) may facilitate increased parental engagement," Dr. Vittner and colleagues write.

Mothers and fathers had similar increases in oxytocin during SSC. In mothers, the rise in oxytocin was related to increased parental engagement. Unexpectedly, however, increased oxytocin during SSC in fathers was negatively related to parental engagement. Dr. Vittner and colleagues note that for many fathers, the study SSC intervention was the first time they had held their infants.

The study provides new evidence of how SSC might work to promote attachment between parents and premature infants. "The changes in oxytocin and cortisol levels provide robust support to advocate for increased SSC during infancy, especially for the vulnerable infant in the NICU," the researchers write. They note that further studies will be needed to understand these relationships, and how they affect parent-infant relationships - especially in overcoming the obstacles posed by having a premature infant who need NICU care.

The results also suggest that the PREEMI questionnaire can provide a "window into parent engagement," potentially useful in identifying parents who may need interventions to increase engagement with their premature infant. Dr. Vittner and coauthors conclude: "Uncovering the bio-behavioral basis of early parent-infant interactions is an important step in developing therapeutic modalities to improve infant health outcomes."

Credit: 
Wolters Kluwer Health

Arctic greening thaws permafrost, boosts runoff

image: NGEE-Arctic researchers from Los Alamos, University of Alaska Fairbanks and Oak Ridge National Laboratory dig deep snow pits in tall shrub patches to understand the warming effect of snow-shrub interactions on underlying permafrost.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., Oct. 17, 2018--A new collaborative study has investigated Arctic shrub-snow interactions to obtain a better understanding of the far north's tundra and vast permafrost system. Incorporating extensive in situ observations, Los Alamos National Laboratory scientists tested their theories with a novel 3D computer model and confirmed that shrubs can lead to significant degradation of the permafrost layer that has remained frozen for tens of thousands of years. These interactions are driving increases in discharges of fresh water into rivers, lakes and oceans.

"The Arctic is actively greening, and shrubs are flourishing across the tundra. As insulating snow accumulates atop tall shrubs, it boosts significant ground warming," said Cathy Wilson, Los Alamos scientist on the project. "If the trend of increasing vegetation across the Arctic continues, we're likely to see a strong increase in permafrost degradation."

The team investigated interactions among shrubs, permafrost, and subsurface areas called taliks. Taliks are unfrozen ground near permafrost caused by a thermal or hydrological anomaly. Some tunnel-like taliks called "through taliks" extend over thick permafrost layers.

Results of the Los Alamos study published in Environmental Research Letters this week revealed that through taliks developed where snow was trapped, warmed the ground and created a pathway for water to flow through deep permafrost, significantly driving thawing and likely increasing water and dissolved carbon flow to rivers, lakes and the ocean. Computer simulations also demonstrated that the thawed active layer was abnormally deeper near these through taliks, and that increased shrub growth exacerbates these impacts. Notably, the team subtracted warming trends from the weather data used to drive simulations, thereby confirming that the shrub-snow interactions were causing degradation even in the absence of warming.

The Los Alamos team and collaborators from the Department of Energy (DOE) Office of Science's Next-Generation Ecosystem Experiments Arctic program, which funds this project, used a new Los Alamos-developed fine-scale model, the Advanced Terrestrial Simulator (ATS). It incorporates soil physics and captures permafrost dynamics. The team repeatedly tested results against experimental data from Alaska's Seward Peninsula.

"These simulations of through talik formation provide clues as to why we're seeing an increase in winter discharge in the Arctic," said Los Alamos postdoctoral research associate Elchin Jafarov, first author on the paper.

This model is the first to show how snow and vegetation interact to impact permafrost hydrology with through talik formation on a slope--prevalent across Alaskan terrain. The team, including collaborators from Oak Ridge National Laboratory and the University of Alaska, investigated how quickly through taliks developed at different permafrost depths, their impact on hydrology and how they interrupted and altered continuous permafrost.

Credit: 
DOE/Los Alamos National Laboratory

Winter ticks are killing moose

DURHAM, N.H. - As winter in New England seems to get warmer, fall lingers longer and spring comes into bloom earlier, areas like northern New Hampshire and western Maine are seeing an unusual continued increase in winter ticks which are endangering the moose population. Researchers at the University of New Hampshire have found that the swell of infestations of this parasite, which attaches itself to moose during the fall and feeds throughout the winter, is the primary cause of an unprecedented 70 percent death rate of calves over a three-year period.

"The iconic moose is rapidly becoming the new poster child for climate change in parts of the Northeast," said Pete Pekins, professor of wildlife ecology. "Normally anything over a 50 percent death rate would concern us, but at 70 percent, we are looking at a real problem in the moose population."

In the study, published in the Canadian Journal of Zoology, researchers outline the screening of 179 radio-marked moose calves (age nine to 10 months) for physical condition and parasites in the month of January over three consecutive years from 2014 to 2016. They tracked new calves for four months each winter and found that a total of 125 calves died over the three-year period. A high infestation of winter ticks was found on each calf (an average of 47,371 per moose) causing emaciation and severe metabolic imbalance from blood loss, which was the primary cause of death.

Most adult moose survived but were still severely compromised. They were thin and anemic from losing so much blood. The ticks appear to be harming reproductive health so there is also less breeding.

The researchers say winter tick epidemics typically last one to two years. But, five of the last 10 years has shown a rare frequency of tick infestations which reflects the influence of climate change. They point out that right now these issues are mostly appearing in southern moose populations, but as climate change progresses they anticipate this issue to reach farther north.

"We're sitting on a powder keg," said Pekins. "The changing environmental conditions associated with climate change are increasing and are favorable for winter ticks, specifically later-starting winters that lengthen the autumnal questing period for ticks."

Fall is considered "questing" season for winter ticks. They climb up vegetation and look to attach to a host. Once they attach, they go through three active life stages (larvae, nymph, and adult) by taking a blood meal and feeding on the same animal. The ticks will feed and remain on one host during their subsequent molts until spring when adult females detach and drop to the ground. Their preferred hosts are moose and other mammals, including deer, elk, caribou, and occasionally horses and cattle. Winter ticks rarely bite and feed on humans.

Credit: 
University of New Hampshire

Prescription opioid and benzodiazepine misuse linked with suicidal thoughts

Misuse of prescription opioids or benzodiazepines (such as Xanax) was associated with suicidal ideation in a study of US older adults.

In the International Journal of Geriatric Psychiatry study of 17,608 adults aged 50 years and older, past-year use (without misuse) of prescription opioids or benzodiazepines was not associated with past-year suicidal ideation. In contrast, past-year opioid misuse was associated with an 84 percent increased odds of past-year suicidal ideation, and past-year benzodiazepine misuse was associated with a twofold increased odds, after controlling for various factors related to suicide in other work. While 2.2 percent of US older adults not engaged in either opioid or benzodiazepine misuse reported past-year suicidal ideation, the rate was 25.4 percent in those who misused both medication classes.

"Suicide is a major public health concern in older adults. Our study found a strong link between prescription opioid or benzodiazepine misuse and suicidal ideation, which is particularly concerning because these medications are commonly prescribed to older adults," said lead author Dr. Ty Schepis, of Texas State University. "Prescribers and other health professionals are encouraged to screen for prescription opioid or benzodiazepine misuse in older adults who are prescribed these medications to prevent suicide."

Credit: 
Wiley