Culture

Behavioral interventions may be as effective at reducing food intake as anorectic drugs

Simulations predict that behavioral interventions such as imposing strict no-food restrictions after meals can be as effective as strong anorectic drugs in reducing food intake in rodents, according to a study published December 5 in the open-access journal PLOS Biology by Tom McGrath, Kevin Murphy and Nick Jones of Imperial College London, and colleagues.

A better understanding of feeding behavior is critical for addressing obesity and metabolic syndrome, but there is a lack of a standard model that captures the complexity of feeding behavior. In their new study, McGrath, Murphy, Jones and colleagues constructed an accurate mathematical model of rodent feeding to examine the effects of fullness on feeding behavior and how this is modulated by anorectic agents, fasting, and the day/night cycle.

The researchers validated their model and used it on a substantial dataset of rat feeding behavior under a wide variety of conditions. The researchers identified novel mechanisms of the behavioral effects of anorectic drugs (such as PYY3-36, lithium chloride, GLP-1 and leptin), and investigated how behavioral interventions can combine with anorectic drug administration to robustly reduce food intake.

The results suggest that introducing a strict minimum interval between meals or modulating the rate of upper-gut emptying (for instance, by changing the composition of food) can be as effective as administering anorectic drugs. The authors found that it was possible to improve food-intake reduction by optimizing drug administration, but the gains were relatively small. For example, enforcing a 45-minute post-meal period (for rodents) during which food was not available, or a 20% decrease in gut motility, were both as effective at reducing food intake over a 12-hour period as a high dose of the anorectic drug PYY3-36, highlighting the potential of alternative interventions to reduce food intake. Although a direct comparison with humans is difficult to make, these results suggest that changes to diet - particularly changes that slow gut emptying - might be a strong strategy for weight loss.

"The surprise for us," said Nick Jones, "was both that a relatively simple model of feeding, which hinges on a notion of fullness, was so predictive of the complex feeding patterns of individual rodents. We further hadn't anticipated that the effects of anorectic agents could be mimicked with such mild behavioral interventions.'"

Credit: 
PLOS

Ratcheting up NBA rookie salaries may incentivize athletes to finish college

image: Unlike the NBA's current rookie salary scale that awards the highest salary to the player picked first in the draft and dwindles down with each successive player picked, two sports-enthused business school faculty at the University of Vermont propose a new salary structure designed to see more athletes graduate by considering both draft pick and class year.

Image: 
University of Vermont Athletics

BURLINGTON, VT - Going pro early may be a no-brainer for exceptional, young basketball stars like former Duke freshman and 2019 NBA draft first-pick Zion Williamson. But a study in the "International Journal of Sport Finance" proposes a new salary structure that might entice most other college players considering the NBA to graduate before trying their hand at going pro.

"Zion Williamson is a classic example of a strangely strong signal that foregoing the remaining time in college is rational--from a basketball perspective, we're not talking about his education or degrees--but from a basketball learning perspective, he had nothing more to gain from playing for Duke. So he should go to the pros and get the contract," says Michael J. Tomas III, finance professor at the University of Vermont (UVM) and co-author of the study with UVM accounting professor Barbara Arel.

Noting that the average NBA career length is 4.8 years, Arel and Tomas reimagined the NBA's rookie salary scale--which currently awards the highest salary to the player picked first in the draft and dwindles down with each successive player picked--in a way that considers both draft pick position and class year.

"This is our attempt to show that you could alter the NBA draft schedule to try to incentivize students to stay. There's been a big discussion about people leaving early to go to the NBA draft and I think that revolves around the idea of wanting to see them get an education," says Tomas.

Their study proposes a pay scale that locks in salary gains as athletes advance toward graduation and incorporates yearly bonuses into their salaries determined by class year. Specifically, it offers drafted freshmen 60 percent of the current NBA rookie salary base and ratchets up to 120 percent for drafted graduates in that same position.

Inspired by a ratchet option or cliquet option in the finance industry, the sports-enthused business school professors say that ratcheting up rookies' salaries this way would ultimately "provide the incentive for players to delay entering the draft until they are ready to contribute to the NBA, but still allows an early exercise decision to remain rational for the very top prospects;" like Williamson, for example, who are likely to be picked first and go on to earn multiple contracts throughout their NBA careers. For these players, the sooner they are drafted, the sooner they can earn non-rookie salaries and contracts.

Subsequently, the researchers argue, the NBA's labor market would improve as a whole as drafted athletes enter the NBA more prepared for the professional league following those additional years of experience playing college basketball. Though the cost and burden of additional training would shift from the NBA to colleges and universities, those schools would retain top talent that might otherwise leave, while the NBA bears the brunt of the financial incentive that keeps players in college.

"With this system you wouldn't have universities that are already facing financial difficulties, trying to pay players to come and make their sports teams better," says Tomas, adding that it also maintains the competitive landscape and possibility for underdog victories and March Madness upsets that fans have come to appreciate.

Credit: 
University of Vermont

Obesity surgery improves the heart

Vienna, Austria - 5 December 2019: The benefits of bariatric surgery for obese individuals go beyond weight loss, according to a study presented today at EuroEcho 2019, a scientific congress of the European Society of Cardiology (ESC).1

Study author, Dr. Marie-Eve Piché of the Quebec Heart and Lung Institute in Canada said: "Bariatric surgery was conceived for weight loss; our study indicates it may also reverse subclinical heart dysfunction. Since this abnormality predicts adverse cardiovascular events and mortality, its reversal could translate into improved prognosis."

Obese individuals have a doubled risk of cardiovascular disease. Risk rises exponentially with the number of obesity-related comorbidities: namely type 2 diabetes, hypertension, and dyslipidaemia. Obesity and its comorbidities affect cardiac structure and function before overt heart disease occurs - so-called subclinical heart disease, which can be detected with cardiac imaging.

This study examined the effects of bariatric surgery on subclinical heart disease. It also looked at the impact on diabetes, hypertension, and dyslipidaemia.

The study included 38 obese patients who underwent bariatric surgery and 19 patients, matched for age and sex, who remained on the surgical waiting list. Measurements at the beginning of the study (before bariatric surgery) and at six months included echocardiography, body weight, blood pressure, blood lipids, and blood glucose.

By six months, patients in the surgery group had lost 26% of their total body weight, while those on the waiting list stayed the same weight. Rates of comorbidities at six months were significantly lower in the surgery group compared to the waiting list group: 30% versus 61% had hypertension, 5% versus 42% had dyslipidaemia, and 13% versus 40% had type 2 diabetes, respectively.

Some 22 patients (58%) in the surgery group had subclinical heart disease at the start of the study. In 82% of those patients, subclinical heart function had normalised at six months after surgery. In contrast, subclinical disease worsened in 53% of patients on the waiting list during the same period.

Further evidence of improved subclinical heart function in the surgery group was obtained by comparisons with 18 age-and-sex-matched healthy weight controls. Before surgery, obese patients had worse subclinical heart function than healthy weight controls. At six months, subclinical heart function was similar in both groups.

Dr. Piché said: "We show that abnormal subclinical heart function in obese patients is common. Early (six months) after bariatric surgery, it normalises in more than 80% of patients and is comparable to normal weight controls."

"Obesity-related comorbidities also get better," she added. "Interestingly, remission of type 2 diabetes after bariatric surgery was associated with improvement in subclinical heart function. Conversely, obese individuals with type 2 diabetes who remained on the surgical waiting list showed a worsening in their subclinical myocardial function during follow-up."

Dr. Piché said: "Bariatric surgery is an effective approach for weight loss. Our study suggests it has additional benefits on subclinical heart function, type 2 diabetes, hypertension, and dyslipidaemia. Longitudinal studies are needed to show if these changes reduce cardiovascular disease."

Credit: 
European Society of Cardiology

Genome testing for siblings of kids with autism may detect ASD before symptoms appear

One of the key priorities of interventions for autism spectrum disorder (ASD) is starting early, with some evidence showing infants as young as seven months old could benefit. Yet, most children in North America aren't diagnosed with ASD until they're over four years of age. New research led by The Hospital for Sick Children (SickKids) and the University of Alberta published on December 5, 2019 in Nature Communications has found testing the DNA of siblings of individuals with ASD may be predictive of a future diagnosis even if symptoms aren't yet apparent.

ASD refers to a group of neurodevelopmental conditions resulting in challenges related to communication, social understanding and behaviour. Studies show families who have a child with ASD have a 6.9 to 19.5 per cent chance of another child having ASD and a 30 to 40 per cent chance of another child having atypical development.

Genomic factors linked to ASD-related traits

"Genetic factors are the most likely reason we see a clustering of ASD related traits in families," explains Dr. Stephen Scherer, Senior Scientist and Director of The Centre for Applied Genomics (TCAG) at SickKids, Director of the McLaughlin Centre at the University of Toronto and principal investigator of the study. "We wanted to investigate the possible benefits of genetic testing for infants whose older sibling had already been diagnosed with ASD. If we can identify those children early, we may be able to enrol them earlier in therapies."

The researchers looked for the presence of genetic alterations that have been linked to ASD called copy number variations (CNVs) in over 288 infant siblings from 253 families. By age 3, 157 siblings were either diagnosed with ASD or developing atypically. DNA testing revealed CNVs in genes relevant to ASD in 11 (7 per cent) of the 157 siblings who were eventually diagnosed.

The study found that the presence of an ASD-relevant CNV in a sibling had a high likelihood of predicting a future diagnosis of ASD or atypical development. This marks the first time that scientists have been able to quantify the predictive value of CNVs in determining these diagnoses.

Early identification could lead to earlier intervention

"These findings add to a growing body of evidence that biomarkers might be helpful in identifying pre-symptomatic infants who are likely to develop ASD or other developmental challenges," says Dr. Lonnie Zwaigenbaum, Professor of Pediatrics, Stollery Children's Hospital Foundation Chair in Autism and Stollery Science Lab Distinguished Researcher at the University of Alberta.

"At this point, we can't fully determine the anticipated severity of a child's future symptoms. What we can say is that it's important to closely monitor their development and start therapeutic interventions early to support their skill development and address emerging functional impairments related to ASD."

The research team has confirmed similar findings in a separate group of 2,110 families having one child with, and a second child without ASD. Their next step will be to look beyond CNVs and determine how newer technologies - like whole genome sequencing - might increase the early genetic detection rate.

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Researchers: Put a brake on bioenergy by 2050 to avoid negative climate impacts

Los Altos, California (5 DECEMBER 2019)--The burgeoning bioenergy sector must peak and decline in the next 30 years to alleviate extreme pressure on land, warns researchers in a new analysis published today in Global Change Biology. They assert that projections envisioning the use of biomass from crops, trees or grasses for fuel through 2100 overlook the technology's high carbon footprint and excessive land use.

"As countries worldwide are seeking renewable energy alternatives to coal, oil and other carbon-spewing fossil fuels, we find ourselves at a crossroads--and how we proceed can make or break the renewable energy sector," said Walt Reid, the lead author of The Future of Bioenergy, and the director of the conservation and science program at the David and Lucile Packard Foundation. "If we listen to the latest science, it's clear that bioenergy opportunities are mostly short-term or limited. In the long term, land-intensive bioenergy is not only inferior to wind, solar and other best bet green technologies, it also can be a major source of carbon emissions. With the exception of bioenergy from waste and ecosystem-improvement projects, it simply doesn't make sense for the climate to invest in bioenergy. It must be on its way out by 2050."

An Intergovernmental Panel on Climate Change (IPCC) report released last year found that many scenarios capable of reducing the threat of climate change relied heavily on bioenergy, predicting that energy from biomass could make up 26% of primary energy in 2050 (up from 10% in 2020) and predicting that solar and wind combined would likely only account for 22%. Those scenarios often relied on significant use of bioenergy with carbon capture and storage (BECCS), which involves growing trees across a large area of land to produce wood pellets burned for energy, then capturing and sequestering the carbon emissions. In its analysis, though, the IPCC found significant challenges associated with a high reliance on bioenergy, noting in particular that the vast areas of land required to produce biomass for energy would compete with food production and other human needs.

"With a growing world population to feed, and a climate emergency to tackle, society needs to become much smarter in how it uses our limited land resources," said Pete Harrison, executive director for EU Policy at the European Climate Foundation. "We should prioritize sources of bioenergy that do not use land, such as wastes and residues, and steer clear of using sources that leave a heavy footprint on agricultural land or forests. There is clear evidence that many policymakers have been making wrong choices; using tax payers' money to support bioenergy projects that cause deforestation; and it is now time to learn from those mistakes."

The authors of the new Global Change Biology assessment examine a flurry of recent reports that suggest even more problems with large-scale bioenergy projects reliant on large tracts of land, and also show that more cost-effective alternatives will be available in the coming decades. Pulling from these recent studies, the authors establish three reasons why large-scale bioenergy must and can peak and decline in the next 30 years:

Large-scale bioenergy emits carbon. Carbon emissions from bioenergy can be greater in the near-term than emissions from the fossil fuels it is replacing, undermining the assumption that bioenergy is always a relatively low-emission and low-cost form of energy. Burning wood pellets, for example, creates a "double climate problem." Manufacturing and shipping wood pellets entails substantial emissions of fossil CO2, and it can take decades or centuries for harvested areas to return to pre-harvest carbon stocks.

Large-scale bioenergy puts a squeeze on land. Land is already a scarce resource, and it will become even scarcer with time due to an increase in the human population and a rise in the appreciation of the conservation value of natural and mostly-natural ecosystems--even if agricultural yields continue to increase. Because land is so limited, we should use it as efficiently as possible for energy production. In contrast to land-intensive bioenergy, the amount of electricity that can be produced from a hectare of land using photovoltaics is at least 50-100 times that from biomass.

Large-scale bioenergy is inferior to other solutions. And, by mid-century, land-intensive bioenergy will face fierce competition from superior technologies such as wind and solar energy, the development of efficient storage and other flexibility solutions, and the advent of more effective carbon removal technologies such as direct air capture with carbon storage.

"The evidence is piling up that an energy system based on dedicating vast amounts of land for bioenergy simply uses too much land," said Reid. "More promising energy solutions--from solar power farms to carbon capture technologies--have the potential to provide much more energy from much less land in a post-2050 world. Investors are wise to think strategically about the long-term landscape of superior competitors, as well as the short-term trends."

The assessment comes at a time when the bioenergy industry is ramping up worldwide, with the European Union in the lead. Bioenergy currently accounts for 10% of the world's energy, and 50% of our renewable energy. In the European Union, bioenergy accounts for two-thirds of all renewable energy (nearly half from wood).

Two-thirds of the EU's "20% renewable energy by 2020" target depends on bioenergy. And the bloc is also about to greenlight the conversion of five large coal plants to bioenergy plants that burn imported wood pellets from overseas forests.

Land-intensive electrical power projects in particular are picking up steam as governments and industry leaders seek to transform disused coal factories into new profit centers. Between 2006 and 2015, the production of wood pellets for biomass energy use quadrupled to 26 million tons. Worldwide, demand for globally traded wood pellets destined for use in phased-out coal plants or new dedicated bioenergy plants is expected to rise 250% by 2027.

"Our assessment shows that before the EU and other countries commit to decades of expanding this technology, they must hit the pause button to recognize that bioenergy is actually increasing carbon emissions and to assess the worrying impact of dramatically increasing bioenergy on the world's most contested resource: land," said Reid. "Our need for food, conservation and the restoration of forests is simply incompatible with greatly enlarged bioenergy projects in need of land."

The study lays out a bioenergy trajectory that policymakers can use to encourage sustainable bioenergy while also opening the door for new technologies to replace land-intensive bioenergy in the very near future. These recommendations include improved accounting of the actual carbon emissions associated with the use of biomass, favoring biomass from waste, residues or land management practices that enhance carbon storage, and providing incentives for energy storage, direct air capture technologies, and low-carbon alternatives to fossil fuels.

Above all, the authors argue that bioenergy projects should be avoided if they involve natural forests, such as converting natural forests to bioenergy plantations, or use land best suited for food crops. And the authors caution that claims that bioenergy projects are a zero-carbon form of energy should be met with skepticism.

"Appropriate bioenergy can be an important part of solving the climate crisis and improving ecosystems, but if current bioenergy trends continue over the next few decades unabated, driven by well-intentioned but poorly conceived clean energy incentives, tax payers and investors may find themselves spending tens of billions in public subsidies to prop up a fuel destined for the waste basket of history, instead of directing those investments to sure bets for a zero-carbon future, such as solar and wind," Reid said.

Credit: 
Burness

Gene expression regulation in Chinese cabbage illuminated

image: This is tri-methylation of H3 lysine residue 27K (H3K27me3).

Image: 
Kobe University

Doctoral student Ayasha Akter (Kobe University's Graduate School of Agricultural Science) and technical staff member Satoshi Takahashi (from the RIKEN Center for Sustainable Resource Science) have revealed the important role played by the histone modification H3K27me3 in regulating gene expression in Chinese cabbage. In addition, they illuminated the role of H3K27me3 in vernalization- a vital process for enabling Brassica rapa vegetables, such as Chinese cabbage, to flower.

The collaborative research group consisted of members from Kobe University's Graduate School of Agriculture, the RIKEN Center for Sustainable Resource Science, and CSIRO Australia.

These results were initially published online the journal 'DNA Research' on October 17 2019.

Introduction

Histones are proteins that play a role in gene regulation- they also package DNA into structures called nucleosomes. A nucleosome is a section of DNA that is wrapped around a core of eight histone proteins- this core is known as a histone octamer (Figure 1). Histone methylation is a process whereby certain amino acids in a histone protein are modified by the addition of one, two or three methyl groups. Methylation of histones changes the packaging of the DNA and allows transcription factors or other proteins to access the DNA and turn the genes 'on' or 'off'. In this way, histones can change gene expression but not the DNA sequence. This is known as an epigenetic change. In the case of H3K27me3, the methylation of the H3 family histone targets the lysine residues (K27). H3K27me3 is often associated with gene silencing.

The whole genome sequence for Chinese cabbage (Brassica rapa L.) has been available since 2011. However, the exact role that H3K27me3 plays was not well understood. This study revealed the importance of H3K27me3 in vernalization- the process by which Chinese cabbage flowers after cold exposure.

Methodology and results

For this study, two inbred lines of Chinese cabbage were used (T23 and T24). These lines were selected to investigate the role of H3K27me3 in tissue-specific gene expression because they are genetically similar to the reference whole genome sequence that was discovered in 2011. The research team compared the number of genes marked by H3K27me3 between the two lines, and between different tissues. They found that the distribution of H3K27me3 was similar between the two lines. However, there was greater variation in H3K27me3 distribution across different tissues within a line. In other words, it was found that genes with H3K27me3 marks showed tissue-specific expression (Figure 2). This suggests that this histone modification plays a role in the regulation of gene expression during plant tissue development.

Next, they investigated the relationship between the level of gene expression and the presence of H3K27me3. To do this, they measured the level of expression in the genes that were modified by H3K27me3 in all the samples (around 8000 genes), and compared this with the level of expression in all the genes (around 40,000 genes). They found that the level of expression in genes that were modified by H3K27me3 was lower. These results revealed that H3K27me3 plays a role in repressing gene expression. Furthermore, it was found that the level of H3K27me3 was high in genes responsible for tissue-specific expression. This suggests that H3K27me3 regulates gene expression during plant development.

Chinese cabbage is from the same Brassica genus of plants as Arabidopsis thaliana. Therefore, the researchers decided to compare the presence of H3K27me3 in these closely related plants. It was found that 40% of the modified genes in the two species were altered in a similar way by H3K27me3, suggesting that these modifications remained after speciation. This showed that H3K27me3 modification plays a vital role in both species.

Lastly, an experiment was conducted to see how H3K27me3 levels changed before and after vernalization. Vernalization refers to the process in which some varieties of plant flower after a period of exposure to cold weather.

In a previous study, it was revealed that the gene FLOWERING LOCUS C (FLC) plays a key role in preventing vegetables in the B. rapa family from flowering before vernalization. The expression of the FLC gene ensures that the plant cannot flower before it is exposed to cold weather- this helps prevent the plant from flowering in winter. The FLC gene is repressed when the plant undergoes a cold period- allowing it to flower once temperatures become warmer.

To mimic the conditions for vernalization, Chinese cabbage and Arabidopsis thaliana plants were kept at low temperatures of 4°C for a 4 week period. The results found that there were no large changes in H3K27me3 levels at a genome level. However, H3K27me3 accumulated in the FLC genes of both species after cold exposure. In Chinese cabbage, it was observed that H3K27me3 accumulated near FLC transcription initiation sites. In other words, H3K27me3 levels were high where the FLC genes were turned 'off'. When normal growing temperatures returned (22°C), H3K27me3 spread along the entirety of the FLC paralogs. This suggests that H3K27me3 also plays an important role in ensuring that the FLC gene remains repressed after vernalization.

Further Research

Overall, this research illuminated that the histone modification H3 lysine 27 tri-methylation plays important roles in regulating the expression of genes responsible for tissue differentiation. They also revealed the indispensable role of H3K27me3 in controlling the expression of FLC during vernalization.

Further research into the role H3K27me3 plays in individual genes, as well as into any differences between H3K27me3 levels in different cultivars would deepen understanding around the role of this histone modification.

If successful, experiments in using H3K27me3 to suppress FLC could enable flowering time to be controlled artificially. This knowledge could be utilized in agriculture to improve the cultivation rates of vegetables in the B. rapa family.

Credit: 
Kobe University

Preterm births more likely when dads live in lower income areas

Lifelong lower socioeconomic status of fathers, as defined by early life and adulthood neighborhood income, is a newly identified risk factor for early preterm birth (at less than 34 weeks), according to a study published in Maternal and Child Health Journal. The rate of early preterm births was three times higher when fathers lived in lower income neighborhoods, regardless of the mother's age, marital status, education and race or ethnicity.

"We knew that the mother's socioeconomic status is a risk factor for preterm birth, but this is the first time that the father's status is linked to prematurity, even when the mother did not have high-risk demographics," says lead author James Collins, MD, MPH, Medical Director of the Neonatal Intensive Care Unit at Ann & Robert H. Lurie Children's Hospital of Chicago and Professor of Pediatrics at Northwestern University Feinberg School of Medicine. "The father's lifelong class status needs to be taken into account when designing initiatives to reduce the number of early preterm births among urban women."

For this study, Dr. Collins and colleagues analyzed the Illinois transgenerational birth file of infants (born 1989-1991) and their parents (born 1956-1976) with appended U.S. census income data.

"Our results add to the mounting evidence suggesting that socioeconomic status is one of the most important drivers of worse pregnancy outcomes in the United States, which has one of the highest rates of preterm birth among the developed countries," says Dr. Collins. "We need to address the social influencers of health for both parents in order to decrease preterm birth rates in this country."

Credit: 
Ann & Robert H. Lurie Children's Hospital of Chicago

Island 'soundscapes' show potential for evaluating recovery of nesting seabirds

image: Islands isolated from the mainland provide critical nesting sites for seabirds such as this fork-tailed storm petrel on Buldir Island, Alaska.

Image: 
Ian L. Jones)

Nocturnal seabirds nesting on remote islands can be extremely difficult to study. An increasingly important tool for monitoring these populations involves acoustic sensors deployed in the field to record sounds over long periods of time. But analysis of the resulting recordings to identify and count the calls of different species can be time-consuming, even with computers and artificial intelligence.

An alternative approach for acoustic monitoring is to evaluate all of the sounds in an environment as a 'soundscape', using features such as acoustic diversity, complexity, and intensity as indicators of ecosystem health. In a new study, recently published in Restoration Ecology, researchers used soundscape analysis to evaluate the outcomes of restoration efforts in the Western Aleutian Islands.

"We learned that we can collect sound data at larger scales and understand where the ecosystem is responding to our actions," said first author Abraham Borker, who led the study as a graduate student in the Conservation Action Lab at UC Santa Cruz. Now an adjunct lecturer in ecology and evolutionary biology, Borker came to UCSC to study seabird conservation and earned his Ph.D. in 2018.

"Seabirds are special because they serve as a link between the ocean and land, transporting nutrients from the sea to land and transforming the islands they breed on," he said.

The new study took advantage of a highly successful campaign to remove invasive species and restore seabird nesting colonies in the Western Aleutian islands as a natural experiment. These remote islands, located in the Northern Pacific Ocean between Russia and Alaska, are important nesting sites for many species of seabirds.

In the 18th century, fur traders introduced arctic foxes to the islands, and they quickly began devastating seabird populations. Later, rats brought to the islands during World War II added to the plight of nesting seabirds. The islands are now part of the Alaska Maritime National Wildlife Refuge managed by the U.S. Fish and Wildlife Service, which in 1949 began efforts to remove the invasive species and restore healthy seabird colonies.

Today, some of the islands still have invasive predators while others have had them removed for varying amounts of time. One island that was never inhabited by foxes or rats was used as a reference for comparison to restored islands, representing a healthy seabird island. Overall, the varying timelines for the removal of invasive species from each island created a perfect opportunity for Borker to test if soundscapes reflect the recovery of island seabirds.

Traditional acoustic monitoring techniques had already been used to evaluate the recovery of seabirds on these islands. That work, led by Rachel Buxton of Colorado State University (a coauthor of the new study) and published in 2013, involved identifying and counting individual bird calls.

"It's quite a laborious process, even with automated tools," Borker said. Using the recordings from the earlier study, the researchers performed soundscape analyses and compared the results with the previous findings to see if the soundscape approach could assess restoration outcomes at a fraction of the effort.

Borker and his team characterized over 800 hours of recordings from across the islands, using a variety of "eco-acoustic" indexes to quantify features of the recordings indicative of each island's ecological conditions. These indexes enabled the researchers to compare the complexity, richness, and similarities of soundscapes between different environments.

Using mathematical models, they found that compared the soundscape analyses to the results from the 2013 study. They found that of the indexes tested, the acoustic richness index, as well as indexes of acoustic similarity to the reference condition, performed best in indicating seabird recovery on the islands. Their findings suggest that eco-acoustic indexes can provide an initial rapid analysis, identifying coarse differences in the amount of seabird acoustic activity between restored areas and a reference site. Their analysis also identified time since invasive species removal as the chief predictor of seabird acoustic activity, as did the 2013 study.

While promising, Borker said this approach to soundscape monitoring isn't perfect, but can scale in a way that other approaches struggle to do. In one case, an island still invaded with rats and with a low seabird nesting population yielded index values similar to an uninvaded and healthy island. Heavy winds and rain during the acoustic recordings may have influenced the indexes in this case, highlighting the importance of understanding the non-biological differences between recording sites, he said.

Borker is now working to get information about soundscape analysis into the hands of conservationists. Organizations such as the Nature Conservancy have already begun taking this soundscape acoustic monitoring approach to other environments.

"This tool can be used to measure change in not one or two species, but an entire ecosystem over time," said coauthor Bernie Tershy, a research biologist in the Institute of Marine Sciences at UC Santa Cruz. This approach could be used to analyze the changes in all of the species in an environment--frogs, birds, insects, and more--all at once, he said.

Sometimes it's unclear which restoration method is best, Borker said, but by using this holistic approach researchers may be able to understand which actions are more effective. Finding tools to measure the status of restored sites that are highly automated is critical to scaling conservation and restoration efforts worldwide.

Borker's efforts have shown that this is now possible, in less time and with less resources than standard acoustic monitoring. "It's exciting--we can measure conservation outcomes across a community and at large scales," he said.

Credit: 
University of California - Santa Cruz

New report: Teacher effectiveness has a dramatic effect on student outcomes

A new IZA World of Labor report publishing tomorrow, 05/12/19, finds teacher effectiveness to have a strong effect on pupils attainment. It goes on to look at ways to increase it including reforming hiring practices, and reforming teacher training and development.

Teacher effectiveness is the most important component of the education process within schools for pupil attainment. According to economist Simon Burgess of the University of Bristol one estimate suggests that, in the US, replacing the least effective 8% of teachers with average teachers has a present value of $100 trillion. While there is a good understanding on how teachers' effectiveness can be measured in this IZA World of Labor report Burgess stresses the importance for politicians to look at ways to raise teacher efficiency.

Burgess cites a number of studies from different countries which produced similar estimates of the impact of teacher effectiveness. These estimates have been shown to be robust and are supported by studies using experimental assignment of teachers to classes. The results show that variations in teacher effectiveness are extremely important in understanding pupils' attainment. In fact, it seems that no other attribute of schools comes close to having this much influence on student achievement. One of the most striking results is that replacing the lowest performing 5-10% of teachers with average teachers would deliver extremely large net present value calculations. One study estimates that replacing the 5% least effective teachers with average teachers would yield around $9,000 per classroom per year in future pupil earnings due to better education. Pupils that are taught by highly effective teachers earn more, are more likely to go to university, and to live in richer neighborhoods.

If teacher effectiveness is that important, we need to look at methods to improve it. Burgess summarizes a number of studies looking at teacher selection processes, teacher training methods and teacher evaluation and how these can be used to improve teacher performance. Though more research needs to be done it seems that three areas in particular seem to hold the greatest promise when it comes to improving teacher's effectiveness: (i) improving teacher selection and hiring procedures (by for example replacing ineffective teachers at an early point in their career), (ii) reforming teacher contracts and the tenure/retention decision (currently highly effective teachers are more likely to leave their job), and (iii) re-thinking teacher professional development (through for example personalized teacher coaching).

Burgess concludes: 'Teacher effectiveness should be a central concern for education policymakers...The potential size of the impact of improving teacher effectiveness represents a truly grand prize for the countries, cities, and schools which manage to crack the code of how to raise teacher effectiveness."

Please credit IZA World of Labor should you refer to or cite from the report.

Credit: 
IZA World of Labor

Brachytherapy proves effective in treating skin cancer

image: Patient's ear pre-treatment - An elderly patient on anti-coagulation therapy with squamous cell cancer of ear.

Image: 
Study author and RSNA

CHICAGO - The use of high-dose-rate brachytherapy to treat elderly patients with common skin cancers offers excellent cure rates and cosmetic outcomes, according to a new study presented today at the annual meeting of the Radiological Society of North America (RSNA).

Squamous cell carcinoma (SCC) and basal cell carcinoma (BCC) are the most common types of skin cancer, affecting 3 million Americans each year. Although they are highly curable and less dangerous than melanoma, they can be disfiguring and costly to treat. Treatments for squamous and basal cell carcinomas include surgical removal and conventional, or external beam, radiation therapy.

"For elderly patients who don't heal as well and may have additional medical problems, surgery may not be the best option," said Ashwatha Narayana, M.D., chairman of the Department of Radiation Oncology at Northern Westchester Hospital in Mount Kisco, N.Y. "If the affected area is the tip of the nose, ear or on the eyelid, multiple surgeries and skin grafting may be required."

In high-dose-rate brachytherapy, a precise dose of radiation is delivered to the cancerous cells through catheters implanted into a custom-fitted applicator. Unlike six-week external beam radiation therapy, in which treatment sessions can last up to six hours, a course of high-dose-rate brachytherapy includes six three-minute sessions over two weeks.

"Treatment with external beam radiation therapy can be too long and painful for elderly patients," Dr. Narayana said. "It also exposes healthy tissue around the lesion to radiation, which can increase side effects. Brachytherapy delivers a higher dose of radiation directly to the tumor while sparing healthy tissue nearby."

According to Dr. Narayana, brachytherapy patients have minimal recovery time and typically experience few or no side effects that can be associated with the treatment, such as nausea, hair loss or diarrhea. They can also return to normal activities after the procedure.

In the study, radiologists used high-dose-rate brachytherapy to treat 70 patients between the age of 70 and 100 (median age 85 years) with early-stage BCC and SCC. A total of 81 lesions (BCC: 53, SCC: 28) on the nose, face, forehead, scalp, ear, neck and legs were treated between 2013 and 2019. Lesions ranged in size from 3 to 26 millimeters (mm) with a median of 10mm. Patients were followed for up to four years (median follow-up: 2 years).

"We had a cure rate of 96% in patients with squamous cell carcinoma and 98% in patients with basal cell carcinoma, and cosmetic outlook was excellent in 90% of cases," Dr. Narayana said. "This is a great treatment option compared to surgery."

Despite being a well-recognized treatment that is used routinely to treat other types of cancers, Dr. Narayana said brachytherapy has failed to catch on for the treatment of non-melanoma skin cancers on the face and neck. He hopes results of his study and future research will help raise awareness of high-dose-rate brachytherapy as an alternative to surgery and external beam radiation therapy.

"High-dose-rate brachytherapy is a powerful way of treating skin cancers in both elderly and younger patients," he said. "The results are impressive."

Credit: 
Radiological Society of North America

Medical marijuana cards often sought by existing heavy users

PISCATAWAY, NJ - Young adults who seek enrollment in state medical marijuana programs are often those who already use heavily rather than those with mental or physical issues that could be addressed by the drug. That's according to new results published in the Journal of Studies on Alcohol and Drugs.

"Making medical marijuana cards easy to obtain for vaguely defined mental or physical health conditions that are not supported by any research evidence has potential for those who use more heavily to claim need for a medical marijuana card solely to have easier access," says lead author Eric R. Pedersen, Ph.D., of the RAND Corporation in Santa Monica, Calif.

These results have implications for states that have legalized medical marijuana or are considering such legislation. Policymakers, according to the authors, should "design medical marijuana programs in their states that allow card acquisition only for people with mental and physical health problems that have documented evidence of medicinal benefit."

The researchers analyzed data from a long-term study of substance use prevention in California. Beginning in 2008, participants were followed from middle school through young adulthood. For the current investigation, the study contained data from 264 participants who were 18 to 20 years old in 2015 to 2017, when medical marijuana was legal in California but before outlets in that state were allowed to sell legal recreational marijuana beginning January 2018.

All participants had to have used marijuana at least once in the past month at the beginning of the study, but none had obtained a "medical marijuana card," which would allow them to buy, possess and use marijuana. Both initially and one year later, they were asked about their marijuana use, symptoms of depression and anxiety, and physical health problems.

At the one-year follow-up, 19 percent of participants had sought out and received a medical marijuana card. Overall, men were 2.91 times as likely as women to have received a medical marijuana card at follow-up.

In addition, for every additional day of marijuana use at the beginning of the study, the odds of receiving a medical marijuana card one year later increased by 7%. After statistically controlling for confounding variables, the authors found that physical and mental health problems at baseline (i.e., the problems ostensibly one would seek a medical marijuana card for) were not significant predictors of receiving a card in that follow-up period.

"It seems that more frequent use of marijuana, and not the physical and mental health problems that one ostensibly seeks a medical marijuana card to address, is what drives acquisition of a medical marijuana card," Pedersen says.

The authors note that their study does have limitations. Because this was a retrospective analysis of previously reported data, the researchers were unable to specifically ask participants the reasons they may have sought medical marijuana. Further, the timeframe may have been too short, and a more detailed and longer analysis may allow a better understanding of who seeks medical marijuana and why.

"Many individuals . . . struggle with legitimate medical and psychological concerns that can benefit from medical marijuana," according to the authors. In California, the specific qualifying conditions include chronic pain, glaucoma, AIDS, and seizures, among others. However, the law is also general enough that providers are allowed to recommend the drug to patients with "any other chronic or persistent medical symptom" that may limit "major life activities."

"It's not clear to us what participants are telling providers," Pedersen says, "but we suspect that under this catch all, you can get a recommendation pretty easily."

He adds that providers should consider reviewing forms of symptom management other than, or in addition to, marijuana and look for signs that the drug has impacted patients in a negative way.

And as more states consider legalizing medical marijuana, policymakers should be aware of both the favorable and adverse outcomes.

"In another paper, we found that young adults who had a medical marijuana card were more likely to report heavy use, greater consequences from use, selling marijuana, and driving under the influence of marijuana compared to young adults who did not have a card," Pedersen says.

Credit: 
Journal of Studies on Alcohol and Drugs

Common heart drugs linked with less heart damage from cancer therapy

Vienna, Austria - 4 December 2019: Cancer patients receiving common heart drugs have less heart damage from cancer therapy, according to research presented today at EuroEcho 2019, a scientific congress of the European Society of Cardiology (ESC).1 For every ten patients with breast or haematological cancer treated with heart drugs, one case of cardiotoxicity could be avoided.

"Our study provides support for the routine use of beta-blockers, angiotensin converting enzyme (ACE) inhibitors, or angiotensin II receptor blockers (ARBs) in patients receiving cancer treatment but the decision should be made on a case by case basis," said study author Dr Sergio Moral, of Hospital Universitari Josep Trueta and Hospital Santa Caterina, Girona, Spain.

Advances in oncological treatment have led to improved survival of patients with cancer but at the same time, cardiovascular disease is one of the most frequent side effects. The incidence of heart problems varies according to the drug, its dose, and patient characteristics including age and hypertension.

This study analysed the best available evidence to examine whether use of beta-blockers, ACE inhibitors, and/or ARBs was related to reduced incidence of cardiotoxicity in patients with breast or haematological cancer receiving cancer treatment. Cardiotoxicity was defined as decline in heart pump function (drop in left ventricular ejection fraction to below 50%, or a greater than 10% decline) and/or overt heart failure during the first year of follow-up.

The meta-analysis included nine randomised controlled trials. The main cancer treatment in all studies was anthracycline chemotherapy; some also administered adjuvant treatment such as trastuzumab. A total of 913 patients were enrolled, of whom 534 received heart drugs and 379 were in a control group. Of the 534 receiving heart medications, 337 had a beta-blocker, 152 had an ACE inhibitor or ARB, and 45 received a beta-blocker and ACE inhibitor.

During the one-year follow-up, 108 patients (12%) developed cardiotoxicity. Patients receiving cardioprotective treatment had a significantly lower risk of cardiotoxicity (relative risk 0.381).

Dr Moral said: "Cardioprotective medications are not habitually prescribed in patients with cancer and our study suggests that they should be considered. Cancer and cardiovascular disease share common risk factors which also influence susceptibility to cardiotoxicity. Consequently, cancer patients are advised to eat healthily, quit smoking, control their weight, and exercise."2

"More research is needed to identify which patients benefit the most from cardioprotective therapy, which medication is most effective and at what dose, and the optimal duration of prophylaxis," he said.

Credit: 
European Society of Cardiology

Introducing peanuts and eggs early can prevent food allergies in high risk infants

Research undertaken by King's College London and St George's, University of London has found that introducing certain foods early to infants can prevent them from developing an allergy despite low adherence to an introduction regime.

In a series of papers published today in the Journal of Allergy and Clinical Immunology, researchers found that despite low adherence, early introduction to allergenic foods (those that may cause an allergic reaction), including egg and peanut, was found to be effective in preventing the development of food allergies in specific groups of infants. The research additionally highlights barriers to following the early introduction process.

The research is a continuation from The Enquiring About Tolerance (EAT) study where over 1300 three-month old infants were recruited in England and Wales and placed into one of two groups. One group was introduced to six allergenic foods (including peanut and egg) from three months of age alongside breastfeeding and was called the Early Introduction Group (EIG). The other group was exclusively breastfed for six months and was termed the Standard Introduction Group (SIG).

Results showed that:

Among children with any food sensitisation at study enrolment, 34.2% of children in the SIG developed food allergy in comparison to 19.2% of children in the EIG.

Among infants sensitised to peanut at enrolment, 33.3% of infants in the SIG developed a peanut allergy versus the 14.3% in the EIG.

Among infants sensitised to egg at enrolment, 48.7% developed an egg allergy in the SIG compared to 20.0% in the EIG.

The early introduction of allergenic foods to infants who were not at a high risk of developing food allergies was not associated with an increased risk of developing food allergy.

There were no significant differences in food allergy rates between the two groups of infants with no sensitisation to any food at the time of enrollment.

The results were still evident despite only 42% of the EIG group achieving the per-protocol adherence of sustained, high dose consumption of five or more early introduction foods. Low adherence to the protocol, appeared to be most prominent among populations of increased maternal age, non-white ethnicity and lower maternal quality of life.

EAT Study Principal Investigator Gideon Lack, Professor of Paediatric Allergy, School of Life Course Sciences at King's College London said: "These results have significant implications and are informative when it comes to infant feeding recommendations concerning allergies and the development of new guidelines. If early introduction to certain allergenic foods became a part of these recommendations, we also have data that tells us what populations may need extra support when it comes to implementing the recommendations."

One paper dove deeper into what factors influenced non-adherence in a qualitative analysis. Three major themes emerged including children refusing allergenic foods, caregiver reported concern about the foods causing allergic reactions and practical lifestyle constraints. These three challenges all contributed significantly to non-adherence and would need to be addressed if infant feeding recommendations were updated.

"The EAT study has provided us with a wealth of data that is still being analysed. As more research about early introduction of specific food allergens continues, we will get closer to new early introduction recommendations that will hopefully help to prevent food allergies in the future," said Professor Lack.

EAT Study Co-Principal Investigator, Dr Michael Perkin, from St Georges, University of London said: "We have shown that the early introduction of foods that causes allergies can significantly reduce the chances of high-risk infants developing peanut and egg allergy. Our research adds to the body of evidence that early introduction of allergenic foods may play a significant role in curbing the allergy epidemic."

Credit: 
King's College London

Probiotic may help treat colic in infants

Probiotics--or "good bacteria"--have been used to treat infant colic with varying success. In a new trial published in the journal Alimentary Pharmacology & Therapeutics, investigators have shown that drops containing a particular probiotic strain (Bifidobacterium animalis subsp. lactis BB-12) reduced the duration of daily crying by more than 50% in 80% of the 40 infants who received the probiotic once daily for 28 days, with beneficial effects on sleep duration and on stool frequency and consistency. This compared with only 32.5% of the 40 infants who received placebo.

Infant colic is a very common gastrointestinal disorder affecting up to 25% of infants in the first 3 months of life, and although it is a benign condition, it is source of major distress for the infants and their families. It is associated with maternal postpartum depression, early breastfeeding cessation, parental guilt and frustration, shaken baby syndrome, multiple physician visits, drugs use, formula changing, and long-term adverse outcomes such as allergies and behavior and sleep problems.

The effect seen in the study was associated with a positive modulation of the gut microbiome, with increased bacterial production of butyrate, a short chain fatty acid that is able to positively regulate intestinal transit time, pain perception, the gut-brain axis, and inflammation.

"Our study provides evidence on the important role of gut microbiota as a target of intervention against infant colic," said senior author Roberto Berni Canani, MD, PhD of the University of Naples "Federico II," in Italy. "It is relevant to underline that this trial studied a specific well-characterized probiotic strain, and that these findings cannot be extrapolated for other probiotic strains."

Credit: 
Wiley

The influence of alcohol consumption among cohabitating partners

Research has linked a partner's or spouse's drinking with changes in alcohol-related behaviours, but few studies have considered only cohabiting relationships. A new study published in Drug & Alcohol Review sought to determine if a cohabiting partner's drinking habits are influenced by their partner's consumption.

In the analysis of survey data on 1,483 newly cohabiting, Australian heterosexual couples, a respondent's own drinking was a stable and significant predictor of future consumption, and it was a greater predictor of later drinking than their partner's. A woman's consumption generally exerted significant influence on her male partner's later consumption, while a man's drinking had no effect for all but the first year following cohabitation.

"Cohabitation is increasingly becoming an important relationship step and precursor or alternative to marriage. Our findings suggest that cohabitation may present with similar levels of partner influence, related to alcohol consumption, as new marriages," said lead author Geoffrey Leggat, MSc, of the Centre for Alcohol Policy Research, La Trobe University, in Australia. "Partners in cohabiting relationships should be mindful of the potential effect that their alcohol consumption may have, not only on their own consumption, but on that of their partner."

Credit: 
Wiley