Culture

York U researcher identifies 15 new species of stealthy cuckoo bees

video: Cuckoo bees sneakily lay their eggs in the nests of other bee species, after which their newly hatched prodigies kill the host egg or larva, and then feed on the stored pollen. The host, a solitary bee, never knows anything is awry. Nine new species of these clandestine bees have been found hiding in collections and museums across North America by York University Ph.D. Candidate Thomas Onuferko, as well as another six unpublished in a decades old academic thesis.

Image: 
York University

TORONTO, Tuesday, May 8, 2018 - Cuckoo bees sneakily lay their eggs in the nests of other bee species, after which their newly hatched prodigies kill the host egg or larva, and then feed on the stored pollen. The host, a solitary bee, never knows anything is awry. Nine new species of these clandestine bees have been found hiding in collections and museums across North America by York University PhD Candidate Thomas Onuferko, as well as another six unpublished in a decades old academic thesis.

More closely resembling wasps in appearance, cuckoo bees lack the typical fuzzy look usually attributed to bees as they don't need those hairs to collect pollen for their young. Although not much is known about them, cuckoo bees are named after cuckoo birds which exhibit the same cleptoparasitic behaviour.

There are now a total of 43 known cuckoo bees in the genus Epeolus (pronounced ee-pee-oh-lus) in North America, many of which go unnoticed hovering low to the ground in backyards or "sleeping" on leaves, as they don't have nests of their own. They are only 5.5 to 10 mm in length, smaller and rarer than the polyester bees whose nests they invade.

"It may seem surprising to some that in well-researched places like Canada and the United States there is still the potential for the discovery of new species," says Onuferko of the York University's Faculty of Science. "People have been aware of a few of the new species that I'm describing, but they've never been formerly named. There is a whole bunch of other species, however, that no one knew about."

Part of the reason it's taken so long to identify these new cuckoo bees is that they are small, uncommonly collected and can be difficult to tell apart. Onuferko visited collections across North America and had specimens sent to the Packer Lab at York University for examination.

Many of the newly described cuckoo bees, including one Onuferko named after well-known British broadcaster and naturalist Sir David Attenborough - Epeolus attenboroughi, possess very short black, white, red and yellow hairs that form attractive patterns.

Onuferko named another cuckoo bee after York University bee expert and thesis adviser Professor Laurence Packer - Epeolus packeri.

Where did the name "epeolus" come from? Onuferko thinks it's likely a diminutive of Epeus/Epeius, the name of the soldier in Greek mythology who is attributed with coming up with the Trojan horse war strategy.

All 15 new species are now formally described, which will allow other researchers and bee enthusiasts to keep a lookout for them.

Credit: 
York University

Troubling stats for kids with intellectual disabilities

COLUMBUS, Ohio - By federal law passed in 1975, children with intellectual disabilities are supposed to spend as much time as possible in general education classrooms.

But a new study suggests that progress toward that goal has stalled.

Findings showed that over the past 40 years, 55 to 73 percent of students with intellectual disabilities spend most or all of the school day in self-contained classrooms or schools and not with their peers without disabilities.

"Given the legal mandate, it is surprising that such a large proportion of students are consistently placed in restrictive settings," said Matthew Brock, author of the study and assistant professor of special education at The Ohio State University.

The study is the first to look at national trends in education placement for students with intellectual disability - previously called mental retardation - for the entire 40 years since the law was enacted.

"I found historical trends of incremental progress toward less restrictive settings, but no evidence of such progress in recent years," said Brock, who is affiliated with Ohio State's Crane Center for Early Childhood Research and Policy.

The study has been accepted for publication by the American Journal on Intellectual and Developmental Disabilities.

The Individuals with Disabilities Education Improvement Act (as the law is now called) has the aim of educating students with disabilities in what it calls the "least restrictive environment." That means they should be placed in general education classrooms alongside peers without disabilities to the maximum extent appropriate.

Decisions about what is appropriate for each child are made by an Individual Education Program team that includes the child's parents, teachers and others.

Brock used several data sources to determine the proportion of students 6 to 21 years old with intellectual disability who were placed in each federally reported educational environment from 1976 to 2014.

The definitions of placement categories changed several times over the 40 years, so it is impossible to directly compare statistics over the entire time period, Brock said. But some general trends can be detected.

He found that in the first years following passage of the law, the proportion of students in less restrictive settings actually decreased. Students served in regular general education classrooms decreased from 38 percent in 1976 to 30 percent in 1983.

From 1984 to 1989 an overall trend is less clear.

From 1990 to 2014, the proportion of students in less restrictive placements initially increased and then plateaued, Brock said.

The proportion of students who spent at least 80 percent of the school day in general education classrooms trended up to near 14 percent in 1998, dropped to 11 percent in 2002, hit a peak of 18 percent in 2010 and decreased slightly to 17 percent in 2014.

"Overall, the most rapid progress toward inclusive placements was in the 1990s, with more gradual progress in the 2000s and a plateau between 2010 and 2014," Brock said.

He believes the rapid progress in the 90s occurred because advocacy for special education was strongest during this period, at least on a national level.

"There are still people working really hard toward the goal of inclusion in some parts of the country, but that doesn't come through in this national data," he said.

One argument could be that inclusion has plateaued in the United States because nearly all students are already in the least restrictive environments possible, as decided by their Individual Education Program teams, Brock said.

But state-by-state data suggests something else must be going on. In 2014, students with intellectual disabilities in Iowa were 13.5 times more likely to spend most of the school day in a general education setting compared to students in the bordering state of Illinois.

These huge discrepancies in placements between states can't be explained by differences in the students. The issue is that states and even individual school districts follow different policies and ways of working with student with disabilities - and not all succeed at giving students the least restrictive environment, according to Brock.

"I don't want to send the message that all kids with intellectual disabilities should spend 100 percent of their time in general education classrooms," he said.

"But I think we need to find opportunities for all kids to spend some time with peers who don't have disabilities if we are going to follow the spirit and letter of the law."

Credit: 
Ohio State University

A new mechanism for neurodegeneration in a form of dementia

Philadelphia, May 8, 2018
A new study in Biological Psychiatry reports that dementia-related and psychiatric-related proteins cluster together to form aggregates in the brain, leading to abnormal cell function and behavior. Aggregation of the protein TDP-43 is a hallmark of a pathological process that leads to dementia called frontotemporal lobar degeneration (FTLD). The study showed that as TDP-43 accumulates in the brain of patients with FTLD, it ropes in DISC1, an important protein in the pathology of many mental conditions.

The findings provide a clue to the unsolved puzzle of why psychiatric disorders often emerge in neurodegenerative disorders. “From the clinical point of view, it is critical to understand the molecular mechanisms underlying psychiatric symptoms in neurodegenerative diseases,” said senior author Motomasa Tanaka, PhD, of RIKEN Brain Science Institute, Japan. The findings reveal that the TDP-43/DISC1 protein clusters disrupt the production of new proteins in neurons, a process critical for higher brain functions that are impaired in psychiatric disorders.

First author Ryo Endo, PhD, and colleagues found the co-aggregates in postmortem brain tissue from patients with FTLD and in a mouse model of FTLD. The FTLD model mice were hyperactive and displayed abnormal social interactions, behaviors relevant to multiple psychiatric conditions. Aggregation of the protein renders it unusable, so Dr. Endo and colleagues added DISC1 back into the mice. The behavioral abnormalities in the mice returned to normal.

“At the mechanistic level, TDP-43 and DISC1 co-aggregation disrupted activity-dependent local translation in dendrites,” said Dr. Tanaka, a process that builds proteins from DNA codes based on neural activity. The disruption in translation resulted in reduced synaptic protein expression in the mice. Adding DISC1 back in also restored the reduced protein levels. The findings demonstrate that the co-aggregation of DISC1 caused the abnormalities in the model mice, suggesting that the co-aggregation of DISC1 with TDP-43 may disrupt cellular function and trigger psychiatric manifestations.

“DISC1 has long been a focus of research as a consequence of its implication in the heritable risk for schizophrenia,” said John Krystal, MD, Editor of Biological Psychiatry. “However, this new study implicates DISC1 in the biology of FTLD. There are still unanswered questions about whether DISC1 is disrupted in association with schizophrenia risk. However, the new study by Dr. Endo and colleagues provides a compelling case for further exploring the role of this protein in frontotemporal dementia,” he added.

Credit: 
Elsevier

Type of maternal homework assistance affects child's persistence

image: Type of maternal homework assistance affects child's persistence.

Image: 
UEF / Varpu Heiskanen

Different types of maternal homework assistance have a different impact on the child's way of completing school assignments in grades 2 to 4 of elementary school, according to a new study from the University of Eastern Finland and the University of Jyväskylä. Although all homework assistance presumably aims at helping the child, not all types of homework assistance lead to equally positive outcomes.

Researchers in the longitudinal First Steps Study found that the more opportunities for autonomous work the mother offered the child, the more task-persistent the child's behaviour. In other words, the child later worked persistently on his or her school assignments, which encouraged mothers to offer more and more opportunities for autonomous working.

However, when the mother provided assistance by concretely helping the child, the less task-persistent the child's later behaviour. This, in turn, made mothers offer more and more help.
These associations between different types of maternal homework assistance and the child's task-persistent behaviour remained even after the child's skill level was controlled for.

"One possible explanation is that when the mother gives her child an opportunity to do homework autonomously, the mother also sends out a message that she believes in the child's skills and capabilities. This, in turn, makes the child believe in him- or herself, and in his or her skills and capabilities," Associate Professor Jaana Viljaranta from the University of Eastern Finland explains.

Similarly, concrete homework assistance - especially if not requested by the child - may send out a message that the mother doesn't believe in the child's ability to do his or her homework.

Homework assistance should consider the child's needs

The findings also indicate that task-persistence is a mediating factor between different types of maternal homework assistance and the child's academic performance. This helps to understand some earlier findings on how some types of maternal homework assistance predict better academic performance than others. When the mother offers the child an opportunity for autonomous working, the child will work persistently, which leads to better development of skills. If, however, the mother's homework assistance involves plenty of concrete help, the child will work less persistently, leading to poorer development of skills.

"It is important for parents to take the child's needs into consideration when offering homework assistance. Of course, parents should offer concrete help when their child clearly needs it. However, concrete help is not something that should be made automatically available in every situation - only when needed," Viljaranta says.

The First Steps Study is an extensive longitudinal study carried out by the University of Jyväskylä, the University of Eastern Finland and the University of Turku. The study examines student learning and motivation among approximately 2,000 children from kindergarten onwards. Children currently participating in the study are in secondary education.

Credit: 
University of Eastern Finland

Study looks at barriers to getting treatment for substance use disorders

May 8, 2018 - For patients with substance use disorders seen in the emergency department or doctor's office, locating and accessing appropriate treatment all too often poses difficult challenges. Healthcare providers and treatment facility administrators share their views on delays and obstacles to prompt receipt of substance use disorder treatment after referral in a study in the Journal of Addiction Medicine, the official journal of the American Society of Addiction Medicine (ASAM). This journal is published in the Lippincott portfolio by Wolters Kluwer.

Issues related to patient eligibility, treatment capacity, understanding of options, and communication problems all contribute to gaps in referral and delays to getting treatment for patients with substance use disorders, according to the new research by Claire Evelyn Blevins, PhD, of Warren Alpert Medical School of Brown University and Butler Hospital, Providence, RI; Nishi Rawat, MD, of OpenBeds, Inc., Washington. DC; and Michael Stein, MD, of Boston University and Butler Hospital.

Four Themes Affecting Obstacles to Treatment for Substance Use Disorders

The ongoing opioid crisis has drawn attention to the widening gap between the high need and limited access to substance use treatment in the United States. A recent Substance Abuse and Mental Health Services Administration report found that of 21.7 million Americans in need of substance use disorder treatment, only 2.35 million received treatment at a specialty facility. Yet there is little information on the organizational-level barriers to treatment for substance use disorders.

To address this issue, Dr. Blevins and colleagues performed a series of interviews with 59 stakeholders in the treatment referral process. The study gathered input from those who make referrals for substance use treatment, including emergency medicine physicians, addiction specialists, and other medical providers; as well as those who receive referrals, including substance use treatment facility staff and administrators.

Analysis of the interviews identified four broad themes:

Patient Eligibility. Healthcare providers face difficulties in determining whether patients meet criteria for admission to a particular treatment center, including the application of treatment eligibility criteria. "Eligibility requirements may prevent a patient from entering a treatment center," the researchers write.

Treatment Capacity. Even if a patient is eligible, providers have trouble finding out whether space is available. "Despite the need for services, treatment centers may not run at capacity, because of frustrations encountered and time wasted on the referral and admission process."

Knowledge of Treatment Options. Providers may not understand the levels of available care for substance use treatment, and how to select the best treatment for their patient. "After determining appropriate level of care, a provider must then find a program that meets the patient's needs, which becomes more difficult with the differences in terminology and program guidelines."

Communication. Difficulties in communication between referring providers and treatment facilities can contribute to delays to starting treatment. The need for direct referral - "from the emergency department to a bed" - is particularly high for patients with opioid use disorders.

"Access to substance use disorder treatment is often a maze that can be difficult to navigate for both providers and patients," Dr. Blevins and coauthors write. Based on the themes identified, they make recommendations for improvement in the referral process, including a database of clear eligibility criteria, real-time information on treatment capacity, and increased education and training for providers on substance use treatment.

They also propose ways to improve communication and reduce treatment waiting times, including new information technologies. The researchers write: "By improving systems that enhance communication across organizations, patient referrals may be more easily completed, improving access to care and expanding the use of appropriate treatments for the many patients in need."

In an accompanying commentary, David L. Rosenbloom, PhD, of Boston University School of Public Health discusses the underlying reasons for the current "dysfunctional referral system." He notes that referrals for other chronic diseases "may be more effective because they are to 'in-house' affiliated providers." Dr. Rosenbloom writes: "The standard of care should be to stabilize, initiate treatment, and provide a hands-on transfer to an entity that can complete a diagnostic assessment and provide evidence-based treatment" for patients with substance use disorders.

Credit: 
Wolters Kluwer Health

Impaired brain pathways may cause attention problems after stroke

image: Stroke lesions. A, Lesion incidence map in patients with acute stroke. B, Lesion incidence map shows regions in which at least 10 patients had a lesion. Color bar denotes the probability of lesion distribution. C, Brain region that is correlated with attention deficit in the voxel-based lesion-symptom mapping (VLSM) analysis. Color bar denotes the t values.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - Damage to some of the pathways that carry information throughout the brain may be responsible for attention deficit in patients who have had a subcortical stroke in the brain's right hemisphere, according to a study published online in the journal Radiology. Researchers hope the findings may provide a measure for selecting suitable patients for early interventions aimed at reducing cognitive decline following stroke.

A stroke may affect cortical regions of the cerebral cortex, which includes the gray matter that lines the surface of the brain, or it may affect brain regions below the cortex, including white matter tracts connecting different regions of the brain. A stroke affecting brain structures below the cortex is known as a subcortical stroke.

More than one-third of patients experience cognitive decline after a stroke, including attention deficit, which can affect and impair the patient's ability to carry out routine activities of daily living.

"Impairment of attention has been observed in patients with both cortical and subcortical stroke," said senior study author Chunshui Yu, M.D., from the Department of Radiology at Tianjin Medical University General Hospital in Tianjin, China. "In cortical stroke, the direct involvement of cortical regions associated with attention may account for the deficit. However, the parts of the nervous and brain systems underlying attention deficit in subcortical stroke remain largely unknown."

To investigate the mechanisms underlying attention deficit in chronic subcortical stroke, Dr. Yu and colleagues combined voxel-based lesion-symptom mapping (VLSM) and diffusion tensor tractography (DTT) in 49 patients (32 men and 17 women between the ages of 40 and 71) after subcortical stroke and 52 control patients (30 men and 22 women, age 40-68). VLSM is a method of analyzing relationships between tissue damage and behavioral deficits, and DTT is an MRI technique that allows for 3-D visualization of specific white matter tracts in the brain.

A modified version of the attention network test was used to assess visual attention function. VLSM was used to identify lesion locations related to attention deficit in the stroke patients. Then DTT was used to determine the responsible impaired brain connections at the chronic stage (> 6 months post-stroke).

The results showed that compared to the controls, patients with chronic stroke exhibited prolonged reaction time during the attention task. VLSM revealed that having an acute stroke lesion in the right caudate nucleus and nearby white matter was correlated to the prolonged reaction time. DTT showed that the responsible lesion was located in the right thalamic- and caudate-prefrontal pathways in controls.

The right brain damage subgroup had significantly decreased fractional anisotropy (FA) in these pathways, which were correlated with the prolonged reaction time. FA provides a way to measure diffusion occurring within a region of the brain. FA is typically higher in brain regions of high organization. Reductions in FA have been previously associated with advancing age and in cases of cognitive impairment.

"The impairment of the right thalamic- and caudate-prefrontal pathways was consistently associated with attention deficit in patients with right subcortical stroke," Dr. Yu said. "Based on this association, one can estimate which patients with stroke would be more likely to develop into long-term persisting attention deficit by evaluating the lesion-induced damage to these pathways."

Credit: 
Radiological Society of North America

Carbon satellite to serve as an important tool for politicians and climate change experts

CLIMATE: A new satellite that measures and provides detailed carbon balance information is one of the most important new tools in carbon measurement since infrared light, believe researchers from the University of Copenhagen. The researchers expect the satellite to be a valuable tool for the UN's work on climate change related to the Paris climate accord.

Carbon balance is important for climate and environment because whenever carbon is converted into carbon dioxide, CO2 emissions increase. On the other hand, carbon is an essential aspect of life on Earth: a felled tree releases carbon into the atmosphere whereas a planted one takes up carbon in vegetation and soil. A lack of carbon in vegetation and soil can create a carbon imbalance and have climate-related consequences.

University of Copenhagen researchers have tested a new French satellite that can measure carbon balance far more precisely than the current method, which uses aerial photography. The satellite uses low-frequency passive microwaves to measure the biomass of above ground vegetation. The studies have recently been published in Nature Ecology and Evolution.

"This is one of the biggest steps related to carbon measurement since infrared measurements were developed in the 1970s," according to Postdoc Martin Stefan Brandt of the Department of Geosciences and Natural Resources Management, who is the researcher behind the study.

"The new satellite can measure emissions from all types of vegetation - including trunks and branches, not just the crowns as has been the case until now. This presents a much more detailed account of the carbon balance in the region concerned."

Vital for further work on climate change

The group of Danish researchers took an image of the African continent for seven years. The satellite made it possible to produce a detailed map of the carbon balance across the whole of Africa.

Over the seven years, the researchers documented that drought and deforestation had a dramatic influence on carbon emissions, which has a negative effect on climate. For this reason, it is important to have a tool on hand for monitoring changes to the landscape.

"We will need to understand how various factors like deforestation and drought affect the carbon balance in order to provide a knowledge base for experts and politicians whose job it is to make decisions related to work on climate change," says Martin Stefan Brandt.

The satellite can prove to be an important tool for future work on climate change and the reduction of CO2 emissions. For example, researchers expect that the UN Intergovernmental Panel on Climate Change (IPCC) will be able to use the satellite in relation to the Paris climate accord because it is well suited to present emissions by country.

Credit: 
University of Copenhagen - Faculty of Science

Hunting dogs may benefit from antioxidant boost in diet

image: Researchers from the University of Illinois tested an antioxidant-rich performance diet in American Foxhounds and found evidence of lower oxidative stress when vitamin E and taurine were consumed at higher concentrations.

Image: 
Preston Buff

URBANA, Ill. - Free radicals, those DNA-damaging single-oxygen atoms, are produced in spades during exercise. Dogs that exercise a lot, like hunting dogs, may need to consume more antioxidants than their less-active counterparts to protect against this damage. But what diet formulation best meets the needs of these furry athletes? A new University of Illinois study provides some answers in a real-world scenario.

Researchers visited a kennel of American Foxhounds in Alabama over the course of a hunting season, providing one group a high-performance commercial diet and another group a test diet similar to the commercial diet, but with added antioxidants (vitamins C and E, and lutein), zinc, and taurine. During the study, dogs from both groups went on two to three hunts per week, each 2 to 5 hours in length.

"We think of it as unstructured endurance exercise. They're not running the entire time. They might stop to sniff or go more slowly to pick up a scent," says Kelly Swanson, corresponding author on the study and Kraft Heinz Company Endowed Professor in Human Nutrition in the Department of Animal Sciences and the Division of Nutritional Sciences at U of I.

Before starting the diets and on four occasions during the seven-month study, researchers took blood samples from the dogs to examine oxidative stress markers and other blood metabolites.

"We hypothesized that dogs fed the test diet would have a lower concentration of oxidative stress markers and improved performance compared to the dogs fed the commercial diet," Swanson says. "It turns out performance wasn't affected by diet, but the test diet did improve indirect measures of oxidative stress. Therefore, improved performance may be expected with more strenuous exercise when metabolic demands are higher."

The amino acid taurine, once thought to be non-essential for dogs but now recognized as an important nutrient for heart health, declined over the course of the season for dogs fed the commercial diet. The same pattern occurred with vitamin E. Although one dog did come close to a critically low level of taurine during the study, all dogs fed the commercial diet stayed within the normal range for all blood metabolites.

For dogs fed the test diet, taurine and vitamin E levels were maintained at or above the baseline. The results suggest to Swanson and his co-authors that these compounds are compromised in athletic dogs over months of unstructured exercise, and more-active dogs such as sled dogs may experience greater depletion.

"We can conclude that athletic dogs may benefit from supplementation of vitamin E and taurine to minimize oxidation and maintain taurine status," he says.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Majority of the population trusts state structures in consumer health protection

In the latest issue, it can be seen that people are becoming more and more aware of glyphosate, the active substance used in certain plant protection products, with three quarters of the population already having heard of it. Despite this, food in Germany is still regarded as safe by over 80 percent of respondents, and more than half trust the state authorities that they protect the health of consumers.

It was determined for the first time in this issue how great the interest in consumer health topics is. "More than two thirds of the population are interested in consumer health protection. That makes our mandate of providing people with comprehensive information on actual and perceived risks all the more important," says BfR President, Professor Dr. Dr. Andreas Hensel. "The goal is for consumers to remain able to decide what to do by themselves and maintain their competence in assessing risks".

http://www.bfr.bund.de/cm/364/bfr-consumer-monitor-02-2018.pdf

The BfR Consumer Monitor is an important instrument of consumer health protection. As a representative consumer survey, it gives an insight every six months as to how the German-speaking population perceives health risks.To do so, roughly 1,000 persons living in private households in Germany aged at least 14 years are interviewed per telephone on behalf of the BfR.

Respondents still perceive smoking, climate and environmental pollution as well as a wrong or unhealthy diet as the greatest health risks. In focus once again, and moving up into fourth place in the list of the greatest health risks, are the shortcomings of the health system. These include a perceived shortage of medical staff, the care crisis and the difficult situation in hospitals. Alcohol and unhealthy or contaminated foods are seen as further risks.

When questions are asked about selected topics, salmonella, genetically modified foods, antimicrobial resistance and plant protection product residues head the list of subjects of which people are most aware. These are also the four topics that cause concern among the most respondents. Compared to the previous year, the topics of aluminium, microplastics and glyphosate in food are much better known. Almost half of the population is concerned about glyphosate, with a similar percentage of people concerned about microplastics. By way of comparison, only a good third of respondents find aluminium in food a cause for concern.

Toys and cosmetics are estimated to be safe by a larger percentage of consumers compared to the previous survey. There has been a slight decrease in the feeling of safety where textiles are concerned.

The BfR Consumer Monitor is dedicated on the one hand to topics which receive a lot of public attention. On the other hand though, it also analyses issues which have been less the focus of attention but which are also relevant, such as Campylobacter and pyrrolizidine alkaloids in food, or new methods of "genome editing" for the targeted modification of genetic makeup. As was the case last year, these topics are hardly visible in public perception and are consequently not regarded as being of particular concern. Food hygiene at home also plays only a minor role in the consciousness of the consumer.

To what extent public perception deviates from the scientific estimation of health risks is of particular interest for the work of the BfR. Through follow-up studies and specific communicative measures on such topics as kitchen hygiene, the BfR aims to counteract false estimations and misunderstandings.

Credit: 
BfR Federal Institute for Risk Assessment

Inequality is normal: Dominance of the big trees

image: These are large-diameter trees in the Douglas-fir/western hemlock forest of Winder River, Washinton, USA

Image: 
James Lutz/Utah State University

The top 1% of the forest has been sharing some vital information with researchers. Ninety-eight scientists and thousands of field staff have concluded the largest study undertaken to date with the Smithsonian Forest Global Earth Observatory (ForestGEO), and what they have found will have profound implications toward ecological theories and carbon storage in forests. Rather than examining tree species diversity in temperate and tropical ecosystems, this global study emphasized forest structure over a vast scale. Using large forest plots from 21 countries and territories, Utah State researchers found that, on average, the largest 1% of trees in mature and older forests comprised 50% of forest biomass worldwide. Furthermore, the amount of carbon that forests can sequester depends mostly on the abundance of big trees. The size of the largest trees was found to be even more important to forest biomass than high densities of small and medium trees. Lead author Jim Lutz, Assistant Professor at Utah State University said, "Big trees provide functions that cannot be duplicated by small or medium-sized trees. They provide unique habitat, strongly influence the forest around them, and store large amounts of carbon."

This study has shown that the structure of the forest is as important to consider as species diversity - the largest trees follow their own set of rules. Using 48 of the large forest dynamics plots from around the world coordinated by the Smithsonian ForestGEO Program, scientists were able to examine the variability of forest structure on a consistent basis. Co-author Dan Johnson, Research Associate at Utah State University said, "Having a worldwide group of scientists following the same methods offers us unique opportunities to explore forests at a global scale. This is a really wonderful group of scientists united by a passion for deepening our understanding of forests."

Tropical forests are well known to typically have many more species than temperate forests. However, this study found that temperate forests have higher structural complexity, both in terms of different tree sizes within an area and also between adjacent areas of forest. Co-lead author Tucker Furniss, PhD student at Utah State University said, "The distribution of big trees has not been well explained by theory. Our results emphasize the importance of considering these rare, but disproportionately important ecosystem elements. We clearly need more applied and theoretical research on these important big trees."

The researchers also found that the largest trees are representatives of the more common tree species. The ability of some trees in any given forest to reach very large sizes relative to the other trees and concentrate resources seems to be a global phenomenon. "Big trees are special." Continued Lutz. "They take a long time to regrow if they are eliminated from a forest. Making sure that we conserve some big trees in forests can promote and maintain all the benefits that forests provide to us."

Credit: 
S.J. & Jessie E. Quinney College of Natural Resources, Utah State University

Stomata -- the plant pores that give us life -- arise thanks to a gene called MUTE

image: Without MUTE, Arabidopsis plants cannot produce stomata, and do not develop past the seedling stage.

Image: 
Soon-Ki Han/ Xingyun Qi

Plants know how to do a neat trick.

Through photosynthesis, they use sunlight and carbon dioxide to make food, belching out the oxygen that we breathe as a byproduct. This evolutionary innovation is so central to plant identity that nearly all land plants use the same pores -- called stomata -- to take in carbon dioxide and release oxygen.

Stomata are tiny, microscopic and critical for photosynthesis. Thousands of them dot on the surface of the plants. Understanding how stomata form is critical basic information toward understanding how plants grow and produce the biomass upon which we thrive.

In a paper published May 7 in the journal Developmental Cell, a University of Washington-led team describes the delicate cellular symphony that produces tiny, functional stomata. The scientists discovered that a gene in plants known as MUTE orchestrates stomatal development. MUTE directs the activity of other genes that tell cells when to divide and not to divide -- much like how a conductor tells musicians when to play and when to stay silent.

"The MUTE gene acts as a master regulator of stomatal development," said senior author Keiko Torii, a UW professor of biology and investigator at the Howard Hughes Medical Institute. "MUTE exerts precision control over the proper formation of stomata by initiating a single round of cell division -- just one -- in the precursor cell that stomata develop from."

Stomata resemble doughnuts -- a circular pore with a hole in the middle for gas to enter or leave the plant. The pore consists of two cells -- each known as a guard cell. They can swell or shrink to open or close the pore, which is critical for regulating gas exchange for photosynthesis, as well as moisture levels in tissues.

"If plants cannot make stomata, they are not viable -- they cannot 'breathe,'" said Torii, who also is a professor at Nagoya University in Japan.

Torii and her team investigated which genes governed stomata formation in Arabidopsis thaliana, a small weed that is one of the most widely studied plants on the planet. Past research by Torii's team and other researchers had indicated that, in Arabidopsis, MUTE plays a central role in the formation of stomata. The MUTE gene encodes instructions for a cellular protein that can control the "on" or "off" state of other plant genes.

The researchers created a strain of Arabidopsis that can artificially produce a lot of the MUTE protein, so they could easily identify which genes the MUTE protein turned on or off. They discovered that many of the activated genes control cell division -- a process that is critical for stomatal development.

In Arabidopsis, as in nearly all plants, stomata form from precursor cells known as guard mother cells, or GMCs. To form a working stoma -- singular for stomata -- a GMC divides once to yield to paired guard cells. Since their data showed that MUTE proteins switched on genes that regulated cell division, Torii and her team wondered if MUTE is the gene that activates this single round of cell division. If so, it would have to be a tightly regulated process. The genetic program would have to switch on cell division in the GMC, and then quickly switch it right back off to ensure that only a single round of division occurs.

Torii's team showed that one of the genes activated by the MUTE protein to its DNA is CYCD5;1, a gene that causes the GMC to divide. The researchers also found that MUTE proteins turn on two genes called FAMA and FOUR LIPS. This was an important discovery because, while CYCD5;1 turns on cell division of the GMC, FAMA and FOUR LIPS turn off -- or repress -- the cell division program.

"Our experiments showed that MUTE was turning on both activators of cell division and repressors of cell division, which seemed counterintuitive -- why would it do both?" said Torii. "That made us very interested in understanding the temporal regulation of these genes in the GMC and the stomata."

Through precise experiments, they gathered data on the timing MUTE activation of these cell division activators and repressors. They incorporated this information into a mathematical model, which simulated how MUTE acts to both activate and repress cell division in the GMC. First, MUTE turns on the activator CYCD5;1 -- which triggers one round of cell division. Then, FAMA and FOUR LIPS act to prevent further cell division, yielding one functional stomata consisting of two guard cells.

"Like a conductor at the podium, MUTE appears to signal its target genes -- each of which has specific, and even opposite, parts to play in the ensuing piece," said Torii. "The result is a tightly coupled sequence of activation and repression that gives rise to one of the most ancient structures on land plants."

Credit: 
University of Washington

The effect of night shifts: Gene expression fails to adapt to new sleep patterns

Have you ever considered that working night shifts may, in the long run, have an impact on your health? A team of researchers from the McGill University affiliated Douglas Mental Health University Institute (DMHUI) has discovered that genes regulating important biological processes are incapable of adapting to new sleeping and eating patterns and that most of them stay tuned to their daytime biological clock rhythms.

In a study published in the Proceedings of the National Academy of Sciences, Laura Kervezee, Marc Cuesta, Nicolas Cermakian and Diane B. Boivin, researchers at the DMHUI (CIUSSS de l'Ouest-de-l'Île-de-Montréal), were able to show the impact that a four-day simulation of night shift work had on the expression of 20,000 genes.

"We now better understand the molecular changes that take place inside the human body when sleeping and eating behaviours are in sync with our biological clock. For example, we found that the expression of genes related to the immune system and metabolic processes did not adapt to the new behaviours," says Dr. Boivin, Director of the Centre for Study and Treatment of Circadian Rhythms and a full professor at McGill University's Department of Psychiatry.

It is known that the expression of many of these genes varies over the course of the day and night. Their repetitive rhythms are important for the regulation of many physiological and behavioural processes. "Almost 25% of the rhythmic genes lost their biological rhythm after our volunteers were exposed to our night shift simulation. 73% did not adapt to the night shift and stayed tuned to their daytime rhythm. And less than 3% partly adapted to the night shift schedule," adds Dr. Cermakian, Director of the Laboratory of Molecular Chronobiology at the DMHUI and a full professor at McGill University's Department of Psychiatry.

Health problems ahead?

For this study, eight healthy volunteers were artificially subjected to a five-day schedule simulating night shift work. In a time-isolation room, they were deprived of any light or sound cues characteristic of the time of day, and were not allowed to use their phones or laptops. The first day the participants slept during their normal bedtimes. The four following days were "night shifts": the volunteers remained awake during the night and slept during the day.

On the first day and after the last night shift, the team collected blood samples at different times for a period of 24 hours. Laura Kervezee, a postdoctoral fellow on Boivin's team, then measured the expression of more than 20,000 genes using a technique called transcriptomic analysis, and assessed which of these genes presented a variation over the day-night cycle.

"We think the molecular changes we observed potentially contribute to the development of health problems like diabetes, obesity, cardiovascular diseases more frequently seen in night-shift workers on the long term," explains Dr. Boivin. However, she adds this will require further investigations.

As the study was conducted under highly controlled conditions in the laboratory, future research should extend these findings by studying the gene expression of actual night shift workers whose physical activity, food intake and timing of sleep might differ from one another. This could also be applied to other people that are at risk of experiencing biological clock misalignment such as travellers crossing time zones on a frequent basis.

Around 20% of the workforce in Canada, the United States and Europe is involved in shift work.

Credit: 
McGill University

Study finds possibility of new ways to treat, manage epilepsy seizures

LEXINGTON, Ky. (May 4, 2018) - New findings from the University of Kentucky published in the Journal of Neuroscience demonstrate that there may be ways to address blood-brain barrier dysfunction in epilepsy.

Epilepsy is one of the most common neurological disorders and around one-third of epilepsy patients do not respond well to anti-seizure drugs. Until now, it was believed that the cause and effect of epilepsy was merely based on a dysfunction in the brain's neurons. However, recent findings suggest that epilepsy can be caused by many other factors, including a dysfunctional blood-brain barrier. Essentially, seizures erode the lining of capillaries in the brain which plays a role in letting nutrients in and keeping toxins out. This can result in a "leaky" blood-brain barrier, which leads to more seizures, resulting in epilepsy progression.

Björn Bauer's lab at the UK College of Pharmacy collaborated with Sanders-Brown Center on Aging scientists to conduct research focused on this barrier leakage. Bauer and colleagues hypothesized that glutamate, released during seizures, mediates an increase in certain enzymes and activity levels, thereby contributing to barrier leakage.

Through their research, they found that the neurotransmitter glutamate, released during seizures, increased the activity of two types of enzymes, which increased barrier leakage. They also found that blocking the enzyme cPLA2 and genetically deleting cPLA2 may prevent the changes mentioned and the associated leakage. This suggests that cPLA2 is responsible for barrier leakage.

Since 30 percent of people with epilepsy do not respond well to current anti-seizure medications, these findings demonstrate there could be new ways to treat and manage seizures that currently do not respond well to medication.

The data gathered implies that cPLA2 could be a pharmaceutical target to repair and normalize barrier dysfunction and improve the treatment of epilepsy and potentially other neurological disorders that are accompanied by blood-brain barrier leakage These strategies to repair barrier dysfunction could be valuable add-on treatments to existing pharmacotherapy.

Credit: 
University of Kentucky

Planetary waves similar to those that control weather on Earth discovered on Sun

image: Rossby waves on the Sun are waves of vorticity that move in the direction opposite to rotation. Associated flows have amplitudes of about one meter per second that peak in the Sun's equatorial regions.

Image: 
Max Planck Institute for Solar System Research

Abu Dhabi, May 7, 2018: An international team of scientists, led by Laurent Gizon, co-principal investigator of the Center for Space Science at NYU Abu Dhabi (NYUAD), have discovered planetary waves of vorticity on and inside the Sun similar to those that significantly influence weather on Earth.

Rossby waves are a natural phenomenon in the atmospheres and oceans of planets that form in response to the rotation of the planet. Like Earth, the Sun also rotates and should support Rossby waves, but their existence on the Sun has been debated, until now.

"There's no doubt what we're seeing are Rossby waves due to the measured, textbook relationship between frequency and wavelength, said Gizon.

Solar Rossby waves are gigantic in size, Gizon explained, with wavelengths comparable to the solar radius. They are an essential component of the Sun's internal dynamics because they contribute half of the Sun's large-scale kinetic energy.

"That these waves are so big and are only seen in the equatorial regions of the Sun is completely unexpected," he said.

Astrophysicists from NYUAD, the Max Planck Institute for Solar System Research, and Stanford University studied six years of space data, which revealed the Rossby waves moving in the direction opposite to the Sun's rotation.

Rossby waves on the Sun are close relatives to those known to occur in the Earth's atmosphere and oceans, Gizon said, but are extremely difficult to detect on the Sun because they have very small flow amplitudes, around one meter per second.

Solar Rossby Waves Characteristics

- waves of vorticity

- move in the direction opposite to rotation

- well-defined relationship between frequency and wavelength

- found only near the equator

- small amplitude, difficult to detect

- live for several months

- contribute half of the Sun's kinetic energy at large scales

Earth's Rossby Waves Characteristics

- found in at mid-latitudes in the atmosphere and ocean

- significant role in shaping weather

Analysis and confirmation

Scientists analyzed data collected from 2010-2016 by the Heliospheric and Magnetic Imager (HMI) instrument on board NASA's Solar Dynamics Observatory. The study required high-precision observations of the Sun over many months.

Granules were used as passive tracers to uncover the underlying, much larger vortex flows associated with Rossby waves. "The HMI images have sufficiently high spatial resolution to allow us to follow the movement of photospheric granules on the Sun's visible surface," said Bjoern Loeptien, scientist at the Max Planck Institute and first author of the paper. These granules are small convective cells roughly 1,500 kilometers in size on the solar surface.

Helioseismology, the study of the solar interior using solar internal acoustic waves, was used to verify the findings and observe the Sun's Rossby waves at depths up to 20,000 kilometers. "The results from helioseismology and granulation tracking are in excellent agreement," asserted Gizon.

"We don't yet know what role Rossby waves play in the Sun, but know that they can't be ignored in future studies," added Katepalli R. Sreenivasan, NYUAD Center for Space Science principal investigator, "their presence may help us understand solar convection at the largest spatial scales, which remains poorly understood. They are very hard to find because of low signal levels but this research team has used ingenious data processing techniques to discover their existence."

Their findings are reported in the journal Nature Astronomy.

NYUAD Provost Fabio Piano said, "We congratulate the researchers, including NYUAD Research Professor Laurent Gizon, for their work on this important discovery confirming the presence of Rossby waves on the Sun. The Center for Space Science is running a world-class research and outreach program in solar, stellar, and exoplanet science. In addition to being a hub of intellectual activity within NYUAD, the Center is quickly becoming a significant resource in supporting the priority space sector within the UAE."

Credit: 
Rubenstein Associates, Inc.

Generic options provide limited savings for expensive drugs

Generic drug options did not reduce prices paid for the cancer therapy imatinib (Gleevec), according to a Health Affairs study released today in its May issue.

After nearly two years of generic competition the price for a month of treatment dropped by only 10 percent, according to authors from Vanderbilt University School of Medicine (VUSM) and University of North Carolina at Chapel Hill.

"Most estimates of price reductions due to generic entry assume prices will drop by as much as 80 percent," said senior author Stacie Dusetzina, Ph.D., associate professor of Health Policy at VUSM. "Obviously we aren't even close to that mark."

Not only are prices remaining high during that period, doctors were initially slow to prescribe the generic treatment, she said.

Gleevec, the poster child for effective cancer therapies, became available in 2001 and changed chronic myeloid leukemia from a condition with a short life expectancy into a manageable chronic disease.

Because patients typically take Gleevec every day for the rest of their lives, costs of treatment can be a significant burden.

It was priced at nearly $4,000 per bottle when it came on the market in 2001 and that price escalated to nearly $10,000 per bottle by 2015, before a generic competitor entered the market.

But prices remained high even two years after a generic option was available.

"Patients and providers have all looked forward to generic entry, expecting major price reductions," Dusetzina said. "Unfortunately, we don't see prices drop as quickly and as low as we would hope when generics are available."

Dusetzina and first author Ashley Cole, doctoral candidate at the University of North Carolina at Chapel Hill, focused specifically on generic price competition for the specialty drug Gleevec in their Health Affairs study, "Generic Price Competition For Specialty Drugs: Too Little, Too Late?"

The authors said the Gleevec case demonstrates several potential barriers to effective generic price competition, including shifts in prescribing toward more expensive brand-name treatments and smaller-than-expected price reductions.

Twenty-four percent of imatinib (Gleevec) prescriptions claims were for "dispense as written," according to the authors. This suggests that patients or providers specifically wanted to stay on the branded drug instead of switching to the generic.

"The more than doubling of the drug price over time and the lack of price reductions observed with nearly two years of generic drug competition is concerning," said Dusetzina, also the Ingram Associate Professor of Cancer Research.

"It begs the question whether we can rely on generic entry as a primary approach to address drug pricing for high-priced specialty medications," she said. "We need robust competition to move prices in this space."

Credit: 
Vanderbilt University Medical Center