Culture

New method lets doctors quickly assess severity of brain injuries

A new way to rapidly assess levels of consciousness in people with head injuries could improve patient care.

The new score - based on the Glasgow Coma Scale - could also help doctors assess the health of the patient's central nervous system in cases of serious trauma or intensive care.

Using it could improve the way doctors around the world care for patients in a coma from brain injury.

The Glasgow Coma Scale (GCS), which was created at the University of Glasgow and the city's Southern General Hospital in 1974.

The 13 point scale - covering the patient's ability to open their eyes, speak and move - has revolutionised the care of brain injured patients worldwide.

The original GCS team joined forces with researchers at the University of Edinburgh to improve the scale by adding a simple score for pupil response.

Using health records from more than 15,000 patients, they showed that the new score, known as the GCS-Pupil (GCS-P), would have improved doctors' ability to predict a patient's condition in the six months following a brain injury.

A major advantage of the GCS-P is its simplicity and it could be adopted into hospitals easily, allowing doctors to quickly assess prognosis, experts say.

There are almost 350,000 hospital admissions involving damage to the brain in the UK per year, equating to one admission every 90 seconds.

Dr Paul Brennan, who co-led the study from the University of Edinburgh's Centre for Clinical Brain Sciences, said: "The importance of the Glasgow Coma Scale to medicine cannot be overstated and our simple revision really improves its predictive ability and usefulness.

"Making major decisions about brain injured patients relies on quick assessments and the new method gives us rapid insights into the patient's condition. Our next step is to test the GCS-P more widely on large data sets from Europe and the US."

Professor Sir Graham Teasdale, Emeritus Professor of Neurosurgery at the University of Glasgow, who first developed the GCS and co-led the study, said: "This has been a very successful collaboration. It promises to add a new index to the language of clinical practice throughout the world. The GCS-P will be a platform for bringing together clinical information in a way that can be easily communicated and understood. "

Credit: 
University of Edinburgh

Study finds humans and others exposed to prenatal stress have high stress levels after birth

image: Thi is the overall weighted effect size and 95 percent highest posterior density intervals and the independent influence of each moderator variable in explaining variation in effect size. Width of lines and size of points are proportional to the number of effect sizes in each category.

Image: 
Adrian V. Jaeggi

Vertebrate species, including humans, exposed to stress prenatally tend to have higher stress hormones after birth, according to a new Dartmouth-led study published in Scientific Reports. While previous research has reported examples of maternal stress experience predicting offspring stress hormones in different species, this study is the first to empirically demonstrate the impact of prenatal stress on offspring stress hormone levels using data from all known studies across vertebrates.

Through a meta-analysis of 114 results from a total of 39 observational and experimental studies across 14 vertebrate species, including birds, snakes, sheep and humans, the study examines the impact of prenatal exposure to maternal stress on offspring. The researchers analyzed the role of the hypothalamic pituitary adrenal (HPA)-axis, the stress physiological system that is shared across all vertebrates, which ultimately, results in the production of stress hormones known as "glucocorticoids." The HPA-axis is the hormonal system responsible for mobilizing an animal's stress response. Offspring exposed prenatally to maternal stress were found to have more stress hormone levels (glucocorticoids) after birth. This could reflect a biological adaptation with an evolutionary history, as more stress hormones could increase an animal's chances for survival in a stressful environment.

In the present study, the researchers tested the strength of the effect of prenatal stress on offspring stress hormone levels across a range of characteristics. Remarkably, the effects of prenatal stress on offspring stress hormones were consistent across species, regardless of evolutionary relationships or factors, such as brain or body size. There were also no differences when considering offspring sex, age of the offspring at the time of assessment, or the timing of the stressor exposure prenatally or its severity.

Only two factors influenced the size of the effect. Experimental studies had a stronger effect than observational studies. In addition, studies that measured glucocorticoid recovery showed a greater association with prenatal stress than was observed at baseline or during peak glucocorticoid response.

"Animals, including humans, modify their stress hormones in response to their environment. Your stress response is set like a thermostat-- your body can amp up or down stress hormones in response to anticipated environmental conditions," explains lead author Zaneta Thayer, an assistant professor of anthropology at Dartmouth.

An animal's stress response tends to be activated by external factors, such as when its see a predator or whether food is availabile. Higher stress hormone levels among offspring may help extend survival but come at a cost and may affect other physiological systems, such as reproduction. In humans, the mere anticipation of stress or just thinking about prior experiences of discrimination or trauma can activate a stress response. Overactive stress hormones can lead to chronic health problems in humans, including anxiety, depression and cardiovascular disease.

One of the studies included in the meta-analysis looked at how maternal stress hormones in pregnant snow hares changed in relation to the abundance of their natural predators, lynxes, over a 10-year cycle. The research team found that in years where there were more lynxes, snow hare offspring had more stress hormones and anti-predator behaviors.

"Our stress response is meant to be adaptive to acute stress, such as being chased by predators. However, humans' stress response is often triggered by social evaluative threats and is not serving the adaptive purpose that it was designed for," added Thayer. "This research confirms what other scientists have long speculated that there are trends across species when it comes to linking prenatal stress and offspring hormonal stress responses."

Prior work co-authored by Thayer has explored early origins of humans' health disparities and the impacts of maternal stress during pregnancy on offspring's postnatal stress hormone levels.

Credit: 
Dartmouth College

Vampire bats' bloody teamwork

Many mammals consume blood as part of their diet, but blood is actually a pretty poor source of energy. Only bats (order Chiroptera) include species that feed exclusively on blood.

So how do vampire bats manage to survive on such low-grade nourishment? A recently published article in Nature Ecology & Evolution provides part of the answer. The bats had to evolve in tandem with their microorganisms.

The challenges were clear. Blood consists of 78 per cent liquid. The remainder is 93 per cent proteins and only one per cent carbohydrates. Blood provides very little in the way of vitamins. On top of all that, a blood-based diet exposes these animals to blood-borne pathogens.

To find the answer, researchers had to look at the vampire bat's genome.

So now the genes of the common vampire bat (Desmodus rotundus) have been thoroughly investigated. But not only its genes.

Professor Tom Gilbert is the senior author of the Nature article and has collaborated with PhD student and first author Lisandra Zepeda Mendoza on this research.

Gilbert works at the Centre for GeoGenetics at the University of Copenhagen, and also holds a part-time position as an adjunct professor at the Norwegian University of Science and Technology's (NTNU) University Museum.

"Coping with this kind of diet requires one species to co-evolve with other species," says Gilbert.

But exactly which other species vampire bats have co-evolved with may not be immediately apparent.

Some vampire bat characteristics are easy to recognize. Vampire bats have developed specialized adaptations that enable them to access blood and then make use of it.

All three species of vampire bats are native to the Americas. Anyone who has watched animal programs or scary movies is probably familiar with some of their adaptive behaviors.

Everyone knows about bats' razor sharp teeth and especially their striking incisors. The teeth are practical for penetrating the skin of their victim.
The bats also have specialized cells, called thermoreceptors, that can detect heat and are useful in finding bare skin on a sleeping animal at night.

Substances in the bat's saliva prevent the blood in a wound from coagulating. The bat's inner adaptations are at least as interesting as the more obvious outer ones. Their kidneys are specially adapted to cope with high protein content. Their immune system helps deal with any pathogens.

However, none of these known adaptations explains how vampire bats have evolved to rely exclusively on blood for their nutrition. This is what the researchers behind the recent Nature article set out to investigate.

This development seems to require one species to develop in tandem with other species. The way this happens may alter the way we perceive evolution: it turns out that some of the other species we co-evolve with may be found within us.

According to the article, a diet this specialized requires a highly specific adaptation of the genome of the species itself. But it also requires a uniquely adapted microbiome.

A microbiome consists of the entire genetic material of all the microorganisms that live in our bodies, whether we're talking about viruses, bacteria or fungi.

You and I and everyone you know are full of other organisms, maybe around 100 trillion of them. The actual number is controversial and subject to debate - but a lot of organisms at any rate.

Some folks may feel disgust at this thought, but there's little to loathe about them. We are completely dependent on other organisms to survive, and the vast majority of them are useful, or at least don't pull any bad pranks that you would notice.

We have even co-evolved with several of these organisms and share evolutionary history with them. At least that's the way it works with vampire bats.

"Our results show that vampire bats became blood sippers after their own genome and microbiome co-evolved closely," the Nature article says.

The genes of the vampire bats thus evolved along with all the microbes in their bodies.

In other words, in order to understand vampire bats you have to look at the bat's own genes and all the genes in the microbiome as a whole. This is what researchers call the "hologenome" - the genes of the host plus all its symbiotic microbial guests.

The common vampire bat has a unique hologenome. The vampire bat's microbiome helps to compensate for the lack of vitamins and various fatty substances in what would otherwise be an imbalanced diet. Its microbes also help the body get rid of waste and maintain the cells' fluid balance through osmoregulation.

The article's researchers emphasize the value they found in studying both the host and its interaction with the microbiome as they tried to figure out the adaptations that underlie the vampire bat's radical diet.

The bat's ability to survive on a diet that would be inadequate for other mammals is in fact only made possible through the help of the microbiome.

"But the main finding probably applies to all animals in regards to their diet, whether we're talking about cows and grass, vultures and carrion or koalas and eucalyptus. We have to look at both the animal itself and the collective microbes to understand what's happening. Now we've arrived at a point where this is possible both technically and economically," says Professor Gilbert.

Credit: 
Norwegian University of Science and Technology

Tiny injectable sensor could provide unobtrusive, long-term alcohol monitoring

image: Alcohol monitoring chip is small enough to be implanted just under the surface of the skin.

Image: 
David Baillot/UC San Diego Jacobs School of Engineering

Engineers at the University of California San Diego have developed a miniature, ultra-low power injectable biosensor that could be used for continuous, long-term alcohol monitoring. The chip is small enough to be implanted in the body just beneath the surface of the skin and is powered wirelessly by a wearable device, such as a smartwatch or patch.

"The ultimate goal of this work is to develop a routine, unobtrusive alcohol and drug monitoring device for patients in substance abuse treatment programs," said Drew Hall, an electrical engineering professor at the UC San Diego Jacobs School of Engineering who led the project. Hall is also affiliated with the Center for Wireless Communications and the Center for Wearable Sensors, both at UC San Diego. Hall's team presented this work at the 2018 IEEE Custom Integrated Circuits Conference (CICC) on Apr. 10 in San Diego.

One of the challenges for patients in treatment programs is the lack of convenient tools for routine monitoring. Breathalyzers, currently the most common way to estimate blood alcohol levels, are clunky devices that require patient initiation and are not that accurate, Hall noted. A blood test is the most accurate method, but it needs to be performed by a trained technician. Tattoo-based alcohol sensors that can be worn on the skin are a promising new alternative, but they can be easily removed and are only single-use.

"A tiny injectable sensor--that can be administered in a clinic without surgery--could make it easier for patients to follow a prescribed course of monitoring for extended periods of time," Hall said.

The biosensor chip measures roughly one cubic millimeter in size and can be injected under the skin in interstitial fluid--the fluid that surrounds the body's cells. It contains a sensor that is coated with alcohol oxidase, an enzyme that selectively interacts with alcohol to generate a byproduct that can be electrochemically detected. The electrical signals are transmitted wirelessly to a nearby wearable device such as a smartwatch, which also wirelessly powers the chip. Two additional sensors on the chip measure background signals and pH levels. These get canceled out to make the alcohol reading more accurate.

The researchers designed the chip to consume as little power as possible--970 nanowatts total, which is roughly one million times less power than a smartphone consumes when making a phone call. "We don't want the chip to have a significant impact on the battery life of the wearable device. And since we're implanting this, we don't want a lot of heat being locally generated inside the body or a battery that is potentially toxic," Hall said.

One of the ways the chip operates on such ultra-low power is by transmitting data via a technique called backscattering. This occurs when a nearby device like a smartwatch sends radio frequency signals to the chip, and the chip sends data by modifying and reflecting those signals back to the smartwatch. The researchers also designed ultra-low power sensor readout circuits for the chip and minimized its measurement time to just three seconds, resulting in less power consumption.

The researchers tested the chip in vitro with a setup that mimicked an implanted environment. This involved mixtures of ethanol in diluted human serum underneath layers of pig skin.

For future studies, the researchers are planning to test the chip in live animals. Hall's group is working with CARI Therapeutics, a startup based in the Qualcomm Institute Innovation Space at UC San Diego, and Dr. Carla Marienfeld, an addiction psychiatrist at UC San Diego who specializes in treating individuals with substance abuse disorders, to optimize the chip for next generation rehab monitoring. Hall's group is developing versions of this chip that can monitor other molecules and drugs in the body.

"This is a proof-of-concept platform technology. We've shown that this chip can work for alcohol, but we envision creating others that can detect different substances of abuse and injecting a customized cocktail of them into a patient to provide long-term, personalized medical monitoring," Hall said.

Credit: 
University of California - San Diego

Scientists learn how to avoid a roadblock when reprogramming cells

image: Kazutoshi Takahashi (left) and Tim Rand (right), scientists in Shinya Yamanaka's laboratory at Gladstone, helped answer lingering questions about cellular reprogramming.

Image: 
Gladstone Institutes

SAN FRANCISCO, CA--April 10, 2018--Over a decade ago, Shinya Yamanaka and Kazutoshi Takahashi made a discovery that would revolutionize biomedical research and trigger the field of regenerative medicine. They learned how to reprogram human adult cells into cells that behave like embryonic stem cells. Scientists were shocked that something so complex could be done so simply, and they had thousands of questions.

The reprogrammed cells are known as induced pluripotent stem cells (iPSCs). Researchers can create iPSCs from a patient's blood or skin cells, and use these patient-specific cells to study diseases or even create new tissues that could be transplanted back into the patient as therapy.

Initially, Nobel Laureate and Gladstone Senior Investigator Yamanaka, MD, PhD, and Staff Research Investigator Takahashi, PhD, identified four genes--abbreviated as O, S, K, and M--that cause cells to transform into iPSCs. The genes O, S, and K were known to help the cells become pluripotent, which allows them to produce any other cell type in the body.

The role of gene M (short for MYC), however, was unclear. They knew that by adding MYC, they could reprogram cells 10 percent more efficiently. But they didn't know why.

Twelve years later, Yamanaka and Takahashi finally defined the role of MYC in this important reprogramming process, answering several lingering questions. Their findings are published today in the scientific journal Cell Reports.

They discovered that MYC helps cells get around a significant roadblock in the process. They also found that, in some instances, MYC isn't actually needed for adult cells to successfully transform into iPSCs.

The Power of Three Discoveries

To reprogram cells, scientists typically add four genes (O, S, K, and MYC) to a dish containing adult cells. This allows the cells to start multiplying, which is a distinctive feature of stem cells. But after three days, the cells suddenly encounter a roadblock and stop multiplying, or proliferating. Then, on day seven, the cells start multiplying again and go on to become iPSCs.

If the researchers don't add MYC to the dish, the cells go through the same process, but they never overcome the obstacle, so they cannot successfully convert into iPSCs.

"We realized that MYC seems to help cells get around this roadblock, and that this needs to happen for adult cells to turn into iPSCs, but we still didn't quite understand how MYC did that," explained Takahashi. "Interestingly, we were able to figure it out thanks to three discoveries that happened independently in the lab, while people were working on different things."

The first discovery helped them find an early indicator of a cell's potential to finish reprogramming. It also allowed them to easily identify when the roadblock would occur, providing a valuable time reference for the subsequent findings.

The second discovery stemmed from a separate project on a protein called LIN41. The scientists found that if they replaced MYC with LIN41 in the cocktail of genes involved in reprogramming--meaning if they used O, S, K and LIN41--they could convert adult cells into iPSCs with the same efficiency.

"This was strange because it meant that, contrary to what we believed, MYC isn't necessary for cells to reprogram efficiently," said Tim Rand, MD, PhD, staff scientist at Gladstone and a first author of the study. "It turns out that adding LIN41 altogether avoids the onset of the roadblock that prevents cells from converting into iPSCs."

The team found that when they use the combination of O, S, K, and LIN41, the adult cells don't stop proliferating after the third day. Instead, they continue to multiply as if nothing happened and successfully complete the reprogramming process. This is because LIN41 blocks another protein, called p21, which causes the roadblock.

The third discovery proved to be even more astonishing. It showed that, in a particular cell line, neither MYC nor LIN41 are needed to enhance reprogramming.

The scientists went through the same process using tumor-derived cells that continuously multiply. Then, they removed LIN41, and nothing happened. Puzzled, they tried to remove MYC and, once again, nothing changed.

"That result was very shocking to me," said Rand. "Given everything we thought we knew about MYC and LIN41 at the time, we couldn't comprehend how these genes were so beneficial in somatic cell reprogramming, but absolutely useless in tumor reprogramming. Eventually, when we realized how it fit in, it was such useful information. It made us realize that certain cell types can fortuitously accomplish the role of MYC and LIN41 during reprogramming--to disable the p21 response. If I could relive that day over again, I would make sure it was a big celebration."

Rand and the rest of the team realized that without p21, there is no roadblock, so LIN41 is not needed to avoid it. They also showed that MYC is mainly useful because it activates LIN41. So, without the p21 roadblock, MYC isn't needed either.

Bringing Clarity to a Complex Process

Through these multiple discoveries, the Gladstone scientists noticed that the reprogramming process involves many genes and proteins important for cancer biology. In fact, they believe the roadblock trying to prevent cells from multiplying is the same one that tries to prevent cancer from spreading.

"When cancer biologists add certain factors to a cell that should drive it toward cancer, the cell panics and, to protect itself, it stops multiplying," said Takahashi. "We think the same thing is happening here, because cells are reacting to reprogramming as if it were cancer. It's not that they're trying to block the cells from transforming into iPSCs, but they've simply never been exposed to this process before and don't know how to react."

The new study explains many important activities involved in cellular reprogramming, and debunks certain leading theories about the role of MYC in this process.

"For a long time now, the entire field was collecting data on MYC, LIN41, and other genes and proteins without knowing what most of it meant," said Yamanaka, who is also director of the Center for iPS Cell Research and Application (CiRA) at Kyoto University, and professor at UC San Francisco. "Our study finally allows us to clearly understand all the data and address questions about the roles and importance of many of these elements."

With a clearer picture of the reprogramming process in hand, the field of regenerative medicine can now build upon these findings to answer the next set of burning questions.

Credit: 
Gladstone Institutes

Solo medical practices outperform groups in treatment of cardiac disease

In a recently published article in the Annals of Family Medicine, Donna Shelley, MD, MPH, et al, aimed to describe small, independent primary care practices’ performance in meeting the Million Hearts ABCSs (aspirin use, blood pressure control, cholesterol management, and smoking screening and counseling), as well as on a composite measure that captured the extent to which multiple clinical targets are achieved for patients with a history of arteriosclerotic cardiovascular disease. They also explored relationships between practice characteristics and ABCS measures.

The article, entitled “Quality of Cardiovascular Disease Care in Small Urban Practices,” concludes that achieving targets for ABCS measures varied considerably across practices; however, small practices were meeting or exceeding Million Hearts goals (i.e., 70 percent or greater). Practices were less likely to consistently meet clinical targets that apply to patients with a history of ASCVD risk factors. Greater emphasis is needed on providing support for small practices to address the complexity of managing patients with multiple risk factors for primary and secondary ASCVD.

Quality of Cardiovascular Disease Care in Small Urban Practices,” by Donna Shelley, MD, MPH, et al, New York, New York
http://www.annfammed.org/content/16/Suppl_1/S21

Journal

The Annals of Family Medicine

Credit: 
American Academy of Family Physicians

Leadership and adaptive reserve are not associated with blood pressure control

In a recently published study in the Annals of Family Medicine, Kamal Henderson, MD, et al, assessed whether a practice’s adaptive reserve and high leadership capability in quality improvement are associated with population blood pressure control. The article, entitled “Organizational Leadership and Adaptive Reserve in Blood Pressure Control: The Heart Health NOW Study,” reveals that adaptive reserve (i.e., the ability of a practice to weather the process of change) and leadership capability in quality improvement implementation are not statistically associated with achieving top quartile practice-level hypertension control at baseline in the Heart Health NOW project. Findings, however, may be limited by a lack of patient-related factors and small sample size to preclude strong conclusions.

Credit: 
American Academy of Family Physicians

Major disruptions are frequent in primary care

In primary care practices, sustainability of performance improvements and ability to deliver continuity of care to patients can be adversely affected by major disruptive events, such as relocations and changes in ownership, clinicians, and key staff. This is according to a recently published study in the Annals of Family Medicine entitled “The Alarming Rate of Major Disruptive Events in Primary Care Practices in Oklahoma,” in which James Mold, MD, MPH, et al, documented the rates of major disruptive events in a cohort of primary care practices in Oklahoma.

During a 2-year period, major disruptive events occurred at an alarming rate, adversely affecting quality improvement efforts. Most reported events involved losses of clinicians and staff. More research is needed to identify and address the root causes of these events.

The Alarming Rate of Major Disruptive Events in Primary Care Practices in Oklahoma,” by James W. Mold, MD, MPH, et al, Oklahoma City, Oklahoma
http://www.annfammed.org/content/16/Suppl_1/S52

Journal

The Annals of Family Medicine

Credit: 
American Academy of Family Physicians

Overlapping mechanisms in HIV cognitive disorders and Alzheimer's disease

image: Aβ oligomers are elevated in the brains of HIV(+) cases. Paraffin-embedded tissue sections from hippocampus of HIV(-) and HIV(+) individuals were prepared for immunofluorescent analysis and visualized by laser confocal microscopy. Representative images are shown from hippocampal sections triple-labeled for Aβ oligomers (red), MAP2 (green), and nuclei (blue). Red and green colocalization appears yellow.

Image: 
Stern et al., JNeurosci (2018)

A protein involved in Alzheimer's disease (AD) may be a promising target for treating neurological disorders in human immunodeficiency virus (HIV) patients, suggests a study published in JNeurosci of rat neurons and brain tissue from deceased humans. The research shows that the two conditions may damage neurons in similar ways.

Although HIV-associated neurological disorders (HAND) and AD have symptoms in common, whether they also share underlying mechanisms of disease progression is controversial because HAND patients do not exhibit the amyloid plaques that are characteristic of AD. To address this question, Kelly Jordan-Sciutto and colleagues investigated the role of a well-known AD protein -- β-site amyloid precursor protein cleaving enzyme 1 (BACE1) -- in HAND. The researchers found elevated levels of BACE1 and Aβ oligomers -- the compound thought to be responsible for neuronal damage in AD -- in postmortem brain tissue of HIV-positive humans. Treating rat neurons with HIV-infected white blood cells from healthy humans revealed similar mechanisms of neurotoxicity.

Credit: 
Society for Neuroscience

Stop prioritizing the car to tackle childhood obesity, governments/planners urged

The dominance of the "windscreen perspective" whereby governments and planners view the world from quite literally the driving seat, has allowed car travel to become the "default choice," argue the authors.

Consequently, investment in road building far exceeds that for active travel--public transport, footpaths, and cycle lanes-- "resulting in an environment that often feels too risky for walking or cycling," they suggest.

The average length of a school journey has nearly doubled since the 1980s to just under 4 miles in 2013. But the age at which parents will allow their children to go to school by themselves has been steadily creeping up amid fears about road safety.

So they drive their children to school. But what is often not recognised is just how much air pollution children travelling by car are exposed to inside the vehicle under urban driving conditions, the authors point out.

Encouraging independent travel not only helps shed the pounds, but has knock-on social and mental health benefits, and it breaks the cycle of normalising car travel for future generations, they say.

They admit there is no single solution, but safe routes to school are needed. The UK could adopt the school travel initiatives pioneered by Germany, The Netherlands, and Denmark, they suggest.

And it could plough more cash into the Sustainable Travel Towns programme, already implemented in some parts of the UK.

This programme of town-wide measures, which aims to curb car use, has helped boost economic growth, cut carbon emissions and promote quality of life in those areas where it has been adopted, the authors point out.

"For a fraction of the road building programme cost, we could see not just safe routes to schools, but, even more importantly, safe routes wholesale across urban areas, they argue.

In an accompanying letter, sent to all four UK transport ministers--Chris Grayling in England; Humza Yousaf (Scotland); Ken Skates (Wales); and Karan Bradley (Northern Ireland)--the authors point to significant savings to the NHS, reductions in pollution levels, and ingraining sustainable travel behaviours among future generations if active travel were to be prioritised.

"The rhetoric of improving the environment in favour of children's active travel has been visible for at least two decades, but tangible changes have largely been absent from transport planning," they write.

"We suggest the time is right to redress the imbalance and give back to today's children many of the freedoms that older adults recall and benefited from in terms of the levels of independent mobility," they conclude.

Credit: 
BMJ Group

Blood flow is a major influence on tumor cell metastasis

video: This video shows blood flow tuning of tumor metastasis.

Image: 
Jacky Goetz

Scientists have long theorized that blood flow plays an integral role in cancer metastasis. But new research testing this long-held hypothesis in zebrafish and humans confirms that the circulatory blood flow impacts the position where circulating tumor cells ultimately arrest in the vasculature and exit into the body, where they can form a metastasis.

In a paper published April 9 in Development Cell, researchers from the French National Institute of Health and Medical Research (INSERM) found that in the model of the zebrafish embryo, labeled circulating tumor cells (CTCs) could be followed throughout the vasculature. The location where the tumor cells arrested was found to be closely correlated with blood flow rates less than 400-600 μm/s. The larger aim of the study was to visualize the impact of blood flow on important steps in metastasis--arrest of the CTCs, adhesion to the vasculature, and extravasation of the CTCs from the blood vessel.

"A long-standing idea in the field is that arrest is triggered when circulating tumor cells end up in capillaries with a very small diameter simply because of size constraints," says author Jacky G. Goetz, PhD, whose laboratory conducted the study. "This research shows that this position is not only driven by physical constraint but that blood flow has a strong impact on allowing the tumor cells to establish adhesion with the vessel wall. I think this is an important addition to understanding how and where tumor cells would eventually form metastases."

Researchers chose the zebrafish embryo model since its vasculature is highly stereotyped. "This made it much easier to document the position of all the tumor cells after they were injected," explains Goetz. The team compiled all of the images together and created heat maps of the position of the tumor cells in the vasculature.

The researchers also found that blood flow is essential for the process of extravasation, when tumor cells leave the circulatory blood vessel and cross the endothelial barrier at a new site to establish a secondary tumor. "When we did timelapse imaging in the zebrafish embryo, we found that endothelial cells appear to curl around the tumor cells that are arrested in the blood vessel," says Goetz. "Blood flow at this step is essential. Without flow, endothelial remodeling does not occur. You need a certain amount of flow to keep the endothelium active so that it can remodel around the tumor cell."

They further confirmed this observation in brain metastases in mice using intravital correlative microscopy, an imaging technique developed by the Goetz laboratory, in collaboration with Y.Schwab (EMBL, Heidelberg), which combines imaging living multicellular model systems with electron microscopy to provide details of dynamic or transient events in vivo.

The researchers next applied these findings to study brain metastases in 100 human patients from with heterogeneous primary tumor locations. Similar to the zebrafish model, they mapped the position of the metastases and generated heatmaps. "We were able to merge the brain metastases map to a perfusion map of a control patient and found that it nicely reproduced exactly what we did in the zebrafish showing that metastases preferably develop in areas with low perfusion," says Goetz.

The researchers conclude that all of these findings show that blood flow at metastatic sites regulates where and how metastatic outgrowth develops. Looking ahead, the researchers plan on studying methods to inhibit the endothelial remodeling ability of the blood vessel to potentially impair extravasation and inhibit metastasis.

Credit: 
Cell Press

Binge-eating mice reveal obesity clues

image: Chocolate bar similar to the ones used in this study.

Image: 
CC0

Obesity is a growing issue in many countries, accelerated by easy access to calorie-dense foods that are pleasurable to eat (known as an 'obesogenic environment'). But while it's clear that eating too much leads to weight gain, little is known about the underlying behaviours that lead to overeating.

To mimic this obesogenic environment, the teams led by Mara Dierssen at CRG and Rafael Maldonado at UPF offered mice the option of a high-fat 'cafeteria' diet or a mixture of chopped-up commercial chocolate bars alongside their regular lab chow, before carrying out a detailed analysis of the animals' activity and feeding behaviour. Their results have been published in two back-to-back articles in the journal Addiction Biology.

Working together with Cedric Notredame (CRG) and Elena Martín-García (UPF), the scientists found that as well as becoming obese, the mice started very early to show the signs of addiction-like behaviour and binge-eating in response to these enticing foods.

For example, when offered chocolate for just one hour per day, the animals will compulsively 'binge', consuming as much chocolate in one hour as they would over a whole day if it was continually available. They also showed inflexible behaviours, similar to those seen in addiction, choosing to wait for chocolate while ignoring freely available standard chow. Yet, at the same time, the chocolate did not seem to satiate hunger as well as regular food.

The team found that animals on the high fat or chocolate diet also changed their daily routines. They were more likely to eat during the daytime - mice are usually nocturnal and feed at night - and they ate shorter more frequent 'snacks' rather than larger, longer-spaced meals.

A major problem in treating obesity is the high rate of relapse to abnormal food-taking habits after maintaining an energy balanced diet. The scientists evaluated this relapse and found that extended access to hypercaloric diets impairs the control of food seeking behaviour and has deleterious effects on learning, motivation and behavioural flexibility.

"Our results revealed that long-term exposure to hypercaloric diets impair the ability to control eating behaviour leading to negative effects on the cognitive processes responsible for a rational control of food intake" says Maldonado, head of the Neuropharmacology Laboratory at UPF.

"Obesity is not just a metabolic disease - it is a behavioural issue. People who are overweight or obese are usually told to eat less and move more, but this is too simplistic." explains Mara Dierssen, group leader of the Cellular and Systems Neurobiology laboratory at CRG. "We need to look at the whole process. By understanding the behaviours that lead to obesity and spotting the tell-tale signs early, we could find therapies or treatments that stop people from becoming overweight in the first place."

The scientists are now expanding their research to larger numbers of animals and they are also planning a study to look at addiction-like behaviours in obese people to see how well their results translate to humans.

"It is very hard to lose weight successfully, and many people end up trapped in a cycle of yo-yo dieting," Dierssen explains. "We need to focus on preventing obesity, and this study shows us that understanding and modifying behaviour could be the key", as Maldonado states "these studies reveal the major behavioural and cognitive changes promoted by hypercaloric food intake, which could be crucial for the repeated weight gain and the difficulties to an appropriate diet control".

Credit: 
Center for Genomic Regulation

Gender gap in academic medicine has negative impact, but there are simple solutions

image: This is Dr. Sharon Straus, director of the Knowledge Translation Program at the Li Ka Shing Knowledge Institute of St. Michael's Hospital, co-led this study.

Image: 
St. Michael's Hospital

TORONTO, April 9, 2018 - Existing gender gaps in academic medicine may have a negative impact on workplace culture and organizational effectiveness, but there are simple, systems-based solutions, suggests a new study.

Published today in BMC Medicine, the study led by Dr. Reena Pattani and Dr. Sharon Straus analyzed interviews conducted with female and male faculty members at the University of Toronto and its six fully-affiliated hospitals. Interviews with frontline staff uncovered that the gender gap in academic medicine has a negative impact.

"We identified three key themes that the gender gap fuels: social exclusion of female colleagues, reinforced stereotypes, and unprofessional behaviour," said Dr. Straus, director of the Knowledge Translation Program at the Li Ka Shing Knowledge Institute of St. Michael's Hospital. "Interestingly, instead of just focusing on the issues, the study's participants also offered system-based solutions to close the gap."

Several opportunities to mitigate the gap emerged from this research, said Dr. Straus, who is also interim physician-in-chief at St. Michael's. The participants suggested more streamlined recruitment, hiring, and promotions processes and simple amendments to the work environment, such as unconscious bias training for leaders and holding meetings during work hours so that all faculty members can attend. Formalized mentorship and consistent monitoring of the gender gap were also offered as solutions.

"These are solutions at the level of the institution, rather than at the level of the individual," said Dr. Pattani, a physician at St. Michael's. "The advice historically given to women has been about how they can change their own personal behaviours. That overlooks some of the systemic factors holding women back. We need to ensure that institutions can create a more inclusive environment for all."

This work builds on previous research conducted by Dr. Straus, which found that a significant gender gap existed across the research institute at St. Michael's Hospital. Since then, steps have been taken to reduce the gender imbalance. Across the University of Toronto's Department of Medicine, search guidelines, transparent promotions processes, and a formal mentorship program have also been implemented, and a biennial faculty survey helps university and hospital leadership monitor trends of the gender gap over time. At St Michael's Hospital, the Department of Medicine has also been hosting networking opportunities for female trainees.

"These are inspiring measures to promote equity. We can work together to institute more system-based approaches like these ones, which are low cost and easy to implement," Dr. Pattani said.

The researchers acknowledge that there are also other factors beyond gender that impact workplace culture, which were not studied in this paper. Further research is needed to understand how sexual orientation, race and ethnicity play a role in the representation gap, Dr. Straus said.

Credit: 
St. Michael's Hospital

Wheat research discovery yields genetic secrets that could shape future crops

image: This is floral architecture in wheat. The spikelets highlighted in purple are the additional spikelets which form part of the paired spikelets.

Image: 
CSIRO

A new study has isolated a gene controlling shape and size of spikelets in wheat in a breakthrough which could help breeders deliver yield increases in one of the world's most important crops.

The team from the John Innes Centre say the underlying genetic mechanism they have found is also relevant to inflorescence (floral) architecture in a number of other major cereals including corn, barley and rice.

The genetic identification of an agronomically-relevant trait represents a significant milestone in research on wheat; a crop with a notoriously complex genome.

The findings, published today in the journal The Plant Cell, give breeders a new tool to accelerate the global quest to improve wheat. The study also highlights a range of next generation techniques available for fundamental research into wheat, the world's most abundantly produced crop.

The Wheat Initiative, which co-ordinates global research for wheat, has identified floral architecture as one of the key traits which must be improved if a 1.6% yield increase needed to feed a growing world population is to be reached.

Dr Scott Boden from the John Innes Centre, whose crop genetics laboratory led the study alongside colleagues from Australia and Cambridge, said it represented a breakthrough both in lab and field.

"This paper is an example of what we are capable of doing in wheat now with a lot of the resources that are coming on board. We have gone from the field to the lab and back again. This is a developmental gene that contributes to a lot of agronomically important traits. This knowledge and the resources that come from this study can be used to see if it really does benefit yield."

"We have approached this in an academic sense but we have moved it towards giving breeders tools they can work with to optimise floral development."

Diversity of floral architecture has been exploited by generations of crop breeders to increase yields, and genetic variation for this trait has the potential to further boost grain production.

The study focused on the genetics behind a specific mutant trait in bread wheat known as paired spikelets, where a wheat inflorescence is formed of two spikelets instead of the usual one. This trait, which bears resemblance to flower production in corn and rice, is a variation that could lead to increase in yield.

Using a range of techniques including plant transformation, gene sequencing and speed breeding, researchers investigated lines of wheat displaying paired spikelets, derived from a mapping population called a multi-parent advanced generation intercross (MAGIC); a population of spring wheat created as a tool to study and identify the genetic origins of relevant traits.

The study revealed that a gene called TEOSINTE BRANCHED1 (TB1) regulates wheat inflorescence architecture, promoting paired spikelets via a mechanism which delays flowering and reduces the expression of genes that control the development of lateral branches called spikelets.

Further analysis showed that alleles that modify the function of TB1 were present in a wide range of major modern wheat cultivars used by breeders in the UK and Europe. Also, that variant alleles for TB1 were present on two of the three wheat genomes of winter and spring wheat.

Genetic analysis also showed that TB1 is linked to another gene that has been known for a long time: the so-called Green Revolution gene, Rht-1, which controls plant height.

Further studies will determine whether some of the effects attributed to Rht-1 are actually TB1 effects.

The authors of the study say the TB1 gene is also important to the contribution of floral architecture diversity in a number of other cereals including corn, barley and rice - with interest in the paper already coming from those research communities.

Dr Boden hopes that one of the impacts of the paper will be to encourage more early-career researchers to choose wheat for developmental research projects.

The full findings are available in the paper: Teosinte Branched1 regulates inflorescence architecture and development in bread wheat.

Credit: 
John Innes Centre

Kids with regular health care less likely to have life-threatening diabetic ketoacidosis

A key factor in reducing the risk of diabetic ketoacidosis (DKA), a potentially life-threatening complication of type 1 diabetes, in children at diagnosis of type 1 diabetes, is having a regular health care provider, according to a study in CMAJ (Canadian Medical Association Journal).

Type 1 diabetes mellitus is a common chronic childhood disease. If untreated, it can result in DKA, the most common cause of death in children with type 1 diabetes. DKA occurs as the body breaks downs muscle and fats for energy in place of sugar, releasing fatty acids (ketones) into the blood.

"Having a regular primary care provider was associated with a reduced risk of DKA at diabetes onset, but this protection reached statistical significance only among those 12-17 years of age," writes Dr. Meranda Nakhla, Montreal Children's Hospital and the Research Institute of the McGill University Health Centre, Montreal, Quebec, with coauthors. "Adolescents who had a regular family physician or pediatrician were 31% less likely or 38% less likely, respectively, to present with DKA relative to those without regular primary care."

The researchers looked at data on 3704 children with newly diagnosed diabetes over the study period from 2006 to 2015. The mean age at diagnosis was 10 years and about 27% (996 children) presented with diabetic ketoacidosis at time of diagnosis. About 59% (2177 children) had a regular primary care provider before diagnosis. Children of lower socioeconomic status or living in small cities were more likely to have a diabetic ketoacidosis episode at diagnosis of diabetes than those of higher socioeconomic status or those living in urban areas.

"Our study provides further evidence for policy-makers about the need to develop and strengthen initiatives that promote primary care for children," write the authors. "Our results highlight the need to develop targeted interventions for children under 12 years of age, including increasing public and physician awareness (through educational campaigns) about the symptoms of diabetes in this age group."

In a related commentary http://www.cmaj.ca/lookup/doi/10.1503/cmaj.180220, Dr. Astrid Guttmann, The Hospital for Sick Children (SickKids) and the Institute for Clinical Evaluative Sciences (ICES), Toronto, Ontario, writes the "study serves as an example of one of the many important child health outcomes that are both sensitive to access to timely care and independently related to socioeconomic health status."

Credit: 
Canadian Medical Association Journal