Culture

What is a scream? The acoustics of a primal human call

image: Harold Gouzoules, a professor of psychology at Emory University, is studying the origins of screams and the role they played in human evolution.

Image: 
Emory University

Screams are prompted by a variety of emotions -- from joyful surprise to abject terror. No matter what sparks them, however, human screams share distinctive acoustic parameters that listeners are attuned to, suggests a new study published by the Journal of Nonverbal Behavior.

"Screams require a lot of vocal force and cause the vocal folds to vibrate in a chaotic, inconsistent way," says senior author Harold Gouzoules, a professor of psychology at Emory University. "Despite the inherent variation in the way that screams are produced, our findings show that listeners can readily distinquish a scream from other human calls. And we are honing in on how they make that distinction."

Jay Schwartz is first author of the paper and Jonathan Engleberg is a co-author. They are both Emory PhD candidates in Gouzoules' Bioacoustics Lab.

Gouzoules began researching monkey screams in 1980, before becoming one of the few scientists studying human screams about 10 years ago. He is interested in the origins of screams and the role they played in human development.

"Animal screams occur almost always in the context of a fight or in response to a predator," Gouzoules says. "Human screams happen in a much broader array of contexts, which makes them much more interesting."

Gouzoules' Bioacoustics Lab has amassed an impressive library of high-intensity, visceral sounds -- from TV and movie performances to the screams of non-actors reacting to actual events posted to online sites such as YouTube.

For the current study, the researchers presented 182 participants with a range of human calls. Some of the calls were screams of aggression, exclamation, excitement, fear or pain. Others calls included cries, laughter and yells.

The participants showed strong agreement for what classified as a scream. An acoustical analysis for the calls the participants classified as screams, compared to those they did not, included a higher pitch and roughness, or harshness, to the sound; a wider variability in frequency; and a higher peak frequency.

The current paper is part of an extensive program of research into screams by Gouzoules. In another recently published article, his lab has found that listeners cannot distinguish acted screams from naturally occurring screams. Listeners can, however, correctly identify whether pairs of screams were produced by the same person or two different people.

Credit: 
Emory Health Sciences

Using lungs from increased-risk donors expands donor pool, maintains current survival rates

video: Cleveland Clinic researchers have found that using lungs from donors who are considered high risk for certain infectious diseases compared to standard risk donors results in similar one-year survival for recipients

Image: 
Cleveland Clinic

Thursday, December 5, 2019, CLEVELAND: Cleveland Clinic researchers have found that using lungs from donors who are considered high risk for certain infectious diseases compared to standard risk donors results in similar one-year survival for recipients. In addition, researchers saw no difference in rejection or graft (donor lung) survival after one year in patients receiving lungs from increased-risk donors.

The study was published recently in the Journal of Thoracic and Cardiovascular Surgery.

In 2013, the proportion of non-standard risk-lung donors increased as the U.S. Public Health Service expanded the definition of what it means to be a "high risk" donor. The definition broadened the designation to include more organs in this category and changed the name to "increased risk" donors. The designation is used to identify risky donor behavior with the goal of reducing the transmission of HIV, hepatitis B, and hepatitis C. All organs considered for transplant are tested for infectious disease, but there is a very small possibility of an infection not showing up upon early initial testing because the immune system has not produced enough antibodies yet to be detected.

Increased risk behaviors include activities like non-medical intravenous drug use and sexual contact with a person known or suspected to have HIV, hepatitis B or hepatitis C infections. The broadened definition also encompasses donors whose medical or behavioral history cannot be obtained. Prior to the changes, about 8% of organs were considered "high risk;" after the changes, about 22% were considered "increased risk."

During the study, researchers looked at a total of 18,490 patients, with 64% transplanted during the high-risk-designation period and 36% during the increased-risk period. Researchers found no statistically significant differences in survival, acute rejection that was treated or organ survival for those receiving either increased risk or high-risk donor organs compared to those with standard-risk organs. This study did not look at recipients who accepted organs known to have hepatitis C, which, with new treatment options for the infection, is becoming more common.

Researchers worry the broadened definition has the potential to narrow the donor pool, because transplant candidates often refuse organs from increased risk donors. Transplant candidates must consent to use a non-standard-risk organ, and studies have shown up to 78% of waitlist candidates refuse an offer from an increased-risk donor. Due to organ shortages, approximately 10% of U.S. lung transplant candidates die on the waiting list every year.

"Our findings raise the question of the utility of the designation of 'increased risk' for donor lungs, since there is no impact on outcomes," said Carli Lehr, M.D., M.S., a transplant pulmonologist at Cleveland Clinic and lead author of the study. "Forgoing the designation, treating all donors as potentially at risk, and using appropriate post-transplant screening for infectious diseases may increase overall organ utilization and lessen deaths on the waitlist."

Currently, there about 1,450 people waiting for a lung transplant in the United States.

Credit: 
Cleveland Clinic

Young people with IBD five times more likely to develop serious infections

(Vienna, December 5, 2019) Young patients with inflammatory bowel disease (IBD) are five times more likely than the general population to develop viral infections that can lead to hospitalisation or permanent organ damage, a new study published in the UEG Journal has found.1

In the first study of its kind, researchers analysed almost 2,700 IBD patients in a Paris referral centre to understand the respective roles of IBD activity and drugs in promoting systemic serious viral infection (SVI). The study identified clinically active IBD and thiopurines (a class of immunomodulators used to treat an estimated 60% of IBD patients2) as the main drivers of infection. Despite the highest risk of infection being seen in young patients between the ages of 18 and 35, a three-fold increased incidence of severe viral infections was observed in IBD patients of all ages.

The study also uncovered a concerning link between thiopurine use and a number of harmful infections. Whilst IBD patients receiving no treatment were at a similar risk level to the general population, patients treated with immunomodulators were found to be six times more likely to develop an SVI. The most common SVIs developed by IBD patients were identified as Epstein-Barr virus (EBV), which is associated with a range of diseases such as glandular fever and Hodgkin's Lymphoma, and cytomegalovirus (CMV), an infection which can pose a risk to unborn babies.

A correlation was also found between thiopurine use and EBV-induced hemophagocytic lymphohistiocytosis (HLH), an aggressive disease associated with high mortality rates.3 With a third of patients estimated to be stopping thiopurine use due to adverse side effects, these new findings underline the need to find novel therapeutic approaches to tackle IBD.2

Lead researcher Professor Laurent Beaugerie, from the Department of Gastroenterology at Saint-Antoine Hospital, commented, "Clinicians need to be aware of the substantially increased risk of SVI in patients with IBD, which had previously remained unclear. Young IBD patients are the most vulnerable to the development of SVIs, as they are less likely to have been exposed to viruses such as EBV or CMV before. They will therefore mount a less effective immune response. Their risk is further elevated by the inhibiting effect of the immunosuppressive drugs they are treated with."

The number of individual IBD cases, which encompasses both Crohn's disease and ulcerative colitis, has shown a marked increase since 1990, rising from 3.6 million cases globally to over 6.8 million in 2017.4 Commenting on the increasingly heavy burden of IBD, Professor Beaugerie added, "The relation between IBD drugs and SVIs is especially concerning, as presently, hospitalisation due to the serious complications that accompany the disease is the main cost associated with the management of IBD. The growing prevalence of IBD across the globe will only add further to the pressure placed on healthcare structures."

New treatment pathways such as nutritional therapies in Crohn's disease and faecal microbiota transplantations (FMT), which are not evidenced to be associated with an increased risk of SVI, could potentially alleviate the strain placed on healthcare systems. Therapies such as these could transform the course of treatment and confer significant benefits to patients.

The study, which has cast new light on the strong association between IBD drugs and SVI, emphasises the need for further research and funding into the area to improve patient outcomes. An investigation into promising new treatments should become the next course of action if the risk of SVI in IBD patients is to be brought closer that of the general population.

Credit: 
SAGE

Fusion by strong lasers

image: Accelerator tunnel at the European XFEL

Image: 
DESY

Nuclear physics usually involves high energies, as illustrated by experiments to master controlled nuclear fusion. One of the problems is how to overcome the strong electrical repulsion between atomic nuclei which requires high energies to make them fuse. But fusion could be initiated at lower energies with electromagnetic fields that are generated, for example, by state-of-the-art free electron lasers emitting X-ray light. Researchers at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) describe how this could be done in the journal Physical Review C.

During nuclear fusion two atomic nuclei fuse into one new nucleus. In the lab this can be done by particle accelerators, when researchers use fusion reactions to create fast free neutrons for other experiments. On a much larger scale, the idea is to implement controlled fusion of light nuclei to generate power - with the sun acting as the model: its energy is the product of a series of fusion reactions that take place in its interior.

For many years, scientists have been working on strategies for generating power from fusion energy. "On the one hand we are looking at a practically limitless source of power. On the other hand, there are all the many technological hurdles that we want to help surmount through our work," says Professor Ralf Schützhold, Director of the Department of Theoretical Physics at HZDR, describing the motivation for his research.

Tunneling at a high level, to be accessible soon

In order to trigger nuclear fusion, you first have to overcome the strong electrical repulsion between the identically charged atomic nuclei. This usually requires high energies. But there is a different way, explains the co-author of the study, Dr. Friedemann Queißer: "If there isn't enough energy available, fusion can be achieved by tunneling. That's a quantum mechanical process. It means that you can pass (i.e., tunnel) through the energy barrier caused by nuclear repulsion at lower energies."

This is not some theoretical construct; it really happens: The temperature and pressure conditions in the sun's core do not suffice to overcome the energy barrier directly and enable hydrogen nuclei to fuse. But fusion happens nonetheless because the prevailing conditions allow the fusion reaction to be sustained thanks to a sufficiently high number of tunneling processes.

In their current work, the HZDR scientists are investigating whether controlled fusion could be facilitated with the assistance of tunneling processes using radiation. But that is also a question of energy: the lower it is, the lesser the likelihood of tunneling. Up to now, conventional laser radiation intensity was too low to trigger the processes.

XFEL and electron beams to assist fusion reactions

This could all change in the near future: With X-ray free electron lasers (XFEL) it is already possible to achieve power densities of 10^20 watts per square centimeter. This is the equivalent of approximately a thousand times the energy of the sun hitting the earth, concentrated on the surface of a one-cent coin. "We are now advancing into areas that suggest the possibility of assisting these tunneling processes with strong X-ray lasers," says Schützhold.

The idea is that the strong electric field causing the nuclei repulsion is superimposed with a weaker, but rapidly changing, electromagnetic field that can be produced with the aid of an XFEL. The Dresden researchers investigated the process theoretically for the fusion of the hydrogen isotopes deuterium and tritium. This reaction is currently considered to be one of the most promising candidates for future fusion power plants. The results show that it should be possible to increase the tunneling rate in this way; a sufficiently high number of tunneling processes could eventually facilitate a successful, controlled fusion reaction.

Today, just a handful of laser systems around the world with the requisite potential are the flagships of large-scale research facilities, like those in Japan and the United States - and in Germany where the world's strongest laser of its type, the European XFEL, is to be found in the Hamburg area. At the Helmholtz International Beamline for Extreme Fields (HIBEF) located there, experiments with unique ultra-short and extremely bright X-ray flashes are planned. HZDR is currently in the process of constructing HIBEF.

The Dresden strong field physicists' next step is to dive even deeper into the theory in order to understand other fusion reactions better and be able to assess their potential for assisting tunneling processes with radiation. Analogous processes have already been observed in laboratory systems, such as quantum dots in solid-state physics or Bose-Einstein condensates, but in nuclear fusion experimental proof is still pending. Thinking yet further ahead, the authors of the study believe other radiation sources could possibly assist tunneling processes. The first theoretical results on electron beams have already been obtained.

Credit: 
Helmholtz-Zentrum Dresden-Rossendorf

Modern technology and old-fashioned legwork solve science mystery

image: New research may have put to rest a century-old question on the behavior of the single-cell organism S. roeseli, shown here (a) resting, (b) bending, (c) contracting and (d) detaching in response to an irritant.

Image: 
Image Joseph Dexter and Sudhakaran Prabakaran/Current Biology.

HANOVER, N.H. - December 5, 2019 - A life of avoidance, detachment and relocation might not be suitable for all, but for the single-cell eukaryote Stentor roeseli, confirmation of this idiosyncratic behavior pattern has been a long time coming.

In a study appearing in Current Biology, researchers at Dartmouth College and Harvard Medical School hope to put to rest a century-old scientific debate by demonstrating that the low-level organism S. roeseli is capable of decision making. They also offer the video evidence to prove it.

In 1906, American biologist Herbert Spencer Jennings reported that Stentor roeseli exhibited complex behavior. In response to an irritating stimulus, Jennings said that S. roeseli engaged in four distinct behaviors—bending, ciliary alteration, contraction and detachment.

The news that the organism, which lacks a central nervous system, possessed sophisticated sensing and response mechanisms sent waves through the scientific community. The findings also played a key role in early scientific debates about animal behavior.

Over a half-century later, the Jennings research was debunked by a 1967 experiment that failed to replicate Jennings' results. That study was accepted by the science community even though it used a different species of organism.

Now, the Dartmouth-Harvard Medical School team have confirmed Jennings' original finding.

Through a series of analyses conducted in part at Dartmouth's Neukom Institute for Computational Science on a project that began at Harvard close to a decade earlier, researchers observed the same avoidance behavior that Jennings noted over one hundred years ago.

"Our results provide strong evidence that Jennings' original observations about Stentor behavior were correct, which should help to resolve the long-standing confusion," said Joseph Dexter a fellow at Dartmouth's Neukom Institute for Computational Science and a lead author on the study. "We now have a transparent dataset, and we invite researchers to view the full set of videos to learn more about the complexities of how S. roeseli responds to stimulation."

Stentor roeseli is a colorless, trumpet-shaped protozoa that is visible to the naked eye and resembles the sound horn of a Gramophone.

To reconstruct Jennings’ experiment, the team first had to acquire the specific species of organism used in the early 1900s. After an effort that included wading through ponds in southeastern Massachusetts, the team obtained a sample from a golf course in Manchester, England through local supplier Sciento.

The researchers then developed a platform for manipulating the organism that allowed them to target the delivery of an irritant. They settled on using polystyrene beads to stimulate reactions from the organism in the test. This was a departure from the powder used in the original experiment, but it led to an observable response that is thought to be part of a generalized avoidance strategy in S. roeseli.

As the beads were fed through a microinjection needle using a gravity-based system, the researchers worked to keep the microscope image in focus while they observed and recorded the experiment.

In the video, the researchers demonstrate how S. roeseli avoids the irritant by bending away or changing the beat of its hair-like cilia to keep from ingesting it. In response to the irritation, the organism might also contract into a protective ball, or detach from the piece of algae it is anchored to and swim to a new site.

After years of field work, video microscopy, micromanipulation and quantitative analysis, the researchers finally had the evidence that they needed to confirm Jennings' finding that the single-cell organism is capable of complex avoidance behavior.

"The results are the culmination of a long, highly-collaborative process. It was quite satisfying to work on a problem with such an interesting history and to confront some unusual challenges along the way," said Dexter.

"Our findings show that single cells can be much more sophisticated than we generally give them credit for," said senior researcher Jeremy Gunawardena, associate professor of systems biology in the Blavatnik Institute at Harvard Medical School. "They have to be 'clever' at figuring out what to avoid, where to eat and all the other things that organisms have to do to live. I think it's clear that they can have complex ways of doing so."

In addition to demonstrating how the organism responds to stimulus, the research team also confirmed Jennings' finding that S. roeseli uses a hierarchy of behaviors. While the team found few instances of the organism following the full hierarchy, they observed many partial instances with varying orders of occurrence, ultimately concluding that the behavior hierarchy exists.

According to the paper, the team considers the behavior hierarchy a form of "sequential decision making in the sense that when given similar stimulation repeatedly, the organism 'changes its mind' about which response to give, thereby following the observed hierarchy."

By generating a much larger and richer dataset than the early 1900s experiment, the team also demonstrates that the organism's decision making is distinct from habituation or classical conditioning. The team notes that the choice between contraction and detachment in the organism resembled the same probability of a fair coin toss.

Credit: 
Dartmouth College

Dramatic rise in patients 'cured' of heart condition after GP performance pay scheme

The introduction of a performance-related financial incentive scheme for GPs led to a dramatic almost five-fold rise in the number of patients whose heart rhythm condition was said to have been 'cured', say University of Birmingham researchers.

Academics at the University of Birmingham's Institute of Applied Health Research, supported by NIHR ARC West Midlands, conducted a study into patients with the most common heart rhythm condition, called atrial fibrillation. It mainly affects older people, with around 1.4 million sufferers in the UK, and it greatly increases the risk of stroke. To avoid strokes, it is important patients take anticoagulant drugs to prevent blood clotting. A previous study by the University of Birmingham showed that even after atrial fibrillation is recorded as 'cured', patients remain at high risk of stroke and should continue taking anticoagulants.

In this new study, published in BMJ Open, researchers analysed GP records dating between 2000 and 2016 of 250,788 patients with atrial fibrillation, of which 14,757 patients were recorded to have cured, or 'resolved', atrial fibrillation.

The researchers found that prior to 2005, resolved cases of atrial fibrillation were uncommon, but became almost five times more frequent after the introduction of the management of the condition in GP performance targets in 2006 (rising from 5.7 per 1,000 person-years in 2005, to 26.3 per 1,000 person-years in 2006). It has remained high ever since and increased again when further changes were made to the incentive scheme in 2012.

The targets are part of the Quality and Outcomes Framework (QOF), a voluntary scheme within the General Medical Services contract aimed at improving the quality of clinical care. QOF sees GPs keeping a register of patients with chronic disease. They are paid an incentive to ensure a specific percentage of atrial fibrillation patients receive drugs for stroke prevention. However, patients with 'resolved' atrial fibrillation are excluded from this register.

Research Fellow Dr Nicola Adderley, of the University of Birmingham, said: "It is possible that this increase was, in part, the result of GP practices catching up with recording 'resolved' atrial fibrillation following the introduction of QOF.

"However, we know from our previous research that the vast majority of patients with resolved atrial fibrillation do not receive stroke prevention drugs.

"Our latest study gives evidence that, since the introduction of these performance targets, patients with atrial fibrillation are deemed to be cured simply because they aren't being prescribed anticoagulants."

Tom Marshall, Professor of Public Health and Primary Care, of the University of Birmingham, said: "In this new study we also found that, since 2006, the increase in resolved cases of atrial fibrillation was most striking in the months of January to March, just before the date in April when the performance targets are measured.

"Prior to 2006, resolved cases of atrial fibrillation were recorded throughout the year with little monthly variation in incidence.

"The effect of this is that GPs' performance appears improved, which can be equated to getting troublesome children to stay at home on the day of a school inspection."

Senior Clinical Lecturer Dr Krish Nirantharakumar, of the University of Birmingham, said: "The problem is that if a patient is reported as cured they are removed from the register and therefore fall out of the system, which stops GPs keeping an eye on people with ongoing atrial fibrillation.

"Our previous research has shown that atrial fibrillation cannot ever safely be considered as resolved - it is a condition that can be present one day and absent the next and giving someone the all-clear can be a mistake.

"It can come back without a patient or their doctor realising, which means they continue to be at high risk of stroke and should still be prescribed clot-preventing drugs.

"Across the UK there are around 100,000 people with atrial fibrillation which is reported as resolved, of which around 2,000 will suffer strokes each year. They are largely untreated, yet treatment of these patients would prevent about 1,000 strokes per year."

Professor Marshall concluded: "We hope this research will lead to changes to the QOF rules so these patients don't come off the register if they are labelled 'resolved'."

Those with atrial fibrillation may be aware of noticeable heart palpitations, where their heart feels like it's pounding, fluttering or beating irregularly. Sometimes atrial fibrillation does not cause any symptoms and a person who has it is completely unaware that their heart rate is irregular.

Credit: 
University of Birmingham

Older adults who 'train' for a major operation spend less time in the hospital

CHICAGO (December 5, 2019): Older adults who "train" for a major operation by exercising, eating a healthy diet, and practicing stress reduction techniques preoperatively have shorter hospital stays and are more likely to return to their own homes afterward rather than another facility, compared with similar patients who do not participate in preoperative rehabilitation, according to research findings. The new study, which appears as an "article in press" on the Journal of the American College of Surgeons website in advance of print, evaluated a home-based program of preoperative rehabilitation--called prehabilitation--for Michigan Medicare beneficiaries.

The researchers also reported an association between prehabilitation and lower total insurance payments for all phases of care.

"Prehabilitation is good for patients, providers, and payers," said study coauthor Michael J. Englesbe, MD, FACS, a liver transplant surgeon at the University of Michigan, Ann Arbor. "We believe every patient should train for a major operation. It's like running a 5K race: You have to prepare."

Involving physical and lifestyle changes, prehabilitation, or "prehab," optimizes a patient's well-being and ability to withstand the stress of undergoing an operation, Dr. Englesbe said. Past studies show that prehabilitation lowers the rate of postoperative complications and speeds the patient's return to their normal functioning, among other advantages.*

"Prehab has been gaining momentum over the past 10 years. More surgeons and other clinicians are appreciating its benefits," Dr. Englesbe said. "However, the feasibility and value of broad implementation of prehabilitation outside the research environment were unknown."

For this new study researchers tested the real-world effectiveness and cost savings of prehabilitation. Patients underwent diverse cardiothoracic (chest/heart) and abdominal operations at 21 hospitals in Michigan that participated in a statewide prehabilitation program called the Michigan Surgical & Health Optimization Program (MSHOP). Patients' surgeons referred them to MSHOP if they were at high risk of postoperative complications, Dr. Englesbe, program co-developer and director, said.

Physical and psychological preparation

MSHOP involved a home-based walking program in which surgical patients tracked their steps using a pedometer and received daily reminders and feedback through phone, email, or text messages. Program participants received educational materials on nutrition, relaxation techniques, and smoking cessation as well. They also practiced using an incentive spirometer, a medical device that helps patients keep their lungs healthy after an operation.

Included in the study were 523 Medicare patients who participated in MSHOP for at least one week before an inpatient operation and filed Medicare claims between 2014 and 2017, according to the article. For comparison, the researchers used Medicare claims data during the same period to identify 1,046 matched controls: patients with similar demographic characteristics and coexisting illnesses who had the same operation at the same hospital but did not take part in prehabilitation. The average age of patients and controls was 70 years.

Participation in MSHOP ranged from 11 to 33 days, the researchers reported. Of the participants, 62 percent were reportedly "engaged" in the prehabilitation program, defined as recording step counts three or more times per week for most of the program. Thirty-nine patients (7.5 percent) asked to be removed from the program, but they remained in the statistical data analysis. For both groups, the study authors analyzed data for the hospitalization and 90 days afterward.

Prehabilitation leads to better outcomes

Participation in prehabilitation was significantly associated with several improved outcomes that are important to patients or insurers, according to the researchers:

The hospital length of stay was shorter by one day, with a median (middle value) of six days for participating patients versus seven days for controls, who received no prehabilitation.
Program participants were more likely to be discharged from the hospital to home: 65.6 percent versus 57 percent of controls.
Medicare paid nearly $3,200 less in total payments for both hospital and posthospital care (what Medicare calls an "episode of care") for patients who underwent prehabilitation than for controls: $31,641 versus $34,837.
Insurance payments were especially lower among patients for posthospital care, including skilled nursing facility ($941 versus $1,566 for controls) and home health care ($829 versus $960 for controls).

"Every patient scheduled for a major operation--not just those at high risk--should ask their surgeon for a prehabilitation program," Dr. Englesbe recommended.

Although the study did not evaluate patient satisfaction with prehabilitation, Dr. Englesbe said patients at his medical center who completed MSHOP described their surgical experience positively, using words such as "empowering." Some patients requested MSHOP when they required another operation, he said.

Dr. Englesbe said he hopes that prehabilitation will become the standard of surgical care in Michigan.

Nationwide, prehabilitation is an area of focus for the American College of Surgeons' Strong for Surgery program, which promotes evidence-based practices to boost preoperative health. Prehab also is part of the College's new Geriatric Surgery Verification Program standards, developed to optimize surgical care for older adults, and now enrolling hospitals nationally.

Credit: 
American College of Surgeons

The Lancet Public Health: One in two people who are homeless may have experienced a traumatic brain injury in their lifetime

This systematic review and first meta-analysis on the prevalence of traumatic brain injury (TBI) in people who are homeless or in unstable housing situations -- including 38 studies published between 1995 and 2018, and published in The Lancet Public Health journal -- suggests that homeless people experience a disproportionately high lifetime prevalence of TBI.

The authors call for healthcare workers to have increased awareness of the burden and associated effects of TBI in people who are homeless, noting that more comprehensive assessments of their health - including checking for history of TBI - may help improve their care.

Lead author Jacob Stubbs, University of British Columbia, Canada, says: "Traumatic brain injury may be an important factor in the complex health challenges faced by this population. Our work emphasizes that health care professionals and frontline workers should be aware of the burden of TBI in this population, and how it relates to health and functioning." [1]

In the USA and EU, more than six million people experience homelessness each year, and homeless people are known to experience poorer mental and physical health than the general population [2]. While often preventable, TBI is a pervasive and under-recognised public health problem linked with subsequent development of neurological and psychiatric disorders.

The authors looked at existing studies from six high-income countries - Australia, Canada, Japan, South Korea, the UK, and the USA - which included people of any age who were either homeless, in unstable housing situations, or seeking services for homeless people. They examined the number of new cases and existing cases of TBI, and the association between TBI and health or functioning outcomes.

In their systematic review, they included 38 studies and reviewed the association between TBI and health or functioning outcomes in this population (e.g. mental health and health service use). They then conducted a meta-analysis (including 22 studies with data on how many of the participants had a history of TBI to estimate the lifetime prevalence of all TBIs and moderate or severe TBIs.

The results suggest that one in two homeless people experience a TBI (53%), and almost one in four experience a TBI that is moderate or severe (23%).

Their findings also suggested that TBI is consistently associated with poorer self-reported physical and mental health, suicidality and suicide risk, memory concerns, increased health service use and higher criminal justice system involvement [3]. They note that these links will need to be confirmed in further research, including prospective study designs to establish incidence rates more accurately, as well as for public health research and practice to focus on TBI prevention and more thoroughly understanding the consequences of TBI in this vulnerable population. The possibility of TBI creating risk for age-related cognitive disorders in this group is largely unexplored.

Jehannine Austin, British Columbia Mental Health and Substance Use Services Research Institute, Canada, says: "The relationship between homelessness and TBI could function both ways - TBI and homeless could increase the risk of homelessness, and homelessness could increase the risk of TBI. We need a better understanding of this relationship to address the issue, and to improve outcomes in the homeless and marginally housed population." [1]

The authors speculate that, based on comparing their estimates to studies of the general population, the lifetime prevalence of TBI in people who are homeless and in unstable housing situations could potentially be between 2.5 times and 4 times higher than in the general population. Furthermore, they suggest that lifetime prevalence of moderate or severe TBI in this population could be nearly ten-times higher than estimates in the general population. [4,5]

The authors note that their findings suggest that the provision of stable housing might lower the risk for TBI, and confirm more research is needed to study how this benefits people who are homeless or in unstable housing situations.

The authors note some limitations of their study. Their prevalence estimates are limited by the quality of the studies included and differences in study design. For example, some studies used different data collections methods and varying definitions of TBI. The authors believe that these limitations may mean that their estimate (53%) is actually an underestimate of the burden of TBI in this population. They were also unable to determine the directionality of the relationship between TBI and homelessness because most studies included were retrospective.

In a linked Comment, Jesse T Young, a research fellow at the University of Melbourne, Australia, says: "It is becoming clear that TBI can be both a cause and a consequence of homelessness. The functional and socioeconomic consequences associated with TBI can present challenges to finding and retaining housing. [...] Given the increasing evidence for a potential causal relationship, a randomised controlled trial investigating the effect of housing intervention on TBI is both feasible and warranted."

Credit: 
The Lancet

Insilico publishes a review of deep aging clocks and announces the issuance of key patent

image: Insilico Medicine grants a patent on Deep Transcriptomic Aging Clock 

Image: 
Insilico Medicine

December 5th 2019 - Insilico Medicine today announced the publication of a paper titled "Deep biomarkers of aging and longevity: from research to applications" in Aging and the issuance of a key patent "Deep transcriptomic markers of human biological aging and methods of determining a biological aging clock" (US20190034581). Insilico Medicine utilizes next-generation computational approaches to accelerate the three areas of drug discovery and development: disease target identification, generation of novel molecules, and prediction of clinical trial outcomes. Age is a universal feature for every living being and allows the deep neural networks to be trained to predict age or use age to generate using multiple data types. This allows for novel methods for target identification, data quality control, and generation of synthetic biological data.

"The fields of artificial intelligence, drug discovery, and aging research are rapidly converging. We are using the deep neural networks trained on age for a variety of applications such as target identification or patient stratification geared to accelerate pharmaceutical R&D. We are very happy to see the first patent on the deep aging clocks granted. At Insilico we filed for patents for a broad range of inventions in generative chemistry and in generative biology," said Alex Zhavoronkov, PhD.

Credit: 
InSilico Medicine

How flowers adapt to their pollinators

image: This is a flower of the bee-pollinated species Meriania hernandoi from the Ecuadorian cloud forest.

Image: 
Agnes Dellinger

Flowering plants are characterized by an astonishing diversity of flowers of different shapes and sizes. This diversity has arisen in adaptation to selection imposed by different pollinators including among others bees, flies, butterflies, hummingbirds, bats or rodents. Although several studies have documented that pollinators can impose strong selection pressures on flowers, our understanding of how flowers diversify remains fragmentary. For example, does the entire flower adapt to a pollinator, or do only some flower parts evolve to fit a pollinator while other flower parts may remain unchanged?

In a recent study, scientists around Agnes Dellinger from the Department of Botany and Biodiversity Research from the University of Vienna investigated flowers of 30 species of a tropical plant group (Merianieae) from the Andes. "Each of these plant species has adapted to pollination by either bees, birds, bats or rodents", says Dellinger. Using High-Resolution X-ray computed tomography, the research team produced 3D-models of these flowers and used geometric-morphometric methods to analyse differences in flower shape among species with different pollinators.

The researchers could show that flower shapes have evolved in adaptation to the distinct pollinators, but that flower shape evolution was not homogeneous across the flower. In particular, the showy sterile organs of flowers (petals) adapted to the different pollinators more quickly than the rest of the flower: the reproductive organs have evolved more slowly. "This study is among the first to analyse the entire 3-dimensional flower shape, and it will be exciting to see whether similar evolutionary floral modularity exists in other plant groups", concludes Dellinger.

Credit: 
University of Vienna

Whales may owe their efficient digestion to millions of tiny microbes

image: A bowhead whale breaches the surface of the cold waters near Point Barrow, Alaska.

Image: 
Photo by Kate Stafford, University of Washington

A study by researchers at Woods Hole Oceanographic Institution (WHOI) shows that the microbial communities inside whales may play an important role in the digestion of one of the ocean's most abundant carbon-rich lipids, known as a wax ester. Their findings were published Dec. 2 in the Journal of the International Society for Microbial Ecology.

Wax esters are one of the most difficult fats to digest for many animals, including humans. They are especially rich in tiny crustaceans, such as krill and copepods--a favorite prey of filter-feeding bowhead and endangered North Atlantic right whales. Wax esters are also an important lipid in our oceans world-wide, at times storing at least half of the carbon produced by plant-like marine organisms, according to previous studies.

"We found more than 80 percent of the lipids eaten by bowhead whales are wax esters, but less than 30 percent remain in the large intestine," says WHOI marine scientist Carolyn Miller, lead author of the study. As a result, bowhead whales, among other baleen whale family members, are highly efficient at digesting these lipids. "This is important not only for understanding the cycling of these important lipids in the oceans, but also because whales consume enormous quantities of wax ester-rich prey to support many aspects of their health and reproduction."

The question is: if wax esters are so difficult to digest for other animals, how can bowhead and other baleen whales are able to digest them so efficiently? Part of the answer could be the millions of tiny microbes, which include bacteria, living in their digestive tract. These microbial communities are commonly referred to as the 'gut microbiota.' In humans and other terrestrial animals, gut microbes play important roles in many aspects of health, including digestion, where they often have the ability to break down otherwise indigestible components of the diet.

Miller and her colleagues wanted to study whether or not the bowhead whale's gut microbes were in fact playing a role in the digestion of the wax esters. To do this, they would first need fresh samples.

But the window to extract samples is a fast-closing one, as decomposition can taint the contents of the gut. "Getting fresh samples from the insides of whales is really rare," says Miller.

Thanks to the generosity of Alaskan Native whaling captains of the Barrow Whaling Captains Association, who are permitted to take a small number of whales each year for their subsistence, Miller and her colleagues were given an opportunity to extract samples from freshly-harvested bowhead whales. Their combined efforts yielded over one hundred samples from 38 bowhead whales over four years.

In the lab, Miller and her colleagues analyzed these samples from nine locations along the gastrointestinal tract, hoping to detect changes in the microbial communities and lipids throughout the gut of each whale. What they found was a strong connection between the bacterial community and a decrease in the presence of wax esters in the lower part of the small intestine.

"Microorganisms play important roles in the digestive processes of mammals, as well as contributing to immune functioning," says Amy Apprill, a WHOI microbial ecologist and a coauthor of the study. "This study suggests that the gut bacteria may have a similarly critical role within whales, possibly providing them the assistance they need to break down these fatty prey compounds."

Miller, Apprill and their colleagues aim to build off of this study's findings, hoping to determine how much of the digestion is due to the whale itself versus the microbial communities inhabiting its intestines. Ultimately, this may shed light on how these whales and their microbes digest the primary source of energy from their prey to sustain themselves. The results of this research may benefit the Alaskan Native whaling community as well by illuminating how the nutritional resources in the waters off of Point Barrow, Alaska support the local bowhead whales they so rely upon.

"There have been decades of research focused on carbon cycling in the ocean, but how these compounds are being broken down, transformed and utilized to create the substantial biomass of a whale has remained a bit of a black box," Apprill adds. "This study is providing a unique glimpse into a previously hidden part of the marine food web."

Credit: 
Woods Hole Oceanographic Institution

Chronic disease prevention could ease opioid crisis

Preventing chronic disease could help curb the opioid epidemic, according to research from the University of Georgia.

The study is the first to examine the relationship between hospitalizations due to opioid misuse and chronic disease.

"When we look at the opioid crisis, most of the response has been to treat opioid overdose, making naloxone more available, for example. That's a good immediate intervention, but in the long run, we need to identify the underlying issues of the epidemic," said study author Janani Thapa, who studies chronic disease at UGA's College of Public Health.

When most health professionals talk about the risks of living with a chronic disease, opioid addiction doesn't make the list, but to Thapa, the association is obvious.

"Chronic disease is associated with pain. Pain is associated with opioid use," she said. "So, we thought, let's look at that and put some numbers behind the association."

One in four U.S. adults is living with at least one chronic disease, and many of these diseases are accompanied with chronic pain. Arthritis is one common example. Obesity is another.

That's why Thapa and her co-authors were particularly interested in the patterns of opioid-related hospitalizations among patients with conditions that were the most likely to be prescribed opioids, including asthma, arthritis, cancer, liver disease and stroke.

The researchers gathered inpatient data from a national sample of community hospitals, and they looked at the prevalence of chronic disease among patients who had been admitted for an opioid-related injury, from 2011 to 2015.

The results showed that over 90% of the opioid-related hospitalizations were among patients with two or more chronic diseases.

Thapa says the public health and health care fields need to be aware of the overlap between two of the country's growing epidemics and prioritize finding alternative, non-addictive strategies to managing the chronic pain.

"These aren't separate issues," said Thapa. "The numbers that we have make the case that hospitalization is happening because these patients are taking pain medications, and chronic disease is underlying many of these cases."

Thapa would like to see this study begin a conversation about allocating more of the resources pouring in to curb the opioid crisis toward chronic disease prevention.

"We are missing a key component, I think, if we aren't talking about preventing opioid misuse through chronic disease prevention," she said.

Credit: 
University of Georgia

What does DNA's repair shop look like? New research identifies the tools

image: Human nucleus with fluorescently labeled genome (green) and sites of induced double stranded DNA breaks (red).

Image: 
Jonah A. Eaton and Alexandra Zidovska, Department of Physics, New York University.

A team of scientists has identified how damaged DNA molecules are repaired inside the human genome, a discovery that offers new insights into how the body works to ensure its health and how it responds to diseases that stem from impaired DNA.

"Our findings show that a DNA repair process is very robust, engineered through intricate structural and dynamical signatures where breaks occur," explains Alexandra Zidovska, an assistant professor in New York University's Department of Physics and the senior author of the study, which appears in Biophysical Journal. "This knowledge can help us understand human genome in both healthy and disease-stricken states--and potentially offer a pathway for enhancing cancer diagnosis and therapy."

The human genome consists, remarkably, of two meters of DNA molecules packed inside a cell nucleus, which is ten micrometers in diameter. The proper packing of the genome is critical for its healthy biological function such as gene expression, genome duplication, and DNA repair. However, both the genome's structure and function are highly sensitive to DNA damage, which can range from chemical change to the DNA molecule to full break of DNA's well-known double helix.

Yet, such harm is not unusual. In fact, the human genome experiences about thousand DNA damage events every day, which can occur naturally or due to the external factors such as chemicals or UV radiation (e.g. sunlight).

But, if left unrepaired, DNA damage can have devastating consequences for the cell, note Zidovska and study co-author Jonah Eaton, NYU doctoral student. For example, an unrepaired DNA double strand break can lead to cell death or cancer; therefore, DNA repair mechanisms are crucial for cellular health and survival.

In the Biophysical Journal study, the scientists examined the formation, behavior, and motion of DNA double strand breaks in the human genome. To do so, they chemically induced DNA breaks inside of live human cells, tracing these breaks using fluorescent markers. With the help of microscopy and machine learning technologies, they followed the motion of the DNA breaks and genome in both space and time.

In their analysis of more than 20,000 DNA breaks in over 600 cells, the researchers found that genome tends to change the nature of its packing around a DNA break.

"This new packing state for DNA undergoing repair, is unlike any packing state found for a healthy DNA," says Zidovska. "Around a DNA break we find that DNA becomes denser than its surrounding DNA, forming a small island of highly condensed DNA immersed in the sea of uncondensed DNA."

In addition, they observed that DNA breaks also move in a unique way--different from that of undamaged parts of genome. The islands of condensed DNA around a DNA break move faster than do healthy DNA, yet the mobility of such islands strongly depends on their size.

"The human genome is constantly exposed to events that damage the DNA molecule," notes Zidovska. "The cell has robust repair processes that work to eliminate such damage before it can give cause to cancer or other diseases. Now we have a better understanding of how a DNA break behaves differently than the undamaged DNA, which allows us not only to gain insight into these repair processes but also inform our efforts in cancer diagnosis and therapy."

Credit: 
New York University

New tool to detect blackleg disease in potato has widespread application

image: Cover for the November 2019 issue of Plant Disease.

Image: 
The American Phytopathological Society

Potatoes are important. They rank fourth among the world's staple crops. In the United States, they are grown commercially in 30 states and valued at $4 billion annually. Potatoes are also susceptible to 160 different fungal, bacterial, and viral diseases, such as blackleg and soft rot diseases, which are caused by the bacterium Dickeya dianthicola.

In 2015, an aggressive outbreak of this bacterium led to losses of greater than $40 million. In part, this outbreak occurred because of the inability to detect this specific bacterium when seed potatoes were being screened for pathogens using DNA-based tests.

To prevent an outbreak of this magnitude from occurring again, scientists developed a user-friendly online tool called Uniqprimer, which quickly and automatically designs species-specific DNA tags (also known as primers) for detecting pathogens using DNA testing.

To test the Uniqprimer, the scientists screened 116 field samples for the presence of D. dianthicola and found it in samples from nine different states. The Uniqprimer was able to detect the bacterium when other more commonly used tests did not.

While Uniqprimer was developed with blackleg disease and potatoes in mind, it can be applied to any pathogen and in any geographic area. It can also be used by anyone with an internet connection, even those with no background in bioinformatics.

"We hope Uniqprimer and the tests it designed will aid in the accurate detection of D. dianthicola and many other pathogens," said lead author Shaista Karim. "Accurate pathogen detection is the first step for management of a disease, which helps in reducing the losses in the potato industry and informing the farmers in a timely manner to better aid on-farm decision making."

Credit: 
American Phytopathological Society

A biology boost

For many young college students, the first years are a time of wonder and excitement and early steps toward long-term goals. These years, for some students, are equally fraught with anxiety, as the realities of rigorous curricula set in alongside feelings of unpreparedness and impostor syndrome. In the STEM fields, this results in roughly 50% of first-year majors leaving their original course of study.

This is a disheartening statistic, especially given the projected need for a 33% increase in the number of STEM-degree holders nationwide to support the industries of the future. Also disheartening: Those that leave their majors are disproportionately first-generation and underrepresented students, which is contributing to a lack of diversity in both industry and at advanced levels of research.

It doesn't have to be this way, say Mike Wilton and Eduardo Gonzalez-Niño, lecturers in the UC Santa Barbara Department of Molecular, Cellular, and Developmental Biology. In a paper published in the journal CBE-Life Sciences Education, the pair and their colleagues demonstrate that an active learning approach -- the use of tactics such as in-class iClickers, small group discussions and peer-reviewed writing assignments -- early in the biological sciences improves grades in the short-term and increases student retention in biology over the long-term.

"All students that are admitted here to UC Santa Barbara are capable of pursuing the biology major, otherwise the university would not have admitted them," said Wilton, who, with the rest of his team, has been running the BioMentors program for undergraduate biology majors since 2015.

Unfortunately, the researchers said, that sense of belonging is often lost in the mix as students -- many of whom are the first in their family to attend university -- navigate large class sizes, a fast-paced quarter system and a multitude of hurdles they did not expect along the way.

"I also faced that conundrum," Gonzalez-Niño said, thinking of his early years as a biology major. "Sometimes the background that these students have is not ideal -- they came from high schools that didn't prepare them for the rigor that you face in college." In addition to feeling out of place in a completely new environment and not knowing who to turn to for help, he said, these students often perform badly, leading them to lose confidence in their abilities and to reconsider their major.

At UCSB, roughly 1,100 new undergraduates are admitted into the biology major each year; however, about 600 have tended to leave the major after their first two years of study. Wilton and Gonzalez-Niño wanted to see if they could keep undergrads from prematurely leaving by involving them in a more participatory style of learning.

In their three-year research project, cohorts of students participated in an alternative and parallel -- but no less rigorous -- introductory biology course that ran concurrently to the traditional lecture course. In addition to active learning strategies, the intervention course replaces a weekly lecture with a tutorial that focuses on historically difficult course concepts. These discussion-based tutorials are led by Wilton and Gonzalez-Niño with the assistance of upper-division biology students, called BioMentors, who model approaches and strategies for success.

"The thing that we try to instill the most is that we're all in this together," Wilton said.

Following three years' worth of introductory biology students in both versions of the course, the researchers found significant benefits to those in the intervention section. For example, those in the intervention course outperformed their peers in the traditional lecture course by about 12% on common exam questions, and participation in the active learning program resulted in higher final course grades overall.

The students who participated in the active learning program also reported a "significantly higher perception of student belonging when compared with peers in the traditional section," according to the study. Students attributed this to the "higher perceptions of faculty support" and classroom environment -- i.e., how easy it is to share ideas and ask questions in lecture.

"The students in the active learning course were comfortable approaching us," Gonzalez-Niño said. "They can tell that we're on their side."

Taken together, the improved performance and the greater sense of belonging, the researchers said, increased the likelihood that students would remain in the biology major. In fact, students in the intervention course were 10% more likely than their peers in the traditional lecture course to participate in the subsequent introductory biology course offered the following quarter.

A biology degree is broad and diverse enough to enable individuals to work in various fields that will need these STEM graduates. According to the researchers, there is a growing demand for biologists in the biomedical fields, environmental sciences, agriculture and nutrition, and in research that not only broadens our knowledge, but also can translate into innovation.

"A huge area of demand currently that is only projected to grow is healthcare," Wilton noted. "If we can teach these students to navigate through their first gateway biology courses and help them pursue career in healthcare, that will be a productive future for a lot of students who are first-generation."

In addition, Gonzalez-Niño said, assisting underrepresented students in overcoming hurdles to learning biology will boost diversity in the field.

"When you have people with the same background all think about the same questions, often there's a limited number of answers that you can come up with," he said. "But when you have a diverse group of people thinking about the same issues, then the answers to those issues become more creative and diverse."

Credit: 
University of California - Santa Barbara