Body

Experiment reverses the direction of heat flow

Heat flows from hot to cold objects. When a hot and a cold body are in thermal contact, they exchange heat energy until they reach thermal equilibrium, with the hot body cooling down and the cold body warming up. This is a natural phenomenon we experience all the time.

It is explained by the second law of thermodynamics, which states that the total entropy of an isolated system always tends to increase over time until it reaches a maximum. Entropy is a quantitative measure of the disorder in a system. Isolated systems evolve spontaneously toward increasingly disordered states and lack of differentiation.

An experiment conducted by researchers at the Brazilian Center for Research in Physics (CBPF) and the Federal University of the ABC (UFABC), as well as collaborators at other institutions in Brazil and elsewhere, has shown that quantum correlations affect the way entropy is distributed among parts in thermal contact, reversing the direction of the so-called "thermodynamic arrow of time".

In other words, heat can flow spontaneously from a cold object to a hot object without the need to invest energy in the process, as is required by a domestic fridge. An article describing the experiment with theoretical considerations has just been published in Nature Communications.

The first author of the article, Kaonan Micadei, completed his PhD under the supervision of Professor Roberto Serra and is now doing postdoctoral research in Germany. Serra, also one of the authors of the article, was supported by São Paulo Research Foundation - FAPESP via Brazil's National Institute of Science and Technology in Quantum Information. FAPESP also awarded two research grants linked to the project to another coauthor, Gabriel Teixeira Landi, a professor at the University of São Paulo's Physics Institute (IF-USP).

"Correlations can be said to represent information shared among different systems. In the macroscopic world described by classical physics, the addition of energy from outside can reverse the flow of heat in a system so that it flows from cold to hot. This is what happens in an ordinary refrigerator, for example," Serra told.

"It's possible to say that in our nanoscopic experiment, the quantum correlations produced an analogous effect to that of added energy. The direction of flow was reversed without violating the second law of thermodynamics. On the contrary, if we take into account elements of information theory in describing the transfer of heat, we find a generalized form of the second law and demonstrate the role of quantum correlations in the process."

The experiment was performed with a sample of chloroform molecules (a hydrogen atom, a carbon atom and three chlorine atoms) marked with a carbon-13 isotope. The sample was diluted in solution and studied using a nuclear magnetic resonance spectrometer, similar to the MRI scanners used in hospitals but with a much stronger magnetic field.

"We investigated temperature changes in the spins of the nuclei of the hydrogen and carbon atoms. The chlorine atoms had no material role in the experiment. We used radio frequency pulses to place the spin of each nucleus at a different temperature, one cooler, another warmer. The temperature differences were small, on the order of tens of billionths of 1 Kelvin, but we now have techniques that enable us to manipulate and measure quantum systems with extreme precision. In this case, we measured the radio frequency fluctuations produced by the atomic nuclei," Serra said.

The researchers explored two situations: in one, the hydrogen and carbon nuclei began the process uncorrelated, and in the other, they were initially quantum-correlated.

"In the first case, with the nuclei uncorrelated, we observed heat flowing in the usual direction, from hot to cold, until both nuclei were at the same temperature. In the second, with the nuclei initially correlated, we observed heat flowing in the opposite direction, from cold to hot. The effect lasted a few thousandths of a second, until the initial correlation was consumed," Serra explained.

The most noteworthy aspect of this result is that it suggests a process of quantum refrigeration in which the addition of external energy (as is done in refrigerators and air conditioners to cool a specific environment) can be replaced by correlations, i.e., an exchange of information between objects.

Maxwell's demon

The idea that information can be used to reverse the direction of heat flow - in other words, to bring about a local decrease in entropy - arose in classical physics in the mid-nineteenth century, long before information theory was invented.

It was a thought experiment proposed in 1867 by James Clerk Maxwell (1831-1879), who, among other things, created the famous classical electromagnetism equations. In this thought experiment, which sparked a heated controversy at the time, the great Scottish physicist said that if there were a being capable of knowing the speed of each molecule of a gas and of manipulating all the molecules at the microscopic scale, this being could separate them into two recipients, placing faster-than-average molecules in one to create a hot compartment and slower-than-average molecules in the other to create a cold compartment. In this manner, a gas initially in thermal equilibrium owing to a mixture of faster and slower molecules would evolve to a differentiated state with less entropy.

Maxwell intended the thought experiment to prove that the second law of thermodynamics was merely statistical.

"The being he proposed, which was capable of intervening in the material world at the molecular or atomic scale, became known as 'Maxwell's demon'. It was a fiction invented by Maxwell to present his point of view. However, we're now actually able to operate at the atomic or even smaller scales, so that usual expectations are modified," Serra said.

The experiment conducted by Serra and collaborators and described in the article just published is a demonstration of this. It did not reproduce Maxwell's thought experiment, of course, but it produced an analogous result.

"When we talk about information, we're not referring to something intangible. Information requires a physical substrate, a memory. If you want to erase 1 bit of memory from a flash drive, you have to expend 10,000 times a minimum amount of energy consisting of the Boltzmann constant times the absolute temperature. This minimum of energy necessary to erase information is known as Landauer's principle. This explains why erasing information generates heat. Notebook batteries are consumed by heat more than anything else," Serra said.

What the researchers observed was that the information present in the quantum correlations can be used to perform work, in this case the transfer of heat from a colder to a hotter object, without consuming external energy.

"We can quantify the correlation of two systems by means of bits. Connections between quantum mechanics and information theory are creating what is known as quantum information science. From the practical standpoint, the effect we studied could one day be used to cool part of a quantum computer's processor," Serra said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

New method reveals how well TB antibiotics reach their targets

image: Scientists visualise the TB antibiotic bedaquiline inside lipid droplets (circles) and TB bacteria (rod shaped structures) inside human macrophages.

Image: 
Daniel Greenwood, Francis Crick Institute.

Scientists have developed a new technique that enables them to visualise how well antibiotics against tuberculosis (TB) reach their pathogenic targets inside human hosts. The findings, published in the journal Science, boost our understanding of how antibiotics work and could help guide the development of new antibiotics, which are much-needed in the battle against drug-resistance.

The Crick-led team used the technique to observe the TB drug, bedaquiline, in action in human cells infected with the TB-causing bacterium, Mycobacterium tuberculosis (Mtb). They found that bedaquiline accumulates in lipid droplets inside the host cells, forming a reservoir that supplies the drug to Mtb over time. The work was done in collaboration with scientists at GlaxoSmithKline and the University of Western Australia.

"Bedaquiline is the first newly approved antibiotic for TB treatment in 50 years, but until now, we didn't understand how a compound that doesn't dissolve in water could be so effective in treating TB," explains Daniel Greenwood, Crick PhD student and first author of the paper.

"Normally, compounds that only dissolve in lipids, known as lipophilic compounds, are often abandoned during drug development because they tend to bind to proteins and fats in our bodies in a non-specific way. We wondered how such a drug could ever reach the bacteria. Our surprising findings show that even very lipophilic antibiotics like bedaquiline are worth pursuing in drug development. In the case of TB, this conventionally undesirable trait actually boosts drug delivery."

TB treatment

TB remains one of the world's deadliest infectious diseases, with over a million TB-related deaths worldwide each year.

When a person is infected with Mtb, their immune system tries to clear the pathogen by calling upon specialised immune cells called macrophages that recognise and engulf Mtb. But the bacteria often find ways to survive and reproduce, causing disease. Patients require a minimum of four antibiotics for at least six months to overcome the infection.

It was previously unknown whether antibiotics penetrate into all the compartments of the macrophage where the Mtb hide and replicate.

The method pioneered in this study, which combines three types of imaging (correlated light, electron and nano-scale ion microscopy), allows researchers to visualise the distribution of TB drugs in Mtb-infected human macrophages at high resolution, for the first time.

A test-case TB drug

Using bedaquiline as a test-case, they infected human macrophages with Mtb, and after two days, treated them with the drug. Their imaging results revealed that bedaquiline accumulated in numerous compartments of the cell, most notably, inside lipid droplets.

The bacteria can interact with and consume these lipid droplets. However, the team didn't know if bedaquiline would be transferred to the bacteria, or whether the lipid droplets were absorbing the antibiotic and preventing it from reaching the bacteria. Adding a chemical that prevented lipid droplets from forming significantly reduced the amount of bedaquiline in Mtb, suggesting that the lipid droplets are responsible for transferring antibiotic to the bacteria.

"Now that we can see exactly where antibiotics go once they enter macrophages, we can build up a much clearer picture of how they reach their targets, and harness these observations to design more effective treatments in the future, not only for TB but for other infectious diseases too," says Max Gutierrez, Crick group leader and senior author of the paper.

Credit: 
The Francis Crick Institute

Researchers verify 70-year-old theory of turbulence in fluids

image: A turbulence comparison from the very big (a storm on Jupiter) to the incredibly small (quantum turbulence).

Image: 
The University of Queensland

Pilots and air travellers know turbulence can be powerful, but science has struggled to fully explain the phenomenon.

Now, a University of Queensland study has confirmed a 70-year-old theory and is expected to help address "huge problems" in global engineering and transport.

Dr Tyler Neely from the ARC Centre of Excellence for Engineered Quantum Systems (EQUS) said enormous amounts of energy were used daily to transport all sorts of fluids through pipes all over the world.

"Turbulence physics also causes enormous inefficiency for moving vehicles such as ships," Dr Neely said.

"Fluid has characteristic ways of flowing, but it goes into chaotic eddies when it gets out of equilibrium.

"Better understanding fluid turbulence has great potential to make many industry and transport functions cheaper and greener around the globe."

Turbulence has absorbed scientific minds for five centuries, since Leonardo da Vinci coined the term la turbulenza.

To verify Nobel Laureate Lars Onsager's 70-year-old fluid turbulence theory, UQ and Monash University physicists developed techniques to control and measure ultra-cold atom systems.

Dr Neely said Onsager's theory only directly applies to quantum fluids called superfluids.

"The theory says if you add enough energy to a two-dimensional system, turbulence will cause giant vortices to appear," Dr Neely said.

"Vortices are regions in fluid where flow revolves around an axis line - similar phenomenon can be seen in the atmosphere of the planet Jupiter.

"Our study created a superfluid by cooling a gas of rubidium atoms almost to absolute zero.

"We then precisely focussed laser beams to create vortices in the fluid, a technique similar to stirring a cup of tea with a spoon.

"It amazes me that we can do this with light and at such a small scale - the cores of the tiny vortices we created were about one tenth of the size a human blood cell."

The study, appearing in Science (DOI: 10.1126/science.aat5718), was part of a collaboration with the Centre for Quantum Science at the University of Otago.

The Monash research furthermore confirmed Onsager's theory of the formation of large-scale vortex structures.

"These studies explore the range of states that Onsager predicted," Dr Neely said.

The discovery will help support the researchers' next scientific foray, which involves understanding few-particle thermodynamics.

"Better experimental and theoretical understanding will help to better engineer new quantum machines, a key aspect of EQUS research," Dr Neely said.

Credit: 
University of Queensland

Patients see multiple clinicians on one visit, thanks to new scheduling protocol

A new patient-centered scheduling protocol is improving the quality, efficiency and convenience of multiprovider health care, according to a recently published paper from The University of Texas at Austin.

Researchers with UT Austin's McCombs School of Business, the Cockrell School of Engineering and Dell Medical School describe how patients being treated for joint pain at the medical school's Musculoskeletal Institute are able to see a variety of health care providers, one after the other, during the same medical appointment, without ever having to leave the exam room.

The researchers created a scheduling algorithm that employs supply chain concepts similar to those used to track and ensure timely transport of oilfield equipment to drilling sites. In a medical setting, this approach translates into more efficiencies for patients.

At clinics across America, it's not unusual for a patient with joint pain to make a primary care or chiropractic appointment, only to be referred to a physical therapist, a surgeon, or any number of other allied health professionals. Each can require separate appointments on different dates, sometimes with months-long wait times in between, often resulting in disparate and confusing recommendations for treatment.

But at the Musculoskeletal Institute's joint pain clinic, a patient can see a chiropractor, a surgeon, a physical therapist, a social worker and a dietician, all in the same visit, in the same room, with minimal waiting. The health care providers then work together as a team to coordinate an individualized course of care for each patient. It's a model called an integrated practice unit.

The researchers tested various scheduling scenarios based on the types of patients being seen -- new and follow-up -- and the kinds of providers each would potentially need to see for appropriate, convenient care. Upon arrival at the clinic, all patients first see an associate provider (a nurse practitioner, physician assistant or chiropractor) who determines which other health professionals that patient should see during that visit. Each provider gives a "warm handoff" of the patient to the next provider entering the room, creating seamless care without redundancy.

The scheduling algorithm creates real-time updates to clinicians' schedules throughout the day, based on which patients end up needing to see them. It also eliminates long wait times for specialist appointments and relieves the patient of the responsibility for scheduling numerous different types of provider visits about the same medical issue.

"Patients can feel the difference," said Dr. Karl Koenig, medical director for musculoskeletal care at UT Health Austin. "I've had people weep because they felt that for the first time, providers were really paying attention to them as a whole person."

The research was conducted by Douglas Morrice, professor of information, risk and operations management (IROM) and senior research fellow of UT's Supply Chain Management Center of Excellence in the McCombs School of Business; Jonathan Bard, professor in the Walker Department of Mechanical Engineering in the Cockrell School of Engineering; Pengfei Zhang, an IROM Ph.D. candidate; and Koenig, associate professor at Dell Med's Department of Surgery and Perioperative Care.

Morrice and Bard first worked on the approach in 2012 at the UT Health Science Center at San Antonio's anesthesiology clinic.

Koenig heard about the San Antonio clinic's success and decided the concept would be perfect for the Musculoskeletal Institute's joint pain clinic. When the clinic opened, he expected it to serve 28 patients a day. With the scheduling model in place, it has been treating 37 to 40.

Credit: 
University of Texas at Austin

Reducing overtesting in the emergency department could save millions

ANN ARBOR, Mich. - An emergency department is sometimes the first place a person thinks to go for health care.

"The emergency department is an essential care setting, treating over 145 million annual visits in the United States across a wide range of patient populations, from children to adults," says Keith Kocher, M.D., MPH, an assistant professor of emergency medicine at Michigan Medicine.

It's for this very reason that Michele Nypaver, M.D., a professor of emergency medicine and pediatrics at Michigan Medicine and University of Michigan C.S. Mott Children's Hospital, says the emergency department setting represents the ideal venue to implement practice improvement efforts to ensure high-quality care, informed by the best available evidence.

Kocher and Nypaver are the program directors of the Michigan Emergency Department Improvement Collaborative (MEDIC), a physician-led statewide quality network connecting a diverse set of unaffiliated emergency departments with the goal of improving quality and reducing low-value emergency care throughout Michigan.

The collaborative is funded by Blue Cross Blue Shield of Michigan and Blue Care Network through the Value Partnerships program.

In a new study, published in Annals of Emergency Medicine, Kocher, Nypaver and a team of emergency physician MEDIC clinical champions from hospitals and health systems across Michigan use data from MEDIC to highlight opportunities to safely reduce overtesting in emergency departments.

"Health care is changing," says Kocher, a member of the U-M Institute for Healthcare Policy and Innovation. "Government, insurance companies, advocacy groups and patients are all more interested than ever in demonstrating the value of care delivered."

He adds, "This requires a better understanding of the outcomes we get for the tests and treatment we perform."

Overimaging costs

MEDIC warehouses a clinical registry that contains data on more than 1 million emergency department visits from 16 diverse emergency departments. It also includes data on performance across four quality measures for each department.

In the study, the research team examined data from the clinical registry, specifically looking at amounts of performed imaging. Estimates of excess imaging were calculated based on the Achievable Benchmark of Care method for determining quality improvement targets across a population.

In 2017 alone across the collaborative, the team found substantial variation in amounts of performed imaging and the potential to avoid 1,519 head CT scans for minor head injury, 3,308 chest X-rays for children with asthma, bronchiolitis or croup, and 4,254 CT scans for suspected pulmonary embolism.

"This translates to about $3.8 million annually in avoidable spending on low-value care if these MEDIC sites were to collectively improve to the benchmark standard," Kocher says.

"We show that there is the opportunity to avoid low-value imaging tests in the emergency department and, in turn, create significant health care savings."

Kocher, Nypaver and team hope these study results show the benefit and power a collaborative can have for fellow physicians.

"Useful insight into how we continuously improve the quality of care we deliver only comes from getting meaningful, credible data into the hands of those who can best act on it -- and oftentimes that means clinicians," Nypaver says.

"MEDIC is a tool that allows emergency physicians to get actionable feedback on how they are performing on their patients in their own emergency departments, with the collaborative helping us to learn from each other to keep improving."

Future implications for payments

Kocher and Nypaver note that measurement standards for patient care are increasingly tied to reimbursement, termed "pay for performance."

"Unfortunately these measures are often initiated and implemented without the direct input of the physicians they impact," Kocher says.

"It is critical that emergency physicians bridge this gap and participate in shaping how and in what form the quality of our care is measured and how it will positively impact our patients, or it will likely be determined for us by those outside of our specialty."

He says MEDIC's learning collaborative approach provides an important model for large-scale clinical practice change demonstrating the benefits of partnerships among physicians, hospitals and payers.

Nypaver adds, "MEDIC will continue to identify challenges in emergency medicine where strong evidence exists to define best practice, where opportunity exists to improve patient outcomes and where emergency physicians are best positioned to intervene."

Credit: 
Michigan Medicine - University of Michigan

Elevated first trimester blood pressure increases risk for pregnancy hypertensive disorders

Elevated blood pressure in the first trimester of pregnancy, or an increase in blood pressure between the first and second trimesters, raises the chances of a high blood pressure disorder of pregnancy, according to a study funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), part of the National Institutes of Health.

The study was led by Alisse Hauspurg, M.D., of the University of Pittsburgh School of Medicine and appears in the American Journal of Obstetrics and Gynecology.

The researchers sought to determine how revisions in guidelines for blood pressure in non-pregnant adults might apply to pregnant women. The results suggest that blood pressure readings lower than those traditionally used to identify women as having high blood pressure may indicate a higher risk for a hypertensive disorder of pregnancy, such as gestational high blood pressure, which develops after the 20th week of pregnancy, and preeclampsia, or high blood pressure and protein in the urine. Both conditions increase the risk for stroke in the mother and for stillbirth, preterm birth and low birth weight. Preeclampsia also increases the risk for eclampsia--life-threatening seizures for the mother.

The researchers analyzed data from Monitoring Mothers-to-Be (nuMoM2b), a study which sought to identify risks for birth and pregnancy complications in first-time mothers. For roughly 8,900 women, researchers compared blood pressure readings in the first and second trimesters of pregnancy to blood pressure status in the remainder of pregnancy. None of the women had stage 2 high blood pressure (140/90 mmHg or higher) at the time they entered the study.

Of women who had elevated blood pressure in the first trimester (120/80 to 129/80 mmHg), 30.3% developed a hypertensive disorder of pregnancy, a 42% higher risk than for women with normal blood pressure (less than 120/80 mmHg). Of women with stage 1 high blood pressure (130/80 to 130/89 mmHg), 37.8% developed a hypertensive disorder of pregnancy, an 80% greater risk than women with normal blood pressure. Stage 1 high blood pressure was associated with more than 2.5 times the risk for preeclampsia.

An increase in blood pressure between the first and second trimester also increased the risk of a hypertensive disorder of pregnancy. For example, even among women with normal blood pressure in the first trimester, an increase in systolic pressure (the top number) was associated with a 41% higher risk of any hypertensive disorder of pregnancy, compared to women with a decrease in systolic pressure between the first and second trimester. An increase in diastolic pressure (the bottom number) was associated with a 23% higher risk of a hypertensive disorder, compared to women who had a decrease in diastolic pressure during this time.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

Nuclear stress test helps identify heart attack risk in people with diabetes

OAK BROOK, Ill. - Abnormal results on a nuclear stress test are associated with a significantly increased risk of cardiac-related deaths, especially among people with diabetes, according to a multi-center study published in the journal Radiology: Cardiothoracic Imaging.

Researchers said the study results support a role for Positron Emission Tomography (PET) Myocardial Perfusion Imaging (MPI) in improving cardiac risk stratification in people with diabetes.

Diabetes is a devastating worldwide pandemic and a major risk factor for coronary artery disease and heart attack. Accurate cardiac risk assessment of these patients is vital, as it helps guide appropriate and potentially lifesaving treatments.

PET MPI has emerged in recent years as a promising diagnostic and risk assessment tool. The test uses radioisotopes and a special camera to show how well blood is flowing in the heart when it is under stress. Prior research backs the test's use in evaluating individuals with suspected or known coronary heart disease. However, less is known about its value among patients with diabetes.

In the largest study of its kind to date, investigators at four centers collected clinical data on stress tests for both diabetic and non-diabetic patients and then followed them to track the occurrence of adverse events like heart attacks. The study group consisted of 7,061 participants, including 1,966 people with diabetes.

The data showed that among diabetic patients an abnormal PET MPI was associated with increased risk of cardiac death in all important clinical subgroups based on age, gender, obesity, or those with prior revascularization procedures like angioplasty. Using the data, the researchers were able to more accurately assess the cardiac risk for a significant proportion of diabetic patients.

"The data from the stress test among diabetic patients actually allowed us to better risk-stratify people in greater than 39 percent of the cases," said study lead author Hicham Skali, M.D., M.Sc., from Brigham and Women's Hospital and Harvard Medical School in Boston. "Patients with diabetes remain at a significantly higher risk of cardiac death compared to patients without diabetes, and the data from a stress test helps us further stratify those at greatest risk."

The results suggest PET MPI could help select which vulnerable patients may need immediate treatment while sparing others from unnecessary procedures.

Among people without diabetes, the study findings echoed those of previous studies that showed men face a significantly higher risk of cardiac death than women. But that wasn't the case among diabetic patients.

"In patients without diabetes, being a woman confers a certain advantage in that their risk of death is much lower, regardless of their stress findings," Dr. Skali said. "However, when you look at patients with diabetes, men and women have relatively the same risk of cardiovascular death, and that risk increases with worsening findings on the PET stress test."

The data also revealed that, even when they had normal PET MPI results, people with diabetes had a similar rate of cardiac death to people without diabetes who were 10-15 years older, suggesting that younger diabetic patients may require additional tools for risk stratification.

Credit: 
Radiological Society of North America

Urinary tract and other infections may trigger different kinds of stroke

DALLAS, June 27, 2019 -- Several infections have been identified as possible stroke triggers, with urinary tract infections showing the strongest link with ischemic stroke, according to new research in the American Heart Association's journal Stroke.

Previous research examined infections as triggers of stroke, but were limited to the correlation of acute infections with ischemic stroke, a type of stroke caused by blocked blood vessels in the brain. This study considered a wider range of infections, and examined connections with two other types of stroke: intracerebral hemorrhage, which is caused by a ruptured blood vessel in the brain, and a type of stroke that results from bleeds in the inner lining of the brain, called subarachnoid hemorrhage.

"Healthcare providers need to be aware that stroke can be triggered by infections," said Mandip Dhamoon, M.D., Dr.P.H., senior study author and associate professor of neurology at the Icahn School of Medicine at Mount Sinai in New York City. "Probing into the previous weeks or months of a patient's life before the stroke can sometimes help to illuminate the possible causes of stroke if there was an infection during that time."

The researchers used the New York State Inpatient Databases and Emergency Department Databases from 2006 to 2013, which record all inpatient and emergency department visits to community hospitals in New York state. Electronic health record codes were used to identify hospitalizations and emergency department visits for the three types of stroke and for infections; including skin, urinary tract, septicemia, abdominal and respiratory. Records for hospitalizations for infections were considered for 7, 14, 30, 60, 90, and 120 days prior to the stroke occurrence.

For ischemic stroke, the researchers found that every infection type was linked with an increased likelihood of this type of stroke. The strongest link was seen with urinary tract infection, which was showed more than three times the increased risk of ischemic stroke within 30 days of infection. For all infection types, the magnitude of stroke risk decreased as the time period before ischemic stroke occurred increased.

For intracerebral hemmorhage, the connections with occurrence was strongest for urinary tract infections, septicemia (blood infection) and respiratory infections. Respiratory infection was the only infection related to the occurrence of subarachnoid hemorrhage.

"Our study shows that we need to do more to understand why and how infections are associated with the occurrence of different kinds of stroke, and that will help us to determine what we can do to prevent these types of strokes," Dhamoon said. "These findings suggest that there could be implications for vaccination, antibiotic regimens or intensive antithrombotic treatments not only to prevent the infections but to prevent stroke in those who are deemed high-risk."

Credit: 
American Heart Association

New, noninvasive test for bowel diseases

video: This video is based on a paper published in Experimental Physiology:

Inflammatory bowel disease associates with increased gut-to-blood penetration of SCFA: a new, noninvasive marker of a functional intestinal lesion
Kinga Jaworska, Marek Konop, Klaudia Bielinska, Tomasz Hutsch, Marcin Dziekiewicz, Aleksandra Banaszkiewicz & Marcin Ufnal.

Image: 
The Physiological Society & Medical University of Warsaw (Poland)

Gut diseases such as inflammatory bowel disease (IBD) are increasingly prevalent worldwide, especially in industrialised countries. In 2015 alone, 250,000 people in the UK were diagnosed with IBD, and 3 million in the United States (1, 2). Symptoms can include pain and swelling of the stomach, bloody diarrhoea, weight loss and extreme tiredness.

A new study in Experimental Physiology proposes a novel, non-invasive test for assessing gut function that may help screen and monitor treatment of gut diseases using only a small sample (1 mL) of blood and stool. How well your gut functions is determined by the gut-blood barrier , a complex multi-layer system. This can be compared to a fine-tuned filter that precisely controls the passage of nutrients and prevents bacteria passing from inside the bowel into the bloodstream.

In those with IBD, and other intestinal diseases, the gut-blood barrier is impaired. Here the intestinal wall is more like a ripped sieve, allowing more bacterial products to pass from the gut into the blood. This is commonly referred to as a leaky gut.

This test measures the concentration of gut bacterial products (produced by bacteria during metabolism) in the patient's blood and stool. The authors believe that with further research this assessment of gut leakage will be very important in the diagnosis and treatment of IBD and other intestinal diseases. (See related video for more on the science of leaky gut and the gut-to-blood permeability ratio).

The usual strategy for diagnosing and monitoring IBD is based on a colonoscopy, which is invasive, often requires anaesthesia, and assesses structural lesions, rather than gut malfunction. Gut disorders can happen before there are visible structural changes, so diagnosing based on functional tests evaluating gut leakage could allow clinicians to detect the disease earlier. While there is no cure for IBD, it is controllable. Early diagnosis would enable patients to control symptoms before they became severe, improving their quality of life.

This new research provides a non-invasive, simple test that could not only be useful for diagnosing IBD, but also other gut disorders, such as celiac disease and food allergies. It's also helpful for detecting diseases that result in a leaky gut, such as heart failure, high blood pressure and liver ailments.

Marcin Ufnal, senior author on the study said:
"This may be a very important tool for diagnosis and treatment of gut and other diseases, using the leaky gut as a marker for disease, as well as a potential target for treatment. "

Credit: 
The Physiological Society

Vaccination programs substantially reduce HPV infections and precancerous cervical lesions

Human papilloma virus (HPV) vaccination programs have substantially reduced the number of infections and precancerous cervical lesions caused by the virus, according to a study published today in The Lancet by researchers from Université Laval and the CHU de Québec-Université Laval Research Centre. Results are so promising that it has now become possible to envision eliminating cervical cancer as a public health problem in the coming decades.

Infections caused by HPV are among the most common sexually transmitted diseases. Certain forms of the virus can lead to anogenital warts while others cause lesions that can develop into cancer of the mouth, throat, vagina, vulva, anus, or penis, and particularly the cervix. "HPV is found in almost 100% of cervical cancer cases," states lead author Mélanie Drolet.

The research team headed by Université Laval's Faculty of Medicine Professor Marc Brisson performed a meta-analysis of 65 studies in 14 different countries that have set up HPV vaccine programs in the last ten years. With data on 60 million people, the researchers compared the frequency of HPV infections, anogenital warts, and precancerous cervical lesions before and after the programs were launched.

Their analysis shows that infections dropped by 83% among girls aged 13 to 19 and 66% among women aged 20 to 24. For anogenital warts, the drop was 67% among 15- to 19-year-old girls, 54% for women aged 20 to 24, and 31% for those aged 25 to 29. Precancerous cervical lesions also dropped by 51% among teens aged 15 to 19 and 31% among women aged 20 to 24.

Vaccination of young women is also producing herd protection for young men. Anogenital warts among males have dropped 48% for those 15 to 19 and 32% for those aged 20 to 24.

Cervical cancer currently affects 1 woman in 150 in Canada, and the 5-year mortality rate is about 25%. "HPV vaccination is still too recent to directly measure its effects on cervical cancer as it can take decades to develop," explains Mélanie Drolet. "However, our analyses show that vaccination is producing substantial reductions in the infections that cause cervical cancer and precancerous lesions. These reductions are a first sign that vaccination could eventually lead to the elimination of cervical cancer as a public health problem. We are now trying to determine when elimination could be achieved and which vaccination and screening programs could help us achieve it faster."

Credit: 
Université Laval

Understanding how tics are suppressed may help some at risk for tic disorders

image: Studying children shortly after they began experiencing tics, researchers at Washington University School of Medicine in St. Louis discovered that although tics don't go away, most children are able to suppress and control them. Understanding how they do that may provide insight to help others at risk for significant tic disorders.

Image: 
Illustration by Michael Worful

At least 20 percent of elementary school-age children develop tics such as excessive blinking, throat clearing or sniffing, but for most of those kids, the tics don't become a long-term problem. Conventional wisdom has held that most tics go away on their own and that only in rare cases do they become chronic or develop into a disorder such as Tourette syndrome.

However, studying children shortly after tics first appear, researchers at Washington University School of Medicine in St. Louis discovered that tics don't go away completely; rather, most children simply exhibit tics less when others are watching. Learning how they do that may provide insight to help others at risk for significant tic disorders.

The findings are published online June 26 in the Journal of Child Neurology.

"We found that tics were still present one year after they first appeared but that many of the kids we studied had figured out how to suppress them," said principal investigator Kevin J. Black, MD, a professor of psychiatry. "Uncovering just how they are able to control these tics may help other children do the same and perhaps avoid chronic tic disorders such as Tourette syndrome." Chronic tic disorders affect about 3% of the population, he added.

The researchers examined 45 children who had just started experiencing some sort of tic. The kids were ages 5 to 10, with an average age of about 7 ½.

Thirty of the children were boys -- in whom tic disorders are more common -- and the other 15 were girls. All of the children were examined within a few months of when their tics first appeared, and a second time 12 months after the tics had started.

"Our expectation, initially, was that maybe one in 10 kids would still have tics at their follow-up exams," said first author Soyoung Kim, PhD, a postdoctoral research associate in psychiatry. "Most had improved a year later, but to our surprise in every case, the children still had tics -- many of them just controlled them better."

Kim, Black and their colleagues verified the presence of tics by leaving each child alone in a room with a video camera. They found it was possible for most children to suppress tics when they were being watched during neurological exams. But when left alone, the children exhibited tics, without exception.

"I find tics fascinating because they illustrate the interplay between what is volitional and what is involuntary," Black said. "People don't tic on purpose, and most can suppress it for a short period of time, but at some point, it's going to come out."

The research team was able to identify several factors that predicted problematic tics at the one-year mark, as well as factors related to the ability to suppress tics.

A history of anxiety disorder was among the predictors of an inability to control or suppress tics, as was having pronounced tics during the kids' initial exams. Having three or more vocal tics, such as throat clearing or making other noises, also indicated likelihood of evident tics one year later.

In addition, children with higher scores on the Social Responsiveness Scale -- a test that measures behaviors on the autism spectrum -- also were likely to have continued problems with tics a year after first experiencing them.

"None of these kids had autism, but those who did a little bit worse on that test, who had what we would call sub-syndromal symptoms of autism, were more likely to have trouble with tics one year later," Black said.

The researchers used a reward system to help determine whether the children could suppress tics. In one study exercise, they rewarded children with a token worth a few pennies for every 10 seconds they could go without having a tic. Those who suppressed their tics most effectively in response to rewards exhibited fewer and less significant problems at their follow-up visits.

"My suspicion is that, over time, these kids may improve in their ability to suppress tics, just from social cues," Black said. "But perhaps more importantly, early on -- when they've experienced tics for only a few weeks or months -- some children already can suppress them. If we can develop ways to help other children acquire those skills, we might improve quality of life for those who otherwise may go on to develop a chronic tic disorder such as Tourette syndrome."

Credit: 
Washington University School of Medicine

Long delays prescribing new antibiotics hinder market for needed drugs

MADISON, Wis. -- U.S. hospitals wait over a year on average to begin prescribing newly developed antibiotics, a delay that might threaten the supply or discourage future development of needed drugs.

A survey of how 132 hospitals prescribed six new antibiotics from 2014 to 2018 found that the average time to prescribe any one of the new drugs was 398 days. Teaching and research hospitals and large hospitals tended to prescribe the drugs more quickly than smaller, non-academic hospitals.

That gap period delays the payback of research and development costs in the crucial months following federal approval of a new drug. Those challenges could reduce the likelihood that companies develop new antibiotics, which are essential for treating pathogens that develop resistance to existing antimicrobials.

Warren Rose, a professor in the School of Pharmacy at the University of Wisconsin-Madison who led the recent study, says this delay shows the need for market-boosting incentives to buoy new antibiotics during this sensitive period to ensure a robust supply of fresh antibiotics in the coming years and decades.

He points to the pharmaceutical company Achaogen, which filed for bankruptcy earlier this year after its first drug, an antibiotic, failed to gain a market foothold.

"If companies fail, then even if you call for drugs, they won't make them," says Rose. "As clinicians, we are hypocritical, because we call for drugs and then don't use them."

Rose conducted the study with Lucas Schulz, the Infectious Diseases Coordinator in the Department of Pharmacy at UW Health, Seok Yeong Kim, a student at the School of Pharmacy, and Alyssa Hartsell from the health care consulting company Vizient. The team published its findings June 22 in the journal Diagnostic Microbiology and Infectious Disease.

In response to the growing problem of antibiotic resistance, which affects millions of people in the U.S. each year, public and private organizations have provided funding and support to boost the development of new antibiotics. That support has worked -- 12 new antibiotics have received approval by the Food and Drug Administration since 2010, reversing decades of declining development.

Those drugs include several Qualified Infectious Disease Products, an FDA label for antibiotics that are eligible for extended exclusive sale protections and rapid approval as incentives for development. Rose and his team tracked the prescription of six QIDPs over four years using a Vizient database.

While the average time to use one of the new drugs was longer than a year, it varied widely from less than two weeks to more than four years. Hospitals with more complex patient needs, which tended to be larger hospitals and those associated with universities, prescribed the new drugs more quickly.

Prescribing time also varied by region. The South prescribed all six drugs within two years on average, while the Northeast region took more than three and a half years to meet the same mark.

It's not entirely clear what drives this delay. These new drugs are much more expensive than legacy antibiotics -- up to $1,000 a day versus pennies for some older drugs -- which may account for some of the reluctance to begin prescribing them.

More importantly, antibiotics work best when they are only used for the right patient at the right time, and there may be barriers to including the new drugs in these protocols. Many of the recently developed antibiotics target a wide range of bacteria, which means they do not meet antibiotic stewardship guidelines that recommend targeting pathogens with narrowly active antimicrobials.

There are also few tests available for helping doctors determine whether a patient is a good candidate for these new drugs, which means they may only be prescribed after other antibiotics have failed to cure an infection. In all, these new antibiotics appear to function as a drug of last resort, delaying their use.

Rose says current incentives for companies to produce new antibiotics end before the drugs are approved and go on sale, leaving companies in the lurch if uptake is slow.

"Those incentives might not be enough," says Rose. "We have to think about things differently."

Programs that push drugs to market could be combined with incentives that pull new antibiotics forward once they receive FDA approval, Rose says. Government stockpiling of drugs or a cash reward for meeting that milestone might help companies bridge the year-plus gap to a stable market.

A recently proposed bipartisan bill in Congress, the DISARM Act, aims to institute some of these pull incentives. These sorts of programs could help ensure a steady stream of new antibiotics by providing more security after the drugs are approved.

"I don't know if this study has all the answers, but it starts the conversation," says Rose.

Credit: 
University of Wisconsin-Madison

Immune damage may explain ineffectiveness of high-dose radiation against lung cancer

image: Sameer Nath, MD, and colleagues show that radiation dose to the immune system may predict shorter survival for stage III non-small cell lung cancer patients.

Image: 
University of Colorado Cancer Center

When it comes to using radiation against lung cancer, preliminary clinical studies were pretty clear: More is better. So why did a large phase 3 clinical trial find exactly the opposite - that stage III non-small cell lung cancer patients treated with higher doses of radiation actually had shorter overall survival than patients treated with lower-dose radiation?

"At first glance, you might think that when you give a higher dose of radiation it's simply too toxic - there are just too many side effects and the benefit does not outweigh the harm. But in this trial, if you look at grade 3-5 toxicities, there was no statistically significant difference between the high- and low-dose radiation arms, and furthermore the absolute number of treatment-related deaths was quite small as a whole. Even more interestingly, tumor control was actually worse in the high-dose arm of the study, which is not easily explained by increased toxicity," says Sameer K. Nath, MD, University of Colorado Cancer Center investigator and assistant professor in the CU School of Medicine Department of Radiation Oncology.

The unexpected findings have vexed researchers since the study closed in 2013. Now a new study by Nath and CU Cancer Center colleagues offers an interesting answer: Blame it on the immune system. That's because radiation used in the lung doesn't stay in the lung. This radiation hits blood passing through lung and heart as well. And compromising the immune components of blood removes an important ally in the body's fight against cancer.

One reason has to do with the way anti-cancer radiation treatment works. Of course, radiation kills cells - oncologists try to focus radiation onto tumor tissue so that it primarily kills cancer cells. But not all cancer cells die immediately from radiation. Many of these cells are only damaged and it falls to the immune system to recognize and eliminate these cells with DNA damage.

"Over the last decade, a lot of research supports the idea that a functional immune system plays a key role in tumor cell killing following the DNA damage created by radiation therapy," Nath says.

But if radiation impairs the immune system, these tumor cells with DNA damage may be left to survive and ultimately escape immune-mediated removal.

"Our hypothesis is that higher doses of radiation to the immune system is contributing to worse survival in those patients," Nath says.

To test this hypothesis, Nath and colleagues used a sophisticated computer model to define the estimated dose of radiation to immune cells (EDRIC) delivered during treatment. Then they looked at a new population of patients: Could EDRIC predict survival in 117 patients with stage III non-small cell lung cancer treated at UCHealth University of Colorado Hospital between 2004 and 2017?

Yes, it could: Patients with high EDRIC (above 7.3 Gy) lived a median 14.3 months, while those with low EDRIC (below 5.1 Gy) lived median 28.2 months.

"We show that the estimated dose of radiation to immune cells is an important predictor of tumor control and survival. In fact, it ends up being one of the most important predictors," Nath says. "It's a really thought-provoking finding because radiation effects on the immune system isn't something the field has focused on in the past."

Nath cautions that radiation remains a backbone treatment for the management of stage III non-small cell lung cancer, and although radiation dose to the immune system may lessen the benefit of radiation, several clinical trials have established a clear survival benefit of radiation in this population.

"What is really exciting is that now that we have ways to estimate this radiation dose, we can look at sparing the immune system as an organ-at-risk," he says. "We think it is a modifiable risk factor. Now we can focus on methods to minimize dose and in doing so, we will be able to maximize the benefit of radiation to these patients."

Though more work is needed to confirm the current findings and to define best practices for immune-sparing radiation therapy, Nath suggests that it may be useful to reduce the time over which radiation is delivered and the number of treatments to restrict the number of immune cells damaged.

"As you give each fraction of radiation to the chest, there's a certain volume of blood that's in the field. Many of the lymphocytes in that field will be killed and as you repeat each treatment, you keep hitting a different volume of blood, and overall end up depleting that circulating pool. If we reduce number of radiation fractions or give the dose over a shorter time period, we would be hitting less blood, overall," Nath says.

Additional immune-sparing strategies for future study may include the use of more focused forms of radiation that allow doctors to spare areas that contain more circulating immune cells, directing radiation away from the heart and lung, and using functional lung imaging to identify and avoid areas that are especially perfused.

Credit: 
University of Colorado Anschutz Medical Campus

A hidden truth: Hospital faucets are often home to slime and biofilm

image: Sinks and faucets tested at the University of Michigan Health System revealed slime and biofilm.

Image: 
University of Michigan Health System

Hand hygiene is a critical component of infection prevention in hospitals, but the unintended consequences include water splashing out of a sink to spread contaminants from dirty faucets according to new research presented last week in Philadelphia at the 46th Annual Conference of the Association for Professionals in Infection Control and Epidemiology (APIC).

Researchers at the University of Michigan Health System assessed eight different designs across four intensive care units to determine how dirty sinks and faucets really are. They found that a shallow depth of the sink bowl enabled potentially contaminated water to splash onto patient care items, healthcare worker hands, and into patient care spaces - at times at a distance of more than four feet from the sink itself.

"The inside of faucets where you can't clean were much dirtier than expected," said study author Kristen VanderElzen, MPH, CIC. "Potentially hazardous germs in and around sinks present a quandary for infection preventionists, since having accessible sinks for hand washing is so integral to everything we promote. Acting on the information we found, we have undertaken a comprehensive faucet replacement program across our hospital."

To identify the grime level of the sinks, the researchers used adenosine triphosphate (ATP) monitoring to measure the cleanliness. Visible biofilm was associated with higher ATP readings, and cultures tested over the course of the study grew Pseudomonas aeruginosa, mold, and other environmental organisms.

The research team also found aerators on sinks where they had previously been removed, pointing to an overall inconsistency of equipment protocols across the facility. Included in the design improvement program were sink guards, which were shown to limit splash significantly.

"As we learn more about the often stealthy ways in which germs can spread inside healthcare facilities, infection preventionists play an increasingly important role in healthcare facility design - including in the selection of sink and faucet fixtures - as this study illustrates," said 2019 APIC President Karen Hoffmann, RN, MS, CIC, FSHEA, FAPIC. "Because the healthcare environment can serve as a source of resistant organisms capable of causing dangerous infections, an organization's infection prevention and control program must ensure that measures are in place to reduce the risk of transmission from environmental sources and monitor compliance with those measures."

Credit: 
Association for Professionals in Infection Control

New knowledge on the development of asthma

Researchers at Karolinska Institutet in Sweden have studied which genes are expressed in overactive immune cells in mice with asthma-like inflammation of the airways. Their results, which are published in the journal Immunity, suggest that the synthesis and breakdown of fats plays an important part in the process.

The job of the human immune system is to read our environment and react to potentially harmful substances. In asthma, the immune system is overactive, causing inflammation in the lungs and symptoms such as coughing, wheezing and shortness of breath.

A kind of immune T cell called a Th2 cell plays a vital part in asthma-related inflammation, but the rarity of these cells and a lack of sensitivity technology has made these cells hard to study in any detail.

Researchers at Karolinska Institutet have now used a highly sensitive technique called single-cell RNA sequencing to analyse which genes are active in individual T cells. For the study, the team exposed mice to house dust mites, a common allergen to which most asthmatics are sensitive and which induces asthma-like lung inflammation. They then monitored gene expression in T cells before and after exposure to the allergens in the lymph glands to the point of inflammation in the lungs.

They found that in the mouse lung, the T cells express a unique profile of hundreds of genes, many of which are linked to how the cells make and break down fat. When they then gave mice a drug to block fat metabolism, the lung inflammation decreased relative to controls.

"Our results suggest that fats can help to aggravate the T-cell activated inflammation in the lungs that is seen in asthma," says the study's corresponding author Jonathan Coquet, researcher at the Department of Microbiology, Tumor and Cell Biology, Karolinska Institutet. "We now plan to systematically test the importance of the hundreds of uniquely expressed genes in order to find those that can trigger or prevent the development of the disease."

Another feature of the study was that when T cells reached the lungs from the lymph glands, they received signals that switched on the production of two powerful inflammatory substances: the cytokines interleukin 5 and 13. These cytokines are responsible for many of our normal asthma symptoms, such as respiratory tract inflammation, muscle contraction and mucus discharge.

"Our observation is that the T cells change a great deal over time and seem to undergo a kind of reprogramming in the lungs that makes them highly inflammatory," says Dr Coquet.

Credit: 
Karolinska Institutet