Culture

Study identifies network of genes that directs trachea and oesophagus development

A new study reporting how a network of genes directs the development of the trachea and oesophagus in mice has been published today in eLife.

The results provide new insight on the genes present during development that enable the formation of the trachea and oesophagus, more commonly known as the windpipe and food pipe, respectively. This may help scientists understand what causes birth defects in which the two structures do not fully separate, leading to eating and breathing difficulties. The findings may also help scientists one day grow oesophagus or trachea tissue in the lab to treat such birth defects or conditions such as cancer that may destroy these tissues.

During prenatal development, a tube of stem cells in the embryo gives rise to the cells of both the oesophagus and trachea. These cells become distinct and eventually the two structures separate. Previous studies have suggested that a pair of master gene regulators called NKX2.1 and SOX2 may control this process, but it is not clear which genes are activated by these regulators or if there are other regulators that might also be involved.

"We wanted to determine all of the genes that distinguish the trachea from the oesophagus and to learn how NKX2.1 and SOX2 influence the development of these organs," says lead author Akela Kuwahara, a Developmental & Stem Cell Biology Program graduate at the University of California, San Francisco (UCSF), US.

To do this, Kuwahara and a team of researchers led by Jeffrey Bush, Associate Professor at the Department of Cell & Tissue Biology, UCSF, used single cell RNA sequencing to compare all the genes that were switched on in the cells of the developing oesophagus and trachea in mice. Their results showed there are two very different sets of genes that are turned on during early development of the two organs.

Next, they compared which genes were turned on in the oesophagus and trachea of developing mice, comparing animals that had a functional NKX2.1 gene with those that did not. Most of the genes needed for the two tissues to develop were still turned on in the mice lacking NKX2.1. But a few important genes were different in these animals, including for example those needed to grow the cartilage that supports the trachea or the smooth muscle that moves food down the oesophagus.

"This suggests that NKX2.1 is not the master regulator for all genes involved in trachea development, but instead regulates only a small number of important genes," Kuwahara explains. "Our results reveal multiple new genes that are essential for trachea and oesophagus development in mice, but we now need to determine if these same genes are involved in the development of these organs in humans."

"Learning more about these genes and whether they play similar roles in humans is key to understanding how defects of the trachea and oesophagus can occur at birth," adds senior author Jeffrey Bush. "In the longer term, this insight may help us discover ways to grow new tissue from stem cells to help counter these defects."

Credit: 
eLife

Armor on butterfly wings protects against heavy rain

ITHACA, N.Y. - An analysis of high-speed raindrops hitting biological surfaces such as feathers, plant leaves and insect wings reveals how these highly water-repelling veneers reduce the water's impact.

Micro-bumps and a nanoscale wax layer on fragile butterfly wings shatter and spread raindrops to minimize damage.

The study, "How a Raindrop Gets Shattered on Biological Surfaces," published June 8 in the Proceedings of the National Academy of Sciences.

The research showed how microscale bumps, combined with a nanoscale layer of wax, shatter and spread these drops to protect fragile surfaces from physical damage and hypothermia risk.

There already exists a large market for products that use examples from nature - known as biomimicry - in their design: self-cleaning water-resistant sprays for clothes and shoes, and de-icing coatings on airplane wings. Findings from this study could lead to more such products in the future.

"This is the first study to understand how high-speed raindrops impact these natural hydrophobic surfaces," said senior author Sunghwan "Sunny" Jung, associate professor of biological and environmental engineering in the College of Agriculture and Life Sciences. The lead author is Seungho Kim, a postdoctoral researcher in Jung's lab.

Previous studies have looked at water hitting insects and plants at low impacts and have noted the liquid's cleaning properties. But in nature, raindrops can fall at rates of up to 10 meters per second, so this research examined how raindrops falling at high speeds interact with super-hydrophobic natural surfaces.

Raindrops pose risks, Jung said, because their impact could damage fragile butterfly wings, for example.

"[Getting hit with] raindrops is the most dangerous event for this kind of small animal," he said, noting the relative weight of a raindrop hitting a butterfly wing would be analogous to a bowling ball falling from the sky on a human.

In the study, the researchers collected samples of leaves, feathers and insects. The latter were acquired from the Cornell University Insect Collection, with the help of co-author Jason Dombroskie, collection manager and director of the Insect Diagnostic Lab.

The researchers placed the samples on a table and released water drops from heights of about two meters, while recording the impact at a few thousand frames per second with a high-speed camera.

In analyzing the film, they found that when a drop hits the surface, it ripples and spreads. A nanoscale wax layer repels the water, while larger microscale bumps on the surface creates holes in the spreading raindrop.

"Consider the micro-bumps as needles," Jung said. If one dropped a balloon onto these needles, he said, "then this balloon would break into smaller pieces. So the same thing happens as the raindrop hits and spreads."

This shattering action reduces the amount of time the drop is in contact with the surface, which limits momentum and lowers the impact force on a delicate wing or leaf. It also reduces heat transfer from a cold drop. This is important because the muscles of an insect wing, for example, need to be warm enough to fly.

"If they have a longer time in contact with the cold raindrop, they're going to lose a lot of heat and they cannot fly very easily," Jung said, making them vulnerable to predators, for example.

Repelling water as quickly as possible also is important because water is very heavy, making flight in insects and birds difficult and weighing down plant leaves.

"By having these two-tiered structures," Jung said, "[these organisms] can have a super hydrophobic surface."

Credit: 
Cornell University

Heat and humidity battle sunshine for influence over the spread of COVID-19, research

An international team of researchers led by McMaster University has found that while higher heat and humidity can slow the spread of COVID-19, longer hours of sunlight are associated with a higher incidence of the disease, in a sign that sunny days can tempt more people out even if this means a higher risk of infection.

The findings, published online the journal Geographical Analysis, inform the widespread scientific debate over how seasonal changes, specifically warmer weather, might shape the spread of COVID-19.

While research has shown that pathogens such as influenza and SARS thrive in lower temperatures and humidity, little is known about SARS-CoV2, the agent that causes COVID-19.

"There is a lot of pressure to reopen the economy, and many people want to know if it will be safer to do so in the summer months," says Antonio Páez, a professor and researcher in McMaster's School of Geography & Earth Sciences who is lead author of the study.

"Restrictions in movement, which have begun to ease around the world, hinge in part on how SARS-CoV2 will be affected by a change in season," he says.

Páez and colleagues from Spain's Universidad Politecnica de Cartegena and Brazil's Universidade Federal de Pernambuco investigated climate factors in the spread of COVID-19 in several provinces in Spain, one of the countries hardest hit by the pandemic, with more than 270,000 cases.

They combined and analyzed data on reported cases of the disease and meteorological information over a period of 30 days that began immediately before a state-of-emergency was declared.

At higher levels of heat and humidity, researchers found that for every percentage increase, there was a 3 per cent decline in the incidence of COVID-19, possibly because warmer temperatures curtail the viability of the virus.

The opposite was true for hours of sunshine: more sun meant greater spread. The researchers speculate the increase may be related to human behaviour, since compliance with lockdown measures breaks down in sunnier days.

They were also surprised to find rates of transmission dropped among more dense populations and in areas with more older adults, suggesting those populations regard themselves as being at greater risk, and so are more likely to adhere to lockdown guidance.

While older adults are more vulnerable to the disease, researchers believe they are less likely overall to contribute to the spread of the disease because they are more apt to be isolated from others because of health or mobility issues.

Páez stresses that models such as the one he helped develop show that contagion of COVID-19 declines as a lockdown progresses, possibly to the vanishing point - an argument for maintaining discipline despite the approach of pleasant weather.

"We will likely see a decrease in the incidence of COVID-19 as the weather warms up, which is an argument for relaxing social distancing to take advantage of the lower incidence associated with higher temperatures" he says. "But a more conservative approach would be to use the months of summer to continue to follow strict orders to remain in place and to crush this pandemic."

Credit: 
McMaster University

Botox is an effective treatment for some common sports injuries, new research suggests

June 9, 2020 - While botulinum toxin is commonly known as a cosmetic treatment for facial lines and wrinkles, a growing body of evidence suggests that "Botox" can also be an effective treatment for certain sports injuries and chronic pain conditions, according to a review in the June issue of Current Sports Medicine Reports, official journal of the American College of Sports Medicine (ACSM). The journal is published in the Lippincott portfolio by Wolters Kluwer.

Clint Moore, DO, and colleagues of the Uniformed Services University of the Health Sciences assembled and analyzed previous research on the use of botulinum toxin A (BoNT-A) - best known by the brand name Botox - for treatment of musculoskeletal disorders. "We found evidence showing promising pain relief and functional improvements using botulinum toxin for some very common conditions, including plantar fasciopathy, tennis elbow, and painful knee osteoarthritis," Dr. Moore comments.

For These Musculoskeletal Disorders, Evidence Supports Botulinum Toxin Injection

Various types of BoNT-A are available, but all act on motor neurons (nerve cells) to produce muscle weakness and on sensory neurons to inhibit the release of pain modulators. As in cosmetic procedures, the effects of BoNT-A injection are time-limited, and treatment may need to be repeated for sustained benefits. The effects on muscle contraction last about three months, while effects on pain may last for six months.

In a critical analysis of the research literature, Dr. Moore and colleagues identified studies showing that the neuromuscular blockade provided by BoNT-A can reduce pain and improve function in several musculoskeletal conditions:

Plantar fasciopathy: The most common cause of plantar heel pain, caused by thickening and other changes in the fibrous plantar fascia in the foot. Several studies report that BoNT injections can be effective if initial conservative treatments are unsuccessful. Although limited, high-quality evidence suggests reduced pain and improved function after BoNT therapy, with no significant side effects.

Osteoarthritis: A very common and disabling condition causing pain and reduced function of the knee, shoulder, or other joints. Especially in the knee, studies have reported reduced pain and disability scores after intra-articular (into the joint) injections of BoNT-A. Dr. Moore and colleagues have found improvements lasting four to six months after BoNT injection for knee osteoarthritis.

Lateral epicondylitis: Often called "tennis elbow," a common cause of elbow pain. Studies have reported reduced pain and improvement in daily activities after BoNT-A injection. In some reports, reductions in finger movement and grip strength have occurred due to the (temporary) motor effects of BoNT-A.

Chronic exertional compartment syndrome: A condition causing painful and potentially damaging increases in pressure in muscle compartments, usually after exercise. Based on limited evidence, BoNT-A injections may be a safe and effective treatment, in some cases avoiding the need for surgery.

For each of these conditions, Dr. Moore and colleagues discuss the role of BoNT-A and how they use it in their practice, including injection technique and dosage. Their paper also reviews studies using BoNT-A for patients with myofascial pain syndrome, a relatively common cause of chronic pain - with inconclusive results.

The authors note that all of these are "off-label" uses for which BoNT-A is not an FDA-approved treatment, and emphasize the need for appropriate patient selection and counseling. Dr. Moore and coauthors conclude, "Further research is required to provide stronger clinical recommendations for the use of BoNT in musculoskeletal conditions."

Credit: 
Wolters Kluwer Health

Use of cystatin C for precise assessment of kidney function and cardiovascular risk

The glomerular filtration rate (GFR) is normally specified as a measure of kidney function. The GFR is the volume of blood that the kidneys filter per minute (the unit of measurement, in relation to a standardized body surface area, is therefore ml/min/1.73 m2). To calculate or estimate GFR (eGFR= estimated GFR), an equation based, inter alia, on the laboratory parameter serum creatinine is mostly applied. Creatinine, a non-protein nitrogenous substance, is a breakdown product of muscle metabolism that is released continuously and excreted in urine (making it a urinary substance). If kidney function is impaired, eGFR decreases and serum creatinine increases. However, because the body's own creatinine production depends on various factors (e.g. age, gender and muscle mass), the significance of creatinine-based eGFR (eGFRcr) is a recurrent topic of discussion among specialists. For example, the kidney function of a delicate elderly lady (with low muscle mass and correspondingly lower serum creatinine) may be wrongly assessed as normal, based on her creatinine level, even though her kidney function may be significantly reduced. Conversely, the muscular creatinine production in a bodybuilder may cause elevated serum creatinine values and thus lead arithmetically to a low eGFR (despite normal kidney function). The endogenous protein Cystatin C (Cys-C), which is permanently released in the metabolism of almost all body cells, therefore appears to be more suitable as a marker than serum creatinine. The volume of Cys-C amount is independent of age, gender and muscle mass - potential confounding factors in cystatin-based eGFR estimation (eGFRcys) are inflammation, cancer, thyroid dysfunction or steroid therapy. Cys-C measurement is also more expensive than creatinine, and the test is not available in every laboratory.

An equation for estimating eGFR that includes both parameters (eGFRcr-cys) has been shown to provide the most accurate approximation of true GFR, not only in early stages, but also in late stages of kidney disease. This may be due to the fact that the confounding factors of the two parameters are independent of each other and play a less significant role in the combined equation eGFRcr-cys, according to the authors. eGFRcr-cys is particularly suitable, therefore, when it is important to know how well kidneys function as precisely as possible and at an early stage (e.g. to calculate the dosage of certain drugs, for enrolment in studies, or in the case of potential kidney donors).

"Accurate measurement is needed for the early detection of CKD. The ERA-EDTA recommends that eGFRcys and eGFRcr-cys be implemented as the new standard", emphasizes Professor Denis Fouque, Lyon/France, NDT´s Editor-in-chief.

Restriction of kidney function is known to worsen the prognosis of patients with cardiovascular disease. "eGFRcys and eGFRcr-cys could be used in anybody with an eGFRcr of 45-60 or 60-90 ml/min/1.73 m2 plus another cardiovascular risk factor to confirm diagnosis/staging of CKD. The lowest identified eGFR should be used for forward planning", explains corresponding author, Dr. Jennifer Lees, Glasgow. "EGFRcys should be used in parallel with traditional cardiovascular risk factors in order to produce a more exact prediction of individual risk and to optimize the primary prevention cardiovascular disease."

Credit: 
ERA – European Renal Association

Computer modelling predicts where vaccines are needed most

Researchers have developed a model that can estimate regional disease burden and the impact of vaccination, even in the absence of robust surveillance data, a study in eLife reveals.

The report, originally published on May 26, highlights areas that would have the greatest benefit from initiating a vaccination programme against the virus, Japanese encephalitis (JE). This will in turn guide rational assessment of the cost and benefit of vaccinations, and support policymaker decisions on allocating vaccines.

JE is a viral infection of the brain transmitted by mosquitoes. It is endemic in Asia-Pacific countries, with three billion people at risk of infection according to the World Health Organization (WHO). Only a small number of infections are symptomatic (ranging from one in 25 to one in 1,000), but people with symptomatic infections have a high risk of death (around one in three of those infected). Those who survive are often left with considerable neurological and psychological symptoms.

There are a number of vaccines available for JE, but in 2013, WHO prequalification was given to a new JE vaccine that requires only a single dose, is cheap to produce and is safer than previous vaccines. This led to a great increase in vaccination in Asia. However, given the disease's widespread prevalence across several countries, it has not been possible to estimate the impact of these vaccinations on disease burden.

"Vaccination is the most effective method of prevention but it is difficult to decide where it should be implemented or to estimate the quantitative impact without good-quality surveillance data from before and after vaccination," says lead author Tran Minh Quan, who was a Research Assistant at the Oxford University Clinical Research Unit, Wellcome Trust Asia Program, Vietnam, at the time of the study, and is now a graduate student at the University of Notre Dame, Indiana, US. "We developed a new approach using a modelling method that overcomes some of the limitations of sparse and variable surveillance data."

The team took a two-step approach to their analysis. First, they reviewed the available data on cases of JE and grouped this data by age. By focusing on age, this took out other variables and allowed the team to analyse the data according to a simple rule: the higher the rate of infection, the earlier in life people will acquire the infection. Then, by using a model that calculates the rate of infection using the age-grouped data, they generated a value called Force of Infection (FOI). This gives an idea of the intensity of transmission within a particular region.

In the second step, they used this FOI value to generate the disease burden in a specific region. When they ran this analysis with and without data on vaccination programs, it provided an estimate on the impact of vaccination on the number of global JE cases to date.

From this analysis, the team estimated that between 2000 and 2015, there were nearly two million cases of JE worldwide (1,976,238). Without vaccination, this number would have been 2,284,012 meaning that more than 300,000 JE cases were prevented globally because of vaccination. China had the highest burden of the disease but also benefited from the greatest impact of vaccination. On the other hand, estimates for countries including India, Vietnam and Indonesia suggested that up until 2015 these countries had high transmission intensity and that vaccination could be scaled up or introduced in these areas.

"Poor clinical outcomes and lack of a specific treatment makes JE prevention a priority," says senior author Hannah Clapham, who was a Mathematical Epidemiologist at the Oxford University Clinical Research Unit, Wellcome Trust Asia Program, Vietnam, at the time the study was carried out, and is now Assistant Professor at NUS Saw Swee Hock School of Public Health, Singapore.

"We estimated that in 2015 there were still 100,000 cases of JE in Asia each year, meaning that two-thirds of all cases of this severe but vaccine-preventable disease were not being averted. Given there is a cheap vaccine now available, our results will help to identify the regions that would be best targeted for vaccination in future."

Credit: 
eLife

Pitt researchers' new material allows for unprecedented imaging deeper in tissues

PITTSBURGH -- A team from the Department of Chemistry has established an approach for the creation of a metal-organic framework material that provides new perspectives for the sensitization of near-infrared luminescent lanthanide ions, including unprecedented possibilities of imaging deeper in tissues for more comprehensive studies of biological systems with light.

Professor Nathaniel Rosi and his team worked with Professor Stephane Petoud, INSERM Research Director for the Center for Molecular Biophysics in France and Adjunct Professor in the Department of Chemistry on the paper, "Ship-in-a-bottle preparation of long wavelength molecular antennae in lanthanide metal-organic frameworks for biological imaging."

The research details the process in which small molecular precursors are loaded into the rigid three-dimensional cavities within lanthanide metal-organic frameworks, where they combine to form a dense array of extended molecular systems that work as an "antennae" that sensitize the lanthanide cations with long wavelengths excitation light. Those long wavelengths activate the near infrared emitting properties of the lanthanide, which may help to create images of areas located more deeply within biological systems.

Rosi also noted the luminescence from lanthanides lasts longer than background radiation in standard biological images, so researchers will have a time advantage when studying lanthanide samples.

"We've achieved a system that's sufficiently bright, that we can see using biological imaging in the near infrared. We can also excite it at long wavelengths, up to 600 nanometers which is highly desired so as not to disturb the biological systems." said Rosi.

The paper published in April in the Journal of the American Chemical Society.

Rosi said this novel optical imaging agent will also help researchers detect greater numbers of biological targets from a single experiment than what is possible with current methods.

"Current limitations in imaging allow one to only detect 4 maybe 5 molecules at best in a single imaging experiment. What if we wanted to detect five or six, or 10? There are 14 lanthanide elements across the periodic table. Most of them have very distinct, sharp, luminescent signals. We can potentially make up to 10 optical imaging probes with different lanthanides and be able to detect all of them because they don't have overlapping signals."

Credit: 
University of Pittsburgh

Lab makes 4D printing more practical

image: Shapeshifting materials produced at Rice University with a 3D printer morph from their original form to an alternate through changes in temperature, electric current or stress. This example shows how one printed configuration can be programmed to take various shapes.

Image: 
Verduzco Laboratory/Rice University

HOUSTON - (June 9, 2020) - Soft robots and biomedical implants that reconfigure themselves upon demand are closer to reality with a new way to print shapeshifting materials.

Rafael Verduzco and graduate student Morgan Barnes of Rice's Brown School of Engineering developed a method to print objects that can be manipulated to take on alternate forms when exposed to changes in temperature, electric current or stress.

The researchers think of this as reactive 4D printing. Their work appears in the American Chemical Society journal ACS Applied Materials and Interfaces.

They first reported their ability to make morphing structures in a mold in 2018. But using the same chemistry for 3D printing limited structures to shapes that sat in the same plane. That meant no bumps or other complex curvatures could be programmed as the alternate shape.

Overcoming that limitation to decouple the printing process from shaping is a significant step toward more useful materials, Verduzco said.

"These materials, once fabricated, will change shape autonomously," Verduzco said. "We needed a method to control and define this shape change. Our simple idea was to use multiple reactions in sequence to print the material and then dictate how it would change shape. Rather than trying to do this all in one step, our approach gives more flexibility in controlling the initial and final shapes and also allows us to print complex structures."

The lab's challenge was to create a liquid crystal polymer "ink" that incorporates mutually exclusive sets of chemical links between molecules. One establishes the original printed shape, and the other can be set by physically manipulating the printed-and-dried material. Curing the alternate form under ultraviolet light locks in those links.

Once the two programmed forms are set, the material can then morph back and forth when, for instance, it's heated or cooled.

The researchers had to find a polymer mix that could be printed in a catalyst bath and still hold its original programmed shape.

"There were a lot of parameters we had to optimize -- from the solvents and catalyst used, to degree of swelling, and ink formula -- to allow the ink to solidify rapidly enough to print while not inhibiting the desired final shape actuation," Barnes said.

One remaining limitation of the process is the ability to print unsupported structures, like columns. To do so would require a solution that gels just enough to support itself during printing, she said. Gaining that ability will allow researchers to print far more complex combinations of shapes.

"Future work will further optimize the printing formula and use scaffold-assisted printing techniques to create actuators that transition between two different complex shapes," Barnes said. "This opens the door to printing soft robotics that could swim like a jellyfish, jump like a cricket or transport liquids like the heart."

Credit: 
Rice University

Infected insects may warn of impending citrus disease a year in advance

image: Despite the first appearance of citrus greening disease in Florida in 2005, the bacterium wasn't found in Texas until 2011, when scientists detected it in the psyllids. The disease was not detected in citrus years until 2012, suggesting that psyllids may actually be used for early detection of the HLB pathogen in newly invaded areas.

Image: 
Plant Disease

Citrus greening disease (Huanglongbing of HLB), transmitted by the Asian citrus psyllid, is currently the biggest threat to the citrus industry and is threat to many parts of the world, including Asia, Africa, South America, and the Unites States. In Florida alone, citrus greening disease has accounted for losses of several billions of U.S. dollars.

Despite HLB's widespread prevalence, factors influencing the epidemic are poorly understood because most research has been conducted after the pathogen has been introduced. In an attempt to change this, several Texas-based scientists surveyed commercial and residential citrus trees from 2007 to 2017 and monitored the time-course variations in the proportion of citrus trees and the Asian citrus psyllid (ACP).

"Unlike previous studies on citrus greening disease epidemics that were typically initiated in commercial orchards after the disease had been introduced or became widespread in the area, our study commenced 5-years prior to the first detection of the greening bacterium in Texas and continued for 5 additional years," said Olufemi Alabi, one of the scientists involved in this research. "This gave us unique opportunity to obtain a holistic picture of the progression of the disease epidemics from its onset in both commercial and residential ecologies.

Despite the first appearance of citrus greening disease in Florida in 2005, the bacterium wasn't found in Texas until 2011, when scientists detected it in the psyllids. The disease was not detected in citrus years until 2012, suggesting that psyllids may actually be used for early detection of the HLB pathogen in newly invaded areas.

Over the course of this decade-long study, the proportion of infected trees and psyllids increased exponentially over time while the number of fields and residential backyards with at least one disease-affected citrus tree reached 26% and 40% respectively by 2017. Research also revealed seasonal fluctuations and will provide comprehensive insight into the ongoing citrus greening epidemic in Texas, with potential lessons for California and other citrus-growing regions that have not yet been affected.

"Our study suggest that a flatter progression of citrus greening disease epidemics could be achieved through the implementation of strategies to protect new plantings from infection and the continued implementation of the area-wide ACP management program," said Mamoudou Sétamou, the lead author of the article .

This study is good news for Texas farmers, who were alarmed by the rapid spread of citrus greening disease epidemics in Florida and worried that the smaller Texas citrus industry would be quickly overwhelmed once the disease appeared.

"Surprisingly, our research showed that although an exponential growth was observed in progression of infected trees in Texas, the annual rate of increase was relatively slower than reported from Florida. This led us to conduct series of analyses that enabled us to identify potential climatic and cultural factors that may be contributing to the relatively slow spread of citrus greening disease in Texas."

Credit: 
American Phytopathological Society

BU researcher: Screening for drug use can be reasonable, but not evidence-based

Little evidence supports the new recommendations for clinical screening for drug use. Do the potential benefits outweigh the potential harms?

In the June 9 issue of JAMA, the U.S. Preventive Services Task Force (USPSTF) recommends that clinicians screen for unhealthy drug use (that is, any use of drugs that are illegal or medications not used for medical purposes) for all adult patients, but admits that there is still little evidence weighing the benefits and risks of this practice.

Screening for unhealthy drug use "is reasonable to consider in clinical practice, but it is not evidence-based for improving health," writes Dr. Richard Saitz, professor and chair of community health sciences at the Boston University School of Public Health (BUSPH), in an accompanying editorial.

"A service that likely has substantial harms for some, and small benefit for few under the most generous assumptions should be held to the same standards as other preventive services, regardless of whether it is a laboratory test that leads to an invasive treatment or a series of questions that lead to counseling and referral," writes Saitz, who is also professor of medicine at the Boston University School of Medicine, and a physician in the Clinical Addiction Research and Education Unit and the Grayken Center for Addiction at Boston Medical Center.

In the editorial, Saitz points out that the USPSTF did not hold drug screening to the same standards to which it usually holds other preventive services it evaluates. Instead, he writes, the Task Force included studies of people who were seeking help for their drug use. Usually, evidence for preventive services comes from people who are asymptomatic and not seeking help, because interventions differ in terms of their effectiveness depending on whether people are seeking help or not. In the case of drug use, those seeking help are likely to be more amendable to changing their use than are people who are not seeking to change--and may not even think their use is risky. The only evidence the Task Force found for efficacy of drug screening, Saitz notes, was in people seeking help for cannabis use. There was no benefit for other drugs, such as opioids or stimulants, nor were there any effects on any health outcomes.

Saitz writes that screening is "not unreasonable" for clinicians, mainly because knowledge of a patient's drug use is valuable and sometimes vital to understanding a patient's health needs. Making screening standard practice could also help reduce stigma by demonstrating that drug use is a health issue, and might encourage patients to talk to their health care providers about their drug use and even seek treatment through them, he writes.

However, Saitz also points out that screening comes with risks, particularly during pregnancy: 23 states and the District of Columbia consider drug use to be child abuse, three states consider drug use grounds for civil commitment, Alabama considers drug use to be chemical endangerment of a child, and South Carolina considers drug use to be criminal child abuse. Drug counseling delivered poorly also poses risks, he writes, and having drug use on a patient's medical record could expose them to stigma in their health care.

"These observations should serve as an important call for the development and study of new strategies to identify and address drug use in ways that can reduce related harms," Saitz writes.

Credit: 
Boston University School of Medicine

Birmingham scientists 're-train' immune system to prevent attack of healthy cells

The body's immune system can be re-wired to prevent it from recognising its own proteins which, when attacked by the body, can cause autoimmune diseases like multiple sclerosis, a significant new study by UK scientists has found.

Autoimmune diseases are caused when the immune system loses its normal focus on fighting infections or disease within and instead begins to attack otherwise healthy cells within the body. In the case of multiple sclerosis (MS), the body attacks proteins in myelin - the fatty insulation-like tissue wrapped around nerves - which causes the nerves to lose control over muscles.

Led by a multi-disciplinary team from the University of Birmingham, scientists examined the intricate mechanisms of the T-cells (or white blood cells) that control the body's immune system and found that the cells could be 're-trained' to stop them attacking the body's own cells. In the case of multiple sclerosis, this would prevent the body from attacking the Myelin Basic Protein (MBP) by reprogramming the immune system to recognise the protein as part of itself.

Supported by the Medical Research Council, the two-part study, published today in Cell Reports, was a collaboration between two research groups led by Professor David Wraith from the Institute of Immunology and Immunotherapy and Professor Peter Cockerill from the Institute of Cancer and Genomic Sciences.

The first stage, led by Professor Wraith showed that the immune system can be tricked into recognising MBP by presenting it with repeated doses of a highly soluble fragment of the protein that the white blood cells respond to. By repeatedly injecting the same fragment of MBP, the process whereby the immune system learns to distinguish between the body's own proteins and those that are foreign can be mimicked. The process, which is a similar type of immunotherapy to that previously used to desensitise people against allergies, showed that the white blood cells that recognise MBP switched from attacking the proteins to actually protecting the body.

The second stage, saw gene regulation specialists led by Professor Peter Cockerill probe deep within the white blood cells that react to MBP to show how genes are rewired in response to this form of immunotherapy to fundamentally re-programme the immune system. The repeated exposure to the same protein fragment triggered a response that turns on genes that silence the immune system instead of activating it. These cells then had a memory of this exposure to MBP embedded in the genes to stop them setting off an immune response. When T cells are made tolerant, other genes which function to activate the immune system remain silent.

Professor David Wraith said: "These findings have important implications for the many patients suffering from autoimmune conditions that are currently difficult to treat."

Professor Peter Cockerill, said: "This study has led us to finally understand the underlying basis of immunotherapies which desensitise the immune system"

Further longer term clinical trials will be needed to determine whether antigen-specific immunotherapies can indeed deliver lasting benefits. If this is successful, the study published today will be the first study defining the actual mechanisms of how T-cells can be made tolerant to the body's own proteins in a context that may lead to further advances in the battle to overcome autoimmunity.

Credit: 
University of Birmingham

Biomedical sciences researchers provide methods to inactivate and safely study SARS-CoV-2

image: Dr. Christopher Basler, professor in the Institute for Biomedical Sciences and director of the Center for Microbial Pathogenesis at Georgia State University and a Georgia Research Alliance Eminent Scholar in Microbial Pathogenesis

Image: 
Georgia State University

ATLANTA--Detailed methods on how to perform research on SARS-CoV-2, the virus that causes COVID-19, including procedures that effectively inactivate the virus to enable safe study of infected cells have been identified by virologists in the Institute for Biomedical Sciences at Georgia State University.

The peer-reviewed paper on the novel coronavirus, published in the journal Viruses, is a resource for newcomers in the field.

"Importantly, the study defines specific methods that fully inactivate the virus, that is make it non-infectious, in ways compatible with further scientific analysis," said Dr. Christopher Basler, professor in the Institute for Biomedical Sciences, director of the Center for Microbial Pathogenesis and a Georgia Research Alliance Eminent Scholar in Microbial Pathogenesis.

"This allows researchers to study the proteins and genes of the virus and how the infected host responds to infection outside of high containment. Confirming that such analyses can be done safely, with no risk of infection, will increase the rate of discovery about the virus and COVID-19."

When the disease COVID-19 appeared in humans, virologists in Basler's lab, who study emerging pathogens, wanted to contribute to the effort to understand SARS-CoV-2 and develop medical countermeasures for the virus. Because the new pathogen causes serious disease for which there are no definitive treatments, biosafety level 3 (BSL3) facilities are required. It was also necessary to handle the virus with extra care because so little was known about it.

To ensure the safety of the researchers and public, Basler and his team relied on biosafety experts who oversee the high-containment core at Georgia State. The experts created a plan that identified the optimal BSL3 facility on the university's Atlanta Campus for the work, developed rigorous training for the researchers (who were already experienced with high-containment work) and implemented procedures to enable safe and efficient work on SARS-CoV-2.

Credit: 
Georgia State University

Undersized airways may explain why nonsmokers get COPD

NEW YORK, NY (June 9, 2020) -- A new study of lung anatomy may explain why 1 in 4 cases of COPD--a lung disease most often linked to smoking--occur in people who have never smoked, a fact that has long perplexed researchers.

The research analyzed CT scans of more than 6,500 adults and found that people with small airways relative to their lungs' volume--a relationship termed dysanapsis--are at increased risk of chronic obstructive pulmonary disease (COPD) regardless of their smoking habits.

"Our study shows that having an undersized airway tree compromises breathing and leaves you vulnerable to COPD later in life," says lead author Benjamin M. Smith, MD, assistant professor of medicine at Columbia University Vagelos College of Physicians and Surgeons.

"Our findings suggest that dysanapsis is a major COPD risk factor -- on par with cigarette smoking," Smith says. "Dysanapsis is believed to arise early in life. Understanding the biological basis of dysanapsis may one day lead to early life interventions to promote healthy and resilient lung development."

The study was published online today in the Journal of the American Medical Association.

What Is Dysanapsis

Air is transported into the lung via airways that resemble the branches of a tree.

In the 1970s, researchers using simple lung function tests speculated that some people have undersized airways relative to the volume of their lungs. The size mismatch was termed dysanapsis and is believed to develop in childhood when airway branches grow more slowly than lung volume. With the advent of high-resolution in vivo imaging, dysanapsis could finally be measured directly in large cohorts.

Dysanapsis and Health

For many decades, the clinical significance of dysanapsis was unclear due to the difficulty in measuring airway and lung dimensions in large samples of smokers and nonsmokers.

A recent study showed that half of older adults with COPD had low lung function early in life.

"This observation motivated us to think about early life origins of COPD," Smith says. "Combining classic theories from respiratory physiology with state-of-the-art imaging in large epidemiological samples, we tested whether dysanapsis might explain a significant proportion of COPD risk."

COPD--including emphysema and chronic bronchitis-- is characterized by reduced airflow from the lungs and is the third-leading cause of death in the United States.

Lung anatomy Linked to COPD

In the new study, Smith and his colleagues analyzed health data, including lung CT scans, from more than 6,500 older adults enrolled in three major lung studies in the United States and Canada.

They found that individuals with smaller airways relative to lung size had the poorest lung function and the highest risk of COPD and were 8 times more likely to develop COPD.

The findings support a landmark 2015 study demonstrating two major pathways that lead to COPD later in life. In the classic paradigm, individuals with normal lung function experience a rapid decline after years of exposure to irritants, like cigarette smoke or air pollution.

"But there's a second pathway in people who have reduced lung function from an early age. This low starting point increases the risk for COPD in later years, even in the absence of rapid lung function decline," says Smith. "Based on our data, dysanapsis may account for a large percentage of these cases."

Dysanapsis and Smoking

The association between dysanapsis and COPD risk existed for both smokers and nonsmokers and may also explain why only a minority of heavy smokers develop COPD.

The study also looked at the lifelong heavy smokers without COPD and found that these participants had larger than expected airways for their lung size.

"This suggests that people at the opposite end of the dysanapsis spectrum, i.e. those with larger than expected airways, may be able to incur considerable damage from smoking while maintaining enough reserve to avoid COPD," says Smith. "Of course, the harmful effects of smoking are legion, including lung cancer, heart disease, and stroke. So anyone who smokes should do their best to quit."

Credit: 
Columbia University Irving Medical Center

Unexpected uncertainty can breed paranoia, researchers find

In times of unexpected uncertainty, such as the sudden appearance of a global pandemic, people may be more prone to paranoia, Yale University researchers suggest in a new study published in the journal eLife.

"When our world changes unexpectedly, we want to blame that volatility on somebody, to make sense of it, and perhaps neutralize it,'' said Yale's Philip Corlett, associate professor of psychiatry and senior author of the study. "Historically in times of upheaval, such as the great fire of ancient Rome in 64 C.E. or the 9/11 terrorist attacks, paranoia and conspiratorial thinking increased."

Paranoia is a key symptom of serious mental illness, marked by the belief that other people have malicious intentions. But it also manifests in varying degrees in the general population. For instance, one previous survey found that 20% of the population believed people were against them at some time during the past year; 8% believed that others were actively out to harm them.

The prevailing theory is that paranoia stems from an inability to accurately assess social threats. But Corlett and lead author Erin Reed of Yale hypothesized that paranoia is instead rooted in a more basic learning mechanism that is triggered by uncertainty, even in the absence of social threat.

"We think of the brain as a prediction machine; unexpected change, whether social or not, may constitute a type of threat -- it limits the brain's ability to make predictions," Reed said. "Paranoia may be a response to uncertainty in general, and social interactions can be particularly complex and difficult to predict."

In a series of experiments, they asked subjects with different degrees of paranoia to play a card game in which the best choices for success were changed secretly. People with little or no paranoia were slow to assume that the best choice had changed. However, those with paranoia expected even more volatility in the game. They changed their choices capriciously -- even after a win. The researchers then increased the levels of uncertainty by changing the chances of winning halfway through the game without telling the participants. This sudden change made even the low-paranoia participants behave like those with paranoia, learning less from the consequences of their choices.

In a related experiment, Yale collaborators Jane Taylor and Stephanie Groman trained rats, a relatively asocial species, to complete a similar task where best choices of success changed. Rats who were administered methamphetamine -- known to induce paranoia in humans -- behaved just like paranoid humans. They, too, anticipated high volatility and relied more on their expectations than learning from the task.

Reed, Corlett and their team then used a mathematical model to compare choices made by rats and humans while performing these similar tasks. The results from the rats that received methamphetamine resembled those of humans with paranoia, researchers found.

"Our hope is that this work will facilitate a mechanistic explanation of paranoia, a first step in the development of new treatments that target those underlying mechanisms," Corlett said.

"The benefit of seeing paranoia through a non-social lens is that we can study these mechanisms in simpler systems, without needing to recapitulate the richness of human social interaction," Reed said.

Credit: 
Yale University

Accounting for nature in economies

The way we measure economic health is flawed, according to new research from the Stanford-based Natural Capital Project. When we talk about a country's economic prosperity, we're almost always referring to gross domestic product, or GDP, a calculated value based on the goods and services that flow through an economy. But GDP doesn't account for many of the benefits that people and economies receive from nature, like clean water and climate security.

To address this economic gap, Stanford researchers developed a new metric for measuring the value of nature's contributions to economic activity. Their study, published in Proceedings of the National Academy of Sciences, details how the approach, known as Gross Ecosystem Product (GEP) is being successfully implemented in China.

"We're basically flying blind when it comes to knowing where and how much nature to protect," said the study's senior author Gretchen Daily, a professor of environmental science at Stanford's School of Humanities and Sciences.

"GEP tracks the vital contributions of nature to society, informs investments in securing them and helps evaluate the performance of leaders and policies," added Daily, who is also the faculty director of the Natural Capital Project, an interdisciplinary partnership that helps governments and organizations integrate the values of nature's contributions into economic and development plans.

The researchers used InVEST, the Natural Capital Project's open-source mapping and modeling software, to calculate the flow of benefits that nature provides to people and inform the GEP equation. GEP is calculated in parallel ways to GDP, accounting for and aggregating all of nature's contributions to people in a single, monetary metric. These contributions provide a new lens with which to assess income and performance.

"Over the past 50 years, global GDP has risen by 370 percent," said co-author Stephen Polasky, professor of ecological and environmental economics at the University of Minnesota. "Alongside global economic prosperity, we're seeing the degradation of the vital natural capital that fundamentally underpins human well-being."

The Chinese Academy of Sciences has pioneered the concept of GEP and compiled a wealth of environmental data to do the calculations. Led by Zhiyun Ouyang, director of the academy's Research Center for Eco-Environmental Sciences and lead author on the study, the research team used Qinghai province in China as a testing ground for the new measure. Qinghai province is known as the "water tower of Asia" because it sits at the source of the Mekong, Yangtze and Yellow rivers, which provide water to much of China and other Southeast Asian countries.

"Qinghai is rich in natural capital, but its GDP alone does not reflect that value," Ouyang said. "Using this new metric, we were able to place a value on important ecosystem services, especially water supply, that Qinghai currently exports to other provinces but receives no credit for in the GDP calculation."

The people downstream who benefit from Qinghai's water supply tend to live in provinces wealthier in GDP and, in urban areas, often poorer in GEP. Using GEP, leaders in China are informing "eco-compensation" programs that enable downstream water users to pay for the protection of the water source upstream. These types of programs can help alleviate poverty while keeping critical ecosystem benefits flowing.

The researchers are also thinking about how the new measure could be linked to policy. "GEP by itself is just a metric, a number. But when used to inform and drive policies, it can be a powerful new tool for changing the ways we understand and value nature globally," said Mary Ruckelshaus, managing director of the Natural Capital Project. "Being held accountable for reporting GEP may incentivize governments to make decisions that protect the natural resources on which we all depend."

Like the payment for the water eco-compensation program in China, other countries and governments have been implementing schemes to allow payment for the provision and protection of ecosystem services for decades. But policy development can be slow and burdensome, especially without a standardized measurement or approach. GEP will enable governments to more easily compare options and weigh tradeoffs between different conservation decisions.

The ultimate goal is to see the successes of its application in China applied globally so that economies everywhere track and secure the values of nature to society. The researchers are working with the United Nations Statistics Division to develop ways to scale and standardize GEP as a global reporting metric.

"We see a potential future where GEP is reported alongside GDP in all economies," said Daily. "The use of GEP yielding tangible results already - creating jobs and restoring critical ecosystems. Securing natural capital is at the heart of a future in which all of us can thrive."

Credit: 
Stanford University