Culture

Not just for numbers: Anchoring biases decisions involving sight, sound, and touch

TROY, N.Y. -- Numeric anchoring is a long-established technique of marketing communication. Once a price is mentioned, that number serves as the basis for -- or "anchors" -- all future discussions and decisions. But new research shows that this phenomenon is not limited to decisions that involve numbers, the use and understanding of which require high-level cognitive thinking. Anchoring also biases judgments at relatively low levels of cognition when no numbers are involved.

In research recently published in the Journal of Behavioral Decision Making, Gaurav Jain, an assistant professor in the Lally School of Management at Rensselaer Polytechnic Institute, demonstrated that anchoring even occurs in perceptual domains, like sight, sound, and touch.

To test his novel theory that anchoring could happen without numbers as the starting point, Jain conducted several studies involving different senses. For example, to test decision-making relating to haptics -- or touch -- he asked subjects to close their eyes and touch sandpaper of a certain grit. When the subjects opened their eyes, he offered them 16 sandpaper choices and asked them to find the grit that matched the first one.

Jain anchored the range of options by making participants start with either a relatively finer or coarser grit than the initial one. Those subjects that were anchored with the finer grit chose sandpaper that was finer than the one they originally touched -- and the converse was true for those anchored with the coarser grit.

"My findings offer marketing professionals another fundamental tool to guide consumer behavior by anchoring a product or message through their senses," Jain said.
Additionally, Jain's research offers critical insight into the underpinnings of the phenomenon of anchoring.

Even in academic circles, questions remain about how decisions are made and the role anchors play. Do people go from the anchor point to their final decision in one move? Or do they take incremental steps away from the anchor?

Jain's experiments gave him the opportunity to watch the decision-making process in action, leading to a conclusion that reconciles these two models. He found that his subjects reached their final decision by taking small jumps away from the anchor point, but each of those jumps were influenced by the anchor's placement.

"Discovering exactly how we humans make decisions has been nearly impossible," Jain said. "With this research, I found an opening into the black box of the human brain. I've shown how decision-making works in the perceptual domains, and it signals directly how it may work in numerical domains."

Credit: 
Rensselaer Polytechnic Institute

For hip fracture patients, hospital reimbursements rising faster than surgeon reimbursements

March 17, 2021 - In recent years, hospital charges and Medicare payments for patients with hip fractures have increased much more rapidly than charges and payments for orthopaedic surgeons, reports a study in the Journal of Orthopaedic Trauma. The journal is published in the Lippincott portfolio by Wolters Kluwer.

The gap in Medicare reimbursements to hospitals compared to surgeons widened substantially in the last decade - even as patient outcomes improved and healthcare resource use decreased, according to a new analysis by Brian Werner, MD, and colleagues of UVA Health, Charlottesville, Va. "The results confirm our hypothesis that hospital charges and payments contribute significantly more to the increasing cost of treating a hip fracture patient than surgeon charges and payments do," the researchers write.

Hospital payments rise rapidly, despite shorter lengths of stay

To evaluate trends and variations in hospital versus surgeon charges and payments, the researchers analyzed Medicare data on more than 28,000 patients treated for hip fracture between 2005 and 2014. Charges refer to the "list prices" set by hospitals and surgeons; payments refer to the fixed prices that Medicare pays for specific procedures. The analysis included patients with two hip fracture sites and three different types of procedures.

About 25,000 patients were treated with surgery or "open reduction" for proximal femur fractures. For this group, hospital charges increased by 76.9 percent during the study period: from about $37,000 to $66,000 per patient. By comparison, surgeon charges increased by 22.2 percent: from about $3,100 to $3,900. There were also discrepant trends in payments: hospital payments increased by 39 percent (from about $10,500 to $14,700 per patient), while surgeon payments actually decreased by 7.5 percent (from $916 to $847).

For a better comparison of trends in hospital and surgeon reimbursement, the researchers calculated a "charge multiplier" (CM) and a "payment multiplier" (PM). Both multipliers continually increased over time: the CM for surgery/open reduction increased from 11.9 in 2005 to 17.2 in 2014.

Meanwhile, the PM increased from 11.5 to 17.4. In other words, hospital payments were about 11 times higher than surgeon payments in 2005, but 17 times higher in 2014.

Similar trends were noted for approximately 3,000 patients undergoing closed reduction and percutaneous pinning of femoral neck fractures. For this procedure, the CM increased from 10.1 to 15.6, while the PM increased from 15.1 to 19.2. The trends were consistent across US regions.

The burden of health problems for patients with hip fracture increased during the years studied, based on a standard comorbidity index. However, patient outcomes improved, including lower mortality rates and a shorter average hospital length of stay (LOS).

"Theoretically, this decrease in LOS should decrease hospital resource utilization and consequently, hospital charges and payments," Dr. Werner and colleagues write. But instead, "As LOS decreased, hospital charges and payments actually increased relative to surgeon charges and payments."

The study cannot explain the widening gap between hospital and surgeon reimbursements, at a time when total charges for treating a patient with hip fracture are paradoxically increasing. Dr. Werner and colleagues conclude: "Identifying and rectifying the sources of increased hospital charges - rather than continually minimizing surgeon reimbursement - will be tantamount to minimizing the financial burden of hip fractures on the health care system while continuing to deliver effective and efficient patient care in the coming years."

Credit: 
Wolters Kluwer Health

Identifying cells to better understand healthy and diseased behavior

image: Georgia Tech researchers use a graphical model framework to uncover a better way to identify cells and understand neural activities in the brain.

Image: 
Christopher Moore, Georgia Tech

In researching the causes and potential treatments for degenerative conditions such as Alzheimer's or Parkinson's disease, neuroscientists frequently struggle to accurately identify cells needed to understand brain activity that gives rise to behavior changes such as declining memory or impaired balance and tremors.

A multidisciplinary team of Georgia Institute of Technology neuroscience researchers, borrowing from existing tools such as graphical models, have uncovered a better way to identify cells and understand the mechanisms of the diseases, potentially leading to better understanding, diagnosis, and treatment.

Their research findings were reported Feb. 24 in the journal eLife. The research was supported by the National Institutes of Health and the National Science Foundation.

The field of neuroscience studies how the nervous system functions, and how genes and environment influence behavior. By using new technologies to understand natural and dysfunctional states of biological systems, neuroscientists hope to ultimately bring cures to diseases.
Before that can happen, neuroscientists first must understand which cells in the brain are driving behavior but mapping the brain activity cell by cell isn't as simple as it appears.

No Two Brain Cells Are Alike

Traditionally, scientists established a coordinate system to map each cell location by comparing images to an atlas, but the notion in literature that "all brains look the same is absolutely not true," said Hang Lu, the Love Family Professor of Chemical and Biomolecular Engineering in Georgia Tech's School of Chemical and Biomolecular Engineering.

Taking a coordinate approach presents two main challenges: first, the sheer number of cells in which none look that distinct; second, cells vary from individual to individual.

"This is a current huge bottleneck - you can record neuron activities all you want but if you don't understand which cells are doing what, it's difficult to compare between brains or conditions and draw meaningful conclusions," Lu said.

According to graduate researcher Shivesh Chaudhary, there are also noises in data that make establishing correspondence between two different regions of the brain difficult. "Some deformations may exist in data or some portions of the shape may be missing," he said.

Focusing on Cell Relationships, Not Just Geography

To overcome these challenges, the Georgia Tech researchers borrowed from two disciplines - graphical models in machine learning and metric geometry approach to shape matching in mathematics - and built a computational method to identify cells in their model organism, the nematode C.elegans.

The team used frameworks from other fields such as natural language processing to build their own modeling software. In natural language processing, the computer can determine what sentences mean by capturing dependencies between words in a statement.

The researchers embraced a similar model but instead of capturing dependencies among the words, "We captured them among the neurons to identify cells," Chaudhary said, noting that this approach limits error propagation as compared to other methods that examine the geographic location of each cell.

"Using relationships among the cells was actually more useful in defining a cell's identity," Lu said. "If you define one, you will have the implications of the identity of the other cells."

The approach, say the research team, is significantly more accurate than the current method of identification. The algorithm, while not perfect, performs significantly better in the face of imperfect data, and "gets less rattled" by noise or errors, Lu said.

The algorithm has huge implications for many developmental diseases, since once scientists can understand the mechanism of a disease, they can find interventions.

"You can use this to do drug and genetic screens to assess genetic risks. You can take someone's genetic background and examine how this background makes cells behave differently from the standard reference genetic background," Lu said.

"One cool thing about this approach is that it is data driven, and therefore, it captures the variations among individual worms. This method has a high potential to be applicable to a wide range of studies on development and function under normal as well as disease-like conditions," said Yun Zhang, professor, Department of Organismic and Evolutionary Biology, Center for Brain Science at Harvard University.

Faster Data Analysis

The algorithm greatly accelerates the speed of analyzing whole-brain data. The researchers explained that before this advance, their lab might take 20 minutes to record a set of data, but it would take them weeks to identify cells and analyze data. With the algorithm, the analysis takes "overnight at most on a desktop," said Chaudhary.

The technique also supports crowdsourcing, collaborative online platforms that open up the algorithm to a larger community, which can test the algorithm and build atlases.

"Every researcher working on the same problem could do recordings and contribute to further building these atlases that will be widely usable in all contexts," Lu said.

The researchers credit the success of the project to being able to draw upon multiple disciplines across physics, biology, math, and chemistry. Chaudhary, who has an undergraduate degree in chemical engineering, took advantage of developments in computer science and math to solve this particular neuroscience problem.

"In our labs, we have a physicist working on building microscopes, we have biologists, we have people like me who are inclined more towards computer science. We also collaborate with a pure mathematician," he explained. "The neuroscience field has everything. You can go any direction that you want to."

Credit: 
Georgia Institute of Technology

Study examines the use of electroconvulsive therapy in England

Electroconvulsive therapy (ECT), which involves passing electricity through the brain, remains a controversial psychiatric treatment for depression and other conditions because it can cause side effects such as memory loss and is ineffective for many patients. A recent study published in Psychology and Psychotherapy: Theory, Research and Practice has examined how ECT is currently administered and monitored throughout England.

The study was based on data provided by 37 National Health Service Trusts' responses to requests under the Freedom of Information Act. The audit found that the dwindling use of ECT in England has levelled off at about 2,500 people per year. Most recipients are women and over 60 years old. Only one Trust could report how many people received psychological therapy prior to ECT, as required by government guidelines. More than a third of ECT was given without consent, and 18% of Trusts were non?compliant with legislation concerning second opinions. Only six Trusts provided any data for positive outcomes and seven for adverse effects. None provided data on efficacy or adverse effects beyond the end of treatment.

"ECT is a potentially very dangerous procedure that requires the most stringent monitoring. Our audit shows that this is not the case at most ECT clinics in England," said lead author John Read, PhD, of the University of East London. "Currently, monitoring is left to the Royal College of Psychiatrists and they are clearly not capable of ensuring patient safety." Dr. Read and a group of other ECT experts are calling for an independent enquiry into the administration and monitoring of ECT in England.

Credit: 
Wiley

Hormone therapy shown to reduce effects of nocturia in postmenopausal women

CLEVELAND, Ohio (March 17, 2021)--As women age, they are more likely to wake up in the middle of the night to pass urine. The loss of estrogen during the menopause transition accelerates this problem, which is known as nocturia. A new study evaluated the effectiveness of different hormone therapies in managing the frequency of nocturia. Study results are published online today in Menopause, the journal of The North American Menopause Society (NAMS).

The loss of estrogen during menopause has been shown to create bladder dysfunction, sleep disorders, hot flashes, and alterations in renal water and salt handling, all of which result in higher diuresis overnight. To date, there has been little research done regarding the effect of hormone therapy on nocturia, even though hormone therapy has been proven to improve the causative factors of postmenopausal nocturia such as sleep disorders, obstructive sleep apnea, and hot flashes.

Vaginal estrogen has already been shown to help manage the various symptoms of the genitourinary syndrome of menopause, especially with regard to improving urinary function. However, little was known about the effect of systemic treatment. In addition, there is some limited evidence suggesting significant benefits of using oral estrogen in combination with oral progesterone, but nothing is known about the effects of other hormone combinations or the newer tissue-selective estrogen complex (TSEC) on nocturia.

In this new study involving nearly 250 women, participants were divided into four treatment groups: estrogen and progesterone (E+P); estrogen only in patients with prior hysterectomies; TSEC; and no treatment. The study concluded that systemic treatment with either E+P or TSEC led to a significant reduction in nocturia prevalence and significant improvement of bothersome symptoms in women with two or more nocturnal voids. The use of estrogen only resulted in a significant reduction in urgency prevalence.

Researchers believe that additional research should be conducted to better understand the underlying pathophysiologic triggers.

Results are published in the article "Hormone therapy as a positive solution for postmenopausal women with nocturia: results of a pilot trial."

"This pilot study shows a significant reduction in the prevalence and bother associated with nocturia in postmenopausal women using systemic hormone therapy. Although additional study is needed, this finding appears to be primarily related to improvements in sleep quality," says Dr. Stephanie Faubion, NAMS medical director.

Credit: 
The Menopause Society

A raw diet for under 6-month-old puppies may reduce the risk of inflammatory bowel disease

According to a study conducted at the University of Helsinki, a raw diet from the late stages of suckling to roughly two months of age may reduce the prevalence of inflammatory bowel disease (IBD) in dogs later in life.

In addition, a raw diet administered subsequently up to six months was found to have a positive effect. At the same time, the study indicates that feeding dry food to puppies early on in their lives can increase the incidence of IBD later in life.

In addition to the diet, the maternal history of IBD as well as the dog's gender and age were associated with the onset of the disease in adulthood.

"Puppies whose dam suffered from IBD had a 7.9-fold risk of developing the disease, with male puppies carrying a risk that was 2.1 times that of female puppies. IBD was most prevalent among 5- to 10-year-old dogs," says Manal Hemida, DVM, the principal investigator of the study from the Helsinki One Health network.

Vaccinations given to dams during or shortly prior to pregnancy made the likelihood of IBD in their offspring 1.5-fold compared to puppies whose dams had not been vaccinated in the corresponding period.

Another relevant factor was the puppies' weight: slim puppies had a 1.4-fold chance of developing the disease in adulthood compared to puppies with normal weight.

"However, it is still unclear if the lower body weight is a consequence of undiagnosed early IBD. All of our study's findings may suggest causal relationships, but do not prove them. Future prospective longitudinal dietary intervention studies are needed to confirm our findings, as well as to develop primary strategies for IBD prevention in dogs," says Docent Anna Hielm-Björkman, leader of the DogRisk research group.

As data for the study, the researchers utilised an online feeding survey introduced in 2009 by the DogRisk research group of the Faculty of Veterinary Medicine, University of Helsinki. The study investigated environmental exposures in four early life stages of dogs, two of which were the dog's intra-uterine life as a foetus and the lactation period, during which newborns receive all of their nutrition from suckling. The latter two stages were the early (1-2 months of age) and late (2-6 months of age) puppyhood periods.

Credit: 
University of Helsinki

Can I squeeze through here? How some fungi can grow through tiny gaps

image: A team led by the University of Tsukuba has found key differences that explain why some species of fungi can grow successfully through tiny gaps, whereas other fungi--typically those with faster growth rates--cannot squeeze through and stop growing. The trade-off between developmental plasticity and growth rate helps to understand how fungi penetrate surfaces or plant/animal tissues, with important implications for fungal biotechnology, ecology, and studies of disease.

Image: 
University of Tsukuba

Tsukuba, Japan - Fungi are a vital part of nature's recycling system of decay and decomposition. Filamentous fungi spread over and penetrate surfaces by extending fine threads known as hyphae.

Fungi that cause disease within living organisms can penetrate the spaces between tightly connected plant or animal cells, but how their hyphae do this, and why the hyphae of other fungal species do not, has been unclear.

Now, a team led by Professor Norio Takeshita at University of Tsukuba, with collaborators at Nagoya University and in Mexico, has discovered a key feature that helps explain the differences among species. They compared seven fungi from different taxonomic groups, including some that cause disease in plants.

The team tested how the fungi responded when presented with an obstruction that meant they had to pass through very narrow channels. At only 1 micron wide, the channels were narrower than the diameter of fungal hyphae, typically 2-5 microns in different species.

Some species grew readily through the narrow channels, maintaining similar growth rates before meeting the channel, while extending through it, and after emerging. In contrast, other species were seriously impeded. The hyphae either stopped growing or grew very slowly through the channel. After emerging, the hyphae sometimes developed a swollen tip and became depolarized so that they did not maintain their previous direction of growth.

The tendency to show disrupted growth did not depend on the diameter of the hyphae, or how closely related the fungi were. However, species with faster growth rates and higher pressure within the cell were more prone to disruption.

By observing fluorescent dyes in the living fungi, the team found that processes inside the cell became defective in the fungi with disrupted growth. Small packages (vesicles) that supply lipids and proteins (needed for assembling new membranes and cell walls as hypha extend) were no longer properly organized during growth through the channel.

"For the first time, we have shown that there appears to be a trade-off between cell plasticity and growth rate," says Professor Takeshita. "When a fast-growing hypha passes through a narrow channel, a massive number of vesicles congregate at the point of constriction, rather than passing along to the growing tip. This results in depolarized growth: the tip swells when it exits the channel, and no longer extends. In contrast, a slower growth rate allows hyphae to maintain correct positioning of the cell polarity machinery, permitting growth to continue through the confined space."

As well as helping explain why certain fungi can penetrate surfaces or living tissues, this discovery will also be important for future research into fungal biotechnology and ecology.

Credit: 
University of Tsukuba

Crying human tear glands grown in the lab

video: A 4-hour movie of tear gland organoids exposed to noradrenaline.

Image: 
Marie Bannier-Hélaouët, copyright Hubrecht Institute

Researchers from the lab of Hans Clevers (Hubrecht Institute) and the UMC Utrecht used organoid technology to grow miniature human tear glands that actually cry. The organoids serve as a model to study how certain cells in the human tear gland produce tears or fail to do so. Scientists everywhere can use the model to identify new treatment options for patients with tear gland disorders, such as dry eye disease. Hopefully in the future, the organoids can even be transplanted into patients with non-functioning tear glands. The results will be published in Cell Stem Cell on the 16th of March.

The tear gland is located in the upper part of the eye socket. It secretes tear fluid, which is essential for lubrication and nutrition of the cornea and has antibacterial components. Rachel Kalmann (UMCU), ophthalmologist and researcher on the project, explains: "Dysfunction of the tear gland, for example in Sjögren's syndrome, can have serious consequences including dryness of the eye or even ulceration of the cornea. This can, in severe cases, lead to blindness." However, the exact biology behind the functioning of the tear gland was unknown and a reliable model to study it was lacking. That is, until now: researchers from the group of Hans Clevers (Hubrecht Institute) present the first human model to study how the cells in the tear gland cry and what can go wrong.

Crying organoids

The researchers used organoid technology to grow miniature versions of the mouse and human tear gland in a dish. These so-called organoids are tiny 3D-structures that mimic the function of actual organs. After they cultivated these tear gland organoids, the challenge was to get them to cry. Marie Bannier-Hélaouët, researcher on the project, explains: "Organoids are grown using a cocktail of growth-stimulating factors. We had to modify the usual cocktail to make the organoids capable of crying." Once the researchers found the right mixture of growth factors, they could induce the organoids to cry. "Our eyes are always wet, as are the tear glands in a dish," Bannier-Hélaouët says.

Swelling up like a balloon

Similar to the way people cry in response to for example pain, the organoids cry in response to chemical stimuli such as noradrenaline. The cells of the organoids shed their tears on the inside of the organoid, which is called the lumen. As a result, the organoid will swell up like a balloon. The size of the organoids can therefore be used as an indicator of tear production and
-secretion. "Further experiments revealed that different cells in the tear gland make different components of tears. And these cells respond differently to tear-inducing stimuli," says Yorick Post, another researcher on the project.

Atlas of cells

The tear gland is composed of several cell types, but the current model only captures one, the ductal cell. In their paper, the researchers present an atlas of the cells in the tear gland to demonstrate their differences. They generated this atlas using single-cell sequencing; a method with which individual cells can be examined and characterized. Post explains: "In the future, we would like to also grow the other tear gland cell type, so-called acinar cell, in a dish. That way, we can eventually grow a full tear gland in the lab." With the atlas, the researchers were also able to identify new tear products, which help protect the eye from infections.

Transplanting organoids

The development of the miniature tear glands holds promise for patients suffering from tear gland disorders. Scientists everywhere can use the model to identify new drugs for patients who do not produce enough tears. Additionally, the organoids can be used to study how cancers of the tear gland form and may be treated. "And hopefully in the future, this type of organoids may even be transplantable to patients with non-functioning tear glands," Bannier-Hélaouët concludes. The study demonstrates once more the broad potential of organoid technology for science and medicine.

Credit: 
Hubrecht Institute

Brain disease research reveals differences between sexes

image: Human neurodegenerative diseases affect men and women differently, yet sex is rarely included in in vitro bioengineered models of neurodegenerative disease. Sex-related differences include a wide range of biochemical factors, gene expression, and biomechanical cues. These sex differences must be included in blood-brain barrier models to improve the understanding of sex differences in neurodegenerative disease and eventually realize personalized medicine.

Image: 
Callie Weber

WASHINGTON, March 16, 2021 -- Men and women are impacted differently by brain diseases, like Alzheimer's disease and Parkinson's disease. Researchers are urging their colleagues to remember those differences when researching treatments and cures.

In APL Bioengineering, by AIP Publishing, University of Maryland scientists highlight a growing body of research suggesting sex differences play roles in how patients respond to brain diseases, as well as multiple sclerosis, motor neuron disease, and other brain ailments.

That is progress from just a few years ago, said Alisa Morss Clyne, director of the university's Vascular Kinetics Laboratory.

"I have worked with vascular cells for 20 years and, up until maybe about five years ago, if you asked if the sex of my cells mattered at all, I would have said no," Clyne said. Then, she worked on a difficult study in which data appeared "all over the place."

"We separated the cell data by sex, and it all made sense," Clyne said. "It was an awakening for me that we should be studying this."

As of 2020, an estimated 5.8 million Americans were diagnosed with Alzheimer's disease, another 1 million with Parkinson's disease, 914,000 with multiple sclerosis, and 63,000 with motor neuron disease. These diseases happen when nerve cells in the brain and nervous system quit working and, ultimately, die.

The changes are associated with the breakdown of what is called the blood-brain barrier -- a border of cells that keeps the wrong kind of molecules in the bloodstream from entering the brain and damaging it.

Published research has shown differences in the blood-brain barriers of men and women. Some of the research suggests the barrier can be stronger in women than men, and the barriers in men and women are built and behave differently.

That could factor into known differences in the sexes, such as Alzheimer's disease being more prevalent in older women than men, while Parkinson's impacts men more frequently and tends to do so more severely.

The authors said they hope their article will serve as a reminder to researchers not just in their own field, but across the sciences, that accounting for sex differences leads to better results.

"I think there is an awakening in the past 10 years or so that you cannot ignore sex differences," Clyne said. "My goal is to inspire people to include sex differences in their research, no matter what research they are doing."

Credit: 
American Institute of Physics

Nursing home characteristics associated with resident COVID-19 morbidity in communities with high infection rates

What The Study Did: Researchers examined nursing homes in communities with the highest COVID-19 prevalence to identify characteristics associated with resident infection rates.

Authors: Hye-Young Jung, Ph.D., of Weill Cornell Medical College in New York, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.1555)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2021.1555?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=031621

About JAMA Network Open: 

JAMA Network Open is the new online-only open access general medical journal from the JAMA Network. On weekdays, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

Credit: 
JAMA Network

Viruses adapt to 'language of human cells' to hijack protein synthesis

image: Xavier Hernandez, PhD student at the Serrano Lab at the Centre for Genomic Regulation (CRG)

Image: 
Centre for Genomic Regulation (CRG)

The first systematic study of its kind describes how human viruses including SARS-CoV-2 are better adapted to infecting certain types of tissues based on their ability to hijack cellular machinery and protein synthesis.

Carried out by researchers at the Centre for Genomic Regulation (CRG), the findings could help the design of more effective antiviral treatments, gene therapies and vaccines. The study is published today in the journal Cell Reports.

Living organisms make proteins inside their cells. Each protein consists of single units of amino acids which are stitched together according to instructions encoded within DNA. The basic units of these instructions are known as a codons, each of which corresponds to a specific amino acid. A synonymous codon is when two or more codons result in cells producing the same amino acid.

"Different tissues use different languages to make proteins, meaning they preferentially use some synonymous codons over others. We know this because tRNAs, the molecules responsible for recognising codons and sticking on the corresponding amino acid, have different abundances in different tissues," explains Xavier Hernandez, first author of the study and researcher at the CRG.

When a virus infects an organism, it needs to hijack the machinery of the host to produce its own proteins. The researchers set out to investigate whether viruses were specifically adapted to using the synonymous codons used preferentially by the tissues they infect.

The researchers downloaded the publicly available protein sequences of all known human viruses and studied their codon usage. Based on the known tRNA abundances in different tissues, they then determined how well adapted all 502 human-infecting viruses were at infecting 23 different human tissues.

Viral proteins expressed during the early infection stage were better adapted to hijacking the host's protein-making machinery. According to Xavier Hernandez, "well adapted viruses start by using the preferred language of the cell but after taking full control they impose a new one that meets its own needs. This is important because viruses are used in gene therapy to treat genetic diseases and, if we want to correct a mutation in one tissue, we should modify the virus to be optimal for that particular tissue."

The researchers then took a closer look at how different respiratory viruses are adapted to infecting specific tissues based on their codon usage. They studied four different coronaviruses - SARS-CoV, MERS-CoV, SARS-CoV-2, and the bat coronavirus that is most closely related to SARS-CoV-2 - as well as the common flu-causing influenza A virus H1N1.

They found that SARS-CoV-2 adapted its codon usage to lung tissue, the gastrointestinal tract and the brain. As this aligns with known COVID-19 symptoms such as pneumonia, diarrhoea or loss of smell and taste, the researchers hypothesise future treatments and vaccines could take this factor into account to generate immunity in these tissues.

"Out of the respiratory viruses we took a close look at, SARS-CoV-2 is the virus that is most highly adapted to hijacking the protein synthesis machinery of its host tissue, but not more so than influenza or the bat coronavirus. This suggests that factors other than translational efficiency play an important role in infection, for example the ACE2 receptor expression or the immune system," concludes Xavier Hernandez.

The researchers next steps include further developing a biotechnological tool to design optimised protein sequences containing codons adapted to the tissue of interest, which may be useful for the development of gene therapies.

Credit: 
Center for Genomic Regulation

BIDMC researchers model a safe new normal

Boston, Mass. - Just one year after the World Health Organization declared the novel coronavirus a global pandemic, three COVID-19 vaccines are available in the United States, and more than 2 million Americans are receiving shots each day. Americans are eager to get back to business as usual, but experts caution that opening the economy prematurely could allow a potential resurgence of the virus. How foot traffic patterns in restaurants and bars, schools and universities, nail salons and barbershops affect the risk of transmission has been largely unknown.

In an article published in npj Digital Medicine, researcher-physicians from Beth Israel Deaconess Medical Center (BIDMC) used anonymized cell-phone data to build a Business Risk Index, which quantifies the potential risk of COVID-19 transmission in these establishments. The team's index accounts for both the density of visits and the length of time individuals linger inside, providing a more precise description of the human interactions -- and thus risk of viral transmission -- going on inside.

"While business traffic pre-pandemic and during statewide shut downs has been studied, business foot traffic and its relationship to COVID-19 transmission in the so-called 'new normal' of re-opening has not been well understood." said corresponding author Ashley O'Donoghue, PhD, Economist, in the Center for Healthcare Delivery Science at BIDMC. "Many forecasting models use anonymized cell-phone mobility data as a broad measure of the movement of residents. But two regions with same levels of mobility will likely see very different levels of COVID-19 transmission if people in one region are diligently practicing social distancing and people in the other are not."

O'Donoghue and colleagues built their risk index by analyzing trends in foot traffic patterns in more than 1.25 million businesses across eight states from January to June 2020. In the six New England states, New York and California, the team saw a 30 percent drop in high-density foot traffic and long visit lengths to businesses -- two factors that can increase the risk of COVID-19 transmission -- from the pre-pandemic baseline to April 2020. They saw similar declines when they looked at similar risky foot traffic patterns in restaurants, bars, universities and personal care establishments (which includes hair and nail salons and barbershops). In both analyses, the risk index rose steadily starting in mid-June as states eased restrictions.

Next, using county-level COVID-19 data for the same time period, the team demonstrated that their index could accurately forecast future COVID-19 cases with a one-week lag. The team found that an increase in a county's average Business Risk Index was associated with an increase in COVID-19 cases per 10,000 people in one week.

"Not all types of mobility contribute equally to increased risk of transmission, so it is important to directly measure human interaction when weighing the costs and benefits of reopening and lifting restrictions on businesses," said senior author Jennifer P. Stevens, MD, MS, Director of the Center for Healthcare Delivery Science at BIDMC. "Tracking how individuals use different businesses may provide the kind of information policymakers need to re-open different businesses in the safest way possible."

O'Donoghue, Stevens and team are now building an online decision-support tool that will help policymakers and hospital decision-makers monitor weekly risk in their areas. They have also deployed a prototype of their tool for Massachusetts that is being used by a large tertiary academic medical center in Boston to monitor potential surges in their service area, and their index has been integrated as a feature in a forecasting model for a large health system in Massachusetts.

"Our index can better quantify close human interactions, which are important predictors of transmission and help identify potential disease surges," said Stevens.

Credit: 
Beth Israel Deaconess Medical Center

Leprosy drug holds promise as at-home treatment for COVID-19

image: Sumit Chanda, co-senior study author and director of the Immunity and Pathogenesis Program at Sanford Burnham Prebys.

Image: 
Sanford Burnham Prebys Medical Discovery Institute

LA JOLLA, CALIF. - March 16, 2021 - A Nature study authored by scientists at Sanford Burnham Prebys Medical Discovery Institute and the University of Hong Kong shows that the leprosy drug clofazimine, which is FDA approved and on the World Health Organization's List of Essential Medicines, exhibits potent antiviral activities against SARS-CoV-2 and prevents the exaggerated inflammatory response associated with severe COVID-19. Based on these findings, a Phase 2 study evaluating clofazimine as an at-home treatment for COVID-19 could begin immediately.

"Clofazimine is an ideal candidate for a COVID-19 treatment. It is safe, affordable, easy to make, taken as a pill and can be made globally available," says co-senior author Sumit Chanda, Ph.D., professor and director of the Immunity and Pathogenesis Program at Sanford Burnham Prebys. "We hope to test clofazimine in a Phase 2 clinical trial as soon as possible for people who test positive for COVID-19 but are not hospitalized. Since there is currently no outpatient treatment available for these individuals, clofazimine may help reduce the impact of the disease, which is particularly important now as we see new variants of the virus emerge and against which the current vaccines appear less efficacious."

Promising candidate revealed by screening drug library

Clofazimine was initially identified by screening one of the world's largest collections of known drugs for their ability to block the replication of SARS-CoV-2. Chanda's team previously reported in Nature that clofazimine was one of 21 drugs effective in vitro, or in a lab dish, at concentrations that could most likely be safely achieved in patients.

In this study, the researchers tested clofazimine in hamsters--an animal model for COVID-19--that were infected with SARS-CoV-2. The scientists found that clofazimine lowered the amount of virus in the lungs, including when given to healthy animals prior to infection (prophylactically). The drug also reduced lung damage and prevented "cytokine storm," an overwhelming inflammatory response to SARS-CoV-2 that can be deadly.

"The animals that received clofazimine had less lung damage and lower viral load, especially when receiving the drug before infection," says co-senior author Ren Sun, Ph.D., professor at the University of Hong Kong and distinguished professor emeritus at the University of California, Los Angeles (UCLA). "Besides inhibiting the virus, there are indications that the drug also regulates the host response to the virus, which provides better control of the infection and inflammation."

Clofazimine also worked synergistically with remdesivir, the current standard-of-care treatment for people who are hospitalized due to COVID-19, when given to hamsters infected with SARS-CoV-2. These findings suggest a potential opportunity to stretch the availability of remdesivir, which is costly and in limited supply.

How clofazimine works

The study showed that clofazimine stops SARS-CoV-2 infection in two ways: blocking its entry into cells and disrupting RNA replication (SARS-CoV-2 uses RNA to replicate). Clofazimine was able to reduce the replication of MERS-CoV, the coronavirus that causes Middle East Respiratory Syndrome (MERS), in human lung tissue.

"Potentially most importantly, clofazimine appears to have pan-coronavirus activity, indicating it could be an important weapon against future pandemics," says co-senior author Kwok-Yung Yuen, M.D., chair of Infectious Diseases at the University of Hong Kong, who discovered the coronavirus that causes severe acute respiratory syndrome (SARS). "Our study suggests that we should consider creating a stockpile of ready-made clofazimine that could be deployed immediately if another novel coronavirus emerges."

ADD VIDEO: https://www.youtube.com/watch?v=BEkgvviqaf4&t=136s

In July 2020 Sumit Chanda shared more about his team's race to find a treatment for COVID-19.

Testing clofazimine in the clinic

A Phase 2 trial evaluating clofazimine in combination with interferon beta-1b as a treatment for people with COVID-19 who are hospitalized is ongoing at the University of Hong Kong. Interferon beta-1b is an immunoregulator that is given as an injection and is currently used to treat people with multiple sclerosis.

"Our data suggests that clofazimine should also be tested as a monotherapy for people with COVID-19, which would lower many barriers to treatment," says Chanda. "People with COVID-19 would be able to simply receive a regime of low-cost pills, instead of traveling to a hospital to receive an injection."

Credit: 
Sanford Burnham Prebys

Much of Mars' ancient water was buried in the planet's crust, not lost to space

Several oceans' worth of ancient water may reside in minerals buried below Mars' surface, report researchers. The new study, based on observational data and modeling, shows that much of the red planet's initial water - up to 99% - was lost to irreversible crustal hydration, not escape to space. The findings help resolve the apparent contradictions between predicted atmospheric loss rates, the deuterium to hydrogen ratio (D/H) of present-day Mars and the geological estimates of how much water once covered the Martian surface. Ancient Mars was a wet planet - dry riverbeds and relic shorelines record a time when vast volumes of liquid water flowed across the surface. Today, very little of that water remains, mostly frozen in the planet's ice caps. Previous studies have assumed that the lost water escaped to space over several billion years, an assertion supported by the currently observed atmospheric D/H ratio. However, measurements of the current rate of atmospheric water loss are too low for atmospheric escape alone to explain all Martian water loss. Eva Scheller and colleagues show how large volumes of water could have instead become incorporated into minerals that were buried in the planet's crust. Using observational constraints from orbiting spacecraft, rovers and Martian meteorites, Scheller et al. developed a water budget and D/H model that considers atmospheric escape, volcanic degassing and crustal hydration through chemical weathering. By simulating Martian water loss through geological time and for a range of plausible conditions, the authors discovered that Mars had lost most of its water - between 40-95% - over the Noachian period (~4.1 - 3.7 billion years ago). The results suggest that between 30 and 99% of Mars' initial water was incorporated into minerals and buried in the planet's crust, with subsequent escape to space of the remainder accounting for the currently observed D/H ratio.

Credit: 
American Association for the Advancement of Science (AAAS)

Commercial truck electrification is within reach

When it comes to electric vehicles, particularly for heavy-duty trucks, the limitations of battery technology are often seen as the main barrier to widespread adoption. However, a new analysis concludes that it's the lack of appropriate policies around adoption incentives, charging infrastructure, and electricity pricing that prevents widespread electrification of commercial trucking fleets.

Researchers from the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California, Los Angeles published a new study that makes the case for prioritizing public policy to help move long-haul trucking from diesel to electric. Doing so will mean huge gains in addressing the climate crisis and avoiding premature deaths due to local vehicular pollution, which disproportionately affects communities of color.

The study analyzes the total cost of ownership of an electric long-haul truck compared to a diesel long-haul truck. Using the current price of a battery pack and assuming a 375-mile range, the researchers found that an electric long-haul truck has a 13% per mile lower total cost of ownership, with a net savings of $200,000 over the lifetime of the electric truck. The total cost of ownership analysis takes into account the purchase price and operating costs over the lifetime of the truck.

The researchers also showed that future reductions in battery costs - taken together with a more aerodynamic design and monetized benefits of reduced pollution - would result in a 50% per mile lower total cost of ownership compared to a diesel long-haul truck by 2030. The electrification of long-haul trucks therefore is possible, and figuring out what is required to move the nation's trucking fleet to widely adopt electric trucks is the next step, the authors said.

"Given the massive economic and environmental benefits, the case for long-haul electric trucking is stronger than ever before," said Berkeley Lab Research Scientist Nikit Abhyankar, one of the authors of the study. "Enabling policies such as adoption and charging infrastructure incentives, sales mandates, and cost-reflective electricity pricing are crucial."

Why focus on long-haul trucks?

Electric cars are becoming more prevalent now, with a substantial increase in global sales and commitments from several major auto manufacturers, including General Motors and Volvo, to sell only electric vehicles by 2030-2035. Long-haul trucks have not experienced the same level of growth, yet they are diesel-fuel guzzlers and a major source of air pollution, contributing more than 20% of U.S. transportation-sector greenhouse gas emissions.

Berkeley Lab scientists have done extensive research tracking the impact of diesel trucks on air quality and public health in disadvantaged communities. Even though diesel trucks account for just a small fraction of motor vehicles, they are responsible for almost one-third of motor vehicle CO2 emissions. The transportation sector was the largest contributor of CO2 emissions associated with the US economy.

"If we can move away from diesel-dependent heavy-duty vehicles, we have a chance at significantly reducing greenhouse gas and particulate emissions from the transportation sector," said Berkeley Lab Staff Scientist Amol Phadke, lead author on this study.

There are currently two main pathways to electrify trucks - fuel cells and batteries - and both are actively being pursued by researchers at Berkeley Lab. Long-haul trucks powered by hydrogen fuel cells are on the horizon, and Berkeley Lab scientists are playing a leading role in a new DOE consortium called the Million Mile Fuel Cell Truck (M2FCT) to advance this technology. Battery-powered electric trucks have seen the most dramatic improvements in technology in recent years, making the battery costs more affordable and competitive.

What's more, electricity from renewable energy sources is becoming more cost-competitive, and Berkeley Lab researchers have shown that decarbonizing the electric grid is feasible in the coming decades, which means electric long-haul trucks would no longer contribute to greenhouse gas emissions.

"It is exciting to see recent dramatic improvements in battery technology and costs," said Phadke. "Electric trucks can generate significant financial savings for truck owners and fleet operators, while enabling inflation-proof freight transportation that can have significant macroeconomic benefits."

Credit: 
DOE/Lawrence Berkeley National Laboratory