Culture

Do obesity and smoking impact healing after wrist fracture surgery?

Boston, Mass. - Both obesity and smoking can have negative effects on bone health. A recent study led by a team at Beth Israel Deaconess Medical Center (BIDMC) examined whether they also impact healing in patients who have undergone surgery for fractures of the wrist, or the distal radius, which are among the most common bone fractures. Such fractures account for 5 percent to 20 percent of all emergency room fracture visits, and affected patients can experience challenges with daily living as well as potentially serious and costly complications.

For the study, published in the Journal of Hand Surgery, the investigators analyzed data on patients surgically treated for a distal radius fracture between 2006 and 2017 at two trauma centers. The 200 patients were divided into obese and non-obese groups (39 and 161 patients, respectively) and were also characterized as current, former, and never smokers (20, 32, and 148 patients, respectively) based on self-reported cigarette use.

At three-month and one-year follow-ups after surgery, both the obese and nonobese groups achieved acceptable scores that pertained to patient-reported function in the upper extremity - close to those of the general population. The two groups were also similar in regards to range of motion and bone alignment. At three months, smokers demonstrated worse scores related to arm, shoulder, and hand function and a lower percentage of healed fractures, but these effects improved over the course of a year. Complications were similar between groups.

"Overall we found that we can achieve excellent clinical and radiographic outcomes with surgery for displaced wrist fractures in patients who are obese and in those who smoke," said senior author Tamara D. Rozental, MD, Chief of Hand and Upper Extremity Surgery at BIDMC and Professor of Orthopedic Surgery at Harvard Medical School. "Our results show that treatment for distal radius fractures in obese and smoking patients is safe, and these patients may be treated like the general population with similar long-term results. Their short-term outcomes, however, demonstrate higher disability and, in the case of smokers, slower fracture healing."

Rozental stressed that obesity and smoking are currently considered among the two most important preventable causes of poor health in developed nations, and both are modifiable risk factors. "As such, we believe that lifestyle interventions focusing on weight loss and smoking cessation should be emphasized whenever possible," she said.

Credit: 
Beth Israel Deaconess Medical Center

Nov. journal highlights: First MCI prevalence estimates in US Latino populations

image: Alzheimer's & Dementia: The Journal of the Alzheimer's Association November 2019 issue cover

Image: 
Alzheimer's Association

CHICAGO, November 22, 2019 - In the largest dementia study of a diverse group of U.S. Latinos to date, researchers found that nearly 10% of middle-age and older Latinos have a decline in memory and thinking skills known as mild cognitive impairment (MCI), according to a new article published online by Alzheimer's & Dementia: The Journal of the Alzheimer's Association. MCI marks early memory changes that can progress to dementia.

Hector M. González, Ph.D., University of California, San Diego, and colleagues analyzed data from more than 6,000 individuals in the ongoing Study of Latinos-Investigation of Neurocognitive Aging (SOL-INCA) funded by the National Institutes of Health. The researchers found that older age, high cardiovascular disease risk and depression symptoms were significantly associated with MCI diagnosis. MCI prevalence rates ranged from 12.9% for individuals with Puerto Rican backgrounds to 8.0% among individuals with Cuban backgrounds.

Link: "Prevalence and correlates of mild cognitive impairments among diverse Hispanics/Latinos: Study of Latinos-Investigation of Neurocognitive Aging results"

In a related Perspectives article, also newly published online, Dr. González and colleagues offer SOL-INCA to illustrate a new framework for advancing research into Alzheimer's and other dementias in Latino populations. Latinos represent nearly one-fifth of the U.S. population, and are a growing segment that is culturally and genetically diverse, while also facing major risks and disparities for Alzheimer's disease and related dementias.

Link: "A research framework for cognitive aging and Alzheimer's disease among diverse US Latinos: Design and implementation of the Hispanic Community Health Study/Study of Latinos--Investigation of Neurocognitive Aging (SOL-INCA)"

A sharp upward change in the use of health care services, including twice as many primary care visits, was seen in the year that individuals received a diagnosis of Alzheimer's or another dementia in the first study of tribal health care service use by Alaska Natives and American Indians. The paper by Krista R. Schaefer, M.P.H., from Southcentral Foundation, Anchorage, AK, and colleagues is published in the November print issue of Alzheimer's & Dementia: The Journal of the Alzheimer's Association.

"Alaska has the fastest growing population of people 65 years and older compared with any other state; moreover, Alaska Native and American Indian people make up to 20% of Alaska's population. Therefore, healthcare systems will need to tailor their services in anticipation of an increase in the numbers of patients suffering from Alzheimer's disease and related dementias," the authors write.

Link: "Differences in service utilization at an urban tribal health organization before and after Alzheimer's disease or related dementia diagnosis: A cohort study" (not embargoed)

Also in the November print issue, a research article from Denmark by Laerke Taudorf, M.D., from the University of Copenhagen, Denmark and colleagues analyzed data from three Danish national health registries for people 65 years and older. After adjusting for age and sex, the incidence rate (new cases) for the individuals in these population studies in Denmark increased by an average of 9% annually from 1996 to 2003, followed by a 2% annual decline, while total prevalence (number of people living with dementia) increased during the entire time and is still increasing. In conclusion, the authors state, "the decline in total incidence and incidence rates of dementia leads to a cautious optimism that with better health and management of risk factors, it may be possible to lower the risk of dementia."

Link: "Declining incidence of dementia: A national registry-based study over 20 years" (not embargoed)

Credit: 
Alzheimer's Association

Small, fast, and highly energy-efficient memory device inspired by lithium-ion batteries

image: The stacked layers in the proposed memory device form a mini-battery that can be quickly and efficiently switched between three different voltage states (0.95 V, 1.35 V, and 1.80 V).

Image: 
ACS Applied Materials and Interfaces

Virtually all digital devices that perform any sort of processing of information require not only a processing unit, but also a quick memory that can temporarily hold the inputs, partial results, and outputs of the operations performed. In computers, this memory is referred to as dynamic random-access memory, or DRAM. The speed of DRAM is very important and can have a significant impact in the overall speed of the system. In addition, lowering the energy consumption of memory devices has recently become a hot topic to achieve highly energy-efficient computing. Therefore, many studies have focused on testing out new memory technologies to surpass the performance of conventional DRAM.

The most basic unit in a memory chip are its memory cells. Each cell typically stores a single bit by adopting and holding one of two possible voltage values, which correspond to a stored value of either "0" or "1". The characteristics of the individual cell largely determine the performance of the overall memory chip. Simpler and smaller cells with high speed and low energy consumption would be ideal to take highly efficient computing to the next level.

A research team from Tokyo Tech led by Prof. Taro Hitosugi and student Yuki Watanabe recently reached a new milestone in this area. These researchers had previously developed a novel memory device inspired by the design of solid lithium-ion batteries. It consisted of a stack of three solid layers made of lithium, lithium phosphate, and gold. This stack is essentially a miniature low-capacity battery that functions as a memory cell; it can be quickly switched between charged and discharged states that represent the two possible values of a bit. However, gold combines with lithium to form a thick alloy layer, which increases the amount of energy required to switch from one state to the other.

In their latest study, the researchers created a similar three-layer memory cell using nickel instead of gold. They expected better results using nickel because it does not easily form alloys with lithium, which would lead to lower energy consumption when switching. The memory device they produced was much better than the previous one; it could actually hold three different voltage states instead of two, meaning that it is a three-valued memory device. "This system can be viewed as an extremely low-capacity thin-film lithium battery with three charged states," explains Prof. Hitosugi. This is a very interesting feature that has potential advantages for three-valued memory implementations, which may be more area efficient.

The researchers also found that nickel forms a very thin nickel oxide layer between the Ni and the lithium phosphate layers (see Fig. 1), and this oxide layer is essential for the low-energy switching of the device. The oxide layer is much thinner than that of the gold-lithium alloys that formed in their previous device, which means that this new "mini-battery" cell has a very low capacity and is therefore quickly and easily switched between states by applying minuscule currents. "The potential for extremely low energy consumption is the most noteworthy advantage of this device," remarks Prof. Hitosugi.

Increased speed, lower energy consumption, and smaller size are all highly demanded features in future memory devices. The memory cell developed by this research team is a very promising stepping stone toward much more energy-efficient and faster computing.

Credit: 
Tokyo Institute of Technology

New model for predicting kidney injury after common heart procedure

New Haven, Conn. --A Yale-led group of doctors has developed a new mathematical model that can predict the risk of acute kidney injury (AKI) in patients undergoing a common heart procedure.

For patients treated with percutaneous coronary intervention (PCI), commonly known as angioplasty, exposure to contrast agents -- material used in the procedure to help visualize blood vessels -- can harm the kidneys.

The new tool will enable doctors to make better pre-procedure estimates of risk and provide more personalized estimates for how much contrast material can safely be used when inserting stents in blocked or narrowed blood vessels near the heart, the researchers said. A new study describing the research appears in the journal JAMA Network Open.

"The previous models assumed that the exposure to contrast produced the same risk for everyone, but it is not one-size-fits-all. There are individual differences," said Harlan Krumholz, M.D., cardiologist and director of the Yale Center for Outcomes Research and Evaluation (CORE).

For the new study, the researchers developed a machine learning model, a type of artificial intelligence, for estimating patients' risk of AKI before their heart procedure, while also accounting for the complexity of associations between contrast levels and AKI in different risk groups. The new model was then able to predict AKI risk injury more accurately than previous models, the researchers said.

Previous tools to estimate risk have not employed modern mathematical approaches or included information about the patients and their contrast exposure, they noted.

"We determined that their associations with the risk of kidney injury are quite complex. The range of contrast levels you're considering matters, and the baseline risk level for a particular patient matters, too," said Chenxi Huang, an associate research scientist at Yale and first author of the study.

Credit: 
Yale University

Samoa climate change resilience challenges Western perceptions

image: Dr Anita Latai-Niusulu interviewing a Samoan farmer

Image: 
Dr Anita Latai-Niusulu

The resilience of Samoan communities in the face of climate change is providing a blueprint for other nations to follow, according to Samoa and Otago researchers.

It is one of the first studies to examine Samoa's grassroots ability to adapt to climate change, and its authors warn officials risk ignoring village expertise at their peril.

The newly-released paper is co-authored by Dr Anita Latai-Niusulu from the National University of Samoa, and University of Otago Professors Tony Binns and Etienne Nel, both from the School of Geography.

The study, based on Dr Latai-Niusulu's PhD thesis, interviewed 165 residents in villages across Samoa's main islands Upolu and Savaii, including in coastal, inland, urban and rural areas.

More than 70 per cent of Samoa's population lives in 330 rural villages across Upolu and Savaii, and most of the country's infrastructure, population and development is near the coastline.

The researchers found villagers had a heightened awareness of climate change and noticed hotter days and longer dry spells, shorter periods of rainfall, stronger damaging winds, and sea level rise.

However rather than despairing at the prospect, villagers have developed a pragmatic and positive approach to impending climate changes.

Past natural disasters such as Cyclones Ofa and Val in the 1990s had a devastating effect on many communities, but the recovery period also brought opportunities for developing tighter social connections, new food supplies and infrastructural development and, in some cases, village relocation.

Professor Binns says the Samoan approach challenges general Western perceptions about Pacific nations' ability to respond to climate change.

He says exposure to serious environmental challenges has not made villagers 'fatalistic' or 'helpless', but instead has given them a more optimistic outlook on life.

The close-knit Samoan village structure, with a village council (fono) made up of chiefly title holders (matai) from extended family units throughout the village, means that each villager has a voice at the local decision table.

Communities also regularly meet together at evening prayers to share information and strengthen social networks.

This, along with the fact that more than 80 per cent of Samoa's land and resources are still collectively owned, means Samoans can engage in collaborative action against climate change.

Common strategies in all villages include diversifying food and water sources, being geographically mobile, having more than one place to live, and developing mental and spiritual strength.

"Such diverse livelihood portfolios and close community collaboration have generated an impressive level of resilience which communities elsewhere in the Pacific and beyond could well emulate," Professor Binns says.

However, the study found climate change decisions in Samoa are primarily occurring at a national level, and are dominated by the views of government workers, consultancy firms and civil society workers.

Officials need to listen to community expertise and develop a more nuanced understanding of each village's key concerns - which vary according each village's unique geographical challenges, Professor Binns says.

"Governments need to carefully reconsider their expenditure in relation to climate change adaptation, with perhaps less spending directed towards building seawalls and coastal roads.

"More support should be given to other climate change adaptation initiatives such as village, church and family activities that strengthen social networks and build social memory."

Credit: 
University of Otago

New study shows how cancer survivors develop opioid addictions

Opioids play an important role in how cancer patients manage pain, but the ongoing opioid epidemic has raised concerns about their potential for abuse.

Pain remains one of the most difficult symptoms associated with cancer. More than half of cancer patients undergoing treatment experience moderate to severe pain. Despite the accepted role of opioids in acute pain relief, the use of opioids for chronic pain (pain lasting longer three to six months) remains controversial. Chronic opioid use can lead to diminishing effectiveness as well as dependence, misuse, abuse, and unintentional overdosing.

With an estimated 16.9 million cancer survivors in the United States and two-thirds of newly diagnosed cancer patients living more than five years, many cancer researchers believe it's important to better understand opioid use in oncology patients. While there are guidelines for how to help cancer patients avoid opioid dependence, many researchers are concerned that recommendations for risk reduction are based on expert opinion not related to cancer patients specifically.

Within a cohort of 106,732 cancer survivors diagnosed between 2000 and 2015, researchers here determined rates of persistent post-treatment opioid use, diagnoses of opioid abuse or dependence, and admissions for opioid toxicity. This study cohort included patients diagnosed with one of the 12 most common cancers (bladder, breast, colon, esophagus, stomach, head and neck, kidney, liver, lung, pancreas, prostate, or rectal cancer), and alive without recurrence two years after treatment.

Among the patients in this study the overall incidence of persistent post-treatment opioid use was 8.3% which varied by cancer type ranging from a low of 5.3% in prostate cancer patients to a high of 19.8% in liver cancer patients. Bladder, breast, esophagus, stomach, head and neck, liver, lung and pancreas cancer were associated with higher odds compared to prostate cancer.

The rates of persistent opioid use after treatment varied substantially by a patient's history of opioid use prior to his receiving a cancer diagnosis. The persistent post-treatment opioid use rates were lowest for patients who had never used opioids prior to their cancer diagnosis (3.5%) followed by prior intermittent users (15.0%), and prior chronic users (72.2%). The rate of post treatment diagnoses of opioid abuse or dependence was 2.9%, and opioid-related admissions occurred in 2.1% of patients.

Several factors were associated with the risk of persistent opioid use. Younger age, white race, unemployment at the time of cancer diagnosis, lower median income, increased comorbidity, and current or prior tobacco use were all associated with increased risk for persistent opioid use. Prior diagnoses of alcohol abuse, non-opioid drug abuse, opioid abuse, and depression were associated with increased odds. Prior history of chronic opioid use and prior intermittent use were associated with substantially increased odds of persistent opioid use

"Our study attempts to create an objective clinical tool that can help give providers a better understanding of a patient's risk of opioid-related toxicity," said Lucas K. Vitzthum, an author of the study. "Ultimately, clinical tools such as ours could help providers identify which patients could benefit from alternative pain management strategies or referral to pain specialists."

James D. Murphy, another author, noted that "opioids play an important role in helping patients with pain from cancer, or pain because of treatment. Despite this important role, opioid use carries a risk of problems related to long-term use, or abuse. From a healthcare provider perspective, we need better approaches to identify cancer patients at risk of these opioid-related problems."

Credit: 
Oxford University Press USA

Efficient bottom-up synthesis of new perovskite material for the production of ammonia

image: This new protocol for the production of BaCeO3?xNyHz can be carried out at much lower temperatures and in much less time compared with conventional methods.

Image: 
Tokyo Tech

Perovskites are a class of synthetic materials that have a crystalline structure similar to that of the naturally occurring mineral calcium titanate. They have been the subject of many studies because they exhibit exciting and unique properties that can be tuned according to their composition. One of their potential applications is as catalysts for the synthesis of ammonia. In other words, specific perovskites can be placed inside a reaction chamber with nitrogen and hydrogen to promote the reaction of these gases to form ammonia.

Ammonia is a useful substance that can be employed in the production of fertilizers and artificial chemicals, and even as a clean-energy carrier (in the form of hydrogen), which may be key in eco-friendly technologies. However, there are various challenges associated with the synthesis of ammonia and perovskites themselves.

The synthesis rate for ammonia is generally limited by the high energy required to dissociate nitrogen molecules. Some researchers have had some success using precious metals, such as ruthenium. Recently, perovskites with some of their oxygen atoms replaced by hydrogen and nitrogen ions have been developed as efficient catalysts for ammonia synthesis. However, the traditional synthesis of perovskites with such substitutions usually has to be carried out at high temperatures (more than 800°C) and over long periods of time (weeks).

To address these issues, in a recent study carried out at Tokyo Tech, a group of researchers led by Prof. Masaaki Kitano devised a novel method for the low-temperature synthesis of one of such oxygen-substituted perovskites with the chemical name BaCeO3?xNyHz and tested its performance as a catalyst to produce ammonia. To achieve this, they made an innovative alteration to the perovskite synthesis process. The use of Barium carbonate and Cerium dioxide as precursors (or "ingredients") involves a very high temperature would be required to have them combine into the base perovskite, or BaCeO3, because Barium carbonate is very stable. In addition, one would then have to substitute the oxygen atoms with nitrogen and hydrogen ions. On the other hand, the team found that the compound Barium amide reacts easily with Cerium dioxide under ammonia gas flow to directly form BaCeO3?xNyHz at low temperatures and in less time (Fig. 1). "This is the first demonstration of a bottom-up synthesis of such a material, referred to as perovskite-type oxynitride-hydride," explains Prof. Kitano.

The researchers first analyzed the structure of the perovskite obtained through the proposed process and then tested its catalytic properties for the low-temperature synthesis of ammonia under various conditions. Not only did the proposed material outperform most of the state-of-the-art competitors when combined with ruthenium, but it also vastly surpassed all of them when combined with cheaper metals, such as cobalt and iron (see Fig. 2). This represents tremendous advantages in terms of both performance and associated cost.

Finally, the researchers attempted to elucidate the mechanisms behind the improved synthesis rate for ammonia. Overall, the insight provided in this study serves as a protocol for the synthesis of other types of materials with nitrogen and hydrogen ion substitutions and for the intelligent design of catalysts. "Our results will pave the way in new catalyst design strategies for low-temperature ammonia synthesis," concludes Prof. Kitano. These findings will hopefully make the synthesis of useful materials cleaner and more energy efficient.

Credit: 
Tokyo Institute of Technology

Competing signals shrink or grow liver tumor at the margins

Activating the Hippo molecular signaling pathway in liver tumor cells drives tumor growth--but activating the same pathway in healthy cells surrounding the tumor suppresses tumor growth. This unexpected effect indicates that there is a competitive interaction between tumor cells and their surrounding tissues, say Iván Moya and colleagues. The Hippo pathway and in particular two of its components, the transcriptional coactivators Yap and Taz, have been identified in experimental studies as drivers of tumor growth, making it a potential target in cancer treatments. However, the new findings by Moya et al. suggest that systemic inhibition of Yap and Taz could have unwanted consequences, by blocking the tumor-suppressing abilities of healthy cells at the tumor margin. In a mouse model of liver cancer, the researchers found Yap and Taz activity in the cells surrounding a tumor prompted cell death in the tumor cells. They also confirmed that while liver tumor cells rely on Yap and Taz for their survival, this effect is relative to the levels of Yap and Taz in surrounding cells. When Yap and Taz activity is higher in surrounding cells relative to the tumor cells, the tumors shrink--but can rebound when Yap and Taz activity is neutralized in surrounding cells.

Credit: 
American Association for the Advancement of Science (AAAS)

Non-coding DNA located outside chromosomes may help drive glioblastoma

image: Glioblastoma tumor cells with extrachromosomal EGFR gene amplification (red) indicated by white arrows.

Image: 
Photo courtesy of Case Western Reserve.

One of the ways a cancer-causing gene works up enough power to turn a normal cell into a cancer cell is by copying itself over and over, like a Xerox machine. Scientists have long noticed that when cancer-causing genes do that, they also scoop up some extra DNA into their copies. But it has remained unclear whether the additional DNA helps drive cancer or is just along for the ride.

Using human glioblastoma brain tumor samples, researchers at University of California San Diego School of Medicine and Case Western Reserve University School of Medicine have now determined that all of that extra DNA is critical for maintaining a cancer-causing gene's activation, and ultimately supporting a cancer cell's ability to survive. Comparing those findings to a public database of patient tumor genetics, they also discovered that even if two different tumor types are driven by the same cancer-causing gene, the extra DNA may differ.

The study, published November 21, 2019 in Cell, could explain why drugs will often work for some cancer types but not others.

"We've been targeting the cancer-causing gene for therapy, but it turns out we should also think about targeting the switches that are carried along with it," said co-senior author Peter Scacheri, PhD, Gertrude Donnelly Hess Professor of Oncology at Case Western Reserve University School of Medicine and member of the Case Comprehensive Cancer Center.

When the human genome was first fully sequenced, many people were surprised to find it contained far fewer genes -- segments of DNA that encode proteins -- than expected. It turns out that the remainder of human DNA in the genome, the non-coding regions, play important roles in regulating and enhancing the protein-coding genes -- turning them "on" and "off," for example.

In this study, the researchers focused on one example cancer-causing gene, EGFR, which is particularly active in glioblastoma, an aggressive form of brain cancer, and other cancers. When copies of EGFR pile up in tumors, they tend to be in the form of circular DNA, separate from the chromosome.

"In 2004, I was the lead on the first clinical trial to test a small molecule inhibitor of EGFR in glioblastoma," said co-senior author Jeremy Rich, MD, professor of medicine at UC San Diego School of Medicine and director of neuro-oncology and director of the Brain Tumor Institute at UC San Diego Health. "But it didn't work. And here we are 15 years later, still trying to understand why brain tumors don't respond to inhibitors of what seems to be one of the most important genes to make this cancer grow."

The team took a closer look at the extra DNA surrounding EGFR circles in 9 of 44 different glioblastoma tumor samples donated by patients undergoing surgery. They discovered that the circles contained as many as 20 to 50 enhancers and other regulatory elements. Some of the regulatory elements had been adjacent to EGFR in the genome, but others were pulled in from other regions of the genome.

To determine the role each regulatory element plays, the researchers silenced them one at a time. They concluded that nearly every single regulatory element contributed to tumor growth.

"It looks like the cancer-causing gene grabs as many switches it can get its hands on ... co-opting their normal activity to maximize its own expression," Scacheri said.

First author Andrew Morton, a graduate student in Scacheri's lab, then searched a public database of patient tumor genetic information -- more than 4,500 records covering nine different cancer types. He found that the team's observation was not limited to EGFR and glioblastoma. Enhancers were amplified alongside cancer-causing genes in many tumors, most notably the MYC gene in medulloblastoma and MYCN in neuroblastoma and Wilms tumors.

"People thought that the high copy number alone explained the high activity levels of cancer-causing genes, but that's because people weren't really looking at the enhancers," Morton said. "The field has been really gene-centric up to this point, and now we're taking a broader view."

Next, the researchers want to know if the diversity in regulatory elements across cancer types could also be helping tumors evolve and resist chemotherapy. They also hope to find a class of therapeutic drugs that inhibit these regulatory elements, providing another way to put the brakes on cancer-causing genes.

"This isn't just a laboratory phenomenon, it's information I need to better treat my patients," said Rich, who is also a faculty member in the Sanford Consortium for Regenerative Medicine and Sanford Stem Cell Clinical Center at UC San Diego Health.

Credit: 
University of California - San Diego

New algorithms train AI to avoid specific bad behaviors

video: Robots, self-driving cars and other intelligent machines could become better-behaved thanks to a new way to help machine-learning designers build AI applications with safeguards against specific, undesirable outcomes such as racial and gender bias.

Image: 
Deboki Chakravarti

Artificial intelligence has moved into the commercial mainstream thanks to the growing prowess of machine learning algorithms that enable computers to train themselves to do things like drive cars, control robots or automate decision-making.

But as AI starts handling sensitive tasks, such as helping pick which prisoners get bail, policy makers are insisting that computer scientists offer assurances that automated systems have been designed to minimize, if not completely avoid, unwanted outcomes such as excessive risk or racial and gender bias.

A team led by researchers at Stanford and the University of Massachusetts Amherst published a paper Nov. 22 in Science suggesting how to provide such assurances. The paper outlines a new technique that translates a fuzzy goal, such as avoiding gender bias, into the precise mathematical criteria that would allow a machine-learning algorithm to train an AI application to avoid that behavior.

"We want to advance AI that respects the values of its human users and justifies the trust we place in autonomous systems," said Emma Brunskill, an assistant professor of computer science at Stanford and senior author of the paper.

Avoiding misbehavior

The work is premised on the notion that if "unsafe" or "unfair" outcomes or behaviors can be defined mathematically, then it should be possible to create algorithms that can learn from data on how to avoid these unwanted results with high confidence. The researchers also wanted to develop a set of techniques that would make it easy for users to specify what sorts of unwanted behavior they want to constrain and enable machine learning designers to predict with confidence that a system trained using past data can be relied upon when it is applied in real-world circumstances.

"We show how the designers of machine learning algorithms can make it easier for people who want to build AI into their products and services to describe unwanted outcomes or behaviors that the AI system will avoid with high-probability," said Philip Thomas, an assistant professor of computer science at the University of Massachusetts Amherst and first author of the paper.

Fairness and safety

The researchers tested their approach by trying to improve the fairness of algorithms that predict GPAs of college students based on exam results, a common practice that can result in gender bias. Using an experimental dataset, they gave their algorithm mathematical instructions to avoid developing a predictive method that systematically overestimated or underestimated GPAs for one gender. With these instructions, the algorithm identified a better way to predict student GPAs with much less systematic gender bias than existing methods. Prior methods struggled in this regard either because they had no fairness filter built-in or because algorithms developed to achieve fairness were too limited in scope.

The group developed another algorithm and used it to balance safety and performance in an automated insulin pump. Such pumps must decide how big or small a dose of insulin to give a patient at mealtimes. Ideally, the pump delivers just enough insulin to keep blood sugar levels steady. Too little insulin allows blood sugar levels to rise, leading to short term discomforts such as nausea, and elevated risk of long-term complications including cardiovascular disease. Too much and blood sugar crashes - a potentially deadly outcome.

Machine learning can help by identifying subtle patterns in an individual's blood sugar responses to doses, but existing methods don't make it easy for doctors to specify outcomes that automated dosing algorithms should avoid, like low blood sugar crashes. Using a blood glucose simulator, Brunskill and Thomas showed how pumps could be trained to identify dosing tailored for that person - avoiding complications from over- or under-dosing. Though the group isn't ready to test this algorithm on real people, it points to an AI approach that might eventually improve quality of life for diabetics.

In their Science paper, Brunskill and Thomas use the term "Seldonian algorithm" to define their approach, a reference to Hari Seldon, a character invented by science fiction author Isaac Asimov, who once proclaimed three laws of robotics beginning with the injunction that "A robot may not injure a human being or, through inaction, allow a human being to come to harm."

While acknowledging that the field is still far from guaranteeing the three laws, Thomas said this Seldonian framework will make it easier for machine learning designers to build behavior-avoidance instructions into all sorts of algorithms, in a way that can enable them to assess the probability that trained systems will function properly in the real world.

Brunskill said this proposed framework builds on the efforts that many computer scientists are making to strike a balance between creating powerful algorithms and developing methods to ensure that their trustworthiness.

"Thinking about how we can create algorithms that best respect values like safety and fairness is essential as society increasingly relies on AI," Brunskill said.

Credit: 
Stanford University

Unraveling gene expression

image: The pioneer transcription factor Rap1 pries open compact chromatin structure to activate genes.

Image: 
Beat Fierz, EPFL

The DNA of a single cell is 2-3 meters long end-to-end. To fit and function, DNA is packaged around specialized proteins. These DNA-protein complexes are called nucleosomes, and they are a small part of a larger structure called chromatin. Nucleosomes can be thought of as the cell's DNA storage and protection unit.

When a particular gene needs to be expressed, the cell requires access to the protected DNA within chromatin. This means that the chromatin structure must be opened and the nucleosomes must be removed to expose the underlying target gene.

This takes place in the orchestrated process of "chromatin remodeling", which regulates gene expression and involves a multitude of actors. Unravelling this pivotal step not only furthers our fundamental understanding, but may also help in the development of genetic engineering tools.

Now the lab of Beat Fierz at EPFL, has been able to uncover the first steps in the chromatin-opening process at the level of a single molecule, using a combination of chemical biology and biophysical methods. Published in Molecular Cell, the work looks at the role of a group of proteins called "pioneer transcription factors". These proteins bind to specific DNA regions within chromatin that are themselves shielded from other proteins. Little is known about how these factors overcome the barriers of the chromatin maze.

Fierz's lab looked at yeast, which is a model organism for human genetics. The method involved replicating the architecture of yeast genes, combined with single-molecule fluorescence. The researchers studied a yeast pioneer transcription factor called Rap1, and found that it choreographs chromatin remodeling, allowing access to other proteins required for gene expression that were previously obstructed.

To do this, Rap1 first binds chromatin and then influences the action of a large molecular machine called "Remodeling the Structure of Chromatin" (RSC), displacing nucleosomes and paving the way to the now-exposed DNA for other proteins involved in controlling gene expression.

By revealing the physico-chemical mechanism of how Rap1 gains access to chromatin and opens it up, the EPFL study proposes a biological model for other pioneer transcription factors, but also provides the tools for investigating them at the level of a single molecule.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Self-restrained genes enable evolutionary novelty

image: This is a confocal micrograph of a young leaf of Cardamine hirsuta (hairy bittercress) with emerging leaflets, showing distribution of the RCO protein. Cell outlines are shown in gray. RCO shown here in red colour is active at the base of initiating leaflets where it reduces growth, leading to the formation of leaflets that are separated from each other.

Image: 
Neha Bhatia and Peter Huijser

Changes in the genes that control development can potentially make large contributions to evolution by generating new morphologies in plants and animals. However, because developmental genes frequently influence many different processes, changes to their expression carry a risk of "collateral damage". Scientists at the Max Planck Institute for Plant Breeding Research in Cologne, and collaborators, have now shown how gene self-repression can reduce the potential side effects of novel gene expression so that new forms can evolve. This self-regulation occurs via a distinctive molecular mechanism employing small regions of genomic DNA called low-affinity transcription factor binding sites.

Suppose a bird develops a modified wing shape, which makes flying easier and could be beneficial to its survival. If this gene change also altered the bird's color, making it less attractive to mates, then the advantageous wing-shape modification would be unlikely to persist. So, how then does nature balance the potential for novelty, with the risk of side effects that may prevent novelty from arising? Using the evolution of leaf shape as an example, an international team led by Director Miltos Tsiantis has provided fresh insight into this question.

This new study was done in the hairy bittercress, a small weed that the Tsiantis group has developed into a model system for understanding evolution of plant form. It builds on previous work from the group in which a gene called RCO was found to have driven leaf shape diversification in mustard plants by acquiring a novel expression pattern.

RCO encodes a transcription factor, a type of protein that can turn other genes on or off, and RCO's new expression pattern resulted in the emergence of the more complex leaf shapes found in bittercress. The researchers have now shown that this change in gene expression was accompanied by RCO acquiring the ability to repress its own activity. Mike Levine, Director of the Lewis-Sigler Institute for Integrative Genomics at Princeton University who was not involved in the study, finds this particular insight "very compelling". As the self-repression of RCO "limits the scope of its activity", Levine explains, it "thereby blocks potentially deleterious influences on cell development and function".

Stimulating cytokinin

As a next step, the scientists identified the genes targeted by RCO, and found that many of them are responsible for coordinating local levels of cytokinin - a widely acting plant hormone known to affect cell growth. Importantly, when the self-regulation of RCO is modified, RCO stimulates cytokinin excessively and leaf shape is altered in ways that can negatively affect plant fitness. This finding confirms the idea that self-repression of RCO could be essential for the persistence of RCO-induced novel leaf morphologies.

What's particularly interesting is that this self-repression of RCO occurs in a very distinctive way. The scientists discovered that it is based on many weak interactions between the RCO protein and RCO regulatory DNA at low-affinity binding sites. "This finding is exciting", explains Tsiantis, "because low-affinity binding sites can evolve relatively quickly, thus offering an easy way for evolution to keep changes in gene expression in check, by lowering a regulator's expression".

Soft repression

Indeed, this latest work from Tsiantis's team directly demonstrates that low-affinity transcription factor binding sites can play a major role in the generation of morphological novelty. By providing a tool to "softly" repress RCO expression, these sites dampen the effects of RCO expression changes and allow cytokinin levels to be fine-tuned. This in turn promotes the appearance of more complex leaf shapes, e.g., by precisely regulating the outgrowth of lobes or leaflets along the margins of developing leaves.

These results will stimulate further efforts to understand the influence of low-affinity transcription factor binding sites on development, diversity and disease. For example, there is increasing awareness that changes in the regulation of developmental genes are a major contributor to human disease, and that other regulatory changes can reduce disease severity or protect individuals who carry disease variants. While the specific DNA sequences underlying these effects are often unknown, this latest work highlights low-affinity transcription factor binding sites as excellent candidate regions for identifying causal sequences of disease susceptibility, and for understanding variation in trait diversity more broadly in complex eukaryotes.

Credit: 
Max-Planck-Gesellschaft

Pancreatic cancer tumor classification could optimize treatment choices

CHAPEL HILL -- A study from the University of North Carolina Lineberger Comprehensive Cancer Center could help predict resistance to treatments for pancreatic cancer, one of the deadliest cancer types.

In Clinical Cancer Research, a journal of the American Association for Cancer Research, researchers led by UNC Lineberger's Jen Jen Yeh, MD, and Naim Rashid, PhD, reported findings for how two subtypes of pancreatic cancer respond to treatments differently. Importantly, they found that one subtype of the disease showed poor responses to common therapies and also had worse survival.

"Our study evaluated the best way to classify tumors according to available treatment response data from prior clinical trials," said Yeh, who is a professor of surgery and pharmacology and vice chair for research in the UNC School of Medicine Department of Surgery. "Our hope is that we can use this information to tailor treatments, and potentially avoid giving therapies that may not work well for certain patients."

Pancreatic cancer is one of the deadliest cancer types, with 9.3 percent of patients, or fewer than one in 10, surviving five years after diagnosis, according to the National Cancer Institute. The disease is typically diagnosed in later stages, when the cancer has already spread.

In 2015, UNC Lineberger researchers discovered two major subtypes of pancreatic cancer based on the molecular and genetic features of the disease. However, several other research groups reported different classification systems with three and four subtypes. Researchers said consensus was lacking regarding which of the proposed systems was optimal for clinical decision-making in pancreatic cancer.

To address this, Yeh, Rashid and colleagues first analyzed data from two recent clinical trials for pancreatic cancer to better understand which tumor classifications aligned with treatment responses. They found the two-subtype classification best aligned with treatment outcome data from two clinical trials.

After analyzing five independent pancreatic cancer studies, they also found that the two-subtype system best explained differences in overall patient survival, with patients classified as having basal-like tumors showing worse survival outcomes.

"We found that this simpler, two-subtype system best explained treatment responses and survival outcomes," said Rashid, the study's co-corresponding author and an assistant professor in the UNC Gillings School of Global Public Health Department of Biostatistics.

Importantly, they also saw in their data that patients classified as having the basal-like subtype showed much poorer response rates to treatments than the other subtype.

In the two trials, basal-like tumors showed no response to FOLFIRINOX, a standard therapy that combines five chemotherapy agents, or a treatment that used FOLFIRINOX as a backbone.

"In the context of these two trials, basal-like tumors didn't respond well to common first-line therapies," Rashid said. "In the future, can we use these subtypes to optimize therapies for patients?"

The other tumor type, which they called "classical," showed a better response to FOLFIRINOX treatments.

"We want to know what therapies are best for the patient so that we can maximize response and quality of life," Yeh said. "For pancreatic cancer, where time is more limited, this becomes even more important."

They also reported they were able to simplify and adapt their classification method so it can be used in the clinic and used to generate subtype predictions for a single patient.

Their new subtype classification method, generated using machine-learning approaches, relied on comparisons of how just nine pairs of genes are expressed. They found this method was extremely accurate, even when it was used to classify tumor samples that were processed and stored differently and used different methods of gene expression measurement.

"This study basically provides the evidence that this is something we can feasibly do in the clinic," Yeh said.

They are working to bring of their classification algorithm, which they called PurIST, into a form that can be used in future clinical trials at the North Carolina Cancer Hospital, the University of Rochester and the Medical College of Wisconsin.

Yeh said their next step is to conduct clinical trials to continue to try to understand how the tumor subtypes can inform how patients respond to treatment. They are also trying to understand the differences between the two subtypes.

"We want to use the prediction model we developed in actual trials to ensure patients are placed on optimal therapies up-front in order to optimize survival and other outcomes," Rashid said.

Credit: 
UNC Lineberger Comprehensive Cancer Center

Eastern equine encephalitis virus poses emergent threat, say NIAID officials

image: Colorized electron microscope image of mosquito salivary gland tissue infected by the eastern equine encephalitis virus. Viral particles are red.

Image: 
CDC/Fred Murphy/Sylvia Whitfield

WHAT:

Although eastern equine encephalitis (EEE), a mosquito-borne illness, has existed for centuries, 2019 has been a particularly deadly year for the disease in the United States. As of November 12, 36 confirmed cases of EEE had been reported by eight states; 13 of these cases were fatal. In a new commentary in The New England Journal of Medicine, officials from the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, describe the eastern equine encephalitis virus (EEEV) that causes EEE, current research efforts to address EEE, and the need for a national strategy to address the growing threat of EEEV and other emerging and re-emerging viruses spread by mosquitoes and ticks (known as arboviruses).

There were 12 documented U.S.-based EEE epidemics between 1831 and 1959. The virus is spread between Culiseta melanura mosquitoes and various tree-perching birds found in forested wetlands. Occasionally, other mosquito species transmit the virus to people and other mammals. In people, EEEV takes roughly 3 to 10 days to cause symptoms. The virus initially causes fever, malaise, intense headache, muscle aches, nausea and vomiting; specific diagnostic testing may not reveal anything as EEEV is difficult to isolate from clinical samples, and testing for EEEV antibodies may be negative. Neurologic signs of EEE, which may appear within 5 days of infection, initially are nonspecific but rapidly progress. Most people (96%) infected with EEEV do not develop symptoms; however, of those who do, one-third or more die, and the others frequently suffer permanent and severe neurologic damage.

Although point-of-care diagnostics for EEE and many other mosquito-borne causes of encephalitis are not available, currently they would be of limited value in the absence of effective treatment, the authors write. So far, no antiviral drug has proven safe and effective against EEE, but many compounds are being assessed. Monoclonal antibodies have been found effective in an experimental animal model but only when given prior to infection. Patients with EEE are currently treated with supportive care, which often includes intensive care in a hospital and ventilator assistance. Patients with EEE are not infectious, and social support and counseling for both the patient and the family are vitally important given the seriousness of the disease, the authors write.

Several EEE vaccine candidates are in development but may have trouble reaching advanced development and licensure, according to the authors. EEE outbreaks are rare, brief and focal, and occur sporadically in unpredictable locations, making it difficult to identify an appropriate target population for vaccination. Efforts to develop mosquito-saliva vaccines that would be effective against multiple mosquito diseases, including EEE, are in early stages.

In the absence of effective EEE vaccines and treatments, state and local health departments can provide an early warning of imminent human infections by surveilling horses, birds and mosquitoes, but these efforts are threatened by insufficient funding, according to the authors. In recent years, the Americas have seen a growing number of emerging and re-emerging arboviruses, such as dengue, West Nile, chikungunya, Zika and Powassan. Although outbreaks of EEE disease thus far have been infrequent and focal, the spike in cases in 2019 and the looming presence of other, potentially deadly arboviruses in the United States and globally demand a national defense strategy for arboviruses and other vector-borne diseases, the authors write. Although the best way to address these viruses is not entirely clear, to "ignore them completely and do nothing would be irresponsible," say the authors.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Decoding the fundamental mechanisms of human salivary lubrication

An interdisciplinary team of scientists led by the University of Leeds have uncovered the fundamental mechanism by which human saliva lubricates our mouth. Their multi-scale study opens the door to advancing dry mouth therapies and saliva substitutes - potentially bringing relief to people who suffer from dry mouth, which can affect swallowing, speech, nutritional intake and quality of life.

Roughly 10% of the general population and 30% of older people suffer from dry mouth, which can be caused by prescribed polymedication, certain cancer treatments and autoimmune diseases.

Previously, the molecular mechanisms that govern saliva's lubrication properties have not been well understood. This has caused significant challenges in developing effective and long-lasting treatments or therapies for dry mouth.

Now, new research harnessing expertise in food colloid science, mechanical engineering, nanoscience and chemical engineering has demonstrated for the first time that the high lubrication properties of saliva is a result of electrostatic self-assembly between mucin proteins and positively charged non-mucinous small-molecular proteins.

The study, published in the journal Advanced Materials Interfaces, puts forward an unprecedented molecular model that explains the synergistic lubrication behaviour of human salivary proteins from macro to nanoscale.

Study lead author, Dr Anwesha Sarkar from the School of Food Science and Nutrition at Leeds, said: "Human salivary lubrication underpins the fundamentals of human feeding and speech. Oral lubrication is crucial not only to one's daily life functioning but also to one's general health and wellbeing. However, until now the molecular mechanism behind salivary lubrication properties has remained elusive.

"Our research resolves the distinct roles played by mucin- and non-mucinous molecular proteins. We found that that hydrated mucin controls the macromolecular viscous lubrication, forming a mesh-like nano-reservoir that traps water molecules. Non-mucinous small molecular positively-charged proteins on the other hand act as a molecular bridge between mucin-mucin and mucin-surface within that mesh, aiding boundary lubrication.

"We believe that this work is an important stepping-stone to designing the next-generation of nature-inspired aqueous lubricants for nutritional technologies and biomedical applications."

Credit: 
University of Leeds