Tech

Study uncovers key parts of mechanism for activating T cells to fight cancer and other diseases

BOSTON -- In just a few years, CAR T-cell and other adoptive T-cell therapies have emerged as among the most promising forms of cancer immunotherapy. But even as these agents prove themselves against several forms of leukemia and lymphoma - and, potentially, certain solid tumors - basic questions remain about how they work.

In a study published online today by the journal Immunity, scientists at Dana-Farber Cancer Institute, Harvard Medical School, Vanderbilt University and colleagues at other institutions show how machinery within immune system T cells responds to outside signals and activates the cells to attack cancerous, infected, or otherwise diseased cells. The findings, based on 15 years of painstaking work to recreate and assemble key components of the signal-processing mechanism, may help researchers fine-tune T-cell therapies to the requirements of individual patients, the study authors say.

T cells, whose surfaces are dotted with structures known as T-cell antigen receptors (TCRs), patrol the body for signs of infection or other disease. As they keep watch, their TCRs lock onto bits of proteins, called antigens, displayed on protein structures decorating the surface of other cells in the human body. The antigens reveal whether a cell is normal or diseased. If a cell is diseased, these "protein bit flags" are recognized as "foreign" and, the T cell switches on, or activates, to kill the diseased cell. In CAR T-cell therapy, billions of a patient's T cells are removed and engineered to produce a structure called a chimeric antigen receptor, or CAR, that recognizes and latches on to a cancer cell. The resulting CAR T cells - essentially, high-performance versions of ordinary T cells - are then infused into the patient, where they take up the battle against tumor cells. Other TCR immunotherapies use genetically engineered T cells employing natural TCRs rather than chimeric receptors to target specific tumor cell antigens, also called neoantigens. Of note, in every healthy human being there are billions of distinct T cells each bearing unique TCRs and collectively capable of recognizing the myriad antigens that identify diseased cells.

"While CAR T cells, and T cells in general, are often effective in identifying and killing tumor cells, the precise mechanism by which the TCR works hasn't been clear," says the study's lead author, Kristine Brazin, PhD, of Dana-Farber and Harvard Medical School. "How is the signal, which originates when the receptor links to a tumor antigen, transmitted through the cell membrane into the cell interior leading to cell activation?"

Answering that question involved a deep dive into the intricacies of the TCR. Far from being a rigid, seamless object, the receptor consists of eight distinct subunits which can move as the TCR operates, even dissociating one subunit pair from one another in a highly choreographed manner.

The most prominent features of the TCR are two long components, dubbed α and β, which are unique to each individual T cell and extend like pincers from the cell membrane to snare a particular cell antigen. Beside α and β, there are six other CD3 subunits common to all TCRs involved in signaling the T cell that the specific pincer has detected antigen. Scientists have had a clear picture of the portions of the TCR that rise from the surface of the cell but knew little about the portions that anchor the receptor in the T-cell membrane.

Brazin and her colleagues focused on the α region of the TCR. Using nuclear magnetic resonance technology, they determined the structure of the section of TCRα implanted in the membrane. Here a surprise was in store.

"The assumption had been that this region, known as the transmembrane segment, was always straight," Brazin relates. "We found, however, that it is sometimes bent in an L-shaped formation."

When configured like an L, the segment remains largely within the cell membrane. When, like a flexible straw, it straightens up, one end pokes into the cell interior.

"We wanted to understand why this segment is sometimes embedded so shallowly in the membrane - in an L shape," Brazin relates. "We tried to make it straight."

To do that, she and her colleagues made mutant versions of two protein residues that cling to the sides of the transmembrane segment. Mutating one of those residues, called Arg251, caused the segment to become slightly more embedded in the membrane. Mutating the other, Lys256, made it become much more deeply immersed. Other residues were found to regulate the interconversion between bent and straight forms, with the latter jutting further through the cell membrane.

It was on the surface of the cell, however, that this bending and unbending made the biggest difference. When the transmembrane segment is in full L-shape, it presses tightly against the CD3 subunits at its side. When it unbends a little - as when Arg 251 was mutated - that tightness relaxes a bit, and the T cell enters an early stage of activation. When it becomes more fully immersed in the cell membrane, the gap with CD3 widens further and the T cell enters a later stage of activation, ready to attack tumor cells.

"The looser the connection between the transmembrane segment and CD3 subunits, the higher the state of T cell activation," Brazin remarks. "Our findings suggest that the mechanical force of the TCR's interaction with antigens during T cell movement initiates T cell activation by weakening the connection between the transmembrane segment and CD3."

The finding suggests that small-molecule drugs or genetic engineering approaches that widen or narrow the space between the transmembrane segment and CD3 could be used to tune the strength of T-cell attack on cancers or other non-malignant diseases, as needed for individual patients, the researchers say.

"This study represents a success of multidisciplinary basic science, explaining how bioforces involving antigen recognition initiate TCR signaling through the T-cell membrane with potential for future translational impact," says the study's senior author, Ellis Reinherz, MD, Chief of the Laboratory of Immunobiology and Professor of Medicine in the Department of Medical Oncology at Dana-Farber and Harvard Medical School. "A special scientist such as Kristine Brazin with tireless persistence, focus and intellect was required to solve this mystery over more than a decade of research effort," he added.

Credit: 
Dana-Farber Cancer Institute

Researchers teach 'machines' to detect Medicare fraud

image: This is Taghi M. Khoshgoftaar, Ph.D., co-author and Motorola Professor in FAU's Department of Computer and Electrical Engineering and Computer Science.

Image: 
Florida Atlantic University

Using a highly sophisticated form of pattern matching, researchers from Florida Atlantic University's College of Engineering and Computer Science are teaching "machines" to detect Medicare fraud. Medicare, the primary health care coverage for Americans 65 and older, accounts for 20 percent of health care spending in the United States. About $19 billion to $65 billion is lost every year because of Medicare fraud, waste or abuse.

Like the proverbial "needle in a haystack," human auditors or investigators have the painstaking task of manually checking thousands of Medicare claims for specific patterns that could indicate foul play or fraudulent behaviors. Furthermore, according to the U.S. Department of Justice, right now fraud enforcement efforts rely heavily on health care professionals coming forward with information about Medicare fraud.

A study published in the journal Health Information Science and Systems is the first to use big data from Medicare Part B and employ advanced data analytics and machine learning to automate the fraud detection process. Programming computers to predict, classify and flag potential fraudulent events and providers could significantly improve fraud detection and lighten the workload for auditors and investigators.

Researchers from FAU's Department of Computer and Electrical Engineering and Computer Science examined Medicare Part B dataset from 2012 to 2015. They focused on detecting fraudulent provider claims within the dataset, which consisted of 37 million cases. Fraudulent activities include patient abuse or neglect as well as billing for services not rendered. Physicians and other providers who commit fraud are excluded from participating in federal health care programs like Medicare, and these cases are labeled as "fraud."

For the study, the researchers aggregated the 37 million cases down to a smaller dataset of 3.7 million and identified a unique process to map fraud labels with known fraudulent providers.

Medicare Part B data included provider information, average payments and charges, procedure codes, the number of procedures performed as well as the medical specialty, which is referred to as provider type. In order to obtain exact matches, the researchers only used the National Provider Identifier (NPI) to match fraud labels to the Medicare Part B data. The NPI is a single identification number issued by the federal government to health care providers.

Researchers directly matched the NPI across the Medicare Part B data, flagging any provider in the "excluded" database as being "fraudulent." The research team classified a physician's NPI or specialty and specifically looked at whether the predicted specialty differed from the actual specialty, as indicated in the Medicare Part B data.

"If we can predict a physician's specialty accurately based on our statistical analyses, then we could potentially find unusual physician behaviors and flag these as possible fraud for further investigation," said Taghi M. Khoshgoftaar, Ph.D., co-author and Motorola Professor in FAU's Department of Computer and Electrical Engineering and Computer Science. "For example, if a dermatologist is inaccurately classified as a cardiologist, then this could indicate that this particular physician is acting in a fraudulent or wasteful way."

For the study, Khoshgoftaar, along with Richard A. Bauder, senior author, a Ph.D. student at FAU and a data scientist at FPL, and Matthew Herland, a Ph.D. student in FAU's Department of Computer and Electrical Engineering and Computer Science, had to address the fact that the original labeled big dataset was highly imbalanced. This imbalance occurred because fraudulent providers are much less common than non-fraudulent providers. This scenario can be likened to "where's Waldo," and is problematic for machine learning approaches because the algorithms are trying to distinguish between the classes -- and one dominates the other thereby fooling the learner.

To combat this imbalance, the researchers used random undersampling to reduce the dataset from the 3.7 million cases down to about 12,000 cases. They created seven class distributions and used six different learners across class distributions from severely imbalanced to balanced.

Results from the study show statistically significant differences between all of the learners as well as differences in class distributions for each learner. RF100 (Random Forest), a learning algorithm, was the best at detecting the positives of potential fraud events.

More interestingly, and contrary to popular belief that balanced datasets perform the best, this study found that was not the case for Medicare fraud detection. Keeping more of the non-fraud cases actually helped the learner/model better distinguish between the fraud and non-fraud cases. Specifically, the researchers found the "sweet spot" for identifying Medicare fraud to be a 90:10 distribution of normal vs. fraudulent data.

"There are so many intricacies involved in determining what is fraud and what is not fraud such as clerical error," said Bauder. "Our goal is to enable machine learners to cull through all of this data and flag anything suspicious. Then, we can alert investigators and auditors who will only have to focus on 50 cases instead of 500 cases or more."

This detection method also has applications for other types of fraud including insurance and banking and finance. The researchers are currently adding other Medicare-related data sources such as Medicare Part D, using more data sampling methods for class imbalance, and testing other feature selection and engineering approaches.

"Given the importance of Medicare, which insures more than 54 million Americans over the age of 65, combating fraud is an essential part in providing them with the quality health care they deserve," said Stella Batalama, Ph.D., dean of FAU's College of Engineering and Computer Science. "The methodology being developed and tested in our college could be a game changer for how we detect Medicare fraud and other fraud in the United States as well as abroad."

Credit: 
Florida Atlantic University

Study sheds light on why a warmer world may equal a wetter Arctic

image: Elizabeth Thomas, UB assistant professor of geology, holds a sediment core -- a cylindrical sample of lakebed mud. Such samples contain organic matter that can be analyzed to learn about a region's past climate. The new study was based on data from a sediment core extracted from the bottom of Sikuiui Lake in western Greenland.

Image: 
Douglas Levere / University at Buffalo

BUFFALO, N.Y. -- The Arctic is warming faster than the rest of the globe, and as it does, it's predicted to get wetter. But why? What mechanisms might drive these changes?

A new study looks to history for answers, examining what happened in the region during a period of warming some 8,000 years ago. The research finds evidence that in this ancient time, western Greenland became more humid, a trend that's often linked to increased precipitation. The study further shows that two different climactic processes may have contributed to this elevated humidity. The processes are:

As the Arctic heats up, sea ice melts, exposing regional waters to sun, air and increased evaporation.

As the planet warms, humidity increases more in regions closer to the equator. This creates an imbalance in global humidity, and eventually, moist air from lower latitudes is drawn into the drier Arctic.

"We used geologic evidence to determine that both of these processes likely contributed to an increase in humidity in western Greenland when the region warmed rapidly 8,000 years ago," says lead researcher Elizabeth Thomas, PhD, assistant professor of geology in the University at Buffalo College of Arts and Sciences. "As such, both processes could be at play again today, contributing to possible future increases in Arctic humidity, and ultimately, precipitation."

"We don't have long or detailed written records of Arctic precipitation, so we don't fully understand how precipitation might increase in response to warming," she says. It's an important area of study, she adds, because, "precipitation in the Arctic has complex interactions with climate, and it also impacts plant communities and affects how fast glaciers may shrink."

The study was published this month in Geophysical Research Letters by a team of scientists from UB, the University of Massachusetts and Northern Arizona University. The research was funded by the National Science Foundation.

Clues in lakebed mud

To learn about the climate history of western Greenland, scientists analyzed lakebed mud dating back thousands of years. This sediment contains organic matter -- such as ancient leaf waxes, and compounds produced by bacteria -- that reveal information about the region's climatic past.

As Thomas explains, when it comes to leaf waxes, weather influences the chemical content of these waxes in ways that scientists can trace. Specifically, leaf waxes contain small amounts of a rare form of hydrogen called deuterium, and the concentration of deuterium can go up or down in response to factors such as humidity and precipitation patterns. (One example: In Arctic leaf waxes, deuterium concentrations fluctuate depending on whether precipitation originated locally or from clouds that traveled long distances from low latitudes to arrive in the region).

Chemicals called branched glycerol dialkyl glycerol tetraethers (GDGTs), produced by bacteria, also hold clues about past climate. The composition of these compounds varies depending on the temperature of the surrounding environment at the time they were produced. As a result, scientists can use branched GDGTs to reconstruct prehistoric temperature trends, Thomas says.

These chemical indicators enabled Thomas' team to investigate ancient humidity and precipitation trends in western Greenland as the region warmed some 8,000 years ago. The new research was based on leaf waxes and branched GDGTs found in a sediment sample that the team extracted from the bottom of Sikuiui Lake in western Greenland.

"These chemical indicators are fairly new tools, and they enable us to research ancient climate in ways that were not possible before," Thomas says. "We can use these tools to investigate how humidity fluctuated in a region thousands of years ago, or whether storms in an area originated locally or far away. This is important because understanding what happened in ancient times can provide us with insight into what might happen today as the climate changes."

Credit: 
University at Buffalo

Truck driver pain and discomfort can be alleviated

Almost 60 per cent of truck drivers in a recent Canadian study reported experiencing musculoskeletal (MSD) pain and discomfort on the job, even though it may be preventable.

"Given the fact that MSDs account for nearly one-half of all work-related illnesses and the transportation sector makes up a significant portion of that, understanding the risk factors associated with musculoskeletal disorders is important," said lead author Sonja Senthanar, a doctoral candidate in the School of Public Health and Health Systems. "While the link between trucking and MSDs has been studied in other countries, there is a dearth of research in Canada."

According to the Ontario Ministry of Transportation, truck driving is the second most common occupation in Canada, employing nearly one in 35 males between the ages of 20 and 64 years.

Public health researchers at the University of Waterloo surveyed 107 male truck drivers passing through two popular highway stops in Southern Ontario and found that 57 per cent had experienced musculoskeletal pain and discomfort, especially low back pain. They found an association between this pain and discomfort and specific risk factors, including organizational safety climate, level of risk associated with the job, exhaustion from work tasks, being married and having higher education levels.

Senthanar said that being married and more educated are presumably associated with pain and discomfort because the presence of a spouse and knowledge gained from education can increase awareness of musculoskeletal symptoms - and therefore rates of reporting.

Co-author Philip Bigelow, a professor in the School of Public Health and Health Systems, said, "Physical exposures such as awkward postures, repetition, lifting, whole body vibration and prolonged sitting, as well as personal factors such as physical fitness and job satisfaction, are known to be associated with the development of MSDs. Since driving a truck involves a variety of these risk factors, programs that address these multiple factors are needed."

Bigelow said that a number of large Canadian carriers have adopted programs that take holistic approaches that include reducing vibration exposures through improved seating, modifying workloads and physical tasks, as well as promoting the overall wellness of drivers by encouraging physical activity and healthy eating.

Researchers at the University of Waterloo are members of a Canadian team of researchers that is engaged with stakeholders in the industry to identify such wholistic programs and to evaluate their impacts. They hope that companies with successful programs can act as champions of driver health and wellness to improve working conditions for all truck drivers.

Credit: 
University of Waterloo

Deconstructing the superfood that determines honeybee hierarchy

Katharina Paschinger's father, a conservation chemist in Vienna, was a devoted beekeeper. Paschinger remembers fondly that he would bring royal jelly, an important food for bee larvae, as a gift on visits to her maternal grandmother. "He would feed it to my grandma and tell her it was for long life and beauty," Paschinger said. "And actually, she lived to be 98."

Royal jelly is widely believed to have health benefits, although the medical evidence is scarce (and doctors caution that some people have severe allergic reactions). One thing the substance certainly does is promote caste development in honeybees, causing genetically identical larvae to develop into very different adults. All bee larvae eat royal jelly secreted by worker bees for the first few days of life, but those picked out to be queens continue to eat it until they pupate and beyond, whereas those that will become workers switch to honey and pollen. Biologists believe molecular signals in royal jelly drive larval bees to develop into queens, but the details of that signaling -- including what molecule is most important and how it is recognized -- are not yet clear.

Questions along that line brought Katharina Paschinger, a chemist, to revisit royal jelly this year in research published in the journal Molecular & Cellular Proteomics. Paschinger and colleagues in Iain Wilson's lab at the University of Natural Resources and Life Sciences in Vienna focus on glycoproteins, proteins to which a chain of sugar molecules is attached. These sugar chains, called glycans, can dramatically affect proteins' binding and signaling activities.

Previous studies of royal jelly glycoproteins had mostly found classes of glycans known as oligomannosidic and simple hybrids. As these contain no special recognition elements, they could not explain the unique effect of royal jelly on larval fate. But Paschinger, her colleagues and some other scientists recently began to find more complex glycan structures in several insect species, such as mosquitoes and moths. Their data, Paschinger said, challenged "a really long-held belief that insects only synthesize oligomannosidic glycans. You see these statements everywhere. It's a nightmare to read such simplifications."

The diversity in other insects' glycans was a reason to suspect that royal jelly glycoproteins also had hidden depth. Royal jelly, available in bulk at health food stores, was a good candidate for a combined glycomic and glycoproteomic analysis, said first author Alba Hykollari. "If you have a sample and you want to start with glycomics, the first question is how much you have and how pure is it. We were quite lucky: We got a lot of royal jelly, and it was very pure."

To determine the structure of the glycans in royal jelly, Hykollari used enzymes to isolate the glycans from proteins and added chemical tags. She separated the tagged glycans using liquid chromatography and analyzed them using a mass spectrometer, an instrument that breaks molecules into smaller pieces and separates them by size and charge.

Paschinger analyzed the data to draw conclusions about the glycan structures. First, she compared fragmentation patterns to precursor molecules, making inferences about the glycans' structures from how they broke apart. Then, she suggested specific chemical or enzymatic treatments to test those hypotheses.

Because glycans are modular chains, like Legos, breaking off one unit at a time can give a good idea of how the whole fits together. For example, phosphoethanolamine, a subunit the team observed in royal jelly, blocks digestion by some enzymes, but it can be removed using hydrofluoric acid. If glycan fragments of a certain mass appeared after treatment with hydrofluoric acid, it was a clue that phosphoethanolamine was present.

"I would say that the N-glycome of royal jelly was definitely underestimated," said Hykollari. Of the approximately 100 glycan structures the team defined, many had not been observed before in bees. Their laboratory's exclusive focus on glycan biochemistry and their extremely sensitive mass spectrometer helped the research team determine the identity of scarce glycans, said Hykollari. "We have worked (on glycans) for many years, so I would say our workflow is optimized."

Knowing these structures could help future scientists understand the activity of glycosylated proteins in royal jelly--either how they designate larval bees as future queens or how they trip allergic alarms in the human immune system. For example, said Paschinger, a researcher could synthesize a glycan from royal jelly to see how it interacts with signaling proteins in the larva. Their own plans moving forward are to tackle the glycome of another species. "Our driving force is understanding glycoevolution," said Paschinger. "But very often we're also driven by the element of challenge."

The research team dedicated their manuscript to Paschinger's father, the chemist-beekeeper. "I am sure he would have been very happy to see something scientific come out of his beekeeping hobby," said Paschinger.

Credit: 
American Society for Biochemistry and Molecular Biology

Drugs' side effects in lungs 'more widespread than thought'

A systematic review of research has revealed that the toxic effects on the lung of drugs commonly taken to treat a range of common conditions is much more widespread than thought.

Though the 27 drugs treating a range of conditions including arthritis, cancer and the heart are successful for most patients, doctors, say the team, need to be more aware of the potential risks to their respiratory systems.

The research was carried out by academics at the Universities of Manchester, Leeds, and Sheffield as well as clinicians at NIHR Manchester Biomedical Research Centre, Royal United Hospitals Bath NHS Foundation Trust and Sheffield Teaching Hospitals NHS Foundation Trust and the European Organisation for Research and Treatment of Cancer (EORTC).

The study, which looked at 6,200 patients’ data from 156 papers is published in the Journal of Clinical Medicine.

The team are part of a €24 million project funded by the European Union and the European pharmaceutical industry’s Innovative Medicine Initiative which is developing imaging techniques for the management of drug-induced interstitial lung disease (DIILD). It is co-led by EORTC and Bioxydyn Ltd, a University of Manchester spin-out company.

Though DIILD can cause difficulty breathing, inflammation and fibrosis, the risk sometimes only becomes apparent after the drugs have been in use for some years.

Though the team say clinicians are hindered because most of the papers they reviewed were of low or very low quality. Between 4.1 and 12.4 million cases of DIILD per year were reported worldwide accord to the review.

And the review also found that DIILD accounted for around 3–5% of all interstitial lung disease cases.

In some of the studies, mortality rates of over 50% were reported and overall, 25% of all the patients studied died as a result of respiratory symptoms.

Steroids were the most common drug used to treat DIILD, but no studies examined their effect on outcome.

John Waterton, a Professor of Translational Imaging from The University of Manchester, was on the research team. He said: “Though this area is not well researched, we can say that the side effects of drugs on the lung are much more widespread than previously thought.

“We do know it affects a considerable number of people, which is why we want to develop better imaging tests to pick up any lung problems before they become serious.

“It’s important to stress that patients can safely continue to take their medication - but it’s also important that doctors monitor and assess them closely for side effects in the lung.”

On the team is also Dr Nazia Chaudhuri, honorary senior lecturer at The University of Manchester and a consultant physician at Wythenshawe Hospital, part of Manchester University NHS Foundation Trust, who has a specialist interest in interstitial lung disease.

She said: “ Doctors need to be aware and vigilant to the possible lung toxicities and harm that can be caused by some drugs. With newer drugs coming on the market this is an increasing yet under recognised problem and we need better ways of detecting these side effects before they cause harm.”

Credit: 
University of Manchester

Curcumin has no benefit in reducing inflammation

A study of oral curcumin, the active medicinal ingredient in turmeric, showed no benefit in preventing inflammation and complications in patients undergoing elective surgery for aortic aneurysm repair, according to a large randomized controlled trial in CMAJ (Canadian Medical Association Journal).

"Turmeric has been used for thousands of years in Indian and Chinese medicine, and curcumin continues to gain popularity today as a natural health supplement," writes Dr. Amit Garg, Department of Medicine, Western University, and Lawson Health Research Institute, London, Ontario, with coauthors. "In this randomized trial, the largest to date, perioperative oral curcumin did not ameliorate the complications of elective abdominal aortic aneurysm repair."

Despite the increasing popularity of curcumin, and many animal studies showing benefit, few rigorous clinical trials have looked at its effects in humans. One single-centre study found that curcumin was associated with lower biomarkers for inflammation after coronary bypass surgery. In contrast, this study enrolled five times the number of patients at 10 hospitals for a different type of procedure to test the hypothesis that curcumin reduces inflammation and improves outcomes of surgery.

Researchers included 606 patients scheduled for elective surgery for abdominal aortic aneurysm repair at 10 Canadian hospitals. Participants were randomized to receive high-dose oral curcumin (2000 mg twice a day over four days) or placebo before surgery. Study results showed no positive effect of curcumin on inflammation compared with placebo, and, in secondary analyses, there was an increased risk of post-surgical kidney damage in patients in the curcumin group.

"Our findings emphasize the importance of testing turmeric and curcumin in rigorous human clinical trials before espousing any health benefits, as is currently done in the popular media," caution the authors.

Credit: 
Canadian Medical Association Journal

Hidden costs of disease to greater Yellowstone elk

image: Image of antlerless elk (Cervus canadensis) at a winter feedground in Wyoming.

Image: 
Mark Gocke/Wyoming Game and Fish Department

For decades researchers have known that a bacterial disease in elk, bison and cattle in the Greater Yellowstone Ecosystem causes periodic abortions in these animals and chronic illness in humans drinking infected cow's milk. The disease, called brucellosis, poses a financial concern for dairy producers and cattle ranchers, but its effects on the wild elk population have generally been considered minor.

In recent years, however, elk pregnancy rates have become the subject of controversy. Various researchers claim that stress caused by fear of wolves and nutritional deficiencies caused by drought can explain low pregnancy rates in specific elk herds, but until now the effects of brucellosis on elk pregnancy have not been scrutinized.

Utah State University researchers Gavin Cotterill and Johan du Toit report that by mid-winter, elk that test positive for brucellosis are less likely to be pregnant than healthy elk, independent from the abortions caused by that disease later in the year. Cotterill and du Toit, along with colleagues from the Wyoming Game and Fish Department, US Geological Survey Northern Rocky Mountain Science Center, and University of California Berkeley, discuss their findings in Ecology and Evolution published 28 October 2018. [DOI #10.1002/ece3.4521].

"Mid-winter elk pregnancy rates are often seen as an indicator of an elk herd's health and viability," says Cotterill, a PhD candidate at USU and lead author of the paper. "If we're interested in figuring out the effect that predators or climate are having on elk we need to also account for disease."

The researchers analyzed pregnancy and disease data collected over the last 20 years at Wyoming's supplemental winter feed-grounds for elk and ran additional pregnancy tests on stored blood samples.

"We found that the disease causes a substantial decline in the probability of pregnancy among young adult elk and the effect is weaker in older animals, but it's still unclear what the mechanism is that's causing this to happen," said du Toit.

The results are not a cause for alarm but are a signal that this disease, which came to the area in infected cattle, has previously hidden consequences for the wild elk population. Quantifying this effect is important, particularly as brucellosis is spreading through the elk population, so that the effects of other factors influencing elk reproduction may be estimated more accurately.

"Elk numbers in most of the region are high, and we don't expect that to change because of brucellosis," said Cotterill. "It's one more factor that researchers and managers need to keep in mind moving forward."

Credit: 
S.J. & Jessie E. Quinney College of Natural Resources, Utah State University

Consequences-focused cognitive training may promote healthier habits

Interventions aimed at reducing unhealthy behaviors often focus on retraining people's mental associations, but a series of studies suggests that showing people the consequences of the behaviors may be more effective. The findings are published in Psychological Science, a journal of the Association for Psychological Science.

The research specifically focuses on "approach-avoidance" training in which participants learn to approach some targets (e.g., nutritious foods) and avoid other ones (e.g., junk foods). Repeated exposure to these pairings is supposed to reinforce their mental associations in ways that increase positive behaviors and reduce negative behaviors. However, studies have not consistently shown an effect of this type of training.

Psychological scientist Pieter Van Dessel and colleagues hypothesized that this training may work when it actually alters people's beliefs about the consequences that follow when they approach or avoid certain foods.

"Our findings show that targeting these adaptive inferences can be effective in changing unhealthy eating behavior," says Van Dessel, a researcher at Ghent University. "This is important because it is often difficult to change these types of automatic behavior."

Van Dessel and coauthors Sean Hughes and Jan De Houwer tested their hypothesis in three online studies and one lab-based study, with a combined total of 1,547 participants. In all four studies, the participants completed a series of computer-based trials in which they saw a digital avatar standing near an open refrigerator. In each trial, the refrigerator contained a particular food and a color cue indicated whether the participants should move the avatar toward the food or away from it.

Some participants also saw a bar that indicated the avatar's health, which reflected the consequences of the participants' decisions. If they chose to approach the healthy food (or avoid the unhealthy food), the avatar's health bar filled up and the avatar appeared healthier while exclaiming "I feel healthy." Choosing to approach the unhealthy food (or avoid the healthy food), on the other hand, depleted the health bar and the avatar appeared less healthy, exclaiming "I feel sick."

Another group of participants saw these outcomes and were explicitly told to try to make their avatar as healthy as possible.

The results were consistent across the studies: Participants who had the explicit mission of maximizing their avatar's health showed the most positive automatic evaluations of the healthy food, gave the most positive ratings to the healthy food, and were most likely to choose a coupon for the healthy food compared with participants who saw the consequences of their choices but had no goal and those who did not see the consequences at all and only performed the typical approach-avoidance training.

Participants who were given a health-related goal also seemed to internalize the relationship between the foods and their consequences - compared with the other groups, they were most likely to approach the healthy food and avoid the unhealthy food when they were given free rein to choose foods without any consequences.

These effects also extended to actual eating behavior. In one version of the experiment, participants followed the avatar task with a supposedly unrelated task that involved rating the sensory characteristics of snack foods, such as candy and potato chips. The results showed that participants who had worked to maximize their avatar's health in the first task ate smaller quantities of the snacks compared with their peers.

In another experiment, participants who had the goal of boosting their avatar's health reported less unhealthy eating the next day and a greater intention to eat healthy foods.

Van Dessel was intrigued to find that the results of the inference training were so robust:

"After one instance of inference training, on one occasion, in one specific context, we found effects on participants' actual snacking behavior and self-reported food consumption a day after the training," he says. "This is striking because this quick training needs to go against an entire learning history of many years, in which people might have learned that eating unhealthy foods has positive effects for them."

The findings suggest that when trainees have a goal that requires that they learn the consequences of certain behaviors, it enhances the overall effectiveness of approach-avoidance training. Future research will help to illuminate whether changing certain aspects of the task, such as making it more personally relevant or including more trials, will strengthen the effects.

The researchers are now expanding on these findings, investigating whether consequence-based training can help to reduce other unwanted behaviors (such as smoking and alcohol use), and even increase certain positive behaviors (such as environmentally friendly behaviors).

Credit: 
Association for Psychological Science

As Canadian oil exports increase, research explores effects of crude oil on native salmon

New Orleans (October 28, 2018)--Oil spills spell disaster for affected wildlife, leading to a number of detrimental outcomes, including suffocation, poisoning and longer-term problems related to exposure to crude oil and its components. New research out of the University of Guelph in Canada takes a closer look at the potential effects on regional salmon populations as Canada eyes expansion of its crude oil export capacity. The findings will be presented today at the American Physiological Society's (APS) Comparative Physiology: Complexity and Integration conference in New Orleans.

"Crude oils are a complex mixture of chemicals and represent a pervasive environmental stressor. Canada sits on the world's third largest crude oil reserve, found as bitumen in the Athabasca oil sands. Ninety-eight percent of Canada's oil comes from the oil sands, and 99 percent of our exports go to the U.S.," said study author Sarah Alderman, PhD. "As plans to bolster the export capacity of this resource intensify, so too do concerns for the added risk of spills and environmental contamination." New pipeline projects, including the Trans Mountain Expansion Project, have the potential to increase diluted bitumen shipped through salmon habitat to seaports on the West Coast.

In the lab, Alderman's research explores the effects of crude oil exposure on anadromous salmon in collaboration with scientists at the Universities of Simon Fraser and British Columbia. "These fish spend parts of their life in both freshwater and the ocean, and our research is critical for finding out whether exposure leads to changes in the physiology or performance of the salmon that would impair their ability to move between these environments," Alderman said. So far, her research has shown that crude oil exposure seems to be toxic to the fishes' hearts, including molecular- and tissue-level changes that could potentially impair the fishes' ability to successfully migrate between freshwater and ocean, as well as impact the fishes' ability to acclimate to saltwater. The ability of salmon to migrate--from fresh water at birth to salt water, where they grow to adulthood, and back to fresh water for spawning--is natural and necessary throughout the course of their life and reproductive cycle.

Alderman found that crude oil exposure early in fish development can lead to long-term consequences, including mortality months after fish are removed to uncontaminated water and brain changes that are apparent for nearly a year after exposure. The research also revealed changes to plasma proteins that signal damage to tissues and biomarkers that could be used to test whether an animal has been exposed to crude oil.

Credit: 
American Physiological Society

NASA's IMERG reveals Hurricane Willa's rainfall

video: A rainfall accumulation analysis was generated by totaling Integrated Multi-satellitE Retrievals for GPM (IMERG) data for the period from October 20 to 26, 2018. Hurricane Willa's approximate 0000Z and 1200Z locations are shown on this analysis. Willa produced rainfall totals greater than 20 inches (508 mm) in the Pacific Ocean off Mexico's coast. IMERG data also indicated that rainfall accumulations of over 15 inches (381 mm) occurred in parts of Mexico and Southeastern Texas.

Image: 
NASA/JAXA, Hal Pierce

NASA uses satellite data to calculate the amount of rainfall generated from tropical cyclones, and used that capability for the Eastern Pacific Ocean's Hurricane Willa.

Tropical Depression 24E formed on October 20, 2018 and later in the day became tropical storm Willa. The tropical depression rapidly intensified and was a category five hurricane with winds of over 161 mph (140 knots) on October 22. Willa had weakened to category three intensity when it made landfall in Sinaloa, Mexico on October 24, 2018. Moisture streaming from Willa's remnants added to the soaking of the already water logged state of Texas. Moisture from Willa's remnants contributed to storms over the Southeast and to the developing Nor'easter moving over the East Coast.

At NASA's Goddard Space Flight Center in Greenbelt, Maryland a rainfall accumulation analysis was generated by totaling Integrated Multi-satellitE Retrievals for GPM (IMERG) data for the period from October 20 to 26, 2018. The IMERG rainfall accumulation data indicated that Willa produced rainfall totals greater than 20 inches (508 mm) in the Pacific Ocean off Mexico's coast. IMERG data also indicated that rainfall accumulations of over 15 inches (381 mm) occurred in parts of Mexico and Southeastern Texas.

IMERG data are generated every half hour by NASA's Precipitation Processing System by merging data from the satellites in the GPM Constellation, and calibrating those data with measurements from the GPM Core Observatory as well as rain gauge networks around the world. IMERG data are provided for much of the globe. Full coverage is calculated over latitudes from 60 degrees north to 60 degrees south with the remaining areas of the globe being partially covered.

The Integrated Multi-satellitE Retrievals for GPM (IMERG) creates a merged precipitation product from the GPM constellation of satellites. These satellites include DMSPs from the U.S. Department of Defense, GCOM-W from the Japan Aerospace Exploration Agency (JAXA), Megha-Tropiques from the Centre National D'etudies Spatiales (CNES) and Indian Space Research Organization (ISRO), NOAA series from the National Oceanic and Atmospheric Administration (NOAA), Suomi-NPP from NOAA-NASA, and MetOps from the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). All of the instruments (radiometers) onboard the constellation partners are intercalibrated with information from the GPM Core Observatory's GPM Microwave Imager (GMI) and Dual-frequency Precipitation Radar (DPR).

GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

Credit: 
NASA/Goddard Space Flight Center

NASA's Aqua Satellite tracks super Typhoon Yutu's oblong eye

image: On Oct. 25, 2018 at 1:30 p.m. CHST (local time, Guam) the MODIS instrument aboard Aqua captured a visible image of Typhoon Yutu with a 19 nautical mile oblong eye. Credit: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

Visible satellite imagery from NASA's Aqua satellite captured powerful Super Typhoon Yutu as it moved through the Philippine Sea. Yutu's eye appeared oblong on satellite imagery.

On Oct. 25 at 1:30 p.m. CHST (local time, Guam) the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard Aqua captured a visible image of Yutu. The MODIS image showed that large typhoon Yutu continued to remain symmetric with an open 19 nautical-mile-wide eye surrounded by very thick bands of powerful thunderstorms circling the center. At the time of the image, Yutu had passed and cleared Guam.

In satellite imagery on Oct. 26, the eye had become cloud-filled. Infrared satellite imagery revealed cooler cloud tops (indicating the uplift of air had strengthened) and a contracting eye occurring after the storm experienced an eyewall replacement cycle.

Mature, intense tropical cyclones can and often undergo an eyewall replacement cycle. That's where a new eyewall or ring of thunderstorms within the outer rain bands forms further out from the storm's center, outside of the original eye wall. That ring of thunderstorms then begins to choke off the original eye wall, starving it of moisture and momentum. Eventually, if the cycle is completed, the original eye wall of thunderstorms dissipates and the new outer eye wall of thunderstorms contracts and replace the old eye wall. The storm's intensity can fluctuate over this period, initially weakening as the inner eye wall dies before again strengthening as the outer eye wall contracts.

The National Weather Service (NWS) in Tiyan, Guam continued to issue a small craft advisory on Oct. 26. The NWS noted for Marianas Waters "Winds and seas will remain elevated, then slowly subside over the weekend as Super Typhoon Yutu moves away from the Marianas. A Small Craft Advisory remains in effect through Saturday afternoon."

At 11 a.m. EDT (1500 UTC) on Oct. 26, the Joint Typhoon Warning Center or JTWC noted that Yutu had maximum sustained winds near 135 knots (155.4 mph/250 kph).

Super Typhoon Yutu was the equivalent of a Category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. Yutu was centered near 17.3 degrees north latitude and 136.9 east longitude. It was about 465 nautical miles west-northwest of Navsta, Guam. Yutu was moving toward the west-northwest.

The JTWC forecast carries Yutu west toward the northern Philippines through Oct. 31. Yutu is expected to maintain typhoon strength through that time.

Credit: 
NASA/Goddard Space Flight Center

Proinflammatory diet linked to higher risk of kidney disease progression

San Diego, CA (October 26, 2018) -- Diets that contribute to inflammation were linked with a higher risk of chronic kidney disease (CKD) progression in a study that will be presented at ASN Kidney Week 2018 October 23-October 28 at the San Diego Convention Center.

CKD progression can be accompanied by chronic inflammation. To examine whether pro-inflammatory diets might increase the risk of CKD progression, Tanushree Banerjee, PhD (University of California, San Francisco) and her colleagues studied a national sample of 1,084 adults with CKD, where 11.1% of the participants developed kidney failure over 14 years of follow-up.

The investigators found that individuals with pro-inflammatory diets had a higher risk of developing kidney failure. "These findings have implications for the prevention of kidney failure using dietary approaches with low inflammatory potential," said Dr. Banerjee. "Nutritional interventions that focus on reducing the inflammatory aspects of diet should be tested for halting the progression of CKD."

Foods that have been positively related to concentrations of inflammatory markers include tomatoes; carbonated beverages; vegetables other than green leafy and dark yellow vegetables; and processed meat, red meat, organ meat, and fish other than dark-meat fish.

Credit: 
American Society of Nephrology

New composite material that can cool itself down under extreme temperatures

A cutting-edge material, inspired by nature, that can regulate its own temperature and could equally be used to treat burns and help space capsules withstand atmospheric forces is under development at the University of Nottingham.

The research paper, Temperature - dependent polymer absorber as a switchable state NIR reactor, is published in the journal Scientific Reports today (Friday 26 October).

"A major challenge in material science is to work out how to regulate man-made material temperature as the human body can do in relationship to its environment," explains lead author Dr Mark Alston, Assistant Professor in Environmental Design, from the Faculty of Engineering.

The research used a network of multiple microchannels with active flowing fluids (fluidics) as a method and proof of concept to develop a thermally-functional material made of a synthetic polymer. The material is enhanced with precise control measures that can switch conductive states to manage its own temperature in relationship to its environment.

"This bio-inspired engineering approach advances the structural assembly of polymers for use in advanced materials. Nature uses fluidics to regulate and manage temperature in mammals and in plants to absorb solar radiation though photosynthesis and this research used a leaf-like model to mimic this function in the polymer."

Dr Alston adds: "This approach will result in an advanced material that can absorb high solar radiation, as the human body can do, to cool itself autonomously whatever the environment it is placed in. A thermally-functional material could be used as a heat regulation system for burn injuries to cool skin surface temperature and monitor and improve healing."

This kind of heat flow management could also prove invaluable in space flight where high solar loads can cause thermal stresses on the structural integrity of space capsules.

By regulation of the structural material temperature of the vehicle, this will not only advance structural properties but could also generate useful power. This thermal energy could be removed from the re-circulated fluid system to be stored in a reservoir tank on board the capsule. Once captured, the energy could be converted into electrical energy or to heat water for use by the crew.

The experimental side of this research is laboratory-based and has been developed in collaboration with UK Government research institute: Scientific Research Facilities Council (SRFC). The next steps for the research are to secure funding for a demonstrator scale-up to present to aerospace manufacturing and to identify an industrial partner.

Credit: 
University of Nottingham

A black bear playbook: Conservationists predict bear/human conflict hot-spots in new study

image: A news study can help wildlife managers mitigate conflict as bears expand their ranges

Image: 
Jon Beckmann/WCS

NEW YORK (October 26, 2018) - A new study by WCS, American Museum of Natural History, and other partners uses long term data on bear mortality to map high-probability hot-spots for human-bear conflicts. The authors say this is a critical tool for wildlife managers to reduce mortality of bears as they recolonize their former range in the Great Basin and in other parts of the country.

The study, which appears in the latest issue of Global Ecology and Conservation, represents the latest information from the first and longest running effort to understand the impact of an increasing human footprint on American black bears at the wildland-urban interface.

The authors looked at expanding bear populations in the Lake Tahoe Basin and Western Great Basin Desert in Nevada, examining 382 bear deaths between 1997 and 2013. They found that the largest causes of mortality were vehicle collisions (160) and management removal of bears (132) due to animals breaking into people's homes and/or causing property damage, or other threats to human safety.

The authors say that by understanding the causes and consequences of mortality on bears using long-term data, wildlife managers will be able to reduce bear deaths in urban interface areas where bears could otherwise be killed more frequently.

Said the study's co-author Jon Beckmann, Science Director of WCS's Rocky Mountain West Program: "Ultimately the goal of conservation is to have more individuals of species like bears and other carnivores on landscapes like we have accomplished in the Great Basin, but we then have to understand how to limit their mortality that results from conflicts with humans."

Lead author Rae Wynn-Grant from the Museum's Center for Biodiversity and Conservation suggests "This approach to understanding local drivers of bear mortality can be replicated in other areas where human influence varies across the landscape. We were surprised to find subtle indicators of human activity were important drivers of bear mortality risk, an important finding for wildlife recovery efforts."

WCS and Nevada Department of Wildlife (NDOW) currently use this information to try and reduce mortality of bears in areas where bears already occur, and to predict hot-spots of human-bear conflict as they currently recolonize their historical range in the state of Nevada. In the Great Basin, bears are returning after an absence of 80-plus years due to recovering habitats and WCS/NDOW conservation efforts such as the Bear Aware Campaign, changing regulations, and policies prohibiting the feeding of wildlife along with ordinances requiring bear-proof dumpsters in many regions of western Nevada. These efforts have led to the bear population expanding in number and geography over the past several years, resulting in bears showing up in areas of central and eastern Nevada.

The authors say that these conservation successes can be a model for other places, including in New York and New Jersey, and areas throughout the U.S. that are dealing with increasing human-bear conflicts at the wildland-urban interface. In addition, the authors say there are lessons for other regions of the globe where large carnivores occur in these wildland-urban interface areas such as lions at the wildland-urban interface of Nairobi, Kenya.

Credit: 
Wildlife Conservation Society