Earth

UMD study finds exercise benefits brains, changes blood flow in older adults

image: 'We are seeing that exercise can impact biomarkers of brain function in a way that might protect people by preventing or postponing the onset of dementia.' -- University of Maryland Associate Professor J. Carson Smith

Image: 
John Consoli, University of Maryland

Exercise training alters brain blood flow and improves cognitive performance in older adults, though not in the way you might think. A new study published by University of Maryland School of Public Health researchers in the Journal of Alzheimer's Disease showed that exercise was associated with improved brain function in a group of adults diagnosed with mild cognitive impairment (MCI) and a decrease in the blood flow in key brain regions.

"A reduction in blood flow may seem a little contrary to what you would assume happens after going on an exercise program," explained Dr. J. Carson Smith, associate professor in the Department of Kinesiology. "But after 12-weeks of exercise, adults with MCI experienced decreases in cerebral blood flow. They simultaneously improved significantly in their scores on cognitive tests."

Dr. Smith explains that for those beginning to experience subtle memory loss, the brain is in "crisis mode" and may try to compensate for the inability to function optimally by increasing cerebral blood flow. While elevated cerebral blood flow is usually considered beneficial to brain function, there is evidence to suggest it may actually be a harbinger of further memory loss in those diagnosed with MCI. The results of the study by Dr. Smith and his team suggest exercise may have the potential to reduce this compensatory blood flow and improve cognitive efficiency in those in the very early stages of Alzheimer's Disease.

A control group of cognitively healthy older adults without mild cognitive impairment also underwent the exercise training program, consisting of four 30-minute sessions of moderate-intensity treadmill walking per week. But the program yielded different responses from each group.

Unlike the group with MCI, whose exercise training decreased cerebral blood flow, the exercise training increased cerebral blood flow in the frontal cortex in the healthy group after 12 weeks. Their performance on the cognitive tests also significantly improved, as was observed the MCI group.

For this study, changes in cerebral blood flow were measured in specific brain regions that are known to be involved in the pathogenesis of Alzheimer's disease, including the insula (involved in perception, motor control, self-awareness, cognitive functioning), the anterior cingulate cortex (involved in decision making , anticipation, impulse control and emotion) and the inferior frontal gyrus (involved in language processing and speech).

Specifically, among those with MCI, the decreased cerebral blood flow in the left insula and in the left anterior cingulate cortex were strongly correlated with improved performance on a word association test used to measure memory and cognitive health.

A previous publication from this study led by Dr. Smith focused on how the exercise intervention influenced changes in the brain's neural networks known to be associated with memory loss and amyloid accumulation, which are both signs of MCI and Alzheimer's.

"Our findings provide evidence that exercise can improve brain function in people who already have cognitive decline," Dr. Smith said optimistically. "We have an interest in targeting people who are at increased risk for developing Alzheimer's earlier in the disease process. We are seeing that exercise can impact biomarkers of brain function in a way that might protect people by preventing or postponing the onset of dementia."

Credit: 
University of Maryland

Climate change and infertility -- a ticking time bomb?

image: Clockwise from top left: broadcast-spawning fish such as carp; small ectothermic insects including pollinating bees; endemic animals with limited latitudinal or elevation ranges such as the flightless cormorant; disease vectors including mosquitoes; coral species that are important to highly diverse reefs; and endemic plant species including the Scottish primrose.

Image: 
Joaquim Alves Gaspar, Charles Sharp, Toby Hudson, and David Glass.

Rising temperatures could make some species sterile and see them succumb to the effects of climate change earlier than currently thought, scientists at the University of Liverpool warn.

"There is a risk that we are underestimating the impact of climate change on species survival because we have focused on the temperatures that are lethal to organisms, rather than the temperatures at which organisms can no longer breed," explains evolutionary biologist Dr Tom Price from the University's Institute of Integrative Biology.

Currently, biologists and conservationists are trying to predict where species will be lost due to climate change, so they can build suitable reserves in the locations they will eventually need to move to. However, most of the data on when temperature will prevent species surviving in an area is based on the 'critical thermal limit' or CTL - the temperature at which they collapse, stop moving or die.

In a new opinion article published in Trends in Ecology and Evolution, the researchers highlight that extensive data from a wide variety of plants and animals suggests that organisms lose fertility at lower temperatures than their CTL.

Certain groups are thought to be most vulnerable to climate-induced fertility loss, including cold-blooded animals and aquatic species. "Currently the information we have suggests this will be a serious issue for many organisms. But which ones are most at risk? Are fertility losses going to be enough to wipe out populations, or can just a few fertile individuals keep populations going? At the moment, we just don't know. We need more data," says Dr Price.

To help address this, the researchers propose another measure of how organisms function at extreme temperatures that focuses on fertility, which they have called the Thermal Fertility Limit or 'TFL'.

"We think that if biologists study TFLs as well as CTLs then we will be able to work out whether fertility losses due to climate change are something to worry about, which organisms are particularly vulnerable to these thermal fertility losses, and how to design conservation programmes that will allow species to survive our changing climate.

"We need researchers across the world, working in very different systems, from fish, to coral, to flowers, to mammals and flies, to find a way to measure how temperature impacts fertility in that organism and compare it to estimates of the temperature at which they die or stop functioning," urges Dr Price.

The work was carried out in collaboration with scientists from the University of Leeds, University of Melbourne and Stockholm University and was funded by the UK Natural Environment Research Council (NERC).

Credit: 
University of Liverpool

Superinsulators to become scientists' quark playgrounds

image: This image shows a 3D superinsulator, in which vortex condensate (green lines) squeezes the electric field lines connecting charge-anticharge pairs (red and blue balls) into the electric strings (orange strips). These strings tightly bind these charge-anticharge pairs, completely immobilizing them, so electric current cannot be produced.

Image: 
Argonne National Laboratory

Scientists widely accept the existence of quarks, the fundamental particles that make up protons and neutrons. But information about them is still elusive, since  their interaction is so strong that their direct detection is impossible and exploring their properties indirectly often requires extremely expensive particle colliders and collaborations between thousands of researchers. So, quarks remain conceptually foreign and strange like the Cheshire cat in “Alice's Adventures in Wonderland,” whose grin is detectable — but not its body.

An international group of scientists that includes materials scientist Valerii Vinokur from the U.S. Department of Energy’s (DOE) Argonne National Laboratory have developed a new method for exploring these fundamental particles that exploits an analogy between the behavior of quarks in high-energy physics and that of electrons in condensed-matter physics. This discovery will help scientists formulate and conduct experiments that could provide conclusive evidence for quark confinement, asymptotic freedom, and other phenomena, such as whether superinsulators can exist in both two and three dimensions.

Vinokur, working with Maria Cristina Diamantini from the University of Perugia in Italy and Carlo Trugenberger from SwissScientific Technologies in Switzerland, devised a theory around a new state of matter called a superinsulator, in which electrons display some of the same properties as quarks.

The electrons, they determined, share two important properties that govern quark interactions: confinement and asymptotic freedom. Confinement is the mechanism that binds quarks together into composite particles. Unlike electrically charged particles, quarks cannot be separated from each other. As distance between them increases, their pull only becomes stronger.

“This is not our everyday experience,” said Vinokur. “When you pull magnets apart, it becomes easier as they’re separated, but the opposite is true of quarks. They resist fiercely.”

Quark interactions are also characterized by asymptotic freedom, where quarks at close distance stop interacting altogether. Once they travel a certain distance away from each other, a nuclear force tugs them back in. 

In the late 1970s, Nobel laureate Gerard ’t Hooft first explained these two newly theorized properties using an analogy. He imagined a state of matter that is the opposite of a superconductor in that it infinitely resists the flow of charge rather than infinitely conducting it. In a “superinsulator,” as ’t Hooft called this state, pairs of electrons with different spins — Cooper pairs — would bind together in a way that is mathematically identical to quark confinement inside elementary particles.

“The distorted electric field in a superinsulator creates a string that binds the couples of Cooper pairs, and the more you stretch them, the more the couple resists to separation,” said Vinokur. “This is the mechanism that binds quarks together into protons and neutrons.”

In 1996, unaware of ’t Hooft’s analogy, Diamantini and Trugenberger — along with colleague Pascuale Sodano — predicted the existence of superinsulators. However, superinsulators remained theoretical until 2008, when an international collaboration led by Argonne investigators rediscovered them in films of titanium nitride.

Using their experimental results, they constructed a theory describing superinsulator behavior that eventually led to their recent discovery, which established a Cooper pair analog to both confinement and the asymptotic freedom of quarks, the way ’t Hooft imagined, noted Vinokur.

The theory of superinsulators fleshes out a mental model that high-energy physicists can use to think about quarks, and it offers a powerful laboratory for exploring confinement physics using easily accessible materials.

“Our work suggests that systems smaller than the typical length of the strings that bind the Cooper pairs behave in an interesting way,” said Vinokur. “They move almost freely at this scale because there is not enough room for high-strength forces to develop. This movement is analogous to the free motion of quarks at a small enough scale.”

Vinokur and co-researchers Diamantini, Trugenberger, and Luca Gammaitoni at the University of Perugia are seeking ways to conclusively differentiate between 2D and 3D superinsulators. So far, they have found one — and it has broad significance, challenging conventional notions about how glass forms.

To discover how to synthesize a 2D or 3D superinsulator, researchers need “a full understanding of what makes one material three-dimensional and another two-dimensional,” Vinokur said.

Their new work shows that 3D superinsulators display a critical behavior known as Vogel-Fulcher-Tammann (VFT) when transitioning to a superinsulating state. Superinsulators in 2D, however, display a different behavior: the Berezinskii-Kosterlitz-Thouless transition.

The discovery that VFT is the mechanism behind 3D superinsulators revealed something surprising: VFT transitions, first described nearly a century ago, are responsible for the formation of glass from a liquid. Glass is not crystalline, like ice — it emerges from an amorphous, random arrangement of atoms that rapidly freeze into a solid.

The cause of VFT has remained a mystery since its discovery, but scientists long believed it began with some kind of external disorder. The 3D superinsulators described in Vinokur’s paper challenge this conventional notion and, instead, suggest disorder can evolve from an internal defect in the system. The idea that glasses can be topological — they can alter their intrinsic properties while remaining materially the same — is a new discovery.

“This fundamental breakthrough constitutes a significant step in understanding the origin of irreversibility in nature,” Vinokur said. The next step will be to observe this theoretical behavior in 3D superinsulators.

The study brought together researchers from markedly different disciplines. Vinokur is a condensed matter physicist, while Gammaitoni focuses on quantum thermodynamics. Diamantini and Trugenberger are in quantum field theory.

“It was most remarkable that we came from very disparate fields of physics,” Vinokur said. “Combining our complementary knowledge enabled us to achieve these breakthroughs.”

Results from the Cooper pairs study appear in the paper “Confinement and asymptotic freedom with Cooper pairs,” published on Nov. 7, 2018 in Communications Physics. Work on 3D superinsulator mechanisms is outlined in the paper “Vogel-Fulcher-Tamman criticality of 3D superinsulators,” published in Scientific Reports on October 24, 2018.

Credit: 
DOE/Argonne National Laboratory

New type of genomic screening in order to produce new medicine

image: Prof. Dr Mikhail Yakimov in Immanuel Kant Baltic Federal University (IKBFU)

Image: 
Immanuel Kant Baltic Federal University (IKBFU)

Prof. Dr Mikhail Yakimov, a researcher from the Immanuel Kant Baltic Federal University, together with his colleagues from the Heinrich Heine University Düsseldorf, Norwegian Research Centre NORCE AS, School of Natural Sciences of CEU San Pablo University and Institute of Catalysis and Petrochemistry in Madrid (Spain), has conducted a study of universal transaminase enzymes. These ferments are involved in cellular metabolism and they also play a key role in construction of building blocks of cells.

Different chemical substances, which are used in drugs, should have a special property that is necessary for the most important molecular compounds of human body. This property is called chirality, and it is based on molecular symmetry elements. Chirality of a compound can differ from its chemical formula. There are only D-sugars and L-amino acids and no D-amino acids in human body.

The researchers have found out that transaminases are the enzymes, which can synthesize the compounds with special chirality. Today, there are many ways to detect transaminases of different chemical compositions.

Recently, a new approach for genomic and metagenomic screening has been developed. The researchers have identified 10 genes, which encode transaminases.

Credit: 
Immanuel Kant Baltic Federal University

Antireflection coating makes plastic invisible

image: Plastic dome coated with a new antireflection coating (right), and uncoated dome (left).

Image: 
Giebink Lab/Penn State

Antireflection (AR) coatings on plastics have a multitude of practical applications, including glare reduction on eyeglasses, computer monitors and the display on your smart-phone when outdoors. Now, researchers at Penn State have developed an AR coating that improves on existing coatings to the extent that it can make transparent plastics, such as Plexiglas, virtually invisible.

"This discovery came about as we were trying to make higher-efficiency solar panels," said Chris Giebink, associate professor of electrical engineering, Penn State. "Our approach involved concentrating light onto small, high-efficiency solar cells using plastic lenses, and we needed to minimize their reflection loss."

They needed an antireflection coating that worked well over the entire solar spectrum and at multiple angles as the sun crossed the sky. They also needed a coating that could stand up to weather over long periods of time outdoors.

"We would have liked to find an off-the-shelf solution, but there wasn't one that met our performance requirements," he said. "So, we started looking for our own solution."

That was a tall order. Although it is comparatively easy to make a coating that will eliminate reflection at a particular wavelength or in a particular direction, one that could fit all their criteria did not exist. For instance, eyeglass AR coatings are targeted to the narrow visible portion of the spectrum. But the solar spectrum is about five times as broad as the visible spectrum, so such a coating would not perform well for a concentrating solar cell system.

Reflections occur when light travels from one medium, such as air, into a second medium, in this case plastic. If the difference in their refractive index, which specifies how fast light travels in a particular material, is large -- air has a refractive index of 1 and plastic 1.5 -- then there will be a lot of reflection. The lowest index for a natural coating material such as magnesium fluoride or Teflon is about 1.3. The refractive index can be graded -- slowly varied -- between 1.3 and 1.5 by blending different materials, but the gap between 1.3 and 1 remains.

In a paper recently posted online ahead of print in the journal Nano Letters, Giebink and coauthors describe a new process to bridge the gap between Teflon and air. They used a sacrificial molecule to create nanoscale pores in evaporated Teflon, thereby creating a graded index Teflon-air film that fools light into seeing a smooth transition from 1 to 1.5, eliminating essentially all reflections.

"The interesting thing about Teflon, which is a polymer, is when you heat it up in a crucible, the large polymer chains cleave into smaller fragments that are small enough to volatize and send up a vapor flux. When these land on a substrate they can repolymerize and form Teflon," Giebink said.

When the sacrificial molecules are added to the flux, the Teflon will reform around the molecules. Dissolving the sacrificial molecules out leaves a nanoporous film that can be graded by adding more pores.

"We've been interacting with a number of companies that are looking for improved antireflection coatings for plastic, and some of the applications have been surprising," he said. "They range from eliminating glare from the plastic domes that protect security cameras to eliminating stray reflections inside virtual/augmented -reality headsets."

One unexpected application is in high altitude UAVs, or unmanned aerial vehicles. These are planes with giant wingspans that are coated with solar cells. Used primarily for reconnaissance, these planes rely on sunlight to stay in near perpetual flight and so a lot of the light they receive is at a glancing angle where reflections are highest. One of the companies that makes these solar cells is exploring the AR coating to see if it can improve the amount of light harvested by a UAV.

Because the technology is compatible with current manufacturing techniques, Giebink believes the coating technology is scalable and widely applicable. At this point, his test samples have stood up to central Pennsylvania weather for two years, with little change in properties. In addition, the coating is also antifogging.

"The coating adheres well to different types of plastics, but not glass," he said. "So, it's not going to be useful for your typical rooftop solar panel with a protective glass cover. But if concentrating photovoltaics make a comeback, a critical part of those is the plastic Fresnel lenses, and we could make a difference there."

Credit: 
Penn State

Road proximity may boost songbird nest success in tropics

image: White-rumped Shamas in Thailand's tropical forests are more successful when they nest near roads--the opposite of the pattern that scientists working in temperate zones have come to expect.

Image: 
Rongrong Angkaew

In the world's temperate regions, proximity to roads usually reduces the reproductive success of birds, thanks to predators that gravitate toward habitat edges. However, the factors affecting bird nest success are much less studied in the tropics--so does this pattern hold true? New research published in The Condor: Ornithological Applications shows that interactions between roads, nesting birds, and their predators may unfold differently in Southeast Asia.

Rongrong Angkaew of King Mongkut's University of Technology Thonburi and her colleagues placed 100 next boxes for the cavity-nesting White-rumped Shama in forest interior and 100 near a road at an environmental research station in northeast Thailand. Monitoring nests and radio-tracking 25 fledglings from each site for seven weeks, they found that nest success was 12% higher and post-fledging survival 24% higher at the edge versus the interior--the opposite of the pattern commonly observed in temperate regions.

"There were some special challenges involved in carrying out the field work," says Angkaew. "When we started setting up the nest boxes in the field, we found a lot of tracks and other signs of poachers and illegal hunting, so we had to avoid some parts of the forest edge in order to reduce human disturbance to our nest boxes, which could have affected nestling and fledgling survival rates."

Predators caused 94% of nest failures and 100% of fledgling mortality, and locally important predators of small birds, such as green cat snakes, northern pig-tailed macaques, and raptors, appear to prefer interior forest habitat. Fledglings also preferred to spend time in dense understory habitat, which provides cover from predators and was more available near roads.

Overall, the study's results suggest that the effects of roads on birds' reproductive success depend on local predator ecology--the same rules don't necessarily apply in different biomes. Angkaew and her coauthors hope that more studies like theirs will help identify key nest predators and assess their foraging behaviors in multiple landscapes, in order to determine the best ways to conserve vulnerable bird species in areas affected by human development.

Credit: 
American Ornithological Society Publications Office

Want to squelch fake news? Let the readers take charge

Would you like to rid the internet of false political news stories and misinformation? Then consider using -- yes -- crowdsourcing.

That's right. A new study co-authored by an MIT professor shows that crowdsourced judgments about the quality of news sources may effectively marginalize false news stories and other kinds of online misinformation.

"What we found is that, while there are real disagreements among Democrats and Republicans concerning mainstream news outlets, basically everybody -- Democrats, Republicans, and professional fact-checkers -- agree that the fake and hyperpartisan sites are not to be trusted," says David Rand, an MIT scholar and co-author of a new paper detailing the study's results.

Indeed, using a pair of public-opinion surveys to evaluate of 60 news sources, the researchers found that Democrats trusted mainstream media outlets more than Republicans do -- with the exception of Fox News, which Republicans trusted far more than Democrats did. But when it comes to lesser-known sites peddling false information, as well as "hyperpartisan" political websites (the researchers include Breitbart and Daily Kos in this category), both Democrats and Republicans show a similar disregard for such sources. Trust levels for these alternative sites were low overall. For instance, in one survey, when respondents were asked to give a trust rating from 1 to 5 for news outlets, the result was that hyperpartisan websites received a trust rating of only 1.8 from both Republicans and Democrats; fake news sites received a trust rating of only 1.7 from Republicans and 1.9 from Democrats. By contrast, mainstream media outlets received a trust rating of 2.9 from Democrats but only 2.3 from Republicans; Fox News, however, received a trust rating of 3.2 from Republicans, compared to 2.4 from Democrats.

The study adds a twist to a high-profile issue. False news stories have proliferated online in recent years, and social media sites such as Facebook have received sharp criticism for giving them visibility. Facebook also faced pushback for a January 2018 plan to let readers rate the quality of online news sources. But the current study suggests such a crowdsourcing approach could work well, if implemented correctly.

"If the goal is to remove really bad content, this actually seems quite promising," Rand says.

The paper, "Fighting misinformation on social media using crowdsourced judgments of news source quality," is being published in Proceedings of the National Academy of Sciences. The authors are Gordon Pennycook of the University of Regina, and Rand, an associate professor in the MIT Sloan School of Management.

To promote, or to squelch?

To perform the study, the researchers conducted two online surveys that had roughly 1,000 participants each, one on Amazon's Mechanical Turk platform, and one via the survey tool Lucid. In each case, respondents were asked to rate their trust in 60 news outlets, about a third of which were high-profile, mainstream sources.

The second survey's participants had demographic characteristics resembling that of the country as a whole -- including partisan affiliation. (The researchers weighted Republicans and Democrats equally in the survey to avoid any perception of bias.) That survey also measured the general audience's evaluations against a set of judgments by professional fact-checkers, to see whether the larger audience's judgments were similar to the opinions of experienced researchers.

But while Democrats and Republicans regarded prominent news outlets differently, that party-based mismatch largely vanished when it came to the other kinds of news sites, where, as Rand says, "By and large we did not find that people were really blinded by their partisanship."

In this vein, Republicans trusted MSNBC more than Breitbart, even though many of them regarded it as a left-leaning news channel. Meanwhile, Democrats, although they trusted Fox News less than any other mainstream news source, trusted it more than left-leaning hyperpartisan outlets (such as Daily Kos).

Moreover, because the respondents generally distrusted the more marginal websites, there was significant agreement among the general audience and the professional fact-checkers. (As the authors point out, this also challenges claims about fact-checkers having strong political biases themselves.)

That means the crowdsourcing approach could work especially well in marginalizing false news stories -- for instance by building audience judgments into an algorithm ranking stories by quality. Crowdsourcing would probably be less effective, however, if a social media site were trying to build a consensus about the very best news sources and stories.

Where Facebook failed: Familiarity?

If the new study by Rand and Pennycook rehabilitates the idea of crowdsourcing news source judgments, their approach differs from Facebook's stated 2018 plan in one crucial respect. Facebook was only going to let readers who were familiar with a given news source give trust ratings.

But Rand and Pennycook conclude that this method would indeed build bias into the system, because people are more skeptical of news sources they have less familiarity with -- and there is likely good reason why most people are not acquainted with many sites that run fake or hyperpartisan news.

"The people who are familiar with fake news outlets are, by and large, the people who like fake news," Rand says. "Those are not the people that you want to be asking whether they trust it."

Thus for crowdsourced judgments to be a part of an online ranking algorithm, there might have to be a mechanism for using the judgments of audience members who are unfamiliar with a given source. Or, better yet, suggest, Pennycook and Rand, showing users sample content from each news outlet before having the users produce trust ratings.

For his part, Rand acknowledges one limit to the overall generalizability of the study: The dymanics could be different in countries that have more limited traditions of freedom of the press.

"Our results pertain to the U.S., and we don't have any sense of how this will generalize to other countries, where the fake news problem is more serious than it is here," Rand says.

All told, Rand says, he also hopes the study will help people look at America's fake news problem with something less than total despair.

"When people talk about fake news and misinformation, they almost always have very grim conversations about how everything is terrible," Rand says. "But a lot of the work Gord [Pennycook] and I have been doing has turned out to produce a much more optimistic take on things."

Credit: 
Massachusetts Institute of Technology

Mechanism explains breast cancer cell plasticity

image: This is Dr. Chonghui Cheng.

Image: 
Baylor College of Medicine

One of the main obstacles to successfully treating breast cancer is the cells' ability to change in ways that make them resistant to treatment. Understanding the cellular mechanisms that mediate this cancer cell plasticity may lead to improved treatments. Taking a step in that direction, a team led by researchers at Baylor College of Medicine has discovered that breast cancer cells can shift between two forms of the cell surface molecule CD44, CD44s and CD44v. Published in the journal Genes & Development, the study shows that breast cancer cells expressing mainly CD44s have increased metastatic behavior and resistance to therapy, while those expressing CD44v do not associate with these behaviors but do present increased cell proliferation.

"One of the goals of my lab is to better understand the mechanisms that allow breast cancer cells to be remarkably heterogeneous, which is one of the reasons cancer is difficult to treat," said corresponding author Dr. Chonghui Cheng, associate professor at the Lester and Sue Smith Breast Center, of molecular and human genetics and of molecular and cellular biology at Baylor College of Medicine. "In this study, we investigated cancer stem cells, a cell population that has the plasticity to generate cells with different properties, focusing on the cell surface molecule CD44."

CD44 is a well-known marker of cancer stem cells and one that is extensively studied in the Cheng lab. The CD44 gene can produce two different forms of the protein - CD44s and CD44v - via a process called alternative splicing. Cheng and her colleagues investigated whether there was a difference in the two forms of CD44 expressed in human breast cancer cells. They also wanted to know whether the different forms of CD44 contributed differently to the disease.

To answer their questions, Cheng and her colleagues took an unbiased approach. They conducted bioinformatics analyses of breast cancer patient data collected in the Cancer Genome Atlas database.

"Our analyses show that CD44s and CD44v, the two major forms of CD44 generated by alternative splicing, have distinct biological functions in breast cancer," said Cheng, who also is a member of the Dan L Duncan Comprehensive Cancer Center.

"Our findings support that cancer cells can use different forms of CD44 to survive and that they also can switch from one form of CD44 to the other," said Cheng. "Cancer cells expressing high levels of CD44s have properties of cancer stem cells, they tend to be metastatic or recurrent and to survive treatment. But when they switch to CD44v, they have fewer cancer stem cell properties but are engaged in proliferation. Alternative splicing is the mechanism that allows the CD44 proteins to switch."

The researchers envision that by manipulating the levels of the two forms of CD44, it might be possible to change the cancer cell properties in ways that may enhance the cancer's susceptibility to treatment.

"We anticipate that other genes that also undergo alternative splicing could as well contribute to the cells' fate and to the plasticity that generates cancer heterogeneity," Cheng said.

Credit: 
Baylor College of Medicine

Seven core principles can help substance use treatment systems focus on high-level goals

PISCATAWAY, NJ - Building on reviews of existing studies, researchers in Canada have identified the principles that may help improve substance use treatment systems. They have published these seven core principles in an article in the current supplemental of the Journal of Studies on Alcohol and Drugs.

Lead study author Brian Rush, Ph.D., Scientist Emeritus at the Centre for Addiction and Mental Health in Toronto, says that, over the last 10 to 15 years, he has often been invited to do a system-wide review and to let those responsible for the overall network of services know how it is performing with respect to substance abuse: Is the system meeting the needs of the population? Are there gaps in the programs being offered? Are they evidence based?

"It's obviously different from a program review, which is for one particular organization," says Rush. "We're looking at the whole community network of services."

The seven principles are meant to serve as a template for system analysis, and they provide such recommendations as "a range of systems supports are needed to support and facilitate the effective delivery of services" and "attention to diversity and social-structural disadvantages is crucial to ensuring effective and equitable system design and service delivery."

Rush highlights two principles that stand out to him as most crucial. The first is the need for collaboration between the different sectors that are stakeholders in the process, including mental health services, addiction medicine doctors, primary care physicians, corrections, education -- all under the umbrella term of "a whole government response" or "whole of society."

"The biggest takeaway here is that the need in the population for alcohol and drug services is so high that the specialized alcohol and drug services alone cannot meet that need," says Rush. "They need a strong collaborative partnership with the other sectors."

He also points to the principle of the population-level approach.

"Many people with alcohol and drug problems have very serious problems, such as opioid or substance abuse combined with schizophrenia or depression, and this is a segment of the population that experiences quite serious challenges," says Rush. "But there's a larger percentage of the population who are just beginning to experience these challenges. If we don't also provide some services to these people, then they may be on a trajectory that's a far more serious and more expensive problem to help resolve. Policymakers funding programs are often responding to a crisis: Who's in the emergency room? Who's taking up hospital beds? But the fewest people at the top of the population health pyramid are the ones who cost the most. They came from somewhere, and they didn't wake up overnight as an addict in the hospital."

Rush says that although treatment system planners and funders are often looking to improve, sometimes it's hard for them to make the big decisions.

"The addiction field, more so than many other areas of health, is dominated by very strong opinions," says Rush. "It's also a marketplace for private enterprise versus public services, as well as a place where people have very strong personal experience with addiction. Depending on who that person is, they can have a very strong influence on policymakers. A principles-based methodology levels the playing field for people advocating for resources and asks them to take a very evidence-based approach in reviewing systems and making recommendations."

Rush says that although the researchers and some of the reviews are based in Canada, the findings are universal.

"We have a lot of experience with the World Health Organization and other important organizations involved in setting substance abuse treatment standards and guidelines, and a large percentage of the research we have drawn upon has been published by U.S. researchers," says Rush. "The principles draw upon our experience in consultation around the world."

Credit: 
Journal of Studies on Alcohol and Drugs

Tachycardia in cancer patients may signal increased mortality risk

Cancer patients who experienced tachycardia within one year of cancer diagnosis had higher mortality rates up to 10 years after diagnosis of tachycardia, according to research presented at the American College of Cardiology's Advancing the Cardiovascular Care of the Oncology Patient conference. The course convenes in Washington on Jan. 25-27, 2019, bringing together top experts in both cardiology and oncology to review new and relevant science in this rapidly evolving field.

Sinus tachycardia is when the heart beats faster than normal while at rest and may cause palpitations and discomfort. In addition to cancer treatment, it can also occur as a result of other conditions such as blood clots that cause heart attack or stroke, heart failure, fainting or sudden death. In the study, researchers defined sinus tachycardia as a heart rate over 100 beats per minute (bpm) diagnosed via electrocardiogram.

"Tachycardia is a secondary process to an underlying disease and reflective of significant multi-system organ stress and disease in cancer patients," said Mohamad Hemu, MD, a resident at Rush University Medical Center in Chicago and one of the study authors. "As a result, the most important initial step is to figure out what is causing the tachycardia. Reversible causes like dehydration and infections should be ruled out. Additionally, cardiopulmonary processes such as pulmonary embolism and other arrhythmias must be taken into consideration. Once these and all other causes of tachycardia are ruled out, then it is more likely that sinus tachycardia is a marker of poorer prognosis in these patients."

Researchers analyzed 622 cancer patients, including lung cancer, leukemia, lymphoma or multiple myeloma, from Rush University Medical Center from 2008 to 2016. The patients were 60.5 percent women, 76.4 percent white and an average age of 70 years; 69.4 percent of the cohort was classified with stage 4 cancer and 43 percent had lung cancer. The study included 50 patients with tachycardia and 572 control patients without tachycardia. Patients included in the study had tachycardia at more than three different clinic visits within one year of diagnosis, excluding history of pulmonary embolism, thyroid dysfunction, ejection fraction less than 50 percent, atrial fibrillation and a heart rate over 180 bpm.

Researchers assessed mortality for patients adjusting for age and other characteristics that were significantly different between a heart rate of more than 100 bpm and less than 100 bpm, characteristics included race, albumin, hemoglobin, beta blockers, kidney disease, use of blood thinners, and type of cancer. They also examined mortality adjusting for age and other clinically relevant characteristics, such as race, coronary artery disease, stroke, diabetes, smoking and radiation. Tachycardia was a significant predictor of overall mortality in both models. Of the patients who experienced tachycardia, 62 percent died within 10 years of diagnosis compared to 22.9 percent of the control group.

"We are continuously learning about the unique heart disease risks that face cancer patients, and our study shows that tachycardia is a strong prognosticator regardless of cancer type. That's why it is critically important to be co-managing both cancer and heart conditions to ensure patients receive the most effective treatment possible," said senior author Tochi M. Okwuosa, DO, director of the cardio-oncology program at Rush University Medical Center. "However, we need to do more studies to determine whether management of tachycardia in cancer patients will have any effect on survival."

Credit: 
American College of Cardiology

Drier mountains pose a double whammy for cold-adapted amphibians, says SFU study

image: The mountain-dwelling Cascades frog thrives in extreme climatic conditions.

Image: 
SFU

A species of frog endemic to the Pacific Northwest faces a 50 per cent increase in the probability of extinction by the 2080s due to climate change, according to a new study published by SFU researchers in the Ecological Society of America.

The mountain-dwelling Cascades frog thrives in extreme climatic conditions, ranging from dozens of feet of snow in winter to temperatures in excess of 90°F in summer. Cascades frogs are explosive breeders and their role as predators of flying insects is critical to aquatic and terrestrial ecosystems.

SFU biologist Wendy Palen, along with co-authors Mike Adams of the United States Geological Survey and Maureen Ryan and Amanda Kissel of Conservation Science Partners, set out to understand the effects of climate change on these unique amphibians.

Specifically, they aimed to assess how the warmer and drier temperatures occurring with climate change affect the survival of two distinct aspects of the frog's life cycle: in the aquatic stage where the frogs develop as tadpoles in shallow ponds, and in the terrestrial environment stage where they live as adults.

During the frogs' aquatic stage, the researchers evaluated whether warmer temperatures would increase food production and result in larger, healthier frogs upon metamorphosis, or whether entire generations of frogs would die in years when warmer, drier winters lead to ponds that dry quickly, stranding tadpoles before metamorphosis.

For the terrestrial stage, they evaluated whether the milder winters of climate change would present a warm welcome and lead to higher survival of adult frogs.

The species has been tracked in Olympic National Park's Sol Duc watershed for approximately 15 years. In fact, Palen, now a professor of biology at SFU, was a graduate student at the University of Washington when she began tagging hundreds of frogs with tiny microchips.

More recently Kissel, a lead scientist at Conservation Science Partners, continued the work by monitoring more than 50 ponds that the frogs use for breeding. She tracked water levels and the timing of metamorphosis to identify how often ponds dried before the frogs could emerge.

The team found that currently, up to a quarter of the tadpoles are stranded and die each year. Applying projections from hydrologists from the universities of Washington and Notre Dame, the researchers predict that nearly 40 per cent of the tadpoles could be lost by the 2080s as a result of dry ponds.

The results from studying the frog's terrestrial stage were even more surprising. Data showed that thinner snow-packs and warmer summer temperatures actually reduced adult survival.

Taking both trends together, the researchers forecast that the Cascades frog will have a 62 per cent chance of extinction risk by the 2080s.

Kissel says, "This is a worst-case scenario, where a frog that largely occurs inside some of our most protected landscapes will be at high risk of extinction by the end of this century."

The study supports an emerging picture of climate change in the Pacific Northwest where, as a result of warmer temperatures, precipitation will fall more often as rain rather than snow, leading to longer, drier summers with compounding negative consequences for many wildlife species.

Credit: 
Simon Fraser University

HBOT showed improvement in Alzheimer's Disease

New Orleans, LA - Dr. Paul Harch, Clinical Professor and Director of Hyperbaric Medicine at LSU Health New Orleans School of Medicine, and Dr. Edward Fogarty, Chairman of Radiology at the University of North Dakota School of Medicine, report the first PET scan-documented case of improvement in brain metabolism in Alzheimer's disease in a patient treated with hyperbaric oxygen therapy (HBOT). The report, published in the current issue of the peer-reviewed journal Medical Gas Research, is available at http://www.medgasres.com/article.asp?issn=2045-9912;year=2018;volume=8;issue=4;spage=181;epage=184;aulast=Harch.

The authors report the case of a 58-year-old female who had experienced five years of cognitive decline, which began accelerating rapidly. Single photon emission computed tomography (SPECT) suggested Alzheimer's disease. The diagnosis was confirmed by 18Fluorodeoxyglucose (18FDG) positron emission tomography (PET) brain imaging, which revealed global and typical metabolic deficits in Alzheimer's.

The patient underwent a total of 40 HBOT treatments - five days a week over 66 days. Each treatment consisted of 1.15 atmosphere absolute/50 minutes total treatment time. After 21 treatments, the patient reported increased energy and level of activity, better mood and ability to perform daily living activities as well as work crossword puzzles. After 40 treatments, she reported increased memory and concentration, sleep, conversation, appetite, ability
to use the computer, more good days (5/7) than bad days, resolved anxiety, and decreased disorientation and frustration. Tremor, deep knee bend, tandem gain, and motor speed were also
improved. Repeat 18FDG PET imaging one month post-HBOT showed global 6.5-38% improvement in brain metabolism.

"We demonstrated the largest improvement in brain metabolism of any therapy for Alzheimer's disease," notes Dr. Harch. "HBOT in this patient may be the first treatment not only to halt, but temporarily reverse disease progression in Alzheimer's disease."

The report also contains video imaging, including unique rotating PET 3D Surface Reconstructions, which allow the lay person to easily see the improvements in brain function.

"PET imaging is used around the world as a biomarker in oncology and cardiology to assay responses to therapy," says Dr. Fogarty. "We now have an irrefutable biomarker system that this intervention has promise where no other real hope for recovery of dementia has ever existed before."

The physicians report that two months post-HBOT, the patient felt a recurrence in her symptoms. She was retreated over the next 20 months with 56 HBOTs (total 96) at the same
dose, supplemental oxygen, and medications with stability of her symptoms and Folstein Mini-Mental Status exam.

According to the National Institutes of Health, "Alzheimer's disease is an irreversible, progressive brain disorder that slowly destroys memory and thinking skills and, eventually, the ability to carry out the simplest tasks. It is the most common cause of dementia in older adults. Alzheimer's disease is currently ranked as the sixth leading cause of death in the United States, but recent estimates indicate that the disorder may rank third, just behind heart disease and cancer, as a cause of death for older people."

The authors note that four pathological processes have been identified and primary treatment is with acetylcholinesterase inhibitors or the N-methyl-D-aspartate receptor antagonist memantine, which have been shown to have a positive impact on Alzheimer's disease progression with no significant disease-modifying effects.

HBOT is an epigenetic modulation of gene expression and suppression to treat wounds and disease pathophysiology, particularly inflammation. HBOT targets all four of the pathological processes of AD by affecting the microcirculation; mitochondrial dysfunction, and
biogenesis; reducing amyloid burden and tau phosphorylation; controlling oxidative stress; and reducing inflammation.

The first successful HBOT-treated case of Alzheimer's disease was published in 2001. The present case report is the first patient in a series of 11 HBOT-treated patients with Alzheimer's disease whose symptomatic improvement is documented with 18fluorodeoxyglucose positron emission tomography (18FDG PET).
"Our results suggest the possibility of treating Alzheimer's disease long-term with HBOT and pharmacotherapy," concludes Harch.

Credit: 
Louisiana State University Health Sciences Center

Positive self belief key to recovery from shoulder pain

People are more likely to recover from shoulder pain if they have the confidence to carry on doing most things, despite their pain - according to new research from the University of East Anglia and University of Hertfordshire.

Researchers studied more than 1,000 people undergoing physiotherapy for shoulder pain.

They found that those who expected physiotherapy would help them were likely to recover more than those who expected minimal or no benefit.

Meanwhile, people suffering more pain, who were confident in their ability to still do most things despite their pain, were likely to recover better with physiotherapy than those suffering less pain, but who weren't confident.

Lead researcher Dr Rachel Chester, from UEA's School of Health Sciences, said: "We studied shoulder pain which is very common, affects people of all ages, and often causes substantial loss of movement and function, as well as night pain.

"Physiotherapy management is effective for many people with shoulder pain, but not everyone. We wanted to find out what factors predict why some people do better than others."

The team investigated the strength of a patient's belief or confidence in their own ability to successfully complete tasks and reach a desired outcome despite being in pain - known as 'pain self-efficacy'.

The study included 1,030 people attending physiotherapy for the treatment of musculoskeletal shoulder pain in 11 NHS trusts and social enterprises across the East of England. The team collected information on 71 patient characteristics and clinical examination findings prior to and during the patient's first physiotherapy appointment. A total of 811 people provided information on their shoulder pain and function six months later.

The majority of patients significantly improved during their course of physiotherapy. The most important predictor of outcome was the person's pain and disability at the first appointment - higher (or lower) levels were associated with higher (or lower) levels six months later.

But the most interesting finding was that pain self-efficacy could change this outcome.

Dr Chester said: "We looked at people who started off with a high level of pain and disability and found that the more they believed in their own ability to do things and reach a desired recovery outcome - the less likely they were to be in pain and have limited function after six months.

"What really surprised us was that these people were more likely to have a better outcome than people who reported a low level of baseline pain and disability but had low pain self-efficacy.

"In addition, on average, people who expected to recover because of physiotherapy did better than those who expected minimal or no benefit.

"We recommend that physiotherapists help patients understand and manage their pain and to select treatments and exercises which help them build confidence in their shoulder and optimise their activity levels. This includes helping patients to gain the confidence to get back on track after a flair up."

Credit: 
University of East Anglia

Free lung cancer screening program builds valuable relationships with patients

image: This is Dr. Carsten Schroeder.

Image: 
Phil Jones, Senior Photographer, Augusta University

Augusta, Ga. (Jan. 24, 2019) - A free, simple screening for lung cancer can save a patient money, while building a healthy relationship for any medical needs they may have in the future. The research, published in the Annals of Thoracic Surgery, shows the partnership can be beneficial for patients looking for cardiology specialists, family medical care and other health-related issues, as well as for medical facilities that offer the free screening.

"Our mission is to find lung cancer earlier," said Dr. Carsten Schroeder, thoracic surgical oncologist at the Georgia Cancer Center and Medical College of Georgia at Augusta University. "If we find a nodule in the lung that's in the later stages, survival rate is much worse than if we find it earlier."

In the paper, titled "Financial analysis of free lung cancer screening shows profitability using broader NCCN Guidelines," Schroeder and his team analyzed fiscal years 2015-17 to evaluate indirect cost, direct cost and adjusted net margin per case after factoring downstream revenue from treating patients with positive scans and other findings.

"In all, we have 1,600 people on the screening list," Schroeder said. "Of those, 1,200 have actually had a scan. In just over 2 percent of those patients, we found lung cancer. The remaining 400 people do not meet the necessary criteria."

The idea to develop the free lung screen program started after a major research paper was published in the summer of 2011. The National Cancer Institute's National Lung Screening Trial, which included 50,000 people, showed a computerized tomography (CT) screening is better than chest x-ray for screening for lung cancer.

"There was a 20 percent increase in the survival rate for those patients who had the CT screening," Schroeder said. "This paper was the one that served as a catalyst for the Centers for Medicare and Medicaid Services to start covering the cost of the screening for patients."

While patients do not need to have health insurance to qualify for the lung screening program, there are some criteria they must meet, including:

Group 1:

55-75 years old.

Currently a smoker or have quit within the past 15
years.

Smoked at least a pack of cigarettes a day for 30+
years.

Group 2:

50-75 years old.

Smoked at least a pack of cigarettes a day for 20+
years.

Have at least one of the following additional lung
cancer risks:

Personal cancer history (lung, head and neck,
lymphoma).

Family history -- parent, sibling or child -- of lung
cancer.

Emphysema or chronic bronchitis.

Chronic Obstructive Pulmonary Disease (COPD).

Long-term exposure to asbestos.

Asbestos-related lung disease or pulmonary asbestosis.

Long term exposure to silica, cadmium, arsenic,
beryllium, chromium, diesel fumes, nickel, radon,
uranium or coal smoke and soot.

For his research, Schroeder and his team looked at the costs from a total of 705 scans. Of that 705, 418 patients were referred for follow-up procedures and specialist evaluations. The adjusted net margin per case was -$212 in the first year but turned positive to $177 in the third fiscal year.

One major factor influencing the profitability of a free screening, is the ability to use the National Comprehensive Cancer Network (NCCN) guidelines, which covers Group 2 of eligible patients. Currently, no other hospitals or medical centers in Augusta's River Region can use NCCN guidelines because they charge patients for a lung screening. Using the NCCN guidelines allowed Schroeder and his team to detect twice the number of lung cancers than if they had only screened Group 1.

"Our free lung screening program is a win for the communities we serve and for the hospital system," Schroeder said. "We bring them in for a free screening, which serves as a starting point for their medical care and health needs for the rest of their lives."

Credit: 
Medical College of Georgia at Augusta University

Overprescribing of antidepressants common in elderly patients

In a Pharmacology Research & Perspectives study of individuals living in Olmsted County, Minnesota from 2005-2012, potential overprescribing of antidepressant medications occurred in nearly one-quarter of elderly residents.

Potential antidepressant overprescribing was most likely in individuals residing in nursing homes; patients having a higher number of comorbid medical conditions; individuals who were outpatients; those taking more concomitant medications; those having greater use of acute care services; and those receiving prescriptions via telephone, e-mail, or patient portal.

"Our results, in agreement with others, suggest that the potential overprescribing of antidepressants may occur more often in elderly people who have a higher degree of clinical complexity or severity," said lead author Dr. William Bobo, of the Mayo Clinic, in Jacksonville, Florida. "This is important to consider because these individuals may be at especially high risk for clinically significant depression, and clinicians may be left with relatively little time to discuss the individual concerns that may prompt the issuing of an antidepressant prescription. This is something that we would like to look into in future studies.

Credit: 
Wiley