Tech

BU finds Medicaid expansion improves access to postpartum care

Study comparing Utah and Colorado finds Medicaid expansion helped prevent new mothers from losing coverage to the "postpartum coverage cliff," with implications for reducing maternal mortality.

A new study led by a Boston University School of Public Health (BUSPH) researcher finds that new mothers in a state that expanded Medicaid (Colorado) were more likely to keep Medicaid coverage and access postpartum care than those in a similar state that had not yet expanded Medicaid (Utah). The study, published in the journal Health Affairs, found that this was especially true for new mothers who had experienced pregnancy/childbirth complications.

"One third of maternal mortality occurs after delivery, so continuous access to care in the year after birth is critical," says study lead author Dr. Sarah Gordon, assistant professor of health law, policy & management at BUSPH.

Medicaid policies in many states create a postpartum coverage "cliff," she explains: "Medicaid eligibility for pregnancy ends just 60 days after delivery, and income limits for parental Medicaid coverage typically fall far below income limits for pregnancy Medicaid coverage, resulting in high rates of postpartum coverage loss." Medicaid expansion can narrow that gap.

For the study, Dr. Gordon and colleagues analyzed 2013-2015 Medicaid claims data from Colorado (which expanded Medicaid in January 2014) and Utah (which would go on to expand Medicaid in 2019). The researchers found that Colorado and Utah had similar trends in 2013, but, after Colorado expanded Medicaid, new mothers in Colorado kept their Medicaid coverage for an average of one month longer than new mothers in Utah. New mothers in Colorado also had 0.52 more Medicaid-financed outpatient visits in the six months after delivery on average than in Utah. Among new mothers who had severe complications at the time of childbirth, Medicaid-financed outpatient visits nearly tripled in Colorado after expansion, and were 50 percent higher in post-expansion Colorado than in Utah.

The researchers found that Medicaid enrollment actually declined among new mothers in Utah during the observed period, while they held steady in Colorado, possibly because the Affordable Care Act made private insurance more accessible. However, after incorporating available data from these private plans, the researcher found only a very small reduction in the difference in the states' rates of coverage and postpartum care utilization.

The study shows that Utah's recent ballot-led Medicaid expansion will likely help new mothers in the state, Dr. Gordon says, and adds to the evidence behind other pushes to narrow the gap: "Eight states and Washington, D.C. have introduced laws or regulations to extend postpartum coverage, and national legislation currently advancing through the House would give states the option to do so," she says.

Credit: 
Boston University School of Medicine

Discoveries detail role of stem cell in deadly gastric cancer

ITHACA, N.Y. - A Cornell study provides important new insights into a common and deadly type of gastric cancer.

Incidence of this cancer, called gastric squamous-columnar junction (SCJ) cancer, also known as gastroesophageal cancer, rose 2.5 times in the United States between the 1970s and 2000s, while cases of all gastric cancers have decreased by more than 80% since the 1950s. Still, gastric cancers overall are the fifth most common tumors and the third-leading cause of cancer death worldwide.

The study, published Jan. 3 in the journal Nature Communications, identifies a key pathway in gastric SCJ cancers that provides a promising target for future study and therapy.

The researchers found that the progeny of a type of stem cell (Lgr5+) collect in large numbers and promote cancer in areas where two types of stomach tissues meet.

"On a global level, gastric cancer, especially gastric squamous-columnar junction cancer, is a very frequent disease and is very unfavorable in terms of prognosis, and so any new development in how the cancer forms and how we can treat it is very exciting," said Alexander Nikitin, professor of pathology and leader of the Cornell Stem Cell Program. Dah-Jiun Fu, a doctoral student in Nikitin's lab, is the paper's first author.

For the study, Nikitin and colleagues developed an experimental mouse model with two tumor suppressor genes that become inactivated under certain conditions. The model meets several parameters that are necessary for accurate research of this cancer. Previous mouse models used by other research groups had limitations, where mice only developed certain types of tumors or they died prematurely, thereby preventing study. But all the Cornell mice developed relevant forms of metastatic gastric SCJ cancers.

Previous research in Nikitin's lab and at other universities has implicated Lgr5+ stem cells in a number of cancers.

"Our studies show that that's not necessarily true for all types of cancers," Nikitin said. His group found no evidence that Lgr5+ stem cells themselves contributed to gastric SCJ cancers.

The current paper used the mouse model and organoids (miniature, simplified versions of an organ) to reveal that, rather than Lgr5+ stem cells, large pools of progeny cells called Lgr5-CD44+ cells populated these junctions. Stem cells divide and when they do, they differentiate into more specialized cells. But at early divisions, cells are immature and have not yet differentiated, and these progeny were detected in the earliest discernable lesions and in advanced carcinomas. They were also found to be highly susceptible to transformation in tests using organoids.

"These findings offer an exciting possibility that the cancer-prone character of other transitional zones may also be explained by the presence of a large fraction of immature cells," Nikitin said.

The study also shed light on a protein called osteopontin, which binds to a receptor called CD44+, initiating a number of downstream effects. The osteopontin-CD44+ complex controls the balance between stem cells and differentiated cells, and initiates the creation of more stem cells and cells with stem cell properties, such as Lgr5-CD44+ cells.

"In our case, this presence of osteopontin signaling keeps the [newly differentiated progeny] cells in an immature state," Nikitin said. "This is probably one of the mechanisms for how you have quite large pools of immature [Lgr5-CD44+] cells."

The researchers examined two populations of human patients and showed that, consistent with the mouse models, lower levels of osteopontin and CD44 correlated strongly with better patient survival, Nikitin said, while overexpression correlated with the worst prognosis of human SCJ cancer.

Future study will investigate testing osteopontin inhibitors and CD44+ antibodies to possibly prevent the buildup of Lgr5-CD44+ cells to treat these cancers.

Credit: 
Cornell University

Human body temperature has decreased in the United States, Stanford study finds

Since the early 19th century, the average human body temperature in the United States has dropped, according to a study by researchers at the Stanford University School of Medicine.

"Our temperature's not what people think it is," said Julie Parsonnet, MD, professor of medicine and of health research and policy. "What everybody grew up learning, which is that our normal temperature is 98.6, is wrong."

That standard of 98.6 F was established by German physician Carl Reinhold August Wunderlich in 1851. Modern studies, however, have called that number into question, suggesting that it's too high. A recent study, for example, found the average temperature of 25,000 British patients to be 97.9 F.

In a study published today in eLife, Parsonnet and her colleagues explore body temperature trends and conclude that temperature changes since the time of Wunderlich reflect a true historical pattern, rather than measurement errors or biases. Parsonnet, who holds the George DeForest Barnett Professorship, is the senior author. Myroslava Protsiv, a former Stanford research scientist who is now at the Karolinska Institute, is the lead author.

The researchers propose that the decrease in body temperature is the result of changes in our environment over the past 200 years, which have in turn driven physiological changes.

Digging into the past

Parsonnet and her colleagues analyzed temperatures from three datasets covering distinct historical periods. The earliest set, compiled from military service records, medical records and pension records from Union Army veterans of the Civil War, captures data between 1862 and 1930 and includes people born in the early 1800s. A set from the U.S. National Health and Nutrition Examination Survey I contains data from 1971 to 1975. Finally, the Stanford Translational Research Integrated Database Environment comprises data from adult patients who visited Stanford Health Care between 2007 and 2017.

The researchers used the 677,423 temperature measurements from these datasets to develop a linear model that interpolated temperature over time. The model confirmed body temperature trends that were known from previous studies, including increased body temperature in younger people, in women, in larger bodies and at later times of the day.

The researchers observed that the body temperature of men born in the 2000s is on average 1.06 F lower than that of men born in the early 1800s. Similarly, they observed that the body temperature of women born in the 2000s is on average 0.58 F lower than that of women born in the 1890s. These calculations correspond to a decrease in body temperature of 0.05 F every decade.

As part of the study, the authors investigated the possibility that the decrease could simply reflect improvements in thermometer technology; thermometers used today are far more accurate than those used two centuries ago. "In the 19th century, thermometry was just beginning," Parsonnet said.

To assess whether temperatures truly decreased, the researchers checked for body temperature trends within each dataset; for each historical group, they expected that measurements would be taken with similar thermometers. Within the veterans dataset, they observed a similar decrease for each decade, consistent with observations made using the combined data.

While the authors are confident of a cooling trend, the strong influences of age, time of day, and genders on body temperature preclude an updated definition of "average body temperature" to cover all Americans today.

Potential causes of colder body temperature

The decrease in average body temperature in the United States could be explained by a reduction in metabolic rate, or the amount of energy being used. The authors hypothesize that this reduction may be due to a population-wide decline in inflammation: "Inflammation produces all sorts of proteins and cytokines that rev up your metabolism and raise your temperature," Parsonnet said. Public health has improved dramatically in the past 200 years due to advances in medical treatments, better hygiene, greater availability of food and improved standards of living.

The authors also hypothesize that comfortable lives at constant ambient temperature contribute to a lower metabolic rate. Homes in the 19th century had irregular heating and no cooling; today, central heating and air conditioning are commonplace. A more constant environment removes a need to expend energy to maintain a constant body temperature.

"Physiologically, we're just different from what we were in the past," Parsonnet said. "The environment that we're living in has changed, including the temperature in our homes, our contact with microorganisms and the food that we have access to. All these things mean that although we think of human beings as if we're monomorphic and have been the same for all of human evolution, we're not the same. We're actually changing physiologically."

Credit: 
Stanford Medicine

Many in LA jails could be diverted into mental health treatment

More than 3,300 people in the mental health population of the Los Angeles County jail are appropriate candidates for diversion into programs where they would receive community-based clinical services rather than incarceration, according to a new RAND Corporation study.

Based on a variety of clinical and legal factors, researchers estimated that about 61% of the individuals in the jail mental health population were appropriate candidates for diversion, 7% were potentially appropriate for diversion and 32% were not appropriate for diversion.

The study, which was based on a review of the jail population as of June 2019, has findings that are similar to preliminary estimates compiled earlier by L.A. County officials.

"Knowing how many people are appropriate for diversion is a first step toward understanding the types of programs, staff and funding that would be needed to treat those individuals in the community," said Stephanie Brooks Holliday, the study's lead author and a behavioral scientist at RAND, a nonprofit research organization.

The largest mental health facilities in the U.S. are now county jails, with an estimated 15% of men and 31% of women who are incarcerated in jails nationally having a serious and persistent mental disorder.

In Los Angeles County, 30% of the people incarcerated in the county jail on any given day during 2018 were in mental health housing units and/or prescribed psychotropic medications (5,111 of 17,024 individuals in the average daily inmate population).

The Office of Diversion and Reentry was created by the county in 2015 to develop alternative approaches to dealing with mental health challenges in the criminal justice system. While L.A. County officials have been pursuing alternatives for individuals with serious mental illness who are incarcerated, there is more demand for the existing services than there is capacity.

RAND was asked by L.A. County to estimate the size of the current population of individuals incarcerated in county jails who likely would be legally suitable and clinically eligible for community-based treatment programs. The study was not limited by the availability of existing services, and considered who could be diverted assuming no limits on the types of programs or number of treatment slots available.

RAND researchers developed a set of legal and clinical criteria that reflect the general factors that the Office of Diversion and Reentry currently uses to determine whether an individual may be put forward to the courts as a candidate for diversion.

The principles were applied to a sample of 500 people who are representative of the L.A. County jail mental health population. Researchers found that about 59% of men and 74% of women were determined to be appropriate candidates for diversion. Men make up 85% of the Los Angeles County jail mental health population.

"Diversion is stopping the cycle between jail and homelessness," said county Supervisor Mark Ridley-Thomas. "Just in the last three years, the Office of Diversion and Reentry has safely diverted over 4,400 people from the county jails to more appropriate settings where they can get treatment, instead of the costly alternative of serving additional time in jail and being released with no supports, too often ending up homeless. This is smart policy making. RAND's research underscores the need to double down on diversion to reach all those who could benefit."

In addition to increasing diversion programs, RAND researchers suggest that L.A. County improve its ability to collect information about individuals released into community-based programs, the ways different courts handle such cases and the outcomes of people placed into diversion programs.

The county also could look for ways to improve its early diversion efforts, which may be able to help people before they enter the county's criminal justice system. For example, some jurisdictions intervene at the point of arrest in an effort to decrease the criminalization of persons with mental illness.

"But even with increases in diversion, there will continue to be a large number of individuals with mental health needs who remain in the jails," Holliday said. "It is important that there are services in place to care for people who are incarcerated and provide continuing services once they are released back into the community."

The report, "Estimating the Size of the Divertible Jail Mental Health Population in Los Angeles County," is available at http://www.rand.org. Other authors of the study are Nicholas M. Pace, Neil Gowensmith, Ira Packer, Daniel Murrie, Alicia Virani, Bing Han and Sarah B. Hunter.

Credit: 
RAND Corporation

The finance community is no longer worried about Bitcoin

2020 could well be the year that the cryptocurrency dream dies. This is not to say that cryptocurrencies will die altogether – far from it. But to all the financial romantics who have cheered the rise of bitcoin and other digital currencies over the past decade, there is a reckoning coming. Like it or not, the vision of a world in which these currencies liberate money from the clutches of central banks and other corporate giants is fading rapidly.

Sublimation, not melting: Graphene surprises researchers again

image: Abstraction

Image: 
MIPT

Physicists from the Moscow Institute of Physics and Technology and the Institute for High Pressure Physics of the Russian Academy of Sciences have used computer modeling to refine the melting curve of graphite that has been studied for over 100 years, with inconsistent findings. They also found that graphene "melting" is in fact sublimation. The results of the study came out in the journal Carbon.

Graphite is a material widely used in various industries -- for example in heat shields for spacecraft -- so accurate data on its behavior at ultrahigh temperatures is of paramount importance. Graphite melting has been studied since the early 20th century. About 100 experiments have placed the graphite melting point at various temperatures between 3,000 and 7,000 kelvins. With a spread so large, it is unclear which number is true and can be considered the actual melting point of graphite. The values returned by different computer models are also at variance with each other.

A team of physicists from MIPT and HPPI RAS compared several computer models to try and find the matching predictions. Yuri Fomin and Vadim Brazhkin used two methods: classical molecular dynamics and ab initio molecular dynamics. The latter accounts for quantum mechanical effects, making it more accurate. The downside is that it only deals with interactions between a small number of atoms on short time scales. The researchers compared the obtained results with prior experimental and theoretical data.

Fomin and Brazhkin found the existing models to be highly inaccurate. But it turned out that comparing the results produced by different theoretical models and finding overlaps can provide an explanation for the experimental data.

As far back as 1960s, the graphite melting curve was predicted to have a maximum. Its existence points to complex liquid behavior, meaning that the structure of the liquid rapidly changes on heating or densification. The discovery of the maximum was heavily disputed, with a number of studies confirming and challenging it over and over. Fomin and Brazhkin's results show that the liquid carbon structure undergoes changes above the melting curve of graphene. The maximum therefore has to exist.

The second part of the study is dedicated to studying the melting of graphene. No graphene melting experiments have been conducted. Previously, computer models predicted the melting point of graphene at 4,500 or 4,900 K. Two-dimensional carbon was therefore considered to have the highest melting point in the world.

"In our study, we observed a strange 'melting' behavior of graphene, which formed linear chains. We showed that what happens is it transitions from a solid directly into a gaseous state. This process is called sublimation," commented Associate Professor Yuri Fomin of the Department of General Physics, MIPT. The findings enable a better understanding of phase transitions in low-dimensional materials, which are considered an important component of many technologies currently in development, in fields from electronics to medicine.

The researchers produced a more precise and unified description of how the graphite melting curve behaves, confirming a gradual structural transition in liquid carbon. Their calculations show that the melting temperature of graphene in an argon atmosphere is close to the melting temperature of graphite.

Credit: 
Moscow Institute of Physics and Technology

ACP issues guideline for testosterone treatment in adult men with age-related low testosterone

1. ACP issues guideline for testosterone treatment in adult men with age-related low testosterone

ACP's recommendations include treating for sexual dysfunction only, discontinuing treatment if sexual function does not improve, and not initiating treatment for other reasons

Notes: HD video soundbites of ACP's president discussing the guideline are available to download at http://www.dssimon.com/MM/ACP-testosterone/.

Guideline: http://annals.org/aim/article/doi/10.7326/M19-0882
Evidence Review: http://annals.org/aim/article/doi/10.7326/M19-0830
Editorial: http://annals.org/aim/article/doi/10.7326/M19-3815
URLs go live when the embargo lifts

Physicians should prescribe testosterone for men with age-related low testosterone only to treat sexual dysfunction, the American College of Physicians (ACP) says in a new evidence-based clinical practice guideline. The evidence shows that men with age-related low testosterone may experience slight improvements in sexual and erectile function. The guideline is published in Annals of Internal Medicine.

ACP suggests that physicians consider intramuscular rather than transdermal formulations when initiating testosterone treatment to improve sexual function because the costs are considerably lower for the intramuscular formulation and clinical effectiveness and harms are similar. The annual cost in 2016 per beneficiary for testosterone replacement therapy was $2,135.32 for transdermal and $156.24 for the intramuscular formulation according to paid pharmaceutical claims provided in the 2016 Medicare Part D Drug Claims data. Most men are able to inject the intramuscular formulation at home and do not require a separate clinic or office visit for administration.

Physicians should discuss whether to initiate testosterone treatment in men with age-related low testosterone with sexual dysfunction who want to improve sexual and erectile function based on the potential benefits, harms, costs, and patient preferences. Physicians should reevaluate symptoms within 12 months and periodically thereafter and discontinue testosterone treatment if sexual function does not improve. Testosterone treatment should not be initiated to improve energy, vitality, physical function, or cognition because the evidence indicates testosterone treatment is not effective.

ACP's guideline, endorsed by the American Academy of Family Physicians, applies to adult men with age-related low testosterone. It does not address screening or diagnosis of hypogonadism, or monitoring of testosterone levels.

Media contact: For an embargoed PDF or to talk to an ACP spokesperson, please contact Steve Majewski at SMajewski@acponline.org or 215-351-2514.

2. U.S. health care administration costs four times more per capita than in Canada

Rise in bureaucracy due to surging overhead of private insurers

Abstract: http://annals.org/aim/article/doi/10.7326/M19-2818
URLs go live when the embargo lifts

Health care bureaucracy cost Americans $812 billion in 2017, and represented more than one-third of total expenditures for doctor visits, hospitals, long-term care and health insurance. A study found that cutting U.S. administrative costs to Canadian levels would have saved more than $600 billion. The analysis is published in Annals of Internal Medicine.

Researchers at the City University of New York at Hunter College, Harvard Medical School and the University of Ottawa analyzed thousands of accounting reports that hospitals and other health care providers filed with regulators, as well as census data on employment and wages in the health sector to quantify 2017 spending for administration by insurers and providers. They found that health administration costs were more than four-fold higher per capita in the U.S. than in Canada ($2,479 vs. $551 per person) which implemented a single payer Medicare for All system starting in 1962. Americans spent $844 per person on insurers' overhead while Canadians spent $146. Additionally, doctors, hospitals and other health providers in the U.S. spent far more on administration due to the complexity entailed in billing multiple payers and dealing with the bureaucratic hurdles that insurers impose. As a result, hospital administration cost Americans $933 per capita vs. $196 in Canada, where hospitals are paid lump-sum budgets by the single payer, much as fire departments are funded in the U.S. Physicians' billing costs were also much higher in the U.S., $465 per capita vs. $87 per capita in Canada.

The authors cautioned that their estimates probably understate administrative costs, and particularly the growth since 1999. The same authors conducted a study in 1999 that included administrative spending for some items such as dental care for which no 2017 data were available. They suggest that Medicare for All could save more than $600 billion each year on bureaucracy and repurpose that money to cover America's 30 million uninsured, and eliminate copayments and deductibles for everyone.

Media contacts: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org.

To reach the lead author, David U. Himmelstein, MD, please contact Clare Fauke at clare@pnhp.org.

3. Only 10 percent of eligible primary care providers certified to prescribe buprenorphine

Abstract: http://annals.org/aim/article/doi/10.7326/M19-2403
URLs go live when the embargo lifts

Only 10 percent of eligible primary care providers are certified to prescribe buprenorphine, a number that falls far short of what is needed to address the current opioid epidemic in the United States. Waiver certification has been much faster in areas hardest-hit by the opioid epidemic, but rural communities and communities with lower levels of secondary education are still underserved. A brief research report is published in Annals of Internal Medicine.

Expanded access to medication treatment of opioid use disorder is a critical component of the national response to the opioid crisis. From 2007 to 2017, there was roughly a four-fold increase in providers certified to prescribe buprenorphine. However, how this growth has varied by community characteristics is unclear.

Researchers from RAND Corporation studied Substance Abuse and Mental Health Services Administration and the Drug Enforcement Administration data to examine county-level growth in the number of buprenorphine-waivered prescribers and variation by county characteristics, including rurality, income, and rate of opioid-related overdose deaths in the past year. They found that the number of buprenorphine-waivered clinicians increased substantially between 2007 and 2017 and that growth was much faster in counties with higher rates of opioid-related overdose deaths in the preceding year. However, despite evidence that the opioid crisis has disproportionately affected rural counties that are socioeconomically disadvantaged, prescriber growth was markedly slower in small nonmetropolitan counties than in urban counties and was also slower in communities with lower levels of education, even after adjusting for the severity of the crisis. According to the authors, new models to increase access to care, broader scope of practice laws, and more aggressive training and financial incentives are needed to address a shortage of certified providers who are offering services.

Media contacts: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org.

To reach the lead author, Ryan McBain, PhD, MPH, please contact him directly at rmcbain@rand.org.

4. Cardiac troponin test cannot safely rule out inducible myocardial ischemia in patients with symptomatic coronary artery disease

Abstract: http://annals.org/aim/article/doi/10.7326/M19-0080
Abstract: http://annals.org/aim/article/doi/10.7326/M19-3731
URLs go live when the embargo lifts

In symptomatic patients with coronary artery disease (CAD), even very low high-sensitivity cardiac troponin (hs-cTn) concentrations, do not generally allow clinicians to safely rule out inducible myocardial ischemia. Findings from a cohort study are published in Annals of Internal Medicine.

Currently, clinical judgement and cardiac stress imaging are used for risk stratification in symptomatic patients with CAD and suspected inducible myocardial ischemia. The optimal noninvasive method for surveillance is unknown, but hs-cTn, a quantitative marker of cardiomyocyte injury, has recently been evaluated as a clinical tool in settings other than the diagnosis of acute myocardial infarction.

Researchers from University Hospital Basel, Switzerland studied 1,896 consecutive symptomatic patients with CAD to apply a novel approach using very low hs-cTnl concentrations less than 2.5 ng/L to determine if it could exclude inducible myocardial ischemia. Including the most sensitive hs-cTnI assay currently available (limit of detection 0.1 ng/l, they used three different assaysto measure hs-cTnI and hs-cTnT in blood samples that had been taken before stress testing and processed by personnel blinded to clinical data. The researchers found that the diagnostic accuracy of hs-cTnI and hs-cTnT to identify inducible myocardial ischemia was low, and no cutoff level achieved the predefined performance characteristics for the safe exclusion of inducible myocardial ischemia.

Media contacts: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org.

To reach the lead author, Christian Mueller, MD, please contact christian.mueller@usb.ch

Credit: 
American College of Physicians

Collaborative conservation approach for endangered reef fish yields dramatic results

image: A researcher swims in the midst of a Nassau Grouper aggregation as part of an ongoing effort to track the critically endangered species numbers.

Image: 
Photo by Paul Humann, copyright Grouper Moon Project

A new study from researchers at the Scripps Institution of Oceanography at the University of California San Diego has documented a successful recovery effort among Nassau Grouper populations in the Cayman Islands thanks to an approach involving government agencies, academic researchers, and nonprofit organizations.

The study, published January 6, 2020 in Proceedings of the National Academy of Sciences, used a two-pronged approach including tagging and video census data for monitoring and counting Nassau Grouper populations in an effort to more accurately estimate annual numbers of fish in the population and thus provide insight into the effects of ongoing conservation efforts. While many governments have enacted regional or seasonal fishing closures in an attempt to allow recovery of overfished stocks of aggregating reef fishes, this is one of the first studies to provide evidence that these measures can be successful.

"Normally, Nassau Grouper are relatively solitary, and tend to be hard to catch," said Lynn Waterhouse, a former PhD student in the Semmens Lab at Scripps Oceanography and research biologist at the John G. Shedd Aquarium in Chicago. "But at spawning, they come together en masse to form annual spawning aggregations, where historically tens of thousands of fish come together to reproduce, so they're very easy for fishermen to catch."

Due to overfishing during spawning, the species has suffered region-wide stock collapse. By the 1980s large aggregations had all but disappeared from the Caribbean region. Of the remaining aggregations, few contained more than 1,000 individuals and the species is currently listed as critically endangered by the International Union for Conservation of Nature.

In 2001, an aggregation of around 7,000 Nassau Grouper was discovered near Little Cayman, the smallest of the three islands located south of Cuba in the Caribbean Sea. In 2003, the subsequent rapid overfishing of the aggregation drove the Cayman Islands Government to enact aggressive management policies by banning fishing at aggregation sites during the spawning season. Through the Grouper Moon Project, the Cayman Islands Department of Environment (CI-DOE) partnered with a citizen conservation group called Reef Environmental Education Foundation (REEF) and scientists from Scripps Oceanography and Oregon State University to develop a monitoring strategy for the remaining Cayman Island aggregations.

"We developed a unique approach for monitoring these populations over the course of nearly two decades," said senior author Brice Semmens, an associate professor and ecologist at Scripps Oceanography. "This included a combination of using mark and recapture tagging techniques to track the proportion of tagged fish and video transects to count fish across the aggregation."

The researchers faced a number of obstacles, including funding challenges and particularly difficult monitoring conditions - the Nassau Grouper has the unfortunate habit of aggregating at inconvenient and often dangerous locations along the reef shelf edge, making it difficult for divers to easily observe and tag the aggregation. But with the support of the CI-DOE, the team has been able to maintain their monitoring efforts for over 15 years.

Importantly, the researchers did not just track the number of fish in the aggregation - they worked together with the CI-DOE and local communities to share results and discuss next steps. After reviewing the data being collected by the Grouper Moon Project, in 2016 the government initiated an even more progressive fishing policy, banning all fishing of Nassau Grouper during the winter spawning season along with limits on the number and size of fish that can be kept.

As a result, the team was astonished at how quickly the Nassau Grouper population recovered - over the last 10 years the aggregation on Little Cayman had nearly tripled in size, going from around 1,200 fish in 2009 to over 7,000 in 2018. This growth was due, at least in part, to a rapid increase in the addition of new, younger fish to the aggregation.

"This really demonstrates the power of this collaborative approach to conservation," said co-author Christy Pattengill-Semmens, REEF's director of science. "We were able to monitor the population and provide information to support management as the data came in, allowing the Cayman government to respond rapidly with policy changes.

"These efforts have been successful because of the strength of the partnerships among the government, academic research groups, and nonprofits," she added. "CI-DOE also has a long history of working with fishing communities in the islands."

The team also emphasized that these results show that patience is key.

"Due to the way these fish breed and the timing and location of spawning events, it can take several generations before the right ocean conditions ultimately facilitate young grouper joining an aggregation," said Pattengill-Semmens. "This means that communities and governments may need to implement protection strategies over the course of years or even decades to meet their management targets."

"This is an ideal approach for conservation," said Semmens. "Just doing the science isn't enough. You need to partner with groups and governments capable of turning science into conservation decisions that support the local community."

Credit: 
University of California - San Diego

Poplar genetically modified not to harm air quality grow as well as non-modified trees

While providing benefits to the environment, some trees also emit gases to the atmosphere that worsen air pollution and alter climate. Field trials in Oregon and Arizona show that poplar trees, which emit trace amounts of the gas isoprene, can be genetically modified not to harm air quality while leaving their growth potential unchanged.

The findings, published today in the journal Proceedings of the National Academy of Sciences, are important because poplar plantations cover 9.4 million hectares (36,294 square miles) globally - more than double the land used 15 years ago. Poplars are fast-growing trees that are a source of biofuel and other products including paper, pallets, plywood and furniture frames.

Poplars and other trees used in plantation agroforestry, including palms and eucalyptus, produce isoprene in their leaves in response to climate stress such as high temperature and drought. The isoprene alleviates those stresses by signaling cellular processes to produce protective molecules; however, isoprene is so volatile that millions of metric tons leak into the atmosphere each year.

The emitted isoprene reacts with gases produced by tailpipe pollution to produce ozone, which is a respiratory irritant. Isoprene also causes higher levels of atmospheric aerosol production, which reduces the amount of direct sunlight reaching the earth (a cooling effect), and it causes the global warming potential of methane in the atmosphere to increase (a warming effect). The warming effect is most likely greater than the cooling effect. The net effect of emitted isoprene is to worsen respiratory health and, most likely, warm the atmosphere.

A research collaboration led by scientists at the University of Arizona, the Helmholtz Research Center in Munich, Portland State University and Oregon State University genetically modified poplars not to produce isoprene, then tested them in three- and four-year trials at plantations in Oregon and Arizona.

The researchers found that trees whose isoprene production was genetically suppressed did not suffer ill effects in terms of photosynthesis or "biomass production." They were able to make cellulose, used in biofuel production, and grow as well as trees that were producing isoprene. The discovery came as a surprise, given the protective role of isoprene in stressful climates, especially in the case of the Arizona plantation.

"The suppression of isoprene production in the leaves has triggered alternative signaling pathways that appear to compensate for the loss of stress tolerance due to isoprene," said Russell Monson, a professor of ecology and evolutionary biology at the University of Arizona and lead author of the study. "The trees exhibited a clever response that allowed them to work around the loss of isoprene and arrive at the same outcome, effectively tolerating high temperature and drought stress."

"Our findings suggest that isoprene emissions can be diminished without affecting biomass production in temperate forest plantations," said study co-author Steven Strauss, a distinguished professor of forest biotechnology at Oregon State University. "That's what we wanted to examine - can you turn down isoprene production, and does it matter to biomass productivity and general plant health? It looks like it doesn't impair either significantly."

The researchers used a genetic engineering tool known as RNA interference. RNA transmits protein coding instructions from each cell's DNA, which holds the organism's genetic code. The genetic tools for modifying the trees, and the protein analyses that revealed changes in the use of biochemical pathways, were developed by scientists at the Institute of Biochemical Plant Pathology, Helmholtz Research Center in Munich, Germany, who collaborated on the study.

"RNA interference is like a vaccination - it triggers a natural and highly specific mechanism whereby specific targets are suppressed, be they the RNA of viruses or endogenous genes," Strauss said. "You could also do the same thing through conventional breeding. It would be a lot less efficient and precise, and it might be a nightmare for a breeder who may need to reassess all of their germplasm and possibly exclude their most productive cultivars as a result, but it could be done. New technologies like CRISPR, short for clustered regularly interspaced short palindromic repeats, which allows for precise DNA editing at specific stretches of the genetic code, should work even better."

In an additional discovery, the researchers found that trees were able to adjust to the loss of isoprene because most plantation growth takes place during cooler and wetter times of the year.

"This means that, for this species, the natural seasonal cycle of growth works in favor of high biomass production when the beneficial effects of isoprene are needed least," Monson explained.

This observation also clarified an adaptive role for isoprene in natural forests, where protection that enhances survival during mid-season climate stress is likely more important than processes that promote growth early in the season.

"The fact that cultivars of poplar can be produced in a way that ameliorates atmospheric impacts without significantly reducing biomass production gives us a lot of optimism," Monson said. "We're striving toward greater environmental sustainability while developing plantation?scale biomass sources that can serve as fossil fuel alternatives."

Credit: 
University of Arizona

Protecting two key regions in Belize could save threatened jaguar, say scientists

Scientists studying one of the largest populations of jaguars in Central Belize have identified several wildlife corridors that should be protected to help the species survival. The study, led by the American Museum of Natural History and the University of Bristol and published in BMC Genetics, provide a new insight into where conservation efforts should be concentrated.

Jaguars are top predators inhabiting large areas of Belize's tropical forests and have a vast range spanning thousands of square miles. However, high deforestation rates for large-scale agricultural development and a constantly changing landscape mean jaguars are under increased threat and now listed as 'near threatened' on the IUCN red list of threatened species.

Dr Angelica Menchaca, the study's lead researcher who led the study while a PhD student at Bristol's School of Biological Sciences, said: "Jaguars don't stay in one place and can move long distances often through unprotected areas between reserves. Areas in between national parks with human activity may put jaguars at risk from retaliatory killing, conflict with cattle ranchers and limit connectivity between reserves."

In order to improve conservation and management issues, the team monitored genetic population structure and predicted jaguars' movement corridors to understand how they relate to each other, and how feasible it is to maintain connectivity between reserves.

Dr Menchaca and colleagues analysed samples of jaguar faeces collected over eight years of field work in central Belize, a region which is of great importance for jaguar connectivity as it forms a bridge between the Selva Maya extending to Guatemala in the south, and Mexico in the north.

The team identified 50 different jaguars and observed high levels of gene flow within animals identified in the Cockscomb Basin Wildlife Sanctuary and the Maya Forest Corridor, making these two areas critical for conservation efforts. These areas are currently separated by Hummingbird highway but can potentially be connected through wildlife corridors and the expansion of natural protected areas.

Dr Menchaca added: "Our findings provide a screenshot of genetic patterns of animals inhabiting the area between 2003-2011 and provide important insight into the best routes for the jaguars to take across two key areas in Central Belize. If we are to help this threatened species, then our conservation efforts must expand protected areas to ensure the maintenance of this threatened species across its range."

Credit: 
University of Bristol

Exploring the 'dark side' of a single-crystal complex oxide thin film

image: Argonne scientists have looked at the local ferroelectric properties of the bottom atomic layers of freestanding complex oxide PZT detached from the epitaxial substrate.

Image: 
Argonne National Laboratory

Analysis from a team led by Argonne researchers reveals never-before-seen details about a type of thin film being explored for advanced microelectronics.

Research from a team led by scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory offers a new, nanoscopic view of complex oxides, which are  promising for advanced microelectronics.

Complex oxides are multifunctional materials that could eventually lead to energy-efficient, advanced electronic memory components and quantum computing devices. Generally, these materials are produced layer-by-layer on an atomically matched substrate, a process known as epitaxial growth.

“Our study shows that this material is ready to go for future microelectronic applications, but it will require further research on ways to avoid these ripples.” — Saidur Rahman Bakaul, Argonne assistant materials scientist.

To use complex oxides in electronics, they need to be produced on silicon — an impossible task for existing epitaxial growth techniques, since the atomic structures of these two materials do not match. One possible workaround is to grow the complex oxides elsewhere and then transfer the film to another substrate. However, a key question arises: Will the local properties of a complex oxide thin film remain intact if you lift it from one substrate and deposit it on another?

The new research reveals insights about freestanding complex oxides that could eventually create an entirely new research field: complex oxide microelectronics. The work is detailed in a paper, “Ferroelectric Domain Wall Motion in Freestanding Single Crystal Complex Oxide Thin Film,” recently published in the journal Advanced Materials.

Using scanning probe microscopy, the team studied lead zirconium titanate (PZT), a type of single-crystal complex oxide ferroelectric thin film. Such single-crystal films have properties ideal for microelectronics — they are highly polarized, endurable and fast-switchable, making them suitable for future ferroelectric random-access memory chips, for example.

Growing these thin films requires temperatures of about 700 °C (1292 °F), which deteriorates the interfacial layer’s properties if directly grown on silicon. So the researchers grew the PZT on a more amenable substrate — a base of strontium titanate (STO) with a “sacrificial layer” of lanthanum strontium manganite (LSMO) sandwiched in between. To transfer the PZT thin film to another substrate, the researchers broke the bonds that united it with the LSMO.

“PZT grows beautifully on LSMO,” said Saidur Rahman Bakaul, an assistant materials scientist at Argonne who led the study. “We wanted to see what happens if we cut that interface.”

After transforming the PZT into a freestanding film, the research team flipped the film over and gently redeposited it onto an identical STO-LSMO substrate. This allowed for a first-ever view of PZT’s detached underside.

“It’s like looking at the other side of the moon, which you normally don’t see,” Bakaul said.

The team used electrostatic force microscopy with 20-nanometer-radius probes to measure the material’s local ferroelectric properties. Their analysis showed the local static properties of the bottom surface of freestanding PZT were quite similar compared to those of the top surface. This finding, Bakaul said, is very encouraging for future complex oxide microelectronics, because it confirms that the interfacial surface of the transferred PZT film is a high-quality ferroelectric layer. That means the transfer technique should be able to combine the best materials from different worlds, such as PZT (ferroelectric) and silicon (semiconductors). So far, no direct growth technique has achieved this without damaging the interfacial surface.

Using piezoresponse force microscopy images , scientists found that the detached layer’s ferroelectric domain wall velocity — a measure of the electrostatic energy landscape of complex oxides — was almost 1,000 times slower than strongly bonded as-grown PZT films.

To find out why, the team first examined the atomic layers at the bottom surface of the PZT film with atomic force microscopy, which revealed anomalies on the surface. For an even closer look, they turned to Argonne's Center for Nanoscale Materials and Advanced Photon Source, both DOE Office of Science User Facilities, to use their joint hard X-ray nanoprobe to see the tilts in atomic planes, revealing never-before-seen ripples.

The ripples, Bakaul said, rise to the height of only a millionth of a pinhead’s diameter, but still can create a strong electric field that keeps the domain wall from moving, the theoretical analysis revealed. This claim was further supported with measurements from a scanning capacitance microscope.

The presence of such structural ripples in complex oxides, which used to be known as nonbendable ceramics, is an exciting new scientific discovery and a future playground to explore strong strain gradient-induced physical phenomena such as flexoelectric effects. However, in microelectronic devices, these tiny ripples can induce device-to-device variability.

The work, which was supported by DOE’s Office of Basic Energy Sciences, offers a unique and important level of detail about the properties of freestanding complex oxide thin films.

“Our study shows that this material is ready to go for future microelectronic applications,” Bakaul said, “but it will require further research on ways to avoid these ripples.”

Credit: 
DOE/Argonne National Laboratory

Study shows animal life thriving around Fukushima

image: These are macaque monkeys.

Image: 
UGA

Nearly a decade after the nuclear accident in Fukushima, Japan, researchers from the University of Georgia have found that wildlife populations are abundant in areas void of human life.

The camera study, published in the Journal of Frontiers in Ecology and the Environment, reports that over 267,000 wildlife photos recorded more than 20 species, including wild boar, Japanese hare, macaques, pheasant, fox and the raccoon dog--a relative of the fox--in various areas of the landscape.

UGA wildlife biologist James Beasley said speculation and questions have come from both the scientific community and the general public about the status of wildlife years after a nuclear accident like those in Chernobyl and Fukushima.

This recent study, in addition to the team's research in Chernobyl, provides answers to the questions.

"Our results represent the first evidence that numerous species of wildlife are now abundant throughout the Fukushima Evacuation Zone, despite the presence of radiological contamination," said Beasley, associate professor at the Savannah River Ecology Laboratory and the Warnell School of Forestry and Natural Resources.

Species that are often in conflict with humans, particularly wild boar, were predominantly captured on camera in human-evacuated areas or zones, according to Beasley.

"This suggests these species have increased in abundance following the evacuation of people."

The team, which included Thomas Hinton, professor at the Institute of Environmental Radioactivity at Fukushima University, identified three zones for the research.

Photographic data was gathered from 106 camera sites from three zones: humans excluded due to the highest level of contamination; humans restricted due to an intermediate level of contamination; and humans inhabited, an area where people have been allowed to remain due to "background" or very low levels of radiation found in the environment.

The researchers based their designations on zones previously established by the Japanese government after the 2011 Fukushima Daiichi accident.

For 120 days, cameras captured over 46,000 images of wild boar. Over 26,000 of those images were taken in the uninhabited area, compared to approximately 13,000 in the restricted and 7,000 in the inhabited zones.

Other species seen in higher numbers in the uninhabited or restricted zones included raccoons, Japanese marten and Japanese macaque or monkeys.

Anticipating questions about physiological condition of the wildlife, Hinton said their results are not an assessment of an animal's health.

"This research makes an important contribution because it examines radiological impacts to populations of wildlife, whereas most previous studies have looked for effects to individual animals," said Hinton.

The uninhabited zone served as the control zone for the research.

The scientists said although there is no previous data on wildlife populations in the evacuated areas, the close proximity and similar landscape of the human-inhabited zone made the area the ideal control for the study.

The team evaluated the impact of other variables: distance to road, time of activity as captured by the cameras' date-time stamps, vegetation type and elevation.

"The terrain varies from mountainous to coastal habitats, and we know these habitats support different types of species. To account for these factors, we incorporated habitat and landscape attributes such as elevation into our analysis," Beasley said.

"Based on these analyses, our results show that level of human activity, elevation and habitat type were the primary factors influencing the abundance of the species evaluated, rather than radiation levels."

The study's results indicate the activity pattern of most species aligned with their well-known history or behavior patterns. Raccoons, who are nocturnal, were more active during the night, while pheasants, which are diurnal animals, were more active during the day. However, wild boar inside the uninhabited area were more active during the day than boar in human-inhabited areas, suggesting they may be modifying their behavior in the absence of humans.

One exception to these patterns was the Japanese serow, a goat-like mammal. Normally far-removed from humans, they were most frequently seen on the camera footage in rural human-inhabited upland areas. The researchers suggest this might be a behavioral adjustment to avoid the rapidly growing boar population in the evacuated zone.

Credit: 
University of Georgia

Biodiverse forests better at storing carbon for long periods, says study

image: Teak (Tectona grandis) in a timber plantation stands unharvested in a protected area in Karnataka, India.

Image: 
Anand Osuri

As the effects of climate change are increasingly felt around the world, possible solutions, from reducing fossil fuel emissions to capturing carbon, have come to dominate policy discussions. Planting new forests and restoring existing ones have emerged as some of the best ways to capture CO2, since trees pull carbon out of the air during photosynthesis, then store it in their trunks and roots.

A new study, accepted in Environmental Research Letters, has found that diverse natural forests with a mix of tree species are more reliable and stable at absorbing and storing carbon than plantations dominated by just a few tree species, both over time and across diverse conditions. The study was co-authored by scientists from Columbia University's Earth Institute and its Department of Ecology, Evolution and Environmental Biology.

Scientists already understand that natural forests are better at sequestering carbon than more uniform, short-rotation plantations whose trees are harvested regularly. Less clear have been the relative carbon-storage benefits of natural forests versus monoculture tree plantations comprising just a few species that remain uncut for long periods.

The study looked at forests in India, where conservation laws have led to the preservation of both natural forests and former timberlands. It compared the ability of both kinds of forest to capture and store carbon in wet and dry conditions in five reserves in a mountainous region known as the Western Ghats. Among the study areas were former teak and eucalyptus plantations that have not been harvested for timber in recent years, as well as species-rich evergreen and deciduous tropical forests that were selectively logged until 1980.

The history of forest management and conservation in the Western Ghats make it an ideal location for such a study, said lead author Anand M. Osuri, a postdoctoral fellow at the Earth Institute and the Nature Conservancy. Many nature reserves in the Western Ghats include areas that were formerly managed as plantations. This creates a neat natural experiment for comparing natural forests and mature plantations under similar climatic and environmental conditions.

In field studies, the researchers analyzed tree-species richness and measured tree height and girth at one site, using this information to calculate the trees' above-ground biomass and carbon storage. Carbon capture rates, meanwhile, were estimated across all the sites using satellite detection of photosynthetic activity across a broad geographic area.

The study revealed a somewhat complex picture when it comes to carbon storage. Teak and eucalyptus plantations stored 30 to 50 percent less carbon than the natural evergreen forests, but nearly as much carbon as the moist-deciduous forests. But the natural forests showed higher stability of carbon capture across years, and especially proved their mettle in dry conditions. While tree plantations captured 4 to 9 percent more carbon than the evergreen and deciduous forests during wet seasons, they fared far worse during dry seasons, with a carbon capture rate up to 29 percent lower than that of the natural forests.

Because climate models show that global warming will worsen droughts, the ability of natural forests to soak up carbon even during dry seasons was important, the authors say. The study concluded that even though tree plantations rival some natural forests for carbon capture, the plantations were unlikely to match the stability and hence reliability of carbon capture exhibited by forests, particularly in the face of increasing droughts and other climate disruptions. That holds critical lessons for conservationists and government officials, the authors say.

Greater stability of carbon capture in natural forests is one of several reasons why policies for protecting and regenerating such forests should be prioritized over raising plantation monocultures, Osuri said.

In addition to enhancing carbon storage, he added, such policies could also offer a much-needed boost to biodiversity conservation.

The examination of the success of natural forests versus tree plantations is especially timely. Recent international agreements, including the Bonn Challenge and the Paris Climate Accord, call for increases in tree cover as a way to address global warming. The study noted a worrisome trend, however: Tree plantations comprised of only a few species have expanded in recent decades, while mixed forests, especially those found in tropical areas, have contracted.

In India, the government has devoted significant resources to restoring natural forests. Still, more than half of the areas India reforested between 2015 and 2018 consisted of plantations with five or fewer tree species.

While it might be easier and cheaper to focus on one or two tree species in reforestation initiatives, the authors urged governments to deploy a broad variety of native trees species when looking at ways to ramp up carbon capture and stave off climate change.

Credit: 
Columbia Climate School

NASA's Hubble surveys gigantic galaxy

image: This Hubble Space Telescope photograph showcases the majestic spiral galaxy UGC 2885, located 232 million light-years away in the northern constellation Perseus. The galaxy is 2.5 times wider than our Milky Way and contains 10 times as many stars. A number of foreground stars in our Milky Way can be seen in the image, identified by their diffraction spikes. The brightest star photobombs the galaxy's disk. The galaxy has been nicknamed "Rubin's galaxy," after astronomer Vera Rubin (1928-2016), who studied the galaxy's rotation rate in search of dark matter.

Image: 
Credits: NASA, ESA and B. Holwerda (University of Louisville)

This majestic spiral galaxy might earn the nickname the "Godzilla galaxy" because it may be the largest known in the local universe. The galaxy, UGC 2885, is 2.5 times wider than our Milky Way and contains 10 times as many stars.

But it is a "gentle giant," say researchers, because it looks like it has been sitting quietly over billions of years, possibly sipping hydrogen from the filamentary structure of intergalactic space. This fuels modest ongoing star birth at half the rate of our Milky Way. In fact, its supermassive central black hole is a sleeping giant, too. Because the galaxy does not appear to be feeding on much smaller satellite galaxies, it is starved of infalling gas.

The galaxy has been nicknamed "Rubin's galaxy," after astronomer Vera Rubin (1928 - 2016), by Benne Holwerda of the University of Louisville, Kentucky, who observed the galaxy with NASA's Hubble Space Telescope.

"My research was in a large part inspired by Vera Rubin's work in 1980 on the size of this galaxy," said Holwerda. Rubin measured the galaxy's rotation, which provides evidence for dark matter, which makes up most of the galaxy's mass as measured by the rotation rate. "We consider this a commemorative image. This goal to cite Dr. Rubin in our observation was very much part of our original Hubble proposal."

In results being presented at the winter American Astronomical Society meeting in Honolulu, Hawaii, Holwerda is seeking to understand what led to the galaxy's monstrous size. "How it got so big is something we don't quite know yet," said Holwerda. "It's as big as you can make a disk galaxy without hitting anything else in space."

One clue is that the galaxy is fairly isolated in space and doesn't have any nearby galaxies to crash into and disrupt the shape of its disk.

Did the monster galaxy gobble up much smaller satellite galaxies over time? Or did it just slowly accrete gas for new stars? "It seems like it's been puttering along, slowly growing," Holwerda said. Using Hubble's exceptional resolution, his team is counting the number of globular star clusters in the galaxy's halo -- a vast shell of faint stars surrounding the galaxy. An excess of clusters would yield evidence that they were captured from smaller infalling galaxies over many billions of years.

NASA's upcoming James Webb Space Telescope could be used to explore the center of this galaxy as well as the globular cluster population. NASA's planned Wide Field Infrared Survey Telescope (WFIRST) would give an even more complete census of this galaxy's cluster population, especially that of the whole halo. "The infrared capability of both space telescopes would give us a more unimpeded view of the underlying stellar populations," said Holwerda. This complements Hubble's visible-light ability to track wispy star formation throughout the galaxy.

A number of foreground stars in our Milky Way can be seen in the image, identified by their diffraction spikes. The brightest appears to sit on top of the galaxy's disk, though UGC 2885 is really 232 million light-years farther away. The giant galaxy is located in the northern constellation Perseus.

Credit: 
NASA/Goddard Space Flight Center

Nerve stimulation may benefit women with fibromyalgia

A treatment involving electrical nerve stimulation helped women with fibromyalgia in a recent clinical trial. The findings are published in Arthritis & Rheumatology.

Fibromyalgia is characterized by pain and fatigue, particularly during physical activity. Transcutaneous electrical nerve stimulation (TENS) delivers electrical currents through the skin to activate nerve pathways in the body that inhibit pain.

In this trial, TENS resulted in significant improvements in movement-related pain and fatigue compared with placebo or no TENS.

The TENS treatment was given along with standard treatments for fibromyalgia. Thus, it can provide people with a tool to help manage pain and fatigue without taking additional pain medications.

"TENS is available over the counter, is inexpensive, and is safe and easy to use," said senior author Kathleen A. Sluka, PT, PhD, FAPTA, of the University of Iowa. "It can provide a self-management option for people with chronic pain, particularly fibromyalgia, to provide an additional level of pain relief.

Credit: 
Wiley