Tech

Transplantation followed by antiviral therapy cured hepatitis C

Transplantation followed by antiviral therapy cured hepatitis C in 100 percent of patients receiving kidneys from infected donors

Abstract: http://annals.org/aim/article/doi/10.7326/M18-0749
Editorial: http://annals.org/aim/article/doi/10.7326/M18-1781
URLs go live when the embargo lifts

Twenty patients who received kidneys transplanted from hepatitis C virus (HCV)-infected donors experienced HCV cure, good quality of life, and excellent renal function at one year. These findings offer additional evidence that kidneys from HCV-infected donors may be a valuable transplant resource. Results from the single-group trial are published in Annals of Internal Medicine.

Organs from HCV-infected deceased donors are often discarded. However, preliminary data from two small trials suggest that HCV-infected kidneys could be safely transplanted into HCV-negative patients.

Researchers from Penn Medicine report 12-month HCV treatment outcomes, estimated glomerular filtration rate, and quality of life for 10 kidney recipients in the THINKER-1 (Transplanting Hepatitis C kidneys Into Negative KidnEy Recipients) trial, and 6-month data on 10 additional recipients. All of the participants underwent lifesaving transplant with kidneys infected with genotype 1 HCV and received antiviral therapy on day 3 after transplantation. The 20 recipients achieved a 100 percent cure rate, excellent renal function, and stable to improved quality of life.

According to the researchers, these findings suggest that kidneys from HCV-infected donors may represent an important opportunity to expand the donor pool. Patients without HCV should be well-informed about the benefits and risks so that they may engage in shared decision-making.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Peter P. Reese, MD, MSCE, please contact Abbey Anderson Hunton at Abbey.Anderson@uphs.upenn.edu.

Women internists earn less than men whether they are generalists, hospitalists, or sub-specialists
ACP Research Report Calls for Gender Equity in Physician Compensation and Career Advancement

Abstract: http://annals.org/aim/article/doi/10.7326/M18-0693
Editorial: http://annals.org/aim/article/doi/10.7326/M18-1879
URLs go live when the embargo lifts

Women internists earn less than men whether they are generalists, hospitalists, or subspecialists. Factors that contribute to disparities in compensation may include choice of occupation, time taken away from work due to family obligations, gender discrimination, and productivity levels. A brief research report from the American College of Physicians (ACP) is published in Annals of Internal Medicine.

Women comprise over one third of the active U.S. physician workforce, an estimated 46 percent of all physicians-in-training, and more than half of all medical students. Despite progress toward gender diversity in the U.S. physician workforce, disparities in compensation and career advancement persist.

Researchers for ACP conducted a cross-sectional survey of a nationally representative panel of ACP nonstudent members in the U.S. to describe physician compensation by gender. They found that female internists earn less than their male counterparts. The disparities existed even when controlling for specialty, number of hours worked, and practice characteristics. Median annual salary for men was on average $50,000 higher than women--with females earning 80 cents for every dollar earned by men. Further, the data highlighted that female physicians earned less than men in every specialty, ranging from a salary difference of $29,000 for internal medicine specialists to $45,000 for subspecialists.

According to the authors, this research is a step forward in ensuring that physicians are compensated equally and fairly at all stages of their professional careers in accordance with their skills, knowledge, competencies, and expertise regardless of their characteristics or gender.

Promoting gender equity and eliminating the inequities in compensation physicians can face is a longstanding goal of ACP. In April, the College published a paper, Achieving Gender Equity in Physician Compensation and Career Advancement, in the Annals of Internal Medicine calling for the adoption of equitable compensation policies in all organizations that employ physicians, investment in leadership development, negotiation and career development programs, and parental and family leave policies.

Media contact: For an embargoed PDF or to speak with an author from ACP, please contact Julie Hirschhorn at jhirschhorn@acponline.org.

Bariatric surgery linked to significant reduction in microvascular complications of type 2 diabetes

Abstract: http://annals.org/aim/article/doi/10.7326/M17-2383
URLs go live when the embargo lifts

Compared with usual care, bariatric surgery was associated with half the incidence of microvascular disease at 5 years for adults with type 2 diabetes. These findings add to a growing body of evidence suggesting that bariatric surgery not only improves glucose, blood pressure, and lipid control, but is likely to reduce macrovascular and microvascular complications, as well as improve survival in patients with severe obesity and type 2 diabetes. Results from a matched cohort study are published in Annals of Internal Medicine.

Research has shown that about half of people with diabetes and severe obesity who get bariatric surgery maintain long-term glucose control without medication. But for many patients, avoiding microvascular complications affecting the nerves of the feet and hands (neuropathy), the kidneys (nephropathy), and the eyes (retinopathy) is of greater concern.

Researchers from Kaiser Permanente Washington Health Research Institute studied more than 4,000 obese patients with type 2 diabetes who underwent bariatric surgery to determine its effect on microvascular complications. They found that the risk of all microvascular complications at 5 years after surgery was less than half that of a matched control group of more than 11,000 obese patients who received usual medical care for their diabetes that did not include surgery. Overall, bariatric surgery was associated with a two-thirds decrease in neuropathy, one-half decrease in nephropathy, and one-third decrease in retinopathy.

According to the researchers, these results suggest that everyone with diabetes and severe obesity should have a conversation with their doctor about whether bariatric surgery is a reasonable treatment option for them, weighing the risks and benefits.

Media contact: For an embargoed PDF or author contact information, please contact Lauren Evans at laevans@acponline.org. To speak with the lead author, David Arterburn, MD, MPH, please contact Rebecca Hughes at Rebecca.F.Hughes@kp.org.

Patient characteristics an important factor in determining optimal blood pressure target
Clinicians go 'Beyond the Guidelines' to debate treatment for an elderly patient who does not fit neatly within parameters of current guidelines

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1312
URLs go live when the embargo lifts

Current guidelines differ on the optimum threshold above which to begin antihypertensive therapy and what the target blood pressure should be after treatment has begun. A primary care physician and a gerontologist, both from Beth Israel Deaconess Medical Center (BIDMC), debate care for an elderly patient with hypertension in a multicomponent educational article being published in Annals of Internal Medicine.

Hypertension is prevalent and the most important risk factor for cardiovascular disease. Guidelines from the American College of Physicians/American Academy of Family Physicians recommend initiating antihypertensive therapy for patients aged 60 or older if systolic blood pressure is 150 mm Hg or higher and to treat to the same target. However, they recommend a lower threshold for high-risk patients and these patients should begin therapy at 140 mm Hg. The American College of Cardiology/American Heart Association guideline, which is based largely on SPRINT (Systolic Blood Pressure Intervention Trial), advises a target systolic blood pressure of 130 mm Hg for patients 65 years or older.

In a recent BIDMC Grand Rounds, two experts debated care for a 79-year-old man with a mean blood pressure value of 157/68 over 2 years of readings. The patient was overweight with some comorbidities, but felt that he was in relatively good health. The patient reported than in the past he had taken a blood pressure medication and did not tolerate it well. Internist Jennifer Beach, MD, and gerontologist Lewis Lipsitz, MD, both considered the patient's diagnosis, comorbidities, and cardiovascular risk factors before suggesting a target blood pressure and treatment strategy. Dr. Beach recommended a target blood pressure below 140 mm Hg, considering the patient's risk for adverse events. However, Dr. Lipsitz felt that the patient's risk for cardiovascular disease warranted a lower blood pressure target of 130 mm Hg. Both Dr. Beach and Dr. Lipsitz agreed that if treatment were needed, an ACE inhibitor or ARB should be the first line therapy.

All 'Beyond the Guidelines' papers are based on the Department of Medicine Grand Rounds at BIDMC in Boston and include print, video, and educational components. A list of topics is available at http://www.annals.org/grandrounds.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To speak with someone regarding BIDMC Beyond the Guidelines, please contact Jennifer Kritz at jkritz@bidmc.harvard.edu.

Also new in this issue:

Value-Based Health Care Meets Cost-Effectiveness Analysis
Joel Tsevat, MD, MPH, and Christopher Moriates, MD
Medicine and Public Issues
Abstract: http://annals.org/aim/article/doi/10.7326/M18-0342

Credit: 
American College of Physicians

Scientists create atomic glue gun to build better nucleic

When it comes to certain molecules, shape makes all the difference. The shape of limonene, for instance, a compound produced by citrus fruits, determines whether it tastes like orange juice or turpentine. In the case of therapeutics, the 3D shape of a molecule can be critical to activity.

Now, scientists at Scripps Research and Bristol-Myers Squibb have created a powerful new tool for precisely controlling the 3D architecture--also called stereochemistry--of linkages known as thiophosphates, found in some promising new drugs that target genetic molecules and other disease targets, according to a paper published today in Science.

Dubbed phosphorus-sulfur incorporation (PSI, for short), the first-of-its-kind technology acts like an atomic glue gun, binding nucleosides into oligomers with specific, preprogrammed spatial configurations at the thiophosphate linkage. The thiophosphate linkages are analogues of nature's method of connecting nucleosides and offer multiple advantages to drug development, but add the complexity of stereochemistry at the phosphorous atom. PSI provides an unprecedented, inexpensive and simple method to enable the development of single isomers of these compounds, which can have hundreds of thousands of stereoisomers.

"Thiophosphate based nucleotide compounds represent remarkable therapeutic potential, but our understanding of these systems has been hindered an inability to easily control the stereochemistry of the thiophosphate during drug synthesis," says Phil Baran, PhD, a Scripps Research professor and senior scientist on the study. "PSI provides a robust and stereo-controlled method of synthesizing oligonucleotide drugs, allowing us to create, analyze and manufacture stereoisomers of a drug candidate in ways that were previously only possible with expensive and inefficient methods."

Martin Eastgate, PhD, co-senior author on the Science paper and lead scientist on the Bristol-Myers Squibb team, says that by providing a simple and generalized method for controlling the stereochemistry of the phosphorus-centered bonds, called thiophosphate linkages, PSI overcomes a significant hurdle to discovering the next generation of innovative medicines.

"The invention of these stereoselective, simple, scalable and stable reagents provides a solution to this complex problem," says Eastgate, group director and head of chemical research in Bristol-Myers Squibb's Chemical and Synthetic Development organization. "We hope the invention of the PSI reagent class will prove to be an enabling technology to the scientific community."

To build the long chain of nucleotides present in oligonucleotides, the current manufacturing technique relies on the unnatural, but highly reactive, phosphorous(III) oxidation state. One of the major limitations of applying standard P(III) chemistry to thiophosphate synthesis is a lack of control over the 3D shape of the new phosphorous based stereocenter.

"Using P(III) chemistry to produce even a modest amount of the compound as a single stereoisomer is challenging, making it difficult to fully assess the impact of molecular shape on biological function," says Justine deGruyter, a Scripps Research graduate student and one of the first authors on the Science paper. To overcome these limitations, the Bristol-Myers Squibb and Scripps researchers explored using a different form of phosphorus, P(V), that was long eschewed by synthetic chemists due to its low reactivity. While P(V) is generally less reactive than P(III), which can make it more challenging to use in building molecules in the laboratory, the scientists suspected its superior stability could translate into far better control over the three-dimensional molecular shape during synthesis.

Over the course of two years, the Scripps and Bristol-Myers Squibb teams collaborated to develop an effective method of using P(V) to produce desired stereoisomers of molecules. They focused on finding a way to bind together chains of nucleosides with a traceless reagent that wouldn't leave behind unwanted atoms. The result of this was the reagent PSI.

The researchers have used PSI to generate pure stereoisomers of cyclic dinucleotides (CDNs), the basis of CDN drug candidates that have generated much excitement as a new type of cancer immunotherapy. CDN drugs target a protein called STING (STimulator of INterferon Genes) to activate the body's immune system against cancers.

"CDNs show incredible promise for activating the immune system against cancers, but until now there was no simple way to control their stereochemistry," says Kyle Knouse, a graduate student in Baran's lab and first author on the Science paper. "The ability to efficiently and inexpensively create pure stereoisomers will provide a powerful tool to advance CDN research."

In the case of CDNs, and ASO drugs, the ability to prepare a single stereoisomer will enable scientists to explore what shapes of the drugs are most therapeutically effective and generate those stereoisomers for clinical use. Another advantage of PSI is that it is traceless, thus avoiding the time and cost of having to remove it from the drug product during manufacturing.

The Bristol-Myers Squibb and Scripps researchers are excited to continue exploring other ways to use these reagents to build complex molecules.

Credit: 
Scripps Research Institute

Epigenetic markers of ovarian cancer

image: Johns Hopkins researchers in collaboration with Insilico Medicine identify that silencing of the GULP1 gene by methylation plays an important role in ovarian carcinogenesis.

Image: 
Insilico Medicine

Monday, July 6, 2018 - Johns Hopkins researchers in collaboration with Insilico Medicine, a biotechnology company based in Rockville Maryland, identify that silencing of the GULP1 gene expression by methylation plays an important role in ovarian carcinogenesis.

In a report published in Cancer Letters journal the authors, including researchers from both Johns Hopkins and Insilico Medicine, used an integrated approach by coupling identification of genome-wide expression patterns in multiple cohorts of primary ovarian cancer samples and normal ovarian surface epithelium with innovative computational analysis of gene expression data, leading to the discovery of novel cancer-specific epigenetically silenced genes. The study reveals 43 genes abnormally methylated in the ovarian cancer and identifies methylation of an engulfment gene, GULP1, as potential biomarker of ovarian cancer.

It was found that in at least one third of over 400 ovarian cancer cases analyzed, GULP1 is methylated. Inverse correlation of GULP1 methylation with expression was also observed in the TCGA data set, further validating these findings. Furthermore, GULP1 methylation was associated with late stage disease, and worse overall survival, suggesting that GULP1 expression plays an important role in ovarian cancer pathogenesis. This also sheds light on the mechanisms underlying GULP1-mediated growth suppression. The authors used iPANDA, a bioinformatics software suite for analysis of intracellular signaling pathway activation based on transcriptomic data (developed by InSilico Medicine, a biotechnology company located on the Johns Hopkins University of Montgomery County Campus that focuses on artificial intelligence-driven drug and biomarker discovery platforms), to compare signaling profiles between GULP1-low-expressing and GULP1-high ovarian tumors in TCGA ovarian cancer dataset. This analysis has predicted that mitogenic and survival signaling pathways, such as AKT, MAPK/ERK, RAS, ILK, PAK/P38, WNT and JNK were significantly upregulated among GULP1-low tumors. These pro-survival signaling axes play a crucial role in cancer initiation, progression and maintenance in various solid tumors, including ovarian cancer, and may contribute to acquisition of an aggressive phenotype via inhibition of apoptosis and induction of cell proliferation. In line with these observations, pro-apoptotic and anti-proliferative pathways including those associated with TP53 and TGF-β signaling were predicted to be downregulated in most of the GULP1-low tumors. In accordance with the in-silico analysis, reconstitution of GULP1 expression in vitro resulted in marked suppression of MAPK and AKT phosphorylation, along with concomitant reduction in cell proliferation, survival and invasion, while GULP1 depletion led to opposite effects. Taken together, these findings indicate that GULP1 may assert tumor suppressive activities by tethering members of multiple, cross-talking pathways involved in cell growth and survival control.

While this study implicates that epigenetic regulation of GULP1 expression may play an important role in ovarian cancer, and suggests its potential clinical value as a promising prognostic biomarker, other possible mechanisms for GULP1 down-regulation may include microRNA-mediated silencing, transcriptional regulation, or homozygous deletions. Therefore, deeper understanding of the role of GULP1 in the development of ovarian cancer may offer additional possibilities for the management of this disease, and further studies are planned to fully elucidate the role of GULP1 in tumorigenesis.

Credit: 
InSilico Medicine

New system selectively sequesters toxins from water

image: Contaminants can be removed from fluids that traverse a maze-like path between electrodes in a technology developed by Rice University engineers. The photo shows Rice postdoctoral researcher Kuichang Zuo placing a separator that channels water through the system.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (Aug. 6, 2018) - Rice University scientists are developing technology to remove contaminants from water - but only as many as necessary.

The Rice lab of engineer Qilin Li is building a treatment system that can be tuned to selectively pull toxins from drinking water and wastewater from factories, sewage systems and oil and gas wells. The researchers said their technology will cut costs and save energy compared to conventional systems.

"Traditional methods to remove everything, such as reverse osmosis, are expensive and energy intensive," said Li, the lead scientist and co-author of a study about the new technology in the American Chemical Society journal Environmental Science & Technology. "If we figure out a way to just fish out these minor components, we can save a lot of energy."

The heart of Rice's system is a set of novel composite electrodes that enable capacitive deionization. The charged, porous electrodes selectively pull target ions from fluids passing through the maze-like system. When the pores get filled with toxins, the electrodes can be cleaned, restored to their original capacity and reused.

"This is part of a broad scope of research to figure out ways to selectively remove ionic contaminants," said Li, a professor of civil and environmental engineering and of materials science and nanoengineering. "There are a lot of ions in water. Not everything is toxic. For example, sodium chloride (salt) is perfectly benign. We don't have to remove it unless the concentration gets too high.

"For many applications, we can leave non-hazardous ions behind, but there are certain ions that we need to remove," she said. "For example, in some drinking water wells, there's arsenic. In our drinking water pipes, there could be lead or copper. And in industrial applications, there are calcium and sulfate ions that form scale, a buildup of mineral deposits that foul and clog pipes."

The proof-of-principal system developed by Li's team removed sulfate ions, a scale-forming mineral that can give water a bitter taste and act as a laxative. The system's electrodes were coated with activated carbon, which was in turn coated by a thin film of tiny resin particles held together by quaternized polyvinyl alcohol. When sulfate-contaminated water flowed through a channel between the charged electrodes, sulfate ions were attracted by the electrodes, passed through the resin coating and stuck to the carbon.

Tests in the Rice lab showed the positively charged coating on the cathode preferentially captured sulfate ions over salt at a ratio of more than 20 to 1.

The electrodes retained their properties over 50 cycles. "But in fact, in the lab, we've run the system for several hundred cycles and I don't see any breaking or peeling of the material," said Kuichang Zuo, lead author of the paper and a postdoctoral researcher in Li's lab. "It's very robust."

Li said the system is intended to work with current commercial water-treatment systems. "The true merit of this work is not that we were able to selectively remove sulfate, because there are many other contaminants that are perhaps more important," she said. "The merit is that we developed a technology platform that we can use to target other contaminants as well by varying the composition of the electrode coating."

The Rice team is developing coatings for other contaminants and working with labs at the University of Texas at El Paso and Arizona State University on large-scale test systems. Zuo said it should also be possible to scale systems down for in-home water purification.

Credit: 
Rice University

Workshop advances plans for coping with disruptions on ITER

image: Participants in the three-day workshop on mitigating disruptions in the ITER fusion facility experiment.

Image: 
Elle Starkman/PPPL Office of Communications

The sixth Annual Theory and Simulation of Disruptions Workshop at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) made substantial progress toward planning a system for mitigating disruptions on ITER, the international experiment under construction in France to demonstrate the feasibility of fusion power. Disruptions, the sudden loss of heat in plasma that halts fusion reactions, can seriously damage ITER and other doughnut-shaped fusion facilities called tokamaks, and are among the major challenges facing the international experiment.

Design of a disruption mitigation system for ITER is a primary responsibility of U.S. fusion scientists, said Amitava Bhattacharjee, head of the Theory Department at PPPL and organizer of the June 16-19 workshop attended by some 35 U.S. and international physicists.

Fusion, the process that powers the sun and stars, is the fusing of light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of energy to generate electricity.

Topics discussed during the three-day event covered theoretical and experimental research on issues ranging from runaway electrons -- searing laser-like beams of high voltage and current that can damage tokamaks -- to machine-learning software for predicting disruptions in time to tame them. "Controlling disruptions, particularly runaway electrons, is among the most important challenges confronting ITER," Bhattacharjee said. "Working closely with our colleagues at ITER, we must develop a plan over the next few years for a reliable disruption mitigation system for this facility."

Shattered pellet injection

A key method outlined during the workshop is injection of shattered pellets, composed of elements such as boron or beryllium, to control ITER disruptions. The technique has chiefly been tested on the DIII-D national fusion facility that General Atomics operates for the DOE in San Diego, and important results from this facility were presented by DIII-D scientists. United Kingdom's Joint European Torus (JET), the largest and most powerful tokamak in operation today, will begin shattered pellet testing in December.

Workshop presenters included Daniele Carnevale of the University of Rome Tor Vergata, who discussed control of runaway electron beams on the Frascati Tokamak Upgrade (FTU) in Frascati, Italy, and the TCV tokamak in Lausanne, Switzerland, two medium-sized fusion facilities. Carnevale said the experiments demonstrated improved control of runaway electrons through use of a fast-control feedback system that held the plasma in place, but full control of runaways has yet to be achieved. Feedback control will next be tested on JET, he said.

Among theory speakers was physicist Dylan Brennan of PPPL and Princeton University, who serves as principal investigator of the Simulation Center for Runaway Electron Avoidance and Mitigation (SCREAM). The two-year-old center combines simulations and data from worldwide experiments to explore the causes and solutions for runaway electrons. Brennan said the group has developed a rigorous way to couple large codes that simulate runaway electron generation with codes that simulate disruptions. Such coupling requires coordination of experts in these codes, he said.

Simulation tools are rapidly advancing, according to Bhattacharjee. Such tools are being developed to simulate and predict the behavior of plasmas in ITER. "Predictive capability is what we must have for ITER," he pointed out. "ITER cannot be used for learning about disruptions -- it must avoid or mitigate disruptions and operate successfully."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Nanotube 'rebar' makes graphene twice as tough

image: Rice University graduate student Emily Hacopian holds the platform she used to study the strength of rebar graphene under a microscope. Hacopian and colleagues discovered that reinforcing graphene with carbon nanotubes makes the material twice as tough.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (Aug. 3, 2018) - Rice University researchers have found that fracture-resistant "rebar graphene" is more than twice as tough as pristine graphene.

Graphene is a one-atom-thick sheet of carbon. On the two-dimensional scale, the material is stronger than steel, but because graphene is so thin, it is still subject to ripping and tearing.

Rebar graphene is the nanoscale analog of rebar (reinforcement bars) in concrete, in which embedded steel bars enhance the material's strength and durability. Rebar graphene, developed by the Rice lab of chemist James Tour in 2014, uses carbon nanotubes for reinforcement.

In a new study in the American Chemical Society journal ACS Nano, Rice materials scientist Jun Lou, graduate student and lead author Emily Hacopian and collaborators, including Tour, stress-tested rebar graphene and found that nanotube rebar diverted and bridged cracks that would otherwise propagate in unreinforced graphene.

The experiments showed that nanotubes help graphene stay stretchy and also reduce the effects of cracks. That could be useful not only for flexible electronics but also electrically active wearables or other devices where stress tolerance, flexibility, transparency and mechanical stability are desired, Lou said.

Both the lab's mechanical tests and molecular dynamics simulations by collaborators at Brown University revealed the material's toughness.

Graphene's excellent conductivity makes it a strong candidate for devices, but its brittle nature is a downside, Lou said. His lab reported two years ago that graphene is only as strong as its weakest link. Those tests showed the strength of pristine graphene to be "substantially lower" than its reported intrinsic strength. In a later study, the lab found molybdenum diselenide, another two-dimensional material of interest to researchers, is also brittle.

Tour approached Lou and his group to carry out similar tests on rebar graphene, made by spin-coating single-walled nanotubes onto a copper substrate and growing graphene atop them via chemical vapor deposition.

To stress-test rebar graphene, Hacopian, Yang and colleagues had to pull it to pieces and measure the force that was applied. Through trial and error, the lab developed a way to cut microscopic pieces of the material and mount it on a testbed for use with scanning electron and transmission electron microscopes.

"We couldn't use glue, so we had to understand the intermolecular forces between the material and our testing devices," Hacopian said. "With materials this fragile, it's really difficult."

Rebar didn't keep graphene from ultimate failure, but the nanotubes slowed the process by forcing cracks to zig and zag as they propagated. When the force was too weak to completely break the graphene, nanotubes effectively bridged cracks and in some cases preserved the material's conductivity.

In earlier tests, Lou's lab showed graphene has a native fracture toughness of 4 megapascals. In contrast, rebar graphene has an average toughness of 10.7 megapascals, he said.

Simulations by study co-author Huajian Gao and his team at Brown confirmed results from the physical experiments. Gao's team found the same effects in simulations with orderly rows of rebar in graphene as those measured in the physical samples with rebar pointing every which way.

"The simulations are important because they let us see the process on a time scale that isn't available to us with microscopy techniques, which only give us snapshots," Lou said. "The Brown team really helped us understand what's happening behind the numbers."

He said the rebar graphene results are a first step toward the characterization of many new materials. "We hope this opens a direction people can pursue to engineer 2D material features for applications," Lou said.

Credit: 
Rice University

Groundbreaking poplar study shows trees can be genetically engineered not to spread

CORVALLIS, Ore. - The largest field-based study of genetically modified forest trees ever conducted has demonstrated that genetic engineering can prevent new seedlings from establishing.

The "containment traits" that Oregon State University researchers engineered in the study are important because of societal concerns over gene flow - the spread of genetically engineered or exotic and invasive trees or their reproductive cells beyond the boundaries of plantations.

"There's still more to know and more research to be done, but this looks really good," said corresponding author Steve Strauss, distinguished professor of forest biotechnology at OSU. "It's very exciting."

Findings from the study - which looked at 3,300 poplar trees in a 9-acre tract over seven growing seasons - were published today in Frontiers in Bioengineering and Biotechnology.

Poplars are fast growing and the source of many products, from paper to pallets to plywood to frames for upholstered furniture.

In trees like poplars that have female and male individuals, female flowers produce the seeds and male flowers make the pollen needed for fertilization.

Strauss and colleagues in the Department of Forest Ecosystems and Society assessed a variety of approaches for making both genders of trees sterile, focusing on 13 genes involved in the making of flowers or controlling the onset of reproduction.

Individually and in combination, the genes had their protein function or RNA expression modified with the goal of obtaining sterile flowers or a lack of flowering.

The upshot: Scientists discovered modifications that prevented the trees from producing viable sexual propagules without affecting other traits, and did so reliably year after year. The studies focused on a female, early-flowering poplar that facilitates research, but the genes they targeted are known to affect both pollen and seed and thus should provide general approaches to containment.

In addition to the findings, the research was notable for its scope, duration, and broad network of funders, both government and industry.

"I'm proud that we got the research done," Strauss said. "It took many years and many people doing it, managing it.

"People have this fear that GMO trees will take over the world, but these are containment genes that make taking over the world essentially impossible," he said. "If something is GMO, people assume it's dangerous - it's guilty until proven safe in the minds of many and in our regulations today. In contrast, scientists say the focus should be on the trait and its value and safety, not the method used.

At the start of the research, Strauss wondered if the trees would look normal or survive or express their new traits stably and reliably. All the answers were a strong yes.

"Will our trees be OK, will they be variable or unpredictable? The trees were fine," he said. "Year after year, the containment traits reliably worked where we got the genetics right. Not all of the constructs worked but that's why you do the research."

Strauss also noted that newer genetic approaches in his laboratory, especially CRISPR-based gene editing, are making the production of reliably contained and improved trees even easier and more efficient.

He pointed out that "the work focused on pollen and seeds, but poplar can also spread vegetatively - for example by root sprouts. But those are far slower, much narrower in distance, and far easier to control in and around plantations."

Credit: 
Oregon State University

Locusts help uncover the mysteries of smell

image: A palate-clearing sniff of coffee inspired Barani's smell research.

Image: 
Barani Raman

Understanding how a sensory input becomes an experience -- how molecules released by a blooming flower, for instance, become the internal experience of smelling a rose -- has for millennia been a central question of philosophy.

In more recent times, it has also been a question for scientists. One way to approach it is to understand the physical brain processes behind sensory experiences. Historically, scientists have proposed different ways to describe what is happening by positing that a certain set of neurons must fire; a certain sequence of firing that must occur; or a combination of the two.

But according to a research team from the School of Engineering & Applied Science at Washington University in St. Louis, these descriptions do not account for the variability of the real world. Smells do not occur in a vacuum. The team wanted to find out what happened when sensory input was presented in sequences, more akin to what happens in the real world.

They turned to locusts.

In a paper slated for publication in Nature Communications, researchers found that in locusts, only a subset of neurons associated with a particular scent would fire when that scent was presented in a dynamic environment that included other scents. Although there was not a one-to-one relationship between a pattern of neurons activated and a specific smell, the researchers were able to determine how the locusts could still recognize a scent; it comes down to the locust being flexible in its interpretation.

"There is variability because of stimulus history," said Barani Raman, associate professor of biomedical engineering, "so flexibility is necessary to compensate."

For the experiments, the team of Washington University engineers, which included Raman, graduate research assistants Srinath Nizampatnam and Rishabh Chandak, and Debajit Saha, a postdoctoral research fellow, first had to train the locusts in the same way one might train a dog, namely, Pavlov's dog. A machine administered a puff of the target scent, hexanol, to hungry locusts, then rewarded the locusts with a treat: grass. After enough rounds (usually six), the locusts would open up palps -- small organs outside of their mouths that function in a similar way to lips or tongues in humans -- after they smelled hexanol, in anticipation of the grass.

Once the locusts were trained, the testing began. The locusts were exposed to the "target" odor, hexanol either on its own, or after the introduction of a different scent, called a "distractor."

Each time the target odor was introduced on its own, a locust's neural activity was the same. But when the locusts were exposed to a distractor smell first, different combinations of neurons fired when the locusts were subsequently exposed to the target.

This is the variability based on context. What has been previously smelled (and even unrelated brain states, such as hunger) can affect how a brain reacts to the same input. If that were the end of it, though, smells would rarely, if ever, be recognizable.

Imagine entering a coffee shop and buying a freshly baked chocolate chip cookie. As you bring it to your mouth, you inhale and smell that comforting, chocolate chip cookie smell. The next day, you head to a tea shop. Another batch of freshly baked cookies calls your name. If variability (induced by prior exposure to tea or coffee) alone determined how smells are processed, the scent of tea shop cookie, wafting into your nose after a strong Earl Grey, couldn't possibly smell the same as it did after you caught a whiff of Sumatra at the coffee shop.

But just as humans recognize the smell of a chocolate chip cookie in either setting, the locusts recognized the target -- even though their neurons were firing in a variety of different ways -- as evidenced by their palps, which opened as per their conditioning.

So there had to be more to the story than variability when it came to recognizing smells. The team wanted to know if there was a pattern, or a way to discern, via brain activity, how the locusts were smelling the target odorant despite the variability in brain activity.

As it turned out, there is a way. "The rules are very simple," Raman said. "An OR-of-ANDs logical operation was sufficient to compensate for variability and allow flexible decoding."

Think of an "ideal" chair: it has four legs, a seat, two armrests, and back support. If you only recognized a chair with all of these, and only these, attributes, you would miss out on a lot of good chairs -- those on a pedestal, those without armrests, etc. To be able to generalize, there needs to be some flexibility in what's recognized as a chair. One simple way is to allow any object that has any two or three out of the four features usually associated with chair, if present, to be recognized as a chair.

The OR-of-ANDs logical operation for recognizing chair might be [four legs AND seat] OR [seat AND back support]. In the same way, locusts show a fixed pattern of brain activity when smelling the target odorant alone, but only some flexible combination involving just some of those same neurons will fire when smelling the target after smelling, say, an apple.

What subset of neurons that fire depends, in large part, on what the distractor smell is; the neurons that are activated by the target alone will continue to fire, but those that are in common to both the distractor and the target will either not be activated or their activity will be reduced.

In this way, the uniqueness of neural response to the target odorant is enhanced. Like perfume after a whiff of coffee, if the target odorant shared few neurons with the distractor, the cross-talk between the smells was less and the history/context is reset.

Going forward, the team plans to see if its results hold in another organism: the fruit fly. The researchers also will investigate how other sources of variability such as short-term memory might affect how smells are perceived. There is, of course, another organism of interest: humans.

The main inspiration for this research was the use of coffee beans to clear the olfactory pallet, so to speak, in perfume shops.

"There, we use coffee beans to enhance the way we smell the next perfume," Raman said. "We cannot say for sure if this is exactly how other olfactory systems perform the same computation, but we expect some of the computational principles revealed by our study to be general."

Credit: 
Washington University in St. Louis

New UK research links even low levels of air pollution with serious changes in the heart

image: Research from the UK has found that people exposed to even low levels of air pollution have heart remodelling, similar to that seen in the early stages of heart failure.

Image: 
British Heart Foundation

Researchers have found that people exposed to air pollution levels well within UK guidelines have changes in the structure of the heart, similar to those seen in the early stages of heart failure. The research was part-funded by the British Heart Foundation (BHF) and is published in the journal Circulation. [1]

A team of scientists, led from Queen Mary University of London by Professor Steffen Petersen, studied data from around 4,000 participants in the UK Biobank study. Volunteers provided a range of personal information, including their lifestyles, health record and details on where they have lived, so the research team were able to remove patients with underlying heart problems, or those who had moved house during the study. Participants also had blood tests and health scans. Heart MRI (magnetic resonance imaging) was used to measure the size, weight and function of the participants' hearts at fixed times. [2]

Even though most participants lived outside major UK cities, there was a clear association between those who lived near loud, busy roads, and were exposed to nitrogen dioxide (NO2) or PM2.5 - small particles of air pollution - and the development of larger right and left ventricles in the heart. The ventricles are important pumping chambers in the heart and, although these participants were healthy and had no symptoms, similar heart remodelling is seen in the early stages of heart failure.

Higher exposures to the pollutants were linked to more significant changes in the structure of the heart. For every 1 extra μg per cubic metre of PM2.5 and for every 10 extra μg per cubic metre of NO2, the heart enlarges by approximately 1% .

Air pollution is now the largest environmental risk factor linked to deaths in England. Globally, coronary heart disease and stroke account for approximately six in ten (58%) deaths related to outdoor air pollution. This research could help explain exactly how and why air pollution affects the heart.

In the study, average annual exposures to PM2.5 (8-12μg per cubic metre) were well within UK guidelines (25μg per cubic metre), although they were approaching or past World Health Organisation (WHO) guidelines (10μg per cubic metre). The WHO has said that there are no safe limits of PM2.5. The participants' average exposure to NO2 (10-50μg per cubic metre) was approaching and above the equal WHO and UK annual average guidelines (40μg per cubic metre).

Ahead of the UK Government's consultation on their draft Clean Air Strategy closing on 14 August 2018, the British Heart Foundation want to ensure the public's heart and circulatory health is at the centre of discussions.

The Strategy commits to halving the number of people in the UK living in areas where PM2.5 levels exceed WHO guidelines (10 μg per cubic metre) by 2025, but ultimately the charity would like to see this action go further to reduce the health impacts of toxic air as quickly as possible.

Dr Nay Aung who led the data analysis from Queen Mary University of London said:

"Although our study was observational and hasn't yet shown a causal link, we saw significant changes in the heart, even at relatively low levels of air pollution exposure. Our future studies will include data from those living in inner cities like Central Manchester and London, using more in-depth measurements of heart function, and we would expect the findings to be even more pronounced and clinically important.

"Air pollution should be seen as a modifiable risk factor. Doctors and the general public all need to be aware of the their exposure when they think about their heart health, just like they think about their blood pressure, their cholesterol and their weight."

Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, which part-funded the study said:

"We can't expect people to move home to avoid air pollution - Governments and public bodies must be acting right now to make all areas safe and protect the population from these harms.

"What is particularly worrying is that the levels of air pollution, particularly PM2.5, at which this study saw people with heart remodelling are not even deemed particularly high by the UK Government - this is why we are calling for the WHO guidelines to be adopted. They are less than half of UK legal limits and while we know there are no safe limits for some forms of air pollution, we believe this is a crucial step in protecting the nation's heart health.

"Having these targets in law will also help to improve the lives of those currently living with heart and circulatory diseases, as we know they are particularly affected by air pollution."

Credit: 
British Heart Foundation

Genomic study ties insect evolution to the ability to detect airborne odors

image: A new study from Illinois entomology professor Hugh Robertson and colleagues at the University of California, Davis reveals that all insects have odorant receptors that enable them to detect airborne chemicals.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- A new study reveals that all insects use specialized odorant receptors that enable them to detect and pursue mates, identify enemies, find food and - unfortunately for humans - spread disease. This puts to rest a recent hypothesis that only some insects evolved the ability to detect airborne odors as an adaptation to flight, the researchers said.

The findings are reported in the journal eLife.

"We now know that odor detection was present at the very beginning of insect evolution and that it was probably a defining feature of insects as they became terrestrial," said University of Illinois entomology professor Hugh Robertson, who led the new research with Philipp Brand and Brian R. Johnson of the University of California, Davis. "We found odorant-receptor genes in every insect species we looked at, including some that don't fly. We did not find these genes in any other arthropods, however, including other bugs with six legs."

Although odorant receptors are found in almost all terrestrial animals and in some crustaceans and mollusks, they vary a lot between groups, Robertson said.

"In insects, odorant receptors are the most simple kind that you can imagine," he said. These proteins are embedded in the membranes of neurons and extend outward to interact with chemicals in the extracellular environment. "When they bind to a chemical, they change shape, open an ion pore and change the polarity of the neuron," he said.

Insects often have hundreds of individual odorant receptors, each of which senses a particular type of chemical, Robertson said. In almost all cases, the receptors work hand-in-glove with a single coreceptor, a protein known as Orco.

Robertson has spent much of his career studying chemoreception in insects, using genomic techniques to identify receptor and coreceptor genes.

Odorant receptors are distinct from gustatory receptors, which enable almost all animals to detect chemicals in watery environments, Robertson said. When vertebrate and invertebrate creatures began to find new niches on land, some evolved the ability to also detect airborne chemicals.

"Odorant receptors evolved from gustatory receptors," Robertson said. But why and when they evolved in insects has been the subject of debate. He hypothesized in 2003 that odorant receptors were a feature of all insects.

But in a recent study, scientists failed to find odorant-receptor proteins in two groups of flightless insects. This led them to suggest that odorant receptors occurred only in flying insects and had evolved as an adaptation to flight.

The new study revisited these flightless insects, tiny creatures known as firebrats and jumping bristletails. Instead of looking for the odorant-receptor proteins in these critters, the team scoured their genomes and the genomes of other six-legged terrestrial bugs for genes that code for the receptors and for coreceptors.

The researchers found numerous odorant-receptor genes in the wingless firebrat and jumping bristletail, but not in the genomes of six-legged bugs that are not insects.

"Unequivocally, we have a full-blown odorant-receptor family in a wingless insect," Robertson said. "That refutes the claim that the whole system evolved with flight.

"Clearly, odorant receptors evolved long before wings and were not an adaptation to flight," he said. "The evolution of odorant receptors had to be an adaptation to something else, and the most obvious thing is terrestriality."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

New study shows smoking can affect breastfeeding habits

image: Marie Tarrant is Director of UBC Okanagan's School of Nursing.

Image: 
UBC Okanagan

Researchers have determined that new mothers exposed to cigarette smoke in their homes, stop breastfeeding sooner than women not exposed to second-hand smoke.

The study, conducted in Hong Kong, involved more than 1,200 women from four large hospitals, explains Professor Marie Tarrant, Director of UBC Okanagan's School of Nursing. Tarrant, whose research focuses on maternal and child health, taught in the faculty of Medicine in the University of Hong Kong before joining UBC.

"Our study showed that just being in a smoking household--whether it was the husband, mother or member of the extended family--reduced the time that a child was breast fed," says Tarrant. "In fact, the more smokers there were in the home, the shorter the breastfeeding duration."

This study, says Tarrant, is one of the first to examine the effect of family members' smoking on the duration of breastfeeding in Hong Kong after that country made substantial changes to tobacco control regulations in 2007. In Hong Kong about four per cent of women and 18 per cent of men smoke, for a national average of about 10 per cent of the population--compared to Mainland China where smoking statistics are still quite high. In Canada, about 14 per cent of the population smokes more than one cigarette a day.

"Our findings were consistent with previous studies and we found that exposure to household smokers also had a substantial negative effect on breastfeeding practices," says Tarrant. "More than one-third of participants had partners or other household members who smoked. And fathers who smoked were significantly less likely to prefer breastfeeding when compared with non-smoking partners."

Nicotine is transmitted in the breastmilk to the child and Tarrant says there is also some suggestion that it can may reduce the overall quantity of the breastmilk. There is also the concern regarding the environmental exposure of second-hand smoke on the child.

"Our study did show that smoking partners may affect the mother's decision to stop breastfeeding and that paternal and household smoking exposure is strongly associated with a shorter breastfeeding duration."

Tarrant says the takeaway from the study is to recommend that women and their families quit smoking before they become pregnant and for new mothers to wait until they have finished breastfeeding, if they choose to restart smoking. And she recommends if a woman chooses to smoke with a baby in the home, they make sure the infant is not exposed to second-hand smoke.

"We know the effects of environmental tobacco smoke on young babies is very detrimental as babies who are around smoking are more like to get respiratory infections and other experience other respiratory problems," says Tarrant. "However, if a mother is breastfeeding, the benefits of her doing that still outweigh the negative effects of the smoking as long as she maintains good smoking hygiene and doesn't expose the baby to tobacco smoke."

Tarrant's study was published recently in the Breastfeeding Medicine journal.

Credit: 
University of British Columbia Okanagan campus

Key piece identified for slowing a colorectal cancer subtype

Inhibiting the Jagged 1 protein in mice prevents the proliferation and growth of colon and rectal tumours. What is more, this approach to the disease permits the removal of existing tumours. This is the conclusion of a study led by the Molecular Mechanisms of Cancer and Stem Cells research group from the Hospital del Mar Medical Research Institute (IMIM), directed by Dr Lluís Espinosa, who is also a member of CIBERONC (the Network Centre for Biomedical Research into Cancer), in collaboration with the Pathological Anatomy and Medical Oncology Units at Hospital del Mar, and the IDIBELL-Catalan Oncology Institute. The work has been published in Nature Communications.

The researchers took tumours from patients and then implanted them into mice in order to analyse the role of this protein in cancer cell proliferation. Jagged 1 is essential for cancer cells due to its role in activating the so-called Notch cell-signalling pathway. Generally speaking, Notch inhibits cell differentiation, in other words, a cell's ability to become a mature cell that can no longer proliferate. In the case of colorectal tumours, the activation of this signalling pathway favours their proliferation and growth. In this study, the researchers discovered that the intestinal tumours of mice lack a protein known as Fringe, implying that Jagged 1 is essential for activating Notch. "The fact that Fringe is present in the normal cells of the small intestine represents a significant therapeutic opportunity for treating patients with colorectal cancer", says Dr Espinosa, since by inhibiting Jagged 1 you can halt tumour growth without affecting the function of normal tissue.

In fact, researchers have been able to see how, in the case of healthy mice, the colon and rectum do not need Jagged 1, since in the presence of the Fringe protein there are other mechanisms for activating Notch. This need to have Jagged 1 in order to activate Notch in the absence of Fringe was observed in 239 of the cases of human tumours that were analysed. Therefore, inhibiting this protein could enable doctors to combat the disease without affecting the functioning of the body. Dr Espinosa explains that "we implanted human tumours with Jagged 1, without Fringe, into mice and then we treated them with antibodies. Post-treatment, the tumours were very small and had necrosed." In the study, the tumours had shrunk after 10 weeks of treatment.

Prognostic factor

The study also enabled the researchers to demonstrate that Jagged 1 protein levels in patients with colorectal cancer is a prognostic indicator. Where levels are high, the disease rapidly becomes worse. The researchers believe that this way of treating the cancer is very promising and there are already several pharmaceutical companies working with specific antibodies to inhibit Jagged 1. Even so, the work that has just been published is a preclinical trial, and not yet transferable to patient treatment.

In this regard, Dr. Joan Albanell, one of the authors of the study, head of the Medical Oncology Unit at Hospital del Mar, and director of the IMIM's Cancer Research Programme, points out that "these results lead the way towards therapeutic strategies for selectively deactivating the properties of malignant multipotent stem cells in colon cancer. It is now very important to continue this research so that in the next few years it can culminate in clinical trials for patients with colorectal cancer. For these people, the identification of new therapeutic targets is essential."

Colorectal cancer

This is the most common type of cancer in Catalonia, with more than 6,000 new cases every year, and the second leading cause of cancer death. The frequency of colon cancer is similar in men and women, while rectal cancer is more common in men.

In 2017, 6,201 people were diagnosed in Catalonia and 2,700 died. For Spain as a whole, the number of patients diagnosed exceeded 34,000, making it the most prevalent cancer in the country. Mortality, however, has gone down 5.3% in men and 6.7% in women since 2012, thanks to the success of early detection programmes, like the Early Detection Programme for Colorectal Cancer, a joint action between Hospital del Mar and Hospital Clínic that has been running since 2009.

Credit: 
IMIM (Hospital del Mar Medical Research Institute)

Novel PET imaging method could track and guide therapy for type 1 diabetes

image: TOP: Representative axial slices of PET/CT overlay of pancreas uptake (white arrows) for each dopaminergic radioligand A) 11C-(+)-PHNO B) 11C-FLB457 and C) 11C-raclopride. All SUV images summed from 20-30 minutes. BOTTOM: Representative coronal PET/CT images of 11C-(+)-PHNO in pancreas (white arrows) for A) healthy control B) C-peptide deficient T1DM subject C) T1DM subject with detectable C-peptide All SUV images are summed from 20-30 minutes.

Image: 
J Bini et al., Yale University, New Haven, CT.

RESTON, VA - Researchers have discovered a new nuclear medicine test that could improve care of patients with type 1 diabetes. The new positron emission tomography (PET) imaging method could measure beta-cell mass, which would greatly enhance the ability to monitor and guide diabetes therapies. This study is reported in the featured article of the month in The Journal of Nuclear Medicine's August issue.

According to the American Diabetes Association, approximately 1.25 million American children and adults have type 1 diabetes. Jason Bini, PhD, at the Yale University PET Center in New Haven, Connecticut, explains the significance to patients of being able to track their beta-cell mass:

"Beta-cell mass includes both functional and non-functional beta cells. Many indirect methods to measure beta-cell function are influenced by factors such as glucose and insulin levels and are not able to measure non-functional (dormant) beta cells that may be responsive to treatments. This work is important for patients because uptake of a radiotracer measured on a PET scan could guide treatment options. For example, if a patient has low beta-cell function with high signal in the PET scan, this could represent a patient with dormant beta cells that could respond to a treatment targeting existing cells. If a patient has low beta-cell function and low signal in the PET scan (very few viable or dormant beta cells present), that individual may be a candidate for beta-cell transplantation."

Beta cells and neurological tissues have common cellular receptors and transporters, so, the Yale researchers screened brain radioligands for their ability to identify beta cells. Then, 12 healthy control subjects and two subjects with type 1 diabetes mellitus underwent dynamic PET/CT scans with six tracers.

The dopamine type 2/type 3 (D2/D3)-receptor agonist radioligand carbon-11 (11C)-(+)-4-propyl-9-hydroxynaphthoxazine (PHNO) was the only radioligand to demonstrate sustained uptake in the pancreas with high contrast versus abdominal organs such as the kidneys, liver, and spleen.

The results provide preliminary evidence that 11C-(+)-PHNO is a potential marker of beta-cell mass with 2:1 binding of D3 receptors over D2 receptors. While further research is needed before clinical application, 11C-(+)-PHNO is a promising way to differentiate the beta-cell mass of healthy individuals from those with type 1 diabetes mellitus, as well as track and guide therapies for diabetes patients.

Bini also points out, "These findings could facilitate development and wider dissemination of novel imaging methods in molecular imaging and nuclear medicine to assess receptor/enzyme pharmacology in diabetes and other endocrine disorders."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

As temperatures rise, Earth's soil is 'breathing' more heavily

The vast reservoir of carbon stored beneath our feet is entering Earth's atmosphere at an increasing rate, most likely as a result of warming temperatures, suggest observations collected from a variety of the Earth's many ecosystems.

Blame microbes and how they react to warmer temperatures. Their food of choice - nature's detritus like dead leaves and fallen trees - contains carbon. When bacteria chew on decaying leaves and fungi chow down on dead plants, they convert that storehouse of carbon into carbon dioxide that enters the atmosphere.

In a study published August 2 in Nature, scientists show that this process is speeding up as Earth warms and is happening faster than plants are taking in carbon through photosynthesis. The team found that the rate at which microbes are transferring carbon from soil to the atmosphere has increased 1.2 percent over a 25-year time period, from 1990 through 2014.

While that may not seem like a big change, such an increase on a global scale, in a relatively short period of time in Earth history, is massive. The finding, based on thousands of observations made by scientists at hundreds of sites around the globe, is consistent with the predictions that scientists have made about how Earth might respond to warmer temperatures.

"It's important to note that this is a finding based on observations in the real world. This is not a tightly controlled lab experiment," said first author Ben Bond-Lamberty of the Joint Global Change Research Institute, a partnership between the Department of Energy's Pacific Northwest National Laboratory and the University of Maryland.

"Soils around the globe are responding to a warming climate, which in turn can convert more carbon into carbon dioxide which enters the atmosphere. Depending on how other components of the carbon cycle might respond due to climate warming, these soil changes can potentially contribute to even higher temperatures due to a feedback loop," he added.

Globally, soil holds about twice as much carbon as Earth's atmosphere. In a forest where stored carbon is manifest in the trees above, even more carbon resides unseen underfoot. The fate of that carbon will have a big impact on our planet. Will it remain sequestered in the soil or will it enter the atmosphere as carbon dioxide, further warming the planet?

To address the question, the team relied heavily on two global science networks as well as a variety of satellite observations. The Global Soil Respiration Database includes data on soil respiration from more than 1,500 studies around the globe. And FLUXNET draws data from more than 500 towers around the world that record information about temperature, rainfall and other factors.

"Most studies that address this question look at one individual site which we understand very well," said author Vanessa Bailey, a soil scientist. "This study asks the question on a global scale. We're talking about a huge quantity of carbon. Microbes exert an outsize influence on the world that is very hard to measure on such a large scale."

The study focused on a phenomenon known as "soil respiration," which describes how microbes and plants in the soil take in substances like carbon to survive, then give off carbon dioxide. Soils don't exactly breathe, but as plants and microbes in soil take in carbon as food, they convert some of it to other gases which they give off - much like we do when we breathe.

Scientists have known that as temperatures rise, soil respiration increases. Bond-Lamberty's team sought to compare the roles of the two main contributors, increased plant growth and microbial action.

The team discovered a growing role for microbes, whose action is outstripping the ability of plants to absorb carbon. In the 25-year span of the study, the proportion of soil respiration that is due to microbes increased from 54 to 63 percent. Warmer temperatures can prompt more microbial action, potentially resulting in more carbon being released from carbon pools on land into the air.

"We know with high precision that global temperatures have risen," said Bond-Lamberty. "We'd expect that to stimulate microbes to be more active. And that is precisely what we've detected. Land is thought to be a robust sink of carbon overall, but with rising soil respiration rates, you won't have an intact land carbon sink forever."

Credit: 
DOE/Pacific Northwest National Laboratory

Tech takes on cigarette smoking

image: This is Ming-Chun Huang.

Image: 
CWRU

CLEVELAND--Researchers at Case Western Reserve University are using wearable sensor technology to develop an automatic alert system to help people quit smoking.

The smart-phone app, initially limited to android-based operating systems, automatically texts 20- to 120-second video messages to smokers when sensors detect specific arm and body motions associated with smoking.

There is no shortage of products or programs--from nicotine gum to hypnosis--to help people stop smoking. More recently, wearable technology has gained popularity in the fight against addiction.

But the mobile alert system Case Western Reserve researchers are testing may be the first that combines:

an existing online platform with mindfulness training and a personalized plan for quitting;
* two armband sensors to detect smoking motions, a technology that demonstrated more than 98-percent accuracy in differentiating "lighting up" from other similar motions. (That compares to 72-percent accuracy in systems using a single armband);

and a personalized text-messaging service that reminds the user of either their own plan to quit, or sends video messages that stress the health and financial benefits of quitting.

Collaborative effort

The system was conceived, developed and tested over the course of the last year by a team of Electrical Engineering and Computer Science researchers at the Case School of Engineering and a high school intern in collaboration with a clinical psychologist at the Case Western Reserve School of Medicine.

A paper detailing the system and reporting early findings on a group of 10 users was published in a July edition of Smart Health. The researchers said most previous studies have relied on smokers self-reporting how often they smoked, while the Case Western Reserve system more accurately tracked smoking activity based on the sensors.

"We've been able to differentiate between a single motion, which could be confused with eating or drinking, and a sequence of motions more clearly linked to the act of smoking a cigarette," said Ming-Chun Huang, an assistant electrical engineering and computer science professor who led the technical aspect of the study.

The collaboration to develop a technology-enhanced, and personalized mobile-smoking cessation system started after a conversation last summer between Huang, who was looking for a new project for his students, and Monica Webb Hooper of the Case Comprehensive Cancer Center, who was seeking new ways to help her clients break the habit.

"The field of tobacco control has really adopted mobile technologies because many people won't come in for therapy," said Webb Hooper, who has been working on interventions for nearly two decades.

"We were interested in translating one of our programs into a video-based mobile application, but the motion sensors made this even more amazing," said Webb Hooper, who has extended the study to another 120 smokers--half using the program and a control group using a standard text messaging program without sensors or video messaging.

The addiction problem

Tobacco smoking is responsible for one of every five deaths in the United States, according to the Centers for Disease Control and Prevention. Other research has shown there are more than 7,000 chemicals in cigarettes, including carbon monoxide, hydrogen cyanide and nitrogen oxides in cigarette smoke.

Further, the National Cancer Institute reports that there are 69 known cancer-causing agents in tobacco smoke.

"Tobacco is the toughest of all addictions to overcome and cigarettes are one of the easiest drugs to become addicted to--all it takes is three (cigarettes) for some people," Webb Hooper said. "And, neurologically, it's harder to quit because we have more nicotine receptors in the brain. That's why I'm so excited about this intervention."

Credit: 
Case Western Reserve University