Tech

Immunotherapy for egg allergy may allow patients to eat egg safely long after treatment

SAN FRANCISCO, C.A. - 2/24/19 - After completing up to four years of egg oral immunotherapy (eOIT) treatment, certain participants were able to safely incorporate egg into their diet for five years. This new research was presented by the study's first author, Edwin Kim, MD, at the annual American Academy of Allergy, Asthma and Immunology (AAAAI) conference in San Francisco.

"Egg allergy is one of the most common food allergies and usually appears in early childhood. It has significant risk for severe allergic reactions and negatively affects quality of life for children with the allergy," said Kim, assistant professor of medicine and pediatrics at the UNC School of Medicine and director of the UNC Food Allergy Initiative. "While the allergy does seem to go away with age, it can last into the second decade of life for most people. Any treatment that can allow the introduction of egg into the diet of someone with egg allergy provides nutritional benefits and peace of mind for the patient and their family."

UNC School of Medicine was one of five centers to participate in the study, led by the Consortium of Food Allergy Research (COFAR) and funded by the National Institutes of Health (NIH). The trial began with either eOIT or a placebo for 55 patients aged 5-11 who were allergic to egg. The treatments were randomized - 40 participants received eOIT and 15 received the placebo.

The treatments lasted up to four years, during which patients were tested for their sensitivity to egg. Those who were considered desensitized - requiring a higher quantity of egg to cause an allergic reaction - could eat 10 grams, or about two teaspoons, of pure egg without reaction. Desensitized patients then stopped eOIT and were tested for sensitivity again. Those who did not have a reaction were considered sustained unresponsiveness (SU). After completing eOIT, concentrated egg (scrambled, fried or boiled egg) and/or baked egg (eggs incorporated into something like a cake) were recommended to be added into the patients' diet. For five years following the allergy treatment, patients were asked to report how much egg they ate, in what form they ate it, how often they ate it and how they felt afterward.

At the end of eOIT, 50 percent of patients were classified with SU, 28 percent of patients were classified as desensitized (without SU) and 22 percent as not desensitized. Of SU-classified patients, 100 percent were able to eat both baked and concentrated egg.

Desensitized, not desensitized, and placebo groups had more variable ingestion of baked and concentrated egg and had more chance of symptoms from ingestion.

"These results further support the effectiveness of eOIT as a safe way of desensitizing children and youth with egg allergy," said Kim. "Past research also suggests that eating egg may actually shorten the amount of time a patient has the allergy, so any amount of egg that is incorporated into an allergy patient's diet is helpful."

Credit: 
University of North Carolina Health Care

Doctor-affiliated PACs fund political candidates who oppose firearm safety policies

image: Dr. Jeremiah Schuur, chair of emergency medicine at Brown University's Warren Alpert Medical School, co-led research which found physician-affiliated political action committees provided more financial support to candidates who opposed increased background checks, contrary to many societies' recommendations for evidence-based policies to reduce firearm injuries.

Image: 
Courtesy Jeremiah Schuur

PROVIDENCE, R.I. [Brown University] -- Political action committees (PACs) affiliated with physician organizations contribute more money to political candidates who oppose evidence-based policies to reduce firearm-related injuries than to those who support such policies, a new study found.

This pattern of giving is inconsistent with advocacy efforts by many individual physicians and organizations in support of the policies, the researchers said.

"Doctors can -- and should -- lead efforts to prevent firearm violence," said study co-author Dr. Jeremiah Schuur, chair of emergency medicine at Brown University's Warren Alpert Medical School.

"Yet we found that the PACs affiliated with the doctors who provide frontline care for victims of gun violence contribute to candidates who are blocking evidence-based firearm safety policies. If the organized political giving of these organizations doesn't match their stated public health goals, they undermine the moral authority and scientific credibility they draw upon when advocating for policy change."

Indirectly, such contributions hinder the health and safety of patients, Schuur added.

The findings were published on Feb. 22, in the journal JAMA Network Open.

Physician professional organizations and individual doctors have recently called attention to firearm-related injuries in multiple forums, from #ThisIsOurLane tweets to policy recommendations published in 2015 in the Annals of Internal Medicine, an academic journal -- deemed a Call to Action.

To conduct the study, Schuur and his two co-authors analyzed campaign contributions from the 25 largest physician organization-affiliated PACs in the U.S. to determine whether their support for political candidates aligned with their established positions on firearm safety regulations. The authors reviewed the candidates' voting records on a U.S. Senate amendment (SA 4750) or co-sponsorship of a U.S. House of Representatives resolution (HR 1217), two legislative efforts that sought to expand background checks for firearm purchases.

The analysis found that the majority of physician-affiliated PACs provided more money to Congressional candidates who, during the 2016 election cycle, opposed increased background checks -- which the study said are an evidence-based policy shown to reduce rates of suicide, homicide and accidental firearm injury. That financial support is contrary to many of the societies' policy recommendations, said Schuur, who is also the physician-in-chief for emergency medicine at Rhode Island Hospital.

They researchers also evaluated candidates' National Rifle Association Political Victory Fund (NRA-PVF) letter-grade ratings. The NRA-PVF is a PAC that ranks political candidates based on their support for the NRA's mission, including opposition to expanding background checks and imposing limits on assault weapons. Most candidates receive either an "A" or "F" rating.

"We were surprised to find that there was a pattern across the largest PACs affiliated with physician professional organizations -- they gave more money and to a greater number of Congressional candidates who voted against background checks and were rated A by the NRA," said Hannah Decker, a medical student at Emory University's School of Medicine and study co-author. "This pattern held true even for physician groups that publicly endorsed evidence-based policies to reduce firearm injury."

The study found that 20 of 25 physician-affiliated PACs, including the American Medical Association, American College of Emergency Physicians and American Association of Orthopaedic Surgeons, contributed more money to U.S. Senate incumbents who voted against SA 4750 than to those who voted for it. Additionally, 24 PACs contributed more to House incumbents who did not co-sponsor HR 1217. In total, the 25 PACs contributed an additional $500,000 to Senate candidates who voted against SA 4750 and an additional $2.8 million to House candidates who did not co-sponsor HR 1217.

Twenty-one PACs contributed more money to candidates rated A by the NRA, and 24 contributed to a greater proportion of A-rated candidates by the NRA than candidates not rated A. Physician-affiliated PACs gave nearly $1.5 million more to A-rated candidates by the NRA than to those with other ratings.

Among the nine PACs whose affiliated organizations endorsed the policy recommendations laid out in the 2015 call to action, eight supported a greater proportion of NRA A-rated candidates. All 16 PACs affiliated with organizations that have not publicly endorsed the call to action supported a greater proportion of NRA A-rated candidates.

"The #ThisIsOurLane movement has highlighted that many physicians are willing to publicly speak out on Twitter and in the press against the NRA and in favor of evidence-based policies to reduce firearm violence," Schuur said. "We aren't suggesting that these groups actively sought to support candidates that are against evidence-based firearms policies. Rather, our study shows that these physician PACs haven't made candidates' stance on firearms policy an issue they consider.

"The question going forward is if physicians can change their organizations' PACs contribution criteria, so NRA A-rated candidates no longer get the majority of physicians' political dollars," he added.

Credit: 
Brown University

Captured carbon dioxide converts into oxalic acid to process rare earth elements

image: Carbon dioxide scrubbers remove emissions from power plant systems.

Image: 
Nathan Shaiyen/Michigan Tech

Until now, carbon dioxide has been dumped in oceans or buried underground. Industry has been reluctant to implement carbon dioxide scrubbers in facilities due to cost and footprint.

What if we could not only capture carbon dioxide, but convert it into something useful? S. Komar Kawatra and his students have tackled that challenge, and they're having some success.

A team lead by Kawatra, a professor of chemical engineering at Michigan Technological University, his PhD students, Sriram Valluri and Victor Claremboux, and undergraduate Sam Root, have designed a carbon dioxide scrubber. They are working on converting the carbon dioxide that they capture into oxalic acid, a naturally occurring chemical in many foods.

Root and Valluri have been invited to present their research at the Society of Mining, Metallurgy and Exploration's annual meeting in Denver in February.

Oxalic acid is used by industry to leach rare earth elements from ore bodies. The rare earths are used in electronics such as cell phones. Rare earths are not presently produced in the United States; China produces 90 percent or more of the rare earths in the world. By producing oxalic acid domestically, it may be possible to profitably extract rare earth elements in the U.S., which is important for national security, Kawatra said.

How a Carbon Dioxide Scrubber Works

The group installed their carbon dioxide scrubber at the Michigan Tech steam plant, where they are testing with real flue gas at pilot plant scale.

The steam plant produces flue gas that contains eight percent carbon dioxide. The chemical engineers' scrubber brought the emissions down to four percent and their goal is to reduce it below two percent.

"Below two percent, we are happy," Kawatra said. "Below one percent, we will be very happy."

It's a real possibility. "We've already got it down to zero percent in the laboratory," Valluri noted.

In the steam plant, they tap a sample stream of flue gas from the boiler's main exhaust line. The flue gas comes out of the burner at 300-350 degrees Fahrenheit. The sample is compressed through a filter that removes particles, then passes through a cooling unit before it enters the bottom of the scrubbing column.

Soda Ash Captures Carbon Dioxide

A sodium carbonate solution is pumped into the top of the 11-foot-tall scrubbing column. The flue gas is bubbled up through the column. As it moves toward the top, the sodium carbonate or soda ash removes much of the carbon dioxide from the gas. Kawatra and his students monitor the amount of carbon dioxide constantly.

"The biggest challenge is a fluctuating ratio of gases in the flue gas," Valluri said. Team member Root elaborates, "You need a cascade control system that measures the carbon dioxide and manipulates the amount of scrubbing solution accordingly."

"Our next challenges are, how much can we scale the scrubber up and what can we use the carbon dioxide for," Valluri says. This ties into Valluri's and Claremboux's other research project, the conversion of carbon dioxide to useful products. They have been able to produce oxalic acid from carbon dioxide at laboratory scale.

Tech Alumnus Supports Research

John Simmons, a Michigan Tech alumnus in the Chemical Engineering Academy at Tech and chairman of Carbontec Energy in Bismarck, North Dakota, is supporting Kawatra's research. He says the savings to industry of this kind of carbon dioxide scrubber is enormous.

The usual method of removing carbon dioxide from emissions uses amines, nitrogen-based chemical compounds that bind the carbon dioxide. But amines cost $20,000 a ton, Simmons said. Carbonates like the soda ash that Kawatra's team is using cost $200 a ton.

Simmons is excited about the potential for producing a commercial product from the captured carbon dioxide. "I don't think sequestering it in the ground is a good idea," he says. "We have to find a way to utilize it commercially."

The technology, trade-named the "Clearite VI Carbon Dioxide Capture/ Utilization Process," was patented (Patent No. US7,919,064B2 ) by the inventors, S. Komar Kawatra, Tim Eisele and John Simmons, and assigned to Michigan Tech. Carbontec Energy Corporation, the technology sponsor, is the exclusive world-wide licensee and plans to commercialize the technology through joint ventures and sub-licenses.

Simmons is pleased that Kawatra and his students are conducting a pilot plant study of their scrubber in Michigan Tech's natural gas fired steam plant. "It was important to test the process under actual emission conditions," he explains.

Credit: 
Michigan Technological University

New dynamic dependency framework may lead to better neural social and tech systems models

image: In a paper published recently in Nature Physics, Bar-Ilan University Prof. Havlin, and a team of researchers, including Stefano Boccaletti, Ivan Bonamassa, and Michael M. Danziger, present a dynamic dependency framework that can capture interdependent and competitive interactions between dynamic systems which are used to study synchronization and spreading processes in multilayer networks with interacting layers.

Main results in this image. (Top Left) Phase diagram for two partially competitive Kuramoto models with regions of multistability. (Top Right) Theoretical and numerical results for the ow in interdependent SIS epidemics (Erdos-Renyi graphs, average degree = 12). (Bottom Left) Path-dependent (awakening) transitions in asymmetrically coupled SIS dynamics. (Bottom Right) Critical scaling of bottlenecks (ghosts in saddle-node bifurcations) above the hybrid transitions in interdependent dynamics

Image: 
Prof. Shlomo Havlin and team

Many real-world complex systems include macroscopic subsystems which influence one another. This arises, for example, in competing or mutually reinforcing neural populations in the brain, spreading dynamics of viruses, and elsewhere. It is therefore important to understand how different types of inter-system interactions can influence overall collective behaviors.

In 2010 substantial progress was made when the theory of percolation on interdependent networks was introduced by Prof. Shlomo Havlin and a team of researchers from the Department of Physics at Bar-Ilan University in a study published in Nature. This model showed that when nodes in one network depend on nodes in another to function, catastrophic cascades of failures and abrupt structural transitions arise, as was observed in the electrical blackout that affected much of Italy in 2003.

Interdependent percolation, however, is limited to systems where functionality is determined exclusively by connectivity, thus providing only a partial understanding to a wealth of real-world systems whose functionality is defined according to dynamical rules.

Research has shown that two fundamental ways in which nodes in one system can influence nodes in another one are interdependence (or cooperation), as in critical infrastructures or financial networks, and antagonism (or competition), as observed in ecological systems, social networks, or in the human brain. Interdependent and competitive interactions may also occur simultaneously, as observed in predator-prey relationships in ecological systems, and in binocular rivalry in the brain.

In a paper published recently in Nature Physics, Bar-Ilan University Prof. Havlin, and a team of researchers, including Stefano Boccaletti, Ivan Bonamassa, and Michael M. Danziger, present a dynamic dependency framework that can capture interdependent and competitive interactions between dynamic systems which are used to study synchronization and spreading processes in multilayer networks with interacting layers.

"This dynamic dependency framework provides a powerful tool to better understand many of the interacting complex systems which surround us," wrote Havlin and team. "The generalization of dependent interactions from percolation to dynamical systems allows for the development of new models for neural, social and technological systems that better capture the subtle ways in which different systems can affect one another."

Prof. Havlin's research since 2000 has produced groundbreaking new mathematical methods in network science which have led to extensive interdisciplinary research in the field. Following Havlin's and his colleagues' publication of the theory of percolation, he received the American Physical Society's Lilienfeld Prize, which is awarded for "a most outstanding contribution to physics". Earlier this year he received the Israel Prize in Chemistry and Physics.

Credit: 
Bar-Ilan University

Geographic distribution of opioid-related deaths

Bottom Line: Identifying changes in the geographic distribution of opioid-related deaths is important, and this study analyzed data for more than 351,000 U.S. residents who died of opioid-related causes from 1999 to 2016. Researchers report increased rates of opioid-related deaths in the eastern United States, especially from synthetic opioids. In 2016, there were 42,249 opioid-related deaths (28,498 men and 13,751 women) in the United States for an opioid-related mortality rate of 13 per 100,000 people. Eight states (Connecticut, Illinois, Indiana, Massachusetts, Maryland, Maine, New Hampshire and Ohio) had opioid-related mortality rates that were at least doubling every three years, and two states (Florida and Pennsylvania) and the District of Columbia had opioid-related mortality rates that were at least doubling every two years. A limitation of the study is the potential for misclassification of deaths, which could result in an underreporting of opioid-related deaths. The study findings suggest policies focused on reducing opioid-related deaths may need to prioritize synthetic opioids.

Authors: Mathew V. Kiang, Sc.D., Stanford University School of Medicine, Palo Alto, California, and coauthors

(doi:10.1001/jamanetworkopen.2019.0040)

Editor's Note: The article contains funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Want to embed a link to this study in your story? This full-text link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2019.0040

About JAMA Network Open: JAMA Network Open is the new online-only open access general medical journal from the JAMA Network. Every Friday, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

Credit: 
JAMA Network

HIV infections in US could be reduced by up to 67 percent by 2030, study finds

image: Dr. Heather Bradley, assistant professor in the School of Public Health at Georgia State University.

Image: 
Georgia State University

ATLANTA--New HIV infections in the United States could be substantially reduced by up to 67 percent by 2030 if ambitious goals for HIV care and treatment are met and targeted prevention interventions for people at risk for HIV are rapidly scaled up, according to a study by Georgia State University and the University at Albany-SUNY.

The federal administration recently announced a goal to reduce new HIV infections by 90 percent in the next 10 years during the 2019 State of the Union address. This study shows the goal is unlikely to be achieved, but that it is possible to substantially reduce new HIV infections in the next decade with innovative models for delivering HIV care and prevention interventions and sufficient investments to bring them to scale.

The researchers analyzed the latest HIV surveillance data from the Centers for Disease Control and Prevention (CDC) and estimated how many new HIV infections could be averted through ambitious, but attainable, national HIV prevention goals.

They predict that meeting internationally accepted targets for HIV diagnosis and care by 2025 and preventing an additional 20 percent of transmissions through targeted interventions such as pre-exposure prophylaxis (PrEP) for people with HIV risk would enable the U.S. to reduce new HIV infections by 67 percent in the next decade.

Achieving this goal would require the percentage of people diagnosed with HIV who are receiving care to increase from under 70 percent to 95 percent in six years and 40 percent PrEP coverage among people at risk for HIV, levels that are unprecedented in the U.S. epidemic. The results are published in the journal AIDS and Behavior.

"It is important to set HIV prevention goals that are ambitious, but realistic," said Dr. Heather Bradley, lead author of the study and assistant professor in the School of Public Health at Georgia State. "We know that treating people living with HIV greatly improves health and also prevents transmission of HIV infection to others. However, treating enough people to meaningfully reduce new HIV infections will require us to confront issues like poverty, unstable housing and mental health conditions that keep people living with HIV from accessing care."

Progress to reduce HIV infections in the U.S., particularly among key minority and risk groups, has been relatively stagnant, and a new national HIV strategy with achievable targets is critically needed.

"Greatly increasing the number of people living with HIV who are receiving care and treatment combined with targeted prevention strategies for people at risk for HIV infection could result in substantial reductions in new HIV infections in the next decade. Our study estimates how much improvement is possible and can help quantify what it would take to get there," Bradley said.

Credit: 
Georgia State University

Breast cancer study confirms importance of multigenerational family data to assess risk

A team of researchers led by Columbia University Mailman School of Public Health Professor Mary Beth Terry, PhD, evaluated four commonly used breast cancer prediction models and found that family-history-based models perform better than non-family-history based models, even for women at average or below-average risk of breast cancer. The study is the largest independent analysis to validate four widely used models of breast cancer risk and has the longest prospective follow-up data available to date. The findings are published online in The Lancet Oncology.

Dr. Terry and colleagues used the Breast Cancer Prospective Family Study Cohort composed of 18,856 women from Australia, Canada, and the U.S. without breast cancer, between March 1992 and June 2011. Women between the ages of 20 to 70 were selected for the study who had no previous history of bilateral prophylactic mastectomy or ovarian cancer, and whose family history of breast cancer was available. The researchers calculated 10-year risk scores for the final cohort of 15,732 women, comparing four breast cancer risk models which all vary in how they use information regarding multi-generational and genetic information as well as non-genetic information: the Breast and Ovarian Analysis of Disease Incidence and Carrier Estimation Algorithm model (BOADICEA), BRCAPRO, the Breast Cancer Risk Assessment Tool (BCRAT), and the International Breast Cancer Intervention Study model (IBIS). A second analysis was conducted to compare the performance of the models after 10 years based on the mutation status of the BRCA1 or BRCA2 genes.

The results showed that the BOADICIA and IBIS models which have multigenerational family history data were more accurate in predicting breast cancer risk than the other models. This held true even for women without a family history of breast and without BRCA1 and BRCA2 mutations. The other two models BRCAPRO and BCRAT models did not perform as well overall and in women under 50 years of age. The BCRAT model was well-calibrated in women over 50 years who were not known to carry deleterious mutations in the BRCA1 and BRCA2 genes. Of the 15,732 eligible women, 4 percent were diagnosed with breast cancer during the median follow-up of 11-plus years.

"Our study, which was enriched based on family history, was large enough to evaluate model performance across the full spectrum of absolute risk, including women with the highest risk of cancer in whom accurate prediction is especially important," said Dr. Terry, who is a Professor of Epidemiology at the Columbia Mailman School, and the Herbert Irving Comprehensive Cancer Center. "Independent validation is particularly important to understand the utility of these models across different settings."

Breast cancer risk models are used to help inform decisions about primary prevention and increasingly, in screening programs, including when women should have mammographies. There are several different models to assess breast cancer risk, and they vary in how they take into account family history and genetics.

"Mathematical models can help estimate a woman's future risk of breast cancer. There are several available, but it is uncertain which models are the most appropriate ones to use. These findings might help provide better guidance to women with their decision-making on breast cancer screening strategies," says Dr. Robert MacInnis, who is a Senior Research Fellow in the Cancer Epidemiology and Intelligence Division at the Cancer Council, Victoria Australia and co-led the analyses with Dr. Terry.

"Our findings suggest that all women would benefit from risk assessment that involves collection of detailed family histories, and that risk models would be improved by inclusion of family history information including ages at diagnoses and types of cancer," said Dr. Terry.

Credit: 
Columbia University's Mailman School of Public Health

Triclosan added to consumer products impairs response to antibiotic treatment

image: This is E. coli from the strain used in this study. The cell wall is shown in red and DNA is shown in blue.

Image: 
Petra Levin laboratory, Washington University in St. Louis

Grocery store aisles are stocked with products that promise to kill bacteria. People snap up those items to protect themselves from the germs that make them sick. However, new research from Washington University in St. Louis finds that a chemical that is supposed to kill bacteria is actually making them stronger and more capable of surviving antibiotic treatment.

The study, available online Feb. 19 in the journal Antimicrobial Agents & Chemotherapy, suggests that triclosan exposure may inadvertently drive bacteria into a state in which they are able to tolerate normally lethal concentrations of antibiotics -- including those antibiotics that are commonly used to treat urinary tract infections (UTIs).

Triclosan is the active ingredient responsible for the "antibacterial" property marketed on many consumer products. It is added to toothpaste, mouthwash, cosmetics and even to clothing, baby toys and credit cards with the intention of reducing or preventing bacterial growth.

"In order to effectively kill bacterial cells, triclosan is added to products at high concentrations," said Petra Levin, professor of biology in Arts & Sciences.

In 2017, the U.S. Food and Drug Administration cited both safety concerns and lack of efficacy when it recommended against adding triclosan to consumer soaps, but these guidelines have not discouraged companies from adding it to other products. What's more, Levin said, "Triclosan is very stable. It lingers in the body and in the environment for a long time."

The new study in mice uncovers the extent to which triclosan exposure limits the body's ability to respond to antibiotic treatment for urinary tract infection. It also sheds new light on the cellular mechanism that allows triclosan to interfere with antibiotic treatment.

Escaping death

Some antibiotics kill bacterial cells, while others keep them from growing.

Levin and her colleagues were particularly interested in bactericidal antibiotics -- those that can kill bacterial cells and are typically prescribed by doctors to treat bacterial infections. They wanted to know whether triclosan could protect bacteria from death in the presence of killing antibiotics.

Corey Westfall, postdoctoral scholar in the Levin lab, treated bacterial cells with bactericidal antibiotics and tracked their ability to survive over time. In one group, the bacteria were exposed to triclosan prior to being given the bactericidal antibiotic. In the other group, they were not.

"Triclosan increased the number of surviving bacterial cells substantially," Levin said. "Normally, one in a million cells survive antibiotics, and a functioning immune system can control them. But triclosan was shifting the number of cells. Instead of only one in a million bacteria surviving, one in 10 organisms survived after 20 hours. Now, the immune system is overwhelmed."

Triclosan exposure allowed the bacteria to escape death by antibiotics. And the protective property was not limited to any single family of antibiotics. In fact, multiple antibiotics that are considered unique in how they kill cells were less effective at killing bacteria exposed to triclosan.

"Triclosan increased tolerance to a wide breadth of antibiotics," Westfall said. "Ciprofloxacin (also known as Cipro) was the most interesting one to us because it is a fluoroquinolone that interferes with DNA replication and is the most common antibiotic used to treat UTIs."

Antibiotics can't do their job with triclosan around

UTIs occur when bacteria, primarily Escherichia coli (E. coli), enter and infect the urinary tract. Antibiotics such as Cipro are commonly used to kill the bacteria and treat the infection.

UTIs are common; so is exposure to triclosan. A shocking percentage -- about 75 percent -- of adults in the United States have detectable levels of triclosan in their urine. About 10 percent of adults have levels high enough to prevent E. coli from growing. Could triclosan's presence in the body interfere with treating UTIs?

Westfall and Levin worked with collaborators at Washington University School of Medicine in St. Louis to answer this question.

Ana Flores-Mireles, an assistant professor at the University of Notre Dame, worked on this study as a postdoctoral scholar in the lab of Scott Hultgren, the Helen L. Stoever Professor of Molecular Microbiology at the School of Medicine. With the help of Jeffrey Henderson, associate professor of medicine and molecular biology, she figured out that mice which drink triclosan-spiked water have urine triclosan levels similar to those reported in humans.

"This result meant we could actually test the impact that human urine levels of triclosan have during antibiotic treatment of UTIs in mice," Levin said.

All of the mice with the infection received Cipro to treat the UTI. Only some of the mice drank triclosan-spiked water. After antibiotic treatment, mice with triclosan exposure had a large number of bacteria in their urine and stuck to the bladder; mice without exposure had significantly lower bacterial counts.

"The magnitude of the difference in bacterial load between the mice that drank triclosan-spiked water and those that didn't is striking," Levin said.

"If the difference in the number of bacteria between the groups was less than tenfold, it would be difficult to make a strong case that the triclosan was the culprit," Levin added. "We found 100 times more bacteria in the urine of triclosan-treated mice -- that is a lot."

This striking result has an equally striking message -- antibiotics are less effective at treating UTIs when triclosan is around, at least in mice.

Triclosan's dirty weapon: ppGpp

Triclosan is interfering with antibiotic treatment, but how?

Levin and her colleagues found that triclosan works with a cell growth inhibitor, a small molecule nicknamed ppGpp, to render cells less sensitive to antibiotics.

During times of stress, ppGpp responds by shutting down the biosynthetic pathways that make the building blocks -- DNA, RNA, protein and fat -- that ultimately become new cells. This response helps divert resources away from growth and towards survival.

"There is a rule in medicine that you don't give drugs that slow cell growth before drugs that kill cells," Levin said.

Bactericidal antibiotics kill by targeting specific biosynthetic pathways. Ampicillin targets the enzymes that make the bacterial cell wall, for example, while Cipro targets DNA synthesis. When these pathways are shut down, bactericidal antibiotics have trouble doing their job.

If triclosan triggers ppGpp, biosynthesis is curtailed and bactericidal antibiotics would become ineffective at killing cells. Biosynthesis continues in bacteria lacking ppGpp, however, and these cells would be expected to die.

Levin and colleagues tested their hypothesis by engineering E. coli mutants unable to make ppGpp and compared them to E. coli able to make ppGpp. The absence of ppGpp in the mutant E. coli removed triclosan's ability to protect the cells from bactericidal antibiotics.

While clinical studies would be required to definitely prove that triclosan is interfering with antibiotic treatments in humans, Levin said, "My hope is that this study will serve as a warning that will help us rethink the importance of antimicrobials in consumer products."

Credit: 
Washington University in St. Louis

Innovative nanocoating technology harnesses sunlight to degrade microplastics

image: CLAIM Project Logo
The project receives funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 774586.

Image: 
CLAIM

Low density polyethylene film (LDPE) microplastic fragments, successfully degraded in water using visible-light-excited heterogeneous ZnO photocatalysts.

The innovative nanocoating technology was developed by a research team from KTH Royal Institute of Technology, Sweden and was further investigated together with PP Polymer, Sweden, as part of the EU Horizon 2020 funded project CLAIM: Cleaning Marine Litter by Developing and Applying Innovative Methods in European Seas (GA no. 774586).

Microplastics are a global menace to the biosphere owing to their ubiquitous distribution, uncontrolled environmental occurrences, small sizes and long lifetimes.

While currently applied remediation methods including filtration, incineration and advanced oxidation processes like ozonation, all require high energy or generate unwanted byproducts, the team of CLAIM scientists propose an innovative toxic-free methodology reliant solely on relatively inexpensive nanocoatings and visible light.

The study, published in Environmental Chemistry Letters,is part of CLAIM's ambition to develop a small-scale photocatalytic device to be deployed in wastewater plants aiding the degradation and breaking down microplastics in the water streams into harmless elements.

The scientists tested the degradation of fragmented, low-density polyethylene (LDPE) microplastic residues, by visible light-induced heterogeneous photocatalysis activated by zinc oxide nanorods. Results showed a 30% increase of the carbonyl index, a marker used to demonstrate the degradation of polymeric residues. Additionally, an increase of brittleness accompanied by a large number of wrinkles, cracks and cavities on the surface were recorded.

"Our study demonstrates rather positive results towards the effectiveness of breaking low-density polyethylene, with the help of our nanocoating under artificial sunlight. In practice this means that once the coating is applied, microplastics will be degraded solely through the help of sunlight. The results provide new insights into the use of a clean technology for addressing the global microplastic pollution with reduced by-products." explains Prof. Joydeep Dutta, KTH Royal Institute of Technology.

The photocatalytic device is one of five marine cleaning technologies developed within the CLAIM project.

"A year and a half in the project we are already able to demonstrate positive results towards our ultimate goal to introduce new affordable and harmless technologies to aid us tackle the uncontrolably growing problem of marine plastic pollution. We are positive that more results will come in the following months." concludes CLAIM Coordination.

Credit: 
Pensoft Publishers

EEG helps scientists predict epileptic seizures minutes in advance

image: Carmen Mejia holds her daughter Elizabeth, who metabolic disorder causes her frequent seizures.

Image: 
UT Southwestern

DALLAS - Feb. 20, 2019 - Elizabeth Delacruz can't crawl or toddle around like most youngsters nearing their second birthday.

A rare metabolic disorder that decimated her mobility has also led to cortical blindness - her brain is unable to process images received from an otherwise healthy set of brown eyes. And multiple times a day Elizabeth suffers seizures that continually reduce her brain function. She can only offer an occasional smile or make soft bubbly sounds to communicate her mood.

"But a few months ago I heard her say, 'Mama,' and I started to cry," said Carmen Mejia, a subtle quaver in her voice as she recalled the joy of hearing her daughter. "That's the first time she said something."

Ms. Mejia realizes it may also be the last, unless doctors can find a way to detect and prevent the epileptic seizures stemming from a terminal disease called pyruvate dehydrogenase deficiency (PDHD) - which occurs when mitochondria don't provide enough energy for the cells.

A UT Southwestern study gives parents like Ms. Mejia renewed hope for their children: By monitoring the brain activity of a specific cell type responsible for seizures, scientists can predict convulsions at least four minutes in advance in both humans and mice. The research further shows that an edible acid called acetate may effectively prevent seizures if they are detected with enough notice.

Although the prediction strategy cannot yet be used clinically - a mobile technology for measuring brain activity would have to be developed - it signifies a potential breakthrough in a field that had only been able to forecast seizures a few seconds ahead.

"Many of the families I meet with are not just bothered by the seizures. The problem is the unpredictability, the not knowing when and where a seizure might occur," said Dr. Juan Pascual, a pediatric neurologist with UT Southwestern's O'Donnell Brain Institute who led the study published in Science Translational Medicine. "We've found a new approach that may one day solve this issue and hopefully help other scientists track down the root of seizures for many kinds of epilepsy."

Debunked theory

The critical difference between the study and previous efforts was debunking the long-held belief among researchers that most cells in epilepsy patients have malfunctioning mitochondria.
In fact, Dr. Pascual's team spent a decade developing a PDHD mouse model that enabled them to first discover the key metabolic defect in the brain and then determine only a single neuron type was responsible for seizures as the result of the metabolic defect. They honed in on these neurons' electrical activity with an electroencephalogram (EEG) to detect which brainwave readings signaled an upcoming seizure.

"It's much more difficult to predict seizures if you don't know the cell type and what its activity looks like on the EEG," Dr. Pascual said. "Until this finding, we thought it was a global deficiency in the cells and so we didn't even know to look for a specific type."

Predicting seizures

The study shows how a PDHD mouse model helped scientists trace the seizures to inhibitory neurons near the cortex that normally keep the brain's electrical activity in check.

Scientists then tested a method of calculating when seizures would occur in mice and humans by reviewing EEG files and looking for decreased activity in energy-deficient neurons. Their calculations enabled them to forecast 98 percent of the convulsions at least four minutes in advance.

Dr. Pascual is hopeful his lab can refine EEG analyses to extend the warning window by several more minutes. Even then, live, clinical predictions won't be feasible unless scientists develop technology to automatically interpret the brain activity and calculate when a seizure is imminent.

Still, he said, the discovery that a single cell type can be used to forecast seizures is a paradigm-shifting finding that may apply to all mitochondrial diseases and related epilepsies.

Potential therapy

Dr. Pascual's ongoing efforts to extend the prediction time may be a crucial step in utilizing the other intriguing finding from the study: the use of acetate to prevent seizures.

The study showed that delivering acetate into the blood stream of PDHD mice gave their neurons enough energy to normalize their activity and decrease seizures for as long as the acetate was in the brain.
However, Dr. Pascual said the acetate would probably need more time - perhaps 10 minutes or more - to take effect in humans if taken by mouth.

Acetate, which naturally occurs in some foods, has been used in patients for decades - including newborns needing intravenous nutrition or patients whose metabolism has shut down. But it had not yet been established as an effective treatment for mitochondrial diseases that underlie epilepsy.

Among the reasons, Dr. Pascual said, is that labs have struggled to create an animal model of such diseases to study its effects; his own lab spent about a decade doing so. Another is the widespread acceptance of the ketogenic diet to reduce the frequency of seizures.

But amid a growing concern about potentially unhealthy side effects of ketogenic diets, Dr. Pascual has been researching alternatives that may refuel the brain more safely and improve cognition.

Frequent seizures

Elizabeth, among a handful of patients whose EEG data were used in the new study, has been prescribed a ketogenic diet and some vitamins to control the seizures.
Her family has seen little improvement. Elizabeth often has more than a dozen seizures a day and her muscles and cognition continue to decline. She can't hold her head up and her mother wonders how many more seizures her brain can take.

Elizabeth was only a few months old when she was diagnosed with PDHD, which occurs when cells lack certain enzymes to efficiently convert food into energy. Patients who show such early signs often don't survive beyond a few years.

Ms. Mejia does what she can to comfort her daughter, with the hope that Dr. Pascual's work can someday change the prognosis for PDHD. Ms. Mejia sings, talks, and offers stuffed animals and other toys to her daughter. Although her little girl can't see, the objects offer a degree of mental stimulation, she said.

"It's so hard to see her go through this," Ms. Mejia said. "Every time she has a seizure, her brain is getting worse. I still hope one day she can get a treatment that could stop all this and make her life better."

'Big questions'

Dr. Pascual is already conducting further research into acetate treatments, with the goal of launching a clinical trial for patients like Elizabeth in the coming years.

His lab is also researching other epilepsy conditions - such as glucose transporter type I (Glut1) deficiency - to determine if inhibitory neurons in other parts of the brain are responsible for seizures. If so, the findings could provide strong evidence for where scientists should look in the brain to detect and prevent misfiring neurons.

"It's an exciting time, but there is much that needs to happen to make this research helpful to patients," Dr. Pascual said. "How do we find an automated way of detecting neuron activity when patients are away from the lab? What are the best ways to intervene when we know a seizure is coming? These are big questions the field still needs to answer."

Credit: 
UT Southwestern Medical Center

Computer simulators show how to reduce damage to lungs of children in intensive care

image: Professor Declan Bates.

Image: 
University of Warwick

Mechanical ventilation of children in intensive care units is often necessary, but can damage the lungs of critically ill patients

It's possible to change ventilator settings to reduce the risk of damage without putting child patients at risk, according to engineering researchers at the University of Warwick

They successfully tested their new treatment strategies on simulated patients using data from real patients collected at the Children's Hospital of Philadelphia, published in the journal Intensive Care Medicine

Changing the ventilation settings for children on life support can reduce the risk of damage to their lungs, researchers at the University of Warwick and the Children's Hospital of Philadelphia have found on computer simulated patients.

Paediatric Acute Respiratory Distress Syndrome (PARDS) is one of the most challenging diseases for doctors to manage in the pediatric intensive care unit, and can arise due to several different causes, such as pneumonia, sepsis, trauma, and drowning.

Mechanical ventilation is a life-saving medical intervention for many such patients, but the forces and stresses applied by the ventilator can themselves further damage the lungs (so-called ventilator induced lung injury - VILI).

Using patient data collected by Dr. Nadir Yehya, an attending physician in the paediatric intensive care unit at the Children's Hospital of Philadelphia, researchers form the Department of Engineering at the University of Warwick have developed a computer simulator that predicts how different ventilator settings affect the lungs of individual child patients in the ICU.

This simulator was then used to safely investigate whether, and how, ventilator settings can be changed to be more "protective", i.e. to lower the risk of causing VILI in different patients, while still maintaining adequate ventilation.

The researchers identified several strategies that, in simulated patients, led to significant reductions in variables that are associated with VILI, such as tidal volumes (the volume of air displaced between inhalation and exhalation) and driving pressures.

The next stage of this research will be to test these strategies in patients in formal prospective trials in order to evaluate the clinical benefits of more protective ventilation in real hospital environments.

Professor Declan Bates from the School of Engineering at the University of Warwick commented:

"It has been incredibly exciting to see the potential of computer simulators being realised to develop safer treatment strategies for critically ill children in the intensive care unit. We are sure that combining the expertise of medical doctors and engineers will bring about radical improvements in patient care and medical outcomes over the coming years."

Dr. Nadir Yehya from the Division of Critical Care Medicine at the Children's Hospital of Philadelphia commented:

"Collaborations such as these are essential for providing safe care for our sickest children. Computer simulations have been relatively under-utilised in paediatric intensive care, and we are excited about the opportunities to address critical areas of research using these technologies."

Credit: 
University of Warwick

Firefly-inspired surfaces improve efficiency of LED lightbulbs

image: Saphire surface with asymmetrical pyramids to produce more light in LEDs.

Image: 
Yin Lab/ Penn State

A new type of light-emitting diode lightbulb could one day light homes and reduce power bills, according to Penn State researchers who suggest that LEDs made with firefly-mimicking structures could improve efficiency.

"LED lightbulbs play a key role in clean energy," said Stuart (Shizhuo) Yin, professor of electrical engineering. "Overall commercial LED efficiency is currently only about 50 percent. One of the major concerns is how to improve the so-called light extraction efficiency of the LEDs. Our research focuses on how to get light out of the LED."

Fireflies and LEDs face similar challenges in releasing the light that they produce because the light can reflect backwards and is lost. One solution for LEDs is to texture the surface with microstructures -- microscopic projections -- that allow more light to escape. In most LEDs these projections are symmetrical, with identical slopes on each side.

Fireflies' lanterns also have these microstructures, but the researchers noticed that the microstructures on firefly lanterns were asymmetric -- the sides slanted at different angles, giving a lopsided appearance.

"Later I noticed not only do fireflies have these asymmetric microstructures on their lanterns, but a kind of glowing cockroach was also reported to have similar structures on their glowing spots," said Chang-Jiang Chen, doctoral student in electrical engineering and lead author in the study. "This is where I tried to go a little deeper into the study of light extraction efficiency using asymmetric structures."

Using asymmetrical pyramids to create microstructured surfaces, the team found that they could improve light extraction efficiency to around 90 percent. The findings were recently published online in Optik and will appear in the April print edition.

According to Yin, asymmetrical microstructures increase light extraction in two ways. First, the greater surface area of the asymmetric pyramids allows greater interaction of light with the surface, so that less light is trapped. Second, when light hits the two different slopes of the asymmetric pyramids there is a greater randomization effect of the reflections and light is given a second chance to escape.

After the researchers used computer-based simulations to show that the asymmetric surface could theoretically improve light extraction, they next demonstrated this experimentally. Using nanoscale 3D printing, the team created symmetric and asymmetric surfaces and measured the amount of light emitted. As expected, the asymmetric surface allowed more light to be released.

The LED-based lighting market is growing rapidly as the demand for clean energy increases, and is estimated to reach $85 billion by 2024.

"Ten years ago, you go to Walmart or Lowes, LEDs are only a small portion (of their lighting stock)," said Yin. "Now, when people buy lightbulbs, most people buy LEDs."

LEDs are more environmentally friendly than traditional incandescent or fluorescent lightbulbs because they are longer-lasting and more energy efficient.

Two processes contribute to the overall efficiency of LEDs. The first is the production of light -- the quantum efficiency -- which is measured by how many electrons are converted to light when energy passes through the LED material. This part has already been optimized in commercial LEDs. The second process is getting the light out of the LED -- called the light extraction efficiency.

"The remaining things we can improve in quantum efficiency are limited," said Yin. "But there is a lot of space to further improve the light extraction efficiency."

In commercial LEDs, the textured surfaces are made on sapphire wafers. First, UV light is used to create a masked pattern on the sapphire surface that provides protection against chemicals. Then when chemicals are applied, they dissolve the sapphire around the pattern, creating the pyramid array.

"You can think about it this way, if I protect a circular area and at the same time attack the entire substrate, I should get a volcano-like structure," explained Chen.

In conventional LEDs, the production process usually produces symmetrical pyramids because of the orientation of the sapphire crystals. According to Chen, the team discovered that if they cut the block of sapphire at a tilted angle, the same process would create the lopsided pyramids. The researchers altered just one part of the production process, suggesting their approach could easily be applied to commercial manufacture of LEDs.

The researchers have filed for a patent on this research.

"Once we obtain the patent, we are considering collaborating with manufacturers in the field to commercialize this technology," said Yin.
Other researchers who worked on the project were Jimmy Yao, Wenbin Zhu, Ju-Hung Chao, Annan Shang and Yun-Goo Lee, doctoral students in electrical engineering.

Credit: 
Penn State

Breakthrough in the search for graphene-based electronics

image: Danish researchers just solved one of the biggest challenges of making effective nano electronics based on graphene: to carve out graphene to nanoscale dimensions without ruining the electrical properties. This allows them to achieve electrical currents orders of magnitude higher than previously achieved for such structures. The work shows that the quantum transport properties needed for future electronics can survive scaling down to 10 nanometer dimensions.

Image: 
Carl Otto Moesgaard

For 15 years, scientists have tried to exploit the "miracle material" graphene to produce nanoscale electronics. On paper, graphene should be great for just that: it is ultra-thin - only one atom thick in fact and therefore two-dimensional, it is excellent for conducting electrical current and should be ideal for future forms of electronics that are faster and more energy efficient. In addition, graphene consists of carbon atoms - of which we have an unlimited supply.

In theory, graphene can be altered to perform many different tasks within e.g. electronics, photonics or sensors simply by drawing tiny patterns in it, as this fundamentally alters its quantum properties. One "simple" task, which has turned out to be surprisingly difficult, is to induce a bandgap - which is crucial for making transistors and optoelectronic devices. However, since graphene is only an atom thick all of the atoms are important and even tiny irregularities in the pattern can destroy its properties.

"Graphene is a fantastic material, which I think will play a crucial role in making new nanoscale electronics. The problem is that it is extremely difficult to engineer the electrical properties," says Peter Bøggild, a professor at DTU Physics.

The Center for Nanostructured Graphene at DTU and Aalborg University was established in 2012 specifically to study how the properties of graphene can be engineered, for instance by making a very fine pattern of holes. This should subtly change the quantum nature of the electrons in the material, and allow the properties of graphene to be tailored. However, the team of researchers from DTU and Aalborg experienced the same as many other researchers worldwide: it didn't work.

"When you make patterns in a material like graphene, you do so in order to change its properties in a controlled way - to match your design. However, what we have seen throughout the years is that we can make the holes, but not without introducing so much disorder and contamination that it no longer behaves like graphene. It is a bit similar to making a water pipe, with a poor flow rate because of coarse manufacturing. On the outside, it might look fine. For electronics, that is obviously disastrous," says Peter Bøggild.

Now, the team of scientists have solved the problem. Two postdocs from DTU Physics, Bjarke Jessen and Lene Gammelgaard, first encapsulated graphene inside another two-dimensional material - hexagonal boron nitride, a non-conductive material that is often used for protecting graphene's properties.

Next, they used a technique called electron beam lithography to carefully pattern the protective layer of boron nitride and graphene below with a dense array of ultra small holes. The holes have a diameter of approx. 20 nanometers, with just 12 nanometers between them - however, the roughness at the edge of the holes is less than 1 nanometer or a billionth of a meter. This allows 1000 times more electrical current to flow than had been reported in such small graphene structures.

"We have shown that we can control graphene's band structure and design how it should behave. When we control the band structure, we have access to all of graphene's properties - and we found to our surprise that some of the most subtle quantum electronic effects survive the dense patterning - that is extremely encouraging. Our work suggests that we can sit in front of the computer and design components and devices - or dream up something entirely new - and then go to the laboratory and realise them in practice," says Peter Bøggild. He continues:

"Many scientists had long since abandoned attempting nanolithography in graphene on this scale, and it is quite a pity since nanostructuring is a crucial tool for exploiting the most exciting features of graphene electronics and photonics. Now we have figured out how it can be done; one could say that the curse is lifted. There are other challenges, but the fact that we can tailor electronic properties of graphene is a big step towards creating new electronics with extremely small dimensions," says Peter Bøggild.

Credit: 
Technical University of Denmark

Fibers from old tires can improve fire resistance of concrete

A new way of protecting concrete from fire damage using materials recycled from old tyres has been successfully tested by researchers at the University of Sheffield.

The team used fibres extracted from the textile reinforcement commonly embedded into tyres to guarantee their performance. Adding these fibres to the concrete mix was shown to reduce the concrete's tendency to spall - where surface layers of concrete break off - explosively under the intense heat from a fire.

Using man-made polypropylene (PP) fibres to protect concrete structures from damage or collapse if a fire breaks out is a relatively well-known technique. Many modern structures, including large scale engineering projects such as Crossrail, have used concrete that includes PP fibres for protection against fire spalling.

The Sheffield study is the first to show that these fibres do not have to be made from raw materials, but can instead be reclaimed from used tyres. The results are published in the journal Fire Technology.

"We've shown that these recycled fibres do an equivalent job to 'virgin' PP fibres which require lots of energy and resources to produce," explains lead author Dr Shan-Shan Huang, in the Department of Civil and Structural Engineering at the University of Sheffield.

"Using waste materials in this way is less expensive, and better for the planet."

The fibres melt under the intense heat from a fire, leaving networks of tiny channels. This means that moisture trapped within the concrete is able to escape, rather than becoming trapped, which causes the concrete to break out explosively.

"Because the fibres are so small, they don't affect the strength or the stiffness of the concrete," says Dr Huang.

"Their only job is to melt when heat becomes intense. Concrete is a brittle material, so will break out relatively easily without having these fibres help reducing the pressure within the concrete."

Protecting the concrete from fire spalling means that steel reinforcements running through the concrete are also protected. When the steel reinforcements are exposed to extreme heat they weaken very quickly, meaning a structure is much more likely to collapse. The Liverpool Waterfront Car Park suffered this kind of damage during a fire in 2017, leading to the entire structure eventually having to be demolished.

Collaborating with Twincon, a Sheffield-based company that develops innovative solutions for the construction industry, the researchers have also developed technologies for reclaiming the fibres from the used tyres.

This involved separating the fibres from the tyre rubber, untangling the fibres into strands, and then distributing them evenly into the concrete mixture.

The team plan to continue testing the material with different ratios of the fibres to concrete, and also using different types of concrete. They also plan to find out more about how the materials react to heat at the microstructure level. By scanning the concrete as it is heated, they will be able to see more precisely the structural changes taking place inside the material.

Credit: 
University of Sheffield

Fishing and pollution regulations don't help corals cope with climate change

image: This image depicts some of the ecosystem services that a healthy reef provides to people.

Image: 
Bruno et al 2019

A new study from the University of North Carolina at Chapel Hill reports that protecting coral reefs from fishing and pollution does not help coral populations cope with climate change. The study also concludes that ocean warming is the primary cause of the global decline of reef-building corals and that the only effective solution is to immediately and drastically reduce greenhouse gas emissions.

The new study published in the Annual Review of Marine Science (accessible via MarXiv: https://marxiv.org/ugk4v) found that coral reefs in areas with fishing and pollution regulations had the same level of decline as the coral reefs in unprotected areas, adding to the growing body of evidence that managed resilience efforts, like fishing and pollution regulations, don't work for coral reefs. This finding has important implications for how to protect reefs and best allocate scarce resources towards marine conservation.

Ocean warming is devastating reef-building corals around the world. About 75 percent of the living coral on the reefs of the Caribbean and south Florida has been killed off by warming seawater over the last 30 to 40 years. Australia's Great Barrier Reef was hit by extreme temperatures and mass bleaching in 2016 and 2017, wiping out roughly half of the remaining coral on the Great Barrier Reef's remote northern section.

Corals build up reefs over thousands of years via the slow accumulation of their skeletons and coral reef habitats are occupied by millions of other species, including grouper, sharks, and sea turtles. In addition to supporting tourism and fisheries, reefs protect coastal communities from storms by buffering the shoreline from waves. When corals die, these valuable services are lost.

The most common response to coral decline by policy makers and reef managers is to ban fishing based on the belief that fishing indirectly exacerbates ocean warming by enabling seaweeds that overgrow corals. The approach, referred to as managed resilience, assumes that threats to species and ecosystems are cumulative and that by minimizing as many threats as possible, we can make ecosystems resilient to climate change, disease outbreaks, and other threats that cannot be addressed locally.

The study's authors, led by John Bruno who is a marine ecologist in the College of Arts and Sciences at the University of North Carolina at Chapel Hill, performed a quantitative review of 18 case studies that field-tested the effectiveness of the managed resilience approach. None found that it was effective. Protecting reefs inside Marine Protected Areas from fishing and pollution did not reduce how much coral was killed by extreme temperatures or how quickly coral populations recovered from coral disease, bleaching, and large storms.

"Managed resilience is the approach to saving reefs favored by many scientists, nongovernmental organizations, and government agencies, so it's surprising that it doesn't work. Yet the science is clear: fishery restrictions, while beneficial to overharvested species, do not help reef-building corals cope with human-caused ocean warming," said Bruno.

The 18 individual studies measured the effectiveness of managed resilience by comparing the effects of large-scale disturbances, like mass bleaching events, major storms, and disease outbreaks, on coral cover inside Marine Protected Areas versus in unprotected reefs. Many also measured the rate of coral population recovery after storms. The decline in coral cover was measured directly, via scuba surveys of the reef, before and periodically after large-scale disturbances. Overall, the meta-analysis included data from 66 protected reefs and 89 unprotected reefs from 15 countries around the world.

The study also assessed evidence for various assumed causes of coral decline. For many, including overfishing, seaweeds, and pollution, evidence was minimal or uncertain. In contrast, the authors found that an overwhelming body of evidence indicates that ocean warming is the primary cause of the mass coral die-off that scientists have witnessed around the world.

Credit: 
University of North Carolina at Chapel Hill