Brain

Religion associated with HPV vaccination rate for college women

TAMPA, Fla. (August 19, 2019)- It's been more than a decade since a vaccine was introduced to prevent contraction of human papillomavirus (HPV), the most common sexually transmitted disease in the U.S. The Centers for Disease Control and Prevention (CDC) recommends patients start receiving the vaccine between ages 11 and 12, with catch-up vaccination recommended for certain groups through age 26. However, a new study conducted at the University of South Florida (USF) found many female college students have not been inoculated and religion may be a contributing factor.

The study published in the Journal of Religion and Health found 25 percent of female students surveyed between 18 and 26 years old had not been vaccinated for HPV, and sexual activity was the main factor related to vaccination. Of those unvaccinated students, 70 percent identified with a particular religious faith. A common religious belief is abstinence until marriage. Accordingly, some parents oppose vaccinating their children against HPV because they are not sexually active and believe it's not necessary. Others feel vaccinating children could promote sexual activity, although studies show this belief has declined in recent years. Despite their religious identification, majority of the students surveyed say they're sexually active.

"The whole point of the vaccine is to protect people against high risk types of HPV before exposure - so ideally before they're sexually active," said lead author Alicia Best, PhD, assistant professor in the USF College of Public Health. "College is often a time of sexual exploration and autonomous decision-making. So, while a student's parents may have previously opposed the HPV vaccine for religious or other reasons, these students may decide vaccination is right for them now. Therefore, college students are a key group that must be educated on the importance of HPV vaccination."

Other HPV preventative measures include condom usage and dental dams. HPV can be asymptomatic, allowing it to be unknowingly transmitted through sexual contact. While there is currently no cure for HPV, the virus is usually harmless and goes away by itself; but certain types of HPV can cause genital warts and cancer. The CDC recommends women begin getting Pap smears at age 21, which can detect cancer and other abnormalities. Women may also ask their doctor about HPV testing.

Credit: 
University of South Florida

Uncertainty in emissions estimates in the spotlight

National or other emissions inventories of greenhouse gases that are used to develop strategies and track progress in terms of emissions reductions for climate mitigation contain a certain amount of uncertainty, which inevitably has an impact on the decisions they inform. IIASA researchers contributed to several studies in a recently published volume that aims to enhance understanding of uncertainty in emissions inventories.

Estimates of greenhouse gas (GHG) emissions are important for many reasons, but it is crucial to acknowledge that these values have a certain level of uncertainty that has to be taken into account. If, for example, two estimates of emissions from a country are different, it does not necessarily imply that one or both are wrong - it simply means that there is an uncertainty that needs to be recognized and dealt with. A special issue of the Springer journal Mitigation and Adaptation Strategies for Global Change, aims to enhance understanding of uncertainty in estimating GHG emissions and to provide guidance on dealing with the resulting challenges. IIASA researchers and colleagues from other international institutions including the Lviv Polytechnic National University in Ukraine, the Systems Research Institute at the Polish Academy of Sciences, and Appalachian State University in the US, contributed to the 13 papers featured in the publication, addressing questions such as the size of the uncertainty dealt with, how to deal with this, and how uncertainty might be decreased.

According to the researchers, there are ways to decrease uncertainty but these are often difficult and ultimately expensive. In their respective papers, they point out that there are seven important issues that currently dominate our understanding of uncertainty. These include 1) verification; 2) avoidance of systemic surprises; 3) uncertainty informing policy; 4) minimizing the impact of uncertainty; 5) full GHG accounting; 6) compliance versus reporting; and 7) changes in emissions versus changes in the atmosphere.

In terms of how uncertainty in observations and modeling results can influence policy decisions on climate change mitigation, some of the papers also looked at how decision-making procedures can be improved to produce more fair rules for checking compliance and how information around emission inventories can be communicated to make it more transparent and easier to understand. The authors explain that understanding the uncertainties is very important both for those who do the calculations or modeling and for the consumers of this information, like policymakers or consultants, as it provides an indication of how much they can rely on the data, in other words, how "strong" the conclusions are and how sure the decisions derived from the data can be.

"Uncertainty is higher for some GHGs and some sectors of an inventory than for others. This raises the option that, when future policy agreements are being designed, some components of a GHG inventory could be treated differently from others. The approach of treating subsystems individually and differently would allow emissions and uncertainty to be looked at simultaneously and would thus allow for differentiated emission reduction policies," explains Matthias Jonas, an IIASA researcher in the Advanced Systems Analysis Program and one of the editors of the special issue. "The current policy approach of ignoring inventory uncertainty altogether (inventory uncertainty was monitored, but not regulated, under the Kyoto Protocol) is problematic. Being aware of the uncertainties involved, including those resulting from our systems views, will help to strengthen future political decision making."

The authors all agree that dealing with uncertainty is often not a quick exercise but rather involves a commitment that is painstaking and long-term. Proper treatment of uncertainty can be costly in terms of both time and effort because it necessitates taking the step from "simple" to "complex" in order to grasp a wider and more holistic systems view. Only after that step has been taken, is it possible to consider simplifications that may be warranted.

"Decision makers want certainty, the public wants certainty, but certainty is not achievable. We can work with the best information available and we have to keep moving forward and learning. I think that we need to convince data users such as policymakers or the public that uncertainty in these kinds of numbers is normal and expected and does not mean that the numbers are not useful," says study author Gregg Marland from Appalachian State University in the US.

Special edition co-editor Rostyslav Bun from Lviv Polytechnic National University in Ukraine confirms this sentiment and in conclusion adds: "The presence of uncertainties in estimates of GHG emissions may suggest that we have to devote more energy to decreasing uncertainties or it may simply mean that we need to be prepared to deal with a future that includes a certain measure of uncertainty."

Credit: 
International Institute for Applied Systems Analysis

Circulation of water in deep Earth's interior

image: The thick red line indicates the calculated dissociation phase boundary of phase H.

Image: 
Ehime University

The existence of water in deep Earth is considered to play an important role in geodynamics, because water drastically changes the physical properties of mantle rock, such as melting temperature, electric conductivity, and rheological properties. Water is transported into deep Earth by the hydrous minerals in the subducting cold plates. Hydrous minerals, such as serpentine, mica and clay minerals, contain H2O in the form of hydroxyl (-OH) in the crystal structure. Most of the hydrous minerals decompose into anhydrous minerals and water (H2O) when they are transported into deep Earth, at 40-100 km depth, due to the high temperature and pressure conditions.

However, it has also been reported that some hydrous minerals, called dense hydrous magnesium silicates (DHMSs), may survive in the deeper part of Earth's interior if the subducting plate is significantly colder than the surrounding mantle. DHMS is a series of hydrous minerals which have high stability under the pressure of deep Earth's interior. DHMS is also referred to as "alphabet phases": phase A, phase B, phase D, etc.

Until recently phase D (chemical composition: MgSi2O6H2) was known to be the highest pressure phase of DHMSs. However, Tsuchiya 2013 conducted first principles calculation (a theoretical calculation method based on quantum mechanics) to investigate the stability of phase D under pressure and found that this phase transforms to a new phase with a chemical composition of MgSiO4H2 (plus stishovite, a high pressure form of SiO2, if the system keeps the same chemical composition) above 40 GPa (GPa=109 Pa). This predicted phase has been experimentally confirmed by Nishi et al. 2014 and named as "phase H" (Figure 1). The theoretical calculation by Tsuchiya 2013 also suggests that phase H finally decomposes into the anhydrous mineral MgSiO3 by releasing H2O by further compression.

Although the theoretical calculation estimated the decomposition pressure of phase H around the middle of the lower mantle (from 660 km to 2900 km depth), a detailed determination has not yet been achieved, because the estimation of the Gibbs free energy of H2O was needed to determine the decomposition pressure of phase H. The Gibbs free energy is a thermodynamic potential that can determine the stability of a system. At lower mantle conditions, the H2O phase has a crystal structure with disordered hydrogen positions, i.e. hydrogen positions are statistically distributed among several different positions. In order to calculate the disordered state of hydrogen, Tsuchiya and Umemoto 2019 calculated several different hydrogen positions and estimated the Gibbs free energy of H2O using a technique based on statistical mechanics.

As a result, they estimated the decomposition pressure of phase H at around 62 GPa at 1000 K, corresponding to the ~1500 km depth (Figure 2). This result indicates that the transportation of water by subducting plate terminates at the middle of the lower mantle in the Mg-Si-O system. Tsuchiya and Umemoto 2019 also suggested that superionic ice may be stabilized by the decomposition of phase H in the subducted plate. In superionic ice, oxygen atoms crystalize at lattice points whereas hydrogen atoms are freely mobile. The chemical reactions between superionic ice and surrounding minerals have not been identified yet, but high diffusivity of hydrogen in superionic ice may produce reactions faster than that in solid ice, but different from water, the liquid phase of H2O.

Credit: 
Ehime University

New discipline proposed: Macro-energy systems -- the science of the energy transition

image: Phenomena of interest to macro-energy systems are listed in bold.

Image: 
Patricia Levi

What types of electricity storage could have the biggest impact globally for a low-carbon energy future? Can humanity simultaneously de-carbonize energy and extend heat, lighting and transportation to more than a billion people now living with without modern energy services?

These are the types of big-picture questions that are being answered by the research that fits into a new academic discipline--"macro-energy systems"--proposed by a group of researchers led by Stanford University.

"Macro-energy systems as a discipline illuminates the dynamics, benefits, costs and impacts of large-scale energy system transitions," says Sally M. Benson, co-director of Stanford's Precourt Institute for Energy and senior author of the perspective published Wednesday in the academic journal Joule. Benson is a professor in Energy Resources Engineering.

The new discipline would addresses topics that account for a large portion of energy use, like the global car fleet, or cover vast geographical regions, like supply chains, or that cover decades, like energy investments. The large spatial, temporal, or energy scale of these issues requires researchers, no matter what department or discipline they come from, to use similar techniques like modeling and abstraction.

"Macro-energy systems research and education is happening already, but it's being done in different departments and published under disparate academic journals," says lead author Patricia Levi, a PhD candidate in Stanford's Management & Science Engineering Department.

Formalizing this research and education as a discipline would have many benefits, Levi says. Discipline-specific journals, conferences and funding bodies would improve research by establishing core methods and terminology, making peer review more credible, fostering collaboration, and avoiding redundant work.

"An example of the pitfalls of not doing this is the sphere of 'energy analysis,' which is also called 'net energy analysis,' and 'energy return on investment' analysis. This concept has been re-invented numerous times over decades from multiple directions by multiple actors, but still lacks a consistent methodology," says co-author Michael Carbajales-Dale, who heads the Energy-Economy-Environment Systems Analysis group at Clemson University, where he is an assistant professor in the Environmental Engineering & Earth Sciences Department.

The creation of macro-energy systems will also help its practitioners convince other researchers, policymakers, companies and funders of the value of their research, the authors write.

"Unification of different areas under the umbrella of macro-energy systems will also make it more identifiable for new students, and in turn help them find jobs after they complete their education because relevant hiring managers will understand what they studied," says co-author Simon Davidsson Kurland, a postdoctoral researcher at Chalmers University of Technology in Sweden.

As first steps, the authors ask that academics who think that their work is described by the paper begin to identify their discipline as macro-energy systems and champion that term with their peers. Second, they envision a meeting to chart the scope and direction of research and education in the discipline.

John Weyant, Stanford professor in Management Science & Engineering and director of the Energy Modeling Forum, is also a co-author of the study, as is Adam Brandt, associate professor in Stanford's Energy Resources Engineering Department.

People interested in a future workshop about macro-energy systems and otherwise helping to develop a community around this discipline can connect with the authors here, which also includes a select bibliography of research relevant to macro-energy systems.

Credit: 
Stanford University

Feasibility of antimicrobial stewardship interventions in community hospitals

What The Study Did: This study evaluated whether implementing two antimicrobial stewardship interventions (pharmacist approval to continue antibiotic use after the first dose and pharmacist engagement with the prescriber about antibiotic appropriateness after 72 hours of treatment) were feasible in community hospitals.

Authors: Deverick J. Anderson, M.D., M.P.H., of the Duke University School of Medicine in Durham, North Carolina, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.9369)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Archaeology at BESSY II

image: A team of researchers examined an ancient papyrus with a supposed empty spot. With the help of several methods, they discovered which signs once stood in this place and which ink was used.

Image: 
HZB

The first thing that catches an archaeologist's eye on the small piece of papyrus from Elephantine Island on the Nile is the apparently blank patch. Researchers from the Egyptian Museum, Berlin universities and Helmholtz-Zentrum Berlin have now used the synchrotron radiation from BESSY II to unveil its secret. This pushes the door wide open for analysing the giant Berlin papyrus collection and many more.

For more than a century, numerous metal crates and cardboard boxes have sat in storage at the Egyptian Museum and Papyrus Collection Berlin, all of which were excavated by Otto Rubensohn from 1906 to 1908 from an island called Elephantine on the River Nile in the south of Egypt, near the city of Aswan. Eighty percent of the texts on the papyrus in these containers have yet to be studied, and this can hardly be done using conventional methods anymore. Thousands of years ago, the Egyptians would carefully roll up or fold together letters, contracts and amulets to a tiny size so that they would take up the least possible space. In order to read them, the papyri would have to be just as carefully unfolded again. "Today, however, much of this papyrus has aged considerably, so the valuable texts can easily crumble if we try to unfold or unroll them," Prof. Dr. Heinz-Eberhard Mahnke of Helmholtz-Zentrum Berlin and Freie Universität Berlin describes the greatest obstacle facing the Egyptologists, who are eager to unearth the scientific treasures waiting in the boxes and crates in the Berlin Egyptian Museum.

Testing the fragile papyrus with nondestructive methods

The physicist at Helmholtz-Zentrum Berlin knew from many years of research how to analyse the fragile papyrus without destroying it: shining a beam of X-ray light on the specimen causes the atoms in the papyrus to become excited and send back X-rays of their own, much like an echo. Because the respective elements exhibit different X-ray fluorescence behaviour, the researchers can distinguish the atoms in the sample by the energy of the radiation they return. The scientists therefore long ago developed laboratory equipment that uses this X-ray fluorescence to analyse sensitive specimens without destroying them.

Scholars in ancient Egypt typically wrote with a black soot ink made from charred pieces of wood or bone and which consisted mainly of elemental carbon. "For certain purposes, however, the ancient Egyptians also used coloured inks containing elements such as iron, copper, mercury or lead," Heinz-Eberhard Mahnke explains. If the ancient Egyptian scribes had used such a "metal ink" to inscribe the part that now appears blank on the Elephantine papyrus, then X-ray fluorescence should be able to reveal traces of those metals. Indeed, using the equipment in their laboratory, the researchers were able to detect lead in the blank patch of papyrus.

Revealing sharper details at BESSY II with "absorption edge radiography"

In fact, they even managed to discern characters, albeit as a blurry image. To capture a much sharper image, they studied it with X-ray radiography at BESSY II, where the synchrotron radiation illuminates the specimen with many X-ray photons of high coherence. Using "absorption edge radiography" at the BAMline station of BESSY II, they were able to increase the brightness of this technique for the sample studied, and thus better distinguish the characters written on the papyrus from the structure of the ancient paper. So far, it has not been possible to translate the character, but it could conceivably depict a deity.

Composition of the invisible ink resolved in the Rathgen laboratory

The analysis at BESSY II did not identify the kind of leaded ink the ancient scribes used to write these characters on the papyrus. Only by using a "Fourier-transform infrared spectrometer" could the scientists of the Rathgen Research Laboratory Berlin finally identify the substance as lead carboxylate, which is in fact colourless. But why would the ancient scribe have wanted to write on the papyrus with this kind of "invisible ink"? "We suspect the characters may originally have been written in bright minium (red lead) or perhaps coal-black galena (lead glance)," says Heinz-Eberhard Mahnke, summarising the researchers' deliberations.

If such inks are exposed to sunlight for too long, the energy of the light can trigger chemical reactions that alter the colours. Even many modern dyes similarly fade over time in the bright sunlight. It is therefore easily conceivable that, over thousands of years, the bright red minium or jet black galena would transform into the invisible lead carboxylate, only to mystify researchers as a conspicuously blank space on the papyrus fragment.

Method developed to study folded papyri without contact

With their investigation, Dr. Tobias Arlt of Technische Universität Berlin, Prof. Dr. Heinz-Eberhard Mahnke and their colleagues have pushed the door wide open for future studies to decipher texts even on finely folded or rolled papyri from the Egyptian Museum without having to unfold them and risk destroying the precious finds. The researchers namely developed a new technique for virtually opening the valuable papyri on the computer without ever touching them.

The Elephantine project funded by the European Research Council, ERC, and headed by Prof. Dr. Verena Lepper (Stiftung Preußischer Kulturbesitz-Staatliche Museen zu Berlin) is thus well on its way to studying many more of the hidden treasures in the collection of papyrus in Berlin and other parts of the world, and thus to learning more about Ancient Egypt.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Bloodsucker discovered: First North American medicinal leech described in over 40 years

video: An international team of museum scientists led by Anna Phillips, the Smithsonian's curator of parasitic worms, describe Macrobdella mimicus, the first new species of medicinal leech discovered in over 40 years, in the Aug. 15 issue of the Journal of Parasitology. Phillips and her colleagues have been exploring the diversity of medicinal leeches in North America for years. When she returned to the National Museum of Natural History from a 2015 field expedition with several orange-spotted, olive-green leech specimens she had collected from a Maryland swamp, she and her team assumed they belonged to a familiar species called M. decora, a leech that is thought to live throughout a large swath of the northern United States. But DNA sequencing revealed it was a new species.

Image: 
Ian Cook

Freshwater wetlands from Georgia to New York are home to a previously unrecognized species of medicinal leech, according to scientists at the Smithsonian's National Museum of National History. The new species, named Macrobdella mimicus, was first identified from specimens collected in southern Maryland, prompting a search through marshes and museum collections that ultimately revealed that the leech has long occupied a range that stretches throughout the Piedmont region of the eastern United States, between the Appalachian Mountains and the Atlantic Coast.

An international team of museum scientists led by Anna Phillips, the museum's curator of parasitic worms, describe the new species in the Aug. 15 issue of the Journal of Parasitology.
"We found a new species of medicinal leech less than 50 miles from the National Museum of Natural History--one of the world's largest libraries of biodiversity," Phillips said. "A discovery like this makes clear just how much diversity is out there remaining to be discovered and documented, even right under scientists' noses."

Leeches are parasitic worms, many of which feed on the blood of their hosts. In the 1700s and 1800s, physicians used leeches to treat a wide range of ailments, believing that by ridding a patient's body of bad blood, the parasites could cure headaches, fevers and other conditions. Any leech that readily feeds on humans is considered a medicinal leech, although in North America most leeches used for bloodletting were imported from Europe, leaving native species relatively undisturbed.

Phillips and her colleagues have been exploring the diversity of medicinal leeches in North America for years. When she returned to the museum from a 2015 field expedition with several orange-spotted, olive-green leech specimens she had collected from a Maryland swamp, she and her team assumed they belonged to a familiar species called M. decora, a leech that is thought to live throughout a large swath of the northern United States. But DNA sequencing revealed otherwise.

Examining the specimens' genomes at key regions used for species identification, Ricardo Salas-Montiel, a graduate student at the National Autonomous University of Mexico, found significant differences from the DNA of M. decora. The molecular discrepancy was surprising, but when the scientists took a closer look at the newly collected leeches, they found a physical difference that distinguished them from M. decora, as well. Like M. decora, the new leeches have multiple reproductive pores along the bottom of their bodies, known as gonopores and accessory pores. In the new leeches, the gonorpores and accessory pores were located in a different position relative to each other.

A subsequent field outing led to the team finding more leeches from South Carolina that shared the same accessory pore positioning. "Then we sequenced [their DNA], and they all came out more closely related to the leeches we had found in Maryland than to anything else known to science," Phillips said.

Phillips quickly retrieved dozens of North American leeches stored in the Smithsonian's parasite collection and examined their accessory pores. "All of a sudden, I started finding these things everywhere," she said. Leeches with the unique pore positioning had been found in locations from northern Georgia to Long Island and preserved in the museum's collection for years. The oldest, Phillips said, dated back to 1937.

From there, Phillips expanded her search, scouring parasite collections at the North Carolina Museum of Natural Sciences and the Virginia Museum of Natural History, pinpointing additional places where the leech had been found in the past. She and her team also found fresh specimens in Georgia and North Carolina and used DNA sequencing to confirm their close relationship to the others.

Each specimen added to the team's understanding of the leech's history in the region and its geographical range. Their molecular, geographical and morphological data suggest the M. mimicus occupies a sliver of the eastern United States that is nestled between the ranges of two other medicinal leech species, Phillips said. The historical record from the museums' collections, with specimens spanning 63 years, adds another critical layer of information, confirming that the species was not recently introduced to the area and does not represent a newly evolved species. "It's been here this whole time," she said. "We just hadn't looked at it in this new way."

Credit: 
Smithsonian

Superdeep diamonds confirm ancient reservoir deep under Earth's surface

image: Diamonds from the Juina area: most of these are superdeep diamonds.

Image: 
Graham Pearson

Barcelona: Analyses show that gases found in microscopic inclusions in diamonds come from a stable subterranean reservoir at least as old as the Moon, hidden more than 410 km below sea level in the Earth's mantle.

Scientists have long suspected that an area of the Earth's mantle, somewhere between the crust and the core, contains a vast reservoir of rock, comparatively undisturbed since the planet's formation. Until now, there has been no firm proof if or where it exists. Now an international group of scientists has measured helium isotopes contained in superdeep diamonds brought to the surface by violent volcanic eruptions, to detect the footprints of this ancient reservoir. This work will be presented to scientists for the first time on Friday 23rd August at the Goldschmidt conference in Barcelona, after publication today (15 August) in the journal Science*.

After the formation of the Earth, violent geological activity and extra-terrestrial impacts disrupted the young planet, meaning that almost nothing of the Earth's original structure remains. Then in the 1980's geochemists noted that in some basalt lavas from particular locations the ratio of the helium 3 to helium 4 isotope was higher than expected, mirroring the isotope ratio found in extremely old meteorites which had fallen to Earth. This indicates that the lava had carried the material from some kind of reservoir deep in the Earth, with a composition which hasn't changed significantly in the last 4 billion years. "This pattern has been observed in 'Ocean Island Basalts', which are lavas coming to the surface from deep in the Earth, and form islands such as Hawaii and Iceland" said research leader Dr Suzette Timmerman, from the Australian National University. "The problem is that although these basalts are brought to the surface, we only see a glimpse of their history. We don't know much about the mantle where their melts came from".

To address this problem, Timmerman's team looked at helium isotope ratios in superdeep diamonds. Most diamonds are formed between 150 to 230 km below the Earth's crust, before being carried to the surface by melts. Very occasionally some 'superdeep' diamonds (created between 230 and 800 km below the Earth's surface) are brought to the surface. These superdeep diamonds are recognisably different from normal diamonds.

Suzette Timmerman said "Diamonds are the hardest, most indestructible natural substance known, so they form a perfect time capsule that provides us a window into the deep Earth. We were able to extract helium gas from twenty-three super-deep diamonds from the Juina area of Brazil. These showed the characteristic isotopic composition that we would expect from a very ancient reservoir, confirming that the gases are remnants of a time at or even before the Moon and Earth collided. From the geochemistry of the diamonds, we know that they formed in an area called the 'transition zone', which is between 410 and 660 km below the surface of the Earth. This means that this unseen reservoir, left over from the Earth's beginnings, must be in this area or below it.

Questions remain about the form of this reservoir; is it a large single reservoir, or are there multiple smaller ancient reservoirs? Where exactly is the reservoir? What is the complete chemical composition of this reservoir? But with this work we are beginning to home in on what is probably the oldest remaining comparatively undisturbed material on Earth"

Commenting, Professor Matthew Jackson (University of California, Santa Barbara) said:

"There has been a lot of work focused on identifying the location of primordial reservoirs in the deep Earth. So this is an interesting result, with a lot of potential to "map out" where elevated 3He/4He domains are located in the Earth's deep interior. Helium can diffuse rapidly at mantle conditions, so it will be important to evaluate whether the ancient helium signature reflects compositions trapped at diamond-formation depths, or the composition of the host lava that transported to diamonds to the surface. This work is an important step towards understanding these reservoirs, and points the way to further research".

Credit: 
Goldschmidt Conference

Regenstrief, IU scientists to present cutting-edge HIT expertise at world congress

image: Research scientists representing Regenstrief Institute, Indiana University School of Medicine and IU Richard M. Fairbanks School of Public Health at IUPUI are joining -- and in some cases leading - the global health conversation at the 17th World Congress of Medical and Health Informatics (MedInfo).

Image: 
MedInfo

Research scientists representing Regenstrief Institute, Indiana University School of Medicine and IU Richard M. Fairbanks School of Public Health at IUPUI are joining - and in some cases leading - the global health conversation at the 17th World Congress of Medical and Health Informatics (MedInfo).

MedInfo is the foremost international conference for the science and practice of biomedical informatics. Every two years, scientists, physicians, academicians, students, entrepreneurs, decision-makers and other professionals from around the world gather to share pioneering research and discuss leveraging information to improve human health. This year's event is in Lyon, France Aug. 25-30. The topic is: "Health and Wellbeing: E-Networks for all."

Regenstrief and IU researchers are presenting their work on topics including artificial intelligence, public health informatics, disease surveillance and more. They are also discussing opportunities for advancing the use of electronic health record data and opportunities for electronic care planning for chronic disease, as well as offering tutorials on public and population health informatics and LOINC®, a universal code developed at Regenstrief Institute that is used for health measurements in many countries and is necessary for the electronic exchange of health information.

The biennial conference is organized by the International Medical Informatics Association (IMIA). The organization aims to promote informatics in healthcare, stimulate research and application and advance international cooperation and exchange of knowledge in informatics. This year, MedInfo is co-hosted by the French Association for the applications of medical informatics, "Association pour les Applications de l'Informatique en Médecine," an IMIA member society.

Regenstrief participation:

Panels

Challenges and opportunities in electronic care planning for chronic disease: The chronic kidney disease use case

Panelist: Theresa Cullen M.D., M.S.

M.D. from University of Arizona College of Medicine; M.S. from University of Wisconsin; B.S. from Johnston College

The role of CIOs/CRIOs in advancing the use of EHR data for translational research, precision medicine and learning health systems

Chairman: Umberto Tachinardi, M.D., M.Sc.
M.D. from Itajubá School of Medicine, Brazil; M.Sc. State University of São Paulo.

Panelist: Peter Embí, M.D., M.S.
M.D. from University of South Florida; M.S. from Oregon Health and Science University; B.S. from University of Florida.

Presentations

HIV case-based surveillance and aggregate data reporting: Demonstrating use cases for a national health information exchange (HIE) using information from patient-level systems

Authors: Theresa Cullen, M.D., M.S.; Burke Mamlin, M.D.

Integration of FHIR to facilitate electronic case reporting: results from a pilot study

Authors: Brian E Dixon, PhD, MPA; David E Taylor

Presenter: Dr. Dixon
PhD from Indiana University School of Informatics and Computing; MPA from Indiana University School of Public and Environmental Affairs; B.A. from DePauw University.

Evaluating two approaches for parameterizing the Fellegi-Sunter patient matching algorithm to optimize accuracy

Authors: Shaun Grannis, M.D., M.S.; Suranga Kasthurirathne, PhD; Huiping Xu, PhD; Na Bo, M.S.

Presenter Dr. Grannis
M.D. from Michigan State College of Human Medicine; M.S. from Indiana University; B.S. from Massachusetts Institute of Technology.

Generalization of machine learning approaches to identify notifiable diseases reported from a statewide health information exchange

Authors: Gregory Dexter; Suranga Kasthurirathne, PhD; Brian E Dixon, PhD, MPA; Shaun Grannis, M.D., M.S.

Presenter: Gregory Dexter

Purdue University undergraduate student, Regenstrief summer scholar.

An adversarial approach to enable re-use of machine learning models and collaborative research efforts using synthetic unstructured free-text medical data

Suranga Kasthurirathne, PhD; Gregory Dexter; Shaun Grannis M.D., M.S.

Presenter: Dr. Kasthurirathne

PhD from Indiana University School of Informatics and Computing; BEng from University of Westminster, UK.

Evaluating a dental diagnostic terminology subset

Heather Taylor MPH, LDH; Thankam Thyvalikakath, DMD, MDS, PhD

Presenter: Heather Taylor

MPH from Indiana University Richard M. Fairbanks School of Public Health at IUPUI; B.S. from Indiana University School of Dentistry

Tutorials

Advancing the health and wellness for populations: A review of public health, population health, and global health informatics

Brian E Dixon, PhD, MPA; Suranga Kasthurirathne, PhD

Presenter and Chair: Dr. Dixon

PhD from Indiana University School of Informatics and Computing; MPA from Indiana University School of Public and Environmental Affairs; B.A. from DePauw University.

Presenter: Dr. Kasthurirathne

PhD from Indiana University School of Informatics and Computing; BEng from University of Westminster, UK.

Introduction to LOINC, the global vocabulary for tests, measurements, and observations in healthcare

Presenter and Chair: Dan Vreeman, PT, DPT, MS

DPT from Du

Credit: 
Regenstrief Institute

Cardiac rehabilitation: Preliminary results

In the current issue of Cardiovascular Innovations and Applications volume 4, issue 2, pp. 121-23; DOI https://doi.org/10.15212/CVIA.2017.0069, C. Richard Conti, Jamie. B Conti, and Jeff Plasschaert from the University of Florida Medical School, Gainesville, FL, USA consider the impact of cardiac rehabilitation programmes.

A typical cardiac rehabilitation patient is one of age 60 + years and male, inactive, but stable, patient who smokes, is stressed, hypertensive, and has bad nutrition habits. Most patients with a chronic cardiac condition are depressed or anxious and have a lot of concerns about their ability to function in society as they did in the past. After a few cardiac rehabilitation sessions patients become more confident that they can exercise at a higher level than when they first appeared in the unit, have improved attitudes, and decreased anxiety.

Credit: 
Cardiovascular Innovations and Applications

Pores for thought: Ion channel study beckons first whole-brain simulation

Pores at the surface of neurons and muscle cells control your every thought, movement; the very beating of your heart. The way the pores behave - that is open, close, or lock for a short time (inactivate) depending on voltage - shapes signals in the form of ions moving across the cell surface.

For the first time, researchers at the EPFL's Blue Brain Project have mapped the behavior of the largest family of these voltage-gated ion channels: Kv channels.

Published in Frontiers in Cellular Neuroscience, and freely available online as raw data, their pioneering work will power virtual drug discovery - and, they hope, the first whole-brain simulation.

Data for a digital brain

Thousands of studies have probed the behavior of Kv channels, in various cells, by measuring the movement of ions across tiny patches of cell membrane at controlled voltages. In the early years of the Blue Brain Project, neuroscientist Dr. Rajnish Ranjan was tasked with modelling the behavior of Kv channels, based on these studies, for use in its brain simulations.

"To my surprise, despite 30 years of research none of the raw data was available," recalls Ranjan, lead author of the new study.

Even the published, processed data reflected a Wild West of study protocols, with inconsistent and incompatible results. Many Kv channels were hardly or never studied, and crucially, across the board there was a lack of studies near body temperature.

"Near body temperature, fatty cell membranes soften and slip away from recording apparatus. So, virtually all studies were performed at room temperature," Ranjan explains.

No data - no channel models - no brain simulation. If the Blue Brain team were to succeed, they would need to record their own Kv channel data - thirty years' worth of it.

Bringing ion channel research in from the cold

Fortunately, the Blue Brain team includes an ion channel-recording robot. By automating recordings, this allowed them to overcome the high failure rate near body temperature with sheer speed and volume of attempts.

The result is the first ever map of the behavior of all Kv channels - or any family of ion channels, for that matter. The map by turns reconciles, reinforces and refutes the last thirty years of Kv channel studies.

"Under standardized conditions and with large sample sizes, the behavior of Kv channels is largely consistent across cell lines and species. And as expected, quantitatively, Kv channels activate and inactivate faster at 35°C than at 25°C or 15°C.

"The big new finding was that Kv channels behave qualitatively very differently from 15°C, to 25°C to 35°C."

For instance, some channels inactivate only at higher temperature, so were previously wrongly taken as non-inactivating despite thousands of studies at room temperature. Others display a new type of delayed inactivation. Even more surprising, some channels change their behavior seemingly at random - which though unexplained, could underlie some of the inconsistencies between previous studies.

"The qualitative differences we found at 35°C clearly demand further tests," says Ranjan.

In particular, the team's method should be applied to systematically study Kv channel behavior in the presence of known modulators - like protein signals, chaperones and anchors - or genetic channel variants.

Beyond this, work is already underway at Blue Brain to map the behavior of other voltage-gated ion channels, including ones that allow Na+, or any positive ion to cross the cell membrane. Ultimately, they will require all of these to build a digital copy of the brain.

Channelpedia

The importance of these findings is more immediate than Blue Brain's ambitious whole-brain simulations, however. The study data will enable researchers everywhere to develop their own improved models for Kv channels - useful, for example, in drug discovery.

"The data and methods we have established to map the behavior of Kv channels can be used to systematically screen drug candidates for potentially positive or deleterious effects on channel behavior," suggests Ranjan.

To this end, the study team has provided open access to their million-plus Kv channel recordings from over 9,000 cells, and a growing dataset for other channels. These are publicly available for download in a dedicated, wiki-like platform called Channelpedia.

"Anyone can access Channelpedia online and contribute," Ranjan emphasizes. "We encourage other labs to share their ion channel data to expand and refine this resource."

The models the team provides, based on its systematic data collections, make it possible to interpret any channel recording in the context of physiological temperature.

"The temperature-activity relationships established in our models allow recordings at any temperature to be converted to physiological temperature," explains Ranjan. "This is a really key contribution of our work, since channel recordings at physiological temperature remain difficult and future experiments will likely continue to be performed at lower temperatures."

Credit: 
Frontiers

New insight into glaciers regulating global silicon cycling

image: Flying over Greenland Ice Sheet - A view of outlet glaciers terminating into the complex fjord network around Greenland.

Image: 
Jade Hatton - University of Bristol

A new review of silicon cycling in glacial environments, led by scientists from the University of Bristol, highlights the potential importance of glaciers in exporting silicon to downstream ecosystems.

This, say the researchers, could have implications for marine primary productivity and impact the carbon cycle on the timescales of ice ages.

This is because silica is needed by primary producers, such as diatoms (a form of algae that account for up to 35 percent of all marine primary productivity), and these primary producers remove significant amounts of carbon dioxide from the atmosphere, transporting it to the deep ocean.

Lead author Jade Hatton from the University of Bristol's School of Earth Sciences, said: "It is important we understand the role glaciers play in silicon cycling and we have examined previously published work considering subglacial weathering and nutrient fluxes to bring together this review, focusing upon the chemical fingerprint of silicon exported from these environments."

The team, whose findings were published this week in the journal Proceedings of Royal Society A, considered some of the 'big questions' currently surrounding glaciers and silicon export including the differences in the chemical fingerprint of silicon between glacial and non-glacial rivers, and if weathering processes occurring beneath glaciers are driving these differences.

Through combining new measurements of meltwaters from over 20 glaciers in Iceland, Alaska, Greenland and Norway with existing data, the paper shows that the chemical fingerprint of silicon exported from glaciers is distinct compared to silicon within non-glacial rivers.

This chemical signature (the silicon isotopic composition) helps to understand the nature of weathering processes occurring beneath glaciers.

Jade Hatton added: "Data from such a range of glaciers represents a significant endeavour in terms of fieldwork and represent a vast improvement in our knowledge of the isotopic signature of silicon from glaciers.

"We suggest that the distinct silicon isotopic composition in glacial waters is driven by the high physical erosion rates beneath glaciers.

"This has implications on how we understand subglacial weathering processes and the export of nutrients from glacial environments."

These new data are presented alongside work previously carried out in Iceland and Greenland to provide stronger evidence that the relationship between glacial meltwaters and a distinct silicon isotope signature holds.

The researchers hope this wider data set will help inform more complex computer models in the future, building on our previous modelling work that has demonstrated the importance of glacial silica on glacial-interglacial timescales.

The paper also provides a discussion of the complexities of glacial environments and highlights some of the important questions that are still uncertain, including the importance of particulate silica when considering the overall export flux from glacial environments.

Jade Hatton said: "Very little work has been done to understand the formation of this 'amorphous' silica beneath glaciers. We suggest the high physical erosion within these systems is extremely important, however encourage future work to constrain this further.

"Another highly debated area presently is the role of fjords in nutrient recycling, resulting in uncertainties in fluxes of glacial nutrients reaching the open ocean. Funding from the ERC (ICY-LAB) and the Royal Society is allowing us to continue research into this area, with projects considering biogeochemical cycling in Greenlandic fjords.

"We look forward to being able to shed light on these uncertainties by using a range of analyses from fieldwork within these fjord environments."

Credit: 
University of Bristol

Age distribution of new obesity-associated cancer cases

What The Study Did: This observational study examines changes in the age distribution of new obesity-associated cancer cases and nonobesity-associated cancer cases from 2000 to 2016 by sex and race/ethnicity.

Authors: Siran M. Koroukian, Ph.D., of Case Western Reserve University School of Medicine in Cleveland, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.9261)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

New proteomics technique gives insights into ubiquitin signalling

image: Professor David Komander has developed a new proteomics technique, called ubiquitin clipping, to understand intricate changes that control how proteins function in our cells in health and disease.

Image: 
Walter and Eliza Hall Institute, Australia

Australian researchers are among the first in the world to have access to a new approach to understand intricate changes that control how proteins function in our cells in health and disease.

The new proteomics technique called 'ubiquitin clipping' allows researchers to create high-definition maps of how proteins are modified by a process called ubiquitination. The technique provides a new level of detail for understanding the role of ubiquitination in cells, and could uncover subtle changes that contribute to a range of diseases including cancer, inflammatory conditions and neurodegenerative disorders.

The research, which was published today in Nature, was led by Walter and Eliza Hall Institute researcher Professor David Komander, who undertook the work at the MRC Laboratory of Molecular Biology in Cambridge, UK.

At a glance

A new proteomics technique has been developed to study protein ubiquitination, modifications of proteins in cells that can impact their function.

The technique, called 'ubiquitin clipping', enables researchers to map protein ubiquitination in unprecedented detail.

Ubiquitin clipping is now established at the Walter and Eliza Hall Institute, enabling Australian scientists to investigate new aspects of ubiquitin signalling, including its links to a range of diseases.

Ubiquitin architecture

Ubiquitin is a small protein that can link to other proteins in a cell, either as a single unit or in longer straight or branched chains. Professor David Komander, who heads the Institute's recently established Ubiquitin Signalling division, said protein ubiquitination could impact all cellular processes.

"Ubiquitination can change how proteins function, potentially altering their activity, redirecting them to different parts of the cell, or regulating their interactions with other proteins. One of the best-known examples of ubiquitination is when it targets specific proteins for destruction, regulating the levels of the protein in the cell, but we now know there are many more subtle and complex roles for ubiquitin signalling," he said.

"The 'architecture' of ubiquitin chains can be complex with many branches that influence its impact on proteins, yet until now it has been almost impossible for researchers to detect and distinguish between different branching structures. This has limited the experiments that were possible to understand the role of ubiquitination in disease processes."

The new 'ubiquitin clipping' technique, which was developed by Professor Komander and his colleagues at the University of Cambridge and the University of Vienna, enables scientists to measure different ubiquitin chain architectures by pretreating protein samples, and then analysing them using electrophoresis and mass spectrometry.

"Ubiquitin clipping has enabled us to reveal a whole new level of complexity in ubiquitin signalling. In our pilot experiments, we discovered branched ubiquitin chains are much more common than previously thought. We could also study combinations of modifications on ubiquitin and other proteins - a feat that was until now rather difficult," Professor Komander said.

"This is a revolutionary technique that simplifies ubiquitin research, enabling a new level of detailed experimentation. It's the difference between describing a house based solely on the number of walls, windows and doors it has, versus looking at the detailed architectural plans."

New insights into diseases

Altered ubiquitination of proteins has been implicated in a range of diseases, including cancer, inflammatory diseases and neurodegenerative disorders such as Parkinson's disease. Professor Komander said the ubiquitin clipping technique was already being applied to study patient samples.

"My Walter and Eliza Hall Institute colleague Dr Rebecca Feltham is using ubiquitin clipping to look for protein ubiquitination patterns in samples from patients with rheumatoid arthritis, a complex inflammatory disease. This could give new insights into how this disease develops and responds to existing therapies," he said.

"Ubiquitination is also a promising target for the development of new drugs. Ubiquitin clipping will be a critical aspect of my team's drug discovery research.

"It's exciting that Australian researchers are among the first in the world to have access to ubiquitin clipping, and I'm looking forward to seeing the technique underpin many exciting discoveries, including through our collaborations with other research groups."

Credit: 
Walter and Eliza Hall Institute

'The Nemo effect' is untrue: Animal movies promote awareness, not harm, say researchers

The emotive warnings were made because of global reports that its precursor 'Finding Nemo' had inspired a surge in purchases of clown fish, which in turn caused environmental and animal harm. This became known as "the Nemo effect".

The most high profile of the warnings came from the voice of 'Little Dory' herself - Ellen DeGeneres, and largely all the appeals focused on stopping viewers from buying pets linked with the movie.

Results from scientists at the University of Oxford published in the journal Ambio today, show that the links between consumer-demand for wildlife and blockbuster movies are largely unfounded.

Their results suggest that exposure to these movies does not increase demand for live animals, but can instead drive information-seeking behaviour.

The researchers looked at data on online search patterns, from the Google Trends platform, fish purchase data from a major US importer of ornamental fish and visitation data from 20 Aquaria across the US.

Their results show that, counter to popular narratives, by focusing on lesser-known species, blockbuster movies can actually bring attention to species that would normally not receive attention, illuminating animal diversity and environmental threats that are of societal concern.

The scientists say it is hard to determine exactly how reports of "the Nemo effect" originated, although past research mentions a number of press articles in the UK, USA and Australia, published shortly after the release of the movie. These were amplified by numerous other outlets around the world.

Allegations have also been made in the media linking the Harry Potter movie series and Zootopia to spikes in demand for certain species. In both cases separate studies found that these allegations have not been supported by the evidence. Similar allegations have been made for movies as diverse as the Teenage Mutant Ninja turtles and Jurassic Park.

The scientists' research suggests that there is no evidence that the "Nemo effect" is real.

Lead researcher, Diogo Veríssimo, from the Department of Zoology, University of Oxford, said: 'We think these narratives are so compelling because they are based on a clear causal link that is plausible, relating to events that are high profile - Finding Dory was one of the highest grossing animated movies in history.

'My research looks at demand for wildlife in multiple contexts. As such I was intrigued as to whether the connection between these blockbusters and demand for wildlife was as straight-forward as had been described in the media. My experience is that human behaviour is hard to influence, particularly at scale and it seemed unlikely that movies like Finding Nemo, Finding Dory and the Harry Potter series indeed generated spikes in demand for the species they feature.

'Our results suggest that the impact of movies is limited when it comes to the large-scale buying of animals. There is, however, a clear effect in terms of information-seeking which means that the media does play an important role in making wildlife and nature conservation more salient. This is particularly the case for animation movies which are viewed by a much more diverse group of people than, for example, nature documentaries.'

The researchers plan to follow up this study with an examination of the role of nature and wildlife documentaries in shaping behaviours towards nature: for example, the impact of the BBC Blue Planet series on behaviours around plastics and of the documentary Black Fish on attitudes towards cetacean captivity.

Credit: 
University of Oxford