Earth

Moving toward a future free of drug-induced hearing loss

A new special publication orchestrated by five of the nation's leading hearing experts compiles the latest research into hearing loss caused by drugs and solvents - how it occurs, how to treat it, and how to prevent it.

The compilation is being published online as a special research topic by the journal Frontiers in Cellular Neuroscience. It includes both original research and focused reviews. The Pharmaceutical Interventions for Hearing Loss Working Group organized the effort at the behest of the Department of Defense Hearing Center of Excellence.

"We're trying to elevate ways for the human population to avoid losing this important sensation for experiencing and communicating with the world around us," said co-author Peter Steyger, Ph.D., a professor of otolaryngology/head and neck surgery in the OHSU School of Medicine in Portland, Oregon.

"Ototoxicity is a threat to hearing at any age and hearing loss remains a significant side effect of chemotherapy. This review highlights how far we've come in understanding that threat and provides us with a roadmap for developing more effective ways to recognize and address the problem," added co-author Jian Zuo, Ph.D., of the Department of Developmental Neurobiology at St. Jude Children's Research Hospital in Memphis, Tennessee.

In people, hearing cells don't regenerate so the loss is irreversible. That's why it is crucial to understand the mechanisms that affect hearing and how to prevent loss of hearing, Steyger said. The introductory editorial, "Moving toward a future free of ototoxicity," highlights the latest scientific research exploring how certain pharmaceuticals damage the inner ear while others can protect it. It also highlights the need for better monitoring and detection of hearing loss over time, especially among patients being treated with antibiotics.

"Many people don't admit they're losing their hearing until it's really bad," Steyger said.

Steyger, who lost hearing as a child after being treated with antibiotics for meningitis when 14 months old, noted that hearing loss affects a surprisingly large proportion of the population - rising from an estimated 1 in 500 newborns to as many as half of all people age 75 or older. The research encapsulated in the new e-book includes 22 scientific articles from 91 authors and represents the state of the science in both prevention and treatment of ototoxicity hearing loss. This e-book is available to all, free of charge.

"This compilation will help to propel our knowledge forward and underscore the need to better understand the dangers of ototoxicity. The DoD Hearing Center of Excellence is honored to host and mobilize this important effort," said Carlos Esquivel, M.D., co-author and a neurotologist and chief medical officer in the Clinical Care, Rehabilitation, and Restoration Branch of the DoD Hearing Center of Excellence at Joint Base San Antonio in Texas.

Credit: 
Oregon Health & Science University

Treatment variations may be cutting short lives of lung cancer patients in England

Differences in the active treatment of lung cancer across England may be cutting short the lives of hundreds of patients with the disease every year, concludes research published online in the journal Thorax.

Disease and patient factors don't seem to be driving these variations, say the researchers, who calculate that if treatment rates rose to optimal levels, 800 patients could "have a clinically relevant extension of their lives each year."

Lung cancer survival in England is worse than in other comparable countries, with various factors, such as speed of diagnosis and access to cancer services, thought to influence the figures.

The researchers wanted to know if geographical variations in treatment might also have a role in lung cancer survival rates across the country.

So they retrieved national cancer registry information on the survival of people who had been diagnosed with lung cancer between 2005 and 2014.

The 1 year survival of lung cancer patients in England improved by one percentage point each year between 2005 and 2014, rising from 26 percent in 2005 to 36 percent in 2014, the figures showed.

The researchers then looked in detail at active treatment (surgery, radiotherapy, and chemotherapy) and its potential association with survival of 176,225 people who had been diagnosed with the disease between 2010 and 2014.

The detailed analysis of the 2010-14 period showed considerable variations in use of active treatment, which was in turn associated with survival rates.

Fewer than one in 10 patients (9.3%) had surgery in the bottom fifth (quintile) of active treatment areas compared with one in six (17%) in the top fifth. Similarly, radical radiotherapy varied from 4 percent to 13 percent, and chemotherapy from 21.5 percent to 34.5 percent.

The more active the treatment, the longer survival tended to be. And the researchers calculated that this variation added up to 188 potentially avoidable annual deaths in the first two years after diagnosis for those not actively treated with surgery plus 373 deaths for those not actively treated with radiotherapy.

Similarly, 318 deaths could have been delayed at the six month time point if patients had been as actively treated with chemotherapy as they were in the top five performing areas.

Chemotherapy treatment rates didn't affect two year survival rates, possibly because more advanced lung cancer tends to have a poor outlook irrespective of what treatment is given, say the researchers.

But the annual toll of avoidable deaths could be 800 if active treatment reached the rates of the top five performing areas, they say.

Their calculations held true after taking account of underlying conditions; age; sex; and tumour stage.

The researchers go on to say that linear associations between treatment rates and length of survival extend to the highest range of treatment rates for each option, and they conclude that "even the highest treatment rates that we observed are still below the levels required for optimal survival outcomes."

Credit: 
BMJ Group

Drug-producing bacteria possible with synthetic biology breakthrough

image: This is Declan Bates, Professor of Bioengineering at the University of Warwick's School of Engineering.

Image: 
University of Warwick

Bacteria could be programmed to produce drugs, thanks to breakthrough research into synthetic biology from the Universities of Warwick and Surrey

Researchers develop unique system to dynamically allocate essential cellular resources to both synthetic circuit and host cell - allowing both to survive and function properly

Adding synthetic circuitry to cells could enable them to be turned into factories for the production of antibiotics and other valuable drugs - opening up vast possibilities for the future of healthcare

Bacteria could be programmed to efficiently produce drugs, thanks to breakthrough research into synthetic biology using engineering principles, from the University of Warwick and the University of Surrey.

Led by the Warwick Integrative Synthetic Biology Centre at Warwick's School of Engineering and the Faculty of Health and Medical Sciences at the University of Surrey, new research has discovered how to dynamically manage the allocation of essential resources inside engineered cells - advancing the potential of synthetically programming cells to combat disease and produce new drugs.

The researchers have developed a way to efficiently control the distribution of ribosomes - microscopic 'factories' inside cells that build proteins that keep the cell alive and functional - to both the synthetic circuit and the host cell.

Synthetic circuitry can be added to cells to enhance them and make them perform bespoke functions - providing vast new possibilities for the future of healthcare and pharmaceuticals, including the potential for cells specially programmed to produce novel antibiotics and other useful compounds.

A cell only has a finite amount of ribosomes, and the synthetic circuit and host cell in which the circuitry is inserted both compete for this limited pool of resources. It is essential that there are enough ribosomes for both, so they can survive, multiply and thrive. Without enough ribosomes, either the circuit will fail, or the cell will die - or both.

Using the engineering principal of a feedback control loop, commonly used in aircraft flight control systems, the researchers have developed and demonstrated a unique system through which ribosomes can be distributed dynamically - therefore, when the synthetic circuit requires more ribosomes to function properly, more will be allocated to it, and less allocated to the host cell, and vice versa.

Declan Bates, Professor of Bioengineering at the University of Warwick's School of Engineering and Co-Director, Warwick Integrative Synthetic Biology Centre (WISB) commented:

"Synthetic Biology is about making cells easier to engineer so that we can address many of the most important challenges facing us today - from manufacturing new drugs and therapies to finding new biofuels and materials. It's been hugely exciting in this project to see an engineering idea, developed on a computer, being built in a lab and working inside a living cell. "eng logo

José Jiménez, Lecturer in Synthetic Biology at the University of Surrey's Faculty of Health and Medical Sciences:

"The ultimate goal of the selective manipulation of cellular functions like the one carried out in this project is to understand fundamental principles of biology itself. By learning about how cells operate and testing the constraints under which they evolve, we can come up with ways of engineering cells more efficiently for a wide range of applications in biotechnology"

Ribosomes live inside cells, and construct proteins when required for a cellular function. When a cell needs protein, the nucleus creates mRNA, which is sent to the ribosomes - which then synthesise the essential proteins by bonding the correct amino acids together in a chain.

Credit: 
University of Warwick

Novel PET imaging agent targets copper in tumors, detects prostate cancer recurrence early

image: In scans of a 62-yr-old man with Gleason 4+3 PCa treated with radical prostatectomy, with rising PSA level (1.32) and PSA doubling time of 3.7 months, 64CuCl2-PET/CT images revealed 2 positive small left iliac lymph nodes (A,C), whereas 18F-Choline PET/CT (B,D) was negative (arrows).

Image: 
A Piccardo et al., Galliera Hospital, Genoa, Italy

RESTON, Va. - An Italian study featured in the March issue of The Journal of Nuclear Medicine demonstrates that a novel nuclear medicine imaging agent targeting copper accumulation in tumors can detect prostate cancer recurrence early in patients with biochemical relapse (rising prostate-specific antigen [PSA] level).

Copper tends to be more concentrated in tumors, making it a good imaging biomarker. For this study of 50 patients, researchers conducted PET/CT scans comparing the new imaging agent, copper-64 chloride (64CuCl2), with fluorine-18-choline (18F-Choline). Multiparametric magnetic resonance imaging (mpMRI) was also conducted. In addition to calculating the detection rate of each imaging modality, the biodistribution, kinetics of the lesions and radiation dosimetry of 64CuCl2 were evaluated.

"This is the first time this novel agent has been compared with 18F-Choline-PET/CT in a considerable number of prostate cancer patients with biochemical relapse," explains Arnoldo Piccardo, of E.O. Ospedali Galliera in Genoa, Italy. He points out, "Early detection of prostate cancer relapse may improve the clinical management of patients, for example implementing early salvage radiotherapy."

The effective dose of 64CuCl2 was determined to be 5.7 mSv, similar to those of other established PET tracers (although higher than for 18F-Choline, which is 4 mSv). Unlike 18F-Choline, 64CuCl2 is neither accumulated in, nor excreted from, the urinary tract (main uptake is in the liver); this allows for thorough pelvic assessment, increasing the possibility of identifying small lesions close to the bladder. No adverse reactions were observed after the injection of 64CuCl2, and results show that 64CuCl2-PET/CT has a higher detection rate than 18F-Choline-PET/CT in patients with low levels of PSA (

"This study determined that the biodistribution of 64CuCl2 is more suitable than that of 18F-Choline for exploring the pelvis and prostatic bed," says Piccardo. "In patients with biochemical relapse and a low PSA level, 64CuCl2-PET/CT shows a significantly higher detection rate than 18F-Choline-PET/CT." He reports, "Larger trials with this PET tracer are expected to further define its capabilities and role in the management of prostate cancer."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

Models show how to limit global temperature rise to 1.5°C

There are several ways to limit global temperature rise to 1.5°C by 2100, and new research led by IIASA researcher Joeri Rogelj shows under what conditions this could happen.

The team's paper, published in Nature Climate Change, is the first to look at how socioeconomic conditions such as inequalities, energy demand, and international cooperation might affect the feasibility of achieving these goals, and also considers technological and resource assumptions.

"One of the goals of the Paris Agreement is to limit warming to 1.5°C, but scientific studies mainly looked at the question of limiting warming to 2°C. This study now fills this gap and explores how climate change by the end of the 21st century can be brought in line with 1.5°C of warming. Individual studies have looked at this question in the past, but this study is the first to use a broad and diverse set of models," says Rogelj.

The researchers used six integrated assessment computer models which each attempted to model scenarios that limit warming by the end of the century to 1.5°C, under five so-called Shared Socioeconomic Pathways (SSPs). The SSPs, developed previously by IIASA and other key partner organizations, look at different ways the world and society could progress, including for example, one in which the world pursues sustainability, one in which economic and population growth continue much as they have done historically, and another in which the world pursues high economic growth with little emphasis on sustainability.

The computer models could not model a scenario that would limit warming to 1.5°C in all of the SSPs. All of the successful scenarios include a rapid shift away from fossil fuel use towards low-carbon energy sources, lowered energy use, and the removal of CO2. Strong social and economic inequalities, a focus on continued high fossil-fuel use, and poor short-term climate policies emerged as key barriers to achieving the 1.5°C goal.

"A critical value of the paper is the use of the SSPs, which has helped to systematically explore conditions under which such extreme low targets might become attainable. Our assessment shows particularly the enormous value of pursuing sustainable development for reaching extreme low climate change targets. On the other hand, fragmentation and pronounced inequalities will likely come hand-in-hand with low levels of innovation and productivity, and thus may push the 1.5°C target out of reach," says IIASA Energy Program Director and coauthor Keywan Riahi.

In the successful scenarios, by 2030 greenhouse gas emissions have already peaked and begun a decline that continues rapidly over the following two to three decades. Zero net greenhouse gas emissions are reached between 2055 and 2075. Energy demand is limited by improving energy efficiency measures. In the SSP where economic and population growth continue as they have done historically, energy demand in 2050 for example is limited to 10-40% above 2010 levels.

Bioenergy and other renewable energy technologies, such as wind, solar, and hydro, scale up drastically over the coming decades in successful scenarios, making up at least 60% of electricity generation by the middle of the century. This marks a clear shift away from unabated fossil fuel use, without carbon capture and storage. Traditional coal use falls to less than 20% of its current levels by 2040 and oil is phased out by 2060. Negative emissions technologies, such as bioenergy with carbon capture and storage (BECCS) and both afforestation and reforestation are considered as means to additionally remove CO2 from the atmosphere.

The 1.5°C pathways created as part of the study will now be used by the wider climate change research community to run the most complex coupled climate models. This will serve as a starting point for further research, enabling better understanding of the residual impacts at low levels of global warming.

"The study provides decision makers and the public with key information about some of the enabling conditions to achieve such stringent levels of climate protection," says Rogelj.

The researchers stress that more work will be needed. The scenarios can only take into consideration technological and economic feasibility. In the real world, other factors, such as social acceptability and international cooperation, for example, can have a large effect on feasibility. Policy advisors will need to take these into consideration.

Credit: 
International Institute for Applied Systems Analysis

New study: Snowpack levels show dramatic decline in western states

image: The year 2015 was the warmest on record for Oregon, resulting in low snowpacks and less water in many lakes and rivers. Pictured is Wallowa Lake in northeastern Oregon.

Image: 
Oregon State University

CORVALLIS, Ore. - A new study of long-term snow monitoring sites in the western United States found declines in snowpack at more than 90 percent of those sites - and one-third of the declines were deemed significant.

Since 1915, the average snowpack in western states has declined by between 15 and 30 percent, the researchers say, and the amount of water lost from that snowpack reduction is comparable in volume to Lake Mead, the West's largest manmade reservoir. The loss of water storage can have an impact on municipal, industrial and agricultural usage, as well as fish and other animals.

Results of the study are being published this week in NPJ Climate and Atmospheric Science, a Nature publication.

"It is a bigger decline than we had expected," said Philip Mote, director of the Oregon Climate Change Research Institute at Oregon State University and lead author on the study. "In many lower-elevation sites, what used to fall as snow is now rain. Upper elevations have not been affected nearly as much, but most states don't have that much area at 7,000-plus feet.

"The solution isn't in infrastructure. New reservoirs could not be built fast enough to offset the loss of snow storage - and we don't have a lot of capacity left for that kind of storage. It comes down to managing what we have in the best possible ways."

The researchers attribute the snowpack decline to warmer temperatures, not a lack of precipitation. But the consequences are still significant, they point out. Earlier spring-like weather means more of the precipitation will not be stored as long in the mountains, which can result in lower river and reservoir levels during late summer and early fall.

The study considered data from 1,766 sites in the western U.S., mostly from the U.S. Department of Agriculture's Natural Resources Conservation Service and the California Department of Water Resources. The researchers focused on measurements taken on April 1, which historically has been the high point for snowpack in most areas, though they also looked at measurements for Jan. 1, Feb. 1, March 1, and May 1 - which led to the range of decline of 15 to 30 percent.

They also used a physically based computer model of the hydrologic cycle, which takes daily weather observations and computes the snow accumulation, melting, and runoff to estimate the total snowpack in the western U.S.

"We found declining trends in all months, states and climates," Mote said, "but the impacts are the largest in the spring, in Pacific states, and in locations with mild winter climates."

The Pacific states - California, Oregon and Washington - receive more precipitation because of the Pacific Ocean influence, and more of the snow falls at temperatures near freezing. Because the Cascade Mountains, which transect the region, are not as steep as the Rocky Mountains, they have more area that is affected by changes in temperature.

"When you raise the snow zone level 300 feet, it covers a much broader swath than it would in the inland states," Mote said.

Mote was one of 12 lead authors on a chapter of the fifth Intergovernmental Panel on Climate Change report looking at the cryosphere, which is comprised of snow, river and lake ice, sea ice, glaciers, ice sheets and frozen ground. Also an author on the fourth IPCC report, he had led a 2005 study on western snowpack levels that had also documented declines that were less dramatic than those in this new study.

This latest study found:

California had the highest number of positive snowpack trends since 1955, but lingering drought during the past decade erased most of those gains and snowpack declines still dominated;

Most of the other western states had only one or two sites that reported increases in snowpack;

Regions with the most significant decrease in snowpack were eastern Oregon and northern Nevada, though snowpack decreases in excess of 70 percent also occurred in California, Montana, Washington, Idaho and Arizona.

"The amount of water in the snowpack of the western United States is roughly equivalent to all of the stored water in the largest reservoirs of those states," Mote said. "We've pretty much spent a century building up those water supplies at the same time the natural supply of snowpack is dwindling.

"On smaller reservoirs, the water supply can be replenished after one bad year. But a reservoir like Lake Mead takes four years of normal flows to fill; it still hasn't recovered from the drought of the early 2000s."

Mote said snowpack levels in most of the western U.S. for 2017-18 thus far are lower than average - a function of continued warming temperatures and the presence of a La Niña event, which typically results in warmer and drier conditions in most southwestern states.

Credit: 
Oregon State University

Moms-to-be can exercise in warm weather and use saunas without getting too hot

Pregnant women can safely exercise in warm weather and take short hot baths or saunas without risking critical elevations in body temperature that could harm their unborn child, finds a review of the available evidence published online in the British Journal of Sports Medicine.

The findings contradict current advice that pregnant women should avoid heat stress based on concerns about possible risks of exceeding a core body temperature of 39? during pregnancy.

However, current guidelines do not clearly define heat stress limits and may therefore be discouraging physical activity during pregnancy, which benefits both mother and child. Some evidence also suggests that the body's ability to regulate its core temperature is enhanced during pregnancy.

To investigate further, a team of researchers set out to assess heat stress response during pregnancy and whether the body's thermal regulation capacity improves during pregnancy.

They analysed the results of 12 studies, published in English up to July 2017, reporting the core temperature response of 347 pregnant women to heat stress, either through exercise or through passive heating, such as using a sauna or sitting in a hot bath.

Differences in study design and quality of evidence were taken into account. Studies included women at any stage of pregnancy and responses were measured according to intensity and duration of exercise as well as ambient temperature and humidity.

No woman exceeded the recommended core temperature limit of 39? across all studies. The highest individual core temperature reported was 38.9?. The highest average core temperature was 38.3? for exercise on land, 37.5? for exercise in water, 36.9? for hot water bathing and 37.6? for sauna exposure.

Based on these results, the researchers say that pregnant women can safely engage in up to 35 minutes of high intensity aerobic exercise (at 80-90% of their maximum heart rate) at air temperatures of up to 25? and 45% relative humidity.

They can also safely participate in aqua-aerobic exercise in water temperatures ranging from 28.8? to 33.4? for up to 45 minutes, and sit in hot baths (40?) or hot/dry saunas (70?; 15% relative humidity) for up to 20 minutes, irrespective of pregnancy stage, without reaching the recommended core temperature limit of 39?.

Some studies also showed a reduction in the rise in core temperature as pregnancy progressed, lending support to the theory that thermal regulation is enhanced during pregnancy. While the underlying reason for this is unclear, the researchers suspect it may be linked to changes in body mass and surface area.

They also point out some limitations of their review, such as the small body and varying quality of evidence, and inconsistency in study design. Although they were able to allow for some of these factors, they cannot rule out the possibility that they may have influenced the results. As such, they say their recommendations "may change with future research."

They suggest that more research is needed to identify safe exposure and environmental limits for pregnant women who are physically active in hotter climates, but say their results suggest that heat stress risk is low.

Credit: 
BMJ Group

Caught on camera: Amazonian crop raiders

image: Caught on camera: Amazonian crop raiders

Papped snaffling in the jungle, a striking set of photos reveal the secret lives of Amazonian crop-raiding animals.

A new study from the University of East Anglia (UK) identifies the Amazon's 'worst offending' crop destroyers -- and highlights the problems caused for rural communities.

The research team spent a year working with 47 Amazonian communities in the Juruá region of Amazonas, Brazil.

They set up 132 motion-activated camera traps and took over 61,000 photos that reveal 11 crop-raiding animals, which were identified in interviews. Collared peccaries, red brocket deer, paca and agoutis are identified as the most damaging.

They also conducted 157 interviews with rural farmers to see how communities are impacted by wild animals eating their crops.

Image: 
University of East Anglia

Papped snaffling in the jungle, a striking set of photos reveal the secret lives of Amazonian crop-raiding animals.

A new study from the University of East Anglia (UK) identifies the Amazon's 'worst offending' crop destroyers - and highlights the problems caused for rural communities.

The research team spent a year working with 47 Amazonian communities in the Juruá region of Amazonas, Brazil.

They set up 132 motion-activated camera traps and took over 61,000 photos that reveal 11 crop-raiding animals, which were identified in interviews. Collared peccaries, red brocket deer, paca and agoutis are identified as the most damaging.

They also conducted 157 interviews with rural farmers to see how communities are impacted by wild animals eating their crops.

Key findings:

Crop raiding damages the livelihoods of Amazonian communities by directly reducing crop yields, necessitating costly crop protection and reducing the range of crops that can be planted.

The most damaging crop raiders are collared peccaries, red brocket deer, paca and agoutis.

These species are not highly endangered and could be hunted for subsistence without threatening biodiversity.

Lead researcher Dr Mark Abrahams, from UEA's School of Environmental Sciences, said: "Conserving biodiversity is challenging for rural communities where livelihoods are sometimes threatened by wildlife. In parts of Africa and Asia for example, where elephants can destroy crops and even harm people, a bitter human-wildlife conflict may ensue, in which local communities kill wildlife and resent conservation organisations.

"Rural Amazonian communities are some of the world's poorest, but they live with the world's highest biodiversity. Their primary source of income and carbohydrates is farming manioc - which produces starchy tubers and grows well in infertile tropical soils.

"We wanted to find out how local communities are impacted by wild animals eating their crops."

The study finds that crop-raiding impacts subsistence farmers in three ways. Firstly, on average farmers lose more than 7 per cent of their manioc crop every year to crop raiders and some farmers lose their entire crop.

Secondly, farmers need to invest time and energy into protecting their crops to avoid much higher losses (roughly 10 times higher).

Finally, because more palatable 'sweet' manioc is three times more vulnerable to crop-raiding, farmers are forced to plant less of it and hide it amongst more chemically defended 'bitter' manioc. Small communities far from towns were worst impacted by crop raiding.

Despite these livelihood impacts, rural communities are not attempting to wipe out all crop-raiding animals.

Hunting with dogs and setting traps were as commonly reported as non-lethal crop protection methods such as enclosing their fields with nets.

"We found that the most damaging crop-raiding animals were collared peccaries which look similar to wild pigs, red brocket deer, paca and agoutis, which are both types of large rodents," said Dr Abrahams.

"These species are not highly endangered so hunting them for subsistence does not threaten biodiversity, as long as conservation measures like protected areas and natural resource management are put in place.

"This study shows that crop-raiding in the Amazon does not need to become a human-wildlife conflict. Conservationists can work with local communities to support their management of natural resources. Vulnerable species like spider monkeys, which do not damage livelihoods, could be protected from hunting, whilst common and damaging crop-raiders like agoutis could be hunted for subsistence.

"The community-based management of natural resources, including the hunting of crop raiders, could form part of the sustainable-use conservation strategy which is already being implemented in the Juruá region (see the Médio Juruá Project) and elsewhere in the tropics.

"Biodiversity conservation and rural development are sometimes presented as incompatible. Crop raiding has the potential to damage livelihoods and make communities hostile to conservation. In our study region however, it seems that crop raiding is not an insurmountable barrier to conservation."

Prof Carlos Peres, an author on the study from UEA's School of Environmental Sciences, said: "Coexisting side-by-side with wildlife often incurs a cost to subsistence farmers in tropical forests, which export colossal environmental services that enhance the lives of millions of people elsewhere. This asymmetry in costs and benefits at different scales should be explicitly recognised by both conservation and development policy agendas."

'Manioc Losses by Terrestrial Vertebrates in Western Brazilian Amazonia' is published in the Journal of Wildlife Management.

Credit: 
University of East Anglia

Diversity of cortical neurons captured in comprehensive computer models

The Allen Institute for Brain Science has produced the first comprehensive, publicly available database of predictive neuron models, along with their corresponding data. The generalized leaky integrate-and-fire (GLIF) and biophysically detailed models are described in two articles published in the journal Nature Communications.

"The publication of these mathematical-physical models of the individual components making up neural networks is an important landmark in our ten-year quest to understand the brain" said Christof Koch, Ph.D., President and Chief Scientist at the Allen Institute for Brain Science. "We now seek to understand how vast assemblies of these elements give rise to behavior, perception and the feeling of life itself - consciousness."

The GLIF and biophysical models were built from extensive data in the Allen Cell Types Database: a massive publicly available repository of cortical neurons in both the mouse and human brain. Launched in 2015, this database contains electrophysiological, morphological, and transcriptomic properties gathered from individual cells, and models simulating cell activity, all building toward a "periodic table" of cell types.

"This project is a testament to the Allen Institute's interdisciplinary collaboration and teamwork," said Nathan Gouwens, Ph.D., an Allen Institute scientist. "Five years ago, we all began working together to build the pipeline for the Allen Cell Types Database. Now we have a parallel pipeline for generating models for each cell in the database."

GLIF models reproduce the spike times of neurons and capture the abstract transformations taking place inside of neurons. These models have the benefit of requiring a relatively low level of computer power to simulate, meaning that it's feasible to model the millions and millions of neurons making up the brain of a mouse. The biophysical models are more detailed. As such, they accurately represent the mechanisms reproducing the actual voltage waveform of action potentials (nerve pulses) and other forms of electrical activity across the dendritic tree and the cell body of neurons. Biophysical models take considerably more computational power than GLIF models to execute.

"We have generated a large library of cell models - ranging from simple to complex - which can be put together like building blocks to construct higher-level circuit models, providing a valuable resource for the neuroscience community," said Anton Arkhipov, Ph.D., an Allen Institute scientist.

"The models can be used as building blocks for larger simulations, but also to understand how some cell types differ from the other. Researchers can classify cell types by only looking at model parameters," said Stefan Mihalas, an Allen Institute scientist.

Perhaps in the future, detailed models of multiple cell types can be used in larger simulations to model neurological or psychiatric disorders, such as epilepsy, autism or Alzheimer's, and applying virtual perturbations to cells or networks of cells may allow us to see how the brain might respond to specific therapies.

"These studies make great use of the Allen's unique single-cell physiology data base by putting all recorded responses into a single model framework," said Adrienne Fairhall, Ph.D., co-director of the Computational Neuroscience program at the University of Washington. "This can play a huge role in allowing researchers to compare and potentially differentiate the computations of different cell types."

"Our models represent a large step in enabling scientists around the world to use standardized data, methods and models for exploring network behavior in the brain," said Corinne Teeter, Ph.D., the lead scientist for the GLIF modeling effort at the Allen Institute.

Credit: 
Allen Institute

Déjà vu and feelings of prediction: They're just feelings

image: Anne Cleary's team created virtual reality scenarios using the Sims virtual world video game. They made scenes -- like a junkyard, or a hedge garden -- that later spatially mapped to previously witnessed, but thematically unrelated scenes.

Image: 
Anne Cleary/Colorado State University

Most people can relate to the prickly, unsettling experience of déjà vu: When you're in a new situation, but you feel like you've been there before.

For some, that eerie feeling has an added twist: In that moment, they feel like they know what's going to happen next. Say you're walking up a stairwell for the first time, but it feels familiar, like a dream state - so much so that you think, "At the top of the stairs, there will be a Picasso on the left."

Anne Cleary, a cognitive psychologist at Colorado State University, has spent the last several years establishing déjà vu as a memory phenomenon - a trick of the brain akin to when a word is on the tip of your tongue, but you just can't retrieve it.

Building on previous experiments, Cleary has now shown that the prescient feeling that sometimes accompanies déjà vu is just that ­- a feeling. But it sure feels real.

A professor in CSU's Department of Psychology, Cleary has a new paper in Psychological Science, co-authored by former graduate student Alexander Claxton, detailing how they recreated déjà vu in human subjects in order to examine the feeling of premonition during the déjà vu state. According to their results, participants were no more likely to actually be able to tell the future than if they were blindly guessing. But during déjà vu, they felt like they could - which seems to mirror real life.

Cleary is one of just a handful of déjà vu researchers in the world. Ever since she read Alan S. Brown's book, The Déjà Vu Experience, she's been fascinated by the phenomenon and wanted to experimentally unmask why it occurs.

Déjà vu has a supernatural reputation. Is it recall of a past life, people have asked? Scientists, though, tend to attack questions through a more logical lens.

Cleary and others have shown that déjà vu is likely a memory phenomenon. It can occur when someone encounters a scenario that's similar to an actual memory, but they fail to recall the memory. For example, Cleary and collaborators have shown that déjà vu can be prompted by a scene that is spatially similar to a prior one.

"We cannot consciously remember the prior scene, but our brains recognize the similarity," Cleary said. "That information comes through as the unsettling feeling that we've been there before, but we can't pin down when or why."

Cleary has also studied the phenomenon known as "tip of the tongue" - that sensation when a word is just out of reach of recall. Both tip of the tongue and déjà vu are examples of what researchers call "metamemory" phenomena. They reflect a degree of subjective awareness of our own memories. Another example is the memory process known as familiarity, Cleary says ­- like when you see a familiar face out of context and can't place it.

"My working hypothesis is that déjà vu is a particular manifestation of familiarity," Cleary said. "You have familiarity in a situation when you feel you shouldn't have it, and that's why it's so jarring, so striking."

Since she began publicizing her results about déjà vu as a memory phenomenon more than 10 years ago, people around the world started responding. You're wrong, they argued. It's not just a memory. I also feel that I know what's going to happen next.

Cleary herself doesn't relate to this feeling, but she felt the need to suss out the claims. She read a study from the 1950s by neurologist Wilder Penfield, in which he stimulated parts of patients' brains and had them talk about what they were experiencing. In at least one case, when a patient reported feeling déjà vu upon stimulation, Penfield documented concurrent feelings of premonition. Hmm, Cleary thought. There's something to this.

Her hypothesis: If déjà vu is a memory phenomenon, is the feeling of prediction also a memory phenomenon? Cleary was further motivated by a recent shift in memory research, asserting that human memory is adapted for being able to predict the future, for survival purposes, rather than simply recollecting the past.

In previously published research, Cleary and her research group created virtual reality scenarios using the Sims virtual world video game. They made scenes - like a junkyard, or a hedge garden - that later spatially mapped to previously witnessed, but thematically unrelated scenes.

While immersed in a virtual reality test scene, participants were asked to report whether they were experiencing déjà vu. Subjects were more likely to report déjà vu among scenes that spatially mapped onto earlier witnessed scenes. These foundational studies mirrored the real-life experience of "feeling like you've been there before," but being unable to recall why.

In her most recent experiments, Cleary created dynamic video scenes in which the participant was moved through a series of turns. Later, they were moved through scenes spatially mapped to the previous ones, to induce the déjà vu, but at the last moment, they were asked what the final turn should be. In those moments, the researchers asked the participants if they were experiencing déjà vu, and whether they felt they knew what the direction of the next turn should be.

Cleary and her team were intrigued to note that about half the respondents felt a strong premonition during déjà vu. But they were no more likely to actually recall the correct answer - the turn they had previously seen in a spatially mapped, different scene - than if they were to choose randomly. In other words, participants who had the feeling of prediction were pretty confident they were right, but they usually weren't.

Conclusion: no, déjà vu doesn't help us predict the future. But it can manifest as a feeling that we can.

Cleary and her lab are conducting follow-up experiments now that even further probe this feeling of prediction. They wonder whether it's the familiarity process that drives the feeling. They want to know whether people experience hindsight bias - that is, whether people will be convinced they knew what was going to happen, after the fact.

"I think the reason people come up with psychic theories about déjà vu is that they are these mysterious, subjective experiences," Cleary said. "Even scientists who don't believe in past lives have whispered to me, 'Do you have an explanation for why I have this?' People look for explanations in different places. If you're a scientist, you're looking for the logical reason for why you just had this really weird experience."

Credit: 
Colorado State University

Children's use of non-dental services for oral pain could be costing the NHS £2.3 million a year

Thousands of children with oral pain are being taken by parents to pharmacies and non-dental health services, including A&E, instead of their dentist, and could be costing NHS England £2.3 million a year, according to research led by Queen Mary University of London.

The study of more than half of all of the pharmacies in London and nearly 7,000 parents finds that most pharmacy visits for children's pain medications in London are to treat oral pain.

Lead researcher Dr Vanessa Muirhead from Queen Mary's Institute of Dentistry said: "The fact that only 30 per cent of children with oral pain had seen a dentist before going to a pharmacy highlights a concerning underuse of dental services.

"Children with oral pain need to see a dentist for a definitive diagnosis and to treat any tooth decay. Not treating a decayed tooth can result in more pain, abscesses and possible damage to children's permanent teeth.

"These children had not only failed to see a dentist before their pharmacy visit; they had seen GPs and a range of other health professionals outside dentistry. This inappropriate and overuse of multiple health services including A&E is costing the NHS a substantial amount of money at a time when reducing waste is a government priority."

Previous research has found that the main cause of planned hospital admissions for children aged 5-9 years is to have their decayed teeth extracted under general anaesthesia. Meanwhile, a quarter of five-year-olds in England still have tooth decay in their baby teeth and approximately one in five 12-year-olds have tooth decay in their adult teeth.

Only 58 per cent of children in England and 49 per cent of children in London had visited a dentist in 2016, even though dental care is free in the UK for under 18s and national guidelines recommend dental visits at least every year for children.

In this latest study, published in BMJ Open and jointly funded by Healthy London Partnership and NHS England London Region, 951 pharmacies collected information from 6,915 parents seeking pain medications for their children in November 2016 - January 2017, and found that:

Nearly two-thirds (65 per cent) of parents seeking pain medications for their children were doing so to relieve their children's oral pain.

Only 30 per cent of children with oral pain had seen a dentist before the pharmacy visit while 28 per cent had seen between one and four different health professionals (including GPs, health visitors, school nurses and A&E departments - GPs being the most common).

Nearly one in ten children had signs and symptoms indicating a dental emergency and community pharmacy staff signposted them to emergency services.

The cost to the NHS of children contacting health professionals outside dentistry over the period was £36,573 (an annual cost of £373,288). Replicating these findings across all pharmacies in England could mean that the NHS spends an estimated £2.3 million annually when children with oral pain inappropriately use multiple health services.

41 per cent of the children had toothache; 20 per cent had pain from a newly erupting tooth and 15 per cent had a painful mouth ulcer.

Saturdays and Sundays were the peak days for parents to visit pharmacies for pain medication for children's oral pain. This could partly explain why some parents had not seen a dentist due to limited urgent dental care services over the weekend.

Dr Muirhead added: "We need to develop integrated systems and referral processes where GPs, community pharmacists and dentists talk to each other to make sure that children with toothache see a dentist as soon as possible for treatment. We also need better training for community pharmacy staff giving parents advice and look at how dentists manage children who have toothache."

The researchers also highlight the need to work towards preventing tooth decay from occurring in the first place. This includes rolling out Scotland's Childsmile programme more widely, where fluoride toothpaste is distributed to all pre-school children, all nurseries have supervised toothbrushing every day and early years' settings have healthy low sugar meals and snacks.

The study limitations include the extrapolation of cost estimations which contained several assumptions. The researchers also possibly underestimated the number of children with oral pain in London because only community pharmacies were used as a means of identifying children and parents.

Credit: 
Queen Mary University of London

As summers get warmer, more rain may not be better than less

Warm, wet summers are historically unusual and could bring unexpected disruptions to ecosystems and society, according to new research from the University of British Columbia.

As climate change raises summer temperatures around the world, increases in precipitation could offset drought risk in some regions. However, a paper published in Nature Communications this month shows that wetter summers may bring other problems in a warming climate.

"Terrestrial climates around the world tend to alternate between cool, wet summers in some years and warm, dry summers in other years," said UBC forestry PhD candidate Colin Mahony, lead author of the study. "But climate change is driving many climates towards warmer and wetter conditions. We found that where temperature and precipitation are increasing together, climates are changing faster than the temperature trend alone would suggest."

Warmer, wetter summers could produce unexpected impacts, such as disease outbreaks and crop failures, because they break the climatic norms that ecosystems and human communities are adapted to.

For the study, Mahony and co-author Alex Cannon from Environment and Climate Change Canada looked at historical observations going back to 1901 and global climate model projections to the year 2100.

Previous studies of temperature alone have highlighted the tropics as a hotspot of emerging unfamiliar climates. However, this new research points to subtropical and temperate regions -- the southeastern U.S., central Canada, northern Australia, southern Africa, central Asia and the African Sahel -- as areas where these types of warmer, wetter extremes are most likely to occur.

"We're just getting into the time period where we expect to see this effect," Mahony said.

The next steps will be to look at specific effects of these compound climate extremes in individual regions of the world. Mosquito-borne human diseases such as Zika virus, dengue fever and malaria are promoted by both heat and standing water, and could be exacerbated by warm-wet extremes.

Closer to home, recent outbreaks of fungal diseases in the forests of western North America have been linked to warmer and wetter conditions at specific times of year.

"Some fungal outbreaks over the past couple of decades, such as Dothistroma needle blight, could likely have been anticipated by tracking how temperature and precipitation were changing together," said Mahony, who has worked as a forester in British Columbia for 10 years and has witnessed the impacts of climate change on the ground. "In order to respond to global warming, we need to understand how the climates of the future will be different than the familiar, historical climates that we are adapted to."

Credit: 
University of British Columbia

Our reactions to odor reveal our political attitudes

image: These are researchers at the smell laboratory at Stockholm university. Jonas Olofsson standing.

Image: 
Niklas Björling

People who are easily disgusted by body odours are also drawn to authoritarian political leaders. A survey showed a strong connection between supporting a society led by a despotic leader and being sensitive to body odours like sweat or urine. It might come from a deep-seated instinct to avoid infectious diseases. 'There was a solid connection between how strongly someone was disgusted by smells and their desire to have a dictator-like leader who can supress radical protest movements and ensure that different groups "stay in their places". That type of society reduces contact among different groups and, at least in theory, decreases the chance of becoming ill', says Jonas Olofsson, who researches scent and psychology at Stockholm University and is one of the authors of the study.

Disgust is a basic emotion that helps us survive. When people are disgusted, they wrinkle their noses and squint their eyes, basically decreasing their sensory perception of the world. At its core, disgust is a protection against things that are dangerous and infectious - things that we want to avoid. The researchers had a theory that there would be a connection between feelings of disgust and how a person would want society to be organised. They thought that people with a strong instinct to distance themselves from unpleasant smells would also prefer a society where different groups are kept separate. 'Understanding the shared variance between basic emotional reactivity to potential pathogen cues such as body odours and ideological attitudes that can lead to aggression towards groups perceived as deviant can prompt future investigations on what are the emotional determinants of outgroup derogation. In the next future, his knowledge might inform policies to prevent ethnocentrism' says Marco Tullio Liuzza from Magna Graecia University of Catanzaro, Italy, also one of the authors.

A scale was developed for the participants to rate their levels of disgust for body odours, both their own and others. The scale was used in a large-scale survey that was given online in different countries, together with questions on their political views. In the US, questions about how they planned to vote in the presidential race in 2016 were added. 'It showed that people who were more disgusted by smells were also more likely to vote for Donald Trump than those who were less sensitive. We thought that was interesting because Donald Trump talks frequently about how different people disgust him. He thinks that women are disgusting and that immigrants spread disease and it comes up often in his rhetoric. It fits with our hypothesis that his supporters would be more easily disgusted themselves', says Jonas Olofsson.

The results of the study could be interpreted to suggest that authoritarian political views are innate and difficult to change. However, Jonas Olofsson believes that they can be changed even if they are deep seated. 'The research has shown that the beliefs can change. If contact is created between groups, authoritarians can change. It's not carved in stone. Quite the opposite, beliefs can be updated when we learn new things.'

Credit: 
Stockholm University

New online tool gives 3-D view of human metabolic processes

An international team of researchers has developed a computational resource that provides a 3D view of genes, proteins and metabolites involved in human metabolism. Researchers used the tool to map disease-related mutations on proteins and also probed how genes and proteins change in response to certain drugs. The work provides a better understanding of disease-causing mutations and could enable researchers to discover new uses for existing drug treatments.

The findings were recently published in Nature Biotechnology. The work was led by the research group of Bernhard Palsson, Galletti Professor of Bioengineering at the University of California San Diego, in collaboration with colleagues at the University of Luxembourg, Technical University of Denmark and other institutions around the world.

The tool, called Recon3D, is the most comprehensive human metabolic network reconstruction to date. It integrates 3,288 open reading frames, which are stretches of DNA and RNA that contain protein-producing genes; 13,542 metabolic reactions; and the 3D structures of 4,140 metabolites and 12,890 proteins. Recon3D is available online through two databases: BiGG Models and the Virtual Metabolic Human database. Recon3D is also integrated into the Protein Data Bank, which can now enable users to visualize 3D structures of proteins in the context of their metabolic networks.

“This is the first resource to link all these different data types together in one place and has shown to be a very valuable tool for analyzing sequencing data,” said Elizabeth Brunk, a postdoctoral researcher at UC San Diego and first author of the study.

Many approaches to analyze sequencing data often involve treating DNA code as a linear sequence, but that doesn’t tell the whole story, Brunk explained. That’s because the proteins that DNA sequences produce naturally exist as 3D structures with coils, twists and folds—they aren’t linear. “So if you’re studying mutations and looking at them as isolated points on a line, they may look like they have no association with each other. But in the protein’s native folded state, those mutations could actually end up being close together,” Brunk said.

Researchers used Recon3D to map genetic mutations called single nucleotide polymorphisms (SNPs), which are caused by a variation of a single nucleotide base (A, C, G or T) in the DNA sequence. In this study, researchers focused on SNPs that are associated with diseases such as cancer. Researchers located where these mutations occur in proteins and found many of them occupying the same regions, which they called “mutation hotspots.” Using Recon3D, researchers found that other harmful mutations were significantly more likely to neighbor mutations that were also harmful.

Researchers also used Recon3D to study how genes, proteins and metabolic reactions change in response to various drug treatments. To their surprise, the researchers found that drugs with very different molecular structures produced similar metabolic responses. Drug developers could use this tool to explore whether certain drugs can be repurposed to treat other diseases for which they weren’t originally developed, Brunk noted.

“It is wonderful to see how this international group of researchers came together to generate Recon3D, that accounts for 17 percent of the functionally annotated genes on the human genome,” Palsson said. “Given the involvement of metabolism in most major diseases (cancer, nervous system, diabetes, etc.) and wellness, Recon3D is likely to help break new ground in human metabolic research.”

Credit: 
University of California - San Diego

Young children use physics, not previous rewards, to learn about tools

Children as young as seven apply basic laws of physics to problem-solving, rather than learning from what has previously been rewarded, suggests new research from the University of Cambridge.

The findings of the study, based on the Aesop's fable The Crow and the Pitcher, help solve a debate about whether children learning to use tools are genuinely learning about physical causation or are just driven by what action previously led to a treat.

Learning about causality - about the physical rules that govern the world around us - is a crucial part of our cognitive development. From our observations and the outcome of our own actions, we build an idea - a model - of which tools are functional for particular jobs, and which are not.

However, the information we receive isn't always as straightforward as it should be. Sometimes outside influences mean that things that should work, don't. Similarly, sometimes things that shouldn't work, do.

Dr Lucy Cheke from the Department of Psychology at the University of Cambridge says: "Imagine a situation where someone is learning about hammers. There are two hammers that they are trying out - a metal one and an inflatable one. Normally, the metal hammer would successfully drive a nail into a plank of wood, while the inflatable hammer would bounce off harmlessly.

"But what if your only experience of these two hammers was trying to use the metal hammer and missing the nail, but using the inflatable hammer to successfully push the nail into a large pre-drilled hole? If you're then presented with another nail, which tool would you choose to use? The answer depends on what type of information you have taken from your learning experience."

In this situation, explains, Cheke, a learner concerned with the outcome (a 'reward' learner) would learn that the inflatable hammer was the successful tool and opt to use it for later hammering. However, a learner concerned with physical forces (a 'functionality' learner) would learn that the metal hammer produced a percussive force, albeit in the wrong place, and that the inflatable hammer did not, and would therefore opt for the metal hammer.

Now, in a study published in the open access journal PLOS ONE, Dr Cheke and colleagues investigated what kind of information children extract from situations where the relevant physical characteristics of a potential tool are observable, but often at odds with whether the use of that tool in practice achieved the desired goal.

The researchers presented children aged 4-11 with a task through which they must retrieve a floating token to earn sticker rewards. Each time, the children were presented with a container of water and a set of tools to use to raise the level. This experiment is based on one of the most famous Aesop's fables, where a thirty crow drops stones into a pitcher to get to the water.

In this test, some of the tools were 'functional' and some 'non-functional'. Functional tools were those that, if dropped into a standard container, would sink, raising the water level and bringing the token within reach; non-functional tools were those that would not do so, for example because they floated.

However, sometimes the children used functional tools to attempt to raise the level in a leaking container - in this context, the water would never rise high enough to bring the token within reach, no matter how functional the tool used.

At other times, the children were successful in retrieving the reward despite using a non-functional tool; for example, when using a water container that self-fills through an inlet pipe, it doesn't matter whether the tool is functional as the water is rising anyway.

After these learning sessions, the researchers presented the children with a 'standard' water container and a series of choices between different tools. From the pattern of these choices the researchers could calculate what type of information was most influential on children's decision-making: reward or function.

"A child doesn't have to know the precise rules of physics that allow a tool to work to have a feeling of whether or not it should work," says Elsa Loissel, co-first author of the study. "So, we can look at whether a child's decision making is guided by principles of physics without requiring them to explicitly understand the physics itself.

"We expected older children, who might have a rudimentary understanding of physical forces, to choose according to function, while younger children would be expected to use the simpler learning approach and base their decisions on what had been previously rewarded," adds co-first author Dr Cheke. "But this wasn't what we found."

Instead, the researchers showed that information about reward was never a reliable predictor of children's choices. Instead, the influence of functionality information increased with age - by the age of seven, this was the dominant influence in their decision making.

"This suggests that, remarkably, children begin to emphasise information about physics over information about previous rewards from as young as seven years of age, even when these two types of information are in direct conflict."

Credit: 
University of Cambridge