Tech

Changes in oxygen concentrations in our ocean can disrupt fundamental biological cycles

image: On Aug. 11, 2015, a NASA satellite captured this false-color image of a large bloom of cyanobacteria (Nodularia) swirling in the Baltic Sea. These cyanobacteria fix inorganic atmospheric nitrogen (N2) into a form available to Life, a process fundamental for marine ecosystems. In our paper we show that nitrogen fixation becomes even more important when the state of oxygenation of the ocean declines.

Image: 
Photo courtesy of NASA Earth Observatory/USGS <a href="https://earthobservatory.nasa.gov/images/86449/blooming-baltic-sea" target="_blank">https://earthobservatory.nasa.gov/images/86449/blooming-baltic-sea</a>

New research led by scientists at the University of Bristol has shown that the feedback mechanisms that were thought to keep the marine nitrogen cycle relatively stable over geological time can break down when oxygen levels in the ocean decline significantly.

The nitrogen cycle is essential to all forms of life on Earth - nitrogen is a basic building block of DNA.

The marine nitrogen cycle is strongly controlled by biology and small changes in the marine nitrogen cycle have major implications on life.
It is thought that the marine nitrogen cycle has stayed relatively stable over geological time due to a range of different feedback mechanisms.

These feedback mechanisms are called 'the nitrostat'. However, exactly how the global marine nitrogen cycle and the associated feedback mechanisms responded to past severe changes in marine oxygenation is not well understood.

The team used a data-constrained earth system model to show show that under these deoxygenated conditions the ocean can become extremely depleted in nitrogen as the total bioavailable nitrogen inventory collapses relative to phosphorous.

At the same time the ocean transitions from an oxic-nitrate ocean to an anoxic ammonium ocean. The substantive reduction in the ocean bioavailable-N inventory in response to change in marine oxygenation may represent a key biogeochemical vulnerability.

Their findings are published today in the journal Proceedings of the National Academy of Sciences of the United States of America.

Lead author Dr David Naafs from the University of Bristol's School of Earth Sciences, said: "Our results demonstrate that changing the amount of oxygen in the ocean can have disastrous effects on vital biogeochemical cycles such as the nitrogen cycle, which is essential for all forms of Life."

Co-author Dr Fanny Monteiro, from Bristol's School of Geographical Sciences, added: "Our modelling results are in agreement with the sparsely available proxy data from the geological past."

Co-author Professor Ann Pearson from Harvard University, said: "Our modelling results show the impact of changes in ocean oxygenation on the marine nitrogen cycle for places and time periods for which we do not (yet) have sufficient proxy data."

The strength and state of the marine nitrogen cycle and biological pump in the ocean are highly susceptible to disruptions in the level of oceanic oxygen.

As oxygen levels in the oceans are currently declining and expected to decline significantly more in the coming decades due to anthropogenic activities, the results indicate that the marine nitrogen cycle might be significantly disrupted in the future.

Credit: 
University of Bristol

Breakthrough in understanding common childhood cancer

Scientists studying one of the most common forms of childhood cancer have made an important breakthrough in understanding how the disease progresses.

Neuroblastoma is a rare type of cancer of the nervous system that mainly affects babies and young children.

It often begins in the adrenal gland but in around half of cases the condition has spread throughout the body when it is diagnosed, particularly to bone and bone marrow, and in these high risk cases survival is only about 50%.

A study, led by experts at Newcastle University, UK, and published today in Clinical Cancer Research, has focussed on neuroblastoma cells which circulate in the blood and spread through the bone marrow.

It is the first time that circulating neuroblastoma tumour cells have been identified in this way and experts say it is possible to test the effect of newer targeted types of treatments on the circulating tumour cells without the need for an invasive biopsy.

Understanding the disease

Professor Deborah Tweddle, from the newly-formed Newcastle University Centre for Cancer, led the national study and believes that it is a major step forward in trying to personalise treatment as the number of circulating tumour cells indicate the strength of the disease.

She said: "Our study is an exciting development. It has improved our understanding of the spread of neuroblastoma and why some young patients may be at high risk of the disease advancing.

"If the numbers of circulating tumour cells prove important in predicting the effects of chemotherapy then, in the future, we will hopefully be able to tailor treatment to an individual patient's needs.

"Through a greater understanding of neuroblastoma we hope to eventually improve the cure rate and, for those children who survive, we want to make sure that their quality of life is as good as possible after treatment.

"Our ultimate aim is to give those with this devastating disease the best chance possible and increase survival rates."

This study is an example of the pioneering research that is taking place at Newcastle University Centre for Cancer, with a focus on the 'discovery to trial to healthcare pathway'.

Scientists collected blood and bone marrow samples from patients at five paediatric oncology centres in England and Scotland. A total of 40 patients were studied, 23 had high risk neuroblastoma at diagnosis.

The team analysed the samples using specialist equipment - an Image Stream Flow Cytometer run by Dr David Jamieson - to count the number of tumour cells circulating in the blood and bone marrow by labelling the tumour cells with an antibody against a molecule called GD2 present on neuroblastoma cells and photographing them. Anti-GD2 antibody therapy is now routinely used for treatment of patients with high risk neuroblastoma.

Experts also collected the plasma of the blood after the cells were removed and found that they could still detect small pieces of tumour DNA. Genetic tests were carried out on the DNA and it was identified that they were similar to that of the main tumour.

Professor Tweddle, an Honorary Consultant in Paediatric Oncology at the Great North Children's Hospital, Newcastle, said: "To our knowledge, this is the first study in the world to use this type of specialist equipment to look for circulating tumour cells in neuroblastoma.

"Clearly it is early days, but this study is promising as it means that we can look at tumour cells as well as tumour DNA from the same blood sample. Therefore, if it's too hazardous to biopsy the main tumour we can get all the important genetic information we need from a blood test."

Further research will look at a much larger number of patients and will be done as part of the next European high risk neuroblastoma trial, which is hoped will be open in the UK next year.

Patient's perspective

Gemma Lowery sadly knows first-hand the devastation that neuroblastoma can cause.

Her inspirational son, Bradley, died at the age of six after bravely battling the illness from being a toddler. When it was revealed that the condition was terminal, Bradley appeared as mascot for his beloved Sunderland AFC and struck up a close bond with striker Jermain Defoe.

Gemma, of Blackhall Colliery, County Durham, has welcomed the Newcastle University research as she says it is key that there is a better understanding into the development of the condition.

She said: "It is extremely important that as much research as possible is done into neuroblastoma to help improve treatments for children with the condition.

"In general, not enough research is done into childhood cancer so the fact that Newcastle University is leading the way in studying neuroblastoma is fantastic.

"It's great that this research is a stepping stone towards personalised treatment as the standard treatment currently given is gruelling. Bradley was on life-support for 15 days, not from the cancer but from the effects of the chemotherapy.

"The Bradley Lowery Foundation is pushing for more people to become involved in scientific research and this is an example of the importance of it."

Credit: 
Newcastle University

Researchers reach milestone in quantum standardization

image: This is the Institute of Quantum Computing at the University of Waterloo in Waterloo, Canada.

Image: 
University of Waterloo

Researchers at the University of Waterloo have developed a method that could pave the way to establishing universal standards for measuring the performance of quantum computers.

The new method, called cycle benchmarking, allows researchers to assess the potential of scalability and to compare one quantum platform against another.

"This finding could go a long way toward establishing standards for performance and strengthen the effort to build a large-scale, practical quantum computer," said Joel Wallman, an assistant professor at Waterloo's Faculty of Mathematics and Institute for Quantum Computing. "A consistent method for characterizing and correcting the errors in quantum systems provides standardization for the way a quantum processor is assessed, allowing progress in different architectures to be fairly compared."

Cycle benchmarking provides a solution that helps quantum computing users to both determine the comparative value of competing hardware platforms and increase the capability of each platform to deliver robust solutions for their applications of interest.

The breakthrough comes as the quantum computing race is rapidly heating up, and the number of cloud quantum computing platforms and offerings is quickly expanding. In the past month alone, there have been significant announcements from Microsoft, IBM and Google.

This method determines the total probability of error under any given quantum computing applications when the application is implemented through randomized compiling. This means that cycle benchmarking provides the first cross-platform means of measuring and comparing the capabilities of quantum processors that is customized to users' applications of interest.

"Thanks to Google's recent achievement of quantum supremacy, we are now at the dawn of what I call the `quantum discovery era', said Joseph Emerson, a faculty member at IQC. "This means that error-prone quantum computers will deliver solutions to interesting computational problems, but the quality of their solutions can no longer be verified by high-performance computers.

"We are excited because cycle benchmarking provides a much-needed solution for improving and validating quantum computing solutions in this new era of quantum discovery."

Emerson and Wallman founded the IQC spin-off Quantum Benchmark Inc., which has already licensed this technology to several world-leading quantum computing providers, including Google's Quantum AI effort.

Quantum computers offer a fundamentally more powerful way of computing, thanks to quantum mechanics. Compared to a traditional or digital computer, quantum computers can solve certain types of problems more efficiently. However, qubits--the basic processing unit in a quantum computer--are fragile; any imperfection or source of noise in the system can cause errors that lead to incorrect solutions under a quantum computation.

Gaining control over a small-scale quantum computer with just one or two qubits is the first step in a larger, more ambitious endeavour. A larger quantum computer may be able to perform increasingly complex tasks, like machine learning or simulating complex systems to discover new pharmaceutical drugs. Engineering a larger quantum computer is challenging; the spectrum of error pathways becomes more complicated as qubits are added and the quantum system scales.

Characterizing a quantum system produces a profile of the noise and errors, indicating if the processor is performing the tasks or calculations, it is being asked to do. To understand the performance of any existing quantum computer for a complex problem or to scale up a quantum computer by reducing errors, it's first necessary to characterize all significant errors affecting the system.

Wallman, Emerson and a group of researchers at the University of Innsbruck identified a method to assess all error rates affecting a quantum computer. They implemented this new technique for the ion trap quantum computer at the University Innsbruck and found that error rates don't increase as the size of that quantum computer scales up a very promising result.

"Cycle benchmarking is the first method for reliably checking if you are on the right track for scaling up the overall design of your quantum computer," said Wallman. "These results are significant because they provide a comprehensive way of characterizing errors across all quantum computing platforms."

Credit: 
University of Waterloo

16-million-year-old fossil shows springtails hitchhiking on winged termite

image: Distribution of springtails on termite and ant hosts within ~ 16 Ma old Dominican amber.

Image: 
N. Robin, C. D'Haese and P. Barden

When trying to better the odds for survival, a major dilemma that many animals face is dispersal -- being able to pick up and leave to occupy new lands, find fresh resources and mates, and avoid intraspecies competition in times of overpopulation.

For birds, butterflies and other winged creatures, covering long distances may be as easy as the breeze they travel on. But for soil-dwellers of the crawling variety, the hurdle remains: How do they reach new, far-off habitats?

For one group of tiny arthropods called springtails (Collembola), a recent fossil discovery now suggests their answer to this question has been to piggyback on the dispersal abilities of others, literally.

In findings published in BMC Evolutionary Biology, researchers at the New Jersey Institute of Technology (NJIT) and Museum national d'Histoire naturelle have detailed the discovery of an ancient interaction preserved in 16-million-year-old amber from the Dominican Republic: 25 springtails attached to, and nearby, a large winged termite and ant from the days of the early Miocene.

The fossil exhibits a number of springtails still attached to the wings and legs of their hosts, while others are preserved as if gradually floating away from their hosts within the amber. Researchers say the discovery highlights the existence of a new type of hitchhiking behavior among wingless soil-dwelling arthropods, and could be key to explaining how symphypleonan springtails successfully achieved dispersal worldwide.

"The existence of this hitchhiking behavior is especially exciting given the fact that modern springtails are rarely described as having any interspecfic association with surrounding animals," said Ninon Robin, the paper's first author whose postdoctoral research at NJIT's Department of Biological Sciences was funded by the Fulbright Program of the French-American Commission. "This finding underscores how important fossils are for telling us about unsuspected ancient ecologies as well as still ongoing behaviors that were so far simply overlooked."

Today, springtails are among the most common arthropods found in moist habitats around the world. Most springtails possess a specialized appendage under their abdomen they use to "spring" away in flee-like fashion to avoid predation. However this organ is not sufficient for traversing long distances, especially since most springtails are unable to survive long in dry areas.

The hitchhikers the researchers identified belong to a lineage of springtails found today on every continent, known as Symphypleona,which they say may have been "pre-adapted" to grasping on to other arthropods through prehensile antennae.

Because springtails would have encountered such winged termites and ants frequently due to their high abundance during the time of the preservation, these social insects may have been their preferred hosts for transportation.

"Symphypleonan springtails are unusual compared to other Collembola in that they have specialized antennae that are used in mating courtship," said Phillip Barden, assistant professor of biology at NJIT and the study's principal investigator. "This antennal anatomy may have provided an evolutionary pathway for grasping onto other arthropods. In this particular fossil, we see these specialized antennae wrapping around the wings and legs of both an ant and termite. Some winged ants and termites are known to travel significant distances, which would greatly aid in dispersal."

Barden says that the discovery joins other reports from the Caribbean and Europe of fossil springtails attached to a beetle, a mayfly and a harvestman in amber, which together suggest that this behavior may still exist today.

Barden notes that evidence of springtail hitchhiking may not have been captured in such high numbers until now due to the rarity of such a fossilized interaction, as well as the nature of modern sampling methods for insects, which typically involves submersion in ethanol for preservation.

"Because it appears that springtails reflexively detach from their hosts when in danger, evidenced by the detached individuals in the amber, ethanol would effectively erase the link between hitchhiker and host," said Barden. "Amber derives from fossilized sticky tree resin and is viscous enough that it would retain the interaction. ... Meaning, sometimes you have to turn to 16-million-year-old amber fossils to find out what might be happening in your backyard."

Credit: 
New Jersey Institute of Technology

NASA spots first tropical cyclone of Southern Pacific season

image: On Nov. 25 at 9:05 a.m. EST (1405 UTC), the MODIS instrument aboard NASA's Aqua satellite gathered that temperature information. MODIS found a large area of powerful thunderstorms (yellow) circling Rita's center with cloud top temperatures as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall. Vanuatu is seen to the southwest of the storm.

Image: 
NASA/NRL

The tropical cyclone season in the Southern Pacific Ocean has kicked off with Tropical Cyclone Rita, and NASA's Aqua satellite passed over the storm and analyzed it in infrared light for temperature data.

Rita developed on Nov. 24 as Tropical Cyclone 1P, 452 miles north of Port Vila, Vanuatu. By 10 a.m. EST (1500 UTC) on Nov. 24, Tropical Cyclone 1P strengthened into a tropical storm and was named Rita.

NASA's Aqua satellite used infrared light to analyze the strength of storms in newly developed Rita. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures. On Nov. 25 at 9:05 a.m. EST (1405 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument gathered that temperature information. MODIS found a large area of powerful thunderstorms circling Rita's center with cloud top temperatures as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

At the time of the MODIS image, Rita was located northeast of Vanuatu. Because of Rita's close proximity, there are Vanuatu warnings in effect on Nov. 26. Those warnings include a Red alert is in effect for the Torba province and a Yellow Alert applies to Penama and Sanma provinces.

On Nov. 26 at 10 a.m. EST (1500 UTC), Tropical Cyclone Rita was located near latitude 12.6 degrees south and longitude 170.3 degrees east. That is about 344 nautical miles north-northeast of Port Vila, Vanuatu. Rita was moving to the south-southeast at 3 knots. Maximum sustained winds 60 knots (69 mph/96 kph).

Rita is forecast to move south-southeast. The storm is weakening and is expected to dissipate within three days.

Typhoons and hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Dinosaur skull turns paleontology assumptions on their head

A team of researchers at the University of Alberta has unearthed a well-preserved Styracosaurus skull--and its facial imperfections have implications for how paleontologists identify new species of dinosaurs.

The skull was discovered by Scott Persons in 2015, then a graduate student in the Department of Biological Sciences, during an expedition in the badlands northwest of Dinosaur Provincial Park.

Nicknamed Hannah, the dinosaur was a Styracosaurus--a horned dinosaur over five metres in length with a fan of long horns. UAlberta paleontologists led by Robert Holmes, professor in the Department of Biological Sciences, have learned much from those horns--because they aren't symmetrical.

"When parts of one side of the skull were missing, paleontologists have assumed that the missing side was symmetrical to the one that was preserved," explained Persons. "Turns out, it isn't necessarily. Today, deer often have left and right antlers that are different in terms of their branching patterns. Hannah shows dramatically that dinosaurs could be the same way."

The differences in the skull's left and right halves are so extreme that had the paleontologists found only isolated halves, they might have concluded that they belong to two different species

"The skull shows how much morphological variability there was in the genus," said Holmes. Like the antlers of modern deer and moose, Hannah shows that the pattern of dinosaur horns could vary significantly--meaning some fossils that were once assumed to be unique species will have to be reevaluated.

Tradition dictates that the person who finds an important dinosaur specimen gets to give it a nickname. "Hannah the dinosaur is named after my dog," explained Persons, now a professor and museum curator at the College of Charleston. "She's a good dog, and I knew she was home missing me while I was away on the expedition."

Despite the nickname, paleontologists have no way of knowing if the dinosaur was female. But they have learned other details from the skull--from a partnership with researchers in the Faculty of Engineering.

"Ahmed Qureshi and graduate student Baltej Rupal in the Faculty of Engineering assisted us in performing a 3D laser scan of the skull," said Persons. "That let our publication to include a digital reconstruction, allowing scientists all over the world to download the 3D model and inspect it in detail."

"This is the future of paleontological collections: digital dinosaurs."

Credit: 
University of Alberta

Agroecology is emerging as a new market for peasant farming

In the 2019 harvest, members of the Landless Workers' Movement (MST), in Rio Grande do Sul state, Brazil, commemorated a harvest estimated at 16,000 tons of organic and agroecological rice, the biggest production of its kind in the whole country. 363 families in 15 settlements work in the rice production.

In the south of the state of Minas Gerais, 20 families of the Campo Grande "quilombo" community produce Guaií organic and agroecological coffee, internationally recognized for its high quality. Coordinated by two women's collectives, the production process is also free from agrotoxins.

In Ceará state, small rural producers in Chapada do Apodi, after years of confronting large agricultural corporations and an enormous effort to recover lands, have created a new regional market for selling organic cassava and beans. The region is known for its large banana production for export and also for contamination by agrotoxins.

"These are three examples in three regions of Brazil, but I could present cases in all regions of the world. They involve a process of resistance and overcoming of the global agrarian question. After decades of subordination to agribusiness, the socioterritorial movements have created their own food system based on agroecology," said Bernardo Mançano Fernandes, a professor at the Geography Department of the School of Science and Technology and at the Institute of Public Policies and International Relations of São Paulo State University (UNESP), in a lecture given November 22 in Paris at FAPESP Week France.

According to the geographer, although economists and governments have long bet that the solution for this population would be to produce commodities for agribusiness, the movements have understood that it is possible to produce for society, without intermediaries and creating a new market.

Thus, some Brazilian peasant movements have innovated with the creation of a new food system. "This new system is based on the principles of food sovereignty, with experiences of agroecological production, family business, and community markets, as well as, of course, the fight for lands. Until recently these peasants were subjected to deterritorialization processes, when, due to economic pressures, they had their lands expropriated. More recently, there has been reterritorialization, when they have tried to return to the land," he said.

Fernandes coordinates the UNESCO Chair in Rural Education and Territorial Development, which, through an agreement between UNESP, UNESCO, and Via Campesina, created the first post-graduate program for the population of traditional territories focused on sustainable territorial development.

A condition of existence of the indigenous, "quilombola," or peasant socioterritorial movements is territory. "They are people who do not exist without their territories," he said.

According to Fernandes, in the 1970s and 1980s, various governments tried to implement policies for the "integration" of these populations in the production of crop and livestock commodities. "They then started to produce commodities on small scales for large corporations. However, despite this process being called integration, it was, in fact, a subordination process, since it created a series of problems for these families and these territories, such as poverty and the loss of land," he said.

According to Fernandes, it was from the 1990s onward that a new concept - that of food sovereignty - emerged, created by the socioterritorial movements, at the forefront of which was Via Campesina, based on agroecology, that is, agriculture based on an ecological perspective. "This is happening in almost all countries of the world and, evidently, in Brazil, since there is an ever greater demand for the production of healthy food. It is a new market," he said.

Selling directly to the consumer

Another characteristic of this global phenomenon, linked to the "quilombola" and indigenous peasant movement, is it does not compete with the traditional mode of monoculture, on large land properties and with the use of agrotoxins. "It is another logic. It makes no sense for agroecology to compete with the capitalist form of agribusiness. They are different production and product models, with different qualities and scales," he said.

Following this same logic, organic and agroecological products are not sold to large corporations, but at fairs, institutional markets, and cooperative stores. "They are creating new markets and relationships with the communities that support the farmer, offering organic and agroecological baskets sold directly to the consumer. They also sell to schools and hospitals," he said.

According to Fernandes, all the farming families that produce rice, beans, cassava, and coffee in the examples mentioned in Rio Grande do Sul, Ceará, and Minas Gerais were subordinate to the agribusiness model. "Now, organized in the Landless Workers' Movement, they have recovered their territories and gone on to produce organic and agroecological foods, as they understood that it was the only way to continue existing," he said.

Fernandes highlights that, although it is the most well-known, the MST is only one of the 126 socioterritorial movements listed in Brazil by DATALUTA - the Fight for Land Database, of the Center for Agrarian Reform Studies, Research, and Projects (NERA) at UNESP.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Scientists suggest new solution to the rare-disease problem

Thousands of rare diseases cumulatively affect millions of people across the globe, yet because each case is so rare doctors struggle to accurately diagnose and effectively treat individual patients. Every time a patient with an unspecified disorder walks into a new clinic or shows up in an emergency room, doctors must start from scratch. The patients often go through years of such experiences before they ever get a diagnosis.

Scientists believe they've hit upon a way to get a handle on this problem.

In a commentary published this month in the journal Nature Reviews Drug Discovery, an international team of data scientists and rare disease specialists write that they've come up with a way to characterize and define diseases so that they eventually would be sharable among physicians across the globe. For years, patient advocacy groups and regulators have often cited an estimate that there are roughly 7,000 "rare" diseases, however the new research suggests this number may be undercounting by thousands.

"This preliminary analysis suggests that there could be a substantially higher number of rare diseases than typically assumed ... with obvious implications for diagnostics, drug discovery and treatment," they write.

The scientists call for a coordinated effort to better define rare diseases so that clinicians can effectively diagnose and treat patients.

The reason: Doctors can't successfully treat what they can't recognize.

"Most of these diseases they'll never see again in their lifetime," said lead author Melissa Haendel, Ph.D., associate professor of medical informatics and clinical epidemiology in the Oregon Clinical and Translational Research Institute (OCTRI). "If we can't count them, that means we also can't define them, and therefore we can't adequately diagnose them."

The lack of agreement and imperfections in current rare disease definitions also affect research on rare disease mechanisms and potential therapies.

Researchers estimate that as much as 10% of the world's population suffers from a rare disease, translating into hundreds of millions of people. Yet the inability to reliably diagnose a specific rare disease hinders doctors' ability to treat individual patients and researchers' ability to develop effective treatments.

Haendel wrote the commentary with 18 co-authors from the United States, Australia, France and Germany. In it, the authors call upon the World Health Organization, the U.S. Food and Drug Administration, the European Medicines Agencies, the National Academy of Medicine and other entities to adopt a unified definition of rare disease.

Credit: 
Oregon Health & Science University

Carbon soccer ball with extra proton probably most abundant form in space

image: The fingerprint of protonated C60.

Image: 
Radboud University

It is one of the most common forms of carbon in space: C60, a soccer ball-shaped carbon molecule, but one that has an extra proton attached to it. This is the conclusion of research carried out at Radboud University, which has succeeded for the very first time in measuring the absorption spectrum of this molecule. Such knowledge could ultimately help us to learn more about the formation of planets. The researchers will publish their findings on November 25th in Nature Astronomy.

"Almost every property of the iconic C60 molecule - also called a molecular soccer ball, Buckminsterfullerene or buckyball - that can be measured, has been measured," says Jos Oomens, professor of Molecular Structure and Dynamics at Radboud University. Even so, he and his colleagues have managed to measure something new: the absorption spectrum of the molecule in its protonated form, C60H+.

"In doing so, we show that it is probably abundant in interstellar clouds, while we also demonstrate a textbook example of the role of symmetry in molecular physics", explains Oomens.

Carbon football in space

When the astronomer Harry Kroto discovered C60 in 1985, he predicted that, due to its high stability, this new form of carbon would be widespread in space. C60 consists of 60 carbon atoms in the form of a soccer ball, and has the highest possible symmetry in molecular physics. And indeed, over the last ten years, C60 has been detected in many interstellar clouds.

It is important for astronomers to determine the chemical composition of such interstellar clouds, because this is where new stars and planets are formed, including our own solar system. The more we learn about the molecules present in these clouds, the more we can discover about how our own planet was formed. C60 is one of the most complex molecules identified so far in these clouds.

Kroto also predicted that not C60, but the protonated version of the molecule, would be the most prevalent in space. Now the researchers have shown for the first time that this could in fact be the case. "When we compared the infrared spectra emitted by interstellar clouds with our infrared spectrum for protonated C60, we found a very close match", explains Oomens.

Colour change due to symmetry loss

Protonated C60 has a proton (H+) attached to the outside of the football, which means that the molecule loses its perfect symmetry. "Our research shows that, as a result, protonated C60 absorbs many more colours of light than 'normal' C60. In fact, you could say that C60H+ has a very different colour compared with the C60 molecule, although this is in the infrared spectrum. This is a well-known effect in molecular physics, and is beautifully demonstrated in the new spectrum."

This is the first time that researchers have successfully measured the light absorption spectrum of protonated C60. Because of the charge on the molecules, they repel one another, and this makes it difficult to obtain a sufficiently high density to obtain an absorption spectrum. "We found a way to work around this using the free-electron laser at the FELIX laboratory. By combining the FELIX laser with a mass spectrometer, C60H+ disintegrates and we can detect the fragmented ions rather than measuring the direct absorption spectrum."

Credit: 
Radboud University Nijmegen

Air pollution linked to higher glaucoma risk

Living in a more polluted area is associated with a greater likelihood of having glaucoma, a debilitating eye condition that can cause blindness, finds a new UCL-led study in the UK.

People in neighbourhoods with higher amounts of fine particulate matter pollution were at least 6% more likely to report having glaucoma than those in the least-polluted areas, according to the findings published in Investigative Ophthalmology & Visual Science.

"We have found yet another reason why air pollution should be addressed as a public health priority, and that avoiding sources of air pollution could be worthwhile for eye health alongside other health concerns," said the study's lead author, Professor Paul Foster (UCL Institute of Ophthalmology and Moorfields Eye Hospital).

"While we cannot confirm yet that the association is causal, we hope to continue our research to determine whether air pollution does indeed cause glaucoma, and to find out if there are any avoidance strategies that could help people reduce their exposure to air pollution to mitigate the health risks."

Glaucoma is the leading global cause of irreversible blindness and affects over 60 million people worldwide. It most commonly results from a build-up of pressure from fluid in the eye, causing damage to the optic nerve that connects the eye to the brain. Glaucoma is a neurodegenerative disease.

"Most risk factors for glaucoma are out of our control, such as older age or genetics. It's promising that we may have now identified a second risk factor for glaucoma, after eye pressure, that can be modified by lifestyle, treatment or policy changes," added Professor Foster.

The findings were based on 111,370 participants of the UK Biobank study cohort, who underwent eye tests from 2006 to 2010 at sites across Britain. The participants were asked whether they had glaucoma, and they underwent ocular testing to measure intraocular pressure, and spectral-domain optical coherence tomography imaging (a laser scan of the retina) to measure thickness of their eye's macula (central area of the retina).

The participants' data was linked to air pollution measures for their home addresses, from the Small Area Health Statistics Unit, with the researchers focusing on fine particulate matter (equal or less than 2.5 micrometres in diameter, or PM2.5).

The research team found that people in the most-polluted 25% of areas were at least 6% more likely to report having glaucoma than those in the least-polluted quartile, and they were also significantly more likely to have a thinner retina, one of the changes typical of glaucoma progression. Eye pressure was not associated with air pollution, which the researchers say suggests that air pollution may affect glaucoma risk through a different mechanism.

"Air pollution may be contributing to glaucoma due to the constriction of blood vessels, which ties into air pollution's links to an increased risk of heart problems. Another possibility is that particulates may have a direct toxic effect damaging the nervous system and contributing to inflammation," said the study's first author, Dr Sharon Chua (UCL Institute of Ophthalmology and Moorfields Eye Hospital).

Air pollution has been implicated in elevated risk of pulmonary and cardiovascular disease as well as brain conditions such as Alzheimer's disease, Parkinson's disease and stroke. Particulate matter exposure is one of the strongest predictors of mortality among air pollutants.

This study adds to previous evidence that people in urban areas are 50% more likely to have glaucoma than those in rural areas, suggesting now that air pollution may be a key contributor to that pattern.

"We found a striking correlation between particulate matter exposure and glaucoma. Given that this was in the UK, which has relatively low particulate matter pollution on the global scale, glaucoma may be even more strongly impacted by air pollution elsewhere in the world. And as we did not include indoor air pollution and workplace exposure in our analysis, the real effect may be even greater," said Professor Foster.

Credit: 
University College London

Light-trapping nanocubes drive inexpensive multispectral camera

image: A new type of lightweight, inexpensive hyperspectral camera could enable precision agriculture. This graphic shows how different pixels can be tuned to specific frequencies of light that indicate the various needs of a crop field.

Image: 
Maiken Mikkelsen & Jon Stewart, Duke University

DURHAM, N.C. -- Researchers at Duke University have demonstrated photodetectors that could span an unprecedented range of light frequencies by using on-chip spectral filters created by tailored electromagnetic materials. The combination of multiple photodetectors with different frequency responses on a single chip could enable lightweight, inexpensive multispectral cameras for applications such as cancer surgery, food safety inspection and precision agriculture.

A typical camera only captures visible light, which is a small fraction of the available spectrum. Other cameras might specialize in infrared or ultraviolet wavelengths, for example, but few can capture light from disparate points along the spectrum. And those that can suffer from a myriad of drawbacks, such as complicated and unreliable fabrication, slow functional speeds, bulkiness that can make them difficult to transport, and costs up to hundreds of thousands of dollars.

In research appearing online on November 25 in the journal Nature Materials, Duke researchers demonstrate a new type of broad-spectrum photodetector that can be implemented on a single chip, allowing it to capture a multispectral image in a few trillionths of a second and be produced for just tens of dollars. The technology is based on physics called plasmonics -- the use of nanoscale physical phenomena to trap certain frequencies of light.

"The trapped light causes a sharp increase in temperature, which allows us to use these cool but almost forgotten about materials called pyroelectrics," said Maiken Mikkelsen, the James N. and Elizabeth H. Barton Associate Professor of Electrical and Computer Engineering at Duke University. "But now that we've dusted them off and combined them with state of the art technology, we've been able to make these incredibly fast detectors that can also sense the frequency of the incoming light."

According to Mikkelsen, commercial photodetectors have been made with these types of pyroelectric materials before, but have always suffered from two major drawbacks. They haven't been able to focus on specific electromagnetic frequencies, and the thick layers of pyroelectric material needed to create enough of an electric signal have caused them to operate at very slow speeds.

"But our plasmonic detectors can be turned to any frequency and trap so much energy that they generate quite a lot of heat," said Jon Stewart, a graduate student in Mikkelsen's lab and first author on the paper. "That efficiency means we only need a thin layer of material, which greatly speeds up the process."

The previous record for detection times in any type of thermal camera with an on-chip filter, whether it uses pyroelectric materials or not, was 337 microseconds. Mikkelsen's plasmonics-based approach sparked a signal in just 700 picoseconds, which is roughly 500,000 times faster. But because those detection times were limited by the experimental instruments used to measure them, the new photodetectors might work even faster in the future.

To accomplish this, Mikkelsen and her team fashioned silver cubes just a hundred nanometers wide and placed them on a transparent film only a few nanometers above a thin layer of gold. When light strikes the surface of a nanocube, it excites the silver's electrons, trapping the light's energy -- but only at a specific frequency.

The size of the silver nanocubes and their distance from the base layer of gold determine that frequency, while the amount of light absorbed can be tuned by controlling the spacing between the nanoparticles. By precisely tailoring these sizes and spacings, researchers can make the system respond to any electromagnetic frequency they want.

To harness this fundamental physical phenomenon for a commercial hyperspectral camera, researchers would need to fashion a grid of tiny, individual detectors, each tuned to a different frequency of light, into a larger 'superpixel'.

In a step toward that end, the team demonstrates four individual photodetectors tailored to wavelengths between 750 and 1900 nanometers. The plasmonic metasurfaces absorb energy from specific frequencies of incoming light and heat up. The heat induces a change in the crystal structure of a thin layer of pyroelectric material called aluminium nitride sitting directly below them. That structural change creates a voltage, which is then read by a bottom layer of a silicon semiconductor contact that transmits the signal to a computer to analyze.

"It wasn't obvious at all that we could do this," said Mikkelsen. "It's quite astonishing actually that not only do our photodetectors work, but we're seeing new, unexpected physical phenomena that will allow us to speed up how fast we can do this detection by many orders of magnitude."

Mikkelsen sees several potential uses for commercial cameras based on the technology, because the process required to manufacture these photodetectors is relatively fast, inexpensive and scalable.

Surgeons might use multispectral imaging to tell the difference between cancerous and healthy tissue during surgery. Food and water safety inspectors could use it to tell when a chicken breast is contaminated with dangerous bacteria.

With the support of a new Moore Inventor Fellowship from the Gordon and Betty Moore Foundation, Mikkelsen has set her sights on precision agriculture as a first target. While plants may only look green or brown to the naked eye, the light outside of the visible spectrum that is reflected from their leaves contains a cornucopia of valuable information.

"Obtaining a 'spectral fingerprint' can precisely identify a material and its composition," said Mikkelsen. "Not only can it indicate the type of plant, but it can also determine its condition, whether it needs water, is stressed or has low nitrogen content, indicating a need for fertilizer. It is truly astonishing how much we can learn about plants by simply studying a spectral image of them."

Hyperspectral imaging could enable precision agriculture by allowing fertilizer, pesticides, herbicides and water to be applied only where needed, saving water and money and reducing pollution. Imagine a hyperspectral camera mounted on a drone mapping a field's condition and transmitting that information to a tractor designed to deliver fertilizer or pesticides at variable rates across the fields.

It is estimated that the process currently used to produce fertilizer accounts for up to two percent of the global energy consumption and up to three percent of global carbon dioxide emissions. At the same time, researchers estimate that 50 to 60 percent of fertilizer produced is wasted. Accounting for fertilizer alone, precision agriculture holds an enormous potential for energy savings and greenhouse gas reduction, not to mention the estimated $8.5 billion in direct cost savings each year, according to the United States Department of Agriculture.

Several companies are already pursuing these types of projects. For example, IBM is piloting a project in India using satellite imagery to assess crops in this manner. This approach, however, is very expensive and limiting, which is why Mikkelsen envisions a cheap, handheld detector that could image crop fields from the ground or from inexpensive drones.

"Imagine the impact not only in the United States, but also in low- and middle-income countries where there are often shortages of fertilizer and water," said Mikkelsen. "By knowing where to apply those sparse resources, we could increase crop yield significantly and help reduce starvation."

Credit: 
Duke University

New hospital tech disrupts doctors' and nurses' jobs, forces improvisation to ensure patient safety

Doctors and nurses must adapt their routines and improvise their actions to ensure continued patient safety, and for their roles to be effective and to matter as new technology disrupts their working practices.

Research from Lancaster University Management School, published in the Journal of Information Technology, found electronic patient records brought in to streamline and improve work caused changes in the division of labour and the expected roles of both physicians and nursing staff.

These changes saw disrupted working practices, professional boundaries and professional identities, often requiring complex renegotiations to re-establish these, in order to deliver safe patient care. Managers implementing these systems are often quite unaware of the unintended consequences in their drive for efficiency.

The researchers worked in one of Saudi Arabia's biggest hospitals, in Riyadh, over the course of a nine-month period. The hospital has 2,000 beds and serves around 2,000 outpatients a day, with a staff of 1,000. They conducted interviews and observations on the wards and timed their research to coincide with the introduction of a new Computerised Physician Order Entry system (CPOE) which provide the facility to update patient records electronically.

The CPOE system is designed to be a safer, more streamlined and efficient way to provide care to patients. However, implicit in its design is a hierarchy of medical expertise, which assumes physicians rather than nurses will enter orders.

The researchers found this was at odds with ongoing work practices, where doctors considered themselves the thinkers, and nurses the doers, carrying out such work. As a result, staff found their roles and established practices undermined, and employed methods better suited to their routines to ensure fewer mistakes and that their roles still mattered, even if the new system was not built with such decisions in mind.

The CPOE also introduced added complications in the form of yellow stickers to indicate when medication for a patient is set to be discontinued. Aimed at physicians, who must give orders to renew medication, they tended to ignore them, because they felt the alerts were meant for nurses, who instead kept track and issued reminders to act on them.

Co-author Professor Lucas Introna said: "Introducing the CPOE system disrupted medical work practices, especially around the division of labour between 'headwork' and 'paperwork', and staff had to renegotiate new practices, with non-human actors such as the CPOE and paper-based medical records playing a role in such renegotiations.

"The new system affected the professional identities of doctors and nurses, as they became fundamentally different actors when operating within that system as opposed to using previous methods. Boundaries were contested when trying to deal with the changes introduced.

"Under the paper system, the nurse held the role of an expert administrator of the patient record, but the new CPOE system handed that responsibility to doctors with no experience of such responsibilities. The new system did not take account of the established hierarchy and division of labour, which places physicians as authorisers rather than administrators.

"Because physicians were not used to using such systems, they were more error-prone - on occasion entering details for the wrong patient, or failing to even log on," added Professor Introna. "As a result, some claimed it was confusing and could cause serious errors, concluding it was neither safe nor useful.

"Nurses would have to correct errors and it was not uncommon for them to stand next to doctors and guide them through the system, with one nurse saying the doctor working with them was 'clueless as far as the system is concerned'.

"Doctors became novice administrators and nurses became chasers - their roles and long-established professional identities and boundaries changed by the introduction of the new system which has the potential to remove their legitimacy."

Doctors and nurses related to medical records in different ways, with nurses staying on top of medical record requirements and seeing caring for the records as caring for the patient, and vice-versa. Doctors, contrarily, saw themselves as being responsible for diagnosis and prescription, with nurses carrying out their instructions. The medical record is what tied them together and served as the centre of interaction.

Entering orders into the CPOE on an individual basis reduces the opportunities for discussion and joint decision-making when it comes to patients, while use of the system can take up to twice as long as the previous paper-based system.

Co-author Professor Niall Hayes said: "The changes in the division of labour seemed innocent on the surface, but were much more significant. The previous recording practices, using paper and notes, were not only records, but also functioned as ways of communicating and sharing knowledge, which defined roles of those involved and more besides. The recording and reviewing of records were practices at the heart of clinical care and decision-making, and this all changed.

"There was a reluctance by the physicians to engage with the CPOE's functionality. They were now expected to enter orders, previously the domain of the nurses - they were no longer just the authorisers, they now had to take direct action through the system.

"Instead, doctors would often still ask nurses to enter orders and would simply check and submit. One nurse manager suggested doctors had been spoiled in the past, and it appeared the doctors did not see the work they were expected to carry out as work they should be doing.

"As an ICU consultant explained to us, they saw 'use' of the system as getting benefit from it, not operating it."

Credit: 
Lancaster University

High expression of apoptosis protein (Api-5) in chemoresistant triple-negative breast cancers: an innovative target

image: The influence of stress conditions on API-5 expression and inhibition. (A) Hypoxia CAIX immunostaining on XBC-R and XBC-S (n = 5 for each). CAIX labelling and necrotic areas were detected at a magnification of 200 (top panel) and 400 (bottom panel). (B, C) API-5 expression in HMEC cells and the effect of anti-API-5 peptide under hypoxia and metabolic stress conditions respectively. API-5 expression showed an increase after 12 h and 24 h of hypoxia and metabolic stress respectively. The effect of anti-API-5 peptide was more efficient under stress conditions on HMEC cell viability.

Image: 
Correspondence to - Melanie Di Benedetto - melanie.dibenedetto@univ-paris13.fr and Guilhem Bousquet - guilhem.bousquet@aphp.fr

The cover for issue 61 of Oncotarget features Figure 4, "The influence of stress conditions on API-5 expression and inhibition," by Bousquet, et al.

78 TNBC biopsies from patients with different responses to chemotherapy were analysed for API-5 expression before any treatment.

Further studies on API-5 expression and inhibition were performed on patient-derived TNBC xenografts, one highly sensitive to chemotherapies and the other resistant to most tested drugs.

Clinical analyses of the 78 TNBC biopsies revealed that API-5 was more markedly expressed in endothelial cells before any treatment among patients with chemoresistant TNBC, and this was associated with greater micro-vessel density.

Dr. Melanie Di Benedetto and Dr. Guilhem Bousquet said, "Apoptosis Inhibitor-5 (API-5) also called Anti-Apoptosis Clone 11 (AAC-11) is a 58-kDa nuclear protein highly conserved across species."

API-5 was originally identified as an anti-apoptotic protein in mouse fibroblasts and in human cervical carcinoma cell lines, in which API-5 significantly enhanced cell survival after serum deprivation or ultraviolet sensitization.

In vitro, a decrease in API-5 expression sensitized human cancer cell lines to apoptosis, and increased their sensitivity to anticancer drugs.

In human osteocarcinoma cells, API-5 promotes survival through the inhibition of an E2 promoter-binding factor, and the integrity of the leucine zipper domain is required for the anti-apoptotic functions of API-5.

Previous authors engineered an anti-API5 peptide composed of the LZ domain fused with the Antennapedia/Penetrating protein domain since the integrity of this domain is essential for the function of API-5. This LZ peptide is able to penetrate cells in vitro, and to cancel the API-5/acinus interaction.

In this pre-clinical study, the authors studied the expression of API-5 in patients with chemotherapy-resistant TNBC and the inhibition of API-5 in a resistant TNBC xenograft model.

The Benedetto/Bousquet Research Team concluded that Hypoxia is associated with drug resistance and they have previously demonstrated a link between cancer stem-cells, hypoxic niches and resistance to sunitinib in renal cell carcinoma.

In vivo, hypoxia has been shown to induce resistance to apoptosis in human cervical cancer cell lines through a selection of p53-mutated cells.

Credit: 
Impact Journals LLC

Scientists develop electrochemical platform for cell-free synthetic biology

image: The new biohybrid system uses non-optical reporter enzymes contained within 16 microlitres of liquid which pair specifically with micropatterned electrodes hosted on a small chip no more than one inch in length. (To be visible, liquid shown here is more than 16 microlitres)

Image: 
Steve Southon

Scientists at the University of Toronto (U of T) and Arizona State University (ASU) have developed the first direct gene circuit to electrode interface by combining cell-free synthetic biology with state-of-the-art nanostructured electrodes.

Study results were published today in Nature Chemistry.

Long inspired by concepts from the field of electronics, with its circuits and logic gates, synthetic biologists have sought to reprogram biological systems to carry out artificial functions for medical, environmental, and pharmaceutical applications. This new work moves the field of synthetic biology toward biohybrid systems that can take advantage of benefits from each discipline.

"This is the first example of a gene circuit being directly coupled to electrodes, and is an exciting tool for the conversion of biological information into an electronic signal," said Keith Pardee, assistant professor in the Department of Pharmaceutical Sciences at U of T's Leslie Dan Faculty of Pharmacy.

The interdisciplinary effort to create the new system brought together expertise in cell-free synthetic biology from the Pardee lab (U of T), electrochemistry from the Kelley lab (U of T) and sensor design from the Green lab (ASU).

Overcoming practical limits of optical signaling

Pardee, whose research group specializes in developing cell-free diagnostic technologies that can be used safely outside the lab, received widespread attention in 2016 when he and collaborators released a platform for the rapid, portable and low-cost detection of the Zika virus using paper-based synthetic gene networks.

Bringing the capacity to detect the Zika virus outside of the clinic and to the point-of-need was a crucial step forward, but the approach relied on conventional optical signaling - a change in colour to indicate that the virus had been detected. This posed a challenge for practical implementation in countries like Brazil where viruses with similar symptoms require health care providers to screen for several different pathogens in order to correctly identify the cause of a patient's infection.

This highlighted the need for a portable system that could accommodate many sensors in the same diagnostic test, a capability known as multiplexing. The challenge was that multiplexing with colour-based signaling is not practical.

"Once you get beyond three colour signals, you run out of bandwidth for unambiguous detection. Moving into the electrochemical space gives us significantly more bandwidth for reporting and signalling. We've now shown that distinct electrochemical signals can operate in parallel and without crosstalk, which is a much more promising approach for scaling up," said Pardee.

The new biohybrid system uses non-optical reporter enzymes contained within 16 microlitres of liquid which pair specifically with micropatterned electrodes hosted on a small chip no more than one inch in length. Within this chip, gene-circuit-based sensors monitor the presence of specific nucleic acid sequences, which, when activated, trigger the production of one of a panel of the reporter enzymes. The enzymes then react with reporter DNA sequences that set off an electrochemical response on the electrode sensor chip.

Detecting antibiotic resistance genes

As a proof of concept, the team applied the new approach to detecting colistin antibiotic resistance genes which have recently been identified in livestock globally and represent a serious threat to the use of the antibiotic as a last resort treatment for infection. Four separate resistance genes were detected, demonstrating the ability of the system to effectively identify and report each gene independently and also in combination.

For synthetic biologists, this new approach represents a potential technical leap forward. Conventional synthetic biology requires that logic calculations be encoded into the DNA of the gene circuit. This can be painstaking, taking months to years to build complex circuits.

"What makes this combined approach so powerful is that the underlying connectivity of the gene circuit sensor outputs can be re-programmed at will by simply modifying the code at the level of the software rather than at the level of the DNA which is much more difficult and time consuming," said Shana Kelley, university professor in the Department of Pharmaceutical Sciences at U of T's Leslie Dan Faculty of Pharmacy, whose research group specializes in the development of highly sensitive electrochemical sensors. Bringing biology-based sensing together with electronic-based logic, memory and response elements, has the potential to transform medicine, biotech, academic research, food safety, and other practical applications, she said.

A powerful toolkit for the future

"This new system enables us to detect many different signals simultaneously, which is essential for diagnostics and monitoring systems," said co-author Alexander A. Green, assistant professor at the Biodesign Institute at Arizona State University. "The electronic output means that in the future it can be readily interfaced technologies like smartphones and distributed sensing arrays that could be brought directly to a patient's bedside."

In Toronto, Pardee and his research group are excited to see where others in the synthetic biology field will take the system. "We've essentially created a new set of tools and opened up a new venue for signaling. Synthetic biology applications are limited at the reporting step and this has been a significant challenge. With this new combined approach, we think we can really accelerate the field and its capacity to improve lives."

Credit: 
University of Toronto - Leslie Dan Faculty of Pharmacy

A study compares how water is managed in Spain, California and Australia

image: Julio Berbel, investigator of the Universidad de Córdoba

Image: 
Universidad de Córdoba

Turning on the faucet and having water come out has become such a common daily occurrence that nobody stops to think about it. In times of abundance, everything goes smoothly. However, when rain is scarce or almost inexistent and reservoir capacity diminishes considerably, that is when alarm bells are set off and governments scramble trying to find a solution. As they say, you don't know what you´ve got until it's gone.

A research project at the University of Cordoba and the University of Zaragoza analyzed and compared legislative changes in water management that occurred in Spain, California and the Murray Darling Basin in Australia over the last few years. The study demonstrated the following: big legislative reforms in water management in these three areas have always come about as a consequence of important droughts.

Though these three regions have similar climates, agriculture and water scarcity problems, the truth is that the way they deal with them is quite different. In many cases, above all on a technical level, they coincide. "In all three zones they use recycled, desalinated water and drip irrigation. However, the institutional and legal framework differs," clarifies Julio Berbel, Professor of Agricultural Economics at the University of Cordoba and lead author of the study together with Encarna Esteban of the University of Zaragoza.

One of the main differences lies in how water ownership is managed and how the market is regulated in this field. In Australia, water is private and there has been a strong, loosely regulated water market for years in which rights to use subterranean water and river water can be bought and sold. In California, the issue is even more complicated. Water is considered to be a private asset but there are several conflicts of rights as to who can use it. In contrast, in Spain, river water was declared a public resource in 1985 and water trade was banned until 2005, at which time the market opened up slightly in order to tackle a large hydrological crisis, though it is still extremely regulated.

The study concludes that in the three regions there is a tendency to overexploit all subterranean and surface bodies of water, despite legislation. Nevertheless, if he had to choose a model, researcher Julio Berbel would opt for the Spanish one. "In Australia and California, they have not defined the concept of ecological flow well, which is what indicates the minimum level of water needed to conserve the plant and animal ecosystem of a river," he explains. In Australia's Murray Darling region, water is extremely scarce. In the case of California, the Colorado river is so overused that, in its last segment, water is so scarce that it doesn't even reach the sea.

Despite problems in Spain not being as serious, legislation can continue to improve by adopting measures that work in the other two regions and that will be necessary in future droughts, occurrences that will become more and more common due to climate change. According to Berbel, Spain could learn from Australia's flexibility and dynamism in its water market and from California's aquifer recharge system. "I truly hope we do not have to wait for the next drought to implement some of these improvements," he concludes.

Credit: 
University of Córdoba