Earth

Researchers capture first images of oxygen in cancer tumors during radiation therapy

image: An oxygen map image recovered from a mouse undergoing radiation therapy. The luminescent oxygen probe PtG4 is injected during the week of radiation treatment and localizes between the cells of the tumor as illustrated by microscopy (red).

Image: 
Brian Pogue, PhD

LEBANON, NH - Oxygen in cancer tumors is known to be a major factor that helps radiation therapy be successful. Hypoxia, or starvation of oxygen, in solid tumors is also thought to be an important factor in resistance to therapy. However, it is difficult to monitor tumor oxygenation without invasive sampling of oxygen distributions throughout the tissue, or without averaging across the whole tumor, whereas oxygen is highly heterogenous within a tumor. A research team at Dartmouth's and Dartmouth-Hitchcock's Norris Cotton Cancer Center led by Brian Pogue, PhD, has developed the first non-invasive way to directly monitor oxygen distributions within the tumor right at the time when radiation therapy is happening. With injection of an oxygen probe drug, PtG4, they are able to image the distribution of oxygen from within the tumor. The method measures the luminescence lifetimes of PtG4 while it is excited by the Cherenkov light emitted by the radiation therapy. The drug, PtG4, stays in the tumor for at least a week, and works for imaging repeatedly.

"The imaging is all done without any additional radiation, simply by using a camera to monitor the emissions during radiotherapy treatment," explains Pogue. "Following two tumor lines, one which is known to be responsive to radiation and one which is known to be resistant, we could see differences in the oxygenation of the tumor which are reflective of their differences in response." The team's findings, "Tissue pO2 Distributions in Xenograft Tumors Dynamically Imaged by Cherenkov-Excited Phosphorescence during Fractionated Radiation Therapy," are newly published in Nature Communications, by lead author, Xu Cao.

Pogue's team is able to capture oxygenation imaging through special technology. "We have a unique set of time-gated cameras in our radiation therapy department that were designed for Cherenkov-based radiation dosimetry, but we have used them for this additional purpose of monitoring oxygen in the tumors under treatment," says Pogue. "So access to these specialized Cherenkov cameras made the measurements possible." Pogue's team also collaborated with Professor Sergei Vinogradov and his team at the University of Pennsylvania Perelman School of Medicine, who produced the PtG4 and supported the work with drug characterization and co-supervision of the study.

Pogue hopes to develop this tumor monitoring ability into a useful clinical aid used to track tumor response to radiation therapy, especially tumors that are known to be hypoxic. Having such information available at the time of treatment could be helpful in influencing treatment decisions such as giving a radiation boost where needed. "When a patient gets radiation therapy, the treatment should be designed to directly utilize as much information about the patient's tumor as possible," says Pogue. "Today, we use the shape of the tumor and the tissue around it. But, we need to also think about using measurements of the tumor metabolism because this affects the success of treatment as well. Future radiation therapy treatments should ideally incorporate metabolic features such as oxygenation of the tumor when the treatment is planned or delivered."

The next steps toward this future are already underway. Pogue's team is looking to characterize how small of a region they can track the oxygenation from, and how fast they can take measurements. "Our goal is to produce oxygen images at video rate, with a spatial resolution that allows us to see radiobiologically relevant hypoxia nodules in the tumor of humans," explains Pogue.

Credit: 
Dartmouth Health

Identified a determinant protein for tumor progression and metastasis in Rhabdomyos

image: Dr. Òscar M. Tirado group

Image: 
Dr. Òscar M. Tirado group

Rhabdomyosarcoma is the most common childhood cancer in the soft tissues, and it is mainly originated in the muscles. It represents almost 5% of pediatric tumors, and the survival rate is between 60% and 70%. The work published in "Cancer Letters" journal focuses on the most aggressive and hard to treat rhabdomyosarcoma, the alveolar type. Metastasis plays an essential role in the disease progression, because it induces a severe decrease in patient survival rate, lower than 30%. Dr. Oscar M. Tirado's group from Bellvitge Biomedical Research Institute (IDIBELL) has observed that this type of sarcoma cells have an increased level of LOXL2 protein, that is implicated in tumors metastatic capacity.

In a normal situation, LOXL2 acts, outside the cell, modifying the surrounding extracellular matrix. However, in a tumor environment, LOXL2 acts inside the cell, promoting metastasis processes by a mechanism independent from its normal function. Dr. Olga Almacellas, the first author of the study, states that in future treatment, that tries to inhibit LOXL2 activity to reduce rhabdomyosarcoma metastasis, the drug penetration inside the cells must be considered. Besides, she adds that existing drugs that block the classic LOXL2 activity would not affect their metastatic role, as it is independent of its classical function.

Not only cellular models of alveolar rhabdomyosarcoma showed a clear decrease in metastatic capacity by LOXL2 elimination. Also, the injection of these cells into healthy mice demonstrated that those expressing LOXL2 formed more metastasis than those without it. Additionally, patients' samples, from the Virgen del Rocío University Hospital, suggested a lower survival rate for patients with higher levels of LOXL2. However, a more extensive study with more samples would be needed to confirm this relationship.

One of the questions that remains open is how LOXL2 regulates metastatic function in tumor cells. Researchers have found that it interacts with vimentin. A cytoskeletal protein that provides structural support, cell resistance and plays a crucial role in cell migration. Basics processes in metastasis.

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute

Researchers identify mechanism that triggers a rare type of muscular dystrophy

image: IBB-UAB researchers Salvador Ventura and Cristina Batlle.

Image: 
IBB-UAB

Limb-girdle muscular dystrophy (LGMD) is the term given to a group of rare hereditary diseases characterised by the wasting and weakening of the hip and shoulder muscles.

LGMD type 1G (LGMD1G) is associated with two possible genetic mutations in a protein called hnRNPDL. Little is known about this protein, except that it exists in cells in three functional forms (isoforms) and that it may contain the genetic mutations linked to the disease.

A research team led by the Institute of Biotechnology and Biomedicine at the Universitat Autònoma de Barcelona (IBB-UAB) now explains the behaviour of this protein, its role in the cells and the phenotype caused by the genetic mutations associated with LGMD1G, in an article published in Cell Reports.

The research establishes that one of the protein's isoforms demonstrates a greater tendency to form amyloid fibrils - toxic protein aggregates - and this tendency to aggregate occurred significantly faster when the protein contained the genetic mutations related to the disease, which prevented it from performing correctly.

"For the first time we can provide solid proof of the effects genetic mutations have on the process of the hnRNPDL protein aggregation", affirms Salvador Ventura, IBB-UAB researcher and coordinator of the study. "Based on data obtained with the Drosophila fruit fly, we were able to suggest a possible mechanism for the disease: that it is the loss of protein function, once the aggregates are formed, that triggers the dystrophy. A hypothesis corroborated by the first data we are beginning to obtain with humans, and that opens the door to search for possible treatments".

Differential Behaviour

To conduct the study, researchers first analysed the presence and behaviour of the three isoforms in which the protein is found within the cells: with three, two or one protein domains, or independent regions. Then they studied the effects of the genetic mutations in the most common variant.

The isoform with two domains is most common in cells and, surprisingly for researchers, is also the one with the greatest tendency to form aggregates.

The researchers also saw that the isoform with three domains has a greater tendency to undergo a process known as phase separation, discovered a few years ago and of great biological importance, which could act as a prevention against the aggregations.

"What we have seen is that the more tendency towards phase separation, the less aggregates are formed. Until now, it was thought that phase separation was a process occurring after amyloid-type aggregation, and we have now seen that it is not always so", explains Salvador Ventura.

The study was conducted both in vitro and in human cells. It was also conducted on a transgenic model of the Drosophila fruit fly, in which the flies expressed their natural variant or each of the forms associated with the disease.

Credit: 
Universitat Autonoma de Barcelona

Brain networks come 'online' during adolescence to prepare teenagers for adult life

image: The red brain regions belong to the "conservative" pattern of adolescent development, while the blue brain regions belong to the "disruptive" pattern

Image: 
Frantisek Vasa

New brain networks come 'online' during adolescence, allowing teenagers to develop more complex adult social skills, but potentially putting them at increased risk of mental illness, according to new research published in the Proceedings of the National Academy of Sciences (PNAS).

Adolescence is a time of major change in life, with increasing social and cognitive skills and independence, but also increased risk of mental illness. While it is clear that these changes in the mind must reflect developmental changes in the brain, it has been unclear how exactly the function of the human brain matures as people grow up from children to young adults.

A team based in the University of Cambridge and University College London has published a major new research study that helps us understand more clearly the development of the adolescent brain.

The study collected functional magnetic resonance imaging (fMRI) data on brain activity from 298 healthy young people, aged 14-25 years, each scanned on one to three occasions about 6 to 12 months apart. In each scanning session, the participants lay quietly in the scanner so that the researchers could analyse the pattern of connections between different brain regions while the brain was in a resting state.

The team discovered that the functional connectivity of the human brain - in other words, how different regions of the brain 'talk' to each other - changes in two main ways during adolescence.

The brain regions that are important for vision, movement, and other basic faculties were strongly connected at the age of 14 and became even more strongly connected by the age of 25. This was called a 'conservative' pattern of change, as areas of the brain that were rich in connections at the start of adolescence become even richer during the transition to adulthood.

However, the brain regions that are important for more advanced social skills, such as being able to imagine how someone else is thinking or feeling (so-called theory of mind), showed a very different pattern of change. In these regions, connections were redistributed over the course of adolescence: connections that were initially weak became stronger, and connections that were initially strong became weaker. This was called a 'disruptive' pattern of change, as areas that were poor in their connections became richer, and areas that were rich became poorer.

By comparing the fMRI results to other data on the brain, the researchers found that the network of regions that showed the disruptive pattern of change during adolescence had high levels of metabolic activity typically associated with active re-modelling of connections between nerve cells.

Dr Petra Vértes, joint senior author of the paper and a Fellow of the mental health research charity MQ, said: "From the results of these brain scans, it appears that the acquisition of new, more adult skills during adolescence depends on the active, disruptive formation of new connections between brain regions, bringing new brain networks 'online' for the first time to deliver advanced social and other skills as people grow older."

Professor Ed Bullmore, joint senior author of the paper and head of the Department of Psychiatry at Cambridge, said: "We know that depression, anxiety and other mental health disorders often occur for the first time in adolescence - but we don't know why. These results show us that active re-modelling of brain networks is ongoing during the teenage years and deeper understanding of brain development could lead to deeper understanding of the causes of mental illness in young people."

Measuring functional connectivity in the brain presents particular challenges, as Dr František Váša, who led the study as a Gates Cambridge Trust PhD Scholar, and is now at King's College London, explained.

"Studying brain functional connectivity with fMRI is tricky as even the slightest head movement can corrupt the data - this is especially problematic when studying adolescent development as younger people find it harder to keep still during the scan," he said. "Here, we used three different approaches for removing signatures of head movement from the data, and obtained consistent results, which made us confident that our conclusions are not related to head movement, but to developmental changes in the adolescent brain."

Credit: 
University of Cambridge

Space super-storm likelihood estimated from longest period of magnetic field observations

Analysis led by University of Warwick shows 'severe' space super-storms occurred 42 years out of 150 and 'great' super-storms occurred in 6 years out of 150

Super-storms can disrupt electronics, aviation and satellite systems and communications

Provides insight into the scale of the largest super-storm in recorded history

A 'great' space weather super-storm large enough to cause significant disruption to our electronic and networked systems occurred on average once in every 25 years according to a new joint study by the University of Warwick and the British Antarctic Survey.

By analysing magnetic field records at opposite ends of the Earth (UK and Australia), scientists have been able to detect super-storms going back over the last 150 years.

This result was made possible by a new way of analysing historical data, pioneered by the University of Warwick, from the last 14 solar cycles, way before the space age began in 1957, instead of the last five solar cycles currently used.

The analysis shows that 'severe' magnetic storms occurred in 42 out of the last 150 years, and 'great' super-storms occurred in 6 years out of 150. Typically, a storm may only last a few days but can be hugely disruptive to modern technology. Super-storms can cause power blackouts, take out satellites, disrupt aviation and cause temporary loss of GPS signals and radio communications.

Lead author Professor Sandra Chapman, from the University of Warwick's Centre for Fusion, Space and Astrophysics, said: "These super-storms are rare events but estimating their chance of occurrence is an important part of planning the level of mitigation needed to protect critical national infrastructure.

"This research proposes a new method to approach historical data, to provide a better picture of the chance of occurrence of super-storms and what super-storm activity we are likely to see in the future."

The Carrington storm of 1859 is widely recognised as the largest super-storm on record, but predates even the data used in this study. The analysis led by Professor Chapman estimates what amplitude it would need to have been to be in the same class as the other super-storms- and hence with a chance of occurrence that can be estimated.

Professor Richard Horne, who leads Space Weather at the British Antarctic Survey, said: "Our research shows that a super-storm can happen more often than we thought. Don't be misled by the stats, it can happen any time, we simply don't know when and right now we can't predict when."

Space weather is driven by activity from the sun. Smaller scale storms are common, but occasionally larger storms occur that can have a significant impact.

One way to monitor this space weather is by observing changes in the magnetic field at the earth's surface. High quality observations at multiple stations have been available since the beginning of the space age (1957). The sun has an approximately 11-year cycle of activity which varies in intensity and this data, which has been extensively studied, covers only five cycles of solar activity.

If we want a better estimate of the chance of occurrence of the largest space storms over many solar cycles, we need to go back further in time. The aa geomagnetic index is derived from two stations at opposite ends of the earth (in UK and Australia) to cancel out the earth's own background field. This goes back over 14 solar cycles or 150 years, but has poor resolution.

Using annual averages of the top few percent of the aa index the researchers found that a 'severe' super-storm occurred in 42 years out of 150 (28%), while a 'great' super-storm occurred in 6 years out of 150 (4%) or once in every 25 years. As an example, the 1989 storm that caused a major power blackout of Quebec was a great storm.

In 2012 the Earth narrowly avoided trouble when a coronal mass ejection from the Sun missed the Earth and went off in another direction. According to satellite measurements if it had hit the Earth it would have caused a super-storm.

Space weather was included in the UK National Risk Register in 2012 and updated in 2017 with a recommendation for more investment in forecasting. In September 2019 the Prime Minister announced a major new investment of £20 million into space weather. The object is to forecast magnetic storms and develop better mitigation strategies.

Credit: 
University of Warwick

Historical impacts of development on coral reef loss in the South China Sea

image: A coral specimen collected by the naturalist W Stimpson in 1854 in Chai Wan typhoon shelter (Victoria Harbour, HKSAR). This specimen was later used by the naturalist AE Verrill to describe the species Acropora valida for the first time. A fragment of this specimen was sampled and analyzed by N. Duprey for the present study.

Image: 
@The University of Hong Kong

New research led by The University of Hong Kong, Swire Institute of Marine Science in collaboration with Princeton University and the Max Planck Institute for Chemistry highlights the historical impacts of development on coral reef loss in the South China Sea. The findings were recently published in the journal Global Change Biology.

Using cutting-edge geochemical techniques pioneered by their Princeton collaborators, the team extracted minute quantities of nitrogen from coral skeletons, which grow in observable layers of growth similar to tree rings. Although more than 99% of the skeleton is calcium carbonate, the coral secretes a protein scaffolding upon which the minerals are attached. In this way, corals can control their calcification which increases in productive summer months and decreases in cool winter months leading to a tell-tale alternation of high and low density bands. Those bands were observed and measured using x-ray equipment at the Ocean Park veterinary hospital.

Using coral cores archived at The University of Hong Kong, and spanning research laboratories at HKU, Princeton, and the Max Planck Institute, the team, led by SWIMS postdoctoral fellow Dr Nicolas Duprey extracted skeletal material from each band to reconstruct a nearly 200-year time-series of change in the Pearl River Delta that pre-dated British colonization. The coral - still living as of 2007, had continuously recorded the conditions of its environment during that period of time by utilizing resources from seawater to build new skeleton. Nitrogen, a key component of the protein scaffolding, is one such element derived from the corals diet. Coincidentally, nitrogen also bears tell-tale signs of human disturbance through the increasing prevalence of sewage pollution and a rapidly changing landscape in the Pearl River catchment within Guangdong Province.

The well-documented collapse of southern Hong Kong coral communities in the 1980-1990s remained a mystery for the scientific community until now. The authors report that during the coral's lifespan, the human population within the multi-city megalopolis sky-rocketed some 3,000% to today's ~100 million people. Modern records showed that especially in the 1980s-1990s, during Hong Kong's rapid development and the reclamation of Victoria Harbour, water quality deteriorated substantially which coincided with severe losses of local coral reefs, especially in western waters.

"The precise detective work that our team has led over the last 5 years allowed us to identify the main culprit of this loss: water quality! This is a very interesting find because often time global warming is pointed out as the cause of coral decline worldwide; we often feel helpless facing this scenario because it involves changing drastically everybody's lifestyle on the planet to fix the problem. However, in the Pearl River Delta, the threat is different and originates from our poor handling of waste water treatment locally. Paradoxically, this is a good news because it implies that the solution to this problem is in our hands. Indeed, the improvement in waste water treatment in the 2000's was recorded in our coral skeleton, indicating that these efforts must be continued if we want corals to come back in Hong Kong, " said lead author Dr Nicolas Duprey.

Moreover, the coral skeleton revealed that in that specific timeframe a major anomaly in the nitrogen isotope ratio occurred. This was an indication that the source of nitrogen the coral used to build its skeletal scaffolding became was replaced by pollution. To verify this, the researchers also measured the isotope values from museum specimens held by the Smithsonian Institution and Yale Peabody Museum that were collected in Hong Kong in the 1800s during some of the first global biodiversity expeditions.

Credit: 
The University of Hong Kong

Understanding long-term trends in ocean layering

image: Upper ocean stratification has been strengthening in a large part of the global ocean since the 1960s. Color shows trends in density difference between the surface and 200-m depth.

Image: 
Ryohei Yamaguchi

Water layering is intensifying significantly in about 40% of the world's oceans, which could have an impact on the marine food chain. The finding, published in the Journal of Geophysical Research: Oceans, could be linked to global warming.

Tohoku University geophysicist Toshio Suga collaborated with climate physicist Ryohei Yamaguchi of Korea's Pusan National University to investigate how upper-ocean stratification has changed over a period of 60 years.

Upper-ocean stratification is the presence of water layers of varying densities scattered between the ocean's surface and a depth of 200 metres. Density describes how tightly water is packed within a given volume and is affected by water temperature, salinity and depth. More dense water layers lie beneath less dense ones.

Ocean water density plays a vital role in ocean currents, heat circulation, and in bringing vital nutrients to the surface from deeper waters. The more significant the stratification in the upper ocean, the larger the barrier between the relatively warm, nutrient-depleted surface, and the relatively cool, nutrient-rich, deeper waters. More intense stratification could mean that microscopic photosynthetic organisms called phytoplankton that live near the ocean's surface won't get the nutrients they need to survive, affecting the rest of the marine food chain.

Scientists think that global warming could be increasing upper ocean stratification, but investigations have been limited and have usually used short-term data, leading to a large degree of uncertainty. Suga and Yamaguchi compiled temperature and salinity data from the World Ocean Database 2013, covering the period from 1960 to 2017. They then used mathematical equations to calculate the difference in temperature and salinity content between 10 and 200 metres in the regions where data was available.

They found that around 40% of the world's oceans are witnessing a rise in upper ocean density stratification. Half of this rise is happening in tropical waters. They also found that rising stratification in mid-latitude and high-latitude oceans of the Northern Hemisphere varied seasonally, with faster changes happening in the summer compared to winter months.

Additionally, inter-annual variations in several regions correlated with climatic events, such as the Pacific Decadal Oscillation, the North Atlantic Oscillation and El Niño. This suggests that changes in density stratification could be a key factor explaining how large-scale atmospheric changes impact biogeochemical processes, the researchers say.

Suga and Yamaguchi note further studies are needed to confirm this link. But for this to happen, continued international corroborative efforts are needed to gather global, long-term temperature and salinity data across various upper-ocean depths.

Credit: 
Tohoku University

Scientists discover how malaria parasites import sugar

image: Structure of the transport protein PfHT1 in complex with the sugar D-glucose.

Image: 
David Drew

The consumption of sugar is a fundamental source of fuel in most living organisms. In the malaria parasite Plasmodium falciparum, the uptake of glucose is essential to its life cycle. Like in other cells, sugar is transported into the parasite by a transport protein - a door designed for sugar to pass through the cell membrane. The details in how this door works has now been revealed.

"By elucidating the atomic structure of the sugar-transporting-protein PfHT1, we can better understand how glucose is transported into the parasite", says David Drew, Wallenberg Scholar at the Department of Biochemistry and Biophysics and leading the study at Stockholm University.

The main goal of the research is basic understanding of this important biological process, but with the potential for development of new antimalarial drugs. Malaria kills almost half a million persons each year, according to the WHO. By blocking the door for sugar, it has been shown that one can stop the growth of the malaria parasites.

"It's a long process from a compound with antimalarial activity to a drug that can be taken in the clinic. However, with this knowledge one can improve known antimalarial compounds so that they are more specific to the malarial transporter, so they do not have the side-effect of stopping sugar transport into our own cells. As such, this knowledge increases the likelihood that more specific compounds can be developed into a successful drug", says David Drew.

Despite million's years of evolution between parasites and humans the research show that glucose is surprisingly captured by the sugar transporting protein in malaria parasites in a similar manner as by transporters in the human brain.

"This conservation reflects the fundamental importance of sugar uptake - basically, nature hit on a winning concept and stuck with it", says David Drew.

However, the malaria parasite is more flexible. Other sugars, such as fructose, can also be imported. This flexibility could give a selective advantage to the malaria parasite so that it can survive under conditions when its preferred energy source glucose is unavailable.

"Every biochemistry student is taught about the process of sugar transport and it is exciting to add another important piece to this puzzle", says Lucie Delemotte, Associate Professor of Biophysics at KTH Royal Institute of Technology and Science for Life Laboratory Fellow, who collaborated on this project.

Credit: 
Stockholm University

Common form of heart failure could be treated with already approved anticancer drug

image: Steven Houser, PhD, FAHA, Senior Associate Dean of Research, Vera J. Goodfriend Endowed Chair of Cardiovascular Research, and Professor of Physiology and Medicine at the Cardiovascular Research Center at the Lewis Katz School of Medicine at Temple University.

Image: 
Lewis Katz School of Medicine at Temple University

(Philadelphia, PA) - When it comes to finding new treatments for disease, reinventing the wheel is not always necessary - drugs already in use for other conditions may do the job. And if it turns out that a pre-existing drug works, getting it approved for the treatment of another disease can happen much more quickly than for entirely new drugs never previously tested in people.

This fast-tracking approach may now prove valuable - and potentially life-saving - for patients with a common form of heart failure known as heart failure with preserved ejection fraction (HFpEF). Many patients with HFpEF feel fine at rest but experience shortness of breath upon physical exertion because their sick heart struggles to pump enough blood to meet the body's needs. HFpEF usually worsens over time, leading to major declines in quality of life, and often death.

Thanks to new research by scientists at the Lewis Katz School of Medicine at Temple University (LKSOM), however, a drug capable of reversing HFpEF may soon be available. The researchers show that a drug already approved for the treatment of some forms of cancer can reverse HFpEF symptoms and improve the heart's ability to pump blood in an HFpEF animal model.

"Although many people suffer from HFpEF, there are currently no FDA-approved therapies available for the problem," explained Steven Houser, PhD, FAHA, Senior Associate Dean of Research, Vera J. Goodfriend Endowed Chair of Cardiovascular Research, and Professor of Physiology and Medicine at the Cardiovascular Research Center at LKSOM. Dr. Houser is the senior investigator on the new study, which was published in the journal Science Translational Medicine.

"We know from previous research that heart cells from patients with HFpEF have abnormalities in the genes that are being activated as well as in the function of the proteins that they encode," Dr. Houser said. "The alterations in gene expression and protein activity in these cells involve a group of enzymes known as histone deacetylases (HDACs). Drugs that block HDAC activity have already been developed for other diseases, including cancer."

At the suggestion of collaborator Timothy A. McKinsey, PhD, LaConte Chair in Cardiovascular Research, Professor of Medicine, Associate Cardiology Division Head for Translational Research, and Director of the Consortium for Fibrosis Research & Translation (CFReT) at the University of Colorado Anschutz Medical Campus, the Houser and McKinsey teams decided to investigate the effects of an HDAC inhibitor known as SAHA on animals with HFpEF. SAHA, marketed under the name Zolinza, is currently approved for the treatment of a form of cancer known as cutaneous T-cell lymphoma.

The Houser and McKinsey teams tested SAHA in an HFpEF model in which animals progressively developed typical signs of disease, including loss of exercise tolerance and shortness of breath. The animals also experienced tissue changes similar to those that occur in humans with HFpEF, most notably heart remodeling. Heart remodeling in HFpEF characteristically involves hypertrophy, or enlargement and thickening, of the left ventricle, which is the main pump that pushes oxygen-rich blood into the aorta and through the body. Hypertrophy is one way the heart attempts to respond to chronic cardiovascular problems, such as high blood pressure.

Following treatment with SAHA, HFpEF animals showed amazing improvements. In particular, hypertrophy of the left ventricle was significantly reduced in treated compared to untreated animals. The left ventricle was also much more relaxed in treated animals, enabling the heart to fill and pump more effectively and leading to overall improvements in heart structure and function.

"The remarkable thing is that this therapy could be tested in HFpEF patients today," Dr. Houser said.

The Houser and McKinsey teams now plan to investigate what specifically makes heart cells in HFpEF abnormal. "The cells are still alive but are working abnormally. If we can figure out why, we may be able to find a more targeted approached to develop entirely new treatments for HFpEF," Dr. Houser said.

Credit: 
Temple University Health System

Success and failure of ecological management is highly variable

image: Example of the laboratory setup for the flour beetle.

Image: 
Easton White

BURLINGTON, VT--What do we really know about reasons attributed to the success or failure of wildlife management efforts? A new study originating out of UVM suggests a disquieting answer: much less than we think.

A new study in the Proceedings of the National Academy of Sciences finds that ecological systems might contain a lot of inherit randomness that makes them difficult to manage. One of the most difficult parts of managing an invasive species or a fishery is determining whether or not the management strategy was effective. If a management strategy failed to reach some goal, was this because it was the wrong strategy or because of inherit randomness in the system? Perhaps, that particular management strategy would have been the right choice 9 times out of 10 and managers were simply unlucky.

Led by Dr. Easton White from the University of Vermont, in collaboration with scientists in California and Colorado, the study used mathematical models to first demonstrate that there could be high levels of variability in species management outcomes. They then tested these ideas with an experimental invasive species, the flour beetle (Tribolium confusum).

"In nature, we might only have a single study site we are concerned with managing," White says. "This means we typically only have a single replicate under study, making it difficult to determine the ultimate cause of management success or failure. The combination of mathematical models and laboratory experiments provide replication and a measure of ecological management variability."

The team also found that the highest levels of management variability occurred at intermediate levels of management effort. In other words, unless a large amount of effort is used to control a system, we are likely to fail or succeed simply by chance. This is concerning for real systems where we have limited budgets.

"Our results suggest that much of ecological management is bound to succeed or fail simply because of good or bad luck," notes White. "In our experiment we were able to control the laboratory conditions precisely, reducing variability caused by the environment. Thus, we might expect that managing natural systems might lead to higher levels of variability."

The team also investigated the combination of different management strategies. To control the invasive species, they tried direct harvesting and controlling the beetle movement. They found that combinations of strategies, as opposed to only using a single strategy, were often more effective.

Credit: 
University of Vermont

Drug lord's hippos make their mark on foreign ecosystem

video: UC San Diego scientists and their colleagues have published the first scientific assessment of the impact that an invasive hippo population, imported by infamous drug lord Pablo Escobar, is having on Colombian aquatic ecosystems. The study revealed that the hippos are changing the area's water quality by importing large amounts of nutrients and organic material from the surrounding landscape.

Image: 
Shurin Lab, UC San Diego

Four hours east of Medellin in northern Colombia's Puerto Triunfo municipality, the sprawling hacienda constructed by infamous drug lord Pablo Escobar of "Narcos" fame has become a tourist attraction. When Escobar's empire crashed, the exotic animals housed at his family's zoo, including rhinos, giraffes and zebras, were safely relocated to new homes... except for the hippopotamuses.

(Video: Researchers film invasive Colombian hippos: https://youtu.be/fY_8EM8V5Lw)

With no safe or practical way to remove the animals, the original population of four hippos has since ballooned to more than 80, per the last estimate (background story: https://ucsdnews.ucsd.edu/feature/drug-lords-hippos-make-their-mark-on-foreign-ecosystem).

Now, scientists at the University of California San Diego and their colleagues in Colombia have provided the first scientific assessment of the impact the invasive animals are having on Colombian aquatic ecosystems. Their study is published in the journal Ecology.

"This unique species has a big impact on its ecosystem in its native range in Africa, and we found that it has a similar impact when you import it into an entirely new continent with a completely different environment and cast of characters," said UC San Diego Biological Sciences Professor Jonathan Shurin. "It's clear that this effect might include negative consequences for water quality and water resources by fueling harmful algae and bacteria."

Spanning two years, the research team completed a comprehensive assessment of water quality, oxygen levels and stable isotope signatures, comparing lakes with hippo populations to those without. The researchers also compared the microbiomes in the lakes, along with assessments of insects, crustaceans and other organisms.

The study revealed that the hippos are changing the area's water quality by importing large amounts of nutrients and organic material from the surrounding landscape. Since the nocturnal animals feed on land most of the night and spend their days cooling off in the water, their large inputs of waste are altering the chemistry and oxygen of the lakes.

"The effect of fertilizing all those bacteria and algae increases the productivity in the water," said Shurin. "We found that the lakes are more productive when they have hippos in them. This can change the kinds of algae and bacteria and can lead to problems like eutrophication, or excess algae production that can lead to harmful algal blooms similar to red tides."

From an historical perspective, the hippos offer a rare opportunity to study the types of massive animals that have largely disappeared in the Americas and their influence on new ecosystems. They provide a snapshot of a period thousands of years ago when gigantic creatures such as mammoths and mastodons roamed the Americas.

From a practical viewpoint, the new study provides Colombian officials with scientific evidence of how the hippos are disrupting the area's aquatic ecosystems. The researchers estimate that the hippo population will continue to grow dramatically in the years ahead. Their growth spurs many new questions such as how the expanding population interacts with local animals, including manatees, caymans and giant river turtles that inhabit nearby rivers.

Hippos tend to be extremely difficult to catch and can be very dangerous to confront.

"If you plot out their population growth, we show that it tends to go exponentially skyward," said Shurin. "In the next couple of decades there could be thousands of them. This study suggests that there is some urgency to deciding what to do about them. The question is: what should that be?"

The answer, according to Shurin, is much easier to address when there are 80 hippos rather than thousands.

Credit: 
University of California - San Diego

Nitrogen fertilizers finetune composition of individual members of the tomato microbiota

After conducting a field trial at a tomato farm near Ravenna, Italy, a team of plant pathologists and agronomists found that nitrogen fertilizers shape the composition and predicted functions of the plant microbiota. The microbiota refers to the community of microorganisms found in the interface between the soil and the roots of a plant. Similarly to the human digestive tract, the microbiota can help or hinder the plant's nutrition as it is responsible for the uptake of minerals from the soil.

Nitrogen is one of the most important nutrients as is a key component for healthy crop production globally. Because the microbiota is crucial to the plant's ability to take in nitrogen, scientists are very interested in identifying ways to ensure this transfer.

Using state-of-the-art DNA sequencing technologies in the tomato fields, the research team involving the University of Dundee and the University of Modena and Reggio Emilia was able to make two striking observations about what happens to tomato microbiota when it is subjected to nitrogen fertilizers. First, they discovered the tomato microbiota is a gated community.

"Not all the microbes can proliferate in the same manner in the thin layer of soil surrounding plant roots, what scientists call the rhizosphere, and within root tissues," explained Davide Bulgarelli, a plant pathologist from the University of Dundee. "Interestingly, members of the bacterial phylum Actinobacteria appeared to be particularly keen on colonizing the interior of tomato roots."

Secondly, the application of different nitrogen fertilizer fine-tunes the composition of individual members of the microbiota. These microbes are likely to be key targets in optimizing the process of plant nutrition and enhancing global food security.

"It really seems that the diet, here represented by the different nitrogen treatments, is one of the key determinants of the plant microbiota," Bulgarelli concluded. "Knowing that plants have different microbial needs when subjected to different diets will help us identify the most effective inoculants for a given scenario."

Also of note, the lab-in-the-field approach used will likely expedite the translational applications of these findings for those focused on global food security.

For more information about this research, read "Nitrogen Fertilizers Shape the Composition and Predicted Functions of the Microbiota of Field-Grown Tomato Plants" published in the November issue of the open access Phytobiomes Journal.

Credit: 
American Phytopathological Society

The Global Reef Expedition: Kingdom of Tonga

image: The Khaled bin Sultan Living Oceans Foundation conducted a major research study to assess the status of corals and reef fish in the Kingdom of Tonga.

Image: 
©KSLOF/Ken Marks.

The Khaled bin Sultan Living Oceans Foundation has published their findings from extensive coral reef surveys conducted in the Kingdom of Tonga. Released today, the Global Reef Expedition: Kingdom of Tonga Final Report contains critical information on the health and resiliency of coral reef ecosystems in Tonga, and provides scientists, policymakers, and stakeholders with invaluable information they can use to protect these fragile marine ecosystems.

"Since Tonga is a developing country, about 66% of the total population's livelihood depends upon seafood," said Apai Moala, a Senior Geologist Assistant at the Ministry of Lands & Natural Resources (MLNR) who participated in the research mission. "I do believe that with more information and better understanding by the people about the importance of marine life, we can minimize destructive activities and negative effects that might happen to reefs in the near future."

In 2013, scientists on the Global Reef Expedition--the largest coral reef survey and mapping expedition in history--came to Tonga to work with local partners and government officials to assess the health of the reef. Representatives from the Ministry of Lands, Environment, Climate Change and Natural Resources; the Ministry of Agriculture, Forests and Fisheries; and the Vava'u Environmental Protection Association (VEPA) joined over a dozen scientists from around the world on the month-long research mission in Tonga, and helped the Foundation conduct numerous coral reef outreach and education activities with local schoolchildren and community members.

On the Global Reef Expedition mission to Tonga, scientists conducted nearly 500 surveys of coral reefs and reef fish around three of Tonga's island groups: Ha'apai, Vava'u, and Niua. They also collected over 2,200 km2 of satellite imagery to develop detailed habitat and bathymetric maps of the seafloor.

The report released today summarizes the Foundation's findings from the research expedition along with recommendations for preserving Tonga's coral reefs into the future.

They found that coral reefs in the Kingdom of Tonga were moderately healthy, but the reef fish and invertebrate communities were in need of attention. At the time the surveys were conducted, some reefs had lower coral cover than expected, but it was the fish communities that were of greatest concern to the scientists. Although there were many kinds of fish seen on reefs in Tonga, the fish were small. Few large and commercially valuable fish remained. But with continued fisheries management, there is hope these reefs can recover.

"The coral reef fish communities we observed in Tonga were dominated by small fish considered low on the food chain, raising concern for the long-term sustainability of the fishery," said Renee Carlton, author and Marine Ecologist for the Foundation. "We saw very few fish that are particularly important to local fishers, such as parrotfish, emperors, snappers, and groupers. Our findings highlight the importance of marine conservation already happening in Tonga, and expanding management efforts to even in the most remote regions of the country."

The report commends the Kingdom of Tonga for the substantial work they have done in establishing Specially Managed Areas (SMAs) and Fish Habitat Reserves (FHRs) and recommends they continue and expand these efforts. Some of the specific recommendations made in the report include providing education on the importance of establishing SMAs and the benefits they provide to local fishermen, improving documentation of fish catch, encouraging sustainable fishing practices in Niuatoputapu, and protecting the reefs surrounding the northern Niua islands from larger fishing vessels. With the establishment of additional SMAs and FHRs, and continued enforcement of those that already exist, Tonga's fisheries resources can be protected and sustainably used by the people of Tonga for generations to come.

Alexandra Dempsey, Director of Science Management for the Foundation and one of the report's authors, acknowledges that Tonga's reefs may have changed since the Expedition, but the report still provides valuable data on the state of the reefs at a point in time. "Our hope is that the data included in this report will be used by the people of Tonga to help protect their coral reefs."

Credit: 
Khaled bin Sultan Living Oceans Foundation

Robot sweat regulates temperature, key for extreme conditions

ITHACA, N.Y. - Just when it seemed like robots couldn't get any cooler, Cornell University researchers have created a soft robot muscle that can regulate its temperature through sweating.

This form of thermal management is a basic building block for enabling untethered, high-powered robots to operate for long periods of time without overheating, according to Rob Shepherd, associate professor of mechanical and aerospace engineering at Cornell, who led the project.

The team's paper, "Autonomic Perspiration in 3D Printed Hydrogel Actuators," published in Science Robotics.

One of the hurdles for making enduring, adaptable and agile robots is managing the robots' internal temperature, according to Shepherd. If the high-torque density motors and exothermic engines that power a robot overheat, the robot will cease to operate.

This is a particular issue for soft robots, which are made of synthetic materials. While more flexible, they hold their heat, unlike metals, which dissipate heat quickly. An internal cooling technology, such as a fan, may not be much help because it would take up space inside the robot and add weight.

So Shepherd's team took inspiration from the natural cooling system that exists in mammals: sweating.

"The ability to perspire is one of the most remarkable features of humans," said co-lead author T.J. Wallin, a research scientist at Facebook Reality Labs. "Sweating takes advantage of evaporated water loss to rapidly dissipate heat and can cool below the ambient environmental temperature. ... So as is often the case, biology provided an excellent guide for us as engineers."

Shepherd's team partnered with the lab of Cornell engineering professor Emmanuel Giannelis, to create the necessary nanopolymer materials for sweating via a 3D-printing technique called multi-material stereolithography, which uses light to cure resin into predesigned shapes.

The researchers fabricated fingerlike actuators composed of two hydrogel materials that can retain water and respond to temperature - in effect, "smart" sponges. The base layer, made of poly-N-isopropylacrylamide, reacts to temperatures above 30 C (86 F) by shrinking, which squeezes water up into a top layer of polyacrylamide that is perforated with micron-sized pores. These pores are sensitive to the same temperature range and automatically dilate to release the "sweat," then close when the temperature drops below 30 C.

The evaporation of this water reduces the actuator's surface temperature by 21 C within 30 seconds, a cooling process that is approximately three times more efficient than in humans, the researchers found. The actuators are able to cool off roughly six times faster when exposed to wind from a fan.

One disadvantage of the technology is that it can hinder a robot's mobility. There is also a need for the robots to replenish their water supply, which has led Shepherd to envision soft robots that will someday not only perspire like mammals, but drink like them, too.

Credit: 
Cornell University

Medicaid expansion reduce cancer, saves black lives

Expanding Medicaid in North Carolina could sharply lessen the burden of colon cancers in the state and save the lives of thousands of Black men as well as improving access to care for men of all races, researchers report in the 27 January issue of PLOS ONE.

Cancers of the colon and rectum kill tens of thousands of people in the US each year and are particularly common among African American men. For every hundred thousand residents of the US in 2016, 37 cases of colorectal cancer were reported, and 14 deaths. For Black men, the numbers were higher: 49 cases per hundred thousand. And the rates of colon and rectal cancers are higher in Appalachia and the South compared to other parts of the country.

North Carolina, a southern state that extends into Appalachia, declined to expand access to Medicaid as part of the Affordable Care Act. That law, also known as Obamacare, not only required states to set up health insurance exchanges but also gave states the opportunity to use federal money to extend Medicaid to anyone with income up to 138% of the federal poverty level. Previously, Medicaid primarily covered poor women and children. Expanded Medicaid also covered men of working age, if their income was below the threshold.

Researchers at UConn Health and University of North Carolina (UNC) saw an opportunity to use North Carolina as a natural experiment to test whether the expansion of Medicaid could have actually reduced illness--and whether it would have saved money for the state in the long term.

"If we had expanded Medicaid in North Carolina, could we have saved lives?" That was the question Wizdom Powell, director of the UConn Health Disparities Institute, wanted to answer. Powell, UNC systems scientist Leah Frerichs, and their colleagues at UNC created a population model that simulated all the African American and white men in North Carolina, based on county by county demographics. Every individual in the model (more than 338,000 Black men and 1,496,000 whites) was assigned a probability of colon cancer screening, based on real world statistics related to income, race and neighborhood data.

They then ran the model under several different scenarios. One, in which North Carolina set up a health insurance exchange but did not expand Medicaid, mimicked what actually happened. And the rates of sickness and death caused by colon cancer in that scenario match what has actually happened in North Carolina in recent years, giving the researchers confidence the model is accurate.

Other scenarios looked at what would have happened if North Carolina had both expanded Medicaid and set up a health insurance exchange.

"And we saw that if we had done that, we would have saved hundreds of Black male lives--and increased cancer screening among both Black and white men," Powell says.

"It was enlightening to see the impact on disparities," Frerichs says. And she points out how much the state would gain, for little cost. Initially, by expanding Medicaid the state of North Carolina would pay a couple dollars more for every African American man in 2018, but it would save $5.1 million in cumulative cost savings by 2044. And the state would save $9.6 million in cumulative savings for white males.

The researchers are expanding the model to encompass all adults of all races in the state, and are looking at other questions, such as the effects of proposals like Medicare for all.

Credit: 
University of Connecticut