Tech

'Water wires' may play bigger role in cellular function

TALLAHASSEE, Fla. -- Each of our cells is surrounded by a complex membrane that functions as a biological border, letting ions and nutrients such as salt, potassium and sugar in and out. The guards are membrane proteins, which do the hard work of permitting or blocking the traffic of these molecules.

Strings of bonded water molecules, called water wires, play an important role in this process that was thought to be well understood. Now, a team at the Florida State University-headquartered National High Magnetic Field Laboratory ( MagLab) is upending decades-long assumptions about how they actually interact with proteins.

Their paper was published today in the Proceedings of the National Academy of Sciences.

While scientists knew water wires played a role in conducting nutrients across the cell membrane, they vastly underestimated their interactions with the membrane channel. This finding has widespread ramifications, researchers said, calling into question existing models of how water behaves inside other proteins.

"That's where this gets really interesting from a biological perspective," said corresponding author Tim Cross, director of the Tallahassee-based nuclear magnetic resonance (NMR) facility at the National MagLab and the Robert O. Lawton Professor of Chemistry. "Now we understand that those interactions between the water and the protein's oxygen atoms lining the pore are going to be much stronger than anyone has anticipated. And that's going to influence how these proteins function."

The work is also important, Cross added, because it showcases how a unique, world-record magnet, known as the Series Connected Hybrid (SCH), is giving scientists access to new details about proteins and other biological systems.

Their study focused on gramicidin A, an antibiotic peptide (or small protein) that is shaped like a helix. Two of these molecules stacked one atop the other create a narrow channel in some cell membranes through which ions can pass in and out. An eight molecule-long water wire spanning the length of the channel acts as a kind of lubricant in this process. The hydrogens in those water molecules bond with some of the oxygen atoms in the gramicidin that encircles them. The orientations of the water wire molecules were thought to flip extremely quickly, binding and unbinding with oxygen atoms in the gramicidin A many times a nanosecond.

However, when the MagLab team took a closer look at this system, they discovered something that called that prevailing view into question. Their first clue came about two years ago, when Joana Paulino, then a postdoctoral researcher at the MagLab working with Cross, put some specially treated gramicidin A into the SCH and ran some NMR experiments.

Scientists use NMR machines to better understand the structure and function of complex molecules like proteins and viruses. They can tune the machine to identify, for example, all the sodium atoms in a sample and their orientations vis-à-vis other atoms. Each atom sends a tell-tale signal back to the machine.

But some atoms are easier to detect by NMR than others. Oxygen, for example, is quite hard to see. So, until recently, one of the most biologically active atoms in the body was all but invisible to NMR. Due in part to a powerful magnet generating a field of 36 teslas (a unit of magnetic field strength), the SCH can "see" oxygen.

The specific gramicidin samples Paulino was looking at had already been studied in depth years earlier in another powerful NMR magnet at the MagLab. Cross established his career with his work on gramicidin, known to be a perfectly symmetrical structure: The last thing he expected was a surprise.

The gramicidin sample was made up of two identical, stacked, helical molecules. Paulino examined the exact same oxygen atom on both, hoping the more sensitive SCH would detect a clearer signal from those two atoms than had been previously observed.

But she didn't see just one oxygen signal: She saw two.

At first blush, the results seemed to suggest something amiss with the model of a perfectly symmetrical gramicidin A - the model that had earned Cross his tenure. His immediate reaction to Paulino's measurements was, "Well, that must be wrong."

His next thought: "Or, this could be something very interesting."

Repeated experiments showed Paulino's first result was indeed correct -- but not because the molecules were asymmetrical. Rather, the SCH was so sensitive that it detected one signal from a gramicidin oxygen that was bound to the water wire, and a separate signal from a gramicidin oxygen that was not bound to the wire.

The team spent years conducting more experiments to make sure they understood what they were seeing.

"Every time we ran a sample of gramicidin labeled at a different oxygen site and we saw two peaks, we did a little dance," said Paulino, lead author on the paper and now a postdoctoral scholar in biochemistry and biophysics at the University of California at San Francisco.

The fact that the SCH was able to detect the signal of the bound oxygen, the researchers determined, meant that the interactions between the water wire and pore wall of the gramacidin A were much stronger and longer-lasting -- more than a million times longer, in fact, than scientists had believed.

"The energies associated with the process are clearly different than what was imagined," Cross said. "So, we need to go back now and take a look at the energetics and how these water wires actually function."

The findings are relevant for many other types of proteins that feature water wires in their cell membranes.

"The excitement now is to really start thinking about all of these other water wires in proteins that conduct ions that are essential for life," Cross said, "and to understand how this is going to influence those interactions and conductance rates."

The findings are likely to ruffle some scientific feathers because they contradict computational models of the molecular dynamics of water wires that have been accepted for decades, Cross said.

"Scientists have a pretty good understanding of a lot of things," Cross explained. "But every once in a while, something comes out of the blue and forces us to rethink things. There's nothing out there that would at all hint that there was a problem with those computational studies -- until this."

Credit: 
Florida State University

New paper helps advance myopia management strategies

image: Mark Bullimore, MCOptom, PhD, FAAO, is a scientist, speaker, and educator renowned for his expertise in myopia, contact lenses, low vision, presbyopia, and refractive surgery. His new literature analysis published in Ophthalmic & Physiological Optics gives eye care practitioners a comprehensive analysis of evidence-based information needed to help manage myopia.

Image: 
(c) Dr. Mark Bullimore

SAN RAMON, Calif., May 12, 2020--A new literature analysis published in Ophthalmic & Physiological Optics, the peer-reviewed journal of The College of Optometrists, gives eye care practitioners (ECPs) a comprehensive analysis of evidence-based information needed to help manage myopia.

Written by Dr. Mark Bullimore and Dr. Kathryn Richdale, "Myopia Control 2020: Where are we and where are we heading?" presents a range of critically evaluated safety and efficacy considerations for behavioral, optical and pharmaceutical myopia management pathways.1 The paper is available via open access.

The authors note the paper seeks to present a snapshot of the rapid evolution of the field, addressing multiple questions that ECPs may have. These include not only who to manage, but also relative strengths of various methodologies and when to modify or stop care. The review also discusses potential future avenues for myopia management, including a continuum of care starting with the delay of onset followed by individual or combination therapies to slow myopia progression.

"Management of an individual child should be underpinned by the evidence-based literature and clinicians must stay alert for ongoing myopia research. [This] will undoubtedly result in the evolution of the standard of care for the myopic and pre-myopic child," write Bullimore and Richdale.

The review supports a call for ophthalmology and optometry to determine a collaborative framework and referral patterns in the interest of prevention, education, and proactively addressing associated pathology. This includes working with the pediatrics and school communities to reach a broad population with high impact.

The paper was supported by an educational grant from CooperVision.

"We hope this comprehensive review reaches ECPs to offer sound, science-backed evaluation that can help advance myopia management strategies in practices worldwide," said James Gardner, Vice President of Global Myopia Management at CooperVision. "Clinical education plays an important role in our efforts to grow the category, alongside ongoing research, groundbreaking products such as our MiSight® 1 day contact lenses, advocacy and corporate social responsibility initiatives."

Mark Bullimore, MCOptom, PhD, FAAO, is a scientist, speaker, and educator renowned for his expertise in myopia, contact lenses, low vision, presbyopia, and refractive surgery. He spent most of his career at the Ohio State University and the University of California at Berkeley and is now Adjunct Professor at the University of Houston. Dr. Bullimore is Associate Editor of Ophthalmic and Physiological Optics and the former Editor of Optometry and Vision Science.

Kathryn Richdale, OD, PhD, FAAO, was founding director of the Clinical Vision Research Center, and established the Myopia Control Clinic at the State University of New York before joining the University of Houston as a tenured Associate Professor in 2017. She conducts research in the areas of cornea, contact lenses and refractive error. Dr. Richdale is also an attending doctor in the Cornea and Contact Lens Clinic and co-director of the Myopia Management Service.

Credit: 
McDougall Communications

Illuminating the impact of COVID-19 on hospitals and health systems

image: A FAIR Health Brief, May 12, 2020.

Image: 
www.fairhealth.org

NEW YORK, NY--May 12, 2020--In the third week of March 2020, as the COVID-19 pandemic escalated, large hospitals in the Northeast experienced a 26 percent decline in average per-facility revenues based on estimated in-network amounts as compared to the same period in 2019. Nationally, the decrease in revenue for large hospitals was 16 percent. These are among the findings of FAIR Health's second COVID-19 study, Illuminating the Impact of COVID-19 on Hospitals and Health Systems: A Comparative Study of Revenue and Utilization.

The third week of March 2020 was the week when thousands of new COVID-19 cases became commonplace in certain parts of the United States, particularly in the Northeast. Hospitals and health systems underwent financial strain as many elective procedures were deferred. FAIR Health's new brief illuminates the financial impact on hospitals by comparing revenues based on estimated in-network amounts on private insurance claims submitted by facilities in the first quarter (January to March) of 2020 with the first quarter of 2019 (adjusted by Consumer Price Index). The first quarter is analyzed month by month, and March is analyzed week by week. Also studied are discharge volume, settings, and diagnoses and procedures.

The study was based on claims data received by April 30, 2020, which meant some claims for services during the period examined were incurred but not reported (IBNR)--valid claims for covered services that had been performed but not yet reported to the insurer. For that reason, the 2019 claims used for the study were limited to those received by April 30, 2019, to produce an "apples to apples" comparison. Notwithstanding the IBNR issue, FAIR Health found that the impact of COVID-19 on hospitals was already substantial and of such public health relevance that it deemed it worthwhile to issue this report. FAIR Health will continue to monitor the data volume in the coming weeks.

Findings include:

In general, there was an association between larger hospital size and greater impact from COVID-19. Nationally, in large facilities (over 250 beds), average per-facility revenues based on estimated in-network amounts declined from $4.5 million in the first quarter of 2019 to $4.2 million in the first quarter of 2020. The gap was less pronounced in midsize facilities (101 to 250 beds) and not evident in small facilities (100 beds or fewer).

March was the month when COVID-19 had its greatest impact in the first quarter of 2020. Nationally, in that month, in midsize facilities, the decrease in average per-facility revenues based on estimated in-network amounts in 2020 from 2019 was four percent; in large facilities, five percent.

Facilities in the Northeast experienced a greater impact from COVID-19 than those in the nation as a whole. For example, in the Northeast, the decline in average per-facility revenues based on estimated in-network amounts in March 2020 from March 2019 was five percent for midsize facilities, nine percent for large ones.

Both nationally and in the Northeast, the decrease in facility discharge volume (i.e., patient discharges) from March 2019 to March 2020 was greater on a percentage basis than the decrease in revenues based on estimated in-network amounts. For example, in large facilities nationally, the drop in volume was 32 percent; in the Northeast, 40 percent.

Nationally, the decrease in facility discharge volume in the third week of March 2020 from the corresponding week in 2019 grew significantly compared to the first two weeks; it also appears greater than the decrease in the fourth week. But in the Northeast, in midsize facilities, the fourth week of March had a greater drop (34 percent) than the third week (30 percent).

From March 2019 to March 2020, the outpatient share of the distribution of estimated in-network amounts by settings decreased relative to the inpatient share. The effect was more pronounced in the Northeast than nationally.

The third and fourth weeks of March 2020, compared to the corresponding period in 2019, saw several changes in the most common diagnostic categories in the inpatient and ER settings. Nationally and in the Northeast, in the inpatient setting, diseases and disorders of the respiratory system rose in share of distribution by volume and estimated in-network dollars, while in the ER setting, acute respiratory diseases and infections rose.

FAIR Health President Robin Gelburd stated: "With this second study, we again use our data repository to shed light on the impact of COVID-19. As the pandemic continues to test the entire healthcare system, FAIR Health seeks to provide data and analysis to support all the system's participants."

Credit: 
FAIR Health

Researchers develop new drugs for treating polycystic hepatorena

image: Microscopic view of cells affected by non alcoholic fatty liver disease.

Image: 
UPV/EHU

Polycystic hepatorenal diseases are hereditary, genetic disorders characterised by the progressive development of multiple symptomatic cysts in the kidneys and/or liver which may cause alterations in the function of these organs and/or complications associated with their growth. Right now, there are no effective pharmacological treatments and the only curative option is organ transplant.

Researchers at the University of the Basque Country (UPV/EHU) led by Dr Fernando Cossío, scientific director of Ikerbasque, and in the Liver Diseases Group at the Biodonostia Institute of Health Research, led by the Ikerbasque research fellow Dr Jesús M. Bañales, have collaborated on the development of new drugs that have proven to be effective in reducing the growth of hepatic and renal cysts in experimental models of this disease, which could be of huge clinical significance. Researchers at the University of Salamanca, led by Dr José J. G. Marín, at the Idibell Institute of Catalonia, led by Dr Manel Esteller, and at the Hormel Institute of Minnesota (USA), headed by Dr Sergio Gradilone, have collaborated in this multidisciplinary project led by the two Basque institutions.

The drugs developed are based on the structure of ursodeoxycholic acid (UDCA), a bile acid present in the body at a low concentration and which has protective properties for the liver. In fact, its administration is recommended for treating specific liver diseases. The researchers based themselves on the structure and properties of this molecule to design and synthesise a family of chemical derivatives geared towards inhibiting a key protein responsible for encouraging the growth of hepatorenal cysts. The results published have shown that these new drugs are capable of blocking the growth of hepatic and renal cysts in an animal model of this disease.

The promising therapeutic effects of these new drugs have managed to spark great interest in the international scientific community. So much so that the work was selected to be presented orally at the International Congress of the EASL, the European Association for the Study of the Liver, held in Vienna. It has also been published recently in the prestigious international journal Hepatology (official journal of the American Association for the Study of Liver Diseases), and has given rise to the PhD thesis by Dr Francisco J. Caballero (UPV/EHU, Biodonostia), lead author of the said work, supervised by Dr Fernando Cossío (UPV/EHU) and Dr Jesús M. Bañales (Biodonostia, Ikerbasque).

This project recently received the FIPSE national award for Innovation (Ministry of Science and Innovation) which enabled a patent for these drugs to be taken out; they have been licensed to the company ATLAS Molecular Pharma of the Basque Country so that their clinical study can be pursued. This project has also had competitive funding from the RIS3 Euskadi Programme, Ministry of Science and Innovation, the Carlos III Institute of Health (ISCIII) and Ikerbasque.

Credit: 
University of the Basque Country

Researchers ID target for colorectal cancer immunotherapy

image: From left, Xiongbin Lu, PhD, and Sophie Paczesny, MD, PhD

Image: 
IU School of Medicine

Researchers at the Indiana University Melvin and Bren Simon Comprehensive Cancer Center have identified a target for colorectal cancer immunotherapy.

Immunotherapy uses the body's immune system to target and destroy cancer cells. Considered the future of cancer treatment, immunotherapy is less toxic than chemotherapy. Colorectal cancer is the third most common cancer among men and women, yet chemotherapy remains the standard of care as limited numbers of patients respond to current immunotherapy treatment options.

The findings published May 7 in JCI Insight could provide additional treatments for a larger number of colorectal cancer patients via a new immunotherapy pathway. Researchers identified ST2 as a novel checkpoint molecule that could help T cells become more effective.

The research is a collaboration between IU School of Medicine cancer researchers Xiongbin Lu, PhD, Vera Bradley Foundation Professor of Breast Cancer Innovation and of Medical and Molecular Genetics, and Sophie Paczesny, MD, PhD, Nora Letzter Professor of Pediatrics and of Microbiology and Immunology.

Immune checkpoints are an essential part of the immune system with the role of preventing immune cells from destroying healthy cells. T cells are immune system cells that attack foreign invaders such as infections and can help fight cancer. But cancer is tricky, and often the tumor microenvironment creates ways to prevent T cells from attacking cancer cells by misusing several factors including the activation of checkpoint molecules.

Within the tumor microenvironment, the body's immune system knows something is wrong and sends a stress signal such as the alarmin IL-33, which brings in immune cells called macrophages that express ST2 (the receptor for IL-33) to help. What is at first a "good" response is quickly overwhelmed and the macrophages become the enemy in fighting colon cancer.

The authors investigated using patient tumor genetic data and found that T-cell functionality, one of the key factors in fighting the cancer using the adaptive immune responses, is reduced in patients displaying high ST2 levels. Using tumor tissue samples from IU Simon Comprehensive Cancer Center tissue bank, researchers found abundant expression of ST2 in macrophages in tumor tissue samples from early to late-stage colorectal cancer.

"In all of the patient samples, we were able to identify ST2 expressing macrophages, which would potentially mean that targeting these ST2 macrophages would be relevant to the patients," Kevin Van der Jeught, PhD, said. Van der Jeught is a post-doctoral researcher in Lu's lab and first author of this study.

In preclinical mouse models, researchers found that by targeting the ST2-expressing macrophages, they were able to slow tumor growth. By depleting these inhibitory cells, the T cells became more active in fighting cancer.

Research collaborator and scientist at the Herman B Wells Center for Pediatric Research, Paczesny's previous research led to the discovery of ST2 and is the subject of her National Cancer Institute "Cancer Moonshot" grant focusing on immunotherapy for pediatric acute myeloid leukemia (AML). While leukemia and colorectal cancer are very different diseases, researchers have found commonality and collaboration in the ST2 protein.

"This research is bringing together the pathway in two different diseases," Paczesny said.

Lu's research focuses on cancer cell biology in diseases such as triple negative breast cancer and colorectal cancer.

"We have to develop new tools and new approaches for solid tumors, and this is the kind of collaboration we need for advancing future treatments," Lu said. Researchers from two other institutions, the University of Maryland's Marlene and Stewart Greenebaum Comprehensive Cancer Center and the VIB-UGent Center for Inflammation Research in Belgium, have contributed to this publication.

Researchers also are exploring combination therapy with existing immunotherapy, such as PD-1 checkpoint inhibitors, which work to boost T cells directly, while attacking ST2 on macrophage cells increased T cells by stopping the inhibitors.

"Potentially through a combination of two checkpoints at work on different immune cells, we could enhance the current response rates," Van der Jeught said.

The researchers plan to explore these findings further and pursue the development of ST2 for cancer immunotherapy.

Credit: 
Indiana University School of Medicine

Street smarts required in heat mitigation

image: This is Ariane Middel with a MaRTy unit on ASU's Tempe campus.

Image: 
Photo by Ken Fagan/ASU Now

One day last July, Ariane Middel and two other Arizona State University researchers headed west on Interstate 10. Squeezed inside their van were MaRTy 1 and MaRTy 2, mobile biometeorological instrument platforms that can tell you exactly what you feel in the summer heat. All five were destined for Los Angeles.

The researchers and their colleagues were headed to L.A. to start investigating how solar reflective coatings on select city streets affected radiant heat and, in turn, pedestrians' comfort on a typical summer day.

The Los Angeles Bureau of Street Surfaces has pioneered the use of solar reflective coatings in a quest to cool city streets.

The idea is, if you coat a street with a lighter color than traditional pavement black, it will actually lower the surrounding temperatures.

But Middel and her collaborators now wanted to see what effect reflective coating had on pedestrians.

"If you're in a hot, dry and sunny climate like Phoenix or L.A., the mean radiant temperature has the biggest impact on how a person experiences the heat," explains Middel, assistant professor in the ASU School of Arts, Media and Engineering and a senior sustainability scientist in the Julie Ann Wrigley Global Institute of Sustainability. "The mean radiant temperature is essentially the heat that hits the human body. It includes the radiation from the sun, so if you are standing in direct sunlight you will feel much hotter than in the shade."

Thanks to remote-sensing satellites, decades of data exist on the Earth's land surface temperature; that is, how hot a single point on the Earth's surface would feel to the touch. But that data should not be confused with near-surface ambient and radiant temperature, the heat that humans and animals "experience," said Middel, lead author of the study and director of ASU's SHaDE Lab, which stands for Sensable Heatscapes and Digital Environments.

The researchers' study is the first to measure the thermal performance of solar reflective coatings using instruments that sense meteorological variables relevant to a pedestrian's experience: radiant heat, ambient temperature, wind and humidity.

The researchers focused on two variables, surface temperature and radiant temperature over highly reflective surfaces. They took MaRTy 1 and 2 on hourly strolls through a Los Angeles neighborhood to measure a pedestrian's heat exposure over regular asphalt roads, reflective coated roads and sidewalks next to the roads.

MaRTy, which stands for mean radiant temperature, looks like a weather station in a wagon. The station measures the total radiation that hits the body, including sunlight and the heat emitted from surfaces like asphalt.

The study showed that the surface temperature of the coated asphalt road was up to 6 degrees Celsius cooler than the regular road in the afternoon. However, the radiant heat over coated asphalt was 4 degrees Celsius higher than non-coated areas, basically negating any heat-limiting factor.

"So, if you're a pedestrian walking over the surface, you get hit by the shortwave radiation reflected back at you," Middel said.

The study also found that the coating didn't have a big impact on air temperature, only half a degree in the afternoon and 0.1 degrees Celsius at night.

The upshot, said V. Kelly Turner, assistant professor of urban planning at UCLA and the study's co-author, is that to cool off cities, urban climatologists and city planners need to focus on different solutions or combinations of solutions depending on a desired goal.

"The solutions are context dependent and depend on what you want to achieve," Turner explained.

A solution that addresses surface temperature is not necessarily suited to the reduction of building energy use. For example, if you want cooler surface temperatures on a playground because children are running across its surface, a reflective coating would be best. But if you want to reduce the thermal load on people, planting trees or providing shade would be more effective.

But what happens if you combine trees with cool pavement? Does the cool pavement lose its ability to reduce surface temperature? Or perhaps the cool pavement is costly to maintain when the trees drop their leaves?

"So, reflective coating is not a panacea," Turner said. "It's one tool."

It should also be noted that temperature is a multifaceted measurement of heat. Surface temperature, ambient temperature and mean radiant temperature are distinct from one another and require distinct solutions when it comes to mitigating heat.

"We need more of these experiments," Middel said. "There have been a lot of large-scale modeling studies on this. So, we don't know in real life if we get the same effects. The urban environment is so complex, and models have to always simplify. So, we don't know what really happens on the ground unless we measure, and there haven't been these types of measurements in the past."

The researchers report their findings of the Los Angeles study in, "Solar reflective pavements -- A policy panacea to heat mitigation?" which was published on April 8, 2020 in the journal Environmental Research Letters. Co-authors on the paper include Florian Schneider and Yujia Zhang of ASU, and Matthew Stiller of Kent State University.

Credit: 
Arizona State University

On the road to non-toxic and stable perovskite solar cells

image: The illustration shows the changes in the structure of FASnI3:PEACl films during treatment at different temperatures.

Image: 
HZB/Meng Li

Among the new materials for solar cells, the halide perovskites are considered particularly promising. Within a few years, the efficiency of such perovskite solar cells raised from a few percents to over 25 %. Unfortunately, the best perovskite solar cells contain toxic lead, which poses a hazard to the environment. However, it is surprisingly challenging to replace the lead with less toxic elements. One of the best alternatives is tin. Halogenide perovskites with tin instead of lead should show excellent optical properties, but in practice, their efficiencies are mediocre and decrease rapidly. And this rapid "aging" is their main disadvantage: the tin cations in the perovskite structure react very quickly with oxygen from the environment so that their efficiency drops.

Now, an international cooperation led by Antonio Abate, HZB, and Zhao-Kui Wang, Institute of Functional Nano & Soft Materials (FUNSOM), Soochow University, China, has achieved a breakthrough that opens up a path to non-toxic perovskite-based solar cells that provides stable performance over a long period. They also use tin instead of lead but have created a two-dimensional structure by inserting organic groups within the material, which leads to so-called 2D Ruddlesden-Popper phases. "We use phenylethylammonium chloride (PEACl) as an additive to the perovskite layers. Then we carry out a heat treatment while the PEACl molecules migrate into the perovskite layer. This results in vertically ordered stacks of two-dimensional perovskite crystals" explains first author Dr Meng Li. Li is a postdoc in Abate's group and has organised the close cooperation with the Chinese partners. At the Shanghai Synchrotron Radiation Facility (SSRF), they were able to precisely analyse the morphology and crystal characteristics of the perovskite films after different annealing treatments.

The best of these lead-free perovskite solar cells achieved an efficiency of 9.1 % and high stability values, both under daytime conditions and in the dark. The PEACl molecules accumulate between the crystalline perovskite layers as a result of the heat treatment and form a barrier that prevents the tin cations from oxidising. "This work paves the way for more efficient and stable lead-free perovskite solar cells," Abate is convinced.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Solve invasive seaweed problem by turning it into biofuels and fertilisers

video: Professor Mike Allen and his son Archie talk about their work on plastic pollution and making fuels from seaweed

Image: 
Professor Mike Allen, Plymouth Marine Laboratory

UK researchers have developed a cheap and simple way of creating biofuel and fertiliser from seaweed, whilst removing plastic from the oceans and cleaning up tourist beaches in the Caribbean and Central America.

Millions of tonnes of rotting seaweed washes up on beaches of Mexico, the Caribbean and elsewhere every year.

Partly fuelled by fertilisers washing into the sea from farming in the Americas, the foul-smelling Sargassum seaweed devastates the tourism industry and harms fisheries and ocean ecosystems.

A research team, led by the University of Exeter and the University of Bath, has developed a cheap and simple way to pre-process seaweed before making bulk chemicals and biofuels from it.

Making biofuels financially viable

"Ultimately, for this to work it has to make financial sense," said Professor Mike Allen, from the University of Exeter and Plymouth Marine Laboratory.

"Processing marine biomass like seaweed usually requires removing it from the salt water, washing it in fresh water and drying it.

"The costs of these processes can be prohibitively high.

"We needed to find a process that would pay for and sustain itself - something both economically and environmentally viable.

"This work provides a crucial missing step towards a true salt-based Marine Biorefinery by establishing the initial fractionation step."

Using acidic and basic catalysts, the team devised a process that releases sugars that can be used to feed a yeast that produces a palm oil substitute. The same method also prepares the residual seaweed for the next stage of processing, called hydrothermal liquefaction.

This process subjects the organic material to high temperature and pressure, turning the seaweed into bio-oil that can be processed further into fuels, and high-quality, low-cost fertiliser.

Ed Jones, first author on the paper and PhD student at the Centre for Sustainable Circular Technologies at the University of Bath said: "In contrast with existing pre-treatment strategies, we show that an entirely salt-based biochemical conversion route can work."

"For the first time this study demonstrates that, rather than a hindrance, the presence of saltwater can be helpful."

Professor Christopher Chuck, Director of the Centre for Integrated Bioprocessing Research at the University of Bath and the project lead said: "The variety of products created by this process is a major strength. The oil industry creates a variety of products including liquid fuel, plastics and fertilisers - and we can benefit from a similar flexibility.

"We can simply alter the process conditions to produce larger or smaller amounts of specific by-products, allowing us to have meet variable demand."

Removing ocean plastics

Not only is all the seaweed used in products, but any plastic collected alongside the seaweed will also be converted alongside the seaweed. Part of the inspiration for the project came from Professor Allen's children, Rosie (12) and Archie (9), who helped collect seaweed samples for trial studies from the Devon coast.

Professor Allen said: "It was Rosie who triggered a whole stream of research following the painstaking removal of plastic litter from the children's seaweed samples by asking: 'Dad, can't you just convert the plastics alongside the seaweed?'"

Removing an environmental nuisance

Another strength of the plan is its use of invasive seaweed such as Sargassum - an environmental nuisance which currently costs the tourist industry vast sums, both in clean-up costs and because it deters visitors.

Professor Allen said: "Many countries in the Caribbean and Central America rely heavily on tourism, so the coronavirus pandemic and the ongoing Sargassum problem have put them on their knees. Last month more than 4 million tonnes of problematic seaweed washed up on their shores."

This is the latest in a string of developments around seaweed processing from the team which is supported by UKRI, Global Challenges Research Fund, Roddenberry Foundation, Innovate UK and Newton Fund. Exploiting their diverse expertise in phycology, chemistry, ecology, biotechnology and chemical engineering, they are now seeking to develop seaweed based biorefineries to provide local solutions and opportunities on the global stage.

Beginning with just an inquisitive family on their local Devon beach, the ideas and concepts they have inspired are now being applied on the international stage.

Credit: 
University of Bath

New optical biosensor system may help round-the-clock management of gout

image: Experimental set-up used in the study to measure urate levels.

Image: 
Texas A&M University College of Engineering

In a recent article published in the February issue of the journal Sensors, researchers at Texas A&M University have reported a technology that might help people with gout disease monitor their symptoms better. They said their minimally invasive biosensor system may hold the key to future point-of-care therapies centered around personal management of gout, and possibly other conditions.

"Finding more ways to help patients reduce their risks of gout attacks is an important clinical need that hasn't been looked at in detail," said Dr. Mike McShane, department head and professor in the Department of Biomedical Engineering. "In the future, biosensor technology such as ours can potentially help patients take preemptive steps to reduce the severity of their symptoms and lower their long-term health costs from repeated lab visits."

Gout is a painful joint disease that affects over 8 million Americans. Patients with gout tend to have higher levels of urate salts circulating in their bloodstream, a condition called hyperuricemia. These urate crystals then diffuse out of blood vessels and accumulate in the space between joints. The salt deposits then cause excruciating pain, and in advanced cases, a deterioration of joints and bones.

For gout diagnosis, physicians often use clinical criteria, like the frequency of painful incidents, location of pain and the severity of the inflammation. But for a definitive diagnosis, the fluid between the joints is examined for the presence and quantity of urate crystals. These laboratory tests can be expensive and time-consuming due to factors such as equipment and labor costs. Also, the researchers said frequent visits for laboratory testing can be difficult for elderly gout patients.

However, levels of circulating urate can be kept in check with medications. Additionally, avoiding or minimizing the consumption of foods rich in urates, like red meat and seafood, can also help in managing blood urate levels.

"Maintaining low levels of urate is critical for mitigating gout symptoms," said Tokunbo Falohun, a graduate student in the College of Engineering and the primary author of the study. "And so, we wanted to create a technology that is reliable and user-friendly so that patients can easily self-monitor their blood urate levels."

Urate reacts with oxygen in the presence of an enzyme called uricase to form allantoin. The researchers used this knowledge to develop a system where urate levels could be indirectly monitored using benzoporphyrins, a known sensor for oxygen.

Benzoporphyrins are complex molecules that have unique optical properties that are valuable in the design of optical biosensors. When hit by light from an LED, benzoporphyrins get energized, and after a short time, lose their excess in stages and finally emit light. But oxygen atoms can affect the amount of time or lifetimes of benzoporphyrins in an energized state.

Through collisions, oxygen atoms can take away some of the excess energy from the benzoporphyrins. And so, if there are fewer oxygen atoms, there are lesser that bump into benzoporphyrins and the lifetimes of benzoporphyrins proportionately increase. The researchers reasoned that when urate levels are high, benzoporphyrins lifetimes must be higher since more oxygen is used up to make allantoin.

Based on this rationale, McShane and Falohun set up a technology to measure benzoporphyrins' lifetimes. Their technology consisted of two main components: an optical device to both produce light and collect emitted light from benzoporphyrins; and a biocompatible hydrogel platform for encapsulating uricase and benzoporphyrins.

To mimic conditions within the body, the researchers put the pieces of hydrogels, which were thin discs millimeters in diameter, in saline-filled chambers receiving a steady flow of oxygen and continually maintained at 37 degrees Celsius. In each chamber, they then put in different levels of urate. An external computer connected to the optical system calculated and reported the lifetimes of the benzoporphyrins.

The researchers found that when they switched on the LED light, as predicted, the urate levels in each chamber directly affected the lifetimes of the benzoporphyrins. That is if there was more urate, there were fewer oxygen atoms available for collisions and consequently, the lifetimes of the benzoporphyrins were higher.

Although the lifetime values faithfully followed urate levels, McShane and Falohun said that additional experiments need to be done to ensure long-term stability of their optical biosensor system so that the technology is suitable for future clinical use.

However, they noted that their biosensor system demonstrates the feasibility of using the technology for personal management of gout since the hydrogels are small enough to be inserted just below the skin, at a site near oxygen-carrying blood vessels. Furthermore, they said the optical system can be easily connected to any standard computer and that their software is designed to report urate levels in a user-friendly manner. Thus, gout patients will be able to measure their urate levels precisely and as often as they need to.

"From a global health perspective, we need to empower people to make informed decisions about their health and well-being. In that regard, our system is a step toward building biomedical technologies for continuous and more frequent monitoring of disease symptoms," said McShane.

Credit: 
Texas A&M University

A 'consciousness conductor' synchronizes and connects mouse brain areas

image: Claustrum neurons are labeled with synaptophysin-GFP (pre-synapses shown in green) and WGA (trans-synapse area in red). All the neuronal cell bodies and dendrites are stained with anti-MAP2 (blue).

Image: 
RIKEN

For scientists searching for the brain's 'control room', an area called the claustrum has emerged as a compelling candidate. This little-studied deep brain structure is thought to be the place where multiple senses are brought together, attention is controlled, and consciousness arises. Observations in mice now support the role of the claustrum as a hub for coordinating activity across the brain. New research from the RIKEN Center for Brain Science (CBS) shows that slow-wave brain activity, a characteristic of sleep and resting states, is controlled by the claustrum. The synchronization of silent and active states across large parts of the brain by these slow waves could contribute to consciousness.

A serendipitous discovery actually led Yoshihiro Yoshihara, team leader at CBS, to investigate the claustrum. His lab normally studies the sense of smell and the detection of pheromones, but they chanced upon a genetically engineered mouse strain with a specific population of brain cells that was present only in the claustrum. These neurons could be turned on using optogenetic technology or selectively silenced through genetic manipulation, thus enabling the study of what turned out to be a vast, claustrum-controlled network. The study by Yoshihara and colleagues was published in Nature Neuroscience on May 11.

They started out by mapping the claustrum's inputs and outputs and found that many higher-order brain areas send connections to the claustrum, such as those involved in sensation and motor control. Outgoing connections from the claustrum were broadly distributed across the brain, reaching numerous brain areas such as prefrontal, orbital, cingulate, motor, insular, and entorhinal cortices. "The claustrum is at the center of a widespread brain network, covering areas that are involved in cognitive processing," says co-first author Kimiya Narikiyo. "It essentially reaches all higher brain areas and all types of neurons, making it a potential orchestrator of brain-wide activity."

Indeed, this is what the researchers found when they manipulated claustrum neurons optogenetically. Neural firing in the claustrum closely correlated with the slow-wave activity in many brain regions that receive input from the claustrom. When they artificially activated the claustrum by optogenetic light stimulation, it silenced brain activity across the cortex--a phenomenon known as a "Down state", which can be seen when mice are asleep or at rest. Up and Down states are known to be synchronized across the cortex by slow waves of activity that travel from the front of the brain to the back. "The slow wave is especially important during sleep because it promotes homeostasis of synapses across the brain and consolidates memories from the preceding awake period," comments Yoshihara.

The claustrum turns out to be vital for generating this slow-wave activity. Genetically removing the claustrum neurons significantly reduced slow waves in the frontal cortex. "We think the claustrum plays a pivotal role in triggering the down states during slow-wave activity, through its widespread inputs to many cortical areas," says Yoshihara. When these areas subsequently enter an up state and fire synchronously, this serves to 'replay' memories, transfer information between areas, and consolidate long-term memories, "all functions that may contribute indirectly to a conscious state," Yoshihara observes. "The claustrum is a coordinator of global slow-wave activity, and it is so exciting that we are getting closer to linking specific brain connections and actions with the ultimate puzzle of consciousness."

Credit: 
RIKEN

Moisture-sucking gels give solar panels the chills

image: When the gel is fully filled with water, it can free enough water to reduce panel temperatures by 10 degrees Celsius.

Image: 
© 2020 KAUST; Youssef A. Khalil

A cooling system developed at KAUST has improved the efficiency of a prototype solar panel up to 20 percent and requires no external energy source to operate.

Commercial silicon photovoltaic panels are only able to transform a small portion of absorbed sunlight into electricity, while the remainder of the radiation becomes heat. Because solar panels are less efficient for every degree rise in temperature, the problem of heat dissipation becomes more acute in hot environments, such as the Arabian desert.

Unfortunately, efforts to cool solar panels with conventional techniques, including refrigeration or air conditioning, tend to consume more energy than can be gained back through efficiency boosts. Now, a team led by Peng Wang from KAUST's Water Desalination and Reuse Center has produced a proof-of-concept device that aims to solve this conundrum by tapping into the natural properties of the Earth's climate.

Previously, the KAUST researchers developed a polymer containing calcium chloride, a powerful desiccant. When exposed to humid air, this material gradually expands as the calcium salts pull water into the gel, eventually doubling its initial weight. By incorporating heat-absorbing carbon nanotubes into the polymer framework, the team found they could reverse this cycle and trigger release of water with solar energy.

Renyuan Li, who was a Ph.D. student and is now a postdoctoral researcher in Wang's group, notes that one of the intriguing properties of the gel was its ability to self-adhere to numerous surfaces--including the underside of solar panels. After controlled experiments with artificial sunlight revealed that a fully filled gel could free enough water to reduce panel temperatures by 10 degrees Celsius, the team decided to build a prototype for outdoor tests at KAUST.

During both summer and winter seasons, the researchers watched as the gel absorbed water from the muggy overnight air and then released the liquid as the daytime temperatures ramped up. Surprisingly, the solar panels showed an increase in efficiency even greater than that of the indoor experiments, a jump the researchers theorize may be due to improved heat and mass transfer outdoors, for example.

"This work shows the benefits of using atmospheric water generation to help fight climate change," says Li. "We believe this cooling technology can fulfill the requirements of many applications because water vapor is everywhere and this cooling technology is easy to adapt to different scales. The technology could be made as small as several millimeters for electronic devices, hundreds of square meters for a building, or even larger for passive cooling of power plants."

Credit: 
King Abdullah University of Science & Technology (KAUST)

Sex, genes and vulnerability

video: Some diseases exhibit a clear sex bias, occurring more often, hitting harder or eliciting different symptoms in men or women. New work led by researchers in the Blavatnik Institute at Harvard Medical School and at the Broad Institute of MIT and Harvard provides a clear genetic explanation behind the sex bias observed in some of these diseases. Nolan Kamitaki, first author of the study in Nature, provides a two-minute summary of the findings.

Image: 
Harvard Medical School

Some diseases exhibit a clear sex bias, occurring more often, hitting harder or eliciting different symptoms in men or women.

For instance, the autoimmune conditions lupus and Sjögren's syndrome affect nine times more women than men, while schizophrenia affects more men and tends to cause more severe symptoms in men than in women.

Likewise, early reports suggest that despite similar rates of infection, men are dying from COVID-19 more often than women, as happened during previous outbreaks of the related diseases SARS and MERS.

For decades, scientists have tried to pinpoint why some diseases have an unexpected sex bias. Behavior can play a role, but that explains only a piece of the puzzle. Hormones are commonly invoked, but how exactly they contribute to the disparity is unclear. As for genes, few, if any, answers have been found on the X and Y sex chromosomes for most diseases.

Now, work led by researchers in the Blavatnik Institute at Harvard Medical School and at the Broad Institute of MIT and Harvard provides a clear genetic explanation behind the sex bias observed in some of these diseases.

The team's findings, reported May 11 in Nature, suggest that greater abundance of an immune-related protein in men protects against lupus and Sjögren's but heightens vulnerability to schizophrenia.

The protein, called complement component 4 (C4) and produced by the C4 gene, tags cellular debris for prompt removal by immune cells.

The team's key findings:

Regardless of sex, natural variation in the number and type of C4 genes contained in people's DNA constitutes the largest common genetic risk factor for developing these three diseases. People with the most C4 genes were seven times less likely to develop systemic lupus erythematosus, an autoimmune condition that can range from mild to life-threatening, and 16 times less likely to develop primary Sjögren's syndrome, a systemic autoimmune syndrome characterized by dry eyes and dry mouth, than those with the fewest C4 genes. Conversely, those with the most C4 genes were 1.6 times more likely to develop the neuropsychiatric condition schizophrenia.

Even in people with similar complement gene profiles, the genes produce more protein in men than in women, further skewing disease susceptibility and protection.

"Sex acts as a lens that magnifies the effects of genetic variation," said the study's first author, Nolan Kamitaki, research associate in genetics in the lab of Steven McCarroll at HMS and the Broad.

"We all know about illnesses that either women or men get a lot more, but we've had no idea why," said Steven McCarroll, the Dorothy and Milton Flier Professor of Biomedical Science and Genetics at HMS and director of genomic neurobiology at the Stanley Center for Psychiatric Research at the Broad. "This work is exciting because it gives us one of our first handles on the biology."

McCarroll is co-senior author of the study with Timothy Vyse of King's College London.

Although C4 variation appears to contribute powerfully to disease risk, it is only one among many genetic and environmental factors that influence disease development.

The study's results are informing the ongoing development of drugs that modulate the complement system, the authors said.

"For example, researchers will need to make sure that drugs that tone down the complement system do not unintentionally increase risk for autoimmune disease," said McCarroll. "Scientists will also need to consider the possibility that such drugs may be differentially helpful in male and female patients."

On a broader level, the work offers a more solid foundation for understanding sex variation in disease than has been available before.

"It's helpful to be able to think about sex-biased disease biology in terms of specific molecules, beyond vague references to 'hormones,'" McCarroll said. "We now realize that the complement system shapes vulnerability for a wide variety of illnesses."

Cell sweeper

In 2016, researchers led by Aswin Sekar, a former McCarroll lab member who is a co-author of the new study, made international headlines when they revealed that specific C4 gene variants underlie the largest common genetic risk factor for developing schizophrenia.

The new work suggests that C4 genes confer both an advantage and disadvantage to carriers, much as the gene variant that causes sickle cell disease also protects people against malaria.

"C4 gene variants come with this yin and yang of heightened and reduced vulnerability in different organ systems," said McCarroll.

The findings, when combined with insights from earlier work, offer insights into what may be happening at the molecular level.

When cells are injured, whether from a sunburn or infection, they leak their contents into the surrounding tissue. Cells from the adaptive immune system, which specialize in recognizing unfamiliar molecules around distressed cells, spot debris from the cell nuclei. If these immune cells mistake the flotsam for an invading pathogen, they may instigate an attack against material that isn't foreign at all--the essence of autoimmunity.

Researchers believe that complement proteins help tag these leaked molecules as trash so they're quickly removed by other cells, before the adaptive immune system pays too much attention to them. In people with lower levels of complement proteins, however, the uncollected debris lingers longer, and adaptive immune cells may become confused into acting as if the debris is itself the cause of problem.

As part of the new study, Kamitaki and colleagues measured complement protein levels in the cerebrospinal fluid of 589 people and blood plasma of 1,844 people. They found that samples from women aged 20 through 50 had significantly fewer complement proteins--including not only C4 but also C3, which activates C4--than samples from men of the same age.

That's the same age range in which lupus, Sjögren's and schizophrenia vulnerabilities differ by sex, Kamitaki said.

The results align with previous observations by other groups that severe early-onset lupus is sometimes associated with a complete lack of complement proteins, that lupus flare-ups can be linked to drops in complement protein levels and that a common gene variant associated with lupus affects the C3 receptor.

"There were all these medical hints," said McCarroll. "Human genetics helps put those hints together."

Two flavors

The bulk of the findings arose from analyses of whole genomes from 1,265 people along with single nucleotide polymorphism (SNP) data from 6,700 people with lupus and 11,500 controls.

C4 genes and proteins come in two types, C4A and C4B. The researchers found that having more copies of the C4A gene and higher levels of C4A proteins was associated with greater protection against lupus and Sjögren's, while C4B genes had a significant but more modest effect. On the other hand, C4A was linked with increased risk of schizophrenia, while C4B had no effect on that illness.

In men, common combinations of C4A and C4B produced a 14-fold range of risk for lupus and 31-fold range of risk for Sjögren's, compared to only 6-fold and 15-fold ranges in women, respectively.

The researchers didn't expect the genes' effects to be so strong.

"Large genetic effects tend to come from rare variants, while common gene variants generally have small effects," said McCarroll. "The C4 gene variants are common, yet they are very impactful in lupus and Sjögren's."

Still, complement genes don't tell the full story of lupus, Sjögren's or schizophrenia risk, none of which are caused entirely by genetics.

"The complement system contributes to the sex bias, but it's only one of probably many genetic and environmental contributors," said Kamitaki.

Answers from diversity

Complement genes and another family of immune-related genes, called human leukocyte antigen or HLA genes, are interspersed throughout the same complex stretch of the human genome. HLA variants have been shown to raise risk of developing other autoimmune diseases, including type 1 diabetes, celiac disease and rheumatoid arthritis, and researchers had long believed that something similar was happening with lupus and Sjögren's.

The culprit, however, remained stubbornly hard to pin down, because specific variants in HLA genes and C4 genes always seemed to appear together in the same people.

Kamitaki and colleagues overcame this hurdle by analyzing DNA from a cohort of several thousand African American research participants. The participants' DNA contained many more recombinations between complement and HLA genes, allowing the researchers to finally tease apart the genes' contributions.

"It became quite clear which gene was responsible," said McCarroll. "That was a real gift to science from African American research participants. The question had been unsolved for decades."

The discovery provides further proof that the field of genetics would benefit from diversifying the populations it studies, McCarroll said.

"It will really help for genetics to expand more strongly beyond European ancestries and learn from genetic variation and ancestries all over the world," he said.

C4 variation could contribute to sex-based vulnerabilities in other diseases not yet analyzed, the authors said. It's not yet clear whether C4 pertains to the sex bias seen in COVID-19.

"We don't know the mechanism yet for why men seem to get sicker from COVID-19," said McCarroll. "Complement molecules are potentially important in any immune or inflammatory condition, and in COVID-19, it seems the immune response can be part of a downward spiral in some patients. But we don't know the key details yet."

It also remains to be seen how the differing effects of complement genes apply to people with intersex traits, also known as disorders or differences of sex development, who don't always fit textbook genetic or biological definitions of male and female.

"That is important to understand," said McCarroll.

Credit: 
Harvard Medical School

New HIV vaccine combination strategy provides better and more durable protection

ATLANTA - Researchers from the Emory Consortium for Innovative AIDS Research in Nonhuman Primates and their colleagues across North America have shown a new HIV vaccine is better at preventing infection and lasts longer, continuing to protect one year after vaccination. The findings, which are published online today in Nature Medicine, provide important insights for preventing HIV, and the timeliness of the results could also help shape the scientific community's approach to developing vaccines for COVID-19.

According to the researchers, the key to the new vaccine's markedly improved protection from viral infection is an alliance between neutralizing antibodies and cellular immunity. "Most efforts to develop an HIV vaccine focus on activating the immune system to make antibodies that can inactivate the virus, so called neutralizing antibodies," says Eric Hunter, PhD, professor of pathology and laboratory medicine at Emory, a researcher at the Emory Vaccine Center (EVC) and Yerkes National Primate Research Center, and a Georgia Research Alliance Eminent Scholar. "We designed our vaccine to also generate a strong cellular immune response that homed in on mucosal tissues so the two arms of the immune response could collaborate to give better protection," he continues.

Hunter is one of five senior authors of this study. Two of his Emory colleagues are also senior authors: Rama Amara, PhD, professor of microbiology and immunology at Emory and a researcher at Yerkes and the EVC; and Cynthia Derdeyn, PhD, professor of pathology and laboratory medicine at Emory and also an EVC and Yerkes researcher. The other senior authors are Bali Pulendran, PhD, a former EVC and Yerkes researcher who is now a professor at Stanford, and David Masopust, PhD, professor of microbiology and immunology at the University of Minnesota. The lead authors are Emory postdoctoral scholars Tysheena Charles, PhD, and Satish Bollimpelli, PhD, as well as postdoctoral scholars Prabhu Arunachalam, PhD, at Stanford, and Vineet Joag, PhD, at University of Minnesota. The research team also included members from Cornell University, Duke University, Louisiana State University and 3M Corp.

Some 38 million people worldwide live with AIDS. While antiviral medications limit the impact of the disease on daily life, HIV continues to infect 1.7 million people annually and cause some 770,000 deaths each year, which makes the Emory team's work a high priority.

In the new study, the researchers stimulated both serum and cellular immunity, which proved critical for the encouraging results. Working with rhesus macaques at Yerkes, the researchers inoculated three groups of 15 monkeys during a 40-week period. "Nonhuman primates remain the very best model for testing the potential of novel vaccines," says Hunter. The first group received several sequential inoculations of Env, a protein on the virus' outer surface known for stimulating antibody production, plus an adjuvant, a chemical combination often used in vaccines to enhance immune response. The second group was similarly inoculated but received additional injections of three different attenuated viruses modified to contain the gene for a HIV viral protein, Gag, that's known to stimulate cellular immunity. A third, control group, received injections containing only the adjuvant.

Following the 40-week regimen, all animals rested for 40 weeks, and then the researchers gave them booster shots of just the Env inoculation. After resting four more weeks, the researchers gave the animals 10 weekly exposures to SHIV, the simian version of HIV.

"Our results showed animals in the two experimental groups experienced significant initial protection from viral infection that was linked with high neutralizing antibody titers, particularly in the Env-only group," says Derdeyn. Even more notable, say the researchers, was several of the Env-plus-Gag animals, but none of the Env animals, remained uninfected even though they lacked robust levels of neutralizing antibodies. "This is an intriguing result because increasing the potency of neutralizing antibodies has been thought to be crucial to a vaccine's effectiveness, but doing so is difficult" Derdeyn adds.

Also difficult has been lengthening the duration of protection, but Amara says the current study shows promising results in addressing this. "When we rechallenged the study animals one year after giving the vaccines, the animals that received the Env-plus-Gag combination but not the Env-only vaccination showed a pronounced increase in the duration of protection."

Amara adds, "With these study results, we are one step closer to preventing HIV via a vaccine." The team will use the results to refine the way they approach vaccine development, including further assessing strategies to elicit cellular and neutralizing antibody responses for greater protection, with a goal of moving the new antibody plus T cell vaccine approach into clinical trials. "We think the same approach could be feasible for other pathogens, including influenza, TB, malaria and, now, COVID-19," he continues.

Credit: 
Emory Health Sciences

Loss of green space in India shown to be associated with higher cardiometabolic risk

Although the health benefits of green space are supported by a growing body of evidence, few studies on this topic have been carried out in low- and middle-income countries. A new study by the Barcelona Institute for Global Health (ISGlobal), a centre supported by the "la Caixa" Foundation, which analysed data on more than 6,000 people living in an area south of Hyderabad (India), offers evidence that urban development leading to a reduction in green space may be associated with an increase in several cardiometabolic risk factors.

Most earlier studies in high-income countries have focused on urban green space (parks, urban forests, gardens, etc.), but their findings may have limited applicability to low and middle income country settings where green space largely consists of farmland and bare, open areas.

The team behind the ISGlobal-coordinated CHAI project decided to investigate the association between land-use changes involving the conversion of natural and crop to built up land use and cardiometabolic risk factors--hypertension, obesity and hyperglycaemia--in a periurban area south of Hyderabad undergoing urbanisation. The study also explored the possible mediating roles of air pollution, physical activity and stress in these associations.

The study included health data from over 6,000 adults and the authors analysed changes in land use over a 14-year period across an area of 700 km2 using classification methods based on satellite remote sensing data from Landsat missions. Levels of air pollution (suspended particulate matter measuring less than 2.5 μm, or PM2.5) were estimated as part of the CHAI project. The cardiometabolic risk factors studied included blood pressure, triglycerides, cholesterol and fasting glucose, and participants answered survey questions about their lifestyle and stress indicators.

The findings, published in Environmental Health Perspectives, showed that a fast increase in built-up land use within 300 m of a person's residence was associated with an increase in metabolic risk factors. People whose neighbourhoods experienced faster urban development compared to those whose neighbourhoods did not change had higher blood pressure (both systolic and diastolic), waist circumference and fasting glucose values.

Analysis by sex revealed that women suffered more than men the health impact of the loss of green space. A possible explanation for this disparity could be "differences in mobility patterns because women spend a substantially larger portion of the day close to their homes (74%) than men (52%)", explained Carles Milà, lead author of the study.

In connection with the link between urban development and increased cardiometabolic risk, Milà noted that "the study has shown that this association may, in part, be mediated by an increase in air pollution and a reduction in physical activity" due to the loss of green space close to residential areas. No association with stress was found.

ISGlobal researcher Cathryn Tonne, who coordinates the CHAI project and led the new study, concluded: "The findings of this study support the need to integrate health into urban planning to reduce the negative health impacts of urbanisation, especially in cities or neighbourhoods that are undergoing rapid land-use changes."

Credit: 
Barcelona Institute for Global Health (ISGlobal)

Exploring the impacts of climate change on hydropower production

A new study by researchers from IIASA and China investigated the impacts of different levels of global warming on hydropower potential and found that this type of electricity generation benefits more from a 1.5°C than a 2°C climate scenario.

In a sustainable and less carbon-intensive future, hydropower will play an increasingly crucial role as an important source of renewable and clean energy in the world's overall energy supply. In fact, hydropower generation has doubled over the last three decades and is projected to double again from the present level by 2050. Global warming is however threatening the world's water supplies, posing a significant threat to hydropower generation, which is a problem in light of the continuous increase in energy demand due to global population growth and socioeconomic development.

The study, undertaken by researchers from IIASA in collaboration with colleagues at several Chinese institutions and published in the journal Water Resources Research, employed a coupled hydrological and techno-economic model framework to identify optimal locations for hydropower plants under global warming levels of 1.5°C and 2°C, while also considering gross hydropower potential, power consumption, and economic factors. According to the authors, while determining the effects of different levels of global warming has become a hot topic in water resources research, there are still relatively few studies on the impacts of different global warming levels on hydropower potential.

The researchers specifically looked at the potential for hydropower production under the two different levels of warming in Sumatra, one of the Sunda Islands of western Indonesia. Sumatra was chosen as it is vulnerable to global warming because of sea level rise, and the island's environmental conditions make it an ideal location for developing and utilizing hydropower resources. They also modeled and visualized optimal locations of hydropower plants using the IIASA BeWhere model, and discussed hydropower production based on selected hydropower plants and the reduction in carbon emissions that would result from using hydropower instead of fossil fuels.

The results show that global warming levels of both 1.5°C and 2°C will have a positive impact on the hydropower production of Sumatra relative to the historical period. The ratio of hydropower production to power demand provided by 1.5°C of global warming is however greater than that provided by 2°C of global warming under a scenario that assumes stabilization without overshooting the target after 2100. This is due to a decrease in precipitation and the fact that the south east of Indonesia observes the highest discharge decrease under this scenario. In addition, the reduction in CO2 emissions under global warming of 1.5°C is greater than that achieved under global warming of 2°C, which reveals that global warming decreases the benefits necessary to relieve global warming levels. The findings also illustrate the tension between greenhouse gas-related goals and ecosystem conservation-related goals by considering the trade-off between the protected areas and hydropower plant expansion.

"Our study could significantly contribute to establishing a basis for decision making on energy security under 1.5°C and 2°C global warming scenarios. Our findings can also potentially be an important basis for a large range of follow-up studies to, for instance, investigate the trade-off between forest conservancy and hydropower development, to contribute to the achievement of countries' Nationally Determined Contributions under the Paris Agreement," concludes study lead author Ying Meng, who started work on this project as a participant of the 2018 IIASA Young Scientists Summer Program (YSSP). She is currently affiliated with the School of Environment at the Harbin Institute of Technology in China.

Credit: 
International Institute for Applied Systems Analysis