Tech

Fighting Zika? Call in the T cells

image: Left: Aedes mosquito. Right: Zika virus particles (red) shown

Image: 
NIAID

LA JOLLA--Where Aedes mosquitoes fly, Zika virus may not be far behind. Although the explosive 2015-2016 Zika epidemics in the Americas are behind us, Zika may re-emerge, and "in many countries, Zika may be spreading in silence," says Sujan Shresta, Ph.D., a professor at La Jolla Institute for Immunology (LJI). "We need to develop effective vaccines."

In a new Science Advances study, Shresta and her colleagues at LJI report that the immune system's T cells have the power to prevent Zika infection in mice. This finding suggests that effective Zika vaccines need to activate T cells to work alongside antibodies.

"If we combine T cells and antibodies, we have even stronger protection and longer-term protection," says Annie Elong Ngono, Ph.D., a postdoctoral fellow at LJI and first author of the new study.

Zika virus cases are usually mild, but the virus can cause serious congenital malformations in infants and neurological complications in adults and children. Since Zika made headlines in 2016, when cases of the virus peaked in the Americas, researchers have developed more than 40 Zika vaccine candidates. The vast majority of these vaccines are designed to prompt the body to make antibodies that target one specific protein on the virus.

Unfortunately, there is a drawback to this neutralizing antibody approach. In many parts of the world, Zika virus spreads alongside related mosquito-borne viruses, such as dengue. Scientists have found that the presence of anti-Zika antibodies can make a subsequent case of dengue much, much worse. In a 2018 study, Shresta's lab showed that newborn mouse pups harboring anti-Zika antibodies were more vulnerable to death from dengue exposure than mice that lacked anti-Zika antibodies.

Theoretically, similar cases of "antibody-dependent enhancement" may lead to cases where lingering anti-Zika antibodies in a patient will actually make it easier for dengue to enter host cells--leading to especially devastating consequences in pregnant patients. This means that a Zika vaccine that prompts only antibody production may be risky in areas where both dengue and Zika are common. Luckily, the immune system can make more than antibodies.

For the new study, Shresta and Elong Ngono tested an experimental Zika vaccine in a mouse model. The vaccine was designed to elicit the arm of the immune system that makes T cells. The mice were given the vaccine, given a second vaccine boost four weeks later, and then exposed to Zika six weeks after that.

The team found that the vaccine could induce a strong immunity against a potentially lethal Zika virus infection by inducing mainly CD8+ T cells, also called "killer" T cells, against the virus. The vaccine also prevented Zika transmission through the placenta from mother to fetus in pregnant mice.

This vaccine approach was even more effective when combined with a vaccine candidate that induced neutralizing antibodies. "We found that it is better to have a vaccine that induces both T cells and antibodies than either one alone," says Elong Ngono.

The new research also shows the importance of targeting more than one viral protein when fighting flaviviruses, the group of viruses that include Zika, dengue, yellow fever and Japanese encephalitis. By getting T cells and antibodies to recognize key sites on these related viruses, researchers may be closer to developing a "pan-flavivirus" vaccine to protect people in areas where several of these diseases are common.

"We think this approach can be used against other infectious diseases," Elong Ngono says. For example, recent research from LJI scientists suggests that COVID-19 vaccines may also need to elicit T cells to work alongside antibodies.

"Now the challenge is finding how best to elicit appropriately balanced antibody and T cell responses," says Shresta. "We also don't know how durable the vaccine protection is--if it's fairly short, we want to figure out how to enhance it."

Credit: 
La Jolla Institute for Immunology

New decision support tool can provide personalized antibiotic treatment recommendations

Boston, MA-- A new study led by researchers at the Harvard Pilgrim Health Care Institute developed an algorithm that could greatly reduce use of broad-spectrum antibiotics in outpatient settings, a step toward reducing antibiotic resistance. The findings will be published online November 4, 2020 in Science Translational Medicine.

As discussed by the authors, antibiotic resistance is a major threat to the practice of medicine and is driven in large part by overuse of antibiotics. Outpatient settings are where the vast majority of antibiotics are prescribed but are also where the fewest tools are available to help prescribers make optimal treatment decisions. This leads providers to prescribe broad-spectrum antibiotics in response to a real, as well as a perceived, increase in the rates of antibiotic resistant infection. However, use of broad-spectrum antibiotics, which work against a wide range of bacteria, promotes a vicious cycle where overuse further worsens the problem of resistance through a positive feedback loop. An example is urinary tract infection (UTI), which is a very common reason for using antibiotics among outpatients. Despite national guidelines urging the use of narrow-spectrum treatments as first line therapies, the most commonly prescribed treatments are ciprofloxacin and levofloxacin, which are broad spectrum, second line antibiotics associated with a host of adverse events.

Little attention has been paid to developing effective decision support tools for outpatient prescribers. Algorithms have been used for clinical decision support for infectious diseases since the 1970s but have not yet been widely adopted due to difficulties in integrating them into busy clinical practices. Sanjat Kanjilal, MD, MPH, lead author and Lecturer in Population Medicine at the Harvard Pilgrim Health Care Institute and Harvard Medical School, believes we now have the tools to do better. "Personalized decision support at the point of care may be an effective tool to manage antibiotic prescription for common infectious syndromes," said Dr. Kanjilal. His solution is to use machine learning models to predict the likelihood of antibiotic resistance, and then translate those likelihoods into recommendations that help prescribers make optimal treatment decisions. "Our study developed a personalized decision support algorithm for UTIs as a solution to the challenge of antibiotic prescription in the era of resistance."

The study used data from the medical records of more than 13,000 women with uncomplicated UTI who received care at two large Boston hospitals between 2007 and 2016. Dr. Kanjilal's team trained their machine learning models to predict the probability of antibiotic resistance to four commonly used treatments, and then developed a novel method to translate those probabilities into decisions that can guide prescribers to avoid ciprofloxacin and levofloxacin to the greatest extent possible, while not resulting in any undue harm to patients.

The team compared the performance of the algorithm to that of clinicians and national guidelines and found that it would have reduced prescription of second-line antibiotics by 67%. At the same time, it also reduced the selection of antibiotics to which a specimen is resistant, by 18%.

Added Dr. Kanjilal, "Integrating these models into outpatient care could play an important role in reducing the use of broad-spectrum antibiotics. Our future work will focus on integrating these clinical decision support tools into provider workflows and evaluating the clinical outcomes using randomized controlled trials."

Credit: 
Harvard Pilgrim Health Care Institute

New multiscale view of the human brain

image: Hyperbolic maps of the multiscale human connectome and the geometric renormalization flow.

Image: 
M. Zhenga et al. /PNAS

The architecture of the brain supports cognitive and behavioural functions and it is extremely complex with connections at multiple layers that interact with each other. However, research efforts are usually focused on a single spatial scale. In a study led by researchers of the Institute of Complex Systems of the University of Barcelona (UBICS), researchers studied the multiscale spatial organization of the brain and observed that, in a geometric network model, the layers at different resolution are self-similar, that is, as we move away, the geometric and connectivity structure of the layers remains the same.

In order to carry out this study, researchers used two high-quality datasets with maps of neural connections, connectomes, of eighty-four healthy subjects with five anatomical resolutions for each. According to M. Àngels Serrano, ICREA researcher at UBICS, "the results show that brain connectivity at different scales is organized with the same principles that lead to a efficient decentralized communication".

The structure of the human brain expands over a series of interrelated length scales which increase its complexity. "The self-similarity we determined as a pattern in the multiscale structure of the human connectome introduces the simplicity as an organizing principle", notes Serrano. This means that underlying connectivity rules that explain this structure are independent from the observation scale -at least within the analysed scales in this study-, "that is, we do not need a specific set of rules for each scale", concludes Serrano.

The model predicts observations through the application of a renormalization protocol. This method is based on a geometric network model that places the nodes in a hidden metric space, defining a map, so that both nodes are more likely to be connected. This type of model enables researchers to explain the universal features of real networks.

For every scale, there is a remarkable congruence between empirical observations and predictions provided by the model. The results show that the same rules explain the formation of short and long-range connections in the brain within the rank of length scales that cover the used datasets", concludes the UB researcher.

The implications of this discovery are several. On the one hand, it can be useful in fundamental debates, such as whether the brain is working close to a critical spot. On the other hand, it can have applications for advanced tools on brain functioning simulation.

Credit: 
University of Barcelona

Delirium could be an early marker of COVID-19

image: Delirium accompanied by fever could be an early symptom of COVID-19.

Image: 
Kat Jayne / Pexels

Delirium accompanied by fever could be an early symptom of COVID-19. This is the main conclusion drawn by a scientific research review carried out by researchers from the Universitat Oberta de Catalunya (UOC) and published in the open access Journal of Clinical Immunology and Immunotherapy, which highlights the fact that, together with the loss of the senses of taste and smell and headaches that occur in the days prior to the manifestation of coughing and breathing difficulties, some patients also develop delirium.

As such, the manifestation of this state of confusion, when accompanied by high fever, should be considered an early marker of the disease, particularly in the case of elderly patients.

"Delirium is a state of confusion in which the person feels out of touch with reality, as if they are dreaming," explained UOC researcher Javier Correa, who carried out this study at the University of Bordeaux (France). He added that "we need to be on the alert, particularly in an epidemiological situation like this, because an individual presenting certain signs of confusion may be an indication of infection".

Correa, together with UOC Cognitive NeuroLab researcher Diego Redolar Ripoll, has reviewed the body of scientific work published on the effects of COVID-19 in relation to the central nervous system, i.e. the brain. The review found that, although to date much of the coronavirus research conducted since the first cases of pneumonia were reported in China (on 31 December 2019) have focused on the damage it causes to the lungs and other organs, such as the kidneys and heart, there are growing indications that the coronavirus also affects the central nervous system and produces neurocognitive alterations, such as headaches and delirium, as well as psychotic episodes.

"The main hypotheses which explain how the coronavirus SARS-CoV-2 affects the brain point to three possible causes: hypoxia or neuronal oxygen deficiency, inflammation of brain tissue due to cytokine storm and the fact that the virus has the ability to cross the blood-brain barrier to directly invade the brain," commented Correa. He stressed that any one of these three factors has the potential to result in delirium and explained that evidence of hypoxia-related brain damage has been observed in autopsies carried out on patients who have died from the infection and that it has been possible to isolate the virus from the cerebral tissue.

According to the researchers, delirium, cognitive deficits and behavioural anomalies are most likely to be the result of systemic inflammation of the organ and a state of hypoxia, which also causes the neuronal tissue to become inflamed and cause damage in areas such as the hippocampus, which are associated with the cognitive dysfunctions and behavioural alterations presented by patients suffering delirium.

Credit: 
Universitat Oberta de Catalunya (UOC)

The dangers of collecting drinking water

Collecting drinking water in low and middle income countries can cause serious injury, particularly for women, according to new research from the University of East Anglia.

A new international study published in BMJ Global Health reveals dangers including falls, traffic accidents, animal attacks, and fights, which can result in broken bones, spinal injuries, lacerations, and other physical injuries.

And women are most likely to sustain such injuries - highlighting the social the social and gender inequities of a hidden global health challenge.

Dr Jo-Anne Geere, from UEA's School of Health Sciences, said: "Millions of people don't have the luxury of clean drinking water at their home, and they face many dangers before the water even touches their lips.

"Global research on water has largely focused on scarcity and health issues related to what is in the water, but the burden and risks of how water is retrieved and carried has been overlooked until now.

"We wanted to better understand the true burden of water insecurity."

The new study was led by Northwestern University in the US, in collaboration with UEA, the University of Miamii and the Household Water Insecurity Experiences Research Coordination Network (HWISE RCN).

The research team used a large global dataset to understand what factors might predict water-fetching injuries. The work draws on a survey of 6,291 randomly selected households across 24 sites in 21 low- and middle-income countries in Asia, Africa, Latin America, and the Caribbean.

They found that 13 per cent of the respondents reported some sort of injury while collecting water, and that women were twice as likely to be hurt as men.

Dr Sera Young, from Northwestern University, said: "Thirteen percent is a big number, but it is probably an underestimate. It's highly likely that more people would have reported injuries if the survey had more detailed questions.

Prof Paul Hunter, from UEA's Norwich Medical School, said: "This reinforces how the burden of water scarcity disproportionately falls on women, on rural populations, and on those who do not have water sources close to home.

"It highlights the importance of safe interventions that prioritise personal physical safety alongside traditional global indicators of water, sanitation, and hygiene."

The researchers say that keeping track of such safety measures -- in addition to the usual measures of water quality and access -- could help better assess progress towards the United Nations' Sustainable Development Goal 6.1, which sets out "to achieve universal and equitable access to safe and affordable drinking water for all" by 2030.

Dr Vidya Venkataramanan, also from Northwestern University, said: "It seems likely that water-fetching can contribute considerably to the global Water, Sanitation and Hygiene (WaSH) burden, but it usually goes unmeasured because we typically think about access and water quality. It is, therefore, a greatly underappreciated, nearly invisible public health challenge.

"It's really important that data on water-fetching injuries are systematically collected so that we can know the true burden of water insecurity. Currently, all of the broken bones, spinal injuries, lacerations and other physical injuries are not accounted for in calculations about the burden of water insecurity."

Credit: 
University of East Anglia

New multicomponent reaction frontiers

image: The experts create a main principle by which the polarity change of a reactive in a multicomponent process unchains domino reactions.

Image: 
Angewandte Chemie International Edition

The synthesis of complex molecules such as drugs, requires a process that sometimes involves several phases that increase its cost and harden the access to the product. Now, a team of the University of Barcelona has designed a new methodological approach that combines multicomponent reactions with domino type processes -continuous transformations on an only compound- to ease the synthesis of high structural complex molecules.
he study, published in the journal Angewandte Chemie International Edition, is led by Professor Rodolfo Lavilla, from the Faculty of Pharmacy and Food Sciences and the Institute of Biomedicine (IBUB) of the University of Barcelona. The study, whose first signatories are researchers Ouldouz Ghasghaei and Marina Pedrola (UB-IBUB), counts on the participation of experts from Masaryk University (Czech Republic) and the Leibniz Research Institute for Environmental Medicine (Germany).

Multicomponent reactions: more simplicity and efficiency

Multicomponent reactions are protocols that ease the chemical synthesis of new high complexity and structural diversity compounds. These reactions can form several bonds and generate new molecules with a minimum amount of three reactives. These processes are very direct and help obtain molecules in a quick and efficient way (simplicity, atom economy, etc.) compared to traditional processes. Also, these are also the most sustainable synthetic pathways from an environmental perspective (saving resources, less waste, etc.).

In the study, the experts create a main principle by which the polarity change of a reactive in a multicomponent process unchains domino reactions that enable the access to a complex connectivity. This principle would explain many transformations and would ease the design of new processes in the field of synthetic chemistry.

According to Lavilla, the new principle has been developed "with indole nuclei, a heterocycle present in many natural molecules, and particularly in drugs. Also, the compounds that were prepared with this methodology present a high structural variability (linear and angular fused rings, rigid or flexible compounds, etc.)". In the biological field, most of the products the researchers synthetized "present a powerful activity as ligands of the aryl hydrocarbon receptor", he adds, "a molecule with a determining role in several biological processes that is regarded as a potential pharmacological target for the development of new drugs".

So far, only a few specific cases of multicomponent reactions associated with a domino process had been described. "Both domino and multicomponent reactions are very complex at a mechanistic scale. There are many bonds, and many elemental phases, reaction intermediates, and so on", notes the researcher. He adds that by merging these two reaction families into an only process "we increase the synthetic complexity extraordinarily. Therefore, we consider the description of these processes to be an advance to generalize them and expand them into combinations in synthetic chemistry".

Technology for a greener chemistry

Multicomponent reactions eased the development of new molecules of pharmaceutical and biomedical interest (biological probes, fluophores, complex molecules). These techniques are getting more exploited by other industrial sectors every day.

"However, there are very few general multicomponent reactions -about a dozen compared to the hundreds of biomolecular reactions-, and this limits its applicability. In this sense, a great scientific activity is being carried out in this field to ease the access to this type of general connectivity through these reactions, and enable its application to the development of all types of organic compounds at a large scale (drugs, plastics, fertilizers, etc.)", concludes the researcher.

Credit: 
University of Barcelona

Ants are skilled farmers: They have solved a problem that we humans have yet to

Fungus-farming ants are an insect lineage that relies on farmed fungus for their survival. In return for tending to their fungal crops--protecting them against pests and pathogens, providing them with stable growth conditions in underground nests, and provisioning them with nutritional 'fertilizers'--the ants gain a stable food supply.

These fungus farming systems are an expression of striking collective organization honed over 60 million years of fungus crop domestication. The farming systems of humans thus pale in comparison, since they emerged only ca. 10,000 years ago.

A new study from the University of Copenhagen, and funded by an ERC Starting Grant, demonstrates that these ants might be one up on us as far as farming skills go. Long ago, they managed to appear to have overcome key domestication challenges that we have yet to solve.

"Ants have managed to retain a farming lifestyle across 60 million years of climate change, and Leafcutter ants appear able to grow a single cultivar species across diverse habitats, from grasslands to tropical rainforest" explains Jonathan Z. Shik, one of the study's authors and an assistant professor at the University of Copenhagen's Department of Biology.

Through fieldwork in the rainforests of Panama, he and researchers from the Smithsonian Tropical Research Institute studied how fungus-farming ants use nutrition to manage a tradeoff between the cultivar's increasingly specialized production benefits, and it's rising vulnerability to environmental variation.

Ants as clever farmers

We humans have bred certain characteristics -- whether a taste or texture -- into our crops.

But these benefits of crop domestication can also result in greater sensitivity to environmental threats from weather and pests, requiring increasing pesticide use and irrigation. Simply put, we weaken plants in exchange for the right taste and yield. Jonathan Z. Shik explains:

"The ants appear to have faced a similar yield-vulnerability tradeoff as their crops became more specialized, but have also evolved plenty of clever ways to persist over millions of years. For example, they became impressive architects, often excavating sophisticated and climate-controlled subterranean growth chambers where they can protect their fungus from the elements," he says.

Furthermore, these little creatures also appear able to carefully regulate the nutrients used to grow their crops.

To study how, Shik and his team spent over a hundred hours lying on rainforest floor on trash bags next to ant nests. Armed only with forceps, they stole tiny pieces of leaves and other substrates from the jaws of ants as they returned from foraging trips.

They did this while snakes slithered through the leaf litter and monkeys peered down at him from the treetops.

"For instance, our nutritional analyses of the plant substrates foraged by leafcutter ants show that they collect leaves, fruit, and flowers from hundreds of different rainforest trees. These plant substrates contain a rich blend of protein, carbohydrates and other nutrients such as sodium, zinc and magnesium," explains Shik. "This nutritional blend can target the specific nutritional requirements of their fungal crop."

What can we learn from ants?

Over the years, the ants have adapted their leaf collecting to the needs of the fungus -- a kind of organic farming, without the benefits of the technological advances that have helped human farmers over the millenia, one might say.

One might wonder, is it possible to simply copy their ingenious methods?

"Because our plant crops require sunlight and must thus be grown above ground, we can't directly transfer the ants' methods to our own agricultural practices. But it's interesting that at some point in history, both humans and ants have gone from being hunter-gatherers to discovering the advantages of cultivation. It will be fascinating to see what farming systems of humans look like in 60 million years," concludes Jonathan Z. Shik.

Credit: 
University of Copenhagen

UL research reveals extreme levels of uric acid can significantly reduce patient survival

Extreme values of serum uric acid levels in the blood can markedly reduce a patient's chance of surviving and reduce their lifespans by up to 11 years, according to a new study by researchers at University of Limerick's School of Medicine.

In one of the largest studies and the first of its kind in Ireland, researchers found evidence of substantial reductions in patient survival associated with extreme concentrations of serum uric acid (SUA) for both men and women.

The study, which was seed funded by the Health Research Board (HRB), has just been published in the European Journal of Internal Medicine.

"This is the first study to yield detailed survival statistics for SUA concentrations among Irish men and women in the health system," according to lead author, Dr Leonard Browne, PhD, Senior Research Fellow in Biostatistics at the UL School of Medicine.

"Our key question was to determine whether SUA, a routinely measured blood marker, could help us predict a patient's lifespan, all else being equal," Dr Browne added.

To answer this, the research team used data from the National Kidney Disease Surveillance System (NKDSS), based at UL, and created a large cohort of 26,525 patients who entered the Irish health system at University Hospital Limerick between January 1, 2006 and December 31, 2012, following them until December 31, 2013.

Dr Browne said the results were "quite astonishing".

"For men, the message was quite clear. The median survival was reduced by an average of 9.5 years for men with low levels of SUA (less than 238μmol/L), and 11.7 years for men with elevated SUA levels (greater than 535 μmol/L) compared to patients with levels of 357-416 μmol/L," he explained.

"Similarly, for women, we found that the median survival was reduced by almost 6 years for those with SUA levels greater than 416 μmol/L, compared to women with SUA in the normal range."

The shape of the mortality curves was quite different for men and women, according to Dr Browne.

"For men the shape of the association was predominantly U-shaped with optimal survival between 304-454 μmol/L, whereas, for women, the pattern of association was J-shaped with elevated risk of mortality only present for women with SUA levels beyond 409 μmol/L," he explained.

Professor Austin Stack, Foundation Chair of Medicine at ULs School of Medicine, senior author of the study and Principal Investigator for the NKDSS at UL and Consultant Nephrologist at UL Hospitals, said there was good evidence that high levels of SUA are associated with a range of serious chronic medical conditions such as kidney failure, hypertension, heart disease, stroke and diabetes.

"These known associations might in part explain the high mortality that we observed for patients with elevated SUA levels in our study," Professor Stack explained.

"Indeed, when we looked at the cause of death for these patients we found on one hand that that men and women with very high SUA levels died from cardiovascular causes of death.

"On the other hand, and quite surprisingly, we also found that very low levels of SUA were also associated with a higher risk of death primarily in men. This would of course suggest that very low levels of SUA are also detrimental to survival.

"We had speculated that patients with very low levels of SUA might reflect a subgroup that were generally sicker and had poorer nutritional status. Although when we took these considerations into our analysis, low SUA levels still predicted higher death rates in men.

"Interestingly, men who died with low SUA levels had a higher proportion of deaths from cancer - unlike those with high SUA level who had a higher proportion of deaths from cardiovascular disease," Professor Stack added.

Uric acid is a by-product of the body's metabolism and is associated with conditions such as heart disease, high blood pressure, stroke, kidney disease, and gout.

Previous work by the research group at UL found that hyperuricaemia is very common and affects about 25% of adults in the health system with a pattern of increasing growth year-on-year.

This current study adds to the body of evidence on the importance of SUA as a major predictor of survival and a potential target for treatment.

"A key consideration is whether we should treat hyperuricaemia and lower SUA levels to a desired target level in order to extend patient survival," said Professor Stack.

Prospective clinical trials are currently underway using uric acid lowering drugs in order to provide a definitive answer to this question.

Speaking about the results, Dr Mairead O'Driscoll, Chief Executive of the HRB, said: "This study demonstrates the enduring value of having robust datasets in place that have been collected over time. By researching the data, this team at UL and their partners are now making significant discoveries about uric acid on a frequent basis that will help shape treatments for people with conditions like heart disease, stroke and kidney disease."

Credit: 
University of Limerick

Divide and conquer--modular controller design strategy makes upgrading power grids easier

image: -

Image: 
Takayuki Ishizaki

Scientists at Tokyo Institute of Technology (Tokyo Tech) develop a novel approach for the modular design of controllers for large-scale network systems. Their strategy, which provides a completely decentralized method to design controllers for subsystems of a larger whole, could be readily applied in power grids, greatly simplifying the task of sequentially upgrading individual subdivisions while ensuring stability and performance.

The control of large-scale dynamic network systems, such as national power grids, is a remarkably challenging topic. In this context, "control" roughly means monitoring relevant output variables to ensure that the system operates stably and within safe margins. The difficulty and necessary considerations associated with the design and implementation of controllers usually skyrocket when dealing with complex networked systems, and theoretical studies to find new approaches to controller design are constantly being carried out.

One common problem that arises in large networked systems is that they're integrated. So, when a developer changes or upgrades is one subsystem, their actions on their neck of the woods can have unforeseen consequences on the rest of the network unless necessary precautions are taken for all subsystems. Even remote network disturbances caused by local temporary failures, such as the accidental grounding of a line in a power subsystem, can throw other subsystems off. Consequently, often no change can be made to one subsystem, without needing to alter all others.

However, as demonstrated in a recent study by scientists from Tokyo Tech, Japan, there is a design paradigm that can prevent such problems: modularity. This term implies working in "modules," subdivisions of the main system that can be separated, changed, and recombined independently, ideally without compromising each other. Nonetheless, as explained in their article published in IEEE Transactions on Automatic Control, achieving this independence between modules through their associated controllers is not straightforward.

In their study, the scientists developed a novel approach for the modular design of subsystem controllers in linear large-scale network systems that enables a number of advantages over existing approaches. In their approach, each developer for a subsystem can independently design and implement their controllers as an add on to the existing system. To do so, they only require knowledge of their subsystem. A decentralized controller designed under such considerations is called a retrofit controller.

First, the scientists used a technique called Youla parametrization to formally describe all the relevant parameters of generic retrofit controllers in a networked system. Then, they laid out a unique design for their retrofit controller that required only standard techniques to implement. They also mathematically demonstrated that, given certain reasonable assumptions about the whole, such as a stable system prior to the implementation of the proposed retrofit controller, using their controller guaranteed both local and overall system stability, even in the face of variations in other controllers.

Moreover, through numerical experiments, they showed that simultaneously implementing multiple such uniquely designed controllers in a network translates to performance improvements across the entire system, and adding more such controllers leads to greater power enhancement. As associate Professor Takayuki Ishizaki, lead author of the study, explains "The proposed modular design method provides a new theoretical basis for sequential system upgrades, such that the stability of the current system is surpassed by its future generations. In short, each designer can individually add, remove, and modify their controller without considering the actions of other designers." His team also demonstrated the practical significance of their method through an illustrative example: generator frequency regulation in an IEEE-standard power system model.

The benefits of modularity-in-design are many, as Ishizaki concludes: "Modular design is a widely accepted strategy that simplifies the design of complex large-scale systems, enables parallel work by multiple independent entities, and enables flexible future modifications of modules." Future advances in modular design will hopefully make the control of large-scale network systems more easily tractable and make them more easily upgradeable.

Credit: 
Tokyo Institute of Technology

Luminescent wood could light up homes of the future

image: When exposed to UV light on the outside, a luminescent wood panel (right) lights up an indoor space (as seen through "windows;" red arrows), whereas a non-luminescent panel (left) does not.

Image: 
Adapted from <i>ACS Nano</i> <b>2020</b>, DOI: 10.1021/acsnano.0c06110

The right indoor lighting can help set the mood, from a soft romantic glow to bright, stimulating colors. But some materials used for lighting, such as plastics, are not eco-friendly. Now, researchers reporting in ACS Nano have developed a bio-based, luminescent, water-resistant wood film that could someday be used as cover panels for lamps, displays and laser devices.

Consumer demand for eco-friendly, renewable materials has driven researchers to investigate wood-based thin films for optical applications. However, many materials developed so far have drawbacks, such as poor mechanical properties, uneven lighting, a lack of water resistance or the need for a petroleum-based polymer matrix. Qiliang Fu, Ingo Burgert and colleagues wanted to develop a luminescent wood film that could overcome these limitations.

The researchers treated balsa wood with a solution to remove lignin and about half of the hemicelluloses, leaving behind a porous scaffold. The team then infused the delignified wood with a solution containing quantum dots -- semiconductor nanoparticles that glow in a particular color when struck by ultraviolet (UV) light. After compressing and drying, the researchers applied a hydrophobic coating. The result was a dense, water-resistant wood film with excellent mechanical properties. Under UV light, the quantum dots in the wood emitted and scattered an orange light that spread evenly throughout the film's surface. The team demonstrated the ability of a luminescent panel to light up the interior of a toy house. Different types of quantum dots could be incorporated into the wood film to create various colors of lighting products, the researchers say.

Credit: 
American Chemical Society

Chikungunya may affect central nervous system as well as joints and lungs

image: Investigation showed that chikungunya virus can cause neurological infections. Risk of death in subacute phase is higher for patients with diabetes and significant for young adults

Image: 
William Marciel de Souza

A study conducted by an international team of researchers with FAPESP's support shows that infection by chikungunya virus can produce even more severe manifestations than the typical symptoms of the disease, such as acute fever, headache, rash, and intense joint and muscle pain.

The analysis was performed by 38 researchers affiliated with the Federal University of Ceará (UFC), the University of São Paulo (USP) and the Ministry of Health in Brazil, and with Imperial College London and Oxford University in the United Kingdom.

Their main discovery was that chikungunya can infect the central nervous system and impair cognitive and motor functions.

"The study produced important new knowledge about the disease and the virus. We not only confirmed that the virus can infect the central nervous system but also found the disease to be more deadly for young adults, rather than children and the elderly as is usually predicted in outbreaks of the disease," said William Marciel de Souza , co-author of an article on the study published in Clinical Infectious Diseases.

Souza is a researcher at the University of São Paulo's Ribeirão Preto Medical School (FMRP-USP). "The study also showed that during the acute or subacute phase of the disease [20-90 days after infection] patients with diabetes appear to die seven times more frequently than non-diabetics," he said.

The study was conducted under the auspices of the Brazil-UK Center for Arbovirus Discovery, Diagnosis, Genomics and Epidemiology (CADDE). It also derived from Souza's postdoctoral research, part of which he pursued at Oxford University in the UK with FAPESP's support via a Research Internship Abroad. Researchers affiliated with several different institutions collaborated on the project, which was also supported by Brazil's National Council for Scientific and Technological Development (CNPq).

Worst outbreak in the Americas

The study was based on a retrospective analysis of clinical and epidemiological data as well as blood, cerebrospinal fluid, and tissue samples from patients who died during the 2017 outbreak in the state of Ceará, Brazil, the worst chikungunya outbreak in the Americas. Ceará notified 194 chikungunya-related deaths and 105,229 suspected cases (1,166 per 100,000 inhabitants) in 2017.

The researchers used documentation filed during the outbreak by the Ceará State Health Department's Death Verification Service. To ascertain the cause of death in 100 cases, they analyzed blood serum and cerebrospinal fluid samples using the RT-PCR and MinION genome sequencing techniques, immunohistochemistry, and ELISA assays to detect antibodies against chikungunya.

The virus is transmitted by females of two mosquito species, Aedes aegypti and Aedes albopictus. Most chikungunya patients manifest acute symptoms such as high fever, headache, joint and muscle pain, nausea, fatigue, and rash for three weeks after being infected. Some then progress to the subacute phase, during which the symptoms persist. Joint pain may last more than three months, indicating a transition to the chronic phase, which can last years.

All the evidence gleaned from the laboratory tests and clinical records showed that in most cases of suspected death from chikungunya the patient had a central nervous system infection.

"Joint pain was the most frequent symptom, as evidenced by the name of the disease, which refers to contortion from pain [in the East-African Kimakonde language], but we also identified severe problems in the nervous system due to chikungunya," Souza said.

Viral RNA was found in cerebrospinal fluid from 36 patients and in four brain tissue samples. "The presence of the virus in the brain tissue of those infected is clear evidence that it's capable of crossing the blood-brain barrier that protects the central nervous system, and of causing infection in the brain and spinal cord," Souza said.

Most vulnerable

Besides new characteristics of infection by this virus, the researchers also discovered that the risk of death in the subacute phase was seven times greater for patients with diabetes than patients without diabetes.

Their autopsy and histopathological analysis of fatal cases pointed to viral infection as the cause of bloodstream disorders and fluid imbalances in the brain, heart, lungs, kidneys, spleen, and liver.

"The study confirmed some previous clinical findings about death from chikungunya and also brought up novel aspects of the disease and its lethality. This new information, obtained in a painstaking analysis of the Ceará outbreak, will contribute to the recognition of the factors that cause severity and also to further research to develop better treatments in future," said Luiz Tadeu Moraes Figueiredo, a professor at FMRP-USP and also a co-author of the article.

Figueiredo is engaged in research supported by São Paulo Research Foundation - FAPESP on high-throughput sequencing (HTS) to identify and characterize viruses without requiring viral isolation or cell culture. He is particularly interested in MinION genome sequencing, which is faster and more affordable than other approaches. The technology also reads RNA and DNA in real time and in a single stage.

Based on their analysis, the authors of the study concluded that older people and children were not at greater risk of dying from chikungunya than other age groups, in contrast with the profile typical of arbovirus epidemics. In the 2017 outbreak, most of the fatal victims were middle-aged.

"We normally associate arboviruses with hospitalizations and deaths for elderly patients and infected children, but our analysis of these 100 fatal cases showed that a majority [over 60%] of those with infection in the central nervous system were adults aged 40 or more," Souza said, adding that patients aged anywhere from 3 days to 85 years were among the rest of the fatal victims.

The findings show that defective or suppressed immunity is not necessarily the main source of susceptibility to the disease in such outbreaks. "Many of the victims were healthy young adults under 40, and most had no comorbidities," he said. "The analysis added another layer to our knowledge of the disease and can be extremely important to clinical practice. Even greater attention should be paid to this age group, which is also at great risk of dying."

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Water-energy nanogrid provides solution for rural communities lacking basic amenities

image: The water-energy nanogrid. Electricity generated at solar panels during peak availability is used to run a water nanofilteration system. Any excess energy is either fed to the battery pack or is available for basic household use.

Image: 
Dr. Le Xie and Dr. Shankar Chellam/Texas A&M University College of Engineering

Researchers at Texas A&M University have come up with an economical, green solution that can help underprivileged communities with their water and electricity needs.

Their standalone water-energy nanogrid consists of a purification system that uses solar energy to decontaminate water. The setup, they said, is mathematically tuned to use solar energy optimally so that the water filtration is unhindered by the fluctuations of solar energy during the course of the day.

"To serve areas that are remote and isolated, the infrastructural cost of laying down new water pipes or setting up an electricity grid is enormous and can take a very long time," said Le Xie, professor in the Department of Electrical and Computer Engineering. "To overcome these hurdles, we presented a cost-effective solution that uses solar energy to both purify water and generate electricity for basic household use."

The researchers have described their technology in the journal Applied Energy.

In the United States, the colonias represent one of the many rural, low-income communities along the Texas-Mexico border where basic resources are not readily available. Since the colonias are remote, their residents, consisting of mainly migrant workers, are isolated from major utility and water treatment facilities and thus have limited means for electricity and safe drinking water. Methods like boiling water can be cost-prohibitive and inadequate.

"Boiling water is one of the most expensive ways of decontamination because it takes a lot of energy to heat water," said Shankar Chellam, professor in the Zachry Department of Civil and Environmental Engineering. "Also, although boiling gets rid of biological contaminants, it does not remove many chemical contaminants. We needed a solution that could address both these problems at the same time."

An efficient way to decontaminate water is by passing it through purification systems. These machines use pumps to push water through a filter. However, the pumps require electricity, which is again scarce in the colonias. So, the researchers looked for a solution that would help with both the power and water requirements of the colonia residents.

First, to cut the dependence on centralized sources of power and water, Xie, Chellam and their team conceptualized an energy-water nanogrid, which is a standalone, truck-mountable filtration system with pumps that could run on solar-generated electricity. Next, they developed a cost-minimization mathematical scheme, called scenario-based optimization framework, that minimized the total expenditure for the standalone setup by selecting the type of filter, the number and size of solar panels and the size of the solar battery.

This model revealed that if nanofiltration, a type of purification technique, was used, harvesting solar energy just during peak availability was sufficient to run pumps and purify water. In other words, the water nanofiltration system was largely unaffected by the day-to-day vagaries in solar energy and could purify enough water to meet the weekly water needs of the community. In this way, any excess solar power that was not used for filtration could be stashed away either for storage in the battery pack or for other minor basic household needs, like charging cell phone batteries.

The researchers noted that although the nanofiltration system is more sophisticated and expensive than other filtration methods, its overall merit is that it can successfully desalinate and remove chemicals, like arsenic, present in local groundwater. They said nanofiltration is a preferable method for desalination and water purification for other remote regions where the contaminants within the water are not already known.

"We have for the first time used a very rigorous mathematical approach to interlink water purification and energy provision," Chellam said. "This lays out a quantitative framework that can be used in not just the colonias but in any scenario based on local conditions."

Credit: 
Texas A&M University

Using artificial intelligence can improve pregnant women's health

Researchers from the University of Seville have carried out a rigorous and detailed analysis of how artificial intelligence has been used with pregnant women over the last twelve years. The analysis confirmed that disorders such as congenital heart birth defects or macrosomia, gestational diabetes and preterm birth can be detected earlier when artificial intelligence is used. In the latter case, studies into cases involving artificial intelligence found a correlation between the number of pre-term births and the environmental pollution to which the pregnant women had been previously exposed.

"There is growing interest in the application of artificial intelligence in obstetrics and gynecology. These applications of AI can not only monitor women's health during pregnancy, but can also help to improve the universal provision of health services, especially in the most disadvantaged areas. This field therefore contributes to improving both individual and public health," says University of Seville researcher María del Carmen Romero.

Furthermore, this work reveals the almost total lack of studies where emotions are taken into account as input parameters in risk prediction models in pregnancy (only 1.28% of the studies analyzed). Moreover, very few studies look closely at the pregnant woman's mental health (only 5.1% of the studies analyzed), despite it having been shown that the woman's psychological health is correlated with the risk of suffering certain diseases typical of pregnancy. Pregnancy is a vital state that brings with it the need for change and new learning, potentially causing anxiety, fear, worry, and even depression in women.

Systems based on affective computing could allow emotional interaction with the pregnant woman and, for example, detect emotional changes and make it possible to offer guidance or recommendations, which the system would previously have received from doctors. This can make the patient feel safer and closer to her health service and can reduce the usual feelings of anxiety or worry that sometimes lead to physical problems.

"Given that there is previous scientific evidence that supports the idea that the emotional state and mental health of the pregnant woman can influence the occurrence of risks in pregnancy, our study highlights what is a very interesting multidisciplinary research niche for affective computing in the field of health and well-being of pregnant women," the researcher adds.

Credit: 
University of Seville

Monitoring open-cast mines better than before

image: Mean deformation rates over the Hambach open-pit mine in Northrhine-Westphalia, Germany, retrieved from different SAR satellites. Negative and positive values correspond to subsidence and uplift in the line-of-sight (LOS) direction from satellite to the ground.

Image: 
Wei Tang, Mahdi Motagh, Wei Zhan

When it comes to safety in open-cast mining, soil stability is one of the most critical factors. Settlement of the ground or slipping of slopes poses a great risk to buildings and people. Now Mahdi Motagh from the German Research Centre for Geosciences GFZ, in cooperation with Chinese scientists, has evaluated data from the Sentinel 1 mission of the European Union's Copernicus program and thus demonstrated new possibilities for monitoring mining areas. The three researchers used a special radar method, the Synthetic Aperture Radar Interferometry (InSAR), to investigate lignite regions in North Rhine-Westphalia in Germany. They reported on this in the "International Journal of Applied Earth Observation and Geoinformation".

The InSAR method in itself is not new and is used in many places to detect ground deformations, whether after earthquakes or subsidence due to the overexploitation of underground water reservoirs. However, it had one decisive disadvantage: InSAR satellites such as ERS or ENVISAT only record a certain region on average once a month or less. "With its six-day repeat time interval and small orbital tube, the Sentinel 1 mission provides SAR data that help us to investigate hazards in very specific mining areas in Germany in much greater detail in terms of time and space than before," reports Mahdi Motagh, "and we can do this in near real time." The mission is also able to provide a comprehensive overview of the situation in the mining industry. By combining the results of this new technology with other on-site measurements and high-resolution SAR systems such as the German TerraSAR-X, the geotechnical risk of open-cast mines could be assessed far more completely than before.

The work shows that there is significant land subsidence in the open-cast mining areas of Hambach, Garzweiler and Inden. The reason for this is the compaction process of overburden over refilled areas with subsidence rates varying between 30-50 centimeters per year over Inden, Hambach and Garzweiler. Satellite data also showed a significant horizontal shift of up to 12 centimeters per year at one mine face. Also the former open pits Fortuna-Garsdorf and Berghein in the eastern part of the Rhenish coal fields, which have already been reclaimed for agriculture, show subsidence rates of up to 10 centimeters per year.

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

Effective government saves lives in cyclones, other disasters

ITHACA, N.Y. - In 2008, Cyclone Nargis killed more than 138,000 people in Myanmar. It was a powerful category 3 or 4 storm at landfall, but tropical storms with similar wind speeds that year resulted in far fewer fatalities in other countries.

Elizabeth Tennant, postdoctoral associate in economics at Cornell University, wondered: What made the difference?

To quantify the relationship between natural disaster outcomes and the effectiveness of governments and other institutions, Tennant and co-author Elisabeth Gilmore, associate professor in the Department of International Development, Community and Environment at Clark University, analyzed data from more than 1,000 tropical cyclones from 1979 to 2016. They found, in a paper published Nov. 3 in PNAS, that effective national and local governments are associated with fewer deaths from tropical cyclone disasters - even in countries with similar levels of wealth and development.

Moreover, storms concentrated in areas with weaker public services, as indicated by elevated infant mortality rates, are especially deadly, the researchers found.

"These results suggest that policies and programs to enhance institutional capacity and governance can support risk reduction from extreme weather events," Tennant wrote.

One of the original motivations of the study, Tennant said, was to better understand how effective institutions and governments can moderate the increasing risks of future extreme weather events due to climate change. This research contributes to the body of evidence that institutions are an important foundation for climate adaptation, Tennant said.

There are many examples indicating that strong institutions - including government - play a critical role in protecting populations from adverse effects of natural disasters, Tennant said. But it is much more difficult to determine how universal this relationship is, she said, because there is so much variation in the frequency and severity of storms.

Natural hazards such as cyclones, the researcher wrote, result in disasters only when vulnerable human systems are exposed to hazardous conditions. In their analysis, Tennant and Gilmore explicitly accounted for hazard exposure, connecting the analysis of governance and other indicators of well-being to estimates of the severity and exposure to the tropical cyclone hazard.

They used several data sources to gather information about people, places, events and storms, including: the National Oceanic and Atmospheric Administration; the Centre for Research on the Epidemiology of Disasters Emergency Events Database; and World Governance Indicators.

"We developed an approach where we carefully modeled the extent of the storms to match them to the measures of governance and living conditions in affected areas," Tennant said. "This helps us to identify what makes people vulnerable."

Tennant first became interested in the intersections of disasters and development during her time as a Peace Corps volunteer in Honduras, where resources were constrained.

"I saw how a decade after the devastating Hurricane Mitch [1998], the disaster still affected the local communities and their well-being," Tennant said. "So what does disaster preparedness look like in a country where many people are without secure access to nutritional food and clean drinking water now? To what extent can investing in health, education and the quality of governments and institutions also serve as a useful foundation for disaster risk reduction activities?"

While the study does not suggest specific approaches to improving the quality and effectiveness of institutions, it does highlight their importance, Tennant said. "Ensuring that local institutions are involved and accountable for the delivery of public services may have multiple benefits," she said, "including reducing deaths from natural disasters."

And while the researchers completed the study before the onset of the COVID-19 pandemic, the results are consistent with lessons emerging from the virus, Tennant said: "In our view, the pandemic has provided an immediate example of how government effectiveness plays an important role in shaping societal risks, regardless of a country's wealth."

Credit: 
Cornell University