Culture

Water governance: Could less sometimes be more?

image: A diagram showing the contribution of each new rule to the overall capacity for governance coordination over time; i.e. the "improvement" of governance provided by each new rule. The different phases are visible with an increasingly strong improvement until a turning point, where the improvement then becomes weaker. An example of a reading for the Swiss case (brown curve): From 1850 onwards, each new rule increasingly improves the ability to coordinate. This capacity stagnated at its peak during the first part of the 20th century, only to decline gradually. Thus, in 2006, the capacity to improve the coordination of each new rule returned to a level on the order of that reached in the second half of the 19th century.

Image: 
© UNIGE

The use of environmental resources has been regulated for centuries with the aim of improving the management and behaviour of private and public actors on an on-going basis. But, does the never-ending introduction of new regulations really have a positive effect? Or, does a surfeit of rules cause malfunctions and lead to disturbing overlaps? In an attempt to answer these questions, researchers from the Universities of Geneva (UNIGE) and Lausanne (UNIL), Switzerland, analysed water governance regulations in six European countries from 1750 to 2006. Their results, published in the journal Ecological Economics, show that rules designed to improve resource management eventually come into conflict in the long run, creating an equal number of positive and negative effects until the system falls apart. At this point, the only way out is for the state to overhaul governance.

Societies have been making rules to control behaviours and the uses of natural resources such as water for centuries. At the same time, however, the competing interests of state and private actors continue to produce environmental problems. In overall terms, the scientific literature is in agreement that developments in the way these regulations are structured are, nevertheless, increasingly positive and effective. But to what extent is this really the case in the long run?

"To assess whether a regulation is positive in the long run, you need to factor in the ecosystem of rules that it is part of, and which it may either reinforce or disrupt", begins Thomas Bolognesi, a researcher at the Institute for Environmental Sciences (ISE) at UNIGE. In fact, a rule that induces a positive impact on the use that it regulates may cause turmoil once it begins to interact with existing regulations, causing the entire system to malfunction, conceived here as transversal transaction costs (TTCs). "And over the very long term", adds the Geneva-based scientist, "the negative effect of TTCs can grow and end up being equivalent to the positive effect generated by the new regulation, creating what we called an institutional complexity trap." The quality of governance is based, therefore, on two key components: the scope, i.e. the set of uses governed by the rules (quantity); and the consistency, i.e. the fact that the rules are defined and followed correctly (quality).

Successive improvements to the system lead to breaking point

To test their hypothesis, Bolognesi and Stéphane Nahrath, a professor at UNIL's Swiss Graduate School of Public Administration (IDHEAP), scrutinised the water governance systems in six European countries (Switzerland, Belgium, Spain, France, Italy and the Netherlands) from 1750 to 2006. "The aim of the study was to determine whether the increase in the scope of the governance reduced the system-wide coherence, and even went as far as overriding the positive effects intended by the additional regulations", says professor Nahrath. The researchers identified three distinct phases in the evolution of the governance in the six countries.

The first phase, which lasted from 1750 to 1850 and was followed by around 50 years of stagnation, covered the launch of the governance process, i.e. the production of framework rules that had relatively little impact. From 1900 to 1980, governance developed and the rules, which grew in precision, generated significant positive effects. But since 1980, we have entered a phase where the negative indirect effect, linked to a drop in the system's coherence, has been reinforced and offsets the previous positive effect, even to the point of supplanting it. "This is due to the creation of a profusion of new rules, especially following the introduction of the New Public Management approach in the 1980s", notes Bolognesi. This proliferation of regulations, which were sometimes designed to regulate the same area but along different lines, had an indirect negative impact on governance and resulted in a decrease in efficiency and clarity, leading to a systemic malfunction. "Consequently, to achieve a positive effect - as slim as it is - more and more rules need to be produced, increasing the risk of malfunction and leading to a vicious circle", continues Nahrath.

System reformed by the state

Contrary to the widespread idea that water governance is constantly improving, the study by the researchers from UNIGE and UNIL demonstrates the conflicts instigated by repeatedly introducing new rules designed to increase the system's efficiency. "If we carry on in the same way, we're going to hit breaking point", warns Bolognesi. "That's why we think it's important that the state and government policy should take charge of environmental governance issues. That way, we can avoid introducing separate rules that generate frictions and uncertainties, and that could create insurmountable obstacles for coordinating the system." As professor Nahrath concludes: "The contractual rules must in no instance take precedence over state rules."

Credit: 
Université de Genève

Plant physiology: One size may not suit all

A new study published by biologists at Ludwig-Maximilians-Universitaet (LMU) in Munich demonstrates that there are no simple or universal solutions to the problem of engineering plants to enable them to cope with the challenges posed by climate change.

For plants, climate change promises one thing for sure - increased levels of stress. After all, plants put down roots. They don't have the option of moving to where the weather suits them. Wider fluctuations in temperatures and increasing levels of aridity in many regions around the world are already making their lives more difficult. Plants are highly complex and sensitive systems. Even in zones with stable climates today, variations in light levels can reduce growth rates and crop yields. For example, plants have developed sophisticated cellular mechanisms that protect them against the deleterious effects of high light intensities on photosynthesis. In one such photoprotective process, the excess light energy is dissipated as heat before it can damage the photosynthetic apparatus. This depresses yields but it is very much in the plant's interest.

Three enzymes play a key role in this adaptation process, which are referred to as V, P and Z for short. In a paper published in 2016, which drew a great deal of attention, an American research group overexpressed the genes for these three proteins in tobacco plants, thus increasing the amounts of the enzymes produced in the leaves. They subsequently observed, under field conditions, that these 'VPZ' lines grew faster rates than did control plants with normal levels of the enzymes. LMU biologists Antoni Garcia-Molina and Dario Leister have now performed essentially the same experiment in the model plant Arabidopsis thaliana (thale cress). Their findings appear in the journal Nature Plants.

Their results confirm that, as in the case of tobacco, higher levels of V, P and Z reduce rates of photosynthesis while enabling the plants to adapt more rapidly (in fact, even faster than tobacco) to fluctuating light levels. Crucially however, the Arabidopsis VPZ lines did not grow faster than control plants. On the contrary, overexpression of the three enzymes resulted in retarded growth. "This clearly shows that it's not quite as easy to produce plants that are better adapted as some research groups have confidently suggested," Leister remarks. "In fact, higher levels of photoprotection may actually interfere with the operation of other mechanisms that are important for plant growth."

For Leister, these data essentially demonstrate that targeted adaptation of plants to facilitate successful adjustment to changing climatic conditions is likely to be a very complicated task. They certainly show that one cannot always expect to confer increased resistance to desiccation or optimize yields under fluctuating light levels simply by adjusting the levels of a few proteins. "The physiological processes in plants are tightly interconnected. This makes it impossible to predict the effects of flipping this switch or tightening that screw," he says. This explains why he and his colleagues approach the problem of targeted adaptation from the perspective of systems biology, which takes a 'holistic' view, as he calls it. For example, efforts to increase the yield or biomass by increasing the efficiency of photosynthesis must also ensure that the extra energy available is in fact channeled into increased growth. In principle, enhanced photosynthetic performance should result in the capture of more energy and in higher levels of metabolites. But this extra energy and abundance of chemical compounds must be put to some beneficial use. In the absence of any 'added value', increased rates of photosynthesis can prove to be detrimental to plants.

The analysis of complex relationships like this is the raison d'ètre of the Transregional Collaborative Research Center TR175, of which Leister is the principal coordinator. The scientists involved in the project seek to understand how plants react to biotic and abiotic environmental factors, such as drought, light levels and temperature, by analyzing their impact on the concentrations of all measurable metabolites, transcripts and proteins in plant cells. With the help of these data, they hope to identify the key components that allow plants to cope with varying conditions. In the case of crop plants that are indispensable for human nutrition, the mechanisms that underlie trade-offs between growth rates, increases in biomass and yields must also be taken into consideration. "In the context of climate change, the idea is to help plants to adapt to the changing conditions by introducing targeted genetic changes that allow them to handle the altered environmental parameters," Leister explains. Researchers refer to this strategy as 'assisted evolution'. "In order to have a realistic chance of finding sustainable solutions, we must adopt a systematic approach to the active adaptation of plants to the changing environmental conditions," he says. In this respect, some progress has already been made in certain species of algae that have very short generation times, which permits instances of successful adaptation to be rapidly detected. Such systems can then serve as sources of potentially useful genetic mutations that can be introduced into green plants.

Credit: 
Ludwig-Maximilians-Universität München

It's not about East and West, it's about top and bottom

Overall, 93 per cent of the German populace feels valued in their everyday lives, whereas far fewer - but still one out two (52 per cent) - feel disrespected. Most Germans experience high levels of appreciation overall, especially in private contexts such as among family or friends. Disrespect, however, is most commonly experienced in the workplace. East and West Germans feel equally appreciated in their everyday lives - yet also equally disrespected. Rather, how much appreciation and disrespect a person experiences strongly depends on levels of income, education, and the employment status.

These are the results of a study conducted by a team of sociologists at the Otto von Guericke University Magdeburg (OvGU) in Germany. The study is based on the question module "Status Confidence and Anxiety" which was developed for the Innovation Sample of the German Socio-Economic Panel, a long-standing representative survey in Germany. Building on these data, sociologists Prof. Jan Delhey, Dr. Christian Schneickert, and Leonie Steckermeier (MA) investigate who experiences social appreciation and disrespect in Germany, in which contexts and why. According to Prof. Delhey, the study reveals quite surprising results.

Regarding both, experiences of appreciation and disrespect in daily life, individuals' position on the socio-economic ladder makes a huge difference: the higher a person's income and level of education, the more they feel recognized, and the less disrespected. "An individual's employment status is also crucial." Prof. Jan Delhey continues, "Being unemployed goes hand in hand with a significantly higher risk of experiencing less social recognition and more disrespect." In contrast, a person's regional or ethnic background prove negligible: The study does not find significant differences between East and West Germans, nor between migrants and non-migrants

"Anyone interested in the unequal distribution of social recognition and disrespect in everyday-life in Germany," Dr. Christian Schneickert concludes, "should devote attention to socio-economic inequalityies rather than to socio-cultural diversity."

In addition, the researchers investigated the consequences of these everyday-life experiences for individuals' evaluations of their own lives and their satisfaction with democracy: The more people feel socially appreciated, the more satisfied they are with their lives as well as with democracy.

The study was carried out as part of the project "Recognition, Depreciation and Status Seeking" at the Chair for Macrosociology at the OVGU and was funded by the German Research Foundation (DFG).

The empirical analysis was based on survey data from a representative sample of the German population covering 3,580 respondents. The data were collected in 2016 as part of the Innovation Sample of the German Socio-Economic Panel. The data provide detailed insights into the distribution and significance of feelings of appreciation and disrespect.

The study has been published in the prestigious German social science journal Kölner Zeitschrift für Soziologie und Sozialpsychologie (KZfSS), and can be downloaded for free.

Credit: 
Otto-von-Guericke-Universität Magdeburg

MU scientists find oldest-known fossilized digestive tract -- 550 million years

image: A fossilized cloudinomorph from the Montgomery Mountains near Pahrump, Nev. This is representative of the fossil that was analyzed in the study.

Image: 
University of Missouri

A 550 million-year-old fossilized digestive tract found in the Nevada desert could be a key find in understanding the early history of animals on Earth.

Over a half-billion years ago, life on Earth was comprised of simple ocean organisms unlike anything living in today's oceans. Then, beginning about 540 million years ago, animal structures changed dramatically.

During this time, ancestors of many animal groups we know today appeared, such as primitive crustaceans and worms, yet for years scientists did not know how these two seemingly unrelated communities of animals were connected, until now. An analysis of tubular fossils by scientists led by Jim Schiffbauer at the University of Missouri provides evidence of a 550 million-year-old digestive tract -- one of the oldest known examples of fossilized internal anatomical structures -- and reveals what scientists believe is a possible answer to the question of how these animals are connected.

The study was published in Nature Communications, a journal of Nature.

"Not only are these structures the oldest guts yet discovered, but they also help to resolve the long-debated evolutionary positioning of this important fossil group," said Schiffbauer, an associate professor of geological sciences in the MU College of Arts and Science and director of the X-ray Microanalysis Core facility. "These fossils fit within a very recognizable group of organisms -- the cloudinids -- that scientists use to identify the last 10 to 15 million years of the Ediacaran Period, or the period of time just before the Cambrian Explosion. We can now say that their anatomical structure appears much more worm-like than coral-like."

The Cambrian Explosion is widely considered by scientists to be the point in history of life on Earth when the ancestors of many animal groups we know today emerged.

In the study, the scientists used MU's X-ray Microanalysis Core facility to take a unique analytical approach for geological science -- micro-CT imaging -- that created a digital 3D image of the fossil. This technique allowed the scientists to view what was inside the fossil structure.

"With CT imaging, we can quickly assess key internal features and then analyze the entire fossil without potentially damaging it," said co-author Tara Selly, a research assistant professor in the Department of Geological Sciences and assistant director of the X-ray Microanalysis Core facility.

Credit: 
University of Missouri-Columbia

Mayo Clinic discovers a molecular switch for repairing central nervous system disorders

ROCHESTER, Minn. -- A molecular switch has the ability to turn on a substance in animals that repairs neurological damage in disorders such as multiple sclerosis (MS), Mayo Clinic researchers discovered. The early research in animal models could advance an already approved Food and Drug Administration therapy and also could lead to new strategies for treating diseases of the central nervous system.

Research by Isobel Scarisbrick, Ph.D., published in the Journal of Neuroscience finds that by genetically switching off a receptor activated by blood proteins, named Protease Activated Receptor 1 (PAR1), the body switches on regeneration of myelin, a fatty substance that coats and protects nerves.

"Myelin regeneration holds tremendous potential to improve function. We showed when we block the PAR1 receptor, neurological healing is much better and happens more quickly. In many cases, the nervous system does have a good capacity for innate repair," says Dr. Scarisbrick, principal investigator and senior author. "This sets the stage for development of new clinically relevant myelin regeneration strategies."

Myelin, Thrombin and the Nervous System

Myelin acts like a wire insulator that protects electrical signals sent through the nervous system. Demyelination, or injury to the myelin, slows electrical signals between brain cells, resulting in loss of sensory and motor function. Sometimes the damage is permanent. Demyelination is found in disorders such as MS, Alzheimer's disease, Huntington's disease, schizophrenia and spinal cord injury.

Thrombin is a protein in blood that aids in healing. However, too much thrombin triggers the PAR1 receptor found on the surface of cells, and this blocks myelin production. Oligodendrocyte progenitor cells capable of myelin regeneration are often found at sites of myelin injury, including demyelinating injuries in multiple sclerosis.

"These oligodendroglia fail to differentiate into mature myelin regenerating cells for reasons that remain poorly understood," says Dr. Scarisbrick. "Our research identifies PAR1 as a molecular switch of myelin regeneration. In this study, we demonstrate that blocking the function of the PAR1, also referred to as the thrombin receptor, promotes myelin regeneration in two unique experimental models of demyelinating disease."

The Research

The research focused on two mouse models. One was an acute model of myelin injury and the other studied chronic demyelination, each modeling unique features of myelin loss present in MS, Alzheimer's disease and other neurological disorders. Researchers genetically blocked PAR1 to block the action of excess thrombin.

The research not only discovered a new molecular switch that turns on myelin regeneration, but also discovered a new interaction between the PAR1 receptor and a very powerful growth system called brain derived neurotropic factor (BDNF). BDNF is like a fertilizer for brain cells that keeps them healthy, functioning and growing.

Significantly, the researchers found that a current Food and Drug Administration-approved drug that inhibits the PAR1 receptor also showed ability to improve myelin production in cells tested in the laboratory.

"It is important to say that we have not and are not advocating that patients take this inhibitor at this time," says Dr. Scarisbrick. "We have not used the drug in animals yet, and it is not ready to put in patients for the purpose of myelin repair. Using cell culture systems, we are showing that this has the potential to improve myelin regeneration."

Additional research is needed to verify and advance the findings toward clinical practice.

Credit: 
Mayo Clinic

Researchers develop new protocol to generate intestinal organoids in vitro

(Boston) - Boston researchers have developed a new way to generate groups of intestinal cells that can be used, among others, to make disease models in the lab to test treatments for diseases affecting the gastrointestinal system. Using human induced pluripotent stem cells, this novel approach combined a variety of techniques that enabled the development of three-dimensional groups of intestinal cells called organoids in vitro, which can expand disease treatment testing in the lab using human cells.

Published online in Nature Communications, this process provides a novel platform to improve drug screenings and uncover novel therapies to treat a variety of diseases impacting the intestine, such as inflammatory bowel disease, colon cancer and Cystic Fibrosis.

Researchers at the Center for Regenerative Medicine (CReM) of Boston University and Boston Medical Center used donated human induced pluripotent stem cells (hiPSCs), which are created by reprogramming adult cells into a primitive state. For this study, these cells were pushed to differentiate into intestinal cells using specific growth factors in order to create organoids in a gel. This new protocol allowed the cells to develop without mesenchyme, which typically in other protocols, provides support for the intestinal epithelial cells to grow. By taking out the mesenchyme, the researchers could study exclusively epithelial cells, which make up the intestinal tract.

In addition, using CRISPR technology, the researchers were able to modify and create a novel iPSC stem cell line that glowed green when differentiated into intestinal cells. This allowed the researchers to follow the process of how intestinal cells differentiate in vitro.

"Generating organoids in our lab allows us to create more accurate disease models, which are used to test treatments and therapies targeted to a specific genetic defect or tissue - and it's all possible without harming the patient," said Gustavo Mostoslavsky, MD, PhD, co-director of CReM and faculty in the gastroenterology section at Boston Medical Center. "This approach allows us to determine what treatments could be most effective, and which are ineffective, against a disease."

Using this new protocol, the researchers generated intestinal organoids from iPSCs containing a mutation that causes Cystic Fibrosis, which typically affects several organs, including the gastrointestinal tract. Using CRISPR technology, the researchers corrected the mutation in the intestinal organoids. The intestinal organoids with the mutation did not respond to a drug while the genetically corrected cells did respond, demonstrating their future potential for disease modeling and therapeutic screening applications.

The protocol developed in this study provides strong evidence to continue using human iPSCs to study development at the cellular level, tissue engineering and disease modeling in order to advance the understanding - and possibilities - of regenerative medicine.

"I hope that this study helps move forward our collective understanding about how diseases impact the gastrointestinal tract at the cellular level," said Mostoslavsky, who also is associate professor of medicine and microbiology at Boston University School of Medicine. "The continual development of novel techniques in creating highly differentiated cells that can be used to develop disease models in a lab setting will pave the way for the development of more targeted approaches to treat many different diseases."

Credit: 
Boston Medical Center

Deep learning differentiates small renal masses on multiphase CT

image: Except for AUC, all values are percentages. Ranges in parentheses are 95% CIs.

Image: 
<em>American Journal of Roentgenology</em> (AJR)

Leesburg, VA, January 10, 2020--A deep learning method with a convolutional neural network (CNN) can support the evaluation of small solid renal masses in dynamic CT images with acceptable diagnostic performance, according to an article published ahead-of-print in the March issue of the American Journal of Roentgenology (AJR).

Between 2012 and 2016, researchers at Japan's Okayama University studied 1807 image sets from 168 pathologically diagnosed small (? 4 cm) solid renal masses with four CT phases--unenhanced, corticomedullary, nephrogenic, and excretory--in 159 patients.

Masses were classified as malignant (n = 136) or benign (n = 32) using a 5-point scale, and this dataset was then randomly divided into five subsets.

As lead AJR author Takashi Tanaka explained, "four were used for augmentation and supervised training (48,832 images), and one was used for testing (281 images)."

Utilizing the Inception-v3 architecture CNN model, the AUC for malignancy and accuracy at optimal cutoff values of output data were evaluated in six different CNN models.

Finding no significant size difference between malignant and benign lesions, Tanaka's team did find that the AUC value of the corticomedullary phase was higher than that of other phases (corticomedullary vs excretory, p = 0.022).

Additionally, the highest accuracy (88%) was achieved in the corticomedullary phase images.

Multivariate analysis revealed that the CNN model of corticomedullary phase was a significant predictor for malignancy, "compared with other CNN models, age, sex, and lesion size," Tanaka concluded.

Credit: 
American Roentgen Ray Society

Technique allows dolphin pregnancy exams to mirror those in humans

image: Ultrasound images of dolphin fetuses showing: (A) eye, (B) stomach, liver, lungs, (C) male genitalia (penis), and (d) female genitalia (vagina).

Image: 
NMMF

Ultrasound has been used for decades to study dolphin health and much of that work has been pioneered by veterinarians at the National Marine Mammal Foundation (NMMF). But in a groundbreaking study just published in Veterinary Radiology & Ultrasound, scientists have developed a new ultrasound technique for evaluating dolphin fetuses at all stages of gestation. This recent advancement in dolphin medicine allows veterinarians to evaluate dolphin pregnancies the same way doctors approach pregnancies in humans.

"We can now re-create the human 20-week fetal ultrasound exam in dolphins, which means we can better understand the health challenges dolphin mothers and their babies are facing," said NMMF Executive Director Dr. Cynthia Smith."This is a game-changer for the conservation of bottlenose dolphins and other small cetaceans around the world."

To develop the technique, veterinarians followed 16 healthy pregnancies of dolphins in human care and determined normal findings for both the fetus and placenta. The Chicago Zoological Society's radiologist, Dr. Marina Ivanči?, collaborated on the project.

"This new technique can be performed in a matter of minutes and provides a wealth of information about the health of the dolphin fetus," said Dr. Ivanči?. "We are thrilled to make this technique widely available to veterinarians and radiologists, which has the potential to elevate dolphin medicine globally."

The work was part of an investigation into the long-term health effects of the Deepwater Horizon oil spill, which caused a major reduction in pregnancy success in bottlenose dolphins living within the oil spill footprint. The study was made possible by a research grant from the Gulf of Mexico Research Initiative and is already leading to major breakthroughs in the scientist's ability to understand the reproductive failures.

Future applications of the technique and normal data will help identify fetal abnormalities and maternal illness, helping to determine the cause of reproductive failure in wild and managed dolphin populations.

"This advanced ultrasound technique is allowing us to diagnose problems as early as the first trimester of pregnancy in dolphins," said Dr. Forrest Gomez. "That gives us a chance to determine if there is something that could be done to save the pregnancy, which could prove critical for populations of dolphins and porpoises that are at risk."

The NMMF has also released an important study reporting the normal blood values from dolphins that reproduce successfully. This allows for the early detection of pregnancy problems, which can also help determine the cause of reproductive failures in populations that are struggling to thrive.

Credit: 
National Marine Mammal Foundation

Scientists examine how a gut infection may produce chronic symptoms

image: Along the edge of the small intestine, neurons (green) appear in close proximity to the inflammatory molecule Nlrp6 (pink).

Image: 
Laboratory of Mucosal Immunology

Sometimes the end of an intestinal infection is just the beginning of more misery. Of those who contract traveler's diarrhea, for example, an unlucky few go on to develop irritable bowel syndrome (IBS), a chronic inflammation of the intestinal tract.

Scientists aren't sure exactly how this happens, but some think an infection may contribute to IBS by damaging the gut nervous system. A new Rockefeller study takes a close look at why neurons in the gut die and how the immune system normally protects them.

Conducted with mice, the experiments described recently in Cell offer insight on IBS and could point toward potential new treatment approaches.

Keeping inflammation in check

In a healthy gut, the immune system must strike a careful balance between responding to threats and keeping that response in check to avoid damage.

"Inflammation helps the gut ward off an infection, but too much of it can cause lasting harm," says Daniel Mucida, an associate professor and head of the Laboratory of Mucosal Immunology. "Our work explores the complex mechanisms that prevent inflammatory responses from destroying neurons."

To understand the effects of an infection on the nervous system, Mucida and his colleagues gave mice a weakened form of Salmonella, a bacterium that causes food poisoning, and analyzed neurons within the intestine. They found that infection induced a long-lasting reduction of neurons, an effect they attributed to the fact these cells express two genes, Nlrp6 and Caspase 11, which can contribute to a specific type of inflammatory response.

This response, in turn, can ultimately prompt the cells to undergo a form of programmed cell death. When the researchers manipulated mice to eliminate these genes specifically in neurons, they saw a decrease in the number of neurons expiring.

"This mechanism of cell death has been documented in other types of cells, but never before in neurons," says Fanny Matheis, a graduate student in the lab. "We believe these gut neurons may be the only ones to die this way."

Macrophages to the rescue

It's not yet clear exactly how inflammation causes neurons to commit cell suicide, yet the scientists already have clues suggesting it might be possible to interfere with the process. The key may be a specialized set of gut immune cells, known as muscularis macrophages.

Previous work in Mucida's lab has shown that these cells express inflammation-fighting genes and collaborate with the neurons to keep food moving through the digestive tract. If these neurons die off, as happens in an infection, a possible result is constipation--one of a number of unpleasant IBS symptoms. In their recent report, the team demonstrate how macrophages come to the neurons' aid during an infection, ameliorating this aspect of the disorder.

Their experiments revealed that macrophages possess a certain type of receptor molecule that receives stress signals released by another set of neurons in response to an infection. Once activated, this receptor prompts the macrophage to produce molecules called polyamines, which the scientists think might interfere with the cell death process.

Getting back to normal

In other experiments, the researchers found that Salmonella infection alters the community of microbes within the guts of mice--and when they restored the animals' intestinal flora back to normal, the neurons recovered.

"Using what we learned about the macrophages, one could think about ways to disrupt the inflammatory process that kills the neurons," says Paul Muller, a postdoctoral fellow in the lab.

For instance, it might be possible to develop better treatments for IBS that work by boosting polyamine production, perhaps through diet, or by restoring gut microbial communities. Since short-term stress responses also appear to have a protective effect, Muller thinks it may also be helpful to target that system.

Credit: 
Rockefeller University

SuperTIGER on its second prowl -- 130,000 feet above Antarctica

image: The Super Trans-Iron Galactic Element Recorder (SuperTIGER) instrument is used to study the origin of cosmic rays. SuperTIGER is a collaboration among Washington University in St. Louis, Goddard Space Flight Center, California Institute of Technology Jet Propulsion Laboratory and the University of Minnesota. The SuperTIGER instrument is carried aloft above Antarctica by a giant 39.5 million-cubic-foot scientific balloon. The balloon flies at a height of about 129,000 feet -- nearly four times the typical cruising altitude of commercial airliners. Here, the instrument is waiting on the launch pad as the balloon inflates.

Image: 
Wolfgang Zober, Washington University in St. Louis.

A balloon-borne scientific instrument designed to study the origin of cosmic rays is taking its second turn high above the continent of Antarctica three and a half weeks after its launch.

SuperTIGER (Super Trans-Iron Galactic Element Recorder) is designed to measure the rare, heavy elements in cosmic rays that hold clues about their origins outside of the solar system. The effort is a collaboration among Washington University in St. Louis, Goddard Space Flight Center, California Institute of Technology Jet Propulsion Laboratory and the University of Minnesota.

The longer the balloon and instrument are up, the better.

"The significance of our observation increases with the number of events we observe essentially linearly with time, so we simply want to have as long a flight as possible to maximize the statistics of the data collected," said Brian Rauch, research assistant professor of physics in Arts & Sciences at Washington University and principal investigator for SuperTIGER. "A day of data is a small increment of progress, and we just have to put our heads down and keep grinding away.

"SuperTIGER flights are marathons, not sprints."

'Fly as long as we can'

On Dec. 31, the balloon completed its first full revolution of Antarctica.

Little more than two weeks prior, Rauch and his team were celebrating a successful launch after a succession of challenging seasons on the ice.

"After three Antarctic seasons -- with 19 launch attempts, two launches and one recovery of the payload from a crevasse field -- it is wonderful to have SuperTIGER-2 finally reach float altitude and begin collecting scientific data. The third season is the charm!" Rauch said in a Dec. 15 news release.

NASA's Balloon Program Office called it a "picture-perfect launch," although the scientists suffered some technical setbacks once the instrument was in the air. There were problems with a power supply, Rauch said, and a computer failure eliminated one of the detector modules early in the flight.

"This only further emphasizes the importance for us to fly as long as we can to make up for the loss in instrument collecting power," he said. "As it is, in this flight we may hope to collect about 40% of the statistics achieved with the first SuperTIGER flight."

The 2012-13 SuperTIGER flight broke scientific ballooning records for longevity -- staying afloat for a remarkable 55 days. The current mission will not challenge that record.

"The way the stratospheric winds are circulating this season, our flight will be terminated when the balloon comes over a suitable location at the end of our second revolution around the continent," Rauch said.

Along for the ride

The balloon that carries SuperTIGER is also transporting four, smaller experimental devices that are "piggybacked" onto its core scientific payload. The list includes two experiments by Washington University in St. Louis researchers:

A gamma-ray explorer developed by James H. Buckley, professor of physics in Arts & Sciences. Called APT-Lite, this instrument will be followed by a bigger device called APT, currently under development.
Another instrument developed Alex Meshik, research professor of physics, which captured stratospheric air during the balloon's ascent. This air will be used for high-precision analyses of noble gas isotopes.

Sun never sets on SuperTIGER

"There are two SuperTIGERs (team members) still on the ice, both from Washington University," Rauch said.

"In order to provide continuous coverage for the duration of the flight, the day is divided into monitoring shifts," he said. "Those of us on the ice get to cover the graveyard shift for the folks in the United States.

"My routine has evolved into my getting up in the mid-afternoon, eating dinner, doing the monitoring shift in our office in Crary Lab, working in the office for another few hours or so, and then going to bed. (Graduate student in physics) Wolfgang Zober's routine is similar, but he usually arrives, eats and gets to the office earlier than me and calls it a 'day' closer to the end of our shift."

"When I'm not monitoring, I go walk on the trails, taking photos of penguins and seals," Zober said. "I also make it a habit of socializing with other science groups to learn about other research being done here."

There are few exceptions to the routine. The researchers did carve out a little time from their monitoring tasks to watch on Jan. 6 the launch of another balloon experiment, the BLAST-TNG mission led by researchers at University of Pennsylvania. That voyage ended only 15 hours into the flight, due to technical issues.

For the SuperTIGER team, there was no stopping for the holidays.

"I managed to convince Wolfgang to take a break on New Year's Eve to experience IceStock while I covered the monitoring shift alone," Rauch said.

"I did step outside just before midnight to see the New Year in."

---
LAUNCH video available: https://youtu.be/yx-ZJpmUr7c
---

Want to follow the mission? Get updates on SuperTIGER as the flight progresses through the Washington University team's Twitter account, @SuperTigerLDB, or by following the Twitter handle @NASAUniverse.

Credit: 
Washington University in St. Louis

Prenatal Exposure to Flame Retardants Linked to Reading Problems

A new study from researchers at Columbia University Vagelos College of Physicians and Surgeons suggests that prenatal exposure to flame retardants may increase the risk of reading problems.

The study was published in the January 2020 print edition of Environmental International.

An estimated 2 million children have learning disorders; of these, about 80% have a reading disorder. Genetics account for many, but not all, instances of reading disorders.

In the current study, the researchers hypothesized that in utero exposure to polybrominated diphenyl ethers (PBDEs)--a type of flame retardant that is known to have adverse effects on brain development--might alter the brain processes involved in reading. (While use of PBDEs has been banned, exposure to the compounds is still widespread because they do not degrade easily in the environment.)

The research team analyzed neuro-imaging data from 33 5-year-old children--all novice readers--who were first given a reading assessment to identify reading problems. They also used maternal blood samples, taken during pregnancy, to estimate prenatal exposure to PDBEs.

The researchers found that children with a better-functioning reading network had fewer reading problems. The also showed that children with greater exposure to PDBEs had a less efficient reading network.

However, greater exposure did not appear to affect the function of another brain network involved in social processing that has been associated with psychiatric disorders such as autism spectrum disorder.

"Since social processing problems are not a common aspect of reading disorders, our findings suggest that exposure to PDBEs doesn't affect the whole brain--just the regions associated with reading," says Amy Margolis, PhD, assistant professor of medical psychology in the Department of Psychiatry at Columbia University Vagelos College of Physicians and Surgeons.

Although exposure to PDBEs affected reading network function in the 5-year-olds, it did not have an impact on word recognition in this group. The finding is consistent with a previous study, in which the effects of exposure to the compounds on reading were seen in older children but not in emergent readers. "Our findings suggest that the effects of exposure are present in the brain before we can detect changes in behavior," says Margolis. "Future studies should examine whether behavioral interventions at early ages can reduce the impact of these exposures on later emerging reading problems."

The paper is titled "Functional Connectivity of the Reading Network is Associated with Prenatal Polybrominated Diphenyl Ether Concentrations in a Community Sample of 5 Year-Old Children: A preliminary study."

Additional authors are Sarah Banker (Columbia University Irving Medical Center, New York, NY), David Pagliaccio (CUIMC), Erik De Water (Icahn School of Medicine at Mount Sinai, New York, NY), Paul Curtin (Icahn School of Medicine), Anny Bonilla (Icahn School of Medicine), Julie B. Herbstman (CUIMC), Robin Whyatt (CUIMC), Ravi Bansal (University of Southern California, Los Angeles, CA), Andreas Sjödin (Center for Disease Control and Prevention, Atlanta, GA), Michael P. Milham (Child Mind Institute, New York, NY), Bradley S. Peterson (USC), Pam Factor-Litvak (CUIMC), Megan K. Horton (Icahn School of Medicine).

This work was supported by funding from the National Institute for Environmental Health Sciences (K23ES026239 to A.E.M., R00 ES020364 to M.K.H; R21 ES016610-01 to R.W.)

The authors report no financial or other conflicts of interest.

Credit: 
Columbia University Irving Medical Center

Lonely in a crowd: Overcoming loneliness with acceptance and wisdom

image: Dilip Jeste, MD, senior associate dean for the Center of Healthy Aging and Distinguished Professor of Psychiatry and Neurosciences at UC San Diego School of Medicine. Photo by Erik Jepson, UC San Diego Publications.

Image: 
Erik Jepson, UC San Diego Publications

By nature, human beings are social creatures. Yet, as we age, personal dynamics and lifestyles change, which can result in loneliness and isolation. With older adults increasingly moving into senior living or retirement communities, researchers at University of California San Diego School of Medicine sought to identify the common characteristics of residents who feel lonely in these environments.

"Loneliness rivals smoking and obesity in its impact on shortening longevity," said senior author Dilip V. Jeste, MD, senior associate dean for the Center of Healthy Aging and Distinguished Professor of Psychiatry and Neurosciences at UC San Diego School of Medicine. "It is a growing public health concern, and it's important that we identify the underlying causes of loneliness from the seniors' own perspectives so we can help resolve it and improve the overall health, well-being and longevity of our aging population."

Jeste noted that there are few published qualitative studies about loneliness among older adults in the independent living sector of senior housing communities, where shared common areas, planned social outings and communal activities are intended to promote socialization and reduce isolation. "So why are many older adults living in this type of housing still experiencing strong feelings of loneliness?" asked Jeste.

The new study, published online in the January 10, 2020 issue of Aging and Mental Health, found that people's experience of living with loneliness is shaped by a number of personal and environmental factors.

Researchers conducted one-and-a-half-hour individual interviews of 30 adults ages 67 to 92, part of an overall study evaluating the physical, mental and cognitive functions of 100 older adults living in the independent living sector of a senior housing community in San Diego.

In this communal setting, 85 percent of the residents reported moderate to severe levels of loneliness. "Loneliness is subjective," said Jeste. "Different people feel lonely for different reasons despite having opportunities and resources for socialization. This is not a one size fits all topic."

Three main themes emerged from the study:

Age-associated losses and inadequate social skills were considered to be primary risk factors for loneliness. "Some residents talked about the loss of spouses, siblings and friends as the cause of their loneliness. Others mentioned how making new friends in a senior community cannot replace deceased friends they grew up with," said first author Alejandra Paredes, PhD, a research fellow in the Department of Psychiatry at UC San Diego School of Medicine.
The feeling of loneliness was frequently associated with a lack of purpose in life. "We heard powerful comments like, 'It's kind of gray and incarcerating,'" said Jeste. "Others expressed a sense of 'not being attached, not having very much meaning and not feeling very hopeful' or 'being lost and not having control.'"
The research team also found that wisdom, including compassion, seemed to be a factor that prevented loneliness. "One participant spoke of a technique she had used for years, saying 'if you're feeling lonely, then go out and do something for somebody else.' That's proactive," said Jeste. Other protective factors were acceptance of aging and comfort with being alone. "One resident told us, 'I've accepted the aging process. I'm not afraid of it. I used to climb mountains. I want to keep moving, even if I have to crawl. I have to be realistic about getting older, but I consider and accept life as a transition,'" Jeste noted. "Another resident responded, 'I may feel alone, but that doesn't mean I'm lonely. I'm proud I can live by myself.'"

According to the National Center for Health Statistics, by 2029, more than 20 percent of the United States population will be over the age of 65. "It is paramount that we address the well-being of our seniors -- they are friends, parents and grandparents of the younger generations," said Jeste. "Our study is relevant to better understand loneliness within senior housing and other settings to so we can develop effective interventions."

Credit: 
University of California - San Diego

Satellite constellations harvest energy for near-total global coverage

ITHACA, N.Y. - Think of it as a celestial parlor game: What is the minimum number of satellites needed to see every point on Earth? And how might those satellites stay in orbit and maintain continuous 24/7 coverage while contending with Earth's gravity field, its lumpy mass, the pull of the sun and moon, and pressure from solar radiation?

In the mid-1980s, researcher John E. Draim proposed what is generally considered to be the ideal solution: a four-satellite constellation. However, the amount of propellant needed to keep the satellites in place, and the ensuing cost, made the configuration unfeasible.

Now, a National Science Foundation-sponsored collaboration led by Patrick Reed, the Joseph C. Ford Professor of Engineering at Cornell University, has discovered the right combination of factors to make a four-satellite constellation possible, which could drive advances in telecommunication, navigation and remote sensing. And in an ingenious twist, the researchers accomplished this by making the forces that ordinarily degrade satellites instead work in their favor.

"One of the interesting questions we had was, can we actually transform those forces? Instead of degrading the system, can we actually flip it such that the constellation is harvesting energy from those forces and using them to actively control itself?" Reed said.

Their paper, "Low Cost Satellite Constellations for Nearly Continuous Global Coverage," published Jan. 10 in Nature Communications.

The AI-based evolutionary computing search tools that Reed has developed are ideally suited for navigating the numerous complications of satellite placement and management.

For this project, Reed collaborated with researchers from The Aerospace Corporation, combining his algorithmic know-how with the company's expertise in cutting-edge astrophysics, operational logistics and simulations.

In order to sift through the hundreds of thousands of possible orbits and combinations of perturbations, the team used the Blue Waters supercomputer at University of Illinois, Urbana-Champaign. Blue Waters compressed 300 or 400 years' worth of computational exploration into the equivalent of roughly a month of actual computing, Reed said.

They winnowed their constellation designs to two models that could orbit for either a 24- or 48-hour period and achieve continuous coverage over 86% and 95% of the globe, respectively. While 100% performance coverage would be ideal in theory, the researchers found that sacrificing only 5%-14% created greater gains in terms of harvesting energy from the same gravitational and solar radiation forces that would normally make a satellite constellation short lived and difficult to control.

The tradeoff is worth it, Reed said, especially since satellite operators could control where the gaps in coverage would occur. Outages in these low-priority regions would last approximately 80 minutes a day, at most, in the worst-case scenario.

"This is one of those things where the pursuit of perfection actually could stymie the innovation," Reed said. "And you're not really giving up a dramatic amount. There might be missions where you absolutely need coverage of everywhere on Earth, and in those cases, you would just have to use more satellites or networked sensors or hybrid platforms."

Using this type of passive control could potentially extend a constellation's lifespan from five years to 15 years. These satellites would require less propellant and would float at higher elevations, removing them from the risky high-traffic zone of low Earth orbit. But perhaps the biggest selling point is the low cost. Commercial interests or countries without the financial resources to launch a large constellation of satellites could attain near-continuous global coverage very economically, with reduced long-term technical overhead.

"Even one satellite can cost hundreds of millions or billions of dollars, depending on what sensors are on it and what its purpose is. So having a new platform that you can use across the existing and emerging missions is pretty neat," Reed said. "There's a lot of potential for remote sensing, telecommunication, navigation, high-bandwidth sensing and feedback around the space, and that's evolving very, very quickly. There's likely all sorts of applications that might benefit from a long-lived, self-adapting satellite constellation with near global coverage."

The paper's lead author is Lake Singh with The Aerospace Corporation. Researchers from the University of California, Davis, also contributed.

"We leveraged Aerospace's constellation design expertise with Cornell's leadership in intelligent search analytics and discovered an operationally feasible alternative to the Draim constellation design," said Singh, systems director for The Aerospace Corporation's Future Architectures department. "These constellation designs may provide substantive advantages to mission planners for concepts out at geostationary orbits and beyond."

Credit: 
Cornell University

Study puts the 'Carib' in 'Caribbean,' boosting credibility of Columbus' cannibal claims

image: Researchers analyzed the skulls of early Caribbean inhabitants, using 3D facial "landmarks" as a genetic proxy for determining how closely people groups were related to one another.

Image: 
Ann Ross/North Carolina State University

GAINESVILLE, Fla. --- Christopher Columbus' accounts of the Caribbean include harrowing descriptions of fierce raiders who abducted women and cannibalized men - stories long dismissed as myths.

But a new study suggests Columbus may have been telling the truth.

Using the equivalent of facial recognition technology, researchers analyzed the skulls of early Caribbean inhabitants, uncovering relationships between people groups and upending longstanding hypotheses about how the islands were first colonized.

One surprising finding was that the Caribs, marauders from South America and rumored cannibals, invaded Jamaica, Hispaniola and the Bahamas, overturning half a century of assumptions that they never made it farther north than Guadeloupe.

"I've spent years trying to prove Columbus wrong when he was right: There were Caribs in the northern Caribbean when he arrived," said William Keegan, Florida Museum of Natural History curator of Caribbean archaeology. "We're going to have to reinterpret everything we thought we knew."

Columbus had recounted how peaceful Arawaks in modern-day Bahamas were terrorized by pillagers he mistakenly described as "Caniba," the Asiatic subjects of the Grand Khan. His Spanish successors corrected the name to "Caribe" a few decades later, but the similar-sounding names led most archaeologists to chalk up the references to a mix-up: How could Caribs have been in the Bahamas when their closest outpost was nearly 1,000 miles to the south?

But skulls reveal the Carib presence in the Caribbean was far more prominent than previously thought, giving credence to Columbus' claims.

Face to face with the Caribbean's earliest inhabitants

Previous studies relied on artifacts such as tools and pottery to trace the geographical origin and movement of people through the Caribbean over time. Adding a biological component brings the region's history into sharper focus, said Ann Ross, a professor of biological sciences at North Carolina State University and the study's lead author.

Ross used 3D facial "landmarks," such as the size of an eye socket or length of a nose, to analyze more than 100 skulls dating from about A.D. 800 to 1542. These landmarks can act as a genetic proxy for determining how closely people are related to one another.

The analysis not only revealed three distinct Caribbean people groups, but also their migration routes, which was "really stunning," Ross said.

Looking at ancient faces shows the Caribbean's earliest settlers came from the Yucatan, moving into Cuba and the Northern Antilles, which supports a previous hypothesis based on similarities in stone tools. Arawak speakers from coastal Colombia and Venezuela migrated to Puerto Rico between 800 and 200 B.C., a journey also documented in pottery.

The earliest inhabitants of the Bahamas and Hispaniola, however, were not from Cuba as commonly thought, but the Northwest Amazon - the Caribs. Around A.D. 800, they pushed north into Hispaniola and Jamaica and then the Bahamas where they were well established by the time Columbus arrived.

"I had been stumped for years because I didn't have this Bahamian component," Ross said. "Those remains were so key. This will change the perspective on the people and peopling of the Caribbean."

For Keegan, the discovery lays to rest a puzzle that pestered him for years: why a type of pottery known as Meillacoid appears in Hispaniola by A.D. 800, Jamaica around 900 and the Bahamas around 1000.

"Why was this pottery so different from everything else we see? That had bothered me," he said. "It makes sense that Meillacoid pottery is associated with the Carib expansion."

The sudden appearance of Meillacoid pottery also corresponds with a general reshuffling of people in the Caribbean after a 1,000-year period of tranquility, further evidence that "Carib invaders were on the move," Keegan said.

Raiders of the lost Arawaks

So, was there any substance to the tales of cannibalism?

Possibly, Keegan said.

Arawaks and Caribs were enemies, but they often lived side by side with occasional intermarriage before blood feuds erupted, he said.

"It's almost a 'Hatfields and McCoys' kind of situation," Keegan said. "Maybe there was some cannibalism involved. If you need to frighten your enemies, that's a really good way to do it."

Whether or not it was accurate, the European perception that Caribs were cannibals had a tremendous impact on the region's history, he said. The Spanish monarchy initially insisted that indigenous people be paid for work and treated with respect, but reversed its position after receiving reports that they refused to convert to Christianity and ate human flesh.

"The crown said, 'Well, if they're going to behave that way, they can be enslaved,'" Keegan said. "All of a sudden, every native person in the entire Caribbean became a Carib as far as the colonists were concerned."

Credit: 
Florida Museum of Natural History

Deep learning, 3D technology to improve structure modeling, create better drugs

image: DOVE, created by Purdue researchers, captures structural and energetic features of the interface of a protein docking model with a 3D box and judges if the model is more likely to be correct or incorrect using 3D convolutional neural network.

Image: 
Daisuke Kihara/Purdue University

WEST LAFAYETTE, Ind. - Proteins are often called the working molecules of the human body. A typical body has more than 20,000 different types of proteins, each of which are involved in many functions essential to human life.

Now, Purdue University researchers have designed a novel approach to use deep learning to better understand how proteins interact in the body - paving the way to producing accurate structure models of protein interactions involved in various diseases and to design better drugs that specifically target protein interactions. The work is released online in Bioinformatics.

"To understand molecular mechanisms of functions of protein complexes, biologists have been using experimental methods such as X-rays and microscopes, but they are time- and resource-intensive efforts," said Daisuke Kihara, a professor of biological sciences and computer science in Purdue's College of Science, who leads the research team. "Bioinformatics researchers in our lab and other institutions have been developing computational methods for modeling protein complexes. One big challenge is that a computational method usually generates thousands of models, and choosing the correct one or ranking the models can be difficult."

Kihara and his team developed a system called DOVE, DOcking decoy selection with Voxel-based deep neural nEtwork, which applies deep learning principles to virtual models of protein interactions. DOVE scans the protein-protein interface of a model and then uses deep learning model principles to distinguish and capture structural features of correct and incorrect models.

"Our work represents a major advancement in the field of bioinformatics," said Xiao Wang, a graduate student and member of the research team. "This may be the first time researchers have successfully used deep learning and 3D features to quickly understand the effectiveness of certain protein models. Then, this information can be used in the creation of targeted drugs to block certain protein-protein interactions."

Credit: 
Purdue University