Culture

Micro implants could restore standing and walking

When Vivian Mushahwar first applied to grad school, she wrote about her idea to fix paralysis by rewiring the spinal cord.

It was only after she was accepted into a bioengineering program that the young electrical engineer learned her idea had actually prompted laughter.

"I figured, hey I can fix it, it's just wires," Mushahwar said. "Yeah, well, it's not just wires. So I had to learn the biology along the way."

It's taken Mushahwar a lot of work over two decades at the University of Alberta, but the Canada Research Chair in Functional Restoration is still fixated on the dream of helping people walk again. And thanks to an electrical spinal implant pioneered in her laboratory and work in mapping the spinal cord, that dream could become a reality in the next decade.

Because an injured spinal cord dies back, it's not simply a matter of reconnecting a cable. Three herculean feats are needed. You have to translate brain signals. You have to figure out and control the spinal cord. And you have got to get the two sides talking again.

People tend to think the brain does all the thinking, but Mushahwar says the spinal cord has built-in intelligence. A complex chain of motor and sensory networks regulate everything from breathing to bowels, while the brain stem's contribution is basically "go!" and "faster!" Your spinal cord isn't just moving muscles, it's giving you your natural gait.

Other researchers have tried different avenues to restore movement. By sending electrical impulses into leg muscles, it's possible to get people standing or walking again. But the effect is strictly mechanical and not particularly effective. Mushahwar's research has focused on restoring lower-body function after severe injuries using a tiny spinal implant. Hair-like electrical wires plunge deep into the spinal grey matter, sending electrical signals to trigger the networks that already know how to do the hard work.

In a new paper in Scientific Reports, the team showcases a map to identify which parts of the spinal cord trigger the hip, knees, ankles and toes, and the areas that put movements together. The work has shown that the spinal maps have been remarkably consistent across the animal spectrum, but further work is required before moving to human trials.

The implications of moving to a human clinical setting would be massive, but must follow further work that needs to be done in animals. Being able to control standing and walking would improve bone health, improve bowel and bladder function, and reduce pressure ulcers. It could help treat cardiovascular disease--the main cause of death for spinal cord patients--while bolstering mental health and quality of life. For those with less severe spinal injuries, an implant could be therapeutic, removing the need for months of gruelling physical therapy regimes that have limited success.

"We think that intraspinal stimulation itself will get people to start walking longer and longer, and maybe even faster," said Mushahwar. "That in itself becomes their therapy."

Progress can move at a remarkable pace, yet it's often maddeningly slow.

"There's been an explosion of knowledge in neuroscience over the last 20 years," Mushahwar said. "We're at the edge of merging the human and the machine."

Given the nature of incremental funding and research, a realistic timeline for this type of progress might be close to a decade.

Mushahwar is the director of the SMART Network, a collaboration of more than 100 U of A scientists and learners who intentionally break disciplinary silos to think of unique ways to tackle neural injuries and diseases. That has meant working with researchers like neuroscientist Kathryn Todd and biochemist Matthew Churchward, both in the psychiatry department, to create three-dimensional cell cultures that simulate the testing of electrodes.

The next steps are fine-tuning the hardware--miniaturizing an implantable stimulator--and securing Health Canada and FDA approvals for clinical trials. Previous research has tackled the problem of translating brain signals and intent into commands to the intraspinal implant; however, the first generation of the intraspinal implants will require a patient to control walking and movement. Future implants could include a connection to the brain.

It's the same goal Mushahwar had decades ago. Except now it's no longer a laughable idea.

"Imagine the future," Mushahwar said. "A person just thinks and commands are transmitted to the spinal cord. People stand up and walk. This is the dream."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Opioid overdose risk is high after medical treatment ends, study finds

People with opioid addiction face a high risk of overdose after ending treatment with the medication buprenorphine, even when treated for 18 months, a new study by researchers at Columbia University Vagelos College of Physicians and Surgeons has found.

Among people who were treated with buprenorphine continuously for 6 to 18 months, about 5% needed medical treatment for an opioid overdose in the 6 months after ending buprenorphine treatment. The true rate is likely higher as the study was unable to account for overdose events that did not present to healthcare settings.

"The rate at which individuals relapsed and overdosed after ending treatment was alarmingly high, suggesting that discontinuing buprenorphine is a life-threatening event," says Arthur Robin Williams, MD, MBE, assistant professor of clinical psychiatry at Columbia University Vagelos College of Physicians and Surgeons.

The study also found that the longer patients continued with treatment, the lower their risk of other types of adverse outcomes, suggesting that buprenorphine treatment may be most effective as a long-term therapy for those with opioid use disorder.

Are Opioid Users Getting Evidence-Based Care?

As the opioid crisis continues, increasing attention has focused on difficulties faced by an estimated 2.1 million patients with opioid use disorder in accessing evidence-based care.

Buprenorphine, which won FDA approval in 2002 to combat opioid addiction, is dispensed to almost one million individuals annually. However, an estimated 50% to 80% of patients discontinue the treatment within several weeks or months, and there is no consensus about how long patients should continue taking the medication although expert consensus supports indefinite use.

Further, many insurance plans limit treatment with buprenorphine to six months or require approvals to be reauthorized every year, and patients who are at risk for incarceration are frequently deprived of buprenorphine treatment while awaiting arraignment or sentencing.

"Many clinicians think they should prescribe buprenorphine only for time-limited periods, due to stigma and outdated beliefs that patients using medications for opioid use disorder are not in 'true recovery,'" says Williams. "Our paper is one of the first to look at the effect of long-term durations of buprenorphine treatment on subsequent outcomes."

Dangers of Stopping Buprenorphine Treatment

To understand whether the duration of buprenorphine treatment had an impact on outcomes after treatment ended, the researchers analyzed Medicaid claims data of nearly 9,000 adults (ages 18 to 64 years) across a handful of anonymously reporting states who remained in continual treatment for at least 6 months and for as many as 18 months.

Regardless of treatment duration, the researchers found that in the 6 months after treatment ended, approximately 1 in 20 individuals received treatment for an opioid overdose at least once.

Rates of new opioid prescriptions (~25%) and visits to the emergency room (~45%) remained high for all groups in the 6 months after ending buprenorphine treatment, especially among those with mental illness. Rates were significantly higher for those who stopped treatment after 6 months versus the 15-18 month cohort.

What the Study Means

Previous studies have shown that the risk of dying from an opioid overdose drops by as much as 70% during buprenorphine treatment. However, most patients relapse after they discontinue the medication.

The current study adds to a growing body of literature demonstrating that treatment with buprenorphine may be needed for several years, if not indefinitely, to reduce the risk of overdose and other adverse events.

"Patients and families need guidance, social support, and better coordination of care to help facilitate long-term maintenance with buprenorphrine for opioid use disorder," Williams adds.

Credit: 
Columbia University Irving Medical Center

Anthracnose alert: How bacteria prime fifth-biggest global grain crop against deadly fungus

video: Sorghum anthracnose devastates crops of the drought- and heat-resistant cereal worldwide. Priming with rhizobacteria can boost the plant's resistance against a range of microbial attacks.

Professor Ian Dubery and Dr. Fidele Tugizimana from the University of Johannesburg's Centre for Plant Metabolomics decoded how priming enhances the 'security system' of plants for a much stronger, faster defense.

Using metabolomics and machine learning algorithms, they identified changes in the sorghum plant's chemical response to fungal attack. The low-cost approach can counter other pathogens in economically important food crops.

Image: 
Therese van Wyk, University of Johannesburg.

Anthracnose of Sorghum bicolor devastates crops of the drought- and heat-resistant cereal worldwide. Priming with rhizobacteria can boost the plants' resistance and tolerance against a wide range of adverse conditions such as microbial attacks.

University of Johannesburg researchers decoded how priming enhances the 'security system' of plants for a much stronger, faster defence.

Using metabolomics and machine learning algorithms, they identified changes in the sorghum plant's chemical response to fungal attack.

The low-cost approach can be used to counter other pathogens in economically important food crops.

Fungus modus operandi

The fungus Colletotrichum sublineolum sneaks up to its host in many ways. It may have been hanging around for years in the soil, on decaying plant matter or on equipment. It likes to pounce at the first rains in humid and warm conditions.

It enters the stomata, or "air vents" of the plant and doesn't wreck things at first. Rather, it multiplies inside the plant as a first priority. At this stage, it feeds on the plant without causing damage visible to the farmer.

But once the fungus has truly invaded its host, it switches from unwanted parasite to wholesale destroyer. As if a switch has been flicked, it starts demolishing the structural supports and cells of the plant. This way the fungus feeds its ravenous appetite and gets ready to procreate.

At this stage the devastating disease becomes visible on the outside of the plant. Sorghum anthracnose, or wilting disease, causes spots on the leaves and stems. The spots expand into lesions that can cover leaves completely.

The invaded plant doesn't stand a chance.

Unless friendly bacteria have teamed up with the plant beforehand. A mutual plant-bacteria interaction can switch on a plant's "security system", which can fend off the fungus, if it is sensitive, fast and strong enough.

Sorghum anthracnose is caused by the fungus Colletotrichum sublineolum, or CS fungus for short. The fungus is a picky predator, as most fungi are. It specializes in attacking sorghum plants of any age. Sometimes it favours plants closer to harvest. When it does attack, it can destroy entire fields of the grain, with crop yield losses up to 70% or more.

Coping with climate

"In the era of climate change, we are expecting longer periods of drought and excessive heat. Crop plants will also have to produce during intermittent and more severe flooding. It is time to adapt our existing crop plants for the conditions approaching us," says Prof Ian Dubery. He is the Director of the Research Centre for Plant Metabolomics at the University of Johannesburg in South Africa.

Fifth biggest grain globally

The species of sorghum mostly planted for commercial or subsistence production is Sorghum bicolor. It is indigenous to Africa, and used for food, fodder and bio-fuel in many countries.

By annual volume, it is the fifth-biggest grain crop in the world. Sorghum is key to food security for subsistence farmers producing the grain.

The crop is known as great millet and guinea corn in West Africa, dura in Sudan, mtama in eastern Africa, jowar in India and gaoliang in China; while it is usually referred to as milo or milo-maize in the United States, according to the FAO. Other names include feterita, jwari, shallu, cholam, jola, dari, and solam.

Certain varieties are highly drought and heat resistant.

Costly defences

Some varieties of Sorghum bicolor are more resistant to the CS fungus than others. However, the most productive varieties tend to have less resistance to the CS fungus. In addition, activating and exercising that resistance comes at a price - to the plant and the farmer.

The harder the sorghum plant has to work at mustering its defences, the fewer seeds it will produce. It may be able to defend itself so it has healthy leaves and stems, but it can end up producing a crop yield that is much less. It may even die in the process.

Also, spraying fungicides is expensive and can affect the environment. So a more sustainable way of protecting sorghum would be preferable.

Priming for defence

Priming sorghum plants with friendly bacteria around their roots, can make their leaves more resistant to attacks from the CS fungus. Biofertilizers which contain these rhizobacteria are used commercially for sorghum and other crops.

In industry, these are called "plant-growth promoting rhizobacteria" or PGPR. Seeds can be coated with biofertilizers, and soil or plants can be sprayed with it.

But how and why priming works to defeat pathogens such as the CS fungus on cereal crops was unknown.

Plant metabolomics decodes a tougher security system

Prof Ian Dubery and Dr Fidele Tugizimana decoded how and why priming with rhizobacteria works on Sorghum bicolor. Tugizimana is a research associate at the Research Centre.

"Sorghum plants and the rhizobacteria they are primed with, team up to get the plant 'security system' on higher alert. It also acts faster and with a stronger response against the attacking fungus," says Dubery.

Without fungal attacks, the bacteria living in the rhizosphere (the area around the roots of the plant), help the plant in many ways. As an example, they make it easier for the plant to digest nutrients like phosphates; and to fix nitrogen to the soil to make it more fertile.

In its turn, the plant helps the bacteria by releasing chemicals that they need.

For peace and war

The researchers planted a sorghum variety in trays in their laboratory. After the plants reached a height of 30 cm, they applied the rhizobacteria on the roots to prime the plants. These bacteria live in the soil close to the roots of the plants, which is called the rhizosphere. The plant manufactures chemicals and sends them into the rhizosphere for the bacteria to use.

What the researchers wanted to find out, is how the "chemical communications" work in and around a healthy plant. They analysed the chemicals synthesized by the plant in its leaves, stems and roots, as well as bacteria-inoculated soil in the rhizosphere.

From this, they could build up a picture of how the plant roots and bacteria "talk" to each other. They could also unravel how the roots, stems and leaves "talk" to each other to support the beneficial relationship with the bacteria.

This gave them the metabolomic "signature" of a healthy plant primed with rhizobacteria.

The other half of the plants they infected with the CS fungus to see how badly it would affect that sorghum variety. Again, they analysed the metabolic "cocktails" in the leaves, stems, roots and bacteria-inoculated soil. This gave them the metabolomic signature of a primed, infected plant.

Big data analytics

They repeated the whole process for the same variety of sorghum, with one exception. They didn't prime the roots with bacteria. Now they could see how much weaker the 'security response' was without priming.

All of these biochemical analyses generated a huge amount of complex data, more than 200 gigabytes in volume. To make sense of all of this, they employed big data analytics, which is a complex process of examining and mining large and complex datasets.

Along the way, techniques such as machine learning, chemometrics, multivariate statistical analyses and mathematical methods were used. In this way they could extract the information, so that more accurate conclusions and hypotheses could be drawn and confidently formulated.

Changes in chemical defence

Now they could see what new "cocktails" the sorghum plants manufactured to defend themselves. And how these cocktails were more diverse and concentrated, with help of the priming bacteria, than when the plants were healthy.

The researchers could also see what new "cocktail ingredients" the primed plants used when attacked by the CS fungus.

At one to three days after infection with the fungus, the primed plants produced several times higher quantities of plant hormones than non-primed plants, in particular hydroxyjasmonic acid-glucoside and zeatin.

The primed plants also synthesized significant quantities of the amino acids tyrosine and hydroxy-tryptophan, which non-primed plants made tiny quantities of. They also produced more than three times as much tryptophan than usual.

At the same time, the primed plants ramped up production of lipids, especially phytosphingosine. The non-primed plants produced tiny fractions of the lipids in comparison.

The primed plants radically cut back on producing (iso)flavonoids, especially on apigenin and luteolin.

Decoding the security system

"We found that primed sorghum plants have more sensitive plant security systems. They switch these systems on sooner than they would without priming. The primed plants also responded better to fungal attack. They had much lower infection rates and reduced symptom development compared to non-primed plants," says Tugizimana.

Even at nine days after infection with the CS fungus, few primed sorghum plants showed symptoms. The ones that had symptoms had few leaves affected. The lesions could be described as a localised hypersensitive response. None of the lesions spread over the entire leaf surface, he adds.

They found out how the plant was able to defend itself, using metabolomics analyses.

"We found out how the interaction of the beneficial bacteria with sorghum plant roots modifies the plant's ability to defend itself. The primed sorghum plant changes how it apportions energy and redirects its metabolic pathways more to defence, rather than growth or seed production. In this way, it changes the composition of its protective chemicals to resist the fungus. This is how it starts making new 'cocktails' to enhance its chemical defences.

"The primed sorghum plant is more sensitive to fungal attack, reacts quicker and more intensely. So we can say that plant-beneficial bacteria supports the plant to launch a more efficient defence," says Tugizimana.

Low-cost sustainable approach for farming

The results pave the way for similar studies on countering pathogens on other economically important crop plants, says Dubery.

"Priming with rhizobacteria can make a susceptible plant more tolerant and a tolerant plant more resistant to disease. This means that priming, or pre-conditioning, can enhance crop yields and reduce the use of pesticides. It is a promising, sustainable and low-cost option to get more effective resistance in real-world farming conditions, where many pathogens threaten food crops," he adds.

Credit: 
University of Johannesburg

'Native advertising' builds credibility, not perceived as 'tricking' visitors

CATONSVILLE, MD, December 2, 2019 - The concept of "native advertising" has been in existence for as long as advertisements were designed to resemble the editorial content in newspapers and magazines. As the Internet emerged and became a powerful force, native advertising evolved, which has led some in recent times to be concerned that native advertising, which mimics non-advertising content, could serve to deceive web site visitors.

This concern served as the foundation for new research in the INFORMS journal Marketing Science (Editor's note: The source of this research is INFORMS) which sought to determine more precisely how native advertising is perceived and received by web site users.

The study, to be published in the December edition of the INFORMS journal Marketing Science, is titled "Sponsorship Disclosure and Consumer Deception: Experimental Evidence from Native Advertising in Mobile Search." It is authored by Navdeep S. Sahni and Hirkesh S. Nair of Stanford University.

"We found little evidence that native advertising 'tricks' Internet users into clicking on sponsored content and driving those users directly to the advertisers," said Nair. "Instead, we found that Internet users seem to view native ads as advertisements, and they use the content to deliberately evaluate those advertisers."

The researchers studied native advertising at a mobile restaurant-search platform. They analyzed various formats of paid-search advertising, and the extent to which those ads were disclosed to over 200,000 users.

According to industry standards and certain regulations as instituted by the U.S. Federal Trade Commission (FTC), organizations behind native advertising are required to clearly indicate when content is paid advertising, or "sponsored content."

"One of the interesting findings of the research is that while native advertising benefits advertisers, we see no evidence of consumers getting deceived," said Sahni. "More to the point, users who see a native advertisement continue with their product search; they're more likely to later click on the advertiser's organic listings and make a purchase. In effect, consumers often follow a process of conducting their own due diligence incorporating the information they receive through native advertising."

Credit: 
Institute for Operations Research and the Management Sciences

Study finds common cold virus can infect the placenta

Researchers have shown that a common cold virus can infect cells derived from human placentas, suggesting that it may be possible for the infection to pass from expectant mothers to their unborn children.

The study, published in the journal PLOS ONE, was led by Dr. Giovanni Piedimonte, professor of pediatrics and vice president of research at Tulane University.

"This is the first evidence that a common cold virus can infect the human placenta," Piedimonte said. "It supports our theory that when a woman develops a cold during pregnancy, the virus causing the maternal infection can spread to the fetus and cause a pulmonary infection even before birth."

During pregnancy, the placenta acts as a gatekeeper to provide essential nourishment from a mother to a developing fetus while filtering out potential pathogens. Scientists are discovering that the barrier isn't as impenetrable as once believed with recent studies showing how viruses such as Zika can slip through its defenses.

Using donated placentas, researchers isolated the three major cells types found in placentas -- cytotrophoblast, stroma fibroblasts and Hofbauer cells -- and exposed them in vitro to the respiratory syncytial virus (RSV), which causes the common cold. While the cytotrophoblast cells supported limited viral replication, the other two types were significantly more susceptible to infection.

For example, the Hofbauer cells survived and allowed the virus to replicate inside the cell walls. As Hofbauer cells travel within the placenta, researchers suspect they could act as a Trojan horse and transmit the virus into the fetus.

"These cells don't die when they're infected by the virus, which is the problem," Piedimonte said. "When they move into the fetus, they are like bombs packed with virus. They don't disseminate the virus around by exploding, which is the typical way, but rather transfer the virus through intercellular channels."

Researchers suspect RSV could attack lung tissue within the fetus, causing an infection that may predispose offspring to developing asthma in childhood. Piedimonte plans to launch a clinical study at Tulane to further test the theory.

Credit: 
Tulane University

Smarter strategies

Though small and somewhat nondescript, quagga and zebra mussels pose a huge threat to local rivers, lakes and estuaries. Thanks to aggressive measures to prevent contamination, Santa Barbara County's waters have so far been clear of the invasive mollusks, but stewards of local waterways, reservoirs and water recreation areas remain vigilant to the possibility of infestation by these and other non-native organisms.

Now, UC Santa Barbara-based research scientist Carolynn Culver and colleagues at UCSB's Marine Science Institute are adding to this arsenal of prevention measures with a pair of studies that appear in a special edition of the North American Journal of Fisheries Management. They focus on taking an integrated approach to the management of aquatic invasive species as the state works to move beyond its current toxic, water quality-reducing methods.

"With integrated pest management you're looking for multiple ways to manipulate vulnerabilities of a pest, targeting different life stages with different methods in a combined way that can reduce the pest population with minimal harm to people and the environment," said Culver, an extension specialist with California Sea Grant who also holds an academic appointment at Scripps Institution of Oceanography. "Often there is concentrated effort on controlling one part of the life cycle, like removing adults--which are easier to see--without thinking about the larvae that are out there."

Could hungry fish fight invasive mussels?

In one study, Culver and her colleagues explored whether certain species of sunfish could be used as a biological control method to help manage invasive freshwater mussels in Southern California lakes.

The quagga mussel and closely related zebra mussel are two of the most devastating aquatic pests in the United States. The small freshwater mussels grow on hard surfaces such as water pipes, and can cause major problems for water infrastructure. They can also negatively impact ecosystems and fisheries by feeding on microscopic plants and animals that support the food web. First appearing in North America in the 1980s, they appeared in California in 2007. The cost of managing these mussels is estimated at billions of dollars since their introduction into the U.S.

Culver has worked closely with lake and reservoir managers in California to help them prepare for and respond to mussel invasions. This research was needed, she said, because many of the control systems long used in other places were developed for facilities and involved chemical applications or toxic coatings that can't readily be used in California in bodies of water that serve as sources of drinking water, or are home to endangered species that could be hurt by the chemicals. That covers the majority of California locations that have mussel infestations. In San Diego, for instance, rapid colonization of the reservoirs by these mussels caused docks and buoys to sink, but conventional, toxic methods of controlling them were a cause for concern.

"Commonly used mussel control methods are problematic for San Diego reservoirs since they are primary water supply reservoirs," said study co-author Dan Daft, a water production superintendent and biologist with the city of San Diego, who found that biocontrol methods were both effective and ecologically sound for sensitive water sources.

The study found that when one species of sunfish, bluegill, was penned up in an area where mussels occur, it could significantly reduce microscopic larvae and newly settled young mussels on surfaces within the pen, and on the pen itself. This method could be one key piece of an integrated pest management strategy, and provides a new, non-chemical method for targeting early life stages of the mussels, which are hard to detect.

"Essentially you can put these fish to work in specific areas where mussels occur," Culver said.

The researchers studied two species of resident sunfish in many infested southern California lakes, bodies of water that are human-built and nearly all serve as water supplies. Although not native to California, they were stocked into these man-made reservoirs. According to the researchers, the methods could be applied to predatory species in different places, but no other good candidates were available where they were working.

"It's important to point out that we don't support introducing non-native species," Culver said.

A better way to clean your boat

The other study assessed an integrated management framework that Culver and colleagues had developed to manage biofouling -- the growth of organisms such as algae, barnacles and other aquatic plants and animals that settle on hard surfaces such as piers, pilings and boat hulls -- while balancing both boat operations and ecosystem health. The paper describes how, when applied as part of an integrated framework, a combination of non-toxic methods can help maintain clean boats without the use of toxic paints and coatings that are increasingly regulated due to their environmental impacts.

"Controlling the growth of these organisms is critical for boat maintenance, because they create drag that slows vessels, reduces fuel efficiency, and makes boats harder to steer," said co-author Leigh Johnson, coastal advisor emerita with UC Cooperative Extension and former California Sea Grant Extension advisor. Johnson was instrumental in initiating the research and bringing attention to the need for a balanced biofouling control management approach. "However," she added, "the methods used to control fouling on boats can impact water quality and increase transport of invasive species so it is important to consider all of these issues when deciding how to maintain a clean hull."

The primary method of controlling biofouling around the world has long been toxic antifouling paints. But there are growing concerns about the impacts of currently used copper-based paints on water quality, and many countries and US states, including California and Washington, have set standards to reduce the copper levels and leaching rates of antifouling paints. These actions, however, increase the risks of moving biofouling invasive species from place to place, including vulnerable ecosystems, such as the islands off the coast of California.

In this study, researchers tested a variety of hull coatings, California-based hull cleaning practices, and conditions in various California harbors, to identify methods that could be used in combination to control biofouling.

They found that although copper-based paints were effective when first applied, they lost effectiveness fairly quickly, and that non-native species tended to accumulate first on the toxic coatings -- sometimes within just a few months. The team also showed that frequent, minimally abrasive, in-water hull cleaning was effective and did not cause an increase in fouling as reported for other hull cleaning practices. Their documentation of the time of year when different organisms were attaching to surfaces also helped to illustrate how adjusting the timing and frequency of hull cleaning could help increase its effectiveness.

Results from the study, along with other research findings, informed the development of an integrated pest management framework that boaters can adapt to different regions and specific needs.

"It's not a one-size-fits-all approach -- it's adaptive," Culver said. "Boaters can tailor it to local environments, regulations and boating patterns, and it can be applied in areas where toxic paints have been restricted, as well as where they continue to be used. It can help to keep boat hulls clean, while reducing impacts on water quality and transport of invasive species -- three issues that often are not considered together."

Culver and her colleagues have provided information to boat owners, resource managers, and regulators about applications of this integrated approach. There also has been interest, she said, in using the technique to inform biofouling management guidance and regulations in California and elsewhere.

Credit: 
University of California - Santa Barbara

Breathing? Thank volcanoes, tectonics and bacteria

image: This figure illustrates how inorganic carbon cycles through the mantle more quickly than organic carbon, which contains very little of the isotope carbon-13. Both inorganic and organic carbon are drawn into Earth's mantle at subduction zones (top left). Due to different chemical behaviors, inorganic carbon tends to return through eruptions at arc volcanoes above the subduction zone (center). Organic carbon follows a longer route, as it is drawn deep into the mantle (bottom) and returns through ocean island volcanos (right). The differences in recycling times, in combination with increased volcanism, can explain isotopic carbon signatures from rocks that are associated with both the Great Oxidation Event, about 2.4 billion years ago, and the Lomagundi Event that followed.

Image: 
Image by J. Eguchi/University of California, Riverside

HOUSTON -- (Dec. 2, 2019) -- Earth's breathable atmosphere is key for life, and a new study suggests that the first burst of oxygen was added by a spate of volcanic eruptions brought about by tectonics.

The study by geoscientists at Rice University offers a new theory to help explain the appearance of significant concentrations of oxygen in Earth's atmosphere about 2.5 billion years ago, something scientists call the Great Oxidation Event (GOE). The research appears this week in Nature Geoscience.

"What makes this unique is that it's not just trying to explain the rise of oxygen," said study lead author James Eguchi, a NASA postdoctoral fellow at the University of California, Riverside who conducted the work for his Ph.D. dissertation at Rice. "It's also trying to explain some closely associated surface geochemistry, a change in the composition of carbon isotopes, that is observed in the carbonate rock record a relatively short time after the oxidation event. We're trying explain each of those with a single mechanism that involves the deep Earth interior, tectonics and enhanced degassing of carbon dioxide from volcanoes."

Eguchi's co-authors are Rajdeep Dasgupta, an experimental and theoretical geochemist and professor in Rice's Department of Earth, Environmental and Planetary Sciences, and Johnny Seales, a Rice graduate student who helped with the model calculations that validated the new theory.

Scientists have long pointed to photosynthesis -- a process that produces waste oxygen -- as a likely source for increased oxygen during the GOE. Dasgupta said the new theory doesn't discount the role that the first photosynthetic organisms, cyanobacteria, played in the GOE.

"Most people think the rise of oxygen was linked to cyanobacteria, and they are not wrong," he said. "The emergence of photosynthetic organisms could release oxygen. But the most important question is whether the timing of that emergence lines up with the timing of the Great Oxidation Event. As it turns out, they do not."

Cyanobacteria were alive on Earth as much as 500 million years before the GOE. While a number of theories have been offered to explain why it might have taken that long for oxygen to show up in the atmosphere, Dasgupta said he's not aware of any that have simultaneously tried to explain a marked change in the ratio of carbon isotopes in carbonate minerals that began about 100 million years after the GOE. Geologists refer to this as the Lomagundi Event, and it lasted several hundred million years.

One in a hundred carbon atoms are the isotope carbon-13, and the other 99 are carbon-12. This 1-to-99 ratio is well documented in carbonates that formed before and after Lomagundi, but those formed during the event have about 10% more carbon-13.

Eguchi said the explosion in cyanobacteria associated with the GOE has long been viewed as playing a role in Lomagundi.

"Cyanobacteria prefer to take carbon-12 relative to carbon-13," he said. "So when you start producing more organic carbon, or cyanobacteria, then the reservoir from which the carbonates are being produced is depleted in carbon-12."

Eguchi said people tried using this to explain Lomagundi, but timing was again a problem.

"When you actually look at the geologic record, the increase in the carbon-13-to-carbon-12 ratio actually occurs up to 10s of millions of years after oxygen rose," he said. "So then it becomes difficult to explain these two events through a change in the ratio of organic carbon to carbonate."

The scenario Eguchi, Dasgupta and Seales arrived at to explain all of these factors is:

A dramatic increase in tectonic activity led to the formation of hundreds of volcanoes that spewed carbon dioxide into the atmosphere.

The climate warmed, increasing rainfall, which in turn increased "weathering," the chemical breakdown of rocky minerals on Earth's barren continents.

Weathering produced a mineral-rich runoff that poured into the oceans, supporting a boom in both cyanobacteria and carbonates.

The organic and inorganic carbon from these wound up on the seafloor and was eventually recycled back into Earth's mantle at subduction zones, where oceanic plates are dragged beneath continents.

When sediments remelted into the mantle, inorganic carbon, hosted in carbonates, tended to be released early, re-entering the atmosphere through arc volcanoes directly above subduction zones.

Organic carbon, which contained very little carbon-13, was drawn deep into the mantle and emerged hundreds of millions of years later as carbon dioxide from island hotspot volcanoes like Hawaii.

"It's kind of a big cyclic process," Eguchi said. "We do think the amount of cyanobacteria increased around 2.4 billion years ago. So that would drive our oxygen increase. But the increase of cyanobacteria is balanced by the increase of carbonates. So that carbon-12-to-carbon-13 ratio doesn't change until both the carbonates and organic carbon, from cyanobacteria, get subducted deep into the Earth. When they do, geochemistry comes into play, causing these two forms of carbon to reside in the mantle for different periods of time. Carbonates are much more easily released in magmas and are released back to the surface at a very short period. Lomagundi starts when the first carbon-13-enriched carbon from carbonates returns to the surface, and it ends when the carbon-12-enriched organic carbon returns much later, rebalancing the ratio."

Eguchi said the study emphasizes the importance of the role that deep Earth processes can play in the evolution of life at the surface.

"We're proposing that carbon dioxide emissions were very important to this proliferation of life," he said. "It's really trying to tie in how these deeper processes have affected surface life on our planet in the past."

Dasgupta is also the principal investigator on a NASA-funded effort called CLEVER Planets that is exploring how life-essential elements might come together on distant exoplanets. He said better understanding how Earth became habitable is important for studying habitability and its evolution on distant worlds.

"It looks like Earth's history is calling for tectonics to play a big role in habitability, but that doesn't necessarily mean that tectonics is absolutely necessary for oxygen build up," he said. "There might be other ways of building and sustaining oxygen, and exploring those is one of the things we're trying to do in CLEVER Planets."

Credit: 
Rice University

Unexpected viral behavior linked to type 1 diabetes in high-risk children

image: This is the enterovirus Coxsackievirus B3.

Image: 
Electron microscopy image of Coxsackievirus B3 courtesy of Varpu Marjomäki Laboratory, University of Jyväskylä, and Minna Hankaniemi, Tampere University, Finland

Tampa, FL (Dec. 2, 2019) -- New results from The Environmental Determinants of Diabetes in the Young (TEDDY) study show an association between prolonged enterovirus infection and the development of autoimmunity to the insulin-producing pancreatic beta-cells that precedes type 1 diabetes (T1D). Notably, researchers also found that early adenovirus C infection seemed to confer protection from autoimmunity. The full findings were published Dec. 2 in Nature Medicine.

Viruses have long been suspected to be involved in the development of T1D, an autoimmune condition, although past evidence has not been consistent enough to prove a connection. Investigators from theUniversity of South Florida Health (USF Health) Morsani College of Medicine, Baylor College of Medicine, and other institutions studied samples available through the TEDDY study, the largest prospective observational cohort study of newborns with increased genetic risk for T1D, to address this knowledge gap. TEDDY studies young children in the U.S. (Colorado, Georgia/Florida, and Washington State) and in Europe (Finland, Germany, and Sweden).

"Years of research have shown that T1D is complex and heterogeneous, meaning that more than one pathway can lead to its onset," said lead author Kendra Vehik, PhD, MPH, an epidemiologist and professor with the USF Health Informatics Institute. "T1D is usually diagnosed in children, teens and young adults, but the autoimmunity that precedes it often begins very early in life."

"T1D occurs when the immune system destroys its own insulin-producing beta cells in the pancreas. Insulin is a hormone that regulates blood sugar in the body. Without it, the body cannot keep normal blood sugar levels causing serious medical complications," said coauthor Richard Lloyd, PhD, professor of molecular virology and microbiology at Baylor College of Medicine.

In the current study, Vehik and her colleagues studied the virome, that is, all the viruses in the body. They analyzed thousands of stool samples collected from hundreds of children followed from birth in the TEDDY study, looking to identify a connection between the viruses and the development of autoimmunity against insulin-producing beta cells. The enterovirus Coxsackievirus has been implicated in T1D before, but the current results provide a completely new way to make the connection, by identifying specific viruses shed in the stool. The investigators were surprised to find that a prolonged infection of more than 30 days, rather than a short infection, was associated with autoimmunity.

"This is important because enteroviruses are a very common type of virus, sometimes causing fever, sore throat, rash or nausea. A lot of children get them, but not everybody that gets the virus will get T1D," Vehik said. "Only a small subset of children who get enterovirus will go on to develop beta cell autoimmunity. Those whose infection lasts a month or longer will be at higher risk."

A prolonged enterovirus infection might be an indicator that autoimmunity could develop.

Beta cells of the pancreas express a cell surface protein that helps them talk to neighboring cells. This protein has been adopted by the virus as a receptor molecule to allow the virus to attach to the cell surface. The investigators discovered that children who carry a particular genetic variant in this virus receptor have a higher risk of developing beta cell autoimmunity.

"This is the first time it has been shown that a variant in this virus receptor is tied to an increased risk for beta cell autoimmunity," Vehik said. Ultimately, this process leads to the onset of T1D, a life-threatening disease that requires life-long insulin injections to treat.

Another discovery was that the presence in early life of adenovirus C, a virus that can cause respiratory infections, was associated with a lower risk of developing autoimmunity. It remains to be investigated whether having adenovirus C in early life would protect from developing beta cell autoimmunity. Adenoviruses use the same beta cell surface receptor as Coxsackievirus B, which may offer one clue to explain this connection, although further research is needed to fully understand the details.

Other factors that affect autoimmunity and the development of T1D are still unknown, but the TEDDY study is working to identify them. The researchers seek to gain insights into the exposures that trigger T1D by studying samples taken before autoimmunity developed, starting when the TEDDY participants were 3 months old. Such findings could identify approaches to potentially prevent or delay the disease.

"Taking it all together, our study provides a new understanding of the roles different viruses can play in the development of beta cell autoimmunity linked to T1D, and suggests new avenues for intervention that could potentially prevent T1D in some children," Lloyd said.

Credit: 
University of South Florida (USF Health)

This 'fix' for economic theory changes everything from gambles to Ponzi schemes

Whether we decide to take out that insurance policy, buy Bitcoin, or switch jobs, many economic decisions boil down to a fundamental gamble about how to maximize our wealth over time. How we understand these decisions is the subject of a new perspective piece in Nature Physics that aims to correct a foundational mistake in economic theory.

According to author Ole Peters (London Mathematical Laboratory, Santa Fe Institute), people's real-world behavior often "deviates starkly" from what standard economic theory would recommend. Take the example of a simple coin toss: Most people would not gamble on a repeated coin toss where a heads would increase their net worth by 50%, but a tails would decrease it by 40%.

"Would you accept the gamble and risk losing at the toss of a coin 40% of your house, car and life savings?" Peters asks, echoing a similar objection raised by Nicholas Bernoulli in 1713.

But early economists would have taken that gamble, at least in theory. In classical economics, the way to approach a decision is to consider all possible outcomes, then average across them. So the coin toss game seems worth playing because equal probability of a 50% gain and a 40% loss are no different from a 5% gain.*

Why people don't choose to play the game, seemingly ignoring the opportunity to gain a steady 5%, has been explained psychologically-- people, in the parlance of the field, are "risk averse". But according to Peters, these explanations don't really get to the root of the problem, which is that the classical "solution" lacks a fundamental understanding of the individual's unique trajectory over time.

Instead of averaging wealth across parallel possibilities, Peters advocates an approach that models how an individual's wealth evolves along a single path through time. In a disarmingly simple example, he randomly multiplies the player's total wealth by either 150% or 60% depending on the coin toss. That player lives with the gain or loss of each round, carrying it with them to the next turn. As the play time increases, Peters' model reveals an array of individual trajectories. They all follow unique paths. And in contrast to the classical conception, all paths eventually plummet downward. In other words, the approach reveals a fray of exponential losses where the classical conception would show a single exponential gain.

Encouragingly, people seem to intuitively grasp the difference between these two dynamics in empirical tests. The perspective piece describes an experiment conducted by a group of neuroscientists led by Oliver Hulme, at the Danish Research Center for Magnetic Resonance. Participants played a gambling game with real money. On one day, the game was set up to maximize their wealth under classical, additive dynamics. On a separate day, the game was set up under multiplicative dynamics.

"The crucial measure was whether participants would change their willingness to take risks between the two days," explains the study's lead author David Meder. "Such a change would be incompatible with classical theories, while Peters' approach predicts exactly that."

The results were striking: When the game's dynamics changed, all of the subjects changed their willingness to take risks, and in doing so were able to approximate the optimal strategy for growing their individual wealth over time.

"The big news here is that we are much more adaptable than we thought we were," Peters says. "These
aspects of our behavior we thought were neurologically imprinted are actually quite flexible."

"This theory is exciting because it offers an explanation for why particular risk-taking behaviors emerge, and how these behaviors should adapt to different circumstances. Based on this, we can derive novel predictions for what types of reward signals the brain should compute to optimize wealth over time" says Hulme.

Peters' distinction between averaging possibilities and tracing individual trajectories can also inform a long list of economic puzzles-- from the equity premium puzzle to measuring inequality to detecting Bernie Madoff's Ponzi scheme.

"It may sound obvious to say that what matters to one's wealth is how it evolves over time, not how it averages over many parallel states of the same individual," writes Andrea Taroni in a companion Editorial in Nature Physics. "Yet that is the conceptual mistake we continue to make in our economic models."

Credit: 
Santa Fe Institute

Whaling and climate change led to 100 years of feast or famine for Antarctic penguins

image: A Gentoo and Chinstrap penguin standing on guano covered rocks at a breeding
colony along
the Antarctic Peninsula. Image courtesy of Rachael Herman.

Image: 
Rachael Herman, Louisiana State University, Stony Brook University

BATON ROUGE - New research reveals how penguins have dealt with more than a century of human impacts in Antarctica and why some species are winners or losers in this rapidly changing ecosystem.

Michael Polito, assistant professor in LSU's Department of Oceanography & Coastal Sciences and his co-authors published their findings in the Proceedings of the National Academy of Sciences, which is available on Monday, Dec. 2.

"Although remote, Antarctica has a long history of human impacts on its ecosystems and animals. By the early to mid-1900s, humans had hunted many of its seals and whales nearly to extinction. Seal and whale populations are now recovering, but decades of climate change and a growing commercial fishing industry have further degraded the environment," Polito said.

Polito co-led a team of researchers from Louisiana State University, University of Rhode Island, University of Oxford, University of California Santa Cruz, and the University of Saskatchewan with the goal of understanding how human interference in Antarctic ecosystems during the past century led to booms and busts in the availability of a key food source for penguins: Antarctic krill.

"Antarctic krill is a shrimp-like crustacean that is a key food source for penguins, seals, and whales. When seal and whale populations dwindled due to historic over-harvesting, it is thought to have led to a surplus of krill during the early to mid-1900s. In more recent times, the combined effects of commercial krill fishing, anthropogenic climate change, and the recovery of seal and whale populations are thought to have drastically decreased the abundance of krill," Polito said.

In this study, the team determined the diets of chinstrap and gentoo penguins by analyzing the nitrogen stable isotope values of amino acids in penguin feathers collected during explorations of the Antarctic Peninsula during the past century.

"We've all heard the adage, 'You are what you eat.' All living things record a chemical signal of the food they eat in their tissues. We used the stable isotope values of penguin feathers as a chemical signal of what penguins were eating over the last 100 years," said Kelton McMahon, co-lead author and assistant professor at the University of Rhode Island.

Because humans have never commercially harvested penguins, Polito and colleagues expected that changes in their diets and populations would mirror shifts in krill availability. The team focused their research on chinstrap and gentoo penguins because chinstrap penguins have had severe population declines and gentoo penguin populations have increased in the Antarctic Peninsula over the past half century.

"Given that gentoo penguins are commonly thought of as climate change winners and chinstrap penguins as climate change losers we wanted to investigate how differences in their diets may have allow one species to cope with a changing food supply while the other could not," said Tom Hart, co-author and researcher at the University of Oxford.

The team found that both penguin species primarily fed on krill during the krill surplus in the early to mid-1900s that was caused by seal and whale harvesting. In contrast, during the latter half of the past century, gentoo penguins increasingly showed an adaptive shift from strictly eating krill to including fish and squid in their diets, unlike the chinstrap penguins that continued to feed exclusively on krill.

"Our results indicate that historic harvesting and recent climate change have altered the Antarctic marine food web over the past century. Moreover, the differing diet and population responses we observed in penguins indicate that species such as chinstrap penguins, with specialized diets and a strong reliance on krill, will likely continue to do poorly as climate change and other human impacts intensify," Polito said.

The authors predict that the Antarctic Peninsula Region will remain a hotspot for climate change and human impacts during the next century, and they believe their research will be beneficial in predicting which species are likely to fare poorly and which will withstand--or even benefit from--future changes.

According to McMahon, "By understanding how past ecosystems respond to environmental change, we can improve our predictions of future responses and better manage human-environment interactions in Antarctica."

Credit: 
Louisiana State University

Scientists build a 'Hubble Space Telescope' to study multiple genome sequences

A new tool that simultaneously compares 1.4 million genetic sequences can classify how species are related to each other at far larger scales than previously possible. Described today in Nature Biotechnology by researchers from the Centre for Genomic Regulation in Barcelona, the technology can reconstruct how life has evolved over hundreds of millions of years and makes important inroads for the ambition to understand the code of life for every living species on Earth.

Protecting Earth's biodiversity is one of the most urgent global challenges of our times. To steward the planet for all life forms, humanity must understand the way animals, fungi, bacteria and other organisms have evolved and how they interact amongst millions of other species. Sequencing the genome of life on Earth can unlock previously unknown secrets that yield fresh insights into the evolution of life, while bringing new foods, drugs and materials that pinpoint strategies for saving species at risk of extinction.

The most common way scientists study these relationships is by using Multiple Sequence Alignments (MSA), a tool that can be used to describe the evolutionary relationships of living organisms by looking for similarities and differences in their biological sequences, finding matches among seemingly unrelated sequences and predicting how a change at a specific point in a gene or protein might affect its function. The technology underpins so much biological research that the original study describing it is one of the most cited papers in history.

"We currently use multiple sequence alignments to understand the family tree of species evolution," says Cédric Notredame, a researcher at the Centre for Genomic Regulation in Barcelona and lead author of the study. "The bigger your MSA, the bigger the tree and the deeper we dig into the past and find how species appeared and separated from each other.

"What we've made lets us dig ten times deeper than what we've been able to do before, helping us to see hundreds of millions of years into the past. Our technology is essentially a time machine that tells us how ancient constraints influenced genes in a way that resulted in life as we know today, much like how the Hubble Space Telescope observes things that happened millions of years ago to help us understand the Universe we live in today."

Researchers can use MSA to understand how certain species of plants have evolved to be more resistant to climate change, or how particular genetic mutations in one species makes them vulnerable to extinction. By studying a living organism's evolutionary history, scientists may come up with and test new ideas to stave off the collapse of entire ecosystems.

Technological advances have made sequencing cheaper than ever before, resulting in increasingly large datasets with more than a million sequences for scientists to analyse. Some ambitious endeavours, like the Earth BioGenome Project, may run to the tens of millions. Researchers have not been able to take full advantage of these enormous datasets because current MSAs cannot analyse more than 100,000 sequences with accuracy.

To evaluate the scale-up potential of MSA, the authors of the paper used Nextflow, a cloud-computing software developed in-house at the Centre for Genomic Regulation. "We spent hundreds of thousands of hours of computation to test our algorithm's effectiveness," says Evan Floden, a researcher at the CRG who also led on developing the tool. "My hope is that in combining high-throughput instrumentation readouts with high-throughput computation, science will usher in an era of vastly improved biological understanding, ultimately leading to better outcomes for consumers, patients and our planet as a whole."

"There is a vast amount of 'dark matter' in biology, code we have yet to identify in the unexplored parts of the genome that is untapped potential for new medicines and other benefits we can't fathom," concludes Cédric. "Even seemingly inconsequential organisms may play a pivotal role in furthering human health and that of our planet, such as the discovery of CRISPR in archaea. What we have built is a new way of finding the needles in the haystack of life's genomes."

Credit: 
Center for Genomic Regulation

A new therapeutic target against diseases caused by lipid accumulation in cells

image: The study is led by Carles Enrich and Carles Rentero, lecturers at the unit of Cell Biology in the Department of Biomedicine of the Faculty of Medicine and Health Sciences at the UB and the CELLEX Biomedical Research Center (IDIBAPS-UB).

Image: 
UNIVERSITY OF BARCELONA

Researchers from the University of Barcelona (UB) and the August Pi i Sunyer Biomedical Research Institute (IDIBAPS) found a new molecular mechanism involved in the regulation of the cholesterol movement in cells, an essential process for a proper cell functioning.

The study, published in the journal Cellular and Molecular Life Sciences, also identifies the protein Annexin A6 (AnxA6) as a key factor in this regulation and as a potential therapeutical target against diseases that are caused by the accumulation of cholesterol and other lipids in endosomes, such as the Niemann-Pick disease type C1, a minority genetic disease with no cure that causes hepatic damage and a type of dementia.

The study is led by Carles Enrich and Carles Rentero, lecturers at the unit of Cell Biology in the Department of Biomedicine of the Faculty of Medicine and Health Sciences at the UB and the CELLEX Biomedical Research Center (IDIBAPS-UB). This is the result of six years of research and a collaboration with Thomas Grewal, from the University of Sydney; Elina Ikonen, from the University of Helsinki, and the research group on Lipids and Cardiovascular Pathology of the Biomedical Research Institute at Hospital Sant Pau.

Study with the CRISPR/Cas9 editing technology

Cholesterol is essential in the organization of membranes and it also modulates the vesicular trafficking, basic mechanisms for the cell functioning. To coordinate and regulate the balance, or homeostasis in cholesterol, cells have developed a molecular machinery, which is not fully understood yet. "The understanding of these mechanisms is very important to treat diseases in which there is an accumulation of cholesterol and other lipids which cause serious physiological alterations in the liver, spleen and especially the nervous system", note Carles Enrich and Carles Rentero.

One of such diseases is Niemann-Pick type C1, caused by a mutation in the NPC1 gene, which causes the accumulation of cholesterol in the cell interior of the endosome. In order to study this mechanism, researchers used the CRISPR/Cas9 genetic editing technique to block a molecule -AnxA6 protein- in cells with the phenotype of the disease. The effect of such block resulted in the release of the endosome cholesterol, showing the essential role of this protein in the regulation of cholesterol transfer.

Increasing membrane contact sites

The results of the study also show this release occurred thanks to a significant increase of membrane contact sites (MCS), nanometric structures that can be seen through electronic microscopy. According to the authors, these membrane contact sites are just a few inside the cells of the affected patients, therefore, silencing AnxA6 induces the creation of MCS, stops the effect of the NPC1 gene mutation and redirects cholesterol towards other cell compartments, returning to cell normality.

"Results could help treating the clinical impact of the accumulation of cholesterol in Niemann-Pick and about twelve more diseases, among which are different types of cancer (pancreas, prostate, breast), in which the lipidic metabolism plays a fundamental role", note the researchers.

A new paradigm in the study of the cholesterol cell transport

The membrane contact sites being involved in the cholesterol transport is a pioneering result in this field, since researchers thought -so far- that lipid transport was carried out through vesicles and a type of specialized proteins. "We do not know much about the functioning and dynamics of membrane contact sites, but this study goes together with recent ones and shows MCS are a new paradigm for the understanding of the regulation, transport and homeostasis of lipids, cholesterol and calcium", conclude the researchers.

Credit: 
University of Barcelona

Monkeys inform group members about threats -- following principles of cooperation

image: Male mangabey monkey looking worried into the direction of the snake.

Image: 
A. Mielke/ MPI f. Evolutionary Anthropology

Cooperation - working together or exchanging services for the benefit of everyone involved - is a vital part of human life and contributes to our success as a species. Often, rather than helping specific others, we work for the good of the community, because this helps our friends and family who are part of the group, or because we share in the benefits with everyone else. However, even though the whole group can benefit when people work together, not everyone might be willing to contribute equally. One way for humans to cooperate is by exchanging information: from gossiping and storytelling to teaching and news reporting, we rely on some individuals possessing knowledge and sharing this knowledge for the greater good.

Like humans, many non-human primates live in close-knit social groups where individuals cooperate to their mutual benefit. As in our own species, information can be an important commodity for them: primates use a variety of calls to let each other know where they are going and whether they have found food. One of the most important messages to transmit is the presence of a threat: whenever a leopard or eagle has been spotted, calling for backup can help confuse predators or fight them off. While these calls help others in the group, they also clearly benefit the caller. This is different when the threat remains in one spot: many snakes, especially vipers, do not seem to actively hunt monkeys, but if stepped on, they can still bite and kill. However, once a monkey knows where the snake is, she herself is usually safe; giving a loud call to tell everyone about the snake costs time, and potentially exposes her to other dangers. Hence not all monkeys in a group will call out. So, why do some individuals call when they detect a threat that is not dangerous to themselves anymore?

Researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, made use of the reaction of sooty mangabeys to snakes to understand how this monkey species cooperates by sharing information. Mangabeys live in the tropical Taï National Park, Côte d'Ivoire, in large groups (often consisting of more than 100 members), where individuals often cannot see everything that is happening to other group members, including their kin and friends. While mangabeys often find venomous snakes in the forest, it is hard to predict and film these encounters, so the researchers created realistic snake models out of mesh and paper-mâché and hid them where the mangabeys would pass. By filming the reaction of all the monkeys that would see the snake, the team could identify who called, when, and how often. This way, they hoped to understand whether monkeys called simply because they were scared of the snake, to show off how fearless they are, to warn their friends and family, or whether they share information widely across the group when it was likely that others who followed them did not know about the threat.

Over the course of a year, the researchers worked with a wild mangabey community within the context of the Taï Chimpanzee Project, conducting two to four experiments per month. Using snake models and up to five different camera angles, they were able to get detailed recordings of the behavior of each mangabey that came close enough to see the snake. All group members are used to human observers and cameras, and their family ties, friendships, and dominance relations are known, enabling the researchers to analyze in detail whether the presence and arrival of specific group members would prompt alarm calls. Typically, after monkeys find a snake, they will stick around and observe it for a time, while other individuals follow them and also inspect the snake. Between the first and last individual to encounter the threat, there is a line of monkeys that could call; however, not all of them do, and the question is whether the ones that call differ from the ones that do not.

"What surprised us, in both natural snake encounters and in our experiments, was how different individuals reacted to the threat: most individuals did not show strong reactions unless they almost stepped on the snake, and they would usually just call once or twice and then move on. On the other hand, we had a small number of individuals who called almost every time they saw a snake", says Alex Mielke, lead author of the study. "When we look at all experiments, though, a clear pattern emerges: mangabeys did not call specifically for their kin or friends, or ignorant group members. Individuals called when few others were around the snake or nobody had called in a while, effectively broadcasting the location of the snake to the general public when there is a chance that the information gets lost. This creates a system where no individual has to invest too much - one or two calls are enough before moving on - but because the threat is regularly re-advertised, the danger for following individuals is removed."

These results showcase how the social system an animal lives in changes how information needs to be transmitted: In mangabeys, groups are large but all group members travel together, so an individual in the front of the group who spots a snake and calls out can probably assume that their sister in the back of the group will also get the message. The information gets re-iterated by those between them. In chimpanzees, where the community splits into subgroups while moving around, the same research group in Leipzig could show that individuals wait close to the snake and inform arriving group members who are potentially ignorant of the danger.

The behavior of the mangabeys also illuminates how cooperation could have evolved in a group-setting. Even though the monkeys do not individually warn their kin and allies, if they find a snake and pass this information on to the rest of the group they can rely on others to contribute the same way, so that all group members (including family) are informed and hence less likely to get injured or killed by the snake. Similar mechanisms can be important in other situations where primates work together to achieve a goal, when defending their territory or acquiring food together. This study thus further illuminates the extent to which cooperation shapes social behavior in our primate cousins.

Credit: 
Max Planck Institute for Evolutionary Anthropology

New clues about the origins of familial forms of Amyotrophic lateral sclerosis

image: These are human cells producing aggregates of the protein Sod1 (in green).

Image: 
Aline Brasil, Elis Eleutherio, and Tiago Outeiro

A team led by Brazilian researcher Elis Eleutherio, professor at the Federal University of Rio de Janeiro, in partnership with Tiago Outeiro, at University of Goettingen, Germany, made important progress in understanding the conformation and accumulation of certain proteins involved in lateral amyotrophic sclerosis (ALS).

"We believe protein accumulation is an important hallmark of ALS, and we still do not understand why the protein misbehaves and aggregates during the disease", explains Prof. Elis Eleutherio.

Amyotrophic lateral sclerosis (ALS) is a progressive and devastating neurodegenerative disorder affecting 1 to 3 individuals in 100.000, and is more prevalent in people between 55-75 years of age. The disease affects, primarily, a population of neurons known as 'motor neurons'. Patients suffer from irreversible motor paralysis, and become incapable of speaking, swallowing, or breathing as the disease progress.

Most ALS cases are sporadic, with no defined genetic origin, and only the minority is familial, with known associated genetic alterations. Certain familial forms of ALS (fALS) are associated with genetic alterations in the gene encoding for a protein known as superoxide dismutase 1 (Sod1), that cause alterations in the folding and function of the protein.

The study, published in the journal Proceedings of the National Academy of Sciences (PNAS), allowed scientists to understand the interaction between normal and mutant protein, which causes alteration of protein accumulation in the cell, but also impairs the function of Sod1 protein, thus contributing to the development of the disease. For the group, this discovery opens new perspectives for the treatment of ALS.

Sod1 is a protein that plays a role, among others, in the protection against oxidative damage in our cells. In some ALS cases, altered Sod1 protein accumulates inside neuronal cells and, researchers believe, cause damage to the neurons, leading to their death. Importantly, normal Sod1 protein, present in sporadic cases of ALS, can also misfold and accumulate, suggesting this is a central problem in ALS.

In the study, the researchers used simple experimental models, such as bakery yeast, used to make beer, wine and bread, and human cells, in order to better understand the context of protein aggregation in the disease. They also used a strategy that mimics the genetic context of fALS, where most patients carry one copy of the normal Sod1 protein, and one copy carrying a genetic alteration. "In patients, we think that the presence of a mutant copy of Sod1 alters the behavior of the normal copy", explains Dr. Aline Brasil, the first author of the study.

"By taking advantage of novel genetic manipulation tools, and powerful molecular imaging approaches, that enable the direct visualization of protein complexes in the cell (a technique known as BiFC), we were able to detect 'hetero-complexes' formed by normal and abnormal (mutant) Sod1 protein", said Prof. Tiago Outeiro, leader of the German team that participated in the study.

The research opens novel perspectives for therapeutic intervention, that the authors hope to continue to explore in the near future, such as the specific removal of mutant Sod1 protein.

"In a time when the scientific and education system in Brazil suffers from uncertainty, it is important to demonstrate that we can be competitive and conduct research that will contribute to society and may, ultimately, help change the lives of those affected by such devastating diseases", Prof. Elis Eleutherio concludes.

Credit: 
Instituto Nacional de Ciência e Tecnologia de Biologia Estrutural e Bioimagem (INBEB)

Supermarkets and child nutrition in Africa

image: Malnourished: Malnutrition is a widespread problem in Africa.

Image: 
E M Meemken

Hunger and undernutrition are still widespread problems in Africa. At the same time, overweight, obesity, and related chronic diseases are also on the rise. Recent research suggested that the growth of supermarkets contributes to obesity in Africa, because supermarkets tend to sell more processed foods than traditional markets. However, previous studies only looked at data from adults. New research shows that supermarkets are not linked to obesity in children, but that they instead contribute to reducing child undernutrition. The results were recently published in the journal Global Food Security.

For the research, agricultural and food economists from the University of Göttingen in Germany collected data from over 500 randomly selected children in Kenya over a period of three years. The data show that children from households with access to a supermarket are significantly better nourished than children in the reference group. Purchase of food in a supermarket has particularly positive effects on child growth and height, even after controlling for age, income, and other relevant factors. The most widely used indicator of chronic child undernutrition is child "stunting" which means impaired growth and development as shown by low height for their age.

"At first, we were surprised about the results, because it is often assumed that supermarkets in Africa primarily sell unhealthy snacks and convenience foods", says Dr Bethelhem Legesse Debela, the study's first author. "But our data show that households using supermarkets also consume healthy foods such as fruits and animal products more regularly." Professor Matin Qaim, the leader of the research project adds: "Not all processed foods are automatically unhealthy. Processing can improve the hygiene and shelf-life of foods. Poor households in Africa in particular often have no regular access to perishable fresh produce."

The findings clarify that modernization of the food retail sector can have multilayered effects on nutrition, which need to be analyzed in the local context. The United Nations pursues the goal of eradicating global hunger in all its forms by 2030. According to the study authors, "this can only be achieved when we better understand the complex relations between economic growth, nutrition, and health and identify and implement locally-adapted policies".

Credit: 
University of Göttingen