Earth

How the humble marigold outsmarts a devastating tomato pest

Scientists have revealed for the first time the natural weapon used by marigolds to protect tomato plants against destructive whiteflies.

Researchers from Newcastle University's School of Natural and Environmental Sciences, carried out a study to prove what gardeners around the world have known for generations - marigolds repel tomato whiteflies.

Publishing their findings today (1 March) in the journal PLOS ONE, the experts have identified limonene - released by marigolds - as the main component responsible for keeping tomato whiteflies at bay. The insects find the smell of limonene repellent and are slowed down by the powerful chemical.

Large-scale application

The findings of the study have the potential to pave the way to developing a safer and cheaper alternatives to pesticides.

Since limonene repels the whitefly without killing them, using the chemical shouldn't lead to resistance, and the study has shown that it doesn't affect the quality of the produce. All it takes to deter the whiteflies is interspersing marigolds in tomato plots, or hang little pots of limonene in among the tomato plants so that the smell can disperse out into the tomato foliage.

In fact, the research team, led by Dr Colin Tosh and Niall Conboy, has shown that may be possible in to develop a product, similar to an air freshener, containing pure limonene, than can be hung in glasshouses to confuse the whiteflies by exposing them to a blast of limonene.

Newcastle University PhD student Niall said: "We spoke to many gardeners who knew marigolds were effective in protecting tomatoes against whiteflies, but it has never been tested scientifically.

"We found that the chemical which was released in the highest abundance from marigolds was limonene. This is exciting because limonene is inexpensive, it's not harmful and it's a lot less risky to use than pesticides, particularly when you don't apply it to the crop and it is only a weak scent in the air.

"Most pesticides are sprayed onto the crops. This doesn't only kill the pest that is targeted, it kills absolutely everything, including the natural enemies of the pest."

Limonene makes up around 90% of the oil in citrus peel and is commonly found in household air fresheners and mosquito repellent.

Dr Tosh said: "There is great potential to use limonene indoors and outdoors, either by planting marigolds near tomatoes, or by using pods of pure limonene. Another important benefit of using limonene is that it's not only safe to bees, but the marigolds provide nectar for the bees which are vital for pollination.

"Any alternative methods of whitefly control that can reduce pesticide use and introduce greater plant and animal diversity into agricultural and horticultural systems should be welcomed."

The researchers carried out two big glasshouse trials. Working with French marigolds in the first experiment, they established that the repellent effect works and that marigolds are an effective companion plant to keep whiteflies away from the tomato plants.

For the second experiment, the team used a machine that allowed them to analyse the gaseous and volatile chemicals released by the plants. Through this they were able to pinpoint which chemical was released from the marigolds. They also determined that interspersing marigolds with other companion plants, that whiteflies don't like, doesn't increase or decrease the repellent effect. It means that non-host plants of the whiteflies can repel them, not just marigolds.

A notorious pest

Whitefly adults are tiny, moth-like insects that feed on plant sap. They cause severe produce losses to an array of crops through transmission of a number of plant viruses and encouraging mould growth on the plant.

Dr Tosh said: "Direct feeding from both adults and larvae results in honeydew secretion at a very high rate. Honeydew secretion that covers the leaves reduces the photosynthetic capacity of the plant and renders fruit unmarketable."

Further studies will focus on developing a three companion plant mixture that will repel three major insect pests of tomato - whiteflies, spider mites and thrips.

Longer term, the researchers aim to publish a guide focussing on companion plants as an alternative to pesticides, which would be suitable across range of horticultural problems.

Credit: 
Newcastle University

Novel sleep index, wakefulness may predict if patients able to breathe on their own

image: Wakefulness may predict if patients able to breathe on their own.

Image: 
ATS

March 1, 2019--Critically ill patients are more likely to be successfully weaned from a mechanical ventilator, or breathing machine, if they have higher levels of wakefulness and both their right and left brains experience the same depth of sleep, according to new research published online in the American Thoracic Society's American Journal of Respiratory and Critical Care Medicine.

In "Sleep and Pathological Wakefulness at Time of Liberation from Mechanical Ventilation," Laurent Brochard, MD, PhD, and co-authors used polysomnography, or a sleep study, and a novel index developed by one of the co-authors, Magdy Younes, MD, PhD, to analyze the data from the study. The index's odds ratio product (ORP) provides a continuous digital score from 0 (very deep sleep) to 2.5 (full wakefulness).

The researchers wanted to determine whether the ORP was associated with the likelihood that a patient could be successfully removed from mechanical ventilation.

"Patients under mechanical ventilation in intensive care units frequently suffer from severe sleep deprivation and, as a consequence, exhibit abnormal patterns of sleep or wakefulness, which explain in part the frequent development of delirium," said Dr. Brochard, senior study author and director of the Critical Care Medicine Division at the University of Toronto and a clinician-scientist at the Keenan Research Centre for Biomedical Science at St. Michael's Hospital in Toronto, Canada.

While mechanical ventilation is life-saving, it can cause lung damage, infections and other health problems, so patients should be taken off a ventilator as soon as medically possible. Physicians use a spontaneous breathing trial (SBT) during which a patient breathes with no or little help from the ventilator to assess the patient's readiness for breathing on his or her own.

"Successful separation from mechanical ventilation necessitates an adequate response from a number of physiological systems, all of which could be impaired by sleep deprivation," Dr. Brochard said, noting that previous studies have, in fact, linked pathological sleep with prolonged difficulties in being separated from the ventilator.

"We wondered whether assessing a period of sleep and wakefulness in the hours before attempting a separation from the ventilator could predict the success of this process."

The researchers analyzed data from 37 patients at 3 Toronto-area hospitals who were scheduled for an SBT and had undergone polysomnography for 15 hours before the test. SBT was successful in 19 patients. In 11 of these patients, their breathing tube was removed; in the other 8 patients the breathing tube was not removed because, despite a successful SBT, other clinical factors indicated they were not ready for extubation. In 18 patients, the SBT was unsuccessful.

The study found:

Classical sleep stages as determined by conventional sleep scoring guidelines were not associated with success or failure of the SBT.

Longer durations of full wakefulness as measured by ORP (>2.2) were highly correlated with a successful SBT and extubation.

Poor correlation between sleep depth in the right and left-brain hemispheres strongly predicts SBT failure.

The researchers said the fact that ORP scores were associated with success or failure in weaning patients from mechanical ventilation while standard sleep scores were not, most likely reflects ORP's ability to better distinguish different levels of sleep.

Dr. Brochard explained that classical sleep analysis is made very difficult by frequent "atypical" or "pathological" tracings in ICU patients. This is particularly true during a condition called pathological wakefulness.

"Defining wakefulness or sleep classically," Dr. Brochard, "necessitates detecting short-wave brain activity that typically characterizes sleep and a decrease in higher frequencies that characterize wakefulness and comparing these results to clinical behavior: does the patient look awake or asleep?"

The researchers noted that sleep deprivation produces a brainwave pattern similar to pathological wakefulness and that despite being clinically "awake" patients are "obtunded," meaning they are not fully awake. The authors speculate that this pathological wakefulness is the flip side of sleep deprivation.

Dr. Brochard said the dissociation between the brain hemispheres was an "entirely new" finding that raised many questions: What is exactly causing the dissociation between the two hemispheres observed in these patients? Is it primarily sleep deprivation? Or is there an influence of the sedative drugs administered in the first days, or of certain medical conditions? How and how fast can it be reversed?

Answers to these questions may lead to changes in how mechanically ventilated patients are managed in the ICU.

"We now have a monitoring tool of the brain that can help us address questions of major importance for the outcome of patients in the ICU," Dr. Brochard said.

Credit: 
American Thoracic Society

Open-source software tracks neural activity in real time

image: This is an image of neurons (white) taken using calcium imaging techniques. An innovative software dubbed CaImAn can automatically differentiate between individual neurons (yellow outlines) with nearly the same accuracy as a human (red outlines).

Image: 
Giovannucci et al./<i>eLife</i> 2019

Tracking the firings of individual neurons is like trying to discern who is saying what in a football stadium full of screaming fans. Until recently, neuroscientists have had to tediously track each neuron by hand.

"People spent more time analyzing their data to extract activity traces than actually collecting it," says Dmitri Chklovskii, who leads the neuroscience group at the Center for Computational Biology (CCB) at the Flatiron Institute in New York City.

A breakthrough software tool called CaImAn automates this arduous process using a combination of standard computational methods and machine-learning techniques. In a paper published in the journal eLife in January, the software's creators demonstrate that CaImAn achieves near-human accuracy in detecting the locations of active neurons based on calcium imaging data.

CaImAn (an abbreviation of calcium imaging analysis) has been freely available for a few years and has already proved invaluable to the calcium imaging community, with more than 100 labs using the software. The latest iteration of CaImAn can run on a standard laptop and analyze data in real time, meaning scientists can analyze data as they run experiments. "My lab is excited about being able to use a tool like this," says Duke University neuroscientist John Pearson, who was not involved in the software's development.

CaImAn is the product of an effort initiated by Chklovskii within his group at CCB. He brought on Eftychios Pnevmatikakis and later Andrea Giovannucci to spearhead the project. Their aim was to help tackle the enormous datasets produced by a method called calcium imaging.

That technique involves adding a special dye to brain tissue or to neurons in a dish. The dye binds to the calcium ions responsible for activating neurons. Under ultraviolet light, the dye lights up. Fluorescence only occurs when the dye binds to a calcium ion, allowing researchers to visually track a neuron's activity.

Analyzing the data gathered via calcium imaging poses a significant challenge. The process generates a flood of data -- up to 1 terabyte an hour of flickering movies -- that rapidly becomes overwhelming. "One experimenter can fill up the largest commercially available hard drive in one day," says Michael Häusser, a neuroscientist at University College London whose team tested CaImAn.

The data are also noisy. Much like mingling voices, fluorescent signals from different neurons often overlap, making it difficult to pick out individual neurons. Moreover, brain tissue jiggles, adding to the challenge of tracking the same neuron over time.

Pnevmatikakis, now a research scientist at the Flatiron Institute's Center for Computational Mathematics, first began developing the basic algorithm underlying CaImAn as a postdoc in Liam Paninski's lab at Columbia University.

"It was elegant mathematically and did a decent job, but we realized it didn't generalize well to different datasets," Pnevmatikakis says. "We wanted to transform it into a software suite that the community can use." That was partly why he was drawn to the neuroscience group at Flatiron, which develops new tools for analyzing large datasets.

Pnevmatikakis later began working with Giovannucci, then a postdoc at Princeton University, on applying the algorithm to tracking the activity of cerebellar granule cells, a densely packed, rapid-firing group of neurons. "Existing analysis tools were not powerful enough to disentangle the activity of this population of neurons and implied that they were all doing the same thing," says Giovannucci, who joined the CCB neuroscience group for three years to help develop the software for broader use. "The algorithm subtracts the background voices and focuses on a few," revealing that individual granule cells do indeed have distinct activity patterns.

Further work at the Flatiron Institute honed CaImAn's abilities and made the software easier for researchers to use for a variety of experiments without extensive customization.

The researchers recently tested CaImAn's accuracy by comparing its results with a human-generated dataset. The comparison proved that the software is nearly as accurate as humans in identifying active neurons but much more efficient. Its speediness allows researchers to adapt their experiments on the fly, improving studies of how specific bundles of neurons contribute to different behaviors. The human dataset also revealed high variability from person to person, highlighting the benefit of having a standardized tool for analyzing imaging data.

In addition to benchmarking accuracy, the researchers used the human-annotated results as a training dataset, developing machine-learning-based tools to enhance the CaImAn package. They have since made this dataset public, so that the community can use it to further extend CaImAn or to create new tools.

Credit: 
Simons Foundation

New research suggests earlier emergence of malaria in Africa

image: New research suggests an earlier emergence of malaria in Africa.

Image: 
© Institut Pasteur

Malaria, which claims hundreds of thousands of lives each year - mainly children and especially in Africa -, is one of the leading causes of death by an infectious agent, the Plasmodium falciparum parasite. In research on malaria, the genetic mutation that causes sickle cell anemia (also known as drepanocytosis), a chronic disease that is often fatal in children under five, caught the attention of the scientific community very early on because it also provides protection against malaria. After carrying out extensive research into the βS mutation by performing full sequencing of the HBB gene together with a large-scale genomic study on 479 individuals from 13 populations from Sub-Saharan Africa, scientists from the Institut Pasteur and the CNRS were able to reveal that malaria emerged in Africa at least 20,000 years ago - and not at the same time as the adoption of agriculture 4,000 to 5,000 years ago. The findings will be published in the American Journal of Human Genetics on February 28, 2019.

Individuals carrying the βS mutation in the HBB gene who do not develop sickle-cell anemia - healthy carriers - demonstrate increased resistance to malaria infection. This evolutionary paradox, first revealed in the early 1950s - a mutation that is by definition harmful but promotes the survival of some individuals -, means that βS can be seen both as an emblematic example of natural selection in humans and above all as an ideal marker for malaria research, since the date of emergence of βS corresponds with the minimum date for the emergence of malaria.

Research carried out in recent decades suggests that the date of emergence of βS, and therefore also malaria, coincides with the dates on which agriculture is known to have been adopted as the main means of livelihood in Central Africa around 4,000 to 5,000 years ago. The scientific community had long accepted the existence of a causal link between the emergence of agriculture and the spread of malaria in Africa. But nothing was known about the history of malaria in African populations that did not adopt agriculture.

Drawing on new genetic data obtained by scientists from the Human Evolutionary Genetics Unit at the Institut Pasteur, a study carried out by Institut Pasteur and CNRS scientists Guillaume Laval and Lluis Quintana-Murci, in close collaboration with the Max Planck Institute in Leipzig, Germany, and the IRD, has cast doubt on the role of agriculture in the emergence of malaria in Africa. The results of this collaborative scientific research, based on a novel formalization of the specific natural selection method generally accepted in the case of βS, show that this mutation emerged around 20,000 years ago. These new findings therefore indicate that malaria was rife well before the adoption of agriculture - contradicting widely held interpretations.

The research also shows that the βS mutation emerged more recently, approximately 4,000 years ago, in hunter-gatherer populations.

Changes in the equatorial forest during this period - most likely because of an episode of climate change and/or a period of increased deforestation owing to the emergence of agriculture, are thought to have facilitated the spread of malaria among pygmy populations. "We show that the βS mutation, which provides resistance to malaria, may have been spread by agricultural populations who came into contact with these populations of hunter-gatherers during the Bantu migration, when farming communities crossed the equatorial forest and set out on the major migratory routes to the eastern and southern regions of Sub-Saharan Africa," comments Guillaume Laval, lead author of the paper. "These results shed new light on a little-known chapter in the history of malaria and demonstrate the beneficial effects of admixture for some aspects of public health, such as the spread of mutations conferring resistance to various pathogens among human populations," adds Lluis Quintana-Murci, joint last author of the paper.

Credit: 
Institut Pasteur

500-million-year old worm 'superhighway' discovered in Canada

image: These are worm tunnels (labelled) visible in small section of rock.

Image: 
Professor Brian Pratt, University of Saskatchewan

Prehistoric worms populated the sea bed 500 million years ago--evidence that life was active in an environment thought uninhabitable until now, research by the University of Saskatchewan (USask) shows.

The sea bed in the deep ocean during the Cambrian period was thought to have been inhospitable to animal life because it lacked enough oxygen to sustain it.

But research published in the scientific journal Geology reveals the existence of fossilized worm tunnels dating back to the Cambrian period¬¬ 270 million years before the evolution of dinosaurs.

The discovery, by USask professor Brian Pratt, suggests that animal life in the sediment at that time was more widespread than previously thought.

The worm tunnels--borrows where worms lived and munched through the sediment--are invisible to the naked eye. But Pratt "had a hunch" and sliced the rocks and scanned them to see whether they revealed signs of ancient life.

The rocks came from an area in the remote Mackenzie Mountains of the Northwest Territories in Canada which Pratt found 35 years ago.

Pratt then digitally enhanced images of the rock surfaces so he could examine them more closely. Only then did the hidden 'superhighway' of burrows made by several different sizes and types of prehistoric worm emerge in the rock.

Some were barely a millimetre in size and others as large as a finger. The smaller ones were probably made by simple polychaetes--or bristle worms--but one of the large forms was a predator that attacked unsuspecting arthropods and surface-dwelling worms.

Pratt said he was "surprised" by the unexpected discovery.

"For the first time, we saw evidence of large populations of worms living in the sediment - which was thought to be barren," he said. "There were cryptic worm tunnels - burrows - in the mud on the continental shelf 500 million years ago, and more animals reworking, or bioturbating, the sea bed than anyone ever thought."

Pratt, a geologist and paleontologist and Fellow of the Geological Society of America, found the tunnels in sedimentary rocks that are similar to the Burgess Shale, a famous fossil-bearing deposit in the Canadian Rockies.

The discovery may prompt a rethink of the level of oxygenation in ancient oceans and continental shelves.

The Cambrian period saw an explosion of life on Earth in the oceans and the development of multi-cellular organisms including prehistoric worms, clams, snails and ancestors of crabs and lobsters. Previously the seas had been inhabited by simple, single-celled microbes and algae.

It has always been assumed that the creatures in the Burgess Shale--known for the richness of its fossils--had been preserved so immaculately because the lack of oxygen at the bottom of the sea stopped decay, and because no animals lived in the mud to eat the carcasses.

Pratt's discovery, with co-author Julien Kimmig, now of the University of Kansas, shows there was enough oxygen to sustain various kinds of worms in the sea bed.

"Serendipity is a common aspect to my kind of research," Pratt said. "I found these unusual rocks quite by accident all those years ago. On a hunch I prepared a bunch of samples and when I enhanced the images I was genuinely surprised by what I found," he said.

"This has a lot of implications which will now need to be investigated, not just in Cambrian shales but in younger rocks as well. People should try the same technique to see if it reveals signs of life in their samples."

Credit: 
University of Saskatchewan

Study identifies predictors of psychiatric events during drug-assisted smoking cessation

image: Robert Anthenelli, MD, professor of psychiatry and director of the Pacific Treatment and Research Center at UC San Diego School of Medicine.

Image: 
UC San Diego Health

Researchers at University of California San Diego School of Medicine have identified a clear group of characteristics that predict heightened risk for experiencing increased anxiety or worsening of mood that interferes with daily activities when using a smoking cessation drug. Results are published in the February 27, 2019 online edition of the Journal of General Internal Medicine.

"This is the first time that a large scale multinational study has carefully examined psychiatric events among smokers with and without mental health issues, comparing risks across all three frontline medications," said first author Robert Anthenelli, MD, professor of psychiatry and director of the Pacific Treatment and Research Center at UC San Diego School of Medicine. "By identifying the baseline characteristics that predict moderate-to-severe adverse events, we can better treat all patients."

"We learned that there are three predictors of clinically significant psychiatric adverse events in both smokers with and without mental health conditions: existing anxiety, a history of suicidal ideation or behavior and being of white race," said Anthenelli. "In participants with histories of mental health issues, predictors included being young, female, a long-term smoker and a history of co-occurring psychiatric and substance abuse disorders."

The findings represent a secondary analysis of the 2016 EAGLES study, which found that 2 percent of participants with no prior history of mental illness experienced a moderate-to-severe psychiatric event while using a smoking cessation aid. In patients with a prior history of mental illness, 6 percent had such adverse events.

"The good news is that the vast majority of people do not have adverse psychiatric events when taking the most prescribed smoking cessation drugs," said Anthenelli. "These medicines are effective aids to quit smoking. However, like all drugs, these medications should be monitored, especially in patients who have a diagnosis of mental illness."

Eligible participants for the original placebo-controlled study were men and women aged 18 to 75 years who smoked an average of more than 10 cigarettes per day and who were motivated to stop smoking. Tested therapies included a nicotine patch, bupropion and varenicline.

"People with mental illness are a neglected subpopulation of smokers," said Anthenelli. "They smoke at rates two to four times higher than the general population and consume nearly one out of every two to three cigarettes sold in the U.S. and U.K.

"These individuals have a harder time quitting smoking and are disproportionately affected by tobacco-related diseases and premature death. Just because they have mental illness, however, does not mean they should not consider a smoking cessation drug as the risk of disease from long-term smoking is much greater."

Credit: 
University of California - San Diego

Custom-made proteins may help create antibodies to fight HIV

A new way to create proteins that can sneak through HIV's protective coating may be a step toward understanding the key components needed for developing a vaccine for the virus, according to researchers.

Using computational modeling, a team of researchers led by Penn State designed and created proteins that mimicked different surface features of HIV. After being immunized with the proteins, rabbits developed antibodies that were able to bind with the virus.

"We were able to show that by using our designed proteins, the blood was able to spontaneously generate antibodies that can inhibit the infection of HIV in cellular models," said Cheng Zhu, a postdoctoral fellow at Penn State College of Medicine. "When we incubated the HIV virus, its infectivity was dramatically reduced by the rabbits' blood."

Zhu added that the study -- published today (Feb. 27) in Nature Communications -- provides a novel way to design proteins for vaccines.

"The proteins -- or immunogens -- we developed aren't a finished product, but we were able to show evidence that it's possible to do," Zhu said. "Moreover, it's also very exciting that we were able to create a new method to tailor make proteins, which could open the door for developing vaccines for other infections, as well."

Although millions of people are living with HIV across the globe, creating a vaccine for the virus has alluded researchers. Vaccines work by teaching the immune system where on a virus an antibody can attach before neutralizing it. To create a vaccine, researchers first have to identify this spot.

Nikolay Dokholyan, G. Thomas Passananti Professor and Vice Chair for Research in the Department of Pharmacology at Penn State, explained that developing a vaccine for HIV is difficult because the virus constantly mutates.

"Even if we develop an antibody for a particular strain of the virus, that antibody may not even notice the next strain of the virus," Dokholyan said. "In order to develop broadly neutralizing antibodies -- antibodies that neutralize multiple strains of a virus -- we need to find something that remains constant on the virus for those antibodies to latch onto."

According to Dokholyan, HIV uses a coating of carbohydrates to protect a protein on its surface called Env. While this protein could be a potential target for vaccines, the carbohydrate coating makes it difficult or impossible for antibodies to access and neutralize it.

But sometimes, holes naturally appear in this coating, exposing the Env protein to potential antibodies. Zhu said he and the other researchers wanted to find a way to target these holes.

"The idea would be to do molecular surgery to copy sections of the virus's surface and paste them onto different, benign proteins, so they would look but not act like the Env protein," Zhu said. "Hopefully, this would allow the immune system to recognize the virus and create antibodies to neutralize it in the future."

The researchers used computational models to design proteins that would mimic the conserved protein surface of different strains of HIV to be used in the vaccine. Dokholyan said that while usually proteins are engineered by changing one amino acid at a time, they wanted to try a different approach.

"Instead of changing one amino acid at a time, it's a large surface of the HIV strain that is cut and then plugged onto a different protein," Dokholyan said. "It's an important milestone be able to do these major molecular surgeries, and it's very exciting that the strategy worked with a very high accuracy."

After creating immunogens that used the new, HIV-mimicking proteins, the researchers immunized the rabbits and drew blood samples once a month. After analyzing the samples, the researchers found that the blood contained antibodies that were able to bind onto HIV.

The researchers said that while the findings are promising, there is still more work to be done.

"It's important that we were able to generate an immune response to HIV and show that it's possible as a proof of concept," Dokholyan said. "But, we still need to improve the antibodies' neutralization abilities and other aspects before it can become a viable vaccine."

Dokholyan said that in the future, the protein design method could potentially help create and personalize vaccines for different diseases in various areas in the world.

"Diseases can vary by location, for example, there are different strains of HIV in various countries or regions," Dokholyan said. "If we can easily customize proteins for vaccines, that's a good example of where personalized medicine is going to play a role."

Credit: 
Penn State

Getting to the core of underwater soil

image: Taking a soil core from underwater soil is a bit trickier than soil that's on the surface. Scientists have to utilize boats and plunge the sampling equipment through water and down into the soil.

Image: 
Mark Stolt

Soils all over the Earth's surface are rigorously tested and managed. But what about soils that are down in the murky depths? Although not traditional soils, underwater soils have value and function. Some scientists are working to get them the recognition and research they deserve.

One of these scientists is Mark Stolt from the University of Rhode Island. He and his team are working to sample and map underwater soils.

"Considering that nearly half of the United States population lives within coastal counties, these soils impact many of their lives relative to commercial, recreational, and transportation activities," he explains. "Soil maps provide a mechanism to manage these areas and make fundamental use and management decisions that affect people every day. Unlike land, you can't put a price on underwater soils."

Underwater soils are affected by human activities such as dredging, aquaculture, and restoration, all of which affects water quality. In dredging, layers of the underwater soil are removed to make room--for example, to allow the bottom of a ship to pass. That soil has to go somewhere. If put on land the effects can be unknown.

Aquaculture, raising aquatic plants (for example, nori) or animals (such as oysters), makes use of underwater habitats. In traditional agriculture, farmers are typically concerned about the state of their soil. However, underwater soil is poorly understood. What makes a good or bad soil for aquaculture is uncertain.

The underwater soils also face some threats from land activity, adds Stolt. For example, if too much fertilizer is put on a field it can run off into nearby bodies of water. This encourages algae growth. When that algae dies and decomposes, the water--and underwater soils--lose the oxygen fish and plants need.

"These soils are the foundation and structure of a myriad of habitats and ecosystems," Stolt says. "For example, submerged aquatic vegetation are rooted plants that derive much of their nutrients from underwater soils. They trap sediment and minimize coastal erosion. They are a huge sink for carbon dioxide that is eventually stored in the soils, and like other plants, they add oxygen. Not to mention provide habitats for the animals that live in them. Their importance goes on and on."

By taking measures of the soils, researchers are able to add data to the National Cooperative Soil Survey maps. These provide uniform and rigorous assessments for soil and are used throughout the country. Stolt says soil scientists need to develop faster and better ways to create these maps.

Stolt and his team have been able to add new classifications of underwater soils. These classifications lay out the properties of the soil and provide information on how it was formed, and how it can be used and managed.

"There is still lots and lots to learn about underwater soils, but some of our experiences suggest these soils are quite resilient," Stolt says. "For example, we tested whether long-term high-production oyster aquaculture (up to 20 years) would have a negative effect on the underwater soil. Although there were some negative effects, the soils essentially remained the same in this particular area."

Stolt and his team continue to map underwater soils to help define and understand their value.

"I am interested in the application of soils information toward solving environmental issues and problems," he says. "I live in the Ocean State. We are surrounded by estuaries. I want to see them correctly managed and protected for everyone."

Stolt presented his research at the International Meeting of the Soil Science Society of America, Jan. 6-9, in San Diego. Studies from this work were funded by NOAA, Rhode Island Sea Grant, NIFA, and USDA-NRCS.

Credit: 
American Society of Agronomy

More extreme coastal weather events would likely increase bluff erosion, landslide activity

image: Armored vs. unarmored (photo by Ben Leshchinsky, Oregon State University)

Image: 
Ben Leshchinsky, Oregon State University

CORVALLIS, Ore. - Unstable slopes on Oregon's coastline could see a 30 percent jump in landslide movements if extreme storms become frequent enough to increase seacliff erosion by 10 percent, a new study by Oregon State University shows.

For many slope failures traversing Highway 101, these cliffs form the base of active slides that already move a little every year, said the study's corresponding author, Ben Leshchinsky, a forest engineering and civil engineering researcher at OSU.

The findings are especially impactful for slides along the 360-mile-long coastline that are highly susceptible to "toe erosion" - the removal of buttressing material at the base via waves or river scour.

"The really big slides, the monstrous landslides that span hundreds of acres, are so large and driven by water that their instability is less dominated by erosion," Leshchinsky said. "But the ones that are highly exposed to erosion from the ocean, if we see increased extreme events like storm surge, then we'll see increased bluff erosion. These creeping landslides that are already active and moving every year, they'll move a lot more. If erosion increases 10 percent, slide movements might increase by 20 or 30 percent."

Also known as slow earthflows, the slides are prone to gradual movement, the result of being marginally stable to begin with, coupled with seasonal changes in the water pressure within their soil.

Fluctuations in groundwater levels are primarily what drive changes in pore pressure and in turn landslide movement, but undercutting and "toe retreat" also play key roles, Leshchinsky said. As a bluff's toe erodes, the soil in the bluff shifts gradually - sometimes ending in collapses that can topple houses and bury roads such as Highway 101, which runs the length of the Oregon coast.

"What's going on may often be imperceptible until a landslide begins to compromise infrastructure like underground utilities or roads," he said.

Leshchinsky and collaborators developed a model to determine the relationship between progressive landslide movement and slope geometry, undercutting processes and hydrological changes. The model showed agreement with data gathered at three monitored slide sites along Oregon's shoreline.

"The smaller progressive slides are particularly sensitive to movements and undercutting," he said. "Our model shows the shorter-length landslides are proportionately more destabilized by the removal of buttressing materials."

The observed rates of landslide advance demonstrate sensitivity to changes in erosion rates, he said, highlighting the potential impacts of increased future wave attack or fluvial erosion - both of which could be exacerbated by a changing climate.

"The sensitivity of slope movements to the rate of these processes highlights the importance of gathering coastal erosion data over time, especially considering estimates of future sea level rise and possible changes in the magnitude and frequency of coastal storms," Leshchinsky said. "If we can decrease erosion, then for some of the smaller slides, we could, potentially, arrest or slow slide movement significantly."

Options for decreasing erosion include dune-stabilizing vegetation where applicable, seawalls, riprap and revetments.

Revetments are sloping structures placed some distance from a bluff. They are designed to absorb wave energy before it hits the bluff. Newer armoring technology, like dynamic revetments, characterized as cobbles placed along the toe of sea cliffs, have also been used to absorb wave energy, but require significant quantities of material and frequent maintenance.

Riprap refers to large, loose boulders; placed at the toe of a bluff, it provides stability in the manner of a flying buttress. But its use in Oregon was severely restricted in the early 1970s by the state legislature, which sided with critics who called it an eyesore that ruined beaches and caused erosion elsewhere.

"There is some truth that riprap, by virtue of preventing erosion in a given location, may shift erosion elsewhere and slows the creation of new sand, which often comes from the sea cliffs failing and degrading into new sand," Leshchinsky said. "However, in limited, controlled applications to conserve exposed critical infrastructure, the benefits may outweigh the cost."

Credit: 
Oregon State University

Researchers identify how the bacterial replicative helicase opens to start DNA replication process

image: Loader proteins (orange) attach to the DnaB replicative helicase (white), causing it to briefly spiral open. Opening of the helicase enables entry of one of the DNA molecule's two strands. Once activated, the helicase runs the length of the strand, separating them and initiating the replication process. Here, the helicase -- helicase loader complex is shown atop a set of cryogenic electron microscopy images used to visualize the structure.

Image: 
Jillian Chase and David Jeruzalmi

NEW YORK, February 26, 2019 - DNA replication is a complex process in which a helicase ring separates the DNA molecule's two entwined and encoded strands, allowing each to precisely reproduce its missing half. Until recently, however, researchers have not understood how the helicase--a donut-shaped enzyme composed of six identical proteins--is able to thread just one of the strands when they are bound together. Now, new research from scientists at The Graduate Center of The City University of New York, its Advanced Science Research Center (ASRC), and The City College of New York (CCNY) has solved the mystery.

In a paper published in today's issue of the journal eLife, researchers explain how the helicase loader protein (P loader) from a bacterial virus attaches to the replicative helicase causing it to spiral open and quickly reclose around one of the DNA strands. The helicase then begins running along the strand and breaking the hydrogen bounds that bind it to the second strand, allowing each to become a substrate that can replicate a complete DNA molecule.

"Going into this research, we knew there had to be a loader protein for this action to take place, but we didn't know what the process looked like," said lead investigator David Jeruzalmi, a professor of chemistry and biochemistry at The Graduate Center and CCNY.

"Through our research we were able to identify the mechanism for loading the DNA strand into the helicase, and we also learned that the loader proteins prevent any movement of the helicase at the moment it opens and closes in order to prevent any replication mistakes."

The researchers studied E. coli bacteria DNA to sort out the replication mechanism. They employed cryo electron microscopy and tomography to image the helicase and all of its loader proteins. In addition to helping them identify a previously unknown mechanism in the DNA replication process, the research might also point to an avenue for a novel class of antibiotics that target the bacterial DNA replication machinery, said the researchers.

"Our research is a beautiful example of the powerful ability of cryo-electron microscopy to provide important findings that can be used to develop new therapeutic drugs," said co-investigator Amedee des Georges, professor of chemistry and biochemistry at The Graduate Center and CCNY and a member of the ASRC's Structural Biology Initiative.

Credit: 
Advanced Science Research Center, GC/CUNY

Worldwide estimates suggest that nearly 1 in 2 children with cancer are left undiagnosed and untreated

A modelling study published in The Lancet Oncology journal estimates that there are almost 400,000 new cases of childhood cancer annually, while current records count only around 200,000.

The new model makes predictions for 200 countries and estimates that undiagnosed cases could account for more than half of the total in Africa, South Central Asia and the Pacific Islands. In contrast, in North America and Europe only three per cent of cases remain undiagnosed. If no improvements are made, the study authors estimate that nearly three million further cases will be missed between 2015 and 2030.

"Our model suggests that nearly one in two children with cancer are never diagnosed and may die untreated," says study author Zachary Ward from the Harvard T.H. Chan School of Public Health, USA. "Accurate estimates of childhood cancer incidence are critical for policy makers to help them set healthcare priorities and to plan for effective diagnosis and treatment of all children with cancer. While under-diagnosis has been acknowledged as a problem, this model provides specific estimates that have been lacking." [1]

Previous estimates for the total incidence of global childhood cancer have been based on data from cancer registries, which identify cases in defined populations. However, 60% of countries worldwide do not have such registries and those that do only cover a small fraction of the overall population. Many patients are not diagnosed and are therefore not recorded. This can occur due to lack of access to primary care, with patients dying undiagnosed at home, or due to misdiagnosis.

The new model developed for this study, the Global Childhood Cancer microsimulation model, incorporates data from cancer registries in countries where they exist, combining it with data from the World Health Organisation's Global Health Observatory, demographic health surveys and household surveys developed by Unicef. The model was calibrated to data from public registries and adjusts for under-diagnosis due to weaknesses in national health systems.

The study authors provide estimates of under-diagnosis for each of the 200 countries. They estimate that in 2015 there were 397,000 childhood cancer cases globally, compared to 224,000 that were recorded as diagnosed. This suggests that 43% (172,000 cases) of global childhood cancer cases were undiagnosed. There was substantial regional variation, ranging from 3% in both Western Europe (120 undiagnosed cases out of 4,300 total new cases) and North America (300 of 10,900 cases) to 57% (43,000 of 76,000 new cases) in Western Africa.

In most regions of the world, the number of new childhood cancer cases is declining or stable. However, the authors estimate that 92% of all new cases occur in low and middle-income countries, a higher proportion than previously thought.

The most common childhood cancer in most regions of the world in 2015 was found to be acute lymphoblastic leukaemia, with the notable exception of sub-Saharan Africa. There were around 75,000 new cases globally, including nearly 700 in North Europe, over 1,500 in West Africa, over 3,500 in East Africa and nearly 30,000 in South Central Asia. In East and West Africa, Burkitt's lymphoma was more common, with over 4,000 cases in East Africa and over 10,000 in West Africa. For example, there were around 1,000 cases in the Democratic Republic of the Congo and Ethiopia, while only around 20 in the UK.

"Health systems in low-income and middle-income countries are clearly failing to meet the needs of children with cancer. Universal health coverage, a target of United Nations Sustainable Development Goals, must include cancer in children as a priority to prevent needless deaths," says senior author Professor Rifat Atun, Harvard University, USA. [1]

Taking population growth into account, the authors estimate that between 2015 and 2030 there will be 6.7 million new cases of childhood cancer worldwide. Of these, 2.9 million cases will be missed if the performance of health systems does not improve. The authors hope that their findings will help guide new policies in health systems to improve diagnosis and management of childhood cancers.

The authors found that barriers to access and referral in health systems result in substantial under-diagnosis of childhood cancer in many countries. They argue that current healthcare models, which concentrate treatment in a few specialised hospitals, are not enough. By strengthening health systems more widely, well-functioning healthcare delivery networks could develop, reducing the number of undiagnosed children with cancer.

"As the hidden incidence of childhood cancer starts to come to the fore, stronger health systems are needed for timely diagnosis, referral and treatment," says Ward. "Expanding cancer registration will be important so that progress can be tracked." [1]

The authors highlight that their results might be affected by limited data availability in some countries. There were only two countries in West Africa (Mali and Cameroon) with available registry data, so predictions for this region might be influenced by the extent to which these countries are representative of the region as a whole.

The authors also assumed that all diagnosed cases are accurately recorded in cancer registries. In practice, some cases might be diagnosed but not recorded, or might be incorrectly classified because of deficient pathology services. However, as new country-specific data become available, the model can be refined to provide updated estimates.

Writing in a linked Comment, Dr Eva Steliarova-Foucher, WHO's International Agency for Research on Cancer, France, says: "Where national data are available and used in the presented model, the proposed estimates should be robust. Yet the only way to validate these new estimates is for countries to ensure efficient provision of representative data... increasing registration coverage and improving the data quality of existing registries would help to reduce the estimation error, which is equivalent to 21 000 cases globally, based on the 95% uncertainty interval... developing efficient vital statistics systems would help to ensure registration completeness and unveil the magnitude of underdiagnosis of cancer. Currently, some mortality statistics are available in only four of 34 low-income countries and in 21 of 47 lower-middle income countries."

Credit: 
The Lancet

'Immunizing' quantum bits so that they can grow up

image: A new material could 'immunize' topological quantum bits so that they are resilient enough for building a quantum computer.

Image: 
Purdue University image/Morteza Kayyalha

WEST LAFAYETTE, Ind. -- Quantum computers will process significantly more information at once compared to today's computers. But the building blocks that contain this information - quantum bits, or "qubits" - are way too sensitive to their surroundings to work well enough right now to build a practical quantum computer.

Long story short, qubits need a better immune system before they can grow up.

A new material, engineered by Purdue University researchers into a thin strip, is one step closer to "immunizing" qubits against noise, such as heat and other parts of a computer, that interferes with how well they hold information. The work appears in Physical Review Letters.

The thin strip, called a "nanoribbon," is a version of a material that conducts electrical current on its surface but not on the inside - called a "topological insulator" - with two superconductor electrical leads to form a device called a "Josephson junction."

In a quantum computer, a qubit "entangles" with other qubits. This means that reading the quantum information from one qubit automatically affects the result from another, no matter how far apart they are.

Without entanglement, the speedy calculations that set apart quantum computing can't happen. But entanglement and the quantum nature of the qubits are also sensitive to noise, so they need extra protection.

A topological-insulator nanoribbon Josephson junction device is one of many options researchers have been investigating for building more resilient qubits. This resilience could come from special properties created by conducting a supercurrent on the surface of a topological insulator, where an electron's spin is locked to momentum.

The problem so far is that a supercurrent tends to leak into the inside of topological insulators, preventing it from flowing completely on the surface.

To get more resilient, topological qubits need supercurrents to flow through the surface channels of topological insulators.

"We have developed a material that is really clean, in the sense that there are no conducting states in the bulk of the topological insulator," said Yong Chen, a Purdue professor of physics and astronomy and of electrical and computer engineering, and the director of the Purdue Quantum Science and Engineering Institute. "Superconductivity on the surface is the first step for building these topological quantum computing devices based on topological insulators."

Morteza Kayyalha, a former Ph.D. student in Chen's lab, could show that the supercurrent wraps all the way around the new topological insulator nanoribbon at temperatures 20 percent lower than the "critical temperature," when the junction becomes superconducting. The experiment was conducted in collaboration with the lab of Leonid Rokhinson, a Purdue professor of physics and astronomy.

"It's known that as the temperature lowers, the superconductivity is enhanced," Chen said. "The fact that much more supercurrent flowed at even lower temperatures for our device was evidence that it is flowing around these protective surfaces."

Credit: 
Purdue University

New method uses AI to screen for fetal alcohol spectrum disorder

Scientists at the University of Southern California (USC), Queen's University (Ontario) and Duke University have developed a new tool that can screen children for fetal alcohol spectrum disorder (FASD) quickly and affordably, making it accessible to more children in remote locations worldwide.

The tool uses a camera and computer vision to record patterns in children's eye movements as they watch multiple one-minute videos, or look towards/away from a target, and then identifies patterns that contrast to recorded eye movements by other children who watched the same videos or targets. The eye movements outside the norm were flagged by the researchers as children who might be at-risk for having FASD and need more formal diagnoses by healthcare practitioners.

The technique was described in a study "Detection of Children/Youth With Fetal Alcohol Spectrum Disorder Through Eye Movement, Psychometric, and Neuroimaging Data," by Chen Zhang, Angelina Paolozza, Po-He Tseng, James N. Reynolds, Douglas P. Munoz and Laurent Itti, which appeared in Frontiers in Neurology.

According to the paper's corresponding author, Laurent Itti, a professor of computer science, psychology and neuroscience at USC, FASD is still quite difficult to diagnose--a professional diagnosis can take a long time with the current work up taking as much as an entire day.

"There is not a simple blood test to diagnose FASD. It is one of those spectrum disorders where there is a broad range of the disorder. It is medically very challenging and it is co-morbid with other conditions. The current gold standard is subjective, as it involves a battery of tests and clinical evaluation. It is also costly."

Itti said he and his colleagues conducted this research as they felt that a screening tool might be able to reach more children who might be at risk. It is estimated that millions of children will be diagnosed with fetal alcohol spectrum disorder (FASD). This condition, when not diagnosed early in a child's life, can give rise to secondary cognitive and behavioral disabilities.

"The new screening procedure only involves a camera and a computer screen, and can be applied to very young children. It takes only 10 to 20 minutes and the cost should be affordable in most cases," said Chen Zhang, a doctoral candidate from the Neuroscience Graduate Program at USC and the paper's first author. "The machine learning pipeline behind this gives out objective and consistent estimations in minutes."

While this computer vision tool is not intended to replace full diagnosis by professionals, it is intended to provide important feedback so that parents can ensure that their children are seen by professionals and receive early cognitive learning and potentially behavioral interventions.

For Itti, this is not the first foray into the use of computer vision to monitor ocular movements to screen for neurological and cognitive conditions. Itti has long worked to model visual attention. Over the last decade, he also applied the same techniques to screening for Attention Deficit Disorder and Parkinson's disease.

"Sometimes people may tell you that you only use 10 percent of your brain in everyday life. But as soon as you open your eyes and process the visual world in front of you, already over 70 percent of your brain is engaged. Your ocular-motor system is so complex, that if something is going on in your brain, your eyes will give some sort of signature," Itti said.

The impact of such screening tools could be significant. Study co-author and FASD expert, James N. Reynolds, who is the Interim Chief Scientific Officer at the Kids Brain Health Network, said, "The economic impact of FASD is spread across multiple systems, including health care, education, criminal justice, as well as lost productivity costs for both individuals with FASD and their caregivers. Estimates suggest that the mean annual cost of FASD in Canada and the US ranges from $22,000-$24,000 per individual, and therefore many billions of dollars collectively on society. There is simply no escaping the fact that FASD is a major public health problem associated with tremendous economic and social costs."

USC's Itti has many ideas of how this screening tool could be implemented whether via a mobile unit, on an app or in a pharmacy as one of the free screening tools used while waiting for a prescription. He says, 'This could be the blood pressure monitoring system for your brain."

Credit: 
University of Southern California

Researchers identify possible role of Foxp1 protein in control of autoimmune diseases

Scientists at the Higher School of Economics, the Institute of Bioorganic Chemistry of the Russian Academy of Sciences (IBCh RAS), and the Memorial Sloan Kettering Cancer Center created a genetic model that helps to understand how the body restrains autoimmune and oncological diseases. The researchers published their results in Nature Immunology.

The immune systems of humans and animals enable them to resist infectious diseases. That is, they recognize pathogens such as viruses, bacteria, fungi, protozoa, and multicellular parasites and destroy them. The T lymphocyte or "helper lymphocyte" is a special type of immune cell that identifies pathogens and helps other immune cells destroy both the pathogens and the cells they infect. The helper lymphocytes also contain a specialized lineage, called a T-regulatory or "Treg" that, instead of helping to fight infection, actually inhibits the response of normal lymphocytes. Mutations that interfere with the development and proper functioning of Treg cells lead to disastrous consequences for the body. Mice and humans without Treg cells develop fatal autoimmune diseases caused by T-helper cell's uncontrolled attack on the body's own cells.

The scientists studied the properties of the Foxp3 protein that is responsible for the development and proper functioning of Treg cells. They found that removing the Foxp3 protein gene from the genomic DNA prevents development of Treg cells, leading to the death of the organism. It is also known that numerous autoimmune diseases are associated with abnormal Foxp3 synthesis and Treg cell quantities. The Foxp3 protein does not work alone, but as part of a complex of proteins that help it regulate the work of genes necessary for the proper functioning of Treg cells. That set of proteins includes Foxp1, which has been the subject of much less research.

The authors of this study, under the guidance of Aleksander Rudensky, created a genetic model to explain exactly how the Foxp1 protein affects Foxp3. They began by removing part of the Foxp1 gene in Treg from laboratory mice. A comparison of the "normal" cells with the cells in which Foxp1 had been removed revealed that Foxp3 is much worse at binding DNA in the absence of Foxp1. That is, the genes of the proteins crucial for the proper functioning of Treg cells do not work correctly without Foxp1. Thus, if Foxp3 is essential for Treg, then Foxp1 also holds great importance because its removal negatively affects Foxp3. According to the researchers, achieving an understanding the structure of the complex of proteins that includes Foxp3 and Foxp1 is the key to creating drugs that can selectively affect Treg cells.

'The results significantly broaden our knowledge of the molecular mechanisms regulating immunological tolerance that can be used for treating cancer and autoimmune diseases,' notes Yury Rubtsov, a co-author of the study, associate professor of the HSE faculty of Biology and Biotechnologies, and senior researcher at the IBCh RAS Laboratory of Molecular Oncology. 'Cancerous tumours attract Treg cells to defend themselves against the body's immune system. The more Treg cells present in the tumour, the worse the patient's prognosis. Thus, if we could control the quantity and activity of Treg cells by, for example, decreasing them in the case of a tumour or, on the contrary, increasing them in the case of autoimmune disease, we could create safe medicines for treating heretofore incurable illnesses.'

Credit: 
National Research University Higher School of Economics

Artificial lung cancer tissue could help find new drug treatments

image: These are LAM cells growing within the hydrogel designed to emulate the microenvironment of the lung.

Image: 
Molly Shoichet

A 3D hydrogel created by researchers in U of T Engineering Professor Molly Shoichet's lab is helping University of Ottawa researchers to quickly screen hundreds of potential drugs for their ability to fight highly invasive cancers.

Cell invasion is a critical hallmark of metastatic cancers, such as certain types of lung and brain cancer. Fighting these cancers requires therapies that can both kill cancer cells as well as prevent cell invasion of healthy tissue. Today, most cancer drugs are only screened for their ability to kill cancer cells.

"In highly invasive diseases, there is a crucial need to screen for both of these functions," says Shoichet. "We now have a way to do this."

Shoichet and her team are internationally known for their work on hydrogels, jello-like materials based on hyaluronic acid, a biocompatible substance commonly used in cosmetics. In the past, they have used hydrogels to enhance stem cells that are injected in the body to overcome disease or degeneration.

In their latest research, the team used hydrogels to mimic the environment of lung cancer, selectively allowing cancer cells, and not healthy cells, to invade. In their latest research, the team used hydrogels to mimic the environment of lung cancer, selectively allowing cancer cells, and not healthy cells, to invade. This emulated environment enabled their collaborators in Professor Bill Stanford's lab at University of Ottawa to screen for both cancer-cell growth and invasion. The study, led by Roger Y. Tam, a research associate in Shochet's lab, was recently published in Advanced Materials.

"We can conduct this in a 384-well plate, which is no bigger than your hand. And with image-analysis software, we can automate this method to enable quick, targeted screenings for hundreds of potential cancer treatments," says Shoichet.

One example is the researchers' drug screening for lymphangioleiomyomatosis (LAM), a rare lung disease affecting women. Shoichet and her team were inspired by the work of Green Eggs and LAM, a Toronto-based organization raising awareness of the disease.

Using their hydrogels, they were able to automate and screen more than 800 drugs, thereby uncovering treatments that could target disease growth and invasion.

In the ongoing collaboration, the researchers plan to next screen multiple drugs at different doses to gain greater insight into new treatment methods for LAM. The strategies and insights they gain could also help identify new drugs for other invasive cancers.

Shoichet, who was recently named a Distinguished Woman in Chemistry or Chemical Engineering, also plans to patent the hydrogel technology.

"This has, and continues to be, a great collaboration that is advancing knowledge at the intersection of engineering and biology," says Shoichet.

Credit: 
University of Toronto Faculty of Applied Science & Engineering