Earth

Researchers discover natural product that could lead to new class of commercial herbicide

image: Depiction of the herbicide molecule (top), which inhibits an enzyme (bottom) that plants need.

Image: 
Tang Research Group / UCLA Samueli

A garden can be a competitive environment. Plants and unseen microorganisms in the soil all need precious space to grow. And to gain that space, a microbe might produce and use chemicals that kill its plant competitors. But the microbe also needs immunity from its own poisons.

By looking for that protective shield in microorganisms, specifically the genes that can make it, a team of UCLA engineers and scientists discovered a new and potentially highly effective type of weed killer. This finding could lead to the first new class of commercial herbicides in more than 30 years, an important outcome as weeds continue to develop resistance to current herbicide regimens.

Using a technique that combines data science and genomics, the team found the new herbicide by searching the genes of thousands of fungi for one that might provide immunity against fungal poisons. This approach is known as "resistance gene-directed genome mining."

The study, which was published in Nature, also points to the potential for this genomics-driven approach to be used in medicine, with applications ranging from new antibiotics to advanced cancer-fighting drugs.

"Microorganisms are very smart at protecting themselves from the potent molecules they make to kill their enemies," said Yi Tang, the study's co-principal investigator and a UCLA professor of chemical and biomolecular engineering, and of chemistry and biochemistry. "The presence of these resistance genes provides a window into the functions of the molecules, and can allow us to discover these molecules and apply them to diverse applications in human health and agriculture."

For example, if a resistance gene that protects a microorganism from an anti-bacterial product is found, there's a possibility that the microorganism also has genes to produce that same anti-bacterial compound. That discovery could potentially lead to new antibacterial medicines.

The new herbicide acts by inhibiting the function of an enzyme that is necessary for plants' survival. The enzyme is a key catalyst in an important metabolic pathway that makes essential amino acids. When this pathway is disrupted, the plants die.

This pathway is not present in mammals, including humans, which is why it has been a common target in herbicide research and development. The new herbicide works on a different part of the pathway than current herbicides. A commercial product that uses it would require more research and regulatory approval.

"An exciting aspect of the work is that we not only discovered a new herbicide, but also its exact target in the plant, opening the possibility of modifying crops to be resistant to a commercial product based on this herbicide," said study co-principal investigator Steven Jacobsen, a professor of molecular, cell and developmental biology in the UCLA College and an investigator of the Howard Hughes Medical Institute. "We are looking to work with large agrochemical companies to develop this promising lead further."

To confirm the efficacy of the new herbicide, the UCLA team tested the fungus-produced product on a common plant used in lab studies called Arabidopsis. In experiments, the product killed the plants after they were sprayed with it. The researchers also implanted the resistance gene from the fungus into Arabidopsis genomes. The plants that had the resistance gene implanted in them were immune to the herbicide.

"The emergence of herbicide-resistance weeds is thwarting every herbicide class in use; in fact, there has not been a new type commercialized within the last 30 years," said Yan Yan, a UCLA chemical engineering graduate student who was a lead author of the paper. "We think this new, powerful herbicide -- combined with crops that are immune to it -- will complement urgent efforts in overcoming weed resistance."

Credit: 
UCLA Samueli School of Engineering

Better methods improve measurements of recreational water quality

Washington, DC - July 13, 2018 - The concentration of enterococci, bacteria that thrive in feces, has long been the federal standard for determining water quality. Researchers have now shown that the greatest influences on that concentration are the quantity of mammalian feces in the water, and the numbers of enterococci that glom onto floating particulate matter. The research is published Friday, July 13 in Applied and Environmental Microbiology, a journal of the American Society for Microbiology.

"We also found that ecosystem specific characteristics, such as freshwater sediment and freshwater transport to the estuary are important influences on enterococci concentrations in coastal recreational and shellfish harvest waters," said Stephen Jones, Ph.D. Dr. Jones is Research Associate Professor, University of New Hampshire, and Associate Director, New Hampshire Sea Grant Program.

Recreational waters can harbor an array of different bacterial pathogens, the investigators noted in their paper. Human fecal pollution is the biggest concern for public health, as there is no inter-species barrier to transmission to humans. "But other fecal sources that contain enterococci and possibly human pathogens can be chronic or intermittent sources of both, making beach water quality management and remediation efforts more complex," the investigators wrote.

Dr. Jones and his student and coauthor Derek Rothenheber, collected water samples weekly at Wells, Maine, during the summer of 2016. In 2014, two of the town beaches had been flagged for intermittently exceeding state standards for concentrations of enterococci, and advisories had occasionally been posted warning the public that the waters might be unhealthy--bad publicity for a beach town. But by 2016, the Wells Beach area was meeting state of Maine standards.

Besides the beach area, the investigators sampled freshwater tributaries of the coastal watershed, and marine beach water near the outlet of the estuary, said Dr. Jones. They also sampled sediments in tributaries and in the estuary, and soil from areas surrounding the tributaries, to enable molecular analyses of microbial communities in water, sediments, and soils from the different ecosystems. They also measured water temperature, salinity, and acidity, as well as weather conditions.

Dr. Jones and Mr. Rothenheber used polymerase chain reaction (PCR) to identify the animals that were the sources of fecal material, and they used metagenomic DNA sequencing to characterize the bacterial composition of water, sediment, and soil samples from the different ecosystems," said Dr. Jones.

Several agencies in the state of Maine have now adopted the researchers' methodology for assessing areas with water quality issues, and the investigators have been sharing their findings at conferences, with other scientists and resource managers, in order to spread their techniques for monitoring water quality.

The US Environmental Protection Agency established water quality regulations based on enterococci as the indicator of fecal-borne pollution, to help manage water quality at estuarine and marine beaches.

Dr. Jones noted that enterococci are versatile organisms that thrive not only in the colon, but also in soil and in the sedimentary layers of lakes, rivers, and marine waters. "Our study pulls together the multiple fecal sources, the diverse environmental reservoirs, and the changeable environmental conditions to assess how these variables can all influence enterococci concentrations in a coastal setting," said Dr. Jones. "No other study has taken such an encompassing and robust approach towards addressing the issue of the factors that influence enterococci concentrations in coastal waters."

Credit: 
American Society for Microbiology

NASA finds fragmented remnants of Beryl, located west of Bermuda

image: NASA's Aqua satellite analyzed the fragmented thunderstorms associated with the remnants of Tropical Storm Beryl on July 13 at 2:05 a.m. EDT (0605 UTC) and saw coldest cloud top temperatures (yellow/green) near minus 63 degrees Fahrenheit/minus 53 degrees Celsius).

Image: 
Credits: NASA/NRL

The remnants of former Tropical Storm Beryl are being battered by upper level winds, and that's fragmenting them even more. NASA's Aqua satellite passed over the northwestern Atlantic Ocean and found some of those scattered thunderstorms were strong.

NASA's Aqua satellite passed over Beryl's remnants on July 13 at 2:05 a.m. EDT (0605 UTC) and analyzed the storm in infrared light. Infrared light provides temperature data and that's important when trying to understand how strong storms can be. The higher the cloud tops, the colder and the stronger they are.

NASA's Aqua satellite identified a few scattered storms with coldest cloud top temperatures near minus 63 degrees Fahrenheit/minus 53 degrees Celsius). Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

On July 13 at 8 a.m. EDT, the National Hurricane Center noted "An area of low pressure, associated with the remnants of Beryl, is located about 300 miles west of Bermuda. The associated shower and thunderstorm activity remains disorganized due to strong upper-level winds. These winds are expected to become even less conducive for subtropical or tropical development over the next day or two while the low moves north-northeastward at about 10 mph, and additional development will be limited once the low reaches colder waters by Saturday night, July 14 or Sunday, July 15.

The formation chance through five days is now low because of the upper level winds.

Credit: 
NASA/Goddard Space Flight Center

Reducing Australia's cancer death rate

New research has revealed for the first time what impact cutting back on drinking and smoking as a population would have on Australia's cancer death rate.

Researchers from the Centre for Alcohol Policy Research (CAPR), La Trobe University, found reducing tobacco and alcohol consumption rates as a nation would significantly reduce future cancer deaths.

The researchers used health and consumer data dating back to the 1930s to establish the link between population-level smoking and drinking rates and cancer mortality.

They found:

Smoking half a kilogram less tobacco annually per capita would reduce Australia's overall cancer deaths by 8 per cent over 20 years;

Drinking three litres less alcohol annually per capita would reduce Australia's overall cancer deaths by 12 per cent over 20 years.

Lead researcher Dr Jason Jiang said understanding the impact on a population level was important.

"We know that there is a strong link between an individual's use of alcohol and tobacco and their risk of cancer, but few studies have looked at the impact from a national perspective," Dr Jiang said.

"What we now have is evidence of the longer term health benefits of taking a collective approach to reducing smoking and drinking.

"Public health advocates and policymakers on tobacco and alcohol should work together to minimise the adverse health effects on cancer of these two risky behaviours."

Foundation for Alcohol Research and Education (FARE) Chief Executive Michael Thorn said the study provides further evidence that a reduction in per capita alcohol consumption would flow through to reductions in cancer deaths.

"The problem unfortunately is that too few Australians are aware of the link between alcohol and cancer, or understand the Government drinking guidelines which state how best to avoid those risks," Mr Thorn said.

Cancer Council Victoria CEO Todd Harper, said research like this reinforced the need for greater education about the risks associated with smoking and alcohol consumption.

"There remains work to be done on both fronts. To continue to see decreases in smoking rates it is vital population-wide education campaigns occur at sustained and effective levels," Mr Harper said.

Mr Harper also expressed concern about a lack of community knowledge on the harms of alcohol.

"The lack of community knowledge of the harm alcohol can cause is particularly worrying, especially given many Australians are unknowingly drinking at levels which can damage their health, and increase their risk of eight types of cancer," Mr Harper said.

The research, published in the Journal of the American Medical Association, expands on a study conducted last year by Dr Jiang and colleagues that found reducing alcohol consumption would reduce liver, pancreatic, head and neck cancer rates.

Credit: 
La Trobe University

New research: Financial incentives create critical waterbird habitat in extreme drought

News Release: July 12, 2018 - Projections by climate scientists suggest that severe droughts may become more frequent over the next century, with significant impacts to wildlife habitat. Fortunately, new research from scientists at Point Blue Conservation Science and The Nature Conservancy shows how financial incentive programs can create vital habitat for waterbirds, filling a critical need in drought years.

Between 2013 and 2015, the Central Valley of California sustained an extreme drought, dramatically reducing wildlife habitat. The area is recognized as of hemispheric importance for waterbirds, which use flooded agricultural land and wetlands as habitat. Under two innovative financial assistance programs, farmers are provided with an incentive payment to flood their fields at key moments to create habitat for waterbirds. Until this research, the landscape effects of these incentive programs had not been rigorously studied.

Point Blue and The Nature Conservancy researchers used satellite images to evaluate two issues: 1) the impact of the 2013-2015 drought on waterbird habitat in the Central Valley; and, 2) the amount of habitat created by incentive programs.

"Before this research was completed, we had a sense that these programs were succeeding in offsetting the impacts of the drought on wildlife, but now we know exactly how critical they are in providing bird habitat in the Central Valley," said Dr. Matt Reiter, Principal Scientist and Quantitative Ecologist at Point Blue and lead author of the study. "Program managers should place a high priority on maintaining these incentive programs in the face of more frequent severe droughts in order to sustain waterbirds in the Central Valley and the Pacific Flyway," he added.

Data analysis showed that there were declines of up to 80% in the amount of open water on post-harvest agricultural fields and declines of up to 60% in managed wetlands, in comparison with non-drought years. Crops associated with the San Joaquin Basin, specifically corn, as well as wetlands in that basin showed larger reductions in open water than rice and wetlands in the Sacramento Valley. Overall, the satellite data showed that the 2013-2015 drought in the Central Valley was more severe than previous drought years between 2000 and 2011.

Looking at the amount of habitat created by incentive programs, researchers found that a large portion of the open water in rice fields during key times of the year that are critical to waterbirds could be attributed to the programs. BirdReturns, a program paid for and administered by The Nature Conservancy, provided up to 61% of all available flooded rice habitat on some days during the fall of the drought years studied. The Waterbird Habitat Enhancement Program, administered by the Natural Resources Conservation Service (NRCS), created up to 100% of available habitat on some days during the winter and on average, created 64% of available habitat.

"As we experience these wild swings between droughts and flooding, we know we need to be more agile from year to year as conditions change. Critical to that dynamic conservation is understanding what works and what doesn't and having the data to adjust on the fly," said Dr. Mark Reynolds, Lead Scientist, Migratory Bird Program, The Nature Conservancy. "It's incredibly rewarding to know these programs made a significant impact to these bird populations during these extreme drought years. It gives us a blueprint for what to do when we encounter another drought."

"NRCS has been administering our waterbird habitat incentive program for 8 years and it's great to have strong supporting data now on how well it's working," said Alan Forkey, Assistant State Conservationist for NRCS. "Learning that, during drought years in winter, on average over 60% of the available bird habitat was provided by Farm Bill conservation programs makes it clear that we are achieving the desired benefits."

Credit: 
Point Blue Conservation Science

New study finds 93 million people vulnerable to death from snakebites

SEATTLE - A new scientific study finds 93 million people live in remote areas with venomous snakes and, if bitten, face a greater likelihood of dying than those in urban settings because of poor access to anti-venom medications.

The study, conducted by the Institute for Health Metrics and Evaluation (IHME) at the University of Washington, was published today in the international medical journal, The Lancet.

"One's vulnerability to snakebites represents a nexus of ecological contexts and public health weaknesses," said Dr. David Pigott, one of the study's authors and assistant professor at IHME. "Understanding where venomous snakes live and people's proximity to effective treatments are the two most important steps toward reducing deaths. Our analysis identifies communities in greatest need."

According to Pigott, nations whose people are most vulnerable include: Benin, Congo (Brazzaville), Ethiopia, Myanmar, Nigeria, Papua New Guinea and South Sudan.

He and the other researchers identified the regions and nations where it is hard to access treatment. In addition, they generated range maps of 278 species of venomous snakes listed by the World Health Organization (WHO).

The study cross-referenced this data using criteria such as transport time to care facilities with anti-venom medications and quality of care based a health care access and quality index, which examines 32 causes from which death should not occur in the presence of effective health care.

In May, the WHO mandated that a comprehensive plan be developed supporting countries in implementing measures for greater access to treatment to people bitten by venomous snakes. This followed a declaration last year that poisoning from snakebites is a neglected tropical disease.

Other research concludes that an estimated 5 million people are bitten every year by poisonous snakes, and about 125,000 of them die. As a result, it is one of the most burdensome neglected tropical diseases.

"In spite of the numbers, snakebites received relatively limited global attention," said Professor Simon Hay, Director of Geospatial Science at IHME. "We hope this analysis can broaden the discussion about snakes."

Researchers from the Geneva-based Université de Genève and the Hôpitaux Universitaires de Genève participated in the study.

"Thanks to this model, we were able to construct three maps that allowed us to uncover the three hot spots in terms of these three criteria, focusing most heavily on the zones where the individuals are most vulnerable," said Nicolas Ray, researcher at the Institute of Environmental Sciences and at the Institute of Global Health at Université de Genève .

Credit: 
Institute for Health Metrics and Evaluation

'No evidence' grammar schools can promote social mobility, study suggests

Expanding the number of grammar schools is unlikely to promote social mobility by providing more opportunities for disadvantaged pupils, a new study published in Educational Review finds.

Study author Binwei Lu, of Durham University, used England's National Pupil Database to show how a child's chances of going to grammar school varied depending on the Local Authority (LA) in which they lived, their social and ethnic background, and their attainment level at primary school. The database included more than 600,000 pupils, of which around 186,000 were in the 36 LAs with grammar schools.

In those LAs, the proportion of pupils attending such schools varied widely: from 1.4 to 37.4 per cent. Selection criteria also varied, with pupils in certain LAs needing to achieve more than twice the Key Stage 2 marks of those in other LAs to have any chance of being admitted.

As a result, applying in a different LA - an option more readily available to more affluent families - could increase a child's chances.

"While it is often mentioned that coaching gives more affluent pupils an unfair advantage in grammar school selection, our study suggests that a simpler, but effective action for the rich would be to let their children sit the 11+ in other Local Authorities with more grammar school opportunities," Miss Lu noted.

The study also found that pupils eligible for free school meals, pupils with special educational needs, native English speakers, and white pupils were less likely to go to grammar schools, while those from more affluent areas and from minority ethnic groups were more likely to attend.

Despite these differences, the research showed that, during the selection process, attainment was more important than personal background, indicating no bias towards certain groups of pupils in the selection process itself. Rather, the inequality of opportunity to go to grammar school for pupils from different backgrounds was probably the result of diverging attainment among these different groups at the end of primary education.

"While this outcome demonstrates the relatively equitable process of grammar school enrolment based on selection criteria, there is also no evidence that grammar schools can help the poor, as their likelihood of attending such schools is limited," Miss Lu said.

"If secondary schools are allowed to select based on attainment, they are thus selecting pupils from more advantaged backgrounds. The assumption that grammar schools promote social mobility is therefore unsound.

"On the contrary, if grammar schools do perform better than other state schools, they will widen the gap between children from high and low socioeconomic groups by offering higher Key Stage 4 results for their pupils. In the meantime, pupils without sufficient family support, who thus perform worse than they would have otherwise at the age of 11, will lag further behind as they will be enrolled in less effective secondary schools."

Credit: 
Taylor & Francis Group

Study finds room for improvement in South Korea's polluted river basin

image: A new Portland State University study shows that even though water quality has improved in South Korea's Han River basin since the 1990s, there are still higher-than-acceptable levels of pollutants in some of the more urbanized regions in and around the capital Seoul.

Image: 
Courtesy of Heejun Chang

A new Portland State University study shows that even though water quality has improved in South Korea's Han River basin since the 1990s, there are still higher-than-acceptable levels of pollutants in some of the more urbanized regions in and around the capital Seoul.

The study by Heejun Chang, a geography professor in PSU's College of Liberal Arts and Sciences, and Janardan Mainali, a Ph.D. student in geography, was published online in the Journal of Hydrology in June. It was supported by a grant from the National Science Foundation.

The study used spatial data from the early 1990s through 2016 to examine seasonal water-quality trends in the Han River basin, the largest and most populous river basin in South Korea. The river had become synonymous with pollution as factories, farms and city sewer systems poured waste into its waters. But ahead of the 1988 Seoul Olympics, the government launched efforts to begin cleaning it up.

The study examined the relationship between water quality -- as measured by total nitrogen (TN), total phosphorus (TP), chemical oxygen demand (COD) and suspended soil particles -- and topography, population density, soils, and land cover such as changes from forest or agricultural use to urban land.

The study showed that the water quality generally improved within the Seoul metropolitan area but declined in rural areas from the early 1990s to 2016.

Some of the urbanized regions still had higher-than-acceptable concentrations of nitrogen, phosphorus and COD. Even though wastewater treatment plants were built, the study suggests that the population growth in suburban areas may have outpaced the proper treatment of wastewater as well as increasing runoff or "non-point source" pollution.

Among the findings:

In the major urban areas, the decreasing trends seen in TN, TP and COD levels over time can be attributed, in part, to the government's installation of new wastewater treatment plants, better watershed management practices and stream restoration practices

Most of the trends were explained by some combination of forest or agricultural land cover, changes in land use, the percentage of area covered by water and slope variations, suggesting that land management could be an effective strategy for improving water quality

Having vegetation and protected areas along streams was shown to help improve water quality

Chang, who also serves as a faculty fellow at PSU's Institute for Sustainable Solutions, said that governments and agencies need to be proactive in ensuring water quality is a priority, particularly in suburban and developing rural areas, by imposing more stringent regulations, implementing best management practices and creating natural buffers.

"Nature-based solutions have shown to improve water quality in the long run," he said.

Credit: 
Portland State University

Success of conservation efforts for important Caribbean Reef fish hinges on climate change

image: The Nassau grouper, an iconic and endangered Caribbean reef fish.

Image: 
Photo by Alfredo Barroso.

For more than 20 years, conservationists have been working to protect one of the most recognizable reef fish in the Caribbean, the endangered and iconic Nassau grouper, and thanks to those efforts, populations of this critical reef fish have stabilized in some areas. But in a new paper, published in the journal Diversity and Distributions, marine scientists said climate change might severely hinder those efforts by the end of this century.

By 2100, breeding habitats of the Nassau grouper are projected to decline 82 percent from where they were in 2000 if nothing is done to mitigate climate change. These spawning habitats are critical to the survival of the species, said scientists from The University of Texas at Austin and East Carolina University. Additionally, suitable habitats for nonspawning fish are expected to decline 46 percent.

The paper also points out that because Nassau groupers have a narrow temperature range they can tolerate while spawning, this may create a bottleneck that will affect population recovery.

"The effects of climate change could override some of the successes of conservation efforts at local and regional scales," said Brad Erisman, assistant professor of fisheries biology at The University of Texas at Austin. "That is, if Nassau grouper no longer migrate to spawn in a particular region because the water is too warm, then protecting spawning sites in that region will be ineffective. Likewise, if the months when spawning occurs in certain regions shifts in response to climate change, then seasonal protection measures in those regions will need to shift accordingly to ensure that spawning is still protected."

The Nassau grouper contributes significantly to the ecosystem as a top predator and can act as a kind of canary in the coal mine for reef health. Their reproduction success depends on large breeding events, called spawning aggregations, where hundreds to thousands of fish gather in one area for a few days to mate. But these mass spawning events also make them easy targets for commercial fishing, and they were overfished to the point the species became endangered.

Beginning in the 1990s, several countries including the U.S. have put outright bans on fishing Nassau grouper. Cuba and the Dominican Republic restrict fishing during spawning season (December-February), and some areas have restricted fishing in specific breeding grounds.

"To truly understand how climate will impact fishes, we need to know how it will impact the most vulnerable life history stage, spawning. If this link in the life cycle is jeopardized, the species as a whole will be in jeopardy," said Rebecca G. Asch, an assistant professor of fisheries biology at East Carolina University.

Grouper are also important for the overall health of the ecosystem. Large predators such as sharks feed on the gathered grouper. Whale sharks and manta rays feed on the eggs that are released.

"The loss of these important, energy-rich events has negative impacts that span entire food webs and ecosystems," Erisman said.

There is some good news, scientists said. If strong steps are taken to mitigate climate change, breeding habitat is projected to decline by only 30 percent.

Next, the scientists plan to expand their research to look at how climate change may affect spawning in 12 species of grouper and snapper in the Caribbean and the Pacific. The model developed could aid researchers in studying climate change effects on other fish species that depend on large spawning events such as salmon, tuna and cod.

Credit: 
University of Texas at Austin

7,000 strokes prevented as GPs improve diagnosis and treatment of atrial fibrillation

Around 7,000 strokes each year are being prevented thanks to GPs more than doubling the number of patients at high risk being prescribed with blood thinning drugs, University of Birmingham researchers have found.

Researchers from the University of Birmingham's Institute for Applied Health Research analysed general practice records of five million patients from 2000 to 2016 to find out how many people have a diagnosis of atrial fibrillation and how many are receiving treatment to prevent strokes.

Atrial fibrillation is the most common cause of an irregular heartbeat and five times increases the risk of stroke - annually, 14 per cent of all strokes in the UK are suffered by patients with the condition. To reduce the risk of stroke by around two thirds, patients with atrial fibrillation are given anticoagulant drugs to prevent blood clotting, such as warfarin.

Up until now there has been little research showing the prevalence and treatment of atrial fibrillation. Now, University of Birmingham research, published in Heart and partly funded by the National Institute for Health Research, has found that the frequency of diagnosis has increased each year between 2000 and 2016 and reached around 1.2 million in 2016.

The researchers also found that, in this same 16-year period, the proportion of patients with atrial fibrillation who were being prescribed anticoagulants more than doubled - rising by 40.1% from 35.4% in 2000 to 75.5% in 2016.

Author Dr Nicola Adderley said: "We found that in 2000 only 35 per cent of people with atrial fibrillation who should be taking anticoagulants were actually prescribed them.

"By 2016, the situation had changed dramatically and 75 per cent were receiving them, which highlights that GPs have become markedly better at not only identifying patients with atrial fibrillation, but also at treating these patients.

"Furthermore, over the same period, general practitioners also seem to have got better at selecting who should get anticoagulants.

"The drugs are not recommended for a small minority of younger people with atrial fibrillation whose overall risk of stroke is low and prescribing fell from 20 per cent to 10 per cent in this group."

Senior and corresponding author Professor Tom Marshall said: "This is a success story for NHS general practice.

"Atrial fibrillation causes about 20,000 strokes each year but general practice is preventing about 7,000.

"The proportion of atrial fibrillation patients getting anticoagulants has continued to increase year on year.

"There are some general practices that do particularly well and there is every indication that stroke prevention will probably continue to improve."

Author Dr Krishnarajah Nirantharakumar added: "We hope we will continue to see improvements in stroke prevention within the NHS and more research to investigate the management of atrial fibrillation in UK general practice."

Credit: 
University of Birmingham

Stop antibiotics before resistance 'tipping point'

Treatments using antibiotics should stop as soon as possible to prevent patients passing the "tipping point" of becoming resistant to their effects, new research has shown.

A team of researchers, led by Professor Robert Beardmore from the University of Exeter, has uncovered new evidence that suggests reducing the length of the antibiotic course reduces the risk of resistance.

For the study, the researchers examined how microbial communities - groups of microorganisms that share a common living space in the body - reacted to different antibiotic cycling patterns, which sees the medication restricted or increased, under laboratory conditions.

They found that changes both in the duration and dose of antibiotics used and in sugar levels (which mimics the variable sugar levels in human patients) could push these microbial communities beyond a "tipping point" - creating an irreversible shift to becoming drug resistant.

The researchers insist this new study demonstrates that resistant species can increase within the body even after an antibiotic is withdrawn - if a tipping point was unwittingly passed during treatment.

The study is published in leading journal Nature Ecology & Evolution on Monday, July 9th 2018.

Professor Beardmore, a mathematical biosciences expert from the University of Exeter, said: "It's a sensible idea that when you take an antibiotic away, resistance goes away too, but we wondered what kinds of antibiotic treatments don't behave like that. After all, in some clinical studies, resistance didn't disappear when the antibiotic did."

Antibiotic resistance occurs when microbes develop the ability to defeat the drugs designed to kill them, and so they multiply unhindered. Antibiotics are the most effective treatment for a wide-range of microbial infections, including strep throat and pneumonia.

For decades, patients have been instructed to complete courses of antibiotics because the perceived wisdom had been that taking too few tablets would allow bacteria to mutate and become resistant. However, more recently it has been suggested that the longer microbes are exposed to antibiotics, the more likely it is that resistance will develop.

Little research has been conducted to show how the length of a course of antibiotics impacts resistance, which, despite differences in patients, for example in their blood sugar levels, are recommended to be the same for all.

In the new study, the researchers examined how microbial communities containing Candida albicans and Candida glabrata reacted to different doses of an antimicrobial when fed with sugar.

Both species are commonly found together in healthy people, but are also opportunistic pathogens which can cause infection.

The study showed that as the antimicrobial was introduced, the communities were reduced, while the removal of the treatment allowed them to flourish again.

Crucially, the researchers showed that if sugar levels dropped in the community, it could reach a "tipping point" whereby resistance would persist even after the antimicrobial had stopped being used.

The new research opens up the possibilities for further studies to better understand when the best time would be to stop antibiotic treatment, to prevent resistance occurring.

Co-author Professor Ivana Gudelj added: "Our body is a mother ship for microbial communities but we've still expected to understand drug resistance by studying microbial species one at a time, in the lab.

"We show this can be misleading because microbes have intricate relationships that the drugs make even more complicated, and yet our theories of antibiotic resistance have ignored this, until now. So that's the first surprise: even sugars can affect antibiotic resistance."

Credit: 
University of Exeter

Male couples report as much domestic violence as straight couples

ANN ARBOR--Nearly half of all men in a new study about intimate partner violence in male couples report being victims of abuse.

The study from the University of Michigan shows that in addition to universal stressors--finances, unemployment, drug abuse--that both heterosexual and male couples share, experiences of homophobia and other factors unique to male couples also predict abuse among them.

The study is one of the few that looks at violence from the perspective of both members of male couples (abuser and victim), said Rob Stephenson, U-M professor of nursing and director of the Center for Sexuality and Health Disparities.

Most studies examining domestic violence look at female victims in heterosexual couples or have only asked questions of one member of a male couple.

Nearly half (46 percent) of the 320 men (160 couples) in the study reported experiencing some form of intimate partner violence in the last year--physical and sexual violence, emotional abuse and controlling behavior.

"If you just looked at physical and sexual violence in male couples, it's about 25 to 30 percent, roughly the same as women," he said. "We're stuck in this mental representation of domestic violence as a female victim and a male perpetrator, and while that is very important, there are other forms of domestic violence in all types of relationships."

The research is important because it debunks that stereotype, and accounts for controlling and isolating behaviors as well as physical abuse, Stephenson said.

Ultimately, violence links back to HIV prevention because men in abusive relationships may find it hard to negotiate for condom use or even when and how they have sex, Stephenson said. Nor is there good communication about HIV status and HIV prevention in abusive relationships.

His study makes a strong connection between internalized homophobia and violence, Stephenson said. A gay man who's struggling with his identity might lash out at his partner with physical or emotional abuse as a stress response behavior--similar to heterosexual couples, where an unemployed man lashes out at his female partner because he feels inadequate, he said.

Stephenson wants clinicians to start asking male couples about violence. Right now, the majority do not, he said. The study appears in the July edition of American Journal of Men's Health.

Credit: 
University of Michigan

Finding a weak link in the frightful parasite Schistosoma

image: This confocal microscope image shows a small snail, removed from its shell, which is infected with schistosome parasites. The snail plays a key role in the life cycle of the parasite that causes schistosomiasis, a neglected tropical disease that sickens hundreds of millions of people. The Phillip Newmark lab at the Morgridge Institute is seeking new avenues to fight the disease.

Image: 
Bo Wang, Stanford University

The parasitic disease schistosomiasis is one of the developing world's worst public health scourges, affecting hundreds of millions of people, yet only a single, limited treatment exists to combat the disease.

Researchers at the Morgridge Institute for Research are searching for potential new targets by probing the cellular and developmental biology of its source, the parasitic flatworm Schistosoma.

The team's newest work, published today in the journal eLife, sheds light on essential stages in the life cycle of this blood fluke. They characterized several different types of stem cells that govern the parasite's complex life cycle and also identified a gene associated with the earliest development of the germline, from which gametes form.

"Understanding how these stem cells drive the development of each life-cycle stage may ultimately help prevent disease transmission," says senior author Phillip Newmark, a Howard Hughes Medical Institute and Morgridge investigator and professor of integrative biology at the University of Wisconsin-Madison.

More than 250 million people, mostly in Africa and Asia, have schistosomiasis. The World Health Organization classifies it as the deadliest neglected tropical disease, killing an estimated 280,000 people each year. Children with the disease are often ravaged by anemia, malnutrition and pervasive learning disabilities.

The drug Praziquantel is the primary form of treatment. The drug is largely effective in killing the adult worms in humans, but not in the parasite's other life cycle stages, leaving people exposed to continual reinfection.

Schistosomes have a complicated life cycle, switching through many different body plans as they move from snails to water to humans. The cycle begins in tainted freshwater lakes and ponds, where parasite eggs released from human waste hatch into tiny creatures whose sole task is to infect a specific type of snail.

Within the snail host, the parasite produces massive numbers of offspring called cercariae. These fast-swimming, fork-tailed organisms are released into the water, from which they burrow through human skin and cause infection.

After penetrating host skin, the parasites must migrate into the blood vessels and find their way to the major vein that supplies the liver. During this journey, the parasites reorganize their tissues, and upon reaching the liver, begin developing reproductive organs, pair with a mate, and grow into mature adults.

The team examined the poorly understood early stages after infection through an ingenious experiment designed by co-author Jayhun "Jay" Lee, a Morgridge Postdoctoral Fellow in the Newmark Lab. He mimicked infection in a culture dish by enabling cercariae to penetrate through a portion of mouse skin into a medium on the other side. This strategy allowed him to examine when and which cells first begin to divide after infection.

"We don't get that many ah-ha moments in our lives as scientists," Newmark says. "This was one of them."

During the initial 22-36 hours of infection, they observed five distinct cells proliferating; these same stem cells were packed into the cercariae during development inside the snail. The cells give rise to the adult stem cells, initiating development of the parasite into the adult worm. From there, they identified a subset of stem cells associated with development of the reproductive system.

"We're really excited about this because it opens up a number of important research directions," Newmark says. "The drug used to fight schistosomes does not work on this stage of infection. Understanding what's happening in this early period after infection is critical, because it's also a time when the parasites should be most vulnerable."

Lee says the next research step will be to follow these five stem cells as they continue to differentiate and form tissues. "We want to use this as a road map to figure out what the cells are doing," he says.

The Newmark Lab's primary focus has been on exploring regeneration in planarians -- remarkable flatworms that can regenerate from the tiniest body fragments. They started work on schistosomes, "evil cousins" of planarians, in 2009 and applied decades of planarian biology to better understand their parasitic relatives.

It's a case where model organism research may help provide answers for a human health tragedy. "This provides another example of how curiosity-driven basic research can lead to unanticipated outcomes and why it is important to support such work," Newmark says.

Credit: 
Morgridge Institute for Research

Artificial intelligence helps Stanford researchers predict drug combinations' side effects

image: Marinka Zitnik and colleagues designed a system to predict billions of potential drug combination side effects.

Image: 
L.A. Cicero

Last month alone, 23 percent of Americans took two or more prescription drugs, according to one CDC estimate, and 39 percent over age 65 take five or more, a number that's increased three-fold in the last several decades. And if that isn't surprising enough, try this one: in many cases, doctors have no idea what side effects might arise from adding another drug to a patient's personal pharmacy.

The problem is that with so many drugs currently on the U.S. pharmaceutical market, "it's practically impossible to test a new drug in combination with all other drugs, because just for one drug that would be five thousand new experiments," said Marinka Zitnik, a postdoctoral fellow in computer science. With some new drug combinations, she said, "truly we don't know what will happen."

But computer science may be able to help. In a paper presented July 10th at the 2018 meeting of the International Society for Computational Biology in Chicago. Zitnik and colleagues Monica Agrawal, a master's student, and Jure Leskovec, an associate professor of computer science, lay out an artificial intelligence system for predicting, not simply tracking, potential side effects from drug combinations. That system, called Decagon, could help doctors make better decisions about which drugs to describe and help researchers find better combinations of drugs to treat complex diseases.

Too many combinations

Once available to doctors in a more user-friendly form, Decagon's predictions would be an improvement over what's available now, which essentially comes down to chance - a patient takes one drug, starts taking another and then develops a headache or worse. There are about 1000 different known side effects and 5,000 drugs on the market, making for nearly 125 billion possible side effects between all possible pairs of drugs. Most of these have never been prescribed together, let alone systematically studied.

But, Zitnik, Agrawal and Leskovec realized they could get around that problem by studying how drugs affect the underlying cellular machinery in our body. They composed a massive network describing how the more than 19,000 proteins in our bodies interact with each other and how different drugs affect these proteins. Using more than 4 million known associations between drugs and side effects, the team then designed a method to identify patterns in how side effects arise based on how drugs target different proteins.

To do that, the team turned to deep learning, a kind of artificial intelligence modeled after the brain. In essence, deep learning looks at complex data and extracts from them abstract, sometimes counterintuitive patterns in the data. In this case, the researchers designed their system to infer patterns about drug interaction side effects and predict previously unseen consequences from taking two drugs together.

Predicting complications

Just because Decagon found a pattern doesn't necessarily make it real, so the group looked to see if its predictions came true, and in many cases, they did. For example, there was no indication in the team's data that the combination of atorvastatin, a cholesterol drug, and amlopidine, a blood pressure medication, could lead to muscle inflammation, yet Decagon predicted that it would, and it was right. Although it did not appear in the original data, a case report from 2017 suggested the drug combination had led to a dangerous kind of muscle inflammation.

That example was born out in other cases as well. When they searched the medical literature for evidence of ten side effects predicted by Decagon but not in their original data, the team found that five out of the ten have recently been confirmed, lending further credence to Decagon's predictions.

"It was surprising that protein interaction networks reveal so much about drug side effects," said Leskovec, who is a member of Stanford Bio-X, Stanford Neurosciences Institute and the Chan Zuckerberg Biohub.

Right now, Decagon only considers side effects associated with pairs of drugs, and in the future the team hopes to extend their results to include more complex regimens, Leskovec said. They also hope to create a more user-friendly tool to give doctors guidance on whether it's a good idea to prescribe a particular drug to a particular patient and to help researchers developing drug regimens for complex diseases with fewer side effects.

"Today, drug side effects are discovered essentially by accident," Leskovec said, "and our approach has the potential to lead to more effective and safer healthcare."

Credit: 
Stanford University

As brain extracts meaning from vision, study tracks progression of processing

Here's the neuroscience of a neglected banana (and a lot of other things in daily life): whenever you look at its color - green in the store, then yellow, and eventually brown on your countertop - your mind categorizes it as unripe, ripe, and then spoiled. A new study that tracked how the brain turns simple sensory inputs, such as "green," into meaningful categories, such as "unripe," shows that the information follows a progression through many regions of the cortex, and not exactly in the way many neuroscientists would predict.

The study led by researchers at MIT's Picower Institute for Learning and Memory undermines the classic belief that separate cortical regions play distinct roles. Instead, as animals in the lab refined what they saw down to a specific understanding relevant to behavior, brain cells in each of six cortical regions operated along a continuum between sensory processing and categorization. To be sure, general patterns were evident for each region, but activity associated with categorization was shared surprisingly widely, said the authors of the study published in the Proceedings of the National Academy of Sciences.

"The cortex is not modular," said Earl Miller, Picower Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT. "Different parts of the cortex emphasize different things and do different types of processing, but it is more of a matter of emphasis. It's a blend and a transition from one to the other. This extends up to higher cognition."

The study not only refines neuroscientists' understanding of a core capability of cognition, it also could inform psychiatrist's understanding of disorders in which categorization judgements are atypical, such as schizophrenia and autism spectrum disorders, the authors said.

Scott Brincat, a research scientist in Miller's Picower lab, and Markus Siegel, principal investigator at the University of Tübingen in Germany, are the study's co-lead authors. Tübingen postdoc Constantin von Nicolai is a co-author.

From seeing to judging

In the research, animals played a simple game. They were presented with shapes that cued them to judge what came next - either a red or green color, or dots moving in an upward or downward direction. Based on the initial shape cue, the animals learned to glance left to indicate green or upward motion, or right to indicate red or downward.

Meanwhile the researchers were eavesdropping on the activity of hundreds of neurons in six regions across the cortex: prefrontal (PFC), posterior inferotemporal (PIT), lateral intraparietal (LIP), frontal eye fields (FEF), and visual areas MT and V4. The team analyzed the data, tracking each neuron's activity over the course of the game to determine how much it participated in sensory vs. categorical work, accounting for the possibility that many neurons might well do at least a little of both. First they refined their analysis in a computer simulation, and then applied it to the actual neural data.

They found that while sensory processing was largely occurring where classic neuroscience would predict, most heavily in the MT and V4, categorization was surprisingly distributed. As expected the PFC led the way, but FEF, LIP and PIT often showed substantial categorization activity, too.

"Our findings suggest that, although brain regions are certainly specialized, they share a lot of information and functional similarities," Siegel said. "Thus, our results suggest the brain should be thought of as a highly connected network of talkative related nodes, rather than as a set of highly specialized modules that only sparsely hand-off information to each other."

The patterns of relative sensory and categorization activity varied by task, too. Few neuroscientists would be surprised that V4 cells were particularly active for color sensation while MT cells were active for sensing motion, but perhaps more interestingly, category signals were more widespread. For example, most of the areas were involved in in categorizing color, including those traditional thought to be specialized for motion.

The scientists also noted another key pattern. In their analysis they could discern the dimensionality of the information the neurons were processing, and found that sensory information processing was highly multi-dimensional (i.e. as if considering many different details of the visual input), while categorization activity involved much greater focus (i.e. as if just judging "upward" or "downward").

Cognition in the cortex

The broad distribution of activity related to categorization, Miller speculated, might be a sign that when the brain has a goal (in this case to categorize), that needs to be represented broadly, even if the PFC might be where the judgement is made. It's a bit like in a business where everyone from the CEO down to workers on the manufacturing floor benefit from understanding the point of the enterprise in doing their work.

Miller also said the study extends some prior results from his lab. In a previous study he showed that PFC neurons were able to conduct highly-multidimensional information processing, while in this study they were largely focused on just one dimension. The synthesis of the two lines of evidence may be that PFC neurons are able to accommodate whatever degree of dimensionality pursuing a goal requires. They are versatile in how versatile they should be.

Let all this sink in, the next time you consider the ripeness of a banana or any other time you have to extract meaning from something you perceive.

Credit: 
Picower Institute at MIT