Culture

Bermudagrass harvest management options with poultry litter fertilization

image: Regrowth of two hybrid bermudagrass [Cynodon dactylon (L.) Pers.] varieties after cutting at different time intervals and stubble heights.

Image: 
John Read

When fertilizing bermudagrass with poultry litter, turfgrass managers must consider limiting the buildup of soil P or drawing down soil test P through cut-and-carry forage. In a previous study that provided turfgrass with 122 kg ha-1 P in poultry litter, researchers found that Tifton 44 bermudagrass cut every 49 days at 3-cm stubble height recovered 23% of the P applied. Bermudagrass P removal is closely associated with dry matter (DM) yield and can be altered through management variables like variety, plant maturity at harvest, and litter application rate or timing. Given the tradeoff between DM yield and the amounts of crude protein and digestible DM in bermudagrass, managing harvests to maximize P recovery may adversely impact forage nutritive value.

In an article recently published in Crop, Forage & Turfgrass Management, researchers report on the response of forage nutritive value and P removal to the combined effects of harvest interval and stubble height in Russell and Tifton 44 bermudagrass receiving 4 tons acre-1 of poultry litter supplemented with 98 lb acre-1 N each spring from 2005 to 2007. Soil test P increased by approximately 60 mg kg-1, but there was no difference in soil P accumulation in the harvest interval x stubble height plots in autumn 2007. They concluded a best management practice is cutting every 35 d at 3-cm residual stubble height, which has long been considered a standard for economical forage production. Additionally, harvesting bermudagrass at an advanced stage of maturity is a best management practice in situations where high soil test P is of greater concern than forage nutritive value.

Credit: 
American Society of Agronomy

The commercial consequences of collective layoffs

Researchers from Erasmus University Rotterdam and IESE Business School published a new paper in the Journal of Marketing that empirically demonstrates the effects of collective layoff announcements on sales, advertising effectiveness, and consumers' price sensitivity.

The study forthcoming in the Journal of Marketing is titled "The Commercial Consequences of Collective Layoffs: Close the Plant, Lose the Brand?" and is authored by Vardit Landsman and Stefan Stremersch.

Collective layoffs--the simultaneous termination of the labor contracts of a large group of workers--are quite common in many Western societies. In Europe alone, the 556 collective layoffs announced between December 2018 and November 2019 across different industries involved more than 250,000 employees. In addition to the societal implications, collective layoff decisions have an immense impact on the firms that initiate them.

The termination of employment, particularly of large numbers of people, typically evokes negative connotations. It is, therefore, reasonable to expect that layoff announcements should have negative, rather than positive, effects on the layoff firm. This study explores whether negative effects are universally present and the magnitude of such effects.

From a broad perspective, collective layoffs might be seen as a specific type of firm crisis. Other types of firm crises are product-harm, violations of ethical or moral norms such as sweat-shop operations, or negative news about celebrities who have endorsed a brand. However, collective layoffs have several unique characteristics and thus the commercial consequences of such layoffs warrant specific consideration. For example, while firms do not purposefully initiate most types of brand crises, they do initiate collective layoffs and thus have some level of control over timing, location, and communication. Such control may help the firm to ex-ante contain the potentially adverse outcomes of the layoffs. Moreover, a collective layoff announcement does not directly reflect on the quality of the firm's products, as with a product-harm crisis. However, the merit of the firm's prior actions, or its prospects, may be called into question.

The researchers focus on the automotive industry across nine major automotive markets (Austria, Canada, France, Germany, Italy, Japan, Spain, the U.K., and the U.S.) and study 205 collective-layoff announcements of 20 major brands between 2000-2015, which led to the termination of the labor contracts of more than 300,000 employees. The model specification estimates the advertising effectiveness and price sensitivity of every affected brand over time and across countries.

The data and analyses uncover the differences in the commercial consequences of collective layoffs across characteristics of the layoff announcements. The article identifies three typical information components in an announcement: (1) Motive - Did the firm motivate the collective layoff by a decline in demand?; (2) Nationality - Is the firm domestic or foreign to the layoff country?; and (3) Layoff Size: How many employees are affected by the collective layoff?.

Findings are as follows. First, for two-thirds of the collective layoff announcements in the sample, sales in the layoff country of the corresponding brands decreased in the year following the announcements compared to sales in the year before the announcements. The average drop in sales across all announcements was -6.6%. After accounting for all other effects in the model, sales for the layoff brand are 8.7% lower following a layoff announcement than the predicted level absent the announcement. Second, following collective layoff announcements, consumers become less sensitive to the advertising of the firm and more sensitive to its prices. Third, the layoff announcement characteristics do explain some of the differences in the commercial consequences across announcements. Fourth, firms do not universally increase or decrease their advertising spending following collective layoff announcements. (The median change in spending is about 2%.) However, firms typically spend less on advertising (16% less, on average) than they would absent the announcement in the layoff country during the year following a collective layoff announcement.

These findings are relevant to marketing managers in firms that plan to announce collective layoffs. First, as Stremersch explains, "Our results regarding the commercially adverse effects of collective layoffs suggest that marketing managers should claim a place in the taskforces that manage such layoffs, alongside functional representatives of other areas such as finance and operations, to reflect the potential adverse demand-side consequences perspective of the layoffs in layoff discussions." Second, according to Landsman, "Given the adverse effects for advertising effectiveness, marketers in a layoff country should allocate attention to their advertising response. Firms typically spend less on advertising following a layoff announcement than what they would have spent absent the announcement. As a result, the adverse effects of collective layoffs on sales in the layoff country loom larger, not only because of lower advertising elasticity, but also because of lower advertising spending. An alternative response could be to increase advertising spending to compensate for the decreased effectiveness and to consider such higher ad spending in the layoff country as a restructuring cost."

Credit: 
American Marketing Association

NHS 'turning a blind eye' to labor rights violations in the trade of masks and gloves

The failure of the NHS to provide adequate protective equipment for its employees--including basic items such as gloves and masks--has been among the many unpleasant shocks of the covid-19 crisis for healthcare professionals.

Yet there is a murkier scandal about the procurement of these everyday items that the NHS has yet to face, writes Jane Feinmann, freelance journalist in The BMJ today.

Mahmood Bhutta, consultant in ear, nose, and throat surgery at Brighton and Sussex University Hospitals NHS Trust, who founded the Medical Fair and Ethical Trade Group in 2006, says he feels "ashamed as a doctor to be wearing gloves manufactured using human exploitation."

Feinmann explores allegations of abuse around the production of medical gloves and the use of child labour to make surgical instruments destined for the global market.

Bhutta has been instrumental in helping to improve conditions for workers who make healthcare goods, yet labour abuses have continued--with the response to the coronavirus pandemic now bringing about an increase in the suffering of thousands of workers.

It's a view reflected in the UK Government's Modern Slavery Statement, published on March 18 2020, which includes a promise by Prime Minister Boris Johnson to 'take active steps to drive this increasingly pervasive evil out of our supply chains.'

With the World Health Organization warning that the "chronic global shortage of personal protective gear is among the most urgent threats to virus containment efforts," reports have emerged that a temporary reduction in the production of gloves in Malaysian factories--part of the national lockdown--has been reversed.

What's more, lobbying by the Malaysian Rubber Glove Manufacturers Association throughout March was supported by both the EU and the UK in communications that appeared to make no mention of forced labour concerns, notes Feinmann.

For example, in a letter dated 20 March, reported by Reuters, the Department of Health and Social Care urged Malaysian authorities to prioritise the production and shipment of gloves that are of "utmost criticality for fighting covid-19."

In a further statement on 30 March 2020, an NHS Supply Chain spokesperson told The BMJ that the organisation "takes all allegations of labour abuses in its supply chain very seriously and we have a range of contractual arrangements and initiatives in place to try and prevent such situations arising."

Feinmann acknowledges that taking action against modern slavery is not straightforward, with the potential for action at government level to backfire.

But demands to end forced labour and debt bondage is not happening in medical trade, according to migrant worker specialist, Andy Hall. Rather than introducing sanctions, he says "a better response is for organisations to reward or benefit suppliers demonstrating good working conditions."

Bhutta argues that healthcare professionals "should care enough to do something about a situation that is unethical and illegal and affects the mental and physical health of hundreds of thousands--whether through propagating poverty, risking bodily injury, or through stress and depression from long working hours and a lack of respect at work."

For him, lessons should be learnt from the difficulties of getting supplies of PPE during the covid-19 emergency. "We've learnt how reliant we are on manufacturers overseas and how precarious our supply chains can be."

And he suggests that "by offering a fair price and asking suppliers to show respect for workers, backed by financial or contractual rewards, we can develop long term mutually beneficial relationships."

Credit: 
BMJ Group

Low-income workers disproportionally affected by COVID-19

Low income workers in developing countries face a higher risk of income loss during the Covid-19 lockdown as it is less possible to conduct their jobs from home, suggests a new study from UCL, Bank of Thailand, Universidad Carlos III de Madrid and GRIPS, Tokyo.

The study, published in Covid Economics: Vetted and Real-Time Papers, used Thailand as a case study but the findings are highly relevant for other countries with similar labour market structures - specifically, those with a large share of self-employment and low social safety net.

Dr Suphanit Piyapromdee (UCL Economics) said: "The Covid-19 pandemic has disrupted working life in many ways, with the negative consequences around employment and pay, distributed unevenly under lockdown regulations.

"A substantially larger percentage of people in lower income groups have manual roles, such as construction (10% in Thailand; 11.5% in EU-27) or machine-based jobs, which means they can't work remotely and are without any income.

"Therefore without adequate government intervention to support income or employment for the poor, the adverse impact of Covid-19 could worsen income inequality."

The researchers found that low income workers, such as farmers and construction or factory workers, tend to work in jobs that require less physical proximity to other people at work than high income workers, such as office workers or school teachers.

However, as low income workers tend to be in occupations that are more machine-dependent and less ICT-enabled, this makes them less flexible to work remotely.

The findings provide useful insights for policymakers and leaders seeking to ease the lockdown, while carefully balancing pandemic containment and economic burdens.

Dr Nada Wasi (Bank of Thailand, Puey Ungphakorn Institute for Economic Research) said: "Our analysis suggests that workers in jobs which are not adaptable to work from home, but do not require frequent physical contact with others, should be allowed to return to their workplaces first. They account for one-third of workers from low income brackets.

"On the other hand, those who usually work in close physical proximity to others, but whose jobs are well-suited to work from home, may be the last to return to normalcy."

Dr Ponpoje Porapakkarm (GRIPS) pointed that: "Married couples from the lower income households are much more likely to be in similar occupations, and are highly concentrated in jobs not adaptable to work from home. Whereas, higher income workers have a lower correlation between husband and wife occupations."

Assessing occupational correlation, the study found 60% of couples from low income households are in similar occupations, such as manual labour, compared to 20% of high income households.

The researchers suggest that a means-tested emergency relief programs - looking at household income as well as individual income - would be more suitable than universal support programs in terms of targeting those working in most adversely affected occupations.

The research was carried out using data from Thailand's Labour Force Survey 2019 with information on various occupations and job requirements from the Occupational Information Network. This was used to evaluate the labour market risks arising from the Covid-19 crisis at both individual and household levels.
They created a new set of pandemic-related categories, representing two important risks posed by the Covid-19 pandemic on workers: the risk of earnings losses when a worker is away from their regular workplace, and the risk of contracting or spreading the virus at the workplace. Workers in developing countries, like Thailand, also tend to be more exposed because of the jobs that they do.

Dr Warn N. Lekfuangfu (Universidad Carlos III de Madrid) said: "Our study takes the first step to analysing the impact of the pandemic from the labour supply side. Future research could also factor in labour demand, such as the decline in consumption and supply-chain effects, as well as worker re-distribution during the pandemic."

Credit: 
University College London

High cost of cancer drugs not always justified

A growing number of new cancer drugs have come on the market in recent years, yet the cost of therapies in Europe and the United States have risen. This is driving up healthcare costs, which poses a challenge not only for the Swiss social insurance system, but for patients all over the world. But are the high prices of cancer drugs justified? Does the cost correspond to the particular drug's effectiveness in combating the disease? An international research team from the University of Zurich and Harvard Medical School carried out a study to examine these questions.

Cost comparison of 65 cancer drugs

The scientists - led by Kerstin Noëlle Vokinger, professor at UZH - analyzed the costs of cancer drugs in Switzerland, Germany, England, France and the United States. The prices of 65 new oncology drugs to treat solid tumors and various types of blood cancer were adjusted to calculate the monthly treatment costs for a standard patient.

In a second stage, the researchers investigated whether there is a link between monthly treatment costs and the clinical benefit of cancer drugs for solid tumors. The effectiveness of the drugs, which had been approved by the US and European licensing authorities (the FDA and EMA, respectively), was calculated using two well-established systems for evaluating the clinical benefit of cancer therapies: the American Society of Clinical Oncology Value Framework, and the European Society of Medical Oncology Magnitude of Clinical Benefit Scale.

No correlation between cost and benefit

"Our study clearly shows that, in general, for Switzerland, Germany, England and the United States, there is no association between clinical benefit of a cancer drugs and their prices," explains lead author, Kerstin Vokinger. Only for France a correlation could be found based on one of the clinical benefit assessment systems. "It's also clear that the prices of cancer drugs in the US are significantly higher than in the four European countries, with Americans paying on average approximately twice as much for the same drug."

This is because drug pricing in the US is dictated by the free, unregulated market. In Europe, on the other hand, national authorities negotiate prices with manufacturers. From the European countries analyzed in the study, Switzerland has the second-highest prices after England, while the same drugs are cheaper in Germany and France. It must be kept in mind, however, that NHS England benefits from non-public discounts on certain drugs, so the actual prices may be lower than the official list prices.

Drug prices are not justified

"The pricing of cancer drugs is only partially justified. Drugs that are less effective should be cheaper than those with high efficacy," the UZH professor says. "National authorities should take greater account of the clinical benefits of drugs when negotiating prices, and therapies that provide high clinical benefit should be prioritized in price negotiations." Vokinger firmly believes that this is crucial in order to guarantee patients access to key cancer drugs since countries have only limited financial resources.

Credit: 
University of Zurich

COVID-19 diagnostic tests highlighted in special report

As the new coronavirus continues to claim lives, the race is on to develop fast, convenient and accurate diagnostic tests for COVID-19. Now, researchers from CAS, a division of the American Chemical Society specializing in scientific information solutions, have compiled a special report published in ACS Central Science. Drawing from published journal articles and a variety of other published resources, this report provides a detailed overview of COVID-19 diagnostic tests, trends and resources.

According to the World Health Organization, as of April 26, 2020, the COVID-19 pandemic has caused more than 2.8 million confirmed illnesses and more than 193,000 deaths. Social distancing requirements and business lockdowns have slowed the virus' spread, but at the same time, these measures have disrupted people's lives and weakened economies. To help prevent future outbreaks of COVID-19, experts agree that fast, convenient and accurate diagnostic tests are desperately needed. Widespread testing of the general population would allow public health officials to identify and isolate patients early in the course of their illness, as well as asymptomatic people who might unknowingly spread the disease. To help better understand the numerous diagnostic tests available, a group of CAS scientists led by Cynthia Liu summarized the basic principles of molecular and serological assays underlying these tests. The researchers also provided a high-level view of the more than 200 diagnostic tests currently available.

Tests for COVID-19 currently fall into two major categories: those that detect the RNA of SARS-CoV-2, the virus that causes COVID-19; and those that detect antibodies in the blood of people who at some point were infected with the virus. In the first category, the most common tests rely on the reverse-transcriptase-polymerase chain reaction (RT-PCR), which amplifies a tiny amount of viral RNA collected from nasopharyngeal swabs. Because RT-PCR requires expensive instruments, trained personnel and often several days to generate results, researchers are avidly exploring other methods, such as isothermal nucleic acid amplification and transcription-mediated amplification, as well as CRISPR technologies. The second category of tests cannot be used for early diagnosis because antibodies do not appear in the blood for days to weeks after infection. However, serological and immunological assays could be used to confirm suspected cases, monitor the progress of an individual's disease, or identify people with past infection and potential immunity. Scientists are researching many different types of assays, such as the traditional enzyme-linked immunosorbent assay (known as ELISA), lateral flow immunoassays and surface plasmon resonance-based biosensors. Widespread deployment of both categories of tests could help manage people's return to normal activities, but many questions, including the specificity and sensitivity of the tests, remain to be answered, the researchers say.

Credit: 
American Chemical Society

COVID-19 news from Annals of Internal Medicine

Below please find a summary and link(s) of new coronavirus-related content published today in Annals of Internal Medicine. The summary below is not intended to substitute for the full article as a source of information. A collection of coronavirus-related content is free to the public at http://go.annals.org/coronavirus.

1. Availability of Telemedicine Services Across Hospitals in the United States in 2018: A Cross-sectional Study
Researchers from the University of Texas Southwestern Medical Center studied data from the 2018 American Hospital Association Survey to determine the availability of telemedicine services at U.S. hospitals. They found that a large proportion did not have existing telemedicine programs and will likely require rapid investment in developing the infrastructure needed to deliver patient care remotely and to share limited health care resources across hospitals in light of the COVID-19 pandemic. Read the full text: http://annals.org/aim/article/doi/10.7326/M20-1201.

Media contacts: A PDF for this article is not yet available. Please click the link to read full text. The lead author, Snigdha Jain, MD, can be reached directly at snigdha.89@gmail.com.

2. Annals On Call Podcast: Surge Modeling for COVID-19
In the latest Annals On Call Podcast, Robert Centor, MD from University of Alabama Birmingham School of Medicine and John Wong, MD from Tufts Medical Center discuss pandemic surge models with regard to COVID-19. Learn more: http://annals.org/aim/article/doi/10.7326/A19-0030.

Media contacts: A PDF for this article is not yet available. Please click the link to read more. Dr. John Wong can be reached through Jeremy Lechan at jlechan@tuftsmedicalcenter.org.

Credit: 
American College of Physicians

Racial inequalities in liver cancer deaths soared after launch of hepatitis C drugs

image: Annual percent change black/white mortality rate ratios from malignant hepatocellular cancer. United States residents 55 years and older; years before and after licensure of lifesaving treatments for hepatitis C virus disease in 1998.

Image: 
Baylor College of Medicine

In the United States, hepatocellular or liver cancer deaths have doubled since 1979. Hepatitis C virus is the leading cause of liver cancer. Around 1998, lifesaving drugs to treat hepatitis C - prohibitively expensive for some - were approved and launched. Historically, other lifesaving drugs such as active antiretroviral drug therapy for human immunodeficiency virus (HIV) and surfactant for respiratory distress syndrome (RDS) of the newborn have led to increasing racial inequalities in mortality following their introduction in the U.S.

Researchers from Florida Atlantic University's Schmidt College of Medicine and Baylor College of Medicine explored racial inequalities in mortality from liver cancer before and after the introduction of lifesaving drugs for hepatitis C virus in the U.S.

Results of the study, just published in EClinical Medicine, showed that from 1979 to 1998, racial inequalities in mortality from liver cancer in the U.S. were declining, but, from 1998 to 2016, racial inequalities steadily increased. From 1998 to 2016, of the 16,770 deaths from liver cancer among blacks, the excess relative to whites increased from 27.8 percent to 45.4 percent, and the trends were more prominent in men. Concurrently, racial inequalities in mortality decreased for major risk factors for liver cancer, including alcohol, obesity and diabetes.

The rate among blacks increased from 9.4 per 100,000 in 1998 to 16.7 per 100,000 in 2016, an increase of 77.7 percent, while the corresponding values for whites were 7.2 to 10.3, an increase of 43.1 percent. Rates among blacks age 55 and older increased by 1.7 percent per year from 1979 to 1997 and by 4.2 percent per year from 2000 to 2016. In contrast, corresponding rates among whites increased by 3.5 percent per year from 1979 to 1990, and then increased by 2 percent per year from 1990 to 2016.

"We observed steady increases in racial inequalities in mortality from liver cancer after the licensure of lifesaving drugs for hepatitis C virus in the United States," said Charles H. Hennekens, M.D., Dr.PH, senior author, and the first Sir Richard Doll Professor and senior academic advisor in FAU's Schmidt College of Medicine, and an adjunct professor at Baylor College of Medicine.

Robert S. Levine, M.D., professor of family and community medicine at Baylor College of Medicine and an affiliate professor at FAU's Schmidt College of Medicine, was first author. Hennekens and Levine have collaborated on numerous peer-reviewed manuscripts for nearly five decades and previously noted similar increases in racial inequalities in mortalities from HIV and RDS following the introduction of lifesaving drugs.

The authors note that these descriptive data are useful to formulate but not test hypotheses. Among the many plausible hypotheses generated from these observations are social side effects, including unequal accessibility, acceptability and/or utilization of health care.

"A major clinical and public health priority should be to decrease racial inequalities in mortality following the introduction of lifesaving drugs in the United States and worldwide," said Hennekens.

Data for the study derived from the U.S. Centers for Disease Control and Prevention Wide-ranging ONline Data for Epidemiologic Research (WONDER) to describe liver cancer mortality rates from 1979 to 2016 in those 55 years of age and older, because they suffer the largest disease burden.

In the U.S., with respect to liver cancer, more than 3 million people are affected and about 17,000 new cases occur each year. Hepatitis C virus is estimated to result in 30,160 deaths (20,020 in men and 10,140 in women) this year alone. For men, liver cancer is the fifth most common cause of cancer deaths and for women, it is the seventh leading cause of cancer deaths. The overall death rate from liver cancer has more than doubled from 1980 to 2017.

Credit: 
Florida Atlantic University

Gentler, safer hair dye based on synthetic melanin

image: Using synthetic melanin, scientists have developed a gentler dye that can turn hair any color and lasts at least 18 washes.

Image: 
Northwestern University

EVANSTON, Ill. -- With the coronavirus pandemic temporarily shuttering hair salons, many clients are appreciating -- and missing -- using hair dye to cover up grays or touch up roots. Whether done at a salon or at home, frequent coloring, however, can damage hair and might pose health risks from potentially cancer-causing dye components.

Now Northwestern University researchers have developed a new hair dye process that is much milder than traditional hair dyes. The dye uses synthetic melanin to mimic natural human hair pigmentation.

"We have worked with melanin for several years, focused on how we can capture some of its properties that naturally arise in biology. For example, as a pigment and a color element in all kinds of organisms," said Northwestern's Nathan Gianneschi, who led the research.

"In this study, our postdoctoral fellow Claudia Battistella wondered whether we could use a synthetic process to make a melanin coating on hair that would mimic the appearance of real, melanized hair in the full array of natural hair colors," Gianneschi said. "Such an approach, if done under mild conditions, could be an alternative to other kinds of hair dyes, avoiding some of the toxicity or allergies associated with those chemicals."

The research was published today (April 29) in the journal ACS Central Science.

Gianneschi is the Jacob and Rosalind Cohn Professor of Chemistry in Northwestern's Weinberg College of Arts and Sciences and associate director of the International Institute of Nanotechnology. Battistella is the paper's first author.

Melanin is a group of natural pigments that give hair and skin their varied colors. With aging, melanin disappears from hair fibers, leading to color loss and graying. Most permanent hair dyes use ammonia, hydrogen peroxide, small-molecule dyes and other ingredients to penetrate the cuticle of the hair and deposit coloring. Along with being damaging to hair, these harsh substances could cause allergic reactions or other health problems in colorists and their clients.

Recently, scientists have explored using synthetic melanin to color human hair, but the process required relatively high concentrations of potentially toxic heavy metals, such as copper and iron, and strong oxidants. Gianneschi's team aimed to find a gentler, safer way to get long-lasting, natural-looking hair color with synthetic melanin.

The researchers tested different dyeing conditions for depositing synthetic melanin on hair and found that they could substitute mild heat and a small amount of ammonium hydroxide for the heavy metals and strong oxidants used in prior methods. They could produce darker hues by increasing the concentration of ammonium hydroxide, or red and gold shades by adding a small amount of hydrogen peroxide.

Overall, the conditions were similar to, or milder than, those used in commercially available hair dyes. And the natural-looking colors deposited on the hair surface, rather than penetrating the cuticle, which is less likely to cause damage. The colored layer persisted for at least 18 washes.

Credit: 
Northwestern University

Experts apply microbiome research to agricultural science to increase crop yield

image: NAU associate professor Greg Caporaso

Image: 
Photo courtesy of Northern Arizona University

The global demand and consumption of agricultural crops is increasing at a rapid pace. According to the 2019 Global Agricultural Productivity Report, global yield needs to increase at an average annual rate of 1.73 percent to sustainably produce food, feed, fiber and bioenergy for 10 billion people in 2050. In the US, however, agricultural productivity is struggling to keep pace with population growth, highlighting the importance of research into traditional practices as well as new ones.

In an effort to increase crop yield, scientists at Northern Arizona University's Pathogen and Microbiome Institute (PMI) are working with Purdue University researchers to study the bacterial and fungal communities in soil to understand how microbiomes are impacting agricultural crops. They believe technological advances in microbiome science will ultimately help farmers around the world grow more food at a lower cost.

Nicholas Bokulich, a PMI assistant research professor, and Greg Caporaso, an associate professor of biological sciences and director of PMI's Center for Applied Microbiome Science (CAMS), have been testing a long-held farming belief that phylogenetics--the study of the evolutionary relationship between organisms--should be used to define crop rotation schedules.

The team recently published its findings regarding microbiome research in agricultural food production in Evolutionary Applications. The paper is titled, "Phylogenetic farming: Can evolutionary history predict crop rotation via the soil microbiome?"

Specifically, the traditional approach has been to rotate distantly related crops across different years to maximize plant yield. "One hypothesis for why this may be helpful is that plant pathogens are specific to a single host or to very closely related hosts. If you grow closely related crops in adjacent years, there is a higher chance that pathogens may be lying in wait for their hosts in the second year," Caporaso said. "But this hypothesis has not been directly tested."

The team's experiment, supported by a grant from the USDA National Institute of Food and Agriculture, spanned two outdoor growing seasons. In the first year, Purdue scientists Kathryn Ingerslew and Ian Kaplan grew 36 crops and agricultural weeds that differed in evolutionary divergence from the tomato. The experimental plots ranged from tomato (the same species) to eggplant (the same genus as tomato, but a different species) and sweet peppers (the same family as tomato, but in a different genus and species) through corn, wheat and rye, which are much more distant relatives of the tomato.

In the second season, the researchers only grew tomatoes on all of the plots. They found that in plots where tomatoes were grown in the first season, the year-two tomato yield was lower than in year one, as they expected. However, there were no significant reductions in tomato yield on any of the other plots. "This result suggests that while crop rotation is indeed important for yield, the effect may not extend beyond the species level," Caporaso said.

"This outcome was very surprising because the idea that closely related plants should be avoided in rotations is a widely held rule of thumb across the spectrum from small-scale gardening to large-scale agriculture," said Kaplan, a professor of entomology. "The fact that we cannot detect any signature of relatedness on crop yield--beyond the negative effects of single species monoculture (or tomato after tomato)--suggests that other factors need to be considered in designing crop rotation programs in sustainable farming systems."

Before planting in year two, Caporaso and Bokulich used microbiome sequencing methods to determine the composition of the bacterial and fungal communities in the soil. They found that a microbiome legacy of the year-one crops lived on, although both the soil bacterial and fungal communities were significantly different across the plots of different plants.

"We're now able to explore the role of microbes and the composition of microbiomes in agricultural systems in far greater depth and resolution than has ever been possible before," Caporaso said. "Those technologies can undoubtedly help us optimize agricultural systems to continuously enrich, rather than deplete, soil over time."

Caporaso says his long-term goal with this research direction is to collaborate with organic farmers who are practicing regenerative agriculture techniques. "We can learn how they can use advances in microbiome science to their advantage. I believe that this can help lower their fertilizer costs and water use, and build resiliency and food security in our communities."

Undergraduate researcher will conduct vermicomposting program to track microbiome of food waste

In a related project funded by the NAU Green Fund as well as through Caporaso's lab, which is focused on software engineering in support of microbiome research, environmental science undergraduate student researcher Christina Osterink plans to prototype a workplace composting program this summer. Her project will involve working with about 10 offices on campus to collect food scraps and deliver them to Roots Micro Farm based in downtown Flagstaff.

While diverting food waste from the landfill to an urban farm, Osterink and her team also will track the microbiome of the collected food waste through its transformation via vermicomposting, a worm-based composting method, into high-quality soil. "This will help us develop a more precise understanding of the role of microbes in the composting process as we bring together efforts from throughout NAU's campus as well as local farmers to improve NAU's sustainability and Flagstaff's soil integrity," she said.

Caporaso is hopeful findings from Osterink's research can be applied to optimize composting systems and reduce farmers' costs.

Applying microbiome research to agricultural science, a new direction for CAMS

Caporaso notes that the vermicomposting project represents a new research direction for his lab. "Most of our work at CAMS is related to human health," he said, "but there are at least as many opportunities to apply microbiome research in other areas, such as agricultural science."

Meanwhile, the next step in the crop rotation study will be to identify the important factors for plant yield, especially if evolutionary relatedness of species is ruled out.

"Do we want to rotate crops that thrive with similar soil microbiomes, so that the beneficial bacteria and fungi are already in place to support the next growing season?" Caporaso said. "That would be valuable information for both small urban farms and large industrial operations."

Credit: 
Northern Arizona University

Defining geographic regions with commuter data

image: Comparison of MSAs of New York City region, major Texas cities, and Minneapolis (left) with their associated communities (right) in fairly populous regions.

Image: 
He et al, 2020 (PLOS ONE, CC BY)

A new mathematical approach uses data on people's commutes between and within U.S. counties to identify important geographic regions. Mark He of the University of North Carolina at Chapel Hill and colleagues present this work in the open-access journal PLOS ONE on April 29, 2020.

Defining the boundaries that separate metropolitan areas has major implications for research, governance, and economic development. For instance, such boundaries can influence allocation of infrastructure funding or housing subsidies. However, traditional methods to define metropolitan regions often hamper meaningful understanding of communities' characteristics and needs.

Drawing on methodologies from network science, He and colleagues have now developed a new method of defining metropolitan areas according to census commuter data. They organized all 3,091 counties in the contiguous United States into an interconnected network, with the number of commuters who cross county lines determining the strength of connections between counties. Notably, unlike other studies that have used commuter data to define metropolitan regions, they also accounted for within-county commuting.

Using the new method, the researchers identified 182 clusters of counties which together accounted for more than 90 percent of commuters. 14 clusters were characterized by a high number of commuters to a central node county, while 78 clusters lacked a strong central node. They found 90 counties, including Los Angeles County, that stood alone because of high levels of within-county commuting. In contrast, 20 clusters, mostly centered around large cities, included 50 or more counties and spanned several states.

Generally, the clusters identified by the new method were larger than existing regions defined by traditional methods, suggesting the existence of important connections extending much farther than expected. (It's important to note that the authors permitted geographic regions to overlap in order for a richer and more nuanced characterization of geographic areas.)

While further work is needed to refine this new method, it could enable a more nuanced understanding of meaningful metropolitan boundaries and relationships in the U.S.

The authors add: "Results from community detection suggest that traditional regional delineations that rely on ad hoc thresholds do not account for important and pervasive connections that extend far beyond expected metropolitan boundaries or megaregions."

Credit: 
PLOS

Coffee plants have a small but consistent core microbiome of fungi and bacteria

image: Team member Adam Martin collects coffee root samples.

Image: 
Roberta Fulthorpe

For most people, coffee is a necessary start to the day. For three scientists based in Toronto, coffee is a good research subject in a world with a changing climate.

These scientists explored the tissues of coffee roots to look for signs of a "core microbiome," or for signs of microbes, such as bacteria and fungi, that form partnerships with the coffee plant.

The existence of consistent microbes within a certain plant microbiome is strongly indicative of beneficial relationships and a better understanding of coffee microbial partnerships is helpful in determining best management practices and predicting coffee responses to changing conditions.

To enhance our knowledge of the coffee plant microbiomes, these ecologists used next generation sequencing methods on samples from a number of Central American farms that differed drastically in environmental conditions and management systems. They discovered 26 bacterial and 31 fungal species that met their criteria for belonging to the core microbiome. Some of these species are known to have plant-beneficial properties and should be investigated in more detail.

"The bacterial core microbiome is much stronger and consistent, while the fungal microbiome is more sensitive to environmental conditions that are expected to expand in range with climate change," said Roberta Fulthorpe, one of the scientists behind this research. "We also found that fungi appear to be related to coffee root characteristics while bacteria are not."

The finding that a number of highly abundant microbial species consistently persist in coffee is a remarkable one. As team member Adam Martin explains: "That the same species are found across a huge range of temperatures, precipitation, soil conditions, and light availability, is novel evidence of a core microbiome that actually exists in real-world conditions."

"Our results open the door for understanding if or how microbiomes can be managed in real-world cropping systems. Our work also leads to interesting questions on whether or not the flavor of our morning cup of coffee is influenced by the plant's microbes."

Credit: 
American Phytopathological Society

Eyes send an unexpected signal to the brain

image: Retinal section from a mouse where cell nuclei are labeled in blue, inhibitory cells are labeled with magenta, and ipRGCs are labeled in green.

Image: 
Northwestern University

EVANSTON, Ill. -- The eyes have a surprise.

For decades, biology textbooks have stated that eyes communicate with the brain exclusively through one type of signaling pathway. But a new discovery shows that some retinal neurons take a road less traveled.

New research, led by Northwestern University, has found that a subset of retinal neurons sends inhibitory signals to the brain. Before, researchers believed the eye only sends excitatory signals. (Simply put: Excitatory signaling makes neurons to fire more; inhibitory signaling makes neurons to fire less.)

The Northwestern researchers also found that this subset of retinal neurons is involved in subconscious behaviors, such as synchronization of circadian rhythms to light/dark cycles and pupil constriction to intense bright lights. By better understanding how these neurons function, researchers can explore new pathways by which light influences our behavior.

"These inhibitory signals prevent our circadian clock from resetting to dim light and prevent pupil constriction in low light, both of which are adaptive for proper vision and daily function," said Northwestern's Tiffany Schmidt, who led the research. "We think that our results provide a mechanism for understanding why our eye is so exquisitely sensitive to light, but our subconscious behaviors are comparatively insensitive to light."

The research will be published in the May 1 issue of the journal Science.

Schmidt is an assistant professor of neurobiology at Northwestern's Weinberg College of Arts and Sciences. Takuma Sonoda, a former Ph.D. student in the Northwestern University Interdepartmental Neuroscience program, is the paper's first author.

To conduct the study, Schmidt and her team blocked the retinal neurons responsible for inhibitory signaling in a mouse model. When this signal was blocked, dim light was more effective at shifting the mice's circadian rhythms.

"This suggests that there is a signal from the eye that actively inhibits circadian rhythms realignment when environmental light changes, which was unexpected," Schmidt said. "This makes some sense, however, because you do not want to adjust your body's entire clock for minor perturbations in the environmental light/dark cycle, you only want this massive adjustment to take place if the change in lighting is robust."

Schmidt's team also found that, when the inhibitory signals from the eye were blocked, mice's pupils were much more sensitive to light.

"Our working hypothesis is that this mechanism keeps pupils from constricting in very low light," Sonoda said. "This increases the amount of light hitting your retina, and makes it easier to see in low light conditions. This mechanism explains, in least part, why your pupils avoid constricting until bright light intensifies."

Credit: 
Northwestern University

Water is key in catalytic conversion of methane to methanol

image: Brookhaven Lab and Stony Brook University (SBU) members of the research team. First row, left to right: Sanjaya Senanayake (Brookhaven), Mausumi Mahapatra (Brookhaven), Jose A Rodriguez (Brookhaven), Ping Liu (Brookhaven) and Wenjie Liao (SBU). Second row: Ivan Orozco (SBU), Ning Rui (Brookhaven), Zongyuan Liu (Brookhaven) and Erwei Huang (SBU).

Image: 
Brookhaven National Laboratory

UPTON, NY--Scientists at the U.S. Department of Energy's Brookhaven National Laboratory and collaborators have revealed new details that explain how a highly selective catalyst converts methane, the main component of natural gas, to methanol, an easy-to-transport liquid fuel and feedstock for making plastics, paints, and other commodity products. The findings could aid the design of even more efficient/selective catalysts to make methane conversion an economically viable and environmentally attractive alternative to venting or flaring "waste" gas.

As described in a paper appearing in Science, the team used theory-based models and simulations to identify the atomic-level rearrangements that take place during the reaction, and then conducted experiments to verify those details. The studies revealed three essential roles for water, working in conjunction with an economical cerium-oxide/copper-oxide catalyst, to bring about the conversion of methane to methanol with 70 percent selectivity while blocking unwanted side reactions.

"We knew from previous work that we'd developed a highly selective catalyst for the direct conversion of methane to methanol in the presence of water," said Brookhaven Lab chemist Sanjaya Senanayake, who led the project. "But now, using advanced theoretical and experimental techniques, we've learned why it works so well."

The findings could speed the development of catalysts that make use of methane escaping from gas and oil wells, where it is typically vented directly into the atmosphere or burned off.

"Transporting gas is extremely difficult and potentially hazardous," Senanayake said. "But if you convert it directly into a liquid you can move it and use it instead of flaring it wastefully. While the commercialization potential for such a reaction may still take several years, we hope that our results and the understanding of how it all works will help to get there faster."

Theory lays the groundwork

The search for methane-to-methanol catalysts has turned up a few promising prospects. But many operate in several distinct steps with high energy requirements. And in many cases, competing reactions break down the methane (and any produced methanol) completely to carbon monoxide (CO) and CO2. So, when the Brookhaven team first observed that their catalyst could directly convert methane to methanol with high yield in one continuous reaction, they wanted to know more about how it performed this difficult task.

They were particularly interested in figuring out the role of water, which appeared to facilitate key steps in the process and somehow block the reaction pathways that produced CO and CO2.

Using computational tools in Brookhaven Lab's Center for Functional Nanomaterials (CFN), Brookhaven's Scientific Data and Computing Center, Stony Brook University (SBU), and the National Energy Research Scientific Computing Center (NERSC) at DOE's Lawrence Berkeley National Laboratory (Berkeley Lab), Brookhaven chemist Ping Liu developed the theoretical approach to figure out what was going on.

First, she used "density functional theory" (DFT) calculations to identify how the reactants (methane, oxygen, and water) changed as they interacted with one another and the cerium-oxide/copper-oxide catalyst at various stages during the reaction. These calculations also included information about how much energy it would take to get from one atomic arrangement to the next.

"The DFT gives you a bunch of 'snapshots' of the stages involved in the reaction and the 'bumps' or barriers you have to overcome to get from one stage to the next," she explained.

Then she performed "kinetic Monte Carlo" simulations--essentially using computers to try out all the possible ways the reaction could proceed from snapshot to snapshot. The simulations take into account all the possible pathways and energy requirements to move from one stage to the next.

"These simulations start with each intermediate stage and look at all the possibilities that can go to the next step--and figure out what is the most probable pathway," Liu said. "The simulations determine the most probable way the snapshots can be connected in real time."

The simulations also model how different reaction conditions--for example, changes in pressure and temperature--will affect reaction rates and the probable pathways.

"There were 45-50 possible components in the 'reaction network' we were simulating," said Jose Rodriguez, a leader of Brookhaven's catalysis group who also has a joint appointment at SBU. "Out of those, Ping, Erwei Huang, and Wenjie Liao, two Ph.D. students at SBU, were able to predict what would be the most favorable conditions, the best path, for going from methane to methanol and not to CO and CO2--and all induced by the presence of water."

The models predicted three roles for water: 1) activating the methane (CH4) by breaking one carbon-hydrogen bond and providing an -OH group to convert the CH3 fragment to methanol, 2) blocking reactive sites that could potentially convert methane and methanol to CO and CO2, and 3) facilitates the displacement of methanol formed on the surface into the gas phase as a product.

"All the action takes place at one or two active sites at the interface between the cerium-oxide nanoparticles and copper-oxide film that make up our catalyst," Senanayake said.

But this description was still just a model. The scientists needed evidence.

Experiments provide proof

To gather evidence, the scientists from Brookhaven and SBU conducted additional experiments in Brookhaven's Chemistry Division laboratories and took several trips to the Advanced Light Source (ALS) at Berkeley Lab. This team included SBU Ph.D. student Ivan Orozco and post-doctoral fellows Zongyuan Liu, Robert M. Palomino, Ning Rui, and Mausumi Mahapatra.

At the ALS, the group worked with Berkeley Lab's Slavomir Nemsak and collaborators Thomas Duchon (Peter-Grünberg-Institut in Germany) and David Grinter (Diamond Light Source in the United Kingdom) to perform experiments using ambient pressure (AP) x-ray photoelectron spectroscopy (XPS), which allowed them to track the reaction as it happened in real time to identify key steps and intermediates.

"The x-rays excite electrons, and the energy of the electrons tells you what chemical species you have on the surface and the chemical state of the species. It makes a 'chemical fingerprint.'" said Rodriguez. "Using this technique, you can follow the surface chemistry and reaction mechanism in real time."

Running the reaction with and without water under a range of conditions confirmed that water played the predicted three roles. The measurements showed how the reaction conditions moved the process forward and maximized the production of methanol by preventing side reactions.

"We found direct evidence for formation of CH3O--an intermediate precursor for methanol--in the presence of water," Rodriguez said. "And because you have the water, you modify all the surface chemistry to block the side reactions, and also easily release the methanol from the catalyst surface so it doesn't decompose."

"Now that we've identified the design principles for the catalyst," Senanayake said, "next we have to build a real system for using such a catalyst and test it--and see if we can make it better."

Credit: 
DOE/Brookhaven National Laboratory

Gladstone scientists identify a new potential reservoir of latent HIV

image: In a recent paper in PLOS Pathogens, Gladstone Visiting Scientist Nadia Roan, PhD, and her team describe a class of cells that preferentially support latent infection by HIV.

Image: 
Gladstone Institutes

Scientists have long known that even in the face of antiretroviral therapy, some HIV virus remains in infected individuals forever, hiding in small reservoirs of cells of the immune system. When these individuals discontinue the therapy, the virus almost always rebounds rapidly from the reservoirs, causing deadly symptoms to re-emerge.

These reservoirs remain the main obstacle to curing HIV/AIDS. But there is at present no easy way of targeting reservoir cells for elimination. Nor can scientists efficiently extract reservoir cells from patients to study them, and, ultimately, find ways to control them.

The reason is that the virus in these cells is silent. As a result, the cells do not carry on their surfaces the viral proteins that would make them easy to find.

Scientists have therefore been looking for other means to pinpoint reservoir cells.

In a recent paper in PLOS Pathogens, Gladstone Visiting Scientist Nadia Roan, PhD, and her team describe a class of cells that preferentially support latent infection by HIV. These cells are characterized by a surface protein called CD127 and are found in tissues such as lymph nodes, which are thought to harbor a larger share of the HIV reservoir than blood does.

"Our findings suggest that CD127 cells from tissues may be an important population to target for an HIV cure," says Roan, who is also an associate professor of urology at UC San Francisco.

In addition, scientists can potentially use the CD127 protein as a handle to isolate reservoir cells from patients, and study what makes them able to silence the virus, and occasionally reactivate it.

A New Reservoir?

HIV targets immune cells, known as T cells, that reside primarily in lymphoid tissues, such as lymph nodes and tonsils. Yet HIV infection studies have largely focused on T cells circulating in the blood, which are relatively easy to gain access to--volunteers are more likely to submit to a blood draw than a tissue biopsy.

But focusing on T cells present in the blood is probably giving scientists a skewed view of the reservoir composition.

"We have long suspected that reservoir cells come in different flavors, and that different tissues harbor different types of reservoir cells. But that has been difficult to show because reservoir cells in infected individuals are rare. The vast majority of in vitro models of latency use cell lines or cells circulating in the blood," says Roan.

Roan and her team, by contrast, have been studying HIV infection using tissue specimens. In previous work, her team exposed tonsil cells to HIV in the lab to see which ones were most susceptible to infection. Using a variety of experimental approaches, the team found that tonsil cells with the surface protein CD127 efficiently took up the HIV virus but only rarely let it replicate. By contrast, another type of tonsil cells, carrying CD57 on their surface, readily supported a productive infection.

That was intriguing, but that did not necessarily mean that CD127 were reservoir cells.

"After HIV enters a cell, the cell still has ways to escape infection," says Feng Hsiao, a former research associate in Roan's lab and co-first author of the present study.

One way is to prevent the virus from copying its genome. Unlike the genome of human cells, the HIV genome is made of RNA. One of the virus's first tasks upon entering a cell is to make DNA copies of its RNA genome, using a viral enzyme called reverse transcriptase.

Cells can hamper this step by activating an enzyme called SAMHD1 that depletes the stores of building blocks the virus needs to copy its genome. There was some evidence that this mechanism might be at play in blood cells.

However, in their present work, Roan and her team found that eliminating SAMHD1 by genetic manipulation did not allow CD127 cells to churn out virus, even though it boosted viral production by CD57 cells.

"This suggested to us that CD127 cells blocked the virus at a later step in its life cycle," says Julie Frouard, PhD, a postdoctoral scholar in Roan's lab and the other first-author of the study.

A Preference for Latent Infection

The next step for the virus is to integrate a copy of its genome into the host cell's DNA. Once there, the viral genes can take advantage of the cell machinery to produce their own proteins, which assemble new viral particles that can go infect other cells.

Reservoir cells harbor HIV's genetic material integrated in their own genomes, though they somehow silence it. The occasional mobilization of this material permits the release of infectious virus. Did CD127 tonsil cells allow HIV genome integration?

To answer this question, the scientists extracted the genome of CD127 and CD57 cells that had been exposed to virus in the lab. Using genetic tools that can specifically detect integrated viral DNA sequences, they found that both cell types harbored copies of the virus's genome, even though CD127 cells produced far less virus than CD57 cells did. The CD127 cells appeared to favor a latent infection.

And yet, the virus integrated in CD127 cells is not silenced forever. Roan and her team found that by treating latently infected CD127 cells with agents known to stimulate T cells, they could coax the cells to reactivate the virus.

Hence, CD127 tissue cells could very well serve as reservoir cells in the body, keeping the virus dormant most of the time, yet able to occasionally activate it and release the seeds of a new round of infection.

"The ability of a specific type of tissue T cell to preferentially support latent infection is very intriguing, and can teach us much about how the tissue reservoir becomes established initially," says Roan.

Controlling the Reservoir

To what extent CD127 cells are a major component of the reservoir in people living with HIV awaits follow-up studies analyzing these cells from multiple tissue sites. Preliminary studies from Roan's team are encouraging, as they show that the CD127 marker on the cells' surface can indeed be used to purify enough infected tissue cells from infected individuals to allow further analyses.

Meanwhile, "CD127 tonsil cells exposed to HIV in vitro provide a novel model to study viral latency in tissues," says Roan.

Roan and her team have already started analyzing what makes CD127 cells uniquely prone to silent infections. By comparing all the genes expressed in CD127 and CD57 tonsil cells, they found evidence that CD127 cells are in a quiescent state that may prevent the expression of the virus's genes. Moreover, they also found that the virus's gene products, or RNAs, failed to undergo the necessary processing that would allow them to make viral proteins.

"Ultimately, our hope is that the mechanisms we uncover can be harnessed to control the latent reservoir and move us closer to achieving a cure for HIV," says Roan.

Credit: 
Gladstone Institutes