Culture

Some domesticated plants ignore beneficial soil microbes

While domestication of plants has yielded bigger crops, the process has often had a negative effect on plant microbiomes, making domesticated plants more dependent on fertilizer and other soil amendments than their wild relatives.

In an effort to make crops more productive and sustainable, researchers recommend reintroduction of genes from the wild relatives of commercial crops that restore domesticated plants' ability to interact with beneficial soil microbes.

Thousands of years ago, people harvested small wild plants for food. Eventually, they selectively cultivated the largest ones until the plump cereals, legumes, and fruit we know today evolved. But through millennia of human tending, many cultivated plants lost some ability to interact with soil microbes that provide necessary nutrients. This has made some domesticated plants more dependent on fertilizer, one of the world's largest sources of nitrogen and phosphorous pollution and a product that consumes fossil fuels to produce.

"I was surprised how completely hidden these changes can be," said Joel Sachs, a professor of biology at UC Riverside and senior author of a paper published today in Trends in Ecology and Evolution. "We're so focused on above ground traits that we've been able to massively reshape plants while ignoring a suite of other characteristics and have inadvertently bred plants with degraded capacity to gain benefits from microbes."

Bacteria and fungi form intimate associations with plant roots that can dramatically improve plant growth. These microbes help break down soil elements like phosphorous and nitrogen that the plants absorb through their roots. The microbes also get resources from the plants in a mutually beneficial, or symbiotic, relationship. When fertilizer or other soil amendments make nutrients freely available, plants have less need to interact with microbes.

Sachs and first author Stephanie Porter of Washington State University, Vancouver, reviewed 120 studies of microbial symbiosis in plants and concluded that many types of domesticated plants show a degraded capacity to form symbiotic communities with soil microbes.

"The message of our paper is that domestication has hidden costs," Sachs said. "When plants are selected for a small handful of traits like making a bigger seed or faster growth, you can lose a lot of important traits relating to microbes along the way."

This evolutionary loss has turned into a loss for the environment as well.

Excess nitrogen and phosphorous from fertilizer can leach from fields into waterways, leading to algae overgrowth, low oxygen levels, and dead zones. Nitrogen oxide from fertilizer enters the atmosphere, contributing to air pollution. Fossil fuels are also consumed to manufacture fertilizers.

Some companies have begun selling nitrogen-fixing bacteria as soil amendments to make agriculture more sustainable, but Sachs said these amendments don't work well because some domesticated plants can no longer pick up those beneficial microbes from the soil.

"If we're going to fix these problems, we need to figure out which traits have been lost and which useful traits have been maintained in the wild relative," Sachs said. "Then breed the wild and domesticated together to recover those traits."

Credit: 
University of California - Riverside

Improving the collection of birth and death data worldwide

University of Melbourne researchers have identified and implemented the key interventions and tools that countries can - and should - use to improve the quality and availability of critical birth and death data and ultimately, improve health outcomes.

Published in BMC Medicine, it is the first Civil Registration and Vital Statistics collection to report on experience in implementing technical interventions over the first four years of the Bloomberg Data for Health Initiative.

Across the world, about 40-50 per cent of all deaths are unrecorded and millions of deaths do not have a documented cause.

Over four years, University of Melbourne researchers have worked with low and middle-income countries through the Bloomberg Data for Health Initiative to help them improve the collection of birth and death data. This increases understanding about the leading causes of premature death and can be used to inform public health policy and preventive action.

"People live longer if they benefit from informed, targeted public health action and policies, and that means we need to know reliably what people are dying from," said University of Melbourne Professor Alan Lopez, report co-author and expert on the global burden of disease.

As part of the Initiative, researchers have conducted the first ever multi-country assessment of medical certification improvement strategies designed to improve the accuracy of diagnoses by doctors in hospitals.

Accurate and timely completion of the medical certificate of cause of death should be a relatively straight forward procedure for physicians, but researchers say mistakes are common.

Researchers found a reduction in incorrectly completed certificates of between 28 per cent and 43 per cent is possible, depending on training - training the trainer, direct training of the physician or online training.

The results indicate that a variety of training strategies can produce benefits in the quality of certification, but further improvements are possible.

Other recommendations include using automated verbal autopsy methods. Using a digital tablet and a 20-minute questionnaire, interviewers ask friends and family about the signs and symptoms present before the person died.

Results reported in the journal demonstrate this simple, cost-effective data collection method can produce reliable cause of death information for community deaths for which little was previously known.

Researchers are now calling for parallel investments in critical health information systems to accelerate progress built on lessons learned.

"Knowing how many people are born each year, how many are dying and from what causes, is vital to tackling public health problems, ongoing population planning and developing policies," report co-author and University of Melbourne Associate Professor Deirdre McLaughlin said.

"Our research provides evidence-based interventions that countries can use to strengthen vital statistic systems and improve their record keeping.

"Further funding and support are needed to enable low and middle-income countries to implement these recommendations, improve global health data and ultimately, health outcomes."

This BMC Medicine collection was led by the Global Burden of Disease Group at the Melbourne School of Population and Global Health, University of Melbourne, Australia. It draws upon experiences of countries involved in the Data for Health Initiative, supported by Bloomberg Philanthropies and the Australian Department of Foreign Affairs and Trade.

Credit: 
University of Melbourne

New study on COVID-19 estimates 5.1 days for incubation period

An analysis of publicly available data on infections from the new coronavirus, SARS-CoV-2, that causes the respiratory illness COVID-19 yielded an estimate of 5.1 days for the median disease incubation period, according to a new study led by researchers at Johns Hopkins Bloomberg School of Public Health. This median time from exposure to onset of symptoms suggests that the 14-day quarantine period used by the U.S. Centers for Disease Control and Prevention for individuals with likely exposure to the coronavirus is reasonable.

The analysis suggests that about 97.5 percent of people who develop symptoms of SARS-CoV-2 infection will do so within 11.5 days of exposure. The researchers estimated that for every 10,000 individuals quarantined for 14 days, only about 101 would develop symptoms after being released from quarantine.

The findings will be published online March 9 in the journal Annals of Internal Medicine.

For the study, the researchers analyzed 181 cases from China and other countries that were detected prior to February 24, were reported in the media, and included likely dates of exposure and symptom onset. Most of the cases involved travel to or from Wuhan, China, the city at the center of the epidemic, or exposure to individuals who had been to Hubei, the province for which Wuhan is the capital.

The CDC and many other public health authorities around the world have been using a 14-day quarantine or active-monitoring period for individuals who are known to be at high risk of infection due to contact with known cases or travel to a heavily affected area.

"Based on our analysis of publicly available data, the current recommendation of 14 days for active monitoring or quarantine is reasonable, although with that period some cases would be missed over the long-term," says study senior author Justin Lessler, an associate professor in the Bloomberg School's Department of Epidemiology.

The global outbreak of SARS-CoV-2 infection emerged in December 2019 in Wuhan, a city of 11 million in central China, and has resulted in 95,333 officially confirmed cases around the world and 3,282 deaths from pneumonia caused by the virus, according to the World Health Organization's March 5 Situation Report. The majority of the cases are from Wuhan and the surrounding Hubei province, although dozens of other countries have been affected, including the U.S., but chiefly South Korea, Iran, and Italy.

An accurate estimate of the disease incubation period for a new virus makes it easier for epidemiologists to gauge the likely dynamics of the outbreak, and allows public health officials to design effective quarantine and other control measures. Quarantines typically slow and may ultimately stop the spread of infection, even if there are some outlier cases with incubation periods that exceed the quarantine period.

Lessler notes that sequestering people in a way that prevents them from working has costs, both personal and societal, which is perhaps most obvious when health care workers and first responders like firefighters are quarantined.

The new estimate of 5.1 days for the median incubation period of SARS-CoV-2 is similar to estimates from the earliest studies of this new virus, which were based on fewer cases. This incubation period for SARS-CoV-2 is in the same range as SARS-CoV, a different human-infecting coronavirus that caused a major outbreak centered in southern China and Hong Kong from 2002-04. For MERS-CoV, a coronavirus that has caused hundreds of cases in the Middle East, with a relatively high fatality rate, the estimated mean incubation period is 5-7 days.

Human coronaviruses that cause common colds have mean illness-incubation periods of about three days.

Lessler and colleagues have published an online tool that allows public health officials and members of the public to estimate how many cases would be caught and missed under different quarantine periods.

"The incubation period of COVID-19 from publicly reported confirmed cases: estimation and application" was written by co-first authors Stephen Lauer and Kyra Grantz, and Qifang Bi, Forrest Jones, Qulu Zheng, Hannah Meredith, Andrew Azman, Nicholas Reich, and Justin Lessler.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Predicting appropriate opioid prescriptions post-cesarean delivery

AURORA, Colo. (March 9, 2020) - Knowing the amount of opioids taken following cesarean section surgery and before discharge can inform individualized prescriptions and cut down on unnecessary, leftover pills that could be used for non-medical purposes, according to a new study from the University of Colorado Anschutz Medical Campus.

Cesarean delivery is the most common operation performed in the United States. Most patients are prescribed opioids following the procedure, and while persistent opioid use post-delivery is uncommon, overprescribing poses a risk for patients' communities as unused pills are often stored in unlocked locations.

The study, published today in Annals of Family Medicine, examined opioid use by 203 cesarean delivery patients for 24 hours pre-discharge. The patients then self-reported opioid use for four weeks after. While researchers found that patients who took fewer opioids pre-discharge also reported less opioid intake in the following weeks, most patients received similar, non-individualized prescriptions. There were 1,805 leftover pills from patients participating in this study alone. Just 16% reported storing leftover pills in a locked location, and fewer discarded their leftover medication altogether.

"Leftover opioids fuel nonmedical use," said Karsten Bartels, associate professor of anesthesiology at the University of Colorado Anschutz Medical Campus and senior author of the study. "While it's impossible to make a direct link, we can be cautious by avoiding large amounts of unnecessary opioids. Prescribing post-op discharge opioids based on last 24-hour use is a simple, practical tool to inform appropriate prescriptions - indeed, this practice is now being adopted for our patients at CU Anschutz. If we would re-do the study today, we would likely see many more individualized and lower prescriptions."

While the study stresses the importance of tailoring opioid prescriptions to individual patient needs, overall pain management should not be ignored. Undertreatment of pain has been associated with an increased risk for a variety of postpartum difficulties, including chronic pain and difficulty breastfeeding.

"Identify those who do need prescriptions," Bartels said. "[Opioids] can be detrimental to public health but can also be a godsend."

Credit: 
University of Colorado Anschutz Medical Campus

Focusing continuity of care on sicker patients can save millions of dollars annually

INFORMS journal Manufacturing & Service Operations Management New Study Key Takeaways:

Continuity of care leads to substantial improvements in patient outcomes, up to a point, more so for sicker patients.

Continuity of care leads to fewer inpatient visits, shorter lengths of stay on those visits, and lower readmission rates.

Variety in care becomes important at a certain stage because outcomes tend to grow worse at the very highest levels of continuity of care.

CATONSVILLE, MD, March 9, 2020 - Research shows higher continuity of care, meaning a care team cooperatively involved in ongoing healthcare, is better for health outcomes, but can there be too much of a good thing? New research in the INFORMS journal Manufacturing & Service Operations Management finds the answer is "yes."

Continuity of care is related to improvements in inpatient visits, length of stay, and readmission rates, but the data show outcomes improve and then decline in increasing continuity of care, suggesting that there may be value in having multiple providers.

The study, "Maintaining Continuity in Service: An Empirical Examination of Primary Care Physicians," conducted by Vishal Ahuja of Southern Methodist University, Carlos Alvarez of Texas Tech University, and Bradley Staats of University of North Carolina at Chapel Hill looked at 300,000 patients from the Veterans Health Administration over an 11-year period. They focused on patients who suffer from diabetes or kidney disease, a major complication of diabetes.

Through the researchers' work and prior literature, the consensus is that continuity of care creates an opportunity for learning--with repeated interactions, the service provider gains important knowledge about the patient's situation. These exchanges may improve the efficiency of subsequent interactions, yielding both productivity and quality benefits, but the relationship may not be so straightforward.

"Variety may also prove beneficial. With too many repeated interactions, it is possible that a service provider may grow complacent and miss information. A new provider may provide a fresh look at an existing problem," said Ahuja, a professor in the Cox School of Business at SMU.

The data proved continuity of care is especially important for patients suffering from more serious conditions because it provides even more operational value for those complex patients, which results in a greater opportunity for improvement from continuity.

"At the highest levels of continuity of care, the outcomes grow worse. It is possible to have too much of a good thing. So, focusing continuity on sicker patients can result in saving millions of dollars annually," continued Ahuja. "It is necessary to understand where continuity matters most and where perhaps it is less vital."

Credit: 
Institute for Operations Research and the Management Sciences

Male size advantage drives evolution of sex change in reef fish

Some species of fish, notably parrotfish and wrasses living on coral reefs, change their biological sex as they age, beginning life as females and later becoming functionally male. New work from the University of California, Davis, shows that this sequential hermaphroditism evolves when bigger males gain an advantage in reproductive success -- for example by defending a permanent mating territory.

"People have wondered why this kind of sex change evolved," said Jennifer Hodge, a postdoctoral researcher in the Department of Evolution and Ecology in the UC Davis College of Biological Sciences. A paper about the work by Hodge, research associate Francesco Santini and Professor Peter Wainwright is published online in The American Naturalist.

Fishes are the only vertebrates to show such a sex change over their lifetime. The prevailing theory is that sequential switching from female to male evolves when it is more beneficial to reproduce as a female when small and as a male when large -- a circumstance that is likely to change depending on the social and mating behavior of the species.

Evidence to confirm this idea has been lacking, however.

Large males can hold territory

Mating behavior differs wildly across this group of fish. In some species, males defend a territory and monopolize one or more females within it. Others establish a "lek," or temporary territory, that is visited by females. And in some species, fish mate without establishing a territory at all.

Mating systems that require defending a territory or females provide the opportunity for larger males to be more reproductively successful.

"Large males can control mates and resources, establishing a territory with food and shelter," Hodge said.

Hodge, Santini and Wainwright tested this theory by looking at 89 species of wrasses and parrotfishes with different kinds of mating systems. They used DNA sequence data to reconstruct a tree of the relationships between the species of fish and assigned them to different mating systems based on published studies.

"Our results provide some of the first comparative evidence to support the theory that sequential hermaphroditism evolves when, for a given sex, reproduction is more effective at a certain size," Hodge said. The size advantage increases from lek-like to haremic mating systems, she said, as does the likelihood that all males of the population will be derived from sex-changed females.

Credit: 
University of California - Davis

Squatting or kneeling may have health benefits

image: The Hadza in Tanzania tend to squat or kneel when taking a break, which scientists believe may spare them from some risks for heart and metabolic diseases.

Image: 
David Raichlen of USC and Brian Wood of UCLA

Standing desks are so passé. It's time for squatting desks.

A USC-led study shows that squatting and kneeling may be important resting positions in human evolution -- and even for modern human health.

Sitting for hours a day is linked to some health risks, including cardiovascular disease, likely because it involves low muscle activity and low muscle metabolism. However, these risks seem paradoxical. For humans, evolutionary pressures favor conserving energy. Spending a lot of time sitting would seem to accomplish that goal. So, why should sitting be so harmful?

The USC-led team has shown that resting postures used before the invention of chairs -- like squatting and kneeling -- may hold the answer, as they involve higher levels of muscle activity then chair-sitting. These more active rest postures may help protect people from the harmful effects of inactivity.

"We tend to think human physiology is adapted to the conditions in which we evolved," said David Raichlen, a professor of biological sciences at the USC Dornsife College of Letters, Arts and Sciences. "So, we assumed that if inactivity is harmful, our evolutionary history would not have included much time spent sitting the way we do today."

The study was published on March 9 in the journal Proceedings of the National Academy of Sciences.

How you rest matters

To better understand the evolution of sedentary behaviors, the scientists studied inactivity in a group of Tanzanian hunter-gatherers, the Hadza, who have a lifestyle that is similar in some ways with how humans lived in the past.

For the study, Hadza participants wore devices that measured physical activity and periods of rest. The scientists found that they had high levels of physical activity -- over three times as much as the 22 minutes per day advised by U.S. federal health guidelines.

But the scientists also found that they had high levels of inactivity.

In fact, the Hadza are sedentary for about as much time -- around 9 to 10 hours per day -- as humans in more developed countries. However, they appear to lack the markers of chronic diseases that are associated, in industrialized societies, with long periods of sitting. The reason for this disconnect may lie in how they rest.

"Even though there were long periods of inactivity, one of the key differences we noticed is that the Hadza are often resting in postures that require their muscles to maintain light levels of activity -- either in a squat or kneeling," Raichlen said.

In addition to tracking activity and inactivity, the researchers used specialized equipment to measure muscle activity in the lower limbs in different resting postures. Squatting involved more muscle activity compared to sitting.

The researchers suggested that because the Hadza squat and kneel and have high levels of movement when not at rest, they may have more consistent muscle activity throughout the day. This could reduce the health risks associated with sedentary behavior.

"Being a couch potato -- or even sitting in an office chair -- requires less muscle activity than squatting or kneeling," Raichlen said. "Since light levels of muscle activity require fuel, which generally means burning fats, then squatting and kneeling postures may not be as harmful as sitting in chairs."

In developed countries, humans spend inactive periods sitting on their duffs in chairs, recliners or sofas, so the only time they activate their leg muscles is when they bend their knees to slide into the seat. On average, people in more industrialized societies, including the United States and Europe, spend about nine hours per day sitting.

"Preferences or behaviors that conserve energy have been key to our species' evolutionary success," said Brian Wood, an anthropologist at the University of California, Los Angeles, who has worked with the Hazda people for 16 years. "But when environments change rapidly, these same preferences can lead to less optimal outcomes. Prolonged sitting is one example."

The scientists dubbed this the "Inactivity Mismatch Hypothesis."

"Replacing chair sitting and associated muscular inactivity with more sustained active rest postures may represent a behavioral paradigm that should be explored in future experimental work," they wrote. Resolving this inactivity mismatch with our evolutionary past could pay off in better health today.

"Squatting is not a likely alternative," Raichlen said, "but spending more time in postures that at least require some low-level muscle activity could be good for our health."

Credit: 
University of Southern California

Glucose acts as a double edged sword on longevity factor SIRT1

image: During starvation, the longevity factor SIRT1 regulates glucose production and fat breakdown in the liver to provide energy to the body. Upon feeding, glucose dependent modification of SIRT1 (glycosylation), allows the liver to transit to a fed state, where it efficiently assimilates nutrients from the food to build cellular reserves of fat and glycogen. Whereas excessive glucose dependent glycosylation of SIRT1 in the liver results in obesity and aging, inability to undergo glycosylation also causes failure to respond to nutrient cues, insulin resistance and hyper-inflammation. Thus, our study shows that both de/-modification of SIRT1 is essential for metabolic flexibility of the liver and has immense therapeutic potential.

Image: 
Babukrishna Maniyadath, Tandrika Chattopadhyay and Ullas Kolthur

Feeding and fasting cycles exert control over metabolism and energy utilization of organisms. Any aberration is known to cause metabolic diseases, liver dysfunctions and accelerated aging. Expression and activity of the anti-aging factor SIRT1 has long been known to be beneficial in mitigating diseases such as diabetes, cardiovascular dysfunctions, neurodegeneration, cancer and aging. Global efforts are underway to both uncover molecular mechanisms that affect feed-fast cycles and also to regulate the activity of the longevity factor SIRT1.

Novel discovery from TIFR shows that glucose acts as a double edge sword in regulating the functions of SIRT1. These results have been published in the journal PNAS. The study found that glucose derived cellular metabolite acted as a molecular switch to regulate both the extent and time of activity of the longevity factor, which effected gene expression and regulated metabolic flexibility in the liver. This study has immense therapeutic potential since loss of SIRT1 is associated with obesity and aging, its over-activation resulted in perturbed liver functions, inflammation and a pre-diabetic like state.

India has earned the dubious distinction of harbouring the world's fastest growing diabetic population with an estimated 72 million cases in 2017, a figure expected to almost double by 2025. Increasing evidence indicates that feed-fast cycles are important for physiological homeostasis and aberrant feeding and/or fasting leads to metabolic diseases including NAFLD (Non-alcoholic fatty liver disease), hyper-inflammation, and aging. This has huge impact on public health and lifestyle related disorders. Liver is one of the central metabolic organs that plays a pivotal role in maintaining organismal health and lifespan and regulates both fat and glucose metabolism, ensuring appropriate utilization of energy sources during normal fed-fast cycles. Thus, identifying molecular mechanisms within liver cells that regulates metabolic fitness of an organism becomes crucial.

The current study has discovered that glucose controls the functions of SIRT1, whereby glucose (rather its derivative) itself binds and modifies ('glycosylation') this factor and ultimately reduces its levels. While this reduction is required for metabolic oscillation during fed-fast cycles, sustained glycosylation as seen in obesity and aging abrogates the protective functions of SIRT1. Surprisingly, genetic tricks that eliminates this control and results in overactivation of the longevity factor during feed-fast cycles was also detrimental to liver physiology and resulted in increased blood glucose levels, akin to a pre-diabetic like state. Thus, it is exciting that both excessive modification by glucose (as in the case of obesity and aging) and no modification (as in the case of fasting) are detrimental to organismal physiology. Therefore, this study describes glucose derived modification of the longevity factor SIRT1, which keeps a check on its activity and functions, specifies downstream molecular signalling and fine tunes gene expression in the liver.

This tuning is essential for metabolic flexibility in normal fed-fast cycles and during aging. Interestingly, this glucose dependent modification regulated organismal glucose homeostasis itself via modulating insulin signalling, mitochondrial functions and fat metabolism. While there are efforts to find therapeutic activators for SIRT1, the study shows that both over-activation and under-activation of this longevity factor could lead to diseases. Hence, ways to regulate this modification might be beneficial in tackling lifestyle disorders and aging related diseases.

Credit: 
Tata Institute of Fundamental Research

Viewership soars for misleading tobacco videos on YouTube

Misleading portrayals of the safety of tobacco use are widespread on YouTube, where the viewership of popular pro-tobacco videos has soared over the past half-dozen years, according to research by the Annenberg Public Policy Center (APPC) of the University of Pennsylvania.

In an article published today in the Harvard Kennedy School Misinformation Review, APPC researchers found that from 2013 to 2019, different kinds of popular tobacco-themed YouTube videos saw "dramatic increases in views per day, especially for tutorials about vaping products."

The research follows up on a 2013 content analysis done by APPC which identified five major categories of pro-tobacco videos on YouTube. For example, among instructional or "how-to" videos, the highest-performing video in 2013 was on how to use a pipe, with just over 62,000 total views or 47 views per day. But in 2019, the most-viewed instructional video was on "the art of vape," which had logged over 40 million total views or over 68,000 per day.

Another category is managing risk, in which videos claim that the risks of tobacco use can be managed by various fixes, without offering scientific evidence. In this category, the top-performing 2013 video concerned cigarette smoking, with 85,000 total views or 63 views per day. In 2019, the top-viewed video in this category was on vaping, which had over 3.5 million views or over 1,600 per day.

"The easy access of such [video] material suggests that YouTube is a fertile environment for the promotion of tobacco products despite its banning of tobacco advertising," the researchers said.

YouTube, tobacco videos, and adolescents

YouTube is the second most popular website for U.S internet traffic, after Google, and reaches 85% of adolescents. That sweeping reach gives it the potential to influence the information that young people receive on many topics -- including products that are hazardous to their health, like tobacco. Although the use of cigarettes by young people has declined in recent years, vaping has sharply increased and emerged as the main alternative to cigarettes among adolescents.

"Although we have no direct evidence of the effect of pro-vaping videos, the rise of vaping among adolescents in the last few years has been accompanied by dramatic increases in viewership of vaping videos," said lead author Dan Romer, APPC's research director. Past APPC research has found that misleading YouTube videos promoting e-cigarettes and hookahs made young adults feel more positively about those products.

In 2013, APPC researchers used a set of keywords to locate pro-tobacco videos on YouTube, from which they used a pool of 200 randomly selected videos to create a taxonomy with five categories of pro-tobacco videos: demonstrations of "fun ways to use tobacco"; instructional or tutorial videos; managing risk; assertions that tobacco use is actually healthy; and assertions that the risks of tobacco use are no greater or are less than other life risks, without describing the risks of tobacco products.

Using keyword searches and YouTube's recommendation algorithm, the researchers found the most viewed video in each of the five categories in the taxonomy. In those categories, they used keyword searches and YouTube's recommendation algorithm again in 2019 to find the top-ranked videos at that time. In four of the five categories, the best-performing videos had more than a million views.

"This suggested to us that the misleading tobacco videos we identified on YouTube are part of the information environment that eludes the restrictions that apply to regular tobacco advertising and product promotion," said co-author Patrick E. Jamieson, director of APPC's Annenberg Health and Risk Communication Institute.

Eluding YouTube's tobacco ad ban -- and making a profit

Although YouTube has banned ads for tobacco products, both YouTube and the users who create tobacco videos are legally permitted to profit from them, Romer said. "One of the perverse consequences of this business model is that a video with misleading information about a harmful product such as tobacco can be a source of profit for both YouTube and the creator," he said. "But the information in the video will go unchallenged."

Currently, there is little incentive for YouTube to remove the videos. Under Section 230 of the Communications Decency Act (CDA) of 1996, YouTube and other social media platforms are not regarded as publishers of the content they host. Under Section 230, "it is not clear who is responsible for protecting the public from misleading health content on the internet," the Annenberg researchers said:

"As our study of YouTube illustrates, producers of misleading tobacco content can primarily represent private individuals rather than tobacco manufacturers. Indeed, the producers of the tobacco videos we identified... do not appear to be employees of the tobacco industry. Although we did not see evidence of any connection to the industry, it is nevertheless possible that a content creator could receive endorsement payments from a tobacco company."

Romer added, "Clearly this consequence was not anticipated when the CDA was passed. Perhaps it is time to revisit this measure to more appropriately recognize the business model that has emerged in social media. All content is potentially monetized and the platforms should have to recognize their role in hosting it."

How to counter misleading information

One way to counteract misleading tobacco videos, the APPC team said, is to place corrective ads that counter the misinformation on the same page as the pro-tobacco videos. In an earlier study, APPC researchers found it was possible to counteract pro-tobacco videos on YouTube by showing a corrective message on the severity and scope of health risks associated with smoking.

In the current study, they also noted that other researchers have proposed that the government give incentives to social media platforms to modify their terms of service to allow them to remove misleading health information.

Credit: 
Annenberg Public Policy Center of the University of Pennsylvania

Cryo-EM reveals unexpected diversity of photosystems

image: While monomer and trimer have been known before, the mini-PSI, dimer, and specialised dimer of dimers, which are reported in the current work, expand on our understanding of the diversity of photosynthetic mechanisms in nature.

Image: 
Ella Marushchenko

Oxygenic photosynthesis is the conversion of sunlight into chemical energy that underpins the survival of virtually all complex life forms. The energy conversion is driven by a photosynthetic apparatus that captures light photons in the bioenergetic membranes of cyanobacteria, algae and plants. Photosystem I is a central component of this process.

The current textbook paradigm of Photosystem I is a trimer architecture for cyanobacteria, and a monomer for algae. Two new discoveries from a collaboration of researchers from SciLifeLab with Tsinghua University and Tel Aviv University, reported in Nature Plants, find that freshwater-living plankton Anabaena has adapted a specialized Photosystem I dimer of dimers with 476 pigments; while green alga Dunaliella, has optimized a minimal form of Photosystem I (mini-PSI) to live in hypersaline environment and under light stress. Annemarie Perez Boerema from Alexey Amunts lab (Stockholm University, SciLifeLab) used cryo-EM to visualize the unusual forms of Photosystem I.

The first study on Photosystem I tetramer from Anabaena revealed an increased surface area allowing enrichment of Photosystems in the bioenergetic membrane. This provides an advantage during maturation stages requiring nitrogenase activity. The second study revealed mini-PSI from Dunaliella that features the smallest complex of its kind identified up to date. The scientists also report new energy pathways, pigment binding sites and phospholipids. Unlike all the other known counterparts, the mini-PSI lacks the core protein components that would be involved in interactions with additional light-harvesting partners. This observation suggests previously unknown regulatory mechanism reducing the association of peripheral antennae for environmental acclimatisation.

Together, the two studies show that Photosystems can photosynthesise beyond the textbook description. The discovered configurations in diverged species can be considered as an evolutionary prank that nature makes on occasion, which is promising news for researchers exploring fundamental questions in bioenergetics.

Credit: 
Science For Life Laboratory

New high-cost HIV prevention drug: 'Better' isn't worth it

BOSTON/NEW HAVEN - A newly approved drug for HIV pre-exposure prophylaxis (PrEP) is unlikely to confer any discernible health benefit over generic alternatives and may undermine efforts to expand access to HIV prevention for the nation's most vulnerable populations, according to a new study appearing today (March 9) in the Annals of Internal Medicine.

The study, led by researchers at Massachusetts General Hospital (MGH) and the Yale School of Public Health, is also being released today at the Conference on Retroviruses and Opportunistic Infections in Boston, where top researchers from around the world will be discussing the ongoing battle against HIV/AIDS and related infectious diseases. The Harvard University Center for AIDS Research and the National Alliance of State and Territorial AIDS Directors were also collaborators on the research.

PrEP, a pill taken once a day, reduces the risk of HIV infection via sex or injection drug use by up to 99 percent. Since 2012, there has been one FDA-approved PrEP formulation: the combination of tenofovir/emtricitabine (F/TDF), marketed by Gilead Sciences and sold under the brand name Truvada®. Patent protection for F/TDF is due to expire and the first generic version is expected in September 2020.

"F/TDF has a strong record of safety and efficacy," said Tim Horn, director, Medication Access and Pricing at the National Alliance of State & Territorial AIDS Directors and a study co-author. "The imminent arrival of a far cheaper, equally safe and effective, generic alternative is a golden opportunity to expand access to PrEP in some of the most difficult-to-reach segments of the at-risk population."

Complicating the roll-out of generic F/TDF is the arrival of a second PrEP agent: emtricitabine/tenofovir alafenamide (F/TAF), sold under the brand name Descovy® and also marketed by Gilead. F/TAF was approved by the FDA in October 2019 for men who have sex with men (MSM) and transgender women, based on evidence of its "non-inferior" efficacy and lower impact on markers of bone and renal safety. Anticipating the entry of a generic competitor, Gilead has been moving quickly to recommend that doctors switch patients to the new formulation, which it claims is considerably safer than F/TDF. Gilead's own projections are that it will succeed in transitioning as many as 45 percent of current patients on F/TDF for PrEP to branded F/TAF before F/TDF becomes generically available.

The study examined whether there was evidence to justify the rush to get patients to use the newly branded F/TAF. "How much is 'better' worth?" said lead author Rochelle P. Walensky, MD, MPH, chief, MGH Division of Infectious Diseases and a professor at the Harvard Medical School.

To answer that question, the researchers used data obtained from publicly available sources and recently completed clinical trials to evaluate the cost-effectiveness of F/TAF and to identify the highest possible price premium that branded F/TAF could command, even under the very best of circumstances, over generic F/TDF. To that end, the researchers intentionally overstated any adverse clinical and economic consequences of generic F/TDF, inflating rates of bone and renal disease incidence, assuming that all fractures would require surgical repair and that all cases of renal disease would require dialysis and be irreversible.

"Even when we cast branded F/TAF in the most favorable light possible, we found no plausible scenario under which F/TAF would be cost-effective compared to generic F/TDF, except perhaps for the vanishingly small number of persons with exceptionally high risk of bone or renal disease," Walensky said.

The researchers are quick to point out that while it is difficult to predict the price of generic F/TDF, it is unlikely to exceed $8,300 annually. Their analysis identifies a fair price markup over generic F/TDF of, at most, $670, suggesting that payors ought to be willing to pay no more than $8,970 for F/TAF, a price far below its current $16,600 market price.

"In the presence of a generic F/TDF option, branded F/TAF's price cannot be justified by its modest benefits," said study senior author A. David Paltiel, professor of public health (health policy) at the Yale School of Public Health. "If branded F/TAF succeeds in driving out its generic competitor, PrEP expansion in the US could grind to a halt and the new drug could end up causing more avoidable HIV transmissions than it prevents."

Credit: 
Yale School of Public Health

Ancient shell shows days were half-hour shorter 70 million years ago

WASHINGTON--Earth turned faster at the end of the time of the dinosaurs than it does today, rotating 372 times a year, compared to the current 365, according to a new study of fossil mollusk shells from the late Cretaceous. This means a day lasted only 23 and a half hours, according to the new study in AGU's journal Paleoceanography and Paleoclimatology.

The ancient mollusk, from an extinct and wildly diverse group known as rudist clams, grew fast, laying down daily growth rings. The new study used lasers to sample minute slices of shell and count the growth rings more accurately than human researchers with microscopes.

The growth rings allowed the researchers to determine the number of days in a year and more accurately calculate the length of a day 70 million years ago. The new measurement informs models of how the Moon formed and how close to Earth it has been over the 4.5-billion-year history of the Earth-Moon gravitational dance.

The new study also found corroborating evidence that the mollusks harbored photosynthetic symbionts that may have fueled reef-building on the scale of modern-day corals.

The high resolution obtained in the new study combined with the fast growth rate of the ancient bivalves revealed unprecedented detail about how the animal lived and the water conditions it grew in, down to a fraction of a day.

"We have about four to five datapoints per day, and this is something that you almost never get in geological history. We can basically look at a day 70 million years ago. It's pretty amazing," said Niels de Winter, an analytical geochemist at Vrije Universiteit Brussel and the lead author of the new study.

Climate reconstructions of the deep past typically describe long term changes that occur on the scale of tens of thousands of years. Studies like this one give a glimpse of change on the timescale of living things and have the potential to bridge the gap between climate and weather models.

Chemical analysis of the shell indicates ocean temperatures were warmer in the Late Cretaceous than previously appreciated, reaching 40 degrees Celsius (104 degrees Fahrenheit) in summer and exceeding 30 degrees Celsius (86 degrees Fahrenheit) in winter. The summer high temperatures likely approached the physiological limits for mollusks, de Winter said.

"The high fidelity of this data-set has allowed the authors to draw two particularly interesting inferences that help to sharpen our understanding of both Cretaceous astrochronology and rudist palaeobiology," said Peter Skelton, a retired lecturer of palaeobiology at The Open University and a rudist expert unaffiliated with the new study.

Ancient reef-builders

The new study analyzed a single individual that lived for over nine years in a shallow seabed in the tropics--a location which is now, 70-million-years later, dry land in the mountains of Oman.

Torreites sanchezi mollusks look like tall pint glasses with lids shaped like bear claw pastries. The ancient mollusks had two shells, or valves, that met in a hinge, like asymmetrical clams, and grew in dense reefs, like modern oysters. They thrived in water several degrees warmer worldwide than modern oceans.

In the late Cretaceous, rudists like T. sanchezi dominated the reef-building niche in tropical waters around the world, filling the role held by corals today. They disappeared in the same event that killed the non-avian dinosaurs 66 million years ago.

"Rudists are quite special bivalves. There's nothing like it living today," de Winter said. "In the late Cretaceous especially, worldwide most of the reef builders are these bivalves. So they really took on the ecosystem building role that the corals have nowadays."

The new method focused a laser on small bits of shell, making holes 10 micrometers in diameter, or about as wide as a red blood cell. Trace elements in these tiny samples reveal information about the temperature and chemistry of the water at the time the shell formed. The analysis provided accurate measurements of the width and number of daily growth rings as well as seasonal patterns. The researchers used seasonal variations in the fossilized shell to identify years.

The new study found the composition of the shell changed more over the course of a day than over seasons, or with the cycles of ocean tides. The fine-scale resolution of the daily layers shows the shell grew much faster during the day than at night

"This bivalve had a very strong dependence on this daily cycle, which suggests that it had photosymbionts," de Winter said. "You have the day-night rhythm of the light being recorded in the shell."

This result suggests daylight was more important to the lifestyle of the ancient mollusk than might be expected if it fed itself primarily by filtering food from the water, like modern day clams and oysters, according to the authors. De Winter said the mollusks likely had a relationship with an indwelling symbiotic species that fed on sunlight, similar to living giant clams, which harbor symbiotic algae.

"Until now, all published arguments for photosymbiosis in rudists have been essentially speculative, based on merely suggestive morphological traits, and in some cases were demonstrably erroneous. This paper is the first to provide convincing evidence in favor of the hypothesis," Skelton said, but cautioned that the new study's conclusion was specific to Torreites and could not be generalized to other rudists.

Moon retreat

De Winter's careful count of the number of daily layers found 372 for each yearly interval. This was not a surprise, because scientists know days were shorter in the past. The result is, however, the most accurate now available for the late Cretaceous, and has a surprising application to modeling the evolution of the Earth-Moon system.

The length of a year has been constant over Earth's history, because Earth's orbit around the Sun does not change. But the number of days within a year has been shortening over time because days have been growing longer. The length of a day has been growing steadily longer as friction from ocean tides, caused by the Moon's gravity, slows Earth's rotation.

The pull of the tides accelerates the Moon a little in its orbit, so as Earth's spin slows, the Moon moves farther away. The moon is pulling away from Earth at 3.82 centimeters (1.5 inches) per year. Precise laser measurements of distance to the Moon from Earth have demonstrated this increasing distance since the Apollo program left helpful reflectors on the Moon's surface.

But scientists conclude the Moon could not have been receding at this rate throughout its history, because projecting its progress linearly back in time would put the Moon inside the Earth only 1.4 billion years ago. Scientists know from other evidence that the Moon has been with us much longer, most likely coalescing in the wake of a massive collision early in Earth's history, over 4.5 billion years ago. So the Moon's rate of retreat has changed over time, and information from the past, like a year in the life of an ancient clam, helps researchers reconstruct that history and model of the formation of the moon.

Because in the history of the Moon, 70 million years is a blink in time, de Winter and his colleagues hope to apply their new method to older fossils and catch snapshots of days even deeper in time.

Credit: 
American Geophysical Union

Show me the methane

Though not as prevalent in the atmosphere as carbon dioxide, methane is a far more potent greenhouse gas. Occurring naturally as well as being manmade, methane is much shorter-lived than CO2, but it is fast acting and 20 to 80 times as effective at trapping heat. A little extra methane goes a long way.

In addition, methane is invisible, which makes detection by conventional means difficult. So when UC Santa Barbara researcher Satish Kumar and colleagues noted the growing use of infrared sensing as a means of greenhouse gas detection, as was highlighted in a recent New York Times story, they were pleased. The interactive piece used infrared cameras to track emissions from oil and gas facilities in the Permian Basin, an oil field located in Texas and New Mexico.

It's a topic close to his heart -- as a member of electrical and computer engineering professor B.S. Manjunath's Vision Research Lab, Kumar does work involving multimedia signal processing and analysis.

"As a computer engineer interested in environmental management, I am incredibly glad methane leaks from previously unknown sources are being brought to light," he said.

Now, to keep the conversation alive, Kumar and his colleagues have proposed a system that does the heat detection one better, by using hyperspectral imaging and machine learning to detect the specific wavelength of methane emissions. Their work was presented at the 2020 IEEE Winter Conference on the Applications of Computer Vision.

"Infrared cameras only detect temperature signatures, so if there is a combination of gases with high temperature signatures, an infrared camera will not be able to differentiate between them," Kumar said. An infrared image might point to a suggestion of methane, but its concentration and its location couldn't be pinpointed by heat signature alone. In addition, the farther a hot gas travels from its source, the cooler it gets, eventually making it invisible to infrared.

To overcome these shortcomings, Kumar and team used data from hyperspectral cameras at wavelengths from 400 nanometers to 2,510 nm -- a range that encompasses methane's spectral wavelengths and perhaps those of other gases -- in areas around the Four Corners region. Located in the American Southwest, the region also is the site of what could be the largest source of methane release in the United States, particularly the San Juan Basin, shared by New Mexico and Colorado.

Hyperspectral imaging involves the collection of a series of images, in which each pixel contains a spectrum and each image represents a spectral band (a range of wavelengths). Its high sensitivity allows it to capture spectral "fingerprints" that correspond to certain materials, such as methane's 2,200-2,400 nm wavelengths, which allowed the researchers to locate methane, even in a plume of other gases.

But, methane isn't the only material that exists at that wavelength.

"There are many confusers to methane," Kumar said. "The hydrocarbons from roads and paints on buildings, they have the same signature as methane." The sheer amount of data and the potential for confusion between methane and other hydrocarbons led the researchers to turn to machine learning.

"We used a deep learning model to train the computer to learn the shape that a methane gas leak takes as it is released and spreads," he explained. This helped the researchers not only to pinpoint the location from which methane was being emitted, whether from gas plant or landfill, but also to automatically differentiate between methane and other hydrocarbons in the same image.

Using this method, the researchers report an 87% success rate in the accurate detection of methane leaks, more of which continue to be discovered from a variety of manmade sources. These include fugitive emissions from incomplete flaring, previously undetected leaks from poorly monitored operations, and the cumulative methane leaks from homes, businesses and urban infrastructure.

Credit: 
University of California - Santa Barbara

Tax incentives for businesses could contribute to the decline of the middle class

image: New research from a West Virginia University economist shows that tax incentives for businesses could do more harm than good, especially for the working class.

Image: 
Brian Persinger/West Virginia University

A corporation announces it's seeking bids from local governments to build a new warehouse or move its headquarters.

Policymakers tend to swoop in with this mindset: Let's entice that company with tax breaks to set up shop here. It'll create new jobs and enhance the overall health of our economy.

But economic development incentives may do more harm than good, especially for middle-class workers, according to new West Virginia University research.

"Incentivizing the wrong industries can fail to lead to growth in employment locally in those industries and our research suggests this may potentially be contributing to the decline of middle-class jobs," said Heather Stephens, assistant professor of resource economics and management in the Davis College of Agriculture, Natural Resources and Design. "For this research, we examined what happens to the mix of jobs within communities when you incentivize certain industries."

Stephens co-authored "Incentivizing the Missing Middle: The Role of Economic Development Policy" with Carlianne Patrick, of Georgia State University. Their findings are published in Economic Development Quarterly.

The researchers drew on several primary data sources, including data on taxes and incentives from the W. E. Upjohn Institute for Employment Research, an occupation-based typology from the University of Toronto and detailed industry-level wage and employment data from EMSI, Inc.

From the data, they could break down the incentivized industry classes and their impact on jobs. For instance, do economic development incentives - such as property tax abatements and job tax credits -- for industries that contain a lot of middle-class jobs affect jobs only within those industries? Or could the impact trickle over to higher- or lower-wage industries?

Given that there is no one definition of "middle class," they use two different definitions. One that classifies industries based on occupations as creative class, working class (middle class) or service class; and one that classifies industries based on average wages as high wage, middle wage and low wage.

The creative class can include a wide range of occupations in science, engineering, education, computer programming, business and the legal sector. These job types require high levels of cognitive and problem-solving skills. Working class occupations rely more on manual labor and skills, such as in the construction, factory and trade settings.

"This is the first time anyone's tried to look at the distributional impact," Stephens said. "In the past, we've seen studies on whether economic incentives lead to overall job growth. While incentives can lead to new jobs, those could be only low-wage jobs. Yet it still looks like you have more employment.

"Or you could be giving away substantial tax dollars to create a few high-wage jobs, but you might see overall employment losses."

A tale of two cities

Comparing employment trends in San Antonio, Texas, to Birmingham, Alabama, shows anecdotal evidence of how these incentives play out in the United States.

From 2000 to 2015, the employment index steadily for San Antonio. During that same time, the city shifted tax incentives from creative-class industries to working-class industries, Stephens said.

"In the early 2000s, San Antonio was incentivizing industries in the creative class more, like your Amazon-types," Stephens said. "Then they switched, and by 2015, working-class industries faced more incentives. This seems to have paid off."

On the opposite end of the spectrum, you have Birmingham. At the start of the decade, the city incentivized the working class more but has since focused its tax breaks more on the creative class. There, total employment dropped, according to Stephens' findings.

"It's an anecdote, but it's an accurate representation of the bigger picture," she said.

'Giving away the farm'

Though middle-wage jobs comprise 60 percent of employment in the country, high-wage industries face the lowest tax rates (or are incentivized more) - and the gap has been increasing, according to Stephens.

Her findings reveal that raising taxes on creative-class and high-wage industries do not lead to negative employment effects on any class of industries and doing so actually increases employment among the working- and middle-classes.

She hopes policymakers take note, as tax incentives emerge from state and local governments.

"Policymakers really need to think about whether these incentives are paying off," Stephens said. "Not just whether they're creating jobs--but are they inadvertently contributing to the decline of the middle class, which they say they want to support?"

She referenced Amazon as an example. In 2019, the Virginia Senate approved tax incentives of up to $750 million for Amazon to build a facility in Arlington. More recently, in February, policymakers in Delaware signed off on plans to give Amazon $4.5 million in subsidies for a warehouse.

"Our research suggests there's no reason Virginia should have given Amazon all of those tax breaks because they're going to have the same amount of high-end jobs before and after," she said. "If anything, it could be that Virginia may see a decline in middle class jobs afterwards. And they've just given away their tax revenues. It's like giving away the farm."

Credit: 
West Virginia University

Examining diagnoses of stress-related disorders, risk of neurodegenerative diseases

What The Study Did: Researchers investigated how stress-related disorders (such as posttraumatic stress disorder, adjustment disorder and stress reactions) were associated with risk for neurodegenerative diseases, including Alzheimer and Parkinson disease and amyotrophic lateral sclerosis (ALS), using data from national health registers in Sweden.

Authors: Huan Song, M.D., Ph.D., of Sichuan University in Chengdu, China, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/ 

(doi:10.1001/jamaneurol.2020.0117)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network