Culture

Blocking cellular communication stops SARS-CoV-2

In the transmission of signals within the cell which, for example, stimulate cell growth or trigger metabolic processes, phosphate groups play an important biochemical role. The phosphate groups are often attached to proteins or removed to control activity. In this process, a change in the protein triggers the next one and the signal is transmitted in a signaling cascade. The target is usually the cell nucleus, where genes are switched on or off.

For the first time, biochemists and virologists from Goethe University have now succeeded in documenting the full picture of all the communication pathways in a human cell infected with SARS-CoV-2 and observed what changes the infection triggers. To do so, they analyzed all proteins carrying a phosphate group at a given moment in time - what is known as the phosphoproteome. The result: SARS-CoV-2 evidently uses above all those signaling pathways of the host cell where a growth signal is transmitted into the cell from outside. If these signaling pathways are interrupted, the virus is no longer able to replicate.

Dr. Christian Münch from the Institute of Biochemistry II at Goethe University explains: "The signaling pathways of the growth factors can be blocked precisely at the point where the signal from outside the cell docks onto a signal receiver - a growth factor receptor. There are, however, a number of very effective cancer drugs that interrupt growth factor signaling pathways slightly further down the cascade, through which the signals of different growth factor receptors are blocked. We've tested five of these substances on our cells, and all five led to a complete stop of SARS-CoV-2 replication."

Professor Jindrich Cinatl from the Institute of Medical Virology at University Hospital Frankfurt says: "We conducted our experiments on cultivated cells in the laboratory. This means that the results cannot be transferred to humans without further tests. However, from trials with other infectious viruses we know that viruses often alter signaling pathways in their human host cells and that this is important for virus replication. At the same time, already approved drugs have a gigantic lead in terms of development so that it would be possible - on the basis of our results and just a few more experiments - to start clinical studies very quickly."

Via INNOVECTIS, the researchers have patented their method of interrupting signaling pathways by means of specific inhibitors in order to treat COVID-19. INNOVECTIS was founded in 2000 as a subsidiary of Goethe University and has operated successfully since then as a service provider in the transfer of academic know-how into business practice.

Credit: 
Goethe University Frankfurt

New insights into lithium-ion battery failure mechanism

image: Coin batteries similar to those used in this study.

Image: 
University of Cambridge

Researchers have identified a potential new degradation mechanism for electric vehicle batteries - a key step to designing effective methods to improve battery lifespan.

The researchers, from the Universities of Cambridge and Liverpool, and the Diamond Light Source, have identified one of the reasons why state-of-the-art 'nickel-rich' battery materials become fatigued, and can no longer be fully charged after prolonged use.

Their results, reported in the journal Nature Materials, open the door to the development of new strategies to improve battery lifespans.

As part of efforts to combat climate change, many countries have announced ambitious plans to replace petrol or diesel vehicles with electric vehicles (EVs) by 2050 or earlier.

The lithium-ion batteries used by EVs are likely to dominate the EV market for the foreseeable future, and nickel-rich lithium transition-metal oxides are the state-of-the-art choice for the positive electrode, or cathode, in these batteries.

Currently, most EV batteries contain significant amounts of cobalt in their cathode materials. However, cobalt can cause severe environmental damage, so researchers have been looking to replace it with nickel, which also offers higher practical capacities than cobalt. However, nickel-rich materials degrade much faster than existing technology and require additional study to be commercially viable for applications such as EVs.

"Unlike consumable electronics which typically have lifetimes of only a few years, vehicles are expected to last much longer and therefore it is essential to increase the lifetime of an EV battery," said Dr Chao Xu from Cambridge's Department of Chemistry, and the first author of the article. "That's why a comprehensive, in-depth understanding of how they work and why they fail over a long time is crucial to improving their performance."

To monitor the changes of the battery materials in real time over several months of battery testing, the researchers used laser technology to design a new coin cell, also known as button cell. "This design offers a new possibility of studying degradation mechanisms over a long period of cycling for many battery chemistries," said Xu. During the study, the researchers found that a proportion of the cathode material becomes fatigued after repetitive charging and discharging of the cell, and the amount of the fatigued material increases as the cycling continues.

Xu and his colleagues dived deep into the structure of the material at the atomic scale to seek answers as to why such fatigue process occurs. "In order to fully function, battery materials need to expand and shrink as the lithium ions move in and out," said Xu. "However, after prolonged use, we found that the atoms at the surface of the material had rearranged to form new structures that are no longer able to store energy."

What's worse is that these areas of reconstructed surface apparently act as stakes that pin the rest of the material in place and prevent it from the contraction which is required to reach the fully charged state. As a result, the lithium remains stuck in the lattice and this fatigued material can hold less charge.

With this knowledge, the researchers are now seeking effective countermeasures, such as protective coatings and functional electrolyte additives, to mitigate this degradation process and extend the lifetime of such batteries.

Credit: 
University of Cambridge

Study revealing structure of a protein complex may open doors to better disease research

image: An artistic representation of the Arp2/3 complex structure in an inactive state (top left) and active state with nucleated filament (bottom right). With a cryo-EM micrograph as the background it shows actin filaments nucleated from the Arp2/3 complex.

Image: 
Stony Brook University

STONY BROOK, NY, August 25, 2020 - More than two decades ago scientists discovered the Arp2/3 complex, an actin (cellular protein) cytoskeketal nucleator which plays a crucial role in cell division, immune response, neurodevelopment other biological processes. But there has been no determined structure of the activated state of the complex until now, an achievement that may lay the foundation for uncovering its role in biology and in the development of disease. Researchers at Stony Brook University led by Saikat Chowdhury, PhD, determined the structure of Arp2/3 and describe it in a paper published in Nature Structural and Molecular Biology.

Chowdhury, and graduate student and first author Mohammed Shaaban determined the first near-atomic resolution structure of Arp2/3 in its active state by using cryo-electron microscopy. The structure shows the complex in its active form and bound to a signaling molecule. It also shows the nucleated actin filament, thus providing a structural snapshot of the global and local conformational changes in the Arp2/3 complex that help grow new actin filament in cells.

"Obtaining the macromolecular structure of activated Arp2/3 complex has been a long-standing goal for scientists," says Chowdhury, senior author and an Assistant Professor in the Department of Biochemistry and Cell Biology in the College of Arts and Sciences at Stony Brook University. "Our structure reveals a level of molecular details which show the individual components of the complex and how they are positioned relative to each other in the active state."

Having a structure of Arp2/3 in its active state will help drive more detailed research of the complex. Chowdhury explains that this is extremely important because when Arp2/3 is deregulated in the biological state, it is associated with cancer metastasis, neurodegeneration, bacterial and viral infections and wound healing problems.

"So not only does this structure enable us to fill a knowledge gap in the actin cell biology field, it potentially helps to build our understanding of the underlying causes of a number of diseases with the ultimate goal of developing new therapeutics," emphasizes Chowdhury.

Determining the Arp2/3 structure in its activated state required the researchers to use technologies available in Stony Brook University's Cryo-Electron Microscopy Facility, a center supported by the National Institutes of Health (NIH), and the High Performance Computing capabilities in the Division of Information Technology.

Chowdhury is also affiliated with the Institute of Engineering Driven Medicine and Institute of Chemical Biology & Drug Discovery at Stony Brook University, as well as an affiliated scientist at Brookhaven National Laboratory. The research was done in collaboration with Brad Nolen at the University of Oregon and supported in part by the NIH and SUNY, Stony Brook University.

Credit: 
Stony Brook University

Spurring our understanding

(Santa Barbara, Calif.) -- Once in a while, over the history of life, a new trait evolves that leads to an explosion of diversity in a group of organisms. Take wings, for instance. Every group of animals that evolved them has spun off into a host of different species -- birds, bats, insects and pterosaurs. Scientists call these "key innovations."

Understanding the development of key innovations is critical to understanding the evolution of the amazing array of organisms on Earth. Most of these happened deep in the distant past, making them difficult to study from a genetic perspective. Fortunately, one group of plants has acquired just such a trait in the past few million years.

Columbines, with their elegant nectar spurs, promise scientists an opportunity to investigate the genetic changes that underpin a key innovation. After much research, UC Santa Barbara professor Scott Hodges, research associate Evangeline Ballerini, and their coauthors at Harvard University have identified a gene critical to the development of these structures. And to their knowledge, this is among the first key innovations for which a critical developmental gene has been identified. Their findings appear in the journal PNAS.

The researchers named the gene after Gregg Popovich, head coach of the San Antonio Spurs basketball team. "This gene is a transcription factor, which means it controls spur development in columbines by regulating the activity of other genes," explained Ballerini. "So I chose the name POPOVICH because as coach, Popovich controls San Antonio Spurs development, in a sense, by regulating the activity of his players."

The evolution of spurs in columbines' ancestors seems to have led to rapid expansion in the genus. Around 70 species evolved over the past 5 to 7 million years, compared to its spurless sister genus, which counts only four species among its members.

And columbines aren't the only flowers with spurs. The trait evolved independently in many different plants, including nasturtiums, larkspurs and impatiens. "And in each of those groups, the ones that have spurs have far more species than their closest relatives that don't have spurs," said Hodges.

"We think that diversity is linked to the evolution of this spur because the spur produces nectar, which attracts animal pollinators," Ballerini said. Changing the length or shape of the spur changes the animals that can pollinate the flower. "Bees are only moving pollen between bee flowers, hummingbirds are only moving pollen between hummingbird flowers, so you're not exchanging genes between those two different populations." Eventually, the two can split into different species.

The question the researchers were trying to answer was how innovations like these develop in the first place. "If we can find genes that are important in the development of a key innovation, that will help us understand this kind of process," said Hodges.

"In most of these cases -- like in the wing example with birds, bats and insects -- those evolved so long ago that it's hard to find a particular gene that was critical for evolving that trait," he added. "Here we have a fairly recent origin of a key innovation, only 5 to 7 million years ago, and it's a fairly simple trait, so it's a little more straightforward."

Finding POPOVICH

Since columbines evolved so recently, most of them can form fertile hybrids with each other. In the 1950s and '60s, a Polish geneticist crossed a spurless species -- appropriately named the spurless columbine -- with its spurred cousins. She found that in the first generation of offspring all had spurs, but self-pollinating these yielded a second generation where spurlessness reappeared in a quarter of the plants.

That ratio was crucial to Hodges and Ballerini's work some half a century later. This simple fraction suggested that a single gene controlled the development of spurs. But columbines have roughly 30,000 genes, and only one was the gene they were looking for.

Following in the footsteps of his predecessor, Hodges also crossed the spurless columbine with a spurred species, and then self-pollinated the offspring. But unlike in the previous experiment, Ballerini and Hodges now had the tools to search the plants' genetic code.

Ballerini sequenced the genome of each of the nearly 300 second generation plants and looked for instances in which the spurless plants had inherited two copies from their spurless grandparent. This narrowed the search to around 1,100 genes on one of the plants' chromosomes.

Still, 1,100 genes are a lot to sort through. "There was no guarantee that these methods would lead us to the gene we were looking for," Ballerini said. "There was definitely quite a bit of work that went into all of the experiments and analyses, but in the end there was a bit of luck too."

Ballerini examined the expression of genes during five stages of early petal development in the spurless columbine and three other spurred species. She sequenced all the genes that were turned on in each stage and looked for consistent differences between the spurless and spurred plants. Eventually, with input from one of her collaborators at Harvard, Ballerini suspected she had identified the right gene. It was always turned off in the spurless species, turned on in the spurred species and was one of the 1,100 genes previously identified as associated with spurless flowers in the genetic cross. Now it was time to test her hypothesis.

She used a genetically modified virus to knock down the expression of the gene in question as well as a gene critical for producing red pigment. This way they could tell which petals were affected just by looking at the color.

Wherever POPOVICH was sidelined, the flowers developed diminutive spurs. But spur length depends both on the number and the size of cells. So the researchers worked with collaborators to count the number and measure the length of each cell making up these diminutive spurs.

"The longer spurs had more cells, and the shorter spurs had fewer cells," Hodges noted. "So the gene must have been acting by affecting how many cells were produced."

Ballerini remembers sitting in her office after finishing her final analyses. She began throwing out potential gene names to graduate student Zac Cabin, a fellow sports enthusiast. "At the same time Zac and I turned to each other and both said 'POPOVICH!'" she recalled. The name seemed a perfect fit. "And it leaves open the possibility that, if we identify other genes at play in spur development, we can name them after some of the players on the Spurs."

A path to new discoveries

While identifying POPOVICH is certainly an achievement, the true value of the discovery lies in what it reveals about the evolution of key innovations. Before this work, none of the plant groups that had well-known genomes also made spurs. "We had no idea where to start," said Hodges. "This discovery provides us a foothold."

"Once we identify one gene -- like this gene, which seems to be key in the process of forming spurs -- then we can start to figure out all of the components," he added. The team can now begin investigating which genes POPOVICH regulates, and which genes regulate POPOVICH. "This is a place to start to understand the whole system."

While the researchers don't know how POPOVICH functions in other groups of plants, it appears to influence the number of leaflets that grow on bur clovers. Columbines also express the gene in their leaves; perhaps it was recruited from the leaves into petal development, Ballerini suggested.

Novel adaptations don't appear out of nowhere, she explained. "When you're evolving a new structure, usually you're not evolving a whole brand new gene." Generally, organisms repurpose or add a purpose to an existing gene.

The authors are also interested in identifying genes involved in the second phase of spur formation: the elongation of the cells in the spur cup.

"These are things that we will want to do now that we've identified this gene," Hodges said. "And since it's a transcription factor, it must have particular genes that it's affecting. The next logical step would be to identify the targets of this gene, and that would tell us a lot more about how it functions."

The researchers expressed their gratitude toward Harvey Karp, who generously funded the Karp Discovery Award that made their research possible. "We really couldn't have done this project without it," Ballerini said.

Credit: 
University of California - Santa Barbara

Study identifies first step to beating water scarcity

New research has revealed the locations and industries in the USA where efforts to improve water consumption would have the greatest benefit for economic activity and the environment.

The study, led by researchers from Virginia Tech, used a spatially detailed database of water productivity to set realistic benchmarks for more than 400 industries and products. It is published today in the IOP Publishing journal Environmental Research Letters.

Lead author Dr Landon Marston, from Virginia Tech, said: "Nearly one-sixth of U.S. river basins cannot consistently meet society's water demands while also providing sufficient water for the environment. Water scarcity is expected to intensify and spread as populations increase, new water demands emerge, and climate changes.

"However, improving water productivity by meeting realistic benchmarks for all water users could enable US communities to expand economic activity and improve environmental flows. We asked ourselves the questions: if water productivity is improved across the US economy, how much water can be saved and in which industries and locations?' Our study is the first attempt to answer this question on a nationwide scale, and develop benchmarks to inform future action."

Using their data, the research team looked at how much water savings or how much improvement in water productivity (production or dollars earned per unit of water consumed) could be achieved by improving all users' water productivity to meet a target benchmark, such as up to the 50th percentile (median productivity), 25th percentile (high productivity), or 10th percentile (outstanding productivity).

Co-author Dr Kyle Davis, from the University of Delaware, said: "One of the good things about a benchmarking approach is that it is not prescriptive in the practices or technologies used to reduce water consumption. Instead, it enables individuals and companies to select from a portfolio of strategies, tailored to the constraints and opportunities they face in their businesses and geographic or climatic context.

"The benchmarks we've developed represent actual water productivities achieved by a water user's regional industry peers and are therefore realistically achievable in most cases. Our study provides an upper bound of potential water savings, because we recognise that financial and regulatory barriers may prevent some water users from attaining water productivities achieved by their peers."

The study also found that some of the most water stressed areas in the US West and South have the greatest potential for water savings, with around half the savings obtained by improving water productivity in the production of corn, cotton, and alfalfa.

Dr Marston added: "By incorporating benchmark-meeting water savings within a national hydrological model, we show that depletion of river flows across Western US regions can be reduced on average by 6.2 to 23.2 per cent, without reducing economic production.

"We also identified the US industries and locations that can make the biggest impact by working with their suppliers to reduce water use 'upstream' in their supply chain. The agriculture and manufacturing sectors have the largest indirect water footprint, due to their reliance on water-intensive inputs, but these sectors also show the greatest capacity to reduce water consumption throughout their supply chains.

"Our study is an important first step towards understanding locations and industries where improved water productivity has the greatest potential to conserve water. Meeting the direct and indirect water demands of a growing population while providing enough water to meet local environmental flow requirements will be a key challenge in the coming decades. Improving water productivity will be critical in meeting this challenge, by putting water to more economically-beneficial uses, reducing unsustainable water use, and making water available for other uses, including the environment."

Credit: 
IOP Publishing

Army robo-teammate can detect, share 3D changes in real-time

image: The two robots used in the experiments are identically equipped, with the exception of Velodyne VLP-16 LiDAR (left) and Ouster OS1 LiDAR (right).

Image: 
U.S. Army photo

ADELPHI, Md. -- Something is different, and you can't quite put your finger on it. But your robot can.

Even small changes in your surroundings could indicate danger. Imagine a robot could detect those changes, and a warning could immediately alert you through a display in your eyeglasses. That is what U.S. Army scientists are developing with sensors, robots, real-time change detection and augmented reality wearables.

Army researchers demonstrated in a real-world environment the first human-robot team in which the robot detects physical changes in 3D and shares that information with a human in real-time through augmented reality, who is then able to evaluate the information received and decide follow-on action.

"This could let robots inform their Soldier teammates of changes in the environment that might be overlooked by or not perceptible to the Soldier, giving them increased situational awareness and offset from potential adversaries," said Dr. Christopher Reardon, a researcher at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "This could detect anything from camouflaged enemy soldiers to IEDs."

Part of the lab's effort in contextual understanding through the Artificial Intelligence for Mobility and Maneuver Essential Research Program, this research explores how to provide contextual awareness to autonomous robotic ground platforms in maneuver and mobility scenarios. Researchers also participate with international coalition partners in the Technical Cooperation Program's Contested Urban Environment Strategic Challenge, or TTCP CUESC, events to test and evaluate human-robot teaming technologies.

Most academic research in the use of mixed reality interfaces for human-robot teaming does not enter real-world environments, but rather uses external instrumentation in a lab to manage the calculations necessary to share information between a human and robot. Likewise, most engineering efforts to provide humans with mixed-reality interfaces do not examine teaming with autonomous mobile robots, Reardon said.

Reardon and his colleagues from the Army and the University of California, San Diego, published their research, Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection, at the 12th International Conference on Virtual, Augmented, and Mixed Reality, part of the International Conference on Human-Computer Interaction.

The research paired a small autonomous mobile ground robot, equipped with laser ranging sensors, known as LIDAR, to build a representation of the environment, with a human teammate wearing augmented reality glasses. As the robot patrolled the environment, it compared its current and previous readings to detect changes in the environment. Those changes were then instantly displayed in the human's eyewear to determine whether the human could interpret the changes in the environment.

In studying communication between the robot and human team, the researchers tested different resolution LIDAR sensors on the robot to collect measurements of the environment and detect changes. When those changes were shared using augmented reality to the human, the researchers found that human teammates could interpret changes that even the lower-resolution LIDARs detected. This indicates that--depending on the size of the changes expected to encounter--lighter, smaller and less expensive sensors could perform just as well, and run faster in the process.

This capability has the potential to be incorporated into future Soldier mixed-reality interfaces such as the Army's Integrated Visual Augmentation System goggles, or IVAS.

"Incorporating mixed reality into Soldiers' eye protection is inevitable," Reardon said. "This research aims to fill gaps by incorporating useful information from robot teammates into the Soldier-worn visual augmentation ecosystem, while simultaneously making the robots better teammates to the Soldier."

Future studies will continue to explore how to strengthen the teaming between humans and autonomous agents by allowing the human to interact with the detected changes, which will provide more information to the robot about the context of the change-for example, changes made by adversaries versus natural environmental changes or false positives, Reardon said. This will improve the autonomous context understanding and reasoning capabilities of the robotic platform, such as by enabling the robot to learn and predict what types of changes constitute a threat. In turn, providing this understanding to autonomy will help researchers learn how improve teaming of Soldiers with autonomous platforms.

Credit: 
U.S. Army Research Laboratory

Primary care at a crossroads: Experts call for change

The old adage about a frog that gets put in a pot of cold water on a stove, and doesn't leap out even as the heat slowly climbs to boiling, might seem like a strange metaphor for primary care.
But for many primary care providers around the country, it might feel like an increasingly apt one.

And they feel like the frogs.

The heat that has gradually increased under them in recent years came from a range of sources:

Insurance companies that make them the gatekeeper for patients who need specialty care.

The electronic health record systems that eat up hours after the 'work day' is done, to document diagnoses, orders and treatment authorizations.

The increased expectations from patients and specialists that they'll respond to messages, test results or requests instantly.

The performance metrics that come with a threat of financial penalty.

The growing number of medical decisions that demand in-depth discussions with patients.

And now the rapid rise in telemedicine visits due to COVID-19.

And all of this with little or no increase in the time they have to get it all done, or reduction in the number of patients assigned to them.

Two teams of primary care providers from Michigan Medicine, the University of Michigan's academic medical center, have just published papers in the Journal of General Internal Medicine looking at issues facing primary care practitioners in the third decade of the 21st Century.

More demands, same amount of time

One of the papers documents the electronic medical record demands faced by general internists at Michigan Medicine, U-M's academic medical center.

Each one reported an average of 390 "in-basket" tasks each week, and reported spending an average of 20 hours each week handling the tasks and the additional work that most of them resulted in. That's far more than the eight hours of administrative time per week allowed for each full-time clinician.

The findings build on other studies showing that primary care providers nationwide face the highest number of incoming tasks on EMR systems.

Lead author Laurence McMahon, M.D., M.P.H., chief of general medicine at U-M, notes that the burden has driven many primary care providers across the country to cut back their clinical schedule, just to make their work hours manageable.

"Fully 70% of Michigan Medicine's general medicine faculty now work 'part-time', because it is the only way they can manage the explosion in work - in essence they are taking a pay cut in order to deal with what has become a full-time body of work," says McMahon.

Some have decided to go into hospital medicine, which comes with defined shifts providing general care to hospitalized patients. The new paper's senior author, Vineet Chopra, M.D., M.Sc., leads Michigan Medicine's Division of Hospital Medicine.

"If we are the frogs in the story, I was there when they put the pot on the stove," says McMahon, who has practiced medicine for more than 30 years. "I've had the opportunity to live through all this change, and I remember when each day would mean you'd finish your work and be done. This is no longer the case, and I fear that it will mean fewer physicians choosing primary care careers at a time when our country needs them more than ever. We need a rational way of dividing up that workload so that it's supportive of patient care and of the workforce that's delivering it."

McMahon is quick to point out that the current situation didn't come about all of a sudden, or on purpose.

"This has happened by accident, because of incremental steps," he explains. "Each time we just altered how we practice, and what we do outside our official work hours, to accommodate the additional tasks. But for many, this has made it a job that's not doable within the time we have to see the patients we need to see."

It's no wonder, he says, that some of the "frogs" have chosen to leave the pot, or climb the sides by reducing their hours.

He and his colleagues call for a serious examination of the appropriate size and composition of the panels of patients that primary care providers are expected to maintain, the ability to customize EMRs for primary care, and attention to the actual demands on providers' time. Because of the importance of lifestyle and work-life balance to the current generation of new physicians, time is of the essence to make this change.

Sharing decision making - express style

The other new paper, by U-M and VA Ann Arbor Healthcare System internists Tanner Caverly, M.D., M.P.H., and Rodney Hayward, M.D., focuses on one of the expectations that has arisen in recent years for primary care providers: shared decision making.

They put forth a framework for an abbreviated version that busy providers can practice, to help patients play a larger role in deciding what's right for them and avoiding the patriarchal approach of yesteryear. They call it "everyday shared decision making."

It takes a concept originally developed for high-stakes decisions about cancer treatment or major surgery, and adapts it for the kinds of decisions made at the primary care level.

"An emphasis in recent years has been on improving discussions between the physician and patient--about medical evidence, personal preference and overall goals. But the picture is different in primary care, where most decisions are not high-stakes. Primary care providers most often guide patients about lower-stakes decisions such as whether and when to get screened for different diseases - and only have a minute or two within a clinic visit to make those decisions," says Caverly.

In fact, some studies have shown that as a result of the short time allowed for primary care encounters, many patients aren't getting a true chance to share in the decisions about their health. Efforts to involve other clinic staff in the shared decision making process aren't working either.

In the framework that the authors and their colleagues have proposed, the primary care provider must be able to quickly use information about the individual patient's risk factors to formulate a personalized recommendation about the potential positives and negatives of the decision at hand.

But instead of trying to go over all of the odds ratios and statistics in the minute or two they can allot for such discussions, they instead give the patient their evidence-based, personalized recommendation, and then put the ball in the patient's court by saying they'll support whatever they decide. If the patient asks for more details, they can provide them.

In order to make this approach possible, primary care providers will need more information at the time of care about how much any given treatment, screening or other intervention is likely to help that patient in particular, given their health history and risk factors.

Such information needs to be at their fingertips, including in the computer systems they use during appointments, and a good sense of how much patient preference varies for each decision.
This could flag for providers which patients fall into the "preference sensitive zone" for any given decision, so the provider can let them know the decision isn't clear-cut and inform them about key factors that affect the decision.

Caverly and Hayward give as an example a website created to help providers determine quickly which patients will get the most benefit from a CT scan to screen for lung cancer, https://screenlc.com/.

They note that in their future efforts to test the everyday SDM model, a main task for providers is to frame the decision for that individual patient and then let them decide. For instance, for one patient, they might say about a lung cancer screening,

"For someone like you, I think the potential benefits outweigh the harms" but for another patient they might say "This is very clear-cut, and the benefits far outweigh the potential harms."

Credit: 
Michigan Medicine - University of Michigan

New research contradicts claims that Asian American students are harmed when they cannot attend their first-choice university

Washington, August 24, 2020--A new study finds evidence that contradicts claims in legal complaints to the U.S. Department of Justice arguing that Asian American students face negative consequences while in college as a result of not being admitted to and not attending their first-choice institution. These complaints led to the Trump administration launching formal investigations into the race-conscious admissions practices of Harvard and Yale universities. The findings were published today in Educational Researcher, a peer-reviewed journal of the American Educational Research Association.

The Justice Department issued its findings on the Yale investigation on August 13; its investigation of Harvard is still ongoing.

The nearly identical complaints filed by the Coalition of Asian American Associations (CAAA) and the Asian American Coalition for Education (AACE) specifically cited several negative consequences for these students: reduced time spent on leadership, public service, and co-curricular activities; diminished satisfaction in their academic institutions; a negative attitude toward academics and lower academic achievement; a lack of self-confidence and assertiveness; and negative racial interactions.

To test the groups' claims, a team of seven researchers at the University of Denver and the University of California, Los Angeles (UCLA), examined differences in Asian American student outcomes while in college, based on their college admission and enrollment decisions. For their study, the researchers analyzed longitudinal data from two national surveys administered by the Higher Education Research Institute at UCLA: the 2012 Freshman Survey and 2016 College Senior Survey. The study's sample included 1,023 students who self-identified as Asian American and completed both surveys.

The researchers assessed 27 student outcome measures spread across six general categories. The categories included academic performance and perception of academic abilities; satisfaction with college; self-confidence and self-esteem; level of student involvement; willingness and ability to contribute to society; and diversity of racial interactions.

"Overall, our findings countered the claims made by the two groups that served as the impetus of the Justice Department's investigation," said study coauthor Mike Hoa Nguyen, an assistant professor of higher education at the University of Denver. "We found that only small differences, if any, exist between the self-reported outcomes of Asian American students who were admitted to and attending their first-choice university and those students who were not."

Nguyen's coauthors include Connie Y. Chang, Victoria Kim, Rose Ann E. Gutierrez, Annie Le, and Robert T. Teranishi at UCLA, and University of Denver scholar Denis Dumas.

On 23 of the 27 outcome measures, Nguyen and his colleagues found no difference between the two groups of students, after controlling for students' SAT score, high school grade point average, gender, and first-generation college status. On one other measure--"time spent participating in student clubs or groups"--students not accepted by their first-choice institution reported higher levels of involvement than their peers. The remaining three outcome measures showed marginally higher outcomes for students at their first-choice university, with a very small magnitude of difference between the two groups.

In the "academic performance and perception of academic abilities" category, only one of the 11 measures--"time spent studying and doing homework"--showed a difference between the two groups, with students at their first-choice institution indicating more time on schoolwork. At the same time, the two groups reported similar levels of academic performance and perception of their academic ability.

In the "diversity of racial interactions" category, four of the five measures--including "positive cross-racial interaction"--found no differences between the two groups of students. On the fifth measure--"negative cross-racial interaction"--students at their first-choice university reported fewer negative experiences.

In the "satisfaction with college" category, students at their first-choice university scored higher on one measure--overall satisfaction with the college experience--than their peers. On the other measure in the category--"satisfaction with coursework"--there was no difference between the two groups.

"It is important to note that college choice and admission outcomes are not the only factor contributing to students' college satisfaction," Nguyen said. "Prior research indicates that feeling welcome and valued, instructional effectiveness, racial identity, and faculty and student interactions all impact college satisfaction."

In the "willingness and ability to contribute to society" and the "self-confidence and self-esteem" categories, across seven indicators, both groups showed no differences.

"The bottom line is that our findings reject the claims that Asian American students face negative consequences if they are not accepted by and do not attend their first-choice college," said Nguyen. "Our study shows that the claims are inconsistent and inaccurate."

Furthermore, Nguyen added, the findings support prior research that emphasizes the benefits of attending college, in general, even if it is not at one's first-choice institution.

"Although college choice is of vast importance for many students, including Asian Americans, our study suggests that simply relying on rankings and perceived prestige at elite universities to determine one's first-choice schools might be a disservice to students," Nguyen said. "It is what students do in college, rather than the level of institutional prestige alone, that most determines educational outcomes."

While the CAAA and AACE indicate that they broadly represent the Asian American and Pacific Islander community, public opinion research, including the Spring 2016 Asian American Voter Survey, has found that a majority of Asian Americans support race-conscious university admissions.

Credit: 
American Educational Research Association

Reducing transmission risk of livestock disease

UNIVERSITY PARK, Pa. -- The risk of transmitting the livestock virus PPRV, which threatens 80 percent of the world's sheep and goats, increases with certain husbandry practices but not herd size. A new study, led by researchers at Penn State, investigated how transmission of PPRV might change at different scales and identified specific husbandry practices associated with increased odds of infection -- including the introduction of sheep and goats to the herd, sheep or goat attendance at seasonal grazing camps, and the sales or gifting of goats from the herd.

The sheep and goat plague virus, formally known as peste des petits ruminants virus (PPRV) and now known as small ruminant morbillivirus (SRMV), produces a highly infectious and often fatal disease. This study, which appears online Aug. 24 in the journal Viruses, is the third from an international team of researchers who hope to inform strategies for the global campaign to eradicate the virus.

"If we can identify behaviors that increase transmission risk, we can better inform how we allocate resources to manage the virus," said Catherine Herzog, epidemiologist and Huck postdoctoral scholar at the Center for Infectious Disease Dynamics at Penn State and first author of the paper.

The researchers previously found that transmission risk of PPRV was greater among herds in pastoral villages, where people rely almost solely on livestock for their livelihood, compared to herds from agropastoral villages, where people rely on a mix of livestock and agriculture. However, the factors driving these differences were unclear.

Because pastoral villages typically have much larger herds than agropastoral villages, the researchers first investigated whether herd size was related to the rate at which animals become infected--the force of infection. One might expect the rate of infection to increase with herd size, because an animal in a larger herd would have the potential to closely interact with more individuals.

"We hypothesized that the force of infection would increase with herd size--a pattern known as density-dependent transmission--but interestingly this is not what we observed," said Ottar Bjørnstad, Distinguished Professor of Entomology and Biology and J. Lloyd and Dorothy Foehr Huck Chair of Epidemiology at Penn State and a member of the research team.

Instead, Bjornstad explained that at the level of an individual compound, which might contain animals from multiple households living together in a herd, the data suggest that transmission is not related to herd size--a pattern known as frequency dependent transmission. The researchers believe this is due to the formation of social cliques, whose size is unaffected by the overall herd size.

"Having a clear understanding of this relationship and if it varies across geographic scales will improve how we model the disease," he said.

Transmission risk, however, did increase with specific husbandry practices such as the attendance of sheep or goats at seasonal grazing camps, where many herds come to aggregate, and introduction of livestock to the herd. Introductions occur when animals are purchased from or returned home from the market after a failure to sell, or if they returned from being loaned to another herd for breeding opportunities or milk production. Transmission risk also increased when cattle or goats were recently removed from the herd, through gifting, sale, or death.

The researchers hope their ongoing work will help clarify the ecological mechanisms driving PPRV transmission.

"Now that we have evidence that these husbandry practices are associated with higher rates of infection in the Tanzanian setting, we can take a closer look at these practices and recommend improvements or modifications that could help mitigate the transmission risk," said Herzog. "For example, we could explore quarantine procedures around the introduction of animals from sales or gifting, and the return of animals from seasonal grazing camps. We could also focus our veterinary care on settings or events where risk is the highest."

Credit: 
Penn State

Beam me up: Researchers use 'behavioral teleporting' to study social interactions

image: Behavioral teleporting" in action, a live fish (right) interacting with a replica (left) by Maurizio Porfiri at the NYU Tandon School of Engineering.

Image: 
Maurizio Porfiri

BROOKLYN, New York, Friday, August 14, 2020 – Teleporting is a science fiction trope often associated with Star Trek. But a different kind of teleporting is being explored at the NYU Tandon School of Engineering, one that could let researchers investigate the very basis of social behavior, study interactions between invasive and native species to preserve natural ecosystems, explore predator/prey relationship without posing a risk to the welfare of the animals, and even fine-tune human/robot interfaces.

The team, led by Maurizio Porfiri, Institute Professor at NYU Tandon, devised a novel approach to getting physically separated fish to interact with each other, leading to insights about what kinds of cues influence social behavior.

The innovative system, called “behavioral teleporting” — the transfer of the complete inventory of behaviors and actions (ethogram) of a live zebrafish onto a remotely located robotic replica — allowed the investigators to independently manipulate multiple factors underpinning social interactions in real-time. The research, “Behavioral teleporting of individual ethograms onto inanimate robots: experiments on social interactions in live zebrafish,” appears in the Cell Press journal iScience.

The team, including Mert Karakaya, a Ph.D. candidate in the Department of Mechanical and Aerospace Engineering at NYU Tandon, and Simone Macrì of the Centre for Behavioral Sciences and Mental Health, Istituto Superiore di Sanità, Rome, devised a setup consisting of two separate tanks, each containing one fish and one robotic replica. Within each tank, the live fish of the pair swam with the zebrafish replica matching the morphology and locomotory pattern of the live fish located in the other tank.

An automated tracking system scored each of the live subjects’ locomotory patterns, which were, in turn, used to control the robotic replica swimming in the other tank via an external manipulator. Therefore, the system allowed the transfer of the complete ethogram of each fish across tanks within a fraction of a second, establishing a complex robotics-mediated interaction between two remotely-located live animals. By independently controlling the morphology of these robots, the team explored the link between appearance and movements in social behavior.

The investigators found that the replica teleported the fish motion in almost all trials (85% of the total experimental time), with a 95% accuracy at a maximum time lag of less than two-tenths of a second. The high accuracy in the replication of fish trajectory was confirmed by equivalent analysis on speed, turn rate, and acceleration.

Porfiri explained that the behavioral teleporting system avoids the limits of typical modeling using robots.

“Since existing approaches involve the use of a mathematical representation of social behavior for controlling the movements of the replica, they often lead to unnatural behavioral responses of live animals,” he said. “But because behavioral teleporting ‘copy/pastes’ the behavior of a live fish onto robotic proxies, it confers a high degree of precision with respect to such factors as position, speed, turn rate, and acceleration.”

Porfiri’s previous research proving robots are viable as behavior models for zebrafish showed that schools of zebrafish could be made to follow the lead of their robotic counterparts.

“In humans, social behavior unfolds in actions, habits, and practices that ultimately define our individual life and our society,” added Macrì. “These depend on complex processes, mediated by individual traits —baldness, height, voice pitch, and outfit, for example — and behavioral feedback, vectors that are often difficult to isolate. This new approach demonstrates that we can isolate influences on the quality of social interaction and determine which visual features really matter.”

The research included experiments to understand the asymmetric relationship between large and small fish and identify leader/follower roles, in which a large fish swam with a small replica that mirrored the behavior of the small fish positioned in the other tank and vice-versa.

Karakaya said the team was surprised to find that the smaller — not larger — fish “led” the interactions.

“There are no strongly conclusive results on why that could be, but one reason might be due to the ‘curious’ nature of the smaller individuals to explore a novel space,” he said. “In known environments, large fish tend to lead; however, in new environments larger and older animals can be cautious in their approach, whereas the smaller and younger ones could be ‘bolder.’”

The method also led to the discovery that interaction between fish was not determined by locomotor patterns alone, but also by appearance.

“It is interesting to see that, as is the case with our own species, there is a relationship between appearance and social interaction,” he added.

Karakaya added that this could serve as an important tool for human interactions in the near future, whereby, through the closed-loop teleporting, people could use robots as proxies of themselves.

“One example would be the colonies on Mars, where experts from Earth could use humanoid robots as an extension of themselves to interact with the environment and people there. This would provide easier and more accurate medical examination, improve human contact, and reduce isolation. Detailed studies on the behavioral and psychological effects of these proxies must be completed to better understand how these techniques can be implemented into daily life.”

Credit: 
NYU Tandon School of Engineering

Heart repair factor boosted by RNA-targeting compound

image: Chemistry professor Matthew Disney, PhD, in his lab at Scripps Research in Jupiter, Florida. Disney and his graduate student, Hafeez Haniff, developed a compound that acts on non-coding RNA to restart a healing factor silenced by heart attack.

Image: 
Matthew Sturgess for Scripps Research

JUPITER, FL--AUG. 24, 2020--A heart attack can leave parts of the heart permanently scarred and stiff, resulting in prolonged disability and potential progression toward heart failure. Scientists have studied various ways to repair or regenerate such damaged heart tissue, with limited success.

A new study from Scripps Research Chemist Matthew Disney, PhD, shows that by targeting an essential biomolecule that surges in failing heart muscle, it may be possible to one day heal damaged heart tissue with medication.

In a study published Monday in the journal Nature Chemistry, the Disney collaboration describes the discovery of the first compounds able to restart cellular production of a factor called VEGF-A, short for vascular endothelial growth factor A, in cellular models. Research over many years has shown VEGF-A acts as a signal to stem cells, causing them to rebuild blood vessels and muscle in damaged heart tissue, and improve blood flow.

Targeting RNAs, the "middleman" between genes and protein production, makes logical sense, but doing so with medicines was once deemed unfeasible. RNAs were long thought to be poor small-molecule drug targets due to their simple four-base makeup and dynamic shape. Through the years, Disney and colleagues have developed an array of computational and chemical tools designed to overcome those barriers.

"During a heart attack, the injury causes proteins that could promote new, healthy blood vessel growth to go silent," Disney explains. "We analyzed the entire pathway for how the protein is silenced, and then we used that information to identify how to reinvigorate its expression."

Lead author Hafeez Haniff, a graduate student at Scripps Research, Florida, analyzed the genomics underlying VEGF-A production to assess optimal RNA drug targets, working in collaboration with scientists at AstraZeneca. The team selected a microRNA precursor called pre-miR-377, finding it acts like a dimmer switch for VEGF-A production in failing heart muscle.

They then used Disney's computational and chemical tools, in conjunction with a diverse set of compounds from AstraZeneca's collection, in search of chemical partners able to selectively bind to the key conserved structural features of pre-miR-377.

"A remarkable on-target specificity is achieved by combining the active compound with other helper molecules," Haniff explains.

Other strategies that have been attempted to boost VEGF-A production include administration of VEGF-A itself, or delivery of messenger RNA that encodes for the protein.

"Each of these approaches uses large compounds that can have limited distribution to diseased tissues, compared to potential specific, RNA-binding small-molecule lead medicines," Disney says.

The compound has, so far, been tested in cells, not whole-animal models of heart failure, Disney notes.

"We delivered a lead small molecule compound to reprogram the cell's software to force it to re-express VEGF-A," Disney says. "Transforming TGP-377 into a potential medicine that reaches patients will take considerably more time and research."

Disney called their success a "test case" that shows it is possible to reliably and predictably develop medicinal compounds for pre-defined RNA targets and induce protein production in cellular models.

Malin Lemurell of AstraZeneca, calls it a potentially important first step.

"The ability to design small molecules capable of interacting with and modulating RNA could open new avenues to target challenging disease pathways that have previously been considered undruggable," says Lemurell, who is head of Medicinal Chemistry, Research and Early Development, Cardiovascular, Renal and Metabolism, BioPharmaceuticals R&D at AstraZeneca. "This research has enabled the generation of quality tool compounds that will be useful to probe this mode of action further."

Because of the largescale screening done to identify TGP-377, Disney says the group expanded by 20-fold the data set of known RNA-binding small molecules generally, with implications for multiple incurable diseases.

"There are potential RNA drug targets for nearly every disease." Disney says. "We now have a much greater toolbox to search for lead molecules with medicinal potential."

Credit: 
Scripps Research Institute

Unconventional farming methods could help smallholders fight back against climate change

New research from Ghana shows less popular methods of biochar application are more effective in promoting cowpea growth and yield. The article, "Method of biochar application affects growth, yield and nutrient uptake of cowpea" was published in the De Gruyter open access journal Open Agriculture.

Cowpea is widely cultivated in sub-Saharan Africa and in warm regions around the world. The crop is an important source of human food, livestock feed, and green manure, and generates income for smallholder farmers. It is valued for its ability to boost soil fertility by fixing nitrogen.

But West African farmers - under pressure from climate change, drought, pests, and low soil fertility - have struggled to optimize the yield of this valuable crop. Conventional mineral fertilizers remain expensive for smallholders and can cause soil degradation.

Biochar is a charcoal-like substance made by burning waste plant matter. Adding biochar to soil is a relatively new approach, which has been shown to improve crop yields in many ways. It can increase the soil's water-holding capacity, reduce acidity, increase nutrient supply and retention, and promote the growth of beneficial microbes. But to date, there has been little research into the best method of applying biochar to soils to optimize its benefits.

Scientists tested out the different methods of biochar application on fields at Ghana's CSIR-Soil Research Institute. They planted cowpea seeds in the site's sandy soil and tested out the broadcasting, spot, and ring methods of applying biochar, comparing them to a control.

The broadcasting method sees biochar spread uniformly across the surface and worked into the soil using a hoe. For the spot method, biochar is placed into a small hole and covered with soil. For the ring method, biochar is dug into the soil in a ring around the place where the seed is to be planted.

The research team confirmed that biochar improved plant height and girth, the number and weight of nitrogen-fixing nodules on the cowpea, pod number, shoot and seed yield as well as nitrogen and phosphorus uptake. The spot and ring methods significantly improved these various measures of crop success.

"We've shown the traditional method of broadcast and incorporation to be less effective," says lead researcher Edward Yeboah, "whereas the spot and ring methods of biochar application show tremendous benefits for sustainable soil management. Smallholder farmers can now improve their livelihoods by focussing on spot and ring application of biochar for maximum benefit."

Credit: 
De Gruyter

Are antivitamins the new antibiotics?

image: First author Dr. Rabe von Pappenheim examines protein crystals of a bacterial enzyme that was "poisoned" with an antivitamin.

Image: 
Lisa-Marie Funk

Antibiotics are among the most important discoveries of modern medicine and have saved millions of lives since the discovery of penicillin almost 100 years ago. Many diseases caused by bacterial infections - such as pneumonia, meningitis or septicaemia - are successfully treated with antibiotics. However, bacteria can develop resistance to antibiotics which then leaves doctors struggling to find effective treatments. Particularly problematic are pathogens which develop multi-drug resistance and are unaffected by most antibiotics. This leads to severe disease progression in affected patients, often with a fatal outcome. Scientists all over the world are therefore engaged in the search for new antibiotics. Researchers at the University of Göttingen and the Max Planck Institute for Biophysical Chemistry Göttingen have now described a promising new approach involving "antivitamins" to develop new classes of antibiotics. The results were published in the journal Nature Chemical Biology.

Antivitamins are substances that inhibit the biological function of a genuine vitamin. Some antivitamins have a similar chemical structure to those of the actual vitamin whose action they block or restrict. For this study, Professor Kai Tittmann's team from the Göttingen Center for Molecular Biosciences at the University of Göttingen worked together with Professor Bert de Groot's group from the Max Planck Institute for Biophysical Chemistry Göttingen and Professor Tadgh Begley from Texas A&M University (USA). Together they investigated the mechanism of action at the atomic level of a naturally occurring antivitamin of vitamin B1. Some bacteria are able to produce a toxic form of this vital vitamin B1 to kill competing bacteria. This particular antivitamin has only a single atom in addition to the natural vitamin in a seemingly unimportant place and the exciting research question was why the action of the vitamin was still prevented or "poisoned".

Tittmann's team used high-resolution protein crystallography to investigate how the antivitamin inhibits an important protein from the central metabolism of bacteria. The researchers found that the "dance of the protons", which can normally be observed in functioning proteins, almost completely ceases to function and the protein no longer works. "Just one extra atom in the antivitamin acts like a grain of sand in a complex gear system by blocking its finely tuned mechanics," explains Tittmann. It is interesting to note that human proteins are able to cope relatively well with the antivitamin and continue working. The chemist de Groot and his team used computer simulations to find out why this is so. "The human proteins either do not bind to the antivitamin at all or in such a way that they are not 'poisoned'," says the Max Planck researcher. The difference between the effects of the antivitamin on bacteria and on human proteins opens up the possibility of using it as an antibiotic in the future and thus creating new therapeutic alternatives.

The research project was funded by the German Research Foundation (DFG).

Credit: 
University of Göttingen

Excessive fructose consumption may cause a leaky gut, leading to fatty liver disease

Excessive consumption of fructose -- a sweetener ubiquitous in the American diet -- can result in non-alcoholic fatty liver disease (NAFLD), which is comparably abundant in the United States. But contrary to previous understanding, researchers at University of California San Diego School of Medicine report that fructose only adversely affects the liver after it reaches the intestines, where the sugar disrupts the epithelial barrier protecting internal organs from bacterial toxins in the gut.

Developing treatments that prevent intestinal barrier disruption, the authors conclude in a study published August 24, 2020 in Nature Metabolism, could protect the liver from NAFLD, a condition that affects one in three Americans.

"NAFLD is the most common cause of chronic liver disease in the world. It can progress to more serious conditions, such as cirrhosis, liver cancer, liver failure and death," said senior author Michael Karin, PhD, Distinguished Professor of Pharmacology and Pathology at UC San Diego School of Medicine. "These findings point to an approach that could prevent liver damage from occurring in the first place."

Fructose consumption in the U.S. has skyrocketed since the 1970s and the introduction of high fructose corn syrup (HFCS), a cheaper sugar substitute that is broadly used in processed and packaged foods, from cereals and baked goods to soft drinks. Multiple studies in animals and humans have linked increased HFCS consumption with the nation's obesity epidemic and numerous inflammatory conditions, such as diabetes, heart disease and cancer. The U.S. Food and Drug Administration, however, currently regulates it similar to other sweeteners, such as sucrose or honey, and advises only moderation of intake.

The new study, however, defines a specific role and risk for HFCS in the development of fatty liver disease. "The ability of fructose, which is plentiful in dried figs and dates, to induce fatty liver was known to the ancient Egyptians, who fed ducks and geese dried fruit to make their version of foie gras," said Karin.

"With the advent of modern biochemistry and metabolic analysis, it became obvious that fructose is two to three times more potent than glucose in increasing liver fat, a condition that triggers NAFLD. And the increased consumption of soft drinks containing HFCS corresponds with the explosive growth in NAFLD incidence."

Fructose is broken down in the human digestive tract by an enzyme called fructokinase, which is produced both by the liver and the gut. Using mouse models, researchers found that excessive fructose metabolism in intestinal cells reduces production of proteins that maintain the gut barrier -- a layer of tightly packed epithelial cells covered with mucus that prevent bacteria and microbial products, such as endotoxins, from leaking out of the intestines and into the blood.

"Thus, by deteriorating the barrier and increasing its permeability, excessive fructose consumption can result in a chronic inflammatory condition called endotoxemia, which has been documented in both experimental animals and pediatric NAFLD patients," said the study's first author Jelena Todoric, MD, PhD, a visiting scholar in Karin's lab.

In their study, Karin, Todoric and colleagues from universities and institutions around the world, found that leaked endotoxins reaching the liver provoked increased production of inflammatory cytokines and stimulated the conversion of fructose and glucose into fatty acid deposits.

"It is very clear that fructose does its dirty work in the intestine," said Karin, "and if intestinal barrier deterioration is prevented, the fructose does little harm to the liver."

The scientists noted that feeding mice with high amounts of fructose and fat results in particularly severe adverse health effects. "That's a condition that mimics the 95th percentile of relative fructose intake by American adolescents, who get up to 21.5 percent of their daily calories from fructose, often in combination with calorie-dense foods like hamburgers and French fries," Karin said.

Interestingly, the research team found that when fructose intake was reduced below a certain threshold, no adverse effects were observed in mice, suggesting only excessive and long-term fructose consumption represents a health risk. Moderate fructose intake through normal consumption of fruits is well-tolerated.

"Unfortunately, many processed foods contain HFCS and most people cannot estimate how much fructose they actually consume," said Karin. "Although education and increased awareness are the best solutions to this problem, for those individuals who had progressed to the severe form of NAFLD known as nonalcoholic steatohepatitis, these findings offer some hope of a future therapy based on gut barrier restoration."

Credit: 
University of California - San Diego

Each human gut has a viral 'fingerprint'

COLUMBUS, Ohio - Each person's gut virus composition is as unique as a fingerprint, according to the first study to assemble a comprehensive database of viral populations in the human digestive system.

An analysis of viruses in the guts of healthy Westerners also showed that dips and peaks in the diversity of virus types between childhood and old age mirror bacterial changes over the course of the lifespan.

The Gut Virome Database developed by Ohio State University scientists identifies 33,242 unique viral populations that are present in the human gut. (A collection of viruses like those in the human gut is called a virome.) This is not cause for alarm: Most viruses don't cause disease.

In fact, the more scientists learn about viruses, the more they see them as part of the human ecosystem - suggesting viruses have potential to represent a new class of drugs that could fight disease-causing bacteria, especially those resistant to antibiotics. Better knowledge of viruses in the gut environment could even improve understanding of the gastrointestinal symptoms experienced by some of the sickest COVID-19 patients.

The researchers plan to update the open-access database on a regular basis.

"We've established a robust starting point to see what the virome looks like in humans," said study co-author Olivier Zablocki, a postdoctoral researcher in microbiology at Ohio State. "If we can characterize the viruses that are keeping us healthy, we might be able to harness that information to design future therapeutics for pathogens that can't otherwise be treated with drugs."

The study is published today (Aug. 24) in the journal Cell Host & Microbe.

Talk of the good and bad bacteria in the gut microbiome is commonplace these days, but viruses in the gut - and everywhere - are hard to detect because their genomes don't contain a common signature gene sequence that bacteria genomes do. So much of the vast sequence space of viruses remains unexplored that it is often referred to as "dark matter."

For this work, the researchers started with data from 32 studies over about a decade that had looked at gut viruses in a total of 1,986 healthy and sick people in 16 countries. Using techniques to detect virus genomes, the team identified more than 33,000 different viral populations.

"We used machine learning on known viruses to help us identify the unknown viruses," said first author Ann Gregory, who completed this work while she was a graduate student at Ohio State. "We were interested in how many types of viruses we could see in the gut, and we determined that by how many types of genomes we could see since we couldn't visually see the viruses."

Their analysis confirmed findings from smaller studies suggesting that though a few viral populations were shared within a subset of people, there is no core group of gut viruses common to all humans.

A few trends were identified, however. In healthy Western individuals, age influences the diversity of viruses in the gut, which increases significantly from childhood to adulthood, and then decreases after age 65. The pattern matches what is known about ebbs and flows of gut bacterial diversity with one exception: Infant guts with underdeveloped immune systems are teeming with a range of virus types, but few bacteria varieties.

People living in non-Western countries had higher gut virus diversity than Westerners. Gregory said other research has shown that non-Western individuals who move to the United States or another Western country lose that microbiome diversity, suggesting diet and environment drive virome differences. (For example, the scientists found some intact plant viruses in the gut - the only way for them to get there is through the diet.) Variations in viral diversity could also be seen in healthy versus sick participants in the 32 studies analyzed.

"A general rule of thumb for ecology is that higher diversity leads to a healthier ecosystem," Gregory said. "We know that more diversity of viruses and microbes is usually associated with a healthier individual. And we saw that healthier individuals tend to have a higher diversity of viruses, indicating that these viruses may be potentially doing something positive and having a beneficial role."

Almost all of the populations - 97.7 percent - were phages, which are viruses that infect bacteria. Viruses have no function without a host - they drift in an environment until they infect another organism, taking advantage of its properties to make copies of themselves. The most-studied viruses kill their host cells, but scientists in the Ohio State lab in which Gregory and Zablocki worked have discovered more and more phage-type viruses that coexist with their host microbes and even produce genes that help the host cells compete and survive.

The leader of that lab, senior study author Matthew Sullivan, has his sights set on "phage therapy" - the 100-year-old idea of using phages to kill antibiotic-resistant pathogens or superbugs.

"Phages are part of a vast interconnected network of organisms that live with us and on us, and when broad-spectrum antibiotics are used to fight against infection, they also harm our natural microbiome," Sullivan said. "We are building out a toolkit to scale our understanding and capabilities to use phages to tune disturbed microbiomes back toward a healthy state.

"Importantly, such a therapeutic should impact not only our human microbiome, but also that in other animals, plants and engineered systems to fight pathogens and superbugs. They could also provide a foundation for something we might have to consider in the world's oceans to combat climate change."

A professor of microbiology and civil, environmental and geodetic engineering, Sullivan has helped establish cross-disciplinary research collaborations at Ohio State. He recently founded and directs Ohio State's new Center of Microbiome Science and co-directs the Infectious Diseases Institute's Microbial Communities program.

Zablocki noted that there is still a lot to learn about the functions of viruses in the gut - both beneficial and harmful.

"I see it as the chicken and the egg," he said. "We see the disease and we see the community structure. Was it because of this community structure that the disease occurred, or is the disease causing the community structure that we see? This standardized dataset will enable us to pursue those questions."

Credit: 
Ohio State University