Culture

Earth's species have more in common than previously believed

The Earth hosts an abundance of life forms - from well-known animals and plants to small, more hardy life forms such as archaea, viruses and bacteria. These life forms are fundamentally different all the way down to the cell level. Or so scientists thought.

Now an international team of researchers has analysed the proteins found in 100 different species - from bacteria and archaea to plants and humans. It is the largest protein mapping ever to be conducted across different species.

They have learned that these life forms in fact have a number of common characteristics. The study is a collaboration between researchers in Professor Matthias Mann's group in the Novo Nordisk Foundation Center for Protein Research and the Max Planck Institute of Biochemistry. It has been published in the top scientific journal Nature.

'We have mapped the proteins, together called the proteome of 100 different species. And it is obvious that they are extremely different. At the same time, though, they have more in common than we thought. In all these life forms, a large share of the proteins focus on metabolism and on maintaining a protein balance', says Professor Matthias Mann.

Doubling of Experimentally Confirmed Proteins

Previously, researchers were mainly interested in the DNA of various organisms. For example, how much genetic material humans share with different animals. However, with advancements in the technology used for studying organisms at molecular level, researchers have turned to proteins the workhorses of the cell.

'A common characteristic of all these life forms is the fact that a high percentage of their proteomes focus on maintaining a sort of balance, what is called as homeostasis. Another common characteristic is the fact that a large share of the proteins help to generate energy. Even though the ways in which this is done differ - from photosynthesis to carbohydrate burning', says Alberto Santos Delgado who during the studies was employed at the Novo Nordisk Foundation Center for Protein Research.

The researchers have used an advanced technology called mass spectrometry to study all 100 species. The technology enabled them to double the number of proteins confirmed experimentally.

Previous research has predicted how many and which proteins exist based only on the genetic code and bioinformatic calculations. However, the new protein mapping has provided actual data on the existence of a very large number of new proteins.

Machine Learning Can Reveal New Correlations

'Our work connecting quantitative mass spectrometry-based proteomics with database resources has resulted in a data set of eight million data points with 53 million interconnections. We made all the data publicly available, enabling other researchers to use it to identify new correlations. New technologies enabled by machine learning are on the rise and we expect those to benefit from the large and uniform dataset we provide publicly', says PhD Student Johannes Mueller from the Max Planck Institute of Biochemistry.

The researchers at the University of Copenhagen focussed on data processing and bioinformatics analysis, while the researchers at the Max Planck Institute of Biochemistry in Munich focussed on mass spectrometry.

On the website Proteomes of Life, the researchers made all data from the project publicly available.

The study was funded by the Max Planck Society for the Advancement of Science, the EU Horizon 2020 programme and the Novo Nordisk Foundation.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Chinese scientists construct high-quality graph-based soybean genome

image: Soybean graph-based genome construction and pan genome analyses.

Image: 
IGDB

Soybean oil is one of the world's most important vegetable oils and soybeans are a key protein feed crop. Cultivated soybeans were domesticated from wild relatives in China approximately 5,000 years ago. At present, over 60,000 accessions adapted to different ecoregions have been developed. Extensive genetic diversity among soybean germplasms has shown the need for construction of a complete pan-genome from diverse soybean accessions.

Recently, the research group led by Prof. TIAN Zhixi from the Institute of Genetics and Developmental Biology (IGDB) of the Chinese Academy of Sciences (CAS), in cooperation with Profs. LIANG Chengzhi and ZHU Baoge's team, Prof. HAN Bin's team from the Center for Excellence in Molecular Plant Sciences of CAS, Prof. HUANG Xuehui's team from Shanghai Normal University, and the Berry Genomics Corporation, individually de novo assembled 26 soybean genomes and constructed a high quality graph-based soybean pan-genome.

Based on a phylogenetic analysis of 2,898 soybean accessions, they selected 26 accessions and performed de novo genome assembly for each accession. The contig N50 sizes of the 26 whole-genome assemblies ranged from 18.8 to 26.8 Mb pairs with a mean of 22.6 Mb, and scaffold N50 sizes ranged from 50.3 to 52.3 Mb with a mean of 51.2 Mb.

Through a comparative genome analysis of the 26 genomes plus three previously reported genomes, the scientists identified a total of 14,604,953 SNPs and 12,716,823 small insertions and deletions, 723,862 present and absent variations, 27,531 copy number variations, 21,886 translocation events, and 3,120 inversion events.

Subsequently, by integrating these structural variations, a graph-based genome was built using the ZH13 genome as a standard linear reference genome.

Further investigations illuminated that these structural variations play important roles in driving genome evolution, gene structure variation and gene functional divergence, which in turn contribute to agronomic trait variations in the soybean population.

Having a reference genome opens the door to functional genomics and molecular design breeding for a species. However, an increasing number of reports has suggested that one or a few reference genomes cannot represent the full range of genetic diversity of a species. Therefore, pan-genome construction is becoming increasingly necessary.

In addition, conventional linear references are limited since they are unable to show the genotypes of different alleles from each locus. How to integrate the genotypes from different alleles into a new form of genome is a challenge.

This is the first reported graph-based genome in a plant. This graph-based genome can be used to reanalyze previously resequenced data, which will generate more comprehensive information than ever and rejuvenate those data. In turn, this will greatly facilitate functional study and breeding. An anonymous reviewer said this work is "a landmark paper for genomics."

Credit: 
Chinese Academy of Sciences Headquarters

Call for caution for using a CAR-T immunotherapy against acute myeloid leukemia

image: Part of Pablo Menéndez's Team. Matteo Baroni 2nd from left.

Image: 
Pablo Menéndez

Acute myeloid leukemia (AML) is a hematological malignancy which incidence increases with age, that is biologically, phenotypically, and genetically very heterogeneous. Its treatment uses to combine chemotherapy followed by allogenic Hematopoietic Stem and Progenitor Cells transplant (allo-HSCT), based on the patient's eligibility, to consolidate complete remission and prevent relapse. Yet, except for a few subgroups, so-called low-risk AMLs, relapses are frequent after consolidation therapy and transplant. Chemotherapy-related toxicity, refractoriness, and failure to eradicate leukemia-initiating cells are the major causes underlying AML progression and relapse. Unfortunately, improved AML treatments have only experienced minor developments over the last four decades, and current 5-year event-free survival remains in a 20% in adults and less than 70% in children, highlighting the desperate need for safer and more efficient therapeutics.

Lately, cellular immunotherapy based on CAR-Ts has generated unprecedented expectations in cancer treatment. CARTs immunotherapy consists of the engineering of human T-cells with chimeric antigen receptors (CARs) that direct the T-cells against the cell surface tumor antigens. CARTs have shown robust clinical responses in patients with B-cell malignancies thanks to its high efficacy, specificity, and persistence.

AML patients are challenging because of the absence of a universal AML target antigen to direct the CART and of the shared expression of target antigens with healthy hematopoietic stem and progenitor cells (HSPC), which may lead to life-threatening on target off-tumor cytotoxicity. Despite this, past studies have found that CD33- and CD123-redirected CARTs for AML exhibit robust anti-leukemic activity, and are in advanced preclinical and clinical development.

These CARTs have generated some preclinical and clinical controversy on whether they can be myeloablative, as they could also target healthy HSPC, which are essential for hematopoiesis or blood cell production. This suspicion lies in the fact that HSPC also expresses to a various degree these antigens, so these CARTs could also attack them.

Although some isolated short-term studies have been performed in vitro and in vivo to prove the safety of this CARTs therapy on healthy HSPC, there was no substantial evidence of mid- or long-term in vivo studies yet.

Matteo Baroni, researcher of the Stem Cell Biology, Developmental Leukemia and Immunotherapy Research Group of the Josep Carreras Leukaemia Research Institute, hypothesized with his colleagues that the time for a CART to be effective against a healthy HSPC could be longer than against a leukemic cell. They thought that the results provided from previous studies on the potential myelotoxicity of redirecting T-cells against CD123 underestimated the on-target off-tumor potential of these CARTs.

Baroni and his team provided extensive evidence, in a 6-week in vivo study, that the presence of anti CD123 CAR T-cells strongly inhibited normal hematopoiesis, causing irreversible impediments in the formation of new blood cells. These results have been recently published in the Journal for Immunotherapy of Cancer.

"We call for the caution of using CAR T-cells against CD123 in another way than before an allogenic transplant of AML patients that have experienced therapy refractoriness of disease recurrence or relapse, and cannot further benefit from standard chemotherapy. For these patients is a great alternative. We believe that these results will help clinicians to consider critical long-term effects if CAR-T CD123 cells remain, and prevent for other uses." States Baroni.

This research has been funded by the European Research Council (CoG-2014-646903, PoC-2018-811220), the Spanish Ministry of Economy and Competitiveness (MINECO, SAF2016-80481-R), the Spanish Cancer Research Association (AECC-Semilla19)

Credit: 
Josep Carreras Leukaemia Research Institute

Detecting antibodies with glowing proteins, thread and a smartphone

image: Light emitted from sensor proteins turned bluer when samples contained higher concentrations of antibodies against three viruses. 

Image: 
Adapted from <i>ACS Sensors</i> <b>2020</b>, DOI: 10.1021/acssensors.0c00564

To defend the body, the immune system makes proteins known as antibodies that latch onto the perceived threat, be it HIV, the new coronavirus or, as is the case in autoimmune disease, part of the body itself. In a new proof-of-concept study in ACS Sensors, researchers describe a new system for detecting antibodies within a pinprick of blood within minutes, using an unlikely combination of cotton thread, glowing proteins and a smartphone camera. 

While some tests simply detect the presence of an antibody, sometimes doctors want to know how much is circulating in the blood. Such quantitative tests are used to diagnose a number of conditions, including infections and autoimmune diseases. Although a quantitative antibody test is not yet approved for use in the U.S., such a test could potentially aid in assessing immunity to SARS-CoV-2. However, quantitative testing currently requires expensive, sophisticated instruments in labs, and efforts to make it more accessible have had only limited success. So, Maarten Merkx, Daniel Citterio and colleagues tested an approach that could provide a small, inexpensive alternative.  

The researchers' microfluidic thread-based analytical device (μTAD) relies on light-emitting sensor proteins held on a thread. In the presence of the right antibodies, the color of the light emitted by the sensors changes. The shift, from green to blue, correlates with the concentration of antibodies in a sample. Using a finger-prick-sized drop of pigs' blood spiked with antibodies against HIV, the team showed that their system could successfully detect antibody levels within five minutes. In addition, the device can test for the amounts of several different antibodies in a single blood sample and doesn't require extensive handling and incubation steps. They found that a smartphone camera, outfitted with an adaptor, could pick up on the shifts in the light's color, while the device itself could convert color data into test results and transmit that information. With further development, this combination of technologies could provide user-friendly, one-step analysis of antibody concentration, according to the researchers.

Credit: 
American Chemical Society

Weed's wily ways explained in Illinois research

image: Waterhemp, one of the most economically damaging agronomic weeds, has evolved resistance to many herbicides. In two studies, University of Illinois researchers explain some of the weed's strategies to evade chemical attacks.

Image: 
Lauren D. Quinn, University of Illinois

URBANA, Ill. - Like antibiotic-resistant bacteria, some herbicide-resistant weeds can't be killed by available chemicals. The problem affects more than just the errant weed in our driveways; herbicide-resistant weeds threaten our food supply, stealing resources and outcompeting the crops that make up our breakfast cereal and feed the nation's livestock.

The weed that represents the biggest threat to Midwestern corn and soybean production, waterhemp, has outsmarted almost every kind of herbicide on the market today.

University of Illinois scientists are working to reveal waterhemp's tricks. Through years of research, they discovered the weed can ramp up production of detoxifying enzymes that neutralize certain herbicides before they can disrupt essential cellular processes. Metabolic resistance, as this strategy is known, is just one process by which waterhemp evades herbicides. Unfortunately, because there may be hundreds of detoxifying enzymes involved, metabolic resistance is hard to identify and even harder to combat.

In two recent studies, Illinois researchers explain metabolic resistance to three commonly used herbicides in waterhemp, getting closer to finding important genetic cues. Results also confirm the importance of using a multi-pronged approach to waterhemp control.

"These waterhemp populations are adapting and evolving incredible abilities to metabolize everything. It's bad news, but at least we understand the mechanisms better. And ultimately, that understanding could potentially be exploited to use waterhemp's metabolic arsenal against itself," says Dean Riechers, professor in the Department of Crop Sciences at Illinois and co-author on both studies. "That's one interesting way our research could be directly applied to controlling this weed."

Last year, Illinois researchers documented resistance to Group 15 herbicides in waterhemp. This group of herbicides, including S-metolachlor, targets very-long-chain fatty acid production in sensitive plants. The researchers suspected it was also a case of metabolic resistance, and the Illinois team, led by graduate student Seth Strom, has now confirmed it in a study published in Pest Management Science.

"We were the first group in the world to show resistance to Group 15 herbicides in waterhemp, and now we have identified the mechanism behind it," Riechers says. "Again, it's not good news because it means we're running out of herbicides, and in this case it involves pre-emergence herbicides."

The study suggests two classes of detoxifying enzymes, known as GSTs and P450s, appear to neutralize S-metolachlor in resistant waterhemp.

Group 15 herbicides can be safely used in corn because the crop uses GSTs to naturally detoxify the chemicals; in other words, corn has a natural tolerance to these chemicals. Strom's research suggests waterhemp is not only able to mimic corn's natural detoxification mechanism, but it evolved an additional way to avoid being harmed by S-metolachlor.

Honing in on the two classes of detoxifying enzymes is not the end of the story, however. Because plants have hundreds of enzymes in each class, the researchers have more work ahead of them to identify the specific genes that are activated.

In a separate study, Riechers and another group of Illinois scientists revealed more of waterhemp's metabolic secrets.

"We have known for the last 10 years that whenever we see waterhemp with resistance to an HPPD inhibitor in the field, such as mesotrione, it has always shown metabolic atrazine resistance, too. However, it is possible for waterhemp to be resistant to atrazine and not mesotrione," Riechers says.

The apparent association between mesotrione and metabolic atrazine resistance could be coincidental, but given how often the resistances co-occur, Riechers thought the genes controlling resistance for the two chemicals might be shared or linked.

In a study published in Weed Science, graduate student Kip Jacobs demonstrated an overlap in the genes responsible for metabolic atrazine and mesotrione resistance. Because researchers already knew the single gene for metabolic atrazine resistance, the results get them closer to understanding the genes conferring mesotrione resistance.

"Whenever we find out whether it's two or three or four genes involved in mesotrione resistance, our results tell us one of them should be the metabolic atrazine resistance gene," Riechers says. "We know which one that is."

Unfortunately, even if researchers are able to trace each resistance trait back to the genetic level, that won't ensure an easy solution to the problem. Experts say there are no new herbicide sites-of-action coming into the marketplace, so farmers will need to consider alternative methods of weed control.

"With metabolic resistance, our predictability is virtually zero. We have no idea what these populations are resistant to until we get them under controlled conditions. It's just another example of how we need a more integrated system, rather than relying on chemistry only. We can still use the chemistry, but have to do something in addition," says Aaron Hager, associate professor in the Department of Crop Sciences at Illinois and co-author on the Pest Management Science study. "We have to rethink how we manage waterhemp long term."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Self-powered 'paper chips' could help sound an early alarm for forest fires

image: A sensor (white strip) on a houseplant activates an alarm when fire is near.

Image: 
Adapted from <i>ACS Applied Materials & Interfaces</i> <b>2020</b>, DOI: 10.1021/acsami.0c04798

Recent devastating fires in the Amazon rain forest and the Australian bush highlight the need to detect forest fires at early stages, before they blaze out of control. Current methods include infrared imaging satellites, remote sensing, watchtowers and aerial patrols, but by the time they sound the alarm, it could be too late. Now, researchers reporting in ACS Applied Materials & Interfaces have developed self-powered "paper chips" that sense early fires and relay a signal.

Previously, scientists have proposed placing a network of sensors in the forest that could detect changes in temperature, smoke or humidity and wirelessly transmit a signal to responders. However, such a system hasn't yet seemed practical because all of the sensing components require power. Batteries would eventually go dead and need to be replaced. Thermoelectric materials, which convert temperature differences into electricity, could simultaneously detect temperature increases from fires and power themselves. However, most of these materials are solid inorganic semiconductors, which are often expensive, rigid and environmentally unfriendly. Yapei Wang and colleagues wanted to find out if ionic liquids could be used as thermoelectric materials for fire sensing. These fluids are salts in the liquid state, and two different types of ionic liquids can be connected in series to generate signals.

To make paper-based thermoelectric sensors, the researchers chose two ionic liquids that behaved differently when the temperature increased: One adsorbed to the surface of gold electrodes, while the other desorbed, producing opposite (positive or negative) voltages. They deposited each ionic liquid like an ink between two gold electrodes that were sputtered onto a piece of ordinary paper. When connected in series, the two ionic liquids produced an electric signal when a large temperature difference occurred, as would happen in a fire. In a pilot test of the new sensor, the researchers attached one to a houseplant. When they placed a flaming cotton ball close to the plant's roots, the temperature at the bottom of the sensor quickly increased, producing a voltage signal that an attached microcomputer chip wirelessly transmitted to a receiver. Upon picking up the signal, the receiver activated a sound alarm and a red light. The thermoelectric paper chips are cheap ($0.04), and the materials are eco-friendly, the researchers say.

Credit: 
American Chemical Society

UConn researchers overcome a vexing problem in vaccine research

Researchers at UConn's Center of Excellence in Vaccine Research (CEVR) have made a breakthrough in vaccine development for a common and difficult to treat pneumonia-causing pathogen. Their research was recently published in the Nature Partner Journal - Vaccines.

For Mycoplasma pneumoniae, vaccine development has been stalled since the 1960s due to a phenomena called vaccine-enhanced disease (VED) or vaccine induced disease exacerbation. A vaccine for this type of community acquired pneumonia has been sought after since the illness can pose problems for closed community settings such as military bases, hospitals, ships, college dormitories, and prisons.

"Two different vaccines were developed by the National Institutes of Health," says Assistant Professor in Pathobiology and Veterinary Science Steven Szczepanek. "In trials, most vaccinated subjects were protected from infection and showed no symptoms. However, for some vaccinated and infected subjects, symptoms were actually worse than those observed in people that did not receive the vaccine. This is vaccine-enhanced disease and is of course really bad."

A vaccine must strike a balance. The formulation needs just enough potency to ensure the immune system will be able to recognize a pathogen and easily kill it if the patient re-encounters it. If all goes according to plan, vaccinated patients are able to easily clear a reinfection without even knowing they were re-exposed. However, a vaccine can sometimes lead to an overreaction by the immune system upon reinfection. This vaccine-enhanced disease has been seen with other pathogens such as respiratory syncytial virus (RSV), Dengue fever, and in animals models in SARS vaccine research, says Steven Geary Department Head of Pathobiology and Veterinary Science and Director of CEVR.

VED is contradictory to the very basis of vaccination.

"We're trying to develop prophylactic vaccines to prevent infections from occurring in healthy people. If the vaccines we develop will actually make infections worse in 1/3 people that get the vaccine, then most people are not going to take the vaccine - and rightfully so," says Szczepanek. "We're not talking about cancer therapeutics where the subject is already sick, where the potential benefit of finding a cure often outweighs the risk of an adverse event occurring. The medical community, and people in general, have very little tolerance for adverse events occurring in a product that is given to otherwise healthy individuals."

To get to the root cause of VED with M. pneumoniae vaccination, the researchers analyzed the building blocks of the bacteria -- the proteins, lipids, and lipoproteins -- to determine if they elicited an immune response.

"We decided to systematically tear the bug apart using different chemical and physical approaches and test different components as vaccines to see if we could identify what, exactly, was causing VED after infection. Before we started this process, we hypothesized that it was the membrane bound surface lipoproteins that were causing VED," says Szczepanek.

The team also studied details about the host immune system and what qualities of the pathogen would lead to the occurrence of VED.

"That's the $64,000 question. The short answer is that we don't know the full picture. Chemical signals used by the immune system called "cytokines" help to drive specific types of immune responses to different pathogens," says Szczepanek.

A confounding trend the researchers have found is the cytokines that play a key role in vaccine protection to another pneumonia-causing bacteria, Streptococcus pneumoniae, are the same cytokines driving VED with M. pneumoniae. This is an example of the nuances and complexities behind vaccine development explains Szczepanek.

"We can't even use what we know about immunity from one bacterial pathogen that causes a similar disease to understand what happens during infection with a different species. Each pathogen is complex and unique, so it seems that we will stay employed for many years to come."

The researchers were able to narrow down the candidates to certain lipoproteins on the surface of the bacteria to test their hypothesis about the immune-inducing culprit.

"After some pretty extensive testing we found out that we were right," says Szczepanek. "Chemical removal of the lipid portion of purified M. pneumoniae lipoproteins eliminated VED, and even drove some level of protection from infection. We still have some work to do to fully optimize the efficacy of a vaccine formulation, but we have identified and eliminated the cause of the nagging roadblock of VED that plagued the field for over half a century. Safety problems are no longer a concern for M. pneumoniae vaccines."

The road to a safe and effective vaccine is a long one, but the researchers at CEVR are excited to be moving forward after overcoming the difficult hurdle of VED, says Geary.

"We have to prepare and refine candidate M. pneumoniae vaccines that do not contain lipoproteins, and test them in our animal model. We will also be testing different adjuvants (compounds that are added to vaccines to increase the proper immune response). Once we have defined the precise vaccine formulation we will proceed with a phase 1 clinical trial in humans. If successful, we will continue on the FDA proscribed phase 2 and 3 clinical trials required for all human vaccines and hopefully then find a partner to produce and market it."

It is a team effort Geary adds, "The majority of the hands-on experimentation and data evaluation to date has been conducted by PhD candidates Arlind Mara and Tyler Gavitt, who will continue to perform the immunologic and vaccine efficacy analysis as this project progresses to the point of a successful vaccine."

UConn has filed a provisional patent application and the technology is available for licensing or partnering. For further information please contact Amit Kumar at a.kumar@uconn.edu.

Credit: 
University of Connecticut

Latest findings on bitter substances in coffee

image: Model of the bitter receptor TAS2R43 without extracellular domain. Within the binding pocket: Model of the bitter substance mozambioside (blue).

Image: 
©Leibniz-LSB@TUM; Dr. Antonella Di Pizio

Coffee is very popular around the world despite or perhaps because of its bitter taste. Compounds contained in the coffee such as caffeine contribute to the bitterness to varying degrees. A recent study conducted by the Leibniz-Institute for Food Systems Biology and the Technical University of Munich (TUM) provides new insights into the molecular interactions between bitter substances and bitter receptors. This is of relevance not only for taste perception.

Caffeine is surely the best-known bitter coffee constituent. However, this stimulating substance is not solely responsible for the bitter taste of the beverage. The latest findings from a study conducted by the Freising team of scientists confirm this. Using a cell-based testing system - a type of artificial tongue - and docking analyses, the team investigated five different bitter coffee constituents. The tests included the bitter substance mozambioside identified in Arabica beans, its roast product bengalensol, and the well-known coffee compounds cafestol, kahweol, and caffeine.

Based on the results of their study, the research team assumes that mainly two of the 25 human bitter taste receptors respond to the coffee's constituents. Whereas a relatively high concentration of caffeine is necessary to stimulate the receptors TAS2R46 and TAS2R43, considerably smaller amounts of the other four substances are needed. The caffeine concentration required to activate the bitter taste receptor TAS2R43 to the same degree as mozambioside or bengalensol was about 30 and 300 times higher, respectively, says lead author Tatjana Lang from the Leibniz-Institute for Food Systems Biology.

Bitter substance reduces bitter taste?

Further studies conducted by the researchers suggest that the bitter substances contained in coffee interact with each other. These studies showed that kahweol and mozambioside exhibit similar binding properties for the bitter taste receptor TAS2R43. Compared to mozambioside, however, kahweol receptor activation was relatively weak and, depending on the dose, was capable of inhibiting the mozambioside-induced activation of the bitter taste receptor. "We therefore assume that kahweol can reduce the bitter taste elicited by TAS2R43 by suppressing more effective bitter substances at the receptor," says principal investigator Maik Behrens, who is head of the research group Taste Systems Reception & Biosignals at the Leibniz-Institute.

Behrens adds that this effect could play a role in coffee preparations that do not include a filtering step like espresso or Turkish coffee, which are beverages that contain kahweol.

Bitter receptor affects gastric acid secretion

Behrens says the study results are exciting from another perspective as well, adding that: "All of our findings indicate that bitter coffee substances quite specifically activate two of the 25 bitter taste receptors. We furthermore know that both types of receptors are present not only in taste cells. TAS2R43 is also present in the stomach and in conjunction with caffeine plays a role in the regulation of gastric acid secretion. The question now arises as to how coffee constituents like bengalensol, which activate the receptor with much higher potency, might be involved in this regulatory process."

It is also interesting that many people do not possess the bitter taste receptor TAS2R43 due to a genetic variation. This could explain the differences in individual coffee taste perception or its tolerability says Veronika Somoza, director of the Leibniz-Institute for Systems Biology. She adds that much more research is needed to elucidate the complex interaction of bitter substances, bitter receptors, and their effects on the human body.

Credit: 
Leibniz-Institut für Lebensmittel-Systembiologie an der TU München

NJIT researchers develop easier and faster way to quantify, explore therapeutic proteins

Researchers at New Jersey Institute of Technology in collaboration with Ohio University and Merck & Co. Inc. recently developed a new efficient method for targeted protein analysis -- one they say could speed up processes for disease testing, drug discovery and vaccine development.

The research, published in the journal Analytical Chemistry, highlights the team's new coulometric mass spectrometric (CMS) approach for determining the quantity of proteins in biological samples, potentially opening new doors for exploring proteins in the human body that may only be expressed at low levels, but which could serve important biological functions or roles as disease biomarkers, drug targets or therapeutic antibodies.

Researchers say the new mass spectrometry and electrochemistry-based approach -- capable of accurately quantifying a spectrum of small proteins to large monoclonal antibody drugs -- is an advance on current methods in the field of absolute protein quantitation, which typically require time-consuming and costly preparation of synthesized standard material for analysis.

"Measurement of the molecular changes in disease-associated processes and pathways is critical to our understanding of pathogenesis and discovery of new biomarkers for diagnosis and treatment of diseases," said Hao Chen, professor at NJIT's Department of Chemistry and Environmental Sciences, and the corresponding author of the paper. "With our new method, we've shown we can quantify a range of biomolecules accurately and quickly."

"This approach could benefit a range of life sciences research including combating the COVID-19 pandemic for instance, as it could be used to more quickly quantify various antibodies from patients to probe infection stage and to assist vaccine development," Chen added.

In proteomics, mass spectrometry analysis can offer researchers a way of quantifying thousands of proteins in a single experiment, under various conditions or stimuli. It can also be used to help uncover details about how certain proteins function, interact and change over time in healthy and disease cell states, and can reveal more about level changes of antibodies produced by the immune system to combat antigens, such as viruses or bacteria.

Pengyi Zhao, a Ph.D. researcher in Chen's lab and first author of the paper, says protein quantitation has been typically done through liquid chromatography-mass spectrometry methods that involve preparing synthetic isotope-labeled peptides. These labeled peptides are usually spiked in known concentration into samples to help determine the amount of a protein of interest based on the intensities of the protein's associated peptides relative to the added standards. "The expense and time it takes to synthesize these isotope-labeled peptide standards is a big issue hindering quantitative analysis in research and drug development," explained Zhao.

Chen says the team's new CMS approach instead quantifies proteins based on the electrochemical signature produced during mass spectrometry analysis. "In this method, absolute protein quantitation is based on the electrochemical oxidation of a surrogate peptide from target protein combined with mass spectrometric measurement of the oxidation yield ... this breakthrough opens a new door to investigate many proteins where no standard is available for analysis."

The team demonstrated their new method by analyzing several proteins such as model proteins β-casein and apomyoglobin. In collaboration with Yong-Ick Kim's research group at NJIT, they also successfully quantified a key protein involved in the circadian clock, called KaiB. The team used tyrosine-containing peptides as surrogate peptides for quantitation, finding the results of the CMS analysis comparable in accuracy to the results produced by traditional isotope-labeling methods overall.

"Currently, through this proof-of-concept we've shown this method can accurately quantify various proteins from apomyoglobin to therapeutic antibodies," said Chen. "As our method does not need standards, it would enable a large-scale absolute quantitation analysis of proteins in blood, tissues, or organs, which would need thousands of expensive heavy isotope-labeled peptide standards otherwise. In the next steps, we'll apply this new method for large scale protein quantitation in different biological samples, for disease biomarker discovery."

Credit: 
New Jersey Institute of Technology

How Toxoplasma parasites glide so swiftly (video)

image: A new study shows how Toxoplasma parasites glide during an infection.

Image: 
American Chemical Society

If you're a cat owner, you might have heard of Toxoplasma gondii, a protozoan that sometimes infects humans through contact with contaminated feces in litterboxes. Although harmless to most people, T. gondii can cause serious illness or death in immunocompromised individuals or fetuses of infected pregnant women. Now, researchers reporting in ACS Nano have studied how the microorganism glides so swiftly through mammalian tissues during an infection. Watch a video of the parasites here.

According to the U.S. Centers for Disease Control and Prevention, about 11% of the U.S. population, and up to 60% of people in some parts of the world, have been infected with T. gondii, which can also be transmitted through contaminated food and water. Although the parasite infects most mammals, it only reproduces sexually in cats, which can expel large numbers of T. gondii oocysts in their feces. Once ingested by people or animals, the oocysts' envelopes are broken down by digestive enzymes, releasing parasites that can enter cells of the small intestine. There, the parasites transition into what is called the tachyzoite stage, in which they can move very quickly, massively multiply inside host cells and spread throughout the body, forming long-lived cysts in tissues such as muscle, eye and brain. Isabelle Tardieux and colleagues wanted to determine how these tiny tachyzoites glide so swiftly through tissues in a unique helical motion.

To find out, the researchers combined several types of high-resolution and high-speed 2D and 3D live imaging with force microscopy methods. They examined the parasites' movements through collagen fibers that mimicked the extracellular matrix -- a dense network of proteins that surrounds cells in tissues. Tachyzoites squeezed through the collagen meshwork by first pausing and forming a kink in the front part of their bodies. Then, the cell bodies contracted, and the parasites surged forward with a spring-like motion. Delving deeper with the help of photomicropatterning and machine learning approaches, the researchers found that these movements were caused by the formation and breakage of specific attachments between the protozoans and collagen fibers, resulting in the buildup of contractile forces in the parasites' cytoskeletons. When the front tip of a parasite released its hold on the fibers, it sprung forward with a super-fast, helical glide. 

Credit: 
American Chemical Society

Envy divides society

It's generally recognized that differences in background and education cement class differences. It is less clear when and under what circumstances individual psychological forces can drive an initially homogenous social group apart and ultimately divide it. Claudius Gros, professor for theoretical physics at Goethe University, investigated this question in a mathematical precise way using game theory methods. "In the study, societies of agents - acting individuals - are simulated within game theory, which means that everybody optimises her/his success according to predetermined rules. I wanted to find out whether social differences can emerge on their own if no one starts off with advantages - that is, when all actors have the same skills and opportunity," the physicist explains.

The study is based on the assumption that there are things in every society that are coveted but limited - such as jobs, social contacts and positions of power. An inequality is created if the top position is already occupied and someone must therefore accept the second-best job - but not, however, a societal division. With the help of mathematical calculations Gros was able to demonstrate that envy, which arises from the need to compare oneself with others, alters individual behaviour and consequently the agents' strategies in characteristic ways. As a result of this changed behaviour, two strictly separate social classes arise.

Game theory provides the mathematical tools necessary for the modelling of decision situations with several participants, as in Gros' study. In general, constellations in which the decision strategies of the individual actors mutually influence each other are particularly revealing. The success of the individual depends then not only on his or her own actions, but on others' actions as well, which is typical of both economic and social contexts. Game theory is consequently firmly anchored in the economy. The stability condition of game theory, the "Nash equilibrium", is a concept developed by John Forbes Nash in his dissertation in 1950, using the example of poker players. It states that in equilibrium no player has anything to gain by changing their strategy if the other players do not change theirs either. An individual only tries out new behaviour patterns if there is a potential gain. Since this causal chain also applies to evolutionary processes, the evolutionary and behavioural sciences regularly fall back on game theoretical models, for example when researching animal behaviours such as the migratory flight routes of birds, or their competition for nesting sites.

Even in an envy-induced class society there is no incentive for an individual to change his or her strategy, according to Gros. It is therefore Nash stable. In the divided envy society there is a marked difference in income between the upper and lower class which is the same for all members of each social class. Typical for the members of the lower class is, according to Gros, that they spend their time on a series of different activities, something game theory terms a "mixed strategy". Members of the upper class, however, concentrate on a single task, i.e., they pursue a "pure strategy". It is also striking that the upper class can choose between various options while the lower class only has access to a single mixed strategy. "The upper class is therefore individualistic, while agents in the lower class are lost in the crowd, so to speak," the physicist sums up.

In Claudius Gros' model, whether an agent lands in the upper or lower class is ultimately a matter of coincidence. It is decided by the dynamics of competition, and not by origin. For his study, Gros developed a new game theoretical model, the "shopping trouble model" and worked out a precise analytical solution. From it, he derives that an envy-induced class society possesses characteristics that are deemed universal in the theory of complex systems. The result is that the class society is beyond political control to a certain degree. Political decision-makers lose a portion of their options for control when society spontaneously splits into social classes. In addition, Gros' model demonstrates that envy has a stronger effect when the competition for limited resources is stronger. "This game theoretical insight could be of central significance. Even an 'ideal society' cannot be stably maintained in the long term - which ultimately makes the striving for a communistic society seem unrealistic," the scientist remarks.

Credit: 
Goethe University Frankfurt

A Neandertal from Chagyrskaya Cave

image: Researchers have sequenced the genome of a Neandertal from Chagyrskaya Cave in the Altai Mountains to high quality.

Image: 
Dr. Bence Viola, Dept. of Anthropology, U. of Toronto

The researchers extracted the DNA from bone powder and sequenced it to high quality. They estimate that the female Neandertal lived 60,000-80,000 years ago. From the variation in the genome they estimate that she and other Siberian Neandertals lived in small groups of less than 60 individuals. The researchers also show that the Chagyrskaya Neandertal was more closely related to the Croatian than to the other Siberian Neandertal which lived some 40,000 years before the Chagyrskaya Neandertal. This shows that Neandertal populations from the West at some point replaced other Neandertal populations in Siberia.

"We also found that genes expressed in the striatum of the brain during adolescence showed more changes that altered the resulting amino acid when compared to other areas of the brain", says Fabrizio Mafessoni, lead author of the study. The results suggest that the striatum - a part of the brain which coordinates various aspects of cognition, including planning, decision-making, motivation and reward perception - may have played a unique role in Neandertals.

Credit: 
Max Planck Institute for Evolutionary Anthropology

Microbes might manage your cholesterol

In the darkest parts of the world where light fails to block out the unfathomable bounty of the stars, look up. There are still fewer specks illuminating the universe than there are bacteria in the world, hidden from sight, a whole universe inside just one human gut.

Many species are known, like E. coli, but many more, sometimes referred to as "microbial dark matter," remain elusive. "We know it's there," said Doug Kenny, a Ph.D. candidate in the Graduate School of Arts and Sciences, "because of how it affects things around it." Kenny is co-first author on a new study in Cell Host and Microbe that illuminates a bit of that microbial dark matter: a species of gut bacteria that can affect cholesterol levels in humans.

"The metabolism of cholesterol by these microbes may play an important role in reducing both intestinal and blood serum cholesterol concentrations, directly impacting human health," said Emily Balskus, professor of chemistry and chemical biology at Harvard University and co-senior author with Ramnik Xavier, , core member at the Broad, co-director of the Center for informatics and therapeutics at MIT and investigator at Massachusetts General Hospital. The newly discovered bacteria could one day help people manage their cholesterol levels through diet, probiotics, or novel treatments based on individual microbiomes.

According to the Centers for Disease Control and Prevention (CDC), in 2016, over 12 percent of adults in the United States age 20 and older had high cholesterol levels, a risk factor for the country's number one cause of death: heart disease. Only half of that group take medications like statins to manage their cholesterol levels; while such drugs are a valuable tool, they don't work for all patients and, though rare, can have concerning side effects.

"We're not looking for the silver bullet to solve cardiovascular disease," Kenny said, "but there's this other organ, the microbiome, another system at play that could be regulating cholesterol levels that we haven't thought about yet."

The hog sewage lagoon

Since the late 1800s, scientists knew that something was happening to cholesterol in the gut. Over decades, work inched closer to an answer. One study even found evidence of cholesterol-consuming bacteria living in a hog sewage lagoon. But those microbes preferred to live in hogs, not humans.

Prior studies are like a case file of clues (one 1977 lab even isolated the telltale microbe but the samples were lost). One huge clue is coprostanol, the byproduct of cholesterol metabolism in the gut. "Because the hog sewage lagoon microbe also formed coprostanol," said Balskus, "we decided to identify the genes responsible for this activity, hoping we might find similar genes in the human gut."

Meanwhile, Damian Plichta, a computational scientist at the Broad Institute and co-first author with Kenny, searched for clues in human data sets. Hundreds of species of bacteria, viruses and fungi that live in the human gut have yet to be isolated and described, he said. But so-called metagenomics can help researchers bypass a step: Instead of locating a species of bacteria first and then figuring out what it can do, they can analyze the wealth of genetic material found in human microbiomes to determine what capabilities those genes encode.

Plichta cross-referenced massive microbiome genome data with human stool samples to find which genes corresponded with high levels of coprostanol. "From this massive amount of correlations," he said, "we zoomed in on a few potentially interesting genes that we could then follow up on." Meanwhile, after Balskus and Kenny sequenced the entire genome of the cholesterol-consuming hog bacterium, they mined the data and discovered similar genes: A signal that they were getting closer.

The human connection

Then Kenny narrowed their search further. In the lab, he inserted each potential gene into bacteria and tested which made enzymes to break down cholesterol into coprostanol. Eventually, he found the best candidate, which the team named the Intestinal Steroid Metabolism A (IsmA) gene.

"We could now correlate the presence or absence of potential bacteria that have these enzymes with blood cholesterol levels collected from the same individuals," said Xavier. Using human microbiome data sets from China, Netherlands and the United States, they discovered that people who carry the IsmA gene in their microbiome had 55 to 75 percent less cholesterol in their stool than those without.

"Those who have this enzyme activity basically have lower cholesterol," Xavier said.

The discovery, Xavier said, could lead to new therapeutics--like a "biotic cocktail" or direct enzyme delivery to the gut--to help people manage their blood cholesterol levels. But there's a lot of work to do first: The team may have identified the crucial enzyme, but they still need to isolate the microbe responsible. They need to prove not just correlation but causation--that the microbe and its enzyme are directly responsible for lowering cholesterol in humans. And, they need to analyze what effect coprostanol, the reaction byproduct, has on human health.

"It doesn't mean that we're going to have answers tomorrow, but we have an outline of how to go about it," Xavier said.

Credit: 
Harvard University

4,000th comet discovered by ESA and NASA Solar Observatory

video: ESA and NASA's SOHO has discovered 4,000 comets in nearly 25 years. Karl Battams, who leads the mission's comet-finding program, talks about four of his favorite comets first spotted by the Sun-watching observatory.

Watch on YouTube: https://youtu.be/2wT4ZQG19S0

Download in HD: https://svs.gsfc.nasa.gov/13623

Image: 
NASA's Goddard Space Flight Center

On June 15, 2020, a citizen scientist spotted a never-before-seen comet in data from the Solar and Heliospheric Observatory, or SOHO -- the 4,000th comet discovery in the spacecraft's 25-year history.

The comet is nicknamed SOHO-4000, pending its official designation from the Minor Planet Center. Like most other SOHO-discovered comets, SOHO-4000 is part of the Kreutz family of sungrazers. The Kreutz family of comets all follow the same general trajectory, one that carries them skimming through the outer atmosphere of the Sun. SOHO-4000 is on the small side, with a diameter in the range of 15-30 feet, and it was extremely faint and close to the Sun when discovered -- meaning SOHO is the only observatory that has spotted the comet, as it's impossible to see from Earth with or without a telescope.

"I feel very fortunate to have found SOHO's 4,000th comet. Although I knew that SOHO was nearing its 4,000th comet discovery, I did not initially think that this sungrazer would be it," said Trygve Prestgard, who first spotted the comet in SOHO's data. "It was only after discussing with other SOHO comet hunters, and counting through the most recent sungrazer discoveries, that the idea sunk in. I am honored to be part of such an amazing collaborative effort."

SOHO is a joint mission of the European Space Agency (ESA) and NASA. Launched in 1995, SOHO studies the Sun from its interior to its outer atmosphere, with an uninterrupted view from its vantage point between the Sun and Earth, about a million miles from our planet. But over the past two and half decades, SOHO has also become the greatest comet finder in human history.

SOHO's comet-hunting prowess comes from a combination of its long lifespan, its sensitive instruments focused on the solar corona, and the tireless work of citizen scientists who scour SOHO's data for previously-undiscovered comets, which are clumps of frozen gases, rock and dust that orbit the Sun.

"Not only has SOHO rewritten the history books in terms of solar physics, but, unexpectedly, it's rewritten the books in terms of comets as well," said Karl Battams, a space scientist at the U.S. Naval Research Lab in Washington, D.C., who works on SOHO and manages its comet-finding program.

The vast majority of comets found in SOHO's data are from its coronagraph instrument, called LASCO, short for Large Angle and Spectrometric Coronagraph. Like other coronagraphs, LASCO uses a solid object -- in this case, a metal disk -- to block out the Sun's bright face, allowing its cameras to focus on the relatively faint outer atmosphere, the corona. The corona is critical to understanding how the Sun's changes propagate out into the solar system, making LASCO a key part of SOHO's scientific quest to understand the Sun and its influence.

But focusing on this faint region also means LASCO can do something other telescopes can't -- it can see comets flying extremely close to the Sun, called sungrazers, which are otherwise blotted out by the Sun's intense light and impossible to see. This is why nearly all of SOHO's 4,000 comet discoveries have come from LASCO's data.

Like most who have discovered comets in SOHO's data, Prestgard is a citizen scientist, searching for comets in his free time with the Sungrazer Project. The Sungrazer Project is a NASA-funded citizen science project, managed by Battams, which grew out of comet discoveries by citizen scientists early into SOHO's mission.

"I have been actively involved in the Sungrazer Project for about eight years. My work with sungrazers is what solidified my long-term interest in planetary science," said Prestgard, who recently completed a master's degree in geophysics from Université Grenoble Alpes in France. "I enjoy the feeling of discovering something previously unknown, whether this is a nice "real time" comet or a "long-gone" overlooked one in the archives."

In total, Prestgard has discovered around 120 previously-unknown comets using data from SOHO and NASA's STEREO mission.

Copious comets

This 4,000th comet discovery came earlier than scientists initially expected -- a byproduct of SOHO's teamwork with the Parker Solar Probe mission. In coordination with Parker Solar Probe's fifth flyby of the Sun, the SOHO team ran a special observation campaign in early June, increasing the frequency with which the LASCO instrument takes images of the Sun's corona, as well as doubling the exposure time for each image. These changes in LASCO's imaging were designed to help the instrument pick up faint structures that would later pass over Parker Solar Probe.

"Since Parker Solar Probe was crossing the plane of the sky as seen from Earth, the structures that we see from SOHO's coronagraphs will be in the path of Parker Solar Probe," said Angelos Vourlidas, an astrophysicist at the Johns Hopkins University Applied Physics Lab, in Laurel, Maryland, who works on the Parker Solar Probe and SOHO missions. "It's the optimal configuration to do this type of imaging."

These more-sensitive images also revealed a number of comets that, based on their brightness, would have been too faint to see in SOHO's regular, shorter-exposure images. SOHO typically sees an uptick in comet discoveries each June, because Earth's position in space places SOHO at a good angle to see sunlight reflecting off of comets following the Kreutz path, a family of comets that accounts for about 85% of the comets discovered by SOHO. But this June saw 17 comets discovered in the first nine days of the month, around double the normal rate of discoveries.

"Our exposure time is twice as long, so we're gathering way more light, and seeing comets that are otherwise too faint for us to see -- it's just like any long-exposure photography," said Battams. "It's possible that if we doubled exposure time again, we'd see even more comets."

SOHO is a cooperative effort between ESA and NASA. Mission control is based at NASA's Goddard Space Flight Center in Greenbelt, Maryland. SOHO's Large Angle and Spectrometric Coronagraph Experiment, or LASCO, which is the instrument that provides most of the comet imagery, was built by an international consortium, led by the U.S. Naval Research Lab.

Credit: 
NASA/Goddard Space Flight Center

Light-activated 'CRISPR' triggers precision gene editing and super-fast DNA repair

In a series of experiments using human cancer cell lines, scientists at Johns Hopkins Medicine say they have successfully used light as a trigger to make precise cuts in genomic material rapidly, using a molecular scalpel known as CRISPR, and observe how specialized cell proteins repair the exact spot where the gene was cut.

Results of the experiments, published June 11 in Science, not only reveal new details about the DNA repair process, but also are likely, the researchers say, to speed up and aid understanding of the DNA activity that typically causes aging and many cancers.

"Our new system of gene editing allows for targeted DNA cutting within seconds after activation. With previous technologies, gene editing could take much longer -- even hours," says postdoctoral fellow Yang Liu, Ph.D., a member of the Johns Hopkins Medicine research team.

The powerful CRISPR tool has, in recent years, enabled scientists to easily change, or "edit," DNA sequences and alter gene functions to speed the pace of research on gene-linked conditions.

Adapted from a naturally occurring gene editing system found in bacteria, CRISPR uses small sequences of genetic material called RNA as a kind of guide that is coded to match and bind to a specific sequence of genomic DNA within a cell. The CRISPR molecule also contains an enzyme called Cas9, which acts as the scalpel to cut out the DNA sequence. Then, the cell uses its own enzymes and proteins to repair the sliced DNA, often adding DNA sequences that scientists slip into the cell.

Liu says that studying the DNA repair process has been hampered by an inability to damage the DNA, such as by using CRISPR, in a way that's fast, precise and "on demand."

For the new experiments, the scientists modified the CRISPR-Cas9 complex by engineering a light-sensitive RNA molecule that allows the CRISPR complex to cut genomic DNA in living cells only when exposed to a particular wavelength of light.

"The advantage of our technique is that researchers can get the CRISPR machinery to find its target without prematurely cutting the gene, holding back its action until exposed to light," says Johns Hopkins M.D.-Ph.D. candidate Roger Zou, also a member of the research team. "This allows researchers to have far more control over exactly where and when the DNA is cut," he adds.

Other research teams have experimented with both drugs and light activation to control CRISPR timing, says Taekjip Ha, Ph.D., Bloomberg Distinguished Professor of Biophysics and Biophysical Chemistry, Biophysics and Biomedical Engineering at Johns Hopkins University, and a Howard Hughes Medical Institute investigator. His team's experiments differ by improving the precise timing of CRISPR cuts and examining how quickly proteins repair the DNA damage.

For the current study, the Johns Hopkins team, led by Ha and Bin Wu, Ph.D., assistant professor of biophysics and biophysical chemistry at the Johns Hopkins University School of Medicine, delivered an electric pulse to cultures of human embryonic kidney cells and bone cancer cells, which opened pores in the cell membrane and allowed the CRISPR complex with the light-activated RNA molecule to slide into the cells. Then, the scientists waited 12 hours for the CRISPR complex to bind to a targeted spot on the genomic DNA.

When they shined a light on the cells, they tracked the amount of time it took for the CRISPR complex to make the cut.

The team found that within 30 seconds of shining the light on the cells, the CRISPR complex had cut more than 50 percent of its targets.

To further examine the timing of DNA repair, the Johns Hopkins scientists tracked when proteins involved in DNA repair latched on to the DNA cuts. They determined that repair proteins started their work within two minutes of the CRISPR activation, and the repair was completed as early as 15 minutes later.

"We have shown that light-activated gene cutting is very fast, and it has potentially wide applications in biomedical research." says Ha. "Revealing the timing of CRISPR gene cuts allows us to see biological processes far more precisely." Ha and the Johns Hopkins team have dubbed the technique "very fast CRISPR on demand."

Ha also noted that light-activation offers better location control than drugs that can diffuse widely in the cell.

The Johns Hopkins team also used high-resolution microscopes to "see" how repair proteins interact with the CRISPR cut site in living cells.

They used these microscopes and a focused beam of light to show that they could activate CRISPR cutting of one of two gene copies that are normally found in human cells. This capability, they say, offers opportunities for using CRISPR to study and eventually treat conditions linked to only one abnormal gene copy, such as Huntington's disease.

"There is a big research community interested in studying DNA damage and its impact," says Ha. "The technology we developed is well suited to study that."

Ha notes that scientists typically use ionizing radiation or chemicals to study DNA damage. While those methods can also be fast, he says, they are not specific to a certain genomic location.

The team has filed a provisional patent on the CRISPR technology described in this research.

Credit: 
Johns Hopkins Medicine