Culture

Mouse pups born from eggs derived from the granulosa cells that surround oocytes

image: This photo shows a mouse obtained from gPSC4-derived oocytes by IVM and IVF. The mouse is now about one year old. The gPSC4- mice have reproductive ability and produced 13 and 7 pups, respectively, after two matings.

Image: 
Lin Liu

By introducing a chemical cocktail to granulosa cells, researchers in China induced the cells to transform into functional oocytes in mice. Once fertilized, these oocytes were then successfully able to produce healthy offspring, showing no differences from naturally bred mice. The chemical reprogramming method appears December 24 in the journal Cell Reports.

Ovarian follicles are the basic functional unit of the ovary and consist of an oocyte, the immature egg, which is surrounded by granulosa cells. Besides being crucial to the development of follicles, studies have shown that granulosa cells possess plasticity that shows stem cell-like properties.

"The thing about in vitro fertilization is that they only use the oocyte for the procedure," says senior author Lin Liu, of the College of Life Sciences at Nankai University. "After the egg retrieval, the granulosa cells in the follicle are discarded. It got us thinking, what if we can utilize these granulosa cells? Since every egg has thousands of granulosa cells surrounding it, if we can induce them into pluripotent cells and turn those cells into oocytes, aren't we killing two birds with one stone?"

Granulosa cells tend to undergo cell death and differentiation once removed from the follicles. Liu and his team including PhD students Chenglei Tian and Haifeng Fu developed a chemical "cocktail" with Rock inhibitor and crotonic acid for creating chemically induced pluripotent stem cells (CiPSCs) from granulosa cells. The research team introduced Rock inhibitor to prevent cell death and promote proliferation. In combination with other important small chemicals, crotonic acid facilitates the induction of granulosa cells into germline-competent pluripotent stem cells that exhibit pluripotency similar to embryonic stem cells.

"It's a surprising result," says Liu. "The competency of induced pluripotent germline is usually lower than embryonic stem cells. Germline competency is crucial for germline cells to transfer genetic information to the next generation. With the co-formulation of Rock inhibitor and crotonic acid, it's not only more efficient, but the quality also increased."

Another cocktail of Rock inhibitor and vitamin C is introduced to the germline-competent pluripotent stem cells to improve the follicle development and induce meiosis. Meiosis is the process of a single cell becoming sex cells, the egg. Germ cells and oocytes rejuvenated from granulosa cells exhibit high genomic stability and successfully produce offspring that show normal fertility.

"We can consistently manipulate the concentration and treatment time of these small chemicals," says Liu. Compared to traditional stem cell-inducing methods such as transfection, which reprograms cells by introducing transcription factors to somatic cells, chemical treatment provides higher controllability. "Transfection method may have a higher risk of genetic instability."

"This is the first time we turned granulosa cells into oocytes, it is a crucial and interesting work in developmental and reproductive biology," he says. "But implementing this research to humans from mice still has a long way to go. I think it has more prospect in preserving fertility and endocrine function, than in treating infertility."

Credit: 
Cell Press

Cellular culprit suspected of pushing dengue fever from bad to worse is cleared by transcripts

image: Dengue-specific CD4 cells have both a pro-inflammatory function and an anti-inflammatory function, which is typically not seen in acute viral infections. Each dot is a double positive cell with the color of each dot representing the expression level of a protein named TIGIT, which is an inhibitory immunoreceptor, in each cell.

Image: 
Dr. Yuan Tian, La Jolla Institute for Immunology

LA JOLLA, CA--No one knows what makes a mild dengue viral infection morph into a severe and sometimes deadly dengue hemorrhagic fever/dengue shock syndrome. Experts previously believed the likely cause was ramped up activity of T cells, which can massively boost an immune response to a virus. Now, however, researchers at the La Jolla Institute for Immunology (LJI), have found definitive evidence that CD4 T cells, one of two main subtypes of T cells, are not to blame.

The finding, reported in the December 24, 2019, issue of Cell Reports, is important to both the basic understanding of this disease--the world's most common mosquito-borne illness--and to the hunt for an effective vaccine for dengue.

"We found no evidence to support the common dogma that these T cells are responsible for turning a mild infection to a severe one. This will help us narrow the search for the true culprit," says the study's lead investigator Yuan Tian, Ph.D., an AAI Intersect Fellow and a Bioinformatics Student at LJI. He is also a postdoctoral fellow in the lab of Alessandro Sette, Dr. Sci. Biol, a co-author of the study.

These issues are serious. Dengue fever is spreading. Infected mosquitos have expanded beyond their established tropical and subtropical territories in South East Asia and Latin America to new continents, including Europe and the United States. More than half of the world's population is now at risk; already, 390 million infections occur annually, according to public health experts.

The goal of the LJI study was to define the molecular pattern of dengue-specific CD4 T cells and to investigate whether there is a difference in the T cell response between patients with mild dengue fever or with severe dengue hemorrhagic fever.

When analyzing dengue-specific CD4 T cells, the researchers realized that the responding CD4 T cells, have both a pro-inflammatory function (regulated by the cytokine interferon gamma, or IFN-?) and an anti-inflammatory function (regulated by the cytokine interleukin 10, or IL-10) which is typically not seen in acute viral infections. To comprehensively define these dengue-virus specific T cells in hospitalized patients, researchers used whole transcriptome analysis to determine if there was a difference in the quality of the increased response.

This approach allows to identify all RNA transcripts--produced when a gene's DNA sequence is copied, or transcribed--within the transcriptome of dengue-specific CD4 T cells in hospitalized patients being treated for either mild or for severe dengue infection. These patients were being treated in Sri Lanka, where dengue fever is endemic.

"This is a very powerful approach to detect gene expression activity because all genes upregulated in response to the virus can be identified. It is completely unbiased and does not rely on pre-selected genes," says the study's senior investigator, Daniela Weiskopf, Ph.D., an instructor at LJI.

The research team, to their surprise, detected no difference in the genomic profile of dengue-virus specific CD4 T cells regardless if they isolated them form patients with mild or severe dengue infection.

"The CD4 T cell response in the severe disease does not look different so that cannot be the switch we are all looking for," Tian says. "In fact, based on some intriguing preliminary findings, we speculate that to counteract the severe immune response occurring in acute cases, these dengue-specific CD4 cells may have gradually acquired the ability to produce more IL-10 by converting IFN?. It is as if they are trying to calm themselves, calm the inflammation. The double positive CD4 T cells could actually be helping, rather than hurting."

Tian adds that he hopes these findings will serve to "help guide efforts to develop effective dengue vaccines by improving our understanding of this novel T cell response."

Credit: 
La Jolla Institute for Immunology

A molecular map of the brain's decision-making area

image: From left: Konstantinos Meletis, Antje Märtin, Daniela Calvigioni and Rania Tzortzi, researchers at the Department of Neuroscience at Karolinska Institutet.

Image: 
Juan Perez Fernandez

Researchers at Karolinska Institutet have come one step closer toward understanding how the part of our brain that is central for decision-making and the development of addiction is organized on a molecular level. In mouse models and with methods used for mapping cell types and brain tissue, the researchers were able to visualize the organization of different opioid-islands in striatum. Their spatiomolecular map, published in the journal Cell Reports, may further our understanding of the brain's reward-system.

Striatum is the inner part of the brain that among other things regulates rewards, motivation, impulses and motor function. It is considered central to decision-making and the development of various addictions.

In this study, the researchers created a molecular 3D-map of the nerve cells targeted by opioids, such as morphine and heroin, and showed how they are organized in striatum. It is an important step toward understanding how the brain's network governing motivation and drug addiction is organized. In the study, the researchers described a spatiomolecular code that can be used to divide striatum into different subregions.

"Our map forms the basis for a new understanding of the brain's probably most important network for decision-making," says Konstantinos Meletis, associate professor at the Department of Neuroscience at Karolinska Institutet and the study's main author. "It may contribute to an increased understanding of both normal reward processes and the effects of various addictive substances on this network."

To find this molecular code, the researchers used single-nucleus RNA sequencing, a method to study small differences in individual cells, and mapping of the striatal gene expression. The results provide the first demonstration of molecular codes that divide the striatum into three main levels of classification: a spatial, a patch-matrix and a cell-type specific organization.

"With this new knowledge we may now begin to analyze the function of different types of nerve cells in different molecularly defined areas," says Meletis. "This is the first step in directly defining the networks' role in controlling decision-making and addiction with the help of optogenetics."

This new knowledge may also form the basis for the development of new treatments based on a mechanistic understanding of the brain's network, according to the researchers.

Credit: 
Karolinska Institutet

Why isn't there a vaccine for staph?

Staph bacteria, the leading cause of potentially dangerous skin infections, are most feared for the drug-resistant strains that have become a serious threat to public health. Attempts to develop a vaccine against methicillin-resistant Staphylococcus aureus (MRSA) have failed to outsmart the superbug's ubiquity and adaptability to antibiotics.

Now, a study from Washington University School of Medicine in St. Louis may help explain why previous attempts to develop a staph vaccine have failed, while also suggesting a new approach to vaccine design. This approach focuses on activating an untapped set of immune cells, as well as immunizing against staph in utero or within the first few days after birth.

The research, in mice, found that T cells -- one of the body's major types of highly specific immune cells -- play a critical role in protecting against staph bacteria. Most vaccines rely solely on stimulating the other main type of immune cells, the B cells, which produce antibodies to attack disease-causing microorganisms such as bacteria.

The findings are published online Dec. 24 in the Journal of Clinical Investigation.

"Across the globe, staph infections have become a pervasive health threat because of increasing antibiotic resistance," said senior author Juliane Bubeck Wardenburg, MD, PhD, director of the university's Division of Pediatric Critical Care. "Despite the medical community's best efforts, the superbug has shown a consistent ability to elude treatment. Our findings indicate that a robust T cell response is absolutely essential for protection against staph infections."

Highly contagious, staph survives and thrives on human skin and can be spread through skin-to-skin contact or exposure via contaminated surfaces. Generally, the bacteria live harmlessly and invisibly in about one-third of the population. From their residence on the skin, the bacteria can cause red, pus-filled sores. Ever persistent, the superbug will deliver recurrent infections in about half of its victims.

Staph strains can enter the bloodstream, bones or organs and lead to pneumonia, severe organ damage and other serious complications in hundreds of thousands of people each year. More than 10,000 people die in the U.S. from drug-resistant staph infections annually.

"The focus in the vaccine field for Staphylococcus aureus during the past 20 years has been on generating antibody responses, not on specific T cell responses," Bubeck Wardenburg said. "This new approach shows promise."

For nearly 15 years, Bubeck Wardenburg has studied a single toxin -- called alpha-toxin -- made by staph. This toxin plays a role in tissue damage in multiple forms of infection. "An important thing about the alpha-toxin is that it is found in all staph strains, meaning those that are and are not antibiotic-resistant," she said. "Understanding this allowed us to devise studies in mice that examined the effect of alpha-toxin on the immune response in minor skin infections as well as in more serious infections that spread in the bloodstream."

The researchers found that the immune cells did not protect mice that had minor staph infections on their skin. However, mice that were exposed to life-threatening staph infections in the bloodstream did develop protection. "We discovered a robust T cell response targeting staph in the bloodstream," Bubeck Wardenburg said. "By contrast, T cells were diminished in skin infections as a result of the toxin. Because skin infection is very common, we think that staph uses alpha-toxin to prevent the body from activating a T cell response that affords protection against the bacteria."

In terms of the big picture, Bubeck Wardenburg said blocking the toxin in skin infections may yield a healthy T cell response.

Further, protecting the T cell response from the time of birth may reprogram the bacteria's overall effect on the immune system. "This bug is deliberate and acts in a sinister way early on," she said. "The bug appears to be using the toxin to shape the T cell response in a way that's favorable for the bug but not for humans."

Previous vaccine development efforts have focused on adults. However, Bubeck Wardenburg said, a vaccine may be more likely to succeed if administered before infants first encounter staph. Therefore, immunization should happen before initial exposure to staph, to block the toxin and generate a vigorous T cell response.

"We envision two strategies," Bubeck Wardenburg said. "One is immunizing pregnant women so they can transfer antibodies that protect infants against the toxin at birth. The second involves immunizing infants within a day or two after birth. Neither of these strategies has been considered for staph vaccines to date."

Credit: 
Washington University in St. Louis

Lasers learn to accurately spot space junk

image: Beijing Fangshan Satellite Laser Observatory.

Image: 
Beijing Fangshan Satellite Laser Observatory

WASHINGTON, D.C., December 24, 2019 - Chinese researchers have improved the accuracy in detecting space junk in earth's orbit, providing a more effective way to plot safe routes for spacecraft maneuvers.

"The possibility of successfully navigating an asteroid field is approximately 3,720 to one!" exclaimed C-3PO as Han Solo directed the Millennium Falcon into an asteroid field in "Star Wars: The Empire Strikes Back." Earth's orbit is nowhere near as dangerous, but after more than half a century of space activity, collisions between jettisoned engines and disintegrated spacecraft have formed a planetary scrapheap that spacecraft need to evade.

Scientists have developed space junk identification systems, but it has proven tricky to pinpoint the swift, small specks of space litter. A unique set of algorithms for laser ranging telescopes, described in the Journal of Laser Applications, by AIP Publishing, has significantly improving the success rate of space debris detection.

"After improving the pointing accuracy of the telescope through a neural network, space debris with a cross sectional area of 1 meter squared and a distance of 1,500 kilometers can be detected," said Tianming Ma, from the Chinese Academy of Surveying and Mapping, Beijing and Liaoning Technical University, Fuxin.

Laser ranging technology uses laser reflection from objects to measure their distance. But the echo signal reflected from the surface of space debris is very weak, reducing the accuracy. Previous methods improved laser ranging pinpointing of debris but only to a 1-kilometer level.

Application of neural networks - algorithms modeled on the human brain's sensory inputs, processing and output levels - to laser ranging technologies has been proposed previously. However, Ma's study is the first time a neural network has significantly improved the pointing accuracy of a laser-ranging telescope.

Ma and colleagues trained a back propagation neural network to recognize space debris using two correcting algorithms. The Genetic Algorithm and Levenberg-Marquardt optimized the neural network's thresholds for recognition of space debris, ensuring the network wasn't too sensitive and could be trained on localized areas of space. The team demonstrated the improved accuracy by testing against three traditional methods at the Beijing Fangshen laser range telescope station.

The observation data of 95 stars was used to solve the algorithm coefficients from each method, and the accuracy of detecting 22 other stars was assessed. The new pointing correction algorithms proved the most accurate, as well as easy to operate with good real-time performance.

Ma aims to further refine the method. "Obtaining the precise orbit of space debris can provide effective help for the safe operation of spacecraft in orbit."

Credit: 
American Institute of Physics

Large scale feasts at ancient capital of Ulster drew crowds from across Iron Age Ireland

image: One of the analysed pig jaws for the study.

Image: 
Dr Richard Madgwick

People transported animals over huge distances for mass gatherings at one of Ireland's most iconic archaeological sites, research concludes.

Dr Richard Madgwick of Cardiff University led the study, which analysed the bones of 35 animals excavated from Navan Fort, the legendary capital of Ulster. Researchers from Queen's University Belfast, Memorial University Newfoundland and the British Geological Survey were also involved in the research.

The site had long been considered a centre for ritual gatherings, as excavations found a huge 40m diameter building and a barbary ape cranium, likely from at least as far as Iberia. Results suggest the pigs, cattle and sheep were brought from across Ireland, perhaps being reared as far afield as Galway, Donegal, Down, Tyrone and Antrim. Evidence suggests some were brought over more than 100 miles.

Dr Madgwick, based in Cardiff University's School of History, Archaeology and Religion, said: "Our results provide clear evidence that communities in Iron Age Ireland were very mobile and that livestock were also moved over greater distances than was previously thought.

"The high proportion of pig remains found there is very rare for this period. This suggests that Navan Fort was a feasting centre, as pigs are well-suited as feasting animals and in early Irish literature pork is the preferred food of the feast.

"It is clear that Navan Fort had a vast catchment and that the influence of the site was far-reaching."

Researchers used multi-isotope analysis on samples of tooth enamel to unlock the origins of each animal. Food and water have chemical compositions linked to the geographical areas where they are sourced. When animals eat and drink, these chemical signals are archived in their teeth, allowing scientists to investigate the location where they were raised.

Co-author of the research, Dr Finbar McCormick, of Queen's University, Belfast, said: "In the absence of human remains, multi-isotope analysis of animals found at Navan Fort provides us with the best indication of human movement at that time.

"Feasting, almost invariably associated with sacrifice, was a social necessity of early societies where the slaughter of a large domesticate necessitated the consumption of a large amount of meat in a short period of time."

Earlier this year, Dr Madgwick's research of 131 pigs found at sites near Stonehenge revealed animals came from as far away as Scotland and numerous other locations across the British Isles. Before this, the origins of people who visited this area and the extent of the population's movements at the time had been long-standing enigmas in British prehistory.

Dr Madgwick added: "Transporting animals across the country would have involved a great deal of time and effort so our findings demonstrate the important role they played in society. Food was clearly a central part of people's exchanges and traditions."

Credit: 
Cardiff University

Bacteria can 'outsmart' programmed cell death

image: Shigella bacteria can override programmed cell death, thereby establishing a niche to spread.

Image: 
Hamid Kashkar, University of Cologne

Certain bacteria can override a defence mechanism of the immune system, so called programmed cell death, through inhibition of death effector molecules by their outer membranes components. Shigella bacteria, which cause diarrhoea, use lipopolysaccharides (LPS) on their surface to block the effector caspases. Lipopolysaccharides are a component of the bacterial outer membrane. This strategy enables the bacteria to multiply within the cell. This is the result of a study conducted by the molecular immunologist Professor Hamid Kashkar and his team in the institute for Medical Microbiology and Immunology at the CECAD Cluster of Excellence in Aging Research at the University of Cologne. The article 'Cytosolic Gram-negative bacteria prevent apoptosis by inhibition of effector caspases through LPS' by Günther et al. appeared in the current issue of Nature Microbiology.

Various bacterial pathogens can escape our immune system by staying and multiplying within our body cells (intracellularly). The intracellular propagation of pathogens later leads to cell breakdown and the release of microorganisms that infect neighbouring cells, spread and cause tissue damage and infectious disease.
However, the body has a response to this bacterial strategy: programmed cell death, or apoptosis, reacts to cellular stress situations during infections and causes quick suicide of the infected cells.

Due to this rapid self-destruction programme of our body cells, pathogens cannot multiply - the immune system successfully eliminates them.
Scientists have observed in the past that pathogens can effectively block apoptosis, allowing them to reproduce and spread intracellularly. However, the molecular mechanism responsible for how these bacteria 'outsmarted' the immune system was largely unknown.

Kashkar lab has now showed that the pathogen that causes shigellosis (Shigella), a typical cause of acute inflammatory diarrhoea, blocks apoptosis by efficiently blocking certain enzymes, so-called caspases, which act as engines that initiate apoptosis.

The biologists showed that lipopolysaccharides bind and block the caspase. Bacteria without complete LPS, on the other hand, spark apoptosis, which blocks them from reproducing intracellularly. They are successfully eliminated by the immune system and thus no longer able to cause infectious diseases. Kashkar lab's work has thus deciphered an important bacterial strategy to prevent the rapid death of the host cell and establish a niche to spread.

Credit: 
University of Cologne

New technology allows control of gene therapy doses

image: Scripps Research Immunology Professor Michael Farzan, PhD, developed a gene therapy switch with his Jupiter, Florida-based team, postdoctoral researcher Guocai Zhong, PhD and research assistant Haimin Wang.

Image: 
Scripps Research

JUPITER, Fla. -- Dec. 23, 2019 -- Scientists at Scripps Research in Jupiter have developed a special molecular switch that could be embedded into gene therapies to allow doctors to control dosing.

The feat, reported in the scientific journal Nature Biotechnology, offers gene therapy designers what may be the first viable technique for adjusting the activity levels of their therapeutic genes. The lack of such a basic safety feature has helped limit the development of gene therapy, which otherwise holds promise for addressing genetically based conditions. The scientists' technique appears to solve a major safety issue and may lead to more use of the strategy.

The Scripps Research team, led by principal investigator Michael Farzan, PhD, demonstrated the power of their new switching technique by incorporating it into a gene therapy that produces the hormone erythropoietin, used as a treatment for anemia. They showed that they could suppress expression of its gene to very low levels with a special embedded molecule, and could then increase the gene's expression, over a wide dynamic range, using injected control molecules called morpholinos that the U.S. Food and Drug Administration has found to be safe for other applications.

"I think that our approach offers the only practical way at present to regulate the dose of a gene therapy in an animal or a human," Farzan says.

Gene therapies work by inserting copies of a therapeutic gene into the cells of a patient, if, for example, the patient was born without functional copies of the needed gene. The strategy has long been seen as having enormous potential to cure diseases caused by defective genes. It also could enable the steady, long-term delivery to patients of therapeutic molecules that are impractical to deliver in pills or injections because they don't survive for long in the body. However, gene therapies have been viewed as inherently risky because once they are delivered to a patient's cells, they cannot be switched off or modulated. As a result, only a handful of gene therapies have been FDA-approved to date.

The simplicity of the technique, and the fact that morpholinos are already FDA-approved, could allow the new transgene switching system to be used in a wide variety of envisioned gene therapies, Farzan adds.

Farzan's team, including study co-first authors Guocai Zhong, PhD and Haimin Wang, respectively a postdoctoral researcher and a research assistant in the Farzan lab, crafted a transgene switch from a family of ribonucleic acid (RNA) molecules called hammerhead ribozymes. These ribozymes have the remarkable property that they cut themselves in two as soon as they are copied out into RNA from the DNA that encodes them.

A therapeutic transgene containing the DNA of such a ribozyme will thus be copied out in cells into strands of RNA, called transcripts, that will tend to separate into two pieces before they can be translated into proteins. However, this self-cleaving action of the ribozyme can be blocked by RNA-like morpholinos that latch onto the ribozyme's active site; if this happens, the transgene transcript will remain intact and will be more likely to be translated into the therapeutic protein.

The ribozyme thus effectively acts as an "off switch" for the transgene, whereas the matching morpholinos, injected into the tissue where the transgene resides, can effectively turn the transgene back "on" again--to a degree that depends on the morpholino dose.

The scientists started with a hammerhead ribozyme called N107 that had been used as an RNA switch in prior studies, but they found that the difference in production of a transgene-encoded test protein between the "off" and "on" state was too modest for this ribozyme to be useful in gene therapies. However, over months of experimentation they were able to modify the ribozyme until it had a dynamic range that was dozens of times wider.

The team then demonstrated the ribozyme-based switch in a mouse model of an EPO gene therapy, which isn't yet FDA-approved but is considered potentially better than current methods for treating anemia associated with severe kidney disease. They injected an EPO transgene into muscle tissue in live mice, and showed that the embedded ribozyme suppressed EPO production to a very low level.

Injection of a small dose of the morpholino molecules into affected tissue strongly reversed that suppression, allowing EPO production to rise by a factor of more than 200--and stay there for more than a week, compared to a half-life of a few hours for EPO delivered by a standard injection. Those properties make the ribozyme-based switch potentially suitable for real clinical applications.

"We got what I would have said before was an impossible range of in vivo regulation from this system," Farzan says.

Farzan and his colleagues are now working to adapt their ribozyme switch technique so that it can be used as a gene therapy failsafe mechanism, deactivating errant transgenes permanently.

Credit: 
Scripps Research Institute

Many younger patients with stomach cancer have a distinct disease, Mayo research discovers

ROCHESTER, Minn. -- Many people under 60 who develop stomach cancer have a "genetically and clinically distinct" disease, new Mayo Clinic research has discovered. Compared to stomach cancer in older adults, this new, early onset form often grows and spreads more quickly, has a worse prognosis, and is more resistant to traditional chemotherapy treatments, the study finds. The research was published recently in the journal Surgery.

While rates of stomach cancer in older patients have been declining for decades, this early onset cancer is increasing and now makes up more than 30% of stomach cancer diagnoses.

"I think this is an alarming trend, as stomach cancer is a devastating disease," says senior author Travis Grotz, M.D., a Mayo Clinic surgical oncologist. "There is little awareness in the U.S. of the signs and symptoms of stomach cancer, and many younger patients may be diagnosed late -- when treatment is less effective."

The research team studied 75,225 cases using several cancer databases to review stomach cancer statistics from 1973 to 2015. Today, the average age of someone diagnosed with stomach cancer is 68, but people in their 30s, 40s and 50s are more at risk than they used to be.

Although there's no clear cutoff age for the definition of early onset and late-onset stomach cancer, the researchers found the distinctions held true whether they used an age cutoff of 60, 50 or 40 years. The researchers found that the incidence of late-onset stomach cancer decreased by 1.8% annually during the study period, while the early onset disease decreased by 1.9% annually from 1973 to 1995 and then increased by 1.5% through 2013. The proportion of early onset gastric cancer has doubled from 18% of all cases in 1995 to now more than 30% of all gastric cancer cases.

"Typically, we see stomach cancer being diagnosed in patients in their 70s, but increasingly we are seeing 30- to 50-year-old patients being diagnosed," Dr. Grotz says.

The increased rate of the early onset disease is not from earlier detection or screening, Dr. Grotz adds. "There is no universal screening for stomach cancer, and the younger patients actually presented with later-stage disease than the older patients," he says.

In addition to being more deadly, early onset stomach cancer is also genetically and molecularly distinct, researchers found. Furthermore, traditional risk factors for developing stomach cancer among older Americans, such as smoking tobacco, did not appear to correlate with its early onset counterpart.

"Hopefully, studies like this will raise awareness and increase physician suspicion of stomach cancer, particularly in younger patients," Dr. Grotz says. Younger patients who feel full before finishing a meal, or have reflux, abdominal pain, unintentional weight loss and difficulty eating should see their health care provider, he adds.

Stomach cancer is the 16th most common cancer in the U.S., according to the American Cancer Society. It has a five-year survival rate of 31.5%, and there will be an estimated 27,510 new cases in 2019, according to the National Cancer Institute. The World Health Organization reports that cancer was the second leading cause of death globally in 2018 and that stomach cancer was the third most common cause of cancer death that year.

Next the research team hopes to better identify risk factors for early onset stomach cancer using the Rochester Epidemiology Project and potentially other large databases.

Credit: 
Mayo Clinic

Calcium channels play a key role in the development of diabetes

Researchers at Karolinska Institutet in Sweden have deciphered the diabetogenic role of a certain type of calcium channel in insulin-secreting beta cells. The researchers believe that blockade of these channels could be a potential new treatment strategy for diabetes. The study is published in the scientific journal PNAS.

CaV3.1 channels have a marginal role in healthy insulin-secreting beta cells in the endocrine pancreas but become hyperactive along with the occurrence of diabetes. This raises a critical question of whether the hyperactivation of these calcium channels is a cause or consequence of diabetes. Now, researchers at Karolinska Institutet have found that increased expression of CaV3.1 leads to excessive calcium influx, impairing the genomic expression of exocytotic proteins in beta cells.

"This leads to a reduced insulin-secretion capacity of beta cells and aberrant glucose homeostasis," explains Dr. Jia Yu, first author of the study and Senior researcher at the Department of Molecular Medicine and Surgery, Karolinska Institutet.

The role of CaV3.1 in the development of diabetes was investigated with a series of approaches, including experiments on rat and human pancreatic islets and diabetic rats. The experimental models used suggest that the results apply to both type 1 and type 2 diabetes, but more studies are needed to verify this.

"Over a long period of time, the pathological role of beta cell CaV3.1 channels in the development of diabetes and its complications has been neglected," says Dr. Shao-Nian Yang, Associate professor at the Department of Molecular Medicine and Surgery, Karolinska Institutet, and senior author of the study. "Our work pinpoints an increased expression of these channels as a critical pathogenic mechanism in diabetes, meaning that CaV3.1 channels should not be neglected in diabetes research."

Now, the researchers want to work out if increased expression of CaV3.1 also alter transcriptomic profiles in other types of cells, such as vascular smooth muscle cells and T cells of the immune system to contribute to the development of diabetes and its complications.

"The selective blockade of CaV3.1 channels may have potential as a new mechanism-based treatment strategy," says Professor Per-Olof Berggren, Director of the Rolf Luft Research Center, Karolinska Institutet, and senior author of the study. "Clinical trials with CaV3.1 channel blockers in patients with diabetes will be one of our future study priorities."

The research was supported by the Swedish Diabetes Association, Karolinska Institutet's Foundations and Funds, the Swedish Research Council, the Novo Nordisk Foundation, the Family Erling-Persson Foundation, the Strategic Research Program in Diabetes at Karolinska Institutet, the European Research Council (ERC), the Knut and Alice Wallenberg Foundation, Skandia Insurance Company Ltd, the Diabetes and Wellness Foundation, the Bert von Kantzow Foundation, Lee Kong Chian School of Medicine, Nanyang Technological University and the Stichting af Jochnick Foundation.

Credit: 
Karolinska Institutet

Study reveals a role for jumping genes during times of stress

BOSTON - Only percent of human DNA codes for proteins, and approximately half of the rest of the genome is made up of what used to be called "junk" sequences that can copy themselves into RNA or DNA and jump from one location to another. Previous research led by investigators at Massachusetts General Hospital (MGH) had revealed a critical role for one of these jumping genes during times of stress. In new research published by the same group in the Proceedings of the National Academy of Sciences, the investigators report a surprising new property of this jumping RNA.

The sequences that jump from place to place in the genome are more formally known as transposable elements, and their role in health and disease is not fully understood. But it has long been suspected that they are more than just parasitic elements without good function. In their original study, Jeannie Lee, MD, PhD, an investigator in the Department of Molecular Biology at MGH, and her colleagues found that one of these transposable elements--a very abundant, short interspersed nuclear element (SINE) called B2 in mice (ALU in humans)--makes an RNA that is cut when together with a protein called EZH2. However, at the time, they did not know how the RNA is cut. Researchers now make the striking discovery that B2 and ALU cut themselves.

Until four decades ago, it was thought that only proteins can make enzymes and that only enzymes can cut nucleic acids, the building blocks of DNA and RNA. But in 1982, researchers showed that RNA can function as enzymes as well--and these RNAs are called ribozymes -- a discovery that led to the Nobel Prize in Chemistry in 1989. Today, 15 classes of ribozymes have been described, but they are mostly observed in bacteria and viruses. Very few are known in mammals such as humans, and their functions are mostly unclear.

Because B2 and ALU are so abundant in our cells, the Lee group's discovery puts a new twist to the ribozyme story. "B2 and ALU are present in hundreds of thousands of copies in our DNA and they become massively expressed during stress. This is a mind-boggling amount of ribozyme activity," said Lee. The team found that B2 and ALU are normally silent, but when subjected to heat or other forms of stress, they become activated. Also, their RNA-cutting activity is enhanced by an interaction with the EZH2 protein.

Lee noted that cells are continually challenged by stress, and a swift response can mean the difference between life and death. "Hinging the induction of stress-related genes to self-cutting RNAs seems highly adaptive," she said. "No new synthesis of gene products would be required and the critical event would instead be the recruitment of a protein factor, EZH2, that already exists inside cells and stands ready to be mobilized."

The findings may have important clinical implications for helping the body to respond to stress, such as during the development of infections, cancer or autoimmune disease.

Credit: 
Massachusetts General Hospital

Whales use stealth to feed on fish

image: A humpback whale lunges at a large patch of anchovies while the anchovies are also under threat from sea lion and avian predators.

Image: 
Cascadia Research Collective; NMFS Permit 16111

Small fish are speedy and easy to scare. So how is it that a giant humpback whale, attacking at speeds about as fast as a person jogs, is able to eat enough fish to sustain itself? Combining field studies, laboratory experiments and mathematical modeling, researchers at Stanford University have found a surprising answer to this seemingly paradoxical feat: Whales capture fish using stealth and deception.

From a conservation and ecological standpoint, this work also derived the first quantitative estimates of how many fish humpbacks consume in a single feeding event and over time.

"Lunge-feeding whales need dense concentrations of prey to forage effectively, yet fish schools could easily disperse and render lunge-feeding ineffective if they sensed a threat," said David Cade, lead author of the paper about this work, published Dec. 23 in Proceedings of the National Academy of Sciences. "We were interested in finding why these schools of fish did not run from this huge, looming predator."

The researchers conducted lab experiments to measure anchovies' escape reaction to a virtual whale - a widening dot, representing the expanding maw of a lunging whale. Models that informed how quickly the dot widened were based off recordings from whale-mounted video tags that the researchers deployed in Monterey Bay and Southern California. They then used results from these experiments to predict how many fish would escape from an oncoming whale based on their reaction times.

"One of the innovations of this study was to use predator data to inform the models we played back to fish," said Cade, who was a graduate student in the lab of Jeremy Goldbogen, assistant professor of biology at Stanford, during this research. "This allowed us to discover that the range of values at which a fish responds to an oncoming predator are all passed nearly simultaneously at a point when the whale opens its mouth, suggesting that by precisely timing its engulfment, the whale can avoid triggering escape responses in fish."

Through these experiments, models and field observations, the researchers determined that whales overcome shortcomings in speed and maneuverability by waiting to open their mouths until they are very close to the fish - essentially a whales' way of sneaking up on their prey. The researchers did not see the same delays in whales pursuing krill, which are less reactive to looming predators.

"This made sense when we realized that fish have been evolving to avoid being eaten by smaller predators for at least 100 million years, but lunge-feeding is a relatively new feeding strategy, evolutionarily speaking," said Cade.

Humpbacks, like other members of the rorqual whale group, engage in lunge-feeding. This means they lunge after prey, take in a volume of water that can be larger than their own body (thanks to expandable oral cavities) and then filter out the excess water before gulping down their catch. Opening the mouth, then, is hydrodynamically costly - like opening a parachute at high speeds, and feeding on fish requires the whales to time their lunging in ways that can be energetically costly. However, these costs are outweighed by the high energetic gains from captured prey: This research estimates that, for humpbacks, stealthy fish feeding is seven times more energetically efficient per lunge than feeding on krill.

Credit: 
Stanford University

A fast and inexpensive device to capture and identify viruses

image: This is an array of nanotubes decorated with gold nanoparticles captures virus molecules.

Image: 
Terrones Lab/Penn State

A device to quickly capture and identify various strains of virus has been developed, according to researchers at Penn State and New York University.

Currently, virologists estimate that 1.67 million unknown viruses are in animals, a number of which can be transmitted to humans. Known viruses, such as H5N1, Zika and Ebola have caused widespread illness and death. The World Health Organization states that early detection can halt virus spread by enabling rapid deployment of countermeasures.

"We have developed a fast and inexpensive handheld device that can capture viruses based on size," said Mauricio Terrones, distinguished professor of physics, chemistry, and materials science and engineering at Penn State. "Our device uses arrays of nanotubes engineered to be comparable in size to a wide range of viruses. We then use Raman spectroscopy to identify the viruses based on their individual vibration."

This device, called a VIRRION, has a wide range of possible uses. For farmers, for example, early detection of a virus in the field can save an entire crop. Early detection of a virus in livestock can save a herd from illness. Humans also will benefit by the detection of viruses in minutes rather than in days with current methods. Because of its size and low cost, such a device would be useful in every doctor's office as well as in remote locations when disease outbreaks occur.

"Most current techniques require large and expensive pieces of equipment," Terrones said. "The VIRRION is a few centimeters across. We add gold nanoparticles to enhance the Raman signal so that we are able to detect the virus molecule in very low concentrations. We then use machine learning techniques to create a library of virus types."

According to Professor Elodie Ghedin, a virologist at NYU, "The VIRRION enables the rapid enrichment of virus particles from any type of sample -- environmental or clinical -- which jump-starts viral characterization. This has applications in virus emergence, virus discovery and in diagnosis. Eventually, we hope to use this device for the capture and sequencing of single virions, giving us a much better handle on the evolution of the virus in real time."

Added lead author Ying-Ting Yeh, an assistant research professor in the Terrones group, "We synthesized a gradient of aligned carbon nanotube forest arrays to capture different viruses according to their size and detect them in-situ using Raman spectroscopy. We designed and assembled a portable platform that enriches virus particles from several milliliters of clinical samples in a couple of minutes."

Credit: 
Penn State

Land use caused broad-scale historical declines of large mammals across China

image: Map of the northern boundary of Asian elephant (Elephas maximus) in the study area over the past four millennia, based on multiple archaeological and historical sources. The distribution dynamics were inconsistent with the trend of mean annual temperature across the study area. Oracle bone scripts were used for divination by a cultural group recognized as Chinese ancestors ruling much of the North China Plain. The significant similarity between these scripts and their modern forms for the large mammals supports the past wide distribution of these taxa in ancient China.

Image: 
Shuqing Teng

Cultural evolution has been the dominant driver of range contractions in megafauna taxa across China since the beginning of Common Era, with little or no direct importance of climate. A research team led by Aarhus University along with collaborators from Nanjing University analyzed maps of megafauna distribution dynamics and societal development based on Chinese archival records alongside data on climate across China from 2 to 1953 CE.

Human activities are now playing a dominant role in driving changes in Earth's biodiversity and are responsible for the incipient sixth mass extinction, but the historical processes leading to this situation are poorly understood, often without emphasis on cultural evolution as a potential key process underlying anthropogenic impacts. A team of researchers from Aarhus University and Nanjing University has now shown that cultural evolution overshadowed climate change in driving historical broad-scale biodiversity dynamics.

By mining the deep Chinese administrative records in relation to culturally important wild megafauna species as well as sociocultural development, the researchers identified the millennia-long spread of agricultural land and agricultural intensification, as well as the specific expansion of the Han culture, as the main cause of the extirpation of five megafauna species from much of China, with little or no direct importance of climate.

Cultural evolution, not climate change, as the main driver

"China's well-preserved written records for more than 2000 years provide a unique opportunity to reconstruct long-term dynamics of culture-nature interactions across large geographical extents," says senior author Professor Jens-Christian Svenning, director of Center for Biodiversity Dynamics in Changing World (BIOCHANGE), Aarhus University. The five studied megafauna taxa include Asiatic elephant (Elephas maximus), Asian rhinoceroses (Rhinoceros sondaicus, R. unicornis and Dicerorhinus sumatrensis), tiger (Panthera tigris), Asiatic black bear (Ursus thibetanus), and brown bear (Ursus arctos), all of which were widely distributed across the study area and have played an important role in ancient China's cultural activities.

"Ancient China used to host a highly biodiverse assemblage of large mammals even in its nowadays densely populated areas such as the North China Plain and the Middle-Lower Yangtze Plain. Our research shows that the relatively recent loss of this rich megafauna in large part can be attributed to the southward expansion of intensified agricultural practices with the Han culture, which originated in North China," explains postdoc Shuqing Teng from Aarhus University and Nanjing University, the first author of the study.

Regional extirpation of these taxa from the study area were consistent with the sociocultural dynamics described above, but inconsistent with climate change. There were at least two conspicuous cooling-warming cycles during the last two thousand years, including the Medieval Warm Period and the Little Ice Age, with fluctuations of mean annual temperature around 1 to 1.5 °C, but neither had a conspicuous effect on the megafauna range dynamics.

Importance of cultural filtering

The study provides clear evidence that cultural evolution historically has overshadowed past climate change in shaping broad-scale megafauna patterns, in contrast to the common belief that human societies were unimportant in driving biodiversity dynamics at such large spatiotemporal scales until recent time frames such as the Industrial Revolution or the Great Acceleration of the 20th century.

This finding highlights the importance of culture's role in filtering current ecological assemblages from historical species pools. Perspectives through the lens of cultural filtering should also stimulate thoughts on what is natural - notably helping to overcome the Shifting Baseline Syndrome, the tendency to accept an already degraded state as natural due to lacking recognition of earlier declines - and which natural world we aim to conserve or restore.

Furthermore, modification of cultural filters will be key to respond to the challenges of the Anthropocene biodiversity crisis, as it is fundamentally culturally driven, as shown by this study of historical China, and how to achieve this is an important research challenge.

Credit: 
Aarhus University

For restricted eaters, a place at the table but not the meal

ITHACA, N.Y. - Holiday celebrations often revolve around eating, but for those with food restrictions, that can produce an incongruous feeling when dining with friends and loved ones: loneliness.

People with restricted diets - due to allergies, health issues or religious or cultural norms - are more likely to feel lonely when they can't share in what others are eating, new Cornell University research shows.

"Despite being physically present with others, having a food restriction leaves people feeling left out because they are not able to take part in bonding over the meal," said Kaitlin Woolley, assistant professor of marketing in the Samuel Curtis Johnson Graduate School of Management and lead author of the research.

Across seven studies and controlled experiments, researchers found that food restrictions predicted loneliness among both children and adults.

The research also offers the first evidence, Woolley said, that having a food restriction causes increased loneliness. For example, in one experiment, assigning unrestricted individuals to experience a food restriction increased reported feelings of loneliness. That suggests such feelings are not driven by non-food issues or limited to picky eaters, Woolley said.

"We can strip that away and show that assigning someone to a restriction or not can have implications for their feeling of inclusion in the group meal," she said.

Further evidence came from a survey of observers of the Jewish holiday of Passover. When reminded during the holiday of the leavened foods they couldn't enjoy with others, participants' loneliness increased. Yet, within their own similarly restricted group, they felt a stronger bond.

Bonding over meals is an inherently social experience, Woolley notes. In previous research, she found that strangers felt more connected and trusting of each other when they shared the same food, and eating food from the same plate increased cooperation between strangers.

But when restricted from sharing in the meal, people suffer "food worries," Woolley said. They fret about what they can eat and how others might judge them for not fitting in.

Those worries generated a degree of loneliness comparable to that reported by unmarried or low-income adults, and stronger than that experienced by schoolchildren who were not native English speakers, according to the research. Compared with non-restricted individuals, having a restriction increased reported loneliness by 19%. People felt lonelier regardless of how severe their restriction was, or whether their restriction was imposed or voluntary.

The study concluded that food restrictions and loneliness are on the rise and "may be related epidemics," warranting further research.

To date, Woolley said, children have been the primary focus of research on the effects of food restrictions. A nationally representative survey she analyzed from the Centers for Disease Control did not track the issue among adults.

But increasingly, she said, food restrictions are being carried into adulthood, or adults are choosing restricted diets such as gluten-free, vegetarian and vegan for health or ethical reasons. Up to 30% of all participants in her research deal with restrictions, Woolley said.

"This is a problem that I don't think people are quite aware of," she said, "and that has implications for people's ability to connect with others over eating."

Credit: 
Cornell University