Culture

Switching DNA functions on and off by means of light

DNA (deoxyribonucleic acid) is the basis of life on earth. The function of DNA is to store all the genetic information, which an organism needs to develop, function and reproduce. It is essentially a biological instruction manual found in every cell. Biochemists at the University of Münster have now developed a strategy for controlling the biological functions of DNA with the aid of light. This enables researchers to better understand and control the different processes which take place in the cell - for example epigenetics, the key chemical change and regulatory lever in DNA. The results have been published in the journal Angewandte Chemie.

Background and methodology

The cell's functions depend on special molecules, the enzymes. Enzymes are proteins, which carry out chemical reactions in the cell. They help to synthesize metabolic products, make copies of the DNA molecules, convert energy for the cell's activities, change DNA epigenetically and break down certain molecules. A team of researchers headed by Prof. Andrea Rentmeister from the Institute of Biochemistry at the University of Münster used a so-called enzymatic cascade reaction in order to understand and track these functions better. This sequence of successive reaction steps involving different enzymesmakes it possible to transfer so-called photocaging groups - chemical groups, which can be removed by means of irradiation with light - to DNA. Previously, studies had shown that only small residues (small modifications such as methyl groups) could be transferred very selectively to DNA, RNA (ribonucleic acid) or proteins. "As a result of our work, it is now possible to transfer larger residues or modifications such as the photocaging groups just mentioned," explains Nils Klöcker, one of the lead authors of the study and a PhD student at the Institute of Biochemistry. Working together with structural biologist Prof. Daniel Kümmel, who also works at the Institute of Biochemistry, it was also possible to explain the basis for the changed activity at a molecular level.

Using so-called protein engineering - a method for which a Nobel prize was awarded in 2018 - the Münster researchers engineered one enzyme in the cascade, making it possible to switch DNA functions on and off by means of light. With the aid of protein design, it was possible to expand the substrate spectrum of enzymes - in this case, methionine adenosyltransferases (MATs). In their work, the researchers examined two MATs. The modifications carried out offer a starting point for developing other MATs with an expanded substrate spectrum. "Combining these MATs with other enzymes has potential for future cellular applications. This is an important step for implementing in-situ generated, non-natural substances for other enzymes in epigenetic studies," says Andrea Rentmeister.

Credit: 
University of Münster

HKU chemists develop a new drug discovery strategy for "undruggable" drug targets

image: Graphic illustration of the work: DNA-programmed affinity labelling (DPAL) enables the direct screening of DNA-encoded chemical libraries (DELs) against membrane protein targets on live cells to create novel drug discovery opportunities.

Image: 
The University of Hong Kong

A research team led by Dr Xiaoyu LI from the Research Division for Chemistry, Faculty of Science, in collaboration with Professor Yizhou LI from School of Pharmaceutical Sciences, Chongqing University and Professor Yan CAO from School of Pharmacy, Second Military Medical University in Shanghai has developed a new drug discovery method targeting membrane proteins on live cells.

Membrane proteins play important roles in biology, and many of them are high-value targets that are being intensively pursued in the pharmaceutical industry. The method developed by Dr Li's team provides an efficient way to discover novel ligands and inhibitors against membrane proteins, which remain largely intractable to traditional approaches. The development of the methodology and its applications are now published in Nature Chemistry, a prestigious chemistry journal by the Nature Publishing Group (NPG).

Background

Membrane proteins on the cell surface perform a myriad of biological functions that are vital to the survival of cells and organisms. Not surprisingly, numerous human diseases are associated with aberrant membrane protein functions. Indeed, membrane proteins account for over 60% of the targets of all FDA-approved small-molecule drugs. The G-protein coupled receptor (GPCR) superfamily alone, as the largest class of cell-surface receptors, are the targets of ~34% of all clinical drugs. However, despite the significance, drug discovery against membrane proteins is notoriously challenging, mainly due to the special property of their natural habitat: the cell membrane. Moreover, membrane proteins are also difficult to study in an isolated form, as they tend to lose essential cellular feature and may be deactivated. In fact, membrane proteins have long been considered as a type of "undruggable" targets in the pharmaceutical industry.

In recent years, DNA-encoded chemical library (DEL) has emerged and become a powerful drug screening technology. To simplify, we can use a book library as an example. In a library, each book is indexed with a catalogue number and spatially encoded with a specific location on a bookshelf. Analogously, in a DEL, each chemical compound is attached with a unique DNA tag, which serves as the "catalogue number" recording the structural information of the compound. With DNA encoding, all library compounds can be mixed and screened against the target simultaneously to discover the ones that can modulate the biological functions of the target, e.g. inhibiting the proteins that are aberrantly active in malignant cancers. DELs can contain astonishingly large numbers of test compounds (billions or even trillions), and DEL screening can be conducted in just a few hours in a regular chemistry lab. Today, DEL has been widely adopted by nearly all major pharmaceutical industry worldwide. However, DEL also had encountered significant difficulties in interrogate membrane proteins on live cells.

2 Key findings: Tracking and Boosting

There are two hurdles that the team has overcome to enable the application of DEL on live cells. First, cell surface is not a smooth convex shape like a balloon; it is extremely complex with hundreds of different biomolecules with a rugged topology; thus, locating the desired target on the cells surface is like finding a single tree in a thick tropical forest. The team has overcome this "target specificity" problem by using a method they previously developed: DNA-programmed affinity labelling (DPAL). This method utilises a DNA-based probe system that can specifically deliver a DNA tag to the desired protein on live cells, and the DNA tag serves as a beacon to direct target-specific DEL screening. In other words, the team first installed a "tracker" on the target to achieve screening specificity.

The second challenge is target abundance. Typically, membrane proteins exist in nanomolar to low micromolar concentration, which is far below the high micromolar concentration needed to capture the tiny fraction of binders among billions of non-binders in a library. To solve this problem, the team employed a novel strategy by using complementary sequences in the DNA tag on the target protein and the actual library, so that the library can hybridise close to the target, thereby "boosting" the effective concentration of the target protein. In other words, the "tracker" can not only help the library locate the target, but also create an attractive force to concentrate the library around the target, not being distracted by the non-binding population.

In the publication, the team reports their detailed methodology development, and they also demonstrate the generality and performance of this method by screening a 30.42-million-compound library against folate receptor (FR), carbonic anhydrase 12 (CA-12), and epidermal growth factor receptor (EGFR) on live cells, all are important targets in anti-cancer drug discovery. This approach is expected to broadly applicable to many membrane proteins. For example, classical drug targets, such as GPCRs and ion channels, may be revisited in a live cell setting to identify new drug discovery opportunities by harnessing the power of DEL.

"We expect to the utility of this method is not limited to drug discovery, but also in academic research to explore challenging biological systems, such as oligomeric membrane protein complexes and cell-cell communications," said Dr Xiaoyu Li.

Co-corresponding author Professor Yizhou Li from Chongqing University said: "This method has the potential to facilitate drug discovery for membrane proteins with the power of large and complex chemical diversity from DNA-encoded chemical libraries." Co-corresponding author Professor Yan Cao from Second Military Medical University in Shanghai added: "This technology is an effective tool for characterising ligand-target interaction; it will cast new light on the development of high throughput screening methods, and thus facilitate the fishing of ligands targeting membrane proteins."

Credit: 
The University of Hong Kong

Discovery boosts theory that life on Earth arose from RNA-DNA mix

LA JOLLA, CA--Chemists at Scripps Research have made a discovery that supports a surprising new view of how life originated on our planet.

In a study published in the chemistry journal Angewandte Chemie, they demonstrated that a simple compound called diamidophosphate (DAP), which was plausibly present on Earth before life arose, could have chemically knitted together tiny DNA building blocks called deoxynucleosides into strands of primordial DNA.

The finding is the latest in a series of discoveries, over the past several years, pointing to the possibility that DNA and its close chemical cousin RNA arose together as products of similar chemical reactions, and that the first self-replicating molecules--the first life forms on Earth--were mixes of the two.

The discovery may also lead to new practical applications in chemistry and biology, but its main significance is that it addresses the age-old question of how life on Earth first arose. In particular, it paves the way for more extensive studies of how self-replicating DNA-RNA mixes could have evolved and spread on the primordial Earth and ultimately seeded the more mature biology of modern organisms.

"This finding is an important step toward the development of a detailed chemical model of how the first life forms originated on Earth," says study senior author Ramanarayanan Krishnamurthy, PhD, associate professor of chemistry at Scripps Research.

The finding also nudges the field of origin-of-life chemistry away from the hypothesis that has dominated it in recent decades: The "RNA World" hypothesis posits that the first replicators were RNA-based, and that DNA arose only later as a product of RNA life forms.

Is RNA too sticky?

Krishnamurthy and others have doubted the RNA World hypothesis in part because RNA molecules may simply have been too "sticky" to serve as the first self-replicators.

A strand of RNA can attract other individual RNA building blocks, which stick to it to form a sort of mirror-image strand--each building block in the new strand binding to its complementary building block on the original, "template" strand. If the new strand can detach from the template strand, and, by the same process, start templating other new strands, then it has achieved the feat of self-replication that underlies life.

But while RNA strands may be good at templating complementary strands, they are not so good at separating from these strands. Modern organisms make enzymes that can force twinned strands of RNA--or DNA--to go their separate ways, thus enabling replication, but it is unclear how this could have been done in a world where enzymes didn't yet exist.

A chimeric workaround

Krishnamurthy and colleagues have shown in recent studies that "chimeric" molecular strands that are part DNA and part RNA may have been able to get around this problem, because they can template complementary strands in a less-sticky way that permits them to separate relatively easily.

The chemists also have shown in widely cited papers in the past few years that the simple ribonucleoside and deoxynucleoside building blocks, of RNA and DNA respectively, could have arisen under very similar chemical conditions on the early Earth.

Moreover, in 2017 they reported that the organic compound DAP could have played the crucial role of modifying ribonucleosides and stringing them together into the first RNA strands. The new study shows that DAP under similar conditions could have done the same for DNA.

"We found, to our surprise, that using DAP to react with deoxynucleosides works better when the deoxynucleosides are not all the same but are instead mixes of different DNA 'letters' such as A and T, or G and C, like real DNA," says first author Eddy Jiménez, PhD, a postdoctoral research associate in the Krishnamurthy lab.

"Now that we understand better how a primordial chemistry could have made the first RNAs and DNAs, we can start using it on mixes of ribonucleoside and deoxynucleoside building blocks to see what chimeric molecules are formed--and whether they can self-replicate and evolve," Krishnamurthy says.

He notes that the work may also have broad practical applications. The artificial synthesis of DNA and RNA--for example in the "PCR" technique that underlies COVID-19 tests--amounts to a vast global business, but depends on enzymes that are relatively fragile and thus have many limitations. Robust, enzyme-free chemical methods for making DNA and RNA may end up being more attractive in many contexts, Krishnamurthy says.

Credit: 
Scripps Research Institute

Primordial black holes and the search for dark matter from the multiverse

image: Baby universes branching off of our universe shortly after the Big Bang appear to us as black holes.

Image: 
Kavli IPMU

The Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) is home to many interdisciplinary projects which benefit from the synergy of a wide range of expertise available at the institute. One such project is the study of black holes that could have formed in the early universe, before stars and galaxies were born.

Such primordial black holes (PBHs) could account for all or part of dark matter, be responsible for some of the observed gravitational waves signals, and seed supermassive black holes found in the center of our Galaxy and other galaxies. They could also play a role in the synthesis of heavy elements when they collide with neutron stars and destroy them, releasing neutron-rich material. In particular, there is an exciting possibility that the mysterious dark matter, which accounts for most of the matter in the universe, is composed of primordial black holes. The 2020 Nobel Prize in physics was awarded to a theorist, Roger Penrose, and two astronomers, Reinhard Genzel and Andrea Ghez, for their discoveries that confirmed the existence of black holes. Since black holes are known to exist in nature, they make a very appealing candidate for dark matter.

The recent progress in fundamental theory, astrophysics, and astronomical observations in search of PBHs has been made by an international team of particle physicists, cosmologists and astronomers, including Kavli IPMU members Alexander Kusenko, Misao Sasaki, Sunao Sugiyama, Masahiro Takada and Volodymyr Takhistov.

To learn more about primordial black holes, the research team looked at the early universe for clues. The early universe was so dense that any positive density fluctuation of more than 50 percent would create a black hole. However, cosmological perturbations that seeded galaxies are known to be much smaller. Nevertheless, a number of processes in the early universe could have created the right conditions for the black holes to form.

One exciting possibility is that primordial black holes could form from the "baby universes" created during inflation, a period of rapid expansion that is believed to be responsible for seeding the structures we observe today, such as galaxies and clusters of galaxies. During inflation, baby universes can branch off of our universe. A small baby (or "daughter") universe would eventually collapse, but the large amount of energy released in the small volume causes a black hole to form.

An even more peculiar fate awaits a bigger baby universe. If it is bigger than some critical size, Einstein's theory of gravity allows the baby universe to exist in a state that appears different to an observer on the inside and the outside. An internal observer sees it as an expanding universe, while an outside observer (such as us) sees it as a black hole. In either case, the big and the small baby universes are seen by us as primordial black holes, which conceal the underlying structure of multiple universes behind their "event horizons." The event horizon is a boundary below which everything, even light, is trapped and cannot escape the black hole.

In their paper, the team described a novel scenario for PBH formation and showed that the black holes from the "multiverse" scenario can be found using the Hyper Suprime-Cam (HSC) of the 8.2m Subaru Telescope, a gigantic digital camera--the management of which Kavli IPMU has played a crucial role--near the 4,200 meter summit of Mt. Mauna Kea in Hawaii. Their work is an exciting extension of the HSC search of PBH that Masahiro Takada, a Principal Investigator at the Kavli IPMU, and his team are pursuing. The HSC team has recently reported leading constraints on the existence of PBHs in Niikura, Takada et. al. (Nature Astronomy 3, 524-534 (2019))

Why was the HSC indispensable in this research? The HSC has a unique capability to image the entire Andromeda galaxy every few minutes. If a black hole passes through the line of sight to one of the stars, the black hole's gravity bends the light rays and makes the star appear brighter than before for a short period of time. The duration of the star's brightening tells the astronomers the mass of the black hole. With HSC observations, one can simultaneously observe one hundred million stars, casting a wide net for primordial black holes that may be crossing one of the lines of sight.

The first HSC observations have already reported a very intriguing candidate event consistent with a PBH from the "multiverse," with a black hole mass comparable to the mass of the Moon. Encouraged by this first sign, and guided by the new theoretical understanding, the team is conducting a new round of observations to extend the search and to provide a definitive test of whether PBHs from the multiverse scenario can account for all dark matter.

Credit: 
Kavli Institute for the Physics and Mathematics of the Universe

Early mammal with remarkably precise bite

image: The investigated dentition of P. fruitaensis. The upper molars (M2, M3) are offset from the lower ones (m2, m3). This causes the cusps to interlock in a way that creates a sharp cutting edge.

Image: 
© Thomas Martin, Kai R. K. Jäger / University of Bonn

Paleontologists at the University of Bonn (Germany) have succeeded in reconstructing the chewing motion of an early mammal that lived almost 150 million years ago. This showed that its teeth worked extremely precisely and surprisingly efficiently. Yet it is possible that this very aspect turned out to be a disadvantage in the course of evolution. The study is published in the journal "Scientific Reports".

At just twenty centimeters long, the least weasel is considered the world's smallest carnivore alive today. The mammal that researchers at the University of Bonn have now studied is unlikely to have been any bigger. However, the species to which it belongs has long been extinct: Priacodon fruitaensis (the scientific name) lived almost 150 million years ago, at a time when dinosaurs dominated the animal world and the triumph of mammals was still to come.

In their study, the paleontologists from the Institute for Geosciences at the University of Bonn analyzed parts of the upper and lower jaw bones of a fossil specimen. More precisely: its cheek teeth (molars). Because experts can tell a lot from these, not only about the animal's diet, but also about its position in the family tree. In P. fruitaensis, each molar is barely larger than one millimeter. This means that most of their secrets remain hidden from the unarmed eye.

The researchers from Bonn therefore used a special tomography method to produce high-resolution three-dimensional images of the teeth. They then analyzed these micro-CT images using various tools, including special software that was co-developed at the Bonn-based institute. "Until now, it was unclear exactly how the teeth in the upper and lower jaws fit together," explains Prof. Thomas Martin, who holds the chair of paleontology at the University of Bonn. "We have now been able to answer that question."

How did creatures chew 150 million years ago?

The upper and lower jaws each contain several molars. In the predecessors of mammals, molar 1 of the upper jaw would bite down precisely on molar 1 of the lower jaw when chewing. In more developed mammals, however, the rows of teeth are shifted against each other. Molar 1 at the top therefore hits exactly between molar 1 and molar 2 when biting down, so that it comes into contact with two molars instead of one. But how were things in the early mammal P. fruitaensis?

"We compared both options on the computer," explains Kai Jäger, who wrote his doctoral thesis in Thomas Martin's research group. "This showed that the animal bit down like a modern mammal." The researchers simulated the entire chewing motion for both alternatives. In the more original version, the contact between the upper and lower jaws would have been too small for the animals to crush the food efficiently. This is different with the "more modern" alternative: In this case, the cutting edges of the molars slid past each other when chewing, like the blades of pinking shears that children use today for arts and crafts.

Its dentition therefore must have made it easy for P. fruitaensis to cut the flesh of its prey. However, the animal was probably not a pure carnivore: Its molars have cone-shaped elevations, similar to the peaks of a mountain. "Such cusps are particularly useful for perforating and crushing insect carapaces," says Jäger. "They are therefore also found in today's insectivores." However, the combination of carnivore and insectivore teeth is probably unique in this form.

The cusps are also noticeable in other ways: They are practically the same size in all molars. This made the dentition extremely precise and efficient. However, these advantages came at a price: Small changes in the structure of the cusps would probably have dramatically worsened the chewing performance. "This potentially made it more difficult for the dental apparatus to evolve," Jäger says.

This type of dentition has in fact survived almost unchanged in certain lineages of evolutionary history over a period of 80 million years. At some point, however, its owners became extinct - perhaps because their teeth could not adapt to changing food conditions.

Credit: 
University of Bonn

Novel method reveals small microplastics throughout Japan's subtropical ocean

image: Six areas around Okinawa were visited to collect samples for analyzing the marine microplastics - two were to the south of the island, two around the center, and two to the north.

Image: 
OIST

Research conducted in the Light-Matter Interactions for Quantum Technologies Unit at the Okinawa Institute of Science and Technology Graduate University (OIST) has revealed the presence of small microplastics in the ocean surrounding Okinawa. The study was published in Science of the Total Environment.

"There's been a considerable amount of research on larger plastic pieces in the ocean," said Christina Ripken, PhD student in the Unit and lead author of the paper. "But the smaller pieces, those that are less than 5mm in size, haven't been in the spotlight, so it was important to identify whether they're present and the impacts they might have on living organisms."

Okinawa was an interesting place to carry out this study. A small, subtropical island in southern Japan, it is surrounded by fringing coral reefs, which means that the ocean around the beaches is reliant on surface water and wind. It has also been deemed a 'blue zone' - a region whose residents are exceptionally long-lived. Therefore, the researchers thought it crucial to monitor ocean pollution as it may adversely affect these residents.

In collaboration with the Okinawa Prefecture Government, Christina carried out the sampling in September 2018. Six sites were visited close to the island's shoreline. To look at a range of different areas, two of the sites were to the south of the island, two were around the center, and two to the north. In Naha, the capital of the Okinawa Prefecture, samples were taken from beside the industrial port and the airport. Naha has an estimated population of over 300,000 inhabitants, which represents a fourth of the total population of the island. In contrast, Cape Hedo, at the far north of the island, has a very low population and is considerably less urbanized.

At each site, the surface water was trawled for one kilometer, allowing approximately 800 liters of water to be filtered and small particles to be removed. These particles were then analyzed in the lab at OIST.

Christina worked with Dr. Domna Kotsifaki, staff scientist in the Light-Matter Interactions for Quantum Technologies Unit, who combined two techniques - the optical tweezers technique and the micro-Raman technique - to provide a novel way of analyzing the particles.

The optical tweezers technique uses lasers to hold the particle in the liquid, while the micro-Raman technique identified the unique molecular fingerprint of each particle. This allowed the researchers to see exactly what was present, whether that be organic material, trace metal, or different plastics like polyethylene or polystyrene.

"This method is what sets the study apart from other research into marine microplastics," said Dr. Kotsifaki. "It meant that we didn't need to filter out the plastic first, so we could see if there was plastic embedded within organic material or if any trace metals were present and the concentration of the plastics in the sampled seawater."

As can be expected, the researchers found that there was more plastic in the water to the south of the island than to the north. But somewhat surprisingly, they found that the plastic correlated more with where people were living rather than with particularly industrialized areas.

Concerningly, they found plastic in all the samples.

Over 75% of the plastics found in the samples were made from polyethylene, which the researchers theorized could come from broken fishing equipment, water bottle caps, household utensils, plastic bags, plastic containers, and packaging.

"In the fishing communities, at the ports and beaches where the fish are landed, workers use woven polymer sacks to store and transport items including fish," said Christina. "This is an example of how the small pieces of plastic might be leaching into the ocean."

Another way is through plastic in road dust. Recent research found a high concentration of microplastics in dust samples taken from the roads of Okinawa's heavily urbanized areas, which have considerable amounts of vehicle traffic. Some of this road dust may now be found in the ocean around Okinawa.

"We found more plastic around the heavily urbanized area in the south of the island than around the industrialized center or the rural north, but everywhere we found plastic," said Christina. "Our method means that we have a clearer view on the prevalence of microplastics around Okinawa and this can lead to risk analysis and influence policy. We hope it will help boost the environmental research area."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Discovery of aging mechanism for hematopoietic stem cells

image: No caption

Image: 
©Atsushi IWAMA, The Institute of Medical Science, The University of Tokyo

By transferring mouse aged hematopoietic stem cells (aged HSCs, *1) to the environment of young mice (bone marrow niche, *2), it was demonstrated that the pattern of stem cell gene expression was rejuvenated to that of young hematopoietic stem cells. On the other hand, the function of aged HSCs did not recover in the young bone marrow niche. The epigenome (DNA methylation, *3) of aged HSCs did not change significantly even in the young bone marrow niche, and DNA methylation profiles were found to be a better index than the gene expression pattern of aged HSCs.

A research group led by Professor Atsushi Iwama at the Division of Stem Cell and Molecular Medicine, The Institute of Medical Science, The University of Tokyo (IMSUT) announced these world-first results and was published in the Journal of Experimental Medicine (online) on November 24th.

"The results will contribute to the development of treatments for age-related blood diseases," states lead scientist, Professor Iwama at IMSUT.

Focus on changes in aged HSCs in the bone marrow niche

The research group investigated whether rejuvenating aged HSCs in a young bone marrow niche environment would rejuvenate.

Tens of thousands of aged hematopoietic stem/progenitor cells collected from 20-month-old mice were transplanted into 8-week-old young mice without pretreatment such as irradiation. After two months of follow-up, they collected bone marrow cells and performed flow cytometric analysis.

The research team also transplanted 10-week-old young mouse HSCs for comparison. In addition, engrafted aged HSCs were fractionated and RNA sequence analysis and DNA methylation analysis were performed.

They found that engrafted aged HSCs were less capable of producing hematopoietic cells than younger HSCs. They also showed that differentiation of aged HSCs into multipotent progenitor cells was persistently impaired even in the young bone marrow niche, and that the direction of differentiation was biased. It was found that the transfer of aged HSCs to the young bone marrow niche does not improve their stem cell function.

See the paper for details.

A more detailed analysis may reveal mechanisms that irreversibly affect aged HSC function

Aging studies focusing on HSCs have been actively pursued in mice using a bone marrow transfer model. However, the effect of aging on HSCs remains to be clarified.

Professor Iwama states as follows."This study has a significant impact because it clarified the effect of aging on HSCs. Our results are expected to contribute to further elucidation of the mechanism of aging in HSCs and understanding of the pathogenic mechanism of age-related blood diseases."

Credit: 
The Institute of Medical Science, The University of Tokyo

Protein tells developing cells to stick together

image: Compartments in adult tissues. Fluorescent protein expression shows posterior compartments in the wing and the abdomen. Please notice that boundaries between compartments are amazingly straight.

Image: 
Tohoku University

Tohoku University scientists have, for the first time, provided experimental evidence that cell stickiness helps them stay sorted within correct compartments during development. How tightly cells clump together, known as cell adhesion, appears to be enabled by a protein better known for its role in the immune system. The findings were detailed in the journal Nature Communications.

Scientists have long observed that not-yet-specialized cells move in a way that ensures that cell groups destined for a specific tissue stay together. In 1964, American biologist Malcolm Steinberg proposed that cells with similar adhesiveness move to come in contact with each other to minimize energy use, producing a thermodynamically stable structure. This is known as the differential adhesion hypothesis.

"Many other theoretical works have emphasized the importance of differences in cell-to-cell adhesion for separating cell populations and maintaining the boundaries between them, but this had not yet been demonstrated in living animal epithelial tissues," says Erina Kuranaga of Tohoku University's Laboratory for Histogenetic Dynamics, who led the investigations. "Our study showed, for the first time, that cell sorting is regulated by changes in adhesion."

Kuranaga and her team conducted experiments in fruit fly pupae, finding that a gene, called Toll-1, played a major role in this adhesion process.

As fruit flies develop from the immature larval stage into the mature adult, epithelial tissue-forming cells, called histoblasts, cluster together into several 'nests' in the abdomen. Each nest contains an anterior and a posterior compartment. Histoblasts are destined to replace larval cells to form the adult epidermis, the outermost layer that covers the flies. The cells in each compartment form discrete cell populations, so they need to stick together, with a distinct boundary forming between them.

Using fluorescent tags, Kuranaga and her team observed the Toll-1 protein is expressed mainly in the posterior compartment. Its fluorescence also showed a sharp boundary between the two compartments.

Further investigations showed Toll-1 performs the function of an adhesion molecule, encouraging similar cells to stick together. This process keeps the boundary between the two compartments straight, correcting distortions that arise as the cells divide to increase the number.

Interestingly, Toll proteins are best known for recognizing invading pathogens, and little is known about their work beyond the immune system. "Our work improves understanding of the non-immune roles of Toll proteins," says Kuranaga. She and her team next plan to study the function of other Toll genes in fruit fly epithelial cells.

Credit: 
Tohoku University

Modeling can help balance economy, health during pandemic

image: Arye Nehorai, the Eugene & Martha Lohman Professor of Electrical Engineering in the Preston M. Green Department of Electrical & Systems Engineering

Image: 
Washington University in St. Louis

This summer, when bars and restaurants and stores began to reopen across the United States, people headed out despite the continuing threat of COVID-19.

As a result, many areas, including the St. Louis region, saw increases in cases in July.

Using mathematical modeling, new interdisciplinary research from the lab of Arye Nehorai, the Eugene & Martha Lohman Professor of Electrical Engineering in the Preston M. Green Department of Electrical & Systems Engineering at Washington University in St. Louis, determines the best course of action when it comes to walking the line between economic stability and the best possible health outcomes.

The group -- which also includes David Schwartzman, a business economics PhD candidate at Olin Business School, and Uri Goldsztejn, a PhD candidate in biomedical engineering at the McKelvey School of Engineering -- published their findings Dec. 22 in PLOS ONE.

The model indicates that of the scenarios they consider, communities could maximize economic productivity and minimize disease transmission if, until a vaccine were readily available, seniors mostly remained at home while younger people gradually returned to the workforce.

"We have developed a predictive model for COVID-19 that considers, for the first time, its intercoupled effect on both economic and health outcomes for different quarantine policies," Nehorai said. "You can have an optimal quarantine policy that minimizes the effect both on health and on the economy."

The work was an expanded version of a Susceptible, Exposed, Infectious, Recovered (SEIR) model, a commonly used mathematical tool for predicting the spread of infections. This dynamic model allows for people to be moved between groups known as compartments, and for each compartment to influence the other in turn.

At their most basic, these models divide the population into four compartments: Those who are susceptible, exposed, infectious and recovered. In an innovation to this traditional model, Nehorai's team included infected but asymptomatic people as well, taking into account the most up-to-date understanding of how transmission may work differently between them as well as how their behaviors might differ from people with symptoms. This turned out to be highly influential in the model's outcomes.

People were then divided into different "sub-compartments," for example age (seniors are those older than 60), or by productivity. This was a measure of a person's ability to work from home in the case of quarantine measures. To do this, they looked at college degrees as a proxy for who could continue to work during a period of quarantine.

Then they got to work, developing equations which modeled the ways in which people moved from one compartment to another. Movement was affected by policy as well as the decisions an individual made.

Interestingly, the model included a dynamic mortality rate - one that shrunk over time. "We had a mortality rate that accounted for improvements in medical knowledge over time," said Uri Goldsztejn, a PhD candidate in biomedical engineering. "And we see that now; mortality rates have gone down."

"For example," Goldsztejn said, "if the economy is decreasing, there is more incentive to leave quarantine," which might show up in the model as people moving from the isolated compartment to the susceptible compartment. On the other hand, moving from infectious to recovered was based less on a person's actions and can be better determined by recovery or mortality rates. Additionally, the researchers modeled the mortality rate as decreasing over time, due to medical knowledge about how to treat COVID-19 becoming better over time.

The team looked at three scenarios, according to Schwartzman. In all three scenarios, the given timeline was 76 weeks -- at which time it assumed a vaccine would be available -- and seniors remained mostly quarantined until then.

If strict isolation measures were maintained throughout.
If, after the curve was flattened, there was a rapid relaxation of isolation measures by younger people to normal movement.

If, after the curve was flattened, isolation measures were slowly lifted for younger people.

"The third scenario is the case which was the best in terms of economic damage and health outcomes," he said. "Because in the rapid relaxation scenario, there was another disease spread and restrictions would be reinstated."

Specifically, they found in the first scenario, there are 235,724 deaths and the economy shrinks by 34%.

In the second scenario, where there was a rapid relaxation of isolation measures, a second outbreak occurs for a total of 525,558 deaths, and the economy shrinks by 32.2%.

With a gradual relaxation, as in the third scenario, there are 262,917 deaths, and the economy shrinks by 29.8%.

"We wanted to show there is a tradeoff," Nehorai said. "And we wanted to find, mathematically, where is the sweet spot?" As with so many things, the "sweet spot" was not at either extreme -- total lockdown or carrying on as if there was no virus.

Another key finding was one no one should be surprised to hear: "People's' sensitivity to contagiousness is related to the precautions they take," Nehorai said. "It's still critical to use precautions -- masks, social distancing, avoiding crowds and washing hands."

Credit: 
Washington University in St. Louis

Common diabetes drug may trigger rare complications for COVID-19 patients

Diabetes is a known risk factor for morbidity and mortality related to COVID-19. In diabetes patients, rare but severe complications, like the potentially lethal condition diabetic ketoacidosis (DKA), can arise when illness or certain conditions prevent cells from receiving enough glucose to fuel their functioning. An uptick in a particular type of DKA called euDKA at Brigham and Women's Hospital during the COVID-19 pandemic has led researchers to hypothesize that diabetes patients on glucose-lowering drugs may be at increased risk for euDKA when they contract COVID-19. The observational case series was published in The American Association of Clinical Endocrinologists Clinical Case Reports.

EuDKA is a subset of the diabetes complication known as DKA, which occurs when the body's cells fail to absorb enough glucose and compensate by metabolizing fats instead, creating a build-up of acids called ketones. EuDKA differs from DKA in that it is characterized by lower blood sugar levels, making it more difficult to diagnose. The U.S. Food and Drug Administration has warned that the risk of DKA and euDKA may be increased for individuals who use a popular class of diabetes drugs called sodium-glucose cotransporter 2 inhibitors (SGLT2i), which function by releasing excess glucose in the urine. Underlying nearly all euDKA cases is a state of starvation that can be triggered by illnesses that cause vomiting, diarrhea, and loss of appetite and can be compounded by the diuretic effect of SGLT2i drugs.

Brigham researchers studied five unusual euDKA cases brought to the diabetes inpatient services within the span of two months, three of which occurred in one week, at the height of the pandemic in Boston in the spring of 2020. The five cases represented a markedly heightened incidence of euDKA compared to that of the previous two years, when inpatient services saw fewer than 10 euDKA cases. All five of the recent euDKA cases were observed in COVID-19 patients who were taking SGLT2i; three patients were discharged to rehabilitation facilities, one was discharged home, and one, a 52-year-old male with acute respiratory distress syndrome, died.

"We have the background knowledge of recognizing that SGLT2 inhibitors can cause DKA and euDKA," said corresponding author Naomi Fisher of the Division of Endocrinology, Diabetes, and Hypertension. "Our report reinforces that if patients are ill or have loss of appetite or are fasting, they should pause their medication and not resume until they are well and eating properly."

The authors of the study also suspect that COVID-19 may particularly exacerbate euDKA risks. When the virus infects a patient, it binds to cells on the pancreas that produce insulin and may exert a toxic effect on them. Studies of the earlier SARS-CoV-1 virus found that many infected patients had increased blood sugar. "It's been posited through other models that the virus may be preferentially destroying insulin-producing cells," Fisher said.

Moreover, the maladaptive inflammatory response associated with COVID-19, which produces high levels of immune-response-related proteins called cytokines, may increase DKA risks. "These high levels of cytokines are also seen in DKA, so these inflammatory pathways may be interacting," Fisher said. "It's speculative, but there may be some synergy between them."

Though these findings are observational, rather than the results of a randomized controlled trial, similar reports of heightened euDKA incidence have emerged from other institutions. The authors encourage patients and physicians to halt SGLT2i-use in the event of illness, which is already standard practice for the most common diabetes drug, metformin.

"Patients should continue to monitor their blood sugar, and if the illness is prolonged or if their blood sugar is very high, they can speak to their doctor about other forms of therapy," Fisher said. "But often it's a very short course off of the drug. We're hopeful that with widespread patient and physician education, we will not see another cluster of euDKA cases amid the next surge in COVID-19 infections."

Credit: 
Brigham and Women's Hospital

COVID-19 severity affected by proportion of antibodies targeting crucial viral protein

COVID-19 antibodies preferentially target a different part of the virus in mild cases of COVID-19 than they do in severe cases, and wane significantly within several months of infection, according to a new study by researchers at Stanford Medicine.

The findings identify new links between the course of the disease and a patient's immune response. They also raise concerns about whether people can be re-infected, whether antibody tests to detect prior infection may underestimate the breadth of the pandemic and whether vaccinations may need to be repeated at regular intervals to maintain a protective immune response.

"This is one of the most comprehensive studies to date of the antibody immune response to SARS-CoV-2 in people across the entire spectrum of disease severity, from asymptomatic to fatal," said Scott Boyd, MD, PhD, associate professor of pathology. "We assessed multiple time points and sample types, and also analyzed levels of viral RNA in patient nasopharyngeal swabs and blood samples. It's one of the first big-picture looks at this illness."

The study found that people with severe COVID-19 have low proportions of antibodies targeting the spike protein used by the virus to enter human cells compared with the number of antibodies targeting proteins of the virus's inner shell.

Boyd is a senior author of the study, which was published Dec. 7 in Science Immunology. Other senior authors are Benjamin Pinsky, MD, PhD, associate professor of pathology, and Peter Kim, PhD, the Virginia and D. K. Ludwig Professor of Biochemistry. The lead authors are research scientist Katharina Röltgen, PhD; postdoctoral scholars Abigail Powell, PhD, and Oliver Wirz, PhD; and clinical instructor Bryan Stevens, MD.

Virus binds to ACE2 receptor

The researchers studied 254 people with asymptomatic, mild or severe COVID-19 who were identified either through routine testing or occupational health screening at Stanford Health Care or who came to a Stanford Health Care clinic with symptoms of COVID-19. Of the people with symptoms, 25 were treated as outpatients, 42 were hospitalized outside the intensive care unit and 37 were treated in the intensive care unit. Twenty-five people in the study died of the disease.

SARS-CoV-2 binds to human cells via a structure on its surface called the spike protein. This protein binds to a receptor on human cells called ACE2. The binding allows the virus to enter and infect the cell. Once inside, the virus sheds its outer coat to reveal an inner shell encasing its genetic material. Soon, the virus co-opts the cell's protein-making machinery to churn out more viral particles, which are then released to infect other cells.

Antibodies that recognize and bind to the spike protein block its ability to bind to ACE2, preventing the virus from infecting the cells, whereas antibodies that recognize other viral components are unlikely to prevent viral spread. Current vaccine candidates use portions of the spike protein to stimulate an immune response.

Boyd and his colleagues analyzed the levels of three types of antibodies -- IgG, IgM and IgA -- and the proportions that targeted the viral spike protein or the virus's inner shell as the disease progressed and patients either recovered or grew sicker. They also measured the levels of viral genetic material in nasopharyngeal samples and blood from the patients. Finally, they assessed the effectiveness of the antibodies in preventing the spike protein from binding to ACE2 in a laboratory dish.

"Although previous studies have assessed the overall antibody response to infection, we compared the viral proteins targeted by these antibodies," Boyd said. "We found that the severity of the illness correlates with the ratio of antibodies recognizing domains of the spike protein compared with other nonprotective viral targets. Those people with mild illness tended to have a higher proportion of anti-spike antibodies, and those who died from their disease had more antibodies that recognized other parts of the virus."

Substantial variability in immune response

The researchers caution, however, that although the study identified trends among a group of patients, there is still substantial variability in the immune response mounted by individual patients, particularly those with severe disease.

"Antibody responses are not likely to be the sole determinant of someone's outcome," Boyd said. "Among people with severe disease, some die and some recover. Some of these patients mount a vigorous immune response, and others have a more moderate response. So, there are a lot of other things going on. There are also other branches of the immune system involved. It's important to note that our results identify correlations but don't prove causation."

As in other studies, the researchers found that people with asymptomatic and mild illness had lower levels of antibodies overall than did those with severe disease. After recovery, the levels of IgM and IgA decreased steadily to low or undetectable levels in most patients over a period of about one to four months after symptom onset or estimated infection date, and IgG levels dropped significantly.

"This is quite consistent with what has been seen with other coronaviruses that regularly circulate in our communities to cause the common cold," Boyd said. "It's not uncommon for someone to get re-infected within a year or sometimes sooner. It remains to be seen whether the immune response to SARS-CoV-2 vaccination is stronger, or persists longer, than that caused by natural infection. It's quite possible it could be better. But there are a lot of questions that still need to be answered."

Boyd is a co-chair of the National Cancer Institute's SeroNet Serological Sciences Network, one of the nation's largest coordinated research efforts to study the immune response to COVID-19. He is the principal investigator of Center of Excellence in SeroNet at Stanford, which is tackling critical questions about the mechanisms and duration of immunity to SARS-CoV-2.

"For example, if someone has already been infected, should they get the vaccine? If so, how should they be prioritized?" Boyd said. "How can we adapt seroprevalence studies in vaccinated populations? How will immunity from vaccination differ from that caused by natural infection? And how long might a vaccine be protective? These are all very interesting, important questions."

Credit: 
Stanford Medicine

Remarkable new species of snake found hidden in a biodiversity collection

image: Jeff Weinell, a KU graduate research assistant at the Biodiversity Institute, is lead author of a paper describing Waray Dwarf Burrowing Snake as both a new genus and a new species, in the peer-reviewed journal Copeia.

Image: 
University of Kansas

LAWRENCE -- To be fair, the newly described Waray Dwarf Burrowing Snake (Levitonius mirus) is pretty great at hiding.

In its native habitat, Samar and Leyte islands in the Philippines, the snake spends most of its time burrowing underground, usually surfacing only after heavy rains in much the same way earthworms tend to wash up on suburban sidewalks after a downpour.

So, it may not be shocking that when examples of the Waray Dwarf Burrowing Snake were collected in 2006 and 2007, they were misidentified in the field -- nobody had seen them before. The specimens spent years preserved in the collections of the University of Kansas Biodiversity Institute and Natural History Museum, overlooked by researchers who were unaware they possessed an entirely new genus of snake, even after further examples were found in 2014.

But that changed once Jeff Weinell, a KU graduate research assistant at the Biodiversity Institute, took a closer look at the specimens' genetics using molecular analysis, then sent them to collaborators at the University of Florida for CT scanning. Now, he's the lead author on a paper describing the snake as both a new genus, and a new species, in the peer-reviewed journal Copeia.

"I was initially interested in studying the group of snakes that I thought it belonged to -- or that other people thought it belonged to," Weinell said. "This is when I first started my Ph.D. at KU. I was interested in collecting data on a lot of different snakes and finding out what I actually wanted to research. I knew this other group of small, burrowing snakes called Pseudorabdion -- there are quite a few species in the Philippines -- and I was interested in understanding the relationships among those snakes. So, I made a list of all the specimens we had in the museum of that group, and I started sequencing DNA for the tissues that were available."

As soon as Weinell got the molecular data back, he realized the sample from the subterranean snake didn't fall within Pseudorabdion. But pinpointing where the snake should be classified wasn't a simple task: The Philippine archipelago is an exceptionally biodiverse region that includes at least 112 species of land snakes from 41 genera and 12 families.

"It was supposed to be closely related, but it was actually related to this entirely different family of snakes," he said. "That led me to look at it in more detail, and I realized that there were actually some features that were quite different from what it was initially identified as."

Working with Rafe Brown, professor of ecology & evolutionary biology and curator-in-charge of the KU Biodiversity Institute and Natural History Museum, Weinell took a closer look at the snake's morphology, paying special attention to the scales on the body, which can be used to differentiate species.

He then sent one of the specimens to the University of Florida for CT scanning to get a more precise look at the internal anatomy of the mysterious Philippine snake. The CT images turned out to be surprising.

"The snake has among the fewest number of vertebrae of any snake species in the world, which is likely the result of miniaturization and an adaptation for spending most of its life underground," Weinell said.

Finally, the KU graduate research assistant and his colleagues were able to determine the Waray Dwarf Burrowing Snake mirus was a new "miniaturized genus" and species of snake. Now, for the first time, Weinell has had the chance to bestow the snake with its scientific name, Levitonius mirus.

"It's actually named for Alan Leviton, who is a researcher at the California Academy of Sciences, and he had spent decades basically studying snakes in the Philippines in the '60s, '70s, '80s and then all the way up to now," Weinell said. "So, that's sort of an honorific genus name for him. Then, 'mirus' is Latin for unexpected. That's referencing the unexpected nature of this discovery -- getting the DNA sequences back and then wondering what was going on."

In addition to Brown, Weinell's co-authors on the new paper are Daniel Paluh of the University of Florida and Cameron Siler of the University of Oklahoma. Brown said the description of Levitonius mirus highlights the value of preserving collections of biodiversity in research institutions and universities.

"In this case, the trained 'expert field biologists' misidentified specimens -- and we did so repeatedly, over years -- failing to recognize the significance of our finds, which were preserved and assumed to be somewhat unremarkable, nondescript juveniles of common snakes," Brown said. "This happens a lot in the real world of biodiversity discovery. It was only much later, when the next generation of scientists came along and had the time and access to accumulated numbers of specimens, and when the right people, like Jeff, who asked the right questions and who had the right tools and expertise, like Dan, came along and took a fresh look, that we were able to identify this snake correctly. It's a good thing we have biodiversity repositories and take our specimen-care oaths seriously."

According to Marites Bonachita-Sanguila, a biologist at the Biodiversity Informatics and Research Center at Father Saturnino Urios University, located in the southern Philippines, the snake discovery "tells us that there is still so much more to learn about reptile biodiversity of the southern Philippines by focusing intently on species-preferred microhabitats."

"The pioneering Philippine herpetological work of Walter Brown and Angel Alcala from the 1960s to the 1990s taught biologists the important lesson of focusing on species' very specific microhabitat preferences," Bonachita-Sanguila said. "Even so, biologists have really missed many important species occurrences, such as this, because .... well, simply because we did not know basic clues about where to find them. In the case of this discovery, the information that biologists lacked was that we should dig for them when we survey forests. So simple. How did we miss that? All this time, we were literally walking on top of them as we surveyed the forests of Samar and Leyte. Next time, bring a shovel."

She added that habitat loss as a result of human-mediated land use (such as conversion of forested habitats for agriculture to produce food for people) is a prevailing issue in Philippine society today.

"This new information, and what we will learn more in future studies of this remarkable little creature, would inform planning for conservation action, in the strong need for initiatives to conserve Philippine endemic species -- even ones we seldom get to see," Bonachita-Sanguila said. "We need effective land-use management strategies, not only for the conservation of celebrated Philippine species like eagles and tarsiers, but for lesser-known, inconspicuous species and their very specific habitats -- in this case, forest-floor soil, because it's the only home they have."

Credit: 
University of Kansas

Neurology patients faced with rising out-of-pocket costs for tests, office visits

MINNEAPOLIS - Just like with drug costs, the amount of money people pay out-of-pocket for diagnostic tests and office visits for neurologic conditions has risen over 15 years, according to a new study published in the December 23, 2020, online issue of issue of Neurology®, the medical journal of the American Academy of Neurology. The study, funded by the American Academy of Neurology, found that people enrolled in high-deductible health plans were more likely to have high out-of-pocket costs than people in other types of plans.

"This trend of increased out-of-pocket costs could be harmful, as people may forgo diagnostic evaluation due to costs, or those who complete diagnostic testing may be put in a position of financial hardship before they can even start to treat their condition," said study author Chloe E. Hill, MD, MS, of the University of Michigan in Ann Arbor and a member of the American Academy of Neurology. "What's more, right now neurologists and patients may not have individualized information available regarding what the out-of-pocket costs might be to make informed decisions about use of care."

For the study, researchers examined out-of-pocket costs for visits to a neurologist and diagnostic tests ordered by a neurologist over a 15-year period using a large private insurance claims database. Costs for more than 3.7 million people were included.

The study found an increasing number of people were paying out-of-pocket costs for diagnostic tests and office visits over the years. The out-of-pocket costs are rising and vary greatly across patients and tests, Hill said.

For patients who had out-of-pocket costs for diagnostic tests, average inflation-adjusted out-of-pocket costs rose by as much as 190% over the study period. Average out-of-pocket costs for electroencephalogram (EEG) tests, which can be used to diagnose conditions such as epilepsy, increased from $39 to $112. For MRI scans, they increased from $84 to $242. Office visits increased from an average of $18 to $52.

Including both tests and office visits, people who paid out-of-pocket costs paid on average an increasing amount of the total cost of the service. For example, people paid on average 7% of the cost of an MRI scan at the beginning of the study, compared to 15% of the cost by the end of the study.

The percentage of people who paid out-of-pocket costs for tests varied by test, but all increased over the years. For MRIs, 24% of people paid out of pocket in 2001, compared to 70% in 2016.

People with high-deductible health plans were more likely to have out-of-pocket costs on tests and to have higher out-of-pocket costs. In 2001, none of the people in the study were enrolled in high-deductible health plans. By 2016, 11% of people were enrolled in these plans.

The researchers also found that out-of-pocket costs varied considerably. For an MRI in 2016, the people paying the median amount paid $103, while the people with the top 5% of costs paid $875.

"This study adds further weight to earlier studies from the American Academy of Neurology showing that out-of-pocket costs for neurologic medications are rising sharply, making people less likely to take their medications as often as their doctors prescribed," said James C. Stevens, MD, FAAN, President of the American Academy of Neurology. "Costs have risen to the point where systematic changes are needed. These changes could include legislative action to place a cap on out-of-pocket costs. The American Academy of Neurology is advocating for such caps on out-of-pocket drug costs in Washington, D.C."

A limitation of the study is that costs were examined for only one insurer, so the results may not reflect other private insurers or Medicaid.

Credit: 
American Academy of Neurology

People in rural areas less likely to receive specialty care for neurologic conditions

MINNEAPOLIS - A new study has found that while the prevalence of neurologic conditions like dementia, stroke, Parkinson's disease and multiple sclerosis (MS) is consistent across the U.S., the distribution of neurologists is not, and people in more rural areas may be less likely to receive specialty care for certain neurologic conditions. The study, funded by the American Academy of Neurology, is published in the December 23, 2020, online issue of Neurology®, the medical journal of the American Academy of Neurology.

"Neurologists in the United States are not evenly spread out, which affects whether patients can see a neurologist for certain conditions like dementia and stroke," said study author Brian C. Callaghan, MD, MS, of the University of Michigan in Ann Arbor and a Fellow of the American Academy of Neurology. "Our research found that some areas of the country have up to four times as many neurologists as the lowest served areas, and these differences mean that some people do not have access to neurologists who are specially trained in treating brain diseases."

However, Callaghan noted that the proportion of people receiving specialty care from a neurologist in more rural areas varied by condition. People with specific, less common conditions such as Parkinson's disease and multiple sclerosis were just as likely to see a neurologist in more rural areas as in more urban areas, while people with less specific neurologic symptoms that are more common, such as dementia, pain, dizziness or vertigo or sleep disorders, were more likely to see a neurologist in more urban areas than in more rural areas.

For the study, researchers reviewed one year of data for 20% of people enrolled in Medicare and identified 2.1 million people with at least one office visit for a neurologic condition.

Researchers recorded the number of times people had an office visit with a neurologist during that year and compared that to how many times people had office visits with other health care providers for a neurologic condition.

Researchers identified a total of 13,627 neurologists practicing in the regions where study participants lived.

Researchers found the areas with the fewest neurologists had an average of 10 neurologists for every 100,000 people, while the areas with the most neurologists had an average of 43 neurologists for every 100,000 people.

Researchers also found that the prevalence of neurologic conditions was not different across regions. Nearly one-third of people had at least one office visit for a neurologic condition.

Overall, 24% of people with a neurologic condition were seen by a neurologist. In more rural areas, this number was 21%, compared to 27% in the areas with the most neurologists. Most of that difference was made up of people with dementia, back pain and stroke. For dementia, 38% of people in more rural areas saw a neurologist, compared to 47% in more urban areas. For stroke, 21% of people in more rural areas saw a neurologist, compared to 31% in more urban areas.

On the other hand, more than 80% of people with Parkinson's disease received care from a neurologist, no matter where they lived. The numbers were similar for multiple sclerosis.

"It is important that all people have access to the best neurologic care," said James C. Stevens, MD, FAAN, President of the American Academy of Neurology. "Not surprisingly, more neurologists tend to work and live in metropolitan areas, but this study underlines the need to ensure that rural areas also have a supply of neurologists to meet demand. One way to give people more access to neurologic care is with telemedicine, which has been used successfully during the COVID-19 pandemic. Remote office visits by computer or telephone are one way to extend neurological service to people in underserved areas."

A limitation of the study is that researchers looked at neurologic visits only for people with Medicare coverage, so results may not be applicable to younger people with private insurance.

Credit: 
American Academy of Neurology

New research highlights the importance of the thymus in successful pregnancies

image: Magdalena Paolino, assistant professor and team leader at the Department of Medicine Solna, Karolinska Institutet.

Image: 
Ragnar Söderberg Foundation

How the immune system adapts to pregnancies has puzzled scientists for decades. Now, findings from an international group of researchers, led by researchers at Karolinska Institutet in Sweden, reveal important changes that occur in the thymus to prevent miscarriages and gestational diabetes. The results are published in the journal Nature.

The thymus is a central organ of the immune system where specialised immune cells called T lymphocytes mature. These cells, commonly referred to as T cells, then migrate into the blood stream and tissues to help combat pathogens and cancer. An important T cell subset, known as a regulatory T cell or Treg, is also produced in the thymus. The main function of a Treg is to help regulate other immune cells.

In the study, the researchers have found that during pregnancy, the female sex hormones instruct the thymus to produce Tregs specialised in dealing with physiological changes during pregnancy. The study--which involved researchers at Karolinska Institutet, IMBA - the Institute of Molecular Biotechnology of the Austrian Academy of Sciences in Vienna and the University of British Columbia in Vancouver--further reveals that RANK, a receptor expressed in the thymus epithelia, is the key molecule behind this mechanism.

"We knew RANK was expressed in the thymus, but its role in pregnancy was unknown", says first and co-corresponding author Dr. Magdalena Paolino, assistant professor and team leader at the Department of Medicine, Solna, Karolinska Institutet.

To get a better understanding, the authors studied mice where RANK had been deleted from the thymus.

"The absence of RANK prevented the production of Tregs in the thymus during pregnancy. This resulted in less Tregs in the placentas, leading to miscarriages," continues Magdalena Paolino.

The study further shows that in normal pregnancies, the produced Tregs also migrate to the mother's fat tissue to prevent inflammation and help control glucose levels in the body. Pregnant mice lacking RANK had high levels of glucose and insulin in their blood and many other indicators of gestational diabetes, including fetal macrosomia.

"Similar to babies of women with gestational diabetes, the newborn pups were much heavier than average," explains Magdalena Paolino.

In addition, the deficiency of Tregs during pregnancy was proven to result in long-lasting transgenerational effects on the offspring, which remained prone to diabetes and overweight throughout their life spans. Giving the RANK deficient mice thymus-derived Tregs that had been isolated from normal pregnancies, reversed all issues including fetal loss and maternal glucose levels and the body weights of the pups.

The researchers also analysed women with gestational diabetes, revealing a reduced number of Tregs in their placentas, much similar to the study on mice.

"This research changes our view of the thymus, as an active and dynamic organ required to safeguard pregnancies," Magdalena Paolino says. "It also provides new molecular insight for gestational diabetes, a disease that affects many women and which we still know little about. It emphasises the importance of clinics detecting and managing glucose metabolism in pregnant women to avert its long-term effects."

Co-corresponding author Dr. Josef Penninger notes that how rewiring of the thymus contributes to a healthy pregnancy was one of the remaining mysteries of immunology - until now.

"Our work over many years has now not only solved this puzzle - pregnancy hormones rewire the thymus via RANK - but uncovered a new paradigmatic function: the thymus not only changes the immune system of the mother to allow the fetus, but it also controls metabolic health of the mother," Josef Penninger says.

Credit: 
Karolinska Institutet