Tech

Researchers unveil new 'time machine' technique to measure cells

image: Dendritic cells (co-stained Red/green) in a lymphoid follicle (peyer's patch) draining the intestine (blue).

Image: 
Wang Cao and Shengbo Zhang, WEHI

Using a new single-cell technique, WEHI researchers have uncovered a way to understand the programming behind how stem cells make particular cell types.

The research uncovered 30 new genes that program stem cells to make the dendritic cells that kick-start the immune response.

By uncovering this process, the researchers hope they will be able to find new immunotherapy treatments for cancer, and plan to expand this technique in other areas such as discovering new drug targets in tumour initiation.

At a glance

WEHI researchers have developed a new single cell method to understand the programming behind what causes stem cells to make particular cell types.

By testing daughters of a single stem cell in different parallel tests, researchers found 500 genes that predicted dendritic cell fate.

Using a CRISPR screen, they discovered 30 key genes amongst the 500 that program dendritic cell production.

Researchers intend to expand use of this technique to find the 'big bang' moment in cancer development to identify new drug targets to fight cancer.

Studying 'sister' cells

Led by Dr Shalin Naik, Dr Luyi Tian, Ms Sara Tomei and Mr Jaring Schreuder and published in Immunity, the research outlined the processes involved in kick-starting the generation of dendritic cells driven by the hormone Flt3 ligand, which is used in immunotherapy.

The research team developed a new technique to link the gene expression of a single cell with what cell types it made.

"We invented a technique called 'SIS-seq' in order to study 'sister' cells that descended in parallel from the 'mother' stem cell," Dr Naik said.

"As RNA sequencing destroys the single stem cell, you are only able to measure the genetic contents of the cell but lose the chance to know what it would have made. So, there is no way of then going back in time to find that out."

"By letting a single stem cell divide only a few times, not all the way, we were able to test the sisters separately. Some were tested for what they made, and others were tested for their genetic contents."

"In this way, we have been able to link the genes with the cell types that are made."

Discovery of 30 new genes

Dr Naik said the findings would not have been possible without advances in technology that enabled the team to answer multiple questions simultaneously.

"Using a CRISPR screen, we tested 500 genes that predicted dendritic cell fate and discovered 30 new genes that actually program dendritic cells to be made," he said.

Dr Naik said the breakthrough could pave the way for new drug targets to fight cancer and improve immunotherapy treatment.

"We've now got a list of genes to try and generate or boost human dendritic cells in a petri dish for immunotherapy," he said.

"And we are going to expand the use of this technology to find the genes that program the generation of each of the different human immune cell types."

Finding the 'big bang' of cancer initiation

By examining cells at the single-cell level using this technique, researchers also intend to find the 'big bang' moment in cancer development in order to create new drug targets to fight cancer and improve immunotherapy.

"Using our time machine technique, we hope to be able to pinpoint which of the normal programs in tissue generation are hijacked by cancer causing genes in single cells and then use this information to find new targets for therapy," Dr Naik said.

This work was made possible with funding from the National Health and Medical Research Council, the Australia Research Council, the Victorian Cancer Agency and the Victorian Government.

Credit: 
Walter and Eliza Hall Institute

COVID-19 reduces access to opioid dependency treatment for new patients

image: COVID-19 has been associated with increases in opioid overdose deaths, which may be in part because the pandemic limited access to buprenorphine, a treatment used for opioid dependency, according to a new study led by Princeton University researchers.

Image: 
Egan Jimenez, Princeton University

COVID-19 has been associated with increases in opioid overdose deaths, which may be in part because the pandemic limited access to buprenorphine, a treatment used for opioid dependency, according to a new study led by Princeton University researchers.

The researchers found that Americans who were already taking opioids did not experience disruptions in their supply. Patients who were not previously taking opioids for pain management were less likely to receive a new prescription in the first months of the pandemic, but prescriptions for new patients soon bounced back to previous levels.

At the same time, fewer new patients entered into medication-assisted treatment for opioid addiction, which may have contributed to a spike in overdose deaths, according to the study, which was published in the JAMA Network Open, a journal of the American Medical Association.

"Medication-assisted treatment has been shown to prevent overdose deaths, so disruptions in access to treatment have likely played a role in increasing overdose deaths during the pandemic," said Janet Currie, the Henry Putnam Professor of Economics and Public Affairs and co-director of the Center for Health and Wellbeing (CHW) at Princeton University's School of Public and International Affairs.

Prescribing patterns may be partially accounted for by the role of telemedicine. According to Currie, "Providers found it relatively easy to continue to serve existing patients remotely, but it was more difficult to bring new patients into care."

Currie conducted the study with Jonathan Zhang, a postdoctoral research associate at CHW, Molly Schnell Ph.D. '18, and Hannes Schwandt, a former postdoctoral research scholar at Princeton. Schnell and Schwandt are both now at Institute for Policy Research at Northwestern University.

The researchers explored the pandemic's impact on the opioid crisis by using a large national database of more than 90 million prescriptions for opioid analgesics and buprenorphine.

First, they looked at prescriptions of opioid pain medications for new and existing patients. They also counted prescriptions for buprenorphine prescribed for opioid-use disorder and distinguished between new and existing patients entering and maintaining treatment for opioid dependency.

To compare to the past, the researchers reviewed statistics dating back to 2018. They then studied two time periods to capture the initial and medium-term impacts of COVID-19: the onset of the pandemic (between March and May 2020) and a period when things returned to a more normal level of activity (May to September 2020). They took several factors into account, including the total number of prescriptions filled, the strength of the drugs prescribed, and the average dispensed pills per prescription.

The research team concluded that patients who were already taking opioid pain medication did not experience a disruption in care. Although the number of weekly prescriptions fell slightly at the start of the pandemic, the amount of medication dispensed remained flat because providers increased the amount prescribed per prescription.

There was initially a sharp drop in new opioid prescriptions from March to May 2020, followed by a rapid rebound to predicted levels. Given the downward trend in opioid prescribing, it is possible that some of these patients may never use opioids, thereby reducing future opioid addictions.

Patterns were different for buprenorphine treatment for opioid dependency. Patients who were already in treatment retained access to these drugs during the pandemic. There was little to no change in the number of prescriptions or doses prescribed for these patients.

However, the number of buprenorphine prescriptions for new patients decreased by almost a quarter at the beginning of the pandemic and had returned to 90% of predicted levels by late August. The researchers estimate that about 37,000 fewer people received buprenorphine treatment for opioid dependency as a result of the pandemic. This reduced access to treatment for opioid addiction may have increased overdose deaths.

The study's findings shed light on the importance of access to treatment for opioid addiction and that cracks in the U.S. health care delivery system exacerbated and increased due to COVID-19. The researchers are hopeful that understanding prescription patterns -- and obstacles to care -- will help to improve outcomes for patients suffering from addiction.

Credit: 
Princeton School of Public and International Affairs

Betting on drones as smart agricultural tools for pesticide use in farms

image: Can unmanned aerial vehicles be used for pesticide use in an efficient manner? In a new study, researchers assessed their use in rice paddy fields, in terms of costs, capacity, and management efficiency

Image: 
Yuna Seo

Besides enabling more potent smartphones and higher download speeds while riding the subway, cutting-edge technologies like artificial intelligence, robotics, and wireless communications are on the verge of revolutionizing well-established industrial fields. A remarkable example is "smart agriculture," which has seen a tremendous increase in the use of drones for various tasks, especially in Japan.

Drones, or "unmanned aerial vehicles" (UAVs), have been the focus of extensive research for agricultural applications. For example, they can take aerial images of a field and, through subsequent image processing, identify problems in specific areas of the crop fields. Another notable use case for UAVs that has been quickly gaining traction is the spraying of pesticides. In Japan, the number of hectares sprayed by drones saw a stunning 45-fold increase from 2016 to 2018. Similarly, the number of registered UAVs for agricultural spraying increased from a mere 227 to 1552 between those years.

While UAVs could be used to either replace or complement traditional pesticide spraying methods, it remains to be proven whether UAVs are superior to conventional methods in many regards.

"Following recent technological demonstrations and verifications at field sites, there is an increasing need for farm management research of smart agricultural technology including cost and efficiency analyses; this is essential for its implementation in farms," explains Yuna Seo, who is Junior Associate Professor at Tokyo University of Science, Japan.

In an effort to address this knowledge gap, Seo led a recent study published in MDPI's Sustainability in which she, with her student Shotaro Umeda, compared different pesticide spraying technologies using realistic data. More specifically, the researchers evaluated and compared the costs, working capacity, and management efficiency of drones versus remote-controlled (RC) helicopters and tractor-mounted boom sprayers for preventively spraying pesticides over rice paddies. They made these comparisons for seven different paddy field areas to take into account differences in scale for each method.

In terms of pest-control costs, the UAVs were only slightly less expensive per unit area than the boom sprayers, mainly due to the low price of drones and savings in fuel. In this regard, the RC helicopters were much more expensive. "Although the purchase cost of boom sprayers is almost double that of UAVs, the fixed costs of both end up being similar because of the high operation, maintenance and repair costs of drones, which are notorious obstacles in UAV introduction and adoption," remarks Seo.

As for the working capacity, RC helicopters could cover much more area per hour than both drones and boom sprayers. Still, drones had a slight advantage in daily area coverage over boom sprayers. Finally, to explore the management efficiency of each method, the researchers used a technique called "data envelopment analysis," which is widely used in economy and operations management to benchmark the performance of manufacturing and service operations. The results indicated that both boom sprayers and UAVs reached maximum or near-maximum efficiency for most paddy areas, while RC helicopters were much less efficient.

Overall, this study showcased the benefits of drones as tools for rice production and compared them to other well-established technologies. But, the use of drones in agriculture is not without limitations, which should be addressed in the future, such as the modification of aviation laws that forbid higher pesticide payloads on drones, as well as maintenance costs. "The total costs and efficiency of UAVs would be comparable to that of boom sprayers, which is not a hindrance for farmers wanting to switch to drones. Therefore, technological advances and deregulation are necessary to expand the use of UAVs while meeting safety measures and ensuring applicability," explains Seo.

In conclusion, this study highlights both the advantages and limitations of using drones as agricultural tools. Still, there is no doubt that smart agriculture as a whole could greatly alleviate the labor shortage problems in countries with a rapidly aging population, of which Japan is a prime example. Given the potential upsides, let us hope UAVs quickly earn their spot in agriculture so that farmers can work smarter, not harder!

Credit: 
Tokyo University of Science

Study of marten genomes suggests coastal safe havens aided peopling of Americas

image: Photo of a Pacific marten (Martes caurina).

Image: 
L. L. Master

LAWRENCE -- How did the first humans migrate to populate North America? It's one of the great scientific puzzles of our day, especially because forbidding glaciers covered most of Canada, Alaska and Pacific Northwest during the Last Glacial Maximum (LGM). These glaciers limited human movements between northern ice-free areas, like the Beringia Land Bridge, and southern ice-free areas, like the continental United States.

Now, research from the University of Kansas into the whole genomes of the American pine marten and Pacific pine marten -- weasel-like mammals that range today from Alaska to the American Southwest -- could shed light on how the first humans populated the Americas.

The study used genomic sequence data to determine biogeographic, colonization and demographic histories of martens in North America, and it found that a coastal population of Pacific martens may have inhabited forested refuges along the ice-bound coastline of Alaska and Canada during the LGM. According to the research, published in the Journal of Biogeography, these same forested sanctuaries could have provided food, shelter and other resources to bands of human migrants who populated North America.

"The 'Coastal Refugium Hypothesis' is the idea that there were pockets of ice-free land along the coast of northwestern North America, and also communities of organisms that lived in these areas," said lead author Jocelyn Colella, assistant professor of ecology & evolutionary biology at KU and assistant curator of mammals with the KU Biodiversity Institute and Natural History Museum. "Carnivores, like martens, take a lot of resources to sustain. They need something to eat -- and martens in particular are considered 'forest-associated,' meaning they also need complex forests in order to complete their life cycles. Evidence of martens in this area suggests there may have also been forests, not just tundra and ice, which is different than what we previously thought."

These forested pockets along the coastline where the coastal Pacific marten dwelled also could have served as sanctuaries for humans where they may have hunted, foraged and found access to shelter and supplies along their icy journey.

"Presumably, migrating humans along the coast would have been seafaring -- probably using some type of boat," Colella said. "But humans have to eat, too, and so the next question is, were we good enough fishermen to live solely off of the sea, or were there other resources? It looks like there may have been substantially more resources in these areas: plants, small mammals, maybe we even ate martens. Who knows?"

Colella and her colleagues first sequenced the marten's whole genome and then performed analyses with powerful computers to determine when the different species diverged (or split off from a common ancestor and became distinct species) and infer the historical distributions of martens along the complex Northwestern Coast.

"You can compare genomes from different species to see how their evolutionary histories differ, and we do this a lot with phylogenies -- a phylogeny is kind of like a family tree, it shows the evolutionary relationships between different organisms," Colella said. "There appear to be two different lineages of Pacific marten -- one coastal lineage found on three islands along the North Pacific coast and then another continental lineage located in areas of the American Southwest, but also in the Pacific Northwest and California mountain ranges. We found a deep history for coastal Pacific martens along the North Pacific Coast -- our dates show about 100,000 years, which means they've been there since the Last Glacial Maximum when ice covered most of North America. At that time, these martens may have been isolated off the coast in ice-free areas -- or 'glacial refugia' -- available to terrestrial animals, meaning there was also terrestrial area available for humans migrating along the coast."

While marten genomes show them inhabiting these coastal refuges during the LGM, so far the fossil record hasn't confirm this idea, according to Colella. However, the KU researcher believes some fossils may need to be reexamined.

"Scientists haven't found a lot of marten fossils from this time period along the coast," she said. "But, a lot of the fossils they have found are incomplete, sometimes just teeth, and it's hard to identify species by just their teeth. Interestingly, coastal Pacific martens are found only on islands where their semi-aquatic relative, mink, are not found -- it's possible that coastal Pacific martens have filled that niche instead. In fact, my previous work on marten morphology found that coastal martens are larger than martens on the mainland, so it's possible that some fossils may have been identified as mink but are actually be martens."

Although the coastal Pacific marten isn't today classified as a distinct species, Colella believes the research indicates it should be.

"This is the first time we've detected the coastal Pacific marten, and it's really different from mainland Pacific marten," she said. "The problem is we don't yet have enough samples to say it's a distinct species. The next step is to compare the morphology of the two groups and increase our genetic sample sizes, so we can test the species status of the insular Pacific marten. Pacific martens have this really weird geographic range -- with the coastal group found on just a couple islands off the coast of Southeast Alaska and the mainland group found in Pacific Northwest and California forests, but also on mountaintops in New Mexico and Utah. Alaskan islands are very different from the Pacific Northwest, which is very different from the Southwest. Based on their distribution today, it seems that the Pacific martens were historically more widespread. It's kind of amazing, as we start to look at the genetics of some of these animals, just how little we know."

Credit: 
University of Kansas

Forest elephants are now critically endangered -- here's how to count them

image: A team of scientists compared methodologies to count African forest elephants (Loxodonta cyclotis), which were recently acknowledged by IUCN as a separate, Critically Endangered species from African savannah elephants.

Image: 
WCS Gabon

LIBREVILLE, Gabon (April 15 2021) - A team of scientists led by the Wildlife Conservation Society (WCS) and working closely with experts from the Agence Nationale des Parcs Nationaux du Gabon (ANPN) compared methodologies to count African forest elephants (Loxodonta cyclotis), which were recently acknowledged by IUCN as a separate, Critically Endangered species from African savannah elephants. The study is part of a larger initiative in partnership with Vulcan Inc. to provide the first nationwide census in Gabon for more than 30 years. The results of the census are expected later this year.

Contrary to savannah elephants (Loxodonta africana) which can be counted directly, usually through aerial survey, accurately censusing elusive forest elephants is more challenging and refinements of methods were needed. Publishing a new survey method to counting forest elephants in the journal Global Ecology and Conservation, the team compared traditional methodologies to count elephant dung piles along line transects, with spatial capture-recapture (SCR) techniques using both camera traps and DNA dung analysis. SCR estimates populations by measuring how many times and in what location individual animals are recounted.

Said the study's lead author, Alice Laguardia of WCS's Gabon Program: "The more accurately we can count forest elephants, the more we can measure whether conservation efforts are successful. We are hopeful that the results of this study will help governments and conservation partners protect this Critically Endangered species throughout its range."

Researchers assessed the performance of the methodologies to three relatively large forest elephant populations in Gabon. They found that the SCR method that used DNA sampling of dung was comparable in accuracy to the line transect method but less expensive on larger scales.

Stephanie Bourgeois, coauthor and geneticist at ANPN, said: "Testing of this new DNA approach has been made possible by the recent development of novel genetics techniques by ANPN and the creation of a new genetics lab in Gabon enabling to perform all DNA analyses in-country."

SCR Camera trap surveys were more precise on smaller scales but more expensive. The authors recommend that the use of both SCR methods, and their development, continue. They say that future findings and improvements should be compiled across studies to ensure their robust evolution as an option for monitoring the African forest elephant across its range and inform strategies and action for its conservation.

Forest elephants have been decimated by ivory poachers in recent years. A WCS-led census released in 2014 documented a 65 percent decline in forest elephant numbers between 2002 and 2013. Through this new study, researchers will gain a better understanding of how many forest elephants remain and where they reside. Efforts have been focused on Gabon as it is thought to harbor more than 50 percent of the remaining forest elephant population, despite accounting for less than 15 percent of the species' range, making Gabon the most important country for forest elephant conservation.

"As long as ivory is a precious commodity, elephants will be at risk," said Lee White, the Gabonese Minister of Water, Forests, the Seas, the Environment, charged with Climate Change & Land Use Planning. "In Africa there is a clear link between environmental governance, peace and security. Countries that have lost their elephant populations have all too often descended into civil strife. Through the results of this study we hope to obtain a clear picture of the trend of poaching and elephant populations in all of Gabon."

"Vulcan recognizes the significant role of accurate population data for conservation management and policy decisions," said Ted Schmitt, director, conservation at Vulcan Inc. "By providing timely census data, we can fill critical knowledge gaps and enable prioritization of conservation resources. We are pleased to be part of this effort with Wildlife Conservation Society and the government of Gabon to help preserve this important species."

Funding for this critical work was provided by Vulcan Inc., a Seattle company founded by the late philanthropist Paul G. Allen and his sister Jody Allen, who currently serves as chair.

Credit: 
Wildlife Conservation Society

A neuromagnetic view through the skull

image: For the first time, researchers were able to demonstrate that noninvasively measured fast brain oscillations show significant variability from stimulus to stimulus (rows), both in terms of the timing of successive action potentials (shifts in blue/red vertical bands) and in terms of their strength (color intensity).

Image: 
Charité | Gunnar Waterstraat

The processing of information inside the brain is one of the body's most complex processes. Disruption of this processing often leads to severe neurological disorders. The study of signal transmission inside the brain is therefore key to understanding a myriad of diseases. From a methodological point of view, however, it creates major challenges for researchers. The desire to observe the brain's nerve cells operating 'at the speed of thought', but without the need to place electrodes inside the brain, has led to the emergence of two techniques featuring high temporal resolution: electroencephalography (EEG) and magnetoencephalography (MEG). Both methods enable the visualization of brain activity from outside the skull. However, while results for slow currents are reliable, those for fast currents are not.

Slow currents - known as postsynaptic potentials - occur when signals created by one nerve cell are received by another. The subsequent firing of impulses (which transmit information to downstream neurons or muscles) produces fast currents which last for just a millisecond. These are known as action potentials. "Until now, we have only been able to observe nerve cells as they receive information, not as they transmit information in response to a single sensory stimulus," explains Dr. Gunnar Waterstraat of Charité's Department of Neurology with Experimental Neurology on Campus Benjamin Franklin. "One could say that we were effectively blind in one eye." Working under the leadership of Dr. Waterstraat and Dr. Rainer Körber from the PTB, a team of researchers has now laid the foundations which are needed to change this. The interdisciplinary research group succeeded in rendering the MEG technology so sensitive as to enable it to detect even fast brain oscillations produced in response to a single sensory stimulus.

They did this by significantly reducing the system noise produced by the MEG device itself. "The magnetic field sensors inside the MEG device are submerged in liquid helium, to cool them to -269°C (4.2 K)," explains Dr. Körber. He adds: "To do this, the cooling system requires complex thermal insulation. This superinsulation consists of aluminum-coated foils which produce magnetic noise and will therefore mask small magnetic fields such as those associated with nerve cells. We have now changed the design of the superinsulation in such a way as to ensure this noise is no longer measurable. By doing this, we managed to increase the MEG technology's sensitivity by a factor of ten."

The researchers used the example of stimulating a nerve in the arm to demonstrate that the new device is indeed capable of recording fast brain waves. As part of their study on four healthy subjects, the researchers applied electrical stimulation to a specific nerve at the wrist whilst at the same time positioning the MEG sensor immediately above the area of the brain which is responsible for processing sensory stimuli applied to the hand. To eliminate outside sources of interference such as electric networks and electronic components, the measurements were conducted in one of the PTB's shielded recording rooms. The researchers found that, by doing so, they were able to measure the action potentials produced by a small group of simultaneously activated neurons in the brain's cortex in response to individual stimuli. "For the first time, a noninvasive approach enabled us to observe nerve cells in the brain sending information in response to a single sensory stimulus," says Dr. Waterstraat. He continues: "One interesting observation was the fact that these fast brain oscillations are not uniform in nature but change with each stimulus. These changes also occurred independently of the slow brain signals. There is enormous variability in how the brain processes information about the touch of a hand, despite all of the stimuli applied being identical."

The fact that the researchers are now able to compare individual responses to stimuli opens the way for neurology researchers to investigate questions which previously remained unanswered: To what extent do factors such as alertness and tiredness influence the processing of information in the brain? What about additional stimuli which are received at the same time? The highly sensitive MEG system could also help scientists to develop a deeper understanding of, and better treatments for, neurological disorders. Epilepsy and Parkinson's disease are examples of disorders which are linked to disruptions in fast brain signaling. "Thanks to this optimized MEG technology, our neuroscience toolbox has gained a crucial new tool which enables us to address all of these questions noninvasively," says Dr. Waterstraat.

Credit: 
Physikalisch-Technische Bundesanstalt (PTB)

Mathematical method builds synthetic hearts to identify how heart shape could be linked to disease

Researchers from King's College London have created 3D replicas of full-sized healthy adult hearts from Computed Tomography (CT) images and analyzed how cardiac shape relates to function.

Published today in PLOS Computational Biology, the study also includes 1000 new synthetic hearts that have been made open access allowing researchers to download and use them to test new algorithms, test in-silico therapies, run more statistical analyses or generate specific shapes from the average models.

Statistical shape analysis is a technique that allows the rigorous study of the anatomical changes of the heart across different subjects. Using this technique, from a cohort of 20 healthy adult hearts the researchers created an average heart and then adjusted by deforming this average heart to get 1000 new and synthetic 3D whole hearts.

By making the synthetic heart divert from the average shape more abnormal or extreme hearts can be created, bounded by the range of variation observed in the cohort.

Lead researcher Cristobal Rodero said: "Even in healthy people, everyone has a slightly different heart shape. Knowing these differences and how they affect cardiac function is a task for which computer simulations are an ideal tool."

With the data, researchers can run electro-mechanic simulations on the 20 hearts plus 38 extreme cases where some features, automatically selected, were exaggerated such as bigger or smaller hearts or with thicker walls.

Mr Rodero said the research is the first milestone in our understanding of how subtle anatomical changes can impact function and paves the road for other researchers to replicate and expand study results.

"This research could be used as an early diagnosis later down the track. For instance, we found that there is an area in the heart right before the aorta that when it gets thicker, it has a big impact in the predicted function."

"This anatomical trait has strong links with the latest results of my colleague Maciej Marciniak in the study of local hypertrophy patterns, also published in his recent study in the Journal of Hypertension."

The core of the study is based on the ability to simulate the heartbeat by computer, a complex process that requires supercomputers (more than 200,000 hours of computations that are run in parallel and take about 5 days).

The trick is then to learn the mechanisms that are simulated in what are called emulators that produce the simulation output in a split second.

Building these emulators not only expedites the process, but also the way to identify which are the important factors linked with cardiac health and disease, and how this study tackles the complex problem of how cardiac shape relates to function.

The vision in this research field is a future where our digital twins, our personalized computational replicas, support the decisions in the management of cardiovascular disease, as authors described in a white paper at the European Heart Journal.

Credit: 
King's College London

Tracking the progress of fusion power through 60 years of neutral particle analysis

As the world's energy demands grow, so too does growing concern over the environmental impact of power production. The need for a safe, clean, and reliable energy source has never been clearer. Fusion power could fulfil such a need. A review paper published in EPJ H examines the 6-decade history of neutral particle analysis (NPA), developed in Ioffe Institute, Saint Petersburg, Russia, a vital diagnostic tool used in magnetic plasma confinement devices such as tokamaks that will house the nuclear fusion process and generate the clean energy of the future.

As the review's corresponding author Dr Pavel Goncharov, laboratory head at the Advanced Plasma Research Laboratory, Saint Petersburg Polytechnic University, Russia, explains, fusion power requires bringing processes that occur within the cores of stars down to earth. "Plasma is the dominant state of the visible matter in the present Universe and nuclear fusion powers the stars," says the physicist. "The ability to burn deuterium formed at the beginning of the Universe and generate energy represents a new height for mankind."

The key to this is creating and constraining plasma with powerful magnetic fields. But, understanding the fusion processes with plasma and how this is best harnessed also requires powerful diagnostic techniques, and NPA is such a method.

The concept of NPA originates in the 1951 work of Andrei D. Sakharov, a Nobel Peace Prize laureate who realised that within a plasma fast-moving hydrogen ions would collide with slow-moving neutral hydrogen atoms transferring their charge. Thus, measuring the flux of these fast neutral atoms as they are ejected from the plasma is a good way of diagnosing its ion distributions.

The birth and history of NPA is not only closely related to the history of controlled fusion physics but will play a key role in its future. The technique will be one of the key diagnostic methods employed by ITER?--?currently the world's largest fusion experiment, which hopes to bridge the gap between smaller scale experiments and a functioning nuclear fusion power plant.

"Three factors played a role in our interest in fusion science," remark the authors. "First, it involves multiple branches of fundamental science. Second, this field is of great practical importance. Third, a new clean and abundant energy source is the basis for a better future for mankind. This is an impressive combination."

Credit: 
Springer

Genetic ancestry versus race can provide specific, targeted insights to predict and treat many diseases

The complex patterns of genetic ancestry uncovered from genomic data in health care systems can provide valuable insights into both genetic and environmental factors underlying many common and rare diseases--insights that are far more targeted and specific than those derived from traditional ethnic or racial labels like Hispanic or Black, according to a team of Mount Sinai researchers.

In a study in the journal Cell, the team reported that this information could be used to better understand and predict which populations are more susceptible to certain disorders--including cancers, asthma, diabetes, and cardiovascular disease--and to potentially develop early interventions.

"This is the first time researchers have shown how genetic ancestry data could be used to enhance our understanding of disease risk and management at a health system level," says senior author Eimear Kenny, PhD, Professor of Medicine, and Genetics and Genomic Sciences, at the Icahn School of Medicine at Mount Sinai. "By linking this data directly to health outcomes, we believe we're contributing to an ongoing conversation to move beyond the current role of race and ethnicity in medicine."

The research team drew from Mount Sinai's BioMe™ BioBank program, recognized as one of the world's leading repositories of genomic information for diverse populations, for its study. Using machine learning methodology, scientists identified 17 distinct ethnic communities from among the 30,000 participants in the BioMe BioBank. They then linked this data to thousands of health outcomes residing in Mount Sinai's electronic health records. Among the findings was that 25 percent of BioMe participants had genetic links to populations--such as Ashkenazi Jewish and Puerto Rican--that predisposed them to certain genetic diseases.

"The traditional use of demographic data by health systems fails to capture the rich ethnic heritage of patients, and thus all the genetic and environmental factors that can affect rates of disease even within the same population," says Dr. Kenny, who is Founding Director of the Institute for Genomic Health at Mount Sinai. "Our study used genomic data embedded in health system records to show how patients with origins from different countries in the Americas can have different rates of disease. For example, people of Puerto Rican and Mexican descent are broadly classified as Hispanic or Latinx, yet the former population has one of the highest rates of asthma in the world, while the latter population has one of the lowest."

The Mount Sinai study cited the APOL1 gene, which can confer a significantly greater risk of kidney and cardiovascular disease, as another reason for moving beyond the traditional demographic labels used by health care systems. The risk variants of APOL1 are most frequently seen in populations across the Americas that share African genetic ancestry. However, there are many populations around the world of African descent that might not self-identify as African, and thus be unaware that they might harbor those risk variants. Furthermore, that knowledge gap may result in these populations being underrepresented in APOL1 research.

"Our study underscores that there are limits to the narrow demographic labels used in medicine and research today--and society in general, for that matter--to attempt to characterize disease and its risk factors," says Dr. Kenny. "The types of information that can be derived from using biological markers of ancestry, however, convey a much richer and more sophisticated layer of understanding of disease risk and burden, one that could have enormous implications for health care systems globally."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Study: New approach may boost prostate cancer immunotherapies

image: Leigh Ellis, PhD

Image: 
Cedars-Sinai

LOS ANGELES (April 15, 2021) -- Researchers have discovered a new way to transform the tissues surrounding prostate tumors to help the body's immune cells fight the cancer. The discovery, made in human and mouse cells and in laboratory mice, could lead to improvements in immunotherapy treatments for prostate cancer, the second most common cancer in men in the US.

Using a technique called epigenetic reprogramming, investigators altered the tumor and tumor microenvironment by inhibiting expression of a protein known as enhancer of zeste homolog2, or EZH2, which is found at high levels in prostate cancer. This protein helps tumors resist checkpoint inhibitor immunotherapies, which are designed to block certain other proteins that can stop immune cells from killing cancer cells.

By inhibiting EZH2, the investigators were able to reduce the tumor's resistance to checkpoint inhibitors. If confirmed in clinical trials, the research findings, recently published in the peer-reviewed journal Nature Cancer, potentially may increase the percentage of prostate cancer patients who successfully respond to this therapy.

"Our goal is to one day apply our strategy to fire up the immune system of prostate cancer patients and make the cancer vulnerable to the patient's own immune system," said molecular and cellular biologist Leigh Ellis, PhD, scientific director of the Center for Urologic Research Excellence at Cedars-Sinai Cancer and corresponding author of the multi-institutional study.

The study's findings were made through the study of cancer epigenetics. Whereas traditional genetics describe the way DNA sequences in our genes are passed from one generation to the next, epigenetics describe how genes get turned on and off in an individual's body to produce proteins. Disrupted epigenetic mechanisms can alter gene function and form cancerous cells.

The investigators zeroed in on the EZH2 mechanisms that altered the gene function of prostate cells, switching off the immune response, and on how to switch it back on.

Well-established research has demonstrated that EZH2 inactivates several tumor-suppression genes in numerous cancers, including prostate. Based on those earlier findings, the researchers genetically and chemically inhibited EZH2 to activate important immune-related genes called interferon-stimulated genes. Interferons are proteins that alert the immune system to attack cancer cells.

This action, called "viral mimicry," involves reopening dormant areas of the genome--an organism's complete set of genetic instructions--that typically are closed for action. That process prompted the tumor cell to activate the interferon-stimulated genes, which signaled the immune system to enhance the response to checkpoint inhibitor therapy.

"We discovered that viral mimicry is key to activating interferon-stimulated genes," Ellis said. "With epigenetic therapy, we're able to get the immune response we want."

In recent years, immunotherapy drugs have been added to chemotherapy regimens, or used alone, to help a patient's own immune cells attack cancer, but the response, while promising in treating skin, colon, lung, liver and other cancers, is still low--currently, about 70% to 85% of patients taking immunotherapy drugs fail to respond to them.

The researchers' study findings hold the potential for personalized treatment approaches going forward, which may include combined regimens of EZH2 inhibitors and checkpoint inhibitors as a novel strategy to increase prostate cancer response to checkpoint immunotherapy. Following recent Food and Drug Administration approval of EZH2 inhibitors, Ellis and Edwin M. Posadas, MD, the medical director of the Center for Urologic Research Excellence, and associate professor in the Division of Oncology at Cedars-Sinai Cancer, plan to launch a Cedars-Sinai-based clinical trial of an EZH2 inhibitor in prostate cancer patients.

"We are extremely pleased to see Dr. Ellis, one of our recent researcher recruits, contribute such a significant advance to the prostate cancer field," said Dan Theodorescu, MD, PhD, director of the Cedars-Sinai Cancer enterprise and professor of Surgery and Pathology and Laboratory Medicine. "Prostate cancer has been a challenge to treat even with immunotherapy, and this work will help overcome some of the barriers and hopefully improve outcome for patients."

Credit: 
Cedars-Sinai Medical Center

Japanese-European research team discovers novel genetic mitochondrial disorder

image: A Japanese-European team of scientists described novel mutations in the LIG3 gene, which plays a key role in mitochondrial DNA replication. Normally, mitochondrial DNA is repaired and replicated by LIG3 activity, but if the gene contains mutations, enzymes necessary for energy production are not produced, potentially leading to central nervous system symptoms and muscle defects.

Image: 
Fujita Health University

DNA ligase proteins, which facilitate the formation of bonds between separate strands of DNA, play critical roles in the replication and maintenance of DNA. The human genome encodes three different DNA ligase proteins, but only one of those proteins--DNA ligase III (LIG3)--is expressed in mitochondria. LIG3 is therefore crucial for mitochondrial health, and inactivation of the homologous protein in mice causes profound mitochondrial dysfunction and early embryonic mortality. In an article recently published in the peer-reviewed journal Brain, a team of European and Japanese scientists, led by Dr. Mariko Taniguchi-Ikeda from Fujita Health University Hospital, describes a set of seven patients with a novel mitochondrial disorder caused by biallelic variants in the gene that encodes the LIG3 protein, called the "LIG3" gene. Their report provides a description of the patients' symptoms and a mechanistic exploration of the mutations' effects.

For Dr. Taniguchi-Ikeda, the investigation began with her desire to help a young patient. "I wanted to make a distinct clinical and genetic diagnosis for the affected patient," she explains, "because his elder brother had passed away and the surviving boy was referred to my outpatient ward for detailed genetic tests." By performing whole-exome sequencing of DNA from the surviving patient, Dr. Taniguchi-Ikeda discovered that he had inherited a p.P609L LIG3 variant from his father and a p.R811Ter LIG3 variant from his mother. The parents had kept the deceased brother's dried umbilical cord, and by analyzing DNA extracted from that source, Dr. Taniguchi-Ikeda confirmed that the brother had carried the same LIG3 variants.

Having detected a novel genetic mitochondrial disorder, Dr. Taniguchi-Ikeda wished to conduct further research by identifying other patients with pathogenic LIG3 variants. She could find no other such cases in Japan, but through a collaboration with Dr. Makiko Tsutsumi from Fujita Health University and researchers in Europe, including Professor Elena Bonora from the University of Bologna and Professor Roberto De Giorgio from the University of Ferrara, she learned of two European families also affected by such variants. One was an Italian family in which three brothers had all inherited a p.K537N variant from their father and a p.G964R variant from their mother, and the other was a Dutch family in which two daughters had inherited a p.R267Ter variant from their father and a p.C999Y variant from their mother.

These patients experienced a complex syndrome involving severe gut dysmotility and neurologic abnormalities as the most consistently observed clinical signs. The neurologic abnormalities included leukoencephalopathy, epilepsy, migraine, stroke-like episodes, and neurogenic bladder. The prominent changes in the gut were decreased myenteric neuron counts and elevated fibrosis and elastin levels. Muscle pathology assessments revealed decreased staining intensities for cytochrome C oxidase.

To better characterize how the patients' LIG3 mutations could lead to such phenotypes, the researchers conducted experiments both in vitro and on zebrafish. The in vitro experiments with patient-derived fibroblasts showed that the mutations resulted in reduced LIG3 protein levels and diminished ligase activity. The consequent deficits in mitochondrial DNA maintenance would do much to explain the patients' presentations. Experiments with zebrafish showed that disrupting the lig3 gene produced brain alterations and gut transit impairments analogous to those observed in the patients.

The study brings to light a novel disorder resulting from disruption of a gene that plays a critical role in the maintenance of mitochondrial DNA. In describing the importance of these findings, Dr. Taniguchi-Ikeda concludes, "Our study may facilitate efforts to diagnose patients with mitochondrial diseases. Our findings will also be beneficial to future investigations into the mitochondrial DNA repair system."

Credit: 
Fujita Health University

A more complete account

Even the mention of parasites can be enough to make some people's skin crawl. But to recent UC Santa Barbara doctoral graduate Dana Morton these creepy critters occupy important ecological niches, fulfilling roles that, in her opinion, have too often been overlooked.

That's why Morton has just released the most extensive ecological food web that includes parasites. Eight years in the making, the dataset includes over 21,000 interactions between 942 species, all thoroughly annotated. The detailed description, published in the journal Scientific Data, is a boon for basic research, conservation efforts and resource management.

Understanding who eats whom, or trophic interactions, in an ecosystem is prime information for biologists. These relationships alone can tell researchers a great deal about a system, its complexity and even its overall health. However, ecologists often overlook parasites when investigating these interactions, perhaps because parasitology only recently joined the sphere of ecology, emerging from the medical sciences.

"But you can't overlook parasite interactions once you know about them," said Morton. "If you're ignoring half of the interactions in the system, you don't really know what's going on in that system."

Previous work led by her mentors, Armand Kuris and Kevin Lafferty in the Department of Ecology, Evolution, and Marine Biology, found that parasites were common in estuarine food webs. But Morton wanted to tackle a more diverse ecosystem. Given the body of research conducted on California's kelp forests, she thought it would be easy enough to simply add parasites and small, free-living invertebrates to an existing network. But she quickly realized that previous food webs compiled for the kelp forest were too coarse to build on. They focused on big fish eating little fish, but gave less attention to mammals, birds and invertebrates. She'd need to start from scratch.

An exhaustive endeavor

First Morton compiled a list of species that call the kelp forest home. She and her co-authors used basically every credible source they could find. They pored over literature reviews and got data from long-term research projects, like the Santa Barbara Coastal Long Term Ecological Research Program and the Channel Islands National Park Kelp Forest Monitoring program. She also sought out fellow divers, and when that wasn't enough, Morton and her team conducted their own field sampling.

Morton especially acknowledged the help she received from undergraduate student volunteers and experts throughout the process, including Milton Love, Bob Miller, Christoph Pierre, Christian Orsini and Clint Nelson at UC Santa Barbara; Mark Carr at UC Santa Cruz; Ralph Appy at Cabrillo Marine Aquarium; and David Kushner at Channel Islands National Park.

The authors' next task was discerning all the interactions, which fell primarily into three sorts: predator-prey, parasite-host and predator-parasite. Morton's general rule was that every animal had to eat something, and every node should have at least one connection.

It soon became clear that adults and juveniles often have different roles in food webs, requiring more detail than other food webs usually contain. This also was an exhaustive task that required scouring academic literature and databases, conducting field observations and dissections and talking with expert researchers.

By combining information on predator-prey and parasite-host relationships, Morton was able to infer some relationships based strictly on logical reasoning. For example, this helped to determine whether an ingested parasite was likely to die or infect the predator that ate its host.

Each node on the food web -- corresponding to a particular species or life stage -- had a reference in its entry. In fact, Morton made sure that the entire web was replete with metadata. "We don't want food webs to be just these black boxes where you don't know how they were put together, so you don't know how to use them appropriately," she said.

She was particularly attentive to uncertainty, and estimated her confidence for each of the tens of thousands of putative relationships. For instance, certain parasites may turn up in only one or two specimens simply because they are rare, rather than due to any specialization. Unobserved but real interactions between hosts and parasites create a false negative in the food web.

Morton, therefore, estimated the probability of false-negative links for every potential host-parasite interaction. If an absent interaction had more than a 50% change of a false negative, then she assigned it as a link in the network. She also removed parasite species that were especially prone to false negatives, to reduce overall error.

She also included an estimate of her confidence for each of the tens of thousands of putative relationships.

About a dozen purple urchins crawl around on the reef as fish and giant kelp fronds drift in the sunlight above.

While animals like purple urchins often remain in the forests and reefs, fish can travel between many different ecosystems in a region.

Photo Credit: KATIE DAVIS/CASELLE LAB

A major challenge Morton faced was simply knowing when the project was done. There are few sharp divides in the ocean; ecosystems are incredibly interconnected, and many species that live in the kelp forest also inhabit other ecosystems in Southern California. This project could have crept its way to becoming an account of the entire eastern Pacific.

To keep it from ballooning, Morton limited the study to the rocky reef in the depth range of giant kelp. She also made no attempt to include viruses and bacteria, nor did she specify the many phytoplankton species. Eventually the food web reached a point where additions did not change the overall structure of the network, indicating that the web was converging toward a complete account.

A complex system

Morton's years of work yielded a comprehensive food web comprising 492 free-living species and 450 parasites. Accounting for specific life stages brings the total nodes to 1,098, with 21,956 links between them.

"This is the first food web for a really structurally complex marine ecosystem, that's really dynamic and open," Morton said. She was amazed by the extent to which the network expanded after accounting for often overlooked groups of organisms. Including small, free-living invertebrates doubled the network size. Adding parasite interactions doubled it again.

The results highlight something she suspected all along: "Whether or not you decide to build a food web (which I would not recommend)," she joked, "you could still think about the parasites that might be participating in the system. If you're missing half of the interactions you['re probably missing a huge part of the picture."

Parasites were even more prevalent in the kelp forest food web than in the estuarine food webs that inspired her project. Although a parasite-filled food web might sound unhealthy, according to Morton, it is actually a good sign because parasites often need complex food chains to complete their lifecycles. "Finding a lot of parasites indicates that there are intact trophic structures and high species diversity," she said.

The parasites are only present because the kelp forest provides so many opportunities for them. Kelp forests are well known biodiversity hotspots, particularly those in the Santa Barbara Channel, which lie at the confluence of the cold-water communities north of Point Conception and the warm-water communities of Southern and Baja California.

"This new look at kelp forest food webs puts fishes in the back seat," said co-author Kevin Lafferty, Morton's advisor at the Marine Science Institute. "Most of the action is with the invertebrates. And most of those invertebrates were parasites."

Morton was surprised to find a large number of parasites that use birds and mammals as their final hosts. This suggests that birds and mammals have a larger presence in the kelp forest ecosystem than she expected.

As for next steps, Morton has already set to work comparing her kelp forest food web to the few other intertidal and lake food webs in the literature that include parasites. She also plans to study how the kelp forest food web might change as the ocean warms. But the main point of publishing her data, she said, was to inform conservation efforts and resource management in kelp forest ecosystems.

When studying ecosystems, there's often a big cloud of unknowns that lead to a lot of variability in the data. "My hope in doing this was to provide people with the resources to get a more mechanistic understanding of what they're seeing," Morton said, "because now they basically have a map of all the things that possibly could be happening in this ecosystem."

Credit: 
University of California - Santa Barbara

Microorganisms on the Rio Grande Rise are a basis for life and a possible origin of metals

image: Manipulator arm on the HyBIS hybrid remotely operated vehicle collecting crust samples from the Rio Grande Rise

Image: 
Bramley Murton

The abundant biological and mineral diversity of the Rio Grande Rise, a seamount in the depths of the Atlantic Ocean about 1,500 km from the coast of Brazil, is probably due to a great extent to little-known microscopic creatures. 

Researchers affiliated with the University of São Paulo’s Oceanographic Institute (IO-USP), collaborating with colleagues at the UK’s National Oceanography Center, investigated the microorganisms inhabiting the seamount’s ferromanganese crusts and concluded that bacteria and archaea are probably responsible for maintaining the abundant local life, besides being involved in the process of biomineralization that forms the metals present in the crusts. 

An article published in the journal Microbial Ecology describes the study, which was funded by FAPESP and the UK’s Natural Environment Research Council (NERC). 

In 2014, the International Seabed Authority (ISA) awarded Brazil a 15-year grant of mineral exploitation rights to the Rio Grande Rise. Comprising 167 member states plus the European Union, the ISA is mandated under the United Nations Convention on the Law of the Sea to organize, regulate and control all mineral-related activities in the international seabed area, which corresponds to some 50% of the total area of the world’s oceans.

“Very little is known about the area’s biodiversity or about the impact of mining on its ecosystems,” said Vivian Pellizari, a professor at IO-USP and principal investigator for the study. 

The study was part of a Thematic Project supported by FAPESP. The article is one of the results of the PhD research of Natascha Menezes Bergo, currently a postdoctoral research intern at IO-USP.

“Although the process known as microbial biomineralization is well-known, oxidation and precipitation of manganese hadn’t been proved, and we had no idea how it occurred in ocean areas. In July 2020, however, an article by US researchers was published in Nature showing for the first time that bacteria use manganese to convert carbon dioxide into biomass via a process called chemosynthesis,” said Bergo, who participated in sample collection in 2018 on the UK research vessel RRS Discovery (read more at: agencia.fapesp.br/29617/). 

“One of these bacteria, which belongs to the group Nitrospirae, was present in the DNA sequences we extracted from crust samples collected at the Rio Grande Rise. This is strong evidence that the metals there are formed not just by a geological process but also by a biological process in which microorganisms play an important part,” she noted. 

Besides iron and manganese, the crusts are rich in cobalt, nickel, molybdenum, niobium, platinum, titanium and tellurium, among other elements. Cobalt is essential to the production of rechargeable batteries, for example, and tellurium is a key input for the production of high-efficiency solar cells. In late 2018, Brazil applied to the ISA for an extension of its continental shelf to include the Rio Grande Rise.

In other parts of the world, similar areas that have been studied for longer with the same objectives include the Clarion-Clipperton Zone and the Takuyo-Daigo Seamount, both in the North Pacific, as well as the Tropic Seamount in the North Atlantic.

Formation

The Rio Grande Rise has an area of some 150,000 km2, three times the size of Rio de Janeiro state, and depths ranging from 800 m to 3,000 m. Formed when present-day Africa and South America separated from the supercontinent Gondwana between 146 million years ago (mya) and 100 mya, the Rise was an island that sank some 40 mya, probably owing to the weight of a volcano and its lava and the movement of tectonic plates (read more at: revistapesquisa.fapesp.br/en/revelations-from-a-submerged-archipelago-2/). 

On one of their 2018 expeditions, the researchers collected from a part of the Rise samples of the ferromanganese crusts and of the coral skeletons that live on them, as well as calcarenite rock and biofilms on the crusts’ surfaces. These biofilms are structured microbial communities enveloped in substances they secrete to protect themselves from threats such as lack of nutrients or potential toxins.

“Finding biofilm was an interesting surprise, as it’s an indicator of an incipient biomineralization process,” Bergo said. “We found the same microorganisms in our biofilm, coral, calcarenite and crust samples. The only difference was the age of the surfaces. The coral is more recent than the crusts, and the biofilm is even younger.”

A total of 666,782 DNA sequences were recovered from the samples. The bacteria and archaea found by the scientists belong to groups known to be involved in the nitrogen cycle whereby ammonia is converted into nitrite and nitrate, and hence to serve as a source of energy for other microorganisms. Besides Nitrospirae, they found other prokaryotes such as the archaeon class Nitrososphaeria. Sequencing of the samples also revealed groups involved in the methane cycle such as Methylomirabilales and Deltaproteobacteria.

The results amplify scientists’ understanding of the microbial diversity and potential ecological processes found on the ferromanganese crusts of the South Atlantic seabed. They will also contribute to future regulation of possible mining activities in the area of the Rio Grande Rise.

“As the crusts are removed, local circulation will probably change and this, in turn, will change the available supply of organic matter and nutrients, and hence the local microbiome and all the life associated with it,” Bergo said. “Besides, the crusts grow 1 mm every 1 million years on average, so there won’t be time for recolonization. It’s no accident that so many studies have been published recently on how to assess and mitigate the impact of deep-sea mining.”

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Unlocking new avenues for curing cancer: Cell receptor neuropilin-1 could hold the key

video: A growing body of research shows that inhibiting neuropilin-1 activity can slow tumor progression in several types of cancers

Image: 
Chinese Medical Journal

Cancer is an insidious disease that, despite being around for centuries, is still very difficult to diagnose and treat. Thus, cancer has been the focus much research in the biomedical fields. Today, with research methods advancing rapidly and the base pool of researchers growing constantly, our fundamental understanding of the mechanisms of cancer formation and proliferation have increased, and new more promising treatment avenues have opened up.

In a new review article published in Chinese Medical Journal, scientists from Guangxi Medical University, China, have organized and analyzed the latest papers published on one promising avenue of exploration for many cancers, the cell receptor neuropilin-1 (NRP-1). Their work provides a much-needed overview of the salient findings in this field that helps direct future research on cancer treatment.

Neuropilins are transmembrane protein receptors with many functions in the body. For instance, they act as part of cell signaling pathways, regulating cellular function when various ligands (proteins) bind to them. NRP-1 in particular is critical to nerve cell and heart and blood vessel development and is otherwise found to be expressed on the surface of many types of cells, including regulatory T cells (a type of immune cell), osteoblasts (bone forming cells), adipocytes (fat storage cells), keratinocytes (a type of skin cell), and bone marrow fibroblasts (connective tissue in the bone marrow).

Because of its role in cell growth and blood vessel formation, this receptor is implicated in cancer development and proliferation. For instance, one action of NRP-1, forming complexes with various growth factors--which are a type of protein or hormone that helps stimulate cells to multiply--appears to promote tumor proliferation and metastasis (movement of the tumor into other parts of the body). In addition, NRP-1 binds to a protein expressed on regulatory T cells, an important player in anti-tumor immunity. This binding interferes with T cell functioning, speeding up tumor progression. Further, it promotes action of tumor-associated macrophages, which suppress the body's normal immune system, helping tumors proliferate. Finally, the importance of NRP-1 in blood vessel formation also applies for blood vessels that tumors use to obtain nutrients for continued growth.

All of these factors combine to make hampering NRP-1 function an attractive idea for treating cancers. Several therapeutic strategies have already been developed, including blocking the interaction of NRP-1 with growth factors to prevent tumor blood-vessel formation, preventing NRP-1 presence in tumor regulatory T cells so that cancers are more susceptible to the body's immune system, and directly targeting NRP-1 expression in several types of cells. Other, more cutting-edge technologies include the development of alternative splicing variants which are basically mutant versions of original proteins that bind to NRP-1; these variants change how NRP-1 acts when they bind to it. Some studies have also looked into combining multiple drugs that target NRP-1, or administering NRP-1-related drugs along with existing chemotherapy drugs. These studies have found that such "combination therapies" work well to eliminate multiple types of cancers.

Lead author of the article, Dr. Shao-Dan Liu, discusses the importance of reviewing recent progress in research on NRP-1 and its relationship to cancers. He says, "A comprehensive review gives us a better sense of where we are at the moment and what still needs to be known. For example, a lot of questions remain to be answered about the exact molecular mechanisms of NRP-1 action in cancer progression."

Nonetheless, recent NRP-1 research has considerably increased our understanding of this receptor's role in tumor development and enhanced our ability to predict the likely outcome of cancers, in addition to diagnosing and treating them. Dr. Liu adds, "We also hope that reading about all these advancements in cancer treatments will give the public confidence that cancer research is on the right track."

Indeed, in time, cancer may not be the formidable disease that is today.

Credit: 
Cactus Communications

Modelling ancient antarctic ice sheets helps us see future of global warming

image: Lead author Anna Ruth Halberstadt in Antarctica

Image: 
Anna Ruth Halberstadt

AMHERST, Mass. - Last month saw the average concentration of atmospheric carbon dioxide (CO2) climb to almost 418 parts-per-million, a level not seen on Earth for millions of years. In order to get a sense of what our future may hold, scientists have been looking to the deep past. Now, new research from the University of Massachusetts Amherst, which combines climate, ice sheet and vegetation model simulations with a suite of different climatic and geologic scenarios, opens the clearest window yet into the deep history of the Antarctic ice sheet and what our planetary future might hold.

The Antarctic ice sheet has attracted the particular interest of the scientific community because it is "a lynchpin in the earth's climate system, affecting everything from oceanic circulation to climate," says Anna Ruth Halberstadt, a Ph.D candidate in geosciences and the paper's lead author, which appeared recently in the journal Earth and Planetary Science Letters. Additionally, the ice sheet contains enough frozen water to raise current sea levels by 57 meters.

Yet, it has been difficult to accurately reconstruct the mid-Miocene Antarctic climate. Researchers can run models, but without geologic data to check the models against, it's difficult to choose which simulation is correct. Conversely, researchers can extrapolate from geologic data, but such data points offer only local snapshots, not a wider climatic context. "We need both models and geologic data to know anything at all," says Halberstadt. There's one final complicating factor: geology. Antarctica is bisected by the Transantarctic Mountains, and any clear picture of Antarctica's deep history must be able to account for the slow uplift of the continent's mountain range. "Without knowing the elevation," says Halberstadt, "it's difficult to interpret the geologic record."

Halberstadt and her colleagues, including researchers in both New Zealand and the UK, devised a unique approach in which they coupled an ice sheet model with a climate model, while also simulating the types of vegetation that would grow under each climatic model scenario. The team used historical geologic datasets that included such known paleoclimatic data points as past temperature, vegetation, and glacial proximity, to benchmark their modeled climates. Next, the team used their benchmarked model runs to make inferences about which CO2 and tectonic model scenarios satisfied the known geologic constraints. Finally, Halberstadt and her colleagues extrapolated continent-wide glacial conditions.

The research, which was supported by the NSF, reconstructed a thick but diminished ice sheet under the warmest mid-Miocene environmental conditions. In this model, although the margins of Antarctica's ice sheet had retreated significantly, greater precipitation led to a thickening of the ice sheet's interior regions. The team's modelling further suggests ice over the Wilkes Basin region of Antarctica advanced during glacial periods and retreated during interglacials. The Wilkes Basin is the region thought to be particularly sensitive to future warming and may contribute to future sea level rise.

"Antarctica's paleoclimate," says Halberstadt, "is fundamental to understanding the future."

Credit: 
University of Massachusetts Amherst