Tech

TGen-led study results suggest more accurate diagnostic for breast cancer

PHOENIX, Ariz. -- Jan. 26, 2021 -- Breast cancer, even at its initial stages, could be detected earlier and more accurately than current techniques using blood samples and a unique proteomics-based technology, according to findings of a study led by the Translational Genomics Research Institute (TGen), an affiliate of City of Hope.

Patrick Pirrotte, Ph.D., an Assistant Professor and Director of TGen's Collaborative Center for Translational Mass Spectrometry, and an international team of researchers developed a test that can detect infinitesimally small breast cancer biomarkers that are shed into the bloodstream from cells surrounding cancer known as extracellular matrix (ECM), according to the findings of their study recently published in the scientific journal Breast Cancer Research.

For decades, physicians have relied on mammography breast imaging to look for cancer in a quest to provide prevention, early detection and reduce deaths. But the unintended consequences of both false positives and false negatives have off-set the hoped-for gains of this inexact type of screening, including complications from surgery and cardiovascular disease, and unnecessary biopsies of what turn out to be benign lesions.

ECM is the network of molecules -- including collagen, enzymes and glycoproteins -- that provide structural and biochemical support to surrounding cells, including cancer cells. During the early stages of cancer, these proteins and protein fragments -- which form the tumor microenvironment -- leak into circulating blood.

"Our data reinforces the idea that this release of ECM components into circulation, even at the earliest stages of malignancy, can be used to design a specific and sensitive biomarker panel to improve detection of breast cancer," said Dr. Pirrotte, the study's senior author. "Using a highly specific and sensitive protein signature, we devised and verified a panel of blood-based biomarkers that could identify the earliest stages of breast cancer, and with no false positives."

To establish this protein signature, researchers employed blood samples from 20 patients with IDC breast cancer, and from 20 women without cancer who nonetheless had positive mammograms but benign pathology at biopsy. These results were compared to five groups of individuals diagnosed with other cancers: ovarian, lung, prostate, colon and melanoma.

Because the number of ECM molecules in blood is relatively low, researchers relied on proteomics and new sample preparation enrichment techniques, including the use of hydrogel nanoparticles, to accurately detect cancer-associated biomarkers. This technique binds proteins from ECM associated with cancer proliferation, migration, adhesion and metastasis, or the spread of cancer from one part of the body to another. Many of these proteins had never before been observed in blood samples.

"Our study results show a high degree of specificity of those markers as circulating proteins in breast cancer patients," said Khyatiben Pathak, Ph.D., a staff scientist in TGen's Collaborative Center for Translational Mass Spectrometry and a study author. "Our results justify further studies with larger groups to evaluate whether this biomarker panel improves the positive predictive value of mammography for breast cancer detection."

Credit: 
The Translational Genomics Research Institute

Two anti-viral enzymes transform pre-leukemia stem cells into leukemia

image: These are leukemia cells expressing the enzymes APOBEC3C and ADAR1.

Image: 
UC San Diego Health Sciences

Since stem cells can continually self-regenerate, making more stem cells, and differentiate into many different specialized cell types, they play an important role in our development and health. But there can also be a dark side -- stem cells can sometimes become cancer stem cells, proliferating out of control and leading to blood cancers, such as leukemia and multiple myeloma. The self-renewing nature of cancer stem cells makes them particularly hard to eradicate, and they're often the reason a blood cancer reoccurs.

Researchers at UC San Diego Health and University of California San Diego School of Medicine are working to understand what pushes pre-cancer stem cells to transform into cancer stem cells and are developing ways to stop that switch.

Their latest study, published January 26, 2021 in Cell Reports, is the first to show that, in response to inflammation, two enzymes called APOBEC3C and ADAR1 work together to fuel the transition from pre-cancer stem cells to cancer stem cells in leukemia. Both APOBEC3C and ADAR1 are activated by inflammatory molecules, especially during the body's immune response to viruses.

The researchers also found they can prevent the formation of leukemia stem cells in the laboratory by inhibiting ADAR1 with fedratinib or ruxolitinib, two existing medications for myelofibrosis, a rare bone marrow cancer.

"APOBEC3C and ADAR1 are like the Bonnie and Clyde of pre-cancer stem cells -- they drive the cells into malignancy," said co-senior author Catriona Jamieson, MD, PhD, Koman Family Presidential Endowed Chair in Cancer Research, deputy director of Moores Cancer Center, director of the Sanford Stem Cell Clinical Center and director of the CIRM Alpha Stem Cell Clinic at UC San Diego Health.

Jamieson's team has long studied ADAR1, an enzyme that edits a cell's genetic material to control which genes are turned on or off at which times, and its role in leukemia stem cells. They also previously found that high ADAR1 levels correlate with reduced survival rates for patients with multiple myeloma.

In their new study, the researchers collected blood stem cells and saliva samples donated by 54 patients with leukemia and 24 healthy control participants. They compared the whole genome sequences of pre-leukemia stem cells and leukemia stem cells collected from the patients. They were surprised to discover an uptick in levels of both the enzyme APOBEC3C and ADAR1 during the progression to leukemia stem cell. APOBEC3C typically helps cells maintain genomic stability.

The team found that, in response to inflammation, APOBEC3C promotes the proliferation of human pre-leukemia stem cells. That sets the stage for ADAR1, which becomes overzealous in its editing, skewing gene expression in a way that supports leukemia stem cells. When the researchers inhibited ADAR1 activation or silenced the gene in patient cells in the laboratory, they were able to prevent the formation of leukemia stem cells.

APOBEC3C, ADAR1 and their roles in cancer stem cells are now the focus of Jamieson's NASA-funded project to develop the first dedicated stem cell research laboratory within the International Space Station (ISS).

That's because the NASA Twins Study -- a comprehensive biological comparison of identical twins Scott Kelly, who spent six months aboard the ISS, and Mark Kelly, who stayed on Earth -- revealed an increase in inflammatory growth factors, immune dysregulation and pre-cancer mutations in Scott's blood upon his return. These molecular changes, the perfect conditions to activate APOBEC3C and ADAR1, persisted for almost a year.

"Under the auspices of our NASA task order, we are now developing APOBEC3C and ADAR1 inhibitors as a risk mitigation strategy for astronauts, so we can hopefully predict and prevent pre-cancer stem cell generation in low-Earth orbit and on deep space missions," Jamieson said.

The team is also interested in further exploring the link between viral infections and cancer. According to Jamieson, infection with viruses can trigger a flood of cytokines, molecules that help stimulate the body's immune forces. As part of that response, ADAR1 is activated to help immune cells proliferate.

"We need APOBEC3C and ADAR to help us fight off viruses," she said. "So now we're wondering -- do these enzymes play a role in the immune response to COVID-19? And could there be a downside to that as well? Can the immune response to a viral infection later raise a person's risk of pre-cancer stem cell development and ultimately cancer stem cell generation, and can we intervene to prevent that?"

Credit: 
University of California - San Diego

Southern Africa's most endangered shark just extended its range by 2,000 kilometers

video: Rare footage of shorttail nurse shark in Mozambique

Image: 
WCS Mozambique

MAPUTO, Mozambique (January 26, 2021) - A team of marine scientists led by the Wildlife Conservation Society (WCS) has confirmed that southern Africa's most threatened endemic shark - the Critically Endangered shorttail nurse shark (Pseudoginglymostoma brevicaudatum) - has been found to occur in Mozambique; a finding that represents a range extension of more than 2,000 kilometers (1,242 miles).

Publishing their findings in the journal Marine Biodiversity, the team said that the discovery was based on several records of the shark including underwater video surveys collected in 2019, recent photos of shore-based sport anglers' catches, and the identification of a specimen collected in 1967.

The diminutive shorttail nurse shark reaches lengths of approximately 75 centimeters (30 inches). Owing to its strong association with coral reefs, it is under particular threat from overexploitation by coastal fisheries and habitat degradation, and is suspected to have declined by more than 80 percent over the last 30 years.

The scientists say that the findings expand the species range southward from the coast of Tanzania by some 2,200 kilometers (1,367 miles) and 1,100 kilometers (683 miles) westward from Madagascar across the Mozambique Channel.

One of the records of the shark, from Mozambique's Ponta do Ouro Partial Marine Reserve, suggests that the species benefits from some degree of protection within a large coastal marine protected area (MPA). The authors though warn that the species range within Mozambique may span a large proportion of the country's unprotected coral reef habitat.

Said Rhett Bennett, WCS Shark and Ray Conservation Program Manager, Madagascar & Western Indian Ocean: "The shorttail nurse shark is under threat within much of its Mozambique range. There are no species recovery plans in place for the species and no specific regulations pertaining to its harvest, other than a listing on the Kenya threatened and protected species list."

The authors recommend that the species should be considered for legal protection in Mozambique and throughout its limited range. In addition, they say it should be better monitored, and subject to improved management measures to reduce targeted and incidental catch.

WCS works on shark conservation around the world. The majority of the global trade in both shark fins, and other products such as meat, remains unregulated, pushing many species toward extinction. In 2019, at CITES CoP18, WCS helped lead efforts to expand the protection of sharks from unsustainable trade.

The work was conducted in partnership with the Mozambique Instituto Nacional de Investigação Pesqueira, and the South African Institute for Aquatic Biodiversity.

Aspects of this project were funded by the Shark Conservation Fund, a philanthropic collaborative pooling expertise and resources to meet the threats facing the world's sharks and rays. The Shark Conservation Fund is a project of Rockefeller Philanthropy Advisors.

Credit: 
Wildlife Conservation Society

AI used to predict early symptoms of schizophrenia in relatives of patients

image: Sunil Kalmady Vasu (centre) led a recent study with fellow U of A researchers including Russ Greiner (left), Andrew Greenshaw (right) and Serdar Dursun (not pictured), showing that a machine learning tool could help predict early symptoms of schizophrenia in siblings and children of patients, potentially leading to earlier diagnosis and treatment.

Image: 
University of Alberta (taken pre-COVID-19)

University of Alberta researchers have taken another step forward in developing an artificial intelligence tool to predict schizophrenia by analyzing brain scans.

In recently published research, the tool was used to analyze functional magnetic resonance images of 57 healthy first-degree relatives (siblings or children) of schizophrenia patients. It accurately identified the 14 individuals who scored highest on a self-reported schizotypal personality trait scale.

Schizophrenia, which affects 300,000 Canadians, can cause delusions, hallucinations, disorganized speech, trouble with thinking and lack of motivation, and is usually treated with a combination of drugs, psychotherapy and brain stimulation. First-degree relatives of patients have up to a 19 per cent risk of developing schizophrenia during their lifetime, compared with the general population risk of less than one per cent.

"Our evidence-based tool looks at the neural signature in the brain, with the potential to be more accurate than diagnosis by the subjective assessment of symptoms alone," said lead author Sunil Kalmady Vasu, senior machine learning specialist in the Faculty of Medicine & Dentistry.

Kalmady Vasu noted that the tool is designed to be a decision support tool and would not replace diagnosis by a psychiatrist. He also pointed out that while having schizotypal personality traits may cause people to be more vulnerable to psychosis, it is not certain that they will develop full-blown schizophrenia.

"The goal is for the tool to help with early diagnosis, to study the disease process of schizophrenia and to help identify symptom clusters," said Kalmady Vasu, who is also a member of the Alberta Machine Intelligence Institute.

The tool, dubbed EMPaSchiz (Ensemble algorithm with Multiple Parcellations for Schizophrenia prediction), was previously used to predict a diagnosis of schizophrenia with 87 per cent accuracy by examining patient brain scans. It was developed by a team of researchers from U of A and the National Institute of Mental Health and Neurosciences in India. The team also includes three members of the U of A's Neuroscience and Mental Health Institute--computing scientist and Canada CIFAR AI Chair Russ Greiner from the Faculty of Science, and psychiatrists Andrew Greenshaw and Serdar Dursun, who are authors on the latest paper as well.

Kalmady Vasu said next steps for the research will test the tool's accuracy on non-familial individuals with schizotypal traits, and to track assessed individuals over time to learn whether they develop schizophrenia later in life.

Kalmady Vasu is also using the same principles to develop algorithms to predict outcomes such as mortality and readmissions for heart failure in cardiovascular patients through the Canadian VIGOUR Centre.

"Severe mental illness and cardiovascular problems cause functional disability and impair quality of life," Kalmady Vasu said. "It is very important to develop objective, evidence-based tools for these complex disorders that afflict humankind."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Male breast cancer patients face high prevalence of heart disease risk factors

Male breast cancer patients were found to have a high prevalence of cardiovascular conditions, in a small study of this rare patient population presented at the American College of Cardiology's Advancing the Cardiovascular Care of the Oncology Patient Virtual course.

"Due to the rarity of male breast cancer, there is no cardiovascular data from larger clinical trials or population studies. The lack of large data makes it even more important to individualize cardiovascular assessment and management based on each patient's unique oncologic, therapeutic and pre-existing cardiovascular risk profile to support them through cancer treatment into survivorship," said Michael Ibrahim, fourth year medical student at Georgetown University and one of the study authors.

Researchers from Georgetown Lombardi Comprehensive Cancer and MedStar Washington Hospital Center in Washington conducted a retrospective chart review of 24 male breast cancer patients evaluated at the medical centers. The patients were between 38 and 79 years old with 42% being African American, 29% being Caucasian, 4% Hispanic and 25% another ethnicity. Half of the patients had a family history of breast cancer.

The majority of patients--79%--had invasive ductal carcinoma, which is the most common type of breast cancer. Invasive ductal carcinoma occurs when the cancer started in the breast ducts and spread into the surrounding breast tissue.

All patients underwent a mastectomy, while 4% received anthracycline chemotherapy, 8% received HER2-targeted therapy, 16% received radiation and 71% received hormone therapy. Six patients were diagnosed with a secondary primary malignancy and three with a third primary malignancy.

Researchers found 88% of patients were overweight, 58% had high blood pressure and 54% had high cholesterol. Tachyarrhythmia, or an abnormally increased heart rate, preexisted in 8% of patients and developed in 13% of patients while undergoing treatment. Two patients were found to have decreased ejection fraction or decrease in how much blood the heart pumps out with each beat. Two patients developed heart failure--a chronic condition where the heart doesn't pump blood as well as it should--after treatment.

"How similar or dissimilar male and female breast cancer patients are is the fundamental, unanswered question. Contrary to most other medical conditions, data on breast cancer are driven from female patients. We extrapolate the evidence from female breast cancer patients, or the age matched male general population, and apply it to the cardiovascular care for male breast cancer patients," Ibrahim said. "However, in reality, we do not truly know the difference. For example, the median age of male breast cancer patients is older than their female counterparts. An older population could mean more cardiovascular comorbidities. More comorbidities could require more comprehensive and frequent serial monitoring. It is also unknown if risk of cardiotoxicity from anthracycline or HER-2 targeted therapy is greater or less in male versus female breast cancer patients, and more studies are warranted."

According to the researchers, the high prevalence of cardiovascular conditions in male breast cancer patients requires further investigation to better understand the risk of preexisting heart disease on long term outcomes for these patients. The findings also highlight the need for cardiologists and cardio-oncologists to be involved in male breast cancer treatment due to the common risk factors and potential cardiotoxic effects of breast cancer treatment.

Ibrahim added, "The field of cardio-oncology is well positioned to ensure that cardiologists and oncologists work closely together to address both the patients' oncologic and cardiac concerns. Cardio-oncologists or cardiologists should pay close attention to the proposed treatment plan and be part of a multidisciplinary cancer care team to evaluate the patients' cardiovascular risk prior to and through cancer treatments. On a more personal level, cancer patients are already surprised by their cancer diagnosis. Similar to the pretreatment consultation with radiation oncology, breast surgery, and medical oncology, an upfront cardiovascular risk assessment provides greater comfort and further minimizes psychological surprise with cardiovascular complications going into cancer treatment."

Credit: 
American College of Cardiology

Nanomedicine's 'crown' is ready for its close up

EAST LANSING, Mich. - An international team of researchers led by Michigan State University's Morteza Mahmoudi has developed a new method to better understand how nanomedicines -- emerging diagnostics and therapies that are very small yet very intricate -- interact with patients' biomolecules.

Medicines based on nanoscopic particles have the promise to be more effective than current therapies while reducing side effects. But subtle complexities have confined most of these particles to research labs and out of clinical use, said Mahmoudi, an assistant professor in the Department of Radiology and the Precision Health Program.

"There's been a considerable investment of taxpayer money in cancer nanomedicine research, but that research hasn't successfully translated to the clinic," Mahmoudi said. "The biological effects of nanoparticles, how the body interacts with nanoparticles, remain poorly understood. And they need to be considered in detail."

Mahmoudi's team has now introduced a unique combination of microscopy techniques to enable more detailed consideration of those biological effects, which the researchers described in the journal Nature Communications, published online on Jan. 25.

The team's methods let researchers see important differences between particles exposed to human plasma, the cell-free part of blood that contains biomolecules including proteins, enzymes and antibodies.

These biological bits latch onto a nanoparticle, creating a coating referred to as a corona (not to be confused with the novel coronavirus), the Latin word for crown. This corona contains clues about how nanoparticles interact with a patient's biology. Now, Mahmoudi and his colleagues have shown how to get an unprecedented view of that corona.

"For the first time, we can image the 3-D structure of the particles coated with biomolecules at the nano level," Mahmoudi said. "This is a useful approach to get helpful and robust data for nanomedicines, to get the kind of data that can affect scientists' decisions about the safety and efficacy of nanoparticles."

Although work like this is ultimately helping move therapeutic nanomedicines into the clinic, Mahmoudi is not optimistic that broad approval will happen any time soon. There's still much to learn about the particles. Furthermore, one of the things that researchers do understand very well -- that minute variations in these diminutive drugs can have outsized impact -- was underscored by this study.

The researchers saw that the coronas of nanoparticles from the same batch, exposed to the same human plasma, could provoke a variety of reactions by a patient to a single dose.

Still, Mahmoudi sees an opportunity in this. He believes these particles could shine as diagnostics tools instead of drugs. Rather than trying to treat diseases with nanoscale medicine, he believes that persnickety particles would be well suited for the early detection of disease. For example, Mahmoudi's group has previously shown this diagnostic potential for cancers and neurodegenerative diseases.

"We could become more proactive if we used nanoparticles as a diagnostic," he said. "When you can detect disease at the earlier stages, it becomes easier to treat them."

Credit: 
Michigan State University

Opertech Bio's pioneering approach to taste testing and measurement published in JPET

video: TāStation® (taste+station) is an entirely new approach to taste testing and measurement. Highly efficient and cost effective, the TāStation® technology is used for evaluating new sweeteners, taste enhancers and bitter blockers and also has broad application in flavor optimization, providing a combinatorial strategy for developing the best tasting ingredient mixtures. In addition to its many applications in the food and beverage industry, TāStation® technology is particularly well suited for helping consumer healthcare and pharmaceutical companies evaluate the taste of new formulations of medications. This is because TāStation® is able to generate tremendous amounts of data using minute quantities of drug. The total amount of API (active pharmaceutical ingredient) required for an entire taste test is a small fraction of a typical single daily dose. The capability to test with this minimal level of pharmaceutical exposure cannot be accomplished by any other method or service provider.

The TāStation® includes a portable workstation with an automated high throughput system for delivering small samples in rapid succession to a subject. The TāStation® system has the capacity to determine the taste characteristics of a hundred samples in less than an hour. An individual can be trained through an interactive algorithm, which operates like a game, to make responses on a touch sensitive monitor that are dependent on the subject's ability to detect and distinguish taste stimuli. Responses are rewarded with an incremental point system that incentivizes sensory acuity. The subjects are focused on the game and may not even be aware that their taste sensitivities and preferences are being recorded.

More data means greater informative power. The comparatively large data sets that are rapidly generated by the TāStation® system are amenable to sophisticated computational and analytic tools not practical for the limited information from traditional taste assessment. Among the many advantages that result are vastly reduced errors, greater precision in detecting taste effects, improved consistency and predictive value of test outcomes, and the ability to more quickly and broadly test across a diversity of human subjects.

Image: 
Opertech Bio, Inc.

PHILADELPHIA, PA - January 25, 2021 - Opertech Bio, Inc., today announced the publication of a seminal research article describing the application of its pioneering TāStation® technology to the pharmacological characterization of human taste discrimination. The findings are published in the peer-reviewed Journal of Pharmacology and Experimental Therapeutics, JPET.

The paper, entitled "Rapid throughput concentration-response analysis of human taste discrimination," is the first to quantitatively define the concentration-response function for human taste discrimination, a crucial step in understanding the relationship between receptor activity and taste sensation. The paper is authored by the Opertech research team of R. Kyle Palmer, Mariah M. Stewart, and John Talley. The complete article is available at: DOI: https://doi.org/10.1124/jpet.120.000373.

"The paper provides a rigorous scientific validation of TāStation® technology in the context of threshold sensory measurements and concentration-response analysis of sucrose and other sweeteners. The results are entirely consistent with receptor occupancy theory, implying that the taste discrimination concentration-response function is a direct reflection of the underlying activity of taste receptors" said R. Kyle Palmer, Opertech's Chief Science Officer and lead author on the paper. "The methodology enabled by the TāStation® produced remarkable test-to-test repeatability of results, which proved critical for statistical resolution of effects in small groups of subjects and even among individual participants," he added.

Credit: 
Opertech Bio Inc

First comprehensive LCA shows reprocessed medical devices cut GHG emissions in half

The carbon footprint of plastic production for initial use is greater than the global warming impact of the entire process used for medical device reprocessing
Use of reprocessed devices is environmentally superior to use of original products in 13 of 16 categories evaluated
Reprocessing found to advance "circular economy," a key strategy for reaching the UN Sustainability Goals
LCA offers evidence showing that in order to reduce greenhouse gas emissions and honor the Paris Climate Agreement, EU Member States must opt-in to EU Medical Device Regulation (MDR)'s reprocessing/remanufacturing provisions

[Berlin / Washington, DC - 25 January 2021] Hospitals could cut emissions associated with some medical device use in half by opting instead for regulated, reprocessed "single-use" medical devices. The LCA evaluated the use of a reprocessed electrophysiology catheter compared with the use of original catheters for 16 different environmental impact categories and found that the use of reprocessed devices was superior in 13 categories.

The study, conducted by Fraunhofer Institute for Environmental, Safety, and Energy Technology UMSICHT, a division the world's leading applied research organization Fraunhofer-Gesellschaft and published in Sustainability, is the first comprehensive LCA exploring the environmental impact of a reprocessed "single-use" medical device compared to the "take-make-dispose" use of "single-use" original devices.

"By avoiding the use of virgin materials, reprocessing can reduce the environmental impacts of resource consumption and emissions, such as reducing abiotic resource use and the global warming impact (GWI)," said Anna Schulte, M.S.c., Fraunhofer Institute for Environmental, Safety, and Energy Technology UMSICHT and lead study author. "Hospitals that want to reduce harmful environmental impact should strongly consider using remanufactured 'single-use' medical devices like the EP catheters we studied."

"This comprehensive LCA confirms what we've thought to be true - that reprocessed medical devices are significantly environmentally superior to the original device," said Daniel J. Vukelich, President and CEO, Association of Medical Device Reprocessors. "These definitive environmental benefits, combined with the well documented financial and supply chain resiliency benefits of reprocessed devices make clear what EU Member States gain by 'opting in' to the EU MDR, and that hospitals already using remanufactured devices should double-down and expand their reprocessing programs."

Healthcare is particularly wasteful and toxic.

Last December, the Journal Health Affairs concluded that the health sector is "responsible for 4.6% of global greenhouse gas emissions" and that the "vast majority of health care global greenhouse gas emissions originate in the supply chain." Hospital's over-reliance on "disposable" or "single-use" medical devices and equipment over the last 30 years has been further exacerbated by the challenges associated with COVID. And supply chain vulnerabilities have demonstrated that reliance on a disposable culture may not always provide healthcare workers with the supplies they need.

LCA Finds Reprocessing Superior in 13 of 16 Environmental Impact Categories

The authors researched 16 "Impact Categories" and found reprocessed catheters superior to original catheters in 13, including:

Ozone Depletion. Reprocessed devices reduced ozone depletion by nearly 90% (89.7).Climate Change. Reprocessed catheters cut CO2-equivalent emissions in half (50.4%).Photochemical Ozone Formation. Reprocessed devices reduced human health-impacted photochemical ozone formation by 72.8%.Respiratory Inorganics. Reprocessed devices reduced disease incidents from respiratory inorganics by 66.8%.Cancer Human Health Effects. Reprocessed catheters reduced cancer causing human health effects by 60.9%.

Disinfectants and cleaning agents used for reprocessed catheters were found to elevate two environmental impacts for reprocessed devices compared to original catheters: land use for agriculture associated with citric acid cleaning agents (15.2%) and eutrophication freshwater use (25.1%). The authors note however, that certain environmental inputs for original catheter production are unknown and thus not entered in their calculations.

The environmental analysis confirms that reprocessing leads to a significant reduction in global warming, when studying the "cradle to grave" impact of water, sterilization gasses, detergents and disinfectants, packaging materials, electricity (excluding the electricity used in original plastic production which is unknown by the authors).

Global Warming Impact of Plastic Manufacturing for Original "Single-Use" Medical Devices

The Fraunhofer researchers found that the global warming impact of plastic manufacturing for original EP catheters, which is avoided when using their reprocessed counterparts, accounts for more CO2 than the entire process of reprocessing, including the impact of cleaning the devices.

Global Use and Impact of Reprocessed SUDs

Reprocessing "single-use" medical devices, which requires regulated, commercial companies to collect, clean, sterilize, test and return devices for use again at hospitals is already in place at over 7,600 hospitals in the United States, Canada, Germany, England, Israel and Japan, yet only a small percent of the devices that can legally be reprocessed are. In the EU and the US, over 300 devices labelled for "single-use" are CE marked and cleared by FDA respectively for reprocessing.

Sustaining value after the end of life for SUDs helps hospitals to lower costs, as reprocessed devices cost significantly less than their original counterparts. Use of the reprocessed devices helps hospitals redirect money to pressing needs, such as, toward fighting COVID-19.

Credit: 
Whitecoat Strategies, LLC

Cholesterol starvation kills lymphoma cells

Nanoparticle is first therapy to trigger this novel way to kill lymphoma cells

Drug is being developed for clinical trials

Drug selectively attacks cancer cells, leaves normal cells unharmed

CHICAGO --- Scientists at Northwestern Medicine have developed a novel therapy to trick cancer cells into gobbling up what they think is their favorite food - cholesterol - which actually triggers their destruction. What appears to them as a cholesterol-loaded particle is actually a synthetic nanoparticle that binds to the cancer cells and starves them to death.

Although the research looked at lymphoma cells, scientists say the new experimental drug from Northwestern could be effective in other cancers with a similar appetite for cholesterol, such as kidney and ovarian cancer.

The study was published this month in the Journal of Biological Chemistry and builds upon prior work published by the group.

"Our ability to identify the novel mechanism of cell death gets us closer to translation to the bedside, where we can use this approach in patients with lymphoma who are not responding to more standard therapy," said co-corresponding author Dr. Leo I. Gordon, the Abby and John Friend Professor of Cancer Research at Northwestern University Feinberg School of Medicine and a Northwestern Medicine physician. "These data also provide a rationale to extend these observations to other cholesterol-addicted cancers, such as ovarian and kidney cancer."

This new therapy may work because the scientists also demonstrated cholesterol metabolism is quite different in target cancer cells from that of normal cells. That enables the experimental drug to selectively attack and kill vulnerable cancer cells while leaving normal cells unharmed.

New therapies are urgently needed to treat up to 40% of lymphomas that are aggressive and do not respond to current therapies. Also, SCARB1, the target of the drug which is involved in keeping cholesterol balanced in the cell, is present on other cancer cells that share the same appetite for cholesterol.

"Our therapy targets cancer cells that are dependent upon cholesterol uptake and perturbs the overall balance of cholesterol in the cell," said co-corresponding author Dr. C. Shad Thaxton, an associate professor of urology at Northwestern. "We discovered the cell tries to compensate by turning off pathways it requires to stay alive. We hope that this novel mechanism may be a blueprint for targeting other types of cancer."

The synthetic biologic nanoparticle therapy is the first of its kind to target cancer cells, specifically modulate cell cholesterol metabolism, and then trigger this novel way to kill cells. In addition, the scientists show the drug is not toxic to normal cells that do not harbor the same disruptions in cholesterol metabolism as the cancer cells do.

For the study, Northwestern scientists demonstrated the efficacy of the experimental drug and how it works in human cancer cell models, in animal models and in cancer cells obtained from patients with lymphoma.

The group will continue development of the drug so they can apply to begin Phase I clinical trials in patients. They also initiated a process of scaling up production of the drug to conduct studies in larger animals.

Credit: 
Northwestern University

Optimal information about the invisible

image: When light gets deflected by a disordered structure it becomes difficult to estimate where the target is located.

Image: 
TU Wien

Laser beams can be used to precisely measure an object's position or velocity. Normally, however, a clear, unobstructed view of this object is required - and this prerequisite is not always satisfied. In biomedicine, for example, structures are examined, which are embedded in an irregular, complicated environment. There, the laser beam is deflected, scattered and refracted, often making it impossible to obtain useful data from the measurement.

However, Utrecht University (Netherlands) and TU Wien (Vienna, Austria) have now been able to show that meaningful results can be obtained even in such complicated environments. Indeed, there is a way to specifically modify the laser beam so that it delivers exactly the desired information in the complex, disordered environment - and not just approximately, but in a physically optimal way: Nature does not allow for more precision with coherent laser light. The new technology can be used in very different fields of application, even with different types of waves, and has now been presented in the scientific journal "Nature Physics".

The vacuum and the bathroom window

"You always want to achieve the best possible measurement accuracy - that's a central element of all natural sciences," says Stefan Rotter from TU Wien. "Let's think, for example, of the huge LIGO facility, which is being used to detect gravitational waves: There, you send laser beams onto a mirror, and changes in the distance between the laser and the mirror are measured with extreme precision." This only works so well because the laser beam is sent through an ultra-high vacuum. Any disturbance, no matter how small, is to be avoided.

But what can you do when you are dealing with disturbances that cannot be removed? "Let's imagine a panel of glass that is not perfectly transparent, but rough and unpolished like a bathroom window" says Allard Mosk from Utrecht University. "Light can pass through, but not in a straight line. The light waves are altered and scattered, so we can't accurately see an object on the other side of the window with the naked eye." The situation is quite similar when you want to examine tiny objects inside biological tissue: the disordered environment disturbs the light beam. The simple, regular straight laser beam then becomes a complicated wave pattern that is deflected in all directions.

The optimal wave

However, if you know exactly what the disturbing environment is doing to the light beam, you can reverse the situation: Then it is possible to create a complicated wave pattern instead of the simple, straight laser beam, which gets transformed into exactly the desired shape due to the disturbances and hits right where it can deliver the best result. "To achieve this, you don't even need to know exactly what the disturbances are," Dorian Bouchet, the first author of the study explains. "It's enough to first send a set of trial waves through the system to study how they are changed by the system."

The scientists involved in this work jointly developed a mathematical procedure that can then be used to calculate the optimal wave from this test data: "You can show that for various measurements there are certain waves that deliver a maximum of information as, e.g., on the spatial coordinates at which a certain object is located."

Take for example an object that is hidden behind a turbid pane of glass: there is an optimal light wave that can be used to obtain the maximum amount of information about whether the object has moved a little to the right or a little to the left. This wave looks complicated and disordered, but is then modified by the turbid pane in such a way that it arrives at the object in exactly the desired way and returns the greatest possible amount of information to the experimental measuring apparatus.

Laser experiments in Utrecht

The fact that the method actually works was confirmed experimentally at Utrecht University: Laser beams were directed through a disordered medium in the form of a turbid plate. The scattering behaviour of the medium was thereby characterised, then the optimal waves were calculated in order to analyse an object beyond the plate - and this succeeded, with a precision in the nano-meter range.

Then the team carried out further measurements to test the limits of their novel method: The number of photons in the laser beam was significantly reduced to see whether one then still gets a meaningful result. In this way, they were able to show that the method not only works, but is even optimal in a physical sense: "We see that the precision of our method is only limited by the so-called quantum noise," explains Allard Mosk. "This noise results from the fact that light consists of photons - nothing can be done about that. But within the limits of what quantum physics allows us to do for a coherent laser beam, we can actually calculate the optimal waves to measure different things. Not only the position, but also the movement or the direction of rotation of objects."

These results were obtained in the context of a program for nanometer-scale imaging of semiconductor structures, in which universities collaborate with industry. Indeed, possible areas of application for this new technology include microbiology but also the production of computer chips, where extremely precise measurements are indispensable.

Credit: 
Vienna University of Technology

Charged up: revolutionizing rechargeable sodium-ion batteries with 'doped' carbon anodes

image: Researchers in Korea have developed a "heteroatom-doped" (modified) carbon-based anode that helps sodium-ion batteries to surpass the performance of lithium-ion batteries.

Image: 
Korea Maritime and Ocean University

As the world becomes aware of the imminent environmental crisis, scientists have begun a search for sustainable energy sources. Rechargeable batteries like lithium-ion batteries are seeing a popularity surge, concurrent with production of "greener" technologies such as electric propulsion ships (which are being developed to meet the environmental regulations by the International Maritime Organization) and other electric vehicles. But, lithium is rare and difficult to distribute, putting its sustainability in doubt while also risking sharp increases in cost. Researchers have thus turned to "sodium-ion batteries" (SIBs), which are electrochemically similar to lithium-ion batteries and offer advantages like higher abundance of sodium and cheaper production. However, currently, the standard anode material in SIBs is graphite, which is thermodynamically unstable with sodium ions and leads to lower "reversible capacity" (a measure of its storage) and poor performance.

To this end, researchers at Korea Maritime and Ocean University, Korea, set out to find a suitable non-graphite anode material for SIBs. Dr Jun Kang, the lead scientist, says, "Because SIBs have low performance--only 1/10th the capacity of a lithium-ion battery--it is crucial to find an efficient anode that retains graphite's low cost and stability."

Now, in their latest study published in the Journal of Power Sources, the scientists reported the following strategies to overcome the limitations of carbon-based anode materials for SIBs: (1) Employing a hierarchical porous structure capable of promoting rapid Na+ transport from the bulk zone of the electrolyte to the interface of the active material; (2) retaining large specific surface areas where Na+ migrates to the interface, which can be easily accessed in the active material; (3) retaining surface defects and pore structures that enable co-intercalation from the surface to the interior; (4) retaining nanostructures in Na+ inserted into the active material from defects and pores that can have short diffusion paths; and (5) increasing the number of active sites due to extrinsic defects that result from these elements through hetero-element doping. These strategies led to the electrochemical performance of the battery being significantly improved, even surpassing that of current lithium-ion batteries!

In two of their previous studies, they successfully tested this method using phosphorus and sulfur, which were featured on the cover pages of Carbon and ACS Applied Materials & Interfaces, respectively.

Dr Kang is optimistic about the various potential applications of their technology, such as in electric propulsion ships and other vehicles, drones, and even high-performance CPUs. "These five factors afford good capacity retention, reversible capacity, ultrahigh cycling stability, high initial coulombic efficiency (80%), and remarkable rate capability. This means they can be used for a long time even with intense battery use," he explains.

Considering the advantages of sodium over lithium, these findings certainly have important implications for the engineering of sustainable, inexpensive, high-performance batteries and can take us a step closer to the realization of an energy-efficient future.

Credit: 
National Korea Maritime and Ocean University

Scientists show impact of human activity on bird species

image: The Golden Eagle is not under active conservation management in Great Britain and could be candidates for higher prioritisation.

Image: 
Chris Gomersall (rspb-images.com)

Scientists have shown where bird species would exist in the absence of human activity under research that could provide a new approach to setting conservation priorities.

A study by Durham University, UK, in collaboration with the Royal Society for the Protection of Birds (RSPB), investigated how human activities such as agriculture, deforestation, and the drainage of wetlands have shaped where bird species are found in Great Britain today.

Researchers used data on the geographical distributions of bird species alongside simulation models to predict where bird species would exist today if the effects of human activities on the landscape were removed.

In this scenario there were winners and losers among different bird species due to the impact of humans.

The study found that 42 per cent of the 183 breeding bird species considered were more widely distributed today than they would be in a human-free world, particularly birds associated with farmland habitats.

Conversely, 28 per cent of species, particularly moorland and upland bird species, were much rarer today than they would be if they were not impacted by human activities.

The researchers say their findings, published in the journal Ecological Indicators, could apply to other parts of the world, too.

Co-lead author of the study, Dr Tom Mason, previously of Durham University's Department of Biosciences, but now based at the Swiss Ornithological Institute, said: "Our study suggests that farmland bird species, such as Turtle Dove and Grey Partridge, would be less widespread without the open habitats created by agriculture while moorland species, such as Golden Eagle and Greenshank, have probably been negatively affected by the long-term, extractive human use of moorlands by grazing, burning, hunting and forestry.

"We also found that species found in dense woodlands, such as Goshawk and Capercaillie, would be much more widespread in a 'human-free' Great Britain, which would be much more forested than the present day."

The study produced human-free range-size estimates which were compared against current bird distributions.

Conservation managers often use target population sizes, based on past distributions, to guide species' recovery programmes. The impact of conservation activities can then be assessed by comparing current numbers to historical numbers of bird species.

The authors argue that their approach, which instead uses simulated potential range-sizes as baselines, could complement indicators of short-term extinction risk such as the IUCN Red List of Threatened Species.

Professor Stephen Willis, of Durham University's Department of Biosciences, co-lead author on the study, said: "Our results could lead to reassessments of current conservation priorities. We identified 21 species that were not classified as threatened by the IUCN Red List for Great Britain but which had much smaller current distributions than we predict them to have in the absence of human activities.

"This suggests that their ranges are in a more degraded state than currently recognised. Some of these species, such as Greenshank, Golden Eagle and Whinchat, are not under active conservation management and could be candidates for higher prioritisation."

The researchers say that considering only recent changes in species' state can lead to a phenomenon known as "shifting baseline syndrome", where people set their expectations based on experiences in their lifetime.

This can underestimate the plight of species that have declined due to human activities a long time ago. Previous studies have tried to combat this by using historical baselines from past centuries, however historic species' records tend to be sparse and are difficult to apply across different species.

The study also identified 10 species that are not currently found in Great Britain, but which might have established themselves in the country in the absence of past human activities.

For example, the researchers say that the Kentish Plover, which has not bred in Great Britain since the middle of the 20th century, was projected to be found across south-east England in the absence of damaging human activity.

Similarly, the White-tailed Eagle, which currently has a small presence, and the Black-winged Stilt, which is a very rare breeder in Great Britain, would also be expected to be much more widespread than they are now.

The authors believe that their approach could be used to identify areas that are climatically suitable, but where habitat may have been degraded by human activities.

The RSPB plans to use this new research as one of a number of measures of how 'favourable' breeding bird populations are in Great Britain as it provides an objective measure against which to assess species' ranges or distributions.

This is important as it helps the RSPB to quantify Favourable Conservation Status and standardise methodology for its assessment. Favourable Conservation Status is an important political standard set for the conservation of migratory species under the Bonn Convention, and for habitats and species protected by EU Nature Directives.

Study co-author Dr Gillian Gilbert, of the RSPB, said: "This work could help to target where habitat and species restoration actions might lead to the return of historically lost species, or even to novel colonists."

The authors are now planning to extend their approach to also evaluate species' population sizes, which they hope will lead to lead to unified population targets for species across Great Britain.

Credit: 
Durham University

Climate change in antiquity: mass emigration due to water scarcity

The absence of monsoon rains at the source of the Nile was the cause of migrations and the demise of entire settlements in the late Roman province of Egypt. This demographic development has been compared with environmental data for the first time by professor of ancient history, Sabine Huebner of the University of Basel - leading to a discovery of climate change and its consequences.

The oasis-like Faiyum region, roughly 130 km south-west of Cairo, was the breadbasket of the Roman Empire. Yet at the end of the third century CE, numerous formerly thriving settlements there declined and were ultimately abandoned by their inhabitants. Previous excavations and contemporary papyri have shown that problems with field irrigation were the cause. Attempts by local farmers to adapt to the dryness and desertification of the farmland - for example, by changing their agricultural practices - are also documented.

Volcanic eruption and monsoon rains

Basel professor of ancient history Sabine R. Huebner has now shown in the US journal Studies in Late Antiquity that changing environmental conditions were behind this development. Existing climate data indicates that the monsoon rains at the headwaters of the Nile in the Ethiopian Highlands suddenly and permanently weakened. The result was lower high-water levels of the river in summer. Evidence supporting this has been found in geological sediment from the Nile Delta, Faiyum and the Ethiopian Highlands, which provides long-term climate data on the monsoons and the water level of the Nile.

A powerful tropical volcanic eruption around 266 CE, which in the following year brought a below-average flood of the Nile, presumably also played a role. Major eruptions are known from sulfuric acid deposits in ice cores from Greenland and Antarctica, and can be dated to within three years. Particles hurled up into the stratosphere lead to a cooling of the climate, disrupting the local monsoon system.

New insights into climate, environment, and society

In the third century CE, the entire Roman Empire was hit by crises that are relatively well documented in the province of Egypt by more than 26,000 preserved papyri (documents written on sheets of papyrus). In the Faiyum region, these include records of inhabitants who switched to growing vines instead of grain or to sheep farming due to the scarcity of water. Others accused their neighbors of water theft or turned to the Roman authorities for tax relief. These and other adaptive strategies of the population delayed the death of their villages for several decades.

"Like today, the consequences of climate change were not the same everywhere," says Huebner. Although regions at the edge of the desert faced the harshness of the drought, others actually benefited from the influx of people moving from the abandoned villages. "New knowledge about the interaction of climate, environmental changes and social developments is very topical." The climate change of late antiquity was not, however - unlike today - caused mainly by humans, but was based on natural fluctuations.

Credit: 
University of Basel

First observation of the early link between proteins linked to Alzheimer's disease

image: State-of-the-art automatic brainstem segmentation methods were used to extract tau burden in its first aggregation site, that is, in the brainstem monoaminergic grey matter (bmGM). Beta-amyloid (Aβ) burden was extracted in the earliest cortical aggregation regions, i.e. in the bilateral medial superior frontal, inferior temporal, and fusiform areas.

Image: 
©ULiège/CRC In Vivo Imaging

Study conducted by researchers from the GIGA CRC In vivo Imaging laboratory at ULiège demonstrates, for the first time in humans, how the first deposits of tau proteins in the brainstem are associated with neurophysiological processes specific to the early stages of Alzheimer's disease development.

During the pre-clinical stages of Alzheimer's disease, i.e. when subtle changes are taking place in the brain but no cognitive symptoms can be observed, the cortex presents a state of transient hyperexcitability. To date, several studies conducted in animals have shown that tau and beta-amyloid proteins - central to the development of Alzheimer's disease - were associated with increased cortical excitability and dysfunction of brain networks. However, the relationship between the accumulation of Alzheimer's disease-related proteins and cortical hyperexcitability during the earliest stages of the disease remains poorly understood in humans, in particular due to technological limitations in the precise quantification of early protein deposition.

A study, conducted by researchers from the Cyclotron Research Centre (CRC In vivo Imaging / GIGA) of ULiège studied whether the first deposits of tau and beta-amyloid proteins in the brains of healthy individuals aged between 50 and 70 years old could be linked to a higher level of cortical excitability. To do this, we combined different neuroimaging methodologies (magnetic resonance imaging, positron emission tomography) in order to characterise the quantity of tau and beta-amyloid proteins in their first agglomeration regions," explains Gilles Vandewalle, head of the laboratory. That is to say, respectively, in the brainstem and in a series of upper cortical areas. "In addition, the researchers also measured the excitability of the participants' cortex in a non-invasive manner, using transcranial magnetic stimulation techniques in conjunction with the acquisition of electroencephalographic recordings.

The results of this study show that an increased amount of tau protein in the brainstem - its primary site of agglomeration - is specifically associated with a higher level of cortical excitability, while the researchers did not observe a significant relationship for the amount of beta-amyloid protein in the upper cortical areas. These results constitute a first in vivo observation in humans of the early link between proteins linked to Alzheimer's disease and their impact on brain function," says Maxime Van Egroo, scientific collaborator at the CRC In Vivo Imaging and first author of the scientific article. Furthermore, they suggest that measuring the hyperexcitability of the cortex could be a useful marker to provide information on the progress of certain cerebral pathological processes linked to Alzheimer's disease, and thus contribute to the early identification of people most at risk of developing the disease, well before the first cognitive symptoms appear. »

Credit: 
University of Liège

Protein anchors as a newly discovered key molecule in cancer spread and epilepsy

Certain anchor proteins inhibit a key metabolic driver that plays an important role in cancer and developmental brain disorders. Scientists from the German Cancer Research Center (DKFZ) and the University of Innsbruck, together with a Europe-wide research network, discovered this molecular mechanism, which could open up new opportunities for personalized therapies for cancer and neuronal diseases. They published their results in the journal Cell.

The signaling protein MTOR (Mechanistic Target of Rapamycin) is a sensor for nutrients such as amino acids and sugars. When sufficient nutrients are available, MTOR boosts metabolism and ensures that sufficient energy and cellular building blocks are available. Since MTOR is a central switch for metabolism, errors in its activation lead to serious diseases. Cancers and developmental disorders of the nervous system leading to behavioral disorders and epilepsy can be the result if MTOR is malfunctioning.

Therefore, the cell controls MTOR activity very precisely with the help of so-called suppressors. These are molecules that inhibit a protein and help to regulate its activity. The-TSC complex is such a suppressor for MTOR. It is named after the disease that causes its absence - tuberous sclerosis (TSC). The TSC complex is located together with MTOR at small structures in the cell, the so-called lysosomes, where it keeps MTOR in check. If the TSC complex - for example due to changes in one of its components - no longer remains at the lysosome, this can lead to excessive MTOR activity with severe health consequences.

Protein with an anchor function

The teams led by Christiane Opitz at DKFZ and Kathrin Thedieck at the University of Innsbruck therefore investigated how the TSC complex binds to lysosomes. They discovered that the G3BP proteins (Ras GTPase-activating protein-binding protein) are located together with the TSC complex on lysosomes. "There, the G3BP proteins form an anchor that ensures that the TSC complex can bind to the lysosomes," explains Mirja Tamara Prentzell of DKFZ, first author of the publication. This anchor function plays a crucial role in breast cancer cells. If the amount of G3BP proteins is reduced in cell cultures, this not only leads to increased MTOR activity, but also increases cell migration.

Drugs that inhibit MTOR prevent this spread, the researchers were able to show in cell cultures. In breast cancer patients, low levels of G3BP correlate with a poorer prognosis. "Markers like the G3BP proteins could be helpful to personalize therapies based on inhibition of MTOR," explains Kathrin Thedieck, professor of biochemistry at the University of Innsbruck. The good thing is that drugs that inhibit MTOR are already approved as cancer drugs and could be tested specifically in further studies.

G3BP proteins also inhibit MTOR in the brain. In zebrafish, an important animal model, the researchers observed disturbances in brain development when G3BP is absent. This leads to neuronal hyperactivity similar to epilepsy in humans. These neuronal discharges could be suppressed by drugs that inhibit MTOR. "We therefore hope that patients with rare hereditary neurological diseases in which dysfunctions of the G3BP proteins play a role could benefit from drugs against MTOR," says Christiane Opitz of DKFZ. In the future, the scientists plan to investigate this together with their Europe-wide research network.

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)