Earth

Targeted removals and enhanced monitoring can help manage lionfish in the Mediterranean

image: Groups of up to 18 volunteer SCUBA divers worked together to remove lionfish from the Marine Protected Area of Cape Greco

Image: 
Periklis Kleitou/University of Plymouth

Targeted removals can be effective in suppressing the number of invasive lionfish found within protected coastlines around the Mediterranean Sea.

However, if they are to really be successful they need to be combined with better long-term monitoring by communities and conservationists to ensure their timing and location achieve the best results.

Those are the key findings of a new study, one of the first of its kind to examine the effectiveness of targeted lionfish removals from both an ecological and a socio-economic perspective.

Scientists working as part of the European Union-funded RELIONMED project teamed up with specially trained divers and citizen scientists to conduct a series of removal events and surveys over a six-month period.

Focussed on three marine protected areas on the coast of Cyprus - the Zenobia shipwreck off Larnaca, and two popular diving sites within the Cape Greco Marine Protected Area - between 35 and 119 lionfish were removed per day by divers at each protected site.

Those sites were then monitored by divers over several months which showed that, in some locations, population numbers recovered within three months.

As a result, scientists writing in the Aquatic Conservation: Marine and Freshwater Ecosystems journal, say such initiatives can undoubtedly be effective in reducing population numbers.

However, they need to be carefully coordinated to ensure the lionfish are eliminated - including the potential for them to be overfished - in a manner that doesn't have other negative impacts on other species.

The research was led by researchers at the University of Plymouth (UK) and Marine and Environmental Research (MER) Lab (Cyprus). They have been collaborating for several years as part of the €1.6million RELIONMED project, which aims to assess the history of the lionfish invasion in Cyprus, and identify ways to minimise its future impact.

Periklis Kleitou, Research Assistant on the RELIONMED project and lead author on the study, said: "There are many changes happening within the Mediterranean as a consequence of human activity and climate change. The lionfish invasion has been one notable consequence of that, but this study shows there is a potential - albeit complex and challenging - solution. One of the interesting aspects of this work has been to see how the training improved divers' knowledge of the issue, and motivated them to support management efforts. That is without doubt something we can, and should, build on to ensure lionfish populations are managed sustainably now and in the future."

Lionfish first began populating the Mediterranean less than a decade ago, as a result of expansion in the Suez Canal and ocean warming.

The species was first recorded off the coast of Cyprus in 2014, with a lack of common predators - coupled with lionfish's breeding habits - meaning numbers have increased dramatically with sightings everywhere from coastlines to the deep seas.

The first targeted removals took place in May 2019, having proved successful in areas previously invaded by lionfish, and they have been combined with education programmes around the threats the species pose and how it might be managed sustainably.

Professor of Marine Biology Jason Hall-Spencer, senior author on the current study and one of the core group of scientists that advises the International Programme on the State of the Ocean (IPSO), said: "This study demonstrates the complex nature of managing and protecting our ocean. Marine Protected Areas are undoubtedly beneficial in terms of biodiversity on the seabed, but they are also vulnerable to the spread of invasive species. Our ongoing research is showing the pivotal role citizens can play in monitoring and managing lionfish, but permitting divers to remove these fish using scuba gear will need to be applied with caution and strictly regulated to avoid illegal fishing. If implemented correctly, removal events could protect selected areas from the adverse effects of lionfish, while at the same time help to establish rich and deep links with local communities, strengthening responsibility and surveillance at corporate and social levels, and stimulating public environmental awareness."

Credit: 
University of Plymouth

Advanced bladder cancers respond to immunotherapy regardless of gene mutation status

image: UNC Lineberger's William Kim, MD, and colleagues report their research demonstrated that patients with advanced bladder cancers whose tumors have a mutated FGFR3 gene respond to immunotherapy in a manner that is similar to patients without that mutation. This discovery runs counter to previous research that suggested FGFR3-mutated bladder cancers should not be treated with immunotherapy.

Image: 
UNC LIneberge Comprehensive Cancer Center

CHAPEL HILL, North Carolina--A new study has demonstrated that patients with advanced bladder cancers whose tumors have a mutated FGFR3 gene respond to immunotherapy treatment in a manner that is similar to patients without that mutation, a discovery that runs counter to previous assumptions. This research, led by scientists at the University of North Carolina Lineberger Comprehensive Cancer Center, has important implications for patients who have not been offered immunotherapy because of their genetic profiles.

The findings are published in the British Journal of Cancer.

The National Cancer Institute estimates that 83,730 people in the United States will be diagnosed with bladder cancer in 2021, and the disease will cause 17,200 deaths. While the cancer is treatable when diagnosed early, the five-year survival rate is approximately 6 percent in advanced cases where the cancer has spread to other parts of the body. Within that low survival group of patients, about 15 percent of their tumors have mutations in the FGFR3 gene, making the gene overactive and contributing to high mortality for the disease.

"Despite prior work suggesting that FGFR3-mutated bladder cancers should not be treated with immunotherapy, our study demonstrates the opposite, so we believe that immunotherapy should be offered without hesitation," said UNC Lineberger's William Y. Kim, MD, Rush S. Dickson Distinguished Professor of Medicine and professor of Genetics and the paper's corresponding author.

There have been several recent significant treatment advances for bladder cancer. In 2019, the FDA approved a drug, erdafitinib (Balversa), that targets FGFR3 and prolongs survival. Additionally, immune checkpoint blockade drugs, commonly known as immunotherapies, have recently been approved for advanced bladder cancer. Prior to this decade, treatment was primarily limited to systemic, platinum-based chemotherapy.

"Clinical trials have shown that bladder cancers with FGFR3 mutations have fewer immune cells, primarily T cells, than cancers without the mutation. Because tumors with low levels of immune cells tend to respond poorly to immune checkpoint blockades, it has been hypothesized that those patients would have low response rates to immunotherapy," said UNC Lineberger's Tracy Rose, MD, MPH, assistant professor at the UNC School of Medicine and the paper's co-first author.

To test the hypothesis, UNC Lineberger researchers designed a study to compare tumor tissue samples and clinical trials data from 17 patients with FGFR3-mutated bladder cancer to 86 patients whose tumors did not have the mutation. The investigators found that patients with FGFR3 mutations responded to immunotherapy equally as well as those without the mutations. At a cellular level, they also found equivalent diversities of T cell receptors and a similar balance of immune suppression and immune activation signals in tumors with and without FGFR3 mutations. This equivalency, or balance, indicates a similar chance of benefiting from immunotherapy.

"The standard of care for advanced bladder cancer is getting pretty complicated, but having more options is a good thing," Kim said. "Today, most patients will get chemotherapy and then, if needed, FGFR3-altered tumors can be treated with either erdafitinib or immunotherapy."

The researchers hope to establish a clinical trial to test whether patients with FGFR3 alterations benefit more from erdafitinib or immunotherapy.

"Our study does not rule out the possibility that erdafitinib will synergize with immunotherapy," said William Weir, co-first author and an MD-PhD student at UNC-Chapel Hill. "If anything, the fact that FGFR3-altered patients benefit from immunotherapy argues that this may be a reasonable approach."

Credit: 
UNC Lineberger Comprehensive Cancer Center

Land repair vital for survival

image: Flinders University researchers Dr Martin Breed

Image: 
Flinders University

Restoration of degraded drylands is urgently needed to mitigate climate change, reverse desertification and secure livelihoods for the two billion people who live there, experts warn in a major new paper in Nature Ecology & Evolution.

Scientists leading the Global Arid Zone Project examined restoration seeding outcomes at 174 sites on six continents, encompassing 594,065 observations of 671 plant species - with the lessons learned important to meeting ambitious future restoration targets.

Flinders University Dr Martin Breed, one of three Australian researchers who helped coordinate data collection for the live database, says the new paper is really important for Australia.

"Much of Australia is drylands and huge areas of these drylands in Australia are degraded," he says. "They have been cleared, unsustainably farmed, burnt in megafires and generally not taken good care of. Accordingly, vast areas of our drylands are now being restored to help return biodiversity and supply many important ecosystem services, such as clean air and water, supporting our good mental health and increasing agricultural productivity.

"This restoration usually requires revegetation, primarily via re-seeding areas.

"The scale of this re-seeding effort globally is truly enormous, encouraged no less by the UN Decade on Ecosystem Restoration that started this year. The annual budgets involved are in the 10s to 100s billions of dollars," Dr Breed says.

"This research lays a solid foundation to innovate new ways to effectively re-seed areas.

"It brings together a global understanding of how effective this re-seeding is in these drylands. It shows that re-seeding generally works - if you sow the seeds, the plant has a good chance of being there in the future.

"However, re-seeding is really risky, with almost 20% of seeding events failing. Alarmingly, this risk increased as areas were more arid - and with the realised and predicted increases in aridity with climate change, this does not bode well for seed-based restoration in drylands."

The Nature Ecology & Evolution article says there are reasons for optimism, even though the global targets set for dryland restoration to restore millions of hectares of degraded land have been questioned as overly ambitious.

But without a global evaluation of successes and failures it is impossible to gauge feasibility, the researchers say, pointing out:

Seeding had a positive impact on species presence: in almost a third of all treatments, 100% of species seeded were growing at first monitoring.

However, dryland restoration is risky: 17% of projects failed, with no establishment of any seeded species, and consistent declines were found in seeded species as projects matured.

Across projects, higher seeding rates and larger seed sizes resulted in a greater probability of recruitment, with further influences on species success including site aridity, taxonomic identity and species life form.

The findings suggest that investigations examining these predictive factors will yield more effective and informed restoration decision-making.

Credit: 
Flinders University

Infrared held in a pincer

Many applications, from fiber-optic telecommunications to biomedical imaging processes require substances that emit light in the near-infrared range (NIR). A research team in Switzerland has now developed the first chromium complex that emits light in the coveted, longer wavelength NIR-II range. In the journal Angewandte Chemie, the team has introduced the underlying concept: a drastic change in the electronic structure of the chromium caused by the specially tailored ligands that envelop it.

Many materials that emit NIR light are based on expensive or rare metal complexes. Cheaper alternatives that emit in the NIR-I range between 700 and 950 nm have been developed but NIR-II-emitting complexes of non-precious metals remain extremely rare. Luminescence in the NIR-II range (1000 to 1700 nm) is, for example, particularly advantageous for in vivo imaging because this light penetrates very far into tissues.

The luminescence of complexes is based on the excitement of electrons, through the absorption of light, for example. When the excited electron drops back down to its ground state, part of the energy is emitted as radiation. The wavelength of this radiation depends on the energetic differences between the electronic states. In complexes, these are significantly determined by the type and arrangement of the ligands bound to the metal.

In typical chemical (covalent) bonds, each partner brings one electron to share in a bonding pair; in many complexes both of the electrons come from the ligand. However, the line between these types of bonds is fluid: metal-ligand bonds can have partial covalent character (nephelauxetic effect). As a consequence, the energy of certain excited states is reduced, giving the emitted radiation a longer wavelength. This has been observed for polypyridine ligands, which cause the ruby red emission of trivalent chromium (Cr(III)) in complexes to shift into the NIR-I range.

In order to increase the covalence of the metal-ligand bond and further increase the wavelength, Narayan Sinha in a team led by Claude Piguet and Oliver S. Wenger at the Universities of Basel and Geneva (Switzerland) switched from classic polypyridine ligands to a newly tailored, charged, tridentate chelate ligand. The term chelate is derived from the Greek word for the pincer of a crab, and tridentate means that the ligand has three binding sites with which it grabs the central metal ion like a pincer.

In the resulting new complex, the Cr(III) ion is surrounded on all sides by two tridentate charged chelate ligands to form an octahedral shape. This results in a drastically altered, unusual electronic structure with a high electron density on the Cr(III). In the axial direction, charge transfer takes place from the ligands to the metal, but in the equatorial plane of the octahedron, charge transfer moves from the metal to the ligands. The combined "push" and "pull" interactions likely have a strong influence on the spectroscopically relevant electrons of the Cr(III)--the key to the NIR-II emissions of the new complex.

Credit: 
Wiley

Visualizing a city's energy use

image: Simulated annual energy use intensity (EUI) of the commercial buildings in the urban building energy model (UBEM) for Pittsburgh, Pennsylvania.

Image: 
Bilec Built Environment and Sustainable Engineering Group

The building sector in the U.S. accounts for 39 percent of energy use, with commercial buildings responsible for about half of that. As cities grapple with climate change, making commercial buildings more efficient is a key part of the solution.

Researchers at the University of Pittsburgh Swanson School of Engineering and the Mascaro Center for Sustainable Innovation used the City of Pittsburgh to create a model built upon the design, materials and purpose of commercial buildings to estimate their energy usage and emissions. While other models may be hindered by a scarcity of data in public records, the researchers' Urban Building Energy Model (UBEM) uses street-level images to categorize and estimate commercial buildings' energy use. Their findings were recently published in the journal Energy & Buildings.

"We found that in the existing literature, the scale of commercial buildings was always one of the challenges. It's cumbersome or even impossible to find and process detailed information about hundreds or thousands of buildings in an urban environment," said Rezvan Mohammadiziazi, lead author and graduate student in the Swanson School's Department of Civil and Environmental Engineering. "Researchers need to rely on assumptions based on when buildings were built or what the mechanical and electrical systems look like. Our hope is that by using image processing, we can build a framework that reduces some assumptions."

The researchers used publicly available Geographic Information System (GIS) data and street-level images to develop their UBEM, then created 20 archetypes of buildings that comprised eight commercial use types.The buildings were sorted into the groups based on categories including use type and construction period.

With street-level images to determine the building material, window-to-wall ratio, and number of floors, and LiDAR data from the U.S. Geological Survey to determine building height, the researchers were able to simulate and map the annual energy use intensity of 209 structures in Pittsburgh. When they validated their findings using other publicly available data, they had just a 7 percent error rate. Though it's currently mostly guided by the researchers looking at the images, the researchers hope that this modeling framework can eventually take advantage of machine learning to more quickly analyze and categorize building images.

The focus on commercial buildings, as well, was an important addition to the field of research.

"A lot of good work has already been done in this field, but there are fewer studies focusing on commercial buildings, because data about them is more difficult to capture than residential buildings. They're bigger and have more diverse uses," explained Mohammadiziazi. "We wanted to determine if an urban model for commercial buildings could be accurate based on acceptable errors, and it was."

While one goal was to create a framework that other researchers could replicate and build upon, another was to help the City of Pittsburgh meet its ambitious energy goals. Pittsburgh has joined 22 other U.S. cities as a 2030 District, pledging to reduce energy use, water consumption and transportation emissions by 50 percent by the year 2030. By creating a tool to estimate current usage, the research can aid policy makers in setting energy goals and efficiency regulations. The University of Pittsburgh is a member of the 2030 District, and hasalso pledged to be carbon neutral by 2037 to mark Pitt's 250th anniversary. Pitt Assistant Professor Melissa Bilec has been involved in the Pittsburgh 2030 District since its inception.

"We are fortunate--and have worked diligently--to have a strong partnership with the City of Pittsburgh, along with our own University of Pittsburgh's Facilities Management. The Mascaro Center for Sustainable Innovation values and fosters our internal and external partners," said Bilec, who is also Roberta A. Luxbacher Faculty Fellow in the department of civil and environmental engineering, and deputy director of the Mascaro Center for Sustainable Innovation. "We will not meet or exceed our climate and energy goals without aggressive action and solid planning in the building sector. Models, such as the one we created, are intended to aid in the planning process to meet our goals."

Credit: 
University of Pittsburgh

'Golden nail': Quarry near Salzgitter becomes global geological reference point

image: Salzgitter-Salder: A perfect rock boundary sequence over 40 metres.

Image: 
Photo: Silke Voigt, Goethe University Frankfurt

FRANKFURT/HANNOVER. The international team of geoscientists led by Prof. Silke Voigt from the Goethe University Frankfurt, Prof. Ireneusz Walaszczyk from the University of Warsaw and Dr André Bornemann from LBEG have thoroughly investigated 40 metres of the geological strata sequence in the former limestone quarry at Hasselberg. The researchers determined that this is only sequence in the transition between Turonian and Coniacian without gaps and it therefore represents a perfect rock sequence to serve geoscientists from all over the world as a reference for their research - a "Global Stratotype Section and Point (GSSP)" or, in the jargon of geosciences, a "golden nail".

Certain group of bivalve mollusks of the family Inoceramidae, first appeared in the Coniacian, and are found in large numbers in Salder. In Bed 46 of the quarry, the German-Polish scientific team found the oldest appearance of the Inoceramid species Cremnoceramus deformis erectus, which marks the time boundary. Careful studies also revealed other microfossils and a characteristic change in the ratio of the carbon isotopes 12C and 13C, a so-called negative anomaly in the carbon cycle.

"This means that variable geological sequences, such as marine shelf sediments in Mexico or the deep sea in the tropical Atlantic, can now be compared and classified in time," explains Prof. Silke Voigt. "This is important in order to be able to make an exact chronological classification even in the case of incomplete successions and ultimately to see, for example, what the climate was like at a certain time in the past in different places in the world."

Professor Ireneusz Walaszczyk says: "The sequence in Salzgitter-Salder prevails over other candidates, for example from the USA, India, Madagascar, New Zealand and Poland, because we have a perfect rock boundary sequence here over 40 metres, with a well-defined record of events which took place in this interval of geological time."

"The Zechstein Sea left behind massive salt layers in the North German Basin more than 250 million years ago," explains André Bornemann. "The rock layers deposited later exerted pressure on these salt layers, some of which bulged up into large salt domes, deforming younger layers in the process. Salder is located near such a salt dome, so that here the fossil-rich rock layers of the Cretaceous period are steeply upright, resulting in a wonderful profile that is very accessible for scientific investigations. That's why we at LBEG have designated this place as a geotope, and this is one of the most important geopoints of the Harz-Braunschweiger Land-Ostfalen UNESCO Global Geopark."

Background:

In the limestone quarry at Hasselberg near Salder in the north-east of the Salzgitter mountain range, limestone and marl used to be quarried for the cement industry and later for ore processing. Today, it is the location of a well-known biotope and geotope which is the property of the Stiftung Naturlandschaft (Natural Landscape Foundation) and established by the BUND regional association of Lower Saxony. While the care of the quarry site has been entrusted to the Salzgitter district group of BUND, the Harz-Braunschweiger Land-Ostfalen UNESCO Global Geopark looks after the geoscientific part of the quarry. The quarry is not freely accessible for nature conservation reasons, but guided walks are occasionally offered.

90 million years ago, in the second half of the Cretaceous, it was tropically warm on Earth: the ice-free poles ensured high sea levels, and Central Europe consisted of a cluster of islands. In the sea, ammonites developed a tremendous variety of forms, while dinosaurs reigned on land. The first flowering plants began to compete with horsetails and ferns. About 89.39 million years ago, the climate began to cool slightly, sea levels began sink, and a new period in Earth history, the Coniacian, replaced the Turonian.

Credit: 
Goethe University Frankfurt

Informing policy for long-term global food security

More than 820 million people in the world don't have enough to eat, while climate change and increasing competition for land and water are further raising concerns about the future balance between food demand and supply. The results of a new IIASA-led study can be used to benchmark global food security projections and inform policy analysis and public debate on the future of food.

Despite the fact that food supply has increased dramatically since the 1960s, the question of how to eradicate global hunger - one of the Sustainable Development Goals - and feed the growing world population in years to come, remains a major challenge. Climate change and increasing competition for land and water are further exacerbating the problem, making the need for effective policies to ensure global food security and a better understanding of the main driving forces of global hunger ever more urgent.

Scientists typically use quantified global scenarios and projections to assess long-term future global food security under a range of socioeconomic and climate change scenarios. However, due to differences in model design and scenario assumptions, there is uncertainty about the range of food security projections and outcomes. To address this uncertainty, IIASA guest researcher Michiel van Dijk and colleagues conducted a systematic literature review and meta-analysis to assess the range of future global food security projections to 2050. Their study, which has been published in the journal Nature Food, focused on two key food security indicators: future food demand, which is a key driver of a required increase in food production, and associated impacts on land use change, biodiversity and climate change, and population at risk of hunger - an indicator of the number of people that face chronic food insecurity.

"Our study aimed to determine the range of future global food demand and population at risk of hunger projections to 2050. To answer this question, we analyzed 57 studies published between 2000 and 2018. We harmonized all projections and mapped them into the five highly divergent but plausible socioeconomic futures, including sustainable, business-as-usual, divided world, inequality, and conventional development scenarios," van Dijk explains.

The study's findings provide strong support for the view that food demand will increase by between 35% and 56% over the period 2010-2050, mainly due to population growth, economic development, urbanization, and other drivers. If climate change is taken into account, the range changes slightly, but overall with no statistical differences. Although less dramatic than the need to double current production as commonly stated in many other studies, the increase in demand may still have negative impacts on the environment and lead to biodiversity loss. In order to prevent such impacts, increases in food production would need to be accompanied by policies and investments that promote sustainable intensification and incorporate ecological principles in agricultural systems and practices, while also reducing food loss and waste and encouraging a shift towards more plant-based diets.

In the most negative scenarios, the population at risk of hunger is expected to increase by 8% (30% when the impact of climate change is considered) over the 2010-2050 period, which implies that the Sustainable Development Goal of ending hunger and achieving food security will not be achieved. To prevent this, the researchers urge policymakers to work proactively to develop adequate long-term measures, including stimulating inclusive growth.

"Our study can fuel the public debate on the future of food by inviting every citizen to imagine and discuss a wider range of food future scenarios, rather than just a binary choice between business-as-usual and the universal adoption of organic agriculture or vegan diets. To think responsibility and creatively about the future, we need to envision multiple plausible scenarios and evaluate their consequences," notes study co-author Yashar Saghai, a researcher at the University of Twente, Enschede, the Netherlands.

Although the study did not explicitly investigate the impacts of the COVID-19 pandemic, the researchers say that it is plausible that their range also includes the now more likely negative COVID-induced futures that are associated with an increase in the population at risk of hunger, instead of a decrease of around 50% that was considered the pre-COVID business-as-usual.

"While it is too early to oversee and understand the full impact and consequences of the coronavirus pandemic, current developments show some resemblance to the most negative archetype scenarios in our analysis, which is characterized by slow economic development, a focus on domestic security and national sovereignty, and increasing inequality. This implies a potential significant increase in the number of population at risk of hunger between 2010 and 2050 in the worst case. Recent developments, underscore the need for (quantitative) scenario analysis and comparison as a tool to inform policy analysis, coordination, and planning for the future of food as well as wider societal issues," van Dijk concludes.

Credit: 
International Institute for Applied Systems Analysis

Cannabidiol promotes oral ulcer healing by inactivating CMPK2-mediated NLRP3 inflammasome

Alexandria, Va., USA - Xingying Qi, West China Hospital of Stomatology, Sichuan University, Chengdu, China, presented the oral session "Cannabidiol Promotes Oral Ulcer Healing by Inactivating CMPK2-Mediated NLRP3 Inflammasome" at the virtual 99th General Session & Exhibition of the International Association for Dental Research (IADR), held in conjunction with the 50th Annual Meeting of the American Association for Dental Research (AADR) and the 45th Annual Meeting of the Canadian Association for Dental Research (CADR), on July 21-24, 2021.

The oral ulcer is a common oral inflammatory lesion with severe pain but little effective treatment is currently available. Cannabidiol (CBD) is recently emerging as a therapeutic agent for inflammatory diseases. However, the underlying mechanisms are not fully elucidated. Qi and colleagues sought to investigate whether and how CBD could play a therapeutic role in the oral ulcer. Oral ulcer models were performed in the tongue of C57BL/6 mice by acid etching or mechanical trauma, followed by CBD local administration. Samples were harvested for macroscopic and histological evaluation.

CBD oral spray on acid- or trauma-induced oral ulcers on mice tongues inhibited inflammation, relieved pain and accelerated lesions closure in a dose-dependent manner. The results show that CBD accelerates oral ulcer healing by inhibiting CMPK2-mediated NLRP3 inflammasome activation and pyroptosis, which is mediated mostly by PPARγ in nucleus and partially by CB1 in plasma membrane. This data may shed light on the development of new therapeutic strategies for oral ulcers.

Credit: 
International Association for Dental, Oral, and Craniofacial Research

Cardio-cerebrovascular disease history complicates hematopoietic cell transplant outcomes

image: Transplanting hematopoietic stem cells to treat cancers and other conditions carries with it the risk of developing cardio-cerebrovascular diseases (CCVD)--disorders affecting the blood vessels of the heart and brain. Although research on post-transplant CCVD is extensive, there is paucity of knowledge on the effects of pre-transplant CCVD on transplant outcomes. Now researchers in China suggest pre-transplant CCVD indirectly affects patient mortality and survival following transplant through their strong association with post-transplant CCVD.

Image: 
Chinese Medical Journal

Hematopoietic cell transplantation (HCT) is a recognized treatment option for certain blood and bone marrow cancers as well as some autoimmune and hereditary disorders. Performed to replace or modulate the body's malfunctioning hematopoietic system (which produces blood cells) or a compromised immune system following a medical condition or treatment, HCT can be autologous or allogenic. In autologous HCT, a patient's own stem cells are injected into the bloodstream, while in allogenic HCT donor stem cells are used.

Although a difficult procedure, over the years, the safety of HCT has been improved by providing enhanced postoperative support and carefully evaluating the comorbidities of HCT recipients preoperatively. Among the different risk factors for non-relapse mortality (NRM) used to determine the HCT-specific comorbidity index of patients, cardio-cerebrovascular diseases (CCVD)--diseases of the vessels supplying blood to the heart and brain--feature prominently, with an index score of 1. In fact, HCT recipients may experience up to 5.6-fold increased risk of CCVD, including coronary artery disease, cerebrovascular disease, and heart failure, and almost 4.0-fold increased risk of cardiovascular diseases-specific mortality. And the risk of post-transplant CCVD has been shown to increase following both autologous HCT and allogenic HCT.

However, most of the existing research is limited to post-transplant CCVD. As Professor Dai-Hong Liu of The Fifth Medical Center of Chinese PLA General Hospital, Beijing puts it, "Little is known about the relationship between pre-transplant CCVD and the outcomes of HCT."

To address this gap, researchers in China led by Prof. Liu, conducted a retrospective study on patients receiving HCT between November 2013 and January 2020. Twenty-three patients were enrolled in the study based on their history of symptomatic arrythmia, cardiomyopathy, congestive heart failure, valvular heart disease, coronary artery disease, ischemic attack, cerebral infarction, or cerebral artery stenosis. Constituting the pre-transplant CCVD group, these patients were compared for cardiovascular complications and HCT outcomes with 107 age- and disease status-matched controls who showed no symptoms indicative of pre-transplant CCVD. The researchers reviewed electro-medical records of participants to determine the complications and computed hazard ratios for two primary endpoints--post-transplant CCVD (defined according to the Common Terminology Criteria for Adverse Events version 5.0.) and NRM. As secondary endpoints, the researchers also assessed neutrophil and platelet engraftment, length of stay in the laminar flow ward, relapse, and overall survival. Their study was published in Chinese Medical Journal.

Their findings were interesting. They observed no significant differences in neutrophil and platelet engraftment or relapse between the pre-transplant CCVD group and the control group. The two groups also did not significantly differ in overall survival at the time of the last follow-up. In fact, the cumulative incidences of 2-year NRM were similar between patients with pre-transplant CCVD and the control group patients. However, subgroup analysis revealed that among the pre-transplant CCVD patients, those receiving donor stem cells had inferior overall survival, higher NRM, and longer hospital stay when compared with patients receiving their own stem cells, attesting to the superior safety of autologous HCT over allogenic HCT.

The most interesting finding of their study, however, was in identifying a strong association between pre-transplant CCVD and the occurrence of post-transplant CCVD, with an estimated hazard ratio of 12.50. Although pre-transplant CCVD did not directly affect the measured outcomes of HCT in their analysis, the researchers believe, the strong association between pre- and post-transplant CCVD can only mean cardio-cerebrovascular diseases before HCT have pronounced effects on post-transplant survival and mortality, given that post-transplant CCVD was independently associated with high NRM and inferior survival in the study.

Indeed, a sobering discovery. Prof. Liu explains, "Our findings suggest that post-transplant CCVD, which still remains the most important factor affecting transplant outcomes, is promoted by the combined effects of pre-transplant CCVD and toxicity. While the current evaluation system for pre-transplant comorbidities is still reliable, we cannot ignore transplant-related organ injuries and treatment-related toxicities, which may be compounded by pre-transplant CCVD. This is particularly important for allogenic HCT cases."

The researchers urge the need for further studies to investigate the risk pre-transplant CCVD pose on post-transplant CCVD, so that a robust evaluation system is developed to reduce mortality following transplants. Given the increased incidents of HCTs worldwide and associated cardio-cerebrovascular complications, this does indeed seem to be the need of the hour.

Credit: 
Cactus Communications

Artificial intelligence models to analyze cancer images take shortcuts that introduce bias

Artificial intelligence tools and deep learning models are a powerful tool in cancer treatment. They can be used to analyze digital images of tumor biopsy samples, helping physicians quickly classify the type of cancer, predict prognosis and guide a course of treatment for the patient. However, unless these algorithms are properly calibrated, they can sometimes make inaccurate or biased predictions.

A new study led by researchers from the University of Chicago shows that deep learning models trained on large sets of cancer genetic and tissue histology data can easily identify the institution that submitted the images. The models, which use machine learning methods to "teach" themselves how to recognize certain cancer signatures, end up using the submitting site as a shortcut to predicting outcomes for the patient, lumping them together with other patients from the same location instead of relying on the biology of individual patients. This in turn may lead to bias and missed opportunities for treatment in patients from racial or ethnic minority groups who may be more likely to be represented in certain medical centers and already struggle with access to care.

"We identified a glaring hole in the in the current methodology for deep learning model development which makes certain regions and patient populations more susceptible to be included in inaccurate algorithmic predictions," said Alexander Pearson, MD, PhD, assistant Assistant Professor of Medicine at UChicago Medicine and co-senior author. The study was published July 20, in Nature Communications.

One of the first steps in treatment for a cancer patient is taking a biopsy, or small tissue sample of a tumor. A very thin slice of the tumor is affixed to glass slide, which is stained with multicolored dyes for review by a pathologist to make a diagnosis. Digital images can then be created for storage and remote analysis by using a scanning microscope. While these steps are mostly standard across pathology labs, minor variations in the color or amount of stain, tissue processing techniques and in the imaging equipment can create unique signatures, like tags, on each image. These location-specific signatures aren't visible to the naked eye, but are easily detected by powerful deep learning algorithms.

These algorithms have the potential to be a valuable tool for allowing physicians to quickly analyze a tumor and guide treatment options, but the introduction of this kind of bias means that the models aren't always basing their analysis on the biological signatures it sees in the images, but rather the image artifacts generated by differences between submitting sites.

Pearson and his colleagues studied the performance of deep learning models trained on data from the Cancer Genome Atlas, one of the largest repositories of cancer genetic and tissue image data. These models can predict survival rates, gene expression patterns, mutations, and more from the tissue histology, but the frequency of these patient characteristics varies widely depending on which institutions submitted the images, and the model often defaults to the "easiest" way to distinguish between samples - in this case, the submitting site.

For example, if Hospital A serves mostly affluent patients with more resources and better access to care, the images submitted from that hospital will generally indicate better patient outcomes and survival rates. If Hospital B serves a more disadvantaged population that struggles with access to quality care, the images that site submitted will generally predict worse outcomes.

The research team found that once the models identified which institution submitted the images, they tended to use that as a stand in for other characteristics of the image, including ancestry. In other words, if the staining or imaging techniques for a slide looked like it was submitted by Hospital A, the models would predict better outcomes, whereas they would predict worse outcomes if it looked like an image from Hospital B. Conversely, if all patients in Hospital B had biological characteristics based on genetics that indicated a worse prognosis, the algorithm would link the worse outcomes to Hospital B's staining patterns instead of things it saw in the tissue.

"Algorithms are designed to find a signal to differentiate between images, and it does so lazily by identifying the site," Pearson said. "We actually want to understand what biology within a tumor is more likely to predispose resistance to treatment or early metastatic disease, so we have to disentangle that site-specific digital histology signature from the true biological signal."

The key to avoiding this kind of bias is to carefully consider the data used to train the models. Developers can make sure that different disease outcomes are distributed evenly across all sites used in the training data, or by isolating a certain site while training or testing the model when the distribution of outcomes is unequal. The result will produce more accurate tools that can get physicians the information they need to quickly diagnose and plan treatments for cancer patients.

"The promise of artificial intelligence is the ability to bring accurate and rapid precision health to more people," Pearson said. "In order to meet the needs of the disenfranchised members of our society, however, we have to be able to develop algorithms which are competent and make relevant predictions for everyone."

Credit: 
University of Chicago Medical Center

Silicon with a two-dimensional structure

image: Changes to the natural tetrahedral structure of silicon (top left) in an unusual square planar geometry (bottom right).

Image: 
Illustration: Ebner/Greb (Heidelberg)

Silicon, a semi-metal, bonds in its natural form with four other elements and its three-dimensional structure takes the form of a tetrahedron. For a long time, it seemed impossible to achieve the synthesis and characterisation of a two-dimensional equivalent - geometrically speaking, a square. Now scientists from the field of Inorganic Chemistry at Heidelberg University have succeeded in producing a crystalline complex with such a configuration. PD Dr Lutz Greb from the Institute of Inorganic Chemistry underlines that it has surprising physical and chemical properties and, in the field of molecular chemistry, will open up new approaches to using the second most abundant element in the Earth's crust for catalysis and materials research.

As a classical semi-metal, silicon possesses properties of both metals and non-metals, and belongs to the carbon group on the periodic table. Like carbon, silicon bonds with four elements. Its three-dimensional structure then corresponds to a tetrahedron, a body with four sides. Due to the high stability of a tetrahedron, other structures are not known in natural silicon with four bonds - silicon(IV) for short. Considered purely geometrically, the two-dimensional equivalent to a tetrahedron is a square. These configurations are already known for carbon but, according to Dr Greb, a square-planar structure has not yet been produced in the field of silicon(IV) chemistry, even after over 40 years of intensive effort.

Dr Greb's working group has succeeded for the first time in synthesising and completely characterising a square-planar silicon(IV) species. It was possible to show this with the aid of X-ray crystallography. The scientists grew a monocrystal which they irradiated with a finely focused beam of X-rays. The diffraction of the X-rays when encountering the atoms of the monocrystal led to an unmistakable pattern from which it is possible to calculate the position of the atom nuclei. This measurement enabled the researchers to show that they were dealing with molecules with square-planar silicon(IV). Further studies with spectroscopic methods supported this configuration. It displays physical and chemical properties that the researchers did not expect, e.g. colour in a naturally colourless class of substances.

"Synthesising this configuration from the components we chose is comparatively simple once you have understood the key conditions," explains Dr Fabian Ebner, who is meanwhile a postdoctoral researcher at the Institute of Inorganic Chemistry. It surprised the scientists, however, that the square-planar silicon(IV) molecule constitutes a stable, isolable compound at all. "Due to the high reactivity, there are many conceivable ways of decomposition. Still, we have always believed that it is possible to isolate this compound," Dr Greb emphasises.

Lutz Greb received a Starting Grant from the European Research Council (ERC) for his research in the area of structurally constrained main-group elements. The Foundation of German Business supported Fabian Ebner's work. The research results were published in the journal Chem.

Credit: 
Heidelberg University

Stanford researchers develop tool to drastically speed up the study of enzymes

image: HT-MEK -- short for High-Throughput Microfluidic Enzyme Kinetics -- combines microfluidics and cell-free protein synthesis technologies to dramatically speed up the study of enzymes.

Image: 
Daniel Mokhtari

For much of human history, animals and plants were perceived to follow a different set of rules than the rest of the universe. In the 18th and 19th centuries, this culminated in a belief that living organisms were infused by a non-physical energy or "life force" that allowed them to perform remarkable transformations that couldn't be explained by conventional chemistry or physics alone.

Scientists now understand that these transformations are powered by enzymes - protein molecules comprised of chains of amino acids that act to speed up, or catalyze, the conversion of one kind of molecule (substrates) into another (products). In so doing, they enable reactions such as digestion and fermentation - and all of the chemical events that happen in every one of our cells - that, left alone, would happen extraordinarily slowly.

"A chemical reaction that would take longer than the lifetime of the universe to happen on its own can occur in seconds with the aid of enzymes," said Polly Fordyce, an assistant professor of bioengineering and of genetics at Stanford University.

While much is now known about enzymes, including their structures and the chemical groups they use to facilitate reactions, the details surrounding how their forms connect to their functions, and how they pull off their biochemical wizardry with such extraordinary speed and specificity are still not well understood.

A new technique, developed by Fordyce and her colleagues at Stanford and detailed this week in the journal Science, could help change that. Dubbed HT-MEK -- short for High-Throughput Microfluidic Enzyme Kinetics -- the technique can compress years of work into just a few weeks by enabling thousands of enzyme experiments to be performed simultaneously. "Limits in our ability to do enough experiments have prevented us from truly dissecting and understanding enzymes," said study co-leader Dan Herschlag, a professor of biochemistry at Stanford's School of Medicine.

By allowing scientists to deeply probe beyond the small "active site" of an enzyme where substrate binding occurs, HT-MEK could reveal clues about how even the most distant parts of enzymes work together to achieve their remarkable reactivity.

"It's like we're now taking a flashlight and instead of just shining it on the active site we're shining it over the entire enzyme," Fordyce said. "When we did this, we saw a lot of things we didn't expect."

Enzymatic tricks

HT-MEK is designed to replace a laborious process for purifying enzymes that has traditionally involved engineering bacteria to produce a particular enzyme, growing them in large beakers, bursting open the microbes and then isolating the enzyme of interest from all the other unwanted cellular components. To piece together how an enzyme works, scientists introduce intentional mistakes into its DNA blueprint and then analyze how these mutations affect catalysis.

This process is expensive and time consuming, however, so like an audience raptly focused on the hands of a magician during a conjuring trick, researchers have mostly limited their scientific investigations to the active sites of enzymes. "We know a lot about the part of the enzyme where the chemistry occurs because people have made mutations there to see what happens. But that's taken decades," Fordyce said.

But as any connoisseur of magic tricks knows, the key to a successful illusion can lie not just in the actions of the magician's fingers, but might also involve the deft positioning of an arm or the torso, a misdirecting patter or discrete actions happening offstage, invisible to the audience. HT-MEK allows scientists to easily shift their gaze to parts of the enzyme beyond the active site and to explore how, for example, changing the shape of an enzyme's surface might affect the workings of its interior.

"We ultimately would like to do enzymatic tricks ourselves," Fordyce said. "But the first step is figuring out how it's done before we can teach ourselves to do it."

Enzyme experiments on a chip

HT-MEK combines two existing technologies to rapidly speed up enzyme analysis. The first is microfluidics, which involves molding polymer chips to create microscopic channels for the precise manipulation of fluids. "Microfluidics shrinks the physical space to do these fluidic experiments in the same way that integrated circuits reduced the real estate needed for computing," Fordyce said. "In enzymology, we are still doing things in these giant liter-sized flasks. Everything is a huge volume and we can't do many things at once."

The second is cell-free protein synthesis, a technology that takes only those crucial pieces of biological machinery required for protein production and combines them into a soupy extract that can be used to create enzymes synthetically, without requiring living cells to serve as incubators.

"We've automated it so that we can use printers to deposit microscopic spots of synthetic DNA coding for the enzyme that we want onto a slide and then align nanoliter-sized chambers filled with the protein starter mix over the spots," Fordyce explained.

Because each tiny chamber contains only a thousandth of a millionth of a liter of material, the scientists can engineer thousands of variants of an enzyme in a single device and study them in parallel. By tweaking the DNA instructions in each chamber, they can modify the chains of amino acid molecules that comprise the enzyme. In this way, it's possible to systematically study how different modifications to an enzyme affects its folding, catalytic ability and ability to bind small molecules and other proteins.

When the team applied their technique to a well-studied enzyme called PafA, they found that mutations well beyond the active site affected its ability to catalyze chemical reactions -- indeed, most of the amino acids, or "residues," making up the enzyme had effects.

The scientists also discovered that a surprising number of mutations caused PafA to misfold into an alternate state that was unable to perform catalysis. "Biochemists have known for decades that misfolding can occur but it's been extremely difficult to identify these cases and even more difficult to quantitatively estimate the amount of this misfolded stuff," said study co-first author Craig Markin, a research scientist with joint appointments in the Fordyce and Herschlag labs.

"This is one enzyme out of thousands and thousands," Herschlag emphasized. "We expect there to be more discoveries and more surprises."

Accelerating advances

If widely adopted, HT-MEK could not only improve our basic understanding of enzyme function, but also catalyze advances in medicine and industry, the researchers say. "A lot of the industrial chemicals we use now are bad for the environment and are not sustainable. But enzymes work most effectively in the most environmentally benign substance we have -- water," said study co-first author Daniel Mokhtari, a Stanford graduate student in the Herschlag and Fordyce labs.

HT-MEK could also accelerate an approach to drug development called allosteric targeting, which aims to increase drug specificity by targeting beyond an enzyme's active site. Enzymes are popular pharmaceutical targets because of the key role they play in biological processes. But some are considered "undruggable" because they belong to families of related enzymes that share the same or very similar active sites, and targeting them can lead to side effects. The idea behind allosteric targeting is to create drugs that can bind to parts of enzymes that tend to be more differentiated, like their surfaces, but still control particular aspects of catalysis. "With PafA, we saw functional connectivity between the surface and the active site, so that gives us hope that other enzymes will have similar targets," Markin said. "If we can identify where allosteric targets are, then we'll be able to start on the harder job of actually designing drugs for them."

The sheer amount of data that HT-MEK is expected to generate will also be a boon to computational approaches and machine learning algorithms, like the Google-funded AlphaFold project, designed to deduce an enzyme's complicated 3D shape from its amino acid sequence alone. "If machine learning is to have any chance of accurately predicting enzyme function, it will need the kind of data HT-MEK can provide to train on," Mokhtari said.

Much further down the road, HT-MEK may even allow scientists to reverse-engineer enzymes and design bespoke varieties of their own. "Plastics are a great example," Fordyce said. "We would love to create enzymes that can degrade plastics into nontoxic and harmless pieces. If it were really true that the only part of an enzyme that matters is its active site, then we'd be able to do that and more already. Many people have tried and failed, and it's thought that one reason why we can't is because the rest of the enzyme is important for getting the active site in just the right shape and to wiggle in just the right way."

Herschlag hopes that adoption of HT-MEK among scientists will be swift. "If you're an enzymologist trying to learn about a new enzyme and you have the opportunity to look at 5 or 10 mutations over six months or 100 or 1,000 mutants of your enzyme over the same period, which would you choose?" he said. "This is a tool that has the potential to supplant traditional methods for an entire community."

Credit: 
Stanford University

California's carbon mitigation efforts may be thwarted by climate change itself

image: Redwood forests such as this one in California's Humboldt County are key components of the state's climate change mitigation efforts, but UCI researchers suggest that ongoing greenhouse gas emissions may limit the ability of trees to remove carbon dioxide from the atmosphere.

Image: 
Shane Coffield / UCI

Irvine, Calif., July 22, 2021 - To meet an ambitious goal of carbon neutrality by 2045, California's policymakers are relying in part on forests and shrublands to remove CO2 from the atmosphere, but researchers at the University of California, Irvine warn that future climate change may limit the ecosystem's ability to perform this service.

In a paper published today in the American Geophysical Union journal AGU Advances, the UCI Earth system scientists stressed that rising temperatures and uncertain precipitation will cause a decrease in California's natural carbon storage capacity of as much as 16 percent under an extreme climate projection and of nearly 9 percent under a more moderate scenario.

"This work highlights the conundrum that climate change poses to the state of California," said lead author Shane Coffield, a UCI Ph.D. candidate in Earth system science. "We need our forests and other plant-covered areas to provide a 'natural climate solution' of removing carbon dioxide from the air, but heat and drought caused by the very problem we're trying to solve could make it more difficult to achieve our objectives."

Trees and plants draw CO2 from the atmosphere when they photosynthesize, and some of the carbon ends up stored in their biomass or the soil. California's climate strategy depends in part on enhanced carbon storage to offset some of the emissions from transportation, power generation and other sources. The combination of this natural carbon sequestration system and measures to promote green energy is hoped to help the state reach its target of not contributing net carbon to the environment by 2045.

But the UCI scientists suggest that an even more aggressive approach to curtailing emissions may be necessary.

"The emissions scenario that we follow will have a large effect on the carbon storage potential of our forests," said co-author James Randerson, who holds the Ralph J. & Carol M. Cicerone Chair in Earth System Science at UCI. "A more moderate emissions scenario in which we convert to more renewable energy sources leads to about half of the ecosystem carbon [sequestration] loss compared to a more extreme emissions scenario."

Coffield said that current climate models are not in agreement about California's future precipitation, but it's probable that the northern part of the state will get wetter and the southern part drier. He also said that coastal areas of Central and Northern California and low- and mid-elevation mountain areas - sites of large offset projects - are the most likely to lose some of their carbon sequestration powers over the next several decades.

In addition, the researchers were able to estimate the effects of climate change on specific tree species. They project that coast redwoods will be constrained to the far northern part of their range by the end of the century and that hotter, drier conditions will favor oak trees at the expense of conifers.

While the study used statistical modeling to peer into the future of the state's ecosystems, the research also highlights the importance of present-day drought and wildfire as key mechanistic drivers of carbon sequestration losses. Other studies have estimated that the 2012-2015 drought killed more than 40 percent of ponderosa pines in the Sierra Nevada range. Another issue the researchers describe is the loss of trees from California's worsening wildfire situation.

"We hope that this work will inform land management and climate policies so that steps can be taken to protect existing carbon stocks and tree species in the most climate-vulnerable locations," Randerson said. "Effective management of fire risk is essential for limiting carbon [sequestration] losses throughout much of the state."

Credit: 
University of California - Irvine

Investigational magnetic device shrinks glioblastoma in first-in-world human test

image: Device helmet with 3 oncoscillators securely attached. The oncoscillators are connected to a controller box powered by a rechargeable battery.

Image: 
Houston Methodist

Houston Methodist Neurological Institute researchers from the department of neurosurgery shrunk a deadly glioblastoma tumor by more than a third using a helmet generating a noninvasive oscillating magnetic field that the patient wore on his head while administering the therapy in his own home. The 53-year-old patient died from an unrelated injury about a month into the treatment, but during that short time, 31% of the tumor mass disappeared. The autopsy of his brain confirmed the rapid response to the treatment.

"Thanks to the courage of this patient and his family, we were able to test and verify the potential effectiveness of the first noninvasive therapy for glioblastoma in the world," said David S. Baskin, M.D., FACS, FAANS, corresponding author and director of the Kenneth R. Peak Center for Brain and Pituitary Tumor Treatment in the Department of Neurosurgery at Houston Methodist. "The family's generous agreement to allow an autopsy after their loved ones' untimely death made an invaluable contribution to the further study and development of this potentially powerful therapy."

In a case study published in Frontiers in Oncology Baskin and his colleagues detailed the journey of their pioneering patient who suffered from end-stage recurrent glioblastoma, despite a radical surgical excision, chemoradiotherapy and experimental gene therapy.

Glioblastoma is the deadliest of brain cancers in adults, nearly always fatal, with a life expectancy of a few months to two years. When the patient's glioblastoma recurred in August 2019, Baskin and his team, already working on the OMF treatment in mouse models, received FDA approval for compassionate use treatment of the patient with their newly invented Oncomagnetic Device under an Expanded Access Program (EAP). The protocol also was approved by the Houston Methodist Research Institute Institutional Review Board.

The treatment consisted of intermittent application of an oscillating magnetic field generated by rotating permanent magnets in a specific frequency profile and timing pattern. First administered for two hours under supervision in the Peak Clinic, ensuing treatments were given at home with help from the patient's wife, with increasing treatment times up to a maximum of only six hours per day.

The Oncomagnetic Device looks deceptively simple: three oncoscillators securely attached to a helmet and connected to a microprocessor-based electronic controller operated by a rechargeable battery, an invention by case study co-author Dr. Santosh Helekar. During the patient's five weeks of treatment, the magnetic therapy was well-tolerated and the tumor mass and volume shrunk by nearly a third, with shrinkage appearing to correlate with the treatment dose.

Co-authored by associate professor of neurosurgery Santosh Helekar, M.D., Ph.D., research professor Martyn A. Sharpe, Ph.D., and biomedical engineer Lisa Nguyen, the case study is entitled "Case Report: End-Stage Recurrent Glioblastoma Treated with a New Noninvasive Non-Contact Oncomagnetic Device." The ongoing research is supported by the Translational Research Initiative of the Houston Methodist Research Institute, Donna and Kenneth Peak, the Kenneth R. Peak Foundation, the John S. Dunn Foundation, the Taub Foundation, the Blanche Green Fund of the Pauline Sterne Wolff Memorial Foundation, the Kelly Kicking Center Foundation, the Gary and Marlee Swarz Foundation, the Methodist Hospital Foundation and the Veralan Foundation.

"Imagine treating brain cancer without radiation therapy or chemotherapy," said Baskin. "Our results in the laboratory and with this patient open a new world of non-invasive and nontoxic therapy for brain cancer, with many exciting possibilities for the future."

Credit: 
Houston Methodist

Enamel defects as biomarkers for exposure to environmental stressors

Alexandria, Va., USA - IADR President Pamela Den Besten presented and chaired the IADR President's Symposium "Enamel Defects as Biomarkers for Exposure to Environmental Stressors" at the virtual 99th General Session & Exhibition of the International Association for Dental Research (IADR), held in conjunction with the 50th Annual Meeting of the American Association for Dental Research (AADR) and the 45th Annual Meeting of the Canadian Association for Dental Research (CADR), on July 21-24, 2021.

Enamel pathologies may result from mutations of genes involved in amelogenesis, or from specific environmental conditions, factors and life habits. Over the past century the levels of environmental toxins and chemicals have increased, as have exposure to novel molecules or combination of factors that can affect enamel formation. These environmental effects, which are recorded in the mineralized enamel matrix, may result in changes to the dentition that increase the risk of dental diseases.

This symposium featured an international panel of experts who highlighted the effects of environmental influences on enamel formation, including:

The Effects of Early Life Adversity on Tooth Formation by Pamela Den Besten, IADR President, University of California, San Francisco, USA

Environmental Toxicants and Enamel Pathologies With a Focus on Endocrine Disrupting Chemicals by Sylvie Babajko, Centre de Recherches des Cordeliers Inserm, Paris, France

Genetic Susceptibility to Fluorosis: What Do We Know? by Marilia Afonso Rabelo Buzalaf, University of Sao Paulo, Brazil

The Effect of Phenols and Phthalates on Prenatal/Perinatal Development, Health, and Tooth Formation by James Winkler, University of Utah, Salt Lake City, USA

View Enamel Defects as Biomarkers for Exposure to Environmental Stressors in the IADR General Session Virtual Experience Platform.

Credit: 
International Association for Dental, Oral, and Craniofacial Research