Culture

Toward overcoming solubility issues in organic chemistry

image: Insoluble reactants are hardly reactive in solution, but may react in solvent-free systems using ball milling to drive chemical reactions in the solid state (Tamae Seo, et al. Journal of the American Chemical Society. March 30, 2021).

Image: 
Tamae Seo, et al. Journal of the American Chemical Society. March 30, 2021

Scientists from Hokkaido University have developed a rapid, efficient protocol for cross-coupling reactions, vastly expanding the pool of chemicals that can be used for the synthesis of useful organic compounds.

Chemical reactions are a vital process in the synthesis of products for a diversity of purposes. For the most part, these reactions are carried out in the liquid phase, by dissolving the reactants in a solvent. However, there are a significant number of chemicals that are partially or completely insoluble, and thus have not been used for synthesis. The starting materials required for the synthesis of many cutting-edge organic materials--such as organic semiconductors and luminescent materials--are often poorly soluble, leading to problems in solution-based synthesis. Therefore, the development of a solvent-independent synthetic approach to overcome these long-standing solubility issues in organic synthesis is highly desired to synthesize new valuable organic molecules.

In recent years, synthetic techniques using ball milling have been used to carry out solvent-free reactions in the solid phase. It has been proposed that the use of ball milling would potentially overcome the aforementioned solubility issues in synthetic chemistry, but a systematic study for such purpose has never been carried out.

A team of four scientists from Hokkaido University's Institute for Chemical Reaction Design and Discovery (WPI-ICReDD), led by Associate Professor Koji Kubota and Professor Hajime Ito, have developed a rapid, efficient, solvent-free protocol for Suzuki?Miyaura cross-coupling reaction of insoluble aryl halides. The protocol was published in the Journal of the American Chemical Society.

Aryl halides are popular starting materials for the synthesis of organic functional molecules, primarily by the palladium-catalyzed Suzuki-Miyaura cross-coupling reaction -- for which Hokkaido University's Professor Emeritus Akira Suzuki shared the 2010 Nobel Prize in Chemistry. Although the cross-coupling reactions have been employed for the synthesis of a wide range of valuable molecules, insoluble aryl halides are not suitable substrates because Suzuki-Miyaura cross-coupling reactions have primarily been carried out in solution.

Given this limitation, the scientists focused on the development of an efficient solid-state Suzuki-Miyaura cross-coupling of a number of extremely unreactive insoluble aryl halides. The key equipment consisted of a ball mill, for mixing the reactants; a heat gun, to increase the temperature at which the reactions took place; and the use of a catalytic system composed of palladium acetate (the catalyst), SPhos (a high-performance ligand for Suzuki?Miyaura cross-coupling reactions) and 1,5-cyclooctadiene (dispersant and stabilizer).

The capstone of this study was the application of the solvent-free solid-state reaction to mostly-insoluble aryl halides. These reactants did not yield any products in conventional solution-based reactions. The solid-state reactions using the high-temperature ball milling, however, gave the desired products. Importantly, the team discovered a new strong photoluminescence material prepared from insoluble pigment violet 23.

"The high-temperature ball-milling technique and our catalytic system are essential for these cross-coupling reactions of insoluble aryl halides, and the protocol we have developed expands the diversity of organic molecules derived from insoluble starting materials," says Koji Kubota.

Credit: 
Hokkaido University

How x-rays could make reliable, rapid COVID-19 tests a reality

image: Molecular models constructed from the X-ray data show different antibodies bound to the SARS-CoV-2 nucleocapsid protein (pink). The scientists determined that the linear arrangement (right) has higher detection sensitivity than the sandwich arrangement (left).

Image: 
(Berkeley Lab)

Vaccines are turning the tide of the pandemic, but the risk of infection is still present in some situations. If you want to visit a friend, get on a plane, or go see a movie, there is no highly accurate, instant test that can tell you right then and there whether or not you have a SARS-CoV-2 infection. But new research from Lawrence Berkeley National Laboratory (Berkeley Lab) could help get reliable instant tests on the market.

A study led by Michal Hammel and Curtis D. Hodge suggests that a highly sensitive lateral flow assay - the same type of device used in home pregnancy tests - could be developed using pairs of rigid antibodies that bind to the SARS-CoV-2 nucleocapsid protein. Such a test would only require a small drop of mucus or saliva, could give results in 15 minutes, and could detect a COVID-19 infection one day before the onset of symptoms. Their work was published in the journal mABs.

The current gold standard tests for COVID-19 use a form of polymerase chain reaction (PCR) to identify the presence of SARS-CoV-2 nucleic acid (RNA) rather than a viral protein. They are quite accurate, with false negative rates ranging less then 5% (depending primarily on the sampling site, sample type, and stage of infection). However, PCR tests must be sent away for analysis at an accredited lab.

Rapid antigen tests use antibodies to detect specific parts of the viral particle itself. Current antigen tests have a very low rate of false positives, but are plagued by high false negative rates, and therefore can't replace PCR tests for definitive COVID-19 diagnosis. If a more accurate antigen test was brought to market, it could serve as a helpful initial screening tool similar to how home pregnancy tests work. In the case of a positive result, the user would need to begin appropriate precautionary measures (isolation and other transmission-prevention behaviors) and then have the diagnosis confirmed by an official test at a health clinic.

"As we move toward gaining normalcy and reopening economies worldwide, there is continued demand for rapid, low-cost tests that can be self-administered without the need for a trained professional," said Hammel, a biophysicist in Berkeley Lab's Biosciences Area. "Currently used COVID-19 PCR tests are expensive, at about $100 per test, and on average, U.S. labs are performing one million tests per day. An accurate rapid antigen test could cost $1 each and eliminate the wait time."

Hammel, Hodge, and their colleagues used small angle X-ray scattering (SAXS) performed at Berkeley Lab's Advanced Light Source (ALS) to examine about 20 antibody-antigen interactions. Their data showed that a particular pair of monoclonal antibodies bound to the nucleocapsid protein very strongly and stably, in part due to the antibodies' rigidity. All antibodies vary in their degree of rigidity based on the amino acid sequence of their "arms," which are the part of the Y-shaped molecules that bind to antigens. "The combination of the two rigid antibodies was also observed to increase networking - a process in which multiple antibodies bound to the same antigen at different sites form larger clumps or 'networks,'" explained Hodge, a postdoctoral researcher and first author on the study.

Antibody networking and high binding stability are known to improve the sensitivity of lateral flow assays, and researchers have long speculated that antibody flexibility plays a role in both properties. But studying the physical dynamics of antibody-antigen pairs to find the most effective antibodies is very difficult with traditional imaging techniques, which require the molecules to be stabilized or crystallized. The SAXS technique developed by Hammel and his colleagues allows scientists to examine antibodies and antigens in their natural state, i.e. when moving freely in a liquid.

"We showed that we can rapidly identify new antibody-antigen pairs that result in a more sensitive detection assay," said Hammel. "This technique could be applied to hundreds of antibodies in a short amount of time to identify the most suitable antibodies to achieve as-of-yet unattained sensitivity of antibody-based diagnostics, which are key for early diagnosis of SARS-CoV-2 as well as other pathogens."

The team is now investigating methods of improving test sensitivity even further.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Family history, race and sex linked to higher rates of asthma in children

image: Christine Cole Johnson, Ph.D., MPH, chair of the Department of Public Health Sciences at Henry Ford Health System and the study's lead author.

Image: 
Henry Ford Health System

DETROIT (May 17, 2021) - A national study on childhood asthma led by Henry Ford Health System has found that family history, race and sex are associated in different ways with higher rates of asthma in children.

In a study published in JAMA Pediatrics (hyperlink goes here), researchers found that children with at least one parent with a history of asthma had two to three times higher rates of asthma, mostly through age 4. They also reported that asthma rates in black children were much higher than white children during their preschool years, but the rates of incidence dropped in black children after age 9, while they increased for white children later in childhood.

"These findings help us to better understand what groups of children are most susceptible to asthma early in life," said Christine Cole Johnson, Ph.D., MPH, chair of the Department of Public Health Sciences at Henry Ford and the study's lead author. "We can now use this information to develop interventions for those children at highest risk."

More than 12,000 children born between 1980 and 2014 were followed in the study. Data was collected from 31 different childhood asthma cohort studies in 30 U.S. states and Puerto Rico as part of the national Environmental Influences on Child Health Outcomes (ECHO) program. The children in the cohorts were born at 34 weeks or greater with no evidence of lung disease and had been followed from birth to at least their fifth birthday. The collected data also was based on whether the children received a doctor's asthma diagnosis and included a family history of asthma, sex, race, ethnicity, birth year and the birth mother's education. The gender of the children was nearly equal, 51% boys and 49% girls; 52% were white and 23% black.

Funded by the U.S. National Institutes of Health, Dr. Johnson and researchers sought to measure whether childhood asthma incidence differed by family history, race and sex. Key highlights:

Children with a family history of asthma had a two-fold increase risk asthma at age 4 through age 14 compared to those without a family history

Boys with a family history of asthma had higher rates of asthma than girls in their early years. By age 14, their rate of incidence was about the same

Black children had the highest rates of asthma regardless of a family history

"The ECHO consortium presents a unique opportunity for us to better understand patterns of asthma development given its large and diverse participant population," said Aruna Chandran, M.D., MPH, of Johns Hopkins University Bloomberg School of Public Health and a study co-author. "We hope the results of this study will be useful to both researchers and healthcare providers to better treat and ultimately even prevent asthma in children."

Asthma is a major cause of disease in children that can lead to permanent lung damage. According to the Centers for Disease Control and Prevention, an estimated 1 in 12 children 17 and younger have asthma in the United States, which causes wheezing, difficulty breathing and coughing. Every year, 1 in 6 children with asthma visits the Emergency Department, and about 1 in 20 children are hospitalized.

Credit: 
Henry Ford Health

Archaeologists teach computers to sort ancient pottery

image: A "river" of Tusayan White Ware sherds, showing the change in type designs from oldest at left to youngest at right. Deep learning allows for accurate and repeatable categorization of these sherd types.

Image: 
Chris Downum

Archaeologists at Northern Arizona University are hoping a new technology they helped pioneer will change the way scientists study the broken pieces left behind by ancient societies.

The team from NAU's Department of Anthropology have succeeded in teaching computers to perform a complex task many scientists who study ancient societies have long dreamt of: rapidly and consistently sorting thousands of pottery designs into multiple stylistic categories. By using a form of machine learning known as Convolutional Neural Networks (CNNs), the archaeologists created a computerized method that roughly emulates the thought processes of the human mind in analyzing visual information.

"Now, using digital photographs of pottery, computers can accomplish what used to involve hundreds of hours of tedious, painstaking and eye-straining work by archaeologists who physically sorted pieces of broken pottery into groups, in a fraction of the time and with greater consistency," said Leszek Pawlowicz, adjunct faculty in the Department of Anthropology. He and anthropology professor Chris Downum began researching the feasibility of using a computer to accurately classify broken pieces of pottery, known as sherds, into known pottery types in 2016. Results of their research are reported in the June issue of the peer-reviewed publication Journal of Archaeological Science.

"On many of the thousands of archaeological sites scattered across the American Southwest, archaeologists will often find broken fragments of pottery known as sherds. Many of these sherds will have designs that can be sorted into previously-defined stylistic categories, called 'types,' that have been correlated with both the general time period they were manufactured and the locations where they were made" Downum said. "These provide archaeologists with critical information about the time a site was occupied, the cultural group with which it was associated and other groups with whom they interacted."

The research relied on recent breakthroughs in the use of machine learning to classify images by type, specifically CNNs. CNNs are now a mainstay in computer image recognition, being used for everything from X-ray images for medical conditions and matching images in search engines to self-driving cars. Pawlowicz and Downum reasoned that if CNNs can be used to identify things like breeds of dogs and products a consumer might like, why not apply this approach to the analysis of ancient pottery?

Until now, the process of recognizing diagnostic design features on pottery has been difficult and time-consuming. It could involve months or years of training to master and correctly apply the design categories to tiny pieces of a broken pot. Worse, the process was prone to human error because expert archaeologists often disagree over which type is represented by a sherd, and might find it difficult to express their decision-making process in words. An anonymous peer reviewer of the article called this "the dirty secret in archaeology that no one talks about enough."

Determined to create a more efficient process, Pawlowicz and Downum gathered thousands of pictures of pottery fragments with a specific set of identifying physical characteristics, known as Tusayan White Ware, common across much of northeast Arizona and nearby states. They then recruited four of the Southwest's top pottery experts to identify the pottery design type for every sherd and create a 'training set' of sherds from which the machine can learn. Finally, they trained the machine to learn pottery types by focusing on the pottery specimens the archaeologists agreed on.

"The results were remarkable," Pawlowicz said. "In a relatively short period of time, the computer trained itself to identify pottery with an accuracy comparable to, and sometimes better than, the human experts."

For the four archaeologists with decades of experience sorting tens of thousands of actual potsherds, the machine outperformed two of them and was comparable with the other two. Even more impressive, the machine was able to do what many archaeologists can have difficulty with: describing why it made the classification decisions that it did. Using color-coded heat maps of sherds, the machine pointed out the design features that it used to make its classification decisions, thereby providing a visual record of its "thoughts."

"An exciting spinoff of this process was the ability of the computer to find nearly exact matches of particular snippets of pottery designs represented on individual sherds," Downum said. "Using CNN-derived similarity measures for designs, the machine was able to search through thousands of images to find the most similar counterpart of an individual pottery design."

Pawlowicz and Downum believe this ability could allow a computer to find scattered pieces of a single broken pot in a multitude of similar sherds from an ancient trash dump or conduct a region-wide analysis of stylistic similarities and differences across multiple ancient communities. The approach might also be better able to associate particular pottery designs from excavated structures which have been dated using the tree-ring method.

Their research is already receiving high praise.

"I fervently hope that Southwestern archaeologists will adopt this approach and do so quickly. It just makes so much sense," said Stephen Plog, emeritus professor of archaeology at the University of Virginia and author of the book "Stylistic Variation In Prehistoric Ceramics." "We learned a ton from the old system, but it has lasted beyond its usefulness, and it's time to transform how we analyze ceramic designs."

The researchers are exploring practical applications of the CNN model's classification expertise and are working on additional journal articles to share the technology with other archaeologists. They hope this new approach to archaeological analysis of pottery can be applied to other types of ancient artifacts, and that archaeology can enter a new phase of machine classification that results in greater efficiency of archaeological efforts and more effective methods of teaching pottery designs to new generations of students.

Credit: 
Northern Arizona University

Researchers identify proteins that predict future dementia, Alzheimer's risk

The development of dementia, often from Alzheimer's disease, late in life is associated with abnormal blood levels of dozens of proteins up to five years earlier, according to a new study led by researchers at the Johns Hopkins Bloomberg School of Public Health. Most of these proteins were not known to be linked to dementia before, suggesting new targets for prevention therapies.

The findings are based on new analyses of blood samples of over ten thousand middle-aged and elderly people--samples that were taken and stored during large-scale studies decades ago as part of an ongoing study. The researchers linked abnormal blood levels of 38 proteins to higher risks of developing Alzheimers within five years. Of those 38 proteins, 16 appeared to predict Alzheimer's risk two decades in advance.

Although most of these risk markers may be only incidental byproducts of the slow disease process that leads to Alzheimer's, the analysis pointed to high levels of one protein, SVEP1, as a likely causal contributor to that disease process.

The study was published May 14 in Nature Aging.

"This is the most comprehensive analysis of its kind to date, and it sheds light on multiple biological pathways that are connected to Alzheimer's," says study senior author Josef Coresh, MD, PhD, MHS, George W. Comstock Professor in the Department of Epidemiology at the Bloomberg School. "Some of these proteins we uncovered are just indicators that disease might occur, but a subset may be causally relevant, which is exciting because it raises the possibility of targeting these proteins with future treatments."

More than six million Americans are estimated to have Alzheimer's, the most common type of dementia, an irreversible fatal condition that leads to loss of cognitive and physical function. Despite decades of intensive study, there are no treatments that can slow the disease process, let alone stop or reverse it. Scientists widely assume that the best time to treat Alzheimer's is before dementia symptoms develop.

Efforts to gauge people's Alzheimer's risk before dementia arises have focused mainly on the two most obvious features of Alzheimer's brain pathology: clumps of amyloid beta protein known as plaques, and tangles of tau protein. Scientists have shown that brain imaging of plaques, and blood or cerebrospinal fluid levels of amyloid beta or tau, have some value in predicting Alzheimer's years in advance.

But humans have tens of thousands of other distinct proteins in their cells and blood, and techniques for measuring many of these from a single, small blood sample have advanced in recent years. Would a more comprehensive analysis using such techniques reveal other harbingers of Alzheimer's? That's the question Coresh and colleagues sought to answer in this new study.

The researchers' initial analysis covered blood samples taken during 2011-13 from more than 4,800 late-middle-aged participants in the Atherosclerosis Risk in Communities (ARIC) study, a large epidemiological study of heart disease-related risk factors and outcomes that has been running in four U.S. communities since 1985. Collaborating researchers at a laboratory technology company called SomaLogic used a technology they recently developed, SomaScan, to record levels of nearly 5,000 distinct proteins in the banked ARIC samples.

The researchers analyzed the results and found 38 proteins whose abnormal levels were significantly associated with a higher risk of developing Alzheimer's in the five years following the blood draw.

They then used SomaScan to measure protein levels from more than 11,000 blood samples taken from much younger ARIC participants in 1993-95. They found that abnormal levels of 16 of the 38 previously identified proteins were associated with the development of Alzheimer's in the nearly two decades between that blood draw and a follow-up clinical evaluation in 2011-13.

To verify these findings in a different patient population, the scientists reviewed the results of an earlier SomaScan of blood samples taken in 2002-06 during an Icelandic study. That study had assayed proteins including 13 of the 16 proteins identified in the ARIC analyses. Of those 13 proteins, six were again associated with Alzheimer's risk over a roughly 10-year follow-up period.

In a further statistical analysis, the researchers compared the identified proteins with data from past studies of genetic links to Alzheimer's. The comparison suggested strongly that one of the identified proteins, SVEP1, is not just an incidental marker of Alzheimer's risk but is involved in triggering or driving the disease.

SVEP1 is a protein whose normal functions remain somewhat mysterious, although in a study published earlier this year it was linked to the thickened artery condition, atherosclerosis, which underlies heart attacks and strokes.

Other proteins associated with Alzheimer's risk in the new study included several key immune proteins--which is consistent with decades of findings linking Alzheimer's to abnormally intense immune activity in the brain.

The researchers plan to continue using techniques like SomaScan to analyze proteins in banked blood samples from long-term studies to identify potential Alzheimer's-triggering pathways--a potential strategy to suggest new approaches for Alzheimer's treatments.

The scientists have also been studying how protein levels in the ARIC samples are linked to other diseases such as vascular (blood vessel-related) disease in the brain, heart and the kidney.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Educational intervention enhances student learning

May 16, 2021 -- In a study of low-income, urban youth in the U.S., researchers at Columbia University Mailman School of Public Health found that students exposed to Photovoice, an educational intervention, experienced greater improvements in STEM-capacity scores and environmental awareness scores compared to a group of youth who were not exposed to the activity. The results suggest that the Photovoice activities may be associated with improved learning outcomes. The study is published in the International Journal of Qualitative Methods.

"Our findings suggest that the Photovoice activities result in greater environmental awareness and may be associated with improved learning skills," said Nadav Sprague, doctoral fellow, Environmental Life Course Epidemiology at Columbia Mailman School.

Photovoice uses community-members' knowledge and perspective to address knowledge gaps in academia, research, and policy-making. Most often used in public health research. the participant is engaged through taking photos on a given topic in combination with narrative discussions in a focus group setting.

Sprague and colleagues from Washington University in St. Louis studied the STEM-capacity, environmental perceptions, and environmental awareness of 335 low-income, St. Louis Public School students aged 9 to 15. Students in the study were assigned to one of two intervention groups, a Photovoice environmental education intervention group or a traditional intervention group without a Photovoice activity. The investigators also evaluated outcomes among a control group of youth, who did not participate in either intervention.

After the intervention, both STEM-capacity and environmental awareness for the Photovoice intervention group were significantly higher than that of the control group of youth who did not participate in the intervention and did not experience any significant improvements in STEM- capacity or environmental awareness. Both environmental education interventions were run by Washington University and Gateway to the Great Outdoors (GGO), a nonprofit organization that provides environmental health and STEM education to low-income elementary and middle schools in Chicago and St Louis.

"There is an environmental justice issue. Low-income and non-White children often have less access to nature and greenspace than their high-income or White counterparts" said Sprague, who also founded Gateway to the Great Outdoors. Researchers believe focused based environmental education is a potential answer to halt or slow human-driven climate change, biodiversity loss, overuse of natural resources, environmental health disparities, deforestation, and other human-caused environmental issues.

While the findings of the current research as well as previous studies are consistent with the literature that nature-contact improves social connectedness, Sprague recommends larger intervention studies to confirm the benefits of Photovoice in Environmental Education are necessary.

"The purpose of Photovoice is to empower individuals and communities that are traditionally neglected from research and policymaking discussions," said Sprague. "Photovoice may be an effective leadership development tool for youth, as it demonstrates to youth that their opinions and views can lead to change."

Credit: 
Columbia University's Mailman School of Public Health

Newly published data provides clearer picture of volcano collapse

video: 3D rendering of pre- and post-collapse (likeliest scenario) of Anak Krakatau and the surrounding islands based on available pre-event data outside of the Krakatau Islands and field survey data from August 2019.

Image: 
3D rendering courtesy of Stéphan Grilli.

KINGSTON, R.I. - May 17, 2021 - An article recently published in the prestigious journal Nature Communications, written by University of Rhode Island College of Engineering Professor Stéphan Grilli and his colleagues, reveals new data on the Anak Krakatau volcano flank collapse, which was triggered by an eruption on December 22, 2018.

The tsunami created by the flank collapse hit the coast of Indonesia with waves as tall as 5 meters, leaving 420 people dead and 40,000 people displaced from their homes.

New Data Used for Modeling

Ever since the eruption occurred, scientists have been collecting evidence to determine exactly how it happened, just as crime scene investigators attempt to recreate a crime scene.

"Up until now, a lot of the information we had was based on satellite images and conjecture," said University of Rhode Island Distinguished Engineering Professor Stéphan Grilli. "Until there was real data, nobody could do any better."

By combining new synthetic aperture radar (SAR) images, field observations from a marine geology underwater survey, and aerial photographs taken by drones, a more accurate model can now be created of the volcano before and after it collapsed.

New high-resolution seafloor and sub-seafloor hydroacoustic surveys have provided a comprehensive view of what the landslide deposits look like underwater.

"The renderings show how deep the sediment slid underwater and how large the pieces were that collapsed," said Grilli.

Published Findings

The article in Nature Communications, which is considered one of the world's leading multidisciplinary science journals, was published on May 14, 2021.

"For many researchers working in the natural sciences, publishing a paper in one of Nature's journals is really an honor and a sign that one's work is being recognized by the scientific community," said Grilli. "This also brings great visibility to the work, which is important because as we improve our understanding and modeling of how tsunamis are generated by natural hazards, we can improve our mitigation of their effects in coastal areas and hopefully save lives."

Grilli's research was funded by the National Science Foundation. Other co-project investigators at URI were Annette Grilli, associate professor of ocean engineering and Steve Carey, professor of oceanography.

Most of Stéphan Grilli's peers who are co-authors on the Nature article are from the United Kingdom and were funded by its Natural Environment Research Council.

Closer to Home

As devastating as the tsunami caused by Anak Krakatau was, a potentially much greater threat exists closer to the United States.

According to Grilli, if one of the volcanoes in the Canary Islands in the North Atlantic Ocean off the coast of Northwest Africa was to erupt and suffer a large flank collapse, the results would be catastrophic.

"Our sights are on the Canary Islands because that volcano shows signs of becoming unstable and an eruption could cause a major landslide on one of its flanks, which studies have shown could be up to 2,000 times larger than what we saw in Indonesia," said Grilli. "That could create a mega-tsunami, with the potential to cause inundations along the East Coast of the United State, in some areas twice as large as a category five hurricane. It could mean major destruction along the East Coast."

On a smaller scale, but within the United States, Hawaii's volcanoes pose a constant threat of eruption and flank collapses.

"If a piece of one of Hawaii's volcanoes was to break off, it could create a significant tsunami," said Grilli.

Not Much Warning

Despite advances in technology, there is still very little warning when a volcano is on the verge of eruption or a tsunami is forming as a result of it.

"We have high-frequency radar and systems that can monitor surface currents, including those caused by tsunamis, but we're still a long way from being able to predict when an earthquake, volcano eruption or tsunami may occur," said Grilli.

After Japan was struck by an earthquake with a magnitude of 9.0 in 2011, resulting in a tsunami and a nuclear power-plant accident, which left close to 18,000 people dead, the country spent $12 billion to build 42-foot-high concrete seawalls.

The walls block the view of the ocean, but experts say the barriers are worth it, as they should minimize damage and buy time for evacuation. In some areas of the United States, such as along the Cascadia subduction zone off of Northern California, Oregon, and Washington, there would be very little time to retreat to safe ground should a large earthquake and tsunami occur.

"In Oregon, people are worried about evacuation if we had 'The Big One,'" said Grilli. "Even though people have built artificial hills for a vertical evacuation, at most there would be a 15-minute tsunami warning. There just wouldn't be enough time to get everyone to safety."

Credit: 
University of Rhode Island

Researchers call for bias-free artificial intelligence

Clinicians and surgeons are increasingly using medical devices based on artificial intelligence. These AI devices, which rely on data-driven algorithms to inform health care decisions, presently aid in diagnosing cancers, heart conditions and diseases of the eye, with many more applications on the way.

Given this surge in AI, two Stanford University faculty members are calling for efforts to ensure that this technology does not exacerbate existing heath care disparities.

In a new perspective paper, Stanford faculty discuss sex, gender and race bias in medicine and how these biases could be perpetuated by AI devices. The authors suggest several short- and long-term approaches to prevent AI-related bias, such as changing policies at medical funding agencies and scientific publications to ensure the data collected for studies are diverse, and incorporating more social, cultural and ethical awareness into university curricula.

"The white body and the male body have long been the norm in medicine guiding drug discovery, treatment and standards of care, so it's important that we do not let AI devices fall into that historical pattern," said Londa Schiebinger, the John L. Hinds Professor in the History of Science in the School of Humanities and Sciences and senior author of the paper published May 4 in the journal EBioMedicine.

"As we're developing AI technologies for health care, we want to make sure these technologies have broad benefits for diverse demographics and populations," said James Zou, assistant professor of biomedical data science and, by courtesy, of computer science and of electrical engineering at Stanford and co-author of the study.

The matter of bias will only become more important as personalized, precision medicine grows in the coming years, said the researchers. Personalized medicine, which is tailored to each patient based on factors such as their demographics and genetics, is vulnerable to inequity if AI medical devices cannot adequately account for individuals' differences.

"We're hoping to engage the AI biomedical community in preventing bias and creating equity in the initial design of research, rather than having to fix things after the fact," said Schiebinger.

Constructive - if constructed appropriately

In the medical field, AI encompasses a suite of technologies that can help diagnose patients' ailments, improve health care delivery and enhance basic research. The technologies involve algorithms, or instructions, run by software. These algorithms can act like an extra set of eyes perusing lab tests and radiological images; for instance, by parsing CT scans for particular shapes and color densities that could indicate disease or injury.

Problems of bias can emerge, however, at various stages of these devices' development and deployment, Zou explained. One major factor is that the data for forming models used by algorithms as baselines can come from nonrepresentative patient datasets.

By failing to properly take race, sex and socioeconomic status into account, these models can be poor predictors for certain groups. To make matters worse, clinicians might lack any awareness of AI medical devices potentially producing skewed results.

As an illustrative example of potential bias, Schiebinger and Zou discuss pulse oximeters in their study. First patented around 50 years ago, pulse oximeters can quickly and noninvasively report oxygen levels in a patient's blood. The devices have proven critically important in treating COVID-19, where patients with low oxygen levels should immediately receive supplemental oxygen to prevent organ damage and failure.

Pulse oximeters work by shining a light through a patient's skin to register light absorption by oxygenated and deoxygenated red blood cells. Melanin, the primary pigment that gives skin its color, also absorbs light, however, potentially scrambling readings in people with highly pigmented skin. It's no surprise, then, that studies have shown today's industry-standard oximeters are three times more likely to incorrectly report blood gas levels in Black patients compared to white patients. Oximeters additionally have a sex bias, tending to misstate levels in women more often than men. These oximeter biases mean that dark-skinned individuals, especially females, are at risk of not receiving emergency supplemental oxygen.

"The pulse oximeter is an instructive example of how developing a medical technology without varied demographic data collection can lead to biased measurements and thus poorer patient outcomes," said Zou.

This issue extends to the evaluation of devices after approval for clinical use. In another recent study, published in Nature Medicine and cited in the EBioMedicine paper, Zou and colleagues at Stanford reviewed the 130 medical AI devices approved at the time by the U.S. Food and Drug Administration. The researchers found that 126 out of the 130 devices were evaluated using only previously collected data, meaning that no one gauged how well the AI algorithms work on patients in combination with active human clinician input. Moreover, less than 13 percent of the publicly available summaries of approved device performances reported sex, gender or race/ethnicity.

Zou said these problems of needing more diverse data collection and monitoring of AI technologies in medical contexts "are among the lowest hanging fruit in addressing bias."

Addressing bias at the macro level

Over the longer term, the study explores how structural changes to the broader biomedical infrastructure can help overcome the challenges posed by AI inequities.

A starting point is funding agencies, such as the National Institutes of Health. Some progress has been made in recent years, Schiebinger said, pointing to how in 2016, the NIH started requiring funding applicants to include sex as a biological variable in their research, if relevant. Schiebinger anticipates the NIH instituting a similar policy for gender, as well as race and ethnicity. Her group at Stanford, meanwhile, is developing gender as a sociocultural variable during clinical trials, as reported in a February study in Biology of Sex Differences.

"We want to start with policy up front in funding agencies to set the direction of research," said Schiebinger. "These agencies have a great role to play because they are distributing taxpayer money, which means that the funded research must benefit all people across the whole of society."

Another opportunity area centers on biomedical publications, including journals and conference reports. The Stanford study authors suggest that publications set policies to require sex and gender analyses where appropriate, along with ethical considerations and societal consequences.

For medical schools, the authors suggest enhancing curricula to increase awareness of how AI might reinforce social inequities. Stanford and other universities are already making strides toward this goal by embedding of ethical reasoning into computer science courses.

Another example of using an interdisciplinary approach to reduce bias is the ongoing collaboration between Schiebinger, who has taught at Stanford for 17 years and is a leading international authority on gender and science, and Zou, an expert in computer science and biomedical AI.

"Bringing together a humanist and a technologist is something Stanford is good at and should do more of," said Schiebinger. "We're proud to be in the forefront of the efforts to debias AI in medicine, all the more important considering the many other facets of human life that AI will eventually impact."

Credit: 
Stanford University

American College of Cardiology program works to improve global heart attack care

The American College of Cardiology's (ACC) Global Heart Attack Treatment Initiative (GHATI) had measurable positive impacts on care delivery for heart attacks in low- and middle-income countries, according to data from the program's first year. Results were presented at the ACC's 70th Annual Scientific Session.

Globally, more than 17 million people die each year from cardiovascular disease. Three-quarters of these deaths take place in low- and middle-income countries, which annually see about 3 million ST-elevation myocardial infarctions (STEMIs), the deadliest type of heart attack in which an artery in the heart is completely blocked. While a person who suffers a heart attack in the U.S. or Europe has a 95-97% chance of survival, the odds are significantly worse in low- and middle-income countries, where the chance of survival is 80-90%.

ACC launched GHATI in 2019 to improve heart attack outcomes in low and middle-income countries by encouraging adherence to guideline-directed medical therapy. By the end of its first year, the program had tracked STEMI treatment metrics and outcomes for more than 2,000 patients at 18 medical centers in 13 countries on four continents. Overall, the data reveal that around 90% of hospital admissions for a heart attack adhered to guideline-directed medical therapy; the study also documented improvements in several key metrics over the course of the year.

"It is obvious that there are chances to improve the systems and lower rates of cardiovascular death in low- and middle-income countries, and indeed in all countries," said Benny J. Levenson, MD, PhD, of CV Center Berlin-Charlottenburg, Vivantes Klinikum Am Urban/Berlin in Germany and immediate past chair of GHATI. "We were pleased that the results, at one year, were heading in the right direction. We intend to continue to grow this program to be a model for many other countries to improve systems of care for heart attack and ultimately make a big impact on reducing mortality."

The quality improvement program brings ACC experts from around the world, including members of the College's Assembly of International Governors, together with cardiology teams at participating institutions to establish systems for tracking patient encounters and collecting data on outcomes. Over the course of the year, the average time patients spent in transit to the hospital decreased by 38 minutes; cardiac arrest upon arrival decreased by 4.6%; and the time from first medical contact to the use of a device to open blocked arteries improved by 28%. All these factors are known to improve outcomes after a heart attack.

While the study is not a randomized controlled trial and cannot definitively attribute the improvements to the program, feedback from participants and a robust body of previous research on quality improvement suggests the program has helped to encourage positive change, according to the researchers.

"The benefit to participating institutions starts with participating," Levenson said. "Places that have never collected data are now doing so. This leads to a culture change, because people and institutions learn to look closely at their practices and discuss their results with others. Just by gathering data, we can start to see a positive effect."

The results suggest that doctors in low- and middle-income countries are generally familiar with ACC's treatment guidelines and often show high adherence to them. However, systemic factors outside of the hospital environment, such as the availability of ambulances and emergency-response systems, likely still have a substantial impact on outcomes and may be challenging to change, Levenson said.

Researchers said the program will continue to expand into more sites and countries, including top-performing medical centers and higher-income countries. "Clinicians on cardiology care teams are by nature competitive--we have an internal drive to continue to improve metrics and outcomes," Levenson said. "We know from many years of experience that even the best can improve."

The ACC GHATI Work Group is led by GHATI Chair Cesar Herrera, MD, FACC, the Americas Representative, ACC Assembly of International Governors, and director CEDIMAT Cardiovascular Center in Santo Domingo, Dominican Republic, and B. Hadley Wilson, MD, FACC, chair-elect of the ACC Governance Committee and a cardiologist at Sanger Heart and Vascular Institute.

More information about GHATI is available at ACC.org/ghati.

Levenson will present the study, "Worldwide ST-Elevation Myocardial Infarction Care: One-year Results of the American College of Cardiology Global Heart Attack Treatment Initiative," on Monday, May 17, at 12:30 p.m. ET / 16:30 UTC, virtually.

Credit: 
American College of Cardiology

Multi-gene testing could detect more hereditary cancer syndromes

COLUMBUS, Ohio ­- Up to 38.6% of people with colon cancer who have a hereditary cancer syndrome--including 6.3% of those with Lynch syndrome--could have their conditions remain undetected with current universal tumor-screening methods, and at least 7.1% of people with colorectal cancer have an identifiable inherited genetic mutation, according to new data published by scientists at The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC - James).

Experts say their data, which was gathered from a cohort of more than 3,300 colorectal cancer patients treated at 51 hospitals across Ohio, makes a strong scientific argument for implementing multi-gene panel testing as part of the standard of care for all colorectal cancer patients.

"Finding ways to identify high-risk individuals among colorectal cancer patients is critically needed to better manage this disease and proactively identify family members who may also be impacted," says Rachel Pearlman, MS, LGC, first author of the study and a genetic counselor/researcher at the OSUCCC - James. "Genetic screening has changed dramatically in the past 10 years, allowing us to screen individuals for numerous known genetic mutations for much lower costs. This is a powerful tool that we need to embrace more broadly for cancer prevention and surveillance."

Study Methods and Results

For this study, OSUCCC - James researchers wanted to know if using a multi-level genetic testing approach could better identify hereditary genetic risk factors (passed down through families) that dramatically increase a person's lifetime risk of cancer but often go undetected until cancer occurs.

Researchers identified 3,310 adults who had undergone surgery for invasive colorectal cancer from January 2013 to December 2016. Individuals were recruited as part of the Pelotonia-funded Ohio Colorectal Cancer Prevention Initiative (OCCPI), a statewide research initiative led by Heather Hampel, MS, LGC. Hampel is a member of the OSUCCC - James Molecular Carcinogenesis and Chemoprevention Program, as well as a professor and associate director of the Division of Human Genetics at the Ohio State College of Medicine.

The study was launched to screen newly diagnosed colorectal cancer patients and their biological relatives for Lynch syndrome, a cancer-causing condition that occurs if a person inherits a mutation in one of four genes. People with Lynch syndrome are at higher risk to develop colorectal, endometrial (uterine), ovarian and stomach cancer than are average-risk individuals.

All study participants received universal tumor screening for mismatch repair (MMR) deficiency. This characteristic is common in tumors from patients with Lynch syndrome and suggests that the tumor would respond well to immunotherapy, which is a novel cancer treatment using your body's own immune system to fight the tumor, if necessary.

Individuals who met at least one of the trial-inclusion criteria also received multi-gene panel testing to identify harmful mutations. Genetic testing criteria included: MMR deficiency; colorectal cancer diagnosis before age 50; multiple primary tumors (colorectal cancer/endometrial cancer); or a first-degree relative with colorectal or endometrial cancer.

Researchers report in this new analysis that about 16% (525 patients) of participants had MMR deficiency, and about 7% had an inherited mutation. The scientists note that if universal tumor screening for Lynch syndrome had been the only method used to screen for hereditary cancer syndromes, more than 38% of patients who tested positive would have been missed--including more than 6% of individuals found to have Lynch syndrome through multi-gene testing methods.

"This is a significant and important discovery. By using pan-cancer multi-gene panel testing for all colorectal cancer patients, we could identify many individuals who are at increased risk for future cancer development and identify actionable therapeutic targets for their current cancer," says Hampel, senior author of the study. "Adopting modern testing methods as part of standard clinical practice for patients with colorectal cancer could literally save thousands of lives through early detection and surveillance of other family members who are at increased risk to develop the cancer based on inherited genetic mutations."

Credit: 
Ohio State University Wexner Medical Center

Insulin is necessary for repairing olfactory neurons

image: High dependency of new neurons on insulin signaling.

Image: 
Monell Chemical Senses Center, eNeuro

PHILADELPHIA (May 17, 2021) - Researchers have known for some time that insulin plays a vital role in regeneration and growth in some types of neurons that relay environmental sensory information to our brains, such as sight. However, they know relatively little about the role of insulin in the sense of smell. Now, investigators at the Monell Chemical Senses Center have shown that insulin plays a critical role in the maturation, after injury, of immature olfactory sensory neurons (OSNs). The team published their findings in eNeuro earlier this month.

"Our findings suggest that applying insulin into the nasal passage could be developed as a therapy for injury caused by a host of issues," said first author Akihito Kuboki, MD, a postdoctoral fellow in the lab of Johannes Reisert, PhD."

Knowing that insulin is part of the body's repair pathway for visual neurons, Kuboki suspected that the hormone might also play a role in the maturation of OSNs after injury. He also notes there are many insulin receptors in the olfactory region of the brain. Taking these factors into account, Kuboki concluded that insulin may also be involved in the sense of smell.

"Although scientists don't yet have a clear idea of how it works, we know that insulin plays a key role in preventing cell death," said Kuboki. "If insulin levels are reduced, diabetes patients have a high susceptibility to cell death, which can cause smell loss." He is pursuing this research path to shed light on why people with diabetes often suffer from smell loss, or anosmia.

The research team induced diabetes type 1 in mice to reduce levels of circulating insulin reaching the OSNs. The reduced insulin interfered with the regeneration of OSNs, resulting in an impaired sense of smell. They analyzed how the structure of the olfactory tissue in the nasal cavity and the olfactory bulb is impaired by comparing the number of mature OSNs and how well the axons of OSNs reached the olfactory bulb. The team also recorded odorant-induced responses in the OSNs in the nasal cavity. An odor-guided behavioral task, in which the mice needed to find a cookie reward depending on their ability to smell, measured olfactory function.

In addition, the team injured OSNs, which have a unique ability to regenerate in mammals. This approach allowed the investigators to ask whether OSNs required insulin to regenerate, which they found to be true. What's more, they discovered that OSNs are highly susceptible to insulin deprivation-induced cell death eight to 13 days after an injury. This time window indicates that during a critical stage newly generated OSNs are dependent on insulin. They also found that insulin must be applied to regenerating OSNs at this critical time point in the neurons' growth to be able to restore a mouse's sense of smell.

Also of significance, the team found that insulin promotes regeneration of regenerating OSNs in both type 1 diabetic and nondiabetic mice. "Even in nondiabetic mice, we found that insulin can promote the regeneration of OSNs, which suggests that this could be a therapy for olfactory dysfunction in patients without diabetes," said Kuboki. Specifically, the team only examined the OSN regeneration process after injury in type 1-diabetic mice and did not examine the effects of type 2 diabetes, but plan to in the future.

"Our findings suggest that insulin plays important roles when OSNs need to regenerate after severe injury that induces cell death in many OSNs," said Kuboki. "From this, we hope that an insulin spray can be potentially applied to treat smell loss for various reasons, including head trauma and viral infection."

Credit: 
Monell Chemical Senses Center

Ethnicity, geography and socioeconomic factors determine likelihood of detecting serious congenital

Mothers who are Hispanic or who come from rural or low socioeconomic status neighborhoods are less likely to have their child's critical heart condition diagnosed before birth, according to a new study in the journal Circulation.

This is the largest and most geographically diverse study of these challenges to date. The study compared patient data of more than 1,800 children from the United State and Canada diagnosed with two of the most common, and the most serious, critical congenital heart defects: hypoplastic left heart syndrome (HLHS), when the left side of the heart is not developed completely, and transposition of the great arteries (TGA), when the two main arteries that carry blood away from the heart are reversed.

"The earlier we diagnose a heart defect, especially a serious one such as HLHS or TGA, the sooner we can make a plan for how to safely deliver the infant and reduce the impacts of that heart defect on the rest of the body," says Anita Krishnan, M.D., first author and cardiologist at Children's National Hospital. "Early detection and diagnosis of these conditions is crucial to ensuring the best possible outcome for the child, especially in protecting the brain."

Even when infants' heart defects were detected before birth, babies from neighborhoods with lower socioeconomic status were detected later in gestation than others.

"The COVID-19 pandemic has brought the idea of significant disparities in health care to the forefront of our national attention," says Dr. Krishnan. "Even though many health care providers have seen these inequities firsthand in their own clinical experience, it was still surprising to see the strength of the association between socioeconomic position and the care available to mothers."

In both the United States and Canada, expectant mothers are first screened as part of routine prenatal care in the first trimester for early signs of congenital heart defects and other genetic disorders via blood screen and ultrasound. In the second trimester, a comprehensive ultrasound evaluation for structural anomalies is routine. If any issues are detected, the mother is referred for a fetal echocardiogram and counseling.

The authors suggest that decreased linkages between neighborhoods and people identified in the study and subspecialists could contribute to the disparities found in the study.

"Prenatal detection rates may improve if we are able to leverage outreach and telehealth to strengthen the relationships between these specialties and the groups we identified in the study," Dr. Krishnan says.

The study included a total of 1,862 patients, including 1,171 patients with HLHS (91.8% prenatally diagnosed) and 691 with TGA (58% prenatally diagnosed). The study group included prenatally diagnosed fetuses with HLHS or TGA and postnatally diagnosed infants less than two months old with HLHS or TGA. Data was collected from institutions participating in the Fetal Heart Society, a non-profit 501(c) multicenter research collaborative with a mission to advance the field of fetal cardiovascular care and science.

Credit: 
Children's National Hospital

Alcohol problems severely undertreated

image: Laura Bierut, MD, and her team of researchers at Washington University School of Medicine in St. Louis have found that although the vast majority of people with alcohol use disorder see their doctors regularly for a range of issues, fewer than one in 10 ever get treatment to help curb their drinking.

Image: 
Washington University School of Medicine

Some 16 million Americans are believed to have alcohol use disorder, and an estimated 93,000 people in the U.S. die from alcohol-related causes each year. Both of those numbers are expected to grow as a result of heavier drinking during the COVID-19 pandemic.

Yet, in a new study involving data from more than 200,000 people with and without alcohol problems, researchers at Washington University School of Medicine in St. Louis found that although the vast majority of those with alcohol use disorder see their doctors regularly for a range of issues, fewer than one in 10 ever get treatment for drinking.

The findings are published in the June issue of the journal Alcoholism: Clinical & Experimental Research.

Analyzing data gathered from 2015 through 2019 via the National Survey on Drug Use and Health, the researchers found that about 8% of those surveyed met the current criteria for alcohol use disorder, the medical diagnosis for those with an addiction to alcohol. Of these people who met the criteria, 81% had received medical care in a doctor's office or spent time in a hospital or clinic during the previous year. But only 12% reported they had been advised to cut down on their drinking, 5% were offered information about treatment, and 6% received treatment, some of whom were not referred by their doctors but sought out treatment on their own.

"It's not that these people aren't in the health-care system," said first author Carrie M. Mintz, MD, an assistant professor of psychiatry. "But although they see doctors regularly, the vast majority aren't getting the help they need."

Mintz and her colleagues evaluated data from 214,505 people. The researchers first wanted to learn whether people with alcohol use disorder had access to health care and if they did, whether they had been screened about their alcohol use; they were considered to have been screened if their doctors simply had asked how much they drink. The researchers also evaluated whether people with drinking problems had been advised to cut down on drinking, had received additional information about treatment, or had received treatment or counseling.

The researchers found that although most people with alcohol use disorder had access to health care and although 70% reported they had been asked about alcohol use, that's where the care stopped.

"Some primary care doctors may not feel comfortable telling patients they should cut down on drinking, prescribing medication to help them cut back or referring them to treatment because they don't specialize in treating alcohol misuse; but the result is that many people who need treatment aren't getting it," said senior author Laura Jean Bierut, MD, the Alumni Endowed Professor of Psychiatry. "We used to see the same thing with smoking, but when physicians became educated about smoking and learned that many of their patients wanted to quit or cut back, doctors began offering more treatment, and more people were able to quit. We think the same thing may be possible with alcohol."

Among treatments that could be prescribed are the FDA-approved medications naltrexone, acamprosate and disulfiram, as well as psychotherapy and mutual-aid approaches, such as the 12-step program used by Alcoholics Anonymous.

"Alcohol use disorder is a chronic disease, but compared to other chronic diseases, it's wildly untreated," Bierut said. "For example, two-thirds of patients with HIV and 94% of patients with diabetes receive treatment, compared with only 6% of people with alcohol use disorder."

The researchers noted that during the pandemic, alcohol sales in the U.S. increased by 34%. Consequently, they expect that as the country emerges from COVID-19 and returns to normal, the number of people with alcohol use disorder will have climbed.

"We know alcohol use and misuse have increased during the pandemic," Mintz said. "It seems there has been a shift toward heavier drinking. Plus, many doctor's offices, AA groups and other support groups were shut down for a period of time, so we would hypothesize that even the relatively small percentage of people in treatment may have declined during the past year."

Credit: 
Washington University School of Medicine

Diamonds engage both optical microscopy and MRI for better imaging

image: The microdiamonds used as biological tracers are about 200 microns across, less than one-hundredth of an inch. They fluoresce red but can also be hyperpolarized, allowing them to be detected both optically -- by fluorescence microscopy -- and by radio-frequency NMR imaging, boosting the power of both techniques.

Image: 
Ashok Ajoy, UC Berkeley

When doctors or scientists want to peer into living tissue, there's always a trade-off between how deep they can probe and how clear a picture they can get.

With light microscopes, researchers can see submicron-resolution structures inside cells or tissue, but only as deep as the millimeter or so that light can penetrate without scattering. Magnetic resonance imaging (MRI) uses radio frequencies that can reach everywhere in the body, but the technique provides low resolution -- about a millimeter, or 1,000 times worse than light.

A University of California, Berkeley, researcher has now shown that microscopic diamond tracers can provide information via MRI and optical fluorescence simultaneously, potentially allowing scientists to get high-quality images up to a centimeter below the surface of tissue, 10 times deeper than light alone.

By using two modes of observation, the technique also could allow faster imaging.

The technique would be useful primarily for studying cells and tissue outside the body, probing blood or other fluids for chemical markers of disease, or for physiological studies in animals.

"This is perhaps the first demonstration that the same object can be imaged in optics and hyperpolarized MRI simultaneously," said Ashok Ajoy, UC Berkeley assistant professor of chemistry. "There is a lot of information you can get in combination, because the two modes are better than the sum of their parts. This opens up many possibilities, where you can accelerate the imaging of these diamond tracers in a medium by several orders of magnitude."

The technique, which Ajoy and his colleagues report this week in the journal Proceedings of the National Academy of Sciences, utilizes a relatively new type of biological tracer: microdiamonds that have had some of their carbon atoms kicked out and replaced by nitrogen, leaving behind empty spots in the crystal -- nitrogen vacancies -- that fluoresce when hit by laser light.

Ajoy exploits an isotope of carbon -- carbon-13 (C-13) - that occurs naturally in the diamond particles at about 1% concentration, but also could be enriched further by replacing many of the dominant carbon atoms, carbon-12. Carbon-13 nuclei are more readily aligned, or polarized, by nearby spin-polarized vacancy centers, which become polarized at the same time they fluoresce after being illuminated with a laser. The polarized C-13 nuclei yield a stronger signal for nuclear magnetic resonance (NMR) -- the technique at the heart of MRI.

As a result, these hyperpolarized diamonds can be detected both optically -- because of the fluorescent nitrogen vacancy centers -- and at radio frequencies, because of the spin-polarized carbon-13. This allows simultaneous imaging by two of the best techniques available, with particular benefit when looking deep inside tissues that scatter visible light.

"Optical imaging suffers greatly when you go in deep tissue. Even beyond 1 millimeter, you get a lot of optical scattering. This is a major problem," Ajoy said. "The advantage here is that the imaging can be done in radio frequencies and optical light using the same diamond tracer. The same version of MRI that you use for imaging inside people can be used for imaging these diamond particles, even when the optical fluorescence signature is completely scattered out."

Detecting nuclear spin

Ajoy focuses on improving NMR -- a very precise way of identifying molecules -- and its medical imaging counterpart, MRI, in hopes of lowering the cost and reducing the size of the machines. One limitation of NMR and MRI is that large, powerful and costly magnets are needed to align or polarize the nuclear spins of molecules inside samples or the body so that they can be detected by pulses of radio waves. But humans can't withstand the very high magnetic fields needed to get lots of spins polarized at once, which would provide better images.

One way to overcome this is to tweak the nuclear spins of the atoms you want to detect so that more of them are aligned in the same direction, instead of randomly. With more spins aligned, called hyperpolarization, the signal detected by radio is stronger, and less powerful magnets can be used.

In his latest experiments, Ajoy employed a magnetic field equivalent to that of a cheap refrigerator magnet and an inexpensive green laser to hyperpolarize the carbon-13 atoms in the crystal lattice of the microdiamonds.

"It turns out that if you shine light on these particles, you can align their spins to a very, very high degree -- about three to four orders of magnitude higher than the alignment of spins in an MRI machine," Ajoy said. "Compared to conventional hospital MRIs, which use a magnetic field of 1.5 teslas, the carbons are polarized effectively like they were in a 1,000-tesla magnetic field."

When the diamonds are targeted to specific sites in cells or tissue -- by antibodies, for example, which are often used with fluorescent tracers -- they can be detected both by NMR imaging of the hyperpolarized C-13 and the fluorescence of the nitrogen vacancy centers in the diamond. The nitrogen vacancy-center diamonds are already becoming more widely used as tracers for their fluorescence alone.

"We show one important cool feature of these diamond particles, the fact that they spin polarize -- therefore they can glow very bright in an MRI machine -- but they also fluoresce optically," he said. "The same thing that endows them with the spin polarization also allows them to fluoresce optically."

The diamond tracers also are inexpensive and relatively easy to work with, Ajoy said. Together, these new developments could, in the future, allow for an inexpensive NMR imaging machine on every chemist's benchtop. Today, only large hospitals can afford the million-dollar price tag for MRIs. He currently is working on other techniques to improve NMR and MRI, including using hyperpolarized diamond particles to hyperpolarize other molecules.

Credit: 
University of California - Berkeley

Domestic abuse head injuries prevalent among women in prison, study finds

An international study has found that four out of five women in prison in Scotland have a history of head injury, mostly sustained through domestic violence. Published recently in The Lancet, researchers, including SFU psychology graduate student Hira Aslam, say the study has important implications for the female prison population more broadly and could help to inform mental health and criminal justice policy development.

"The findings are incredibly sobering," says Aslam. "While we anticipated that the incidence of head injuries among women who are involved in the criminal justice system would be high, these estimates exceeded our expectations."

Researchers also found that violent criminal behaviour was three times more likely among women who had a history of significant head injury, while women who sustained such injuries generally had prison sentences that were three times longer. Two-thirds were found to have suffered repeated head injuries, and nearly all reported a history of abuse.

Aslam, the study's second author who led the interviews and assessments with offenders, says expanding the study to Canadian and other prison populations would be a critical next step. She can talk about the impact of trauma, head injury and the overall vulnerability of female offenders in the study sample.

"The relationship between head trauma and both violent crime and length of incarceration suggests that this may be an important consideration in the assessment and management of violent offending, as well as in reducing the risk for reoffending," says Aslam. "There is a need to consider these vulnerability factors in Canada and elsewhere in developing appropriate policy and interventions for this population."

Credit: 
Simon Fraser University