Culture

Biopolymer-coated nanocatalyst can help realize a hydrogen fuel-driven future

image: To realize a hydrogen fuel-based future, it is necessary to be able to produce it efficiently in an eco-friendly manner

Image: 
Incheon National University

To combat climate change, shifting from fossil fuels to clean and sustainable energy sources is imperative. A popular candidate in this regard is hydrogen, an eco-friendly fuel that produces only water when used. However, the efficient methods of hydrogen production are usually not eco-friendly. The eco-friendly alternative of splitting water with sunlight to produce hydrogen is inefficient and suffers from low stability of the photocatalyst (material that facilitates chemical reactions by absorbing light). How does one address the issue of developing a stable and efficient photocatalyst?

In a study recently published in Applied Catalysis B: Environmental, an international group of scientists, led by Assistant Professor Yeonho Kim from Incheon National University in Korea, addressed this question and reported on the performance of polydopamine (PDA)-coated zinc sulfide (ZnS) nanorods as a photocatalyst, which showed an increase in hydrogen production by 220% compared to ZnS catalyst alone! Moreover, it displayed decent stability, retaining almost 79% of its activity after being irradiated for 24 hours. Dr. Kim outlines the motivation behind their research, "ZnS has various photochemical applications because it can rapidly generate electric charge carriers under sunlight. However, sunlight also causes oxidation of sulfide ions leading to photocorrosion of ZnS. Recently, studies showed that controlled-thickness PDA coatings on a photocatalyst can improve conversion efficiency for solar energy and enhance photostability. But, so far, no study has addressed the physico-chemical changes at the interface of ZnS/PDA. Therefore, we wanted to study the effect of PDA binding on the photocatalytic performance of ZnS."

The scientists fabricated the PDA-coated ZnS nanocatalysts through polymerization to coat dopamine onto ZnS nanorods, and varied the polymerization period to create samples of three different PDA thicknesses--1.2 nm (ZnS/PDA1), 2.1 nm (ZnS/PDA2), and 3.5 nm (ZnS/PDA3). They then measured the photocatalytic performance of these samples by monitoring their hydrogen production under simulated sunlight illumination.

The ZnS/PDA1 catalyst showed the highest hydrogen production rate followed by ZnS/PDA2, uncoated ZnS, and ZnS/PDA3. The team attributed the inferior performance of ZnS/PDA2 and ZnS/PDA3 to more light absorption by the thicker PDA coatings, which reduced the light reaching ZnS and impeded the excited charge carriers to reach the surface; uncoated ZnS, contrarily, underwent photocorrosion.

To understand the role of electronic structure in the observed enhancement, the scientists measured emission and extinction spectra of the samples along with density functional theory calculations. The former revealed that the enhanced absorption was due to Zn-O or O-Zn-S shells forming on ZnS and the creation of energy levels near the valence band (highest atomic level filled with electrons) that can accept "holes" (absence of electrons), while the calculations showed that ZnS/PDA has a unique "doubly staggered" electronic structure that facilitates the transport and separation of charge carriers at the surface. The improved durability was due to lowered oxidative capacity of holes in the valence states of PDA.

Dr. Kim and his team are hopeful of wider applications of their technique. "The polydopamine coating utilized in our work is also applicable to other groups of selenide, boride, and telluride-based catalysts," comments Dr. Kim.

The future might indeed be hydrogen!

Credit: 
Incheon National University

LSU Health New Orleans study finds disadvantaged census tracts linked to COVID incidence

New Orleans, LA - An LSU Health New Orleans School of Public Health study reports a positive association between social vulnerability and COVID-19 incidence at the census tract level and recommends that more resources be allocated to socially vulnerable populations to reduce the incidence of COVID-19. The findings are published in Frontiers in Public Health, available here.

"In our study, we found Louisiana census tracts with higher levels of social vulnerability were associated with higher COVID-19 cumulative incidence between March 9 to August 24, 2020, even after adjusting for population density," says first author Erin Biggs, MPH, Doctoral Candidate, PhD in Epidemiology at LSU Health New Orleans School of Public Health.
The researchers conducted an ecological study comparing the Centers for Disease Control and Prevention's Social Vulnerability Index (CDC SVI) and census tract-level COVID-19 case counts.

According to the CDC, social vulnerability refers to the resilience of communities when confronted by external stresses on human health, stresses such as natural or human-caused disasters, or disease outbreaks. Its Agency for Toxic Substances and Disease Registry's Social Vulnerability Index uses U.S. census variables at tract level to help local officials identify communities that may need support in preparing for hazards or recovering from a disaster. The CDC SVI ranks each tract on 15 social factors, including poverty, unemployment rate, percentage of single-parent households, lack of vehicle access, and crowded housing. The factors are grouped into four themes - socioeconomic, household composition and disability, minority and language, and housing and transportation.

The researchers identified census tracts with high levels of both social vulnerability and COVID-19 incidence. They report that as the SVI increases, so too does COVID-19 cumulative incidence.

The authors identify some of the factors that increase risk, such as having jobs where people cannot work from home and that bring them into contact with large numbers of people, living in crowded households with less room to physically distance, less ability to buy face masks, and less access to quality scientific information. They note that African Americans and Hispanics are more likely to live in multigenerational homes, which may make self-isolation more difficult for family members who contract COVID-19. These conditions can lead to increased transmission and community prevalence.

The authors write that their findings support the recent argument that the United States faces significant challenges in its handling of the COVID-19 epidemic, particularly due to the nation's structural racism and inattention to the barriers to health at the root of racial health disparities across the nation. They conclude that the CDC's Social Vulnerability Index could be useful in identifying locations that are most impacted by COVID-19 and should thus be targeted for more specific interventions. The factors that comprise social vulnerability, such as income, education, poverty, race, and ethnicity, influence who will suffer the most from the COVID-19 epidemic.

Credit: 
Louisiana State University Health Sciences Center

Gay men who 'sound gay' encounter more stigma and discrimination from heterosexual peers

Gay men are more likely than lesbian women to face stigma and avoidant prejudice from their heterosexual peers due to the sound of their voice, a new study in the British Journal of Social Psychology reports. Researchers also found that gay men who believe they sound gay anticipate stigma and are more vigilant regarding the reactions of others.

During this unique study researchers from the University of Surrey investigated the role of essentialist beliefs -- the view that every person has a set of attributes that provide an insight into their identity -- of heterosexual, lesbian and gay individuals and whether these beliefs lead to prejudice and rejection towards others. Previous research in this area has shown that gay men's and lesbian women's experiences with stigma can lead to a higher likelihood of emotional distress, depression, and anxiety.

In the first part of the study, researchers surveyed 363 heterosexual participants to assess their essentialist beliefs regarding gay and lesbian individuals and asked a series of questions in regards to discreteness ( e.g. "When listening to a person it is possible to detect his/her sexual orientation from his/her voice very quickly"), immutability (e.g. "Gay/lesbian people sound gay/lesbian and there is not much they can do to really change that") and controllability (e.g. "Gay/lesbian people can choose to sound gay or straight depending on the situation").

Researchers also investigated whether participants held any prejudices (e.g. "I think male/female homosexuals are disgusting) and avoidant discrimination (e.g., "I would not interact with a man/woman who sounds gay/lesbian if I could avoid it").

It was found that participants believed voice was a better cue to sexual orientation for men than for women, and their opinions on the discreteness, immutability and controllability of 'gay-sounding' voices was linked to higher avoidant discrimination towards gay-sounding men.

In the second part of the study researchers surveyed 147 gay and lesbian participants to examine their essentialist beliefs in relation to self-perception of sounding gay, and whether this led them to expect rejection and be more vigilant, e.g., trying to avoid certain social situations and persons who may ridicule them because of their voices.

Researchers found that gay men's endorsement of beliefs that people can detect sexual orientation from voice (voice discreteness) and that speakers cannot change the way they sound (voice immutability) were associated with a stronger self-perception of sounding gay. Moreover, gay men who perceived their voices to sound more gay expected more acute rejection from heterosexuals and were more vigilant.

Dr Fabio Fasoli, Lecturer in Social Psychology at the University of Surrey, said: "What we have found is that people have stronger beliefs about the voices of gay men than lesbian women. In particular, beliefs that gay men and straight men have different voices that allow people to detect their sexual orientation was linked to stigmatisation, possibly explaining why some heterosexual individuals stigmatise gay-sounding men regardless of their sexuality. Understanding more about essentialist beliefs helps explain both the perpetration of stigma by heterosexuals and the experience of stigma by lesbians and gay men.

"It is clear from this study that voice and the perception of it are linked to stigma. This is important because it can have negative consequences for gay men's wellbeing."

Credit: 
University of Surrey

Like wine, environmental conditions impact flavor of whiskey, study finds

image: Close up of barley.

Image: 
Oregon State University

CORVALLIS, Ore. - Flavor differences in whiskey can be discerned based solely on the environment in which the barley used to make the whiskey is grown, a new study co-authored by an Oregon State University researcher found.

This is first scientific study that found the environmental conditions, or terroir, of where the barley is grown impacts the flavor of whiskey, said Dustin Herb, an author of the study and a courtesy faculty member in the Department of Crop and Soil Science at Oregon State University.

"Terroir is increasingly being used to differentiate and market agricultural products, most commonly wine, as consumers grow more interested in the origins of their food," Herb said. "Understanding terroir is something that involves a lot of research, a lot of time and a lot of dedication. Our research shows that environmental conditions in which the barley is grown have a significant impact."

Herb, who is originally from Lebanon, Oregon, and earned his undergraduate and doctoral degrees from Oregon State, is the only American author of the study, which was published in the journal Foods. The other authors are all from Ireland, where the study was conducted.

Herb's doctoral research at Oregon State with Pat Hayes, a barley breeder in the College of Agricultural Sciences, focused on the contributions of barley to beer flavor. Their research found notable differences in the taste of beers malted from barley varieties reputed to have flavor qualities.

That research caught the attention of Waterford Distillery. The Irish distillery reached out to Herb, flew him to Ireland and asked him if he could design a study that would attempt to answer the question of whether terroir exists in whiskey. They dubbed it The Whisky Terroir Project. (Whiskey can be spelled with and without an "e.")

Herb designed a study that involved planting two common commercial varieties of barley in Ireland - Olympus and Laureate - in two distinct environments: Athy, Co. Kildare and Buncloudy, Co. Wexford in 2017 and 2018. Athy is an inland site and Buncloudy is a coastal site. They were selected in part because they have different soil types and different temperature ranges and rainfall levels during the barley growing season.

The crops of each barley variety at each site in each year were harvested, stored, malted and distilled in a standardized way. Once distilled, the product is called "new make spirit." (It isn't called whiskey until it is matured in a wooden cask for at least three years.)

The researchers used gas chromatography mass spectrometry and the noses of a six-person trained sensory panel to determine which compounds in the barley most contributed to the aroma of the new make spirit.

That analysis, along with further mathematical and statistical analysis, found that the environment in which the barley was grown had a greater contribution to the aroma of the whiskey than the variety of the barley. That was the clear indication of the impact terroir has on the new make spirit.

Furthermore, the sensory analysis found distinct differences in the aroma characteristics of the new make spirit from the barley grown in each location. In Athy, it was more positively associated with sweet, cereal/grainy, feinty/earthy, oily finish, soapy, sour, stale and mouldy sensory attributes and in Bunclody it was more associated with dried fruit and solventy attributes.

"What this does is actually make the farmer and the producer come to the forefront of the product," Herb said. "It gets to the point where we might have more choices and it might provide an opportunity for a smaller brewer or a smaller distiller or a smaller baker to capitalize on their terroir, like we see in the wine industry with a Napa Valley wine, or Willamette Valley wine or a French Bordeaux."

The sensory analysis also found differences in the aromatic profiles between the 2017 and 2018 seasons that were studied.

"This makes us think there might be a vintage aspect to the whiskey like wine, where you buy a 2019 or a 2020 or a 2016," Herb said. "Could the whiskey industry operate in a similar way, where someone is going to seek out a certain vintage of a certain year?"

To answer that question, more research needs to be done, Herb said. That is a project the Whisky Terroir Project plans to tackle: examining flavor changes in the spirits as they mature in casks and to see what happens with the terroir impact.

The team is also scaling up the research to study terroir in commercial-scale barley fields over a five-year period.

Credit: 
Oregon State University

Markey's ACTION program develops cancer education curriculum for Appalachian schools

LEXINGTON, Ky. (Feb. 19, 2021) - After conducting a study to assess the need for cancer education materials in Appalachian Kentucky, members of the University of Kentucky Markey Cancer Center's Appalachian Career Training in Oncology (ACTION) program worked with faculty from the UK College of Education to create a three-part cancer education curriculum for middle and high school teachers in the region.

Kentucky is home to the highest rates of cancer incidence and mortality in the country, and that problem is further concentrated in the Appalachian region of the state. Funded by a grant from the National Cancer Institute, ACTION is a two-year program designed to prepare undergraduate and high school students for cancer-focused careers and is open to students who hail from one of the 54 Appalachian Kentucky counties. The program also educates students on ways to make a difference in their own communities through outreach and engagement.

Perhaps not surprisingly, nearly all of ACTION's participants know a friend or family member dealing with cancer.

"It became pretty clear early on that the students were all personally touched by cancer, which was one of the reasons why they were in the program, but they didn't really know much about cancer," said Nathan Vanderford, director of the ACTION Program. "So that got us talking more to them and to their teachers about what's currently being taught in schools regarding the disease."

The ACTION team conducted an online survey with science and health teachers in Appalachian Kentucky. Published in the Journal of Appalachian Health, results of the survey showed that participating teachers agreed that cancer education was important to their students' lives, but the amount of such education is inconsistent.

The participating teachers also expressed a desire to increase cancer education in their classrooms. Many teacher responses touched on the fact that both they and their students see the devastating impact of cancer in their daily lives. Several responses noted the importance of teaching students about cancer from more of a public health angle, i.e., learning more about prevention and risk factors for cancer.

"There's a huge opportunity here, and I think there's a thirst for the knowledge," Vanderford said. "In the survey, without any prompting, the teachers told us exactly what we know - that they and their students are bombarded by cancer all around them, and I think that makes them naturally more curious about the topic."

In response, Vanderford decided to create a curriculum for teachers, reaching out to UK College of Education faculty for expertise on taking the cancer information he wanted teachers to have and developing it into lessons that would fit teachers' current curriculum requirements.

"We know the teachers want to teach about cancer, but it isn't laid out in the curriculum or in the standards," said Sahar Alameh, an assistant professor of STEM education in the UK College of Education. "For example, they told us that they mention cancer when they talk about the ozone layer, or when they talk about mutation and mitosis, but it's not a lesson. My goal was to take Nathan's lessons and suggest modifications to them so they aligned with teaching standards."

"I thought it was really important to work with faculty in the College of Education, because they prepare future teachers every day and they are experts in science education pedagogy," Vanderford said. "Teachers have to teach to these academic standards anyway, but if we can take this great information about a public health topic that is of utmost importance to your students and community and you can be touching on those standards, it's a win-win."

This collaboration yielded a new three-part cancer education curriculum for middle and high school teachers in Appalachia to use in their classrooms. The curriculum includes information on cancer data and risk factors in the region, follows national and state science and health education standards, and is tailored to cultural aspects of Appalachian Kentucky.

Including that Kentucky-specific information was key, Vanderford says. He wanted the cancer lessons to reflect the reality of what teachers and students experience daily, and that also included topics like fatalistic views on cancer and issues around health care access and engagement.

"People tell me all the time that they have had friends and family with cancer and they see it everywhere, but they didn't know, for example, that our cancer rates are the highest in the country," Vanderford said. "The more relevant a topic is to a person, the more they're going to pay attention to it."

With that in mind, Alameh says the curriculum is an example of problem-based learning, a teaching method where real-world issues are used to help drive the learning process in students.

"The teachers tell you that whenever they talk about cancer, the kids - even elementary and middle school kids, not just the high school kids - want to share a story about someone they know who had cancer," she said. "When you situate learning in a meaningful situation, it definitely produces more engagement."

Vanderford and Alameh both highlight the project as a unique, fruitful collaboration between different colleges. Other contributors to the project include ACTION program coordinator Chris Prichard, UK College of Education Associate Professor Melinda Ickes and graduate student Katherine Sharp, and UK College of Arts and Sciences undergraduate student Lauren Hudson.

Hudson, a junior majoring in neuroscience, works with the ACTION Program and is first author on the study. Like the students in the program, the northern Kentucky native had firsthand experience of dealing with cancer, as her mother is a 17-year breast cancer survivor. She says she was always interested in oncology, but was unaware of the scope of the cancer problem in Kentucky until she became involved with ACTION.

"When I started working on this and learning more about it, I thought, 'Why aren't we talking about this more in the areas of Kentucky where everyone needs to know about cancer?'" she said. "This is really exciting for me. In high school, if someone had presented this material to me, I would have loved it. I know there are students out there like me who were interested in medicine, but health disparities are not always covered in high school. I like to think there are students who are going to realize that pursuing a cancer career is what they want to do from learning this curriculum."

The team has already begun disseminating the curriculum to teachers in the region and will continue to adapt the curriculum based on feedback from both teachers and students. Seeing their work making its way into Kentucky communities is exciting, says Vanderford, who notes that creating a tangible, useful product from scientific work is a rare and difficult process. With more than 40 publications under his belt, this project is especially significant for him.

"It's a scientist's dream to actually create something that people will use and that will be helpful for people's knowledge and in this case, potentially impact public health in terms of lowering cancer risk," Vanderford. "For this reason, our work here is kind of phenomenal to me."

Credit: 
University of Kentucky

College students displaced from campus due to COVID-19 show worse psychological outcomes

BOSTON -- Numerous psychiatric studies have documented increased rates of depression and anxiety among those forced to relocate, with sudden moves often affecting individuals' social support and sense of identity and control. As the COVID-19 pandemic spread through the U.S. in March of 2020, universities evacuated students from their campuses, and thousands quickly relocated. Few studies have examined the mental health impact of the sudden disruption. In a new study of 791 undergraduate and graduate students, surveyed between April 9 and August 4, 2020, researchers from Brigham and Women's Hospital, Boston University's School of Social Work, and McLean Hospital revealed that students forced to relocate during the spring were more likely to report COVID-19-related grief, loneliness and generalized anxiety symptoms than students who did not relocate. The findings are published in the Journal of Psychiatric Research.

"Students have high rates of mental health concerns, and moving is a major life stressor for anyone, so we wondered what the consequences of relocating from campus would be," said corresponding author Cindy Liu, PhD, of the Departments of Pediatric Newborn Medicine and Psychiatry at the Brigham. "We saw that moving, itself, predicted mental health concerns, even when accounting for other factors that might be involved in mental health."

These other factors included sociodemographic determinants like age and race, pre-existing mental health diagnoses, measurements of psychological resilience and distress tolerance, the COVID-19 transmission rate in the state where the school was located, and when students took the survey between April and August.

"Even before the pandemic, college students suffered from mental health crises, but COVID-19 has amplified depression, anxiety, and virus-related worry. One of the important findings of the study shows that those who have higher levels of psychological resilience and distress tolerance are much less likely to have symptoms of depression, anxiety, and PTSD. In other words, those with a higher ability to bounce back from adversity and/or to manage and cope with emotional distress were less likely to have mental health problems during the pandemic," said Hyeouk "Chris" Hahm, chair and associate professor of the Social Research Department at BU's School of Social Work, one of the paper's authors.

Approximately one-third of the respondents were required to leave campus, and roughly 80 percent needed to complete the move within one week. Of the 264 students who relocated, roughly 40 percent stated that they left valuable personal belongings behind. These students were more likely to report COVID-19-related worries, grief, and symptoms of depression, generalized anxiety and post-traumatic stress disorder (PTSD) symptoms, even when accounting for the same predictors of mental health described above.

The association between mental health concerns and leaving behind personal belongings -- which could include medication or other essential items -- was particularly striking to the researchers.

"The logistics surrounding the move, the sense of loss associated with it, how colleges communicate the relocation, and whether or not they provide resources to help students -- those are all very critical considerations that need to take place," Liu said. "When we make the decision to move students off campus, it's important to acknowledge that there may be downstream effects from that action."

Hahm added, "Many students became frustrated with the unclear guidelines coming from their colleges regarding their return dates. Some were led to believe that they could return to campus within a month; thus, some students left their belongings in their dorms. Not all students were able to return to pick up their belongings because of the long distance or limited time."

Certain vulnerable subpopulations were more likely to relocate, including those who indicated nonbinary gender identities and those who received financial aid. Compared to men, those with nonbinary gender identities were more likely to report depressive, generalized anxiety and PTSD symptoms.

The findings come from a larger survey of young adults between the ages of 18 and 30 called the CARES 2020 Project (COVID-19 Adult Resilience Experiences Study), spearheaded by Liu and Hyeouk Chris Hahm, MSSW, PhD, of the School of Social Work at Boston University. The study assesses symptoms of loneliness, depression, anxiety, and post-traumatic stress disorder, as well as two new metrics -- COVID-19-related grief and worry -- that the researchers developed to examine losses and insecurities specifically associated with the pandemic. These range from difficulties in accessing resources to fears about contracting the virus or transmitting it to loved ones. The researchers recruited participants through email lists, social media pages, and by word-of-mouth.

Achieving accurate demographic representation was difficult, which the researchers note can limit the ability to generalize their findings. For example, 80 percent of respondents were women, and Black and Latinx individuals accounted for 4.8 and 5.8 of respondents, respectively. Engaging marginalized groups in the study was both important and challenging amidst the broader upheaval of the pandemic. To this end, the authors speculate that their respondents were, on average, supported by families with more stable socioeconomic statuses: 44 percent received no financial aid and, of those who relocated, 86 percent were able to return to a parent or guardian and 91 percent were not obligated to pay for their living arrangement.

The researchers are currently examining the experiences of self-identified sexual and gender minorities in more depth. They have also obtained data from a second wave of the study to track young people's experiences during the pandemic beyond the timeframe covered by the initial responses.

"Hopefully there won't be a need for students to relocate again," Liu said. "But going forward, I think we need to continue following the psychological consequences of that move."

Credit: 
Brigham and Women's Hospital

Study finds COVID risk communication targeting younger adults may have biggest impact

A study of adults in the United States finds that - broadly speaking - the older you are, the more concerned you are about COVID-19, and the more steps you take to reduce your risk from COVID-19. The study suggests that the biggest boost in risk reduction would stem from communication efforts aimed at raising awareness of COVID-19 risks among U.S. adults under the age of 40.

"Our study reinforces the idea that different generations perceive the risks associated with COVID-19 very differently," says Yang Cheng, corresponding author of the study and an assistant professor of communication at North Carolina State University. "It also highlights the need to do more to communicate the need for preventive measures to younger generations."

For this study, researchers conducted a survey of 1,843 adults across the U.S. The survey was conducted in April 2020 and focused on how study participants perceived risks associated with COVID-19 and what they were doing to reduce those risks. Researchers broke the study participants into four generational groups when analyzing the survey data. Baby Boomers were those study participants who were 55 years old or older at the time of the survey. Generation X was aged 40-54. Generation Y was 25-39. Generation Z was 18-24.

The researchers found that Baby Boomers perceived COVID-19 to pose the greatest risk, which is not surprising given that the emphasis at the time of the survey was largely on the risk the disease posed to older adults. Generations X and Y were next, and had very similar assessments of risk. Generation Z had the lowest perceived views of risk associated with COVID-19.

When it came to risk-reduction behaviors - such as wearing a mask and social distancing - Baby Boomers were again the most cautious; on average they did the most to reduce their risk. Generation X engaged in the second most risk-reduction behaviors. Generations Y and Z engaged in about the same number of risk-reduction behaviors.

The researchers also looked at how perceptions of risk influenced risk-reduction behaviors in each age group. And there were big differences here.

For example, Baby Boomers who perceived COVID-19 to be a low risk still took far more precautions than people in Generation Y or Generation Z who perceived COVID-19 to be a low risk. In other words, there was a big difference between people in different generations who thought COVID-19 wasn't dangerous to them. However, that gap narrowed considerably when people viewed the risk as being more significant - until there was very little generational gap in risk-reduction behaviors for people who felt COVID-19 posed a serious risk.

"Persuasive health messages tailored for these younger generations, to increase their level of per-ceived risk, could encourage them to engage in more risk reduction - and help us reduce the spread of the disease," Cheng says.

"It would also be valuable to run this survey again to see how attitudes and behaviors have evolved over the past year," Cheng says. "What's changed? I'd also like to explore issues related to vaccination."

Credit: 
North Carolina State University

UIC researchers invent new gene-editing tool

Researchers from the University of Illinois Chicago have discovered a new gene-editing technique that allows for the programming of sequential cuts -- or edits -- over time.

CRISPR is a gene-editing tool that allows scientists to change the DNA sequences in cells and sometimes add a desired sequence or genes. CRISPR uses an enzyme called Cas9 that acts like scissors to make a cut precisely at a desired location in the DNA. Once a cut is made, the ways in which cells repair the DNA break can be influenced to result in different changes or edits to the DNA sequence.

The discovery of the gene-editing capabilities of the CRISPR system was described in the early 2010s. In only a few years, scientists became enamored with the ease of guiding CRISPR to target almost any DNA sequence in a cell or to target many different sites in a cell in a single experiment.

"A drawback of currently available CRISPR-based editing systems is that all the edits or cuts are made all at once. There is no way to guide them so that they take place in a sequential fashion, one after the other," said UIC's Bradley Merrill, associate professor of biochemistry and molecular genetics at the College of Medicine and lead author of the paper.

Merrill and colleagues' new process involves the use of special molecules called guide RNA that ferry the Cas9 enzyme within the cell and determine the precise DNA sequence at which Cas9 will cut. They call their specially engineered guide RNA molecules "proGuides," and the molecules allow for the programmed sequential editing of DNA using Cas9.

Their findings are published in the journal Molecular Cell.

While proGuide is still in the prototype phase, Merrill and colleagues plan to further develop their concept and hope that researchers will be able to use the technique soon.

"The ability to preprogram the sequential activation of Cas9 at multiple sites introduces a new tool for biological research and genetic engineering," Merrill said. "The time factor is a critical component of human development and also disease progression, but current methods to genetically investigate these processes don't work effectively with the time element. Our system allows for gene editing in a pre-programmed fashion, enabling researchers to better investigate time-sensitive processes like how cancer develops from a few gene mutations and how the order in which those mutations occur may affect the disease."

Credit: 
University of Illinois Chicago

Reclusive neutron star may have been found in famous supernova

video: This computer model from a paper by Orlando and collaborators shows the remnant in 2017, incorporating data taken by Chandra, ESA's XMM-Newton and Japan's Advanced Satellite for Cosmology and Astrophysics (ASCA).

Image: 
INAF-Osservatorio Astronomico di Palermo/Salvatore Orlando

Since astronomers captured the bright explosion of a star on February 24, 1987, researchers have been searching for the squashed stellar core that should have been left behind. A group of astronomers using data from NASA space missions and ground-based telescopes may have finally found it.

As the first supernova visible with the naked eye in about 400 years, Supernova 1987A (or SN 1987A for short) sparked great excitement among scientists and soon became one of the most studied objects in the sky. The supernova is located in the Large Magellanic Cloud, a small companion galaxy to our own Milky Way, only about 170,000 light-years from Earth.

While astronomers watched debris explode outward from the site of the detonation, they also looked for what should have remained of the star's core: a neutron star.

Data from NASA's Chandra X-ray Observatory and previously unpublished data from NASA's Nuclear Spectroscopic Telescope Array (NuSTAR), in combination with data from the ground-based Atacama Large Millimeter Array (ALMA) reported last year, now present an intriguing collection of evidence for the presence of the neutron star at the center of SN 1987A.

"For 34 years, astronomers have been sifting through the stellar debris of SN 1987A to find the neutron star we expect to be there," said the leader of the study, Emanuele Greco, of the University of Palermo in Italy. "There have been lots of hints that have turned out to be dead ends, but we think our latest results could be different."

When a star explodes, it collapses onto itself before the outer layers are blasted into space. The compression of the core turns it into an extraordinarily dense object, with the mass of the Sun squeezed into an object only about 10 miles across. These objects have been dubbed neutron stars, because they are made nearly exclusively of densely packed neutrons. They are laboratories of extreme physics that cannot be duplicated here on Earth.

Rapidly rotating and highly magnetized neutron stars,called pulsars, produce a lighthouse-like beam of radiation that astronomers detect as pulses when its rotation sweeps the beam across the sky. There is a subset of pulsars that produce winds from their surfaces - sometimes at nearly the speed of light - that create intricate structures of charged particles and magnetic fields known as "pulsar wind nebulae."

With Chandra and NuSTAR, the team found relatively low-energy X-rays from SN 1987A's debris crashing into surrounding material. The team also found evidence of high-energy particles using NuSTAR's ability to detect more energetic X-rays.

There are two likely explanations for this energetic X-ray emission: either a pulsar wind nebula, or particles being accelerated to high energies by the blast wave of the explosion. The latter effect doesn't require the presence of a pulsar and occurs over much larger distances from the center of the explosion.

On a couple of fronts, the latest X-ray study supports the case for the pulsar wind nebula -- meaning the neutron star must be there -- by arguing against the scenario of blast wave acceleration. First, the brightness of the higher energy X-rays remained about the same between 2012 and 2014, while the radio emission detected with the Australia Telescope Compact Array increased. This goes against expectations for the blast wave scenario. Next, authors estimate it would take almost 400 years to accelerate the electrons up to the highest energies seen in the NuSTAR data, which is over 10 times older than the age of the remnant.

"Astronomers have wondered if not enough time has passed for a pulsar to form, or even if SN 1987A created a black hole," said co-author Marco Miceli, also from the University of Palermo. "This has been an ongoing mystery for a few decades and we are very excited to bring new information to the table with this result."

The Chandra and NuSTAR data also support a 2020 result from ALMA that provided possible evidence for the structure of a pulsar wind nebula in the millimeter wavelength band. While this "blob" has other potential explanations, its identification as a pulsar wind nebula could be substantiated with the new X-ray data. This is more evidence supporting the idea that there is a neutron star left behind.

If this is indeed a pulsar at the center of SN 1987A, it would be the youngest one ever found.

"Being able to watch a pulsar essentially since its birth would be unprecedented," said co-author Salvatore Orlando of the Palermo Astronomical Observatory, a National Institute for Astrophysics (INAF) research facility in Italy. "It might be a once-in-a-lifetime opportunity to study the development of a baby pulsar."

The center of SN 1987A is surrounded by gas and dust. The authors used state-of-the-art simulations to understand how this material would absorb X-rays at different energies, enabling more accurate interpretation of the X-ray spectrum, that is, the amount of X-rays at different energies. This enables them to estimate what the spectrum of the central regions of SN 1987A is without the obscuring material.

As is often the case, more data are needed to strengthen the case for the pulsar wind nebula. An increase in radio waves accompanied by an increase in relatively high-energy X-rays in future observations would argue against this idea. On the other hand, if astronomers observe a decrease in the high-energy X-rays, then the presence of a pulsar wind nebula will be corroborated.

The stellar debris surrounding the pulsar plays an important role by heavily absorbing its lower energy X-ray emission, making it undetectable at the present time. The model predicts that this material will disperse over the next few years, which will reduce its absorbing power. Thus, the pulsar emission is expected to emerge in about 10 years, revealing the existence of the neutron star.

Credit: 
Center for Astrophysics | Harvard & Smithsonian

Models to predict dengue, zika and yellow fever outbreaks are developed by researchers

image: Monkey being examined in Manaus area. Scientists will monitor areas in which these diseases are endemic to investigate the factors that trigger outbreaks

Image: 
CREATE-NEO

By Maria Fernanda Ziegler | Agência FAPESP – Yellow fever was the first human disease to have a licensed vaccine and has long been considered important to an understanding of how epidemics happen and should be combated. It was introduced to the Americas in the seventeenth century, and high death rates have resulted from successive outbreaks since then. Epidemics of yellow fever were associated with the slave trade, the US gold rush and settlement of the Old West, the Haitian Revolution, and construction of the Panama Canal, to cite only a few examples.

Centuries after the disease was first reported in the Americas, an international team of researchers will embark on a groundbreaking study to develop models that predict epidemics of yellow fever and other diseases caused by mosquito-borne arboviruses such as dengue, zika, and chikungunya.

“Knowledge of these diseases, their cycles ,and the possibilities of new outbreaks is very well-established, but we still lack a systematic understanding of how to predict when outbreaks will occur. Our goal is to create predictive models to help monitor and combat outbreaks, protect the public, and develop a deeper understanding of the combination of factors that leads to epidemics,” said Maurício Lacerda Nogueira, a professor at the São José do Rio Preto Medical School (FAMERP) in the state of São Paulo, Brazil, and a member of the CREATE-NEO project funded by the US National Institutes of Health (NIH).

The new study is part of a Thematic Project supported by FAPESP to monitor the mosquito population in the urban area of São José do Rio Preto and the monkey and mosquito populations in the transition zone between rural and urban Manaus, the Amazonas state capital.

In addition to FAMERP, the Brazilian research centers that are participating in the initiative include the National Institute for Research on the Amazon (INPA), the Federal University of Mato Grosso (UFMT), the Federal University of Amazonas (UF AM), and the Heitor Vieira Dourado Foundation for Tropical Medicine (FMT-HVD). Participants located elsewhere include the University of Texas Medical Branch (UTMB), New Mexico State University (NMSU), the Massachusetts Institute of Technology (MIT), and Cary Institute of Ecosystem Studies in the US, and the Gorgas Memorial Institute for Health Studies (ICGES) in Panama, among others.

An article written by the researchers to mark the project’s inception is published in Emerging Topics in Life Science, reviewing the factors that influence the potential re-emergence of yellow fever in the Neotropics.

“Deforestation, seasonal variations in rainfall and non-human primate populations are all factors that influence outbreaks, but we need to know the tipping-point for each one, and to find that out we’ll develop predictive models based on research and monitoring conducted in arbovirus hotspots in São Paulo, Amazonas and the Pantanal in Brazil, and in Panama,” Nogueira said.

The history of yellow fever shows that outbreaks occur at intervals of between seven and ten years. “Brazil has many arboviruses, with outbreaks and even epidemics in progress all the time,” said Livia Sacchetto, also a researcher at FAMERP and a member of CREATE-NEO.

From forest to city and vice-versa

According to Sacchetto, the project also aims to find out more about spillovers and, if possible, anticipate these outbreaks in which arboviruses jump from humans to animals or vice-versa. Dengue, zika, and chikungunya are transmitted to humans and non-human primates by infected Aedes aegypti mosquitoes. In the case of yellow fever, A. aegypti is the urban vector but mosquitoes of a different genus (Haemagogus) are responsible for transmission in the countryside (sylvatic cycle).

Despite the existence of a highly effective vaccine since 1937, and no reported cases caused by urban transmission since 1942, sylvatic outbreaks of yellow fever frequently spill over into cities. “Many people and monkeys die of yellow fever in Brazil and other parts of the Americas, as well as in Africa,” Sacchetto said. “Despite the vaccine and progress in controlling transmission of the disease, we continue to see cases emerging from the sylvatic cycle. The virus is endemic in part of Brazil, with persistent circulation between mosquitoes and non-human primates, which are its primary hosts.”

This enzootic cycle is far from easy to control. “Once established, the enzootic cycle ensures that the virus stays in forests or other rural areas, but it can spread to a city via accidental infection of a human,” she said. Circulation of the virus in cities raises concerns about a return of the urban cycle involving transmission by A. aegypti. “Hence the importance of epidemiological surveillance studies and maintenance of large-scale vaccine coverage to control outbreaks.”

Predictive models for arboviruses also take into account climate change and urbanization destroying native vegetation. “We have active cases of yellow fever in both non-human primates and humans in the Southern states of Paraná and Santa Catarina. This hasn’t happened for several decades,” Nogueira said.

In Africa, from which yellow fever came to the Americas, most cases of the disease occur in the sub-Saharan region. Urban yellow fever is a major public health concern, with frequent outbreaks that are hard to predict.

In the Americas, as the authors of the review article note, yellow fever has historically been reported from northern Panama to northeastern Argentina. In recent years, most cases have occurred in the Amazon Basin during the rainy season, when population densities of Haemagogus mosquitoes are at their highest, but the number of reported cases following sylvatic spillover has increased in Peru, Bolivia, and Paraguay, as well as Brazil.

The article “Re-emergence of yellow fever in the Neotropics - quo vadis?” (doi: 10.1042/ETLS20200187) by Livia Sacchetto, Betania P. Drumond, Barbara A. Han, Mauricio L. Nogueira and Nikos Vasilakis can be read at: portlandpress.com/emergtoplifesci/article/4/4/411/227095/Re-emergence-of-yellow-fever-in-the-neotropics-quo.

Journal

Emerging Topics in Life Sciences

DOI

10.1042/ETLS20200187

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Managing suicide risk in research study participants

What should researchers do if they encounter a study participant who reports suicidal thoughts?

UIC College of Nursing associate professor Susan Dunn explores this question as lead author of "Suicide Risk Management Protocol for a Randomized Controlled Trial of Cardiac Patients Reporting Hopelessness," a paper published in the January/February edition of Nursing Research.

Suicide is ranked as the 10th leading cause of death for all ages in the U.S. and can be identified through clinical research, according to the paper.

Although suicide screening tools are widely available for patients in emergency, hospital and primary care settings and have been used in research, there is a "significant gap" in the availability of published suicide risk management protocols for use in research studies, the authors wrote.

Because of this, Dunn, who is conducting an NIH-funded study on hopelessness in cardiac patients, says she developed a protocol "from the ground up" to identify, measure and act on suicidal ideation expressed by study participants.

The safety of participants is essential in research, and staff administering studies must be prepared to evaluate and act on suicidal signals from patients, the paper's authors wrote.

"For those research studies when we know patients are at higher risk --and definitely hopelessness does put a patient at higher risk for suicide -- a suicide risk management protocol should absolutely be in place," Dunn says in an interview. "It gives me, as the [principal investigator], peace of mind to know the data collectors -- the nurses doing the intervention -- are all trained to be able to recognize if there is a potential for suicidal ideation."

Dunn's protocol uses the Columbia-Suicide Severity Rating Scale to score whether a patient is at low, moderate or high risk for suicide. Those scores then trigger next steps. For high or moderate risk levels, that would mean stopping data collection, immediately contacting both a mental health resource and the patient's provider, and making sure the patient is supervised. For low risk, it means providing a list of mental health resources. The protocol also requires that those staffing the study take a free online training to use the Columbia scale, participate in role-playing training to be able to identify and act on various suicidal risk levels, and participate in booster trainings (annually or when the protocol changes).

"The reason I wanted to publish the protocol is so that others can use it as a model for their own research," she says.

Age-adjusted suicide rates increased 30% from 2000 to 2016, according to the paper, and Dunn says they're even higher now due to the COVID-19 pandemic.

"Especially when you think about the current situation we find ourselves in with the pandemic, we know that suicide rates are higher," Dunn says. "How many studies are assessing for suicidal ideation? I think that's a significant issue. It's not always being assessed in high-risk patients."

Credit: 
University of Illinois Chicago

Positive reinforcements help algorithm forecast underground natural reserves

Texas A&M University researchers have designed a reinforcement-based algorithm that automates the process of predicting the properties of the underground environment, facilitating the accurate forecasting of oil and gas reserves.

Within the Earth's crust, layers of rock hold bountiful reservoirs of groundwater, oil and natural gas. Now, using machine learning, researchers at Texas A&M University have developed an algorithm that automates the process of determining key features of the Earth's subterranean environment. They said this research might help with accurate forecasting of our natural reserves.

Specifically, the researchers' algorithm is designed on the principle of reinforcement or reward learning. Here, the computer algorithm converges on the correct description of the underground environment based on rewards it accrues for making correct predictions of the pressure and flow expected from boreholes.

"Subsurface systems that are typically a mile below our feet are completely opaque. At that depth we cannot see anything and have to use instruments to measure quantities, like pressure and rates of flow," said Siddharth Misra, associate professor in the Harold Vance Department of Petroleum Engineering and the Department of Geology and Geophysics. "Although my current study is a first step, my goal is to have a completely automated way of using that information to accurately characterize the properties of the subsurface."

The algorithm is described in the December issue of the journal Applied Energy.

Simulating the geology of the underground environment can greatly facilitate forecasting of oil and gas reserves, predicting groundwater systems and anticipating seismic hazards. Depending on the intended application, boreholes serve as exit sites for oil, gas and water or entry sites for excess atmospheric carbon dioxide that need to be trapped underground.

Along the length of the boreholes, drilling operators can ascertain the pressures and flow rates of liquids or gas by placing sensors. Conventionally, these sensor measurements are plugged into elaborate mathematical formulations, or reservoir models, that predict the properties of the subsurface such as the porosity and permeability of rocks.

But reservoir models are mathematically cumbersome, require extensive human intervention, and at times, even give a flawed picture of the underground geology. Misra said there has been an ongoing effort to construct algorithms that are free from human involvement yet accurate.

For their study, Misra and his team chose a type of machine-learning algorithm based on the concept of reinforcement learning. Simply put, the software learns to make a series of decisions based on feedback from its computational environment.

"Imagine a bird in a cage. The bird will interact with the boundaries of the cage where it can sit or swing or where there is food and water. It keeps getting feedback from its environment, which helps it decide which places in the cage it would rather be at a given time," Misra said. "Algorithms based on reinforcement learning are based on a similar idea. They too interact with an environment, but it's a computational environment, to reach a decision or a solution to a given problem."

So, these algorithms are rewarded for favorable predictions and are penalized for unfavorable ones. Over time, reinforcement-based algorithms arrive at the correct solution by maximizing their accrued reward.

Another technical advantage of reinforcement-based algorithms is that they do not make any presuppositions about the pattern of data. For example, Misra's algorithm does not assume that the pressure measured at a certain time and depth is related to what the pressure was at the same depth in the past. This property makes his algorithm less biased, thereby reducing the chances of error at predicting the subterranean environment.

When initiated, Misra's algorithm begins by randomly guessing a value for porosity and permeability of the rocks constituting the subsurface. Based on these values, the algorithm calculates a flow rate and pressure that it expects from a borehole. If these values do not match the actual values obtained from field measurements, also known as historical data, the algorithm is penalized. Consequently, it is forced to correct its next guess for the porosity and permeability. However, if its guesses were somewhat correct, the algorithm is rewarded and makes further guesses along that direction.

The researchers found that within 10 iterations of reinforcement learning the algorithm was able to correctly and very quickly predict the properties of simple subsurface scenarios.

Misra noted that although the subsurface simulated in their study was simplistic, their work is still a proof of concept that reinforcement algorithms can be used successfully in automated reservoir-property predictions, also referred as automated history matching.

"A subsurface system can have 10 or 20 boreholes spread over a two- to five-mile radius. If we understand the subsurface clearly, we can plan and predict a lot of things in advance, for example, we would be able to anticipate subsurface environments if we go a bit deeper or the flow rate of gas at that depth," Misra said. "In this study, we have turned history matching into a sequential decision-making problem, which has the potential to reduce engineers' efforts, mitigate human bias and remove the need of large sets of labeled training data."

He said future work will focus on simulating more complex reservoirs and improving the computational efficiency of the algorithm.

Credit: 
Texas A&M University

Oxidation processes in combustion engines and in the atmosphere take the same routes

image: Laboratory set-up of the free-jet experiment at TROPOS in Leipzig, which allows the investigation of the early phase of oxidation reactions under atmospheric conditions without the walls influencing the reaction behaviour.

Image: 
Torsten Berndt, TROPOS

Thuwal/Helsinki/Leipzig. Alkanes, an important component of fuels for combustion engines and an important class of urban trace gases, react via another reaction pathways than previously thought. These hydrocarbons, formerly called paraffins, thus produce large amounts of highly oxygenated compounds that can contribute to organic aerosol and thus to air pollution in cities. An international research team has now been able to prove this through laboratory experiments with state-of-the-art measurement technology at the University of Helsinki and the Leibniz Institute for Tropospheric Research (TROPOS) in Leipzig.

The results of this interdisciplinary work provide crucial information about oxidation processes both in combustion engines and in the atmosphere - with direct implications for engine efficiency and the formation of aerosols, especially in cities, the research team writes in the journal Communications Chemistry, an open-access journal published by the Springer-Nature publishing group.

Oxidation processes play a major role both in the atmosphere and in combustion. A chain reaction called autoxidation is enabled by high engine temperatures. But it also acts as an important source of highly oxygenated compounds in the atmosphere that form organic aerosol, as researchers from Finland, Germany and the USA demonstrated in 2014. Autoxidation is one reason for ageing processes of organic compounds by oxygen from the air. It contributes to the spoilage of food and wine.

This chain reaction is initiated by the formation of peroxy radicals (RO2). The propensity of organic compounds to undergo such multistep autoxidation determines the ignition timing of fuels in engines and, on the other hand, the potential for the formation of low-volatility condensable vapours and consequently organic aerosol in the atmosphere. The extent to which multistep autoxidation takes place depends on the molecular structure of the organic compounds and the reaction conditions. Determining the different reaction pathways of peroxy radicals, which are important intermediates in all oxidation reactions, is crucial for the formation of the different reaction products and their key properties, which can ultimately affect both human health and the climate.

Since peroxy radicals are very reactive, their chemical reactions take place very quickly and individual reaction steps were thus overlooked for a long time. The discovery of highly oxygenated organic molecules (HOMs) seven years ago was only possible due to advances in measurement techniques. A special mass spectrometer (Chemical Ionisation - Atmospheric Pressure Interface - Time of Flight (CI-APi-TOF) mass spectrometer), that can monitor the very short-lived compounds, was used now to measure the radicals and oxidation products of alkanes. "Until now, there have been no studies on HOM formation from alkanes because it was assumed that their structure would be unfavourable for autoxidation," reports Dr. Torsten Berndt from TROPOS. Methane, an important greenhouse gas, belongs to the group of alkanes. But the most important fossil fuels of the world economy from crude oil and natural gas also consist of alkanes: these include propane, butane, pentane, hexane, heptane and octane. New findings about the oxidation behaviour of this group of substances therefore have great relevance in many areas.

To gain a deeper insight into alkane autoxidation, experiments were carried out in the free-jet flow reactor at TROPOS in Leipzig in addition to experiments in Helsinki. The experimental set-up is optimised so that the gases do not come into contact with the walls during the reaction in order to exclude interferences of the results by wall processes. During the experiments, almost all reactive intermediates, RO2 radicals and their reaction products could be directly monitored. The interdisciplinary cooperation of researchers from combustion chemistry and atmospheric chemistry proved to be very useful, because in the combustion processes analogous processes take place as in the atmosphere, only at a higher temperature. "As a result, it became visible that not only isomerisation reactions of RO2 radicals but also of RO radicals are responsible for the build-up of higher oxidised products. The study made it possible to identify with the alkanes the last and perhaps most surprising group of organic compounds for which autoxidation is important", Torsten Berndt concludes.

Even at high concentrations of nitrogen oxides, which otherwise quickly terminate autoxidation reactions, the alkanes apparently produce considerable amounts of highly oxidised compounds in the air. The new findings allow for a deeper understanding of autoxidation processes and give rise to further investigations on isomerisation reactions of RO radicals.

Credit: 
Leibniz Institute for Tropospheric Research (TROPOS)

Machine learning aids in simulating dynamics of interacting atoms

image: Automated data set generation provides a highly diverse sampling of atomic positions for training an accurate and general machine learning model.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., February 23, 2021--A revolutionary machine-learning (ML) approach to simulate the motions of atoms in materials such as aluminum is described in this week's Nature Communications journal. This automated approach to "interatomic potential development" could transform the field of computational materials discovery.

"This approach promises to be an important building block for the study of materials damage and aging from first principles," said project lead Justin Smith of Los Alamos National Laboratory. "Simulating the dynamics of interacting atoms is a cornerstone of understanding and developing new materials. Machine learning methods are providing computational scientists new tools to accurately and efficiently conduct these atomistic simulations. Machine learning models like this are designed to emulate the results of highly accurate quantum simulations, at a small fraction of the computational cost."

To maximize the general accuracy of these machine learning models, he said, it is essential to design a highly diverse dataset from which to train the model. A challenge is that it is not obvious, a priori, what training data will be most needed by the ML model. The team's recent work presents an automated "active learning" methodology for iteratively building a training dataset.

At each iteration, the method uses the current-best machine learning model to perform atomistic simulations; when new physical situations are encountered that are beyond the ML model's knowledge, new reference data is collected via expensive quantum simulations, and the ML model is retrained. Through this process, the active learning procedure collects data regarding many different types of atomic configurations, including a variety of crystal structures, and a variety of defect patterns appearing within crystals.

Credit: 
DOE/Los Alamos National Laboratory

Seeing schizophrenia: X-rays shed light on neural differences, point toward treatment

image: These 3D images of neurons in the brain of a schizophrenia patient show wavy, distorted neurites, which indicate that the condition may be linked to the shape of the neurons. X-ray images were taken at the Advanced Photon Source.

Image: 
Ryuta Mizutani

Schizophrenia, a chronic, neurological brain disorder, affects millions of people around the world. It causes a fracture between a person’s thoughts, feelings and behavior. Symptoms include delusions, hallucinations, difficulty processing thoughts and an overall lack of motivation. Schizophrenia patients have a higher suicide rate and more health problems than the general population, and a lower life expectancy.

There is no cure for schizophrenia, but the key to treating it more effectively is to better understand how it arises. And that, according to Ryuta Mizutani, professor of applied biochemistry at Tokai University in Japan, means studying the structure of brain tissue. Specifically, it means comparing the brain tissues of schizophrenia patients with those of people in good mental health, to see the differences as clearly as possible.

“There are only a few places in the world where you can do this research. Without 3D analysis of brain tissues this work would not be possible.” — Ryuta Mizutani, professor, Tokai University

“The current treatment for schizophrenia is based on many hypotheses we don’t know how to confirm,” Mizutani said. “The first step is to analyze the brain and see how it is constituted differently.”

To do that, Mizutani and his colleagues from several international institutions collected eight small samples of brain tissue — four from healthy brains and four from those of schizophrenia patients, all collected post-mortem — and brought them to beamline 32-ID of the Advanced Photon Source (APS), a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Argonne National Laboratory.

At the APS, the team used powerful X-rays and high-resolution optics to capture three-dimensional images of those tissues. (Researchers collected similar images at the Super Photon Ring 8-GeV [SPring-8] light source facility in Japan.) The resolution of the X-ray optics used at the APS can be as high as 10 nanometers. That’s about 700 times smaller than the width of the average red blood cell, and there are five million of those cells in a drop of blood.

“There are only a few places in the world where you can do this research,” Mizutani said. “Without 3D analysis of brain tissues this work would not be possible.”

According to Vincent De Andrade, physicist in Argonne’s X-ray Science Division, capturing images at high resolution presents a challenge, since the neurons being imaged can be centimeters long. The neuron is the basic working unit of the brain, a cell within the nervous system that transmits information to other cells to control body functions. The human brain has roughly 100 billion of these neurons, in various sizes and shapes.

“The sample has to move through the X-ray beam to trace the neurons through the sample,” De Andrade explained. “The field of view of our X-ray microscope is about 50 microns, about the width of a human hair, and you need to follow these neurons over several millimeters.”

What these images showed is that the structures of these neurons are uniquely different in each schizophrenia patient, which Mizutani said is evidence that the disease is associated with those structures. Images of healthy neurons were relatively similar, while neurons from schizophrenia patients showed far more deviation, both from the healthy brains and from each other.

More study is needed, Mizutani said, to figure out exactly how the structures of neurons are related to the onset of the disease and to devise a treatment that can alleviate the effects of schizophrenia. As X-ray technology continues to improve — the APS, for example, is scheduled to undergo a massive upgrade that will increase its brightness up to 500 times — so will the possibilities for neuroscientists.

“The APS upgrade will allow for better sensitivity and resolution for imaging, making the process of mapping neurons in the brain faster and more precise,” De Andrade said. “We would need resolutions of better than 10 nanometers to capture synaptic connections, which is the holy grail for a comprehensive mapping of neurons, and those should be achievable with the upgrade.”

De Andrade also noted that while electron microscopy has been used to map the brains of small animals — fruit flies, for instance — that technique would take a long time to image the brain of a larger animal, such as a mouse, let alone a full human brain. Ultrabright, high energy X-rays like those at the APS, he said, could speed up the process, and advances in technology will help scientists get a more complete picture of brain tissue.

For neuroscientists like Mizutani, the end goal is fewer people suffering with brain diseases like schizophrenia.

“The differences in brain structure between healthy and schizophrenic people must be linked to mental disorders,” he said. “We must find some way to make people healthy.”

Credit: 
DOE/Argonne National Laboratory