Brain

Study weighs deep-sea mining's impact on microbes

image: Hydrothermal vents on the seafloor support a rich diversity of life, and they contain deposits of valuable metals used in the manufacture of lithium-ion batteries. A new research paper looks at what is known about the vital microscopic life in these locations to evaluate the possible impacts of mining these and other deep-sea locations.

Image: 
Woods Hole Oceanographic Institution

The essential roles that microbes play in deep-sea ecosystems are at risk from the potential environmental impacts of mining, a new paper in Limnology and Oceanography reports. The study reviews what is known about microbes in these environments and assesses how mining could impact their important environmental roles.

"The push for deep-sea mining has really accelerated in the last few years, and it is crucial that policy makers and the industry understand these microbes and the services they provide," said Beth Orcutt, a senior research scientist at Bigelow Laboratory for Ocean Sciences and the lead author of the study. "This paper establishes what we know and suggests next steps for using the best science to evaluate the impacts of this new human activity in the deep sea."

Microbes across the seafloor are responsible for essential ecosystem services, from fueling the food web to powering global nutrient cycles. Environments that are promising for mining are also often the sites of globally-important microbial processes and unusual animal communities - and they are very slow to recover from disturbance.

Orcutt and her coauthors analyzed four types of deep-sea mineral resources, including the metal-rich rocks that stud underwater mountains and lie on the seafloor. Their findings indicate that the likely impacts of mining on microbial ecosystems vary substantially, from minimal disturbance to the irreversible loss of important ecosystem processes.

Hydrothermal vent systems, for example, are particularly sensitive - and valuable. The hot, mineral-rich waters support robust communities of microbes that form the vital base of the food web in these ecosystems. The extreme environmental conditions also foster rich genetic diversity among the microbes, making them promising candidates in the search for anti-cancer drugs and other new biotechnology applications.

"These microbes have incredible potential to inspire new solutions to all sorts of medical and technical challenges we face today," said Julie Huber, a scientist from the Woods Hole Oceanographic Institution and co-author of the new study. "But if we damage or destroy a habitat like a hydrothermal vent, we lose the diverse the pool of microbial genetic information from which we can find new enzymes or drugs."

Consumer demand for products like smartphones and electric cars is driving the rapidly growing interest in deep-sea mining for metals like cobalt and rare earth elements, which are used in lithium-ion batteries. The International Seabed Authority of the United Nations is working to establish guidelines for countries and contractors to explore the seafloor for minerals, and to eventually mine them.

While guidelines for licensed exploration already suggest that site assessments should include how much microbial life is present, the researchers on the new study emphasize that it is equally important to determine what roles the microbes are playing and assess how they would be impacted by mining.

"It is important to understand the potential impacts of mining activities to figure out if they should occur and how to manage them if they do," said James Bradley, a scientist at Queen Mary University of London and co-author on the paper. "This is an important conversation between policy makers, industry, and the scientific community, and it's important that we work together to get this right. Once these ecosystems are damaged, they may never fully recover."

Credit: 
Bigelow Laboratory for Ocean Sciences

Brain model offers new insights into damage caused by stroke and other injuries

BUFFALO, N.Y. - He calls it his "chocolate and peanut butter moment."

A University at Buffalo neuroimaging researcher has developed a computer model of the human brain that more realistically simulates actual patterns of brain impairment than existing methods. The novel advancement represents the union of two established approaches to create a digital simulation environment that could help stroke victims and patients with other brain injuries by serving as a testing ground for hypotheses about specific neurological damage.

"This model is tied accurately to the functional connectivity of the brain and is able to demonstrate realistic patterns of cognitive impairment," says Christopher McNorgan, an assistant professor of psychology in UB's College of Arts and Sciences. "Since the model reflects how the brain is connected, we can manipulate it in ways that provide insights, for example, into the areas of a patient's brain that might be damaged.

"This recent work doesn't prove that we have a digital facsimile of the human brain, but the findings indicate that the model is performing in a way that is consistent with how the brain performs, and that at least suggests that the model is taking on properties that are moving in the direction of possibly one day creating a facsimile."

The findings provide a powerful means of identifying and understanding brain networks and how they function, which could lead to what once were unrealized possibilities for discovery and understanding.

Details on the model and the results of its testing appear in the journal NeuroImage.

Explaining McNorgan's model starts with a look at the two fundamental components of its design: functional connectivity and multivariate pattern analyses (MVPA).

For many years, traditional brain-based models have relied on a general linear approach. This method looks at every spot in the brain and how those areas respond to stimuli. This approach is used in traditional studies of functional connectivity, which rely on functional magnetic resonance imaging (fMRI) to explore how the brain is wired. A linear model assumes a direct relationship between two things, such as the visual region of the brain becoming more or less active when a light flickers on or off.

While linear models excel at identifying which areas are active under certain conditions, they often fail to detect complicated relationships potentially existing among multiple areas. That's the domain of more recent advances, like MVPA, a "teachable" machine-learning technique that operates on a more holistic level to evaluate how activity is patterned across brain regions.

MVPA is non-linear. Assume for instance that there's a set of neurons dedicated to recognizing the meaning of a stop sign. These neurons are not active when we see something red or something octagonal because there's not a one-to-one linear mapping between being red and being a stop sign (an apple isn't a stop sign), nor between being octagonal and being a stop sign (a board room table isn't a stop sign).

"A non-linear response ensures that they do light up when we see an object that is both red and octagonal," explains McNorgan. "For this reason, non-linear methods like MVPA have been at the core of so-called 'Deep Learning' approaches behind technologies, such as the computer vision software required for self-driving cars."

But MVPA uses brute force machine-learning techniques. The process is opportunistic, sometimes confusing coincidence with correlation. Even ideal models require researchers to provide evidence that activity in the theoretical model would also be present under the same conditions in the brain.

On their own, both traditional functional connectivity and MVPA approaches have limitations, and integrating results generated by each of these approaches requires considerable effort and expertise for brain researchers to puzzle out the evidence.

When combined, however, the limitations are mutually constrained -- and McNorgan is the first researcher to successfully integrate functional connectivity and MVPA to develop a machine-learning model that's explicitly grounded in real-world functional connections among brain regions. In other words, the mutually constrained results are a self-assembling puzzle.

"It was my chocolate and peanut butter moment," says McNorgan, an expert in neuroimaging and computational modeling.

"I've had a particular career trajectory that has allowed me to work extensively with different theoretical models. That background provided a particular set of experiences that made the combination seem obvious in hindsight."

To build his models, McNorgan begins by gathering the brain data that will teach them the patterns of brain activity that are associated with each of three categories - in this case, tools, musical instruments and fruits. These data came from 11 participants who imagined the appearance and sound of familiar category examples, like hammers, guitars and apples, while undergoing an MRI scan. These scans indicate which areas are more or less active based on blood oxygen levels.

"There are certain patterns of activity across the brain that are consistent with thinking about one category versus another," says McNorgan. "We might think of this as a neural fingerprint."

These MRI patterns were then digitized and used to train a series of computer models to recognize which activity patterns were associated with each category.

"After training, models are given previously unseen activity patterns," he explains. "Significantly above-chance classification accuracy indicates that the models have learned a generalizable relationship between specific brain activity patterns and thinking about a specific category."

To test whether the digital brain models produced by this new method were more realistic, McNorgan gave them "virtual lesions" by disrupting activations in regions known to be important for each of the categories.

He found that the mutually constrained models showed classification errors consistent with the lesion location. For example, lesions to areas thought to be important for representing tools disrupted accuracy for tool patterns, but not the other two categories. By comparison, other versions of models not trained using the new method did not show this behavior.

"The model now suggests how brain areas that might not appear to be important for encoding information when considered individually may be important when it's functioning as part of a larger configuration or network," he says. "Knowing these areas may help us understand why someone who suffered a stroke or other injury is having trouble making these distinctions."

Credit: 
University at Buffalo

Healthier school meals are evidence of the success of the Healthy Hunger-Free Kids Act

Hartford, CT - In a new analysis of studies conducted following the implementation of the 2010 Healthy Hunger Free Kids Act (HHFKA), researchers find positive effects on the dietary quality of meals served to school-aged children.

The National School Lunch Program makes it possible for school children in the United States to receive a nutritious school lunch every day. In 2010, the HHFKA required the USDA to create updated school meal standards that better aligned with the Dietary Guidelines for Americans. This included new regulations to promote fruit, vegetables, and whole grains, and limit excess sodium.

In the first national study following the implementation of the HHFKA, researchers observed significant improvements in the quality of school meals. Greater quantities of whole fruit, whole grains, and dairy, as well as limited availability of refined grains, resulted in higher Healthy Eating Index scores, a measure of diet quality used to assess how well a set of foods aligns with the Dietary Guidelines for Americans.

On February 11, 2019 a new rule went into effect, reversing some standards set forth in the HHFKA by giving school lunchrooms flexibility to provide low-fat flavored milk, higher sodium foods, and fewer whole grains. The USDA reported that these changes were in response to concerns over higher plate waste, lower revenue, and decreased participation in the program.

In this new editorial, published in the Journal of the Academy of Nutrition and Dietetics, concerns regarding stricter nutrition standards set forth in the HHFKA are investigated. It is co-authored by Juliana Cohen, Assistant Professor at Merrimack College and Adjunct Assistant Professor of Nutrition at the Harvard T.H Chan School of Public Health, and Marlene Schwartz, Professor of Human Development and Family Sciences and Director of the Rudd Center for Food Policy and Obesity at the University of Connecticut.

Key findings include:

Despite concerns over increased plate waste as a result of fruit and vegetable requirements, early regional studies comparing the proportion of foods consumed before and after the HHFKA implementation found that school plate waste did not increase.

Producing healthier meals was not associated with significantly increased costs, and increased revenue helped offset differences between operating costs and federal reimbursement rates.

Participation rates were 61% in schools that served the healthiest lunches, compared with 50% in schools that served the least healthy lunches.

Moving forward, editorial authors note there are several strategies that can be implemented at the state and local level to maintain the success of the Healthy Hunger-Free Kids Act. Existing literature suggests that the HHFKA has led to meaningful improvements in the quality of school meals, and the data support the importance of having strong federal school meal standards.

Credit: 
UConn Rudd Center for Food Policy and Obesity

Study examines attitudes toward transgender athletes

As several states draft legislation that would force student-athletes to play as their gender identified on their birth certificate instead of on a team that matches their gender identity, a team of political scientists investigated underlying factors that drive public opinion on transgender athletes.

The new study shows while women in general are more supportive than men of transgender athletes participating in sports by gender identity instead of biological sex, women who are sports fans are more likely to oppose it, holding views that resemble male sports fans.

The research recently published in the journal Sex Roles investigated public attitudes toward the participation of transgender people in sports by using data from a 2015 survey of 1,020 adults across the U.S.; the data was previously used by the same researchers to analyze public opinion on a variety of transgender rights issues.

Dr. Jami Taylor, professor of political science and public administration at The University of Toledo who focuses on transgender politics and policy, is part of the team who found that attitudes about transgender athletes are strongly shaped by an individual's characteristics, political values and personality traits.

Also, the study shows people who have contact with transgender, gay and lesbian people as well as those with stronger egalitarian attitudes are more favorable toward transgender participation, whereas those with high moral traditionalism are more opposed.

"This is a very complicated area, and there are legitimate concerns about fairness for both transgender athletes and those who are not transgender," said Taylor, author of the 2017 book "The Remarkable Rise of Transgender Rights." "We need to have thoughtful policies that ensure fair competitions but also ensure that transgender athletes aren't discriminated against. As governments, nonprofits and businesses begin to craft policies that decide how and with whom transgender athletes will compete in sports, they need to avoid one-size-fits-all solutions because of the complexity of the issues."

"Given the gendered nature of sports and the resistance to the issue among sports fans - both male and female - policymakers will likely need to tread carefully and should have a care in this area as they craft policy solutions. Our work might be helpful to inform policymakers, as well as advocates who promote inclusion."

Research contributors include Taylor; Dr. Andrew Flores, assistant professor in the Department of Government at American University and lead author of the study; Dr. Donald Haider-Markel, professor and chair of the Department of Political Science at the University of Kansas; Dr. Daniel Lewis, associate professor of political science at Siena College; Dr. Patrick Miller, associate professor in the Department of Political Science at the University of Kansas; and Dr. Barry Tadlock, professor of political science at Ohio University.

Current policy depends on the position of governing bodies, such as the NCAA at the collegiate level, and applicable laws that may vary by location. For instance, California law requires that transgender students be treated according to their gender identity, not biological sex.

The issue, according to lawmakers proposing new legislation in New Hampshire, Washington, Georgia, Tennessee and Missouri, is whether transgender-rights protections are leading to unfair competition in women's sports, referencing male-to-female transgender students and arguing they have natural physical advantages over biological females.

However, the study cited a female-to-male case: Mack Beggs' victory in the Texas Class 6A girls' state wrestling championship in 2017, even though the female-to-male transgender student started his transition two years prior and took testosterone injections.

"It was a ridiculous situation. He wanted to wrestle with the boys and received harsh treatment from fans when he was forced to compete with girls," Taylor said. "Due to his success, parents accused him of cheating, but the rule in Texas was he had to compete according to the gender on his birth certificate, which was a girl. If he was in California, he would've competed against boys."

The study finds that 35.6% of women agreed with allowing transgender athletes to participate in sports aligned with their gender identity, compared to 23.2% of men.

As the 2020 Olympic games in Tokyo approach, Taylor calls the Olympics reasonably inclusive to transgender athletes and commends the International Olympic Committee for its attention to both human rights and fair competition.

"The International Olympic Committee no longer requires transgender athletes to have had surgery, but there is a strict requirement around hormonal management," Taylor said. "It's far less restrictive for female-to-male athletes than for male-to-female athletes, which seems to be a reasonable attempt to grapple with this complex issue. Importantly, the IOC's approach looks at evidence in this evolving area."

Credit: 
University of Toledo

Vanderbilt-led team discovers new genetic disease and defines underlying mechanism

video: Dr. Ela Knapik discusses her "Aha!" moment when she realizes that her team had discovered a new disease.

Image: 
Vanderbilt University Medical Center

Studies that started in zebrafish have now pointed to a role for collagen secretion in a wide variety of clinical symptoms -- and in a newly identified genetic syndrome.

Ela Knapik, MD, associate professor of Medicine at Vanderbilt University Medical Center, and her colleagues discovered the syndrome caused by mutation of a single gene and named it CATIFA, an acronym for its core symptoms: cleft palate, cataracts, tooth abnormality, intellectual disability, facial dysmorphism and ADHD.

The study, reported Jan. 13 in Nature Medicine, combined three sources of information: a zebrafish model, a pediatric genetic disease and a database of electronic health records linked to a DNA biobank.

"Each of these three data sources has its own advantages and shortcomings for disease discovery," said Knapik, who is also associate professor of Cell and Developmental Biology at Vanderbilt University. "I thought, why don't we use them concurrently."

Knapik and her team were exploring the function of a gene called ric1 in zebrafish. They knew that mutation of ric1 disrupted collagen secretion and caused craniofacial and other skeletal defects in fish.

They learned that another group of investigators had reported a mutation in the RIC1 gene (the human version of the zebrafish ric1 gene) in multiple children from a single family who had pediatric cataracts. Knapik's team evaluated the zebrafish for cataracts, but didn't find any.

The story might have ended there, but Knapik was intrigued. Did the children have other symptoms that hadn't been characterized because their most pressing problem was cataracts? What might those symptoms be?

To uncover additional symptoms, Knapik and her colleagues in the Vanderbilt Genetics Institute turned to BioVU, Vanderbilt's DNA biobank and database of de-identified electronic health records.

Eric Gamazon, PhD, research instructor in Medicine, and Nancy Cox, PhD, Mary Phillips Edmonds Gray Professor of Genetics and director of the Vanderbilt Genetics Institute, had previously developed a computational method, called PrediXcan, to correlate genetically regulated gene expression with patient phenome -- the clinical characteristics included in the electronic health record. When applied to BioVU, the method generates a list of clinical characteristics linked to reduced expression of a specific gene.

The list they generated for reduced RIC1 expression matched characteristics Knapik and her team observed in the fish, she said. "I knew at that moment that we needed more information about the patients with RIC1 mutation and pediatric cataracts."

The team in Saudi Arabia that had reported the patients with RIC1 mutation agreed to reevaluate those patients for the BioVU-derived list of symptoms. Knapik said the investigators were surprised to find the constellation of symptoms in their patients: eight children from two related families.

The symptoms of the new disease, CATIFA, can be explained by loss of collagen function, Knapik said. Collagen is the main structural component of the extracellular matrix -- the "mortar" between the cellular "bricks."

Using zebrafish, Knapik and her team were able to determine that the RIC1 protein is part of the cellular machinery that processes and ships collagen out of the cell.

"In the absence of RIC1, you don't get the collagen shipment, and you don't have matrix," Knapik said. "That leads to the broad spectrum of symptoms that we find in the electronic health records of adults and in the children with CATIFA."

The findings support the concept of a continuum between the individual symptoms in a rare, Mendelian disease -- a disease caused by mutation of a single gene -- and the complex traits of common diseases, Knapik said. Knowing the mechanistic underpinning for the disease and the symptoms that may arise will improve and personalize care for children with CATIFA and for adults with similar symptoms caused by less efficient activity of RIC1, Knapik added.

The study suggests a new paradigm for accelerating the discovery process for disease mechanisms, Knapik said.

"With knowledge coming from animal models, human common disease biobanks and rare Mendelian diseases, we can put together a complete picture of gene function and advance toward better diagnosis, treatment and prevention," she said.

Credit: 
Vanderbilt University Medical Center

Tuning optical resonators gives researchers control over transparency

image: Electromagnetically induced transparency (EIT) is 'tuned' by two particles on the optical resonator. The different locations of particles control the propagation of light in either clockwise or counterclockwise directions, which switch on (upper configuration) or off (lower configuration) the interference of light, leading to controllable brightness (EIT) and darkness in the output.

Image: 
Yang Lab

In the quantum realm, under some circumstances and with the right interference patterns, light can pass through opaque media.

This feature of light is more than a mathematical trick; optical quantum memory, optical storage and other systems that depend on interactions of just a few photons at a time rely on the process, called electromagnetically induced transparency, also known as EIT.  

Because of its usefulness in existing and emerging quantum and optical technologies, researchers are interested in the ability to manipulate EIT without the introduction of an outside influence, such as additional photons that could perturb the already delicate system. Now, researchers at the McKelvey School of Engineering at Washington University in St. Louis have devised a fully contained optical resonator system that can be used to turn transparency on and off, allowing for a measure of control that has implications across a wide variety of applications.

The group published the results of the research, conducted in the lab of Lan Yang, the Edwin H. & Florence G. Skinner Professor in the Preston M. Green Department of Electrical & Systems Engineering, in a paper titled Electromagnetically Induced Transparency at a Chiral Exceptional Point in the January 13 issue of Nature Physics.

An optical resonator system is analogous to an electronic resonant circuit but uses photons instead of electrons. Resonators come in different shapes, but they all involve reflective material that captures light for a period of time as it bounces back and forth between or around its surface. These components are found in anything from lasers to high precision measuring devices.

For their research, Yang's team used a type of resonator known as a whispering gallery mode resonator (WGMR). It operates in a manner similar to the whispering gallery at St. Paul's Cathedral, where a person on one side of the room can hear a person whispering on the other side. What the cathedral does with sound, however, WGMRs do with light -- trapping light as it reflects and bounces along the curved perimeter.

In an idealized system, a fiber optic line intersects with a resonator, a ring made of silica, at a tangent. When a photon in the line meets the resonator, it swoops in, reflecting and propagating along the ring, exiting into the fiber in the same direction it was initially headed.

Reality, however, is rarely so neat.

"Fabrication in high quality resonators is not perfect," Yang said. "There is always some defect, or dust, that scatters the light." What actually happens is some of the scattered light changes direction, leaving the resonator and travelling back in the direction whence it came. The scattering effects disperse the light, and it doesn't exit the system.

Imagine a box around the system: If the light entered the box from the left, then exited out the right side, the box would appear transparent. But if the light that entered was scattered and didn't make it out, the box would seem opaque.

Because manufacturing imperfections in resonators are inconsistent and unpredictable, so too was transparency. Light that enters such systems scatters and ultimately loses its strength; it is absorbed into the resonator, rendering the system opaque.

In the system devised by co-first authors Changqing Wang, a PhD candidate, and Xuefeng Jiang, a researcher in Yang's lab, there are two WGMRs indirectly coupled by a fiber optic line. The first resonator is higher in quality, having just one imperfection. Wang added a tiny pointed material that acts like a nanoparticle to the high-quality resonator. By moving the makeshift particle, Wang was able to "tune" it, controlling the way the light inside scatters.

Importantly, he was also able to tune the resonator to what's known as an "exceptional point," a point at which one and only one state can exist. In this case, the state is the direction of light in the resonator: clockwise or counter clockwise.

For the experiment, researchers directed light toward a pair of indirectly coupled resonators from the left (see illustration). The lightwave entered the first resonator, which was "tuned" to ensure light traveled clockwise. The light bounced around the perimeter, then exited, continuing along the fiber to the second, lower-quality resonator. 

There, the light was scattered by the resonator's imperfections and some of it began traveling counter clockwise along the perimeter. The light wave then returned to the fiber, but headed back toward the first resonator.

Critically, researchers not only used the nanoparticle in the first resonator to make the lightwaves move clockwise, they also tuned it in a way that, as the light waves propagated back and forth between resonators, a special interference pattern would form. As a result of that pattern, the light in the resonators was cancelled out, so to speak, allowing the light traveling along the fiber to eek by, rendering the system transparent. 

It would be as if someone shined a light on a brick wall -- no light would get through. But then another person with another flashlight shined it in the same spot and, all of a sudden, that spot in the wall became transparent.

One of the more important -- and interesting -- functions of EIT is its ability to create "slow light." The speed of light is always constant, but the actual value of that speed can change based on the properties of the medium through which it moves. In a vacuum, light always travels at 300,000,000 meters per second.

With EIT, people have slowed light down to leight meters per second, Wang said. "That can have significant influence on the storage of light information. If light is slowed down, we have enough time to use the encoded information for optical quantum computing or optical communication." If engineers can better control EIT, they can more reliably depend on slow light for these applications.

Manipulating EIT could also be used in the development of long distance communication. A tuning resonator can be indirectly coupled to another resonator kilometers away along the same fiber optic cable. "You could change the transmitted light down the line," Yang said.

This could be critical for, among other things, quantum encryption.

Credit: 
Washington University in St. Louis

'Ageotypes' provide window into how individuals age, Stanford study reports

What's your type?

That question could gain new meaning, thanks to scientists who've categorized how humans age into different classes dubbed "ageotypes," reports a new study from the Stanford University School of Medicine.

"We know already there are a handful of nice molecular and clinical markers, such as high cholesterol, that are more common in older populations," said Michael Snyder, PhD, professor and chair of genetics. "But we want to know more about aging than what can be learned from population averages. What happens to an individual as they age? No one has ever looked at the same person in detail over time."

Now, Snyder and his team have done just that: They profiled a group of 43 healthy men and women between the ages of 34 and 68, taking extensive measurements of their molecular biology at least five times over two years.

The researchers determined that people generally age along certain biological pathways in the body: metabolic, immune, hepatic (liver) and nephrotic (kidney). People who are metabolic agers, for example, might be at a higher risk for diabetes or show signs of elevated hemoglobin A1c, a measure of blood-sugar levels, as they grow older. People with an immune ageotype, on the other hand, might generate higher levels of inflammatory markers or be more prone to immune-related diseases as they age. But the ageotypes are not mutually exclusive, and a metabolic ager could also be an immune ager, for example.

Using blood, stool and other biological samples, the study tracked levels of certain microbes and biological molecules, such as proteins, metabolites and lipids, in participants over two years, monitoring how the levels changed over time.

"Our study captures a much more comprehensive view of how we age by studying a broad range of molecules and taking multiple samples across years from each participant," Snyder said. "We're able to see clear patterns of how individuals experience aging on a molecular level, and there's quite a bit of difference." Differences not only in the ways one ages, but the rates at which one ages. Perhaps the most important thing, he said, is that the study's measurements were taken during an actionable timeframe -- two years -- making it possible for someone to counteract increased markers of aging by changing their behavior.

"The ageotype is more than a label; it can help individuals zero in on health-risk factors and find the areas in which they're most likely to encounter problems down the line," Snyder said. "Most importantly, our study shows that it's possible to change the way you age for the better. We're starting to understand how that happens with behavior, but we'll need more participants and more measurements over time to fully flesh it out."

A paper describing the study will be published Jan. 13 in Nature Medicine. Snyder is the senior author. Stanford postdoctoral scholar Sara Ahadi, PhD, and bioinformaticist Wenyu Zhou, PhD, share lead authorship.

Four of a kind

Just because an individual falls into one or more of the four ageotypes -- metabolic, immune, hepatic and nephrotic -- doesn't mean that they're not also aging along the other biological pathways, Snyder said. The ageotype signifies the pathways in which increases in aging biomarkers are most pronounced.

The study also looked at differences in aging between healthy participants and participants who are insulin-resistant, or cannot properly process sugar. "The differences in aging between healthy and insulin-resistant folks is something that's never been looked at before," Snyder said. "Overall, we found there were about 10 molecules that significantly differed between insulin-sensitive and insulin-resistant folks as they aged." Many of those markers were involved in immune function and inflammation.

Slowed aging

Perhaps most exciting -- and surprising -- is that not everyone in the study showed an increase in ageotype markers over time. In some people, their markers decreased, at least for a short period, when they changed their behavior. They were not Benjamin Buttons -- that is, they still aged -- but the overall rate at which they did so declined, and in some cases aging markers decreased. In fact, the team saw this phenomenon occur in a handful of important clinical molecules, including hemoglobin A1c and creatine, a marker for kidney function, among a small subset of participants.

In that subset, Snyder said, there were individuals who made lifestyle changes to slow their aging rate. Among those who exhibited decreased levels of hemoglobin A1c, many had lost weight, and one made dietary changes. Some who saw a decrease in creatine, indicating improved kidney function, were taking statins. In other cases, exactly why rates of aging markers waned was unclear. For some people, there were no obvious behavioral changes, yet the team still saw a decreased rate of aging along their ageotype pathways. There was also a handful of people that maintained a slower-than-average aging rate throughout the entire study. How or why is still a mystery.

As he's demonstrated in the past, Snyder is not shy about participating in his own studies. He did so in this one, gleaning some insights into his own aging pattern. "I was a bit disappointed to see that I was aging at a pretty average rate," he said. He had collected the data on himself at the end of 2016. "I started lifting weights right around that time. It'll be interesting to see if that influences my aging pathways in another year's time."

Credit: 
Stanford Medicine

Future subtropical warming accelerates tropical climate change

image: Illustration of atmospheric Hadley cell under present-day conditions (left). Subtropical warming leads to a weakening of the Hadley cell (right), less clouds in most of the tropics, a reduction in the upwelling of cold ocean water, and a resulting increase in tropical temperatures. This process explains the accelerated tropical warming found in climate models in response to increase greenhouse gas emissions.

Image: 
Elke Zeller

In response to future fossil fuel burning, climate computer models simulate a pronounced warming in the tropical oceans. This warming can influence the El Niño phenomenon and shift weather and rainfall patterns across the globe. Despite being robustly simulated in computer models of the climate system, the origin of this accelerated tropical warming has remained a mystery. A new study published this week in the journal Nature Climate Change concludes that climate change outside the tropics is the main culprit.

Earth's future warming will not be identical everywhere. Atmospheric and oceanic circulation changes, as well as cloud processes, will determine which regions will warm faster and which ones will experience a delayed warming relative to the global mean. Focusing on the tropical regions, a team of scientists from the IBS Center for Climate Physics (ICCP) at Pusan National University in South Korea, along with their international collaborators, has now developed a new method that separates the contributions from local and remote physical processes that cause warming in a given region.

The team found that the expected future warming in the tropics (15oS-15oN) originates mostly from warming that occurs in subtropical regions (16oN-32oN and 16oS-32oS). "To understand this surprising phenomenon, one has to understand how different areas interact with each other climatically" says Dr. Malte Stuecker from the ICCP and lead author of the study.

In the subtropical regions air in the upper atmosphere is sinking, creating high pressure at the surface (see Figure). At an altitude of about 10-15 km, this sinking motion sucks in tropical air. The resulting large-scale atmospheric circulation is referred to as the Hadley cell and its surface branch, known as trade wind circulation, transports relatively dry subtropical air back to the tropics. Due to the effects of earth's rotation, trade winds also cause upwelling of cold subsurface waters in the tropical Pacific and Atlantic. "In response to increasing greenhouse gas emissions, future subtropical warming will slow down the atmospheric Hadley cell," adds Stuecker. This will lead to a weakening of the surface trade winds, less upwelling of cold ocean water, and a resulting warming of the sea surface. In addition, a weaker Hadley cell also means that less humid air is rising, and cloud coverage is reduced in most of the tropics, increasing the amount of sunlight reaching the surface. "This can further exacerbate future warming in the tropics", says Axel Timmermann, Director of the ICCP and co-author of the study.

To arrive at these conclusions, the authors used a computer model of the global climate system and ran it for present-day and future CO2 conditions. By imposing the extra energy related to the CO2 change, either in the tropics or subtropics, the team then found that human-induced subtropical warming causes about 40% more future tropical surface ocean temperature change than if the same amount of extra energy would enter Earth?s atmosphere directly in the tropics.

The study presents a new paradigm to understand the patterns of future global warming and the arising regional differences. "Warming in one area can affect the degree of warming in another place. We are starting to appreciate how strongly different areas are connected in the climate system." says co-author Prof. Fei-Fei Jin from the University of Hawai'i, USA.

The new Nature Climate Change study also points out the possibility that a future reduction of air-pollution in Asian countries may lead to regional warming in the subtropics, which in turn could trigger a warming cascade that might even affect the tropics. However, more research needs to be conducted to quantify the influences of air quality control on climate.

Credit: 
Institute for Basic Science

Physicists prove that 2D and 3D liquids are fundamentally different

image: An image of atomic trajectories in a two-dimensional liquid, generated by computer simulations. Most trajectories are elongated, and the elongation of close trajectories is similar. This is the visual signature of the collective motions found by Li et al., which demonstrates a fundamental difference between two-dimensional and three-dimensional liquids.

Image: 
NTU Singapore

A 50-year-old puzzle in statistical mechanics has been solved by an international team of researchers who have proved that two-dimensional (2D) liquids have fundamentally different dynamical properties to three-dimensional (3D) liquids.

Researchers routinely use 2D experiments and simulations to represent 3D liquids, simply because studies in 2D are easier to do.

With these studies, physicists aim at rationalising familiar macroscopic fluid properties, such as the viscosity, in terms of the microscopic motion of the particles, which in 2D can be directly visualised.

The team led by Associate Professor Massimo Pica Ciamarra at Nanyang Technological University, Singapore (NTU Singapore) set out to understand the 'thermal motion' of atoms in 2D and 3D liquids.

Using a mix of pen-and-paper calculations and numerical simulations, they predicted that atoms in 2D liquids can travel for long distances before effectively 'forgetting' their initial positions. This behaviour gives rise to a subtle collective motion of the atoms, of a sort that had previously only been thought to occur in solids.

To confirm their theoretical findings, the researchers performed experiments that tracked the motions of colloidal particles under a microscope. In ordinary three-dimensional liquids, such particles execute a type of random motion known as Brownian motion.

But in two-dimensional liquids, the team was able to demonstrate that the Brownian motion is overlaid on large-scale collective motions. This collective motion was previously believed to only occur in 2D solids, as predicted in the 1960s by Mermin and Wagner.

The proof of the fundamental difference between 2D and 3D liquids was obtained by researchers at NTU Singapore, the Jawaharlal Nehru Centre for Advanced Scientific Research in India, the University of Science and Technology of China, and the University of California (Los Angeles) in the United States. Their work was published in November in the Proceedings of the National Academy of Sciences (PNAS).

"Our discovery shows that two-dimensional liquids and three-dimensional liquids are not just variants of each other, but fundamentally different types of matter," said Assoc Prof Pica Ciamarra.

"Our findings help to explain many puzzling differences between the dynamical properties of two- and three-dimensional liquids, which had been reported in the scientific literature," said Assoc Prof Pica Ciamarra. "It is only in 2D, not in 3D or higher dimensions, that the relaxation time is not inversely proportional to the diffusivity of the particles". "

"To extract relevant information on the dynamics of 3D liquids from the 2D investigations" added Dr. Y.-W. Li, co-author of this study, "researchers need to develop a way to selectively filter-out the effect of the observed collective particle oscillations".

Credit: 
Nanyang Technological University

Shocked meteorites provide clues to Earth's lower mantle

image: Artist's rendition of Earth, cut away to reveal individual layers, including the deep mantle.

Image: 
Mingming Li/ASU

Deep below the Earth's surface lies a thick rocky layer called the mantle, which makes up the majority of our planet's volume. While Earth's mantle is too deep for humans to observe directly, certain meteorites can provide clues to this unreachable layer.

In a study recently published in Science Advances, an international team of scientists, including Sang-Heon Dan Shim and Thomas Sharp of Arizona State University (ASU), have completed a complex analysis of a "shocked meteorite" (one that has experienced high-pressure and high-temperature conditions through impact events) and gained new insight into Earth's lower mantle.

Suizhou: a shocked meteorite

Shocked meteorites have provided many examples of deep mantle minerals since 1969 when high-pressure mineral Ringwoodite was discovered.

For this study, lead author Luca Bindi of the University of Florence (Italy), Shim and Sharp of ASU's School of Earth and Space Exploration and Xiande Xie of the Guangzhou Institute of Geochemistry (China), focused their efforts on a sample of a shocked meteorite called Suizhou.

"Suizhou was an ideal meteorite for our team to analyze," explains Shim, who specializes in using high-pressure experiments to study Earth's mantle. "It provided our team with samples of natural high-pressure minerals like those believed to make up the Earth's deep mantle."

Suizhou fell in 1986 in the Hubei province in China. Immediately after the fall of this meteorite, a group of scientists were able to find and collect samples. "It was an observed fall," explains Sharp, who specializes in studying shocked meteorites to understand shock and impact in the solar system. "So it did not suffer any chemical weathering on Earth and therefore there is no alteration of the iron.

Bridgmanite: The dominant material in the lower mantle

The Suizhou meteorite sample the researchers used for this study contains a specific silicate called "bridgmanite." This silicate is considered the dominant material in the Earth's lower mantle and makes up about 38 volume percent of our planet. It was first discovered in the shocked meteorite Tenham in 2014.

While it was previously thought that iron metal mainly existed in Earth's core, about 15 years ago scientists discovered in the lab that iron in bridgmanite can undergo self-oxidation from which it can produce metallic iron.

This process, a chemical reaction called "charge disproportionation," is where atoms re-distribute electrons among themselves and produce two or three cation forms with different oxidation states (in this case, some Fe(II) ions in bridgmanite convert to Fe(III) and Fe(0), the latter of which forms metallic iron).

The question remained, however, if this process could actually occur in nature.

Using high-resolution electron microscope imaging and spectroscopy, the researchers were able to conduct a set of complex analyses of the Suizhou meteorite sample in nanometer scale.

Through these analyses, the research team discovered metallic iron nanoparticles coexisting with bridgmanite in the shocked meteorite sample, representing the first direct evidence in nature of the iron disproportionation reaction, which so far had only been observed in high-pressure experiments.

"This discovery demonstrates that charge disproportionation can occur in natural high-pressure environments and therefore in the deep interior of the Earth," says Shim.

The implications of this study, however, go beyond just this discovery, and may ultimately help us understand the greater question of how Earth itself was oxidized.

While we know that Earth's upper mantle is more oxidizing than other planets and that the more oxidizing conditions of the upper mantle may be linked to the sudden rise of oxygen in the atmosphere 2.5 billion years ago, we don't yet know how the upper mantle of the Earth became more oxidizing.

"It is possible that when materials of the lower mantle are transported to the upper mantle by convection, there would be a loss of metallic iron and the oxidized iron in bridgmanite would cause more oxidizing conditions in the upper mantle," says Shim.

"Our discovery provides a possible explanation for the more oxidizing conditions of the Earth's upper mantle and supports the idea that deep interior processes may have contributed to the great oxygenation event on the surface."

Credit: 
Arizona State University

Research shows nasal spray antidote is easiest to give for opioid overdose

image: William Eggleston, clinical assistant professor at Binghamton University, State University of New York

Image: 
Binghamton University, State University of New York

BINGHAMTON, N.Y. - Of three possible ways for people to deliver the life-saving antidote naloxone to a person experiencing an opioid overdose, the use of a nasal spray was the quickest and easiest according to research conducted by William Eggleston, clinical assistant professor at Binghamton University, State University of New York, and colleagues at SUNY Upstate Medical University.

Access to naloxone is a priority for reducing opioid deaths, but as naloxone moves closer to approval for sale over the counter, little was known about how easy and effectively the general population could administer it in an opioid overdose situation, according to Eggleston, with Binghamton University's School of Pharmacy and Pharmaceutical Sciences.

Eggleston had done prior research in which untrained people were asked to administer naloxone after completing a naloxone training video, and most were successful, he said. "We wondered if we would get similar results when individuals had no training or indication of how to do it."

Participants in the study were randomly assigned to administer naloxone to a manikin using one of three different methods: a preloaded nasal spray, an intramuscular shot or an improvised nasal atomizer kit that requires a bit of assembly before use. All three types are used by naloxone community programs in the United States.

"We had a station with a manikin and as people came up to check out the booth, we asked if they were interested in helping out with the study. Our goal was to see if there was a method that was the most intuitive," Eggleston said.

Participants were instructed to administer the device to a high?fidelity manikin in a public environment with distractions to mimic those that might be present in an actual overdose situation. No device instructions were provided and participants were evaluated using a standardized tool.

Successful administration was defined as being completed within seven minutes without critical errors, Eggleston said. The nasal spray saw the best results with a median administration time of 16 seconds, followed by the intramuscular shot, which took close to a minute to deliver. The improvised nasal atomizer kit, which required assembling three pieces, was found to be the most difficult to use by the untrained participants.

One interesting finding is that the success rates in this trial overall, when participants were given no training, were much lower than our success rates the first time we conducted the study using video demonstrations, Eggleston said.

"People may not realize how important it is to provide training on how to administer naloxone," Eggleston said. "But when someone is not breathing, every second counts. If naloxone becomes available over the counter, our study highlights the importance of training resources, like pharmacists, public health campaigns and community resources. It also shows that the nasal spray product is the most intuitive to use and easiest to give quickly."

Eggleston worked with SUNY Upstate Medical University colleagues Vincent Calleo, MD, and Martin Kim, MD, both clinical assistant professors of emergency medicine, and Susan Wojcik, PhD, associate professor of emergency medicine.

Credit: 
Binghamton University

Visualizing chemical reactions, e.g. from H2 and CO2 to synthetic natural gas

image: Thermal image sequence showing the exothermic surface reaction front moving over the catalyst.

Image: 
Source: ACS Catalysis; DOI: 10.1021/acscatal.9b04475

Infrared (IR) thermography is used to determine the temperature of humans and objects with high precision and without interfering with the system. A single image taken with an IR camera can capture the same amount of information as hundreds to millions of thermocouples (temperature sensors) at once. Furthermore, modern IR cameras can achieve fast acquisition frequencies of over 50 Hz, which allows the investigation of dynamic phenomena with high resolution.

Now, scientists at EPFL have designed a reactor that can use IR thermography to visualize dynamic surface reactions and correlate it with other rapid gas analysis methods to obtain a holistic understanding of the reaction in rapidly changing conditions. The research was led by Robin Mutschler and Emanuele Moioli at the lab of Andreas Züttel (EPFL and Empa) and they collaborated with researchers at the Polytechnic University of Milan.

The scientists applied their method to catalytic surface reactions between carbon dioxide and hydrogen, including the Sabatier reaction, which can be used to produce synthetic methane from renewable energy by combining CO2 from the atmosphere and H2 from water splitting, thus enabling the synthesis of renewable synthetic fuels with similar properties to their fossil counterparts which is why the Sabatier reaction has attracted a lot of attention recently. A catalyst is required in the Sabatier reaction to activate the relatively inert CO2 as a reactant.

In particular the EPFL researchers focused on the investigation of dynamic reaction phenomena occurring during the reaction activation from different initial catalyst states.

"The reaction on the catalyst is favored by a hydrogenated surface while an exposure to CO2 poisons the catalyst and inhibits a fast reaction activation," says Mutschler.

"Thanks to this new approach, we could visualize new dynamic reaction phenomena never observed before," says Moioli.

In their work they showed the catalyst working and responding to the changes in the feed gas composition and during its activation from different initial states in real time for the first time. By means of their results, the reaction startup and activation behavior are now better understood and it can lead to optimized reactor and catalyst designs to improve the performance of these reactor systems working in dynamic conditions.

This is crucial since renewable energy typically provides energy and reactants stochastically and therefore the reactors converting renewable energy to fuels have to be adapted to work in dynamic conditions under certain circumstances.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

'Flash and freeze' reveals dynamics of nerve connections

image: Docked vesicles at a cortical synapse, from brain slices prepared using the "flash and freeze" method.

Image: 
Carolina Borges Merjane

Uniting structure and function of synapses is challenging: Function is studied in living tissue, measuring electrical signals at millisecond precision with electrophysiology, while the observation of fine structure at nanometer scale requires tissue to be fixed for electron microscopy. Peter Jonas, professor at the Institute of Science and Technology Austria (IST Austria), and his group members, first authors Carolina Borges-Merjane (postdoc) and Olena Kim (PhD student), have developed a so-called "flash and freeze" method for studying structure and function of synapses in intact neural circuits in mammalian brain slices.

Method makes structural changes during signaling visible

"Flash and freeze" refers to the flash of light used to stimulate the neurons, followed by immediate freezing of the tissue to fix it in its most native state. Peter Jonas sums up the challenge: "We mostly do impossible experiments in the Jonas lab, and the new research falls exactly into this category. Here, we take a synapse, stimulate it with light and, within milliseconds, shoot it into a chamber that freezes the structure at minus 196 degrees Celsius and at a pressure of 2,000 bar". The sample is then dropped into a tank of liquid nitrogen and prepared for analysis by electron microscopy.

This set-up allows neuroscientists to stimulate neurons and freeze the tissue immediately afterwards for analysis by electron microscopy, so that changes in anatomy right after stimulation become visible. "It is a very dynamic way of studying synapses", explains Carolina Borges-Merjane, "we can flash and then freeze immediately or wait a few milliseconds or even seconds. By taking several such snapshots, we reveal the time course of structural changes that happen during synaptic transmission." In a parallel series of electrophysiology experiments in living tissue, the researchers characterized the functional dynamics of the same type of synapses. By integrating these data sets, they show how structural changes give rise to the observed function.

Function retained in intact networks

The method presented by the Jonas group is a modification of the "flash and freeze" protocol initially used for studying neurons of the worm Caenorhabditis elegans and individually isolated or dissociated mammalian neurons. The difference: the newly reported method uses slices of the mouse brain, in which neuronal networks remain largely intact and alive. "The function of neurons is usually studied in slices of brain tissue, in which networks remain intact. Synaptic structure is typically studied in chemically fixed samples or, as previously done with flash and freeze, with dissociated neurons. With our method, we can now use the same type of preparation used to study synaptic function to simultaneously study structure", Olena Kim points out. The authors also demonstrated that the method is widely applicable to different brain regions and can therefore be used in studies of a variety of synapses in the brain.

Near identity of structurally and functionally defined vesicle pools demonstrated

In a proof-of-principle experiment, the researchers analyzed pools of vesicles at a cortical synapse. These vesicles contain the neurotransmitters that transfer signals to the neighboring neuron. They found that the structurally defined "docked" pool and the functionally defined "readily releasable pool" of synaptic vesicles are in fact revealed to be very nearly the same, once observed and analyzed using their new integrated method. "This has never been demonstrated directly. We interpret our results as meaning that vesicles fuse and integrate with the plasma membrane", Jonas explains. "Our finding underlines how important it is to extend studies of both structure and function to cortical circuits."

Credit: 
Institute of Science and Technology Austria

New study shows 'organic' wounds improve produce

image: Picture of strawberry wounding field experiment with 50 perforations per plant (W50).

Image: 
Luis Cisneros-Zevallos

Texas A&M AgriLife Research scientists found benefits of insect leaf-wounding in fruit and vegetable production. Stress responses created in the fruits and vegetables initiated an increase in antioxidant compounds prior to harvest, making them healthier for human consumption.

"Many studies in the past supported this idea, but many others showed no differences," said Luis Cisneros-Zevallos, Ph.D., AgriLife Research horticulture and food scientist in College Station and principal investigator for a study addressing this controversy. "In our study we proved that wounding leaves in plants like those caused by insects produce healthier organic fruit."

View the original publication in Scientific Reports from Nature "Solving the controversy of healthier organic fruit: Leaf wounding triggers distant gene expression response of polyphenol biosynthesis in strawberry fruit (Fragaria x ananassa)." The research team, highly interdisciplinary in nature, also included molecular biologist Woo Young Bang, Ph.D., and horticulturist Leonardo Lombardini, Ph.D., both former AgriLife Research scientists.

"We conducted studies using strawberries as a crop model and applied various levels of wounding to the leaves a few days before harvesting the fruit. We found how several genes associated with sugar translocation and phenolic compound biosynthesis were overexpressed in the distant strawberry fruit," said Facundo Ibanez, Ph.D., an investigator for the project associated with the Instituto Nacional de Investigacion Agropecuaria, Uruguay.

All plants have the ability to respond to the environment by activating the secondary metabolism as part of a defense mechanism or as part of an adaptation process. It also activates the primary metabolism, which will move the carbon source needed to produce those antioxidant compounds, explained Cisneros-Zevallos.

"There was the existing idea proposed by others that insects present in the field in organic farming could cause a stress response in the plant and increase antioxidant compounds," said Cisneros-Zevallos. "However, this hypothesis or concept was never tested until now, where we mimicked the damage caused by insects."

Ibanez said the study emphasized fresh produce as an excellent source of health-promoting compounds and that perhaps insects in some way can be allies to achieve even healthier produce.

"Healthier grown produce for the food industry can be a driving force for large-scale production and an attractive investment to relevant stakeholders," he said.

Organic farming in recent years has experienced continued growth and a higher demand among consumers. This has had a positive large-scale impact on the organic industry, farmers and other industries related to organic produce, said Cisneros-Zevallos.

After several studies published in the past on post-harvest wounding stress effects on different crops, the team was inspired to apply this approach to leaf surfaces in the field that mimicked the attack of insects to the plant.

"This observation was key when we designed the strategies to be used in the study, that simple wounding stress on leaf surfaces elicited this systemic response with the unique observation of higher accumulation of phenolic antioxidants in fruit," Ibanez said.

"Our team has elucidated a controversy that was an open question for many years," Cisneros-Zevallos said. "Understanding how these antioxidants are produced by a simple stress like wounding can certainly transform the way the fresh produce industry operates, including both organic and conventional. And it may allow the industry to adopt novel tools based on pre-harvest stress to favor the accumulation of healthier antioxidants in fresh produce and processed foods."

Credit: 
Texas A&M AgriLife Communications

Researchers discover new building blocks of catalyst zeolite nanopores

image: UMass Amherst chemists and materials scientists have advanced understanding of zeolite catalyst structure and vibrations to help lead to new materials for clean energy and carbon capture, among other applications.

Image: 
UMass Amherst/Auerbach lab

AMHERST, Mass. - Zeolites crystals, used among other things for refining petroleum to gasoline and biomass into biofuels, are the most-used catalysts by weight on the planet, and discovering mechanisms of how they form has been of intense interest to the chemical industry and related researchers, say chemist Scott Auerbach and colleagues at the University of Massachusetts Amherst. They hope their advance on a new way to understand zeolite structure and vibrations leads to new, tailor-made zeolites for use in sophisticated new applications.

Their cover story in a recent issue of the Journal of the American Chemical Society describes how the team used systematic analyses and a technique called Raman spectroscopy, plus quantum mechanical modeling, to discover new nano-scale building blocks they call "tricyclic bridges," to help explain zeolites' porous structures and their dynamical behaviors.

Auerbach says, "This breakthrough is important because it gives us a way to see the invisible - the precise structures that lead to zeolite crystals. We hope such structural insights will help us to synthesize new, tailor-made zeolites for advanced applications in clean energy and carbon capture." His co-authors include chemical engineer Wei Fan and first author Tongkun Wang at UMass Amherst, with others at Worcester Polytechnic Institute.

The authors say that by replacing previous "overly simplistic" approaches, their methods can "enhance our ability to use Raman spectroscopy as an analytical tool for investigating zeolite structure and formation, using the concept of tricyclic bridges."

In this work supported by the U.S. Department of Energy Division of Materials Science and Engineering, Auerbach and colleagues say that revealing zeolite synthesis is complicated by the fact that precursor structures are mid-sized, so they fall into a nano-scale "blind spot" - too large for atomic-level and functional group structural analyses and too disordered for X-ray analyses. By contrast, Raman spectroscopy "has emerged as a powerful tool for probing medium-range structures in a variety of materials," they note.

Fan explains that until now, experimental studies on the synthesis of zeolites with new structures and compositions were based on trial-and-error methods, and characterizing the process posed a "tantalizing challenge." Their contribution based on tricyclic bridges provides a new tool for understanding the crystallization pathway, opening the door to designing materials for advanced applications in catalysis and separations, they state.

Further, they point out that "it is often assumed with little evidence that Raman bands can be assigned to individual zeolite rings." They tested this assumption and found that tricyclic bridges ¬- collections of three zeolite rings connected together - play a critical role in zeolite formation. Using this, they discovered a precise relationship between zeolite bond angle and Raman frequency that can be used to pinpoint structures that form during zeolite crystallization.

In future work, Auerbach, Fan and their team plan to measure and model Raman spectra during the zeolite crystallization process, to determine which tricyclic bridges are present and will become inherited by the resulting zeolites.

Credit: 
University of Massachusetts Amherst