Brain

Engineering enzymes to turn plant waste into sustainable products

image: Professor McGeehan is the Director of the Centre for Enzyme Innovation in the School of Biological Sciences at Portsmouth.

Image: 
Stefan Venter, UPIX Photography

A new family of enzymes has been engineered to perform one of the most important steps in the conversion of plant waste into sustainable and high-value products such as nylon, plastics and chemicals.

The discovery was led by members of the same UK-US enzyme engineering team that, last year, engineered and improved a plastic-digesting enzyme, a potential breakthrough for the recycling of plastic waste. (Link)

The study, published in the journal Proceedings of the National Academy of Sciences, was led by Professor Jen Dubois at Montana State University, Dr Gregg Beckham at the US Department of Energy's National Renewable Energy Laboratory (NREL), Professor Ken Houk at the University of California, Los Angeles together with Professor John McGeehan's team at the University of Portsmouth.

The newly engineered enzyme is active on lignin - one of the main components of plants, which scientists have been trying for decades to find a way of breaking down efficiently.

Professor McGeehan, Director of the Centre for Enzyme Innovation in the School of Biological Sciences at Portsmouth, said: "This is our goal - to discover enzymes from nature, bring them into our laboratories to understand how they work, then engineer them to produce new tools for the biotechnology industry. In this case, we have taken a naturally occurring enzyme and engineered it to perform a key reaction in the breakdown of one of the toughest natural plant polymers.

"To protect their sugar-containing cellulose, plants have evolved a fascinatingly complicated material called lignin that only a small selection of fungi and bacteria can tackle. However, lignin represents a vast potential source of sustainable chemicals, so if we can find a way to extract and use those building blocks, we can create great things."

Lignin acts as scaffolding in plants and is central to water-delivery. It provides strength and also defence against pathogens.

"It's an amazing material," Professor McGeehan said, "cellulose and lignin are among the most abundant biopolymers on earth. The success of plants is largely due to the clever mixture of these polymers to create lignocellulose, a material that is challenging to digest."

Current enzymes tend to work on only one of the building blocks of lignin, making the breakdown process inefficient. Using advanced 3D structural and biochemical techniques the team has been able to alter the shape of the enzyme to accommodate multiple building blocks. The results provide a route to making new materials and chemicals such as nylon, bioplastics, and even carbon fibre, from what has previously been a waste product.

The discovery also offers additional environmental benefits - creating products from lignin reduces our reliance on oil to make everyday products and offers an attractive alternative to burning it, helping to reduce CO2 emissions.

The research team was made up of an international team of experts in structural biology, biochemistry, quantum chemistry and synthetic biology at the Universities of Portsmouth, Montana State, Georgia, Kentucky and California, and two US national laboratories, NREL and Oak Ridge.

Dan Hinchen, a postgraduate student at the University of Portsmouth said: "We used X-ray crystallography at the Diamond Light Source synchrotron to solve ten enzyme structures in complex with lignin building blocks. This gave us the blueprint to engineer an enzyme to work on new molecules. Our colleagues were then able to transfer the DNA code for this new enzyme into an industrial strain of bacteria, extending its capability to perform multiple reactions."

Professor McGeehan said: "We now have proof-of-principle that we can successfully engineer this class of enzymes to tackle some of the most challenging lignin-based molecules and we will continue to develop biological tools that can convert waste into valuable and sustainable materials."

Credit: 
University of Portsmouth

Researchers explain visible light from 2D lead halide perovskites

image: Jiming Bao, associate professor of electrical and computer engineering at the University of Houston, led an international group of researchers investigating how a two-dimensional perovskite composed of cesium, lead and bromine was able to emit a strong green light.

Image: 
University of Houston

Researchers drew attention three years ago when they reported that a two-dimensional perovskite - a material with a specific crystal structure - composed of cesium, lead and bromine emitted a strong green light. Crystals that produce light on the green spectrum are desirable because green light, while valuable in itself, can also be relatively easily converted to other forms that emit blue or red light, making it especially important for optical applications ranging from light-emitting devices to sensitive diagnostic tools.

But there was no agreement about how the crystal, CsPB2Br5, produced the green photoluminescence. Several theories emerged, without a definitive answer.

Now, however, researchers from the United States, Mexico and China, led by an electrical engineer from the University of Houston, have reported in the journal Advanced Materials they have used sophisticated optical and high-pressure diamond anvil cell techniques to determine not only the mechanism for the light emission but also how to replicate it.

They initially synthesized CsPB2Br5 from a related material known as CsPbBr3 and found that the root cause of the light emission is a small overgrowth of nanocrystals composed of that original material, growing along the edge of the CsPB2Br5 crystals. While CsPbBr3, the base crystal, is three-dimensional and appears green under ultraviolet light, the new material, CsPB2Br5, has a layered structure and is optically inactive.

"Now that the mechanism for emitting this light is understood, it can be replicated," said Jiming Bao, associate professor of electrical and computer engineering at UH and corresponding author on the paper. "Both crystals have the same chemical composition, much like diamond versus graphite, but they have very different optical and electronic properties. People will be able to integrate the two materials to make better devices."

Potential applications range from solar cells to LED lighting and other electronic devices.

Bao began working on the problem in 2016, a project that ultimately involved 19 researchers from UH and institutions in China and Mexico. At the time, there were two schools of scientific thought on the light emission from the cesium crystal: that it emitted green light due to a defect, mainly a lack of bromine, rather than the material itself, or that a variation had unintentionally been introduced, resulting in the emission.

His group started with the synthesis of a clean sample by dropping CsPbBr3 powder in water, resulting in sharper-edged crystals. The sharper edges emitted a stronger green light, Bao said.

The researchers then used an optical microscope to study the individual crystals of the compound, which Bao said allowed them to determine that although the compound is transparent, "something was going on at the edge, resulting in the photoluminescence."

They relied on Raman spectroscopy - an optical technique that uses information about how light interacts with a material to determine the material's lattice properties - to identify nanocrystals of the original source material, CsPbBr3, along the edges of the crystal as the source of the light.

Bao said CsPbBr3 is too unstable to use on its own, but the stability of the converted form isn't hampered by the small amount of the original crystal.

The researchers said the new understanding of the light emission will yield new opportunities to design and fabricate novel optoelectronic devices. The techniques used to understand the cesium-lead-halide compound can also be applied to other optical materials to learn more about how they emit light, Bao said.

Credit: 
University of Houston

Columbia researchers provide new evidence on the reliability of climate modeling

image: Clouds from deep convection over the tropical Pacific ocean, photographed by the space shuttle. Such convective activity drives the Hadley circulation of the atmosphere.

Image: 
NASA

New York, NY--June 24, 2019--For decades, scientists studying a key climate phenomenon have been grappling with contradictory data that have threated to undermine confidence in the reliability of climate models overall. A new study, published today in Nature Geoscience, settles that debate with regard to the tropical atmospheric circulation.

The Hadley circulation, or Hadley cell--a worldwide tropical atmospheric circulation pattern that occurs due to uneven solar heating at different latitudes surrounding the equator--causes air around the equator to rise to about 10-15 kilometers, flow poleward (toward the North Pole above the equator, the South Pole below the equator), descend in the subtropics, and then flow back to the equator along the Earth's surface. This circulation is widely studied by climate scientists because it controls precipitation in the subtropics and also creates a region called the intertropical convergence zone, producing a band of major, highly-precipitative storms.

The study, headed by Rei Chemke, a Columbia Engineering postdoctoral research fellow, together with climate scientist Lorenzo Polvani, addresses a major discrepancy between climate models and reanalyses regarding potential strengthening or weakening of the Hadley circulation in the Northern Hemisphere as a consequence of anthropogenic emissions.

Historically, climate models have shown a progressive weakening of the Hadley cell in the Northern Hemisphere. Over the past four decades reanalyses, which combine models with observational and satellite data, have shown just the opposite--a strengthening of the Hadley circulation in the Northern Hemisphere. Reanalyses provide the best approximation for the state of the atmosphere for scientists and are widely used to ensure that model simulations are functioning properly.

The difference in trends between models and reanalyses poses a problem that goes far beyond whether the Hadley cell is going to weaken or strengthen; the inconsistency itself is a major concern for scientists. Reanalyses are used to validate the reliability of climate models--if the two disagree, that means that either the models or reanalyses are flawed.

Lead author Chemke, a NOAA Climate and Global Change postdoctoral fellow, explains the danger of this situation, "It's a big problem if the models are wrong because we use them to project our climate and send our results to the IPCC (Intergovernmental Panel on Climate Change) and policy makers and so on."

To find the cause of this discrepancy, the scientists looked closely at the various processes that affect circulation, determining that latent heating is the cause of the inconsistency. To understand which data was correct--the models or the reanalyses--they had to compare the systems using a purely observational metric, untainted by any model or simulation. In this case, precipitation served as an observational proxy for latent heating since it is equal to the net latent heating in the atmospheric column. This observational data revealed that the artifact, or flaw, is in the reanalyses--confirming that the model projections for the future climate are, in fact, correct.

The paper's findings support previous conclusions drawn from a variety of models--the Hadley circulation is weakening. That's critical to understand, says Polvani, a professor of applied physics and applied mathematics and of earth and environmental sciences who studies the climate system at the Lamont-Doherty Earth Observatory. "One of the largest climatic signals associated with global warming is the drying of the subtropics, a region that already receives little rainfall," he explained. "The Hadley cell is an important control on subtropical precipitation. Hence, any changes in the strength of the Hadley cell will result in a change in precipitation in that region. This is why it is important to determine if, as a consequence of anthropogenic emission, the Hadley cell will speed up or slow down in the coming decades."

But these findings resonate far beyond the study in question. Resolving contradictory results in scientific research is critical to maintaining accuracy and integrity in the scientific community. Because of this new study, scientists now have added confidence that models are reliable tools for climate predictions.

Credit: 
Columbia University School of Engineering and Applied Science

Researchers unveil how soft materials react to deformation at molecular level

image: Chemical and biomolecular engineering researchers Johnny Ching-Wei Lee, left, professor Simon Rogers and collaborators are challenging previous assumptions regarding polymer behavior with their newly developed laboratory techniques that measure polymer flow at the molecular level.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Before designing the next generation of soft materials, researchers must first understand how they behave during rapidly changing deformation. In a new study, researchers challenged previous assumptions regarding polymer behavior with newly developed laboratory techniques that measure polymer flow at the molecular level.

This approach may lead to the design of new biomedical, industrial and environmental applications - from polymers that aid in blood clotting to materials that more efficiently extract oil and gas from wells.

The findings are published in the journal Physical Review Letters.

Understanding the mechanics of how materials molecularly react to changing flows is critical to developing high-quality materials, the researchers said, and defining a framework for interpreting and describing these properties has eluded scientists for decades.

"When polymeric materials - synthetic or biologic - are deformed, they react at both macroscopic and molecular scales," said Simon Rogers, a University of Illinois chemical and biomolecular engineering professor and lead author of the new study. "The relationship between the two scales of response is complex and has been, until now, difficult to describe."

Previous studies have attempted to characterize the relationship between the microscopic and macroscopic behaviors of polymer deformation mathematically, the researchers said, but have been unable to relate the physics to any well-defined microstructural observations.

"In our study, we wanted to measure both the structural and mechanical properties of polymers during deformation, directly shedding light on the origin of unique mechanical properties," said Johnny Ching-Wei Lee, a graduate student and study co-author. "We thought perhaps it was best to try and use direct observations to explain the complex physics."

In the lab, the researchers simultaneously measured multiscale deformations by combining traditional tools for measuring stress and deformation at the macroscopic level with a technique called neutron scattering to observe the structure at the molecular scale.

The team found something unexpected.

"With simultaneous neutron scattering and flow measurements, we are able to directly correlate structure and mechanical properties with a time resolution on the order of milliseconds, " said study co-author Katie Weigandt, a researcher from the National Institute of Standards and Technology Center for Neutron Science. "This approach has led to fundamental understanding in a wide range of nanostructured complex fluids, and in this work, validates new approaches to making polymer flow measurements. "

"Previous research had assumed that the amount of applied deformation at the macroscale is what soft materials experience at the microscale," Lee said. "But the neutron-scattering data from our study clearly shows that it is the deformation that can be recovered that matters because it dictates the whole response, in terms of macroscopic flow - something that was previously unknown."

The researchers said this development will help rectify several poorly understood phenomena in polymers research, such as why polymers expand during three-dimensional printing processes.

"We have come up with what we call a structure-property-processing relationship," Rogers said. "This subtle, yet fundamentally different way of thinking about the polymer behavior summarizes what we see as a simple and beautiful relationship that we expect to be quite impactful."

The research brings key insights to the long-standing challenge in soft condensed matter, and the team said the established structure-property-processing relationships could provide a new design criterion for soft materials.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Large cohort study confirms small added obstetric risk from transfer of longer developed embryos

Vienna, 24 June 2019: The transfer of embryos cultured for five or six days (instead of two or three) after fertilisation in IVF and ICSI has become routine in many fertility clinics. Many (but not all) studies show that transferring these longer and better developed embryos - known as blastocysts - will increase the chance of pregnancy and live birth.(1)

However, concerns have been raised that this extended culture to the blastocyst stage may increase obstetric complications and increase perinatal risks. Some large studies have already shown slightly higher risks of preterm birth and large-for-gestational-age rates among infants born after blastocyst transfer than after transfer of traditional three-day embryos (or after natural conceptions).

Now, a study based on the large cohort of the Committee of Nordic ART and Safety (CoNARTaS) has confirmed these concerns and found that blastocyst transfer is indeed associated with a higher risk of preterm birth (before 37 weeks), and large-for-gestational-age rates. 'These results are important,' say the investigators, 'since an increasing number of all ART treatments are performed with blastocyst transfer.'

The results of this large CoNARTaS study are presented today at the 35th Annual Meeting of ESHRE in Vienna by Dr Anne Laerke Spangmose from Copenhagen University Hospital in Denmark.

The study itself was an analysis of almost 90,000 assisted reproduction (ART) babies born in the Nordic countries (Denmark, Norway, Sweden) up to 2015 of which 69,751 were singletons and 18,154 twins. The singleton cohort comprised 8368 born after blastocyst transfer and 61,383 born after traditional (three day) embryo transfer; the twin cohort included 1167 children born after blastocyst transfer and 16,987 after three-day embryo transfer (at the 'cleavage stage').

The analysis showed that the singletons born after fresh blastocyst transfer had a 23% higher risk of being large for gestational age than after cleavage stage transfer. In everyday terms the higher risk would mean an overall increase in incidence from 3.7% following cleavage-stage transfers to 4.3% following blastocysts; Dr Spangmose described this increase as 'small'. While another small increase in the risk of preterm birth was found following the transfer of frozen blastocysts, none was found after the transfer of fresh.

With a further objective, the study considered the chance of having twins following cleavage-stage or blastocyst transfer. Incidence was increased from 2.3% following the fresh day-three transfers to 4.0% following blastocyst, and at similar rates for frozen. 'I would say that this a large increase,' said Dr Spangmose, 'considering the risks of perinatal and obstetric outcomes in twin births.'

The results suggest that blastocyst transfer does indeed carry a small but higher risk of obstetric complications than day-three transfers. The increased risk of large-for-gestational-age offspring has been reported in both human and veterinary studies.

Explaining the higher rate of large-for-gestational-age following blastocyst transfers Dr Spangmose said: 'Extended embryo culture implies more time in vitro for the embryo, leaving it more exposed to external potential stressors like temperature or oxygen concentration. Furthermore, the type of culture media used for cleavage and blastocyst stage embryos are different. Studies have suggested that the media itself may influence specific gene expressions and impact perinatal outcomes.'(1)

However, despite these higher obstetric and twin risks, other meta-analyses have found that blastocyst transfer is associated with a higher delivery rate than cleavage-stage transfers - which no doubt explains its increasingly greater use in recent years. The latest Cochrane review (2016) of blastocyst vs cleavage stage transfer found the former associated with a 48% higher chance of live birth in an initial 'fresh' transfer cycle. The Cochrane results mean - as the reviewers explained - that if 29% of women achieved live birth after fresh cleavage stage transfer, between 32% and 42% would do so after fresh blastocyst transfer. Regulatory authorities have recognised this in their encouragement of single embryo transfer to reduce multiple pregnancy rate, still the greatest 'complication' of ART. Embryos which reach the blastocyst stage have been said to be more 'physiological' and thus better able to select themselves for viability.(2)

'Today,' said Dr Spangmose, 'blastocyst culture plays a crucial role in ART treatment. It increases survival rates of frozen-thawed embryos and so supports the use of single embryo transfer to reduce twin births. From our results the greater use of frozen blastocyst transfer could offset the adverse effects of extended embryo culture. Nevertheless, we still believe that blastocyst culture should be used with caution - as a tool for embryo selection and for enhancing success rates in frozen ART cycles. But we still need to consider whether blastocyst culture should be the gold standard in fresh ART cycles given the adverse risks found in our study.'

Credit: 
European Society of Human Reproduction and Embryology

Branching out: Making graphene from gum trees

image: Eucalyptus bark extract has never been used to synthesise graphene sheets before.

Image: 
RMIT University

Graphene is the thinnest and strongest material known to humans. It's also flexible, transparent and conducts heat and electricity 10 times better than copper, making it ideal for anything from flexible nanoelectronics to better fuel cells.

The new approach by researchers from RMIT University (Australia) and the National Institute of Technology, Warangal (India), uses Eucalyptus bark extract and is cheaper and more sustainable than current synthesis methods.

RMIT lead researcher, Distinguished Professor Suresh Bhargava, said the new method could reduce the cost of production from $USD100 per gram to a staggering $USD0.5 per gram.

"Eucalyptus bark extract has never been used to synthesise graphene sheets before and we are thrilled to find that it not only works, it's in fact a superior method, both in terms of safety and overall cost," said Bhargava.

"Our approach could bring down the cost of making graphene from around $USD100 per gram to just 50 cents, increasing it availability to industries globally and enabling the development of an array of vital new technologies."

Graphene's distinctive features make it a transformative material that could be used in the development of flexible electronics, more powerful computer chips and better solar panels, water filters and bio-sensors.

Professor Vishnu Shanker from the National Institute of Technology, Warangal, said the 'green' chemistry avoided the use of toxic reagents, potentially opening the door to the application of graphene not only for electronic devices but also biocompatible materials.

"Working collaboratively with RMIT's Centre for Advanced Materials and Industrial Chemistry we're harnessing the power of collective intelligence to make these discoveries," he said.

A novel approach to graphene synthesis:

Chemical reduction is the most common method for synthesising graphene oxide as it allows for the production of graphene at a low cost in bulk quantities.

This method however relies on reducing agents that are dangerous to both people and the environment.

When tested in the application of a supercapacitor, the 'green' graphene produced using this method matched the quality and performance characteristics of traditionally-produced graphene without the toxic reagents.

Bhargava said the abundance of eucalyptus trees in Australia made it a cheap and accessible resource for producing graphene locally.

"Graphene is a remarkable material with great potential in many applications due to its chemical and physical properties and there's a growing demand for economical and environmentally friendly large-scale production," he said.

Credit: 
RMIT University

Long work hours associated with increased risk of stroke

DALLAS, June 20, 2019 -- People who worked long hours had a higher risk of stroke, especially if they worked those hours for 10 years or more, according to new research in the American Heart Association's journal Stroke.

Researchers reviewed data from CONSTANCES, a French population-based study group started in 2012, for information on age (18-69), sex, smoking and work hours derived from questionnaires from 143,592 participants. Cardiovascular risk factors and previous stroke occurrences were noted from separate medical interviews.

Researchers found:

overall 1,224 of the participants, suffered strokes;

29% or 42,542, reported working long hours;

10% or 14,481, reported working long hours for 10 years or more; and

participants working long hours had a 29% greater risk of stroke, and those working long hours for 10 years or more had a 45% greater risk of stroke.

Long work hours were defined as working more than 10 hours for at least 50 days per year. Part-time workers and those who suffered strokes before working long hours were excluded from the study.

"The association between 10 years of long work hours and stroke seemed stronger for people under the age of 50," said study author Alexis Descatha, M.D., Ph.D., a researcher at Paris Hospital, Versailles and Angers University and at the French National Institute of Health and Medical Research (Inserm). "This was unexpected. Further research is needed to explore this finding.

"I would also emphasize that many healthcare providers work much more than the definition of long working hours and may also be at higher risk of stroke," Descatha said. "As a clinician, I will advise my patients to work more efficiently and plan to follow my own advice."

Previous studies noted a smaller effect of long work hours among business owners, CEOs, farmers, professionals and managers. Researchers noted that it might be because those groups generally have greater decision latitude than other workers. In addition, other studies have suggested that irregular shifts, night work and job strain may be responsible for unhealthy work conditions.

Credit: 
American Heart Association

Rare recessive mutations pry open new windows on autism

Over the past decade, autism spectrum disorder has been linked to mutations in a variety of genes, explaining up to 30 percent of all cases to date. Most of these variants are de novo mutations, which are not inherited, affect just one copy of a gene, and are relatively easy to find. The lab of Timothy Yu, MD, PhD, at Boston Children's Hospital chose a road less travelled, tracking rare recessive mutations in which a child inherits two "bad" copies of a gene.

The study, involving one of the largest cohorts to date, suggests that recessive mutations are more common in autism than previously thought. The findings, published June 17 in Nature Genetics, provide a likely explanation for up to 5 percent of all autism cases and offer new clues to autism's biological causes.

"This is the deepest dive yet into recessive mutations in autism - but we're not done," says Yu, who led the study with first author Ryan Doan, PhD, in Boston Children's Division of Genetics and Genomics. "This study offers a glimpse of an interesting part of the puzzle we've yet to assemble."

Doubling down on double hits

Recessive mutations have been linked to autism in the past, mostly in small study populations in areas where marriages between relatives are common. When parents are fairly closely related genetically, their offspring are more likely to get "double hits" of genetic variants - mostly harmless, but some of which may be disease-causing.

The new study represented a much broader population: 8,195 individuals in the international Autism Sequencing Consortium, founded by study co-author Joseph Buxbaum, PhD, of the Icahn School of Medicine at Mount Sinai. The study included 2,343 individuals affected with autism from the U.S., the U.K., Central America, Germany, Sweden, the Middle East, and Finland. It examined whole-exome data, comparing DNA sequences for all protein-coding genes in these individuals with autism, versus 5,852 unaffected controls.

The researchers first looked for "loss of function" or "knockout" mutations that completely disabled the gene, such that the proteins they normally encode are truncated and non-functional. "The concept is simple, though the execution took a lot of careful work," says Doan, who was the study's first author.

The team identified loss-of-function mutations that were both rare (affecting less than 1 percent of the cohort) and biallelic (affecting both copies of the gene) in 266 people with autism. Overall, people with autism were 62 percent more likely than the control group to have disabling mutations in both copies of a gene.

The team also looked for biallelic missense mutations, which involve a change in a single amino acid (a "spelling error"). Missense mutations are more common than loss-of-function mutations, and some of them cause just as much damage. Biallelic missense mutations, too, were significantly more common in the autism group.

Biological clues

After excluding genetic variants that were also found in the control group and in a separate large cohort of more than 60,000 individuals without autism, Doan, Yu and colleagues were left with 41 genes that were knocked out only in individuals with autism. Overall, the researchers estimate that these genes explain another 3 to 5 percent of all cases of autism (2 percent from loss-of-function mutations, and 1 to 3 percent from missense mutations).

Eight of these had already been flagged in previous studies. The remaining 33 had never been linked to autism before, and several have intriguing attributes that call out for more investigation.

One gene, SLC1A1, for example, helps modulate activity of the brain neurotransmitter glutamate, and has been linked to a metabolic disorder associated with intellectual disability and obsessive-compulsive disorder. Another gene lost in two brothers, FEV, is critical for making the brain neurotransmitter serotonin, providing further support for the idea that dysfunction of serotonin signaling is central to autism.

Many of the double knockouts were found in just one individual and would need to be confirmed in other patients, Yu notes.

Male susceptibility to autism confirmed

Rates of autism are known to be higher among males than females, in a roughly 4:1 ratio. Yet previous studies, mostly looking at de novo mutations have found that boys tend to have milder mutations and girls tend to have more severe mutations, a seeming contradiction.

"One hypothesis is that the female brain is somehow more robust -- that it has more reserve and are more resistant to autism, so it takes a bigger hit to knock them down," says Yu. "We asked, does the same pattern hold true for recessive mutations? And we found that it does - females had a higher rate of complete gene knockout than males."

In fact, a surprising 1 in 10 girls had a biallelic gene knockout caused by either loss-of-function mutations or severe missense mutations. And interestingly, one boy with autism lost a gene involved in estrogen signaling, suggesting that something in the estrogen pathway could be a risk factor for autism.

Credit: 
Boston Children's Hospital

B chromosome first -- mechanisms behind the drive of B chromosomes uncovered

The specific number of chromosomes is one of the defining characteristics of a species. Whilst the common fruit fly carries 8 chromosomes, the genome of bread wheat counts 42 chromosomes. In comparison, the human genome is made out of a total of 46 chromosomes. However, about 15% of all eukaryotic species additionally carry supernumerary chromosomes referred to as "B chromosomes". Other than the essential chromosomes of the genome, B chromosomes are expendable and often preferentially inherited. This leads to a transmission advantage for B chromosomes called "chromosome drive". To date, little knowledge exists about the mechanisms behind this phenomenon. Researchers from the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) in Gatersleben have now been able to decipher the mechanisms behind the drive of B chromosomes in the goatgrass Aegilops speltoides. The novel insights in the workings of chromosome drive were recently published in New Phytologist.

Chromosomes are thread-like structures, carrying the nuclear DNA. The majority of chromosomes found in the genome of an organism, also referred to as A chromosomes, carry genetic information vital for its development and reproduction. Nevertheless, thousands of fungi, plant and animal species carry supernumerary chromosomes, which bring no visible advantage to their host organism and instead can have a negative effect on the host's fitness if they occur in higher numbers. These so-called B chromosomes were first described in 1907 and the purpose of these seemingly selfish chromosomes has ever since been a black box to scientists.

Recent studies suggest that B chromosomes likely are by-products of the evolution of the standard A chromosomes. Further, it is known that B chromosomes have an advantage when it comes to their inheritance. Instead of obeying the Mendelian law of equal segregation, which describes how pairs of chromosomes are separated into gametes, B chromosomes often show an increased transmission rate, higher than 0.5. Understanding this "drive" of the B chromosomes is considered a key for understanding the biology of B chromosomes. Scientists from the research group "Chromosome Structure and Function" of Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) in Gatersleben have now uncovered the mechanisms behind the drive of B chromosomes within one of the progenitors of wheat, the poaceae Aegilops speltoides.

Ae. speltoides is known to carry up to 8 B chromosomes in addition to its 14 standard A chromosomes. Using comparative genomics, the scientists identified a B chromosome-specific repeat within Ae. speltoides, which allowed them to identify and track the B chromosomes during pollen development. This revealed that an asymmetric spindle and the nondisjunction of the chromosomes' sister chromatids during the first pollen grain mitosis were causing the accumulation of B chromosomes. Further, in order to differentiate between the vegetative and sperm nuclei of the pollen grain, as well as to quantify the B chromosomes within the different nuclei, the scientists developed a novel flow cytometric approach. Thanks to this, they were able to show that over 93% of the B chromosomes accumulated within the generative sperm nuclei.

The study demonstrates quantitative flow cytometry as a useful and reliable method to investigate the drive of B chromosomes. More importantly, it provides a new understanding of the mechanisms behind chromosome drive, which will help to further efforts in deciphering the function, regulation and evolution of plant chromosomes. As such, perhaps one day, the hidden purpose behind the selfish chromosomes will be revealed.

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

Scientists make single-cell map to reprogram scar tissue into healthy heart cells

image: UNC School of Medicine scientists created a cellular road map to turn fibroblasts into cardiomyocytes.

Image: 
Qian Lab, UNC School of Medicine

CHAPEL HILL, N.C. - Every year 790,000 Americans suffer a heart attack, which leaves damaged scar tissue on the heart and limits its ability to beat efficiently. But what if scientists could reprogram scar tissue cells called fibroblasts into healthy heart muscle cells called cardiomyocytes? Researchers have made great strides on this front with lab experiments and research in mice, but human cardiac reprogramming has remained a great challenge.

Now, for the first time, researchers at the McAllister Heart Institute (MHI) at the University of North Carolina at Chapel Hill have developed a stable, reproducible, minimalistic platform to reprogram human fibroblast cells into cardiomyocytes. And by taking advantage of the latest single-cell technologies and mathematical simulation, the researchers have produced a high-resolution molecular roadmap to guide precise and efficient reprogramming.

This work, published in the journal Cell Stem Cell today, was led by senior author Li Qian, PhD, Associate Professor of Pathology and Laboratory Medicine and Associate Director of MHI, a pioneer in cardiac reprogramming over the past decade. Her lab's latest work has pushed cardiac reprogramming in human patients one step closer to reality, with an eye toward helping millions of people recover from the debilitating effects of heart attacks.

"We're confident our interdisciplinary approach of combining biological experiments with single-cell genomic analyses will inspire future crucial steps toward understanding the nature of human cardiomyocytes and translating this knowledge into regenerative therapies," Qian said.

Qian and her team introduced a cocktail of three genes - Mef2c, Gata4, and Tbx5 - to human cardiac fibroblast cells with a specific optimized dose. To increase efficiency, they performed a screen of supplementary factors and identified MIR-133, a small RNA molecule that when added to the three-gene cocktail - and with further in-culture modifications - reprogrammed human cardiac fibroblast cells into cardiomyocytes at an efficiency rate of 40 to 60 percent.

Next, the group sought to answer how the process of converting cells into cardiomyocytes actually works. To answer this, they looked at the molecular changes of individual cells throughout the process of reprogramming. Their analysis identified a critical point during the reprogramming process when a cell has to "decide" between progressing into a cardiomyocyte or regressing to their previous fibroblast cell fate. Once that process begins, a suite of signaling molecules and proteins launch the cells onto different molecular routes that dictate their cell type development.

The researchers also created a unique cell fate index to quantitatively assess the progress of reprogramming. Using this index, they determined that human cardiac reprogramming progresses at a much slower pace than that of the previously well-described mouse reprogramming, revealing key differences across species and reprogramming conditions.

"You can think of this cell reprogramming process like map routes on Google," Qian said. "The starting scar-forming fibroblasts are like cars looking for the right GPS instructions to get them home - to a cardiomyocyte fate. Our work identified roadblocks, wrong exit ramps, and gas stations - the genetic facilitators - to get the fibroblasts to the destination we want. Our newly developed cell fate index is like a gauge on the dashboard, predicting how far home is."

This work illuminates previously unidentified characteristics of human cardiac reprogramming and provides new research tools to better understand the processes of cell fate transition and reprogramming within humans.

"Our single-cell pipelines and new algorithms can certainly be used for studying other biological processes, including differentiation, de-differentiation, or drug response of a cell." Qian added. "This research approach is not limited to the heart, but heart disease remains the number one killer in the world and the main focus of our lab."

Credit: 
University of North Carolina Health Care

'Robot blood' powers machines for lengthy tasks

ITHACA, N.Y. - Researchers at Cornell University have created a system of circulating liquid -- "robot blood" -- within robotic structures, to store energy and power robotic applications for sophisticated, long-duration tasks.

The researchers have created a synthetic vascular system capable of pumping an energy-dense hydraulic liquid that stores energy, transmits force, operates appendages and provides structure, all in an integrated design.

"In nature we see how long organisms can operate while doing sophisticated tasks. Robots can't perform similar feats for very long," said Rob Shepherd, associate professor of mechanical and aerospace engineering at Cornell. "Our bio-inspired approach can dramatically increase the system's energy density while allowing soft robots to remain mobile for far longer."

Shepherd, director of the Organic Robotics Lab, is senior author of "Electrolytic Vascular Systems for Energy Dense Robots," which published June 19 in Nature. Doctoral student Cameron Aubin is lead author.

The researchers tested the concept by creating an aquatic soft robot inspired by a lionfish, designed by co-author James Pikul, a former postdoctoral researcher, now an assistant professor at the University of Pennsylvania. Lionfish use undulating fanlike fins to glide through coral-reef environments.

Silicone skin on the outside with flexible electrodes and an ion separator membrane within allows the robot to bend and flex. Interconnected zinc-iodide flow cell batteries power onboard pumps and electronics through electrochemical reactions. The researchers achieved energy density equal to about half that of a Tesla Model S lithium-ion battery.

The robot swims using power transmitted to the fins from the pumping of the flow cell battery. The initial design provided enough power to swim upstream for more than 36 hours.

Underwater soft robots offer tantalizing possibilities for research and exploration. Since aquatic soft robots are supported by buoyancy, they don't require an exoskeleton or endoskeleton to maintain structure. By designing power sources that give robots the ability to function for longer stretches of time, Shepherd thinks autonomous robots could soon be roaming Earth's oceans on vital scientific missions and for delicate environmental tasks like sampling coral reefs. These devices could also be sent to extraterrestrial worlds for underwater reconnaissance missions.

Credit: 
Cornell University

3D technology might improve body appreciation for young women

image: In a new study from the MU Center for Body Image Research and Policy, researchers found that digitally painting 3D avatars might have positive effects on body image and mental health.

Image: 
MU News Bureau

COLUMBIA, Mo. - 3D technology has transformed movies and medical imaging, and now it might be able to help young women better appreciate their bodies.

Virginia Ramseyer Winter, assistant professor in the School of Social Work and director of the MU Center for Body Image Research and Policy, is a nationally recognized body image expert. In a new study, she found that images from 3D scanners can be used to help young women focus on body appreciation, which might in turn improve mental health.

"3D body image scanning is a relatively new tool in social science research, and the research on using 3D tools for improving body image is scant," Ramseyer Winter said. "We wanted to see if it could provide a way to help young women shift their focus away from appearance and toward function."

In her study, young adult women between the ages of 18 and 25 were scanned in a 3D scanner used by researchers and students in MU's Department of Textile and Apparel Management. The researchers used modeling software to convert the scans to 3D avatars. Participants then digitally "painted" body parts that they appreciated for various reasons such as their utility or role in their relationships.

"In digitally painting their avatars, women could think about how, for example, their thighs help them run or how their arms can help hold others in an embrace," Ramseyer Winter said. "It provided the participants a way to visual their bodies in a completely different way. It allowed the participants to recognize how our bodies are much more than a size or a number on a scale."

Immediately and then again three months after digitally painting their avatars, participants reported increased body appreciation over time. Moreover, participants reported lower depressive and anxiety symptoms.

"While more research still needs to be done on the relationship between the 3D image intervention we used and its impact on mental health, we did see a significant effect on body appreciation," Ramseyer Winter said. "Prior research has shown that body appreciation is related to depression and anxiety, which leads us to think that we are on the right track in creating an intervention that can improve not only body image, but mental health as well."

Future research will look at how painting the 3D avatars impacts young women with more severe depression.

Credit: 
University of Missouri-Columbia

Crystal with a twist: Scientists grow spiraling new material

image: UC Berkeley and Berkeley Lab researchers created a new crystal built of a spiraling stack of atomically thin germanium sulfide sheets.

Image: 
UC Berkeley image by Yin Liu

With a simple twist of the fingers, one can create a beautiful spiral from a deck of cards. In the same way, scientists at the University of California, Berkeley, and Lawrence Berkeley National Laboratory (Berkeley Lab) have created new inorganic crystals made of stacks of atomically thin sheets that unexpectedly spiral like a nanoscale card deck.

Their surprising structures, reported in a new study that appeared online June 19 in the journal Nature, may yield unique optical, electronic and thermal properties, including superconductivity, the researchers say.

These helical crystals are made of stacked layers of germanium sulfide, a semiconductor material that, like graphene, readily forms sheets that are only a few atoms or even a single atom thick. Such "nanosheets" are usually referred to as "2D materials."

"No one expected 2D materials to grow in such a way. It's like a surprise gift," said Jie Yao, an assistant professor of materials science and engineering at UC Berkeley. "We believe that it may bring great opportunities for materials research."

While the shape of the crystals may resemble that of DNA, whose helical structure is critical to its job of carrying genetic information, their underlying structure is actually quite different. Unlike "organic" DNA, which is primarily built of familiar atoms like carbon, oxygen and hydrogen, these "inorganic" crystals are built of more far-flung elements of the periodic table -- in this case, sulfur and germanium. And while organic molecules often take all sorts of zany shapes, due to unique properties of their primary component, carbon, inorganic molecules tend more toward the straight and narrow.

To create the twisted structures, the team took advantage of a crystal defect called a screw dislocation, a "mistake" in the orderly crystal structure that gives it a bit of a twisting force. This "Eshelby Twist", named after scientist John D. Eshelby, has been used to create nanowires that spiral like pine trees. But this study is the first time the Eshelby Twist has been used to make crystals built of stacked 2D layers of an atomically thin semiconductor.

"Usually, people hate defects in a material -- they want to have a perfect crystal," said Yao, who also serves as a faculty scientist at Berkeley Lab. "But it turns out that, this time, we have to thank the defects. They allowed us to create a natural twist between the material layers."

In a major discovery last year, scientists reported that graphene becomes superconductive when two atomically thin sheets of the material are stacked and twisted at what's called a "magic angle." While other researchers have succeeded at stacking two layers at a time, the new paper provides a recipe for synthesizing stacked structures that are hundreds of thousands or even millions of layers thick in a continuously twisting fashion.

"We observed the formation of discrete steps in the twisted crystal, which transforms the smoothly twisted crystal to circular staircases, a new phenomenon associated with the Eshelby Twist mechanism." said Yin Liu, co-first author of the paper and a graduate student in materials science and engineering at UC Berkeley. "It's quite amazing how interplay of materials could result in many different, beautiful geometries."

By adjusting the material synthesis conditions and length, the researchers could change the angle between the layers, creating a twisted structure that is tight, like a spring, or loose, like an uncoiled Slinky. And while the research team demonstrated the technique by growing helical crystals of germanium sulfide, it could likely be used to grow layers of other materials that form similar atomically thin layers.

"The twisted structure arises from a competition between stored energy and the energy cost of slipping two material layers relative to one another," said Daryl Chrzan, chair of the Department of Materials Science and Engineering and senior theorist on the paper. "There is no reason to expect that this competition is limited to germanium sulfide, and similar structures should be possible in other 2D material systems."

"The twisted behavior of these layered materials, typically with only two layers twisted at different angles, has already showed great potential and attracted a lot of attention from the physics and chemistry communities. Now, it becomes highly intriguing to find out, with all of these twisted layers combined in our new material, if will they show quite different material properties than regular stacking of these materials," Yao said. "But at this moment, we have very limited understanding of what these properties could be, because this form of material is so new. New opportunities are waiting for us."

Credit: 
University of California - Berkeley

Mobile crisis service reduces youth ER visits for behavioral health needs, says study

Children and youth with acute behavioral health needs who are seen through Connecticut's Mobile Crisis Intervention Service - a community-based program that provides mental health interventions and services to patients 18 years and younger - have a lower risk of experiencing a follow-up episode and are less likely to show up in an emergency room if and when another episode occurs.

That's according to a study conducted by researchers in UConn's School of Social Work published today in Psychiatric Services, a journal of the American Psychiatric Association.

"We have a huge, national issue with the influx of children and youth who have mental health needs, that are being identified by families and schools, who end up going into emergency departments that are not adequately staffed," said Michael Fendrich, a professor and associate dean for research at the School of Social Work, who was the lead researcher and author of the study, "and it ends up not as effectively addressing the mental health problems of the youth who are coming in with these acute behavioral health needs."

While mobile intervention and response services for adults are common nationwide, similar intervention services for children and youth are far less prevalent. With Connecticut's Mobile Crisis program, Fendrich said, "there's a really attentive and deeply multidimensional level of expertise among the providers, so it's a better system for identifying and addressing those acute behavioral health needs."

The study reviewed Medicaid claims data for youth served by Mobile Crisis in the 2014 fiscal year and found that, on average, youth served by Mobile Crisis had a 22 percent reduction in their rate of risk for subsequent emergency room visits during the 18 months following their Mobile Crisis intervention, when compared to youth served in the emergency room for an acute behavioral health need.

"The results of this study are groundbreaking for policymakers and hospitals struggling to keep up with the number of children going the emergency department for mental health issues," said Jeffrey Vanderploeg, a journal article co-author and President and CEO of the Child Health and Development Institute of Connecticut (CHDI) which commissioned the study and serves as the Performance Improvement Center for Mobile Crisis through a contract with the state Department of Children and Families.

Nationwide, pediatric behavioral health visits to emergency departments have skyrocketed in recent years; from 2009 to 2013, psychiatric visits to emergency departments for patients under 18 increased by more than 40 percent in the U.S. According to the researchers, emergency departments often lack providers with the specialized training necessary to address child mental health needs and, with limited resources and the strain of emergency medical situations, are often not equipped to provide the follow-up care needed to address ongoing problems. Emergency department care is also expensive.

"The implication of this is that there are models for effectively addressing the emergency department crisis we're facing, and this mobile model, which has proven effective for adults, also shows substantial promise for diversion and for helping us address this crisis in children and youth," said Fendrich. "And that's critical."

In addition to data analysis, the study included extensive focus group work and interviews with Mobile Crisis providers in Connecticut, who stressed the importance of family involvement in the program's success and suggested that greater involvement from the community and additional education for providers, school systems and emergency departments about Mobile Crisis could encourage further success for the program.

"Hospitals are doing their best to provide excellent care to children with behavioral health issues but they are often the first to acknowledge that they are not the best place for many of the children that present with a behavioral health need," said Vanderploeg. "There are many things states can do, such as investing in a mobile crisis system, to divert children that don't really need to be in the ED to more appropriate community-based services."

Connecticut's Mobile Crisis service is available to all Connecticut residents free of charge and can be accessed by calling 2-1-1, the state's partnership with the United Way.

Department of Children and Families Commissioner Vanessa Dorantes '98 MSW, who oversees the Mobile Crisis program, said, "This study further demonstrates that when we deliver timely help for a child and family in crisis, we reduce the likelihood of that child being unnecessarily sent to a hospital emergency department, and instead being well supported within their community. Connecticut's system has long provided a comprehensive, accessible and family centered approach to supporting children and their families both in crisis and linking them to ongoing supports."

While the study results were overwhelmingly positive for Mobile Crisis, Fendrich said that further study should attempt to replicate the results for additional fiscal years. Fendrich also noted that even though Mobile Crisis services were beneficial for keeping youth out of the emergency department, many youth receiving services did end up there. He and his collaborators are looking more closely at the group receiving Mobile Crisis services in order to identify predictors of their emergency department use. This could help identify at-risk populations and customize potential interventions.

Additionally, Fendrich would like to involve families in further evaluation of Mobile Crisis in order to learn how consumers benefit from the program or feel it could be improved.

"Families don't always know what the resources are that they can employ, and often the times that you employ the services are very urgent situations," Fendrich said. "Dealing with mental health, dealing with behavioral diagnosis issues in children, is stressful for the entire family, and resources like Mobile Crisis are just potentially amazing in helping to resolve some of those stresses."

Credit: 
University of Connecticut

Finding 'Nemo's' family tree of anemones

image: The "bubble-tip"sea anemone Entacmaea quadricolor.

Image: 
© B. Titus

Thanks in part to the popular film Finding Nemo, clownfishes are well known to the public and well represented in scientific literature. But the same can't be said for the equally colorful sea anemones--venomous, tentacled animals--that protect clownfishes and that the fish nourish and protect in return. A new study published online this month in the journal Molecular Phylogenetics and Evolution takes a step to change that, presenting a new tree of life for clownfish-hosting sea anemones along with some surprises about their taxonomy and origins.

"It's astounding that when we look at the relationship between clownfishes and sea anemones, which is perhaps one of the most popular examples of symbiosis out there, we have essentially no clue what is going on with one of the two major players," said Estefanía Rodríguez, one of the co-authors on the new study and an associate curator in the American Museum of Natural History's Division of Invertebrate Zoology.

The relationship between the anemone and the clownfish is a mutually beneficial one. The fish have the ability to produce a mucus coating that allows them to shelter within the anemone's venom-filled tentacles without being stung. This protects clownfishes from bigger fishes, like moray eels, which can be stung by the anemone if they get too close. In return, the highly territorial clownfishes will ward away animals that might try to eat the anemone. In addition, their feces serve as an important source of nitrogen for the anemone, and some research suggests that as the fish wiggle through the anemone's swaying tentacles, they help oxygenate the host, possibly helping it grow.

There are about 30 clownfishes that have this symbiotic relationship with anemones, and they originated in the "coral triangle" of southeast Asia. There are 10 described species of clownfish-hosting anemones, but scientists suspect that the total number may be much higher. And the information on the origin of these species, as well as the number of times the symbiosis evolved in anemones, is sparse and dated. To fill in these gaps, the research team, led by Museum Gerstner Scholar and Lerner Gray Postdoctoral Fellow Benjamin Titus, built a phylogenetic tree based on DNA from newly collected anemone specimens.

They found that as a group, anemones independently evolved the ability to host clownfish three times throughout history. That finding in itself was not unexpected, but the groupings of the species were very different than what previous work had predicted.

Two of the three independent groups originated in the Tethys Sea, an ancient ocean that separated the supercontinent of Laurasia from Gondwana during much of the Mesozoic, and in today's geography, is located near the Arabian Peninsula. The data are unclear about the origin of the third group.

"For a symbiosis that's supposedly highly co-evolved, the groups originated in very different parts of the world and probably also at very different times," Titus said.

The findings suggest that these anemones, at least the ones that originated in the Tethys, are quite old, living at least 12 to 20 million years ago and possibly earlier.

Research on this group is especially relevant as clownfishes--and their anemones--face threats from the aquarium and pet trade.

"These are very heavily collected animals, but we don't even know how many species exist in this group," Rodríguez said. "We have a lot of work to do so we can determine what's there now, what kind of threats they face, and how we can protect them."

Credit: 
American Museum of Natural History