Culture

A new way to control microbial metabolism

Cambridge, Mass. -- Microbes can be engineered to produce a variety of useful compounds, including plastics, biofuels, and pharmaceuticals. However, in many cases, these products compete with the metabolic pathways that the cells need to fuel themselves and grow.

To help optimize cells' ability to produce desired compounds but also maintain their own growth, MIT chemical engineers have devised a way to induce bacteria to switch between different metabolic pathways at different times. These switches are programmed into the cells and are triggered by changes in population density, with no need for human intervention.

"What we're hoping is that this would allow more precise regulation of metabolism, to allow us to get higher productivity, but in a way where we minimize the number of interventions," says Kristala Prather, the Arthur D. Little Professor of Chemical Engineering and the senior author of the study.

This kind of switching allowed the researchers to boost the microbial yields of two different products by up to tenfold.

MIT graduate student Christina Dinh is the lead author of the paper, which appears in the Proceedings of the National Academy of Sciences this week.

Double switch

To make microbes synthesize useful compounds that they don't normally produce, engineers insert genes for enzymes involved in the metabolic pathway -- a chain of reactions that generate a specific product. This approach is now used to produce many complex products, such as pharmaceuticals and biofuels.

In some cases, intermediates produced during these reactions are also part of metabolic pathways that already exist in the cells. When cells divert these intermediates out of the engineered pathway, it lowers the overall yield of the end product.

Using a concept called dynamic metabolic engineering, Prather has previously built switches that help cells maintain the balance between their own metabolic needs and the pathway that produces the desired product. Her idea was to program the cells to autonomously switch between pathways, without the need for any intervention by the person operating the fermenter where the reactions take place.

In a study published in 2017, Prather's lab used this approach to program E. coli to produce glucaric acid, a precursor to products such as nylons and detergents. The researchers' strategy was based on quorum sensing, a phenomenon that bacterial cells normally use to communicate with each other. Each species of bacteria secretes particular molecules that help them sense nearby microbes and influence each other's behavior.

The MIT team engineered their E. coli cells to secrete a quorum sensing molecule called AHL. When AHL concentrations reach a certain level, the cells shut off an enzyme that diverts a glucaric acid precursor into one of the cells' own metabolic pathways. This allows the cells to grow and divide normally until the population is large enough to start producing large quantities of the desired product.

"That paper was the first to demonstrate that we could do autonomous control," Prather says. "We could start the cultures going, and the cells would then sense when the time was right to make a change."

In the new PNAS paper, Prather and Dinh set out to engineer multiple switching points into their cells, giving them a greater degree of control over the production process. To achieve that, they used two quorum sensing systems from two different species of bacteria. They incorporated these systems into E. coli that were engineered to produce a compound called naringenin, a flavonoid that is naturally found in citrus fruits and has a variety of beneficial health effects.

Using these quorum sensing systems, the researchers engineered two switching points into the cells. One switch was designed to prevent bacteria from diverting a naringenin precursor called malonyl-CoA into the cells' own metabolic pathways. At the other switching point, the researchers delayed production of an enzyme in their engineered pathway, to avoid accumulating a precursor that normally inhibits the naringenin pathway if too much of the precursor accumulates.

"Since we took components from two different quorum sensing systems, and the regulator proteins are unique between the two systems, we can shift the switching time of each of the circuits independently," Dinh says.

The researchers created hundreds of E. coli variants that perform these two switches at different population densities, allowing them to identify which one was the most productive. The best-performing strain showed a tenfold increase in naringenin yield over strains that didn't have these control switches built in.

More complex pathways

The researchers also demonstrated that the multiple-switch approach could be used to double E. coli production of salicylic acid, a building block of many drugs. This process could also help improve yields for any other type of product where the cells have to balance between using intermediates for product formation or their own growth, Prather says. The researchers have not yet demonstrated that their method works on an industrial scale, but they are working on expanding the approach to more complex pathways and hope to test it at a larger scale in the future.

"We think it certainly has broader applicability," Prather says. "The process is very robust because it doesn't require someone to be present at a particular point in time to add something or make any sort of adjustment to the process, but rather allows the cells to be keeping track internally of when it's time to make a shift."

Credit: 
Massachusetts Institute of Technology

NIH study reports more than half of US office-based physicians recommend CHA

image: JACM, The Journal of Alternative and Complementary Medicine is dedicated to research on paradigm, practice, and policy advancing integrative health.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, December 2, 2019--A new study has shown that more than half (53.1%) of office-based physicians in the U.S., across specialty areas, recommended at least one complementary health approach (CHA) to their patients during the previous 12 months, with female physicians (63.2%) more likely to recommend a CHA than male physicians (49.3%). This unique study, which found physician's sex, race, specialty, and U.S. region to be significant predictors of CHA recommendation, is published in JACM, The Journal of Alternative and Complementary Medicine, a peer-reviewed publication from Mary Ann Liebert, Inc., publishers, dedicated to paradigm, practice, and policy advancing integrative health. Click here to read the article free on the JACM website through January 2, 2020.

The article entitled "U.S. Physician Recommendations to Their Patients About the Use of Complementary Health Approaches" was coauthored by Barbara Stussman and Richard Nahin, PhD, MPH, National Center for Complementary and Integrative Health, National Institutes of Health, Bethesda, MD, and Patricia Barnes and Brian Ward, PhD, National Center for Health Statistics, Hyattsville, MD. The data are based on the 2012 Physician Induction Interview of the National Ambulatory Medical Care Survey (NAMCS PII).

The researchers analyzed recommendations by physicians to their patients for any CHA and for individual approaches, including massage therapy, herbs/nonvitamin supplements, chiropractic/osteopathic manipulation, yoga, acupuncture, and mind-body therapies. Overall, massage therapy was the most commonly recommended CHA, followed by chiropractic/osteopathic manipulation, herbs/nonvitamin supplements, yoga, and acupuncture. The analysis also looked at physician specialty area, including general/family practice physicians, psychiatrists, OB/GYNs, and pediatricians, and their likelihood of recommending any or a specific CHA. The authors anticipate that their findings will "enable consumers, physicians, and medical schools to better understand potential differences in use of CHAs with patients."

JACM Editor-in-Chief John Weeks, johnweeks-integrator.com, Seattle, WA, states: "It is remarkable that these 2012 data pre-date the systematic inclusion of complementary and integrative approaches in pain and opioid-related guidelines and reports from the Joint Commission, National Academy of Medicine, American College of Physicians, Food and Drug Administration, and others in the 7 years since. The data likely significantly understate present level of recommendations of complementary health practices by physicians."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Most complete commercial sugarcane genome sequence has been assembled

An international group of researchers led by Brazilian scientists has assembled the most complete genome sequence of commercial sugarcane. They mapped 373,869 genes or 99.1% of the total genome.

This feat is the result of almost 20 years of research supported by São Paulo Research Foundation - FAPESP and will serve as a basis for the genetic improvement of the world's largest crop in tonnage according to the UN Food & Agriculture Organization (FAO).

An article describing the study is published in GigaScience. Its lead authors are (Glaucia Mendes Souza, a full professor at the University of São Paulo's Chemistry Institute (IQ-USP) and a member of the steering committee for the FAPESP Bioenergy Research Program (BIOEN- FAPESP), and Marie-Anne Van Sluys, a full professor at the same university's Bioscience Institute (IB-USP) and a member of FAPESP's Life Sciences Adjunct Panel.

"It's the first time all the genes of the sugarcane plant, or the vast majority, have been seen. In previous projects by various research groups, the sequences had to be collapsed for lack of a proper assembly tool, so they were only an approximation," said Souza, who is the principal investigator for the Thematic Project "Signaling and regulatory networks associated with 'energy sugarcane'.

"This knowledge opens up many possibilities, from applications in biotechnology to genetic improvement and gene editing [substitution or elimination of genes with specific functions]," said Van Sluys, principal investigator for the Thematic Project "Contribution of genes, genomes and transposable elements to plant-microbe interaction: a sugarcane study case".

Challenges

As the researchers explained, today's commercial sugarcane hybrids have been bred over thousands of years by crossing different varieties of two species (Saccharum officinarum and S. spontaneum) and have a highly complex genome comprising 10 billion base pairs in 100-130 chromosomes. Sequencing the genome is no easy task, requiring substantial computing power to assemble the DNA fragments while keeping homologous chromosomes separate.

For comparison, the wheat genome contains 17 billion base pairs but only 46 chromosomes, while the human genome has a mere 3.2 billion base pairs, also organized into 46 chromosomes.

Although the technology available at the start of the project was capable of producing long sequences, these long sequences had to be built from smaller fragments. Assembling the genome with these sequences required significant computing power, which was supplied by Microsoft.

The idea for the whole-genome sequencing of sugarcane dates to the onset of the BIOEN Program in 2008. A presentation by Souza at a conference held by Microsoft and FAPESP in 2014 left David Hackerman, a researcher at Microsoft Research Institute in Los Angeles, fascinated with the computational challenges posed by the initiative. He proposed a collaboration with FAPESP, which took the form of the project "Development of an algorithm for the assembly of the sugarcane polyploid genome", with Souza as the principal investigator funded by FAPESP's program Research Partnership for Technological Innovation (PITE). The project was a collaboration with other partners, such as Bob Davidson, then a researcher with Microsoft at its Seattle unit and now with Amazon.

The sequence published has made it possible for the first time to identify the diversity in genome segments called gene promoters - DNA regions that control gene expression.

"Although in some cases the genes are 99.9% identical, we can detect differences in their promoters, and these help us determine which ancestor the copies derive from, S. officinarum or S. spontaneum," Souza said.
The achievement permits studies, for example, of how different copies contribute to increased sugar and fiber yields and which copies may be advantageous to the different genotypes selected by programs to breed sugarcane varieties for sugar and for energy.

"The result confirms Brazil's and São Paulo state's leadership in research on sugarcane which is such an important plant for our country. It also reflects foresight on the part of the São Paulo research community and of FAPESP, regarding the challenge of learning about the sugarcane genome to extract knowledge leading to increased efficiency and productivity. We should always recall that research on sugarcane is one of the factors that enabled Brazil to achieve something no other country of a similar size has achieved to date, namely, producing 40% of its total energy from renewables and with low carbon emissions," said Carlos Henrique de Brito Cruz, FAPESP's Scientific Director.

Background

The variety chosen for sequencing was SP80-3280 because more data are available about this variety in scientific literature than about any other variety. During Project Sugarcane Genome (known as FAPESP SucEST, 1999-2002), 238,000 functional gene fragments from this variety were partially sequenced (read more at: https://revistapesquisa.fapesp.br/en/2012/08/22/mapping-sugarcanes/).

Today, SP80-3280 ranks among the top 20 sugarcane varieties grown in São Paulo state. It is also part of the genealogy of several commercial varieties, since it is used in new crossings. Its agricultural yield is high, and it is easily regrown by the sett method (setts are stem cuttings taken from old plants containing one or more buds), thus making it an option for late harvesting at the end of the crop year in São Paulo state.

"The knowledge obtained for this variety can be applied in studies of other genotypes, particularly for the discovery of genes that control biomass accumulation," explained Augusto Lima Diniz, a coauthor of the study and currently on a research internship abroad at Cold Spring Harbor Laboratory (CSHL) in the United States as part of his postdoctoral research for IQ-USP.

Souza and Van Sluys recently participated in an international team that sequenced the genome of S. spontaneum, the ancestor species corresponding to 10-15% of the commercial sugarcane genome. S. officinarum contributes 80-85%, and 5% is recombinant chromosomes of these two progenitor species. The study is published in Nature Genetics.

In 2018, Van Sluys was one of the authors of an article on the results of a study that mapped about half of the sugarcane monoploid genome (only one chromosome in each pair).

Based on the information obtained from this latest whole-genome sequencing effort, researchers at the University of São Paulo (USP) are developing tools for the genetic improvement of sugarcane and testing several candidate genes in Genetically Modified (GM) plants. They are also conducting comparative genomics studies on large gene families with the aim of understanding their contributions to the sugarcane varieties used in Brazilian genetic improvement programs. They hope to find genes that can increase yields, enhance drought resistance and contribute to the development of novel compounds from sugarcane.

"We're also offering the community a Genome Browser that can be used to search for specific genes and analyze sequences in comparison with previous sequencing exercises. This will be valuable to biotech projects not just relating to sugarcane but also to other crops and plants," Souza said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

When reefs die, parrotfish thrive

image: Parrotfish numbers rise as reef quality decreases.

Image: 
Kendall Clements

In contrast to most other species, reef-dwelling parrotfish populations boom in the wake of severe coral bleaching.

The surprise finding came when researchers led by Perth-based Dr Brett Taylor of the Australian Institute of Marine Science (AIMS) looked at fish populations in severely bleached areas of two reefs - the Great Barrier Reef in the western Pacific and the Chagos Archipelago in the Indian Ocean.

The sites are 8000 kilometres apart.

Bleaching is coral's stress reaction to prolonged exposure to higher sea surface temperatures.

"Warming oceans place enormous pressure on reefs and if the temperatures remain high for too long the coral will die. The more frequently this occurs there is less time for coral reefs to recover," Dr Taylor said.

In the damaged areas of the reefs, the study found that parrotfish populations increased in number by between two and eight times, and individual fish were about 20% larger than those in unbleached sections.

Almost every other species of fish was in sharp decline in the bleached areas.

Parrotfish, named because of their tightly packed teeth in a beak formation, use their teeth to scrape microorganisms off coral - and their presence in large numbers on damaged reefs very likely helps the process of repair, Taylor and his colleagues suggest.

"When bleaching reduces coral cover on the reefs, it creates large areas of newly barren surfaces," Taylor said.

"This immediately gets colonised by the microalgae and cyanobacteria, basically an internal and external layer of 'scunge', which provides nutritious, abundant food for parrotfish."

The researchers concluded that the coral and the parrotfish constitute a feedback loop, slowly bringing each other into balance. When reefs are damaged, parrotfish numbers swell. This results in low levels of scunge, giving the coral the best chance to recover. As the reef then returns to health, parrotfish numbers decline again.

"We found reef ecosystems in two different oceans had the same response to global heat events which is indicative of the current magnitude of climate change effects," he said.

The fact that plump parrotfish were found in large numbers on both reefs indicates the feedback loop is an inherent part of reef ecology and not caused by local factors.

"Parrotfish are a vital link in the reef ecosystem," says AIMS co-author Dr Mark Meekan.

"As herbivores, their grazing shapes the structure of reefs through effects on coral growth and suppression of algae that would otherwise proliferate. Because of these important ecological roles, they have been described as 'ecosystem engineers' of reef systems."

As well as AIMS, scientists working on the project came from James Cook University in Australia, the University of Auckland in New Zealand, and the University of Lancaster in the UK.

The research is published in the journal Global Change Biology.

Credit: 
Australian Institute of Marine Science

MBL team images the bacterial hitchhikers on plastic trash in ocean

video: A summary description of the findings by Schlundt et al (2019) Spatial Structure in the "Plastisphere": Molecular Resources for Imaging Microbial Communities on Marine Plastic Debris. Mol. Ecol. Resources, DOI 10.1111/1755-0998.13119.

Image: 
Emily Greenhalgh, MBL

WOODS HOLE, Mass. - Millions of tons of plastic trash are fouling the world's ocean, most of it tiny pieces of microplastic less than a quarter-inch in size. Even the smallest marine animals can ingest these microplastics, potentially threatening their survival.

Marine microplastics aren't floating solo, either - they quickly pick up a thin coating of bacteria and other microbes, a biofilm known as "The Plastisphere." These biofilms can influence the microplastics' fate - causing them to sink or float, or breaking them down into even tinier bits, for example. They can even make the plastic smell or taste like food to some marine organisms. But very little is known about what kinds of microbes are in the Plastisphere, and how they interact with one another and the plastic.

Now, using an innovative microscopy method developed at the Marine Biological Laboratory (MBL), Woods Hole, scientists have revealed the structure of the microbial communities coating microplastic samples from a variety of ocean sites. The team, led by Linda Amaral-Zettler (who coined the term "Plastisphere"), Jessica Mark Welch, and Cathleen Schlundt, reports its results this week in Molecular Ecology Resources.

The MBL team built upon an fluorescence imaging technique developed by Mark Welch and colleagues to literally see the spatial organization of microbes on the plastic samples. They did so by designing probes that fluorescently lit up and targeted major, known bacterial groups in the Plastisphere.

"We now have a toolkit that enables us to understand the spatial structure of the Plastisphere and, combined with other methods, a better future way to understand the Plastisphere's major microbial players, what they are doing, and their impact on the fate of plastic litter in the ocean," said Amaral-Zettler, a MBL Fellow from the NIOZ Royal Netherlands Institute for Sea Research and the University of Amsterdam.

The scientists saw diatoms and bacteria colonizing the microplastics, dominated in all cases by three phyla: Proteobacteria, Cyanobacteria, and Bacteriodetes. Spatially, the Plastisphere microbial communities were heterogeneously mixed, providing the first glimpse of bacterial interactions on marine microplastics.

Mark Welch and colleagues have previously applied their imaging technology to study microbial communities in the human mouth and in the digestive tract of cuttlefish and vertebrates.

This study customized and extended the technology, called CLASI-FISH (Combinatorial Labeling And Spectral Imaging Fluorescence In Situ Hybridization). Amaral Zettler finds the technology so powerful, she plans to establish a CLASI-FISH microscopy platform in the Netherlands.

Credit: 
Marine Biological Laboratory

New study reveals how ancient Puerto Ricans cooked

image: Photographs of all shells analyzed in this study. Samples A-F are arranged from left to right for each level.

Image: 
Philip Staudigel, Ph.D.

MIAMI--A new study by scientists at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science, University of Miami (UM) College of Arts and Sciences, and Valencia College analyzed the fossilised remains of clams to reconstruct the cooking techniques of the early inhabitants of Puerto Rico. The results showed that Puerto Ricans over 2,500 years ago were partial to roasting rather than boiling their food as a soup.

The research team used a new chemical technique to examine fossilised clam shells collected from an archaeological site in Cabo Rojo, Puerto Rico to identify the exact cooking temperatures at which the clams were cooked. They found that cooking temperatures were around 200oC.

"The only way we have of knowing how our ancestors cooked is to study what they left behind," said the study's lead author Philip Staudigel, who conducted the analysis as a graduate student at the UM Rosenstiel School and is now a postdoctoral researcher at Cardiff University. "Here, we demonstrated that a relatively new technique can be used to learn what temperature they cooked at, which is one key detail of the cooking process."

The study, published in the journal Science Advances, sheds new light on the cultural practices of the first communities to arrive on the island of Puerto Rico, and also provides circumstantial evidence that ceramic pottery technology was not widespread during this period of history-likely the only way in which clams could have been boiled.

The fossilized shells, dating back to around 700 BC, were cleaned and turned into a powder that was analyzed to determine its mineralogy, as well as the abundance of specific chemical bonds in the sample that were dependent upon heating of the shell.

When certain minerals are heated, the bonds between atoms in the mineral can rearrange themselves, which can then be measured in the lab. The amount of rearrangement is proportional to the temperature the mineral is heated.

This technique, known as clumped isotope geochemistry, is often used to determine the temperature an organism formed. For this study, it was used to reconstruct the temperature at which the clams were cooked.

The abundance of bonds in the powdered fossils was then compared to clams cooked at known temperatures, as well as uncooked modern clams collected from a nearby beach.

Results showed that the majority of clams were heated to temperatures greater than 100°C - the boiling point of water - but not greater than 200°C. The results also revealed a disparity between the cooking temperature of different clams, which the researchers believe could be associated with a roasting technique in which the clams are heated from below, meaning the ones at the bottom were heated more than the ones at the top.

The pre-Arawak population of Puerto Rico were the first inhabitants of the island , arriving sometime before 3000 BC from central or South America. They existed primarily from fishing and hunting near the mangrove swamps and coastal areas where they had settled.

"Much of peoples' identity draws upon on where they came from, one of the most profound expressions of this is in cooking," said Staudigel. "The clams from the archaeological site appeared to be most similar to clams that had been roasted."

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

UBCO study demonstrates dogs promote page turning

image: Golden retriever Abby listens while Annie Letheman (right) reads to her sister Ruby and researcher Camille Rousseau (middle) observes.

Image: 
UBCO

Reading in the presence of a pooch may be the page-turning motivation young children need, suggests a UBC researcher.

Camille Rousseau, a doctoral student in UBC Okanagan's School of Education, recently completed a study examining the behaviour of 17 children from Grades 1 to 3 while reading with and without a dog. The study was conducted with Christine Tardif-Williams, a professor at Brock University's department of child and youth studies.

"Our study focused on whether a child would be motivated to continue reading longer and persevere through moderately challenging passages when they are accompanied by a dog," explains Rousseau.

Participants were recruited based on their ability to read independently. Prior to the study, each child was tested to determine their reading range and to ensure they would be assigned appropriate story excerpts. The researchers then choose stories slightly beyond the child's reading level.

During the study's sessions, participants would read aloud to either an observer, the dog handler and their pet or without the dog. After finishing their first page, they would be offered the option of a second reading task or finishing the session.

"The findings showed that children spent significantly more time reading and showed more persistence when a dog--regardless of breed or age--was in the room as opposed to when they read without them," says Rousseau. "In addition, the children reported feeling more interested and more competent."

With the recent rise in popularity of therapy dog reading programs in schools, libraries and community organizations, Rousseau says their research could help to develop 'gold-standard' canine-assisted intervention strategies for struggling young readers.

"There have been studies that looked at the impact of therapy dogs on enhancing students' reading abilities, but this was the first study that carefully selected and assigned challenging reading to children," she says.

Some studies and programs have children choose their own book, and while the reading experience would still be positive, Rousseau adds it's the educational experience of persevering through a moderate challenge that offers a potentially greater sense of achievement.

She hopes the study increases organizations' understanding of how children's reading could be enhanced by furry friends.

Credit: 
University of British Columbia Okanagan campus

Chronic opioid treatment may increase PTSD risk

Long-term (chronic) treatment with opioids, such as morphine, prior to trauma enhances fear learning in mice, according to a study published in Neuropsychopharmacology. The findings, which link chronic opioid treatment before a traumatic event with responses to subsequent stressful events, may suggest a possible mechanism underlying the frequent co-occurrence of post-traumatic stress disorder (PTSD) and opioid dependence.

PTSD and substance use disorders (SUDs) often occur together, with nearly 40% of individuals with PTSD also having a SUD. This is known as comorbidity and understanding it may help explain the mechanisms by which these conditions develop. Previous research has shown that PTSD increases the risk of opioid dependence, but whether opioid dependence may also increase PTSD risk remained unclear.

Using an established model of fear learning in mice, researchers at the University of California, Los Angeles, USA assessed the potential impact of chronic opioid treatment on subsequent development of PTSD-like behaviors. They found that mice that had been treated with opioids and later experienced stress showed more pronounced post-stress reactions.

At the beginning of the study, mice were treated with morphine or saline for eight days, followed by a week of drug cessation. Both groups of mice - morphine-treated mice and saline-treated controls (22 and 24 mice, respectively) - were then subdivided into trauma and non-trauma groups. They were transferred to a chamber where animals in the trauma group received a series of mild foot shocks. A day later, both groups of animals were returned to the chamber to assess their memory of the traumatic event.

Michael Fanselow, the corresponding author said: "We have called this the trauma because the acute stressor, the foot shocks, is able to produce lasting fear and anxiety-like behaviors, such as freezing."

On the subsequent day, mice from both the trauma and non-trauma groups were transferred to a new environment and exposed to a mild stressor (a milder foot shock), before being returned to that environment for eight minutes on the fourth day of the experiment.

The authors found no behavioral differences between morphine-treated and control mice following the initial trauma; neither group froze for longer when returned to the environment associated with the trauma. However, morphine-treated mice showed more pronounced freezing when returned to the second environment after having been exposed to the mild stressor. Morphine-treated mice that had not experienced the trauma did not show signs of heightened fear levels.

Michael Fanselow said: "Our data are the first to show a possible effect of opioids on future fear learning, which may suggest that a person with a history of opioid use may become more susceptible to the negative effects of stress. Also, this ability of opioids to increase PTSD-like symptoms far outlasted the direct effects of the drug or withdrawal from the drug, suggesting that the effect may continue even after opioid treatment has stopped."

The authors also tested treating mice with opioids after the initial trauma had occurred but before exposing them to the second, mild stressor. They found that mice treated with morphine after the initial trauma did not show enhanced fear learning following exposure to the mild stressor. The findings suggest that chronic exposure to opioids before - but not after - a traumatic event occurs, impacts fear learning during subsequent stressful events.

Michael Fanselow said: "While we are generally aware that drug use, such as that in the current opioid crisis, has many deleterious effects, our results suggest yet another such effect; increased susceptibility to developing anxiety disorders. As opioids are often prescribed to treat symptoms such as pain that may accompany trauma, caution may be needed because this may lead to a greater risk of developing PTSD, if exposed to further traumatic events, such as an accident, later on."

Credit: 
Springer

Study identifies brain networks that play crucial role in suicide risk

An international team of researchers has identified key networks within the brain which they say interact to increase the risk that an individual will think about - or attempt - suicide. Writing today in Molecular Psychiatry, the researchers say that their review of existing literature highlights how little research has been done into one of the world's major killers, particularly among the most vulnerable groups.

The facts in relation to suicide are stark: 800,000 people die globally by suicide every year, the equivalent of one every 40 seconds. Suicide is the second leading cause of death globally among 15-29 year olds. More adolescents die by suicide than from cancer, heart disease, AIDS, birth defects, stroke, pneumonia, influenza, and chronic lung disease combined. As many as one in three adolescents think about ending their lives and one in three of these will attempt suicide.

"Imagine having a disease that we knew killed almost a million people a year, a quarter of them before the age of thirty, and yet we knew nothing about why some individuals are more vulnerable to this disease," said Dr Anne-Laura van Harmelen, co-first author from the University of Cambridge. "This is where we are with suicide. We know very little about what's happening in the brain, why there are sex differences, and what makes young people especially vulnerable to suicide."

A team of researchers, including Hilary Blumberg, MD, John and Hope Furth Professor of Psychiatric Neuroscience at Yale, carried out a review of two decades' worth of scientific literature relating to brain imaging studies of suicidal thoughts and behaviour. In total, they looked at 131 studies, which covered more than 12,000 individuals, looking at alterations in brain structure and function that might increase an individual's suicide risk.

Combining the results from all of the brain imaging studies available, the researchers looked for evidence of structural, functional, and molecular alterations in the brain that could increase risk of suicide. They identified two brain networks - and the connections between them - that appear to play an important role.

The first of these networks involves areas towards the front of the brain known as the medial and lateral ventral prefrontal cortex and their connections to other brain regions involved in emotion. Alterations in this network may lead to excessive negative thoughts and difficulties regulating emotions, stimulating thoughts of suicide.

The second network involves regions known as the dorsal prefrontal cortex and inferior frontal gyrus system. Alterations in this network may influence suicide attempt, in part, due to its role in decision making, generating alternative solutions to problems, and controlling behaviour.

The researchers suggest that if both networks are altered in terms of their structure, function or biochemistry, this might lead to situations where an individual thinks negatively about the future and is unable to control their thoughts, which might lead to situations where an individual is at higher risk for suicide.

"The review provides evidence to support a very hopeful future in which we will find new and improved ways to reduce risk of suicide," said Professor Hilary Blumberg. "The brain circuitry differences found to converge across the many studies provide important targets for the generation of more effective suicide prevention strategies. "It is especially hopeful that scientists, such as my co-authors on this paper, are coming together in larger collaborative efforts that hold terrific promise."

The majority of studies so far have been cross-sectional, meaning that they take a 'snapshot' of the brain, rather than looking over a period of time, and so can only relate to suicidal thoughts or behaviours in the past. The researchers say there is an urgent need for more research that looks at whether their proposed model relates to future suicide attempts and at whether any therapies are able to change the structure or function of these brain networks and thereby perhaps reduce suicide risk.

The review highlighted the paucity of research into suicide, particularly into sex differences and among vulnerable groups. Despite suicidal thoughts often first occurring as early as during adolescence, the majority of studies focused on adults.

"The biggest predictor of death by suicide is previous suicide attempt, so it's essential that we can intervene as early as possible to reduce an individual's risk," said co-first author Dr Lianne Schmaal from the University of Melbourne. "For many individuals, this will be during adolescence. If we can work out a way to identify those young people at greatest risk, then we will have a chance to step in and help them at this important stage in their lives."

Even more striking, despite the fact that transgender individuals are at increased risk for suicide, just one individual in the 131 samples included for the review was identified to be transgender.

"There are very vulnerable groups who are clearly not being served by research for a number of reasons, including the need to prioritise treatment, and reduce stigma," said van Harmelen. "We urgently need to study these groups and find ways to help and support them."

In 2018, the researchers launched the HOPES (Help Overcome and Prevent the Emergence of Suicide) study, supported by the mental health research charity MQ. HOPES brings together data from around 4,000 young people across 15 different countries in order to develop a model to predict who is at risk of suicide. Over the course of the project, the team will analyse brain scans, information on young people's environment, psychological states and traits in relation to suicidal behaviour from young people from across the world, to identify specific, universal risk-factors.

Credit: 
University of Cambridge

Solving the thermoelectric 'trade-off' conundrum with metallic carbon nanotubes

image: This is an illustration of aligned metallic carbon nanotubes in the team's thermoelectric device. A temperature gradient causes an electrical current to flow.

Image: 
Tokyo Metropolitan University

Tokyo, Japan - Scientists from Tokyo Metropolitan University have used aligned "metallic" carbon nanotubes to create a device which converts heat to electrical energy (a thermoelectric device) with a higher power output than pure semiconducting carbon nanotubes (CNTs) in random networks. The new device bypasses the troublesome trade-off in semiconductors between conductivity and electrical voltage, significantly outperforming its counterpart. High power thermoelectric devices may pave the way for more efficient use of waste heat, like wearable electronics.

Thermoelectric devices can directly convert heat to electricity. When we think about the amount of wasted heat in our environment like in air conditioning exhausts, vehicle engines or even body heat, it would be revolutionary if we could somehow scavenge this energy back from our surroundings and put it to good use. This goes some way to powering the thought behind wearable electronics and photonics, devices which could be worn on the skin and powered by body heat. Limited applications are already available in the form of body heat powered lights and smartwatches.

The power extracted from a thermoelectric device when a temperature gradient is formed is affected by the conductivity of the device and the Seebeck coefficient, a number indicating how much electrical voltage is generated with a certain difference in temperature. The problem is that there is a trade-off between the Seebeck coefficient and conductivity: the Seebeck coefficient drops when the device is made more conductive. To generate more power, we ideally want to improve both.

Semiconducting materials are generally considered superior candidates for high-performance thermoelectric devices. However, a team led by Prof. Kazuhiro Yanagi of Tokyo Metropolitan University met an unlikely hero in the form of "metallic" CNTs. Unlike purely semiconducting CNTs, they found that they could simultaneously enhance both the conductivity and Seebeck coefficient of metallic CNTs, breaking the trade-off between these two key quantities. The team went on to show that these unique characteristics arose from the one-dimensional metallic electronic structure of the material. Furthermore, they were able to align the orientation of the metallic CNTs, achieving an output which was nearly five times that of films of randomly oriented pure semiconducting CNTs.

Not only will high-performance thermoelectric elements let us use body heat to power our smartphones, the potential biomedical applications will ensure that they play an important role in everyday applications in the future.

Credit: 
Tokyo Metropolitan University

Electro-optical device provides solution to faster computing memories and processors

In collaboration with researchers at the universities of Münster and Exeter, scientists have created a first-of-a-kind electro-optical device which bridges the fields of optical and electronic computing. This provides an elegant solution to achieving faster and more energy efficient memories and processors.

Computing at the speed of light has been an enticing but elusive prospect, but with this development it's now in tangible proximity. Using light to encode as well as transfer information enables these processes to occur at the ultimate speed limit - that of light. While as of recently, using light for certain processes has been experimentally demonstrated, a compact device to interface with the electronic architecture of traditional computers has been lacking. The incompatibility of electrical and light-based computing fundamentally stems from the different interaction volumes that electrons and photons operate in. Electrical chips need to be small to operate efficiently, whereas optical chips need to be large, as the wavelength of light is larger than that of electrons.

To overcome this challenging problem the scientists came up with a solution to confine light into nanoscopic dimensions, as detailed in their paper Plasmonic nanogap enhanced phase change devices with dual electrical-optical functionality published in Science Advances, 29 November 2019. They created a design which allowed them to compress light into a nano-sized volume through what is known as surface plasmon polariton. The dramatic size reduction in conjunction with the significantly increased energy density is what has allowed them to bridge the apparent incompatibility of photons and electrons for data storage and computation. More specifically, it was shown that by sending either electrical or optical signals, the state of a photo- and electro-sensitive material was transformed between two different states of molecular order. Further, the state of this phase-transforming material was read out by either light or electronics thereby making the device the first electro-optical nanoscale memory cell with non-volatile characteristics.

"This is a very promising path forward in computation and especially in fields where high processing efficiency is needed," states Nikolaos Farmakidis, graduate student and co-first author.

Co-author Nathan Youngblood continues: "This naturally includes artificial intelligence applications where in many occasions the needs for high-performance, low-power computing far exceeds our current capabilities. It is believed that interfacing light-based photonic computing with its electrical counterpart is the key to the next chapter in CMOS technologies."

Credit: 
University of Oxford

Elizabeth I identified as author of Tacitus translation

A new article in the Review of English Studies argues that a manuscript translation of Tacitus's Annales, completed in the late sixteenth century and preserved at Lambeth Palace Library, was done by Queen Elizabeth I.

The article analyzes the translation's paper stock, style, and crucially the handwriting preserved in the manuscript to positively identify Elizabeth I as the translation's author. Researchers here also trace the manuscript's transmission from the Elizabethan Court to the Lambeth Palace Library, via the collection of Archbishop Thomas Tenison in the seventeenth century. Thanks to his interest in the Elizabethan court and in Francis Bacon, Tenison made the library at Lambeth one of the largest collections of State Papers from the Elizabethan era.

Researchers found persuasive similarities between unique handwriting styles in the Lambeth manuscript and numerous examples of the Queen's distinctive handwriting in her other translations, including the extreme horizontal 'm', the top stroke of her 'e', and the break of the stem in'd'.

Researchers here identified the paper used for the Tacitus translation, which suggests a court context. The translation was copied on paper featuring watermarks with a rampant lion and the initials 'G.B.', with crossbow countermark, which was especially popular with the Elizabethan secretariat in the 1590s. Notably Elizabeth I used paper with the same watermarks both in her own translation of Boethius, and in personal correspondence.

The tone and style of the translation also matches earlier known works of Elizabeth I. The Lambeth manuscript retains the density of Tacitus's prose and brevity, and strictly follows the contours of the Latin syntax at the risk of obscuring the sense in English. This style is matched by other translations by Elizabeth, which are compared with the Tacitus translation accordingly.

"The queen's handwriting was, to put it mildly, idiosyncratic, and the same distinctive features which characterize her late hand are also to be found in the Lambeth manuscript. As the demands of governance increased, her script sped up, and as a result some letters such as 'm' and 'n' became almost horizontal strokes, while others, including her 'e' and 'd', broke apart. These distinctive features serve as essential diagnostics in identifying the queen's work."

This is the first substantial work by Elizabeth I to emerge in over a century and it has important implications for how we understand the politics and culture of the Elizabethan court.

Credit: 
Oxford University Press USA

Fine-tuning gene expression during stress recovery

image: Nuclear stress bodies (white) formed in response to heat stress.

Image: 
Hokkaido University

Scientists have discovered non-coding RNA has a novel role to fine-tune gene expressions during stress recovery, getting closer to uncovering a 30-year-old nuclear mystery.

Hokkaido University researchers are beginning to uncover the functions of mysterious organelles in the nucleus and their relation to stress, 30 years after their discovery.

The organelles, called nuclear stress bodies, form when cells are exposed to heat or chemical stress. When conditions return to normal, the organelles promote retention of RNA segments, called introns, the researchers report in The EMBO Journal.

This is important because intron retention regulates gene expression for a variety of biological functions, including stress response, cell division, learning and memory, preventing the accumulation of damaged DNA, and even tumour growth.

Among their many mysteries, nuclear stress bodies were found to assemble on a type of long non-coding RNA in response to heat and chemical stress. Molecular biologist Tetsuro Hirose of Hokkaido University's Institute for Genetic Medicine specializes in non-coding RNAs, which are molecules copied from DNA, but not translated into proteins. Hirose and his colleagues investigated the functions of nuclear stress bodies by turning off the long non-coding RNA and thus removing them from human cells.

Removing the nuclear stress bodies resulted in a vast suppression of intron retention during stress recovery. Further investigation enabled the team, which included researchers from Hokkaido University, the National Institute for Advanced Industrial Science and Technology, and the University of Tokyo in Japan, to understand how nuclear stress bodies, when they are present, help cells recover from stress.

Here is what they found: Heat stress at 42°C leads to de-phosphorylation of splicing factors called SRSFs, resulting in the removal of specific introns and the production of mature RNA molecules. Simultaneously, de-phosphorylated SRSFs become incorporated in the nuclear stress bodies. As soon as cells return to the body's normal temperature of 37°C, nuclear stress bodies recruit an enzyme to re-phosphorylate SRSFs, therefore rapidly restoring intron retention to its normal levels.

"Nuclear stress bodies probably function to fine-tune gene expressions by rapidly restoring the proper levels of intron-retaining messenger RNAs as the cell recovers from stress," Tetsuro Hirose says. Further studies are needed to reveal the specific effects of intron retention after heat stress, and to understand the detail mechanism of the process.

Credit: 
Hokkaido University

Sounds of the past give new hope for coral reef restoration

video: Tim Gordon explains "acoustic enrichment."

Image: 
University of Exeter

Young fish can be drawn to degraded coral reefs by loudspeakers playing the sounds of healthy reefs, according to new research published today in Nature Communications.

An international team of scientists from the UK's University of Exeter and University of Bristol, and Australia's James Cook University and Australian Institute of Marine Science, say this "acoustic enrichment" could be a valuable tool in helping to restore damaged coral reefs.

Working on Australia's recently devastated Great Barrier Reef, the scientists placed underwater loudspeakers playing healthy reef recordings in patches of dead coral and found twice as many fish arrived - and stayed - compared to equivalent patches where no sound was played.

"Fish are crucial for coral reefs to function as healthy ecosystems," said lead author Tim Gordon, of the University of Exeter.

"Boosting fish populations in this way could help to kick-start natural recovery processes, counteracting the damage we're seeing on many coral reefs around the world."

This new technique works by regenerating the sounds that are lost when reefs are quietened by degradation.

"Healthy coral reefs are remarkably noisy places - the crackle of snapping shrimp and the whoops and grunts of fish combine to form a dazzling biological soundscape. Juvenile fish home in on these sounds when they're looking for a place to settle," said senior author Professor Steve Simpson, also of the University of Exeter.

"Reefs become ghostly quiet when they are degraded, as the shrimps and fish disappear, but by using loudspeakers to restore this lost soundscape, we can attract young fish back again.

Australian Institute of Marine Science fish biologist Dr Mark Meekan added: "Of course, attracting fish to a dead reef won't bring it back to life automatically, but recovery is underpinned by fish that clean the reef and create space for corals to regrow."

The study found that broadcasting healthy reef sound doubled the total number of fish arriving onto experimental patches of reef habitat, as well as increasing the number of species present by 50%.

This diversity included species from all sections of the food web - herbivores, detritivores, planktivores and predatory piscivores.

Different groups of fish provide different functions on coral reefs, meaning an abundant and diverse fish population is an important factor in maintaining a healthy ecosystem.

Professor Andy Radford, a co-author from the University of Bristol, said: "Acoustic enrichment is a promising technique for management on a local basis.

"If combined with habitat restoration and other conservation measures, rebuilding fish communities in this manner might accelerate ecosystem recovery.

"However, we still need to tackle a host of other threats including climate change, overfishing and water pollution in order to protect these fragile ecosystems."

Gordon added: "Whilst attracting more fish won't save coral reefs on its own, new techniques like this give us more tools in the fight to save these precious and vulnerable ecosystems.

"From local management innovations to international political action, we need meaningful progress at all levels to paint a better future for reefs worldwide."

Credit: 
University of Exeter

SUTD-led research sets the groundwork for patient-specific 3D printed meniscus

image: Biomechanical evaluation of isotropic and shell-core composite meniscal implants.

Image: 
SUTD

The human knee joint is the largest and complex joint in the body compared to other joints. It has various hard and soft tissue elements to provide stability and proper functioning including the meniscus. The meniscus, a cartilage between the thighbone and the shinbone, acts as a shock absorber to dissipate the body weight and reduce friction during physical activities. Considering, the knee joint is the most used joint in the body, meniscal tears are the most common knee injury due to wear and tear as we age as well as injuries due to sports and trauma.

Meniscal tears can be treated in different ways such as physiotherapy, surgery to remove the torn tissue, replace with allografts, or implants. While meniscal transplants serve as an alternative option, it is technically challenging due to the customization in the sizing required for each person, positioning, securing to the bone, risks of rejection and infection.

Even though a significant amount of work and research has been carried out on prostheses and implants, meniscal implants in particular seem to have been significantly overlooked. It was only in 2015 that the first meniscal got implanted in humans during a clinical trial and more recently in September 2019 that the US' Food and Drug Administration approved the first artificial meniscus based on clinical trial results.

To bridge this research gap, Singapore University of Technology and Design's (SUTD) Medical Engineering and Design (MED) Laboratory collaborated with the University of Miyazaki's Biomechanics Laboratory and developed a novel methodology for the early-stage design assessment and verification of meniscal implants using computational modeling and simulation (refer to image). This initial-design stage development paves the way for the manufacturing of 3D printed meniscus, which will allow for customized, anatomically shaped meniscus for a more optimal fit for the patient when they have a meniscal implant. Their research paper has been published in IEEE.

This new development will provide medical practitioners and industry experts valuable insights into understanding how the remnant hard and soft tissues in the knee joint would react biomechanically should an artificial meniscus be implanted. They can then use this information to evaluate aspects of in vivo performance without subjecting patients as well as animals to potential harm or unnecessary risk. Also, the computational models can be used to evaluate options that would otherwise have been impossible experimentally or clinically.

To demonstrate this, the team developed a comprehensive and detailed computational model of an intact knee joint from the medical images and gait cycle data of a healthy adult volunteer. They validated the developed model with biomechanical experiments using the lower extremity of cadavers. They used the validated model to analyze the biomechanical response of soft-tissues in the knee joint for different meniscus conditions, such as intact meniscus, meniscal root tear, total removal of the meniscus, single material meniscal implant and a shell-core composite meniscal implant. Their results suggested that the composite meniscal implant with a shell-core structure performed better than other meniscus conditions and restored the natural biomechanical response of the joint.

This study could also explain the development of knee osteoarthritis due to increased contact stresses and altered joint kinematics caused by the loss of meniscal tissue. This novel synthetic meniscal implantation approach restores the joint mechanics close to the intact meniscus state and appears to be a promising strategy for treating patients with severe meniscal injuries. The model developed in this study sheds light on the knowledge of joint mechanics after injury or repair, and therefore can also assist in the clinical evaluation of other alternative repair techniques.

"This computational model's ability to study the effects of various meniscal implant configurations in a non-invasive manner provides clinicians and researchers with insights to make more informed decisions and enhance implant designs, positioning, anchoring to the bone as well as the choice of material properties. More importantly, it sets the groundwork for the future of patient-specific 3D printed meniscus for implantation," said lead researcher, Assistant Professor Subburaj Karupppasamy from SUTD's Engineering Product Development pillar.

Credit: 
Singapore University of Technology and Design