Earth

Early-and-regular cannabis use by youth is associated with alteration in brain circuits that support cognitive control

image: Conflict-related neural activations. Between-group t-map of conflict-related activations (voxel-wise cluster-defining threshold of p

Image: 
Elsevier

Washington, DC, June 20, 2019- The development of neural circuits in youth, at a particularly important time in their lives, can be heavily influenced by external factors--specifically the frequent and regular use of cannabis. A new study in the Journal of the American Academy of Child and Adolescent Psychiatry (JAACAP), published by Elsevier, reports that alterations in cognitive control--an ensemble of processes by which the mind governs, regulates and guides behaviors, impulses, and decision-making based on goals are directly affected.

The researchers found that these brain alterations were less intense in individuals who recently stopped using cannabis, which may suggest that the effects of cannabis are more robust in recent users. Additional findings from the study also suggest greater and more persistent alterations in individuals who initiated cannabis use earlier, while the brain is still developing.

"Most adults with problematic substance use now were most likely having problems with drugs and alcohol in adolescence, a developmental period during which the neural circuits underlying cognitive control processes continue to mature," said lead author Marilyn Cyr, PhD. "As such, the adolescent brain may be particularly vulnerable to the effects of substance use, particularly cannabis--the most commonly used recreational drug by teenagers worldwide," added the postdoctoral scientist in the Division of Child and Adolescent Psychiatry at the New York State Psychiatric Institute, Vagelos College of Physicians & Surgeons, Columbia University, New York.

The findings are based on functional magnetic resonance imaging (fMRI) data acquired from 28 adolescents and young adults (aged 14-23 years) with significant cannabis use and 32 age and sex-matched non-using healthy controls. Participants were scanned during their performance of a Simon Spatial Incompatibility Task, a cognitive control task that requires resolving cognitive conflict to respond accurately.

Compared to their healthy counterparts, the adolescents and young adults with significant cannabis use showed reduced activation in the frontostriatal circuits that support cognitive control and conflict resolution.

The authors also examined the degree to which fluctuations in activity in relation to conflict resolution is synchronized across the different regions comprised in this frontostriatal circuit (that is, to what extent are regions functionally connected with each other). Although circuit connectivity did not differ between cannabis-using and non-using youth, the research team found an association between how early individuals began regularly using cannabis and the extent to which frontostriatal regions were disrupted, suggesting that earlier chronic use may have a larger impact on circuit development than use of later onset.

"The present findings support the mission of the Adolescent Brain and Cognitive Development study, a longitudinal study aimed at understanding the developmental trajectory of brain circuits in relation to cannabis use," said Dr. Cyr. "In addition, these findings are a first step towards identifying brain-based targets for early interventions that reduce addiction behaviors by enhancing self-regulatory capacity.

"Given that substance use and relapse rates are associated with control processes, interventions based on neural stimulation, such as transcranial magnetic stimulation (TMS), and behavioral interventions, such as cognitive training, that specifically target the brain circuits underlying these control processes may be helpful as adjunct intervention strategies to complement standard treatment programs for cannabis use disorder."

Credit: 
Elsevier

Dynamic collaboration behind new research into best way of using biologging tags

image: A harbour seal tagged with a biologging tag.

Image: 
Dr. Abbo van Neer

Methods used to design F1 cars and spacecraft have played a crucial role in new research into the tags used to track animal movements.

Ecologists joined forces with aerospace colleagues at Swansea University to find the best way to reduce the drag of biologging tags - the recording devices used to track animal movements and behaviour.

Their research collaboration meant the bioscientists were able to take advantage of Computational Fluid Dynamics (CFD) - virtual fluid flow analysis carried out by a supercomputer - to run complex simulations of how a tag would affect a seal when moving through water.

Will Kay, from the Swansea Lab for Animal Movement (SLAM), worked with fellow PhD student David Naumann, from the University's renowned Zienkiewicz Centre for Computational Engineering, who is part of the team working on the Bloodhound supersonic car.

The team also included undergraduate students, academic supervisors and technicians from the College of Science and College of Engineering, as well as external partners from Natural Resources Wales.

Their paper Minimizing the impact of biologging devices: Using Computational Fluid Dynamics for optimizing tag design and positioning has just been published in the prestigious Methods in Ecology and Evolution, published by the British Ecological Society and Wiley, a world-leading biosciences journal.

Will said: "For animals like seals that move in very fast flowing currents, drag is a key issue and streamlining tags is very important. Previous research has been carried out on how tag size, shape and position affect the drag on the animal, but we wanted to look how a combination of these factors work together to affect drag. We also aimed to provide a step-by-step guide for other bioscientists looking to apply these techniques themselves."

Biologging tags are used on animals for many reasons, such as to find out more about the transmission of disease, or to understand what habitats they use to improve conservation strategies. However, carrying a tag that causes drag can change an animal's power requirements, and might mean an animal needs to change its behaviour accordingly.

"My research began by examining tags for seals. We wanted to understand what the most important factors were for making a tag more hydrodynamic," he said.

"Looking at the hydrodynamics (the water flow around a body) of a tag on an animal identifies what enables the tag to slip through the water easily and what features act to impede progress," explains co-author Professor Rory Wilson, from SLAM.

By modelling the drag of tags in a computational environment Will and his colleagues were able to test the drag loading of different tags and discovered that improving tag shape was most important for reducing drag. In fact, tags could actually be made slightly larger so long as their shape was improved.

Will said: "As an ecologist this research was not something I could have done on my own and that was what inspired this collaboration. Running the simulations of the different tags in different positions takes a lot of time, so we were able to capitalise on the expertise of aeronautical engineers and use the same kind of techniques used for designing F1 cars and rockets."

Their results also showed that a having a hydrodynamic design can reduce the impact of a tag's position on device-induced drag. This is particularly important as it is not always possible to place a tag in the optimum position on an animal.

The researchers now hope their findings will not only act as a guide to establishing interdisciplinary collaborations with engineers but will also help other researchers to increase their understanding of tag-induced drag.

"While we don't expect our findings to be taken up as strict, formal guidelines, nor the use of CFD to be made compulsory, we hope that this work, and especially the step-by-step guide the paper contains, will aid the biologging community towards achieving best practice in tag design," added co-author Professor Luca Börger, of SLAM

Will aims to continue his research into seals and seabirds after his PhD after accepting a position with the British Antarctic Survey, working with Antarctic fur seals and leopard seals.

He added: "I am absolutely delighted that my first lead author paper has been accepted by such a world-leading ecology journal. I want to say a huge thanks to my co-authors who put in a great deal of time and effort to make this work a success."

Credit: 
Swansea University

New p53 gene discovery sheds light on how to make cancer therapies more effective

image: Steven R. Grossman, M.D., Ph.D., deputy director of VCU Massey Cancer Center, Dianne Nunnally Hoppes Endowed Chair in Cancer Research and member of Massey's Cancer Molecular Genetics and Developmental Therapeutics research programs.

Image: 
VCU Massey Cancer Center

Scientists at VCU Massey Cancer Center have discovered that the loss of a protein called DBC1 in breast cancer cells leads to the dysregulation of normal anti-cancer functions, contributing to cancer cell growth and resistance to therapies. By restoring the expression of this protein, doctors may be able to help prevent the development of cancer and increase the effectiveness of common cancer treatments.

Recently published in the journal Cell Reports, the study used mass spectrometry to identify proteins that interact with processes that regulate the gene p53, which normally acts to suppress the development of cancer and has been found to be dysregulated in a majority of cancer types.

"We screened for proteins that interact with a protein within the nucleus of cells called CREB binding protein (CBP) that is known to regulate the gene p53. We found one of the proteins discovered in this screen called DBC1 is critical to maintaining the levels and activity of p53, and the gene encoding for this protein is frequently deleted in breast cancer cells," says the study's lead author Steven R. Grossman, M.D., Ph.D., deputy director of VCU Massey Cancer Center, Dianne Nunnally Hoppes Endowed Chair in Cancer Research and member of Massey's Cancer Molecular Genetics and Developmental Therapeutics research programs.

Previous research has shown that the CBP works with another protein known as MDM2 to maintain p53 levels in cells. This latest work from Dr. Grossman's laboratory shows that DBC1 regulates CBP activity, and therefore plays an essential role in maintaining p53 activity and abundance in normal cells.

"Restoring the function of DBC1 could potentially make tumors more susceptible to current cancer treatments and help prevent further cancer growth," says Grossman, who is also the chair of the Division of Hematology, Oncology and Palliative Care at the VCU School of Medicine.

The scientists used human breast cancer cell lines and mouse models to test their findings. They discovered that DBC1 levels decrease in response to cellular stress, which can be caused by platinum-based cancer drugs, for example. This drop in DBC1 decreased p53 levels, making the cells resistant to p53-mediated apoptosis, a form of controlled cell suicide. Many cancer drugs work by inducing apoptotic cell death.

Grossman and his team showed that maintaining DBC1 levels in cancer cells exposed to cisplatin, a platinum-based cancer drug, caused a substantially increased response to the drug.

"This shows that cancer cells have developed finely tuned responses to control DBC1 levels in order to avoid exaggerated apoptotic responses," says Grossman. "We're hopeful we can intervene in these processes and develop new strategies that increase the effectiveness of therapies for a variety of cancers shown to have dysregulated DBC1 levels, such as breast, lung and prostate cancers."

Credit: 
Virginia Commonwealth University

Processed foods may hold key to rise in autism

image: Researchers at UCF have dentified the molecular changes that happen when neuro stem cells are exposed to high levels of an acid commonly found in processed foods.

Image: 
University of Central Florida

With the number of children diagnosed with autism on the rise, the need to find what causes the disorder becomes more urgent every day. UCF researchers are now a step closer to showing the link between the food pregnant women consume and the effects on a fetus' developing brain.

Drs. Saleh Naser, Latifa Abdelli and UCF undergraduate research assistant Aseela Samsam have identified the molecular changes that happen when neuro stem cells are exposed to high levels of an acid commonly found in processed foods. In a study published June 19 in Scientific Reports, a Nature journal, the UCF scientists discovered how high levels of Propionic Acid (PPA), used to increase the shelf life of packaged foods and inhibit mold in commercially processed cheese and bread, reduce the development of neurons in fetal brains.

Dr. Naser, who specializes in gastroenterology research at the College of Medicine's Burnett School of Biomedical Sciences, began the study after reports showed that autistic children often suffer from gastric issues such as irritable bowel syndrome. He wondered about a possible link between the gut and the brain and began examining how the microbiome -- or gut bacteria -- differed between people with autism and those who do not have the condition.

"Studies have shown a higher level of PPA in stool samples from children with autism and the gut microbiome in autistic children is different," Dr. Naser said. "I wanted to know what the underlying cause was."

In the lab, the scientists found exposing neural stem cells to excessive PPA damages brain cells in several ways. First, the acid disrupts the natural balance between brain cells by reducing the number of neurons and over-producing glial cells. While glial cells help develop and protect neuron function, too many glia cells disturb connectivity between neurons. They also cause inflammation, which has been noted in the brains of autistic children.

Excessive amounts of the acid also shorten and damage pathways that neurons use to communicate with the rest of the body. The combination of reduced neurons and damaged pathways impede the brain's ability to communicate, resulting in behaviors that are often found in children with autism, including repetitive behavior, mobility issues and inability to interact with others.

Previous studies have proposed links between autism and environmental and genetic factors, but Drs. Naser and Abdelli say their study is the first to discover the molecular link between elevated levels of PPA, proliferation of glial cells, disturbed neural circuitry and autism. The 18-month study was self-funded by UCF.

PPA occurs naturally in the gut and a mother's microbiome changes during pregnancy and can cause increases in the acid. But Drs. Naser and Abdelli said eating packaged foods containing the acid can further increase PPA in the woman's gut, which then crosses to the fetus.

More research needs to be done before drawing clinical conclusions. Next, the research team will attempt to validate its findings in mice models by seeing if a high PPA maternal diet causes autism in mice genetically predisposed to the condition. There is no cure for autism, which affects about 1 in 59 children, but the scientists hope their findings will advance studies for ways to prevent the disorder.

"This research is only the first step towards better understanding of Autism Spectrum Disorder," the UCF scientists concluded. "But we have confidence we are on the right track to finally uncovering autism etiology."

Credit: 
University of Central Florida

Nearly 5.4 million cancer survivors suffer chronic pain

A new report finds about one in three cancer survivors (34.6%) reported having chronic pain, representing nearly 5.4 million cancer survivors in the United States. The report, appearing as a Research Letter in JAMA Oncology, finds one in six survivors (16%), representing about 2.5 million people in the U.S., reported suffering from high impact chronic pain that restricts daily functioning. Those rates are about double the rates in the general population.

Chronic pain is one of the most common long-term effects of cancer treatment and has been linked with an impaired quality of life, lower adherence to treatment, and higher health care costs. Nevertheless, there is a paucity of information regarding the prevalence of, and risk factors for, the development of chronic pain among cancer survivors.

To gain a better understanding of the epidemiology of pain in cancer survivors and help inform future health care priorities and policies, investigators led by Changchuan (Charles) Jiang, MD MPH of Mount Sinai St. Luke’s and Mount Sinai West, New York, with researchers from Memorial Sloan-Kettering Cancer Center, University of Virginia, and the American Cancer Society investigated the prevalence of chronic pain among cancer survivors in the United States using data from the National Health Interview Survey (2016-2017). The survey collects information related to chronic pain (pain on most days or every day in the past six months) and high impact chronic pain (chronic pain limiting life or work activities on most days or every day in the past 6 months).

Overall, 1,648 of the 4,526 cancer survivors identified in the survey (34.6%) reported having chronic pain; 768 of the survivors (16.1%) reported having high impact chronic pain. Applied to the nation as a whole, those rates equal approximately 5.39 million and 2.51 million cancer survivors, respectively, in the U.S.

Time since diagnosis was not significantly associated with the prevalence of either chronic pain, but a higher prevalence of chronic and high impact chronic pain was reported among survivors with less than a high school education (39.2% for chronic pain and 18.5% for high impact chronic pain), low household income (44.6% and 22.8%, respectively), public insurance among those aged 18-64 years (43.6% and 27.1%, respectively), or no paid employment (38.5% and 20.4%, respectively).

"Because socioeconomic status and employment are associated with insurance coverage and access to care in the United States, the patterns of chronic pain that we observed in cancer survivors may be explained by barriers to cancer care and pain management as well as by the type and extent of cancer treatment received," said Xuesong Han, PhD, American Cancer Society investigator and co-author of the report. "The prevalence of chronic pain and high impact chronic pain among cancer survivors in our study was almost double that in the general population, suggesting there are important unmet needs in the large and growing community of people with a history of cancer."

Credit: 
American Cancer Society

First events in stem cells becoming specialized cells needed for organ development

TORONTO, ON (Canada) - New research by cell biologists at the University of Toronto (U of T) provides significant new insight into the very first step stem cells go through to turn into the specialized cells that make up organs.

The findings published online in Genes & Development implicate the ability of proteins to hang around in cells - their stability - as a major factor in controlling a stem cell's state, and in the decision to remain a stem cell or transform into a specialized cell.

Stem cells are regulated by a network of proteins which maintain their ability to become any type of cell - a property known as pluripotency. These proteins - known as transcription factors - produced from genes in an organism's DNA, regulate the process by which cells decide whether or not to initiate development. The new findings highlight the role of KLF4, one of the transcription factors that gives stem cells their unique properties.

The discovery was serendipitous as the researchers initially set out to investigate how the KLF4 gene is regulated during transcription, but soon turned their attention to the KLF4 protein instead.

"Many previous studies focus on the genes that are turned on or off as stem cells are destined to make specific organs," says lead author Navroop Dhaliwal, who recently completed a PhD with Professor Jennifer Mitchell in the Department of Cell & Systems Biology in the Faculty of Arts & Science at U of T. "Our work exposes a situation earlier in the process where reducing gene expression by 90 per cent does not affect the amount of protein made. It was a really surprising finding when we first saw the results."

The researchers found KLF4 proteins made one day remained functional 24 hours later - a particular surprise as transcription factors typically only last for two or three hours in a cell.

When they looked at how stem cells differentiate and exit the stem cell state, they found KLF4 becomes unstable during the process, and by preventing this breakdown the cells can't differentiate.

"We discovered that the KLF4 protein is highly stable and locks cells in the stem cell state," said Dhaliwal. "Breaking it down, however, releases stem cells to specialize and eventually become the different organs of the body."

Dhaliwal and her colleagues say the findings indicate that KLF4 protein destabilization is a critical step in the ability of a stem cell to become any one of the hundreds of special cell types found in a mature organism.

"These findings have important implications for regenerative medicine as building new organs requires a detailed understanding of how stem cells exit their immature state," says Dhaliwal, now a postdoctoral fellow at the Hospital for Sick Children in Toronto. "Knowing this, we can now develop more efficient ways to produce patient specific stem cells and differentiate these cells to more mature cells, which will be the focus of my postdoctoral work."

Beyond its role in stem cells, KLF4 is also involved in numerous cancers. The researchers suggest the mechanisms uncovered here may shed light on its role in the development of breast cancer, squamous cell carcinoma and gastrointestinal cancer.

"The data we present highlight the importance of studying both transcriptional control and mechanisms that affect protein abundance," says Mitchell. "These mechanisms are particularity timely to keep in mind as more and more work shifts to a focus on gene expression using techniques like single cell RNA-sequencing, which would not have revealed the mechanisms we uncovered."

Credit: 
University of Toronto

Study of multiethnic genomes identifies 27 genetic variants associated with disease

image: Improving genetic diversity increases our understanding of the architecture of complex traits.

Image: 
Ernesto Del Aguila, III (NHGRI)

In a study published in the journal Nature, researchers identified 27 new genomic variants associated with conditions such as blood pressure, type II diabetes, cigarette use and chronic kidney disease in diverse populations. The team collected data from 49,839 African-American, Hispanic/Latino, Asian, Native Hawaiian, Native American and people who identified as others and were not defined by those ethnic groups. The study aimed to better understand how genomic variants influence the risk of forming certain diseases in people of different ethnic groups. The work was funded by the National Human Genome Research Institute (NHGRI) and the National Institute on Minority Health and Health Disparities, both parts of the National Institutes of Health.

In this study, researchers specifically looked for genomic variants in DNA that were associated with measures of health and disease. Everyone has DNA sequences that consist of the chemical bases A, C, G, T. Genomic variants occur in DNA regions where one of those bases is replaced with another, across various individuals. The team found that some genomic variants are specifically found in certain groups. Others, such as some related to the function of hemoglobin (a protein in the blood that carries oxygen), are found in multiple groups.

"There are scientific benefits to including people from different ethnic groups in research studies. This paper gives us a glimpse of how ethnic diversity can be harnessed to better understand disease biology and clinical implications," said Lucia Hindorff, Ph.D., program director in the Division of Genomic Medicine at NHGRI and a co-author of the paper. "This paper represents an important comprehensive effort to incorporate diversity into large-scale studies, from study design to data analysis."

Apart from finding new genomic variants, the study assessed whether known disease associations with 8,979 established genomic variants and specific diseases in European ancestry populations could be detected in African-American, Hispanic/Latino, Asian, Native Hawaiian, and Native American populations.

Their findings show that the frequency of genomic variants associated with certain diseases can differ from one group to another. For example, a strong association was found between a new genomic variant and smokers and their daily cigarette usage in Native Hawaiian participants. However, this association was absent or rare in most other populations. Not finding the variant in all groups despite large numbers of participants in each group strengthens the argument that findings from one population cannot always be generalized to others.

A variant in the hemoglobin gene, a gene known for its role in sickle cell anemia, is associated with greater amount of blood glucose attached to hemoglobin in African-Americans. The paper in Nature is the first to confirm this association within Hispanic/Latinos, who have shared ancestry that is mixed with European, African and Native American ancestry.

Such an effort is vital because a vast majority of human genomics research use data based mostly on populations of white European ancestry. For example, a separate study showed that among 2,500 recently published human genomics papers, only 19% of the individuals studied were non-European participants.

Inclusion of non-European populations in studies is important because ethnicity may partly explain the differences in vulnerability to diseases and treatment effects. This is because there may be genomic variants present in other ethnic populations that increase risk for diseases, but they would not be found if studies were only done on white European populations. Using genomic data from white Europeans to extrapolate to other populations may not accurately predict the disease burden carried by such groups.

The study is part of the Population Architecture using Genomics and Epidemiology (PAGE) consortium, which was formed in 2008, comprising researchers at NHGRI and centers across the United States. The paper in Nature on the study, led by researchers at the Icahn School of Medicine at Mount Sinai, the Fred Hutchinson Cancer Research Center, and other academic centers, is the result of work undertaken by the consortium within the last five years.

The study is a benchmark that addresses the need for new methods and tools for collecting and disseminating large and varied amounts of genomic data, in order to make the results clinically useful. "Ultimately, the PAGE study underscores the value of studying diverse populations, because only with a full understanding of genomic variations across populations can researchers comprehend the full potential of the human genome," said Dr. Hindorff.

Through PAGE and subsequent studies, researchers will be able to identify genomic variants that are associated with diseases from those that are not, but also to understand how such associations differ across race and ethnicity. In turn, this improved understanding can be used to target and tailor new treatments to maximize benefit across multiple populations.

Credit: 
NIH/National Human Genome Research Institute

Unearthing the sweet potato proteome

The sweet, starchy orange sweet potatoes are tasty and nutritious ingredients for fries, casseroles and pies. Although humans have been cultivating sweet potatoes for thousands of years, scientists still don't know much about the protein makeup of these tubers. In ACS' Journal of Proteome Research, researchers have analyzed the proteome of sweet potato leaves and roots, and in the process, have revealed new insights into the plant's genome.

The sweet potato (Ipomoea batatas, Lam.) is a staple food in some parts of the world, in addition to being used for animal feed and industrial products, such as biofuels. The plant has a surprisingly complex genome, encoding more predicted genes than the human genome. Sweet potato also has a complex chemical composition, with a low protein content in the roots (the part that people eat) and many secondary metabolites in the leaves, making it difficult to extract sufficient quantities of proteins for analysis. Sorina and George Popescu and colleagues wanted to see whether a "proteogenomics" approach -- analyzing both protein and genetic data together -- could help them gain a better understanding of the compositions of sweet potato roots and leaves.

The team extracted proteins from root and leaf samples using two different methods and cut them into peptides, which they analyzed with liquid chromatography and mass spectrometry. The researchers identified 3,143 unique proteins from sweet potato leaves and 2,928 from roots. When they compared the proteomic data with the genome of the sweet potato, they identified some regions in the published genome sequence where their data could provide enhanced information. For example, the analysis predicted 741 new protein-coding regions that previously were not thought to be genes. The group says the results could be used to help further characterize and biofortify the tuber.

Credit: 
American Chemical Society

Scientists chart course toward a new world of synthetic biology

image: The roadmap for synthetic or engineering biology identifies five research areas that the federal government needs to invest in to fuel the bioeconomy and keep the US at the forefront of the field.

Image: 
Engineering Biology Research Consortium, UC Berkeley

Genetically engineered trees that provide fire-resistant lumber for homes. Modified organs that won't be rejected. Synthetic microbes that monitor your gut to detect invading disease organisms and kill them before you get sick.

These are just some of the exciting advances likely to emerge from the 20-year-old field of engineering biology, or synthetic biology, which is now mature enough to provide solutions to a range of societal problems, according to a new roadmap released today (June 19) by the Engineering Biology Research Consortium, a public-private partnership partially funded by the National Science Foundation and centered at the University of California, Berkeley.

The roadmap is the work of more than 80 scientists and engineers from a range of disciplines, representing more than 30 universities and a dozen companies. While highly technical, the report provides a strong case that the federal government should invest in this area, not only to improve public health, food crops and the environment, but also to fuel the economy and maintain the country's leadership in synthetic biology. The report comes out in advance of the year's major technical conference for synthetic biology, 2019 Synthetic Biology: Engineering, Evolution & Design, which takes place June 23-27 in New York City.

Engineering biology/synthetic biology encompasses a broad range of current endeavors, including genetically modifying crops, engineering microbes to produce drugs, fragrances and biofuels, editing the genes of pigs and dogs using CRISPR-Cas9, and human gene therapy. But these successes are just a prelude to more complex biological engineering coming in the future, and the report lays out the opportunities and challenges, including whether or not the United States makes it a research priority.

"The question for government is, if all of these avenues are now open for biotechnology development, 'How does the U.S. stay ahead in those developments as a country?'" said Douglas Friedman, one of the leaders of the roadmap project and executive director of the Engineering Biology Research Consortium. "This field has the ability to be truly impactful for society, and we need to identify engineering biology as a national priority, organize around that national priority and take action based on it."

China and the United Kingdom have made engineering biology/synthetic biology -- which means taking what we know about the genetics of plants and animals and then tweaking specific genes to make these organisms do new things -- a cornerstone of their national research enterprise.

Following that lead, the U.S. House of Representatives held a hearing in March to discuss the Engineering Biology Research and Development Act of 2019, a bill designed to "provide for a coordinated federal research program to ensure continued United States leadership in engineering biology." This would make engineering biology a national initiative equivalent to the country's recent commitments to quantum information systems and nanotechnology.

"What this roadmap does and what all of our collaborators on this project have done is to imagine, over the next 20 years, where we should go with all of this work," said Emily Aurand, who directed the roadmapping project for the EBRC. "The goal was to address how applications of the science can expand very broadly to solve societal challenges, to imagine the breadth and complexity of what we can do with biology and biological systems to make the world a better, cleaner, more exciting place."

"This roadmap is a detailed technical guide that I believe will lead the field of synthetic biology far into the future. It is not meant to be a stagnant document, but one that will continually evolve over time in response to unexpected developments in the field and societal needs." said Jay Keasling, a UC Berkeley professor of chemical and biomolecular engineering and the chair of EBRC's roadmapping working group.

The roadmap would guide investment by all government agencies, including the Department of Energy, Department of Defense and National Institutes of Health as well as NSF.

"The EBRC roadmap represents a landmark achievement by the entire synthetic biology and engineering biology community," said Theresa Good, who is the deputy division director for molecular and cellular biosciences at the National Science Foundation and co-chair of a White House-level synthetic biology interagency working group. "The roadmap is the first U.S. science community technical document that lays out a path to achieving the promise of synthetic biology and guideposts for scientists, engineers and policy makers to follow."

Apples, meat and THC

Some products of engineering biology are already on the market: non-browning apples; an antimalarial drug produced by bacteria; corn that produces its own insecticide. One Berkeley start-up is engineering animal cells to grow meat in a dish. An Emeryville start-up is growing textiles in the lab. A UC Berkeley spin off is creating medical-quality THC and CBD, two of the main ingredients in marijuana, while another is producing brewer's yeast that provide the hoppy taste in beer, but without the hops.

But much of this is still done on small scales; larger-scale projects lie ahead. UC Berkeley bioengineers are trying to modify microbes so that they can be grown as food or to produce medicines to help humans survive on the moon or Mars.

Others are attempting to engineer the microbiome of cows and other ruminants so that they can better digest their feed, absorb more nutrients and produce less methane, which contributes to climate change. With rising temperatures and less predictable rain, scientists are also trying to modify crops to better withstand heat, drought and saltier soil.

And how about modified microbes, seaweed or other ocean or freshwater plants -- or even animals like mussels -- that will naturally remove pollutants and toxins from our lakes and ocean, including oil and plastic?

"If you look back in history, scientists and engineers have learned how to routinely modify the physical world though physics and mechanical engineering, learned how to routinely modify the chemical world through chemistry and chemical engineering," Friedman said. "The next thing to do is figure out how to utilize the biological world through modifications that can help people in a way that would otherwise not be possible. We are at the precipice of being able to do that with biology."

While in the past some genetically engineered organisms have generated controversy, Friedman says the scientific community is committed to engaging with the public before their introduction.

"It is important that the research community, especially those thinking about consumer-facing products and technologies, talk about the ethical, legal and societal implications early and often in a way different than we have seen with biotech developments in the past," he said.

In fact, the benefits of engineering biology are so vast that it's an area we just cannot ignore.

"The opportunity is immense," Friedman said.

Credit: 
University of California - Berkeley

'Goldilocks' neurons promote REM sleep

image: Using a technique called optogenetics, a laser light was used to turn off or on MCH neurons time locked with the temperature warming phases. It demonstrated the necessity of the MCH neurons to increase REM sleep during warming toward 'just right' room temperature.

Image: 
Pascal Gugler for Insel Gruppe AG

Every night while sleeping, we cycle between two very different states of sleep. Upon falling asleep, we enter non-rapid eye movement (non-REM) sleep where our breathing is slow and regular and movement of our limbs or eyes are minimal. Approximately 90 minutes later, how-ever, we enter rapid eye movement (REM) sleep. This is a paradoxical state where our breath-ing becomes fast and irregular, our limbs twitch, and our eyes move rapidly. In REM sleep, our brain is highly active, but we also become paralyzed and we lose the ability to thermoregulate or maintain our constant body temperature. "This loss of thermoregulation in REM sleep is one of the most peculiar aspects of sleep, particularly since we have finely-tuned mechanisms that control our body temperature while awake or in non-REM sleep", says Markus Schmidt of the Department for BioMedical Research (DBMR) of the University of Bern, and the Department of Neurology, Inselspital, Bern University Hospital. On the one hand, the findings confirm a hy-pothesis proposed earlier by Schmidt, senior author of the study, and on the other hand repre-sent a breakthrough for sleep medicine. The paper was published in "Current Biology" and highlighted by the editors with a comment.

A control mechanism saving energy

The need to maintain a constant body temperature is our most expensive biological function. Panting, piloerection, sweating, or shivering are all energy consuming body reactions. In his hypothesis, Markus Schmidt suggested that REM sleep is a behavioral strategy that shifts en-ergy resources away from costly thermoregulatory defense toward, instead, the brain to en-hance many brain functions. According to this energy allocation hypothesis of sleep, mammals have evolved mechanisms to increase REM sleep when the need for defending our body tem-perature is minimized or, rather, to sacrifice REM sleep when we are cold. "My hypothesis pre-dicts that we should have neural mechanisms to dynamically modulate REM sleep expression as a function of our room temperature", says Schmidt. Neuroscientists at the DBMR at the Uni-versity of Bern and the Department of Neurology at Inselspital, Bern University Hospital, now confirmed his hypothesis and found neurons in the hypothalamus that specifically increase REM sleep when the room temperature is "just right".

REM sleep promoting neurons

The researchers discovered that a small population of neurons within the hypothalamus, called melanin-concentrating hormone (MCH) neurons, play a critical role in how we modulate REM sleep expression as a function of ambient (or room) temperature. The researchers showed that mice will dynamically increase REM sleep when the room temperature is warmed to the high end of their comfort zone, similar to what has been shown for human sleep. However, genet-ically engineered mice lacking the receptor for MCH are no longer able to increase REM sleep during warming, as if they are blind to the warming temperature. The authors used optogenet-ics technics to specifically turn on or off MCH neurons using a laser light time locked to the temperature warming periods. Their work confirms the necessity of the MCH system to increase REM sleep when the need for body temperature control is minimized.

Breakthrough for sleep medicine

This is the first time that an area of the brain has been found to control REM sleep as a func-tion of room temperature. "Our discovery of these neurons has major implications for the con-trol of REM sleep", says Schmidt. "It shows that the amount and timing of REM sleep are finely tuned with our immediate environment when we do not need to thermoregulate. It also con-firms how dream sleep and the loss of thermoregulation are tightly integrated".

REM sleep is known to play an important role in many brain functions such as memory consoli-dation. REM sleep comprises approximately one quarter of our total sleep time. "These new data suggest that the function of REM sleep is to activate important brain functions specifically at times when we do not need to expend energy on thermoregulation, thus optimizing use of energy resources", says Schmidt.

Credit: 
University of Bern

Scientists demonstrate the advantages of diverse populations when compiling genetic data

AURORA, Colo. (June 19, 2019) - Relying strictly on genetic data from those of European descent, rather than more diverse populations, can exacerbate existing disease and increase health care disparities, according to new research.

The research letter was published today in the journal Nature.

"There have been numerous discoveries in human genetics over the last few decades that have told us a lot about biology, but most of the work is being done on those of European descent," said the study's first author Christopher Gignoux, PhD, MS, associate professor at the Colorado Center for Personalized Medicine at the University of Colorado Anschutz Medical Campus. "By limiting our focus, we are limiting our understanding of the human genetics underlying complex traits. The PAGE Study gives us an overdue opportunity to look at what we can find when studying a large number of groups together."

This was borne out in the study which examined thousands of individuals in the U.S. of non-European ancestry. The Population Architecture using Genomics and Epidemiology study (PAGE) was developed by the National Human Genome Research Institute and the National Institute on Minority Health and Health Disparities to conduct and empower genetic research in diverse populations.

Researchers genotyped 49,839 people and found a number of genetic variants replicated from studies strictly of European descent. But PAGE investigators found dozens of discoveries that would not have been possible in a single population study. This included both complex traits and in Mendelian, or monogenic disorders.

"In light of differential genetic architecture that is known to exist between populations, bias in representation can exacerbate existing disease and health care disparities," the study said. "Critical variants can be missed if they have a low frequency or are completely absent in European populations..." Especially rare variants.

Gignoux said the success of precision medicine and genomics means recruiting people from underrepresented populations for genetic studies. Right now, those genomic databases lack critical diversity despite the fact that many of in underrepresented groups have the greatest health burden and stand to benefit the most from being included.

"The Colorado Center for Personalized Medicine on the Anschutz Medical Campus is committed to personalized medicine here in our state and region that will benefit ALL people, regardless of who you are or where you came from," said Kathleen Barnes, PhD, director of the Colorado Center for Personalized Medicine. "Initiatives like PAGE, and the work summarized in this manuscript by Chris Gignoux and colleagues, show us the way forward in achieving our goals of inclusion. It also illuminates just how important genetic diversity is in our understanding of the architecture of genetic disease. These approaches can now feed into our personalized ancestry information resource for patients interested in their own ancestry, as well as benefit our research and clinical community."

Gignoux agreed.

"With studies of diverse groups we got a better overall picture of the genetic architecture which show the underpinnings of disease," Gignoux said. "We want to understand how genetics can improve and ameliorate disease rather than make it worse."

Credit: 
University of Colorado Anschutz Medical Campus

US military consumes more hydrocarbons than most countries -- massive hidden impact on climate

The US military's carbon footprint is enormous and must be confronted in order to have a substantial effect on battling global warming.

Research by social scientists from Durham University and Lancaster University shows the US military is one of the largest climate polluters in history, consuming more liquid fuels and emitting more CO2e (carbon-dioxide equivalent) than most countries.

The majority of greenhouse gas (GHG) accounting routinely focuses on civilian energy use and fuel consumption, not on the US military. This new study, published in Transactions of the Institute of British Geographers, calculates part of the US military's impact on climate change through critical analysis of its global logistical supply chains.

The research provides an independent public assessment of the US military's greenhouse gas emissions. It reports that if the US military were a nation state, it would be the 47th largest emitter of GHG in the world, if only taking into account the emission from fuel usage.

Report co-author Dr Patrick Bigger, of Lancaster University Environment Centre, said: "The US Military has long understood it is not immune from the potential consequences of climate change - recognising it as a threat multiplier that can exacerbate other threats - nor has it ignored its own contribution to the problem.

"Yet its climate policy is fundamentally contradictory - confronting the effects of climate change while remaining the largest single institutional consumer of hydrocarbons in the world, a situation it is locked into for years to come because of its dependence on existing aircraft and warships for open-ended operations around the globe."

Despite the recent increase in attention, the US military's dependence on fossil fuels is unlikely to change. The US is continuing to pursue open-ended operations around the globe, with the life-cycles of existing military aircraft and warships locking them into hydrocarbons for years to come.

The research comes at a time when the US military is preparing for climate change through both its global supply networks and its security infrastructure. This study brings transparency to one of the world's largest institutional consumers of hydrocarbons at a time when the issue is a hot-button topic on the US Presidential campaign trail. Leading Democratic candidates, such as Senator Elizabeth Warren, are asking critical questions of the role of the US military in climate change and examining its plans for the future.

Co-author Dr Benjamin Neimark, Associate Director of the Pentland Centre for Sustainability in Business at Lancaster, said: "This research provides ample evidence to support recent calls by activist networks to include the US military in Congresswoman Alexandria Ocasio-Cortez's Green New Deal and other international climate treaties."

Co-author Dr Oliver Belcher, of Durham University's Department of Geography, said: "Our research demonstrates that to account for the US military as a major climate actor, you must understand the logistical supply chain that makes its acquisition and consumption of hydrocarbon-based fuels possible.

"How do we account for the most far-reaching, sophisticated supply chains, and the largest climate polluter in history? While incremental changes can amount to radical effects in the long-run, there is no shortage of evidence that the climate is at a tipping point and more is needed."

The researchers' examination of the US military 'carbon boot-print' started with the US Defense Logistics Agency - Energy (DLA-E), a powerful yet virtually unresearched sub-agency within the larger Defense Logistics Agency. It is the primary purchase-point for hydrocarbon-based fuels for the US Military, and a powerful actor in the global oil market, with the fuels it delivers powering everything from routine base operations in the USA to forward operating bases in Afghanistan.

"An important way to cool off the furnace of the climate emergency is to turn off vast sections of the military machine," added Dr Neimark. "This will have not only the immediate effect of reducing emissions in the here-and-now, but create a disincentive in developing new hydrocarbon infrastructure integral to US military operations."

Other key findings of the report include:

In 2017 alone, the US military purchased about 269,230 barrels of oil a day and emitted more than 25,000 kt- CO2e by burning those fuels. In 2017 alone, the Air Force purchased $4.9 billion worth of fuel and the Navy $2.8 billion, followed by the Army at $947 million and Marines at $36 million.

If the US military were a country, it would nestle between Peru and Portugal in the global league table of fuel purchasing, when comparing 2014 World Bank country liquid fuel consumption with 2015 US military liquid fuel consumption.

For 2014, the scale of emissions is roughly equivalent to total - not just fuel - emissions from Romania. According to the DLA-E data obtained by the researchers, which includes GHG emissions from direct or stationary sources, indirect or mobile sources and electricity use, and other indirect, including upstream and downstream emissions.

The Air Force is by far the largest emitter of GHG at more than 13,000 kt CO2e, almost double that of the US Navy's 7,800 kt CO2e. In addition to using the most polluting types of fuel, the Air Force and Navy are also the largest purchasers of fuel.

Credit: 
Lancaster University

Many asylum seekers suffer from depression and anxiety symptoms

Up to 40% of the adults who have sought asylum in Finland told that they are suffering from major depression and anxiety symptoms. More than half of both the adults and children reported having experienced at least one shocking, possibly traumatic event, such as being subjected to violence.

The results are from a recent study by the Finnish National Institute for Health and Welfare, where a medical examination was performed on and interviews made with more than a thousand asylum seekers who had just arrived in Finland. So far, the study is the most extensive population study concerning the health of asylum seekers both at a national and international level.

"Over 60 % of asylum seekers coming from Sub-Saharan Africa had depression and anxiety symptoms - the percentage is higher than among asylum seekers from other areas. The same group had also had the highest number of shocking experiences before coming to Finland. For example, 67% of men from Africa reported having experienced torture and 57% of women reported experiences of sexual violence," says Anu Castaneda, Research Manager from the National Institute for Health and Welfare.

According to her, it is, therefore, important to support the mental health and functioning capacity of asylum seekers already at the reception stage.

"This may be effected by supporting meaningful everyday life and activities of asylum seekers, as well as by providing counselling and discussions and information on mental health and by investing in the smooth operation of referral paths. It is particularly important to support the welfare of children and families."

Women's health weaker than men's in many respects

A larger share of women than men, 49% in all, reported having a long-term illness or health problem, such as musculoskeletal condition, diabetes or respiratory disorder. When arriving in Finland, every tenth of the studied women was pregnant.

On the other hand, men had more injuries caused by accidents and violence, their share being as high as 55%. Men also smoked cigarettes more often than women, their share being up to 37%.

In many areas of health, the situation of those coming from the Middle East and from Africa in particular was weaker than that of asylum seekers from other parts of the world.

"It would be advisable to disseminate more health-related information to asylum seekers in an understandable and easy-to-approach form," says Natalia Skogberg, Project Manager from the National Institute fro Health and Welfare.

According to the study, the asylum seekers had problems in many other areas of health as well, such as oral health. Most of the asylum seekers under the age of 18 had never been to a dentist before coming to Finland.

Alcohol and substance use rare among asylum seekers

On the other hand, some of the findings were quite positive with respect to health. For example, 85% of adults seeking asylum told that they do not use any alcohol, and only a few per cent were drinking to get intoxicated. Use of other substances was also rare among other asylum seekers. Furthermore, very few of those studied showed symptoms of infectious diseases.

"The results of the study are important particularly as we want to develop our activities by which we respond to the health needs of asylum seekers," notes Olli Snellman, Head of Section from the Finnish Immigration Service.

"Based on the results, we are in the process of updating and developing the initial medical examination model applied to asylum seekers, to be adopted in all reception centres around Finland."

The purpose of the study was to produce comprehensive information on the health and welfare of adults and minors who had sought asylum in Finland in 2018 and their need for services in Finland.

Credit: 
Finnish Institute for Health and Welfare

Study predicts more long-term sea level rise from Greenland ice

image: The researchers ran their model 1500 times, testing a variety of land, ice, ocean and atmospheric variables to see how they affected ice melt rate - including three possible future climate scenarios (from left to right: low, medium, and high emissions out to the year 2300).

Image: 
Credits: NASA / Cindy Starr

Greenland's melting ice sheet could generate more sea level rise than previously thought if greenhouse gas emissions continue to increase and warm the atmosphere at their current rate, according to a new modeling study. The study, which used data from NASA's Operation IceBridge airborne campaign, was published in Science Advances today. In the next 200 years, the ice sheet model shows that melting at the present rate could contribute 19 to 63 inches to global sea level rise, said the team led by scientists at the Geophysical Institute at the University of Alaska Fairbanks. These numbers are at least 80 percent higher than previous estimates, which forecasted up to 35 inches of sea level rise from Greenland's ice.

The team ran the model 500 times out to the year 3000 for each of three possible future climate scenarios, adjusting key land, ice, ocean and atmospheric variables to test their effects on ice melt rate. The three climate scenarios depend on the amount of greenhouse gas emissions in the atmosphere in coming years. In the scenario with no reduction of emissions, the study found that the entire Greenland Ice Sheet will likely melt in a millennium, causing 17 to 23 feet of sea level rise.

In the scenario where emissions are stabilized by the end of the century rather than continue to increase, the model shows ice loss falling to 26-57 percent of total mass by 3000. Drastically limiting emissions so they begin to decline by the end of the century could limit ice loss to 8-25 percent. This scenario would produce up to six feet of sea level rise in the next millennium, according to the study.

The updated model more accurately represents the flow of outlet glaciers, the river-like bodies of ice that connect to the ocean. Outlet glaciers play a key role in how ice sheets melt, but previous models lacked the data to adequately represent their complex flow patterns. The study found that melting outlet glaciers could account for up to 40 percent of the ice mass lost from Greenland in the next 200 years.

By incorporating ice thickness data from IceBridge and identifying sources of statistical uncertainty within the model, the study creates a more accurate picture of how human-generated greenhouse gas emissions and a warming climate may affect Greenland in the future.

A clearer picture

Capturing the changing flow and speed of outlet glacier melt makes the updated ice sheet model more accurate than previous models, according to the authors. As ocean waters have warmed over the past 20 years, they have melted the floating ice that shielded the outlet glaciers from their rising temperatures. As a result, the outlet glaciers flow faster, melt and get thinner, with the lowering surface of the ice sheet exposing new ice to warm air and melting as well.

"Once we had access to satellite observations, we were able to capture the surface velocity of the whole Greenland Ice Sheet and see how that ice flows. We recognized that some outlet glaciers flow very fast -- orders of magnitude faster than the interior of the ice sheet," said lead author Andy Aschwanden, a research associate professor at the University of Alaska Fairbanks' Geophysical Institute.

IceBridge's detailed ice thickness measurements helped the team to be the first to model these areas where outlet glaciers are affected by warmer ocean waters, as well as to model more of the complex feedbacks and processes influencing ice loss than previously possible. They examined the importance of factors like underwater melting, large ice chunks breaking off of glaciers, changing snowfall rates and rising air temperatures. They also examined factors that could slow down ice loss, like the movement of Earth's surface "bouncing back" from the weight of ice that is no longer there.

"At the end of the day, glaciers flow downhill," Aschwanden said. "That's very simplified, but if you don't know where downhill is, the model can never do a good job. So the most important contributor to understanding ice flow is knowing how thick the ice is."

Each of the three emissions scenarios used in the study produced different patterns of ice retreat across Greenland. The least severe scenario showed the ice retreating in the west and north, while the moderate scenario showed ice retreat around the island, except for in the highest elevation areas. The most severe scenario, in which emissions continue to increase at their present rate, showed more than half of the model runs losing more than 99 percent of the ice sheet by 3000.

At its thickest point, the Greenland Ice Sheet currently stands more than 10,000 feet above sea level. It rises high enough into the atmosphere to alter the weather around it, as mountains do. Today, this weather pattern generates almost enough snowfall to compensate for the amount of naturally melting ice each year. In the future, however, melting and flow will thin the interior, lowering it into a layer of the atmosphere that lacks the conditions necessary for sufficient replenishing snowfall.

"In the warmer climate, glaciers have lost the regions where more snow falls than melts in the summer, which is where new ice is formed," said Mark Fahnestock, research professor at the Geophysical Institute and the study's second author. "They're like lumps of ice in an open cooler that are melting away, and no one is putting any more ice into the cooler."

The team stressed that despite the need for ongoing research on exactly how glaciers will move and melt in response to warming temperatures, all of the model runs show that the next few decades will be pivotal in the ice sheet's future outcome.

"If we continue as usual, Greenland will melt," Aschwanden said. "What we are doing right now in terms of emissions, in the very near future, will have a big long-term impact on the Greenland Ice Sheet, and by extension, if it melts, to sea level and human society."

Bridging the data gap

The model runs were performed on high-performance supercomputers at NASA's Ames Research Center and the University of Alaska Fairbanks (UAF) using the Parallel Ice Sheet Model (PISM), an open-source model developed at UAF and the Potsdam Institute for Climate Impact Research. NASA also provided funding support for the study. While other ice sheet models could perform the simulations they did, the team said, PISM is unique for its high resolution and low computational cost.

NASA's Operation IceBridge is the world's largest airborne survey of polar land and sea ice. Using an array of aircraft and scientific instruments, IceBridge has collected data between the end of the first Ice, Cloud and Land Elevation Satellite (ICESat) mission in 2010 and the second, ICESat-2, which launched in 2018. It has measured the height of the ice below its flight path as well as the bedrock under the ice sheets.

"NASA's space and airborne campaigns, like IceBridge, have fundamentally transformed our ability to try and make a model mimic the changes to the ice sheet," Fahnestock said. "The technology that allows improved imaging of the glacier bed is like a better pair of glasses allowing us to see more clearly. Only NASA had an aircraft with the instruments and technology we needed and could go where we needed to go."

Banner Image: The Greenland Ice Sheet is the second-largest body of ice in the world, covering roughly 650,000 square miles of Greenland's surface. If it melts completely, it could contribute up to 23 feet of sea level rise, according to a new study using data from NASA's Operation IceBridge.

Credit: 
NASA/Goddard Space Flight Center

Upcycling process brings new life to old jeans

A growing population, rising standards of living and quickly changing fashions send mountains of clothing waste to the world's landfills each year. Although processes for textile recycling exist, they tend to be inefficient and expensive. Now, researchers have reported in ACS Sustainable Chemistry & Engineering an efficient, low-cost method that can convert waste denim into viscose-type fibers that are either white or the original color of the garment.

Cotton-based clothing, such as denim, makes up a large proportion of textile waste. Meanwhile, farming cotton consumes land and resources. Efficiently converting waste denim into reusable cotton fibers could help address both of these problems. Previously, researchers have used ionic liquids -- salts that are liquid, not solid -- to dissolve cotton textiles into their cellulose building blocks. The cellulose was then spun into new viscose-type fibers that could be woven into textiles. However, ionic liquids are expensive and difficult to work with because of their high viscosity. Nolene Byrne and colleagues wanted to find a way to reduce the amount of ionic liquid solvent required to recycle denim into regenerated cellulose fibers.

The researchers ground three textile samples (blue denim fabric, red denim pants and a mixed-color T-shirt) into powders. Then, they dissolved the powders in a 1:4 mixture of the ionic liquid 1-butyl-3-methylimidazolium acetate and dimethyl sulfoxide (DMSO). Using a high concentration of DMSO as a co-solvent allowed the researchers to use much less ionic liquid than other methods. In addition, DMSO reduced the viscosity of the ionic liquid solution, making it easier to spin the cellulose into new fibers. Because DMSO is much cheaper than the ionic liquid, the new process reduced the cost of solvent by 77%. When they pre-treated the textile powders with a sodium hydroxide solution, the researchers could produce white viscose-like fibers. Without this step, the fibers retained the color of the original item, which conserves water and energy that would otherwise be required for textile dyeing.

The authors acknowledge funding from the Australian Research Council Research Hub for Future Fibres.

Credit: 
American Chemical Society