Earth

New study reveals areas of brain where recognition and identification occur

image: Image showing a map of the brain surface showing regions that preferentially activate during face (blue) and scene (red) identification.

Image: 
Oscar Woolnough, PhD, postdoctoral research fellow in the Department of Neurosurgery at McGovern Medical School at UTHealth in Houston.

Using "sub-millimeter" brain implants, researchers at The University of Texas Health Science Center at Houston (UTHealth), have been able to determine which parts of the brain are linked to facial and scene recognition.

The study was published today in Current Biology.

"The ability to recognize familiar faces and locations is crucial to everyday life," said Nitin Tandon, MD, lead author of the study and professor of neurosurgery at McGovern Medical School at UTHealth. "Identifying someone allows you to communicate with them and know who they are, and having this basic skill helps an individual attach an identity to those around them, making it easier to differentiate the who, what, and where."

Traditionally, the hippocampus and parahippocampal gyri, located in the medial temporal lobe (MTL), are implicated as the main area for identification processes. However, recently it has become clear to researchers that the memory network responsible for identification extends beyond the MTL, including a region deep inside the brain called the medial parietal cortex (MPC).

To better understand how recognition and identification occur, researchers performed direct intracranial recordings in the MPC and MTL, structures that are known to be engaged during face and scene identification. In a cohort of 50 participants, a large number for this type of study, researchers placed stereoelectroencephalography (sEEG) electrode implants, used to identify epileptic seizures, in their brain and monitored their brain activity during several tests. This procedure in minimally invasive, involving the insertion of fine probes into the skull.

During these tests, researchers would show patients around 300 photos of celebrity faces and famous landmarks to determine whether or not they could name what they were seeing.

"One of the things that we were able to determine was that the MPC has specific regions involved in face and scene recognition," said Oscar Woolnough, PhD, first author and postdoctoral research fellow in the Department of Neurosurgery at McGovern Medical School at UTHealth. "The MPC was preferentially activated when the patients recognized the people and places, exactly the same as traditional memory regions in MTL. We were also able to see how the MPC and MTL work together to help a person recognize faces and places."

Research shows that this part of the parietal lobe where the MPC is located is an area of the brain that begins to deteriorate early in patients with Alzheimer's disease. Identifying it as one of the regions of the brain that helps with memory and recognition shows researchers that areas outside of the traditional memory region, namely the hippocampus, are important to understand how abnormalities of the brain like Alzheimer's affects a person. Tandon hopes this study can help with future advancements for other diseases.

"We are making many advances in understanding these very basic processes in the brain," he said. "This will give us an opportunity to create devices and other technologies that will target abnormalities of brain processes in the future. So, in essence, what we are doing is creating an understanding of the software of which the brain operates in doing these basic functions. As technology advances, we hope to be able to implement such software, if you will, for creating solutions for abnormalities of brain functions. In this case specifically, the inability to recognize or retrieve somebody's name."

Credit: 
University of Texas Health Science Center at Houston

Ancient DNA provides new insights into the early peopling of the Caribbean

image: lllustration of one of the early settlers in the Caribbean.

Image: 
Tom Björklund.

According to a new study by an international team of researchers from the Caribbean, Europe and North America, the Caribbean was settled by several successive population dispersals that originated on the American mainland.

he Caribbean was one of the last regions of the Americas to be settled by humans. Now, a new study published in the journal Science sheds new light on how the islands were settled thousands of years ago.

Using ancient DNA, a team of archaeologists and geneticists led by researchers from the University of Copenhagen and the Max Planck Institute for the Science of Human History found evidence of at least three population dispersals that brought people to the region.

"The new data give us a fascinating glimpse of the early migration history of the Caribbean. We find evidence that the islands were settled and resettled several times from different parts of the American mainland", says Hannes Schroeder, Associate Professor at the Globe Institute, University of Copenhagen, and one of the senior authors of the study.

More data, more details

The researchers analysed the genomes of 93 ancient Caribbean islanders who lived between 400 and 3200 years ago using bone fragments excavated by Caribbean archaeologists from 16 archaeological sites across the region.

Due to the region's warm climate, the DNA from the samples was not very well preserved. But using so-called targeted enrichment techniques, the researchers managed to extract enough information from the remains.

"These methods allowed us to increase the number of ancient genome sequences from the Caribbean by almost two orders of magnitude and with all that data we are able to paint a very detailed picture of the early migration history of the Caribbean," says Johannes Krause, Director of the Max Planck Institute for the Science of Human History, and another senior author of the study.

The researchers' findings indicate that there have been at least three different population dispersals into the region: two earlier dispersals into the western Caribbean, one of which seems to be linked to earlier population dispersals in North America, and a third, more recent "wave", which originated in South America.

Connections across the Caribbean Sea

Although it is still not entirely clear how the early settlers reached the islands, there is growing archaeological evidence indicating that, far from being a barrier, the Caribbean Sea served as a kind of 'aquatic highway' that connected the islands with the mainland and each other.

"Big bodies of water are traditionally considered barriers for humans and ancient fisher hunter gatherer communities are usually not perceived as great seafarers. Our results continue to challenge that view, as they suggest that there was repeated interaction between the islands and the mainland," says Kathrin Nägele, PhD student at the Max Planck Institute for the Science of Human History and one of the first authors of the study.

Biological and Cultural Diversity in the Ancient Caribbean

"The new data support our previous observations that the early settlers of the Caribbean were biologically and culturally diverse, adding resolution to this ancient period of our history", says Yadira Chinique de Armas, Assistant Professor in Bioanthropology at the University of Winnipeg who currently co-directs three large scale excavations in Cuba as part of the SSHRC project.

The researchers found genetic differences between the early settlers and the newcomers from South America who, according to archaeological evidence, entered the region around 2800 years ago.

"Although the different groups were present in the Caribbean at the same time, we found surprisingly little evidence of admixture between them", adds Cosimo Posth, group leader at the Max Planck Institute for the Science of Human History and joint-first author of the study.

"The results of this study provide yet another layer of data that highlights the diverse and complex nature of pre-Columbian Caribbean societies and their connections to the American mainland prior to the colonial invasion," says Corinne Hofman, Professor of Archaeology at Leiden University and PI of the ERC Synergy project NEXUS1492.

"Genetic data provide a new depth to our findings" agrees Mirjana Roksandic, Professor at the University of Winnipeg and the PI on the SSHRC project.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Future of the western North Pacific Subtropical High: Weaker or stronger?

image: A controlling circulation on location of Mei-yu Front and track of tropical storms. A new study reports that the western North Pacific Subtropical High will be intensified in future

Image: 
Xiaolong Chen

The western North Pacific Subtropical High (WNPSH) is a key atmospheric circulation system strongly influences weather and climate over the entire East and Southeast Asia. It determines the strength and position of the Mei-yu (or Baiu/Changma) Front and the trajectories of typhoon and western Pacific tropical cyclones. How it will change in the future concerns the livelihood of many millions of people. The answer from state-of-the-art climate models is currently ambiguous. A total of 35 models participating in the Fifth Phase of Coupled Model Intercomparison Project (CMIP5) cannot agree on the sign of future changes.

A new research published in Nature Communications this week led by the Institute of Atmospheric Physics/Chinese Academy of Sciences (IAP/CAS), in collaboration with the Met Office Hadley Centre in the UK and Nanjing University, has found that such uncertainties are mainly resulted from systematic biases in simulating the historical sea surface temperature from individual models. With correction by observed sea surface temperature, under the Representative Concentration Pathway (RCP) 8.5, a high greenhouse gas emission scenario, the models will tend to agree on a future intensification of the WNPSH with 45% of the uncertainty reduced, which implies stronger East Asian summer monsoon with increased rainfall but reduced typhoon landfalls over East Asia. In the meantime, it could also imply increased risk of heatwaves in the southern and eastern China.

"Uncertainties with climate models are there to stay, even though models have progressively improved. How to get the best information for climate adaptation and decision makers from currently available model projections is an important research topic." Said the lead author, Dr. Xiaolong Chen from the Institute of Atmospheric Physics in Beijing.

This work was jointly funded by an international programme from the Chinese Academy of Science and the Newton Fund from the UK under CSSP-China. Dr Peili Wu, a co-author from the Met Office Hadley Centre, said: "Given the importance of the WNPSH, this is an important step towards the right direction. As the observed climate was only one realization of many possibilities, some uncertainties will remain." So as Dr. Tianjun Zhou, executive director of the institute and a co-author, pointed out, "The link with future changes of the East Asia summer monsoon and western Pacific tropical cyclones needs further investigation."

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Australia's ancient geology controls the pathways of modern earthquakes

image: Surface rupture trace from the 2016 Petermann Ranges earthquake.

Image: 
Dr Dan Clark, Geoscience Australia

Seismological and geological studies led by University of Melbourne researchers show the 2016 magnitude 6.0 Petermann earthquake produced a landscape-shifting 21 km surface rupture. The dimensions and slip of the fault plane were guided by zones of weak rocks that formed more than 500 million years ago.

The unusually long and smooth rupture produced by this earthquake initially puzzled scientists as Australia's typically strong ancient cratons tend to host shorter and rougher earthquakes with greater displacements at this magnitude.

"We found that in regions where weaker rocks are present, earthquakes may rupture faults under low friction," said University of Melbourne Research Fellow, Dr Januka Attanayake.

"This means that structural properties of rocks obtained from geologic mapping can help us to forecast the possible geometry and slip distributions of future earthquakes, which ultimately allow us to better understand the seismic hazard posed by our many potentially active faults.

"Australia regularly incurs earthquakes of this magnitude that could, if located close to our urban centers, create catastrophic damage similar to that incurred in the fatal 2011 magnitude 6.2 Christchurch earthquake in New Zealand. Luckily, most of these earthquakes in Australia have occurred in remote areas."

The Petermann Ranges, extending 320km from east Central Western Australia to the southwest corner of the Northern Territory, started forming about 600 million years ago when an Australian intracontinental mountain building event termed the Petermann Orogeny occurred.

Dr Attanayake said seismic and geologic data collected from the near-field investigation of the Petermann earthquake four years ago by a research team comprising Dr Tamarah King, Associate Professor Mark Quigley, Gary Gibson, and Abe Jones in the School of Earth Sciences helped determined that weak rock layers embedded in the strong crust may have played a role in setting off the rare earthquake.

Despite a major desert storm severely hampering field work, the geologists scoured the land for evidence of a surface rupture, both on foot and using a drone, which they eventually located two weeks into their field work. As a result researchers were able to map in detail the deformation associated with a 21 kilometre long trace of a surface rupture, along which the ground had uplifted with a maximum vertical displacement of one metre.

Seismologists rapidly deployed broadband seismometers to detect and locate aftershocks that provide independent information to estimate the geometry of the fault plane that ruptured.

Dr Attanayake said "The Petermann earthquake is a rare example where we've been able to link earthquakes with pre-existing geologic structure by combining seismological modelling and geological field mapping.

"With this insight about what caused Central Australia's old, strong, and cold cratonic crust to break and produce this significant earthquake, seismic and geologic data might help us infer possible geometries of fault planes present beneath our urban centres and forecast seismic hazard."

Credit: 
University of Melbourne

Discovery of a novel gene involved in DNA damage repair and male fertility

image: Normally, DNA breaks are introduced at the beginning of meiotic recombination. However, such DNA breaks are a threat to a cell and must be repaired immediately. The newly identified C19ORF57 gene mediates binding of BRCA2 to damaged DNA sites for repair.

Image: 
Dr. Kei-ichiro Ishiguro

A research group from the Institute of Molecular Embryology and Genetics (IMEG) at Kumamoto University, Japan has discovered that the gene C19ORF57 plays a critical role in meiosis. The gene appears to be related to the cause of male infertility and could be a big step forward for reproductive medicine.

Meiosis is a specialized type of cell division that generates sperm or eggs. During Meiosis, genetic information is exchanged between maternal and paternal chromosomes through meiotic recombination. This process introduces genetic differences to the next generation.

Normally, meiotic recombination is initiated by introducing breaks in the DNA. However, this process is nothing less than DNA damage that is a threat to the cell. Although introduction of DNA breaks is a normal and necessary process to trigger meiotic recombination, these breaks must be repaired immediately. In this study, Drs. Ishiguro and Takemoto discovered a novel gene that plays a crucial role in repairing DNA damage during meiotic recombination.

Previously, the same group discovered the Meiosin gene which acts as a switch to turn on meiosis as well as hundreds of other genes in the process. However, the functions of all the other genes have not yet been fully elucidated. The C19ORF57 gene is one of those controlled by MEIOSIN and its function was unknown until now.

The researchers set out to clarify the role of C19ORF57 in meiosis. Using mass spectrometry, the group found that it binds to breast cancer suppressor BRCA2, a protein that is known to play a role in repairing damaged DNA. This data suggests that C19ORF57 and BRCA2 function together in germ cells.

Further evidence showing cooperation between C19ORF57 and BRCA2 was found through microscopic imaging. The researchers discovered that C19ORF57 goes first to damaged DNA sites and then recruits BRCA2 to the same position on the chromosomes.

Using genome editing technology to artificially inhibit the C19orf57 gene in mice, researchers found that the male animals became infertile because meiotic recombination did not complete and sperm were not produced. Further analysis of male gonads revealed that the gene plays an essential role in repairing damaged DNA.

There are many unknown causes of human male infertility and this finding potentially reveals a new pathology. Even though these experiments were performed on animal models, the C19ORF57 gene is present in humans. Therapies and diagnostics developed from this research could ensure meiosis quality and decrease the instances of complications.

Credit: 
Kumamoto University

Metasurface opens world of polarization

image: An SEM image of the device shows the irregular nanostructures created during the inverse design process.

Image: 
(Image courtesy of Zhujun Shi/Harvard SEAS)

Polarization, the direction in which light vibrates, is invisible to the human eye. Yet, so much of our optical world relies on the control and manipulation of this hidden quality of light.

Materials that can manipulate the polarization of light -- known as birefringent materials -- are used in everything from digital alarm clocks to medical diagnostics, communications and astronomy.

Just as light's polarization can vibrate along a straight line or an ellipse, materials can also be linearly or elliptically birefringent. Today, most birefringent materials are intrinsically linear, meaning they can only manipulate the polarization of light in a limited way. If you want to achieve broad polarization manipulation, you need to stack multiple birefringent materials on top of one another, making these devices bulky and inefficient.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences have designed a metasurface that can be continuously tuned from linear to elliptical birefringence, opening up the entire space of polarization control with just one device. This single metasurface can operate as many birefringent materials in parallel, enabling more compact polarization manipulation, which could have far-reaching applications in polarization imaging, quantum optics, and other areas.

The research is published in Science Advances.

"It is a new type of birefringent material," said Zhujun Shi, a former graduate student at SEAS and first author of the paper. "We are able to tailor broad polarization behavior of a material beyond what naturally exists, which has a lot of practical benefits. What used to require three separate conventional birefringent components now only takes one".

"The ability to manipulate a fundamental property of light like polarization in completely new ways with a device that is compact and multifunctional will have important applications for quantum optics and optical communications," said Federico Capasso, Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and senior author of the paper.

Metasurfaces are arrays of nanopillars spaced less than a wavelength apart that can perform a range of tasks, including manipulating the phase, amplitude and polarization of light. In the past, Capasso and his team have designed these highly ordered surfaces from the ground up, using simple geometric shapes with only a few design parameters.

In this research, however, the team turned to a new type of design technique known as topological optimization.

"Topological optimization is an inverse approach," said Shi. "You start with what you want the metasurface to do and then you allow the algorithm to explore the huge parameter space to develop a pattern that can best deliver that function."

The result was surprising. Instead of neatly ordered rectangular pillars standing like toy soldiers, this metasurface is composed of nested half circles reminiscent of crooked smiley faces -- more like something a toddler would draw than a computer.

But these odd shapes have opened up a whole new world of birefringence. Not only can they achieve broad polarization manipulations like transforming linear polarization into any desired elliptical polarization but the polarization can also be tuned by changing the angle of the incoming light.

"Our approach has a wide range of potential applications across industry and scientific research, including polarization aberration correction in advanced optical systems," said Capasso.

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Tumors disrupt the immune system throughout the body

Cancer treatment has advanced with the advent of immunotherapies that, in some cancers, can overcome tumors' ability to evade the immune system by suppressing local immune responses. But a new study in mice by UC San Francisco researchers has found that, depending on a cancer's tissue of origin, tumors cause widespread and variable disruption of the immune system throughout the body, not just at the primary tumor site.

Greater success for immunotherapy regimens will rely on taking these different patterns of immune system disruption into account, they said, and findings from the new study, published online in Nature Medicine on May 25, 2020, are already being investigated in the clinic.

"Different cancers do different things to change the systemic immune system, and immunotherapies that help the patient's immune system attack cancer may work best when they trigger lasting immune responses throughout the body," said the study's principal investigator, Matthew Spitzer, PhD, an assistant professor of otolaryngology and member of the UCSF Helen Diller Family Comprehensive Cancer Center.

Spitzer's lab team, including the study's lead authors, Breanna Allen and Kamir Hiam, both UCSF graduate students, determined the abundance and activity of different types of peripheral immune cells -- sampled from blood, bone marrow, spleen and lymph nodes near untreated tumors -- in mice with different types of cancer, including brain, colon, pancreatic, skin (melanoma) and breast cancer. They used mass cytometry, a recently refined technique which relies on unique metallic molecular markers and mass spectrometry to quickly quantify and identify dozens of cell types in various states of activation.

Spitzer earlier discovered that proliferation of new immune cells originating far from a tumor was required for immunotherapy treatment to be effective. In the new study, his lab team has determined that not only does an untreated cancer change the way the immune response unfolds both locally and at a distance from the tumor, but also that this disruption of the immune system evolves over time. Remarkably, however, the immune system perturbations tracked by the researchers were reversed when the tumors were surgically removed.

Three distinct types of breast cancer examined in the study caused similar patterns of disruption in peripheral immune sites, while tumors originating in other tissues caused distinctly different changes in the relative abundance and activity of different immune cell types. These differences are likely a reflection of both anatomy and physiology, according to Allen.

"Different tissues have different needs and risks when interacting with the immune system," she said. "A site like the breast, which has a lot of fat and a lot of drainage, is going to have a different level of access and interaction with immune system in comparison to another tissue. Even in the brain, typically viewed as a protected compartment that excludes most immune cells, we found that localized tumors had effects on the immune system, even in the periphery of the body, although the response we saw was distinct from what we observed with the breast cancers."

To assess whether cancers have similar effects on the human immune system, the researchers also analyzed publicly available data on immune markers in the blood of human breast cancer patients and compared them to data from healthy individuals. They found that cancer patients showed indicators of an altered immune system that were consistent with data from the new mouse study, suggesting the findings may have direct applications to improving human immunotherapies.

Weakened Immune Defenses to Infection

While different tumor types in the study had different effects on the immune system, a common feature identified by the researchers was diminishment of the immune system's capacity to mount a new immune response, an important consideration for fighting infection as well as cancer.

People with cancer are known to have weaker responses to both infection and vaccination, but it has been unclear to what extent this may be due to immunosuppressive effects of treatment rather than the cancer itself. The new UCSF study bolsters the evidence that cancer, before any treatment, can weaken the immune system's response to infection: the researchers found that mice with cancer had weakened immune responses to both viral and bacterial infection.

Cancer immunotherapy is most effective in patients whose immune systems are already mounting an immune response; the treatment needs to be able to stimulate preexisting immune system cells, especially "killer" T cells, in order to boost their ability to effectively attack tumor cells. However, the new research suggests that many tumors may render these treatments less effective by systemically reducing the number of immune cells available to be stimulated. "Our results demonstrate an unappreciated impairment of new cellular immune responses in the context of cancer," Spitzer said.

Tumor growth in the study was linked to reduced activation of immune cells known as antigen-presenting cells, a step that must occur in order for new T cells to become activated. Antigen-presenting cells grab onto a foreign target molecule, or antigen, and display it to other cells of the immune system, including T cells. The cells that detect the antigen target are thereby primed to expand their ranks and to attack any tumor or infectious pathogen that displays the same antigen.

"Our study suggests that the antigen presenting cells may be significantly functionally altered in cancer patients, and that this alteration compromises immune responses," Hiam said.

The researchers determined that poor functioning of antigen-presenting cells in mice with cancer was responsible for the weakened response to infection. They were able to boost antigen-presenting cell activation and the immune response to infection by treating the mice with so-called "co-stimulatory molecules," which normally are made by the immune system.

"Going forward we see a time when cancer patients would receive a different formulation of the flu vaccine, for example, that a healthy person would not require, one that would activate antigen-presenting cells to produce a good immune response," Spitzer said.

"Our hope for the future is that results from this study will allow us to treat more patients with more effective immunotherapies that don't just target T-cells, but which also consider the context in which those T cells are residing, and the other types of cells they need to communicate with in order to become properly activated and to reject a tumor," Spitzer said. Spitzer is collaborating with oncologists on clinical trials to explore treatments to re-activate antigen-presenting cells, including a phase II trial to treat pancreatic cancer.

Credit: 
University of California - San Francisco

Why developing nerve cells can take a wrong turn

A group of scientists from CECAD, the Cluster of Excellence 'Cellular Stress Responses in Aging-Associated Diseases,' have found a mechanism by which neurodevelopmental diseases concerning neurons can be explained: The loss of a certain enzyme, UBE2K, impeded the differentiation of stem cells by silencing the expression of genes important for neuronal differentiation and, therefore, the development and generation of neurons. More specifically, UBE2K regulates the levels and activation of histones, key proteins that pack and organize the DNA, regulating the expression of genes. Being part of the epigenetic landscape of the cell, the changes made to the histones are reversible and could provide a chance for future developments of treatments for neurodevelopmental diseases. The study is available in the current issue of Communications Biology.

Embryonic stem cells (ESCs) can replicate indefinitely while retaining their potential to differentiate into all other types of cells. Thus, nerve cells (neurons), muscle cells and all the other cells of the body are produced in a developing organism. Errors during this process can lead to congenital diseases. Degrading damaged proteins within the cell is an important factor in this process. Thus, the scientists studied the interaction between the proteasome, the main protagonist in terminating proteins, and the epigenetic landscape. Epigenetic landscapes are the heritable changes to an organism which are not determined by DNA but through changes to the chromatin, which can organize and silence DNA into tighter packages.

Azra Fatima from CECAD studied the interactions of the histones in immortal human embryonic stem cells (hESCs) which have a unique chromatin architecture and especially low levels of a certain histone called H3 which has undergone a chemical addition in the form of three methylgroups (H3K9me3).

Histones are proteins which are part of the chromatin in cell nuclei. They build up spools around which the DNA winds, shortening it by a ratio of 1:10 millions. They are also responsible for regulating the gene expression by which genes produce proteins in the organism. In addition, they play an important role in the process of cellular differentiation in which a cell, for example an embryonic stem cell, changes into another type of cell with a higher degree specialization.

They found that embryonic stem cells exhibit high expression of UBE2K (Ubiquitin-conjugating enzyme E2 K), a ubiquitin-conjugating enzyme. These enzymes are known for being important in the process of degradation of proteins. The loss of the enzyme in embryonic stem cells caused an increase in the levels of H3K9 trimethyltransferase SETDB1, resulting in higher trimethylation of H3K9, leading in turn to a repression of neurogenic genes during the differentiation of the stem cells. As a result, the loss of UBE2K impaired the ability of the stem cells to differentiate into neural progenitors a type of precursor cell that generate neurons and other cells of the nervous system.

Besides H3K9 trimethylation, the scientists found that UBE2K binds histone H3 to induce its polyubiquitination and degradation by the 26S proteasome. Notably, ubc-20, the worm orthologue of UBE2K, also regulates both histone H3 levels and H3K9 trimethylation in the germ cells of the model organism C. elegans. 'Our results indicate that UBE2K crosses evolutionary boundaries to promote histone H3 degradation and reduce H3K9me3 repressive marks in immortal cells like human embryonic stem cells and germline cells", says Fatima.

'We found a link between the ubiquitin-proteasome system and epigenetic regulation in immortal stem cells,' Fatima concluded. 'It would be also interesting to see if UBE2K regulates the epigenetic state in other cell types like cancer cells.' David Vilchez, the corresponding author of the manuscript, added: 'We believe that our findings can have important implications to understand the development of the human brain.'

By precisely regulating the levels of UBE2K it would be possible to determine the cell type specific epigenetic landscapes. Different diseases like Huntington's disease are associated with alterations in epigenetic marks. Since the epigenetic marks are reversible, it will be interesting to study if the epigenetic status of pluripotent stem cells from patients can be modulated by controlling the proteasome system and UBE2K. In order to correct the disease phenotype, novel strategies could be designed to correct epigenetic alterations in early developmental stages and thus provide a potential treatment for diseases.

Credit: 
University of Cologne

Exploring the neurological impact of air pollution

Air pollution has become a fact of modern life, with a majority of the global population facing chronic exposure. Although the impact of inhaling polluted air on the lungs is well known, scientists are just now beginning to understand how it affects the brain. A new article in Chemical & Engineering News, the weekly newsmagazine of the American Chemical Society, details how researchers are connecting air pollution to dementia, autism and other neurological diseases.

Arising from vehicle emissions, power plants and factories, air pollution is a complex soup of gases, metals, organic contaminants and other materials. Over 90% of the world's population is continually exposed to particulate matter (PM) pollution, which is known to penetrate deep into the lungs, at levels above the World Health Organization's guidelines, writes Contributing Editor Janet Pelley. Inhaling these substances causes inflammation, which is the body's healthy response to injury or infection, but over time chronic inflammation can damage healthy tissues. 

Although the correlation between PM and lung damage is clear, scientists believe that these harmful particles can also impact the brain, either directly or indirectly. In a recent study, infant mice exposed to air pollution showed altered social behaviors similar to those of autistic children. Postmortem observations revealed inflammation and other abnormalities in the mice's brains resembling changes seen in children with autism. Researchers suspect that iron particles in PM could play a role, as they are known to cause cell death in Parkinson's and Alzheimer's diseases. In mice, inflammation caused by breathing polluted air also appears to boost the production of amyloid plaques, the sticky protein fragments associated with neurological diseases like Alzheimer's. While evidence is mounting that air pollution can pose a serious threat to brain health, scientists emphasize that their research must coincide with policy changes to reduce pollution worldwide. 

Credit: 
American Chemical Society

Here be methane: Skoltech scientists investigate the origins of a gaping permafrost crater

image: Researchers from Skoltech and their colleagues spent more than two years studying a 20-meter wide and 20-meter deep crater in the Yamal Peninsula in northern Russia that formed after an explosive release of gas, mostly methane, from the permafrost. They were able to deduce potential formation models for the discovered crater that has implications for geocryology and climate change studies.

Image: 
Anton Sinitsky / Arctic Research Center of the Yamal-Nenets Autonomous District

Researchers from Skoltech and their colleagues spent more than two years studying a 20-meter wide and 20-meter deep crater in the Yamal Peninsula in northern Russia that formed after an explosive release of gas, mostly methane, from the permafrost. They were able to deduce potential formation models for the discovered crater that has implications for geocryology and climate change studies.

Permafrost, which amounts to two thirds of the Russian territory, is a huge natural reservoir of methane, a potent greenhouse gas. As the Arctic warms and permafrost degrades due to climate change, scientists are concerned that this methane may start leaking into the atmosphere in massive amounts, further exacerbating global warming.

Right now methane is already quietly seeping from underground in the Arctic, but sometimes it does more than just that: a giant 40-meter wide alien-looking crater, dubbed the "Yamal Crater", captured everyone's imagination in 2014 when it was found just 42 kilometers from the Bovanenkovo gas field. Explosive events like this produce impressive "scars", but scientists are still not sure where the gas that causes them comes from.

"Arctic craters are relatively rare phenomena that mostly occur in the remote tundra. The frost heaving that precedes a crater usually happens quite quickly, over one to two years, and this sudden growth is hard to observe, so almost all craters were discovered after everything had already happened. We have only piecemeal evidence from locals who say they heard a noise or saw smoke and flames. Plus, a crater turns into a lake in another one to two years, which is then hard to distinguish from common thermokarst lakes in the Arctic," says Evgeny Chuvilin of the Skoltech Center for Hydrocarbon Recovery, the paper's first author.

The Skoltech team decided to study the Erkuta gas-emission crater, accidentally discovered in the summer of 2017 in the floodplain of the Erkuta-Yakha River on the Yamal Peninsula by biologists interested in falcon nesting-places in the area. According to Dr Chuvilin, the Skoltech team was lucky to get to the much less famous Erkuta crater during its first year -- and just one year before it too turned into a lake. Thus, theirs is probably the only team in the world who got to look into the origins of the Erkuta crater.

The researchers took samples of permafrost soil, ground ice and water from the rim of the crater during a field trip in December 2017 and conducted drone observations six months later. They found that the strongly negative δ13C (a measure of the ratio of stable carbon isotopes 13C to 12C) of methane from ground ice samples was characteristic of biogenic hydrocarbons, yet the ratio of methane to the total amount of its homologs, ethane and propane, pointed to a deeper thermogenic source.

Based on these observations, the scientists built a model for the formation of the crater, which "matured" in one of the dried-up lakes that formed from an oxbow lake, a former paleo-channel of the Erkuta-Yakha River. This lake probably had a underlake talik-- a zone of unfrozen soils that started freezing gradually after the lake had dried out, building up the stress that was ultimately released in a powerful explosion.

"Cryovolcanism, as some researchers call it, is a very poorly studied and described process in the cryosphere, an explosion involving rocks, ice, water and gases that leaves behind a crater. It is a potential threat to human activity in the Arctic, and we need to thoroughly study how gases, especially methane, are accumulated in the top layers of the permafrost and which conditions can cause the situation to go extreme. These methane emissions also contribute to the rising concentrations of greenhouse gases in the atmosphere, and climate change itself might be a factor in increasing cryovolcanism. But this is still something that needs to be researched," Chuvilin notes.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

New technique takes 3D imaging an octave higher

image: Gabriel Popescu and his colleagues have developed a new 3D imaging technique to visualize biological samples.

Image: 
Courtesy Gabriel Popescu

A collaboration between researchers at the University of Illinois at Urbana-Champaign and Colorado State University resulted in a new 3D imaging technique called harmonic optical tomography that facilitates the visualization of tissues and other biological samples on a microscopic scale.

The technique can potentially be used to assist with diagnosing cancer and other diseases. The technique is based on using holographic information, which measures light patterns, to generate 3D images of a sample. Three-dimensional imaging that can peer into the interior of an object provides critical information for a diverse range of applications, such as medical diagnostics, finding cracks in oil wells and airplane wings, using tomographic X-ray, and ultrasound methods.

The paper "Harmonic optical tomography of nonlinear structures" was published in Nature Photonics.

"Our lab specializes in using holographic data to investigate live cells and tissues," said Gabriel Popescu, a professor of electrical and computer engineering and the director of the Quantitative Light Imaging Laboratory at the Beckman Institute for Advanced Science and Technology. "We wanted to extend this technique to nonlinear samples by combining the holographic data and new physics models."

"This work started out as an interesting theoretical project I worked on with Popescu as a part of his graduate level microscopy course in my first year of grad school. I am excited to see it mature into a functioning experimental prototype," said Varun Kelkar, a graduate student who is a member of Mark Anastasio's Computational Imaging Science Lab.

The researchers developed theoretical models to describe how to image the tissue and discovered a unique capability for 3D imaging that arises, counterintuitively, by illuminating the sample with blurry, out-of-focus laser light. The team designed and built a new system at Colorado State University to collect data. The data was then reconstructed with computational imaging algorithms. The experiments verified an entirely new form of optical tomography that validates the experimental predictions.

"A key to the experimental demonstration of this new tomographic imaging was a custom, high-power laser, which was designed and built by CSU graduate student Keith Wernsing," said Randy Bartels, a CSU professor of electrical and computer engineering and a co-author of the paper.

The researchers used two types of samples to test their theory, said Chenfei Hu, a graduate student in the Popescu group. "The first was a manufactured crystal that is typically used for generating nonlinear signals. The second was a biological sample where we used a muscle tissue."

"This new type of tomographic imaging could prove to be very valuable for a wide range of studies that currently rely on two-dimensional images to understand collagen fiber orientation, which has been used as a reporter for a number of types of cancer," said Dr. Jeff Field, the director of the Microscopy Core Facility at CSU and a Research Scientist in electrical engineering.

"Unlike typical laser-scanning microscopes, an additional benefit of HOT is that its speed makes it much less vulnerable to vibrations and unwanted microscope drift, which leads to sharper images and increased repeatability," said study co-author Kimani Toussaint, a former professor in the College of Engineering at Illinois and now professor in the School of Engineering at Brown University.

Credit: 
Beckman Institute for Advanced Science and Technology

Princeton team develops 'poisoned arrow' to defeat antibiotic-resistant bacteria

image: A team of Princeton researchers led by Prof. Zemer Gitai have found an antibiotic that can simultaneously puncture bacterial walls and destroy folate within their cells -- taking out even monstrous bacteria with the effectiveness of a poisoned arrow -- while proving immune to antibiotic resistance.

Image: 
Matilda Luk, Princeton University Office of Communications

Poison is lethal all on its own -- as are arrows -- but their combination is greater than the sum of their parts. A weapon that simultaneously attacks from within and without can take down even the strongest opponents, from E. coli to MRSA (methicillin resistant Staphylococcus aureus).

A team of Princeton researchers reported today in the journal Cell that they have found a compound, SCH-79797, that can simultaneously puncture bacterial walls and destroy folate within their cells -- while being immune to antibiotic resistance.

Bacterial infections come in two flavors -- Gram-positive and Gram-negative -- named for the scientist who discovered how to distinguish them. The key difference is that Gram-negative bacteria are armored with an outer layer that shrugs off most antibiotics. In fact, no new classes of Gram-negative-killing drugs have come to market in nearly 30 years.

"This is the first antibiotic that can target Gram-positives and Gram-negatives without resistance," said Zemer Gitai, Princeton's Edwin Grant Conklin Professor of Biology and the senior author on the paper. "From a 'Why it's useful' perspective, that's the crux. But what we're most excited about as scientists is something we've discovered about how this antibiotic works -- attacking via two different mechanisms within one molecule -- that we are hoping is generalizable, leading to better antibiotics -- and new types of antibiotics -- in the future."

The greatest weakness of antibiotics is that bacteria evolve quickly to resist them, but the Princeton team found that even with extraordinary effort, they were unable to generate any resistance to this compound. "This is really promising, which is why we call the compound's derivatives 'Irresistin,'" Gitai said.

It's the holy grail of antibiotics research: an antibiotic that is effective against diseases and immune to resistance while being safe in humans (unlike rubbing alcohol or bleach, which are irresistibly fatal to human cells and bacterial cells alike).

For an antibiotics researcher, this is like discovering the formula to convert lead to gold, or riding a unicorn -- something everyone wants but no one really believes exists, said James Martin, a 2019 Ph.D. graduate who spent most of his graduate career working on this compound. "My first challenge was convincing the lab that it was true," he said.

But irresistibility is a double-edged sword. Typical antibiotics research involves finding a molecule that can kill bacteria, breeding multiple generations until the bacteria evolve resistance to it, looking at how exactly that resistance operates, and using that to reverse-engineer how the molecule works in the first place.

But since SCH-79797 is irresistible, the researchers had nothing to reverse engineer from.

"This was a real technical feat," said Gitai. "No resistance is a plus from the usage side, but a challenge from the scientific side."

The research team had two huge technical challenges: Trying to prove the negative -- that nothing can resist SCH-79797 -- and then figuring out how the compound works.

To prove its resistance to resistance, Martin tried endless different assays and methods, none of which revealed a particle of resistance to the SCH compound. Finally, he tried brute force: for 25 days, he "serially passaged" it, meaning that he exposed bacteria to the drug over and over and over again. Since bacteria take about 20 minutes per generation, the germs had millions of chances to evolve resistance -- but they didn't. To check their methods, the team also serially passaged other antibiotics (novobiocin, trimethoprim, nisin and gentamicin) and quickly bred resistance to them.

Proving a negative is technically impossible, so the researchers use phrases like "undetectably-low resistance frequencies" and "no detectable resistance," but the upshot is that SCH-79797 is irresistible -- hence the name they gave to its derivative compounds, Irresistin.

They also tried using it against bacterial species that are known for their antibiotic resistance, including Neisseria gonorrhoeae, which is on the top 5 list of urgent threats published by the Center for Disease Control and Prevention.

"Gonorrhea poses a huge problem with respect to multidrug resistance," said Gitai. "We've run out of drugs for gonorrhea. With most common infections, the old-school generic drugs still work. When I got strep throat two years ago, I was given penicillin-G -- the penicillin discovered in 1928! But for N. gonorrhoeae, the standard strains that are circulating on college campuses are super drug resistant. What used to be the last line of defense, the break-glass-in-case-of-emergency drug for Neisseria, is now the front-line standard of care, and there really is no break-glass backup anymore. That's why this one is a particularly important and exciting one that we could cure."

The researchers even got a sample of the most resistant strain of N. gonorrhoeae from the vaults of the World Health Organization -- a strain that is resistant to every known antibiotic -- and "Joe showed that our guy still killed this strain," Gitai said, referring to Joseph Sheehan, a co-first-author on the paper and the lab manager for the Gitai Lab. "We're pretty excited about that."

The poison-tipped arrow

Without resistance to reverse engineer from, the researchers spent years trying to determine how the molecule kills bacteria, using a huge array of approaches, from classical techniques that have been around since the discovery of penicillin through to cutting-edge technology.

Martin called it the "everything but the kitchen sink" approach, and it eventually revealed that SCH-79797 uses two distinct mechanisms within one molecule, like an arrow coated in poison.

"The arrow has to be sharp to get the poison in, but the poison has to kill on its own, too," said Benjamin Bratton, an associate research scholar in molecular biology and a lecturer in the Lewis Sigler Institute for Integrative Genomics, who is the other co-first-author.

The arrow targets the outer membrane -- piercing through even the thick armor of Gram-negative bacteria -- while the poison shreds folate, a fundamental building block of RNA and DNA. The researchers were surprised to discover that the two mechanisms operate synergistically, combining into more than a sum of their parts.

"If you just take those two halves -- there are commercially available drugs that can attack either of those two pathways -- and you just dump them into the same pot, that doesn't kill as effectively as our molecule, which has them joined together on the same body," Bratton said.

There was one problem: The original SCH-79797 killed human cells and bacterial cells at roughly similar levels, meaning that as a medicine, it ran the risk of killing the patient before it killed the infection. The derivative Irresistin-16 fixed that. It is nearly 1,000 times more potent against bacteria than human cells, making it a promising antibiotic. As a final confirmation, the researchers demonstrated that they could use Irresistin-16 to cure mice infected with N. gonorrhoeae.

New hope

This poisoned arrow paradigm could revolutionize antibiotic development, said KC Huang, a professor of bioengineering and of microbiology and immunology at Stanford University who was not involved in this research.

"The thing that can't be overstated is that antibiotic research has stalled over a period of many decades," Huang said. "It's rare to find a scientific field which is so well studied and yet so in need of a jolt of new energy."

The poisoned arrow, the synergy between two mechanisms of attacking bacteria, "can provide exactly that," said Huang, who was a postdoctoral researcher at Princeton from 2004 to 2008. "This compound is already so useful by itself, but also, people can start designing new compounds that are inspired by this. That's what has made this work so exciting."

In particular, each of the two mechanisms -- the arrow and the poison -- target processes that are present in both bacteria and in mammalian cells. Folate is vital to mammals (which is why pregnant women are told to take folic acid), and of course both bacteria and mammalian cells have membranes. "This gives us a lot of hope, because there's a whole class of targets that people have largely neglected because they thought, 'Oh, I can't target that, because then I would just kill the human as well,'" Gitai said.

"A study like this says that we can go back and revisit what we thought were the limitations on our development of new antibiotics," Huang said. "From a societal point of view, it's fantastic to have new hope for the future."

Credit: 
Princeton University

Rivers help lock carbon from fires into oceans for thousands of years

The extent to which rivers transport burned carbon to oceans - where it can be stored for tens of millennia - is revealed in new research led by the University of East Anglia (UEA).

The study, published today in Nature Communications, calculates how much burned carbon is being flushed out by rivers and locked up in the oceans.

Oceans store a surprising amount of carbon from burned vegetation, for example as a result of wildfires and managed burning. The research team describe it as a natural - if unexpected - quirk of the Earth system.

The international interdisciplinary team, including collaborators from the Universities of Exeter, Swansea, Zurich, Oldenburg and Florida International, studied the amount of dissolved carbon flowing through 78 rivers on every continent except Antarctica.

Lead researcher Dr Matthew Jones, of the Tyndall Centre for Climate Change Research at UEA, said: "Fires leave behind carbon-rich materials, like charcoal and ash, which break down very slowly in soils. We care about this burned carbon because it is essentially 'locked out' of the atmosphere for the distant future - it breaks down to greenhouse gases extremely slowly in comparison to most unburned carbon.

"We know that this burned carbon takes about 10 times longer to break down in the oceans than on land. Rivers are the conveyor belts that shift carbon from the land to the oceans, so they determine how long it takes for burned carbon to break down. So, we set out to estimate how much burned carbon reaches the oceans via rivers."

Based on a large dataset of 409 observations from 78 rivers around the world, the researchers analysed how the burned fraction of dissolved carbon in rivers varies at different latitudes and in different ecosystems. They then upscaled their findings to estimate that 18 million tonnes of dissolved burned carbon are transported annually by rivers. When combined with the burned carbon that is exported with sediments, the estimate rises to 43 million tonnes of burned carbon per year.

Dr Jones said: "We found that a surprising amount - around 12% per cent - of all carbon flowing through rivers comes from burned vegetation.

"While fires emit two billion tonnes of carbon each year, they also leave behind around 250 million tonnes of carbon as burned residues, like charcoal and ash. Around half of the carbon in these residues is in the particularly long-lived form of 'black carbon', and we show that about one-third of all black carbon reaches the oceans."

"This is a good thing because that carbon gets locked up and stored for very long periods - it takes tens of millennia for black carbon to degrade to carbon dioxide in the oceans. By comparison, only about one per cent of carbon taken up by land plants ends up in the ocean.

"With wildfires anticipated to increase in the future because of climate change, we can expect more burned carbon to be flushed out by rivers and locked up in the oceans.

"It's a natural quirk of the Earth system - a moderating 'negative feedback' of the warming climate that could trap some extra carbon in a more fire-prone world."

Credit: 
University of East Anglia

Study puts price tag on lost earnings from racial disparities in cancer mortality

A new American Cancer Society study puts a price tag on racial disparities in cancer mortality, finding that $3.2 billion in lost earnings would have been avoided in 2015 if non-Hispanic (NH) blacks had equal years of life lost from cancer deaths and earning rates as NH whites. The study appears in JNCI Cancer Spectrum.

Little is known about disparities in economic burden due to premature cancer deaths by race/ethnicity in the United States. To learn more, investigators led by Jingxuan Zhao, MPH, compared person-years of life lost (PYLL) and lost earnings due to premature cancer deaths by race/ethnicity. PYLL was calculated using national cancer death and life expectancy data. That was combined with annual median earnings to generate lost earnings. PYLL and lost earnings were then compared among individuals who died at age 16-84 years due to cancer by racial/ethnic groups: NH white, NH black, NH Asian or Pacific Islander (API), and Hispanic.

They found that in 2015, age-standardized lost earning rates (per 100,000 person-years) were $34.9 million for NH whites, $43.5 million for NH blacks, $22.2 million for APIs, and $24.5 million for Hispanics. NH blacks had higher age-standardized PYLL and lost earning rates than NH whites for 13 out of 19 cancer sites studied.

"If age-specific PYLL and lost earning rates for NH blacks were the same as those of NH whites, 241,334 PYLLs and $3.2 billion lost earnings (22.6% of the total lost earnings among NH blacks) would have been avoided," write the authors. "Improving equal access to effective cancer prevention, screening, and treatment will be important in reducing the disproportional economic burden associated with racial/ethnic disparities," they conclude.

Credit: 
American Cancer Society

Science snapshots May 2020

image: The filamentous fungus Neurospora crassa eating plant biomass.

Image: 
Vincent Wu/JGI

The World's Forests Are Growing Younger

Study finds climate change is altering forest structure, making forests less of a carbon sink

--By Christina Procopiou

Researchers from Berkeley Lab and 20 other institutions have found that land use and atmospheric changes are altering forest structure around the world, resulting in fewer of the mature trees that are better at storing carbon dioxide from the atmosphere.

The scientists evaluated data and observations from more than 160 previous studies designed to capture how interactions between forest vegetation, climate changes, and disturbance such as drought provoke ecosystem responses including increased tree mortality and decreased forest age. Results of their work, led by researchers at the Pacific Northwest National Laboratory, were published recently in the journal Science.

"This change in forests from old to young is something to be concerned about," said Lara Kueppers, Berkeley Lab faculty scientist, co-author of the paper, and co-investigator involved in the DOE's Next-Generation Ecosystem Experiments - Tropics (NGEE-Tropics) project. "In younger forests, there are, on average, fewer large trees. The lack of large trees tends to drive down the carbon stored in biomass, whereas to address climate change we want forests to hold on to as much carbon as possible."

Kueppers, who led a 2018 DOE workshop which prompted this review paper, contributed to the study by evaluating the ability of forests to regenerate and recruit new trees following natural disturbance, such as wildfire and drought, and chronic climate change. The researchers conclude that the observed trends provide a critical test for Earth System Models designed to simulate changing forest dynamics.

Read Pacific Northwest National Laboratory's news release about this paper here.

New Research Evaluates How U.S. Wind Plant Performance Changes with Age

First comprehensive study of the U.S. wind fleet shows relatively low levels of performance decline with age

--By Kiran Julin

U.S. wind plants maintain 87% of peak performance after 17 years, and newer plants show almost no decline over the first 10 years, according to a recent study from Berkeley Lab. Compared to studies of how European wind fleets age, the U.S. wind fleet shows mild performance loss with age, and plants built after 2008 show the lowest levels of performance decline that have been found in a major fleet.

A team of researchers in Berkeley Lab's Energy Analysis & Environmental Impacts Division analyzed the performance of 917 onshore wind projects in the U.S. Their findings were published recently in the journal Joule.

The U.S. wind market is the second-largest in the world and supplied 7.3% of the nation's electricity generation in 2019, yet this is the first research effort to evaluate the impact of plant age on the performance of the U.S. wind fleet.

"The results indicate that age-related performance loss can be influenced by technology choices and cost-benefit decisions by project operators," said Berkeley Lab scientist Dev Millstein, the corresponding author of the paper. "The study provides evidence that recent technology changes are positively influencing how wind plants age."

Read the full article here at Berkeley Lab's Energy Analysis & Environmental Impacts Division website.

Investigation into Fungi Food Choices Yields a Buffet of Information

A study of how fungi sense and respond to available food helps explain nutrient recycling and opens the door to better methods for producing bio-based products

--By Aliyah Kovner

When you hear the word "fungi" there is a good chance that it conjures an image of an idyllic toadstool, or perhaps a cluster of capped mushrooms, growing out of a fallen log or pile of leaves. Though there are amazingly diverse types of fungi across the planet, fungi that decay plant matter like these iconic forest inhabitants are of particular interest to scientists because the enzymes that they use to break down tough plant cell walls into simple sugars can be mass-produced and used in industrial processes that generate valuable carbohydrate-based compounds, such as biofuels.

Interestingly, these fungi are capable of tailoring which plant cell wall-degrading enzymes they secrete based on the composition of the food sources available. To investigate how the fungi sense and respond in this manner, a team of scientists led by UC Berkeley used multiple techniques to study the genes, gene products, and gene regulation processes in the fungus Neurospora crassa. The recently published work - a collaboration of N. Louise Glass' Lab with researchers from the U.S. Department of Energy Joint Genome Institute (JGI), located at Berkeley Lab, and the Technical University of Munich - adds to a rich array of studies seeking to improve our understanding of fungal genomes.

"This paper explored how ascomycete fungi - the largest phylum within the fungal kingdom - choose what things to eat in an environment of many food choices," explained first author Vincent Wu. "Curiosity aside, these fungi are currently being utilized to produce enzymes, proteins, and other chemicals in mass quantities, so understanding how they decide what to eat can lead to novel ways of engineering these organisms to more efficiently and sustainably assemble these products. Furthermore, this research included a massive transcriptomics analysis with the use of DAP-seq, a powerful new genetic analysis tool implemented at the JGI."

Credit: 
DOE/Lawrence Berkeley National Laboratory