Tech

Intestinal microbes reprogram genetic activity of gut mucosa

Scientists from the German Cancer Research Center and the Hebrew University in Jerusalem demonstrated in mice that intestinal bacteria reprogram DNA activity in cells of the gut mucosa and thus have a considerable impact on the development of the healthy gut. Acute intestinal inflammation induced under experimental conditions led to a huge increase in the activity of inflammation-related and cancer-promoting genes in the mucous membrane cells of microbe-colonized animals.

A large body of research work indicates that the intestinal microbiota - in other words, all the microorganisms that colonize the human gut - and its composition are linked to a whole series of diseases. The illnesses range from inflammatory diseases of the intestine and metabolic disorders such as obesity and diabetes to cancer, autism, and depression.

These studies usually only show links and do not clarify the mechanisms by which the intestinal microbes affect the human body. Frank Lyko from the German Cancer Research Center (DKFZ) and Yehudit Bergman from the Hebrew University in Jerusalem teamed up to address this issue.

To do so, the researchers compared the DNA of the gut mucosal cells in mice with a normal microbiome with those of mice who had grown up in sterile conditions. They focused on analyzing DNA methylation, known as epigenetic marking, which prevents DNA-binding proteins from attaching to the DNA at these sites, hence restricting the activity of the genes there.

The researchers noticed considerable differences in the methylation patterns between sterile and microbe-colonized animals. In the latter, they found a group of "sentinel genes" activated by demethylation, which are responsible for the normal regeneration of the intestinal mucosa in the healthy gut.

Both the microbe-colonized mice and the sterile animals were treated using a chemical that attacks the intestinal mucosa, thus inducing acute inflammation. In the microbe-colonized animals, this treatment led to a reduction in DNA methylation in the gut mucosal cells, particularly affecting a number of regulatory elements. As a result, many genes that play a role in inflammation and in cancer were activated.

In contrast, the chemical hardly caused any changes in genetic activity in the microbe-free mice. "That shows that the differences in methylation are due to the bacteria and not to the chemical," Frank Lyko explained. However, if the microbiome of the microbe-colonized mice was transferred to the microbe-free animals, methylation of the intestinal mucosal cells was reduced in these mice too.

The team of German and Israeli researchers realized that this effect apparently depends on the demethylating enzymes TET2 and TET3: If they were turned off genetically, treatment with the chemical hardly caused any changes in the genome methylation.

"The microbiome seems to have a considerable influence on the animals' health: It ensures normal intestinal development by using epigenetic programming to activate genes that steer regeneration of the gut mucosa. In the microbe-free mice, however, this activation does not take place," Frank Lyko explained. "During acute inflammation, the gut microbes also cause a change in genetic activity, and genes are dysregulated that are also dysregulated in patients with intestinal inflammation and cancer of the colon. This once again underlines the key role that microbes play in epigenetic regulation," he added.

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)

Microcensus in bacteria

image: Bacteria can determine the total number of a population, an ability also known as quorum sensing (left). A "pump probe system" enables Bacillus subtilis to determine the quantitative proportions of different groups (right). These systems are widely distributed in Gram-positive bacteria, but also occur in viruses and mobile genetic elements.

Image: 
MPI f. Terrestrial Microbiology/ Bischofs

Bacteria can perceive how many they are. They release and sense signaling molecules that accumulate with increasing cell numbers, which allows them to change their behavior when a certain group size is reached. A team of researchers from the Max Planck Institute for Terrestrial Microbiology in Marburg and Heidelberg University has now been able to show that bacteria might be capable of even more: they could perceive the proportions of different groups of bacteria in their environment.

In nature, bacteria often live in complex communities, surrounded by other cells that can differ from each other, even within a species. The principal investigator, Ilka Bischofs, explains: "Imagine yourself in a ballroom full of people. Their sheer number is only of limited relevance to you; it is the gender ratio that tells you how hard it will be to find a dancing partner. Bacteria also collect information about their environment. Information about group ratios could help them make decisions and adapt in the best possible way."

The research team studied information retrieval in the bacterium Bacillus subtilis. This species possesses a large number of identically constructed chemical signaling systems that were previously thought to measure cell numbers. Instead the bacteria may utilize these systems to determine the proportions of different groups within a mixed population. The respective signaling molecules are often produced by a subset of cells, but are taken up by all bacteria. Therefore, cells compete with each other for the signaling molecules. The larger the ratio of signal producers in the population, the more signaling molecules will accumulate in the cells where they are being detected. However, as with computers, the specific function of a system depends on its settings.

The research team was able to show experimentally that at least the exemplarily investigated bacterial signal system is indeed correctly configured for facilitating ratio-sensing. Using high-resolution methods of flurorescence microscopy (Förster Resonance Energy Transfer, FRET), they analysed the signal transduction in detail.

The ratio-sensing ability could confer decisive advantages to the bacterium. As research in recent years has shown, Bacillus subtilis often splits its population into subgroups of cells with different properties and functions. Similar to a stock broker, the bacterium diversifies its portfolio of phenotypes. Knowing the composition of a portfolio obviously enables to respond adequately to environmental changes - a strategy that bacteria may have already discovered during evolution.

Credit: 
Max-Planck-Gesellschaft

New genetic signatures in childhood leukemia create paths for precision medicine

WILMINGTON, Del. (March 5, 2020) - Researchers with Nemours Children's Health System utilized Next Generation Sequencing (NGS) to more precisely identify genomic characteristics of leukemias in children, the most common childhood cancer. The study, published today in BMC Medical Genomics, identified new genetic structural variants that could be used to assess the presence of minimal residual disease during the course of chemotherapy and help determine response to various therapies.

"Progress in the field of genomic research and advanced technology has allowed us to find new variants that can better target treatments for kids with cancer," said Erin Crowgey, PhD, lead author of the study and Director of Clinical Bioinformatics at Nemours. "Pediatric leukemias have a diverse and complex genomic structure, and older sequencing techniques were missing a lot of the important information that guides our clinical evaluation, risk identification, and therapeutic strategy for patients."

Researchers analyzed DNA and RNA from bone marrow, cell lines, and umbilical cord samples from 32 pediatric leukemia patients and five adult leukemia patients. Patient samples were collected at diagnosis, end of the first treatment, and relapse. These samples were sequenced using molecular barcoding with targeted DNA and RNA library enrichment techniques. The sequencing identified multiple novel gene fusions and previously unknown copy number losses in leukemia genes. This approach enabled more sensitive detection of these genetic variants.

"Hearing that your child has cancer is scary; particularly that the clinical presentation of his disease is unique and there is no other research that matches his case," said Sophie Hayes, mother of Eliot Hayes, a patient at Nemours. "So, when the genomic results came back and showed that Eliot's cancer was not unique at the genetic level, we were very grateful to the scientific community and to Nemours. This early knowledge enabled us to incorporate an additional line of attack into his treatment and offered us hope."

Researchers note that the ultimate goal is to bring the genomic testing into routine clinical practice so that it can be used regularly, leading to a pediatric onco-specific genomic treatment program at Nemours. "Dr. Crowgey's work brings complex genomic testing closer to the bedside to improve the treatment of children with cancer," said E. Anders Kolb, MD, Director of Nemours Center for Cancer and Blood Disorders and a co-author of the paper.

Credit: 
Nemours

Drug that keeps surface receptors on cancer cells makes them more visible to immune cells

A drug that is already clinically available for the treatment of nausea and psychosis, called prochlorperazine (PCZ), inhibits the internalization of receptors on the surface of tumor cells, thereby increasing the ability of anticancer antibodies to bind to the receptors and mount more effective immune responses. PCZ enhanced the ability of anticancer antibodies to reduce tumor growth in mice, by temporarily increasing the clustering of a receptor targeted by anticancer antibodies on the surface of tumor cells. Temporary treatment of cancer patients with PCZ similarly resulted in receptor clustering on tumor cells inside of patients. The work appears March 5 in the journal Cell.

"This represents a potentially disruptive change in clinical approaches to improve therapeutic targeting as the opportunity of temporarily moving drug targets within patient tumor cells becomes possible," says senior author Fiona Simpson, a cancer researcher at the University of Queensland.

In addition to treating nausea and psychosis, PCZ is currently used in laboratory studies as an inhibitor of endocytosis, a process by which substances are brought into the cell. The temporary inhibition of endocytosis via molecules such as PCZ has been proposed for numerous clinical applications, including control of viral infection, epilepsy, and chronic kidney disease.

To help move this approach closer to clinical translation, Simpson and her team recently developed an imaging method to reliably measure endocytosis in humans. When they analyzed tumor samples from patients with squamous cell carcinoma, they found that a decrease in endocytosis of a protein called epidermal growth factor receptor (EGFR) was associated with better clinical responses to the cancer drug cetuximab, an antibody that binds to and inhibits EGFR. "We found that if we stopped the drug targets getting inside the cells by inhibiting the uptake process, called endocytosis, we got a really big immune response against the tumors in mice," Simpson says.

"This led to the concept that temporarily halting endocytosis of receptors that are the targets of monoclonal antibody therapies may be used to improve clinical outcomes," Simpson says. Their findings suggested that endocytosis inhibition in humans, which increases the clustering of EGFR on the surface of tumor cells, thereby increasing the availability of the receptor for binding to the antibody, should lead to improved outcomes for patients treated with certain therapeutic antibodies.

Exploring this idea in the new study with a different therapeutic antibody, Simpson and her team showed that PCZ improved the immune responses of tumor cells to cetuximab. In mice with cancer, combination therapy with PCZ and cetuximab reversed resistance to cetuximab therapy and inhibited tumor growth more than either treatment alone. Similarly, combination therapy with PCZ and avelumab--an anticancer antibody that inhibits a protein called programmed death ligand 1 (PD-L1)--inhibited tumor growth in mice more than either treatment alone. "These findings highlight the crucial role of availability and persistence of the molecule targeted by the therapeutic antibody at the cell surface of tumor cells," Simpson says.

To begin to test the safety of this approach in humans, Simpson and her team carried out an open-label, uncontrolled, single-center proof-of-concept study in which a single dose of PCZ was intravenously administered to patients with squamous cell carcinoma of the head and neck. Tumor biopsies were taken before and after PCZ treatment. Analysis of results from four patients showed that PCZ increased cetuximab binding sites and the clustering of EGFR on the surface of tumor cells. "As far as we can tell, this was the first verified manipulation of endocytosis in humans," Simpson says.

Moreover, tumor cells became more homogenous with regard to the availability of cetuximab binding sites. "In our patients, we could see surface target on some tumor cells exposed and not others. After PCZ, they all had drug target on their surface, so it fixed the heterogeneity problem," Simpson says. "By overcoming the heterogeneity of drug target availability that frequently characterizes poorly responsive or resistant tumors, clinical application of PCZ may considerably improve the clinical benefit of therapeutic antibodies."

PCZ is already prescribed chronically in some people for the management of psychosis. Unfortunately, over a lifetime of PCZ use, patients may show an increased incidence of metabolic syndrome. But, in this case, the proposed use of the drug by Simpson and colleagues is shortterm, and the endocytosis inhibition is reversible, negating the side effects of chronic use. "It turns out PCZ at the high concentration we use it only stops trafficking for about four hours," Simpson says. "This is enough time to line up the immune cells when an antibody drug is used against the tumor cells," but theoretically not enough time to be really toxic. Further clinical studies are underway to fully test whether this approach can be safely used in people.

From a broader perspective, the study sets the stage for testing PCZ in clinical trials to treat epilepsy or chronic kidney disease or to improve the effectiveness of anti-HIV antibodies. According to the authors, the drug might also be used as a frontline medicine to prevent or reduce infections by highly pathogenic viruses such as dengue or Ebola.

Currently, Simpson and her collaborators are conducting a phase IB safety trial to test the combination of cetuximab and PCZ in head and neck cancer, triple-negative breast cancer and adenoid cystic carcinoma. "We need to show efficacy through trials first, but if it works, we could use it to improve therapies in lots of different cancers and also other uses," Simpson says. "We are hoping once the idea is out there, lots of clever people will run with the idea and make it work even better."

Credit: 
Cell Press

Unexpected ways animals influence fires

Animals eating plants might seem like an obvious way to suppress fire, and humans are already using the enormous appetites of goats, deer, and cows to reduce the fuel available for potential wildfires. But other animals such as birds, termites, and elephants can also double as ecosystem engineers, naturally reducing or enhancing the chances, spread, or severity of wildfires as they go about their day-to-day grass-chewing, track-making, or nest-building. Researchers in Australia describe these and more surprising activities in a Review published March 5 in the journal Trends in Ecology & Evolution.

When it comes to grazing animals, it's important to consider which species of plants they are eating, and which are left behind. "A lot of the things that make a plant good to eat are the things that make it hard to burn," says first author Claire Foster (@claireNfoster), terrestrial conservation biologist and research fellow at Australian National University. "When you take out all the nutritious, palatable plants, those left over tend to be drier and more flammable."

Studies have shown that removing large grazers like cattle or rhinoceros can increase wildfire temperature, as well as the size of individual fires and the total area burnt. But, it's important to note that these grazers are most effective as a fire management tool in grassy habitats like savanna; if domesticated grazers are used improperly, they can promote the growth of less tasty but more flammable plants. This can be a problem in alpine areas and in forests with mixed plant species, where selective feeding can increase the numbers of more fire-prone plants.

"It's very clear that when used strategically, and in the right ecosystems, mammals like goats and cattle can have strong fire-suppressive effects, but I've also seen many examples where they actually do the opposite and increase the risk of severe fires," Foster says.

But there may be other, less obvious animals that could also be used in fire defense. "Some of the animals we don't necessarily think of are the insects that, by feeding on leaves, stimulate the production of defensive chemicals in the plants, changing the flammability of their leaves," says Foster. Other kinds of insects likely play a strong role in removing dead leaves from the forest floor and, in some cases, can even provide shelter for other animals from fire.

"One of the most amazing examples is from savanna ecosystems with termites," she says. "They create massive structures where a huge variety of other animals choose to live. These 'nutrient islands' attract large herbivores that preferentially graze around the termite mounds, making them less likely to burn and creating a safety zone during moderate-severity bushfires."

Further, some animals can manage fire spread by changing the arrangement of plants or dead plant materials within their habitat. Similar to how you might rake your yard, malleefowl birds gather dead leaves into piles to incubate their eggs, helping clear the ground of leaf litter. Larger animals, like elephants, can trample down plants to form wide corridors between foliage. "Gaps in fuel can be really important for fire spread; animal tracks can act like mini roads, creating breaks that can cause extinguishment of the fire front," says Foster.

All together, Foster, senior author and University of Western Australia wildlife biologist Leonie Valentine, and their co-authors illustrate that direct plant consumption is only one of several mechanisms that can influence fire behavior. "Our approach was to encourage readers to think about the more subtle and indirect ways that animals might be altering fuels and to help avoid risky assumptions when we plan strategies to reduce the risk of fire outbreaks," Foster says. "We also encourage readers to consider ways that changes in wildlife populations--and not just grazers--might influence patterns of fire".

Next, Foster and her lab are investigating the relationship between forest insects and rates of leaf-litter breakdown within ecosystems as well as the effects of kangaroos and wallabies on the behavior of fire in Australian forests.

Credit: 
Cell Press

Genetic study offers comprehensive and diverse view of recent US population history

image: This figure shows a dimensionality reduction method, UMAP, applied to genetic data from the National Geographic Genographic Project, where each dot represents one individual, with colors determined from a linear combination of each individual's admixture proportions (colors correspond to each continental ancestry, with European = red, African = yellow, Native American = green, East Asian = blue, South Asian = purple).

Image: 
Dai et al./AJHG

Researchers have assembled one of the most comprehensive studies of population genetics ever conducted in the United States, bringing together large-scale genetics data from more than 32,000 participants in the National Geographic Genographic Project. This new view on the US, appearing March 5 in the American Journal of Human Genetics, reveals a remarkable degree of complexity. Beyond offering an intriguing view of the nation's recent history, the findings also have important implications for health and medicine, the researchers say.

"We have inferred US demographic history at fine scale by combining genetic, geospatial, and multigenerational birth record data," says Alicia Martin (@genetisaur) of the Broad Institute of MIT and Harvard. "This provides important lessons for the future: if we want genetic technologies to benefit everyone, we need to rethink our current approach for genetic studies because they typically miss a huge swath of American--and more broadly human--diversity."

Although much effort has been made to understand the genetic diversity in the
US, fine-scale patterns were less clear for minority populations, particularly for Latin American and African descendants with complex ancestral histories. There also had been little research on the population structure of individuals in the US with East Asian, South Asian, and Middle Eastern ancestry.

Martin and colleagues report that the new effort had several distinct advantages compared to other large-scale population genetics datasets. First, genetic data for each consenting participant from the Genographic Project are accessible to researchers around the world to answer anthropological questions. Most of the study's participants also reported their postal code along with birthplace and ethnicity data for themselves, their parents, and their grandparents, enabling fine-scale insights into recent history.

Their analysis revealed the following:

Hispanic/Latino populations show a remarkably complex population structure, including geographically varying proportions of Native American, European, and African ancestry.

The evidence uncovered a north-south barrier to migration among African Americans, consistent with the transatlantic slave trade and historical patterns of segregation between states.

Some Asian Americans, a fast-growing group in the US that had been often overlooked in past genetic studies, show elevated patterns of relatedness consistent with intermarrying between close kin among their ancestors.

People of European descent show a subtle, unevenly distributed structure. For example, Irish, Finnish, and Acadian (French Canadian) descendants are common in New England, the Northwest, and the South, respectively.

Martin also noted that Hispanic/Latino populations have native origins spanning North, Central, and South America over several generations. These roots vary in how much and how long ago they appear in Californians, New Mexicans, Texans, and Floridians on average.

The researchers say that these findings are "the tip of the iceberg." As the number of participants in this and other ongoing studies continues to grow, the view of US population history will become even clearer.

Martin says they will continue this work as the available data grow to include about 250,000 Americans. In future studies, they also plan to expand to people who may or may not live in the US as well, which, Martin says, will help to tease out relationships among underrepresented populations, such as Hispanic/Latino populations in Florida who have ancestral roots from Cuba, Puerto Rico, Mexico, and elsewhere.

Credit: 
Cell Press

Exciting tweaks for organic solar cells

image: The new electron-accepting molecule TACIC can maintain its excited state 50 times longer than a conventional one.

Image: 
Illustration by Mindy Takamiya

A molecular tweak has improved organic solar cell performance, bringing us closer to cheaper, efficient, and more easily manufactured photovoltaics. The new design approach, targeting the molecular backbone of the cell's power-generating layer, was developed by scientists at Kyoto University's Institute for Integrated Cell-Material Sciences (iCeMS) and published in the journal Chemical Science.

Organic photovoltaics are expected to become the next generation of solar cells as they use cheaper components, and are more lightweight, flexible and easily manufactured compared to currently used inorganic solar cells.

"There is growing concern over the use of fossil fuels and their environmental impacts," says Hiroshi Imahori, a molecular engineer at iCeMS who led the work with colleague Tomokazu Umeyama. "We need to work hard to improve sustainable energy systems."

The power-generating layer in organic photovoltaics contains molecules that either donate or accept electrons. Light is absorbed by this thin layer, exciting the molecules, which generate charges that go on to form an electric current. But for light to be efficiently converted to electricity, the electron-accepting component needs to stay excited.

One type of organic cell is very good at absorbing a broad spectrum of light, but doesn't stay excited for long. To try to address this, Imahori, Umeyama and their colleagues in Japan targeted the molecular backbone of the cell's electron-accepting component. Specifically, they replaced a central ring with a molecule called thienoazacoronene, creating a new molecule called TACIC.

Similar to its predecessor, TACIC absorbed a broad spectrum of visible and near-infrared light. Significantly, it maintained its excited state 50 times longer, converting more than 70% of light particles into current. The design achieved this by stabilizing the vibration and rotation that normally occur when light is absorbed, saving kinetic energy and facilitating intermolecular interaction.

The cell continues to have a power conversion efficiency of just under 10%, which is comparable to other organic solar cells being researched. The team believes modifications to the side chains and core structure of the thienoazacoronene molecule could further improve the efficiency of organic photovoltaics.

Credit: 
Kyoto University

New ESO study evaluates impact of satellite constellations on astronomical observations

image: This annotated image shows the night sky at ESO's Paranal Observatory around twilight, about 90 minutes before sunrise. The blue lines mark degrees of elevation above the horizon.

A new ESO study looking into the impact of satellite constellations on astronomical observations shows that up to about 100 satellites could be bright enough to be visible with the naked eye during twilight hours (magnitude 5-6 or brighter). The vast majority of these, their locations marked with small green circles in the image, would be low in the sky, below about 30 degrees elevation, and/or would be rather faint. Only a few satellites, their locations marked in red, would be above 30 degrees of the horizon -- the part of the sky where most astronomical observations take place -- and be relatively bright (magnitude of about 3-4). For comparison, Polaris, the North Star, has a magnitude of 2, which is 2.5 times brighter than an object of magnitude 3.

The number of visible satellites plummets towards the middle of the night when more satellites fall into the shadow of the Earth, represented by the dark area on the left of the image. Satellites within the Earth's shadow are invisible.

Image: 
ESO/Y. Beletsky/L. Calçada

Astronomers have recently raised concerns about the impact of satellite mega-constellations on scientific research. To better understand the effect these constellations could have on astronomical observations, ESO commissioned a scientific study of their impact, focusing on observations with ESO telescopes in the visible and infrared but also considering other observatories. The study, which considers a total of 18 representative satellite constellations under development by SpaceX, Amazon, OneWeb and others, together amounting to over 26 thousand satellites [1], has now been accepted for publication in Astronomy & Astrophysics.

The study finds that large telescopes like ESO's Very Large Telescope (VLT) and ESO's upcoming Extremely Large Telescope (ELT) will be "moderately affected" by the constellations under development. The effect is more pronounced for long exposures (of about 1000 s), up to 3% of which could be ruined during twilight, the time between dawn and sunrise and between sunset and dusk. Shorter exposures would be less impacted, with fewer than 0.5% of observations of this type affected. Observations conducted at other times during the night would also be less affected, as the satellites would be in the shadow of the Earth and therefore not illuminated. Depending on the science case, the impacts could be lessened by making changes to the operating schedules of ESO telescopes, though these changes come at a cost [2]. On the industry side, an effective step to mitigate impacts would be to darken the satellites.

The study also finds that the greatest impact could be on wide-field surveys, in particular those done with large telescopes. For example, up to 30% to 50% of exposures with the US National Science Foundation's Vera C. Rubin Observatory (not an ESO facility) would be "severely affected", depending on the time of year, the time of night, and the simplifying assumptions of the study. Mitigation techniques that could be applied on ESO telescopes would not work for this observatory although other strategies are being actively explored. Further studies are required to fully understand the scientific implications of this loss of observational data and complexities in their analysis. Wide-field survey telescopes like the Rubin Observatory can scan large parts of the sky quickly, making them crucial to spot short-lived phenomena like supernovae or potentially dangerous asteroids. Because of their unique capability to generate very large data sets and to find observation targets for many other observatories, astronomy communities and funding agencies in Europe and elsewhere have ranked wide-field survey telescopes as a top priority for future developments in astronomy.

Professional and amateur astronomers alike have also raised concerns about how satellite mega-constellations could impact the pristine views of the night sky. The study shows that about 1600 satellites from the constellations will be above the horizon of an observatory at mid-latitude, most of which will be low in the sky -- within 30 degrees of the horizon. Above this -- the part of the sky where most astronomical observations take place -- there will be about 250 constellation satellites at any given time. While they are all illuminated by the Sun at sunset and sunrise, more and more get into the shadow of the Earth toward the middle of the night. The ESO study assumes a brightness for all of these satellites. With this assumption, up to about 100 satellites could be bright enough to be visible with the naked eye during twilight hours, about 10 of which would be higher than 30 degrees of elevation. All these numbers plummet as the night gets darker and the satellites fall into the shadow of the Earth. Overall, these new satellite constellations would about double the number of satellites visible in the night sky to the naked eye above 30 degrees [3].

These numbers do not include the trains of satellites visible immediately after launch. Whilst spectacular and bright, they are short lived and visible only briefly after sunset or before sunrise, and -- at any given time -- only from a very limited area on Earth.

The ESO study uses simplifications and assumptions to obtain conservative estimates of the effects, which may be smaller in reality than calculated in the paper. More sophisticated modelling will be necessary to more precisely quantify the actual impacts. While the focus is on ESO telescopes, the results apply to similar non-ESO telescopes that also operate in the visible and infrared, with similar instrumentation and science cases.

Satellite constellations will also have an impact on radio, millimetre and submillimetre observatories, including the Atacama Large Millimeter/submillimeter Array (ALMA) and the Atacama Pathfinder Experiment (APEX). This impact will be considered in further studies.

ESO, together with other observatories, the International Astronomical Union (IAU), the American Astronomical Society (AAS), the UK Royal Astronomical Society (RAS), and other societies, is taking measures to raise the awareness of this issue in global fora such as the United Nations Committee on the Peaceful Uses of Outer Space (COPUOS) and the European Committee on Radio Astronomy Frequencies (CRAF). This is being done while exploring with the space companies practical solutions that can safeguard the large-scale investments made in cutting-edge ground-based astronomy facilities. ESO supports the development of regulatory frameworks that will ultimately ensure the harmonious coexistence of highly promising technological advancements in low Earth orbit with the conditions that enable humankind to continue its observation and understanding of the Universe.

Credit: 
ESO

Humans transport dangerous smoke residues indoors

Decades of research have demonstrated the adverse effects of fine particulate matter and volatile organic compounds (VOC) such as nicotine or acetonitrile from tobacco smoke on human health, with no "safe" level of exposure. Smoking restrictions have decreased non-smokers' exposure to secondhand smoke. Yet with worldwide smoking rates at 22%, exposure to hazardous pollutants from tobacco smoke remains a major risk for non-smokers, and thirdhand smoke (THS) has been identified as a major exposure pathway.

An international team of scientists from the Max Planck Institute for Chemistry and Yale University now discovered that tobacco smoke off-gassing from prior exposure transports contaminants equivalent to several cigarettes of secondhand smoke. This means that even if someone is in a room that has never been smoked in, that person could still be exposed to many of the hazardous chemical compounds that make up cigarette smoke, depending on who else has entered the room or previously visited it.

Thirdhand smoke includes residual nicotine and multiple other chemicals left on surfaces like clothes, walls, skin or furniture. Desorption from these surfaces results in airborne chemicals that present significant health risks to non-smokers. This serious public health hazard hasn´t been fully understood yet.

The paper is significant since studies on THS-related VOCs in non-smoking environments do not exist and while THS transport to non-smoking sites has been proposed theoretically, no studies have yet observed or quantified the active transport and emission from people into non-smoking environments. "In real-world conditions, we see concentrated emissions of hazardous gases coming from groups of people who were previously exposed to tobacco smoke as they enter a non-smoking location with strict regulations against indoor smoking. So the idea that someone is protected from the potential health effects of cigarette smoke because they're not directly exposed to second-hand smoke is not the case", says Drew Gentner, Associate Professor of Chemical & Environmental Engineering at Yale University and Alexander von Humboldt Foundation Research Fellow.

The researchers conducted an experiment at a movie theater by measuring the real-time emissions of THS compounds from people into a non-smoking indoor environment. Through four consecutive days of measurements with online high-resolution mass spectrometer, 35 different VOCs previously associated with THS or tobacco smoke emissions were observed at significant concentrations in the theater. The gas emissions were equivalent to that of 1-10 cigarettes of secondhand smoke in a one-hour period. A much larger range of compounds originating from cigarette smoke were observed with offline measurements of gases and aerosols. "Based on the measurements, we conclude that humans transport THS into indoor areas via their clothing and bodies. This observation is in line with what has been theorized in the past, but until now had yet to be proven empirically", say Jonathan Williams, group leader at Max Planck Institute for Chemistry and co-author of the study.

The conclusions derived from this study are generalizable to other, less well-ventilated locations. "The observed emission rates into a more confined or less well-ventilated space such as a motor vehicle, a bar, a train, or a small room in a home would lead to much higher concentrations and occupant exposure", Gentner explains.

Credit: 
Max Planck Institute for Chemistry

York University researchers one step closer to creating organic batteries

image: York University researchers have discovered a way to make Lithium-powered batteries more environmentally friendly while retaining performance, stability and storage capacity. Their latest breakthrough is the creation of a new carbon-based organic molecule that can replace the cobalt now used in cathodes or positive electrodes in lithium-ion batteries. The new material addresses the shortcomings of the inorganic material while maintaining performance.

Image: 
Paola Scattolon

York University researchers have discovered a way to make Lithium-powered batteries more environmentally friendly while retaining performance, stability and storage capacity.

Lithium-ion batteries use toxic, heavy metals which can impact the environment when they are extracted from the ground and are difficult to dispose of safely. Cobalt is one of those heavy metals, used in battery electrodes. Part of the problem is that lithium and cobalt are not abundantly available, and supplies are dwindling.

Using organic materials are the way forward and that has scientists like Professor Thomas Baumgartner of the Faculty of Science and his team busy developing and testing new molecules to find the right ones to replace the rare metals currently in use.

"Organic electrode materials are considered to be extremely promising materials for sustainable batteries with high power capabilities," he says.

Their latest breakthrough is the creation of a new carbon-based organic molecule that can replace the cobalt now used in cathodes or positive electrodes in lithium-ion batteries. The new material addresses the shortcomings of the inorganic material while maintaining performance.

"Electrodes made with organic materials can make large-scale manufacturing, recycling or disposing of these elements more environmentally friendly," says Baumgartner. "The goal is to create sustainable batteries that are stable and have equally as good if not better capacity."

The research is published and featured on the cover of the March edition of the journal Batteries & Supercaps, a ChemPubSoc publication.

"With this particular class of molecules that we've made, the electroactive component is very suitable for batteries as it's very good at storing electrical charges and has good long-term stability," he says.

Baumgartner and his group previously reported on the electroactive component in a paper published in the journal Advanced Energy Materials.

"We have optimized this electroactive component and put it in a battery. It has a very good voltage, up to the 3.5 volts, which is really where current batteries are now," he says. "It's an important step forward in making fully organic and sustainable batteries."

Baumgartner, along with postdoctoral researchers Colin Brides and Monika Stolar, have also demonstrated that this material is stable in long-term operation with the ability to charge and discharge for 500 cycles. One of the downsides of inorganic electrodes is that they generate significant heat when charging and require limited discharging rates for safety reasons. This new molecule addresses that shortcoming.

The next step, says Baumgartner, is to improve the capacity further. His team is currently developing the next generation of molecules that show promise in being able to increase current capacity.

Credit: 
York University

Water splitting observed on the nanometer scale

image: At rough areas of a catalyst surface, water is split into hydrogen and oxygen in a more energy efficient way than at smooth areas.

Image: 
MPI-P, License CC-BY-SA

It is a well-known school experiment: When a voltage is applied between two electrodes inserted in water, molecular hydrogen and oxygen are produced. To further the industrial use of this process, it is indispensable to make water splitting as energy-efficient as possible. In addition to the material of the electrode, its surface quality is a crucial aspect for the splitting efficiency. In particular, rough spots of only few nanometers - i.e. millionths of a millimeter - in size that are called reactive centers determine the electrochemical reactivity of an electrode.

Previous investigation methods were not accurate enough to follow chemical reactions taking place at such reactive centers on the electrode surface with sufficient spatial resolution under real operating conditions, i.e. in electrolyte solution at room temperature and with an applied voltage. A team of scientists led by Dr. Katrin Domke, independent Boehringer Ingelheim "Plus 3" group leader at the MPI-P, has now developed a new method with which the initial steps electrocatalytic water splitting on a gold surface could be studied for the first time with a spatial resolution of less than 10 nm under operating conditions.

"We were able to show experimentally that surfaces with protrusions in the nanometer range split water in a more energy efficient way than flat surfaces," says Katrin Domke. "With our images, we can follow the catalytic activity of the reactive centers during the initial steps of water splitting".

For their method, they have combined different techniques: In Raman spectroscopy, molecules are illuminated with light that they scatter. The scattered light spectrum contains information that provides a chemical fingerprint of the molecule, enabling the identification of chemical species. However, Raman spectroscopy is typically a technique that produces only very weak and, moreover, only spatially averaged signals over hundreds or thousands of nanometers.

For this reason, the researchers have combined the Raman technique with scanning tunneling microscopy: by scanning a nanometer-thin gold tip illuminated with laser light over the surface under investigation, the Raman signal is amplified by many orders of magnitude directly at the tip apex that acts like an antenna. This strong enhancement effect enables the investigation of very few molecules at a time. Furthermore, the tight focusing of the light by the tip leads to a spatial optical resolution of less than ten nanometer. The distinctive feature of the apparatus is that it can be operated under realistic electrocatalytic operating conditions.

"We were able to show that during water splitting at nanometer rough spots - i.e. a reactive centers - two different gold oxides are formed, which could represent important intermediates in the separation of the oxygen atom from the hydrogen atoms," says Domke. With their investigations, it is now possible to gain a more precise insight into the processes taking place on the nanometer scale on reactive surfaces and facilitate the design of more efficient electrocatalysts in the future, where less energy is needed to split water into hydrogen and oxygen.

Credit: 
Max Planck Institute for Polymer Research

Study: Organic molecules discovered by Curiosity Rover consistent with early life on Mars

PULLMAN, Wash. - Organic compounds called thiophenes are found on Earth in coal, crude oil and oddly enough, in white truffles, the mushroom beloved by epicureans and wild pigs.

Thiophenes were also recently discovered on Mars, and Washington State University astrobiologist Dirk Schulze-Makuch thinks their presence would be consistent with the presence of early life on Mars.

Schulze-Makuch and Jacob Heinz with the Technische Universität in Berlin explore some of the possible pathways for thiophenes' origins on the red planet in a new paper published in the journal Astrobiology. Their work suggests that a biological process, most likely involving bacteria rather than a truffle though, may have played a role in the organic compound's existence in the Martian soil.

"We identified several biological pathways for thiophenes that seem more likely than chemical ones, but we still need proof," Dirk Schulze-Makuch said. "If you find thiophenes on Earth, then you would think they are biological, but on Mars, of course, the bar to prove that has to be quite a bit higher."

Thiophene molecules have four carbon atoms and a sulfur atom arranged in a ring, and both carbon and sulfur, are bio-essential elements. Yet Schulze-Makuch and Heinz could not exclude non-biological processes leading to the existence of these compounds on Mars.

Meteor impacts provide one possible abiotic explanation. Thiophenes can also be created through thermochemical sulfate reduction, a process that involves a set of compounds being heated to 248 degrees Fahrenheit (120 degrees Celsius) or more.

In the biological scenario, bacteria, which may have existed more than three billion years ago when Mars was warmer and wetter, could have facilitated a sulfate reduction process that results in thiophenes. There are also other pathways where the thiophenes themselves are broken down by bacteria.

While the Curiosity Rover has provided many clues, it uses techniques that break larger molecules up into components, so scientists can only look at the resulting fragments.

Further evidence should come from the next rover, the Rosalind Franklin, which is expected to launch in July 2020. It will be carrying a Mars Organic Molecule Analyzer, or MOMA, which uses a less destructive analyzing method that will allow for the collection of larger molecules.

Schulze-Makuch and Heinz recommend using the data collected by the next rover to look at carbon and sulfur isotopes. Isotopes are variations of the chemical elements that have different numbers of neutrons than the typical form, resulting in differences in mass.

"Organisms are 'lazy'. They would rather use the light isotope variations of the element because it costs them less energy," he said.

Organisms alter the ratios of heavy and light isotopes in the compounds they produce that are substantially different from the ratios found in their building blocks, which Schulze-Makuch calls "a telltale signal for life."

Yet even if the next rover returns this isotopic evidence, it may still not be enough to prove definitively that there is, or once was, life on Mars.

"As Carl Sagan said 'extraordinary claims require extraordinary evidence,'" Schulze-Makuch said. "I think the proof will really require that we actually send people there, and an astronaut looks through a microscope and sees a moving microbe."

Credit: 
Washington State University

Engineered bone marrow cells slow growth of prostate and pancreatic cancer cells

image: In experiments with mice, researchers at the Johns Hopkins Kimmel Cancer Center say they have slowed the growth of transplanted human prostate and pancreatic cancer cells by introducing bone marrow cells with a specific gene deletion to induce a novel immune response.

Image: 
Alan Friedman, M.D.

In experiments with mice, researchers at the Johns Hopkins Kimmel Cancer Center say they have slowed the growth of transplanted human prostate and pancreatic cancer cells by introducing bone marrow cells with a specific gene deletion to induce a novel immune response.

The results, described January 2020 in the Journal for ImmunoTherapy of Cancer, suggest that the technique -- a type of adoptive cell therapy -- could target such cancers in humans, using patients' own marrow cells.

"Building on these mouse studies, this approach may offer a unique method to activate patient immune systems, including T cells, against cancer," says pediatric oncologist Alan Friedman, M.D., who is the King Fahd Professor of Pediatric Oncology at the Johns Hopkins Kimmel Cancer Center.

Previous research, Friedman says, has shown that macrophages and dendritic cells, both crucial to immune response, are more likely to help mount an inflammatory fight when they lack the gene known as NF-κB p50, encoding a transcription. Transcription factors are proteins that help turn specific genes "on" or "off" by binding to nearby DNA.

Past studies also showed that melanoma, fibrosarcoma, colon cancer and the brain cancer glioblastoma grow slower in mice lacking the transcription factor. The new study, Friedman says, is believed to be the first to show the same slower growth effect -- in mice lacking p50 -- for prostate and pancreatic ductal cancers.

To develop and test their technique of adoptive cell therapy, the researchers first grew immature myeloid cells taken from the marrow of p50-deficient mice and compared them with mice having the p50 gene. Myeloid cells are a class of blood cells also known as white blood cells. Immature myeloid cells, which include macrophages and dendritic cell precursors, were chosen because past studies have confirmed that these particular cells increase the likelihood of activating an anti-tumor immune response.

After inoculating both groups of mice with human prostate or pancreatic cancer cells, the investigators injected the immature myeloid cells after pretreating the animals with a widely used anti-cancer medication known as 5-fluorouracil. That drug is known to cause a drop in the number of circulating normal myeloid blood cells, reducing competition with injected cells; target myeloid cells in tumors that suppress immune response; and sometimes release antigens that T cells recognize, triggering those immune cells to attack tumors.

The researchers found that "adoptive transfer" of the p50-negative cells in combination with a dose of the 5-fluorouracil produced the best results. Compared with what happened in the mice given cells with intact p50, the tumors grew at least three times more slowly in 13 of 14 (93%) of the prostate cancers and in eight of 15 (53%) pancreatic cancers. The treatment also produced what the researchers called "striking" pancreatic cancer regression in mice that responded, with up to a tenfold reduction in tumor size, according to the researchers.

The researchers report that the transferred p50-negative myeloid cells generated tumor and lymph node macrophages and dendritic cells programmed to help the immune system fight the cancer by activating T cells. When T cells were depleted to test whether their activation was directly associated with the decrease in tumor growth, the depletion eliminated the effectiveness of the cell transfer. A similar effect was also reported in p50-negative mice with colon cancer, where T cell depletion increased their cancer development to that of the comparison mice, and in p50-negative mice with glioblastoma, whose survival advantage over wild-type mice was eliminated when their T cells were depleted. The research surrounding NF-κB p50 shows promise in more types of cancer, says Friedman. "Seven different cancers -- prostate cancer, pancreatic cancer, brain cancer, melanoma, colon cancer, sarcoma and neuroblastoma -- tested by us and others grew slower in mice lacking NF-κB p50."

Credit: 
Johns Hopkins Medicine

Solving a mystery in 126 dimensions

image: An image of how the 126-dimensional wave function tile is cross-sectioned into our 3-dimensions 42 times, once for each electron. This shows the domain of each electron, in that tile.

Image: 
UNSW Sydney

> p>One of the fundamental mysteries of chemistry has been solved by a collaboration between Exciton Science, UNSW and CSIRO – and the result may have implications for future designs of solar cells, organic light-emitting diodes and other next gen technologies.

Ever since the 1930s debate has raged inside chemistry circles concerning the fundamental electronic structure of benzene. It is a debate that in recent years has taken on added urgency, because benzene – which comprises six carbon atoms matched with six hydrogen atoms – is the fundamental building-block of many opto-electronic materials, which are revolutionising renewable energy and telecommunications tech.

The flat hexagonal ring is also a component of DNA, proteins, wood and petroleum.

The controversy around the structure of the molecule arises because although it has few atomic components the electrons exist in a state comprising not just four dimensions – like our everyday “big” world – but 126.

Analysing a system that complex has until now proved impossible, meaning that the precise behaviour of benzene electrons could not be discovered. And that represented a problem, because without that information, the stability of the molecule in tech applications could never be wholly understood.

Now, however, scientists led by Timothy Schmidt from the ARC Centre of Excellence in Exciton Science and UNSW Sydney have succeeded in unravelling the mystery – and the results came as a surprise. They have now been published in the journal Nature Communications.

Professor Schmidt, with colleagues from UNSW and CSIRO’s Data61, applied a complex algorithm-based method called dynamic Voronoi Metropolis sampling (DVMS) to benzene molecules in order to map their wavefunctions across all 126 dimensions.

Key to unravelling the complex problem was a new mathematical algorithm developed by co-author Dr Phil Kilby from CSIRO’s Data61. The algorithm allows the scientist to partition the dimensional space into equivalent “tiles”, each corresponding to a permutation of electron positions.

Of particular interest to the scientists was understanding the “spin” of the electrons. All electrons have spin – it is the property that produces magnetism, among other fundamental forces – but how they interact with each other is at the base of a wide range of technologies, from light-emitting diodes to quantum computing.

“What we found was very surprising,” said Professor Schmidt. “The electrons with what’s known as up-spin double- bonded, where those with down-spin single-bonded, and vice versa.

“That isn’t how chemists think about benzene. Essentially it reduces the energy of the molecule, making it more stable, by getting electrons, which repel each other, out of each other's way.”

Co-author Phil Kilby from Data61 added: “Although developed for this chemistry context, the algorithm we developed, for ‘matching with constraints’ can also be applied to a wide variety of areas, from staff rostering to kidney exchange programs.”

Journal

Nature Communications

DOI

10.1038/s41467-020-15039-9

Credit: 
University of New South Wales

Freeze-dried soil is more suitable for studying soil reactive nitrogen gas emissions

image: Oven-drying is a commonly used method of soil drying, but scientists are now concerned if they can represent the "real" results.

Image: 
Dianming Wu

Earth's atmosphere and climate change are strongly affected by gas exchange between land and atmosphere. Reactive nitrogen (Nr) gas emissions from soils, e.g., nitrous acid (HONO) and nitric oxide (NO), play a significant role in atmospheric chemistry and also constitute a key process of the global nitrogen (N) cycle.

To understand the underlying mechanisms of soil Nr emissions, air-dried or oven-dried soils are commonly used in the laboratory. To date, few studies have compared the effects of different drying methods on soil Nr gas fluxes and N fractions.

In a paper recently published in Atmospheric and Oceanic Science Letters, Dr. Dianming Wu, from the School of Geographic Sciences, East China Normal University, and his coauthors, try to identify the best approach to treat soil samples.

"We evaluated soil water content, pH, (in)organic N content, and Nr gas fluxes of air-dried, freeze-dried, oven-dried, and fresh soils from different land-use types," says Dr. Wu.

According to this study, all drying methods increased the soil ammonium, nitrate, and dissolved organic N contents compared with fresh soil. However, freeze-dried soil had the closest soil pH value, the maximum HONO and NO flux and total emissions during a full wetting-drying cycle with fresh soil, while air-drying and oven-drying significantly increased Nr gas fluxes. Therefore, global soil Nr gas emissions might be overestimated if air- and oven-dried soil are used.

The study concludes that all drying methods should be considered for use in studies on the land-atmosphere interface and biogeochemical N cycling, whereas the freeze-drying method might be better for studies involving the measurement of soil Nr gas fluxes.

"The important implication of the finding is that we need to carefully evaluate the previous understanding of the mechanism of biogeochemical nitrogen cycling based on different drying methods," concludes Dr. Wu.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences