Culture

A cheap organic steam generator to purify water

image: The water that passes through the system by evaporation becomes very high-quality drinking water.

Image: 
Thor Balkhed

It has been estimated that in 2040 a quarter of the world's children will live in regions where clean and drinkable water is lacking. The desalination of seawater and the purification of wastewater are two possible methods to alleviate this, and researchers at Linköping University have developed a cheap and eco-friendly steam generator to desalinate and purify water using sunlight. The results have been published in the journal Advanced Sustainable Systems.

"The rate of steam production is 4-5 times higher than that of direct water evaporation, which means that we can purify more water", says Associated Professor Simone Fabiano, head of the Organic Nanoelectronics group in the Laboratory of Organic Electronics.

The steam generator consists of an aerogel that contains a cellulose-based structure decorated with the organic conjugated polymer PEDOT:PSS. The polymer has the ability to absorb the energy in sunlight, not least in the infrared part of the spectrum where much of the sun's heat is transported. The aerogel has a porous nanostructure, which means that large quantities of water can be absorbed into its pores.

"A 2 mm layer of this material can absorb 99% of the energy in the sun's spectrum", says Simone Fabiano.

A porous and insulating floating foam is also located between the water and the aerogel, such that the steam generator is kept afloat. The heat from the sun vaporises the water, while salt and other materials remain behind.

"The aerogel is durable and can be cleaned in, for example, salt water such that it can be used again immediately. This can be repeated many times. The water that passes through the system by evaporation becomes very high-quality drinking water", Tero-Petri Ruoko assures us. He is postdoc in the Laboratory of Organic Electronics and one of the authors of the article.

"What's particularly nice about this system is that all the materials are eco-friendly - we use nanocellulose and a polymer that has a very low impact on the environmental and people. We also use very small amounts material: the aerogel is made up of 90% air. We hope and believe that our results can help the millions of people who don't have access to clean water", says Simone Fabiano.

The aerogel was developed by Shaobo Han within the framework of his doctoral studies in the Laboratory of Organic Electronics, under Professor Xavier Crispin´s supervision. The result was presented in the journal Advanced Science in 2019, and is described at the link below. After taking his doctoral degree, Shaobo Han has returned to China to continue research in the field.

Credit: 
Linköping University

Cable bacteria can drastically reduce greenhouse gas emissions from rice cultivation

image: Rice plants in soil without cable bacteria (left) and with cable bacteria (right). The activity of cable bacteria can be clearly seen by the formation of an orange rust crust on the surface of the soil in the rice pot. The bacteria dissolve black iron sulphide in the soil and convert the sulphide into sulphate while the iron wanders to the surface and forms rust when it comes into contact with oxygen.

Image: 
Vincent Valentin Scholz, AU

A Danish-German research collaboration may have found a solution to the large climate impact from the world's rice production: By adding electric conductive cable bacteria to soil with rice plants, they could reduce methane emissions by more than 90%.

Half of world´s population is nourished by rice crops, but rice cultivation is harsh to he climate. The rice fields account for five percent of global emissions of the greenhouse gas methane, which is 25 times stronger than CO2.

This is because the rice plants grow in water. When the fields are flooded, the soil becomes poor in oxygen, creating the right conditions for microorganisms to produce methane.
Now researchers from Aarhus University and the University of Duisburg-Essen have found that cable bacteria could be an important part of the solution. In the laboratory, they have grown rice in soil with and without cable bacteria and measured what happened.

"And the difference was far beyond my expectations. The pots with cable bacteria emitted 93% less methane than the pots without cable bacteria, "says Vincent Valentin Scholz, who conducted the experiments as a PhD student at the Center for Electromicrobiology (CEM) at Aarhus University.

The result is published today in the scientific journal Nature Communications.

Increases sulfate and attenuates microbes

"Cable bacteria transport electrons over centimeter distances along their filaments, changing the geochemical conditions of the water-saturated soil. The cable bacteria recycle the soil's sulfur compounds, thus maintaining a large amount of sulfate in the soil. This has the consequence that the methane-producing microbes cannot maintain their activity", explains Vincent Valentin Scholz.

It is already known that the rice growers can temporarily slow down the emission of methane by spreading sulfate on the rice fields. Apparently, the cable bacteria can do this for them - and not just temporarily.

This finding adds a new angle to the role of cable bacteria as ecosystem engineers. While the authors emphasize that they have only the very first laboratory observation, it is tempting to speculate that enrichment of cable bacteria by sensible management of water and soil regime could become a sustainable and convenient solution for reducing methane emissions from rice fields. But of course, it requires field studies to see how cable bacteria can thrive in rice fields.

Credit: 
Aarhus University

Spores, please!

image: A gypsy moth caterpillar (Lymantria dispar) relishing the spores of Melampsora larici-populina a rust fungus that has spread on a poplar leaf. The new study shows that the insect is not only herbivorous, but also fungivorous, that is, likes to feed on nutrient-rich fungi.

Image: 
Franziska Eberl, Max Planck Institute for Chemical Ecology

Black poplar leaves infected by fungi are especially susceptible to attack by gypsy moth caterpillars. A research team at the Max Planck Institute for Chemical Ecology in Jena, Germany, has now further investigated this observation. The scientists found that the young larvae of this herbivore upgrade their diet with fungal food: Caterpillars that fed on leaves covered with fungal spores grew faster and pupated a few days earlier than those feeding only on leaf tissue. The higher concentrations of important nutrients in fungi, such as amino acids, nitrogen and vitamins, are probably the reason for their better performance. The results shed new light on the co-evolution of plants and insects, in which fungi and other microorganisms play a much greater role than previously assumed (Ecology Letters. DOI: 10.1111/ele.13506).

Gypsy moth caterpillars are known as feeding generalists; this means they accept a large variety of deciduous trees species and shrubs as their food plants. Outbreaks of this species have been documented every now and then also in German forest ecosystems.

Sybille Unsicker and her research team are investigating how poplars defend themselves against herbivores, including the gypsy moth. The scientists had observed that these trees downregulate their defense against the voracious insect when they are simultaneously being attacked by fungi. "We noticed that caterpillars are attracted by the odor of fungus-infested poplars, so we wondered why this is so: Would the caterpillars prefer to feed on infested leaves as well? Would this provide an advantage? And if so, what kind of chemicals are responsible for this?" first author Franziska Eberl asks, describing the basic questions of the study.

Feeding experiments in which the gypsy moth larvae were offered a choice of leaves with or without fungal infection revealed the clear preference of the caterpillars for leaves infected with fungi. In the early larval stage, they even consumed the fungal spores on the leaf surface before feeding on leaf tissue. "Whether rust fungi or mildew, young caterpillars selectively fed on the spores and preferred to feed on infected leaves," explains Franziska Eberl. Chemical analyses showed that mannitol, a substance that is also used as an artificial sweetener in human food, is primarily responsible for this preference. Eberl also monitored larval fitness, which is shown by how well larvae develop - a measurement that depends largely on their diet. "Larvae that consume fungus-infected leaves develop faster and also pupate earlier. This gives them an advantage over their siblings who feed on healthy leaves. Important nutrients, such as amino acids, nitrogen and B vitamins, are likely responsible for increased growth, because their concentration is higher infected leaves," said the researcher.

The role of microorganisms puts the co-evolution of plants and insects in a new light

The observation that an insect classified as an herbivore is actually a fungivore - at least in its early larval stage - was a real surprise for the research team. "Our results suggest that microorganisms living on plants might have a more important role in the co-evolution of plants and insects than previously thought," says Sybille Unsicker, head of the study. "In the black poplar trees from our study, fungal infestation occurs every year. It is therefore indeed imaginable that herbivorous insects have been able to adapt to the additional fungal resource. Especially with regards to the longevity of trees, the evolutionary adaptation to a diet consisting of leaves and fungi seems plausible for such insects".

Further investigations are needed to clarify how widespread fungivory is in other herbivorous insect species and what influence the combination of plant and fungal food has on the immune system of insects. It is possible that this food niche also has an effect on the insects' own defense against their enemies, such as parasitoid wasps. The role of microorganisms in the interactions between plants and insects has long been underestimated, even overlooked. This study is an important step to make up for that neglect.

Credit: 
Max Planck Institute for Chemical Ecology

Often and little, or rarely and to the full?

image: When supplying energy the method of its delivery is as important as its quantity. You probably won't manage to boil the water for your tea with a cigarette lighter. Here, PhD student Yirui Zhang illustrates this principle

Image: 
Source: IPC PAS, Grzegorz Krzyzewski

If we were talking about food, most experts would choose the former, but in the case of energy storage the opposite is true. It turns out that more energy can be stored by charging less often, but right up to 100%.

At least, this is the conclusion arrived at from research carried out by a team of scientists at the IPC PAS. Although the studies involved idealized two-dimensional lattice systems, at the end of the day, a principle is a principle. Dr. Anna Macio?ek, one of the authors of the work published in Physical Review E, describes it as follows. "We wanted to examine how the manner in which energy is stored in a system changes when we pump energy in the form of heat into it, in other words - when we heat it locally." It is known that in systems heat spreads out and diffuses. But is the collection of energy influenced by the way it is delivered; speaking professionally "the delivery alignment"? Does it matter whether we provide a lot of energy over a short period of time, none for a long time and then again a lot of energy, or small portions of energy one after the other, almost without any breaks?

Cyclic energy supply is very common in nature. We provide ourselves with energy in just this manner by eating. The same number of calories can be provided in one or two large portions eaten during the day, or broken down into 5-7 smaller meals with shorter breaks between them. Scientists are still arguing about which regimen is better for the body.
However, when it comes to two-dimensional lattice systems, it is already known that in terms of storage efficiency the "less often and a lot" method wins. "We noticed that the amount of energy the system can store varies depending on the portion size of the energy and the frequency of its provision. The greatest amount is when the energy portions are large, but the time intervals in between their supply are also long," explains Yirui Zhang, a PhD student at the IPC PAS. "Interestingly, it turns out that if we divide this sort of storage system internally into compartments or indeed chambers, the amount of energy that can be stored in such a divided-up "battery"- if it were possible to construct - increases. In other words, three small batteries can store more energy than one large one," says the researcher. All this, assuming that the total amount of energy put into the system remains the same, and only the method of its delivery changes.

Although the research carried out by the IPC PAS team is quite basic and simply shows the fundamental principle governing energy storage in magnets, its potential applications cannot be overestimated. Let's imagine, for example, the possibility of charging an electric car battery not in a few hours, but in just under twenty minutes, or a significant increase in the capacity of such batteries without changing their volume, i.e. extending the range of the car after one charge. The new discovery may also, in the future, change the methods of charging different types of batteries by determining the optimal periodicity of supplying energy to them.

Credit: 
Institute of Physical Chemistry of the Polish Academy of Sciences

Can high-power microwaves reduce the launch cost of space-bound rockets?

image: These are rockets used in the study.

Image: 
University of Tsukuba

Tsukuba, Japan - Governments throughout the world use rockets to launch satellites and people into orbit. This currently requires a lot of high-energy fuel, which is 95% of total rocket mass. Because the launch cost of a rocket can reach 10 billion yen, launching a 1-gram payload is said to be the same as buying 1 gram of gold. Minimizing the total cost of launching rockets would maximize the scientific payloads and increase the feasibility of space exploration.

In a study published in the Journal of Spacecraft and Rockets, researchers from the University of Tsukuba have helped solve important wireless power transmission and other efficiency issues that must be overcome to use high-powered microwaves to supplement--or nearly replace--chemical fuel for rocket launches. Their study will help researchers in this line of work properly focus their efforts.

Researchers commonly believe that a rocket requires a megawatt of beam-powered propulsion--that's approximately the power output of 10 automobiles--per kilogram of payload to reach a minimal orbit. Whether microwave transmission is sufficiently efficient for real-world applications is an open question.

Microwave beams have been transmitted by using a ground antenna that is the same size as a rocket antenna. "However, practical applications will require a large ground-based transmitter and a small receiver on the rocket, and thus variable-focus transmission," explains Assistant Professor Kohei Shimamura, lead author of the study. "We wanted to not only demonstrate this approach, but also quantify its efficiency."

In their comprehensive study, the researchers calculated the efficiencies, at short distances, of a ground-based microwave generator (51%), wireless power supply that sends the microwaves to the rocket propulsion system (14%), receiving antenna on the rocket (34%), and propulsion device that uses the microwave energy to heat the rocket propellant (6%). "Researchers can now put numbers on how efficient variable-focus transmission is at present," says Associate Professor Tsuyoshi Kariya, the other main author of the study.

Future research will need to study and improve efficiencies at long distances. In the words of Associate Professor Shimamura: "This is a difficult challenge, but an important next step in advancing microwave technology to practical use in rocket launches."

Rockets are essential technology, but their launching cost is a major disadvantage for scientific missions. With future research, high-power microwaves may one day be a low-cost method of rocket propulsion.

Credit: 
University of Tsukuba

Type 2 diabetes: Too much glucagon when α-cells become insulin resistant

Patients with type 2 diabetes secrete not only too little insulin but also too much glucagon, which contributes to poor blood glucose control. A new study from Uppsala University suggests that this is because the glucagon-secreting α-cells have become resistant to insulin.

In healthy individuals, insulin signals the body to absorb glucose, thereby reducing the sugar in the blood and providing energy to tissues. In patients with type 2 diabetes this mechanism fails, because the glucose-absorbing tissues become resistant to insulin and because too little of the hormone is released into the blood. This leads to elevated blood glucose and long-term complications that often become disabling or even life-threatening.

Often, type 2 diabetics also have elevated levels of glucagon, another hormone that is released by the pancreas. Glucagon counteracts the effects of insulin by instructing the liver to release stored glucose into the blood. After a meal, the release of glucagon is normally blocked to prevent excessive production of glucose by the liver. When this fails in diabetic patients, too much glucagon contributes to a vicious cycle that exacerbates the already high blood sugar levels of diabetics. Despite this vital function of glucagon, relatively little is known about how its release is regulated. Using advanced microscopy techniques, a team led by Omar Hmeadi in Sebastian Barg's research group at Uppsala University now adds insight into how glucagon-producing α-cells are controlled by glucose.

As expected, the experiments showed that glucagon is secreted during periods of low glucose, while high levels of the sugar efficiently block its release. However, in α-cells of type 2 diabetics this regulation was disturbed and high glucose no longer blocked the release of glucagon. To find out why, Hmeadi and colleagues isolated the α-cells and separated them from their tissue context in the pancreas. Surprisingly, the cells now behaved in a 'diabetic' manner and continued to secrete glucagon even when glucose was elevated.

The reason, Hmeadi explains, is that α-cells are normally blocked by insulin and other hormones that are released at high blood glucose from nearby cells. When the cells are separated from each other, this cell-to-cell communication is lost and glucagon secretion proceeds even when it should not. But why do the isolated α-cells behave as if they were diabetic? It turns out that the α-cells in type 2 diabetes become resistant to insulin, much like liver, fat and muscle. The result is that glucagon release is no longer inhibited during the mealtime rise in blood glucose, and this leads to the elevated levels of the hormone in type 2 diabetes.

The researchers hope that the findings will contribute to a better understanding of human type 2 diabetes and guide the development of better treatment strategies.

Credit: 
Uppsala University

Catalyst enables reactions with the help of green light

image: which are irradiated with green light in the laboratory of the Kekulé Institute of Organic Chemistry and Biochemistry at the University of Bonn.

Image: 
© Photo: Zhenhua Zhang

For the first time, chemists at the University of Bonn and Lehigh University in Bethlehem (USA) have developed a titanium catalyst that makes light usable for selective chemical reactions. It provides a cost-effective and non-toxic alternative to the ruthenium and iridium catalysts used so far, which are based on very expensive and toxic metals. The new catalyst can be used to produce highly selective chemical products that can provide the basis for antiviral drugs or luminescent dyes, for example. The results have been published in the international edition of the journal Angewandte Chemie.

The electrons in chemical molecules are reluctant to lead a single life; they usually occur in pairs. Then they are particularly stable and do not tend to forge new partnerships in the form of new bonds. However, if some of the electrons are brought to a higher energy level with the help of light (photons), things begin to look different when it comes to this "monogamy": In such an excited state, the molecules like to donate or to accept an electron. This creates so-called "radicals", that have electrons, are highly reactive and can be used to form new bonds.

Irradiation with green light

The new catalyst is based on this principle: At its core is titanium, which is connected to a carbon ring in which the electrons are particularly mobile and can be easily excited. Green light is sufficient to use the catalyst for electron transfer to produce reactive organic intermediates that are otherwise not easily obtainable. "In the laboratory, we irradiated a reaction flask containing the titanium catalyst that can be viewed as a 'red dye' with green light," reports Prof. Dr. Andreas Gansäuer from the Kekulé Institute of Organic Chemistry and Biochemistry at the University of Bonn. "And it worked right away." The mixture generates radicals from organic molecules that initiate many reaction cycles from which a wide variety of chemical products can be produced.

A key factor in reactions with this photo redox catalyst is the wavelength of the light used for irradiation. "Ultraviolet radiation is unsuitable because it is far too energy-rich and would destroy the organic compounds," says Gansäuer. Green light from LED lamps is both mild and energy-rich enough to trigger the reaction.

Catalysts are substances that increase the speed of chemical reactions and reduce the activation energy without being consumed themselves. This means that they are available continuously and can trigger reactions that would otherwise not occur in this form. The catalyst can be tailored to the desired products depending on the organic molecule with which the titanium is bonded.

Building blocks for antiviral drugs or luminescent dyes

The new titanium catalyst facilitates the reactions of epoxides, a group of chemicals from which epoxy resin are made. These are used as an adhesive or for composites. However, the scientists are not aiming for this mass product, but for the synthesis of much more valuable fine chemicals. "The titanium-based, tailor-made photo redox catalysts can for instance be used to produce building blocks for antiviral drugs or luminescent dyes," says Gansäuer. He is confident that these new catalysts provide a cost-effective and more sustainable alternative to the ruthenium and iridium catalysts used so far, which are based on very expensive and toxic metals.

The development is an international collaborative effort by Zhenhua Zhang, Tobias Hilche, Daniel Slak, Niels Rietdijk and Andreas Gansäuer from the University of Bonn and Ugochinyere N. Oloyede and Robert A. Flowers II from Lehigh University (USA). While the scientists from the University of Bonn investigated how the desired compounds could best be synthesized with the new catalyst, their colleagues from the USA carried out measurements to prove the reaction pathways. "The luminescence phenomenon really opens up interesting space to consider the design of new sustainable reactions that proceed through free radical intermediates," says Prof. Robert Flowers from the Lehigh University.

Credit: 
University of Bonn

Physicists develop approach to increase performance of solar energy

Experimental condensed matter physicists in the Department of Physics at the University of Oklahoma have developed an approach to circumvent a major loss process that currently limits the efficiency of commercial solar cells.

Solar cells convert the sun's energy into electricity and are the main component of solar panels and many types of electrical devices as broad-ranging as satellites and calculators.

Members of the Photovoltaic Materials and Devices Group, led by OU associate professor in the Homer L. Dodge Department of Physics and Astronomy, Ian Sellers, along with theorists at Arizona State University, led by David K. Ferry, have demonstrated a breakthrough toward the development of a hot carrier solar cell.

A hot carrier solar cell is a device that would increase the efficiency of solar cells by more than 20%, which Sellers said would be a significant breakthrough for solar energy.

"Although this device has been the source of a considerable amount of research over the last 10 to 15 years, the realization of a practical solution has thus far eluded researchers with proof-of-principle demonstrations only presented under unrealistic conditions or in materials and structures not relevant for solar cell operation," said Sellers.

Sellers says this new approach, recently published in the journal Nature Energy, demonstrates "significant progress in the realization of the hot carrier solar cell and the potential for ultra-high-efficiency single-junction semiconductor devices, which would revolutionize the field of photovoltaics and renewable energy generation."

Credit: 
University of Oklahoma

Princeton scientist solves air quality puzzle: Why does ozone linger long after its ban?

image: Meiyun Lin is a research scholar in Princeton University's Program in Atmospheric and Oceanic Sciences, the Cooperative Institute for Modeling the Earth System at Princeton University, and NOAA's Geophysical Fluid Dynamics Laboratory. She led an international team of climate researchers to address a longstanding puzzle: Why, despite laws successfully limiting pollution from cars, trucks and factories, has Europe seen little improvement in ozone air quality? Lin found the surprising chain of causes: As global climate change leads to more hot and dry weather, the resulting droughts are stressing plants, making them less able to remove ozone from the air.

Image: 
Zhiguo Zhang

When high in the atmosphere, ozone protects Earth from harmful solar radiation -- but ozone at ground level is a significant pollutant. Exposure to high concentrations of ground-level ozone aggravates respiratory illnesses, thus exacerbating the negative health effects of heat and contributing to the catastrophic impacts of recent heatwaves and drought in Europe.

In Europe, despite laws limiting pollution from cars, trucks and factories, there has been little improvement in ozone air quality. An international team led by atmospheric scientist Meiyun Lin found the surprising chain of causes: As global climate change leads to more hot and dry weather, the resulting droughts are stressing plants, making them less able to remove ozone from the air.

With hot and dry summers expected to become more frequent over the coming decades, this has significant implications for European policymakers, noted Lin, a research scholar in atmospheric and oceanic sciences and the Cooperative Institute for Modeling the Earth System at Princeton University.

In a new study published today in Nature Climate Change, Lin and her colleagues demonstrated that vegetation feedbacks during drought worsen the most severe ozone pollution episodes.

"We show that declining ozone removal by water-stressed vegetation in response to climate warming can explain the slow progress towards improving ozone air quality in Europe," she said. "Under drought stress, plants are less effective in ozone removal via stomata -- small pores in the leaves of vegetation that are responsible for controlling carbon dioxide transport for photosynthesis and water vapor losses."

Such land-biosphere feedbacks have often been overlooked in prior air quality projections. This study quantified these vegetation feedbacks using six decades of observations and new Earth system model simulations developed at the Geophysical Fluid Dynamics Laboratory, a division of the U.S. National Oceanic and Atmospheric Administration located on Princeton's Forrestal campus.

Lin and her colleagues found that severe drought stress can cause as much as 70% reductions in ozone removal by forests. "Accounting for reduced ozone removal by drought-stressed vegetation leads to a three-fold increase in high-ozone events -- above 80 parts per billion," Lin said. That is significantly worse than the European Union's ozone target: 60 parts per billion, not to be exceeded on more than 25 days per year. For reference, the U.S. standard is 70 parts per billion, not to be exceeded on more than 4 days per year.

The European Union has established an extensive body of legislation to reduce regional emissions of smog-forming chemicals from member states, but despite 45% to 70% reductions in smog-forming chemicals across a 40-year period, summertime ozone levels measured in Europe actually climbed, especially during the 1980s and '90s.

Based on their findings, Lin said, governments will need even stronger emission controls to lower ozone air pollution.

While this study focused on Europe, their findings have broad implications. Substantial reductions in ozone removal by vegetation were also observed during North America's historic heat wave and drought in summer 2012, according to an earlier study by Lin.

Over the coming decades, as the climate warms, it will be increasingly important to account for vegetation feedbacks to determine the effects of extreme pollution events, she said.

Credit: 
Princeton University

Finding leukemia's weakness using genome-wide CRISPR technology

image: UC San Diego researchers used CRISPR technology to carry out a genome-wide screen in leukemia cells to block thousands of genes at once. The tool was used to identify genes that fuel leukemia growth, like those leukemia cells (green) pictured growing within the bone marrow.

Image: 
UC San Diego Health Sciences

A team of researchers at University of California San Diego School of Medicine and Moores Cancer Center used CRISPR technology to identify key regulators of aggressive chronic myeloid leukemia, a type of cancer that remains difficult to treat and is marked by frequent relapse.

"We used CRISPR technology to carry out a genome-wide screen in leukemia cells to block thousands of genes at once. This is an extremely powerful tool that allowed us to identify a multitude of genes that fuel leukemia growth and find new vulnerabilities that can be targeted in this disease," said senior author Tannishtha Reya, PhD, professor in the departments of Pharmacology and Medicine. "The study also shows, for the first time, that whole genome CRISPR-based screens can in fact be carried out in a manner that is much more physiologically relevant: using primary cancer cells, and in the setting of the native microenvironment."

Reporting in the April 20, 2020 online edition of the journal Nature Cancer, Reya and colleagues identified RNA-binding proteins -- which normally control how, when and if cells make certain proteins -- as a key class of proteins that sustain and protect drug-resistant leukemia stem cells. The authors focused on Staufen2 (Stau2), a relatively understudied member of the RNA-binding protein family that was previously only known to control brain and nervous system development.

This work was carried out in collaboration with UC San Diego investigators Gene Yeo, PhD, and Anjana Rao, PhD, and was conducted by lead author Jeevisha Bajaj, PhD, while she was a postdoctoral fellow at UC San Diego in the Reya lab. Bajaj subsequently joined the University of Rochester Medical Center as assistant professor of biomedical genetics and a researcher at the Wilmot Cancer Institute.

The team developed a mouse model in which Stau2 was genetically deleted and found that loss of this protein led to a profound reduction in leukemia growth and propagation, and markedly improved overall survival in mouse models. Stau2 was also required for continued growth of primary tissue samples of patients with leukemia, indicating a conserved dependence in the human disease.

"We are particularly excited about this work because, to our knowledge, this is the first demonstration that Staufen2 is a key dependency in any cancer," said Reya, who is a member of Moores Cancer Center and the Sanford Consortium for Regenerative Medicine.

To understand how Stau2 controls cancer, researchers undertook a genome-scale computational analysis of its targets through RNA-Seq and eCLIP-Seq. This led to the discovery that this protein controls key oncogenes, such as Ras, and epigenetic regulators, such as the LSD/KDM family of proteins, which are critical drug targets being tested against leukemia and other cancers.

According to the National Cancer Institute, approximately 1.5 percent of men and women will be diagnosed with leukemia at some point during their lifetimes. While chronic myeloid leukemia (CML) can be controlled with targeted therapies, this disease can be lethal if it advances or is diagnosed in an acute "blast" phase. The findings also have implications for acute myeloid leukemia (AML) and other blood cancers.

"This work will be particularly important for the discovery of new treatments," said Bajaj. "Our genome-wide screen identified cellular signals critical for the growth of cancer, and in the future, this study will be useful to study the microenvironment, the area around the tumor that includes tissue, blood vessels and important molecular signals related to how the cancer behaves."

Credit: 
University of California - San Diego

Oak genomics proves its worth

image: Landmark 10 article collection in April 16 New Phytologist helps clarify evolution of oaks.

Image: 
<i>New Phytologist</i>

Lisle, Ill. (April 20) - A year and a half following the publication of the pedunculate oak genome by France's National Research Institute for Agriculture, Food and the Environment (INRAE) and The Commission for Atomic Energy and Alternative Energies (CEA), initial results based on this genomic resource were published in the April 16, 2020, issue of New Phytologist.

The 10 articles in the collection help clarify the evolution of oaks, from the deep roots of their diversification through their more recent evolution. They also identify key genes involved in oak adaptation to environmental transitions and resistance to pathogens, investigate the implications and history of oak hybridization, and trace genomic evidence for an estimated 56 million years of oak evolution. This landmark volume will be of value to tree scientists worldwide. Four of the articles were co-authored by researchers at The Morton Arboretum.

Oaks are keystone species in a wide range of forest and savanna ecosystems throughout the northern hemisphere. They are also model organisms for investigating ecological and evolutionary processes responsible for plant diversification and adaptation, especially in response to rapid environmental change. Given the high rate of global climate change, genomic approaches to understanding tree responses to the environment are particularly timely.

The volume brings to light several novel findings in tree biology that have arisen from these studies rooted in the oak genome:

- In a review of research into oaks' early and more recent evolution, evolutionary mechanisms are proposed for the high diversity and abundance of oaks across the Northern Hemisphere (Kremer et Hipp, 2020).

- Two global phylogenomic studies reconstruct the history of oak diversification across the northern hemisphere, dissecting the important role of both speciation and hybridization in shaping the diversity of oak-dominated forest communities (Crowl et al. 2020; Hipp et al. 2020).

- A synthesis of genomic findings against the backdrop of western cultural traditions suggests biological underpinnings of the oak's traditional use as a symbol of longevity, cohesiveness and robustness (Leroy et al. 2020a).

- A novel oak conservation strategy is proposed that would take advantage of hybridization in oaks, designed to facilitate the ability of these long-lived organisms to adapt to global change by naturally sharing advantageous genes across species boundaries (Cannon and Petit, 2020).

Publication of this series of research articles provides a forward-looking review of cutting-edge work being carried out in light of scientists' burgeoning understanding of the oak genome.

Credit: 
The Morton Arboretum

Diagnostic biosensor quickly detects SARS-CoV-2 from nasopharyngeal swabs

image: An artist’s rendering above shows a new test that quickly detects SARS-CoV-2 (spheres) through binding to antibodies (Y-shapes) on a field-effect transistor.

Image: 
Adapted from <i>ACS Nano</i> <b>2020</b>, DOI: 10.1021/acsnano.0c02823

**A correction to the journal article was published on Aug. 28, 2020**

According to many experts, early diagnosis and management are critical for slowing the spread of SARS-CoV-2, the new coronavirus that causes COVID-19. Therefore, the race is on to develop diagnostic tests for the virus that are faster, easier and more accurate than existing ones. Now, researchers reporting in ACS Nano have developed a field-effect transistor-based biosensor that detects SARS-CoV-2 in nasopharyngeal swabs from patients with COVID-19, in less than one minute.

Currently, most diagnostic tests for COVID-19 rely on a technique called real-time reverse transcription-polymerase chain reaction (RT-PCR), which amplifies SARS-CoV-2 RNA from patient swabs so that tiny amounts of the virus can be detected. However, the method takes at least 3 hours, including a step to prepare the viral RNA for analysis. Edmond Changkyun Park, Seung Il Kim and colleagues wanted to develop a faster diagnostic test that could analyze patient samples directly from a tube of buffer containing the swabs, without any sample preparation steps.

The team based their test on a field-effect transistor -- a sheet of graphene with high electronic conductivity. The researchers attached antibodies against the SARS-CoV-2 spike protein to the graphene. When they added either purified spike protein or cultured SARS-CoV-2 virus to the sensor, binding to the antibody caused a change in the electrical current. Next, the team tested the technique on nasopharyngeal swabs collected from patients with COVID-19 or healthy controls. Without any sample preparation, the sensor could discriminate between samples from sick and healthy patients. The new test was about 2-4 times less sensitive than RT-PCR, but different materials could be explored to improve the signal-to-noise ratio, the researchers say.

Credit: 
American Chemical Society

Study identifies last-line antibiotic resistance in humans and pet dog

New research due to be presented at this year's European Congress of Clinical Microbiology and Infectious Diseases (ECCMID)* has identified the dangerous mcr-1 gene -which provides resistance to the last line antibiotic colistin - in two healthy humans and a pet dog. The study is by Dr Juliana Menezes and Professor Constança Pomba from the Centre of Interdisciplinary Investigation of Animal Health of the Faculty of Veterinary Medicine, University of Lisbon, Portugal, and colleagues, and is part of the Pet Risk Consortium Project funded by the Portuguese Government.

Since first being reported in China in 2015, the mcr-1 gene has been found in various people and animals around the world. It confers resistance to colistin, which is an antibiotic of last resort used to treat infections from bacteria resistant to all other antibiotics. The nightmare scenario that could emerge is mcr-1 combining with already drug resistant bacteria to create a truly untreatable infection. Other genes in the same group (mcr-2 to mcr-9) have since been identified which act in a similar manner.

In this study, the authors investigated resistance to colistin in the faecal samples of human and pets in Portugal. Between February 2018 and October 2019, faecal samples were collected from cats and dogs and their human household members. Genetic analysis was carried to establish the presence of five colistin resistance genes (mcr-1 to mcr-5).

The 70 households enrolled from the Lisbon region included healthy humans (n=106) living with healthy pets (n=49) and pets with skin and soft tissue infections (SSTI) (n=19) and urinary tract infections (UTI) (n=16). Of these, 95 faecal samples (89.6%) from humans, 45 from dogs (healthy-23, SSTIs-14 and UTI-8) and 21 from cats (healthy-18 and UTI-3), were positive for E. coli.

Further analysis showed colistin resistance in 5 out of the 161 isolates (3%), three from healthy humans and two from dogs with skin infections. Molecular analysis revealed that three of the E. coli isolates carried the mcr-1 gene, from two healthy humans and one from a dog with skin infection, all from different households. The two remaining isolates resistant to colistin (one from a healthy human and one from a dog with a skin infection), did not test positive for the mcr-1 to mcr-5 genes, and will now be screened for the other mcr-6 to mcr-9 genetic variants.

The isolates found in the faecal samples were not causing infections in humans, hence no specific treatment was recommended for them. Only the sick dog received oral amoxicillin in combination with clavulanic acid for the treatment of the skin infection.

The authors say: "To our knowledge, this is the first report of the presence of the mcr-1 gene from either a dog and or in healthy humans in Portugal. Further studies are needed to determine the full epidemiology of colistin resistance genes in humans and companion animals."

They add: "These humans and dogs, if in direct contact, may transmit bacteria containing the mcr-1 gene to other humans, dogs, other animals and the environment and potentially be a hazard for public health. The situation we all want to avoid at all costs is any infection totally resistant to all antibiotics, caused by bacteria already resistant to most other antibiotics also acquiring this colistin resistance gene."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Study suggests pets are not a major source of transmission of drug-resistant microbes to their owners

New research due to be presented at this year's European Congress of Clinical Microbiology and Infectious Diseases (ECCMID)* has identified genetically identical multidrug-resistant bacteria in humans and their pets, suggesting human-animal transfer is possible in this context. However only a small number of cases were found, suggesting this is not a major source of antibiotic-resistant infections in humans. The study is by Carolin Hackmann, Institute for Hygiene and Environmental Medicine, Charité - University Hospital Berlin, Germany, and colleagues.

The project behind this study is investigating the relevance of pet care in the infection of hospital patients with multidrug-resistant organisms (MDROs) since the potential role of pets as reservoirs of MDROs remains unclear. The project focusses on the most common MDROs in pet owners: methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), 3rd generation cephalosporin-resistant Enterobacterales (3GCRE) and carbapenem-resistant Enterobacterales (CRE). The study is being performed at Charité - University Hospital Berlin, and is funded by the Federal Department of Health (BMG).

To assess the contact to household pets as a risk factor for the colonisation with one of the above-named pathogens, the authors asked the pet owners questions about well-known risk factors and their contact to dogs and cats. This includes information about the number of pets in the household, the closeness of contact and diseases as well as medical treatment of the pets.

To assess the genetic relationship between the human and pet MDROs, the researchers collected nasal and rectal swabs of the participants in the hospital and their pets to test them for MDROs. Phenotypically matching MDROs in the samples of participants and their pets were tested for genetic relationship using whole genome sequencing (WGS).

Among the first 1,500 participants, 495 participants (33%) tested positive for MDROs. In total, 296 participants (20%) owned at least 1 pet (range 1-5 pets), and 38% of these owners (112) tested positive for MDROs. There was no significant difference in the proportion of pet owners between cases and controls. Further analysis of factors concerning the health status of pets and closeness of contact to pets also showed no significant differences between cases and controls.

So far in this ongoing study, samples of 77 dogs and 71 cats from 112 pet owners have been analysed. Among these, 14% of pets (23 dogs and 1 cat) tested positive for one MDRO each. In two cases, MDROs of a dog and its owner were phenotypically matching. The matching pathogens were VR Enterococcus faecium and 3GCR Escherichia coli. The genetic analysis based on a gene-by-gene comparison approach confirmed that the matching pairs were genotypically identical in dog and owner.

The authors conclude: "This analysis of preliminary data showed no significant difference in pet care or closeness of contact to pets between MDRO-positive and MDRO-negative hospital patients. A transmission of MDROs between human and animal was confirmed in only 1.8% of 112 pet owners and their respective pets. So far, the preliminary data does not indicate pet care as a significant risk factor for MDRO colonisation in hospital patients."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Milky Way could be catapulting stars into its outer halo, UCI astronomers say

image: A simulated galaxy image from the FIRE-2 project, representing a structure spanning more than 200,000 light years, shows the prominent plumes of young blue stars born in gas that was originally rotating and then blown radially outward by supernova explosions.

Image: 
Courtesy of Sijie Yu / UCI

Irvine, Calif., April 20, 2020 - Though mighty, the Milky Way and galaxies of similar mass are not without scars chronicling turbulent histories. University of California, Irvine astronomers and others have shown that clusters of supernovas can cause the birth of scattered, eccentrically orbiting suns in outer stellar halos, upending commonly held notions of how star systems have formed and evolved over billions of years.

Hyper-realistic, cosmologically self-consistent computer simulations from the Feedback in Realistic Environments 2 project enabled the scientists to model the disruptions in otherwise orderly galactic rotations. The team's work is the subject of a study published today in the Monthly Notices of the Royal Astronomical Society.

"These highly accurate numerical simulations have shown us that it's likely the Milky Way has been launching stars in circumgalactic space in outflows triggered by supernova explosions," said senior author James Bullock, dean of UCI's School of Physical Sciences and a professor of physics & astronomy. "It's fascinating, because when multiple big stars die, the resulting energy can expel gas from the galaxy, which in turn cools, causing new stars to be born."

Bullock said the diffuse distribution of stars in the stellar halo that extends far outside the classical disk of a galaxy is where the "archeological record" of the system exists. Astronomers have long assumed that galaxies are assembled over lengthy periods of time as smaller star groupings come in and are dismembered by the larger body, a process that ejects some stars into distant orbits. But the UCI team is proposing "supernova feedback" as a different source for as many as 40 percent of these outer-halo stars.

Lead author Sijie Yu, a UCI Ph.D. candidate in physics, said the findings were made possible partly by the availability of a powerful new set of tools.

"The FIRE-2 simulations allow us to generate movies that make it seem as though you're observing a real galaxy," she noted. "They show us that as the galaxy center is rotating, a bubble driven by supernova feedback is developing with stars forming at its edge. It looks as though the stars are being kicked out from the center."

Bullock said he did not expect to see such an arrangement because stars are such tight, incredibly dense balls that are generally not subject to being moved relative to the background of space. "Instead, what we're witnessing is gas being pushed around," he said, "and that gas subsequently cools and makes stars on its way out."

The researchers said that while their conclusions have been drawn from simulations of galaxies forming, growing and evolving to the present day, there is actually a fair amount of observational evidence that stars are forming in outflows from galactic centers to their halos.

"In plots that compare data from the European Space Agency's Gaia mission - which provides a 3D velocity chart of stars in the Milky Way - with other maps that show stellar density and metallicity, we can see structures similar to those produced by outflow stars in our simulations," Yu said.

Bullock added that mature, heavier, metal-rich stars like our sun rotate around the center of the galaxy at a predictable speed and trajectory. But the low-metallicity stars, which have been subjected to fewer generations of fusion than our sun, can be seen rotating in the opposite direction.

He said that over the lifespan of a galaxy, the number of stars produced in supernova bubble outflows is small, around 2 percent. But during the parts of galaxies' histories when starburst events are booming, as many as 20 percent of stars are being formed this way.

"There are some current projects looking at galaxies that are considered to be very 'starbursting' right now," Yu said. "Some of the stars in these observations also look suspiciously like they're getting ejected from the center."

Credit: 
University of California - Irvine