Culture

Promiscuity in the Paleozoic: Researchers uncover clues about vertebrate evolution

image: A diagram showing the evolution of vertebrate chromosomes from an invertebrate ancestor through two whole-genome duplication events (shown as starbursts). For simplicity, only one ancestral pair of chromosomes is shown. The study found that two now-extinct ancestors, alpha and beta, hybridized during duplication #2 to give rise to the jawed vertebrate lineage. After duplication, the beta chromosomes became more degraded than the more robust alpha chromosomes.

Image: 
OIST

To look at how life evolved, scientists usually turn to the fossil record, but this record is often incomplete. Researchers from the Okinawa Institute of Science and Technology Graduate University (OIST), alongside an international team of collaborators, have used another tool - the chromosomes of living animals - to uncover clues about our past. The study, published in Nature Ecology and Evolution, reveals early events in the evolution of vertebrates, including how jawed vertebrates arose through hybridization between two species of primitive fish.

"It's remarkable that although these events occurred almost half a billion years ago, we can figure them out by looking at DNA today," said Professor Daniel Rokhsar, who leads OIST's Molecular Genetics Unit.

Reading the stories in our genes

Chromosomes are tiny structures that carry an organism's genetic material. They normally come in paired sets, with one set inherited from each parent. While humans have 23 pairs, this number varies across species.

The study found that, even over hundreds of millions of years, chromosomes can be surprisingly stable. Although mutations and rearrangements have occurred, the chromosomes of modern animals have striking similarities to each other.

"We can use these similarities to trace our evolution and infer biology from the distant past," said Professor Rokhsar, "If a group of genes is carried together on the same chromosomes in two very different animals - say, snails and sea stars - then these genes were also likely together on the same chromosome in their last common ancestor."

Two former OIST postdoctoral scholars, Professor Oleg Simakov, now at the University of Vienna, and Dr. Ferdinand Marlétaz, now at University College, London, led the study that compared the chromosomes of amphioxus, a small marine invertebrate, to those of other animals, including mollusks, mammals, birds, frogs, fish, and lampreys.

After accounting for a handful of rearrangements, they concluded that the chromosomes of amphioxus resemble those of long-extinct early vertebrate ancestors and confirmed the existence of 17 ancient chromosomal units. The researchers then traced the evolution of these ancient chromosomes in living vertebrates.

"Reconstructing the ancestral chromosomes was the key that allowed us to unlock several puzzles of early vertebrate evolution," said Professor Rokhsar.

Duplicating and disappearing

The puzzles center on a phenomenon known as 'genome duplication.' In the 1970s, geneticist Susumu Ohno suggested that vertebrate genomes were doubled, perhaps repeatedly, relative to their invertebrate ancestors. Genomic studies have confirmed and refined this suggestion, but how many doublings there were, and how and when they occurred, are still debated.

Part of the challenge is that duplicated genomes change rapidly, and these changes can obscure the duplication itself. Although a doubled genome starts with redundant copies of every gene, most of these extra copies will be inactivated by mutation and eventually lost; the doubled chromosomes themselves may also become scrambled.

Using the 17 ancestral chromosome pairs as an ancient anchor, the researchers concluded that there were two separate instances of genome doubling.

The first duplication is shared by all living vertebrates - both the jawed vertebrates, including humans, birds, fish, and frogs, as well as the jawless lampreys and their relatives. The researchers inferred that this most ancient duplication occurred about five hundred million years ago, around the same time the earliest vertebrate fossils appear.

The second duplication is shared only by jawed vertebrates. The researchers found that, unlike the first event, gene loss after the second doubling occurred unevenly across the two sets of chromosomal copies - a surprising but informative feature.

"This kind of uneven gene loss is the hallmark of a genome duplication that follows the hybridization of two species," said Professor Rokhsar.

Usually, the hybrid offspring of two different species are infertile, in part because the chromosomes from the two parents aren't properly coordinated. But very occasionally, in some fish, frogs, and plants, the hybrid genome becomes doubled to restore chromosomal pairing. The resulting offspring have twice as many chromosomes as their mismatched parents - and are often more vigorous. The new study unexpectedly found that such hybrid-doubling occurred in our ancient ancestors.

"Over 450 million years ago, two different species of fish mated and, in the process, spawned a new hybrid species with twice as many chromosomes," said Professor Rokhsar, "And this new species would become the ancestor of all living jawed animals - including us!"

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Extra payments motivate sobriety and employment among people recovering from addiction

After a yearlong study of people with opioid dependence, Johns Hopkins Medicine researchers report evidence that adding $8 an hour to their paychecks may help those in recovery stay drug free longer, as well as encourage them to get and hold regular jobs.

The researchers say poverty is an independent risk factor for drug abuse that treatment plans largely ignore.

In a report on the study, described in the Journal of Epidemiology & Community Health online Feb. 21, the investigators say their intervention could be used widely in low-income neighborhoods as a way to promote employment, reduce drug use and help those ravaged by the opioid crisis better integrate back into their communities.

"We were hoping to have a positive result from our study, but I don't think we expected it to work quite so well," says August Holtyn, Ph.D., assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.

Holtyn says this financial stimulus was the brainchild of Kenneth Silverman, Ph.D., professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. For more than two decades, he has been developing therapeutic workplace strategies to help promote employment while reducing drug use.

For this study, the research team recruited 91 participants from the Center for Learning and Health on the Johns Hopkins Bayview Medical Center campus in Baltimore, Maryland. The participants were ages 21 to 74 and were paid $8 an hour with an optional $2 bonus as they went through three months of job training and drug testing. All participants were then invited to work with an employment specialist to help them get a job in the community.

About 56% of the participants were African American, about 40% were white and 55% were men. All participants were being treated with methadone or buprenorphine for opioid dependence.

Of the participants, 98% lived in poverty according to the 2018 poverty thresholds set by the U.S. Census Bureau, and almost 75% were unemployed for at least three years before the study.

Forty-four participants were randomly chosen to receive a wage supplement of $8 per hour after the first three months, and the 47 people in the other group received no extra financial incentives beyond the first three months of training and drug testing.

All participants underwent regular urine screens three times each week that became more intermittent if the person remained drug-free. If a participant receiving the wage supplement tested positive for cocaine or opioids, the wage supplement was reduced to $1 and more frequent testing resumed. For each day participants were drug free, the wage supplement rose $1 per hour until reaching the maximum of $8 per hour.

During the yearlong intervention, 65% of people with wage supplements provided urine samples free of opioids and cocaine, compared to 45% of those without wage supplements.

People with wage supplements were also 2.9 times more likely to get a job and 2.7 times more likely to rise out of poverty by the end of the year than those without wage supplements.

"Following this study, the vast majority of unemployed people that participated entered into the workforce with minimum wage positions, so we think the wage supplement is helpful in providing motivation to keep jobs that at times can be stressful and difficult," says Holtyn. She says participants worked in construction, grocery stores, the food service industry, house cleaning and delivery.

The researchers caution that because the wage supplements and drug testing stopped after the study period, they will continue to follow participants for another year to see if the reduced drug use and consistent employment effects persist. They are also testing their wage supplement intervention with homeless adults who have alcohol use disorder.

Credit: 
Johns Hopkins Medicine

A cheap organic steam generator to purify water

image: The water that passes through the system by evaporation becomes very high-quality drinking water.

Image: 
Thor Balkhed

It has been estimated that in 2040 a quarter of the world's children will live in regions where clean and drinkable water is lacking. The desalination of seawater and the purification of wastewater are two possible methods to alleviate this, and researchers at Linköping University have developed a cheap and eco-friendly steam generator to desalinate and purify water using sunlight. The results have been published in the journal Advanced Sustainable Systems.

"The rate of steam production is 4-5 times higher than that of direct water evaporation, which means that we can purify more water", says Associated Professor Simone Fabiano, head of the Organic Nanoelectronics group in the Laboratory of Organic Electronics.

The steam generator consists of an aerogel that contains a cellulose-based structure decorated with the organic conjugated polymer PEDOT:PSS. The polymer has the ability to absorb the energy in sunlight, not least in the infrared part of the spectrum where much of the sun's heat is transported. The aerogel has a porous nanostructure, which means that large quantities of water can be absorbed into its pores.

"A 2 mm layer of this material can absorb 99% of the energy in the sun's spectrum", says Simone Fabiano.

A porous and insulating floating foam is also located between the water and the aerogel, such that the steam generator is kept afloat. The heat from the sun vaporises the water, while salt and other materials remain behind.

"The aerogel is durable and can be cleaned in, for example, salt water such that it can be used again immediately. This can be repeated many times. The water that passes through the system by evaporation becomes very high-quality drinking water", Tero-Petri Ruoko assures us. He is postdoc in the Laboratory of Organic Electronics and one of the authors of the article.

"What's particularly nice about this system is that all the materials are eco-friendly - we use nanocellulose and a polymer that has a very low impact on the environmental and people. We also use very small amounts material: the aerogel is made up of 90% air. We hope and believe that our results can help the millions of people who don't have access to clean water", says Simone Fabiano.

The aerogel was developed by Shaobo Han within the framework of his doctoral studies in the Laboratory of Organic Electronics, under Professor Xavier Crispin´s supervision. The result was presented in the journal Advanced Science in 2019, and is described at the link below. After taking his doctoral degree, Shaobo Han has returned to China to continue research in the field.

Credit: 
Linköping University

Cable bacteria can drastically reduce greenhouse gas emissions from rice cultivation

image: Rice plants in soil without cable bacteria (left) and with cable bacteria (right). The activity of cable bacteria can be clearly seen by the formation of an orange rust crust on the surface of the soil in the rice pot. The bacteria dissolve black iron sulphide in the soil and convert the sulphide into sulphate while the iron wanders to the surface and forms rust when it comes into contact with oxygen.

Image: 
Vincent Valentin Scholz, AU

A Danish-German research collaboration may have found a solution to the large climate impact from the world's rice production: By adding electric conductive cable bacteria to soil with rice plants, they could reduce methane emissions by more than 90%.

Half of world´s population is nourished by rice crops, but rice cultivation is harsh to he climate. The rice fields account for five percent of global emissions of the greenhouse gas methane, which is 25 times stronger than CO2.

This is because the rice plants grow in water. When the fields are flooded, the soil becomes poor in oxygen, creating the right conditions for microorganisms to produce methane.
Now researchers from Aarhus University and the University of Duisburg-Essen have found that cable bacteria could be an important part of the solution. In the laboratory, they have grown rice in soil with and without cable bacteria and measured what happened.

"And the difference was far beyond my expectations. The pots with cable bacteria emitted 93% less methane than the pots without cable bacteria, "says Vincent Valentin Scholz, who conducted the experiments as a PhD student at the Center for Electromicrobiology (CEM) at Aarhus University.

The result is published today in the scientific journal Nature Communications.

Increases sulfate and attenuates microbes

"Cable bacteria transport electrons over centimeter distances along their filaments, changing the geochemical conditions of the water-saturated soil. The cable bacteria recycle the soil's sulfur compounds, thus maintaining a large amount of sulfate in the soil. This has the consequence that the methane-producing microbes cannot maintain their activity", explains Vincent Valentin Scholz.

It is already known that the rice growers can temporarily slow down the emission of methane by spreading sulfate on the rice fields. Apparently, the cable bacteria can do this for them - and not just temporarily.

This finding adds a new angle to the role of cable bacteria as ecosystem engineers. While the authors emphasize that they have only the very first laboratory observation, it is tempting to speculate that enrichment of cable bacteria by sensible management of water and soil regime could become a sustainable and convenient solution for reducing methane emissions from rice fields. But of course, it requires field studies to see how cable bacteria can thrive in rice fields.

Credit: 
Aarhus University

Spores, please!

image: A gypsy moth caterpillar (Lymantria dispar) relishing the spores of Melampsora larici-populina a rust fungus that has spread on a poplar leaf. The new study shows that the insect is not only herbivorous, but also fungivorous, that is, likes to feed on nutrient-rich fungi.

Image: 
Franziska Eberl, Max Planck Institute for Chemical Ecology

Black poplar leaves infected by fungi are especially susceptible to attack by gypsy moth caterpillars. A research team at the Max Planck Institute for Chemical Ecology in Jena, Germany, has now further investigated this observation. The scientists found that the young larvae of this herbivore upgrade their diet with fungal food: Caterpillars that fed on leaves covered with fungal spores grew faster and pupated a few days earlier than those feeding only on leaf tissue. The higher concentrations of important nutrients in fungi, such as amino acids, nitrogen and vitamins, are probably the reason for their better performance. The results shed new light on the co-evolution of plants and insects, in which fungi and other microorganisms play a much greater role than previously assumed (Ecology Letters. DOI: 10.1111/ele.13506).

Gypsy moth caterpillars are known as feeding generalists; this means they accept a large variety of deciduous trees species and shrubs as their food plants. Outbreaks of this species have been documented every now and then also in German forest ecosystems.

Sybille Unsicker and her research team are investigating how poplars defend themselves against herbivores, including the gypsy moth. The scientists had observed that these trees downregulate their defense against the voracious insect when they are simultaneously being attacked by fungi. "We noticed that caterpillars are attracted by the odor of fungus-infested poplars, so we wondered why this is so: Would the caterpillars prefer to feed on infested leaves as well? Would this provide an advantage? And if so, what kind of chemicals are responsible for this?" first author Franziska Eberl asks, describing the basic questions of the study.

Feeding experiments in which the gypsy moth larvae were offered a choice of leaves with or without fungal infection revealed the clear preference of the caterpillars for leaves infected with fungi. In the early larval stage, they even consumed the fungal spores on the leaf surface before feeding on leaf tissue. "Whether rust fungi or mildew, young caterpillars selectively fed on the spores and preferred to feed on infected leaves," explains Franziska Eberl. Chemical analyses showed that mannitol, a substance that is also used as an artificial sweetener in human food, is primarily responsible for this preference. Eberl also monitored larval fitness, which is shown by how well larvae develop - a measurement that depends largely on their diet. "Larvae that consume fungus-infected leaves develop faster and also pupate earlier. This gives them an advantage over their siblings who feed on healthy leaves. Important nutrients, such as amino acids, nitrogen and B vitamins, are likely responsible for increased growth, because their concentration is higher infected leaves," said the researcher.

The role of microorganisms puts the co-evolution of plants and insects in a new light

The observation that an insect classified as an herbivore is actually a fungivore - at least in its early larval stage - was a real surprise for the research team. "Our results suggest that microorganisms living on plants might have a more important role in the co-evolution of plants and insects than previously thought," says Sybille Unsicker, head of the study. "In the black poplar trees from our study, fungal infestation occurs every year. It is therefore indeed imaginable that herbivorous insects have been able to adapt to the additional fungal resource. Especially with regards to the longevity of trees, the evolutionary adaptation to a diet consisting of leaves and fungi seems plausible for such insects".

Further investigations are needed to clarify how widespread fungivory is in other herbivorous insect species and what influence the combination of plant and fungal food has on the immune system of insects. It is possible that this food niche also has an effect on the insects' own defense against their enemies, such as parasitoid wasps. The role of microorganisms in the interactions between plants and insects has long been underestimated, even overlooked. This study is an important step to make up for that neglect.

Credit: 
Max Planck Institute for Chemical Ecology

Often and little, or rarely and to the full?

image: When supplying energy the method of its delivery is as important as its quantity. You probably won't manage to boil the water for your tea with a cigarette lighter. Here, PhD student Yirui Zhang illustrates this principle

Image: 
Source: IPC PAS, Grzegorz Krzyzewski

If we were talking about food, most experts would choose the former, but in the case of energy storage the opposite is true. It turns out that more energy can be stored by charging less often, but right up to 100%.

At least, this is the conclusion arrived at from research carried out by a team of scientists at the IPC PAS. Although the studies involved idealized two-dimensional lattice systems, at the end of the day, a principle is a principle. Dr. Anna Macio?ek, one of the authors of the work published in Physical Review E, describes it as follows. "We wanted to examine how the manner in which energy is stored in a system changes when we pump energy in the form of heat into it, in other words - when we heat it locally." It is known that in systems heat spreads out and diffuses. But is the collection of energy influenced by the way it is delivered; speaking professionally "the delivery alignment"? Does it matter whether we provide a lot of energy over a short period of time, none for a long time and then again a lot of energy, or small portions of energy one after the other, almost without any breaks?

Cyclic energy supply is very common in nature. We provide ourselves with energy in just this manner by eating. The same number of calories can be provided in one or two large portions eaten during the day, or broken down into 5-7 smaller meals with shorter breaks between them. Scientists are still arguing about which regimen is better for the body.
However, when it comes to two-dimensional lattice systems, it is already known that in terms of storage efficiency the "less often and a lot" method wins. "We noticed that the amount of energy the system can store varies depending on the portion size of the energy and the frequency of its provision. The greatest amount is when the energy portions are large, but the time intervals in between their supply are also long," explains Yirui Zhang, a PhD student at the IPC PAS. "Interestingly, it turns out that if we divide this sort of storage system internally into compartments or indeed chambers, the amount of energy that can be stored in such a divided-up "battery"- if it were possible to construct - increases. In other words, three small batteries can store more energy than one large one," says the researcher. All this, assuming that the total amount of energy put into the system remains the same, and only the method of its delivery changes.

Although the research carried out by the IPC PAS team is quite basic and simply shows the fundamental principle governing energy storage in magnets, its potential applications cannot be overestimated. Let's imagine, for example, the possibility of charging an electric car battery not in a few hours, but in just under twenty minutes, or a significant increase in the capacity of such batteries without changing their volume, i.e. extending the range of the car after one charge. The new discovery may also, in the future, change the methods of charging different types of batteries by determining the optimal periodicity of supplying energy to them.

Credit: 
Institute of Physical Chemistry of the Polish Academy of Sciences

Can high-power microwaves reduce the launch cost of space-bound rockets?

image: These are rockets used in the study.

Image: 
University of Tsukuba

Tsukuba, Japan - Governments throughout the world use rockets to launch satellites and people into orbit. This currently requires a lot of high-energy fuel, which is 95% of total rocket mass. Because the launch cost of a rocket can reach 10 billion yen, launching a 1-gram payload is said to be the same as buying 1 gram of gold. Minimizing the total cost of launching rockets would maximize the scientific payloads and increase the feasibility of space exploration.

In a study published in the Journal of Spacecraft and Rockets, researchers from the University of Tsukuba have helped solve important wireless power transmission and other efficiency issues that must be overcome to use high-powered microwaves to supplement--or nearly replace--chemical fuel for rocket launches. Their study will help researchers in this line of work properly focus their efforts.

Researchers commonly believe that a rocket requires a megawatt of beam-powered propulsion--that's approximately the power output of 10 automobiles--per kilogram of payload to reach a minimal orbit. Whether microwave transmission is sufficiently efficient for real-world applications is an open question.

Microwave beams have been transmitted by using a ground antenna that is the same size as a rocket antenna. "However, practical applications will require a large ground-based transmitter and a small receiver on the rocket, and thus variable-focus transmission," explains Assistant Professor Kohei Shimamura, lead author of the study. "We wanted to not only demonstrate this approach, but also quantify its efficiency."

In their comprehensive study, the researchers calculated the efficiencies, at short distances, of a ground-based microwave generator (51%), wireless power supply that sends the microwaves to the rocket propulsion system (14%), receiving antenna on the rocket (34%), and propulsion device that uses the microwave energy to heat the rocket propellant (6%). "Researchers can now put numbers on how efficient variable-focus transmission is at present," says Associate Professor Tsuyoshi Kariya, the other main author of the study.

Future research will need to study and improve efficiencies at long distances. In the words of Associate Professor Shimamura: "This is a difficult challenge, but an important next step in advancing microwave technology to practical use in rocket launches."

Rockets are essential technology, but their launching cost is a major disadvantage for scientific missions. With future research, high-power microwaves may one day be a low-cost method of rocket propulsion.

Credit: 
University of Tsukuba

Type 2 diabetes: Too much glucagon when α-cells become insulin resistant

Patients with type 2 diabetes secrete not only too little insulin but also too much glucagon, which contributes to poor blood glucose control. A new study from Uppsala University suggests that this is because the glucagon-secreting α-cells have become resistant to insulin.

In healthy individuals, insulin signals the body to absorb glucose, thereby reducing the sugar in the blood and providing energy to tissues. In patients with type 2 diabetes this mechanism fails, because the glucose-absorbing tissues become resistant to insulin and because too little of the hormone is released into the blood. This leads to elevated blood glucose and long-term complications that often become disabling or even life-threatening.

Often, type 2 diabetics also have elevated levels of glucagon, another hormone that is released by the pancreas. Glucagon counteracts the effects of insulin by instructing the liver to release stored glucose into the blood. After a meal, the release of glucagon is normally blocked to prevent excessive production of glucose by the liver. When this fails in diabetic patients, too much glucagon contributes to a vicious cycle that exacerbates the already high blood sugar levels of diabetics. Despite this vital function of glucagon, relatively little is known about how its release is regulated. Using advanced microscopy techniques, a team led by Omar Hmeadi in Sebastian Barg's research group at Uppsala University now adds insight into how glucagon-producing α-cells are controlled by glucose.

As expected, the experiments showed that glucagon is secreted during periods of low glucose, while high levels of the sugar efficiently block its release. However, in α-cells of type 2 diabetics this regulation was disturbed and high glucose no longer blocked the release of glucagon. To find out why, Hmeadi and colleagues isolated the α-cells and separated them from their tissue context in the pancreas. Surprisingly, the cells now behaved in a 'diabetic' manner and continued to secrete glucagon even when glucose was elevated.

The reason, Hmeadi explains, is that α-cells are normally blocked by insulin and other hormones that are released at high blood glucose from nearby cells. When the cells are separated from each other, this cell-to-cell communication is lost and glucagon secretion proceeds even when it should not. But why do the isolated α-cells behave as if they were diabetic? It turns out that the α-cells in type 2 diabetes become resistant to insulin, much like liver, fat and muscle. The result is that glucagon release is no longer inhibited during the mealtime rise in blood glucose, and this leads to the elevated levels of the hormone in type 2 diabetes.

The researchers hope that the findings will contribute to a better understanding of human type 2 diabetes and guide the development of better treatment strategies.

Credit: 
Uppsala University

Catalyst enables reactions with the help of green light

image: which are irradiated with green light in the laboratory of the Kekulé Institute of Organic Chemistry and Biochemistry at the University of Bonn.

Image: 
© Photo: Zhenhua Zhang

For the first time, chemists at the University of Bonn and Lehigh University in Bethlehem (USA) have developed a titanium catalyst that makes light usable for selective chemical reactions. It provides a cost-effective and non-toxic alternative to the ruthenium and iridium catalysts used so far, which are based on very expensive and toxic metals. The new catalyst can be used to produce highly selective chemical products that can provide the basis for antiviral drugs or luminescent dyes, for example. The results have been published in the international edition of the journal Angewandte Chemie.

The electrons in chemical molecules are reluctant to lead a single life; they usually occur in pairs. Then they are particularly stable and do not tend to forge new partnerships in the form of new bonds. However, if some of the electrons are brought to a higher energy level with the help of light (photons), things begin to look different when it comes to this "monogamy": In such an excited state, the molecules like to donate or to accept an electron. This creates so-called "radicals", that have electrons, are highly reactive and can be used to form new bonds.

Irradiation with green light

The new catalyst is based on this principle: At its core is titanium, which is connected to a carbon ring in which the electrons are particularly mobile and can be easily excited. Green light is sufficient to use the catalyst for electron transfer to produce reactive organic intermediates that are otherwise not easily obtainable. "In the laboratory, we irradiated a reaction flask containing the titanium catalyst that can be viewed as a 'red dye' with green light," reports Prof. Dr. Andreas Gansäuer from the Kekulé Institute of Organic Chemistry and Biochemistry at the University of Bonn. "And it worked right away." The mixture generates radicals from organic molecules that initiate many reaction cycles from which a wide variety of chemical products can be produced.

A key factor in reactions with this photo redox catalyst is the wavelength of the light used for irradiation. "Ultraviolet radiation is unsuitable because it is far too energy-rich and would destroy the organic compounds," says Gansäuer. Green light from LED lamps is both mild and energy-rich enough to trigger the reaction.

Catalysts are substances that increase the speed of chemical reactions and reduce the activation energy without being consumed themselves. This means that they are available continuously and can trigger reactions that would otherwise not occur in this form. The catalyst can be tailored to the desired products depending on the organic molecule with which the titanium is bonded.

Building blocks for antiviral drugs or luminescent dyes

The new titanium catalyst facilitates the reactions of epoxides, a group of chemicals from which epoxy resin are made. These are used as an adhesive or for composites. However, the scientists are not aiming for this mass product, but for the synthesis of much more valuable fine chemicals. "The titanium-based, tailor-made photo redox catalysts can for instance be used to produce building blocks for antiviral drugs or luminescent dyes," says Gansäuer. He is confident that these new catalysts provide a cost-effective and more sustainable alternative to the ruthenium and iridium catalysts used so far, which are based on very expensive and toxic metals.

The development is an international collaborative effort by Zhenhua Zhang, Tobias Hilche, Daniel Slak, Niels Rietdijk and Andreas Gansäuer from the University of Bonn and Ugochinyere N. Oloyede and Robert A. Flowers II from Lehigh University (USA). While the scientists from the University of Bonn investigated how the desired compounds could best be synthesized with the new catalyst, their colleagues from the USA carried out measurements to prove the reaction pathways. "The luminescence phenomenon really opens up interesting space to consider the design of new sustainable reactions that proceed through free radical intermediates," says Prof. Robert Flowers from the Lehigh University.

Credit: 
University of Bonn

Physicists develop approach to increase performance of solar energy

Experimental condensed matter physicists in the Department of Physics at the University of Oklahoma have developed an approach to circumvent a major loss process that currently limits the efficiency of commercial solar cells.

Solar cells convert the sun's energy into electricity and are the main component of solar panels and many types of electrical devices as broad-ranging as satellites and calculators.

Members of the Photovoltaic Materials and Devices Group, led by OU associate professor in the Homer L. Dodge Department of Physics and Astronomy, Ian Sellers, along with theorists at Arizona State University, led by David K. Ferry, have demonstrated a breakthrough toward the development of a hot carrier solar cell.

A hot carrier solar cell is a device that would increase the efficiency of solar cells by more than 20%, which Sellers said would be a significant breakthrough for solar energy.

"Although this device has been the source of a considerable amount of research over the last 10 to 15 years, the realization of a practical solution has thus far eluded researchers with proof-of-principle demonstrations only presented under unrealistic conditions or in materials and structures not relevant for solar cell operation," said Sellers.

Sellers says this new approach, recently published in the journal Nature Energy, demonstrates "significant progress in the realization of the hot carrier solar cell and the potential for ultra-high-efficiency single-junction semiconductor devices, which would revolutionize the field of photovoltaics and renewable energy generation."

Credit: 
University of Oklahoma

Princeton scientist solves air quality puzzle: Why does ozone linger long after its ban?

image: Meiyun Lin is a research scholar in Princeton University's Program in Atmospheric and Oceanic Sciences, the Cooperative Institute for Modeling the Earth System at Princeton University, and NOAA's Geophysical Fluid Dynamics Laboratory. She led an international team of climate researchers to address a longstanding puzzle: Why, despite laws successfully limiting pollution from cars, trucks and factories, has Europe seen little improvement in ozone air quality? Lin found the surprising chain of causes: As global climate change leads to more hot and dry weather, the resulting droughts are stressing plants, making them less able to remove ozone from the air.

Image: 
Zhiguo Zhang

When high in the atmosphere, ozone protects Earth from harmful solar radiation -- but ozone at ground level is a significant pollutant. Exposure to high concentrations of ground-level ozone aggravates respiratory illnesses, thus exacerbating the negative health effects of heat and contributing to the catastrophic impacts of recent heatwaves and drought in Europe.

In Europe, despite laws limiting pollution from cars, trucks and factories, there has been little improvement in ozone air quality. An international team led by atmospheric scientist Meiyun Lin found the surprising chain of causes: As global climate change leads to more hot and dry weather, the resulting droughts are stressing plants, making them less able to remove ozone from the air.

With hot and dry summers expected to become more frequent over the coming decades, this has significant implications for European policymakers, noted Lin, a research scholar in atmospheric and oceanic sciences and the Cooperative Institute for Modeling the Earth System at Princeton University.

In a new study published today in Nature Climate Change, Lin and her colleagues demonstrated that vegetation feedbacks during drought worsen the most severe ozone pollution episodes.

"We show that declining ozone removal by water-stressed vegetation in response to climate warming can explain the slow progress towards improving ozone air quality in Europe," she said. "Under drought stress, plants are less effective in ozone removal via stomata -- small pores in the leaves of vegetation that are responsible for controlling carbon dioxide transport for photosynthesis and water vapor losses."

Such land-biosphere feedbacks have often been overlooked in prior air quality projections. This study quantified these vegetation feedbacks using six decades of observations and new Earth system model simulations developed at the Geophysical Fluid Dynamics Laboratory, a division of the U.S. National Oceanic and Atmospheric Administration located on Princeton's Forrestal campus.

Lin and her colleagues found that severe drought stress can cause as much as 70% reductions in ozone removal by forests. "Accounting for reduced ozone removal by drought-stressed vegetation leads to a three-fold increase in high-ozone events -- above 80 parts per billion," Lin said. That is significantly worse than the European Union's ozone target: 60 parts per billion, not to be exceeded on more than 25 days per year. For reference, the U.S. standard is 70 parts per billion, not to be exceeded on more than 4 days per year.

The European Union has established an extensive body of legislation to reduce regional emissions of smog-forming chemicals from member states, but despite 45% to 70% reductions in smog-forming chemicals across a 40-year period, summertime ozone levels measured in Europe actually climbed, especially during the 1980s and '90s.

Based on their findings, Lin said, governments will need even stronger emission controls to lower ozone air pollution.

While this study focused on Europe, their findings have broad implications. Substantial reductions in ozone removal by vegetation were also observed during North America's historic heat wave and drought in summer 2012, according to an earlier study by Lin.

Over the coming decades, as the climate warms, it will be increasingly important to account for vegetation feedbacks to determine the effects of extreme pollution events, she said.

Credit: 
Princeton University

Finding leukemia's weakness using genome-wide CRISPR technology

image: UC San Diego researchers used CRISPR technology to carry out a genome-wide screen in leukemia cells to block thousands of genes at once. The tool was used to identify genes that fuel leukemia growth, like those leukemia cells (green) pictured growing within the bone marrow.

Image: 
UC San Diego Health Sciences

A team of researchers at University of California San Diego School of Medicine and Moores Cancer Center used CRISPR technology to identify key regulators of aggressive chronic myeloid leukemia, a type of cancer that remains difficult to treat and is marked by frequent relapse.

"We used CRISPR technology to carry out a genome-wide screen in leukemia cells to block thousands of genes at once. This is an extremely powerful tool that allowed us to identify a multitude of genes that fuel leukemia growth and find new vulnerabilities that can be targeted in this disease," said senior author Tannishtha Reya, PhD, professor in the departments of Pharmacology and Medicine. "The study also shows, for the first time, that whole genome CRISPR-based screens can in fact be carried out in a manner that is much more physiologically relevant: using primary cancer cells, and in the setting of the native microenvironment."

Reporting in the April 20, 2020 online edition of the journal Nature Cancer, Reya and colleagues identified RNA-binding proteins -- which normally control how, when and if cells make certain proteins -- as a key class of proteins that sustain and protect drug-resistant leukemia stem cells. The authors focused on Staufen2 (Stau2), a relatively understudied member of the RNA-binding protein family that was previously only known to control brain and nervous system development.

This work was carried out in collaboration with UC San Diego investigators Gene Yeo, PhD, and Anjana Rao, PhD, and was conducted by lead author Jeevisha Bajaj, PhD, while she was a postdoctoral fellow at UC San Diego in the Reya lab. Bajaj subsequently joined the University of Rochester Medical Center as assistant professor of biomedical genetics and a researcher at the Wilmot Cancer Institute.

The team developed a mouse model in which Stau2 was genetically deleted and found that loss of this protein led to a profound reduction in leukemia growth and propagation, and markedly improved overall survival in mouse models. Stau2 was also required for continued growth of primary tissue samples of patients with leukemia, indicating a conserved dependence in the human disease.

"We are particularly excited about this work because, to our knowledge, this is the first demonstration that Staufen2 is a key dependency in any cancer," said Reya, who is a member of Moores Cancer Center and the Sanford Consortium for Regenerative Medicine.

To understand how Stau2 controls cancer, researchers undertook a genome-scale computational analysis of its targets through RNA-Seq and eCLIP-Seq. This led to the discovery that this protein controls key oncogenes, such as Ras, and epigenetic regulators, such as the LSD/KDM family of proteins, which are critical drug targets being tested against leukemia and other cancers.

According to the National Cancer Institute, approximately 1.5 percent of men and women will be diagnosed with leukemia at some point during their lifetimes. While chronic myeloid leukemia (CML) can be controlled with targeted therapies, this disease can be lethal if it advances or is diagnosed in an acute "blast" phase. The findings also have implications for acute myeloid leukemia (AML) and other blood cancers.

"This work will be particularly important for the discovery of new treatments," said Bajaj. "Our genome-wide screen identified cellular signals critical for the growth of cancer, and in the future, this study will be useful to study the microenvironment, the area around the tumor that includes tissue, blood vessels and important molecular signals related to how the cancer behaves."

Credit: 
University of California - San Diego

Oak genomics proves its worth

image: Landmark 10 article collection in April 16 New Phytologist helps clarify evolution of oaks.

Image: 
<i>New Phytologist</i>

Lisle, Ill. (April 20) - A year and a half following the publication of the pedunculate oak genome by France's National Research Institute for Agriculture, Food and the Environment (INRAE) and The Commission for Atomic Energy and Alternative Energies (CEA), initial results based on this genomic resource were published in the April 16, 2020, issue of New Phytologist.

The 10 articles in the collection help clarify the evolution of oaks, from the deep roots of their diversification through their more recent evolution. They also identify key genes involved in oak adaptation to environmental transitions and resistance to pathogens, investigate the implications and history of oak hybridization, and trace genomic evidence for an estimated 56 million years of oak evolution. This landmark volume will be of value to tree scientists worldwide. Four of the articles were co-authored by researchers at The Morton Arboretum.

Oaks are keystone species in a wide range of forest and savanna ecosystems throughout the northern hemisphere. They are also model organisms for investigating ecological and evolutionary processes responsible for plant diversification and adaptation, especially in response to rapid environmental change. Given the high rate of global climate change, genomic approaches to understanding tree responses to the environment are particularly timely.

The volume brings to light several novel findings in tree biology that have arisen from these studies rooted in the oak genome:

- In a review of research into oaks' early and more recent evolution, evolutionary mechanisms are proposed for the high diversity and abundance of oaks across the Northern Hemisphere (Kremer et Hipp, 2020).

- Two global phylogenomic studies reconstruct the history of oak diversification across the northern hemisphere, dissecting the important role of both speciation and hybridization in shaping the diversity of oak-dominated forest communities (Crowl et al. 2020; Hipp et al. 2020).

- A synthesis of genomic findings against the backdrop of western cultural traditions suggests biological underpinnings of the oak's traditional use as a symbol of longevity, cohesiveness and robustness (Leroy et al. 2020a).

- A novel oak conservation strategy is proposed that would take advantage of hybridization in oaks, designed to facilitate the ability of these long-lived organisms to adapt to global change by naturally sharing advantageous genes across species boundaries (Cannon and Petit, 2020).

Publication of this series of research articles provides a forward-looking review of cutting-edge work being carried out in light of scientists' burgeoning understanding of the oak genome.

Credit: 
The Morton Arboretum

Diagnostic biosensor quickly detects SARS-CoV-2 from nasopharyngeal swabs

image: An artist’s rendering above shows a new test that quickly detects SARS-CoV-2 (spheres) through binding to antibodies (Y-shapes) on a field-effect transistor.

Image: 
Adapted from <i>ACS Nano</i> <b>2020</b>, DOI: 10.1021/acsnano.0c02823

**A correction to the journal article was published on Aug. 28, 2020**

According to many experts, early diagnosis and management are critical for slowing the spread of SARS-CoV-2, the new coronavirus that causes COVID-19. Therefore, the race is on to develop diagnostic tests for the virus that are faster, easier and more accurate than existing ones. Now, researchers reporting in ACS Nano have developed a field-effect transistor-based biosensor that detects SARS-CoV-2 in nasopharyngeal swabs from patients with COVID-19, in less than one minute.

Currently, most diagnostic tests for COVID-19 rely on a technique called real-time reverse transcription-polymerase chain reaction (RT-PCR), which amplifies SARS-CoV-2 RNA from patient swabs so that tiny amounts of the virus can be detected. However, the method takes at least 3 hours, including a step to prepare the viral RNA for analysis. Edmond Changkyun Park, Seung Il Kim and colleagues wanted to develop a faster diagnostic test that could analyze patient samples directly from a tube of buffer containing the swabs, without any sample preparation steps.

The team based their test on a field-effect transistor -- a sheet of graphene with high electronic conductivity. The researchers attached antibodies against the SARS-CoV-2 spike protein to the graphene. When they added either purified spike protein or cultured SARS-CoV-2 virus to the sensor, binding to the antibody caused a change in the electrical current. Next, the team tested the technique on nasopharyngeal swabs collected from patients with COVID-19 or healthy controls. Without any sample preparation, the sensor could discriminate between samples from sick and healthy patients. The new test was about 2-4 times less sensitive than RT-PCR, but different materials could be explored to improve the signal-to-noise ratio, the researchers say.

Credit: 
American Chemical Society

Study identifies last-line antibiotic resistance in humans and pet dog

New research due to be presented at this year's European Congress of Clinical Microbiology and Infectious Diseases (ECCMID)* has identified the dangerous mcr-1 gene -which provides resistance to the last line antibiotic colistin - in two healthy humans and a pet dog. The study is by Dr Juliana Menezes and Professor Constança Pomba from the Centre of Interdisciplinary Investigation of Animal Health of the Faculty of Veterinary Medicine, University of Lisbon, Portugal, and colleagues, and is part of the Pet Risk Consortium Project funded by the Portuguese Government.

Since first being reported in China in 2015, the mcr-1 gene has been found in various people and animals around the world. It confers resistance to colistin, which is an antibiotic of last resort used to treat infections from bacteria resistant to all other antibiotics. The nightmare scenario that could emerge is mcr-1 combining with already drug resistant bacteria to create a truly untreatable infection. Other genes in the same group (mcr-2 to mcr-9) have since been identified which act in a similar manner.

In this study, the authors investigated resistance to colistin in the faecal samples of human and pets in Portugal. Between February 2018 and October 2019, faecal samples were collected from cats and dogs and their human household members. Genetic analysis was carried to establish the presence of five colistin resistance genes (mcr-1 to mcr-5).

The 70 households enrolled from the Lisbon region included healthy humans (n=106) living with healthy pets (n=49) and pets with skin and soft tissue infections (SSTI) (n=19) and urinary tract infections (UTI) (n=16). Of these, 95 faecal samples (89.6%) from humans, 45 from dogs (healthy-23, SSTIs-14 and UTI-8) and 21 from cats (healthy-18 and UTI-3), were positive for E. coli.

Further analysis showed colistin resistance in 5 out of the 161 isolates (3%), three from healthy humans and two from dogs with skin infections. Molecular analysis revealed that three of the E. coli isolates carried the mcr-1 gene, from two healthy humans and one from a dog with skin infection, all from different households. The two remaining isolates resistant to colistin (one from a healthy human and one from a dog with a skin infection), did not test positive for the mcr-1 to mcr-5 genes, and will now be screened for the other mcr-6 to mcr-9 genetic variants.

The isolates found in the faecal samples were not causing infections in humans, hence no specific treatment was recommended for them. Only the sick dog received oral amoxicillin in combination with clavulanic acid for the treatment of the skin infection.

The authors say: "To our knowledge, this is the first report of the presence of the mcr-1 gene from either a dog and or in healthy humans in Portugal. Further studies are needed to determine the full epidemiology of colistin resistance genes in humans and companion animals."

They add: "These humans and dogs, if in direct contact, may transmit bacteria containing the mcr-1 gene to other humans, dogs, other animals and the environment and potentially be a hazard for public health. The situation we all want to avoid at all costs is any infection totally resistant to all antibiotics, caused by bacteria already resistant to most other antibiotics also acquiring this colistin resistance gene."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases