Tech

Why does El Niño decay faster than La Niña?

Warm and cold phases of El Niño-Southern Oscillation (ENSO) exhibit a significant asymmetry in their decay speed. Generally, El Niño tends to turn into a La Niña event in the following June-July after its mature phase; however, the negative sea surface temperature anomaly (SSTAs) associated with La Niña events can persist for more than one year after peaking, resulting in a longer duration than that of El Niño.

Aimed at this problem, the research group of Prof. Renhe Zhang proposed the effect of intraseasonal oscillation over the equatorial western Pacific on El Niño and La Niña asymmetric decay by using observation and reanalysis data, and further verified this process in an ocean general circulation model.

"The difference in intraseasonal oscillation intensity brings about the asymmetry of zonal wind anomalies over the equatorial western Pacific during El Niño and La Niña decay phases," says the corresponding author of this study, Prof. Zhang, the Dean of the Department of Atmospheric and Oceanic Sciences Institute of Atmospheric Sciences of Fudan University, "and the asymmetric zonal wind anomalies over the equatorial western Pacific result in asymmetry in El Niño and La Niña decay phases."

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

How cigarette smoke makes head and neck cancer more aggressive

PHILADELPHIA - Head and neck cancer is the sixth most common cancer in the world. The vast majority of cases are head and neck squamous cell carcinoma (HNSCC), a type of cancer that arises in the outer layer of the skin and mucous membranes of the mouth, nose and throat. Cigarette smoking is a major risk factor for developing the disease and reduces treatment effectiveness.

Now Thomas Jefferson University researchers studying the effects of cigarette smoke on tumor progression show that cigarette smoke reprograms the cells surrounding the cancer cells, and helps drive HNSCC aggressiveness. The results were published online in the journal Molecular Cancer Research.

"Cigarette smoke changes the metabolism of cells in head and neck squamous cell carcinomas, making the tumors more efficient as an ecosystem to promote cancer growth," says Ubaldo Martinez-Outschoorn, MD,associate professor in the Department of Medical Oncology and a researcher at the Sidney Kimmel Cancer Center - Jefferson Health, who led the new study.

Tumors are composed of cancer cells that grow out of control and non-cancerous cells that support the tumor. Over half of the cells in tumors are support cells and they create what scientists call the "tumor stroma." The most common cell type in the tumor stroma are fibroblasts, which help maintain tissue architecture. In previous research, Dr. Martinez-Outschoorn and colleagues including Joseph Curry, MD, associate professor in the Department of Otolaryngology-Head and Neck Surgery found that interactions between fibroblasts and cancer cells promote tumor growth.

"We found previously that tumors in human and animal models of head and neck squamous cell carcinoma thrive when these distinct groups of cells support each other," says Dr. Martinez-Outschoorn. Cancer cells make use of metabolic products generated by the surrounding fibroblasts in order to obtain energy and fuel their growth. "That's where tumors are most aggressive."

Marina Domingo-Vidal, a graduate student in Dr. Martinez-Outschoorn's lab, studied the metabolic effects of tobacco in head and neck squamous cell carcinoma. "Knowing that cigarette smoke is the strongest risk factor for this cancer type, we wanted to better understand how it changes the metabolism of the different cells in the tumor," Domingo-Vidal says.

In the current study, the researchers exposed fibroblasts to cigarette smoke. They saw that fibroblasts increased a particular type of metabolism called glycolysis, which produces metabolites that are used by the nearby cancer cells to help fuel their growth. In addition, these cancer cells acquired certain features of malignancy such as increased mobility and resistance to cell death. The enhanced support of the tobacco-exposed fibroblasts caused larger tumors in a mouse model of the disease.

In a key finding, the researcher also found a protein on tobacco-exposed fibroblasts that appeared to drive these metabolic changes. "The protein, called monocarboxylate transporter 4 (MCT4), is a major mechanism by which cigarette smoke exerts cancer aggressiveness, and we've shown how to manipulate and hopefully reverse it," says Domingo-Vidal.

"We've also seen that smoke-exposed fibroblasts interact with other cells in the tumor stroma, such as the cells of the immune system," explains Domingo-Vidal. "A healthy immune system is responsible for recognizing and attacking malignant cells, so it will now be interesting to understand how these altered fibroblasts might influence the efficacy of current immunotherapies."

The discovery has set the foundation for a clinical trial where Dr. Curry hopes to shut down the negative metabolic state induced by cigarette smoke. "Our finding is part of a growing interest in understanding the metabolic relationship between different cells in the tumor microenvironment, and how we can target them to improve patient outcomes," says Dr. Curry.

The clinical trial will use a two-pronged approach to combat the cancer. The trial will combine an FDA-approved drug for diabetes called metformin that will target the cancer cells' altered metabolism with an FDA-approved immunotherapy called durvalumab, a checkpoint or PD-L1 inhibitor that takes the breaks off the immune system. (Clinical trials No. NCT03618654).

"We think metformin and durvalumab might have a synergistic effect on the cancer, where metformin slows the bad players, the cancer cells, and durvalumab grows the strength of the good players, the immune cells," says Dr. Curry.

Credit: 
Thomas Jefferson University

Researchers identify a possible therapeutic target for Kennedy´s disease and prosta

image: Interaction of the androgen receptor and chaperone Hsp70 (in red) inside the cell. Blue indicates cell nuclei.

Image: 
Xavier Salvatella, IRB Barcelona

A study led by scientists at the Institute for Research in Biomedicine (IRB Barcelona) and published in Nature Communications proposes chaperone protein Hps70 as an attractive therapeutic target for the treatment of Kennedy's disease--a rare neuromuscular condition--and of castration-resistant prostate cancer.

Kennedy´s disease is caused by a mutation in the androgen receptor. This receptor serves as a sensor of testosterone, detecting the levels of this hormone and activating the genes responsible for male traits. But in patients with this disease, the mutated receptor shows an altered structure, as demonstrated by this lab in a previous study recently published in the same journal, and it forms aggregates that damage muscle cells and causes muscular atrophy or wasting.

Chaperone proteins are one of the mechanisms through which the formation of toxic protein aggregates is prevented. These chaperones bind to other proteins in order to facilitate their correct folding, assembly and transport, as well as regulating their degradation. "But we didn't know the role of these chaperones in the regulation of the activity, cell concentration and solubility of the androgen receptor," says ICREA researcher Xavier Salvatella, head of the Laboratory of Molecular Biophysics at IRB Barcelona.

Using a sophisticated biophysics approach, nuclear magnetic resonance (NMR), and experiments with human cell cultures, the scientists discovered that the chaperones Hsp40 and Hsp70 bind strongly to a region of the androgen receptor that is susceptible to forming toxic aggregates. This interaction between the chaperones and the receptor prevents the formation of these deposits and facilitates their clearance.

To confirm whether the increase in the activity of these chaperones decreases the formation of toxic aggregates and whether these proteins are therefore useful for the treatment of Kennedy`s disease, the scientists performed experiments in mouse models. These experiments were done in collaboration with the labs of Professors Jason E. Gestwicki and Andrew P. Lieberman from the University of California San Francisco and the University of Michigan, respectively.

"The results obtain in mice confirm that compounds that activate Hsp70 lead to a decrease in the formation of the aggregates," says Salvatella. "Therefore the chaperone Hsp70 emerges as a possible therapeutic target for Kennedy's disease," he goes on to say.

The results may also be useful in the search for a treatment for castration-resistant prostate cancer, the most advanced stage of this kind of cancer, which causes 30,000 deaths a year in Europe. In cells resistant to current treatments, the binding site of Hsp40 and Hsp70 in the androgen receptor is not altered, and therefore these chaperones may also serve as a therapeutic target in this disease.

Credit: 
Institute for Research in Biomedicine (IRB Barcelona)

Oil rigs could pump CO2 emissions into rocks beneath North Sea

North Sea oil and gas rigs could be modified to pump vast quantities of carbon dioxide emissions into rocks below the seabed, research shows.

Refitting old platforms to act as pumping stations for self-contained CO2 storage sites would be 10 times cheaper than decommissioning the structures, researchers say.

The sites would store emissions generated by natural gas production, and could also be used to lock away CO2 produced by other sources - such as power stations - helping to combat climate change.

Scientists from the University of Edinburgh analysed data from the Beatrice oilfield - 15 miles off the north east coast of Scotland. They found that existing platforms could be re-used as storage sites by making minor modifications.

Using a computer model, they worked out that, over a 30-year period, the scheme would be around 10 times cheaper than decommissioning the Beatrice oilfield, which is likely to cost more than £260 million.

Large amounts of natural gas and heat energy can still be extracted from saltwater in exhausted oil and gas fields, the team found. The gas can be used as a fuel or burnt on platforms to generate electricity.

Mixing the saltwater from the oil field with CO2 produced by burning the gas enables it to be injected deep underground for permanent safe storage, researchers say.

The scheme would bring down the costs of storing carbon emissions and postpone expensive decommissioning of North Sea oil and gas infrastructure, the team says.

The study, published in International Journal of Greenhouse Gas Control, was completed as part of the University of Edinburgh's GeoEnergy MSc programme.

Lead author Jonathan Scafidi, of the University of Edinburgh's School of GeoSciences, said: "Removing platforms at large expense is short-sighted. Re-using them to dispose of CO2 in rocks several kilometres beneath the seabed will not only be cheaper, but provides a cost-effective means of cutting the UK's CO2 emissions to meet the 2050 net-zero target."

Dr Stuart Gilfillan, also of the School of GeoSciences, who co-ordinated the study, said: "Our study shows, for the first time, that natural gas production from saltwater can be combined with CO2 storage in the North Sea. The potential revenue provided by extending natural gas production in the North Sea could help kick-start a world-leading carbon capture and storage industry in the UK."

Credit: 
University of Edinburgh

Fewer cows, more trees and bioenergy

video: The IPCC Special Report on Climate Change and Land contains many actions that can be implemented now to increase the chances that we can meet the 1.5 C goal, says Francesco Cherubini, one of the report's lead authors and head of the Industrial Ecology Programme at the Norwegian University of Science and Technology.

Image: 
Synnøve Aune, NTNU

Francesco Cherubini likes to ask his Industrial Ecology students what's the most common use of land today, and nearly all of them get the answer wrong.

"The correct answer is grazing land," Cherubini, a professor and director of NTNU's Industrial Ecology Programme. "Today we are using nearly half of the land on our planet to feed animals and not people."

Cherubini has more than an academic interest in this question -- and what the answer means. He's one of the lead authors of the IPCC's new special report, released today, on Climate Change and Land, and was the only author from a Norwegian-based institution.

He formally handed the report over to Ellen Hambro, Director General of the Norwegian Environment Agency, and Ola Elvestuen, Minister of Climate and Environment on Thursday, when the IPCC released the report in Geneva.

The IPCC describes the report as an "assessment of the latest scientific knowledge about climate change, desertification, land degradation, sustainable land management, food security, and greenhouse gas fluxes in terrestrial ecosystems."

The report also looks at the interrelationships between different competing land uses and how these could affect future potential climate outcomes.

Cherubini says the report offers policymakers yet more reasons to act on curbing greenhouse gas emissions and taking action to make sure that land use plays a positive role in solving the climate problem -- and acting sooner rather than later.

"Land management is key for us to achieve our climate management objectives," he said. "But we have competing land uses and limited land resources: we rely on land for animal feed, food, fibres and timber, and we need to preserve biodiversity and all the ecosystem services that land provides. And on top of that we have climate change."

The new report is an analysis of more than 7000 publications and looks at possible future scenarios to suggest the kinds of changes that need to happen under different socioeconomic situations, assuming society tries to limit warming to 1.5 C.

Using these scenarios allowed Cherubini and his co-authors to see what kinds of actions would need to be taken -- and when -- to keep warming to the 1.5 C goal.

For example, if society changes to a more plant-based diet, more efficient agriculture and food production systems and embraces new cleaner energy technologies, this will make land available for key climate change mitigation measures such as growing more trees or bioenergy crops.

"We'll need a lot of land for climate change mitigation," he said, no matter what.

Even in the most sustainable future scenario that will allow us to meet the 1.5-degree goal, the amount of forest land needed to soak up CO2 by the end of this century will top out at about 7.5 million km2, or roughly the size of Australia, Cherubini said.

In contrast, when the researchers looked at a more resource-intensive future scenario, they found it will be necessary to devote 7.5 million km2 of land to bioenergy crops as early as 2050 to keep warming below 1.5 C. Another important component of using bioenergy under this scenario, he said, is that it will have to be coupled with more and earlier carbon capture and storage.

However, under all the scenarios, land has to be used for both bioenergy and for forest growth, as well as for food production and other human uses, Cherubini said.

"Bioenergy needs to be put in place with all the other climate change mitigation options to achieve the 1.5 C goal," he said. "We shouldn't think of this as a competition for land between forest and energy crops. We need both."

When Cherubini and his colleagues looked at the different scenarios, they were also able to assess how these different futures would affect different parts of the globe in terms of food insecurity, land degradation and other negative impacts.

This allowed the scientists to provide an assessment of "how we do this in the best way possible," Cherubini said. "We have a choice in the kinds of risks we will experience based on what kind of socioeconomic development takes place."

If society doesn't develop in a sustainable way, he said the risk of land degradation and food insecurity is much higher, especially in developing countries. Additionally, mitigation efforts have to be designed carefully to make sure they have a beneficial effect, he said.

One example of how a well-intentioned programme can have unwanted or unanticipated effects is China's "Grain for Green" programme, a massive tree-planting effort.

Under all the social development scenarios, planting trees to soak up carbon is important to curb global warming. It's also a tool used to reverse soil erosion land degradation, which is one reason why China undertook its programme.

But in the case of China's efforts, at least some of the trees they planted were non-native. That meant that the trees took up more water than a native tree would have, and have increased the risks of causing other problems, such as water shortages, Cherubini said.

One aspect of the new report that has generated the most attention is the suggestion that societies move to more vegetarian diets.

"Balanced diets, featuring plant-based foods, such as those based on coarse grains, legumes, fruits and vegetables, nuts and seeds, and animal-sourced food produced in resilient, sustainable and low-GHG emission systems, present major opportunities for adaptation and mitigation while generating significant cobenefits in terms of human health," the summary for policymakers says.

Cherubini says it's clear that societies need to embrace this kind of change, along with improving and intensifying agricultural production.

Especially in places like Africa, where there is a gap between what the land could potentially produce and what is produced, there's a great need for improvements, he said.

"We need to produce more with less so that land is available for other uses," he said. "What is clear, is that we need to change. We need cross-sectoral changes in our lifestyles and in our economies."

Although these changes will cost money, the IPCC report emphasizes that the costs of inaction will exceed the costs of immediate action in many areas. That means that money spent now can be seen as a sound investment, he said.

"These changes do not come for free, they do have a cost," he said. "But should we talk about costs, or should we rather call it an investment?"

Credit: 
Norwegian University of Science and Technology

Mind The Gap: Report on all available open-source publishing software

image: To help visualize the functional scope of various development agendas, the authors of Mind the Gap propose a hypothetical publishing workflow that covers a number of stages in order to show how various projects address different functional areas. The focus here is with software development priorities.

Image: 
Mind the Gap (mindthegap.pubpub.org)

The MIT Press is pleased to release Mind the Gap (openly published at mindthegap.pubpub.org), a major report on the current state of all available open-source software for publishing. Funded by a grant from The Andrew W. Mellon Foundation, the report "shed[s] light on the development and deployment of open-source publishing technologies in order to aid institutions' and individuals' decision-making and project planning." It will be an unparalleled resource for the scholarly publishing community and complements the recently released Mapping the Scholarly Communication Landscape census.

The report authors, led by John Maxwell, Associate Professor and Director of the Publishing Program at Simon Fraser University, catalog 52 open-source online publishing platforms, i.e. production and hosting systems for scholarly books and journals, that meet the survey criteria of "available, documented open-source software relevant to scholarly publishing" and in active development. This research provides the foundation for a thorough analysis of the open publishing ecosystem and the availability, affordances, and current limitations of these platforms and tools.

The number of open-source online publishing platforms has proliferated in the last decade, but the report finds that they are often too small, too siloed, and too niche to have much impact beyond their host organization or institution. This leaves them vulnerable to shifts in organizational priorities and external funding sources that emphasize new projects over the maintenance and improvement of existing projects. This fractured ecosystem is difficult to navigate, and the report concludes that if open publishing is to become a durable alternative to complex and costly proprietary services, it must grapple with the dual challenges of siloed development and organization of the community-owned ecosystem itself.

"What are the forces--and organizations--that serve the larger community, that mediate between individual projects, between projects and use cases, and between projects and resources," asks the report. "Neither a chaotic plurality of disparate projects nor an efficiency-driven, enforced standard is itself desirable, but mediating between these two will require broad agreement about high-level goals, governance, and funding priorities--and perhaps some agency for integration/mediation."?

"We found that even though platform leaders and developers recognize that collaboration, standardization, and even common code layers can provide considerable benefit to project ambitions, functionality, and sustainability, the funding and infrastructure supporting open publishing projects discourages these activities," explains Maxwell. "If the goal is to build a viable alternative to proprietary publishing models, then open publishing needs new infrastructure that incentivizes sustainability, cooperation, collaboration, and integration."

"John Maxwell and his team have done a tremendous job collecting and analyzing data that confirm that open publishing is at a pivotal crossroads," says Amy Brand, Director of the MIT Press. "It is imperative that the scholarly publishing community come together to find new ways to fund and incentivize collaboration and adoption if we want these projects to succeed. I look forward to the discussions that will emerge from these findings."

Credit: 
The MIT Press

New design strategy brightens up the future of perovskite-based light-emitting diodes

image: (A) Photoluminescence and (B) electroluminsecence in low-dimensional and 3D perovskite-based devices.
Photoluminescence (PL) refers to the emission of light caused by the absorption of incident photons, whereas electroluminescence (EL) is the emission of light owing to the energy supplied by an electric current. Although low-dimensional perovskite exhibits better PL properties than 3D perovskite, the latter has better EL properties, which can be exploited to design very bright and efficient PeLEDs.

Image: 
Applied Physics Reviews

Scientists at Tokyo Institute of Technology discover a new strategy to design incredibly efficient perovskite-based LEDs with record-setting brightness by leveraging the quantum confinement effect.

Several techniques for generation of light from electricity have been developed over the years. Devices that can emit light when an electric current is applied, are referred to as electroluminescent devices, which have become orders of magnitude more efficient than the traditional incandescent light bulb. Light-emitting diodes (LEDs) comprise the most notable and ubiquitous category of these devices. A myriad of different types of LED exist nowadays, which has been made possible by advances in our understanding of quantum mechanics, solid-state physics, and the use of alternative materials.

Electroluminescent devices consist of several layers, the most important being the emission layer (EML), which emits light in response to an electric current. Metal halide perovskites, with the chemical formula CsPbX3 (X = I, Br, Cl), have been recently considered as promising materials for fabricating the EML. However, current perovskite-based LEDs (PeLEDs) perform poorly compared with organic LEDs, which are typically used to design displays of TVs and smartphones. Several researchers have suggested fabricating PeLEDs using low-dimensional (i.e., emitting structural units are connected on a plane or linearly in the crystal structure) perovskites that offer improved light-emission performance based on the quantum confinement effect of excitons. An exciton is an electron-hole pair that emits photon efficiently. However, using low-dimensional perovskites has an intrinsic drawback in that the conducting properties, i.e. low mobility, of these materials are very poor, and this lack of low mobility leads to low power efficiency.

Interestingly, as discovered by a team of researchers led by Prof. Hideo Hosono from the Tokyo Institute of Technology, it is possible to design highly efficient PeLEDs using three-dimensional (3D) perovskites, which have superior mobility of electron and hole and hence would address the limitation of low-dimensional perovskites. The team investigated if the quantum confinement effect that occurs in low-dimensional materials using new electron transport layer adjacent to the perovskite and results in attractive light-emission properties, could be achieved in 3D materials. In an electroluminescent device, the EML is sandwiched between two layers: the electron transport and hole transport layers. These two layers play a key role in ensuring good conducting properties of the device. The team found that the energy-level characteristics of these layers also play a crucial role in emission efficiency of the EL.

By tuning the characteristics of the electron and hole transport layers in PeLEDs, the team could prevent the abovementioned effect by ensuring that excitons remain confined in the emission layer. "The whole device structure can be regarded as a scaled-up low-dimensional material in a sense if the energy levels of the electron/hole transport layers are sufficient for exciton confinement," explains Hosono. The team reported 3D PeLEDs with record-setting performance in terms of high brightness and power efficiency and low operating voltage. Figure 1 shows a comparison between low-dimensional and 3D perovskite-based luminescent devices.

Aside from these tangible practical achievements, this research sheds light into how the exciton-related properties of a material can be influenced by the adjacent layers and provides a strategy that can be readily exploited in the development of optical devices. "We believe this study provides new insight into the realization of practical PeLEDs," concludes Hosono. With such interesting advances in light-emitting materials, it seems that a (literally) brighter future awaits.

Credit: 
Tokyo Institute of Technology

The reasons behind aerosol pollution over the eastern slope of the Tibetan Plateau

image: The causes of aerosol pollution over the eastern slope of the Tibetan Plateau

Image: 
Rui Jia

The aerosol optical depth over the eastern slope of the Tibetan Plateau (ESTP) is extremely large--and even more so than some important industrialized regions and deserts, which is the result of a combination of human activities and natural conditions, according to Prof. Yuzhi LIU at Lanzhou University.

Prof. Liu and her team--a group of researchers from the Key Laboratory for Semi-Arid Climate Change of the Ministry of Education, College of Atmospheric Sciences at Lanzhou University--have had their findings published in Advances in Atmospheric Sciences. A combination of satellite observations and reanalysis datasets was used to analyze the spatiotemporal distribution, classification and source of pollutants over the ESTP. The accumulation mechanism of aerosols over the region is also discussed.

"Known as 'the roof of the world', the TP [Tibetan Plateau] is a sensitive indicator and regulator of climate change, playing a significant role in driving the climate change of the Northern Hemisphere and even the globe through thermal and mechanical forcing. The addition of abundant aerosols in the air over the TP poses new climatic and environmental risks," said Dr Rui Jia, the lead author of the study.

"Aerosols over the ESTP are substantial, and even more so than those over the most densely populated and industrialized regions and deserts. These aerosols are high in load and complex in type, which can be further lifted to high altitude and mixed with clouds," she explains. "Studying the aerosol transport processes and physical mechanisms of aerosol accumulation on a large scale over the ESTP is important."

"Local emissions related to human activity contribute directly to the accumulation of sulfate and carbonaceous aerosols over the Sichuan Basin," Dr Jia explains. "In addition, in spring, abundant carbonaceous aerosols emitted from forest, grassland and savanna fires in Southeast Asia can be transported to the ESTP by the prevailing southwesterly wind."

"Additionally, the high aerosol loading over the ESTP is also directly related to the meteorological background. The terrain-driven circulation can trap aerosols in the Sichuan Basin and these aerosols can climb along the ESTP due to the perennial updraft" states Dr Jia. "Of course, more precipitation is beneficial for the wet deposition of the aerosols," she adds.

The corresponding author, Professor Liu, summarizes the causes of aerosol pollution over the ESTP as local emissions, transport from outside the region, and the accumulation of pollutants under specific geographical conditions.

"The inward transport, smaller vertical temperature gradient and terrain-driven circulation contribute greatly to the accumulation of aerosols over the ESTP, which poses greater challenges to environmental governance than that from local emissions." Says Prof. Liu.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Pupillary response to glare illusions of different colors

image: Glare illusion is an optical illusion that has a luminance gradient towards the center and is perceived as brighter by the gradient.

Image: 
COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

The Department of Computer Science and Engineering at Toyohashi University of Technology formed a research team with the University of Oslo to measure the size of subjects' pupils when viewing a brightness illusion (glare illusion). The pupil expands (dilates) in dark environments and contracts in bright environments in order to control the amount of light that enters the eye. Pupil contraction is also known to occur when people view a brightness illusion. This study involved showing glare illusions to subjects in a variety of different colors, and concluded that a blue glare illusion was perceived to be the brightest among all the colors and that subjects' pupils constricted significantly in relation to this perception. The results of this study were published in the Dutch journal, Acta Psychologica on July 6.

The research team hypothesized that the blue glare illusion would be perceived as the brightest because the color blue is most associated with the sky and sunlight typically appears to have a gradient of luminance. In other words, the team surmised that the human visual system often relies on ecologically-based predictions to interpret visual input. The results of this study show that the blue glare illusion was evaluated to be brighter than any other color and that viewing this illusion resulted in significant pupil contraction. This effect is thought to be unique to glare illusions, as it was not observed for visual stimuli without any illusory glare effects.

Lead author and doctoral course student, Yuta Suzuki explains "The pupillary light reflex, in which the pupil constricts and dilates, follows a very concise route in neural processing. The same goes for the pupil responses we saw when subjects viewed the blue glare illusion in this study - the difference of larger pupil constriction among colors occurred with the pupillary light reflex. This may indicate that we have evolved to develop a faster pupillary reaction for illusions in specific colors. We also saw a correlation between pupil contraction and individual brightness perception. We hope to be able to use the pupillary light reflex as a simple tool for evaluating individual differences in brightness perception."

Assistant Professor Tetsuto Minami, who lead the research team, says that "Subjective brightness perception is an individual phenomenon that cannot be comprehended by others. That is, we could only rely on peoples' own reported brightness perception. In this study, we saw a correlation between brightness perception and pupil contraction, and this is a new development that can be used as an index for evaluating objective brightness perception."

Professor Shigeki Nakauchi, the Project Leader, mentions that "The pupillary light reflex has drawn attention as a way to investigate a person's mental state through non-contact measurement. Work involving the pupillary light reflex will create innovative, fundamental technologies that can achieve rapid progress in communication between humans as well as between humans and robots."

Credit: 
Toyohashi University of Technology (TUT)

Researchers identify type of parasitic bacteria that saps corals of energy

image: Rebecca Vega Thurber, a coral microbiologist at Oregon State University, surveys a coral reef on Mo'orea, French Polynesia.

Image: 
Oregon State University

CORVALLIS, Ore. - Researchers at Oregon State University have proposed a new genus of bacteria that flourishes when coral reefs become polluted, siphoning energy from the corals and making them more susceptible to disease.

The National Science Foundation-funded study, published in the ISME Journal, adds fresh insight to the fight to save the Earth's embattled reefs, the planet's largest and most significant structures of biological origin.

Coral reefs are found in less than 1% of the ocean but are home to nearly one-quarter of all known marine species. Reefs also help regulate the sea's carbon dioxide levels and are a crucial hunting ground that scientists use in the search for new medicines.

Corals are home to a complex composition of dinoflagellates, fungi, bacteria and archaea that together make up the coral microbiome. Shifts in microbiome composition are connected to changes in coral health.

Since their first appearance 425 million years ago, corals have branched into more than 1,500 species, including the one at the center of this research: the endangered Acropora cervicornis, commonly known as the Caribbean staghorn coral.

In the study, when the corals were subjected to elevated levels of nutrients - i.e., pollution from agricultural runoff and sewage - the newly identified bacterial genus began dominating the corals' microbiome, jumping from 11.4% of the bacterial community to 87.9%.

The bacteria are in the order Rickettsiales, and the genus is associated primarily with aquatic organisms. Scientists named the genus Candidatus Aquarickettsia, and the coral-associated species in this study, Candidatus A. rohweri, is the first in the new clade to have its genome completely sequenced.

A clade is a group of organisms believed to have evolved from a common ancestor.

This research, which included canvassing DNA sequence data from a multitude of past studies, shows that the bacterial clade is globally associated with many different coral hosts and has genes that enable it to parasitize its hosts for amino acids and ATP, the main energy-carrying molecule within cells.

"We previously showed that this Rickettsiales OTU was found in stony corals, responds to nutrient exposure and is linked with reduced coral growth and increased mortality," said the study's lead author, Grace Klinges, a Ph.D. student in OSU's College of Science. "Now we know it has multiple genes characteristic of parasites, including the antiporter Tlc gene that robs the host cells of energy, taking the ATP and replacing it with low-energy ADP."

OTU refers to operational taxonomical unit - an OTU classifies groups of closely related organisms.

"When we discovered the gene that takes ATP from the host, we knew we were onto something really cool," Klinges said. "The gene has a similar protein structure to mitochondrial genes but does the reverse - it gives back ADP, which is no longer useful, to the animal."

When nutrient levels are normal and corals are healthy, the corals can tolerate low populations of Rickettsiales. But when a reef becomes nutrient rich, the Rickettsiales population spikes and becomes harmful as it consumes more and more of its hosts' resources.

"This order of bacteria is in the microbiome of diseased corals but it's also in healthy corals at low levels," Klinges said. "It's affecting the host's immune system even if it isn't pathogenic on its own - there are so many cascading effects. As nutrient pollution increasingly affects reefs, we suspect that parasites within the new genus will proliferate and put corals at greater risk."

In a metadata analysis of 477,075 samples from the Earth Microbiome Project and Sequence Read Archive databases, researchers found Candidatus Aquarickettsia rohweri to be present around the globe in 51 genera of stony corals and 76 genera of sponges.

"Together these data suggest that our proposed genus broadly associates with corals and also with many members of the non-bilaterian metazoan phyla - Placozoa, Porifera, Cnidaria and Ctenophora - as well as the even more ancient protists," said study co-author and OSU microbiologist Rebecca Vega Thurber.

"That blew my mind," Klinges added. "We were just looking at this one species of coral and one species of bacteria, and now we know the genus of bacteria is involved in hundreds and thousands of years of coral evolution. We expect to find different bacterial species in other corals, but they fall in the same genus and therefore have a pretty similar function. In next few years our lab will assemble genomes of related bacteria present in other species of coral."

Credit: 
Oregon State University

Protein factors increasing yield of a biofuel precursor in microscopic algae

image: Microalgae are a promising source of biofuel feedstock, as they produce triacylglycerol (TAG) as a major storage lipid, especially under nutrient-depleted conditions. LRL1 was involved in the regulatory mechanism during the later stage of P-starvation in C. reinhardtii, as its regulation might depend on P-status, cell growth, and other factors.

Image: 
Tokyo Tech

As an alternative to traditional fossil fuels, biofuels represent a more environmentally friendly and sustainable fuel source. Plant or animal fats can be converted to biofuels through a process called transesterification. In particular, the storage molecule triacylglycerol (TAG), found in microscopic algae, is one of the most promising sources of fat for biofuel production, as microalgae are small, easy to grow, and reproduce quickly. Therefore, increasing the yield of TAG from microalgae could improve biofuel production processes. With this ultimate goal in mind, Professor Hiroyuki Ohta from the Tokyo Institute of Technology and colleagues investigated the conditions under which the model microalga Chlamydomonas reinhardtii produces more TAG.

It is known that microalgae produce greater amounts of TAG when grown in environments with few nutrients. However, according to Dr. Ohta, "While low-nitrogen environments cause microalgae to produce more TAG, this strongly reduces microalgal growth and reproduction, decreasing potential gains in TAG yield." In an attempt to find conditions under which C. reinhardtii both produces more TAG and grows well, the team of researchers gave the microalga sufficient nitrogen but limited the amount of phosphorus in the environment. Under these conditions, TAG production was increased and cell growth was still promoted, increasing the overall yield of TAG.

In this experiment, the scientists used co-expression analysis to identify a C. reinhardtii protein, which they name Lipid Remodeling reguLator 1 (LRL1), that is involved in TAG production in phosphorus-limited environments. Functional analyses, in which the LRL1 gene was disrupted, revealed additional genes involved in TAG accumulation and C. reinhardtii growth under phosphorus depletion. Together, the results shed light on the underlying biochemical pathways involved in this process. A better understanding of these pathways has the potential to improve TAG--and therefore biofuel--production processes. Notes Dr. Ohta, "The discovery of proteins involved in TAG production under nutrient-depleted conditions could one day lead to methods to increase TAG yield, ultimately making biofuel production more efficient and cost-effective." This could in turn help reduce our reliance on fossil fuels and promote the widespread use of biofuels derived from microalgae.

Credit: 
Tokyo Institute of Technology

New insights into the origin of life

A famous experiment in 1953 showed that amino acids, the building blocks of proteins, could have formed spontaneously under the atmospheric conditions of early Earth. However, just because molecules could form doesn't mean that the process was likely. Now, researchers reporting in ACS Central Science have demonstrated that energetically feasible interactions between just two small molecules -- hydrogen cyanide and water -- could give rise to most of the important precursors of RNA and proteins.

The 1953 Urey-Miller experiment involved running electrical sparks, which simulated lightning, through a flask containing water, methane, ammonia and hydrogen. These simple chemicals, present on early Earth, reacted to form hydrogen cyanide, formaldehyde and other intermediates that reacted further to make amino acids and other biomolecules. But some scientists now believe that Earth's hazy atmosphere of roughly 4 billion years ago would have made it difficult for high-energy photons, such as those in lightning or ultraviolet light, to reach the earth's surface. Kumar Vanka and colleagues wondered if heat from ocean waters, which were near-boiling at the time, could have been the driving force for the reactions.

To find out, the researchers used a recently developed tool called the ab initio nanoreactor, which simulates how mixtures of molecules can collide and react, forming new molecules. The researchers found that hydrogen cyanide -- which condensed into oceans from the early Earth's atmosphere -- and water could create the molecules necessary to produce the amino acid glycine and the precursors of RNA. Importantly, these reactions were both thermodynamically and kinetically feasible: In other words, they didn't require a lot of energy or metal catalysts. The simulations reveal interesting new potential pathways for the formation of life's precursors, the team says.

Credit: 
American Chemical Society

Eating more plant-based foods may be linked to better heart health

DALLAS, August 7, 2019 -- Eating mostly plant-based foods and fewer animal-based foods may be linked to better heart health and a lower risk of dying from a heart attack, stroke or other cardiovascular disease according to new research published in the Journal of the American Heart Association, the Open Access Journal of the American Heart Association/American Stroke Association.

"While you don't have to give up foods derived from animals completely, our study does suggest that eating a larger proportion of plant-based foods and a smaller proportion of animal-based foods may help reduce your risk of having a heart attack, stroke or other type of cardiovascular disease," said lead researcher, Casey M. Rebholz, Ph.D., assistant professor of epidemiology at Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland.

Researchers reviewed a database of food intake information from more than 10,000 middle-aged U.S. adults who were monitored from 1987 through 2016 and did not have cardiovascular disease at the start of the study. They then categorized the participants' eating patterns by the proportion of plant-based foods they ate versus animal-based foods.

People who ate the most plant-based foods overall had a:

16% lower risk of having a cardiovascular disease such as heart attacks, stroke, heart failure and other conditions;

32% lower risk of dying from a cardiovascular disease and

25% lower risk of dying from any cause compared to those who ate the least amount of plant-based foods.

"Our findings underscore the importance of focusing on your diet. There might be some variability in terms of individual foods, but to reduce cardiovascular disease risk people should eat more vegetables, nuts, whole grains, fruits, legumes and fewer animal-based foods. These findings are pretty consistent with previous findings about other dietary patterns, including the Dietary Approaches to Stop Hypertension, or DASH diet, which emphasize the same food items," Rebholz said.

This is one of the first studies to examine the proportion of plant-based versus animal-based dietary patterns in the general population, noted Rebholz. Prior studies have shown heart-health benefits from plant-based diets but only in specific populations of people, such as vegetarians or Seventh Day Adventists who eat a mostly vegan diet. Future research on plant-based diets should examine whether the quality of plant foods--healthy versus less healthy--impacts cardiovascular disease and death risks, according to the study, said Rebholz.

"The American Heart Association recommends eating a mostly plant-based diet, provided the foods you choose are rich in nutrition and low in added sugars, sodium (salt), cholesterol and artery-clogging saturated and trans fats. For example, French fries or cauliflower pizza with cheese are plant based but are low in nutritional value and are loaded with sodium (salt). Unprocessed foods, like fresh fruit, vegetables and grains are good choices," said Mariell Jessup, M.D., the chief science and medical officer of the American Heart Association.

The study was observational, which means did not prove cause and effect.

Credit: 
American Heart Association

Substituting poultry for red meat may reduce breast cancer risk

Results from a new study suggest that red meat consumption may increase the risk of breast cancer, whereas poultry consumption may be protective against breast cancer risk. The findings are published in the International Journal of Cancer.

For the study, investigators analyzed information on consumption of different types of meat and meat cooking practices from 42,012 women who were followed for an average of 7.6 years.

During follow-up, 1,536 invasive breast cancers were diagnosed. Increasing consumption of red meat was associated with increased risk of invasive breast cancer: women who consumed the highest amount of red meat had a 23% higher risk compared with women who consumed the lowest amount. Conversely, increasing consumption of poultry was associated with decreased invasive breast cancer risk: women with the highest consumption had a 15% lower risk than those with the lowest consumption. Breast cancer was reduced even further for women who substituted poultry for meat.

The findings did not change when analyses controlled for known breast cancer risk factors or potential confounding factors such as race, socioeconomic status, obesity, physical activity, alcohol consumption, and other dietary factors. No associations were observed for cooking practices or chemicals formed when cooking meat at high temperature.

"Red meat has been identified as a probable carcinogen. Our study adds further evidence that red meat consumption may be associated with increased risk of breast cancer whereas poultry was associated with decreased risk," said senior author Dale P. Sandler, PhD, of the National Institute of Environmental Health Sciences. "While the mechanism through which poultry consumption decreases breast cancer risk is not clear, our study does provide evidence that substituting poultry for red meat may be a simple change that can help reduce the incidence of breast cancer."

Credit: 
Wiley

Study examines characteristics of older adults with moderately severe dementia

A study published in the Journal of the American Geriatrics Society has found that many characteristics among older adults with moderately severe dementia differ depending on whether they live at home or in residential care or nursing facilities.

The study used a nationally-representative dataset of U.S. older adults and included 728 people newly identified as having moderately severe dementia between 2012 and 2016. Sixty four percent received care at home, 19% in residential care, and 17% in a nursing facility.

Individuals living at home were two to five times more likely to be members of disadvantaged populations (such as being a racial/ethnic minority, not being born in the United States, and having less than a high school education). Those living at home also had worse health and more symptoms than those living in residential care or nursing facilities.

When researchers extrapolated their results, they estimated that 3.3 million older U.S. adults developed moderately severe dementia between 2012 and 2016.

"In our experience, many people live at home with dementia even when things get hard, yet we know virtually nothing about this population. Prior research focused on people with advanced dementia in nursing homes," said lead author Krista L. Harrison, PhD, of the University of California San Francisco. "Our study is one of the first to describe people living at home at a stage of dementia plus moderate functional impairment and to compare this population to people with dementia in other settings in the United States. This is a key step towards better understanding and addressing the geriatric palliative care needs of people with dementia wherever they live."

Credit: 
Wiley