Tech

HKUST researchers unlock cancer-causing mechanism of E. coli toxin with synthetic biology approach

image: The dish on the left is the E. coli cloned by the research team.

Image: 
The Hong Kong University of Science and Technology

While human gut microbes like E. coli help digest food and regulate our immune system, they also contain toxins that could arrest cell cycle and eventually cause cell death. Scientists have long known that colibactin - a genotoxin produced by E. coli, can induce DNA double-strand breaks in eukaryotic cells and increase the risk of colorectal cancer in human. However, how colibactin causes DNA damage had remained a mystery as reconstructing colibactin metabolites is extremely difficult due to the compound's instability, low titer and complexity of its biosynthetic pathway.

Now, a research team led by Prof. Qian Peiyuan, Chair Professor of HKUST's Department of Ocean Science and Division of Life Science, have unearthed the missing link using a novel biosynthetic method. The team not only succeeded in cloning the colibactin gene cluster, but also found a way to mass produce the genes for testing and validation. After repeated assays of various sets of colibactin precursors, the team eventually identified colibactin-645 as the culprit of the DNA double-strand breaks, and uncovered colibactin metabolite's biosynthetic pathway as well as its mechanism of causing DNA damage.

Prof. Qian said, "Although a few colibactin metabolites have been reported to damage DNA via DNA crosslinking activity, the genotoxic colibactin that possesses DNA double-strands directly is yet-to-be-identified. Our research has confirmed colibactin-645 exerts direct DNA double-strand breaks, that unearthed the missing link that correlates colibactin to its health effects on human beings."

LI Zhongrui, a researcher of the team, said the restructuring of colibactin's molecular scaffold provides a model for designing and synthesizing potent DNA cleaving agents - such as synthetic restriction "enzymes" or chemotherapeutic agents.

Credit: 
Hong Kong University of Science and Technology

Ethanol fuels large-scale expansion of Brazil's farming land

A University of Queensland-led study has revealed that future demand for ethanol biofuel could potentially expand sugarcane farming land in Brazil by five million hectares by 2030.

UQ School of Earth and Environmental Sciences researcher Milton Aurelio Uba de Andrade Junior said that because Brazil produced ethanol from sugarcane, future biofuel demand would directly impact land use.

"Our study has modelled scenarios forecasting future ethanol demand based on different trajectories for gross domestic product, population growth, fuel prices, blending policies, fleet composition and efficiency gains," he said.

"A high demand scenario fuelled by strong economic and population growth, soaring gasoline prices, and ambitious blending targets, could mean that current demand for ethanol in Brazil will be doubled by 2030.

"If this scenario occurs, then Brazil will need an additional five million hectares of land for sugarcane crops to meet this high demand."

Mr de Andrade Junior said that most of the additional sugarcane farms were likely to expand into pasturelands, minimising impact on native forests.

"A key assumption of our modelling is that Brazil's land-use policies, such as the sugarcane agro-ecological zoning, will continue to promote the increase of agricultural yields while minimising environmental impacts," he said.

"However, in the current context of high uncertainty on the environmental agenda, such land use policies need to be closely monitored and supported to ensure that the country's natural ecosystems and biodiversity remain protected."

Credit: 
University of Queensland

New method for detecting quantum states of electrons

image: This figure, appearing in the article in Physical Review Letters, depicts a copper cell containing liquid helium and a parallel plate capacitor. Konstantinov and his team used microwave radiation to induce quantum states in the electrons.

Image: 
OIST

Quantum computing harnesses enigmatic properties of small particles to process complex information. But quantum systems are fragile and error-prone, and useful quantum computers have yet to come to fruition.

Researchers in the Quantum Dynamics Unit at the Okinawa Institute of Science and Technology Graduate University (OIST) devised a new method -- called image charge detection -- to detect electrons' transitions to quantum states. Electrons can serve as quantum bits, the smallest unit of quantum information; these bits are foundational to larger computational systems. Quantum computers may be used to understand the mechanism of superconductivity, cryptography, artificial intelligence, among other applications.

"There is a huge gap between controlling few quantum bits and building a quantum computer," said Dr. Erika Kawakami, the lead author of a new study, published in Physical Review Letters with editor's suggestion. "With the current state-of-art quantum bits, a quantum computer would need to be the size of a football field. Our new approach could potentially create a ten-centimeter chip."

A New Potential for Electrons on Helium

Electrons need to be immobilized to serve as quantum bits; otherwise they move freely. To create an electron-capturing system, the researchers used liquid helium, which liquefies at cold temperatures, as a substrate. Since helium is free of impurities, these electrons are expected to retain quantum states longer than in any other materials, which is important for realizing a quantum computer.

Prof. Denis Konstantinov and his collaborators, Kawakami and Dr. Asem Elarabi, placed a parallel-plate capacitor inside of a copper cell cooled to 0.2 degrees Kelvin (-272.8 degrees Celsius) and filled with condensed liquid helium. Electrons generated by a tungsten filament sat atop the liquid helium's surface, between the two capacitor plates. Then, microwave radiation introduced into the copper cell excited electrons' quantum states, causing the electrons to move away from the bottom capacitor plate and come closer to the top capacitor plate.

The researchers confirmed the excitation of quantum states by observing an electrostatic phenomenon called image charge. Like a reflection in a mirror, image charge precisely reflects the movement of electrons. If an electron moves further from the capacitor plate, then the image charge moves alongside it.

Moving forward, the researchers hope to use this image charge detection to measure an individual electron's spin state, or quantum orbital state, without disrupting the integrity of the quantum systems.

"Currently, we can detect the quantum states of an ensemble of many electrons," Konstantinov said. "The strong point of this new method is that we can scale down this technique to a single electron and to use it as a quantum bit."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Statistical inference to mimic the operating manner of highly-experienced crystallographer

image: Precision of crystallographic observation was improved from left to right using measurement design based on the proposed method. Differentials from actual to spherical electron densities are described by blue (positive) and red (negative) surfaces.

Image: 
©Manabu Hoshino

A research team from Japan Science and Technology Agency (JST), RIKEN, and the University of Tokyo developed a novel data analysis method for prior evaluation of single crystal structure analysis. Their proposed method is based on precise estimation of a parameter inherent in preliminary-collected small data set. They demonstrated its application to guest distinction in host-guest crystals before single crystal structure analysis and measurement design for precise crystallographic observation.

Recent development of measurement device and analysis program for single crystal structure analysis is advanced; thus, it reduces researchers' effort in crystallographic observation of molecular structure. However, researchers are sometimes trapped in "blind" measurement-analysis iterations until the result of analysis is satisfactory to their purpose. "In almost all cases, the iterations are forced to be taken by the one of two reasons--an observed structure is imprecise or outside the scope of research purpose," said Dr. Manabu Hoshino, a researcher at JST and RIKEN. He was dealing with this issue to improve the efficiency of single crystal structure analysis and conceived the idea for evaluating to-be-collected data and to-be-observed structure from preliminary data. This idea was implemented in statistical estimation of a parameter intrinsic to data set by Bayesian inference.

Moreover, "Highly-experienced crystallographers successfully bypass the iterations by selecting a crystal and setting experimental conditions based on careful check of preliminary-collected small data set," said Hoshino. Further, he continued "In our method, an estimated parameter is used for mimicking professional crystallographers' crystal selection and measurement design."

Practical utility of their proposed method was demonstrated as prior distinction of to-be-observed guest solvent molecules included in a porous host crystal. Because of the isomorphism of crystal structures comprised using a same host molecule, difference of included guest solvent molecules is usually clarified only after completing crystal structure analysis. Hoshino and his collaborators applied their method to the preliminary small data sets collected from two host-guest crystals grown from different mixed-solvent solutions and showed that difference of the estimated parameters is usable to distinguish guest solvent molecules in these crystals.

Moreover, they utilized the estimated parameter as a means to create to-be-collected data by employing it in theoretical equations describing crystallographic data. The lowest threshold of signal-to-noise ratio to guarantee quantitatively of data was suggested from the created data. Indeed, collected data satisfying the proposed threshold enabled precise crystallographic observation for studying deformation of electron density by chemical bonding.

"Our proposed method will help researchers who have little experiences in single crystal structure analysis in the aspect of their crystal selection and measurement design," concluded Hoshino. "I believe our method will encourage researchers to exploit developed measurement devices and analysis programs."

Credit: 
Japan Science and Technology Agency

Novel mechanism of electron scattering in graphene-like 2D materials

image: Hybrid system formed by combining Bose-Einstein condensate (BEC) and 2D electron gas (2DEG) in novel 2D materials, such as MoS2. Electrons (black spheres) move in 2D electron gas (2DEG, upper layer), and interact with other particles present in the lower layers, where photo-excited electrons and holes (gray spheres, h.) form bound electron-hole pairs. The red wiggly lines represent Coulomb forces acting between particles with opposite charges.

Image: 
IBS

Understanding how particles behave at the twilight zone between the macro and the quantum world gives us access to fascinating phenomena, interesting from both the fundamental and application-oriented physics perspectives. For example, ultra-thin graphene-like materials are a fantastic playground to examine electrons' transport and interactions. Recently, researchers at the Center for Theoretical Physics of Complex Systems (PCS), within the Institute for Basic Science (IBS, South Korea), in collaboration with the Rzhanov Institute of Semiconductor Physics (Russia) have reported on a novel electron scattering phenomenon in 2D materials. The paper is published in Physical Review Letters.

The team considered a sample which consists of two subsystems: one made of particles with integer spin (bosons) and the other made of particles with half-integer spin (fermions).

For the bosonic component, they modelled a gas of excitons (electron-positron pairs). At low temperatures, quantum mechanics can force a large number of bosonic particles to form a Bose-Einstein condensate (BEC). This state of matter has been reported in different materials, in particular, gallium arsenide (GaAs), and it has been predicted in molybdenum disulphide (MoS2).

The fermionic subsystem is a 2D electron gas (2DEG), where electrons are limited to move in two dimensions. It exhibits intriguing magnetic and electric phenomena, including superconductivity, that is, the passage of current without resistivity. These phenomena are related to electron scattering, which is mainly due to impurities and phonons. The latter are vibrations of the crystal lattice. Their name derives from the Greek 'phonos', meaning sound, since long-wavelength phonons give rise to sound, but they also play a role in the temperature-dependent electrical conductivity of metals.

Bosons and fermions are very different at the quantum level, so what happens when we combine BEC and 2DEG? Kristian Villegas, Meng Sun, Vadim Kovalev, and Ivan Savenko have modelled electron transport in such hybrid system.

Beyond the conventional phonons and impurities, the team described an unconventional electron scattering mechanism in BEC-2DEG hybrid systems: the interactions of an electron with one or two Bogoliubov quanta (or bogolons) - excitations of the BEC with small momenta. Although phonons and bogolons share some common features, the team found that they have important differences.

According to the models, in high-quality MoS2 at a certain range of temperatures, resistivity caused by pairs of bogolons proved to be dominating over resistivity caused by single bogolons, acoustic phonons, single-bogolons, and impurities. The reason of such difference is the mechanism of interaction between electrons and bogolons, which is of electric nature, as opposed to electron-phonon interaction described by the deformations of the sample.

This research might be useful for the design of novel high-temperature superconductors. An apparent paradox links conductivity and superconductivity: bad conductors are usually good superconductors. In the case of electron-phonon interactions, some materials that show poor conductivity, because of strong scattering of electrons by phonons, can become good superconductors at very low temperatures. For the same reason, noble metals, such as gold, are good conductors, but bad superconductors. If this holds true also for electron-bogolon interactions, then the researchers hypothesise that designing a bad conductor, with high resistivity caused by electron-2 bogolons interactions, might lead to "good" superconductors.

"This work not only opens perspectives in designing hybrid structures with controllable dissipation, it reports on fundamentally different temperature-dependence of scattering at low and high temperatures and sheds light on optically controlled condensate-mediated superconductivity," explains Ivan Savenko, the leader of the Light-Matter Interaction in Nanostructures (LUMIN) team at PCS.

Credit: 
Institute for Basic Science

Novel anti-cancer nanomedicine for efficient chemotherapy

image: The researchers have harnessed exosomes together with synthetic nanomaterial as carriers of anticancer drugs.

Image: 
Santos Lab

Researchers at the University of Helsinki in collaboration with researchers from Åbo Akademi University,Finland and Huazhong University of Science and Technology,China have developed a new anti-cancer nanomedicine for targeted cancer chemotherapy. This new nano-tool provides a new approach to use cell-based nanomedicines for effcient cancer chemotherapy.

Exosomes contain various molecular constituents of their cell of origin, including proteins and RNA. Now the researchers have harnessed them together with synthetic nanomaterial as carriers of anticancer drugs. The new exosome-based nanomedicines enhanced tumor accumulation, extravasation from blood vessels and penetration into deep tumor parenchyma after intravenous administration.

"This study highlights the importance of cell-based nanomedicines", says the principal investigator and one of the corresponding authors of this study, Hélder A. Santos, Associate Professor at the Faculty of Pharmacy, University of Helsinki, Finland.

Nanoparticles-based drug delivery systems have shown promising therapeutic effcacy in cancer. To increase their targettibility to tumors, nanoparticles are usually functionalized with targeted antibodies, peptides or other biomolecules. However, such targeting ligands may sometimes have a negative infuence on the nanoparticle delivery owing to the enhanced immune-responses.

Biomimetic nanoparticles on the other hand combine the unique functionalities of natural biomaterials, such as cells or cell membranes, and bioengineering versatility of synthetic nanoparticles, that can be used as an efficient drug delivery platform.

The developed biocompatible exosome-sheathed porous silicon-based nanomedicines for targeted cancer chemotherapy resulted in augmented in vivo anticancer drug enrichment in tumor cells. "This demonstrates the potential of the exosome-biomimetic nanoparticles to act as drug carriers to improve the anticancer drug efficacy", Santos concludes.

Credit: 
University of Helsinki

From primordial black holes new clues to dark matter

image: Projection of the neutral hydrogen fraction at redshift z=2.

Image: 
The Sherwood Simulation Suite

Moving through cosmic forests and spider webs in deep space in search of answers on the origin of the Cosmos. "We have tested a scenario in which dark matter is composed by non-stellar black holes, formed in the primordial Universe" says Riccardo Murgia, lead author of the study recently published in Physical Review Letters. The research was carried out together with his colleagues Giulio Scelfo and Matteo Viel of SISSA - International School for Advanced Studies and INFN - Istituto Nazionale di Fisica Nucleare (Trieste division) and Alvise Raccanelli of CERN. Primordial black holes (PBH for cosmologists) are objects that formed just fractions of a second after the Big Bang, considered by many researchers among the principal candidates in explaining the nature of dark matter, above all following direct observations of gravitational waves by the VIRGO and LIGO detectors in 2016.

"Primordial black holes remain hypothetical objects for the moment, but they are envisaged in some models of the primordial universe" underlines Raccanelli of CERN. "Initially proposed by Stephen Hawking in 1971, they have come back to the fore in recent years as possible candidates for explaining dark matter. It is believed that this accounts for approximately 80% of all matter present in the Universe, so to explain even just a small part of it would be a major achievement.

Not only, but looking for evidence of the existence of PBHs, or excluding their existence, provides us with information of considerable relevance on the physics of the primordial universe."

Cosmic forests and spider webs

In this work, the scientists have concentrated on the abundance of PBHs that are 50 times larger than the solar mass. In short, the researchers have tried to better describe several parameters linked to their presence (specifically mass and abundance) by analysing the interaction of the light emitted from extremely distant quasars with the cosmic web, a network of filaments composed of gas and dark matter present throughout the Universe. Within this dense weave, the scholars have concentrated on the "Lyman-alpha forest", namely the interactions of the photons with the hydrogen of cosmic filaments, which presents characteristics closely linked to the fundamental nature of dark matter.

Between supercomputers and telescopes

Simulations carried out using the Ulysses supercomputer of SISSA and ICTP have been able to reproduce the interactions between photons and hydrogen and they have been compared with "real" interactions, detected by the Keck telescope (in Hawaii). The researchers were then able to trace several properties of primordial black holes to understand the effects of their presence.

"We used a computer to simulate the distribution of neutral hydrogen on sub-galactic scales, which manifests itself in the form of absorption lines in the spectra of distant sources," continues Murgia. "Comparing the results of our simulations with the data observed, it is possible to establish limits on the mass and abundance of primordial black holes and determine whether and to what extent such candidates can constitute dark matter."

The results of the study seem to disadvantage the case that all dark matter is composed of a certain type of primordial black holes (those with a mass greater than 50 times that of the sun) but they do not totally exclude that they could constitute a fraction of it.

"We have developed a new way to easily and efficiently explore alternative scenarios of the standard cosmological model, according to which dark matter would instead be composed of particles called WIMPs (Weakly Interacting Massive Particles)".

These results, important for the construction of new theoretical models and for the development of new hypotheses about the nature of dark matter, offer much more precise indications for tracing the intricate path to understanding one of the largest mysteries of the cosmos.

Credit: 
Scuola Internazionale Superiore di Studi Avanzati

Emissions from cannabis growing facilities may impact indoor and regional air quality

image: According to new research, strongly scented airborne chemicals called biogenic volatile organic compounds (BVOCs), which are naturally produced by cannabis plants during growth and reproduction, may impact indoor and outdoor air quality.

Image: 
Vera Samburova/DRI

The same chemicals responsible for the pungent smell of a cannabis plant may also contribute to air pollution on a much larger scale, according to new research from the Desert Research Institute (DRI) and the Washoe County Health District (WCHD) in Reno, Nev.

In a new pilot study, DRI scientists visited four cannabis growing facilities in Nevada and California to learn about the chemicals that are emitted during the cultivation and processing of cannabis plants, and to evaluate the potential for larger-scale impacts to urban air quality.

At each facility, the team found high levels of strongly-scented airborne chemicals called biogenic volatile organic compounds (BVOCs), which are naturally produced by the cannabis plants during growth and reproduction. At facilities where cannabis oil extraction took place, researchers also found very high levels of butane, a volatile organic compound (VOC) that is used during the oil extraction process.

"The concentrations of BVOCs and butane that we measured inside of these facilities were high enough to be concerning," explained lead author Vera Samburova, Ph.D., Associate Research Professor of atmospheric science at DRI. "In addition to being potentially hazardous to the workers inside the cannabis growing and processing facilities, these chemicals can contribute to the formation of ground-level ozone if they are released into the outside air."

Although ozone in the upper atmosphere provides protection from UV rays, ozone at ground-level is a toxic substance that is harmful for humans to breathe. Ozone can be formed when volatile organic compounds (including those from plants, automobile, and industrial sources) combine with nitrogen oxide emissions (often from vehicles or fuel combustion) in the presence of sunlight. All of these ozone ingredients are in ample supply in Nevada's urban areas, Samburova explained - and that impacts our air quality.

"Here in our region, unfortunately, we already exceed the national air quality standard for ground-level ozone quite a few times per year," Samburova said. "That's why it is so important to answer the question of whether emissions from cannabis facilities are having an added impact."

At one of the four cannabis growing facilities visited during this study, the team measured emission rates over time, to learn about the ozone-forming potential of each individual plant. The results show that the BVOCs emitted by each cannabis plant could trigger the formation of ground-level (bad) ozone at a rate of approximately 2.6g per plant per day. The significance of this number is yet to be determined, says Samurova, but she and her team feel strongly that their findings have raised questions that warrant further study.

"This really hasn't been studied before," Samburova said. "We would like to collect more data on emissions rates of plants at additional facilities. We would like to take more detailed measurements of air quality emissions outside of the facilities, and be able to calculate the actual rate of ozone formation. We are also interested in learning about the health impacts of these emissions on the people who work there."

The cannabis facility personnel that the DRI research team interacted with during the course of the study were all extremely welcoming, helpful, and interested in doing things right, Samburova noted. Next, she and her team hope to find funding to do a larger study, so that they can provide recommendations to the growing facilities and WCHD on optimum strategies for air pollution control.

"With so much growth in this industry across Nevada and other parts of the United States, it's becoming really important to understand the impacts to air quality," said Mike Wolf, Permitting and Enforcement Branch Chief for the WCHD Air Quality Management Division. "When new threats emerge, our mission remains the same: Implement clean air solutions that protect the quality of life for the citizens of Reno, Sparks, and Washoe County. We will continue to work with community partners, like DRI, to accomplish the mission."

Credit: 
Desert Research Institute

Later puberty and later menopause associated with lower risk of type 2 diabetes in women, while use of contraceptive pill and longer time between periods associated with higher risk

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) shows that use of the contraceptive pill and longer menstrual cycles are associated with a higher risk of developing type 2 diabetes (T2D), while later puberty and later menopause are associated with lower risk.

The study, by Dr Sopio Tatulashvili, Avicenne Hospital, Bobigny, France, and colleagues, suggests that in general longer exposure to sex hormones, but later in life, could reduce the risk of diabetes, and that women at high-risk of T2D taking the contraceptive pill may require personalised advice.

Early screening to detect poor blood sugar control (that may lead to T2D) could lower the risk of further complications. For this reason, it is important to identify the risk factors of T2D. The aim of this study was to determine the association between various hormonal factors and the risk of developing T2D in the large prospective female E3N cohort study.*

The study included 83 799 French women from the E3N prospective cohort followed between 1992 and 2014. Computer models adjusted for the main T2D risk factors were used to estimate risk and statistical significance between various hormonal factors and T2D risk. The risk factors adjusted for included body mass index, smoking, age, physical activity, socioeconomic status, education level, family history of T2D, and blood pressure.

The authors observed that higher age at puberty (aged over 14 years versus under 12 years) reduced T2D risk by 12%, and increased age at menopause (52 years and over compared to under 47 years) reduced risk by 30%. Breastfeeding (ever breastfed versus never breastfed) was also associated with a 10% reduced risk of developing T2D.

Furthermore, an increased total lifetime number of menstrual cycles (over 470 in a woman's lifetime versus under 390) was associated with a 25% reduced risk of developing T2D, and longer duration of exposure to sex hormones (meaning the time between puberty and menopause) (over 38 years compared with under 31 years) was associated with a 34% decreased risk of developing T2D.

By contrast, the use of contraceptive pills (at least once during a woman's lifetime compared with no use at all) was associated with a 33% increased risk of developing T2D, and longer time between periods (menstrual cycle length) (32 days and over versus 24 days and under) was associated with a 23% increased risk.

The authors say: "It seems that longer exposure to sex hormones but later in life could reduce the risk of later developing type 2 diabetes, independent of well-established risk factors. Risk induced by oral contraceptives could lead to personalised advice for young women at risk of developing T2D, such as those with a family history of diabetes, those who are overweight or obese, or those with polycystic ovary syndrome."

Credit: 
Diabetologia

Analysis of studies into alcohol consumption in people with type 2 diabetes suggests

An meta-analysis of studies presented at this year's Annual Meeting of the European Association for the Study of Diabetes in Barcelona, Spain (16-20 September) shows that recommendations to moderate alcohol consumption for people with type 2 diabetes (T2D) may need to be reviewed, since low-to-moderate consumption could have a positive effect on blood glucose and fat metabolism.

The study is by Yuling Chen, Southeast University, Nanjing, China, and Dr Li Ling, Director of the Department of Endocrinology, Zhongda Hospital and School of Medicine, Southeast University, Nanjing, China and colleagues.

However, regardless of the effects on metabolism shown by this analysis, advice from various diabetes organisations including Diabetes UK* remains that people with T1D or T2D need to be careful with alcohol consumption, since drinking can make you more likely to have a hypoglyaemic episode (known as a hypo) because alcohol makes your blood sugars drop. It can also cause weight gain and other health issues.

The authors studied PubMed, Embase, and Cochrane databases up to March 2019 for randomised controlled trials (RCTs) that assessed the relationship between alcohol consumption and glucose and lipid metabolism among adults with T2D. Extracted data from RCTs were analysed using computer modelling.

The authors found ten relevant RCTs involving 575 participants that were included in this review. Meta-analysis showed that alcohol consumption was associated with reduced triglyceride levels and insulin levels, but had no statistically significant effect on fasting blood glucose levels, glycated haemoglobin (HbA1c, a measure of blood glucose control), or total cholesterol, low density lipoprotein (bad) cholesterol, and high density lipoprotein (good) cholesterol.

Subgroup analysis indicated that drinking light to moderate amounts of alcohol decreased the levels of triglycerides (blood fats) and insulin in people with T2DM. Light to moderate drinking was defined by the authors as 20g or less of alcohol per day. This translates to approximately 1.5 cans of beer (330ml, 5% alcohol), a large (200ml) glass of wine (12% alcohol) or a 50ml serving of 40% alcohol spirt (for example vodka/gin).

The authors conclude: "Findings of this meta-analysis show a positive effect of alcohol on glucose and fat metabolism in people with type 2 diabetes. Larger studies are needed to further evaluate the effects of alcohol consumption on blood sugar management, especially in patients with type 2 diabetes."

Credit: 
Diabetologia

To address hunger, many countries may have to increase carbon footprint

Achieving an adequate, healthy diet in most low- and middle-income countries will require a substantial increase in greenhouse gas emissions and water use due to food production, according to new research from the Johns Hopkins Center for a Livable Future based at the Johns Hopkins Bloomberg School of Public Health.

The paper will be published online September 16 in the journal Global Environmental Change.

Obesity, undernutrition, and climate change are major global challenges that impact the world's population. While these problems may appear to be unrelated, they share food production and consumption as key underlying drivers. By recognizing the role of food production in climate change, this study examines the challenges of simultaneously addressing hunger and the climate crisis at both the individual and country levels.

For their analysis, the researchers developed a model that assessed how alterations to dietary patterns across 140 countries would impact individual- and country-level greenhouse gas emissions and freshwater use. They used this model to assess the per capita and whole country climate and water footprints of nine plant-forward diets. The plant-forward diets examined ranged from no red meat, pescatarian, lacto-ovo vegetarian, and vegan, among others.

A key finding of the study showed that a diet in which the animal protein came predominantly from low food chain animals, such as small fish and mollusks, had nearly as low of an environmental impact as a vegan diet. Researchers also determined that a diet that involved reducing animal food consumption by two-thirds--termed by study authors as "two-thirds vegan"--generally had a lower climate and water footprint than the more traditional lacto-ovo vegetarian diet.

"Our research indicates there's no one-size-fits-all diet to address the climate and nutrition crises. Context is everything, and the food production policies for each country must reflect that," says senior author of the study, Keeve Nachman, PhD, director of the Food Production and Public Health program at the Johns Hopkins Center for a Livable Future and an assistant professor with the Bloomberg School's Department of Environmental Health and Engineering.

To counter these climate impacts and to address diet-related morbidity and mortality, the authors recommend, based on this report, that high-income countries accelerate adapting plant-forward diets. The authors emphasize that an examination of these diets and their environmental footprints allows for consideration of dietary recommendations or behavioral changes that would balance health and nutrition needs, cultural preferences, and planetary boundaries.

"Our data indicate that it is actually dairy product consumption that explains much of the differences in greenhouse gas footprints across diets. Yet, at the same time, nutritionists recognize the important role dairy products can have in stunting prevention, which is a component of the World Bank Human Capital Index," says study co-author, Martin Bloem, MD, PhD, director of the Johns Hopkins Center for a Livable Future and the Robert Lawrence Professor of Environmental Health at the Bloomberg School. The World Bank's Human Capital Index calculates the contribution of health and education to the productivity of future generations of workers.

"The study findings highlight the difficulty in prescribing broad dietary recommendations to meet the needs of individual countries," says Bloem.

A food's country of origin can have enormous consequences for climate, according to the study. For example, one pound of beef produced in Paraguay contributes nearly 17 times more greenhouse gases than one pound of beef produced in Denmark. Often, this disparity is due to deforestation resulting from grazing land. "Where you get your food from matters," says Nachman. "Trade patterns have an important influence on countries' diet-related climate and fresh water impacts."

The methodology used in the study allows for new data-driven comparisons between countries and regions, and also takes into account the different contexts and conditions in these countries. The study integrates country-specific data such as current food availability and trade and import patterns with information about greenhouse gas and water use burdens that are associated with the production of specific food items by country of origin. It also takes into account the carbon emissions associated with land use changes for purposes of food production.

"It would be satisfying to have a silver bullet to address carbon footprints and the impact of food production; however, with problems as complex and global as nutrition, climate change, freshwater depletion, and economic development, that's not possible," says Bloem. "There will always be tradeoffs. Environmental impact alone cannot be a guide for what people eat; countries need to consider the totality of the nutritional needs, access, and cultural preferences of their residents. The good news is this research can be a part of the solution, as it now gives policymakers a tool to develop nationally appropriate strategies, including dietary guidelines, that help meet multiple goals."

Credit: 
Johns Hopkins Bloomberg School of Public Health

Study: Americans would rather drive themselves than have an autonomous vehicle drive them

image: Researchers at the University of Washington studied how Americans' perceived cost of commute time changes depending on who's driving. The team set up a survey that asked people across the continental U.S. to select between a personal car or a ride-hailing service for a 15-mile commute trip.

Image: 
Gao et al./Transportation

Many Americans use a ride-hailing service -- like Uber or Lyft -- to get to and from work. It provides the privacy of riding in a personal car and the convenience of catching up on emails or social media during traffic jams. In the future, self-driving vehicles could provide the same service, except without a human driver.

But would consumers be willing to ride in a driverless car?

Researchers at the University of Washington studied how Americans' perceived cost of commute time changes depending on who's driving. Through a survey, the team found that people considered a ride-hailing service at least 13% "less expensive," in terms of time, compared to driving themselves. If the researchers told people the ride-hailing service was driverless, however, then the cost of travel time increased to 15% more than driving a personal car, suggesting that at least for now, people would rather drive themselves than have an autonomous vehicle drive them.

The team published its results Aug. 6 in the journal Transportation.

"The idea here is that 'time is money,' so the overall cost of driving includes both the direct financial costs and the monetary equivalent of time spent traveling," said senior author Don MacKenzie, a UW associate professor of civil and environmental engineering who also leads the leads the UW's Sustainable Transportation Lab. "The average person in our sample would find riding in a driverless car to be more burdensome than driving themselves. This highlights the risks of making forecasts based on how people say they would respond to driverless cars today."

The team set up a survey that asked people across the continental U.S. to select between a personal car or a ride-hailing service for a 15-mile commute trip. Half the 502 respondents were told that the ride-hailing service was driverless.

The researchers converted the responses to a score of how much respondents deemed that trip would cost per hour.

"If someone values their trip time at $15 per hour, that means they dislike an hour spent traveling as much as they dislike giving up $15," said co-author Andisheh Ranjbari, a research engineer at the UW's Supply Chain Transportation & Logistics Center. "So a lower number means that the time spent traveling for that trip is less burdensome."

On average, respondents preferred a ride-hailing service over driving themselves: Ride-hailing services scored at $21 an hour and driving scored $25 an hour. In addition, if the researchers reminded respondents they could multitask during a ride-hailing service ride, their perceived cost of travel time decreased even more to $13 per hour.

Technically a ride-hailing service should be equally as convenient regardless of whether a human or an autonomous car is driving, but respondents disagreed. Driverless ride-hailing services scored at $28 an hour.

These results make sense, according to the team. Driverless cars aren't commercially available yet, so people are not familiar with them or may be leery of the technology.

"We believe that our respondents are telling us that if they were riding in an automated vehicle today, they would be sufficiently stressed out by the experience that it would be worse than driving themselves," MacKenzie said. "This is a reminder that automated vehicles will need to offer benefits to consumers before people will adopt them. To a first approximation, a ride-hailing service with driverless cars would need to offer services at a price at least $7 per hour less than human-driven cars, to make the driverless service more attractive."

Credit: 
University of Washington

NASA finds Humberto strengthening off the Florida Coast

image: On Sept. 15, 2019, the MODIS instrument that flies aboard NASA's Aqua provided a visible image of Tropical Storm Humberto spinning off the eastern coast of Florida and strengthening. Powerful thunderstorms circled the center and a large band of thunderstorms wrapped into the low-level center from the east.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA's Aqua Satellite provided a visible image of Tropical Storm Humberto as it was strengthening off the Florida coast on Sept. 15. Humberto became a hurricane late in the day.

On Sept. 15, the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite provided a visible image of Tropical Storm Humberto spinning off the eastern coast of Florida and strengthening. Powerful thunderstorms circled the center and a large band of thunderstorms wrapped into the low-level center from the east. Humberto became a hurricane on Sept. 15 at 11 p.m. EDT.

On that day at 2:11 p.m. EDT (1811 UTC) the Atmospheric Infrared Sounder (AIRS) instrument aboard NASA's Aqua satellite analyzed Humberto's cloud top temperatures in infrared light. AIRS found coldest cloud top temperatures (purple) of strongest thunderstorms were as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the center. NASA research has shown that storms with cloud top temperatures that cold can produce heavy rainfall.

On Sept. 16, Humberto was stirring up the seas and creating hazardous conditions. Humberto continued to get better organized to the west of Bermuda and was pushing large swells that were affecting much of the southeastern United States coastline. The National Hurricane Center (NHC) cautioned that interests in and around Bermuda should monitor the progress of Humberto since a Tropical Storm Watch will likely be required for Bermuda later in the day.

At 11 a.m. EDT (1500 UTC), the center of Hurricane Humberto was located near latitude 29.9 degrees north and longitude 76.5 degrees west making the center about 710 miles (1,145 km) west of Bermuda. Humberto is moving toward the east-northeast near 7 mph (11 kph). This general motion with a gradual increase in forward speed is expected through early Thursday.

Data from an Air Force Reserve reconnaissance aircraft indicate that maximum sustained winds are near 85 mph (140 kph) with higher gusts. The minimum central pressure recently measured by reconnaissance aircraft was 978 millibars.

In the NHC discussion, Forecaster Stacy Stewart noted, "Humberto has been strengthening at a rate of 20 knots per 24 hours since this time yesterday, and that trend is expected to continue for the next day or so given the warm water beneath the hurricane and a continued favorable upper-level outflow pattern. The hurricane is expected to peak as a major hurricane in 36 to 48 hours/"

On the forecast track, the center of Humberto is forecast to approach Bermuda Wednesday night, Sept. 18. NHC forecasters said that strengthening is expected during the next 48 hours, and Humberto could become a major hurricane by Tuesday night, Sept. 17.

Credit: 
NASA/Goddard Space Flight Center

NASA finds Tropical Depression Peipah dissipating

image: On Sept. 16, 2019, the MODIS instrument aboard NASA's Terra satellite provided a visible image of Peipah. The storm had diminished to a swirl of clouds, with only a small area of strong thunderstorms southwest of the center. Wind shear from the northeast pushed those storms southwest of center.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA's Terra satellite passed over the northwestern Pacific Ocean and provided a final view of Tropical Depression Peipah.

Peipah developed on Sept. 14 as a depression. It was the seventeenth (17W) tropical depression of the Northwestern Pacific Ocean typhoon season. On Sept. 15 at 11 a.m. EDT (1500 UTC) the depression strengthened into a tropical storm with maximum sustained winds near 35 knots (40 mph) and was renamed Peipah. On Sept. 16, Peipah weakened back to a depression.

On Sept. 16, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Terra satellite provided a visible image of Peipah. The storm had diminished to a swirl of clouds, with only a small area of strong thunderstorms southwest of the center. Wind shear from the northeast pushed those storms southwest of center.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

Tropical Depression Peipah was dissipating on Sept. 16 at 11 a.m. EDT (1500 UTC). It was centered near 24.8 degrees north latitude and 142.8 degrees east longitude, about 82 miles east of Iwo To Island, Japan. Peipah was moving to the northwest and had maximum sustained winds near 25 knots.

The Joint Typhoon Warning Center issued their final warning on the system as it continued to dissipate.

Credit: 
NASA/Goddard Space Flight Center

Potential target for diabetes-associated Alzheimer's disease

image: Depletion of Caveolin-1 in Type-2 Diabetes Model Induces Alzheimer's disease Pathology Precursors

Image: 
Bonds et al., <em>JNeurosci</em>(2019)

Researchers have identified a protein that may contribute to the progression of Alzheimer's disease pathology in type-2 diabetes, reports a new study of male mice and human brain tissue. The research could have implications for future drug development.

The cause of sporadic, late onset Alzheimer's disease is unknown. However, type-2-diabetes is associated with an increased Alzheimer's risk, which may provide a clue to its origin.

Bonds et al. examined the relationship between diabetes and Alzheimer's disease and found that a protein called caveolin-1 (Cav-1) is depleted in the temporal lobe of humans with diabetes and in a diabetic mouse model. Depletion of Cav-1 causes the upregulation of amyloid precursor protein and b-amyloid levels. Importantly, restoring Cav-1 levels in mice reduced Alzheimer's pathology and improved learning and memory deficits, revealing a potential mechanism responsible for the increased risk of Alzheimer's disease in this population.

Credit: 
Society for Neuroscience