Earth

Heating poppy seeds, but not baking them in muffins, reduces opiate levels

You might have heard the advice to avoid eating a poppy seed bagel or muffin before a drug screen, lest you test positive for opiates. This urban legend is rooted in truth because the tiny black seeds contain small amounts of morphine and codeine that can show up in a drug test. Now, researchers reporting in ACS' Journal of Agricultural and Food Chemistry have studied how different treatments affect levels of opiates in poppy seeds.

For thousands of years, people have grown poppies both for their colorful flowers and analgesic properties. The plant is the source of opium, which has long been used as a medicinal and recreational drug. Although opium comes from a sap-like substance that surrounds the capsule encasing the seeds, some can be transferred to the seeds during handling. As a result, scientists have detected low levels of opium alkaloids, such as morphine, codeine and thebaine, in poppy seeds. Benjamin Redan and colleagues wanted to measure levels of these opiates in commercially available seeds and determine whether different treatments, including heating or baking in a muffin, could affect their levels.

The researchers used mass spectrometry to measure the levels of three major opium alkaloids in 15 samples of commercially available poppy seeds, and detected large variations in opiate concentrations. Heating the seeds at 392 F for at least 40 minutes degraded most of the alkaloids. However, baking the seeds within or on top of a muffin for 16 minutes at 392 F didn't significantly change morphine, codeine or thebaine concentrations in the seeds, possibly because the internal and external temperatures of the muffins reached only 211 F and 277 F, respectively. Although this study showed that heating poppy seeds could reduce opiate levels, this treatment would likely alter the sensory properties or reduce the shelf life of the seeds, the researchers say.

Credit: 
American Chemical Society

Breakthrough in research on production of 2D crystals with excellent optical properties

image: Artistic visualisation: monolayer of 2D material -- molybdenum diselenide (MoSe2) is grown by directing molecular beams of selenium (yellow) and molybdenum (blue) on atomically flat hexagonal boron nitride substrate. Thanks to this substrate, MoSe2 epilayer exhibits excellent optical properties. The image was chosen for the cover of the May 2020 issue of ACS Nano Letters (Source: UW Physics, A. Bogucki, W. Pacuski)

Image: 
Source: UW Physics, A. Bogucki, W. Pacuski

For the first time, monolayers of transition metal dichalcogenides with excellent optical properties were grown. A team of physicists from the University of Warsaw managed to overcome the technical difficulties faced by industry and scientists from around the world, namely the very limited size, heterogeneity, and broadening of the spectral lines of fabricated materials. Monolayers without these defects were grown by molecular beam epitaxy on atomically flat boron nitride substrates.

Two-dimensional crystals with a honeycomb structure, including the famous graphene, have already revolutionized nanoscience and have the potential to revolutionize common technologies, as well. Therefore, it is highly desirable to develop industrial-scale methods for their production.

However, despite substantial investments in the development of growth techniques for atomically thin crystals, the best quality monolayers are currently still obtained using exfoliation, i.e. due to the mechanical detachment of individual atomic layers from the bulk crystal. For example, graphene flakes exfoliated from bulk graphite exhibit superior electrical properties when compared to grown graphene. In contrast, the size of the mechanically exfoliated monolayers is rather small.

Similarly, optical properties of two-dimensional transition metal dichalcogenides (e.g. molybdenum diselenide) are fully revealed only for layers obtained as a result of exfoliation and after having been subjected to further mechanical treatment, such as placing them between layers of boron nitride. However, as already mentioned, this technique does not lead to atomically thin crystals on a larger scale, resulting in heterogeneity, limited size, and even to the appearance of corrugations, bubbles, and irregular edges.

Hence, it is crucial to develop a technique for growing two-dimensional transition metal dichalcogenides that will allow for the production of monolayers with a large surface area. Currently, one of the most advanced technologies for producing thin semiconductor crystals is molecular beam epitaxy (MBE). It provides low-dimensional structures on large wafers, with high homogeneity, but its effectiveness in the production of transition metal dichalcogenides has been very limited so far. In particular, the optical properties of MBE grown monolayers have hitherto been rather modest, e.g. spectral lines have been broad and weak, giving no hope for the use of the spectacular optical properties of transition metal dichalcogenides on a larger scale.

It is in this area that researchers from the Faculty of Physics of the University of Warsaw made a breakthrough. In collaboration with several laboratories from Europe and Japan, they conducted a series of studies on the growth of transition metal dichalcogenides monolayers on an atomically flat boron nitride substrate. In this way, using the MBE method, they obtained flat crystals, equal in size to the substrate, showing uniform parameters over the entire surface, including - most valuably - excellent optical properties.

The results of the work have just been published in the latest volume of the prestigious journal Nano Letters. The discovery directs future research into the industrial production of atomically thin materials. In particular, it indicates the need to develop larger atomically flat boron nitride wafers. On such wafers, it will be possible to grow monolayers with the optical quality, dimensions, and homogeneity required for optoelectronic applications.

Physics and Astronomy first appeared at the University of Warsaw in 1816, under the then Faculty of Philosophy. In 1825 the Astronomical Observatory was established. Currently, the Faculty of Physics' Institutes include Experimental Physics, Theoretical Physics, Geophysics, Department of Mathematical Methods and an Astronomical Observatory. Research covers almost all areas of modern physics, on scales from the quantum to the cosmological. The Faculty's research and teaching staff includes ca. 200 university teachers, of which 87 are employees with the title of professor. The Faculty of Physics, University of Warsaw, is attended by ca. 1000 students and more than 170 doctoral students.

Credit: 
University of Warsaw, Faculty of Physics

New liver cancer research targets non-cancer cells to blunt tumor growth

PHILADELPHIA -- "Senotherapy," a treatment that uses small molecule drugs to target "senescent" cells, or those cells that no longer undergo cell division, blunts liver tumor progression in animal models according to new research from a team led by Celeste Simon, PhD, a professor of Cell and Developmental Biology in the Perelman School of Medicine at the University of Pennsylvania and scientific director of the Abramson Family Cancer Research Institute. The study was published in Nature Cell Biology.

"This kind of therapy is not something that has been tried before with liver cancer," Simon said. "And in our models, so-called 'senolytic' therapy greatly reduced disease burden, even in cases with advanced disease."

Loss of the enzyme FBP1 in human liver cells significantly increases tumor growth. Previous research has shown FBP1 levels are decreased in stage 1 tumors, and further reduced as the disease progresses. In this study, Simon and her team used RNA-sequencing data to identify FBP1 as universally under-expressed in the most common form of liver cancer, hepatocelluar carcinoma, regardless of underlying causes like obesity, alcoholism, and hepatitis.

The loss of FBP1 in liver cells activates the neighboring hepatic "stellate cells"--which make up ten percent of liver mass--causing fibrosis (tissue scarring) and subsequent stellate cell senescence, both of which promote tumor growth. Researchers found that these senescent stellate cells can be selectively targeted by senolytics, including Navitoclax (already in clinical trials for other diseases, like hematological malignancies), in order to blunt tumor progression driven by liver cell-specific FBP1 loss.

The team provides the first genetic evidence for FBP1 as a bona fide metabolic tumor suppressor in the liver and that its loss in liver cells promotes the growth of tumors because of effects on other cells within the tumor microenvironment.

Using genetically engineered mouse models, the team eliminated FBP1and found the disease progressed more rapidly and tumor burden greatly increased in carcinogen-mediated, dietary, and other forms of hepatocellular carcinoma.

"The case with liver cancer is very dire, once you get beyond a certain stage there are limited, if any, treatments available," Simon said. "As obesity rates continue to increase and viral infections continue to be a problem, there is going to be an increasing surge of liver cancer which currently has few treatment options. And since FBP1 activity is also lost in renal cancer, FBP1 depletion may be generally applicable to a number of human cancers. What's unique about our senotherapy approach is that we are specifically targeting other cells in the liver tumor environment rather than the cancer cells themselves."

Credit: 
University of Pennsylvania School of Medicine

Should tomatoes go in the fridge?

image: Keep your tomatoes in the fridge or in the fruit bowl? The expert panel didn't find a huge difference in flavour.

Image: 
Division of Quality of Plant Products, University of Göttingen

There is much debate about the correct storage of tomatoes. There are two main options available to consumers: storage in the refrigerator or at room temperature. A research team from the University of Göttingen has now investigated whether there are differences in the flavour of ripe tomatoes depending on how they are stored and taking into account the chain of harvesting from farm to fork. No perceptible difference was found: the variety of tomato is much more important. The results have been published in the journal Frontiers in Plant Science.

How does the flavour change when ripe, picked tomatoes go through a commercial post-harvest chain and are then stored either in the refrigerator (7 degrees Celsius) or at room temperature (20 degrees Celsius)? Researchers from the Division of Quality of Plant Products at the University of Göttingen analysed flavour-related attributes in new tomato strains drawing on the expertise of a "sensory panel". The sensory panel consisted of experienced and trained assessors who use their senses to perceive and evaluate the sensory properties of products. Among other attributes, this panel examined the discernible sweetness, acidity and juiciness of tomatoes. No significant differences in flavour were found between the two storage options when the entire post-harvest chain is taken into account.

"It is the variety of tomato in particular that has an important influence on the flavour. Therefore, the development of new varieties with an appealing flavour can be a step towards improving the flavour quality of tomatoes," says Larissa Kanski, lead author of the study. "The shorter the storage period, the better it is for the flavour and related attributes. However, we were able to show that, taking into account the entire post-harvest chain, short-term storage of ripe tomatoes in the refrigerator did not affect the flavour," reports Head of Division Professor Elke Pawelzik.

Credit: 
University of Göttingen

Climate change will turn coastal Antarctica green, say scientists

image: Dr. Matt Davey sampling snow algae at Lagoon Island, Antarctica.

Image: 
Sarah Vincent

Scientists have created the first ever large-scale map of microscopic algae as they bloomed across the surface of snow along the Antarctic Peninsula coast. Results indicate that this 'green snow' is likely to spread as global temperatures increase.

The team, involving researchers from the University of Cambridge and the British Antarctic Survey, combined satellite data with on-the-ground observations over two summers in Antarctica to detect and measure the green snow algae. Although each individual alga is microscopic in size, when they grow en masse they turn the snow bright green and can be seen from space. The study is published today in the journal Nature Communications.

"This is a significant advance in our understanding of land-based life on Antarctica, and how it might change in the coming years as the climate warms," said Dr Matt Davey in the University of Cambridge's Department of Plant Sciences, who led the study. "Snow algae are a key component of the continent's ability to capture carbon dioxide from the atmosphere through photosynthesis."

Blooms of green snow algae are found around the Antarctic coastline, particularly on islands along the west coast of the Antarctic Peninsula. They grow in 'warmer' areas, where average temperatures are just above zero degrees Celsius during the austral summer - the Southern Hemisphere's summer months of November to February. The Peninsula is the part of Antarctica that experienced the most rapid warming in the latter part of the last century.

The team found that the distribution of green snow algae is also strongly influenced by marine birds and mammals, whose excrement acts as a highly nutritious natural fertiliser to accelerate algal growth. Over 60% of blooms were found within five kilometres of a penguin colony. Algae were also observed growing near the nesting sites of other birds, including skuas, and areas where seals come ashore.

The team used images from the European Space Agency's Sentinel 2 satellite taken between 2017 and 2019, and combined these with measurements they made on the ground in Antarctica at Ryder Bay, Adelaide Island, and the Fildes Peninsula, King George Island.

"We identified 1679 separate blooms of green algae on the snow surface, which together covered an area of 1.9 km2, equating to a carbon sink of around 479 tonnes per year" said Davey. Put into context this is the same amount of carbon emitted by about 875,000 average petrol car journeys in the UK.

Almost two thirds of the green algal blooms were on small, low-lying islands with no high ground. As the Antarctic Peninsula warms due to rising global temperatures, these islands may lose their summer snow cover and with it their snow algae. However, in terms of mass, the majority of snow algae is found in a small number of larger blooms in the north of the Peninsula and the South Shetland Islands, in areas where they can spread to higher ground as low-lying snow melts.

"As Antarctica warms, we predict the overall mass of snow algae will increase, as the spread to higher ground will significantly outweigh the loss of small island patches of algae," said Dr Andrew Gray, lead author of the paper, and a researcher at the University of Cambridge and NERC Field Spectroscopy Facility, Edinburgh.

Photosynthesis is the process in which plants and algae generate their own energy, using sunlight to capture carbon dioxide from the atmosphere and release oxygen. There are many different types of algae, from the tiny, single-celled species measured in this study, to large leafy species like giant kelp. The majority of algae live in watery environments, and when excess nitrogen and phosphorous are available they can multiply rapidly to create visible algal blooms.

The researchers say that the total amount of carbon held in Antarctic snow algae is likely to be much larger because carbon dioxide is also taken up by other red and orange algae, which could not be measured in this study. They plan further work to measure these other algal blooms, and also to measure the blooms across the whole of Antarctica using a mixture of field work and satellite images.

Antarctica is the world's southernmost continent, typically known as a frozen land of snow and ice. But terrestrial life can be abundant, particularly along its coastline, and is responding rapidly to climate changes in the region. Mosses and lichens form the two biggest visible groups of photosynthesising organisms, and have been the most studied to date. This new study has found that microscopic algae also play an important role in Antarctica's ecosystem and its carbon cycling.

Credit: 
University of Cambridge

Heat now more lethal than cold for people with respiratory diseases in Spain

Barcelona, 20 May 2020. A new study by the Barcelona Institute for Global Health (ISGlobal), a centre supported by the "la Caixa" Foundation, has analysed deaths linked to respiratory disease in Spain between 1980 and 2016. The study, which analysed data on more than 1.3 million deaths, found that the seasonality of temperature-attributable mortality from respiratory diseases has shifted from the coldest to the hottest months of the year. The authors concluded that the decrease in temperature-attributable mortality during the winter months is driven not by the rising temperatures associated with climate change, but by the adaptation of the population to lower temperatures.

The study, published in Nature Communications, analysed daily temperature data and mortality counts from respiratory diseases--disaggregated by sex, age group and place of residence--from 48 Spanish provinces. Analysis of the data on mortality due to respiratory diseases revealed an average decline in deaths of 16.5% per decade for the colder months compared to relatively stable figures for the warmer months of the year over the 37-year study period. Temperature-attributable deaths from respiratory diseases went from being most frequent in January and December to reaching their peak in July and August.

"Two or three decades ago, respiratory diseases caused by low temperatures represented an additional risk of death in Spain," commented lead author Hicham Achebak, a researcher at ISGlobal and the Autonomous University of Barcelona's Centre for Demographic Studies. "The findings of this study show that this risk has gradually been declining. Thanks to adaptive measures, such as the more widespread use of heating and improved treatment of these conditions, respiratory disease mortality is no longer driven by cold temperatures and we are seeing a complete reversal in the seasonal cycle."

Although this inversion was observed across all sex and age groups, there were differences between the groups. Vulnerability to heat increased with age and was greater in women than in men. Conversely, the effects of cold decreased with age and were less pronounced in women than in men, although the differences between groups were much less striking in this case. "In the later years of our study period, the differences in mortality risk between groups were almost imperceptible for cold temperatures, whereas the differences for the summer months were significant," commented ISGlobal researcher Joan Ballester, co-author of the study. "These observations reflect a remarkable process of adaptation to cold, but not to heat."

Climate Change and Health Policy

Climate change is associated with numerous health effects. Extreme temperatures, for example, correlate with cardiovascular and respiratory diseases. "This study shows that the projected decrease in the number of cold days due to global warming over the coming decades will not contribute to a further reduction in mortality from respiratory diseases," commented Achebak.

"Deaths attributable to hot or cold temperatures are caused by a combination of exposure to extreme temperatures and the vulnerability of the population," explained Ballester. "Reducing this vulnerability may require policies associated with socioeconomic development, such as those aimed at improving health services."

Credit: 
Barcelona Institute for Global Health (ISGlobal)

Every heart dances to a different tune

Sophia Antipolis - 20 May 2020: Play the same piece of music to two people, and their hearts can respond very differently. That's the conclusion of a novel study presented today on EHRA Essentials 4 You, a scientific platform of the European Society of Cardiology (ESC).

This pioneering research revealed how music triggers individual effects on the heart, a vital first step to developing personalised music prescriptions for common ailments or to help people stay alert or relaxed.

"We used precise methods to record the heart's response to music and found that what is calming for one person can be arousing for another," said Professor Elaine Chew of the French National Centre for Scientific Research (CNRS).1

Previous studies investigating physiological responses to music have measured changes in heart rate after listening to different recordings simply categorised as 'sad', 'happy', 'calm', or 'violent'.

This small study took a more precise approach, featuring several unique aspects. Three patients with mild heart failure requiring a pacemaker were invited to a live classical piano concert. Because they all wore a pacemaker, their heart rate could be kept constant during the performance. The researchers measured the electrical activity of the heart directly from the pacemaker leads before and after 24 points in the score (and performance) where there were stark changes in tempo, volume, or rhythm.

Specifically, they measured the time it takes the heart to recover after a heartbeat. "Heart rate affects this recovery time, so by keeping that constant we could assess electrical changes in the heart based on emotional response to the music," said Professor Chew.

"We are interested in the heart's recovery time (rather than heart rate) because it is linked to the heart's electrical stability and susceptibility to dangerous heart rhythm disorders," explained the project's medical lead Professor Pier Lambiase of University College London. "In some people, life-threatening heart rhythm disorders can be triggered by stress. Using music we can study, in a low risk way, how stress (or mild tension induced by music) alters this recovery period."

The researchers found that change in the heart's recovery time was significantly different from person to person at the same junctures in the music. Recovery time reduced by as much as 5 milliseconds, indicating increased stress or arousal. And recovery time lengthened by as much as 5 milliseconds, meaning greater relaxation.

Commenting on the individual nature of reactions, Professor Chew said: "Even though two people might have statistically significant changes across the same musical transition, their responses could go in opposite directions. So for one person the musical transition is relaxing, while for another it is arousing or stress inducing."

For example: a person not expecting a transition from soft to loud music could find it stressful, leading to a shortened heart recovery time. For another person it could be the resolution to a long build-up in the music and hence a release, resulting in a lengthened heart recovery time.

Professor Chew said: "By understanding how an individual's heart reacts to musical changes, we plan to design tailored music interventions to elicit the desired response."

"This could be to reduce blood pressure or lower the risk of heart rhythm disorders without the side effects of medication," added Professor Lambiase.

Professor Chew noted that while the number of patients in the study is small, the researchers amassed gigabytes of data. The results are currently being confirmed in a total of eight patients.

Credit: 
European Society of Cardiology

Deciphering the fine neuroendocrine regulatory system during development

image: Note. Figure from "The Corazonin-PTTH neuronal axis controls systemic body growth by regulating basal ecdysteroid biosynthesis in Drosophila melanogaster"

Image: 
University of Tsukuba

Tsukuba, Japan - Development, growth and reproduction are highly regulated in all animals. One of the key components of these processes is the precise action of steroid hormones. In a new study, researchers from the University of Tsukuba uncovered a regulatory pathway that controls ecdysteroid biosynthesis, a steroid hormone in the fruit fly Drosophila melanogaster, to enable proper body size adjustment during development transition from growth to maturation.

Steroid hormones are key for growth and maturation during development. Because different developmental stages require variable amounts of steroid hormones, the central nervous system tightly regulates their synthesis. In Drosophila, one such control mechanism includes the production of prothoracicotropic hormone (PTTH), which is known to be important for maintaining low, or basal, ecdysteroid levels as well as to stimulate high levels of ecdysteroid production at particular time points to facilitate proper development.

"The amount of steroid hormone being produced determines its function," says senior author of the study Professor Ryusuke Niwa. "Basal levels of ecdysteroid are important for inhibiting larva growth, while peak levels facilitate maturation later on. The goal of our study was to further our understanding of what mechanisms modulate PTTH activity for basal ecdysteroid biosynthesis."

To achieve their goal, the researchers used fluorescence microscopy and found that neurons producing corazonin (Crz) physically contact with PTTH-producing neurons in Drosophila. Next, they used electron microscopy to investigate the connection between these neurons in more detail. They found that fusion sites of so-called dense core vesicles (DCVs), into which neuroactive peptides are known to be packaged, are present between Crz neurons and PTTH neurons -- which suggests that they communicate with each other. Deactivating Crz neurons resulted in increased pupal body size, but did not affect the timing in the larva-to-pupa transition. Interestingly, larvae with inhibited Crz neuronal activity grew faster during the L3 stage, which is the final instar larval stage in Drosophila.

"These findings show how Crz neurons control growth during the L3 stage," says first author of the study, Eisuke Imura. "Using liquid chromatography/mass spectrometry analyses of 20-hydroxyecdysone, the active form of ecdysteroid, we also found that Crz neurons control the timing of basal ecdysteroid biosynthesis. We wanted to know what happens at the molecular level during the mid-L3 stage and how Crz neurons themselves are regulated."

Using fluorescence microscopy, the researchers investigated downstream molecules of Crz signaling and found that Crz receptors on PTTH neurons are present at higher levels during the mid-L3 stage, the feeding stage of larvae, than during the late-L3 stage, the wandering stage of larvae. By imaging calcium levels, which represent Crz receptor-mediated signaling, the researchers further confirmed that Crz neurons affect PTTH neurons only during the mid-L3 stage. In a search for molecules that regulate Crz signaling, the researchers found that the absence of octopamine, a neurotransmitter, replicated the effects of deactivating Crz neurons.

"These are striking results that show how Crz is a key molecule for the body size adjustment during the larval stage" says Professor Shimada-Niwa, a corresponding author of this study. "Our findings provide new insights into a neuronal axis that regulates growth and maturation during development."

Credit: 
University of Tsukuba

New wearable sensor tracks vitamin C levels in sweat

image: A wearable, non invasive Vitamin C sensor could provide a new, highly personalized option for users to track their daily nutritional intake and dietary adherence

Image: 
University of California San Diego

A team at the University of California San Diego has developed a wearable, non invasive Vitamin C sensor that could provide a new, highly personalized option for users to track their daily nutritional intake and dietary adherence. The study was published in the May 18, 2020 issue of ACS Sensors.

"Wearable sensors have traditionally been focused on their use in tracking physical activity, or for monitoring disease pathologies, like in diabetes," said first-author Juliane Sempionatto, a PhD Candidate in nanoengineering in Joseph Wang's lab at the UC San Diego Jacobs School of Engineering. "This is the first demonstration of using an enzyme-based approach to track changes in the level of a necessary vitamin, and opens a new frontier in the wearable device arena."

"Wearable sensors have rarely been considered for precision nutrition," said Joseph Wang, a professor of nanoengineering and director of the Center of Wearable Sensors at UC San Diego.

Why vitamin C is important

Vitamin C is an essential dietary component, as it cannot be synthesized by the human body and must be obtained through our food or via vitamin supplements. The vitamin is important for supporting immune health and collagen production, a vital player in wound healing, as well as improving iron absorption from plant-based foods. Ongoing research is examining whether or not the vitamin's role as an antioxidant might support its use in treating diseases like cancer and heart disease.

Most pressingly, the vitamin is being studied in several clinical trials for its potential in supporting recovery from COVID-19, the disease caused by the novel SARS-CoV-2 virus. A handful of past studies have linked high doses of vitamin C, alongside other treatments, to reduced mortality rates in patients with sepsis and, in one study, acute respiratory distress syndrome (ARDS) - both common conditions seen in serious cases where patients with COVID-19 require intensive care and intubation.

If vitamin C does help patients recover from the disease, such a wearable sensor might aid doctors and recovering patients in tracking their vitamin C levels during treatment and recovery, providing an opportunity for healthcare providers to precisely tune vitamin supplementation to match a patient's needs.

The wearable device

The new wearable device consists of an adhesive patch that can be applied to a user's skin, containing a system to stimulate sweating and an electrode sensor designed to quickly detect vitamin C levels in sweat. To do so, the device includes flexible electrodes containing the enzyme ascorbate oxidase. When vitamin C is present, the enzyme converts it to dehydroascrobic acid and the resulting consumption of oxygen generates a current that is measured by the device.

In vitro testing and testing in four human subjects who had consumed vitamin C supplements and vitamin C-containing fruit juices showed that the device was highly sensitive to detecting changes in the levels and dynamics of the vitamin when tracked across two hours. The researchers also tested the electrode detector's ability to detect temporal vitamin C changes in tears and saliva, demonstrating its cross-functionality. Differences observed in the vitamin C dynamics across different human subjects indicates that the device has promise for personal nutrition applications.

"Ultimately, this sort of device would be valuable for supporting behavioral changes around diet and nutrition," said Sempionatto. "A user could track not just vitamin C, but other nutrients - a multivitamin patch, if you will. This is a field that will keep growing fast." The UC San Diego team is closely collaborating with a major global nutrition company DSM towards the use of wearable sensors for personal nutrition.

"Despite the rapid development of wearable biosensors, the potential of these devices to guide personalized nutrition has not yet been reported," said Wang. "I hope that the new epidermal patch will facilitate the use of wearable sensors for non-invasive nutrition status assessments and tracking of nutrient uptake toward detecting and correcting nutritional deficiencies, assessing adherence to vitamin intake, and supporting dietary behavior change."

With the pressing need to develop new treatments for COVID-19, the team is also looking for ways to quickly get this technology into a clinical setting, in the event that vitamin C does prove to be a helpful treatment for the disease.

Credit: 
University of California - San Diego

Iron nanorobots go undercover

image: Labeled cells could be tracked either in cell cultures or once injected into a living animal.

Image: 
© 2020 KAUST

Living cells inside the body could be placed under surveillance--their location and migration noninvasively tracked in real time over many days--using a new method developed by researchers at KAUST.

The technique uses magnetic core-shell iron nanowires as nontoxic contrast agents, which can be implanted into live cells, lighting up those cells' location inside a living organism when scanned by magnetic resonance imaging (MRI). The technique could have applications ranging from studying and treating cancer to tracking live-cell medical treatments, such as stem cell therapies.

Jürgen Kosel and his team recently showed that core-shell iron nanowires could selectively kill cancer cells with a combination attack, delivering an anticancer drug into target cells while also puncturing the cell's membrane and unleashing blasts of heat. Now, in collaboration with researchers from the CIC biomaGUNE in San Sebastian, Spain, the team has shown that the same type of iron core, iron-oxide shell nanowires, can be used for noninvasive medical imaging. The nanowires could potentially be used as "theranostic" agents, able to identify, track and then take out target cells.

"Cell labeling and tracking has become an invaluable tool for scientific and clinical applications," says Aldo Martínez-Banderas, a Ph.D. student in Kosel's team. "One of the key aspects of cell tracking studies is the sensitivity to detect a small number of cells after implantation, so the strong magnetization and biocompatibility of our nanowires are advantageous characteristics for MRI tracking."

The nanowires performed well as MRI contrast agents, even at very low concentrations, and the magnetic response could be tuned by altering the thickness of the nanowire shell, the team showed. The nanowire's biocompatibility permitted long-term tracking of the live cells. "The nanowires interacted with cells without compromising their survival, functionality or capacity to proliferate," Martínez-Banderas explains. The labeled cells could be tracked either in cell cultures or once injected into a living animal. "The strong magnetization of the nanowires enabled the detection of approximately 10 labeled cells within the brain of a mouse for a period of at least 40 days, which allowed us to trace their exact location and fate in the animal," Martínez-Banderas says.

"These core-shell nanowires have various additional features, including the ability to control them magnetically to guide them to a particular location, to carry drugs, or be to heated with a laser," Kosel says. "Combining all of that with the capability of tracking creates a theranostic platform that can open the door for very promising new approaches in nanomedicine."

Credit: 
King Abdullah University of Science & Technology (KAUST)

Hunting threatens one of the world's most amazing wildlife migrations

image: Many of these fascinating birds are unfortunately declining, with several on the brink of extinction.

Image: 
The University of Queensland

As the world looks to tighten up the illegal capture of wildlife, migratory birds are being threatened by widespread and unsustainable hunting across the Asia-Pacific region.

University of Queensland-led research has revealed that three quarters of migratory shorebird species in the region have been hunted since the 1970s.

UQ PhD student Eduardo Gallo-Cajiao said the finding was deeply concerning, as these globetrotters were already under pressure from other human impacts.

"The Asia-Pacific is host to one of the most amazing animal migrations on earth," Mr Gallo-Cajiao said.

"Every year, hundreds of thousands of shorebirds, wetland-dependent species, breed across the Arctic and boreal regions, moving south to Southeast Asia, Australia, and New Zealand along a migration corridor known as the East Asian-Australasian Flyway.

"The Flyway spans 22 countries, through which 61 species of shorebirds complete their epic annual migrations some covering up to 25,000 km each year.

"But many of these fascinating birds are unfortunately declining, with several on the brink of extinction.

"Until now, habitat loss due to the expansion of coastal infrastructure had been identified as one of the main causes of their declines, particularly around the Yellow Sea region of China and the Korean peninsula, where many birds stop to rest and feed on their migrations.

"The scale and significance of hunting was unknown prior to this study, and it's clear that it's likely contributed to declines of migratory shorebirds in this region."

The team worked for four years assembling all available evidence on hunting - analysing hunting records from 14 countries, involving 46 species.

But there are knowledge gaps, as they could not find data for eight countries.

Currently, there are five shorebird species at high risk of extinction in this region, including the critically endangered spoon-billed sandpiper, of which fewer than 500 remain.

"Our study discovered that other threatened species that have been subject to hunting include the great knot, far eastern curlew, and spotted greenshank," Mr Gallo-Cajiao said.

UQ's Professor Richard Fuller said managing hunting was complicated by the broad range of people involved, from recreational hunters to subsistence hunters and commercial traders.

"At least some hunting is driven by issues of food security, so sustainable development must be considered when developing alternatives for management," Professor Fuller said.

"There's no coordinated monitoring of how many shorebirds are taken annually across the region, which makes management really hard.

"Internationally coordinated approaches to address hunting are now underway, including through the UN Convention on Migratory Species, but these efforts need to be drastically ramped up to avoid extinctions and maintain healthy wildlife populations.

"Additional ground surveys and an international coordinated monitoring strategy are also urgently needed."

Credit: 
University of Queensland

Exercise improves memory, boosts blood flow to brain

image: The 'A' image shows the cerebral blood flow in a group of older adults at risk to develop Alzheimer's disease after one year of aerobic exercise training. The yellow and white represents increased flow into the hippocampus, the anterior cingulate cortex, and other frontal regions. The 'B' image shows no change or a reduction in blood flow in a group of at-risk older adults who did one year of stretching only.

Image: 
UTSW

DALLAS - May 20, 2020 - Scientists have collected plenty of evidence linking exercise to brain health, with some research suggesting fitness may even improve memory. But what happens during exercise to trigger these benefits?

New UT Southwestern research that mapped brain changes after one year of aerobic workouts has uncovered a potentially critical process: Exercise boosts blood flow into two key regions of the brain associated with memory. Notably, the study showed this blood flow can help even older people with memory issues improve cognition, a finding that scientists say could guide future Alzheimer’s disease research.

“Perhaps we can one day develop a drug or procedure that safely targets blood flow into these brain regions,” says Binu Thomas, Ph.D., a UT Southwestern senior research scientist in neuroimaging. “But we’re just getting started with exploring the right combination of strategies to help prevent or delay symptoms of Alzheimer’s disease. There’s much more to understand about the brain and aging.”

Blood flow and memory

The study, published in the Journal of Alzheimer's Disease, documented changes in long-term memory and cerebral blood flow in 30 participants, each of them 60 or older with memory problems. Half of them underwent 12 months of aerobic exercise training; the rest did only stretching.

The exercise group showed a 47 percent improvement in some memory scores after one year compared with minimal change in the stretch participants. Brain imaging of the exercise group, taken while they were at rest at the beginning and end of the study, showed increased blood flow into the anterior cingulate cortex and the hippocampus – neural regions that play important roles in memory function.

Other studies have documented benefits for cognitively normal adults on an exercise program, including previous research from Thomas that showed aging athletes have better blood flow into the cortex than sedentary older adults. But the new research is significant because it plots improvement over a longer period in adults at high risk to develop Alzheimer’s disease.

“We’ve shown that even when your memory starts to fade, you can still do something about it by adding aerobic exercise to your lifestyle,” Thomas says.

Mounting evidence

The search for dementia interventions is becoming increasingly pressing: More than 5 million Americans have Alzheimer’s disease, and the number is expected to triple by 2050.

Recent research has helped scientists gain a greater understanding of the molecular genesis of the disease, including a 2018 discovery from UT Southwestern’s Peter O’Donnell Jr. Brain Institute that is guiding efforts to detect the condition before symptoms arise. Yet extensive research into how to prevent or slow dementia have not yielded proven treatments that would make an early diagnosis actionable for patients.

UT Southwestern scientists are among many teams across the world trying to determine if exercise may be the first such intervention. Evidence is mounting that it could at least play a small role in delaying or reducing the risk of Alzheimer’s disease.

For example, a 2018 study showed that people with lower fitness levels experienced faster deterioration of vital nerve fibers in the brain called white matter. A study published last year showed exercise correlated with slower deterioration of the hippocampus.

Regarding the importance of blood flow, Thomas says it may someday be used in combination with other strategies to preserve brain function in people with mild cognitive impairment.

“Cerebral blood flow is a part of the puzzle, and we need to continue piecing it together,” Thomas says. “But we’ve seen enough data to know that starting a fitness program can have lifelong benefits for our brains as well as our hearts.”

Credit: 
UT Southwestern Medical Center

The moral machine

Is it OK to kill time? Machines used to find this question difficult to answer, but a new study reveals that Artificial Intelligence can be programmed to judge 'right' from 'wrong'.

Published in Frontiers in Artificial Intelligence, scientists have used books and news articles to 'teach' a machine moral reasoning. Further, by limiting teaching materials to texts from different eras and societies, subtle differences in moral values are revealed. As AI becomes more ingrained in our lives, this research will help machines to make the right choice when confronted with difficult decisions.

"Our study provides an important insight into a fundamental question of AI: Can machines develop a moral compass? If so, how can they learn this from our human ethics and morals?" says Dr. Patrick Schramowski, author of this study, based at the Darmstadt University of Technology, Germany. "We show that machines can learn about our moral and ethical values and be used to discern differences among societies and groups from different eras."

Previous research has highlighted the danger of AI learning biased associations from written text. For example, females tend towards the arts and males, technology.

"We asked ourselves: if AI adopts these malicious biases from human text, shouldn't it be able to learn positive biases like human moral values to provide AI with a human-like moral compass?" explains co-author of this study, Dr Cigdem Turan, also based at Darmstadt University.

The researchers trained their AI system, named the Moral Choice Machine, with books, news and religious text, so that it could learn the associations between different words and sentences.

Turan explains, "You could think of it as learning a world map. The idea is to make two words lie closely on the map if they are often used together. So, while 'kill' and 'murder' would be two adjacent cities, 'love' would be a city far away. Extending this to sentences, if we ask, 'Should I kill?' we expect that 'No, you shouldn't.' would be closer than 'Yes, you should.' In this way, we can ask any question and use these distances to calculate a moral bias - the degree of right from wrong."

Once the scientists had trained the Moral Choice Machine, it adopted the moral values of the given text.

"The machine could tell the difference between contextual information provided in a question," reports Schramowski. "For instance, no, you should not kill people, but it is fine to kill time. The machine did this, not by simply repeating the text it found, but by extracting relationships from the way humans have used language in the text."

Investigating further, the scientists wondered how different types of written text would change the moral bias of the machine.

"The moral bias extracted from news published between 1987 and 1996-97 reflects that it is extremely positive to marry and become a good parent. The extracted bias from news published between 2008-09 still reflects this, but to a lesser degree. Instead, going to work and school increased in positive bias," says Turan.

In the future, the researchers hope to understand how removing a stereotype that we consider to be bad affects the moral compass of the machine. Can we keep the moral compass unchanged?

"Artificial Intelligence handles increasingly complex human tasks in increasingly autonomous ways - from self-driving cars to health care. It is important to continue research in this area so that we can trust the decisions they make," concludes Schramowski.

Credit: 
Frontiers

Cognitive behavioural therapy reduces the impact of dissociative seizures

Scientists have found that adding cognitive behavioural therapy (CBT) to standardised medical care gives patients with dissociative seizures longer periods of seizure freedom, less bothersome seizures and a greater quality of life, in a study published in Lancet Psychiatry today and by the Cognitive behavioural therapy for adults with dissociative seizures (CODES) study group funded by National Institute for Health Research (NIHR).

Dissociative seizures, also called functional and non-epileptic seizures, look similar in appearance to epileptic seizures or fainting but are related to a different type of involuntary blackout that is typically distressing and disabling for patients and their carers. Up to 1 in 5 adults presenting in epilepsy clinics have this hidden condition which is one of several types of Functional Neurological Disorder (FND). Historically patients with dissociative seizures have often been ignored or disbelieved by doctors and research on treatment is limited. They are more likely to be found in women and usually have a poor outcome with a worse quality of life than people with epilepsy alone. People with dissociative seizures have a marked increase in health service use.

In the largest treatment trial to date for dissociative seizures, 368 patients from centres across England, Scotland, and Wales were followed up 6 months and 12 months after treatment courses began. Researchers found patients treated with dissociative seizure specific CBT alongside standardised medical care reported the highest number of consecutive dissociative seizure-free days in the previous 6 months, along with greater functional status, self-rated and doctor-rated change in global impression scores, and satisfaction with treatment when compared with standardised medical care alone.

Lead author Laura Goldstein, Professor of Clinical Neuropsychology at the Institute of Psychiatry, Psychology & Neuroscience (IoPPN), King's College London said, "We have delivered the first large-scale multi-centre and multi-professional trial investigating treatments for adults with dissociative seizures. This is especially important as the availability of treatment for people with this disorder has been so variable in the UK and elsewhere.

While overall there appeared to be a reduction in how often people in both groups of the study were having dissociative seizures at the end of the trial, with no clear difference between the groups, the group who had received our package of dissociative seizure-specific CBT were reporting better functioning across a range of everyday situations. They described their dissociative seizures as less bothersome, they were less distressed, reported better health and fewer symptoms, and were more satisfied with their treatment. It is important to consider providing dissociative seizure-specific CBT in addition to specialist care from neurologists and psychiatrists to treat people with dissociative seizures."

In the UK, there is currently no standardised care pathway for people with dissociative seizures. The researchers recommend the incorporation of seizure-specific CBT within specialist care from neurologists and psychiatrists. Furthermore, as participants received treatment in varied medical settings, this study suggests that the CBT combination intervention does not have to be limited to highly specialised centres and can be delivered by a range of clinical psychologists or cognitive behavioural psychotherapists. Researchers suggest future work is needed to identify which patients would benefit most from a dissociative seizure-specific CBT approach.

Neurologist, Professor Jon Stone, who was a co-investigator in the study said, "The CODES Trial is a landmark study for a condition which has, for too long, been ignored by health services. The trial has encouraged participating neurologists, psychiatrists and psychotherapists across the UK to raise their standard of care for these patients. The trial has set a new bar for evidence in this area, making it clear that there are thousands of new patients with this condition in the UK every year who we need to keep doing better for in terms of treatment and research in the future."

Co-investigator Trudie Chalder, Professor of Cognitive Behavioural Psychotherapy at IoPPN, said, "With appropriate training and supervision, we now have evidence for the effectiveness of dissociative seizure specific CBT combined with standardised medical care. This is good news for patients who have often felt misunderstood and health care professionals (HCP's) who have wanted guidance on best practice."

Credit: 
NIHR Maudsley Biomedical Research Centre

How cosmic rays may have shaped life

image: Showers of high energy particles originating from the sun and our galaxy collide with nitrogen and oxygen in the upper atmosphere. At ground level, the shower is dominated by magnetically polarized muons. At the protobiological site, nucleic acids assumed either a right-handed or left-handed helical conformation. The magnetically polarized radiation preferentially ionized one type of 'handedness' leading to a slightly different mutation rate between the two mirror proto-lifeforms. Over time, right-handed molecules out-evolved their left-handed counterparts.

Image: 
Simons Foundation

Before there were animals, bacteria or even DNA on Earth, self-replicating molecules were slowly evolving their way from simple matter to life beneath a constant shower of energetic particles from space.

In a new paper, a Stanford professor and a former post-doctoral scholar speculate that this interaction between ancient proto-organisms and cosmic rays may be responsible for a crucial structural preference, called chirality, in biological molecules. If their idea is correct, it suggests that all life throughout the universe could share the same chiral preference.

Chirality, also known as handedness, is the existence of mirror-image versions of molecules. Like the left and right hand, two chiral forms of a single molecule reflect each other in shape but don't line up if stacked. In every major biomolecule - amino acids, DNA, RNA - life only uses one form of molecular handedness. If the mirror version of a molecule is substituted for the regular version within a biological system, the system will often malfunction or stop functioning entirely. In the case of DNA, a single wrong handed sugar would disrupt the stable helical structure of the molecule.

Louis Pasteur first discovered this biological homochirality in 1848. Since then, scientists have debated whether the handedness of life was driven by random chance or some unknown deterministic influence. Pasteur hypothesized that, if life is asymmetric, then it may be due to an asymmetry in the fundamental interactions of physics that exist throughout the cosmos.

"We propose that the biological handedness we witness now on Earth is due to evolution amidst magnetically polarized radiation, where a tiny difference in the mutation rate may have promoted the evolution of DNA-based life, rather than its mirror image," said Noémie Globus lead author of the paper and a former Koret Fellow at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC).

In their paper, published on May 20 in Astrophysical Journal Letters, the researchers detail their argument in favor of cosmic rays as the origin of homochirality. They also discuss potential experiments to test their hypothesis.

Magnetic polarization from space

Cosmic rays are an abundant form of high-energy radiation that originate from various sources throughout the universe, including stars and distant galaxies. After hitting the Earth's atmosphere, cosmic rays eventually degrade into fundamental particles. At ground level, most of the cosmic rays exist only as particles known as muons.

Muons are unstable particles, existing for a mere 2 millionths of a second, but because they travel near the speed of light, they have been detected more than 700 meters below Earth's surface. They are also magnetically polarized, meaning, on average, muons all share the same magnetic orientation. When muons finally decay, they produce electrons with the same magnetic polarization. The researchers believe that the muon's penetrative ability allows it and its daughter electrons to potentially affect chiral molecules on Earth and everywhere else in the universe.

"We are irradiated all the time by cosmic rays," explained Globus, who is currently a post-doctoral researcher at New York University and the Simons Foundation's Flatiron Institute. "Their effects are small but constant in every place on the planet where life could evolve, and the magnetic polarization of the muons and electrons is always the same. And even on other planets, cosmic rays would have the same effects."

The researchers' hypothesis is that, at the beginning of life of on Earth, this constant and consistent radiation affected the evolution of the two mirror life-forms in different ways, helping one ultimately prevail over the other. These tiny differences in mutation rate would have been most significant when life was beginning and the molecules involved were very simple and more fragile. Under these circumstances, the small but persistent chiral influence from cosmic rays could have, over billions of generations of evolution, produced the single biological handedness we see today.

"This is a little bit like a roulette wheel in Vegas, where you might engineer a slight preference for the red pockets, rather than the black pockets," said Roger Blandford, the Luke Blossom Professor in the School of Humanities and Sciences at Stanford and an author on the paper. "Play a few games, you would never notice. But if you play with this roulette wheel for many years, those who bet habitually on red will make money and those who bet on black will lose and go away."

Ready to be surprised

Globus and Blandford suggest experiments that could help prove or disprove their cosmic ray hypothesis. For example, they would like to test how bacteria respond to radiation with different magnetic polarization.

"Experiments like this have never been performed and I am excited to see what they teach us. Surprises inevitably come from further work on interdisciplinary topics," said Globus.

The researchers also look forward to organic samples from comets, asteroids or Mars to see if they too exhibit a chiral bias.

"This idea connects fundamental physics and the origin of life," said Blandford, who is also Stanford and SLAC professor of physics and particle physics and former director of KIPAC. "Regardless of whether or not it's correct, bridging these very different fields is exciting and a successful experiment should be interesting."

Credit: 
Stanford University