Brain

Zika vaccine protects both mom and fetus, but mom needs a higher dose when pregnant

GALVESTON, Texas - Researchers from The University of Texas Medical Branch at Galveston showed, for the first time, that a single, higher dose of vaccination to a pregnant mouse safely protects both her and her fetus from the Zika virus.

The researchers found that a single, less potent dose was not enough to protect the fetus. The findings are currently available in Nature Communications.

"Preventing birth defects in developing fetuses is an important goal of the Zika virus vaccine but studies on vaccinations in pregnant females have been lacking, raising a number of important questions that are critical to the clinical development and regulatory approval of Zika vaccines," said UTMB's Pei-Yong Shi, senior author and the I.H. Kempner professor at the department of biochemistry and molecular biology. "Could vaccination during pregnancy protect against infection and transmission to the fetus? Does pregnancy affect immune responses to Zika vaccination? Does maternal immunity from vaccination during pregnancy protect newborns against infection?"

Shi and his laboratory previously developed a Zika vaccine and continue studies to improve its efficacy.

In addition to protecting both mother and fetus, Shi said that the researchers also learned that their live-attenuated vaccine has an excellent safety profile in pregnant female mice and her fetus. For example, they saw no adverse effects on pregnancy, fetal development or infant behavior. They also found that pregnancy weakens the mother's immune response to the vaccination, suggesting that that a higher dose of the vaccine or a more immunogenic vaccine is needed during pregnancy. Taken together, their results suggest that their vaccine may be considered for both pregnant and non-pregnant people.

Credit: 
University of Texas Medical Branch at Galveston

Print me an organ -- Why are we not there yet?

image: Bioprinting of potential tissue-engineered constructs

Image: 
SUTD

3D bioprinting is a highly-advanced manufacturing platform that allows for the printing of tissue, and eventually vital organs, from cells. This could open a new world of possibilities for the medical field, while directly benefiting patients who need replacement organs.

Instead of waiting for a suitable donor or having the risk of their body rejecting a transplanted organ, 3D printed organs allow patients to have a customised organ fabricated specifically to replace their faulty ones. However, even with headway that 3D bioprinting has made in the last two decades, it is still lacking significant strides in order to produce complex 3D biomimetic tissue constructs.

According to researchers from the Singapore University of Technology and Design (SUTD), Nanyang Technological University (NTU) and Asia University, tissue culture techniques in particular require accelerated progress to address the bottleneck of maturing bioprinted multi-cellular 3D tissue constructs into functional tissues. Their research paper entitled "Print me an organ! Why are we not there yet?" has been published in the Progress in Polymer Science.

In the paper, researchers also provided in-depth review of recent improvements and analysed the bioprinting techniques, progress in bio-ink development, implementation of new bioprinting and tissue maturation strategies. Special attention was also given to the role of polymer science and how it complemented 3D bioprinting to overcome some of the major impediments such as achieving biomimicry, vascularization and 3D anatomically-relevant biological structures, in the field of organ printing (refer to image).

The use of complementary strategies such as dynamic co-culture perfusion system was seen as critical towards ensuring maturation and assembly of bioprinted tissue constructs. Even though it is now possible to fabricate human-scale tissues or organs that can potentially mature into vascularized and partially-functional tissues, the industry is still lagging behind in the bioprinting of human-specific tissues or organs due to the complexities in tissue-specific extracellular matrices (ECM) and tissue maturation process - the lack of suitable co-culture medium to support multiple types of cells and the need for further tissue conditioning prior to implantation.

"While 3D bioprinting is still in its early stages, the remarkable leap it has made in recent years points to the eventual reality of lab-grown, functional organs. However, to push the frontiers of medicine we must overcome the technical challenges in creating tissue-specific bio-inks and optimizing the tissue maturation process. This will ultimately have a huge impact on patients' lives, many of whom may be reliant on the future of 3D bioprinting," said Professor Chua Chee Kai, lead author of the paper from SUTD.

Credit: 
Singapore University of Technology and Design

NTU Singapore scientists convert plastics into useful chemicals using sunlight

image: NTU SPMS Asst Prof Soo Han Sen with the plastic-photocatalyst mixture, which when exposed to sunlight converts the plastic to formic acid, a useful chemical.

Image: 
NTU Singapore

Chemists at Nanyang Technological University, Singapore (NTU Singapore) have discovered a method that could turn plastic waste into valuable chemicals by using sunlight.

In lab experiments, the research team mixed plastics with their catalyst in a solvent, which allows the solution to harness light energy and convert the dissolved plastics into formic acid - a chemical used in fuel cells to produce electricity.

Reporting their work in Advanced Science, the team led by NTU Assistant Professor Soo Han Sen from the School of Physical and Mathematical Sciences made their catalyst from the affordable, biocompatible metal vanadium, commonly used in steel alloys for vehicles and aluminium alloys for aircraft.

When the vanadium-based catalyst was dissolved in a solution containing a non-biodegradable consumer plastic like polyethylene and exposed to artificial sunlight, it broke down the carbon-carbon bonds within the plastic in six days.

This process turned the polyethylene into formic acid, a naturally occurring preservative and antibacterial agent, which can also be used for energy generation by power plants and in hydrogen fuel cell vehicles.

"We aimed to develop sustainable and cost-effective methods to harness sunlight to manufacture fuels and other chemical products," said Asst Prof Soo. "This new chemical treatment is the first reported process that can completely break down a non-biodegradable plastic such as polyethylene using visible light and a catalyst that does not contain heavy metals."

In Singapore, most plastic waste is incinerated, producing greenhouse gases such as carbon dioxide, and the leftover mass - burn ash - is transported to the Semakau landfill, which is estimated to run out of space by 2035.

Developing innovative zero-waste solutions, such as this environmentally friendly catalyst to turn waste into resources, is part of the NTU Smart Campus vision to develop a sustainable future.

Using energy from the sun to convert chemicals

The vanadium-based catalyst, which is supported by organic groups and typically abbreviated as LV(O), uses light energy to drive a chemical reaction, and is known as a photocatalyst.

Photocatalysts enable chemical reactions to be powered by sunlight, unlike most reactions performed in industry that require heat, usually generated through the burning of fossil fuels.

Other advantages of the new photocatalyst are that it is low cost, abundant, and environmentally friendly, unlike common catalysts made from expensive or toxic metals such as platinum, palladium or ruthenium.

While scientists have tried other approaches for turning waste plastics into useful chemicals, many approaches involve undesirable reagents or too many steps to scale up.

One example is an approach called photoreforming, where plastic is combined with water and sunlight to produce hydrogen gas, but this requires the use of catalysts containing cadmium, a toxic heavy metal. Other methods require plastics to be treated with harsh chemical solutions that are dangerous to handle.

Most plastics are non-biodegradable because they contain extraordinarily inert chemical bonds called carbon-carbon bonds, which are not readily broken down without the application of high temperatures.

The new vanadium-based photocatalyst developed by the NTU research team was specially designed to break these bonds, and does so by latching onto a nearby chemical group known as an alcohol group and using energy absorbed from sunlight to unravel the molecule like a zipper.

As the experiments were conducted at laboratory scale, the plastic samples were first dissolved by heating to 85 degrees Celsius in a solvent, before the catalyst, which is in powder form, was dissolved. The solution was then exposed to artificial sunlight for a few days. Using this approach the team showed that their photocatalyst was able to break down the carbon-carbon bonds in over 30 different compounds and the results demonstrated the concept of an environmentally-friendly, low-cost photocatalyst.

The research team is now pursuing improvements to the process that may allow the breakdown of plastics to produce other useful chemical fuels, such as hydrogen gas.

Credit: 
Nanyang Technological University

Local traditional knowledge can be as accurate as scientific transect monitoring

image: This is the Lowland paca, Cuniculus paca.

Image: 
Hani El Bizri

New research from a cross-organisational consortium in the Amazon has found indigenous knowledge to be as accurate as scientific transect monitoring.

The research involved pooling resources between universities and NGO's, including British universities Oxford Brookes University, Manchester Metropolitan University and the University of Suffolk, to create a strong partnership of researchers from several institutions.

Thais Morcatty, a PhD student, at Oxford Brookes University, who will be presenting the research at the conference, said "Detecting and monitoring wild species in a dense tropical forest, such as the Amazon, has always been a challenge, due to the difficulty in recording animals and high costs of scientific fieldwork expeditions. In this study, our findings indicate that the estimates of species' abundances in the forest by local people, may be as accurate as the best scientific methods we currently have?(such as linear?transects)".

This acts as a cost-effective approach to monitor Amazonian species and to raise awareness to guide management decision-making and plan conservation strategies.

"These findings indicate that, overall, the results we found in around 10 years of transect monitoring is very similar to those obtained through traditional ecological knowledge for the same period", said Morcatty.

In general, there was good consensus amongst local people on species abundance, in particular for the Howler monkey, Alouatta seniculus, a diurnal, aboreal species frequently hunted for subsistence, whereas low consensus was recorded for elusive, aquatic species, such as the Neotropical otter, Lontra longicaudis. The consensus itself is achieved through interviews to determine abundance estimates for species. Species for which there was a high level of agreement regarding their abundance, had also a similar abundance estimative derived from the long-term transects.

Based on this evidence, citizen science can be an accurate and integrative tool to obtain information about wild species. The researchers advocate that traditional ecological knowledge should be valued and the local culture of indigenous and riverine peoples in the Amazon respected. Combining efforts to collect and analyse data from long-term community-based projects allows us to have a wider understanding of the Amazon resources.

The researchers used the cultural consensus approach to develop suites of species for which their abundance could reliably be estimated.

"Cultural consensus theory assumes that cultural beliefs are learned and shared across people and that there is a common understanding among people from the same cultural background regarding a topic", said Morcatty.

The cultural consensus approach was tested amongst mostly male hunters for 97 species at 16 sites throughout the Peruvian and Brazilian Amazon. Sampled species include a selection of birds, such as blue-throated piping guan (Pipile cumanensis), reptiles, such as the yellow-footed tortoise, (Chelonoidis denticulatus), and big mammals, such as the tapir (Tapirus terrestris) and the white-lipped peccary (Tayassu pecari).

The consensus approach was combined with a statistical method known as a generalized additive mixed model. This was used to assess whether factors such as body size, habitat, elusivity and susceptibility to hunting could influence the level of consensus amongst interviewees. These factors likely act together, hence, we can examine the cumulative effect of all features and how they act in isolation to affect estimates of species abundance.

Thais Morcatty said of the study: "we tested if the interviewees agreed more about the species abundance because an animal is larger, or because they hunt it more often for subsistence, or because the species is not elusive, has a specific type of habitat (terrestrial/arboreal/aquatic) or activity pattern (if diurnal/nocturnal)"

Large-bodied species which were hunted had the highest cultural consensus. This indicates that if interviews rather than transects had been carried out at sampling sites, species abundance estimates would have the same level of confidence. However, precision was lower for small-sized, elusive and rarely hunted species.

Morcatty warns against the sole reliance on local knowledge obtained on this study, as the consensus approach was skewed towards a gendered approach of male, which is the gender that mostly go into the forest for several extractive activities. In addition, the linear transects were not designed to monitor invertebrates and small vertebrates such as passerines, amphibians and small reptiles. Hence, the 97 species assessed are largely medium and large-sized vertebrates.

It is likely that a combined transect-cultural consensus approach would be needed to assess the abundances of elusive Amazonian species. However, this interdisciplinary approach can easily be applied to other communities, not just in unsampled areas of the Amazon but in other global biodiversity hotspots.

The researchers consider the partnership among scientists, managers and local people as the best way to achieve wildlife conservation and at the same time contribute to preserve their traditional culture and lifestyle. Information on species abundance, and especially their trends over time, are crucial for decision-making in several aspects. Abundance estimates are required for the establishment of a protected area and evaluating the success of management and conservation strategy. This is critical in places where species are threatened by human activities, such as overhunting, dams, mining, oil extractions and infrastructure projects.

Thais Morcatty will present the group's work on Thursday 12 December 2019 at the British Ecological Society annual meeting. The conference will bring together 1,200 ecologists from more than 40 countries to discuss the latest research.

Credit: 
British Ecological Society

Financial infidelity: Secret spending costs couples and companies

image: Boston College Carroll School of Management Assistant Professor of Marketing Hristina Nikolova and colleagues report in the Journal of Consumer Research that they have identified how to measure consumers' financial infidelity tendencies and their impact on consumption.

Image: 
Lee Pellegrini/Boston College

Chestnut Hill, Mass - Along with sexual dalliances and emotional dishonesty, add "financial infidelity" to the perils of the modern relationship, according to Boston College Assistant Professor Marketing Hristina Nikolova and fellow researchers who undertook the first systemic investigation into the secretive spending of romantic partners.

As retailers enter the holiday shopping season, the new study identifies "financial infidelity" as a real problem for consumers and companies, according to Nikolova, the Diane Harkins Coughlin and Christopher J. Coughlin Sesquicentennial Assistant Professor of Marketing at the Carroll School of Management.

Partners and spouses more prone to financial infidelity exhibit a stronger preference for secretive purchase options such as using cash; keeping a personal rather than a joint credit card; choosing concealing packaging; and shopping at generic rather than specialty stores, Nikolova and researchers from three other universities report in the Journal of Consumer Research.

Retailers may need to adjust traditional marketing approaches to better serve shoppers attempting to keep purchases quiet, for whatever reason, said Nikolova, whose research explores consumer psychology, in particular how couples make decisions.

"We are entering the biggest holiday shopping season and there are very simple things that retailers can do to boost their sales, such as offering inconspicuous packaging without a brand name or the ability to pay with cash," Nikolova said. "Our research suggests that these options should appeal to consumers who are prone to engage in financial infidelity. Retailers should recognize that such shoppers do exist and they will probably sneak an expensive coat or a massage amidst the gift shopping they will do."

The researchers define financial infidelity as "engaging in any financial behavior expected to be disapproved of by one's romantic partner and intentionally failing to disclose the behavior."

"Understanding financial infidelity is important because financial matters are one of the major sources of conflict within romantic couples and prior research has shown that keeping money-related secrets in relationships is a 'deal breaker'," said Nikolova.

The team developed the Financial Infidelity (FI) Scale to measure consumers' financial infidelity proneness, and examine how financial infidelity impacts consumption. The team conducted lab and field studies and analysed bank account data collected in partnership with a couples' money-management mobile app. App users who scored higher on the FI-Scale were more likely to hide their transactions and hide bank accounts from their partners, researchers found.

Masking spending, shielding accounts, shipping in plain brown boxes, and burying an indulgent expenditure within the receipt from a big-box store are just some of the lengths to which people will go to avoid leaving a trail of illicit spending, according to the report, titled "Love, Lies, and Money: Financial Infidelity in Romantic Relationships."

These choices are directly relevant to marketers, as the prevalence of financial infidelity among consumers and variation on the trait impacts purchasing behaviors across domains, Nikolova said.

Although there is a lot of research on sexual infidelity in romantic relationships, there has been no research on financial infidelity. "The lack of research on financial infidelity is surprising because financial infidelity is very common among couples," said Nikolova.

Past studies have shown that 41 percent of married participants who have joint finances with their partners admit to committing financial deceptions and 75 percent reported financial deceit had negatively affected their relationships, according to the National Endowment for Financial Education.

"A few things that couples can do to prevent financial infidelity is to talk more, get on the same page regarding both joint and individual goals they might have, and also budget for some occasional indulgences along the way of achieving their long-term financial goals," Nikolova said.

In addition to the personal toll, it is critical for companies to be aware that there are consumer segments who are highly prone to financial infidelity, as these segments may impact their bottom lines, Nikolova said. For instance, the recent trend of businesses going cash-free may hurt retailers as some secretive spenders prefer to use cash to disguise purchases.

Retailers may have to go so far as to offer a variety of generic packaging options, absent brand identity, in order to appeal to the financially unfaithful.

Nikolova said she was surprised by the potentially costly conflict between behaviors fueled by financial infidelity and traditional marketing methods.

"The significant impact of financial infidelity on the marketing-relevant consumption behaviors was surprising to us," said Nikolova, who co-authored the report with Indiana University's Jenny Olson, University College London's Joe L. Gladstone, and University of Notre Dame's Emily Garbinsky. "This suggests that financial infidelity is important not only for consumers and their personal and relationship well-being, but also for marketers and their bottom lines."

Nikolova said future research will look at how financial infidelity varies across different relationships. Specifically, how it is shaped by the distribution of financial responsibility between partners, decision-making power, and financial communication within relationships.

Credit: 
Boston College

Genetic variant largely found in patients of African descent associated with heart failure

PHILADELPHIA -- A genetic variant in the gene transthyretin (TTR)--which is found in about 3 percent of individuals of African ancestry--is a more significant cause of heart failure than previously believed, according to a multi-institution study led by researchers at Penn Medicine. The study also revealed that a disease caused by this genetic variant, called hereditary transthyretin amyloid cardiomyopathy (hATTR-CM), is significantly under-recognized and underdiagnosed.

The findings, which were published today in JAMA, are particularly important given the U.S. Food and Drug Administration's (FDA) approval of the first therapy (tafamidis) for ATTR-CM in May 2019. Prior to the new therapy, treatment was largely limited to supportive care for heart failure symptoms and, in rare cases, heart transplant.

"Our findings suggest that hATTR-CM is a more common cause of heart failure than it's perceived to be, and that physicians are not sufficiently considering the diagnosis in certain patients who present with heart failure," said the study's corresponding author Daniel J. Rader, MD, chair of the Department of Genetics at Penn Medicine. "With the recent advances in treatment, it's critical to identify patients at risk for the disease and, when appropriate, perform the necessary testing to produce an earlier diagnosis and make the effective therapy available."

hATTR-CM, also known as "cardiac amyloidosis," typically manifests in older patients and is caused by the buildup of abnormal deposits of a specific transthyretin protein known as amyloid in the walls of the heart. The heart walls become stiff, resulting in the inability of the left ventricle to properly relax and adequately pump blood out of the heart. However, this type of heart failure--which presents similar to hypertensive heart disease-- is common, and the diagnosis of hATTR-CM is often not considered.

While it's known that hATTR-CM can lead to heart failure, predominately in older patients of African ancestry, many questions existed about the prevalence of hATTR-CM diagnoses in TTR V122I carriers and the rate of appropriate diagnosis.

In this study, researchers from Penn Medicine and the Icahn School of Medicine at Mount Sinai used a 'genome-first' approach, performing DNA sequencing of 9,694 individuals of African and Latino ancestry enrolled in either the Penn Medicine BioBank (PMBB) or the Icahn School of Medicine at Mount Sinai BioMe biobank (BioMe). Researchers identified TTR V122I carriers and then examined longitudinal electronic health record-linked genetic data to determine which of the carriers had evidence of heart failure. They found 44 percent of the TTR V122I variant carriers older than age 50 had heart failure, but only 11 percent of these individuals had been diagnosed with hATTR-CM. The average time to diagnosis of three years, indicating both high rates of underdiagnoses and prolonged time to appropriate diagnosis.

Researchers also observed left-ventricular-wall thickening, which could be a sign of early subclinical heart failure even in patients without heart failure. Higher rates of left-ventricular-wall thickening among younger TTR V122I carriers without overt heart failure were detected in the BioMe cohort, suggesting subtle changes in the heart may develop years prior to the onset of advanced signs and symptoms of the disease.

"This study suggests that workup for amyloid cardiomyopathy and genetic testing of TTR should be considered, when appropriate, to identify patients at risk for the disease and intervene before they develop more severe symptoms or heart failure," said the study's lead author Scott Damrauer, MD, an assistant professor of Surgery at Penn Medicine and a vascular surgeon at the Corporal Michael J. Crescenz VA Medical Center.

Credit: 
University of Pennsylvania School of Medicine

Genetic breakthrough identifies heart failure risk in African and Latino Americans

A genetic variation believed to increase risk for heart failure in people of African or Latino ancestry has been identified in a new study by researchers from the Icahn School of Medicine at Mount Sinai and Perelman School of Medicine at the University of Pennsylvania.

The study found that the transthyretin or TTR V122I genetic variant was significantly associated with heart failure and that heredity transthyretin amyloid cardiomyopathy (hATTR-CM) caused by this variant was confirmed at appreciable frequency in individuals of African or Latino ancestry. The results suggest a significant under-recognition and under-diagnosis of this potentially fatal disease.

The results of the study were published today in JAMA.

hATTR-CM causes deposits of abnormal proteins called amyloids in the heart, nerves, and sometimes the kidneys and other organs, resulting in progressive organ dysfunction. The condition can run in families. Symptoms may start as early as age 20 or as late as age 80, and the average delay in diagnosis is four years. Treatment was limited to supportive care until May of this year, when the Food and Drug Administration approved tafamidis, the first and only targeted TTR therapy. Even with this new treatment, timely diagnosis is key, as the medication can only delay disease progression, not reverse the symptoms.

"Given recent advances in treatment for hATTR-CM, it is imperative to identify patients at risk for the disease and intervene before noticeable symptoms of the disease appear," said study author Ron Do, PhD, Assistant Professor of Genetics and Genomics Sciences, and co-Director of the BioMe Phenomics Center in The Charles Bronfman Institute for Personalized Medicine at the Icahn School of Medicine at Mount Sinai. "Previous studies have proposed utilizing routine genetic testing for individuals with African ancestry; however, this is not current practice and the scope of the under-diagnosis is not clear."

In this observational study, the association of the TTR V122I variant with the clinical diagnosis of heart failure was evaluated using longitudinal electronic health record-linked genetic data from two large integrated academic health systems, the Icahn School of Medicine at Mount Sinai BioMe biobank (BioMe) and the Penn Medicine Biobank (PMBB). Among carriers of the TTR V122I variant, the rates of evaluation for and diagnosis with hATTR-CM was assessed.

Because the TTR V122I variant predominantly occurs in individuals of African ancestry, researchers analyzed the association between TTR V122I variant carrier status and heart failure in 9,694 individuals of African (BioMe and PMBB) and Latino (BioMe) ancestry enrolled in one of the two biobanks. In PMBB, a cross-sectional analysis was performed, comparing the rate of heart failure between TTR V122I variant carriers and non-carriers in individuals of genetically inferred African ancestry aged 50 years or older. The analysis in BioMe used a case-control design among individuals of self-reported African or Latino ancestry, comparing the number of TTR V122I carriers and non-carriers between all participants with prevalent heart failure cases and individuals over the age of 65 without heart failure.

Patients of African or Latino ancestry with TTR V122I genetic variant are two-fold more likely to have heart failure risk. Across both study samples, researchers also observed left-ventricular-wall thickening, which could be a sign of early subclinical heart failure. Higher rates of left-ventricular-wall thickening among younger TTR V122I carriers without overt heart failure were detected in the BioMe cohort, suggesting subtle changes in the heart may develop years prior to the onset of advanced signs and symptoms of the disease. Finally, only 11 percent of individuals with the genetic variant and heart failure were diagnosed with transthyretin cardiomyopathy, with an average time to diagnosis of three years, indicating both high rates of underdiagnosis and prolonged time to appropriate diagnosis.

"These findings suggest that although the TTR genetic variant is known to cause hereditary TTR amyloidosis cardiomyopathy and heart failure, there is significant under-recognition and under-diagnosis of this disease, particularly in individuals of African and/or Latino ancestry," said study co-author Girish Nadkarni, MD, Assistant Professor of Medicine (Nephrology) and co-Director of the BioMe Phenomics Center in The Charles Bronfman Institute for Personalized Medicine at the Icahn School of Medicine at Mount Sinai. "Of the observed cases, only 11 percent of individuals with the TTR genetic variant were diagnosed appropriately. It's imperative that genetic screening of TTR genetic variants be considered for early diagnosis of the disease and its treatment."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

New aluminium hydroxide stable at extremely high pressure

image: The crystal structure of ε-AlOOH

Image: 
Ehime Univeristy

Hydrogen is the most abundant element in the universe and it plays important roles in the structure, dynamics, and evolution of the planets. Hydrogen is transported into deep mantle regions as a hydrous mineral via the subduction of oceanic plates. To better understand the global hydrogen circulation in the Earth's mantle, a number of high-pressure experiments were conducted on the stability of hydrous phases under lower mantle conditions. Recent discoveries of new high-pressure hydrous minerals have extended the stability field of hydrous phases toward wider pressure, temperature and compositional ranges, suggesting the existence and the important roles of water in the deepest region of the Earth's mantle. However, there have been few studies on hydrous minerals in the multicomponent system relevant to the actual subducting slabs under the pressure and temperature conditions of the lower parts of the Earth's mantle and of those in other planetary interiors.

We conducted in-situ X-ray diffraction experiments on major hydrous phases in the Earth's lower mantle, CaCl2-type ?-AlOOH, and its solid solutions with FeOOH and MgSiO4H2 at pressures up to ~270 GPa, far higher than those of the Earth's mantle. High pressure-temperature (P-T) conditions were achieved by techniques using a multianvil (MA) apparatus and a laser-heated diamond anvil cell (DAC) in a wide range of pressures of up to 270 GPa and temperatures of up to 2,500 K (Table S1).

Above 190 GPa at 2500 K, we observed that ?-AlOOH transitioned to a new phase, named ?-AlOOH. We also found that hydroxides formed solid solutions over a wide composition range in the

AlOOH-FeOOH-MgSiO4H2 system, which accommodates the major elements in terrestrial rocks. Thus water could be stored in these hydroxides in the deep interiors of the Earth, terrestrial super-Earths, and the rocky cores of some icy planets, irrespective of their composition models.

Credit: 
Ehime University

New laser technique images quantum world in a trillionth of a second

image: Ultrafast pulses of extreme ultraviolet light are created in a gas jet of white plasma, and are visible as blue dots on a phosphor screen as well as yellow beams from oxygen fluorescence.

Image: 
Research to Reality

For the first time, researchers have been able to record, frame-by-frame, how an electron interacts with certain atomic vibrations in a solid. The technique captures a process that commonly causes electrical resistance in materials while, in others, can cause the exact opposite--the absence of resistance, or superconductivity.

"The way electrons interact with each other and their microscopic environment determines the properties of all solids," said MengXing Na, a University of British Columbia (UBC) PhD student and co-lead author of the study, published last week in Science. "Once we identify the dominant microscopic interactions that define a material's properties, we can find ways to 'turn up' or 'down' the interaction to elicit useful electronic properties."

Controlling these interactions is important for the technological exploitation of quantum materials, including superconductors, which are used in MRI machines, high-speed magnetic levitation trains, and could one day revolutionize how energy is transported.

At tiny scales, atoms in all solids vibrate constantly. Collisions between an electron and an atom can be seen as a 'scattering' event between the electron and the vibration, called a phonon. The scattering can cause the electron to change both its direction and its energy. Such electron-phonon interactions lie at the heart of many exotic phases of matter, where materials display unique properties.

With the support of the Gordon and Betty Moore Foundation, the team at UBC's Stewart Blusson Quantum Matter Institute (SBQMI) developed a new extreme-ultraviolet laser source to enable a technique called time-resolved photoemission spectroscopy for visualizing electron scattering processes at ultrafast timescales.

"Using an ultrashort laser pulse, we excited individual electrons away from their usual equilibrium environment," said Na. "Using a second laser pulse as an effective camera shutter, we captured how the electrons scatter with surrounding atoms on timescales faster than a trillionth of a second. Owing to the very high sensitivity of our setup, we were able to measure directly--for the first time--how the excited electrons interacted with a specific atomic vibration, or phonon."

The researchers performed the experiment on graphite, a crystalline form of carbon and the parent compound of carbon nanotubes, Bucky balls and graphene. Carbon-based electronics is a growing industry, and the scattering processes that contribute to electrical resistance may limit their application in nanoelectronics.

The approach leverages a unique laser facility conceived by David Jones and Andrea Damascelli, and developed by co-lead author Arthur Mills, at the UBC-Moore Centre for Ultrafast Quantum Matter. The study was also supported by theoretical collaborations with the groups of Thomas Devereaux at Stanford University and Alexander Kemper at North Carolina State University.

"Thanks to recent advances in pulsed-laser sources, we're only just beginning to visualize the dynamic properties of quantum materials," said Jones, a professor with UBC's SBQMI and department of Physics and Astronomy.

"By applying these pioneering techniques, we're now poised to reveal the elusive mystery of high-temperature superconductivity and many other fascinating phenomena of quantum matter," said Damascelli, scientific director of SBQMI.

Credit: 
University of British Columbia

Potentially toxic chemicals from LCDs in nearly half of household dust samples tested

image: Professor John Giesy inspects a sample in his lab at the USask Toxicology Centre.

Image: 
Daniel Hallen/USask

SASKATOON - Chemicals commonly used in smartphone, television, and computer displays were found to be potentially toxic and present in nearly half of dozens of samples of household dust collected by a team of toxicologists led by the University of Saskatchewan (USask).

The international research team, led by USask environmental toxicologist John Giesy, is sounding the alarm about liquid crystal monomers--the chemical building blocks of everything from flat screen TVs to solar panels--and the potential threat they pose to humans and the environment.

"These chemicals are semi-liquid and can get into the environment at any time during manufacturing and recycling, and they are vaporized during burning. Now we also know that these chemicals are being released by products just by using them," said Giesy, Canada Research Chair in Environmental Toxicology at USask.

"We don't know yet whether this a problem, but we do know that people are being exposed, and these chemicals have the potential to cause adverse effects," said Giesy.

In a first-of-its-kind paper published Dec. 9 in Proceedings of the National Academy of Sciences, Giesy's research team assembled and analyzed a comprehensive list of 362 commonly used liquid crystal monomers gathered from 10 different industries and examined each chemical for its potential toxicity.

The team also further tested the toxicity of monomers commonly found in six frequently used smartphone models.

The researchers found the specific monomers isolated from the smartphones were potentially hazardous to animals and the environment. In lab testing, the chemicals were found to have properties known to inhibit animals' ability to digest nutrients and to disrupt the proper functioning of the gallbladder and thyroid--similar to dioxins and flame retardants which are known to cause toxic effects in humans and wildlife.

To understand how common these monomers are in the environment, researchers tested dust gathered from seven different buildings in China--a canteen, student dormitory, teaching building, hotel, personal residence, lab, and electronics repair facility. Nearly half of the 53 samples tested positive for the liquid crystal monomers.

"Ours is the first paper to list all of the liquid crystal monomers in use and assess their potential to be released and cause toxic effects," said Giesy. "We looked at over 300 different chemicals and found that nearly 100 have significant potential to cause toxicity."

Ninety per cent of the monomers tested had concerning chemical properties. They either accumulate in organisms, resist degradation in the environment, or are easily transported long distances in the atmosphere. Nearly one quarter of the chemicals tested had all three troubling characteristics.

"There are currently no standards for quantifying these chemicals, and no regulatory standards," said Giesy. "We are at ground zero."

Researchers Huijun Su, Shaobo Shi, Ming Zhu, and Guanyong Su of China's Nanjing University of Science and Technology, along with Doug Crump and Robert Letcher of Environment and Climate Change Canada, worked with Giesy to conduct the research. Guanyong Su, who leads the research effort in China, was a former student with Giesy at USask and then a post-doctoral fellow with Environment Canada.

LCD panels are almost exclusively produced in three Asian countries: China, Japan, and South Korea. It's estimated that 198 million square metres of liquid crystal display were produced last year--enough to cover the entire Caribbean island of Aruba.

"Since there are more and more of these devices being made, there's a higher chance of them getting into the environment," said Giesy.

For many years, huge amounts of globally produced e-waste--including LCD displays--have been dismantled, disposed of, and introduced into the environment.

"Right now, there are no measurements of these monomers in surface waters. Our next steps are to understand the fate and effect of these chemicals in the environment," said Giesy.

In his previous work, Giesy was also the first researcher to identify that toxic perfluorinated and polyfluorinated chemicals were widespread in contaminating the environment. His research ultimately resulted in the entire class of chemicals being banned globally.

Credit: 
University of Saskatchewan

Lily and Yuh-Nung Jan named 20th Perl-UNC neuroscience prize recipients

image: Lily Jan, Ph.D., and Yuh-Nung Jan, Ph.D.

Image: 
UCSF

CHAPEL HILL, NC - December 11, 2019 - The UNC School of Medicine has awarded the 20th Perl-UNC Neuroscience Prize to Lily Jan, PhD, and Yuh-Nung Jan, PhD, both at UC San Francisco, for the "discovery and functional characterization of potassium channels."

The husband-and-wife team will visit Chapel Hill March 26, 2020 to receive the prize - a $20,000 award - and give a lecture on their work at 3 p.m. in room G202 of the Medical Biomolecular Research Building (MBRB).

Lily and Yuh-Nung Jan are the Jack and DeLoris Lange Professor of Physiology and Biophysics and Jack and DeLoris Lange Professor of Molecular Physiology, respectively, at UC San Francisco, where they are co-principal investigators of the Jan Lab. They are also both Howard Hughes Medical Institute investigators and members of the Kavli Institute for Fundamental Neuroscience at UCSF and the UCSF Weill Institute for Neurosciences.

Their research centers on the development and function of the nervous system, including the mechanisms of dendrite development, sensory physiology, axon and dendrite regeneration, the function and regulation of potassium and calcium activated chloride channels, and how dendritic morphogenesis and channel modulation contribute to the assembly and shape of functional neuronal circuits.

Potassium channels control the flow of electrical signals in the brain and nervous system, and they are implicated in conditions ranging from epilepsy to hypertension. When the Jans began their work, potassium channels could only be detected indirectly through their electrical effects. In 1987, after nearly a decade of work, the Jans were the first to clone a gene known as Shaker, which is responsible for encoding a potassium channel membrane protein in fruit flies.

They have authored hundreds of scientific papers and mentored more than 150 of future scientists over four decades.

"I have long been impressed with the quality and caliber of research from Drs. Lily and Yuh-Nung Jan," said Mark Zylka, PhD, chair of the Perl-UNC Neuroscience Prize committee and director of the UNC Neuroscience Center. "In fact, as I was finishing graduate school, I had the good fortune of traveling to San Francisco and meeting them in person. At the time, they had already made many insightful discoveries, and since then their labs have continued to provide many important scientific contributions to our field."

The Jans have received numerous awards over the course of their careers, including the 2017 Vilcek Prize, honoring the contributions of immigrants to the biomedical sciences. The Jans came to the United States from Taiwan as graduate students at Caltech in 1968. After postdoctoral training at Caltech and Harvard Medical School, they joined UCSF in 1979.

"We are honored to receive this prestigious award, and we are thrilled to join the impressive group of illustrious awardees of the Perl-UNC Neuroscience Prize," Lily Jan said. "This recognition is very much to the credit of our former and current students and postdoctoral fellows whose dedication and scientific insights are instrumental for the contributions made by the Jan lab."

Yuh-Nung Jan added, "We are both deeply indebted to our mentors, Drs. Max Delbrück, Seymour Benzer and Steve Kuffler, who all greatly influenced us as scientists. It was in Seymour's Drosophila neurogenetics lab, we started using Drosophila to study fundamental problems in neurobiology, a fruitful and enjoyable approach that we have continued to this day."

The Perl-UNC Neuroscience Prize, established in 2000, is named after former UNC professor Edward Perl, MD, who discovered that a specific type of sensory neuron responded to painful stimuli. Before this, scientists thought that sensory neurons responded to all stimuli and that pain responses were sorted out in the spinal cord. The discovery had a major impact on the field of pain research, particularly in the development of pain medications.

Dr. Perl passed away in 2014. Read more about his research in this remembrance.

Credit: 
University of North Carolina Health Care

Researchers say 30% of patients taking opioids experience adverse drug interactions

CHICAGO--December 10, 2019-- Patients who do not disclose use of other medications are at higher risk of adverse drug interactions and addiction, according to new research in The Journal of the American Osteopathic Association.

A new article outlines common drug-drug interactions that alter how the body metabolizes certain opioids, causing decreased efficacy that ultimately can lead to misuse and overdose. The authors estimate that around 30 percent of patients experience such interactions; however, very few are detected and reported.

"The concern we have is that patients may not get the proper amount of pain relief due to an undetected interaction with some other medication they're taking," says Kevin Bain, MPH, PharmD, co-founder and medical director at Biophilia Partners and lead author on this article. "That can lead to them taking higher doses of their prescribed opioid and more frequently, which over time can lead to a substance use disorder or even an overdose."

Bain adds that physicians should take a thorough medication history and consult a pharmacist before prescribing opioids if they have any concerns. However, he encourages patients to advocate for themselves by offering relevant information proactively.

He says commonly prescribed medications that can cause interactions with several opioids include those from antidepressant and antipsychotic classes, as well as some cardiovascular drugs used to treat arrhythmia and high blood pressure.

Preventing complications

The article notes several steps physicians can take to mitigate adverse interactions, when patients take opioids and other medications concurrently.

Patients often take all their medications at once, usually in the morning, to establish a routine and ensure none are forgotten. Bain points out that this method significantly increases the likelihood of a drug interaction.

He recommends staggering the interacting medications a couple hours apart, often taking the opioid first to ensure it metabolizes without interference, because pro-drug opioids like codeine and tramadol usually "suffer" from drug interactions.

When timing fails to avoid drug interactions, physicians can prescribe an alternate opioid less likely to interact with the patient's other medications. Conversely, physicians can consider changing the non-opioid prescription to one less likely to cause an interaction.

"The possible combinations that might result in a drug interaction are vast," says Bain. "The best approach is for physicians and patients to partner closely with a pharmacist who can advise on potential complications, especially at the start of an opioid prescription."

He adds that patients who find an opioid ineffectively manages their pain should consult their physician and never independently alter their dosage or frequency. Communication between patients and physicians is paramount to identifying and mitigating opioid-involved drug interactions, Bain concludes.

Credit: 
American Osteopathic Association

Ice in motion: Satellites capture decades of change

image: Meltwater lakes form on the surface of Greenland's Petermann Glacier, seen here in a June 2019 Landsat image. A new study finds that the number - and elevation - of meltwater lakes in Greenland is increasing.

Image: 
NASA/USGS The videos clearly illustrate wha

New time-lapse videos of Earth's glaciers and ice sheets as seen from space - some spanning nearly 50 years - are providing scientists with new insights into how the planet's frozen regions are changing.At a media briefing Dec. 9 at the annual meeting of the American Geophysical Union in San Francisco, scientists released new time series of images of Alaska, Greenland, and Antarctica using data from satellites including the NASA-U.S. Geological Survey Landsat missions. One series of images tells illustrates the dramatic changes of Alaska's glaciers and could warn of future retreat of the Hubbard Glacier. Over Greenland, different satellite records show a speed-up of glacial retreat starting in 2000, as well as meltwater ponds spreading to higher elevations in the last decade, which could potentially speed up ice flow. And in Antarctic ice shelves, the view from space could reveal lakes hidden beneath the winter snow.

Using images from the Landsat mission dating back to 1972 and continuing through 2019, glaciologist Mark Fahnestock of the University of Alaska Fairbanks, has stitched together six-second time-lapses of every glacier in Alaska and the Yukon.

"We now have this long, detailed record that allows us to look at what's happened in Alaska," Fahnestock said. "When you play these movies, you get a sense of how dynamic these systems are and how unsteady the ice flow is."

The videos clearly illustrate what's happening to Alaska's glaciers in a warming climate, he said, and highlight how different glaciers respond in varied ways. Some show surges that pause for a few years, or lakes forming where ice used to be, or even the debris from landslides making its way to the sea. Other glaciers show patterns that give scientists hints of what drives glacier changes.

The Columbia Glacier, for example, was relatively stable when the first Landsat satellite launched 1972. But starting in the mid-1980s, the glacier's front began retreating rapidly, and by 2019 was 12.4 miles (20 kilometers) upstream. In comparison, the Hubbard Glacier has advanced 3 miles (5 km) in the last 48 years. But Fahnestock's time-lapse ends with a 2019 image that shows a large indentation in the glacier, where ice has broken off.

"That calving embayment is the first sign of weakness from Hubbard Glacier in almost 50 years - it's been advancing through the historical record," he said. If such embayments persist in the coming years, it could be a sign that change could be coming to Hubbard, he said: "The satellite images also show that these types of calving embayments were present in the decade before Columbia retreated."

The Landsat satellites have provided the longest continuous record of Earth from space. The USGS has reprocessed old Landsat images, which allowed Fahnestock to handpick the clearest Landsat scenes for each summer, over each glacier. With software and computing power from Google Earth Engine, he created the series of time-lapse videos.

Scientists are using long-term satellite records to look at Greenland glaciers as well. Michalea King of Ohio State University analyzed data from Landsat missions dating back to 1985 to study more than 200 of Greenland's large outlet glaciers. She examined how far the glacier fronts have retreated, how fast the ice flows, and how much ice glaciers are losing over this time span.

She found that Greenland's glaciers retreated an average of about 3 miles (5 km) between 1985 and 2018 - and that the most rapid retreat occurred between 2000 and 2005. And when she looked at the amount of glacial ice entering the ocean, she found that it was relatively steady for the first 15 years of the record, but then started increasing around 2000.

"These glaciers are calving more ice into the ocean than they were in the past," King said. "There is a very clear relationship between the retreat and increasing ice mass losses from these glaciers during the 1985-through-present record. "While King is analyzing ice lost from the front of glacier, James Lea of the University of Liverpool in the United Kingdom is using satellites data to examine ice melting on top of Greenland's glaciers and ice sheets, which creates meltwater lakes.

These meltwater lakes can be up to 3 miles (5 km) across and can drain through the ice in a matter of hours, Lea said, which can impact how fast the ice flows. With the computing power of Google Earth Engine, Lea analyzed images of the Greenland ice sheet from the Moderate Resolution Imaging Spectroradiometer (MODIS) on the Terra satellites for every day of every melt seasons over last 20 years - more than 18,000 images in all.

"We looked at how many lakes there are per year across the ice sheet and found an increasing trend over the last 20 years: a 27 percent increase in lakes," Lea said. "We're also getting more and more lakes at higher elevations - areas that we weren't expecting to see lakes in until 2050 or 2060."

When these high-elevation meltwater ponds punch through the ice sheet and drain, it could cause the ice sheet to speed up, he said, thinning the ice and accelerating its demise.

It doesn't always take decades worth of data to study polar features - sometimes just a year or two will provide insights. The Antarctic ice sheet experiences surface melt, but there are also lakes several meters below the surface, insulated by layers of snow. To see where these subsurface lakes are, Devon Dunmire of the University of Colorado, Boulder, used microwave radar images from the European Space Agency's Sentinel-1 satellite. Snow and ice are basically invisible to microwave radiation, but liquid water strongly absorbs it.

Dunmire's new study, presented at the AGU meeting, found lakes dotting the George VI and Wilkins ice shelves near the Antarctic Peninsula - even a few that remained liquid throughout the winter months. These hidden lakes might be more common than scientists had thought, she said, noting that she is continuing to look for similar features across the continent's ice shelves.

"Not much is known about distribution and quantity of these subsurface lakes, but this

water appears to be prevalent on the ice shelf near the Antarctic peninsula," Dunmire said, "and it's an important component to understand because meltwater has been shown to destabilize ice shelves."

For more information on Landsat and the upcoming Landsat 9 mission, visit: https://nasa.gov/landsat or https://usgs.gov/landsat

Credit: 
NASA/Goddard Space Flight Center

A tech jewel: Converting graphene into diamond film

image: Top: Optimized models of bilayer graphene and F-diamane. Orange and grey spheres represent fluorine and carbon atoms, respectively. Bottom: Cross-sectional transmission electron micrographs of as-grown bilayer graphene and F-diamane with the highlighted interlayer and interatomic distances.

Image: 
IBS

Can two layers of the "king of the wonder materials," i.e. graphene, be linked and converted to the thinnest diamond-like material, the "king of the crystals"? Researchers of the Center for Multidimensional Carbon Materials (CMCM) within the Institute for Basic Science (IBS, South Korea) have reported in Nature Nanotechnology the first experimental observation of a chemically induced conversion of large-area bilayer graphene to the thinnest possible diamond-like material, under moderate pressure and temperature conditions. This flexible, strong material is a wide-band gap semiconductor, and thus has potential for industrial applications in nano-optics, nanoelectronics, and can serve as a promising platform for micro- and nano-electromechanical systems.

Diamond, pencil lead, and graphene are made by the same building blocks: carbon atoms (C). Yet, it is the bonds' configuration between these atoms that makes all the difference. In a diamond, the carbon atoms are strongly bonded in all directions and create an extremely hard material with extraordinary electrical, thermal, optical and chemical properties. In pencil lead, carbon atoms are arranged as a pile of sheets and each sheet is graphene. Strong carbon-carbon (C-C) bonds make up graphene, but weak bonds between the sheets are easily broken and in part explain why the pencil lead is soft. Creating interlayer bonding between graphene layers forms a 2D material, similar to thin diamond films, known as diamane, with many superior characteristics.

Previous attempts to transform bilayer or multilayer graphene into diamane relied on the addition of hydrogen atoms, or high pressure. In the former, the chemical structure and bonds' configuration are difficult to control and characterize. In the latter, the release of the pressure makes the sample revert back to graphene. Natural diamonds are also forged at high temperature and pressure, deep inside the Earth. However, IBS-CMCM scientists tried a different winning approach.

The team devised a new strategy to promote the formation of diamane, by exposing bilayer graphene to fluorine (F), instead of hydrogen. They used vapors of xenon difluoride (XeF2) as the source of F, and no high pressure was needed. The result is an ultra-thin diamond-like material, namely fluorinated diamond monolayer: F-diamane, with interlayer bonds and F outside.

For a more detailed description; the F-diamane synthesis was achieved by fluorinating large area bilayer graphene on single crystal metal (CuNi(111) alloy) foil, on which the needed type of bilayer graphene was grown via chemical vapor deposition (CVD).

Conveniently, C-F bonds can be easily characterized and distinguished from C-C bonds. The team analyzed the sample after 12, 6, and 2-3 hours of fluorination. Based on the extensive spectroscopic studies and also transmission electron microscopy, the researchers were able to unequivocally show that the addition of fluorine on bilayer graphene under certain well-defined and reproducible conditions results in the formation of F-diamane. For example, the interlayer space between two graphene sheets is 3.34 angstroms, but is reduced to 1.93-2.18 angstroms when the interlayer bonds are formed, as also predicted by the theoretical studies.

"This simple fluorination method works at near-room temperature and under low pressure without the use of plasma or any gas activation mechanisms, hence reduces the possibility of creating defects," points out Pavel V. Bakharev, the first author and co-corresponding author.

Moreover, the F-diamane film could be freely suspended. "We found that we could obtain a free-standing monolayer diamond by transferring F-diamane from the CuNi(111) substrate to a transmission electron microscope grid, followed by another round of mild fluorination," says Ming Huang, one of the first authors.

Rodney S. Ruoff, CMCM director and professor at the Ulsan National Institute of Science and Technology (UNIST) notes that this work might spawn worldwide interest in diamanes, the thinnest diamond-like films, whose electronic and mechanical properties can be tuned by altering the surface termination using nanopatterning and/or substitution reaction techniques. He further notes that such diamane films might also eventually provide a route to very large area single crystal diamond films.

Credit: 
Institute for Basic Science

Predicting a protein's behavior from its appearance

image: Researchers at EPFL have developed a new way to predict a protein's interactions with other proteins and biomolecules, and its biochemical activity, merely by observing its surface.

Image: 
Laura Persat / 2019 EPFL

Proteins are the building blocks of life and play a key role in all biological processes. Understanding how they interact with their environment is therefore vital to developing effective therapeutics and the foundation for designing artificial cells.

Researchers at the Laboratory of Protein Design & Immunoengineering (LPDI), part of EPFL's Institute of Bioengineering at the School of Engineering, working with collaborators at USI-Lugano, Imperial College and,Twitter's Graph Learning Research division have developed a groundbreaking machine learning-driven technique for predicting these interactions and describing a protein's biochemical activity based on surface appearance alone. In addition to deepening our understanding of how proteins function, the method - known as MaSIF - could also support the development of protein-based components for tomorrow's artificial cells. The team published its findings in the journal Nature Methods.

Data-driven research

The researchers took a vast set of protein surface data and fed the chemical and geometric properties into a machine-learning algorithm, training it to match these properties with particular behavior patterns and biochemical activity. They then used the remaining data to test the algorithm. "By scanning the surface of a protein, our method can define a fingerprint, which can then be compared across proteins," says Pablo Gainza, the first author of the study.

The team found that proteins performing similar interactions share common "fingerprints."

"The algorithm can analyze billions of protein surfaces per second," says LPDI director Bruno Correia. "Our research has significant implications for artificial protein design, allowing us to program a protein to behave a certain way merely by altering its surface chemical and geometric properties."

The method, published in open-source format, could also be used to analyze the surface structure of other types of molecules.

Credit: 
Ecole Polytechnique Fédérale de Lausanne