Tech

'Sandwich' structure key to thin LSMO films retaining magnetic properties

Researchers at North Carolina State University have found that the oxide ceramic material lanthanum strontium manganite (LSMO) retains its magnetic properties in atomically thin layers if it is "sandwiched" between two layers of a different ceramic oxide, lanthanum strontium chromium oxide (LSCO). The findings have implications for future use of LSMO in spintronic-based computing and storage devices.

In its bulk form LSMO has both magnetic and metallic properties. The conductivity of the material can be altered by changing its magnetic field, which makes LSMO appealing for use as a switch in spintronic devices. However, when the material gets to a certain thinness - between five and 10 atomic layers - it loses these properties.

Divine Kumah, assistant professor of physics at NC State and corresponding author of a paper describing the work, wanted to know why LSMO loses its magnetic properties at a particular thinness, and to find a way to make LSMO magnetic in thin form.

Kumah, with colleagues and graduate students from NC State, first grew thin films of LSMO on strontium titanate - a non-magnetic substrate commonly used as a neutral scaffold. The team grew films ranging from two to 10 atomic layers thick and tested them for magnetic properties.

Next, the team utilized the synchrotron light source at Argonne National Laboratory so that they could get a three-dimensional view of the arrangement of the atoms within the thin layers of LSMO. They found that at extreme thinness, the oxygen and manganese atoms moved slightly out of alignment on the surface of the material, effectively switching off its magnetism.

"At about five atomic layers we saw distortions on the surface of the layer and at the bottom interface with the scaffold," Kumah says. "The oxygen and manganese atoms rearrange themselves. Magnetism and electrical conductivity in LSMO are related to how these two atoms bond, so if there are polar distortions in the film where they move up and down, the bonds stretch out, electrons can't move through the material effectively and magnetism is switched off."

The team noted that these distortions started at the top of the film and extended approximately three layers below surface.

"We found that the distortions occur because the crystal structure creates an electric field at the surface," Kumah says. "The oxygen and manganese atoms move in order to cancel the electric field. Our challenge was to grow something at the interfaces that is compatible with LSMO structurally but that is also insulating - so that we remove the electric field, stop the movement of the oxygen and manganese atoms and retain magnetic properties."

The researchers found that by using two layers of LSCO on either side of the LSMO, the LSMO could retain its magnetic properties at two atomic layers.

"It is like a sandwich - LSCO is the bread and LSMO is the meat," Kumah says. "You can use fewer than five layers of LSMO in this arrangement without any atomic displacement. Hopefully our work has shown that these materials can be thin enough to be useful in spintronics devices."

Credit: 
North Carolina State University

Laboratory testing suggests human lung tissue unimpacted by blu vapor

video: This video depicts the effects of cigarette smoke and myblu vapour on human lung tissue, including the impact on the cilia -- mobile, hair-like structures that line the surface of airway and lung cells

Image: 
Imperial Brands Science

A new study by Imperial Brands, owners of leading vape brand blu, contributes to the increasing evidence base substantiating vaping's harm reduction potential compared to smoking.

The research, presented at the 58th annual meeting of the Society of Toxicology earlier this year, compared the in-vitro toxicological responses of a 3D model of human lung tissue to myblu vapour and cigarette* smoke across a range of biological endpoints.

Cells were repeatedly exposed to either 30, 60 or 90 puffs of vapour, smoke and air over 4 weeks, and the results were conclusive. While myblu vapour delivered significantly more nicotine compared to the cigarette smoke, it did not trigger any significant toxicological responses under test conditions.

Arguably the most striking observation involved the cilia on the surface of the cells- mobile, hair-like structures that line the airways and lungs, helping keep them clear of mucus and dirt. After 4 weeks of repeated exposure to undiluted myblu vapour, there was no recorded decrease in either the number of cilia, or the number of ciliated cells. In fact, tissue integrity was indistinguishable from air control. The observations were in marked contrast to cigarette smoke's negative impact on lung cells, even when diluted at 1:17 ratio.

Dr Roman Wieczorek, Group Biological & Toxicological Laboratory Manager at Imperial Brands Science and study author, commented: "Utilising state-of-the-art in-vitro methodologies based on TT21C principles, we continue to investigate the mechanistic effects of our Next-Generation Product (NPG) portfolio to substantiate its harm reduction potential".

"Our ethical assays use cells derived from humans. This negates the need to test on animals, while targeting multiple endpoints of direct relevance to adult smokers."

Dr Grant O'Connell, Head of Scientific Affairs at Imperial Brands, added: "Our process of scientific substantiation focuses on all aspects of population level harm reduction, allowing us to develop robust scientific evidence packages that demonstrate the risk-reduced potential of our NGPs.

"Unfortunately, media headlines based on misleading science containing non-realistic human exposures and extrapolated results continue to prove confusing and unhelpful at best and disastrous to the global public health agenda at worst.

"Imperial Brands and blu urge public health bodies, regulators and journalists across the world to cut through the misleading, agenda-driven murk and help us in our mission to deliver something better for the world's smokers."

Credit: 
Imperial Brands

Learning from nature's bounty: New libraries for drug discovery

image: This is an artistic depiction of a macrocycle binding to a target protein.

Image: 
Image: University of Basel, Basilius Sauter | CC BY-SA 3.0

Natural products, or their close derivatives, make some of our most potent medicines, among which macrocycles with their large carbon-rich ring systems are one class. The size and complexity of macrocycles has made it difficult to emulate and build on Nature's success in the laboratory. By completing a complex molecular synthesis of these compounds attached to a unique identifying DNA strand, the Chemists of the University of Basel have built a rich collection of natural product-like macrocycles that can be mined for new medicines as the researchers report in the scientific journal Angewandte Chemie.

Natural evolution has created an incredible diversity of small molecular structures that perturb living systems and are therefore used as drugs in medicinal applications. Although several dozen approved medicines are macrocyclic structures, nearly all of these are natural products or close derivatives.

To find new lead compounds in drug research, huge libraries with diverse structures are required - or simply put, rich collections of molecules. Medicinal chemists have failed to imitate Nature's approach to bioactive macrocyclic molecules - and their long syntheses precluded the creation of large screening libraries, which are essential for identifying drug leads.

A challenge for synthetic chemistry

Researchers at the chemistry department of the University of Basel have now completed a total synthesis of over one million macrocycles that incorporate structural elements often observed in natural biologically active macrocycles.

The synthesis is based on the split-and-pool principle: Before a synthesis step, the whole library is split. Then each fraction is coupled with one of various building blocks and the newly built molecules are labeled with a covalently attached DNA sequence. Before the next synthesis step all fractions are pooled again.

This leads to the cross combination of all diversity elements. Each combination is attached to a specific DNA barcode. Through this approach all 1.4 million members of the pooled library could be screened in a single experiment. Next generation DNA sequencing on the selected libraries could then identify macrocycles that bind target proteins.

Macrocycles are unlikely yet potent drugs

Most small molecule drugs are hydrophobic molecules ("water repellants") with a low molecular weight (less than 500 daltons). Because of this, these drugs tend to slip without problem through cell membranes, exposing them to the great majority of disease-relevant proteins. Macrocycles buck this trend because they are often extremely large (more than 800 daltons) by medicinal chemistry standards, and yet they passively diffuse through cell membranes.

Researchers speculate that this special property of natural macrocycles derives from their ability to adapt their spatial structure (conformation) depending on the medium. Hence in the largely water-based environment of the blood stream and cell interior the macrocycles would expose their more water compatible (hydrophilic) groups to remain soluble. Once the hydrophobic cell membrane is encountered a conformational shift could allow the molecules to expose their hydrophobic face, making them soluble in membranes and hence capable of passive diffusion.

New applications possible

Given their unique properties, macrocycles are conspicuously under-represented in medicinal chemistry. This is largely due to the synthetic challenge of creating a large collection of macrocycles for screening. With the help of a barcoding DNA strand the Gillingham group has overcome this hurdle by developing an efficient seven-step synthesis of a natural product-like macrocycle library all pooled in one solution.

"With a large diverse collection of macrocycles available for screening, a more data-rich investigation of the properties of these extraordinary molecules can begin", comments Dennis Gillingham. "This might reveal future medicinal applications, targets or active principles."

Credit: 
University of Basel

First cyber agility framework to train officials developed to out-maneuver cyber attacks

image: A US Army project with scientists at The University of Texas at San Antonio developed the first framework to score the agility of cyber attackers and defenders.

Image: 
US Army -- Shutterstock graphic

RESEARCH TRIANGLE PARK, N.C. (June 10, 2019) To help train government and industry organizations on how to prevent cyberattacks, as part of a research project for the U.S. Army, scientists at The University of Texas at San Antonio, developed the first framework to score the agility of cyber attackers and defenders.

"The DOD and U.S. Army recognize that the cyber domain is as important a battlefront as ground, air and sea," said Dr. Purush Iyer, division chief, network sciences at Army Research Office, an element of the Army Futures Command's Army Research Laboratory. "Being able to predict what the adversaries will likely do provides opportunities to protect and to launch countermeasures. This work is a testament to successful collaboration between academia and government."

The framework developed by the researchers will help government and industry organizations visualize how well they out-maneuver attacks. Their work is published in IEEE Transactions on Information Forensics and Security, a top journal for cybersecurity.

"Cyber agility isn't just about patching a security hole, it's about understanding what happens over time. Sometimes when you protect one vulnerability, you expose yourself to 10 others," said Jose Mireles, who works for the DOD and co-developed this first-known framework as part of his UTSA master's thesis. "In car crashes, we understand how to test for safety using the rules of physics. It is much harder to quantify cybersecurity because scientists have yet to figure out what are the 'rules of cybersecurity.' Having formal metrics and measurement to understand the attacks that occur will benefit a wide range of cyber professionals."

To develop quantifiable metrics, Mireles collaborated with a fellow UTSA student Eric Ficke, researchers at Virginia Tech, and a researcher at CCDC ARL and the U.S. Air Force Research Laboratory.

The project under the supervision of UTSA Professor Shouhuai Xu, who serves as the director of the UTSA Laboratory for Cybersecurity Dynamics. Together, they used a honeypota computer system that lures real cyber-attacksto attract and analyze malicious traffic according to time and effectiveness. As both attackers and defenders created new techniques, the researchers were able to better understand how a series of engagements transformed into a new adaptive and responsive agile pattern or what they called an evolution generation.

"The cyber agility framework is the first of its kind and allows cyber defenders to test out numerous and varied responses to an attack," Xu said. "This is an outstanding piece of work as it will shape the investigation and practice of cyber agility for the many years to come."

Mireles added, "A picture or graph in this case is really worth more than 1,000 words. Using our framework, security professionals will recognize if they're getting beaten or doing a good job against an attacker."

Credit: 
U.S. Army Research Laboratory

Curbing your enthusiasm for overeating

image: From L to R: Nicholas DiPatrizio, Pedro Perez, and Donovan Argueta.

Image: 
UCR/Stan Lim.

RIVERSIDE, Calif. -- Signals between our gut and brain control how and when we eat food. But how the molecular mechanisms involved in this signaling are affected when we eat a high-energy diet and how they contribute to obesity are not well understood.

Using a mouse model, a research team led by a biomedical scientist at the University of California, Riverside, has found that overactive endocannabinoid signaling in the gut drives overeating in diet-induced obesity by blocking gut-brain satiation signaling.

Endocannabinoids are cannabis-like molecules made naturally by the body to regulate several processes: immune, behavioral, and neuronal. As with cannabis, endocannabinoids can enhance feeding behavior.

The researchers detected high activity of endocannabinoids at cannabinoid CB1 receptors in the gut of mice that were fed a high-fat and sugar -- or Western -- diet for 60 days. This overactivity, they found, prevented the food-induced secretion of the satiation peptide cholecystokinin, a short chain of amino acids whose function is to inhibit eating. This resulted in the mice overeating. Cannabinoid CB1 receptors and cholecystokinin are present in all mammals, including humans.

Study results appear in the journal Frontiers in Physiology, an open-access journal.

"If drugs could be developed to target these cannabinoid receptors so that the release of satiation peptides is not inhibited during excessive eating, we would be a step closer to addressing the prevalence of obesity that affects millions of people in the country and around the world," said Nicholas V. DiPatrizio, an assistant professor of biomedical sciences in the UCR School of Medicine who led the research team.

DiPatrizio explained that previous research by his group on a rat model showed that oral exposure to dietary fats stimulates production of the body's endocannabinoids in the gut, which is critical for the further intake of high-fat foods. Other researchers, he said, have found that levels of endocannabinoids in humans increased in blood just prior to and after eating a palatable high-energy food, and are elevated in obese humans.

"Research in humans has shown that eating associated with a palatable diet led to an increase in endocannabinoids -- but whether or not endocannabinoids control the release of satiation peptides is yet to be determined," said Donovan A. Argueta, a doctoral student in DiPatrizio's lab and the first author of the research paper.

Previous attempts at targeting the cannabinoid CB1 receptors with drugs such as Rimonabant -- a CB1 receptor blocker -- failed due to psychiatric side effects. However, the DiPatrizio lab's current study suggests it is possible to target only the cannabinoid receptors in the gut for therapeutic benefits in obesity, greatly reducing the negative side effects.

The research team plans to work on getting a deeper understanding of how CB1 receptor activity is linked to cholecystokinin.

"We would also like to get a better understanding of how specific components of the Western diet -- fat and sucrose -- lead to the dysregulation of the endocannabinoid system and gut-brain signaling," DiPatrizio said. "We also plan to study how endocannabinoids control the release of other molecules in the intestine that influence metabolism."

Credit: 
University of California - Riverside

Fracking causes some songbirds to thrive while others decline

A new paper in The Condor: Ornithological Applications, published by Oxford University Press, finds that some songbird species benefit from the spread of fracking infrastructure while others decrease in population.

The shale gas industry has grown rapidly in recent years and its resulting infrastructure can have negative consequences for native wildlife communities. While other studies have documented negative impacts of these developments on birds and their habitats, few have described variability among species in their spatial responses to fracking.

The Marcellus-Utica Shale region, which lies beneath the Appalachian Mountains and holds one of the largest deposits of natural gas in North America, is a major source of gas production in the United States. Production has led to more interstate and gathering pipelines, access roads, and other gas infrastructure in the region.

Researchers studied the relationship between 27 bird species and their distance from shale gas construction in northern West Virginia, from 2008 to 2017. They organized the birds into three groups based on their relationship to human development in the region. The first category, forest interior birds, includes species like the Ovenbird that are associated with large areas of mature forests. The second group includes early successional birds, such as Indigo Buntings, which prefer young forest and shrubland habitats. The final category is human-adapted birds, like the Brown-headed Cowbird, which thrive in environments that have been altered by people.

The researchers recorded annual changes to the landscape across the study area, and monitored birds at 142 survey stations. Over the ten-year period, the footprint of shale gas increased tenfold, from approximately 42 acres in 2008 to over 432 acres in 2017, with an even greater increase in new forest edges created by gas infrastructure. The researchers found that forest interior birds decreased in numbers near gas development, avoiding both drilling sites and road and pipeline corridors. The Ovenbird population declined 35% at the study site while Cerulean Warblers, a forest species of conservation concern, declined by 34%. These are sharp declines at a site previously recognized as a globally Important Bird Area for its significant populations of Cerulean Warblers and other forest interior birds. Conversely, early successional species like the Indigo Bunting saw population increases and began to concentrate along new pipelines and access roads. Similarly, the Brown-headed Cowbird showed an increase in numbers throughout the study site, but displayed a clear pattern of attraction to areas disturbed by fracking.

This suggests that forest disturbances from shale gas energy development may create more habitats for generalist or highly adaptable species while pushing out birds that depend on interior forests. For instance, the Brown-headed Cowbird that saw an increased population in this study is a nest parasite that manipulates other birds into raising its young, which can be detrimental to the host. The researchers also noted that shale gas expansion meant increased human access and activity, including traffic, light, and noise pollution, which poses problems for birds that rely on song to attract mates. In particular, pipeline compressors are a source of chronic noise that can carry into surrounding habitats, exacerbating the negative impacts of habitat loss.

"We hope to find a way to balance our energy needs with maintaining healthy forest ecosystems, which we also depend on for clean air, clean water, carbon storage, and countless other ecological services," said author Laura S. Farwell, Ph.D. "Like the proverbial canary in the coal mine, these birds are early indicators of ecosystem degradation. We hope our research will help inform planning decisions about where to avoid or minimize gas development in order to protect valuable forest resources, both for humans and for other species."

Credit: 
Oxford University Press USA

Novel agent reactivates an immune call by LIF blockade

image: VHIO's Gene Expression & Cancer Group directed by Joan Seoane.

Image: 
Katherin Wermke

Promising new therapy with a dual mechanism of action to eliminate cancer stem cells and activate the immune system now in clinical development

Findings show a reactivation of the anti-cancer alarm system and draw parallels between embryogenesis and cancer

Combining LIF-neutralizing antibodies with immunotherapy promotes tumor regression, triggers immune memory, and increases survival in animal models

Results from a study spearhead by researchers at the Vall d´Hebron Institute of Oncology (VHIO), show that the blockade of the multi-functional cytokine LIF induces tumor-infiltrating T Cells to hone in on and eliminate cancer. Reported today in Nature Communications, this research was led by corresponding and co-first author Joan Seoane, Co-Program Director of Preclinical and Translational Research at VHIO, and ICREA Research Professor, and has now culminated in a Phase I clinical trial currently assessing the safety and efficacy of LIF inhibitors in patients across three sites: the Vall d'Hebron University Hospital (HUVH), Memorial Sloan Kettering Cancer Center (MSKCC - New York, USA), and the Princess Margaret Cancer Center (Toronto, Canada).

Developed by VHIO, novel agent MSC-1 inhibits LIF and has now been shown to have a dual mechanism of action. First, in tumors expressing high levels of LIF, this protein promotes the proliferation of cancer stem cells. LIF blockade eliminates these tumor-initiating stem cells, putting the brakes on metastatic cell spread and cancer recurrence.

Additionally, elevated LIF expression disables the anti-tumor alarm system and stops the immune system from thwarting cancer's plans. Blocking LIF reactivates the alarm to call an anti-tumoral immune response.

Pioneer of previous LIF studies, Joan Seoane and his team were the first to establish a link between this multi-functional protein and cancer as well as show that LIF blockade eliminates cancer stem cells and prevents disease progression and recurrence. In this present paper, they have now revealed its implication in the immune system's anti-cancer response.

When foreign bodies or alterations in healthy cells are detected, a biological alarm alerts the immune system to act against these 'dealers' of damage. "We have discovered that LIF silences this alarm which enables cancer to dodge the immune system's innate response. It´s just like a bank robber deactivating an alarm to escape detection by the police," explains Joan Seoane.

More specifically, the researchers have shown that LIF inhibits the CXCL9 gene, which acts as a signal to lure immune system T cells. LIF blockade induces these immune system soldiers to invade, attack and destroy tumors. "We have observed that LIF inhibition in tumors expressing high levels of this protein reactivates the signal to T cells to target and destroy cancer," says Joan.

This study also shows that combining LIF inhibition with anti-PD1 therapy powerful blows against cancer. "Once the T cells infiltrate the tumors, they are activated by anti-PD1 immunotherapy. In animal models the pairing of both agents not only halted tumor growth but also, in some cases, made tumors disappear. In these cases, the immune memory is activated and the system 'remembers' the tumor and that particular does not reappear even when more tumor cells emerge," observe Monica Pascual-García and Ester Bonfill, co-first authors and Post-Doctoral Fellows of VHIO's Gene Expression and Cancer Group directed by Joan.

After several years' research and validating LIF's promise as a therapeutic target in preclinical and experimental models, Joan founded Mosaic Biomedicals, a VHIO-born spin-off that launched to identify, develop, potentiate and translate novel therapies into benefits for patients at the bedside as quickly as possible. Mosaic has since brought the first-in-class MSC-1 LIF inhibitor closer to the clinic. This promising agent is currently being assessed in clinical trials for further development.

Manipulating Mother Nature's love for LIF

LIF protects cancer in the same way a mother protects her embryo. Throughout evolution LIF has emerged as a solver of a serious issue among mammals: the fact that a living being exists inside another. The embryo has antigens from the father, so why then, is it not rejected by the mother's immune system? LIF protects the embryo and induces the proliferation of embryonic stem cells, resulting in its 'safe' development.

This present study exposes the parallels between embryogenesis and cancer. Joan's team have now shown that LIF assumes a crucial role in embryogenesis by protecting the embryo from the mother's immune system.

Cancer seizes on this molecular mechanism induced by LIF and uses it for its own gain. LIF is aberrantly expressed in some tumors when it shouldn't be, and shields tumors from the patient's own immune system; in the same way that it protects the embryo. Similarly, instead of embryonic stem cells, found in cancer, LIF promotes the proliferation of tumor stem cells.

This new therapeutic window is not open to all tumor types. It only shows promise for the treatment of tumors expressing high levels of LIF. The preselection of patients identified with high LIF levels detected in their tumors is critical in more precisely matching this novel therapy to those patients who would be most likely to benefit.

"Tumor types with typically high LIF levels include glioblastoma, pancreatic, ovarian, lung, and prostate cancers. Importantly, we have also observed that these cancers are also more aggressive and indicative of a poor prognosis," adds Joan.

He concludes, "These findings are the fruit of a major body of work, mainly fueled by the Spanish Association against Cancer (AECC) and the European Research Council (ERC). We are equally grateful to the FERO Foundation, and the BBVA Foundation's Comprehensive Program of Cancer Immunotherapy and Immunology (CAIMI), for their backing and belief in this research project."

Credit: 
Vall d'Hebron Institute of Oncology

A 'one-two punch' to wipe out cancerous ovarian cells

image: Model of targetable PARPi reversible senescence. The combination of PARPi and a senolytic was effective in preclinical models of ovarian cancer suggesting that coupling these synthetic lethalities provides a rational approach to their clinical use and may together be more effective in limiting resistance.

Image: 
<em>Nature Communications</em> and Francis Rodier (CRCHUM)

Montreal, June 11, 2019 - Researchers from the University of Montreal Hospital Research Centre (CRCHUM) have developed a two-step combination therapy to destroy cancer cells. In a study published in the journal Nature Communications, they show the superior therapeutic effectiveness of the "one-two punch" on cells of ovarian cancer patients, based on manipulation of the state of cellular aging.

With time, our cells age and enter a phase called cellular senescence. These senescent cells stop proliferating, build up in the body and cause the development of diseases such as cancer. In recent years, the scientific community has tried to heal these aging-related pathologies by targeting and destroying senescent cells.

"In the case of epithelial ovarian cancer (EOC)--the most common and lethal ovarian cancer--we act in two stages. First, we force the cancer cells to age prematurely i.e., we force them into senescence. This is the first therapeutic punch. We throw our second punch using senolysis, destroying and eliminating them. This strategy requires excellent coordination of the two steps," explained Francis Rodier, a researcher at the CRCHUM and professor at the Université de Montréal.

The team of researchers, led by Rodier and his colleague Anne-Marie Mes-Masson, discovered that EOC cells enter senescence following chemotherapy in combination with PARP inhibitors. PARPs are enzymes that help repair damage to DNA. By blocking PARPs, PARP inhibitors prevent cancer cells from repairing their DNA, stop them from proliferating and cause them to age prematurely.

"Thanks to our 'one-two punch' approach, we have managed to destroy senescent EOC cells in preclinical ovarian cancer models. Our approach could improve the effectiveness of chemotherapy in combination with PARP inhibitors and counteract the systematic resistance that develops with this treatment," said Mes-Masson, a researcher at the CRCHUM and professor at the Université de Montréal.

Future clinical trials in store?

"Our study was done using cells taken from our biobank of samples from CHUM ovarian cancer patients. These patients agreed to take part in the research study and let us store their biological specimens. Our 'one-two punch strategy' was also tested on preclinical ovarian and breast cancer models, which allowed us to validate its effectiveness," commented Mes-Masson.

Although the results of this study will be used to propose clinical trials for ovarian and triple-negative breast cancer, Rodier says that it is important to remember that they used preclinical models in which there was no immune system. "Given the importance of the immune response in humans, we need to continue evaluating our strategy in a context closer to biological reality."

According to the Canadian Cancer Society, 2,800 Canadian women were diagnosed with ovarian cancer in 2017 and 1,800 died from the disease. It is the fifth leading cause of death in North America.

Credit: 
University of Montreal Hospital Research Centre (CRCHUM)

EWG: Nitrate pollution of US tap water could cause 12,500 cancer cases each year

WASHINGTON - Nitrate pollution of U.S. drinking water may cause up to 12,594 cases of cancer a year, according to a new peer-reviewed study by the Environmental Working Group.

For the groundbreaking study, published today in the journal Environmental Research, EWG scientists estimated the number of cancer cases in each state that could be attributed to nitrate contamination of public water systems, largely caused by farm runoff containing fertilizer and manure. They also estimated the costs of treating those cases at up to $1.5 billion a year.

"Nitrate contamination of drinking water is a serious problem, and especially severe in the nation's farm country," said Olga Naidenko, Ph.D., EWG senior science advisor and one of the study's authors. "Now, for the first time, we can see the staggering consequences of this pollution."

The current federal drinking water standard for nitrate, set in 1962, is 10 parts per million, or ppm. Yet several well-regarded epidemiological studies have linked nitrate in drinking water with cancer and other serious health issues at levels less than one-tenth of the legal limit. Earlier this year, the Environmental Protection Agency suspended plans to reevaluate its outdated nitrate standard.

Four-fifths of EWG's estimated cases were occurences of colorectal cancer, with ovarian, thyroid, kidney and bladder cancer making up the rest. Nitrate in tap water has also been linked with serious neonatal health issues. EWG estimated that nitrate pollution may be responsible for as many as 2,939 cases of very low birth weight; 1,725 cases of very preterm birth; and 41 cases of neural tube defects.

"Millions of Americans are being involuntarily exposed to nitrate, and they are also the ones paying the heavy costs of treating contaminated tap water," said Alexis Temkin, Ph.D., a toxicologist at EWG and primary author of the study. "But the federal government is not doing enough to protect Americans from tap water contamination."

EWG scientists estimate the level at which there would occur no adverse health effects from nitrate in drinking water to be 0.14 milligrams per liter - equivalent to parts per million. That level, 70 times lower than the EPA's legal limit, represents a one-in-one-million risk of cancer.

Credit: 
Environmental Working Group

Tiny light box opens new doors into the nanoworld

image: Using a box built from stacked atomically thin layers of the material tungsten disulphide (see the atomic model), Chalmers researchers have succeeded in creating a type of feedback loop in which light and matter become one. This new concept involves two distinct processes being housed in the same nanodisk. The box has a diameter of only 100 nanometres (0.00001 centimetres) and opens the way to new fundamental research and more compact solutions in nanophotonics.

Image: 
Denis Baranov/Yen Strandqvist/Chalmers University of Technology

Researchers at Chalmers University of Technology, Sweden, have discovered a completely new way of capturing, amplifying and linking light to matter at the nanolevel. Using a tiny box, built from stacked atomically thin material, they have succeeded in creating a type of feedback loop in which light and matter become one. The discovery, which was recently published in Nature Nanotechnology, opens up new possibilities in the world of nanophotonics.

Photonics is concerned with various means of using light. Fibre-optic communication is an example of photonics, as is the technology behind photodetectors and solar cells. When the photonic components are so small that they are measured in nanometres, this is called nanophotonics. In order to push the boundaries of what is possible in this tiny format, progress in fundamental research is crucial. The innovative 'light box' of the Chalmers researchers makes the alternations between light and matter take place so rapidly that it is no longer possible to distinguish between the two states. Light and matter become one.

"We have created a hybrid consisting of equal parts of light and matter. The concept opens completely new doors in both fundamental research and applied nanophotonics and there is a great deal of scientific interest in this," says Ruggero Verre, a researcher in the Department of Physics at Chalmers and one of the authors of the scientific article.

The discovery came about when Verre and his departmental colleagues Timur Shegai, Denis Baranov, Battulga Munkhbat and Mikael Käll combined two different concepts in an innovative way. Mikael Käll's research team is working on what are known as nanoantennas, which can capture and amplify light in the most efficient way. Timur Shegai's team is conducting research into a certain type of atomically thin two-dimensional material known as TMDC material, which resembles graphene. It was by combining the antenna concept with stacked two-dimensional material that the new
possibilities were created.

The researchers used a well-known TMDC material - tungsten disulphide - but in a new way. By creating a tiny resonance box - much like the sound box on a guitar - they were able to make the light and matter interact inside it. The resonance box ensures that the light is captured and bounces round in a certain 'tone' inside the material, thus ensuring that the light energy can be efficiently transferred to the electrons of the TMDC material and back again. It could be said that the light energy oscillates between the two states - light waves and matter - while it is captured and amplified inside the box. The researchers have succeeded in combining light and matter extremely efficiently in a single particle with a diameter of only 100 nanometres, or 0.00001 centimetres.

This all-in-one solution is an unexpected advance in fundamental research, but can hopefully also contribute to more compact and cost-effective solutions in applied photonics.

"We have succeeded in demonstrating that stacked atomically thin materials can be nanostructured into tiny optical resonators, which is of great interest for photonics applications. Since this is a new way of using the material, we are calling this 'TMDC nanophotonics'. I am certain that this research field has a bright future," says Timur Shegai, Associate Professor in the Department of Physics at Chalmers and one of the authors of the article.

Credit: 
Chalmers University of Technology

Technique pulls interstellar magnetic fields within easy reach

MADISON, Wis. -- A new, more accessible and much cheaper approach to surveying the topology and strength of interstellar magnetic fields -- which weave through space in our galaxy and beyond, representing one of the most potent forces in nature -- has been developed by researchers at the University of Wisconsin-Madison.

Together with gravity, magnetic fields play a major role in many of the astrophysical processes -- from star formation to stirring the massive dust and gas clouds that permeate interstellar space -- that underpin the structure and composition of stars, planets and galaxies. On the galactic scale, magnetic fields dominate the acceleration and propagation of cosmic rays, and play an important role in transferring heat and polarized radiation.

What's more, the polarized radiation that arises from galactic magnetic fields exceeds by orders of magnitude that of the Cosmic Microwave Background (CMB), the relic radiation of the first moments of the universe. The next milestone in understanding the origin of the universe, some scientists believe, requires measuring the CMB's polarized radiation. Importantly, unraveling the topology of the intervening magnetic fields between Earth and the CMB will be a necessary step to reliably obtain those data.

But despite their importance and pervasive influence, interstellar magnetic fields represent one of the final frontiers of astrophysics. Little is known about them, in large part, because they are exceedingly difficult to study.

"There are very limited ways to study magnetic fields in space," explains Alexandre Lazarian, a UW-Madison professor of astronomy and an authority on the interstellar medium, the seemingly empty spaces between the stars that are, in fact, rich in matter and feature twisted, folded and tangled magnetic fields composed of fully or partially ionized plasmas entrained on magnetic fields. "Our understanding of all these (astrophysical) processes suffers from our poor knowledge of magnetic fields."

Now, much of that knowledge may be more readily at hand. Writing this week (June 10, 2019) in the journal Nature Astronomy, an international team led by the Wisconsin astrophysicist demonstrates a new methodology capable of tracing the orientations of magnetic fields in the swirl of interstellar space.

The proof-of-concept reported in Nature Astronomy builds on a series of theoretical and numerical studies published over the last two years by Lazarian and his students, and which lay out a radical new approach to mapping the tangle of magnetic fields in space.

Until now, much of the detailed mapping of magnetic fields in diffuse environments such as clouds of dust and gas in space involved infrared polarimetry with instruments deployed either on satellites or balloons flown high in the stratosphere.

The new method, known as the Velocity Gradient Technique and informally as the "Wisconsin technique," uses previously collected observational data from a variety of ground-based telescopes, transcending the need to put instruments in space, a costly and limited resource for astronomers. Building on studies of turbulence in magnetic fields in conducting fluids, Lazarian and his students devised the new statistical approach to measure the topology of magnetic fields using routine spectroscopic observations taken from the ground.

For the most part, infrared light is absorbed by Earth's atmosphere, which is why conventional magnetic field measurements require telescopes positioned on long-duration, high-altitude balloon flights, or above it on satellites. In recent years, many new measurements of interstellar magnetic fields, for instance, were gathered using the Planck satellite, a European space observatory with infrared capabilities and operational from 2009 to 2013.

Applying the new Wisconsin technique to a number of interstellar molecular clouds whose magnetic fields had been previously measured by the Planck satellite, Lazarian and his students were able to generate high-resolution maps using existing ground-based observations.

"The technique provides magnetic field maps of resolution comparable to maps obtained with the Planck mission," says Lazarian, "and it utilizes spectroscopic observations collected by researchers for other purposes. Given that the technique utilizes data from ground-based telescopes and interferometers, the resolution of magnetic field maps can be significantly improved."

In addition to determining the direction of the interstellar magnetic fields, the new methodology can determine the strength of the field at a fine scale, down to each pixel on a map. "This demonstrates that the Wisconsin technique can revolutionize studies of magnetic effects on star formation by using existing ground-based telescopes without waiting for new space-based polarization missions with a higher resolution in some distant future," Lazarian says.

The new technique, Lazarian adds, also opens a unique window to the development of three-dimensional magnetic field maps, work that has already been demonstrated in a corresponding paper published in the Astrophysical Journal by Lazarian and his student, Diego Gonzales Casanova.

To contrast the capabilities of the new technique with traditional polarimetry, Lazarian and his group, including UW-Madison physics graduate student Yue Hu and astronomy graduate student Ka Ho Yuen, key authors of the new Nature Astronomy report, deployed their new methodology to produce the first magnetic field map of the Smith Cloud, a mysterious cloud of atomic hydrogen that seems to be crashing onto the disk of the Milky Way. Previous efforts to map the cloud's magnetic field were frustrated by its weak infrared emission, obscuring dust and galactic atomic hydrogen along the same line of sight.

Credit: 
University of Wisconsin-Madison

A new picture of dengue's growing threat

SEATTLE--Research published today in Nature Microbiology paints a startling new picture of where dengue, the world's fastest-growing mosquito-borne virus, will spread to put more than 6 billion people at risk toward the end of the century.

The study predicts risk to increase in the southeastern United States, coastal areas of China and Japan, and inland regions of Australia, based on researchers' analysis of climate change data, urbanization, and resources and expertise available to control the virus. However, the biggest changes are predicted to occur in nations where dengue is already endemic.

"What was most surprising was actually how much less spread we predict in comparison to previous dengue maps," said Dr. Oliver Brady, co-author of the paper and an Assistant Professor at the London School of Hygiene & Tropical Medicine. "While climate change is likely to contribute to dengue expansion, factors including population growth and increasing urbanization in tropical areas will play a much larger role in shaping who will be at risk in the future."

The estimates are the first to include the projected spread of mosquitoes that carry the dengue virus. They forecast out to the years 2020, 2050, and 2080 at a high spatial resolution using the latest climate change projections from the Intergovernmental Panel on Climate Change (IPCC).

Dengue causes the greatest disease burden of any virus transmitted by mosquitoes, ticks, or other insects, with an estimated 10,000 deaths and 100 million infections per year, according to the World Health Organization. There is no specific treatment, but ensuring governments have robust programs for mosquito control, clinical management of disease, and outbreak response can help limit the impact of dengue.

The researchers first mapped environmental suitability for the dengue virus in 2015, then modeled the estimated distribution of dengue over the next 65 years. The results show that demographic changes in areas where the disease is already present will drive much of the increase in dengue burden, putting an estimated 60 percent of the global population at risk of contracting the virus in 2080.

The greatest shifts in dengue risk are projected to occur on the African continent, particularly in the Sahel and southern Africa. In contrast to other studies, the results do not show significant expansion of dengue across continental Europe, with only a few isolated areas around the Mediterranean likely to see low levels of risk in the future.

"We found that the population at risk of dengue will grow substantially and disproportionately in many areas that are economically disadvantaged and least able to cope with increased demands on health systems," said co-author Dr. Simon I. Hay, Director of Geospatial Science at the Institute for Health Metrics and Evaluation and Professor of Health Metrics Sciences at the University of Washington. "Mitigation strategies must focus on dengue-endemic areas, not just the risk of expansion to Western nations. Taking action now by investing in trials of novel vaccines and mosquito control and planning for sustainable population growth and urbanization are crucial steps for reducing the impact of the virus."

In addition to the US, China, Japan, and Australia, new areas at risk over the next 60 years include higher altitudes in central Mexico and northern Argentina. Areas with decreasing risk include areas in central East Africa and India.

Credit: 
Institute for Health Metrics and Evaluation

To protect kids and teens from firearm harm, answer these question first, experts

image: The process and outcomes behind the FACTS consortium's new paper.

Image: 
FACTS Consortium

Firearm injuries kill more American children and teens than anything else, except automobile crashes. But research on how those injuries happen, who's most likely to suffer or die from one, or what steps would prevent them, has lagged behind research on other causes of death in young people.

Meanwhile, firearm deaths among people age 19 and under have grown 44 percent since 2013.

Now, as more researchers and funding sources appear interested in pediatric firearm injury prevention research, a team of 24 experts from around the country has published a list of the 26 most pressing questions that they call for impartial studies to address.

Writing in the new issue of JAMA Pediatrics, the team lays out the list that they developed after an extensive review of the existing scientific literature and a structured consensus-building process. Their effort involved input from stakeholders from organizations that represent gun owners, law enforcement, clergy, the educational community, firearm injury prevention advocates, medical organizations and more.

The team of authors includes researchers from 12 different universities and hospitals across the country. All belong to the Firearm Safety Among Children and Teens (FACTS) consortium, funded by the National Institutes of Health. They include academic scientists who have led most of the research on pediatric firearm injury prevention to date, and are led by a trio of University of Michigan injury prevention experts.

"Firearm injury prevention research could answer many questions, but we need to address the most urgent questions first, and focus on what could bring death rates down fastest, just as we have with other causes of injury," says Rebecca Cunningham, M.D., one of the leaders of FACTS and an emergency physician at Michigan Medicine, U-M's academic medical center. "Answering these questions in an objective, rigorous way will provide valuable information for the country to use, just as past research on automobile injury led to changes that cut the death rate for children and teens in half."

Urgent questions across many topics

The FACTS team addressed research across multiple firearm outcomes, including knowledge and attitudes toward firearms, access and storage of firearms, carrying firearms, exposure to firearm violence, intentional firearm injury including suicide and mass shootings, and accidental injury.

Some of the most urgent types of questions they identified, for which no definitive, research-based answers exist, include:

How many children and teens annually suffer fatal and non-fatal firearm injuries?

How do children and teens gain access to firearms?

How are firearms stored in homes where children and teens live?

How effective are various programs for improving firearm handling, and reducing firearm violence and suicide?

How often, and under what circumstances, are children and teens protected by self-defensive use of firearms, by themselves or by others?

How effective are existing and new technologies, such as higher-pressure triggers and RFID identification safeguards, at preventing firearm injuries and deaths among young people?

What effect do existing public policies on firearms have on firearm injuries and deaths among children and teens?

How can we use data technology to provide near real-time information on firearm injuries and deaths among young people?

What are the immediate and long-term costs of pediatric firearm injuries, from health care to criminal justice and disability?

The full list of questions is published in the paper.

Cunningham and her colleagues hope it will help guide funding agencies at the state and federal level, and foundations, as they consider which research proposals to fund.

A groundswell of interest

Cunningham notes that in the past year, and especially since the launch of the FACTS website at http://www.childfirearmsafety.org, she and her colleagues have heard from many researchers who are interested in studying firearm issues.

"As funding is opening up right now, we need a roadmap for what we should be studying in all disciplines of research, and for agencies and foundations who want to invest in what will make the most difference," says Cunningham. "This is that roadmap."

In addition to the list of questions, in August the FACTS group will publish a series of review papers that review the existing research on firearm injuries and deaths in children and teens, and related topics. That review informed the team that compiled the list of questions, she notes.

"Federally funded, peer-reviewed research is the most unbiased mechanism we have for answering societal questions," she says.

Work already under way

FACTS has funds from its initial grant to support 10 pilot studies that can produce data to guide larger studies. Some are already under way, including a nationally representative survey of teens and adults about firearm-related behaviors, and an inventory of state and county firearm policies and their relationship to the risk of school shootings.

At the same time, FACTS has gathered or identified over 60 existing pools of data that researchers can examine immediately for answers to urgent firearm research questions. It has made them available on the initiative's website for free, with another 40 expected to be online this year.

All of this, the FACTS team hopes, will ensure that their colleagues will focus on the key issues with the most urgent questions.

"Conducting research in a timely way that can serve policymakers who are focused on these issues could help new policies be more evidence-based," Cunningham says. And as new laws are made, researchers should evaluate their effectiveness and compare outcomes with states or areas that didn't pass such laws, she notes.

"There's been a giant momentum shift in the number of researchers, policymakers and organizations interested in having this information, and the number of clinicians deciding that this is their 'lane' and seeking to use proven approaches in their practice with families," says Cunningham. "This is very much parallel to what happened in the late 1960s and 1970s with automotive safety research. As researchers look to include this topic in their careers the first place they should start as scientists is to ask what's been done and what are the most urgent questions. We hope we've provided that."

Credit: 
Michigan Medicine - University of Michigan

Fiber-optic probe can see molecular bonds

image: This visualization shows the fiber-in-fiber-out process for optical spectroscopy measurement.

Image: 
Liu Group/UCR

In "Avengers: Endgame," Tony Stark warned Scott Lang that sending him into the quantum realm and bringing him back would be a "billion-to-one cosmic fluke."

In reality, shrinking a light beam to a nanometer-sized point to spy on quantum-scale light-matter interactions and retrieving the information is not any easier. Now, engineers at the University of California, Riverside, have developed a new technology to tunnel light into the quantum realm at an unprecedented efficiency.

In a Nature Photonics paper, a team led by Ruoxue Yan, an assistant professor of chemical and environmental engineering, and Ming Liu, an assistant professor of electrical and computer engineering, describe the world's first portable, inexpensive, optical nanoscopy tool that integrates a glass optical fiber with a silver nanowire condenser. The device is a high-efficiency round-trip light tunnel that squeezes visible light to the very tip of the condenser to interact with molecules locally and send back information that can decipher and visualize the elusive nanoworld.

Our ability to zoom in on the fine details of an object is limited by the wave nature of light. If you ever used an optical microscope in a science class, you probably learned that one can only magnify an object by about 2,000 times before everything becomes a blur. That's because it's impossible to distinguish any features finer than half the wavelength of light -- a few hundred nanometers for far-field visible light -- no matter how advanced your microscope is.

Unlike far-field waves, near-field waves only exist very close to a light source and are not governed by this rule. But they do not travel voluntarily and are very difficult to utilize or observe. Since the 1920s, scientists have thought that forcing light through a small pinhole on a metal film would generate near-field waves that could be converted to detectable light, but the first successful prototypes weren't built until half a century later.

In the early 1990s, Eric Betzig, the 2014 Nobel laureate in chemistry, made substantial improvements to earlier prototypes in imaging performance and reliability. Since then, near-field scanning optical microscopy, as the technique is known, has been used to reveal the nanoscale details of many chemical, biological, and material systems.

Unfortunately, almost another half-century later, this technique is still esoteric and used by few.

"Sending light through a tiny pinhole a thousand-times smaller than the diameter of a strand of human hair is no piece of cake," Liu said. "Only a few in a million photons, or light particles, can pass the pinhole and reach the object you want to see. Getting a one-way ticket is already challenging; a round-trip ticket to bring back a meaningful signal is almost a daydream."

Scientists have made endless efforts to improve this chance. While the most sophisticated probes today allow only one in 1,000 photons to reach the object, the UC Riverside device delivers half the photons to the tip.

"The key of the design is a two-step sequential focusing process," Yan said. "In the first step, the wavelength of the far-field light slowly increases as it travels down a gradually thinning optical fiber, without changing its frequency. When it matches the wavelength of the electron density wave in the silver nanowire lying on top of the optical fiber, boom! All energy is transferred to the electron density wave and starts to travel on the surface of the nanowire instead."

In the second step of the focusing process, the wave gradually condenses to a few nanometers at the tip apex.

The UC Riverside device, a tiny silver needle with light coming off the tip "is sort of like Harry Potter's wand that lights up a tiny area," explained Sanggon Kim, the doctoral student who carried out the study.

Kim used the device to map out the frequency of molecular vibrations that allow one to analyze chemical bonds that hold atoms together in a molecule. This is known as tip-enhanced Raman spectroscopy, or TERS, imaging. TERS is the most challenging branch of near-field optical microscopy, because it deals with very weak signals. It usually requires bulky, million-dollar equipment to concentrate light and tedious preparation work to get super-resolution images.

With the new device, Kim achieved 1-nanometer resolution on a simple portable equipment. The invention could be a powerful analytical tool that promises to reveal a new world of information to researchers in all disciplines of nanoscience.

"The integration of a fiber-nanowire assembly with tip-enhanced Raman spectroscopy coupled with a scanning tunneling microscope enables the collection of high-resolution chemical images in a simple and elegant setup, placing this tool at the forefront of optical imaging and spectroscopy. We are proud of this achievement and its impact on chemical research. We are even more encouraged by its potential application in a wide array of disciplines such as biological and materials research, which will further scientific advancement," said Lin He, acting deputy division director for the National Science Foundation Division of Chemistry that in part funded the research.

Credit: 
University of California - Riverside

How electrical stimulation reorganizes the brain

image: Location of electrodes exhibiting significant stimulation responses and modulation of the stimulation response.

Image: 
Huang <em>et</em> <em>al</em>., <em>JNeurosci</em> (2019)

Recordings of neural activity during therapeutic stimulation can be used to predict subsequent changes in brain connectivity, according to a study of epilepsy patients published in JNeurosci. This approach could inform efforts to improve brain stimulation treatments for depression and other psychiatric disorders.

Corey Keller and colleagues delivered electrical stimulation from implanted electrodes in 14 patients while recording participants' brain activity. Repeated sets of stimulation resulted in progressive changes to the brain's response to simulation, with stronger responses in brain regions connected to the stimulation site. The researchers observed these changes in a matter of minutes, suggesting that electrical stimulation induces the brain to rapidly reorganize itself.

Assessing brain activity before, during, and after simulation has the potential to personalize neuromodulation therapies. Whether these results will translate to non-invasive techniques, such as transcranial magnetic stimulation, and to other patient populations remains to be determined.

Credit: 
Society for Neuroscience