Earth

Science snapshots: New nitrides, artificial photosynthesis, and TMDC semiconductors

image: A study led by Wenhao Sun of Berkeley Lab features a large, interactive stability map of the ternary nitrides, highlighting nitride compositions where experimental discovery is promising in blue.

Image: 
Wenhao Sun/Berkeley Lab

Groundbreaking Study Maps Out Paths to New Nitride Materials

Formed by elements combining with nitrogen, nitrides can possess unique properties with potential applications from semiconductors to industrial coatings. But before nitrides can be put to use, they first must be discovered - and the odds of finding them in nature are slim.

Now, your chances of discovering new nitrides just got better with a groundbreaking Nature Materials study led by Berkeley Lab in close collaboration with the National Renewable Energy Laboratory (NREL) and a number of other institutions.

The study features a large, interactive stability map of the ternary nitrides, highlighting nitride compositions where experimental discovery is promising in blue. So far, the map has yielded the prediction of 244 new stable ternary nitride compounds.

"For ancient explorers, sailing into the unknown was a very risky endeavor, and in the same way, exploration of new chemical spaces can also be risky," explained Wenhao Sun, lead author of the paper and staff scientist at Berkeley Lab. "If you don't find a new material where you are looking, it can be a big waste of time and effort. Our chemical map can help to guide the exploratory synthesis of nitrides, just as maps helped to guide explorers, allowing them to navigate better."

Read the full release from NREL here .

Here Comes the Sun: A New Framework for Artificial Photosynthesis
By THERESA DUQUE

Scientists have long sought to mimic the process by which plants make their own fuel using sunlight, carbon dioxide, and water through artificial photosynthesis devices, but how exactly substances called catalysts work to generate renewable fuel remains a mystery.

Now, a PNAS study led by Berkeley Lab - and supported by state-of-the-art materials characterization at the Joint Center for Artificial Photosynthesis, powerful X-ray spectroscopy techniques at the Advanced Light Source, and superfast calculations performed at the National Energy Research Scientific Computing Center - has uncovered new insight into how to better control cobalt oxide, one of the most promising catalysts for artificial photosynthesis.

When molecules of cobalt oxide cubane, so named for its eight atoms forming a cube, are in solution, the catalytic units eventually collide into one another and react, and thus deactivate.

To hold the catalysts in place, and prevent these collisions, the researchers used a metal-organic framework as a scaffold. The technique is similar to how tetramanganese, a metal-oxygen catalyst in natural photosynthesis, protects itself from self-destruction by hiding in a protein pocket.

"Our study provides a clear, conceptual blueprint for engineering the next generation of energy-converting catalysts," said Don Tilley, senior faculty scientist in Berkeley Lab's Chemical Sciences Division (and a co-corresponding author of the study.

You Don't Have to Be Perfect for TMDCs to Shine Bright
By Theresa Duque

Atomically thin semiconductors known as TMDCs (transition metal dichalcogenides) could lead to devices that operate more efficiently than conventional semiconductors in light-emitting diodes, lasers, and solar cells. But these materials are hard to make without defects that dampen their performance.

Now, a study led by senior faculty scientist Ali Javey of Berkeley Lab - and published in the journal Science - has revealed that TMDCs' efficiency is diminished not by defects, but by the extra free electrons.

In a previous study, the researchers used chemical treatments to improve TMDCs' photoluminescence quantum yield, a ratio describing the amount of light generated by the material versus the amount of light absorbed. "But that's not ideal because the treatments are unstable in subsequent processing," said co-first author and graduate student researcher Shiekh Zia Uddin.

For the current study, the researchers discovered that when they applied an electrical voltage instead of a chemical treatment to TMDCs made of molybdenum disulfide and tungsten disulfide, the extra free electrons are removed from the material, resulting in a quantum yield of 100%.

"A quantum yield of 100% is unprecedented in inorganic TMDCs, said Der-Hsien Lien, postdoctoral researcher and co-first author. "This is an exciting result that shows it might be much easier and cheaper than previously thought to make useful optoelectronic devices from these materials."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Climate change had significant impact on Amazon communities before arrival of Europeans

image: Raised fields in the Bolivian Llanos de Moxos region.

Image: 
Umberto Lombardo

Climate change had a significant impact on people living in the Amazon rainforest before the arrival of Europeans and the loss of many indigenous groups, a new study shows.

Major shifts in temperature and rainfall caused the disappearance of communities long before 1492, researchers have found. In contrast other cultures still flourished just before the Spanish colonisation of the Americas.

New analysis of what the climate was like in the Amazon from 700 to 1300 shows the changing weather led to the end of communities who farmed intensively, and had a strong class structure. Those who lived without political hierarchy, who grew a greater variety of crops, and took more care to look after the land so it remained fertile, were able to adapt and were less affected.

During this period the Amazon was home to dozens of sophisticated communities who lived in flourishing towns and villages. Conflict between these communities, and migration, also contributed to the downfall of some.

Dr Jonas Gregorio de Souza, who led the research while at the University of Exeter and is now based at the Universitat Pompeu Fabra, said: "Some Amazon communities were in decline or had changed drastically before 1492. Our research shows climate change was one of the responsible factors, but some groups survived because they had been working with their natural environment rather than against it. Those who farmed intensively, and had more pressure to produce surplus food because of a strong class structure, were less able to cope."

It is thought the population of indigenous communities declined by 90 per cent to 95 per cent after Europeans came to Amazonia due to epidemics and violence. Before this up to 10 million people had lived in Amazonia, and this loss reshaped landscapes and cultural geographies across the region.

Experts analysed the climate in ancient Amazonia through analysis of pollen and charcoal remains, sediments from lakes and stalagmites. This allowed them to track how much rainfall there was in the region from year-to-year. They also analysed archaeological remains showing crops grown by communities in the past, and the structures they lived in.

In the Eastern Amazon the Marajoara elite lived on large mounds, which each could have been home to around 2,000 people. These chiefdoms disintegrated after 1200. It had been thought this was due to the arrival of Aruã nomadic foragers, but the study suggests decreasing rainfall also played a part. Communities
used the mounds to manage water, with the rich monopolising resources. This made them sensitive to prolonged droughts.

At the same time Santarém culture, established in around 1100, was flourishing. They grew a variety of crops - maize, sweet potato, squash - and worked to enrich the forest. This meant drier conditions had less impact.

Experts have found communities in the Amazon built canals to manage seasonal floods. In the southern Amazon people fortified their ditches, walled plazas, causeways and roads as the climate became more volatile.

Professor Jose Iriarte, from the University of Exeter, said: "This study adds to the growing evidence that the millennium preceding the European encounter was a period of long-distance migrations, conflict, disintegration of complex societies and social re-organisation across lowland South America. It shows the weather had a real impact."

The research, part of the Pre-Columbian Amazon-Scale Transformations project, funded by the European Research Council, was carried out by academics at the University of Exeter, Pennsylvania State University, Baylor University, Universität Bern, Universidade de São Paulo, Instituto Geofísico del Peru, Northumbria University, Universidade Federal do Pará, French National Centre for Scientific Research, The University of Utah, University of Reading, Reading and the Universiteit van Amsterdam.

Climate change and cultural resilience in late pre-Columbian Amazonia is published in the journal Nature Ecology and Evolution.

Credit: 
University of Exeter

Molecular analysis could improve the early detection and prevention of endometrial cancer

image: IDIBELL-ICO researchers.

Image: 
Gemma

The use of molecular biomarkers in minimally invasive sampling opens a promising perspective for the early detection of endometrial cancer. This is the conclusion reached by the members of Screenwide research group, formed by researchers from the Bellvitge Biomedical Research Institute (IDIBELL) and the Catalan Institute of Oncology (ICO-Hospitalet). The article that they have prepared, and that has been published in the International Journal of Cancer, highlights the gaps that exist in current knowledge to accelerate the implementation of new technologies, with the aim of improving the screening and the early detection in clinical environments of this type of women cancer.

Due to the anatomical continuity between the uterine cavity and the cervix, the genomic exploitation of the biological material from the Papanicolaou test, or cervical cytology, which is routinely used in cervical cancer prevention programs, represents, together with other methods of non-invasive sampling, a unique opportunity to detect signs of upper genital tract disease. This fact can contribute to improve the diagnosis and prevention of endometrial cancer.

Currently, strategies for detecting the signs of this cancer are limited to high-risk populations and symptomatic women, since 90% of endometrial cancers present with abnormal bleeding. The new analyzes will not only clearly benefit these cases: their potential could have an impact on better screening of asymptomatic women. Molecular tests can help refine current diagnostic algorithms, since they will reduce the failure rate of classical histological diagnosis. In addition, minimally invasive methods are more appropriate in large populations of asymptomatic women, since they are much better tolerated. The first women to benefit from this new screening approach will probably be those with a family history of cancer, as in the case of Lynch syndrome, due to its high underlying risk.

The Screenwide group was created in 2016 with the aim of developing tools for early detection and screening of endometrial and ovarian cancer. The team is led by the epidemiology group (Dr. Laura Costas), with the alliance of pathology groups (Dr. Xavier Matias-Guiu), gynecology (Dr. Jordi Ponce), oncology (Dr. Josep Maria Piulats), Procure (Dr. Álvaro Aytés), and genetic counseling (Dr. Joan Brunet). The group has the international collaboration of the Endometrial Cancer Epidemiology Consortium (Dr. Sara Olson), the John Hopkins University (Dr. Bert Vogelstein) and the Forecee Consortium (Dr. Martin Widschwendter); and, nationally, with the Vall d'Hebron Research Institute (VHIR, Dr. Eva Colás), among others. During the last two years, the combined effort of this multidisciplinary group has allowed the recruitment of almost 500 women and the gathering of more than 1,600 biological samples. All this information will be the basis to be able to evaluate new detection strategies for endometrial and ovarian cancer at an early stage.

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute

'Self-healing' polymer brings perovskite solar tech closer to market

image: This perovskite solar module is better able to contain the lead within its structure when a layer of epoxy resin is added to its surface. This approach to tackling a long-standing environmental concern helps bring the technology closer to commercialization.

Image: 
OIST

A protective layer of epoxy resin helps prevent the leakage of pollutants from perovskite solar cells (PSCs), report scientists from the Okinawa Institute of Science and Technology Graduate University (OIST). Adding a "self-healing" polymer to the top of a PSC can radically reduce how much lead it discharges into the environment. This gives a strong boost to prospects for commercializing the technology.

With atmospheric carbon dioxide levels reaching their highest recorded levels in history, and extreme weather events continuing to rise in number, the world is moving away from legacy energy systems relying on fossil fuels towards renewables such as solar. Perovskite solar technology is promising, but one key challenge to commercialization is that it may release pollutants such as lead into the environment -- especially under extreme weather conditions.

"Although PSCs are efficient at converting sunlight into electricity at an affordable cost, the fact that they contain lead raises considerable environmental concern," explains Professor Yabing Qi, head of the Energy Materials and Surface Sciences Unit, who led the study, published in Nature Energy.

"While so-called 'lead-free' technology is worth exploring, it has not yet achieved efficiency and stability comparable to lead-based approaches. Finding ways of using lead in PSCs while keeping it from leaking into the environment, therefore, is a crucial step for commercialization."

Testing to destruction

Qi's team, supported by the OIST Technology Development and Innovation Center's Proof-of-Concept Program, first explored encapsulation methods for adding protective layers to PSCs to understand which materials might best prevent the leakage of lead. They exposed cells encapsulated with different materials to many conditions designed to simulate the sorts of weather to which the cells would be exposed in reality.

They wanted to test the solar cells in a worst-case weather scenario, to understand the maximum lead leakage that could occur. First, they smashed the solar cells using a large ball, mimicking extreme hail that could break down their structure and allow lead to be leaked. Next, they doused the cells with acidic water, to simulate the rainwater that would transport leaked lead into the environment.

Using mass spectroscopy, the team analyzed the acidic rain to determine how much lead leaked from the cells. They found that an epoxy resin layer provided minimal lead leakage -- orders of magnitude lower than the other materials.

Enabling commercial viability

Epoxy resin also performed best under a number of weather conditions in which sunlight, rainwater and temperature were altered to simulate the environments in which PSCs must operate. In all scenarios, including extreme rain, epoxy resin outperformed rival encapsulation materials.

Epoxy resin worked so well due to its "self-healing" properties. After its structure is damaged by hail, for example, the polymer partially reforms its original shape when heated by sunlight. This limits the amount of lead that leaks from inside the cell. This self-healing property could make epoxy resin the encapsulation layer of choice for future photovoltaic products.

"Epoxy resin is certainly a strong candidate, yet other self-healing polymers may be even better," explains Qi. "At this stage, we are pleased to be promoting photovoltaic industry standards, and bringing the safety of this technology into the discussion. Next, we can build on these data to confirm which is truly the best polymer."

Beyond lead leakage, another challenge will be to scale up perovskite solar cells into perovskite solar panels. While cells are just a few centimeters long, panels can span a few meters, and will be more relevant to potential consumers. The team will also direct their attention to the long-standing challenge of renewable energy storage.

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Research highlights possible targets to help tackle Crohn's disease

image: A scanning electron microscopy image of a macrophage, total magnification x5500.

Image: 
University of Plymouth

Affecting around 115,000 people in the UK alone, Crohn's Disease is a lifelong condition which sees parts of the digestive system become inflamed.
There is no precise cure and causes are believed to vary. But one indicator of the condition - an abnormal reaction of the immune system to certain bacteria in the intestines - has had new light shed on it thanks to scientists at the University of Plymouth.

The research has focused on different types of cells called macrophages, which are part of our immune system and are found in most tissues, where they patrol for potential harmful organisms and destroy them.

In inflammatory diseases, like Crohn's, macrophages mediate the inflammatory destruction of the gut. Just how the tissue reacts (inflammation or suppression) is dependent on the type of macrophage cell present, and how it is stimulated - and scientists have been trying to get to the bottom of this.

The new research has shown how different types of macrophage - one type being pro-inflammatory and the other being anti-inflammatory - exhibit quite different molecular mechanisms involved in switching off their functional behaviour when bacteria are present.

And this difference, as study author Dr Andrew Foey explains, highlights the possibility of targeting and selectively suppressing the pro-inflammatory cells that drive diseases such as Crohn's Disease.

"This small step in understanding of differential off-signalling of macrophage type may go hand-in-hand with understanding the relapsing/remitting presentation of Crohn's Disease," he said. "It is suggestive of future research endeavours in targeting macrophage responses in the treatment of inflammatory diseases - and it's a really positive step."

The full study, entitled 'Macrophage subsets exhibit distinct E. coli-LPS tolerisable cytokines associated with the negative regulators, IRAK-M and Tollip' is available to view in the journal PLOS ONE (doi: 10.1371/journal.pone.0214681).

The research was led by Dr Andrew Foey with Dr Khalid AlShaghdali and PhD student Barbara Durante from the School of Biomedical Sciences at the University of Plymouth, in collaboration with Dr Jane Beal, from the School of Biological and Marine Sciences.

Infection, Immunity and Inflammation is one of the key research themes in the Institute of Translational and Stratified Medicine (ITSMed) at the University of Plymouth.

The work was funded by University of Hail, Kingdom of Saudi Arabia.

Credit: 
University of Plymouth

Stem cells reprogrammed into neurons could reveal drugs harmful to pregnancy

image: Human neurons reprogrammed from embryonic stem cells at various stages of development.

Image: 
Soham Chanda/Colorado State University

Pregnant women are often advised to avoid certain drugs because of potential risks to their unborn infant's growing brain cells. Such risks are difficult to pinpoint, though, because there are few ways to track the cellular mechanisms of a drug while the fetus is developing.

Soham Chanda, an assistant professor in the Department of Biochemistry and Molecular Biology, has designed a new experimental system that can rapidly assess the pathogenic effects of a drug on a baby's developing brain. His system uses embryonic stem cells reprogrammed into neurons, offering a powerful tool for probing genetic and molecular underpinnings of drug-induced neurodevelopmental disorders. The knowledge gained from this new method could be harnessed to uncover unknown drug risks, as well as preventive therapies.

The research is published in Cell Stem Cell, and the work was primarily carried out while Chanda was a postdoctoral researcher at Stanford University with Thomas Südhof and Marius Wernig. Chanda joined the Colorado State University faculty in January and is continuing to devise methods for understanding biochemical properties of early-stage neuronal development as one of his key research interests.

The paper in Cell Stem Cell describes a model in vitro platform demonstrating the use of reprogrammed stem cells to systematically deconstruct how a drug can disrupt neuronal development. The researchers provided proof-of-concept of their experimental platform by observing the effects of valproic acid, a commonly prescribed drug that treats epileptic seizures and is also associated with fetal brain issues.

Chanda explained that attributing certain phenotypes to different neurodevelopmental stages is extremely challenging in a living (in vivo) system. That's because during normal development, neurons don't all generate in lockstep at the same time; thus it's difficult to distinguish between a developing vs. mature neuron.

But with Chanda's reprogrammed stem cells, the entire neuronal population is "phase-locked" in early development stages, gradually becoming mature in a synchronous manner. "This gives you a great advantage so that when you expose them to teratogenic drugs, you see the clear effects at early vs. late maturation stages," Chanda said. Teratogenic drugs are any that disturb the development of an embryo or fetus.

Using this system, the researchers definitively showed that valproic acid has profoundly divergent effects on early- vs. late-stage neuronal development. When neurons were still immature, the drug exposure induced changes in gene expression that led to severe impairments in how the brain cells were shaped, and how they functioned. In particular, they found that these pathogenic effects were largely mediated by a reduced cellular level of the MARCKSL1 protein, which is essential for guiding the structural maturation of newly born neurons. The drug caused no ill effects in mature neurons.

Chanda said his chief aim with the project was to test the efficacy of reprogrammed neurons as an in vitro model for human neurodevelopment, and to prepare this platform to test cellular effects of many different drugs and their consequences.

"Our major goal is to understand the fundamental mechanisms of how neurons develop their morphological and functional properties, and how different molecules contribute to this process," Chanda said.

Credit: 
Colorado State University

Tuning into the LCDs of tomorrow: Exploring the novel IGZO-11 semiconductor

In 1985, Noboru Kimizuka of the National Institute for Research in Inorganic Materials, Japan had pioneered the idea of polycrystalline indium-gallium-zinc oxide (IGZO) ceramics, with the general chemical formula (InGaO3)m(ZnO)n (m, n = natural number; hereafter referred to as IGZO-mn). Little would he have thought that its curious electrical properties would bring the electronics industry to license thin-film transistors (TFTs) made from these metal oxides for various devices, including touchable displays. However, this did not come easy. In fact, even today, many of the characteristics of pure IGZO crystals remain unknown owing to their difficult extraction procedure. Then what makes them tantalizing?

When you shine light on metals, the free conducting electrons resonate or vibrate with external light (electromagnetic waves). Thus, the light wave is shielded, and as a result, light is not transmitted but is reflected. This is why metals are not generally transparent despite being good reflectors and conductors. In contrast, semiconductors with a large band gap, such as IGZO, can absorb and transmit light even in the visible light range. In general, the large band gap implies that these types of materials are insulators. Injecting carriers, using oxygen defects, into a semiconductor material with a large band gap can yield a material that is both transparent and conductive.

Thus, being both transparent and conductive makes these semiconductors suitable for use in optoelectronic devices, much like the one you're reading this on! Furthermore, IGZO-based transistors have added advantages such as high electron mobility, good uniformity over a large area, and low processing temperature, which make it possible to achieve unparalleled energy-efficient high resolution. Within this IGZO-1n family, polycrystalline IGZO-11 (i.e. InGaZnO4) exhibits the highest conductivity and the largest optical band gap. In addition, von Neumann-type computers, or simply digital computers, require "on-off state" electrical circuits as the basic building blocks, with the ideal "off" state corresponding to a "zero" current. The IGZO-11 excels on this front too, as the off-state current value for it is extremely small, which implies that the energy loss can be minimized.

However, sufficiently big single crystals of IGZO-11 that could be used to measure their physical properties have not yet been obtained. Therefore, its precise intrinsic properties are unexplored. Motivated by this and the fact that a multicomponent oxide with a layered structure could exhibit anisotropic conduction, a team of researchers, mainly from Tokyo University of Science, led by Prof Miyakawa, has developed a novel technique to grow single crystals of the type.

The primary challenge in the synthesis of the multicomponent layered structure is the recurrent defect formation during crystal growth. Furthermore, the physical properties of the material were unknown, which meant that the route for isolating the crystal had to be meticulously chalked out. Faced with the fact that the IGZO-11 might also be an incongruent material under atmospheric pressure (i.e., the crystalline solid phase is decomposed in the melting process into a second crystal phase, different from the original crystal, and a liquid phase), the research team opted for optical floating zone (OFZ) to build the crystal. By increasing the gas pressure, the team succeeded in suppressing the evaporation and vaporization, and growing a good single crystal from the liquid phase.

Thus, OFZ enabled the growth of high-quality oxide crystals without the need for a crucible or a container, which gives better control over the temperature and pressure that the liquid material is subjected to. Additionally, the use of Zn-rich feed-rod in the synthesis allowed the researchers to control the level of ZnO that would have otherwise evaporated, rendering the synthesis futile.

Upon succeeding with the synthesis of the crystal, the researchers studied its physical properties. They observed that the nascent crystal appeared bluish in color. On annealing or heating and then slowly cooling in free atmosphere and additional oxygen, the crystal became transparent. Free carriers produced by oxygen vacancies in crystals absorb red light and emit blue light; thus, the researchers associated the color change with this oxygen filling the vacancies when the crystal underwent annealing.

To complete the tale, the researchers then measured the crystal's electrical conductivity, mobility, and carrier density, and their temperature dependences. They noted that post-annealing all electrical properties showed a decrease. The carrier density and conductivity could be controlled within the range of 1017-1020 cm-3 and 2000-1 S cm-1 at room temperature by post-annealing. They also reported an increase in the mobility upon increase in carrier density, which was previously noted in transport studies for some IGZO-1n thin films. This suggests that the unusual behavior is an intrinsic characteristic of the IGZO-1n family.

Interestingly, the team noted that the conductivity along the c-axis (axis perpendicular to each plane in the layered structure) is >40 times lower than that in the ab-plane (plane of the layer) in the single crystals, and that the anisotropy increases with decreasing carrier density. As Prof Miyakawa explains, "Indium-indium distance along the c-axis is much longer than that along the ab-plane. Therefore, the overlap of the wave function is smaller in c-axis direction." Because the degree of overlap of the wave-functions of electronic orbitals governs how easily electrons can move, the researchers assert that this could be the origin of the anisotropic conductivity for IGZO-11 crystals.

Previously, the IGZO family has been used in liquid crystal displays, including in smartphones and tablets and, in fact, recently also in large OLED televisions. The electrical conductivity and transparency of this novel material make IGZO stand out. While fabricating transistors out of the IGZO-11 that can be directly applied in LEDs remains a work in progress, this fascinating research marks the start of many more discoveries.

So, do you see why IGZO-11 is important or are you seeing through it?

Credit: 
Tokyo University of Science

Story tips from the Department of Energy's Oak Ridge National Laboratory, June 17, 2019

image: Researchers developed a one-of-a-kind, high-pressure cell and used it on the Magnetism Reflectometer beamline at ORNL's Spallation Neutron Source to study the spatially confined magnetism in a lanthanum-cobalt-oxide thin film.

Image: 
Genevieve Martin/Oak Ridge National Laboratory, US Dept. of Energy

Buildings--Pushing the envelope

An online tool developed by researchers at Oak Ridge National Laboratory provides architects and engineers a fast and efficient way to assess the performance of a building's envelope design before construction begins. The Building Science Advisor allows builders to evaluate the moisture durability of the envelope, or exterior, of residential buildings. "Most building envelope issues are associated with moisture problems," ORNL's Andre Desjarlais said. "With BSA, we're guiding builders through the design process by identifying features that impact durability." The tool helps builders make better-informed decisions for energy efficiency through two pathways--expert or educational. "The expert pathway gives builders the ability to input construction plan information uninterrupted, and the educational path guides users through each step of the material selection process, providing feedback so that the user can adjust plans in real-time," he said. The BSA will be expanded to include roofing systems, retrofits and commercial envelopes evaluation. [Contact: Jennifer Burke, (865) 576-3212; burkejj@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/2019-06/Building_science_adv_tool.jpg

Caption: ORNL's online tool provides builders a fast and efficient way to assess the performance of a building's envelope design before construction begins. Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy

Neutrons--Mastering magnetism

Researchers have pioneered a new technique using pressure to manipulate magnetism in thin film materials used to enhance performance in electronic devices. They used neutron scattering at Oak Ridge National Laboratory's Spallation Neutron Source to explore the spacial density of atoms and observe how magnetism in a lanthanum-cobalt-oxide film changed with applied pressure. "We developed a novel method to identify the critical role that strain has on the magnetism of films and their interfaces," said ORNL's Michael R. Fitzsimmons. "This allows us to study magnetism in thin films without having to compare a lot of differently grown samples." The new technique, described in Physical Review Letters, will enable novel studies into complex correlations between magnetism and pressure involving a broad class of thin films in a wide range of applications. The thin film materials were developed at ORNL, and complementary measurements were made at Argonne National Laboratory's Advanced Photon Source.--Gage Taylor [Contact: Jeremy Rumsey, (865) 576-2038; rumseyjp@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/2019-06/Reflectometry%20Cell-5737_sm.jpg

Caption: Researchers developed a one-of-a-kind, high-pressure cell and used it on the Magnetism Reflectometer beamline at ORNL's Spallation Neutron Source to study the spatially confined magnetism in a lanthanum-cobalt-oxide thin film. Credit: Genevieve Martin/Oak Ridge National Laboratory, U.S. Dept. of Energy

Quantum--Squeezed light cuts noise

Oak Ridge National Laboratory physicists studying quantum sensing, which could impact a wide range of potential applications from airport security scanning to gravitational wave measurements, have outlined in ACS Photonics the dramatic advances in the field. "Quantum-enhanced microscopes are particularly exciting," ORNL's Ben Lawrie said. "These quantum sensors can 'squeeze' the uncertainty in optical measurements, reducing the uncertainty in one variable while increasing the uncertainty elsewhere." Squeezed light refers to a quantum state where the statistical noise that occurs in ordinary light is greatly reduced. Squeezed atomic force microscopes, or AFMs, could operate hundreds of times faster than current microscopes while providing a nanoscale description of high-speed electronic interactions in materials. This enhancement is enabled by removing a requirement in most AFMs that the microscope operate at a single frequency. Future sensing technologies that harness quantum properties could be deployed as new quantum-enabled devices or as "plug-ins" for existing sensors. [Contact: Sara Shoemaker, (865) 576-9219; shoemakerms@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/2019-06/Quantum-Squeezed_light_cuts_noise.jpg

Caption: Certain quantum sensors use a "squeezed" state of light to greatly reduce statistical noise that occurs in ordinary light. Credit: Reprinted with permission from B. J. Lawrie, et al., "Quantum Sensing with Squeezed Light." ACS Photonics. Copyright 2019. American Chemical Society.

Credit: 
DOE/Oak Ridge National Laboratory

Lynx in Turkey: Noninvasive sample collection provides insights into genetic diversity

image: Caucasian lynx (Lynx lynx dinniki).

Image: 
Deniz Mengulluoglu, Nurten Salikara

Little is known about the biology and the genetic status of the Caucasian Lynx (Lynx lynx dinniki), a subspecies of the Eurasian lynx distributed across portions of Turkey, the Caucasus region and Iran. To collect baseline genetic, ecological, and behavioural data and assist future conservation efforts, a team of scientists from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) collected data and samples in a region of Anatolian Turkey over several years. They were particularly interested in the question whether non-invasive samples (faeces, hair) were helpful to discern genetic diversity of the study population. The results of the genetic analyses indicated an unexpectedly high genetic diversity and lack of inbreeding despite the recent isolation of the study population, a result that would not have been obtained with the use of conventional samples. The data also revealed that females stay near home ranges in which they were born whereas males disperse after separation from their mothers. These insights into the genetics and behaviour of the Caucasian Lynx are published in the scientific journal PLOS ONE.

Among lynx species, the Eurasian Lynx has the widest geographical distribution. Previous research has largely focused on European populations, with the result that there is little known about the subspecies in Asia, such as the Caucasian and Himalayan subspecies. "Scientists still know surprisingly little about their ecological requirements, spatial structure and genetic diversity", says Leibniz-IZW researcher Deniz Mengulluoglu (Department for Evolutionary Ecology). "Our study aimed at collecting baseline genetic, ecological and behavioural data of the lynx population in a mountainous region in north-west Anatolia". Making use of box trapping and non-invasive faecal sampling allowed Mengulluoglu to extract DNA and conduct genetic analyses on a population scale. The lynx population had also been monitored via camera traps at 54 different stations for nearly a decade.

Looking into family relationships of individual lynx, the data revealed that females stay near the territories in which they were born whereas males disperse after separation from their mothers. Such behaviour is known from many mammals, most likely to avoid inbreeding. "We can conclude from our analyses that territoriality in lynx and philopatry in female lynx can result in low genetic diversity estimates if sampling is done in small study areas via box trapping alone", says Mengulluoglu. This behaviour - females remaining in close proximity to their mothers' territory - is called female philopatry, and Mengulluoglu and his team confirmed it for this subspecies. "Using faecal samples that were non-invasively collected, we were able to sample more non-territorial individuals, gaining information about an additional component of this lynx population." Unless individuals become used to the presence of box traps within their own range (habituation), and thus are ready to enter them, sampling them by conventional means is unlikely. Hence habituation will bias conventional sample collection in favour of resident territorial individuals and their kittens.

A second important finding is that genetic diversity is unexpectedly high in this population. Lynx in north-west Anatolia are isolated from southern and north-eastern populations by a series of natural and human constructed barriers. "Population isolation can be harmful and, for example, lead to a loss of genetic variation. But it appears that genetic diversity is in fact substantial at the moment and matches the diversity found in European endogenous populations", states senior author Daniel Foerster from the Leibniz-IZW, Department of Evolutionary Genetics. "Management should therefore focus on maintaining the current level of diversity". As a first step, Mengulluoglu and Foerster recommend identification and conservation of primary lynx habitats and corridors in the region.

"We also need to address threats that can lead to future loss of genetic variation" adds Mengüllüo?lu. Since this study has set a baseline for comparison with future findings, similar work is needed for the other two Turkish populations in order to determine whether the three big populations are currently connected by gene flow at all, Mengulluoglu and Foerster say. Mengulluoglu is currently working on a long-term lynx monitoring project and the development of a "Turkish Lynx Conservation Action Plan" in collaboration with the Wildlife Department of Turkey.

Credit: 
Forschungsverbund Berlin

Rinsing system in stomach protects the teeth of ruminants

image: Sand sinks down in the rumen and collects in the abomasum, passes through the bowel and is then expelled with the undigested material in the feces.

Image: 
(Illustration: UZH)

"Field-grazing animals always eat some earth and dust along with the plants," says Jean-Michel Hatt, professor at the Clinic for Zoo Animals, Exotic Pets and Wildlife. This is particularly the case in dry regions where the wind blows a lot of dust around, and causes a lot of work for the mastica-tory organs. His research team has now shown that various mechanisms prevent excessive abra-sion of the teeth - thus ensuring the animals' survival.

Short and long teeth in the same habitat

Horses and zebras, for instance, have developed very long teeth in order to compensate for the abrasion caused by dust and sand. Cows and wildebeest, on the other hand, have shorter teeth. "We have always wondered how ruminants living in the same habitat manage with shorter teeth," Hatt explains.

Ruminants have a stomach system with multiple chambers - rumen, reticuum, omasum and abo-masum - which use bacteria to digest the plant material they eat. The food is washed by rumen fluid and sorted into material that is already small enough to digest, and larger pieces that are regurgitated to be chewed again. It has long been assumed that the cud to be ruminated has been freed from dust and sand.

Sand collects in the stomach

Jean-Michel Hatt and his team have now for the first time tested the influence of various types of food on dental abrasion. Using computer tomography, the researchers observed in goats that the sand ingested with the plants was not equally distributed around the gastrointestinal tract, but collected at specific locations. "We were able to show that there was considerably less sand in the upper rumen, where the material to be ruminated is regurgitated, than in the ingested food itself," Hatt explains.

What happens to the sand? First it sinks down in the rumen and collects in the abomasum, pass-es through the bowel and is then expelled with the undigested material in the feces. "Organisms that develop such a washing system have a natural way to easily get rid of the rinsed-off material," says Hatt. It is only when animals ingest a large amount of sand all at once - for example through badly produced silage with an unusual amount of soil contamination - that complications can occur.

Ruminants' success model

For Hatt, the finding provides another piece of the puzzle explaining the evolutionary success of the ruminant model. It also explains why the animals do a much less thorough job of chewing their food into small pieces the first time around than they do later, when they are ruminating clean material.

Credit: 
University of Zurich

100-year-old physics model replicates modern Arctic ice melt

image: A simulation of melt pond development.

Image: 
Yi-Ping Ma

The Arctic is melting faster than we thought it would. In fact, Arctic ice extent is at a record low. When that happens--when a natural system behaves differently than scientists expect--it's time to take another look at how we understand the system. University of Utah mathematician Ken Golden and atmospheric scientist Court Strong study the patterns formed by ponds of melting water atop the ice. The ponds are dark, while the ice is bright, meaning that the bigger the ponds, the darker the surface and the more solar energy it absorbs.

So, it's more than a little important to know how the ice's reflectivity, also called albedo, is changing. That's a key component in understanding the balance between solar energy coming in and energy reflected out of the Arctic. Earlier work showed that the presence or absence of melt ponds in global climate models can have a dramatic effect on long term predictions of Arctic sea ice volume.

To model the melt ponds' growth, Golden, Strong and their colleagues tweaked a nearly 100-year-old physics model, called the Ising model, that explains how a material may gain or lose magnetism by accounting for how atoms interact with each other and an applied magnetic field. In their model, they replaced the property of an atom's magnetic spin (either up or down) with the property of frozen (white) or melted (blue) sea ice.

"The model captures the essential mechanism of pattern formation of Arctic melt ponds," the researchers write, and replicates important characteristics of the variation in pond size and geometry. This work is the first to account for the basic physics of melt ponds and to produce realistic patterns that accurately demonstrate how melt water is distributed over the sea ice surface. The geometry of the melt water patterns determines both sea ice albedo and the amount of light that penetrates the ice, which significantly impacts the ecology of the upper ocean.

Unfortunately, a model like this can't halt the ice from melting. But it can help us make better estimates of how quickly Arctic ice or permafrost is disappearing--and better climate models help us prepare for the warmer future ahead.

Credit: 
University of Utah

UVA scientists use machine learning to improve gut disease diagnosis

Charlottesville, VA -- A study published in the open access journal JAMA Open Network June 14 by scientists at the University of Virginia schools of Engineering and Medicine and the Data Science Institute says machine learning algorithms applied to biopsy images can shorten the time for diagnosing and treating a gut disease that often causes permanent physical and cognitive damage in children from impoverished areas.

In places where sanitation, potable water and food are scarce, there are high rates of children suffering from environmental enteric dysfunction, a disease that limits the gut's ability to absorb essential nutrients and can lead to stunted growth, impaired brain development and even death.

The disease affects 20 percent of children under the age of 5 in low- and middle-income countries, such as Bangladesh, Zambia and Pakistan, but it also affects some children in rural Virginia.

For Dr. Sana Syed, an assistant professor of pediatrics in the UVA School of Medicine, this project is an example of why she got into medicine. "You're talking about a disease that affects hundreds of thousands of children, and that is entirely preventable," she said.

Syed is working with Donald Brown, founding director of the UVA Data Science Institute and W.S. Calcott Professor in the Department of Engineering Systems and Environment, to incorporate machine learning into the diagnostic process for health officials combating this disease. Syed and Brown are using a deep learning approach called "convolutional neural networks" to train computers to read thousands of images of biopsies. Pathologists can then learn from the algorithms how to more effectively screen patients based on where the neural network is looking for differences and where it is focusing its analysis to get results.

"These are the same types of algorithms Google is using in facial recognition, but we're using them to aid in the diagnosis of disease through biopsy images," said Brown.

The machine learning algorithm can provide insights that have evaded human eyes, validate pathologists' diagnoses and shorten the time between imaging and diagnosis, and from a technical engineering perspective, might be able to offer a look into data science's "black boxes" by giving clues into the thinking mechanism of the machine.

But for Syed, it is still about saving lives.

"There is so much poverty and such an unfair set of consequences," she said. "If we can use these cutting-edge technologies and ways of looking at data through data science, we can get answers faster and help these children sooner."

Credit: 
University of Virginia School of Engineering and Applied Science

Balancing data protection and research needs in the age of the GDPR

Scientific journals and funding bodies often require researchers to deposit individual genetic data from studies in research repositories in order to increase data sharing with the aim of enabling the reproducibility of new findings, as well as facilitating new discoveries. However, the introduction of new regulations such as the EU General Data Protection Regulation (GDPR) can complicate this, according to the results of a study to be presented at the annual conference of the European Society of Human Genetics today (Monday).

The investigators collected experiences of the practical challenges that researchers face. «Their attempts to comply with the requirements of funders and journals to deposit data often clash with the GDPR,» says Dr Deborah Mascalzoni, Senior Researcher at Uppsala University in Sweden and at EURAC Research, Bolzano, Italy. "We need to follow along the path of open science while taking into account ethical and legal rules if we are to be able to comply with both the law and the funders' requirements."

According to the GDPR, participants in research studies have the right to withdraw consent for the further use of their data. To exercise this right, they must be informed about the processing of their data in research repositories, but the way in which some repositories are set up can make it difficult for researchers to comply with the law. And another challenge lies in the GDPR's requirement for data processing to be limited to what is necessary to fulfil a study's objectives, therefore ruling out the long-term retention of data for unspecified uses. The destruction of such data eliminates resources that may be useful in the future and therefore reduces research efficiency.

But solutions to the problem are readily available, say Dr Mascalzoni and her fellow researcher Heidi Beate Bentzen, LLM, from the University of Oslo in Norway. The current difficulties stem from the fact that legal and ethical design are not always embedded into the planning of a system from the beginning. This means that this is often added as an afterthought, a kind of 'patch' that fixes one case but is not a long-term solution. And while the GDPR needs to be interwoven into the everyday work of researchers, it is still a fairly new regulation.

"If you introduce a novel technique in the lab, you need to account for it and make it work with your existing systems, for example IT and other lab equipment. It is the same with the GDPR, that in essence is a good piece of law. What we are facing now with the repositories is that they are often based outside the EU or in in one EU member state (as the UK model) without taking into account all EU member state regulations," says Dr Mascalzoni.

Researchers currently may face problems submitting their work to some journals due to legal restrictions preventing them from sharing individual-level genetic data in the journals' repositories. They may be able to share data with reviewers and editors, and in some cases with other researchers who request the data for specific purposes, but not more widely. "Journals need to rethink their policies in these cases," says Dr Mascalzoni. "Because participant trust is so crucial to the future of research, we were surprised to find that research repositories had not already changed their modus operandi and that journals and funders had not amended their policies to account for the GDPR."

Another way is to recognise mutually equivalent policies, the researchers say. If a law prohibits a practice it is difficult to require individual scientists to either disobey or be excluded, so exceptions and waivers should apply.

"We hope that our work will showcase the urgency of setting up GDPR compliant research repositories and adapting the requirements of funding bodies and journals to the GDPR. If we are to operate in an open, efficient science environment, we need to build a safe place where researchers and patients can participate knowing human rights and research are taken seriously simultaneously", Dr Mascalzoni concludes.

Chair of the ESHG conference, Professor Joris Veltman, Director of the Institute of Genetic Medicine at Newcastle University, Newcastle upon Tyne, UK, said: "Data sharing is essential for scientific progress, especially in the field of human genetics where we need to combine clinical and genetic data to learn about the clinical impact of millions of genetic variations present in each person's genome. This study investigates how the new EU data protection regulations affect data sharing and what should be done to allow for this to be done in a safe and responsible manner."

Credit: 
European Society of Human Genetics

The complex fate of Antarctic species in the face of a changing climate

image: John Spicer collecting intertidal amphipods from South Cove (Rothera Research Station, British Antarctic Survey) looking west to Ryder Bay on the Western Antarctic Peninsula.

Image: 
Simon Morley

Oxygen concentrations in both the open ocean and coastal waters have declined by 2-5% since at least the middle of the 20th century.

This is one of the most important changes occurring in an ocean becoming increasingly modified by human activities, with raised water temperatures, carbon dioxide content and nutrient inputs.

Through this, humans are altering the abundances and distributions of marine species but the decline in oxygen could pose a new set of threats to marine life.

Writing in Philosophical Transactions of the Royal Society B, scientists present support for the theory that marine invertebrates with larger body size are generally more sensitive to reductions in oxygen than smaller animals, and so will be more sensitive to future global climate change.

It is widely believed that the occurrence of gigantic species in polar waters is made possible by the fact that there is more oxygen dissolved in ice cold water than in the warmer waters of temperate and tropic regions.

So as our ocean warms and oxygen decreases, it has been suggested that such oxygen limitation will have a greater effect on larger than smaller marine invertebrates and fish.

The study was conducted by John Spicer, Professor of Marine Zoology at the University of Plymouth, and Dr Simon Morley, an Ecophysiologist with the British Antarctic Survey (BAS).

They investigated how a number of different sized amphipod species - found in abundance in Antarctic waters and relatives of the sandhoppers on temperate beaches) - performed when the oxygen in the water they were in was reduced.

Overall, there was a reduction in performance with body size supporting the theory that larger species may well be more vulnerable because of oxygen limitation.

However, the picture is a little more complex than this with evolutionary innovation - such as the presence of oxygen binding pigments in their body fluids to enhance oxygen transport, and novel gas exchange structures in some, but not all, species - to some extent offsetting any respiratory disadvantages of large body size.

Professor Spicer, who has spent more than 30 years examining the effect of climate change on marine organisms, said: "Over the last 50 years, the oxygen in our oceans has decreased by around 2-5% and this is already having an effect on species' ability to function. Unless they adapt, many larger marine invertebrates will either shrink in size of face extinction, which would have a profoundly negative impact on the ecosystems of which they are a part. This is obviously a major cause for concern.

"Our research also shows that some species have evolved mechanisms to compensate for reductions in oxygen, and so it is not always as simple as drawing a link between size and future survival. But it would be foolhardy to pin our hopes on such 'evolutionary rescue'. Many large species will almost certainly be the first casualties of our warming, oxygen-poor ocean."

Dr Morley added: "Marine animals thrive in the Southern Ocean but life in these freezing waters has led to the evolution of many distinct characteristics. These 'strategies', which allow animals to survive in the cold, are expected to make many Antarctic marine invertebrates and fish vulnerable to the impact of climate change. Understanding these impacts will not only help us to predict the fate of marine biodiversity at the poles but will also teach us much about the mechanisms that will determine the survival of species across the world's oceans."

Credit: 
University of Plymouth

Modified enzyme can increase second-generation ethanol production

One of the main challenges of second-generation biofuel production is identifying enzymes produced by microorganisms for use in a "cocktail" of enzymes to catalyze biomass hydrolysis, in which the enzymes act together to break down the carbohydrates in sugarcane trash and bagasse, for example, and convert them into simple sugars for fermentation.

A group of researchers at the University of Campinas (UNICAMP), working in partnership with colleagues at the Brazilian Biorenewables National Laboratory (LNBR) in Campinas, São Paulo State, Brazil, have discovered that Trichoderma harzianum, a fungus found in the Amazon, produces an enzyme with the potential to play a key role in enzyme cocktails.

The enzyme, which is called β-glucosidase and belongs to glycoside hydrolase family 1 (GH1), acts in the last stage of biomass degradation to produce free glucose for fermentation and conversion into ethanol. In the laboratory, however, the researchers observed that high levels of glucose inhibited the activity of β-glucosidase.

"We also found that the enzyme's optimal catalytic activity occurred at 40 °C. This represented another obstacle to use of the enzyme because in an industrial setting, the enzymatic hydrolysis of biomass is performed at higher temperatures, typically around 50 °C," said Clelton Aparecido dos Santos, a postdoctoral researcher at UNICAMP's Center for Molecular Biology and Genetic Engineering (CBMEG) with a scholarship from FAPESP.

Based on an analysis of the enzyme's structure combined with genomics and molecular biology techniques, the researchers were able to modify the structure to solve these problems and considerably enhance its biomass degradation efficiency.

The study resulted from a project with a regular research grant from FAPESP and a Thematic Project also supported by FAPESP. The findings are published in the journal Scientific Reports.

"The modified protein we developed proved far more efficient than the unmodified enzyme and can be used to supplement the enzyme cocktails sold today to break down biomass and produce second-generation biofuels," Santos told.

To arrive at the modified protein, the researchers initially compared the crystal structure of the original molecule with structures of other wild-type β-glucosidases in the GH1 and GH3 glycoside hydrolase families. The results of the analysis showed that glucose-tolerant GH1 glucosidases had a deeper and narrower substrate channel than other β-glucosidases and that this channel restricted glucose access to the enzyme's active site.

Less glucose-tolerant β-glucosidases had a shallower but wider active site entrance channel, allowing more of the glucose produced by these enzymes to enter the last stage of biomass degradation. Retained glucose blocks the protein's channel and reduces its catalytic activity.

Based on this observation, the researchers used a molecular biology technique known as site-directed mutagenesis to replace two amino acids that might be acting as "gatekeepers" at the entrance to the enzyme's active site, letting in glucose or blocking it. Analysis of their experiments showed that the modification narrowed the channel to the active site.

"The mutant enzyme's active site shrank to a similar size to that of the glucose-tolerant GH1 β-glucosidases," Santos said.

Enhanced efficiency

The researchers conducted a number of experiments to measure the improved protein's performance in breaking down biomass, especially sugarcane bagasse, an agroindustrial waste with vast potential for profitable use in Brazil. During a research internship abroad with a scholarship from São Paulo Research Foundation - FAPESP, Santos worked with a research group led by Paul Dupree, a professor at the University of Cambridge in the UK, on an analysis of the tailored enzyme's glucose release efficiency when different sources of plant biomass were converted.

The analysis showed that the catalytic efficiency of the modified enzyme was 300% higher than that of the wild-type enzyme in terms of glucose release. Moreover, it was more glucose-tolerant, so more glucose was released from all the tested plant biomass feedstocks. The mutation also enhanced the enzyme's thermal stability during fermentation.

"Mutation of the two amino acids at the active site made the enzyme superefficient. It's ready for industrial application," said Anete Pereira de Souza, a professor at UNICAMP and principal investigator for the project. "One of the enzyme's advantages is that it's produced in vitro and not from a modified fungus or other organism, so it can be mass-produced at relatively low cost."

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo