Earth

Mysterious mechanism of graphene oxide formation explained

image: Natural graphite, used as the precursor for graphene oxide production, is a highly ordered crystalline inorganic material, which is believed to be formed by decay of organic matter. It is extremely thermodynamically stable and resistant to be converted to the organic-like metastable graphite oxide

Image: 
Kazan Federal University

Project lead Ayrat Dimiev has been working on this topic since 2012, when he was a part of Professor James Tour's group at Rice University. First results saw light in 2014. That paper, which has amassed 490 citations at this moment, dealt with the mechanism of turning graphite into graphene oxide (GO). Dr. Dimiev later transferred to the private sector and resumed his inquiries in 2017, after returning to Kazan Federal University and opening the Advanced Carbon Nanomaterials Lab. The experimental part of this new publication was conducted by Dr. Ksenia Shukhina and Dr. Artur Khannanov.

Natural graphite, used as the precursor for graphene oxide production, is a highly ordered crystalline inorganic material, which is believed to be formed by decay of organic matter. It is extremely thermodynamically stable and resistant to be converted to the organic-like metastable graphite oxide. On this route, it goes through several transformations, resulting in respective intermediate products. The first intermediate product is graphite intercalation compound (GIC). GICs have been intensively studied in the second half of the 20th century. In recent years they gained renewed interest due to the discovery of graphene and related materials. The second step of the complex reaction, i.e. the conversion of GIC to pristine graphite oxide, remained mysterious. The most interesting question was about the nature of species attacking carbon atoms to form covalent C-O bonds. For many years, it was conventionally assumed that the attacking species are the manganese derivatives like Mn2O7 or MnO3+. In this study, the authors unambiguously demonstrated that the manganese derivatives do not even penetrate graphite galleries; they only withdraw electron density from graphene, but the actual species attacking carbon atoms are water molecules. Thus, reaction cannot proceed in fully anhydrous conditions, and speeds up in presence of small quantities of water.

Another new finding, registered by Ksenia Shukhina for the first time, was the imaginary reversibility of the C-O bond formation, as long as the graphite sample remains intercalated with sulfuric acid. The as-formed C-O bonds can be easily cleaved by the laser irradiation, converting GO back to stage-1 GIC in the irradiated areas of the graphite flake. After careful analysis, this "reversibility" was interpreted by the authors as the mobility of the C-O bonds, i.e. the bonds do not cleave, but freely migrate along the graphene plane for micron-scale distances. The discovered phenomena and proposed reaction mechanism provide rationale for a range of the well-known but yet poorly understood experimental observations in the graphene chemistry. Among them is the existence of the oxidized and graphenic domains in the GO structure.

The results of this fundamental study give a comprehensive view on the driving forces of the complex processes occurring during the transformation of graphite into graphene oxide. This is the first time such a multifaceted des­cription of a dynamic system has been made, and this is the result not only of newly obtained experimental data, but also of many years of reflection on the issue by the project lead. Understanding these processes will finally let one to control this reaction and get products with desired properties. This applies not only to the final product of graphene oxide, but also to the entire family of materials obtained by exposing graphite to acidic oxidizing mixtures: expanded graphite, graphene nano-platelets containing from 3 to 50 graphene sheets, graphite intercalates, and doped graphene. As for graphene oxide itself, its successful use has already been repeatedly demonstrated in such areas as composite materials, selective membranes, catalysis, lithium-ion batteries, etc. However, the use of graphene oxide is hampered by the high cost of its production and the lack of control over the properties of the synthesized product. The published research addresses both of these problems.

Currently, work is ongoing to study the interaction of graphene oxide with metals. The researchers are firmly convinced that this process is based not just on electrostatic attraction, or on non-specific adsorption, as it is commonly believed, but on a chemical interaction with bond formation through the coordination mechanism. The objective now is to describe the complex reaction mechanism of the rearrangements, leading to the metal bonding in the dynamic structure of graphene oxide.

Credit: 
Kazan Federal University

Clear strategies needed to reduce bushmeat hunting

image: Researchers studied a wildlife trading network in Côte d'Ivoire, West Africa, and compiled one of the most comprehensive data sets to date.

Image: 
Wild Chimpanzee Foundation

Covid-19 and the associated global economic, health and societal distortions have shed light on the alarming threat of infectious diseases emerging at an increasing rate. Around 60 percent of emergent infectious diseases are zoonotic, originating in animals; among the most prominent are Sars, Mers, Ebola, HIV and Covid-19. More than two-thirds of those originate in wild species. Many voices have called for higher restrictions or even a blanket ban on the wildlife trade. This demand is also fuelled by the devastating effects of unsustainable hunting that threatens hundreds of species.

However, millions of people, especially in the Global South, depend on wild meat ("bushmeat") for their livelihoods. Hunting and consuming wild meat is a vital part of their culture. Therefore, current strategies often aim at trade regulations, rather than the enforcement of strict bans. Even though species vary in their conservation value and their associated risk of transmitting zoonotic diseases, little is known about the reasons why people choose a certain species. "In order to make wildlife trade more sustainable, to prevent uncontrolled disease emergence and species decline, it is essential to know and understand these reasons, and I was surprised how little information existed on these", explains lead author Mona Bachmann, doctoral researcher at the German Centre for Integrative Biodiversity Research and the Max Planck Institute for Evolutionary Anthropology.

An international research team led by Mona Bachmann and Hjalmar Kühl from the Max Planck Institute for Evolutionary Anthropology and the German Centre for Integrative Biodiversity Research studied a wildlife trading network in Côte d'Ivoire, West Africa. Since the wildlife trade is mostly illegal, people often hesitate to share information. With the help of local, trustable informants, often hunters or bushmeat traders themselves, the researchers were able to break the ice. Around 350 hunters, 200 bushmeat traders and 1,000 bushmeat consumers provided detailed insights into the wildlife trade and contributed to one of the most comprehensive data sets for a wildlife trading network to date.

Different species, different risks

In Sub-Saharan Africa alone, bushmeat trade encompasses over 500 species - from rats to elephants. Around 80 percent of the bushmeat biomass harvested in this region consists of fast-reproducing generalists like rodents, small-bodied duikers or antelopes. These species resist high levels of hunting and are a crucial component of livelihoods throughout rural areas. Replacing them with alternative animal proteins could substantially increase the exploitation of fish stocks or lead to habitat degradation to provide grazing land. Species that produce fewer offspring, like many primates, are threatened by even low levels of hunting. Since they are comparatively rare, they usually represent only a small percentage of a hunters' catch. Additionally, different risks of zoonotic disease transmission are associated with those species. In general, proximity with humans - either in the phylogenetic sense, like many primates, or in the spatial sense, like rodents in areas highly populated by humans - can increase the risk of transmitting diseases.

Most strategies aim at reducing wild meat in general, irrespective how common a species is or how likely it will transmit diseases. However, people might use species for different purposes. If mitigation strategies neglect this fact, rare species of greater conservation relevance that contribute little to the total bushmeat biomass, like many primates or disease-prone-species, would likely be overlooked.

Why people use bushmeat

According to the study, preferences for bushmeat differed widely between hunters, bushmeat traders and consumers, and so does their motivation. People hunt for monetary, nutritional, educational and cultural reasons. Primates, for example, were mostly targeted by commercial hunters for profit and consumed as a luxury meat, while rodents were hunted and consumed when alternative proteins like fish or domestic meat were lacking. Interestingly, hunters and consumers who were aware of the negative ecological consequences of unsustainable bushmeat hunting targeted or consumed primates less. In contrast, bushmeat traders did not change their behaviour.

Broadly applied mitigation approaches are often development-based, educational or cultural. The results show that these interventions may address taxa like rodents, duikers or primates differently. Also, different responses by the individual user groups of hunters, traders or consumers are to be expected.

"Up to 60 percent of the consumed meat was from rodents and only seven percent from primates," says Bachmann. "If we considered bushmeat as one generic good, we would have probably identified a lack of protein as the main reason for its use and had thus recommended development-related projects. However, primates were consumed irrespective of the availability of proteins, and economic development could even increase economic resources to purchase the desired luxury product. Hence, to protect primates, development-related strategies need to be complemented by educational strategies."

The researchers, therefore, urge policy makers to prioritize planning processes: Clear goals, like conservation, development or disease prevention, have to be set. Assessments need to identify the behaviour causing problems, the user group and its motivations first. Knowledge and tools from disciplines such as psychology or marketing may optimize campaigns.

"Scientists and practitioners in conservation often rush to find quick solutions because every delay comes at high cost," says Bachmann. "In West and Central Africa, this often leads to one-size-fits-all solutions. However, our results suggest that many conservation strategies may be tailored to fit the wrong targets. Poor planning not only hampers the effectiveness of strategies but can also cause harm and waste the already scarce resources available for biodiversity protection." Hjalmar Kühl adds: "If we really want to solve the problem of the overexploitation of wildlife and reduce the threats associated with it, for species conservation and human well-being, we need to tackle it at its roots. We cannot continue ignoring this problem, but we need to invest resources and develop strategies that really help to create a more sustainable human-wildlife co-existence."

Credit: 
Max Planck Institute for Evolutionary Anthropology

Antarctica more widely impacted by humans than previously thought

image: A Weddel Seal (Leptonychotes weddellii), a large and abundant seal species, is seen sunbathing in Antarctica. Antarctica hosts a diversity of organisms, which may be sensitive to even passing disturbance.

Image: 
Steven Chown

Antarctica is considered one of the Earth's largest, most pristine remaining wildernesses. Yet since its formal discovery 200 years ago, the continent has seen accelerating and potentially impactful human activity.

How widespread this activity is across the continent has never been quantified. We know Antarctica has no cities, agriculture or industry. But we have never had a good idea of where humans have been, how much of the continent remains untouched or largely unimpacted, and to what extent these largely unimpacted areas serve to protect biodiversity.

A team of researchers led by Monash University, including Dr Bernard Coetzee from the Global Change Institute at the University of the Witwatersrand, Johannesburg (Wits University), has changed all of that. Using a data set of 2.7 million human activity records, the team showed just how extensive human use of Antarctica has been over the last 200 years. The research was published in the journal Nature.

With the exception of some large areas mostly in the central parts of the continent, humans have set foot almost everywhere.

Although many of these visited areas have only been negligibly affected by people, biodiversity is not as well represented within them as it should be.

"We mapped 2.7 million human activity records from 1819 to 2018 across the Antarctic continent to assess the extent of wilderness areas remaining and its overlap with the continent's biodiversity," says Coetzee, a conservation scientist at Wits University. Based in Skukuza in the Kruger National Park in South Africa, Coetzee helped conceptualise the study and collated a spatial database from multiple sources to map the extent of human activity in Antarctica.

"In a region often thought of as remote, we showed that in fact human activity has been extensive, especially in ice-free and coastal areas where most of its biodiversity is found. This means that "wilderness" areas do not capture many of the continent's important biodiversity sites, but that an opportunity exists to conserve the last of the wild."

The study found that only 16% of the continent's Important Bird Areas, areas identified internationally as critical for bird conservation, are located within negligibly impacted areas, and little of the total negligibly impacted area is represented in Antarctica's Specially Protected Area network.

High human impact areas, for example some areas where people build research stations or visit for tourism, often overlap with areas important for biodiversity.

Lead author, Rachel Leihy, a PhD student in the Monash School of Biological Sciences, points out that "While the situation does not look promising initially, the outcomes show that much opportunity exists to take swift action to declare new protected areas for the conservation of both wilderness and biodiversity."

"Informatics approaches using large data sets are providing new quantitative insights into questions that have long proven thorny for environmental policymakers," says Steven Chown, the corresponding author based at Monash University.

"This work offers innovative ways to help the Antarctic Treaty Parties take forward measures to secure Antarctica's Wilderness."

The transdisciplinary team delivering this work includes researchers from Australia, the Netherlands, New Zealand, and South Africa.

Credit: 
University of the Witwatersrand

CVIA has just published a new issue, Volume 4 Issue 4

Beijing, 10 July 2020: the journal Cardiovascular Innovations and Applications (CVIA) has just published a new issue, Volume 4 Issue 4.

This issue brings together important research papers from leading cardiologists in US, China, and Africa, including very important new research on identification of Novel TTN Mutations and discovery of digenic mutation.

Papers in the issue are as follows:

RESEARCH PAPERS

Ying Peng, Jinxin Miao, Yafei Zhai, Guangming Fang, Chuchu Wang, Yaohe Wang, Xiaoyan Zhao and Jianzeng Dong
Identification of Novel TTN Mutations in Three Chinese Familial Dilated Cardiomyopathy Pedigrees by Whole Exome Sequencing (http://ow.ly/NnJx30qYzFX )

Zhaowei Zhu, Yanan Guo, Xuping Li, Shuai Teng, Xiaofan Peng, Pu Zou and Shenghua Zhou
Glycyrrhizic Acid Attenuates Balloon-Induced Vascular Injury Through Inactivation of RAGE Signaling Pathways (http://ow.ly/xC7B30qYzJx )

Lutfu Askin, Hakan Duman, Ali Ozy?ld?z and Okan Tanriverdi
Association of Serum Chemerin Levels with Coronary Artery Disease: Pathogenesis and Clinical Research (http://ow.ly/yQhd30qYzL5)

Yafei Zhai, Jinxin Miao, Ying Peng, Guangming Fang, Chuchu Wang, Yaohe Wang, Xiaoyan Zhao and Jianzeng Dong
Discovery of Digenic Mutation, KCNH2 c.1898A >C and JUP c.916dupA, in a Chinese Family with Long QT Syndrome via Whole-Exome Sequencing (http://ow.ly/U27s30qYzNl)

Mohamed Sobhy, Ahmed Elshal, Noha Ghanem, Hosam Hasan-Ali, Nabil Farag, Nireen Okasha, El Sayed Farag, Mohamed Sadaka, Hisham Abo El Enein, Sameh Salama, Hazem Khamis, Khaled Shokry, Hany Ragy, Amany Elshorbagy and Radwa Mehanna
Development of Primary Percutaneous Coronary Intervention as a National Reperfusion Strategy for Patients with ST-Elevation Myocardial Infarction and Assessment of Its Use in Egypt (http://ow.ly/psWc30qYzPQ)

REVIEW

Israel Oluwasegun Ayenigbara
The Accumulation of Visceral Fat and Preventive Measures among the Elderly (http://ow.ly/1pRW30qYzQo)

COMMENTARIES

C. Richard Conti
Some Issues Related to STEMI and NSTEMI (http://ow.ly/jidg30qYzRH )

C. Richard Conti
Chronic Effusive Pericarditis and Chronic Constrictive Pericarditis (http://ow.ly/uOgJ30qYXqq)

Credit: 
Compuscript Ltd

Significant drop in stroke recurrence found among Mexican Americans

DALLAS, JULY 16, 2020 -- The rate of recurrent strokes significantly declined among Mexican Americans in a long-term study, according to new research published today in Stroke, a journal of the American Stroke Association, a division of the American Heart Association.

Recurrent stroke rates declined faster in Mexican Americans than in non-Hispanics whites. By the end of the study in 2013, the differences between these two groups had vanished. Mexican Americans had a significant reduction in stroke recurrence even though the death rate from stroke remained steady.

"Throughout this long-term study, this is the first time that we have encountered an improvement in any major marker of ethnic stroke disparities," said Lewis Morgenstern, M.D., lead study author and professor of neurology and epidemiology at the University of Michigan's Medical School and School of Public Health in Ann Arbor, Michigan.

Mexican Americans make up 63% of the subpopulation of Hispanic Americans, which are the most numerous U.S. minority population. Currently, 9.2% of Hispanic Americans are older than 65, and, by 2040, this percentage is expected to climb to 15.8%. This population will have a substantial risk for stroke and stroke recurrence, making secondary stroke prevention extremely important.

The Brain Attack Surveillance in Corpus Christi (BASIC) project started in Jan. 1, 2000, and is an ongoing study focused on stroke surveillance. The study's 3,571 participants self-identified as Mexican Americans and non-Hispanic whites in Nueces County, Texas, and were from a predominantly non-immigrant population comprised almost entirely of second and third generation U.S. citizens. The participants were 56% Mexican American, at least 45 years old and 50% women. First-ever strokes occurred between Jan. 1, 2000, and Dec. 31, 2013. Records were cross-referenced with the Texas Department of Health death certificates and adjusted for age, sex, hypertension, diabetes, smoking, atrial fibrillation, insurance and cholesterol.

In this study, stroke was defined as a clot disrupting blood flow to the brain (ischemic) or bleeding in the brain (hemorrhagic). Cases were followed to determine one- and two-year stroke recurrence. Recurrent strokes were observed in 206 patients during the 1-year follow-up period, and 683 deaths occurred before any recurrence. In the 2-year follow-up, 293 recurrent strokes were observed, and 883 deaths occurred before another stroke.

Researchers found:

among Mexican Americans, incidence of 1-year recurrence was 9.26% in 2000 and dropped to 3.42% in 2013;

among non-Hispanic whites, the incidence of 1-year recurrence was 5.67% in 2000 and reduced to 3.59% in 2013; and

the recurrence trend changes from 2000 through 2013 were significant among Mexican Americans but not among non-Hispanic whites.

"These results suggest that stroke recurrence continues to decline in both populations, but faster in Mexican Americans, perhaps because their rates were so high to begin with," Morgenstern said. "Individuals should work to reduce their chance of having a stroke by following national healthy living guidelines such as the American Heart Association's Life's Simple 7."

Although this is a community-based study with multiple efforts to ensure accuracy, it is still one community, and the results may not be generalizable, particularly to immigrant Hispanic populations. There is also a small chance that patients may have had a recurrent stroke during their initial hospitalization or after they left the community. The effects of sex differences were not analyzed in this study.

According to the American Stroke Association, the most common symptoms of stroke are known as F.A.S.T., face drooping, arm weakness, speech and time to call 9-1-1. Bystanders should call 911 for immediate medical attention even if the symptoms go away.

Credit: 
American Heart Association

'Blinking" crystals may convert CO2 into fuels

image: The arrows point to titanium dioxide nanocrystals lighting up and blinking (left) and then fading (right).

Image: 
Tewodros Asefa and Eliska Mikmekova

Imagine tiny crystals that "blink" like fireflies and can convert carbon dioxide, a key cause of climate change, into fuels.

A Rutgers-led team has created ultra-small titanium dioxide crystals that exhibit unusual "blinking" behavior and may help to produce methane and other fuels, according to a study in the journal Angewandte Chemie. The crystals, also known as nanoparticles, stay charged for a long time and could benefit efforts to develop quantum computers.

"Our findings are quite important and intriguing in a number of ways, and more research is needed to understand how these exotic crystals work and to fulfill their potential," said senior author Tewodros (Teddy) Asefa, a professor in the Department of Chemistry and Chemical Biology in the School of Arts and Sciences at Rutgers University-New Brunswick. He's also a professor in the Department of Chemical and Biochemical Engineering in the School of Engineering.

More than 10 million metric tons of titanium dioxide are produced annually, making it one of the most widely used materials, the study notes. It is used in sunscreens, paints, cosmetics and varnishes, for example. It's also used in the paper and pulp, plastic, fiber, rubber, food, glass and ceramic industries.

The team of scientists and engineers discovered a new way to make extremely small titanium dioxide crystals. While it's still unclear why the engineered crystals blink and research is ongoing, the "blinking" is believed to arise from single electrons trapped on titanium dioxide nanoparticles. At room temperature, electrons - surprisingly - stay trapped on nanoparticles for tens of seconds before escaping and then become trapped again and again in a continuous cycle.

The crystals, which blink when exposed to a beam of electrons, could be useful for environmental cleanups, sensors, electronic devices and solar cells, and the research team will further explore their capabilities.

Credit: 
Rutgers University

Graphene-Adsorbate van der Waals bonding memory inspires 'smart' graphene sensors

image: (a) Adsorbed CO2 molecules on graphene sensor (b) van der Waals (vdW) interaction between adsorbed molecules and graphene at zero electric field (c) vdW interaction between adsorbed molecules and graphene with electric field.

Image: 
JAIST

Monolayer graphene, an atomic-layer thick sheet of carbon has found immense applications in diverse fields including chemical sensors, detecting single molecule adsorption events electronically. Therefore, monitoring physisorbed molecule induced changes of the electrical response of graphene has become ubiquitous in graphene based sensors. Electric field tuning of the physisorbed molecule-graphene interaction results in enhanced gas sensing due to unique electric field dependent charge-transfer between the adsorbed gas and graphene. Molecular identification in graphene sensors was predicted based on this unique electrically tunable charge-transfer, which is a signature for different adsorbed molecules. Nevertheless, to achieve molecular identification functionality in graphene sensors, an understanding of the gas adsorption/desorption events and retention of the graphene-gas molecule interaction after turning off the electric field is desired. Until now, the graphene-gas molecule bonding interactions were considered randomized by ambience thermal energy after the electric field is turned off, which is not surprising since these interactions are van der Waals (vdW) bonding and so inherently weak. Nevertheless, this assumed thermal randomization of the graphene-gas molecule vdW bonding was unverified experimentally and a major drawback towards electrically tunable charge-transfer based molecular identification in graphene gas sensors.

To clarify the bonding retention of adsorbed gas molecules on graphene with and without electric field tuning, Osazuwa Gabriel Agbonlahor (current doctoral student), Tomonori Imamura (graduated master's student), Dr. Manoharan Murugananthan (Senior Lecturer), and Professor Hiroshi Mizuta of Mizuta Laboratory at the Japan Advanced Institute of Science and Technology (JAIST) monitored the time-dependent vdW interaction decay of adsorbed CO2 molecules on graphene at different electric fields. Using the electric field to tune the interaction between the adsorbed gas and graphene, the charge-transfer between the adsorbed CO2 molecules and graphene was monitored while the tuning electric field was turned on and after it was turned off. Remarkably, the graphene-gas molecule van der Waals interactions were retained hours after the electric field was turned off, demonstrating both charge-transfer and carrier scattering retention characteristic of the previously applied electric field magnitude and direction i.e. the adsorbed CO2 molecules demonstrated a 'vdW bonding memory'. Due to this bonding memory, the charge-transfer and scattering properties of the adsorbed gas molecules on graphene can be studied hours after the electric field is turned off which is critical for identifying adsorbed molecules based on their signature charge-transfer response to an applied electric field. Furthermore, the long bonding retention time (over 2h) of these electrically tuned adsorbed molecules, sets graphene-based sensors apart as platforms for developing 'smart' sensors suitable for 'beyond-sensing' applications in memory devices and conformational switches.

Credit: 
Japan Advanced Institute of Science and Technology

Solid-state intramolecular motions in continuous fibers for fluorescent humidity sensor

image: Schematic illustration of fluorescence variation of AIE/polymer fiber sensor when exposed to water molecules.

Image: 
©Science China Press

Taking advantages of intramolecular motion of D-A based aggregation-induced emission (AIE) molecular rotors and one-dimensional (1D) polymer fibers, highly sensitive optical fiber sensors that respond to ambient humidity rapidly and reversibly with observable chromatic fluorescence change are developed. Moisture environments induce the swelling of the polymer fibers, activating intramolecular motions of AIE molecules to result in red-shifted fluorescence and linear response to ambient relative humidity (RH). In this case, polymer fiber provides a process-friendly architecture and a physically tunable medium for the embedded AIE molecules to manipulate their fluorescence response characteristics.

Intramolecular motions of AIE molecules driven by ambient humidity. D-A based AIE molecules contain three segments: an electron-donating tetraphenylethene (TPE) group, an electron-accepting pyridinium salt unit, and a spacer unit of single (TPE-P)/double (TPE-EP) bond. The highly twisted TPE group with four phenyl rings ensures the intramolecular twisted-motion in the solid state, while intramolecular rotation of D-A subgroups based on the twisted intramolecular charge-transfer (TICT) effect achieves local polarity sensing. Combining AIE and TICT effects that manipulated by the intramolecular motions, a sensitive humidity sensor is developed by embedding AIE molecules into a water-captured polymer.

Dry spinning AIE/polymer microfiber sensor. Dry-spinning technology is utilized to fabricate AIE/polymer microfibers, and polyvinylpyrrolidone (PVP) is chosen as a material support. AIE/PVP micro-fibrous film shows chromatic fluorescence response and linear response to ambient humidity, serving as sensitive woven fabrics for spatial-temporal humidity mapping. Assembly of microfibers and UV silicone tube could be integrated to develop fiber-shaped flexible device, which can act as a built-in sensor for easy identification of RH and also be able to serve as color-tunable lighting for smart displays.

Electro-spinning AIE/polymer nanofiber sensor. Polyacrylic acid (PAA) nanofibers from electro-spinning characterized with large surface area, high porosity, and fine flexibility, are used as a physical medium for AIE molecules to achieve instant humidity response sensitivity. The nanofibrous nonwoven membranes show ultrafast response and recovery (The mechanism of intramolecular motion of AIE molecules has been demonstrated for developing highly sensitive AIE/polymer fiber sensor. The fluorescence response performance is amplified by refining the fiber structure and changing the chemical structure of polymers. Additionally, fibrous sensors can be used to build various architectures, facilitating multifunctionality in terms of spatial humidity mapping, high device-integration capability, and touchless positioning. The strategy of combining AIE and 1D fiber structure will not only provide a new route for humidity sensor, but also serve as artificial nerves to sense wide environmental stimuli.

Credit: 
Science China Press

Timing key in understanding plant microbiomes

image: Black cottonwood trees with holes punched in the leaves. Holes are punched in leaves to collect tissue for the DNA extraction.

Image: 
Devin Leopold

CORVALLIS, Ore. - Oregon State University researchers have made a key advance in understanding how timing impacts the way microorganisms colonize plants, a step that could provide farmers an important tool to boost agricultural production.

The findings, published in the journal Current Biology, will help scientists better understand the plant microbiome, which consists of hundreds of thousands of microorganisms that live in and on plants and contribute to their health and productivity.

While scientists have studied microbes living in plants for decades, it has only been in the last 10 years or so that advances in DNA sequencing technology have it made it possible to characterize the unseen diversity of plant microbiomes with more precision. The surge in research involving the plant microbiome coincides with a spike in research involving the human microbiome and its role in human health and disease.

Understanding how plant microbiomes form is important because some microorganisms are beneficial and others are harmful to plants. Some factors shaping microbiome composition are predictable, like the relative humidity of the environment, or the thickness of the protective, waxy layer of cells on the leaf surface.

However, much of the variation in microbiome composition remains unexplained. The Oregon State research, led by Posy Busby, an assistant professor in the Department of Botany and Plant Pathology, and Devin Leopold, a postdoctoral fellow in her lab, unravels some of those mysteries.

In this study, Busby and Leopold explored one process that likely contributes to this unexplained variation: the order in which microorganisms colonize plants.

The research was unique in that they studied plants with different genetic backgrounds, in this case black cottonwood trees collected throughout the Cascade Range in the Pacific Northwest, and also exposed those plants to leaf rust, a disease-causing fungus.

They found that the order the microorganisms reached the plant had a significant impact on microbiome composition and how susceptible the plant was to disease. Additionally, the researchers found that this random variation in arrival order of microorganisms may be more important for highly disease-susceptible plants, which have not evolved their own effective defense strategies.

Farmers have a long history of applying beneficial microorganisms to crops. This research provides them with more information about which plant cultivars may be best suited for microbial biocontrol, and how to best time treatments to prevent disease in plants.

"Our hope is that our findings will translate into tools for combatting plant disease that aren't limited to planting only disease-resistant cultivars," Busby said. "Because maintaining diversity in our crops is essential to the long-term sustainability of our agricultural systems."

Credit: 
Oregon State University

Ultra-black skin allows some fish to lurk unseen

image: This deep-sea dragonfish has ultra-black skin capable of absorbing the bioluminescent light that might blow its cover.

Image: 
Photo by Karen Osborn, Smithsonian National Museum of Natural History.

DURHAM, N.C. -- If there were a stagehand of the sea, wearing black to disappear into the darkness backstage, it might be the dragonfish. Or the common fangtooth.

These fish live in the ocean's inky depths where there is nowhere to take cover. Even beyond the reach of sunlight, they can still be caught in the glow of bioluminescent organisms that illuminate the water to hunt. So they evade detection with a trick of their own: stealth wear.

Scientists report that at least 16 species of deep-sea fish have evolved ultra-black skin that absorbs more than 99.5% of the light that hits them, making them nearly impossible to pick out from the shadows.

These fish owe their disappearing act to tiny packets of pigment within their skin cells called melanosomes. The melanosomes of ultra-black fish are differently shaped and arranged, on a microscopic level, compared with regular black fish, says a study led by Duke University and the Smithsonian National Museum of Natural History.

The researchers say the work could lead to new light-trapping materials for use in applications ranging from solar panels to telescopes.

For the paper, to be published July 16 in the journal Current Biology, the team used a trawl net and a remotely operated vehicle to scoop up 39 black fish swimming up to a mile deep in the waters of Monterey Bay and the Gulf of Mexico, and bring them up to a ship to study.

Using a spectrometer to measure the amount of light reflected off the fishes' skin, the researchers identified 16 species that reflected less than 0.5% of light, making them some 20 times darker and less reflective than everyday black objects.

"Ultra-black arose more than once across the fish family tree," said first author Alexander Davis, a biology Ph.D. student in Sonke Johnsen's lab at Duke.

The darkest species they found, a tiny anglerfish not much longer than a golf tee, soaks up so much light that almost none -- 0.04% -- bounces back to the eye. Only one other group of black animals, the birds-of-paradise of Papua New Guinea with their ultra-dark plumage, are known to match them.

Getting decent photos of these fish onboard the ship was tough; their features kept getting lost. "It didn't matter how you set up the camera or lighting -- they just sucked up all the light," said research zoologist Karen Osborn of the Smithsonian National Museum of Natural History.

The team found that, when magnified thousands of times under electron microscopes, normal black skin and ultra-black skin look very different. Both have tiny structures within their cells that contain melanin -- the same pigment that lends human skin its color. What sets ultra-black fish apart, they say, is the shape and arrangement of these melanosomes.

Other cold-blooded animals with normal black skin have tiny pearl-shaped melanosomes, while ultra-black ones are larger, more tic-tac-shaped. And ultra-black skin has melanosomes that are more tightly packed together, forming a continuous sheet around the body, whereas normal black skin contains unpigmented gaps.

The researchers ran some computer models, simulating fish skin containing different sizes and shapes of melanosomes, and found that ultra-black melanosomes have the optimal geometry for swallowing light.

Melanosomes are packed into the skin cells "like a tiny gumball machine, where all of the gumballs are of just the right size and shape to trap light within the machine," Davis said.

Their ultra-black camouflage could be the difference between eating and getting eaten, Davis says. By being blacker than black, these fish manage to avoid detection even at six-fold shorter ranges.

Credit: 
Duke University

Study finds link between too much or too little sleep and increased death rates in patients with or without diabetes

New research published in Diabetologia (the journal of the European Association for the Study of Diabetes [EASD]) reveals that too much or too little sleep in people with type 2 diabetes (T2D) is linked to sharply increased death rates, with the effect much larger than that found in the non-diabetic population. The study, based on data from the USA, is by Dr Chuanhua Yu, School of Health Sciences, Wuhan University, Wuhan, Hubei, China, and Dr Xiong Chen, Department of Endocrinology, The First Affiliated Hospital of Wenzhou Medical University, Wenzhou, Zhejiang, China, and colleagues.

While previous research has shown that extreme (too much or too little) sleep duration is linked to increased mortality in the general population, in this new study the authors wanted to examine how the presence of diabetes affected this association.

The authors used data from 273,029 adults including 248,817 without diabetes and 24,212 with T2DM who participated in the US National Health Interview Survey from 2004 to 2013, and had linked mortality data up to the end of 2015. Sleep duration was measured using self-reporting, with participants asked "on average how long do you sleep each day" (5 hours or less, 6, 7, 8, 9, and 10 or more hours/day). The relationship between sleep duration and mortality were investigated using computer modelling with adjustments for demographics, body mass index, lifestyle behaviours and clinical variables.

As expected, regardless of the amount of sleep compared, death rates were higher in people with T2D than those without (see table 2, full paper). The mortality rate for people with T2D with the 'ideal' level of 7 hours sleep was 138 per 10,000 person years, compared to 215 for less than 5 hours sleep and 364 for those with 10 hours of sleep or more. After adjustment of the data, the authors used people without diabetes who slept 7 hours as the reference or comparison group. Compared to this group, people with T2D who slept 7 hours had a 42% increased risk of death; for those with T2D and 10 or more hours sleep there was 2.2 times increased risk, while for those with T2D and 5 hours or less sleep there was a 63% increased risk of death.

A similar pattern, though less pronounced, was seen in the group without T2D. For those with the ideal 7 hours sleep, the death rate was 78 per 10000 person years, compared with 122 for 5 hours or less and 256 for 10 hours or more. Too much or too little sleep did increase the death rate, but not as much as in the group with T2D. Compared with those who slept 7 hours, those who slept 5 hours or less were at a 33% increased risk of death and those with 10 hours or more had an almost doubled (90%) increased risk of death.

Among people with T2D, there were also some links found between sleep duration and cause-specific mortality. For cancer mortality, people with five hours or less sleep per day, eight hours per day, and 10 hours or more per day had 41%, 26% and 59% greater risk of mortality, respectively compared with 7 hours per day (see table 3). The association between sleep duration and CVD mortality was only statistically significant for the longest sleep duration group (a 74% increased risk for 10 or more hours per day compared with 7 hours per day). The longest sleep groups (10 hours or more) also showed an increased risk of stroke mortality (3 times) and Alzheimer's disease (2.6 times) compared to 7 hours sleep.

The study also found that the shortest and longest sleep duration were associated greater risk of all-cause mortality relative to those sleeping 7 hours per day among people with T2DM diagnosed before the age of 45 years compared with those diagnosed after age 45 years. And the effect of extreme sleep duration on mortality was generally more pronounced in those who had had diabetes for more than 10 years compared with less than 10 years.

The authors point to previous research that shows insufficient sleep results in a 40% slower glucose clearance rate in the body, and activates the sympathetic (involuntary) part of the nervous system, which in turn can aggravate a person's status of insulin resistance, obesity, or high blood pressure. "Therefore, sleep deprivation in people with T2DM is likely to increase complications and affect the control and management of blood glucose which drive excess mortality risk," they say.

They add, however, that sleep is a complex phenomenon and extreme sleep duration may be a reflection of poorer health status and reduced functioning. For example, the finding that people with T2DM who sleep longer have an increased mortality risk may be linked to the possibility that these individuals experience greater diabetes-related complications that require more rest or long-term bed rest. Another possible explanation of these findings is that longer sleep duration has been associated with chronic inflammatory responses which increase mortality risk.

They say: "For people with T2DM, as per the general population, 6 to 8 hours of sleep is recommended on account of reducing mortality risk. Sleep interventions as an addition to standard diabetes treatment may warrant further attention."

They conclude: "This study provides preliminary evidence that the associations between sleep duration and mortality are different between people with and without diabetes. Patients with diabetes sleeping for less than or in excess of 7 hours had an increased risk of all-cause and cause-specific mortality, while too much or too little sleep also increased absolute death rates in people without diabetes, but to a lesser extent. The association was more prominent in those with younger age at disease onset. These patients may require greater medical attention that targets sleep and lifestyle to reduce the risks of adverse health outcomes."

Credit: 
Diabetologia

Revealing Brazil's rotten agribusinesses

(Washington, DC) 16 July 2020--Following reports that Brazil's current deforestation rate--1 million hectares--is the highest in a decade, a peer-reviewed study published in Science today finds that 18-22%, and possibly more, of Brazil's annual exports to the European Union are potentially contaminated with illegal deforestation, while identifying for the first time the specific producers of soy in Brazil responsible for "poisoning the barrel." Unveiling these "bad apples" among soy and beef producers, but also revealing that a vast majority--some 80% of the country's farmers--abide by the Forest Code law, the study suggests deforestation-free Brazilian agricultural production is within reach, if leaders act.

"Until now, agribusiness and the Brazilian government have claimed that they cannot monitor the entire supply chain, nor distinguish the legal from the illegal deforestation," said Raoni Rajão, a professor at the Universidade Federal de Minas Gerais (UFMG) in Belo Horizonte, Brazil, and the lead author of The Rotten Apples of Brazil's Agribusiness. "Not anymore. We used freely available maps and data to reveal the specific farmers and ranchers clearing forests to produce soy and beef ultimately destined for Europe. Now, Brazil has the information it needs to take swift and decisive action against these rule-breakers to ensure that its exports are deforestation-free. Calling the situation hopeless is no longer an excuse."

The 12 researchers from Brazil, Germany and the U.S. who wrote the study developed high-powered software to analyze 815,000 individual rural properties in order to assess where illegal deforestation associated with soy and beef production are taking place and how much of these products is reaching the EU. The article also estimates the greenhouse gas emissions from deforestation that is linked with soy and beef exports, pointing out the shared responsibilities of international buyers.

The Trouble with Trade

The article's findings come at a transformational moment in the history of the Amazon Basin, most of which falls on the national territory of Brazil. Led by President Jair Bolsonaro, who came into power in January 2019, the new administration has encouraged the clear-cutting of forests on private properties and public lands--in defiance of Brazil's Forest Code law and the soy moratorium agreement, which bans the clearing of forests for soy production. The government has also dismantled a series of environmental protections meant to stop illegal deforestation in conservation units and Indigenous Peoples' lands, staunch protectors of the country's forests.

International buyers of Brazil's agricultural commodities have long expressed concern that products contaminated with deforestation could be reaching their countries. EU leaders also have openly criticized the Brazilian government, bolstering demands for the boycott of Brazilian products in response to the forest fires that ripped through the country in August 2019.

"Pummeled by the impacts of political signals encouraging the clearing of forests, mostly for land grabbing, Brazil's forests are at a breaking point," said Professor Britaldo Soares-Filho, a co-author also from UFMG. "It's critical for Europe to use its trade might and purchasing power to help roll back this tragic dismantling of Brazil's environmental protection, which has implications for the global climate, local people and the country's valued ecosystem services. With this research, policymakers in Brussels finally have the information they need to assess the extent of the problem in the Brazilian soy and beef sectors. It's time for them to act."

The European Union has laid out a plan for putting policies in place banning the import of products stemming from illegal deforestation, and they are also negotiating a lucrative trade deal with Mercosur, a bloc of South American countries that includes Brazil. Though this deal is facing increasing scrutiny in Europe, with calls for additional negotiations to add protections for forests and rights, the EU's relationship with Brazil puts it in a position to help the country end illegal deforestation. The evidence laid out in the report crystalizes where efforts should be directed.

"Right now, Brazil's enforcement of its own forest protection laws isn't strong enough to guarantee compliance with the European Union's strict environmental standards for trading partners," said Dr. Felipe Nunes from UFMG. "But if Brazil is serious about its trading ambitions, it can join forces with the EU to use its own available tools, such as the CAR (the country's online environmental registry) to end illegal deforestation linked to soy and beef supply chains. Brazil already has the means. All that's needed is the political will."

Contaminated Soy and Tainted Beef

The article finds that producers on 45% of rural Amazon properties and 48% of rural Cerrado properties that supply soy and beef for exports are failing to comply with limits on deforestation laid out in Brazil's Forest Code. Of 53,000 properties producing soy in both regions, 20% have grown soy on land deforested after 2008; the authors estimate that half of this soy was produced on recently illegally deforested land.

Roughly 41%, or 13.6 million metric tons, of the EU's soy imports come from Brazil each year. Some 69% come from the Amazon and Cerrado regions. According to the study, about two million tons of soy grown on properties with illegal deforestation may have reached EU markets annually during the period of analysis, 500 thousand of which came from the Amazon. In most cases, the recently cleared areas are not used to grow soy in order to comply with the rules of the moratoria. But this has not prevented soy farms from clearing their lands illegally for pasturelands and other crops.

With respect to beef, the EU imports about 189,000 metric tons annually. The authors found that of a total 4.1 million head traded to slaughterhouses, at least 500 thousand head come directly from properties that may have deforested illegally. This represents 2% of beef produced in the Amazon and 13% in the Cerrado. But the largest problem lies in the indirect cattle suppliers that provide steers to fattening operations and are not being monitored by large slaughterhouses nor the government. By analyzing the flows of cattle between ranches, the study estimates that some 60% of all slaughtered head could have been potentially contaminated with illegal deforestation (44% in the Amazon and 66% in the Cerrado) at some point in the supply chain.

Soy Surge

Brazil is the world's largest producer of soy, followed by the United States and Argentina. The study reveals the production of soy, primarily used to feed meat and dairy livestock, is on the upswing across the country. Production has more than quadrupled over the past two decades and is projected to increase by another third over the next 10 years, with exports growing by 42%.

Pig farmers in the EU, the world's largest pork exporter, rely on Brazilian soy, which is also a key ingredient in feed for chickens and other animals. The increasing global demand for pork from Asia and other regions has driven up production, which translates into increased demand for soy. This boom in soybean demand has hit the Cerrado region particularly hard. Known for its rich biodiversity, the world's largest tropical savanna has already lost half of its native vegetation.

Europe's Deforestation-free Ambitions

The European Union has emerged as a global leader in developing public and private efforts to ensure deforestation-free imports of beef, soy, palm, timber and other products known to put tropical forests at risk. These efforts, as well as a food policy initiative aimed at cutting down on the long-distance transport of feed or agricultural products, are tied into the European Green Deal.

There are calls within the EU to reduce soy imports from Brazil; proposals have suggested importing the crop from geographically closer producers, such as the United States, or even building up production in the European Union's borders. This approach flows from the Farm to Fork strategy for Sustainable food, a key component of the European Green Deal, which aims to significantly reduce carbon emissions from food production.

Brazil's Opportunity to Lead

The researchers argue that Brazil could develop a transparent, web-based system using public information and methods laid out in their study to track which producers are illegally clearing forests from their properties. The authors suggest this approach would be preferable to current private systems under consideration by the EU, which would require companies to monitor themselves, or to hire third parties to do so--an approach that is costly, often lacks transparency, only encompasses few farms, and is prone to conflicts of interest.

"Brazil can no longer look the other way. It's now up to its political and economic leaders to root out the bad apples in the soy and beef sectors," said Professor Rajão. "If we were to do so, Brazil could become in practice and not only in discourse a global environmental powerhouse that protects its ecosystems, while feeding the world. In collaboration with a responsible agricultural sector, state and national governments can tackle climate change and protect some of the world's most biodiverse regions."

Credit: 
Burness

Divining monsoon rainfall months in advance with satellites and simulations

image: The Indian monsoon provides water for crops across the subcontinent. Research led by The University of Texas at Austin is allowing for more accurate forecasts of the monsoon season further in advance.

Image: 
Yogendra Joshi

Researchers affiliated with The University of Texas at Austin have developed a strategy that more accurately predicts seasonal rainfall over the Asian monsoon region and could provide tangible improvements to water resource management on the Indian subcontinent, impacting more than one fifth of the world's population.

Using satellite data on the size and extent of the snow pack on the Tibetan plateau and in Siberia, the team created better climate model simulations that predict variation in monsoon rainfall the following season. The new research was published online in Environmental Research Letters.

"We are focusing on the time scale beyond the 14 days of weather forecasting to a farther, seasonal outlook," said Peirong Lin, currently a postdoc at Princeton University who helped lead this research project while a graduate student at UT Jackson School of Geosciences. "This is a very important time scale because water resource managers need to know the forecast months prior to the monsoon onset for decisions about resources and agriculture."

Monsoon winds and the rain that comes with them are propelled by the temperature difference between land and ocean. Current climate forecasting relies on computer models that use general circulation models, soil moisture and other factors. The new model uses complementary satellite data to improve these forecasts by revisiting a historically recognized link between snow pack characteristics and monsoon strength over the Asian monsoon region, especially on the Indian subcontinent.

"For Indian monsoons, it was empirically known almost 140 years ago that rainfall in the summer was connected to snowpack in the Himalaya," said Zong-Liang Yang, professor in the Jackson School's Department of Geological Sciences. "But with our new model, we now have a deeper understanding of the interconnected processes, and we are able to quantify such connection that predicts monsoon season strength from snow pack."

The new research uses both the breadth and depth of winter snows to more accurately simulate monsoons. The information constraining the new models comes from two satellites: the Moderate Resolution Imaging Spectroradiometer (MODIS) that provides data about snow cover, and the Gravity Recovery and Climate Experiment (GRACE) with gravitational information that determines the depth of snow. Combined, the observations make the modeled snow conditions more realistic and demonstrate that heavy snow pack--with slower heating of landmass in comparison to the ocean--leads to weaker monsoons. Conversely, milder winter snows lead to stronger monsoons.

The research also finds that snow in Tibet and Siberia have different roles in moderating monsoon rainfall. The snow on the Tibetan plateau is relatively thin compared to Siberia's. Detailed analysis in the research paper shows that the Tibetan snow pack improves a few weeks of forecasting. It is the Siberian snow that melts later in the summer, thereby having a longer impact on the climate system, that influences predictions with greater lead time and further into peak monsoon season.

There are caveats to this research. Monsoons impact a far wider region of the world, but the team's simulations showed the most pronounced forecasts were obtained only in the Indian subcontinent, and they were not as effective over East Asia.

"The forecast is mostly improved over the Indian subcontinent likely because Indian monsoon is more sensitive to snow changes on land," said Lin. "The East Asian monsoon may be more complex."

Still, the team hopes that the new strategy developed by their research will be used to improve seasonal forecasts beyond the Indian subcontinent, with future research that expands the current simulations.

"The work that we accomplished at the Jackson School is leading the field, but it will take time before these ideas are implemented in operational modeling systems in operational centers," said Yang. "But our goal is to decrease the research-to-operation gap and find ways to use many of the underutilized satellites that can inform long-term weather prediction."

Credit: 
University of Texas at Austin

Beautyberry leaf extract restores drug's power to fight 'superbug'

image: The American beautyberry was an important medicinal plant for some Native American tribes.

Image: 
Tharanga Samarakoon

Scientists discovered a compound in the leaves of a common shrub, the American beautyberry, that boosts an antibiotic's activity against antibiotic-resistant staph bacteria. Laboratory experiments showed that the plant compound works in combination with oxacillin to knock down the resistance to the drug of methicillin-resistant Staphylococcus aureus, or MRSA.

The American Chemical Society's Infectious Diseases published the finding, led by scientists at Emory University and the University of Notre Dame.

The American beautyberry, or Callicarpa americana, is native to the southern United States. Prolific in the wild, the shrub is also popular in ornamental landscaping. It's known for showy clusters of bright purple berries that begin to ripen in the summer and are an important food source for many species of birds.

"We decided to investigate the chemical properties of the American beautyberry because it was an important medicinal plant for Native Americans," says Cassandra Quave, co-senior author of the study and an assistant professor in Emory University's Center for the Study of Human Health and Emory School of Medicine's Department of Dermatology. Quave is also a member of the Emory Antibiotic Resistance Center and a leader in the field of medical ethnobotany, studying how indigenous people incorporate plants in healing practices to uncover promising candidates for new drugs.

Micah Dettweiler, a recent Emory graduate and a staff member of the Quave lab, is first author of the study. Christian Melander, professor of chemistry at Notre Dame, is co-senior author.

The Alabama, Choctaw, Creek, Koasati, Seminole and other Native American tribes relied on the American beautyberry for various medicinal purposes. Leaves and other parts of the plant were boiled for use in sweat baths to treat malarial fevers and rheumatism. The boiled roots were made into treatments for dizziness, stomachaches and urine retention, while bark from the stems and roots were made into concoctions for itchy skin.

Previous research found that extracts from the leaves of the beautyberry deter mosquitoes and ticks. And a prior study by Quave and colleagues found that extracts from the leaves inhibit growth of the bacterium that causes acne. For this study, the researchers focused on testing extracts collected from the leaves for efficacy against MRSA.

"Even a single plant tissue can contain hundreds of unique molecules," Quave says. "It's a painstaking process to chemically separate them out, then test and retest until you find one that's effective."

The researchers identified a compound from the leaves that slightly inhibited the growth of MRSA. The compound belongs to a group of chemicals known as clerodane diterpenoids, some of which are used by plants to repel predators.

Since the compound only modestly inhibited MRSA, the researchers tried it in combination with beta-lactam antibiotics.

"Beta-lactam antibiotics are some of the safest and least toxic that are currently available in the antibiotic arsenal," Quave says. "Unfortunately, MRSA has developed resistance to them."

Laboratory tests showed that the beautyberry leaf compound synergizes with the beta-lactam antibiotic oxacillin to knock down MRSA's resistance to the drug.

The next step is to test the combination of the beautyberry leaf extract and oxacillin as a therapy in animal models. If those results prove effective against MRSA infections, the researchers will synthesize the plant compound in the lab and tweak its chemical structure to try to further enhance its efficacy as a combination therapy with oxacillin.

"We need to keep filling the drug-discovery pipeline with innovative solutions, including potential combination therapies, to address the ongoing and growing problem of antibiotic resistance," Quave says.

Each year in the U.S., at least 2.8 million people get an antibiotic-resistant infection and more than 35,000 people die, according to the Centers for Disease Control and Prevention.

"Even in the midst of the COVID-19, we can't forget about the issue of antibiotic resistance," Quave says. She notes that many COVID-19 patients are receiving antibiotics to deal with secondary infections brought on by their weakened conditions, raising concerns about a later surge in antibiotic-resistant infections.

Credit: 
Emory Health Sciences

Genome guardians stop and reel in DNA to correct replication errors

On the DNA assembly line, two proofreading proteins work together as an emergency stop button to prevent replication errors. New research from North Carolina State University and the University of North Carolina at Chapel Hill shows how these proteins - MutL and MutS - prevent DNA replication errors by creating an immobile structure that calls more proteins to the site to repair the error. This structure could also prevent the mismatched region from being "packed" back into the cell during division.

When a cell prepares to divide, the DNA splits, with the double helix "unzipping" into two separate backbones. New nucleotides - adenine, cytosine, guanine or thymine - are filled into the gaps on the other side of the backbone, pairing with their counterparts (adenine with thymine and cytosine with guanine) and replicating the DNA to make a copy for both the old and the new cells. The nucleotides are a correct match most of the time, but occasionally - about one time in 10 million - there is a mismatch.

"Although mismatches are rare, the human genome contains approximately six billion nucleotides in every cell, resulting in approximately 600 errors per cell, and the human body consists of more than 37 trillion cells," says Dorothy Erie, chemistry professor at UNC-Chapel Hill, member of UNC's Lineberger Comprehensive Cancer Center and co-corresponding author of the work. "Consequently, if these errors go unchecked they can result in a vast array of mutations, which in turn can result in a variety of cancers, collectively known as Lynch Syndrome."

A pair of proteins known as MutS and MutL work together to initiate repair of these mismatches. MutS slides along the newly created side of the DNA strand after it's replicated, proofreading it. When it finds a mismatch, it locks into place at the site of the error and recruits MutL to come and join it. MutL marks the newly formed DNA strand as defective and signals a different protein to gobble up the portion of the DNA containing the error. Then the nucleotide matching starts over, filling the gap again. The entire process reduces replication errors around a thousand-fold, serving as one of our body's best defenses against genetic mutations that can lead to cancer.

"We know that MutS and MutL find, bind, and recruit repair proteins to DNA," says biophysicist Keith Weninger, university faculty scholar at NC State and co-corresponding author of the work. "But one question remained - do MutS and MutL move from the mismatch during the repair recruiting process, or stay where they are?"

In two separate papers appearing in Proceedings of the National Academy of Sciences, Weninger and Erie looked at both human and bacterial DNA to gain a clearer temporal and structural picture of what happens when MutS and MutL engage in mismatch repair.

Using both fluorescent and non-fluorescent imaging techniques, including atomic force microscopy, optical spectroscopy and tethered particle motion, the researchers found that MutL "freezes" MutS in place at the site of the mismatch, forming a stable complex that stays in that vicinity until repair can take place. The complex appears to reel in the DNA around the mismatch as well, marking and protecting the DNA region until repair can occur.

"Due to the mobility of these proteins, current thinking envisioned MutS and MutL sliding freely along the mismatched strand, rather than stopping," Weninger says. "This work demonstrates that the process is different than previously thought.

"Additionally, the complex's interaction with the strand effectively stops any other processes until repair takes place. So the defective DNA strand cannot be repacked into a chromosome and then carried forward through cell division."

Credit: 
North Carolina State University