Tech

Advancing gene editing with new CRISPR/Cas9 variant

Using a new variant to repair DNA will improve both safety and effectiveness of the much-touted CRISPR-Cas9 tool in genetic research, Michigan Medicine researchers say.

Those two key problems - safety and efficacy - are what continue to hold CRISPR-Cas9 gene targeting back from its full clinical potential, explains co-senior author Y. Eugene Chen, M.D., Ph.D., a professor of internal medicine, cardiac surgery, physiology, pharmacology and medicinal chemistry, from the Michigan Medicine Frankel Cardiovascular Center.

The new CRISPR-Cas9 variant improves efficiency when inserting a gene or DNA fragment to a precise location in the genome, known as knocking in. It also reduces the rate of unintended insertions or deletions, known as indels, of base pairs that often happen while gene editing.

"We name it meticulous integration Cas9, or miCas9, to reflect its extraordinary capacity to enable maximum integration, yet with minimal indels, as well as to recognize its development at the University of Michigan," write senior authors Chen, Jifeng Zhang and Jie Xu for Nature's "Behind the Paper" series. "It provides a 'one small stone for three birds' tool in gene editing."

Credit: 
Michigan Medicine - University of Michigan

Cluster of Alaskan islands could be single, interconnected giant volcano

WASHINGTON--A small group of volcanic islands in Alaska's Aleutian chain might be part of a single, undiscovered giant volcano, say scientists presenting the findings Monday, 7 December at AGU's Fall Meeting 2020. If the researchers' suspicions are correct, the newfound volcanic caldera would belong to the same category of volcanoes as the Yellowstone Caldera and other volcanoes that have had super-eruptions with severe global consequences.

The Islands of the Four Mountains in the central Aleutians is a tight group of six stratovolcanoes named Carlisle, Cleveland, Herbert, Kagamil, Tana and Uliaga. Stratovolcanoes are what most people envision when they think of a volcano: a steep conical mountain with a banner of clouds and ash waving at the summit. They can have powerful eruptions, like that of Mount St. Helens in 1980, but these are dwarfed by far less frequent caldera-forming eruptions.

Researchers from a variety of institutions and disciplines have been studying Mount Cleveland, the most active volcano of the group, trying to understand the nature of the Islands of the Four Mountains. They have gathered multiple pieces of evidence showing that the islands could belong to one interconnected caldera.

Unlike stratovolcanoes, which tend to tap small- to modestly-sized reservoirs of magma, a caldera is created by tapping a huge reservoir in the Earth's crust. When the reservoir's pressure exceeds the strength of the crust, gigantic amounts of lava and ash are released in a catastrophic episode of eruption.

Caldera-forming eruptions are the most explosive volcanic eruptions on Earth and they often have had global effects. The ash and gas they put into the atmosphere can affect Earth's climate and trigger social upheaval. For example, the eruption of nearby Okmok volcano in the year BCE 43 has been recently implicated in the disruption of the Roman Republic. The proposed caldera underlying the Islands of the Four Mountains would be even larger than Okmok. If confirmed, it would become the first in the Aleutians that is hidden underwater, said Diana Roman of the Carnegie Institution for Science in Washington, D.C., co-author of the study.

"We've been scraping under the couch cushions for data," said Roman, referring to the difficulty of studying such a remote place. "But everything we look at lines up with a caldera in this region."

Despite all these signs, Roman along with John Power, a researcher with the U.S. Geological Survey at the Alaska Volcano Observatory and the study's lead author, maintain that the existence of the caldera is not by any means proven. To do that the study team will need to return to the islands and gather more direct evidence to fully test their hypothesis.

"Our hope is to return to the Islands of Four Mountains and look more closely at the seafloor, study the volcanic rocks in greater detail, collect more seismic and gravity data, and sample many more of the geothermal areas," Roman said.

The caldera hypothesis might also help explain the frequent explosive activity seen at Mount Cleveland, Roman said. Mount Cleveland is arguably the most active volcano in North America for at least the last 20 years. It has produced ash clouds as high as 15,000 and 30,000 feet above sea level. These eruptions pose hazards to aircraft traveling the busy air routes between North America and Asia.

"It does potentially help us understand what makes Cleveland so active," said Power, who will present the work. "It can also help us understand what type of eruptions to expect in the future and better prepare for their hazards."

Credit: 
American Geophysical Union

Cooling electronics efficiently with graphene-enhanced heat pipes

image: Graphene enhanced heat pipes can efficiently cool power electronics

Image: 
Ya Liu/Johan Liu/Chalmers University of Technology

Researchers at Chalmers University of Technology, Sweden, have found that graphene-based heat pipes can help solve the problems of cooling electronics and power systems used in avionics, data centres, and other power electronics.

"Heat pipes are one of the most efficient tools for this purpose, because of their high efficiency and unique ability to transfer heat over a large distance," says Johan Liu, Professor of Electronics Production, at the Department of Microtechnology and Nanoscience at Chalmers.

The results, which also involved researchers in China and Italy, were recently published in the scientific Open Access journal Nano Select.
https://doi.org/10.1002/nano.202000195

Electronics and data centres need to be efficiently cooled in order to work properly. Graphene enhanced heat pipes can solve these issues. Currently, heat pipes are usually made of copper, aluminium or their alloys. Due to the relatively high density and limited heat transmission capacity of these materials, heat pipes are facing severe challenges in future power devices and data centres.

Large data centres that deliver, for example, digital banking services and video streaming websites, are extremely energy-intensive, and an environmental culprit with greater emissions than the aviation industry. Reducing the climate footprint of this industry is therefore vital. The researchers' discoveries here could make a significant energy efficiency contribution to these data centres, and in other applications too.

The graphene enhanced heat pipe exhibits a specific thermal transfer coefficient which is about 3.5 times better than that of copper-based heat pipe. The new findings pave the way for using graphene enhanced heat pipes in lightweight and large capacity cooling applications, as required in many applications such as avionics, automotive electronics, laptop computers, handsets, data centres as well as space electronics.

The graphene enhanced heat pipes are made of high thermal conductivity graphene assembled films assisted with carbon fibre wicker enhanced inner surfaces. The researchers tested pipes of 6mm outer diameter and 150mm length. They show great advantages and potential for cooling of a variety of electronics and power systems, especially where low weight and high corrosion resistance are required.

"The condenser section, the cold part of the graphene enhanced heat pipe, can be substituted by a heat sink or a fan to make the cooling even more efficient when applied in a real case," explains Ya Liu, PhD Student at the Electronics Materials and Systems Laboratory at Chalmers.

Credit: 
Chalmers University of Technology

Electrical spin filtering the key to ultra-fast, energy-efficient spintronics

image: Lead author Dr Elizabeth Marcellina is a theoretical and experimental physicist, previously at UNSW and now at NTU, Singapore

Image: 
FLEET

Spin-filtering could be the key to faster, more energy-efficient switching in future spintronic technology, allowing the detection of spin by electrical rather than magnetic means.

A UNSW paper published last month demonstrates spin detection using a spin filter to separate spin orientation according to their energies.

Ultra-fast, ultra-low energy 'spintronic' devices are an exciting, beyond-CMOS technology.

DETECTING SPIN VIA ELECTRICAL MEANS IN FUTURE SPINTRONICS

The emerging field of spintronic devices use the extra degree of freedom offered by particles' quantum spin, in addition to its charge, allowing for ultra-fast, ultra-low energy computation.

The key is the ability to generate and detect spin as it accumulates on a material's surface.

The aim of researchers is to generate and detect spin via electrical means, rather than magnetic means, because electric fields are a lot less energetically costly to generate than magnetic fields.

Energy-efficient spintronics is dependent on both generation and detection of spin via electrical means.

In strongly spin-orbit coupled semiconductor systems, all-electrical generation of spin has already been successfully demonstrated.

However, detection of spin-to-charge conversion has always required a large range of magnetic fields, thus limiting the speed and practicality.

In this new study, UNSW researchers have exploited the non-linear interactions between spin accumulation and charge currents in gallium-arsenide holes, demonstrating all-electrical spin-to-charge conversion without the need for a magnetic field.

"Our technique promises new possibilities for rapid spin detection in a wide variety of materials, without using a magnetic field," explains lead author Dr Elizabeth Marcellina.

Previously, generation and detection of spin accumulation in semiconductors has been achieved through optical methods, or via the spin Hall effect-inverse spin Hall effect pair.

However, these methods require a large spin diffusion length, meaning that they are not applicable to strongly spin-orbit coupled materials with short spin diffusion length.

ALL-ELECTRICAL SPIN FILTERING

The UNSW study introduces a new method for detecting spin accumulation--using a spin filter, which separates different spin orientations based on their energies.

Typically, spin filters have relied on the application of large magnetic fields, which is impractical and can interfere with the spin accumulation.

Instead, the UNSW team exploited non-linear interactions between spin accumulation and charge, which facilitate the conversion of spin accumulation into charge currents even at zero magnetic field.

"Using ballistic, mesoscopic gallium-arsenide holes as a model system for strongly spin-orbit coupled materials, we demonstrated non-linear spin-to-charge conversion that is all-electrical and requires no magnetic field," says corresponding author A/Prof Dimi Culcer (UNSW).

"We showed that non-linear spin-to-charge conversion is fully consistent with the data obtained from linear response measurements and is orders of magnitude faster," says corresponding-author Prof Alex Hamilton, also at UNSW.

Because the non-linear method does not need a magnetic field nor a long spin diffusion length, it promises new possibilities for fast detection of spin accumulation in strongly spin-orbit coupled materials with short spin diffusion lengths, such as TMDCs and topological materials.

Finally, the rapidness of non-linear spin-to-charge conversion can enable time-resolved read-out of spin accumulation down to 1 nanosecond resolution.

Credit: 
ARC Centre of Excellence in Future Low-Energy Electronics Technologies

Battery of tests: Scientists figure out how to track what happens inside batteries

image: This DOE-created illustration shows ions in a fully charged lithium-ion battery. A team of researchers using the APS has discovered a new method to precisely measure the movement of these ions through a battery.

Image: 
Department of Energy

The future of mobility is electric cars, trucks and airplanes. But there is no way a single battery design can power that future. Even your cell phone and laptop batteries have different requirements and different designs. The batteries we will need over the next few decades will have to be tailored to their specific uses.

And that means understanding exactly what happens, as precisely as possible, inside each type of battery. Every battery works on the same principle: ions, which are atoms or molecules with an electrical charge, carry a current from the anode to the cathode through material called the electrolyte, and then back again. But their precise movement through that material, whether liquid or solid, has puzzled scientists for decades. Knowing exactly how different types of ions move through different types of electrolytes will help researchers figure out how to affect that movement, to create batteries that charge and discharge in ways most befitting their specific uses.

“We had to connect the dots before, and now we can directly detect the ions. There is no ambiguity.” — Venkat Srinivasan, deputy director, Joint Center for Energy Storage Research, Argonne National Laboratory

In a breakthrough discovery, a team of scientists has demonstrated a combination of techniques that allows for the precise measurement of ions moving through a battery. Using the Advanced Photon Source (APS), a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Argonne National Laboratory, these researchers have not only peered inside a battery as it operates, measuring the reactions in real time, but have opened the door to similar experiments with different types of batteries.

The researchers collaborated on this result with the Joint Center for Energy Storage Research (JCESR), a DOE Energy Innovation Hub led by Argonne. The team’s paper, which details velocities of lithium ions moving through a polymer electrolyte, was published in Energy and Environmental Science.

“This is a combination of different experimental methods to measure velocity and concentration, and then compare them both to theory,” said Hans-Georg Steinrück, professor at Paderborn University in Germany and the first author on the paper. “We showed this is possible, and now we will perform it on other systems that are different in nature.”

Those methods, performed at beamline 8-ID-I at the APS, included using ultra-bright X-rays to measure the velocity of the ions moving through the battery, and to simultaneously measure the concentration of ions within the electrolyte, while a model battery discharged. The research team then compared their results with mathematical models. Their result is an extremely accurate figure representing the current carried by ions — what is called the transport number.

The transport number is essentially the amount of current carried by positively charged ions in relation to the overall electric current, and the team’s calculations put that number at approximately 0.2. This conclusion differs from those derived by other methods, researchers said, due to the sensitivity of this new way of measuring ion movement.

The true value transport number has been the subject of some debate among scientists for years, according to Michael Toney, professor at the University of Colorado Boulder and an author on the paper. Toney and Steinrück were both staff scientists at the DOE’s SLAC National Accelerator Laboratory when this research was conducted.

“The traditional way of measuring the transport number is to analyze the current,” Toney said. “But it was unknown how much of that current is due to lithium ions and how much is due to other things you don’t want in your analysis. The principle is easy, but we had to measure accurately. This was certainly a proof of concept.”

For this experiment the research team used a solid polymer electrolyte, instead of the liquid ones in wide use for lithium ion batteries. As Toney notes, polymers are safer, since they avoid the flammability issues of some liquid electrolytes.

Argonne’s Venkat Srinivasan, deputy director of JCESR and an author on the paper, has extensive experience modeling the reactions inside batteries, but this is the first time he’s been able to compare those models to real-time data on the movement of ions through an electrolyte.

“For years we wrote papers about what happens inside a battery, since we couldn’t see the things inside,” he said. “I always joked that whatever I said must be true, since we couldn’t confirm it. So for decades we have been looking for information like this, and it challenges people like me who have been making the predictions.”

In the past, Srinivasan said, the best way to research the inner workings of batteries was to send a current through them and then analyze what happened afterward. The ability to trace the ions moving in real time, he said, offers scientists a chance to change that movement to suit their battery design needs.

“We had to connect the dots before, and now we can directly detect the ions,” he said. “There is no ambiguity.”

Eric Dufresne, physicist with Argonne’s X-ray Science Division, was one of the APS scientists who worked on this project. An author on the paper, Dufresne said the experiment made use of the coherence available at the APS, allowing the research team to capture the effect they were looking for down to velocities of only nanometers per second.

“This is a very thorough and complex study,” he said. “It's a nice example of combining X-ray techniques in a novel way, and a good step toward developing future applications.”

Dufresne and his colleagues also noted that these experiments will only improve once the APS undergoes an in-progress upgrade of its electron storage ring, which will increase the brightness of the X-rays it produces by up to 500 times.

“The APS Upgrade will allow us to push these dynamic studies to better than microseconds,” Dufresne said. “We will be able to focus the beam for smaller measurements and get through thicker materials. The upgrade will give us unique capabilities, and we will be able to do more experiments of this type.”

That’s a prospect that excites the research team. Steinrück said the next step is to analyze more complex polymers and other materials, and eventually into liquid electrolytes. Toney said he would like to examine ions from other types of material, like calcium and zinc.

Examining a diversity of materials, Srinivasan said, would be important for the eventual goal: batteries that are precisely designed for their individual uses.

“If we want to create high-energy, fast, safe, long-lasting batteries, we need to know more about ion motion,” he said. “We need to understand more about what happens inside a battery, and use that knowledge to design new materials from the bottom up.”

Credit: 
DOE/Argonne National Laboratory

CNIC scientists identify a new diagnostic and therapeutic target for cardiovascular disease

video: CNIC scientists identify a new diagnostic and therapeutic target for cardiovascular disease

Image: 
CNIC

Scientists at the Centro Nacional de Investigaciones Cardiovasculares (CNIC) have identified a mitochondrial protein as a potential marker for the diagnosis of cardiovascular disease (CVD) and as a possible target for future treatments. The study is published today in the journal Nature.

Cardiovascular disease is the leading cause of death in the world, with most deaths from CVD caused by a heart attack or stroke. The leading underlying cause of the blood clots triggering these events is atherosclerosis, a chronic inflammatory disease that produces plaques in blood vessel wall composed of cell debris, fats, and fibrous material. Atherosclerosis manifests clinically as a thrombosis (a blood clot inside a blood vessel), which is the principal cause of acute myocardial infarction and stroke.

Atherosclerosis develops for many years without causing symptoms, and there is therefore a pressing need for new tools for diagnosis and therapy. Study leader Dr Almudena Ramiro, of the CNIC, explained that, "we know that atherosclerosis includes an immunological component and that the innate and adaptive immune systems are both involved in the origin and progression of this disease." However, little is known about the specific response of B cells in these processes or the repertoire of antibodies these cells produce during atherosclerosis.

Now, the new study published in Nature has shown that the mitochondrial protein ALDH4A1 is an autoantigen involved in atherosclerosis. Autoantigens are molecules produced by the body that, through a variety of mechanisms, are recognized as foreign and trigger an immune response. "ALDH4A1 is recognized by the protective antibodies produced during atherosclerosis, making it a possible therapeutic target or diagnostic marker for this disease," Ramiro said.

The study characterized the antibody response associated with atherosclerosis in mice lacking the low-density lipoprotein receptor (LDLR-/-) and fed a high-fat diet. During the study, the CNIC team collaborated with researchers at the German Cancer Research Center (DKFZ), the Spanish Cardiovascular Biomedical Research Network (CIBERCV), the Fundación Jiménez Díaz Institute for Medical Research, and the Universidad Autónoma de Madrid.

Describing the study, first author Cristina Lorenzo explained, "we found that atherosclerosis is associated with the generation of specific antibodies in the germinal centers, where B cells diversify their antibodies and differentiate into high-affinity memory B cells and plasma cells."

To study the repertoire of antibodies produced during atherosclerosis, the research team performed a high-throughput analysis based on isolating individual B cells and sequencing their antibody genes. "Analysis of the sequences of more than 1700 antibody genes showed that mice with atherosclerosis produced a distinct antibody repertoire. The production of these antibodies allowed us to study their targets (their antigen specificity) and their functional properties," explained Hedda Wardemann, of the DKFZ in Heidelberg.

Among the atherosclerosis-associated antibodies, the research team found that the antibody A12 was able to recognize plaques not only in the atherosclerosis-prone mice, but also in samples from patients with atherosclerosis in the carotid arteries. "Proteomics analysis showed that A12 specifically recognized a mitochondrial protein called aldehyde dehydrogenase 4 family, member A1 (ALDH4A1), identifying this protein as an autoantigen in the context of atherosclerosis," said Lorenzo.

Ramiro added that "the study shows that ALDH4A1 accumulates in plaques and that its plasma concentration is elevated in the atherosclerosis-prone mice and in human patients with carotid atherosclerosis, establishing ALDH4A1 as a possible biomarker of the disease."

The team also found that infusion of A12 antibodies into the atherosclerosis-prone mice delayed plaque formation and reduced the circulating levels of free cholesterol and LDL, suggesting that anti-ALDH4A1 antibodies have therapeutic potential in the protection against atherosclerosis. "These results," explained Ramiro, "broaden our knowledge of the humoral response during atherosclerosis and highlight the potential of ALDH4A1 as a new biomarker and of A12 as a therapeutic agent for this disease."

The scientists conclude that their study opens the path to new diagnostic and therapeutic interventions in cardiovascular disease.

Credit: 
Centro Nacional de Investigaciones Cardiovasculares Carlos III (F.S.P.)

Scientists reverse age-related vision loss, eye damage from glaucoma in mice

At-a-glance:

Proof-of-concept study represents first successful attempt to reverse the aging clock in animals through epigenetic reprogramming.

Scientists turned on embryonic genes to reprogram cells of mouse retinas.

Approach reversed glaucoma-induced eye damage in animals.

Approach also restored age-related vision loss in elderly mice.

Work spells promise for using same approach in other tissues, organs beyond the eyes.

Success sets stage for treatment of various age-related diseases in humans.

Harvard Medical School scientists have successfully restored vision in mice by turning back the clock on aged eye cells in the retina to recapture youthful gene function.

The team's work, described Dec. 2 in Nature, represents the first demonstration that it may be possible to safely reprogram complex tissues, such as the nerve cells of the eye, to an earlier age.

In addition to resetting the cells' aging clock, the researchers successfully reversed vision loss in animals with a condition mimicking human glaucoma, a leading cause of blindness around the world.

The achievement represents the first successful attempt to reverse glaucoma-induced vision loss, rather than merely stem its progression, the team said. If replicated through further studies, the approach could pave the way for therapies to promote tissue repair across various organs and reverse aging and age-related diseases in humans.

"Our study demonstrates that it's possible to safely reverse the age of complex tissues such as the retina and restore its youthful biological function," said senior author David Sinclair, professor of genetics in the Blavatnik Institute at Harvard Medical School, co-director of the Paul F. Glenn Center for Biology of Aging Research at HMS and an expert on aging.

Sinclair and colleagues caution that the findings remain to be replicated in further studies, including in different animal models, before any human experiments. Nonetheless, they add, the results offer a proof of concept and a pathway to designing treatments for a range of age-related human diseases.

"If affirmed through further studies, these findings could be transformative for the care of age-related vision diseases like glaucoma and to the fields of biology and medical therapeutics for disease at large," Sinclair said.

For their work, the team used an adeno-associated virus (AAV) as a vehicle to deliver into the retinas of mice three youth-restoring genes--Oct4, Sox2 and Klf4--that are normally switched on during embryonic development. The three genes, together with a fourth one, which was not used in this work, are collectively known as Yamanaka factors.

The treatment had multiple beneficial effects on the eye. First, it promoted nerve regeneration following optic-nerve injury in mice with damaged optic nerves. Second, it reversed vision loss in animals with a condition mimicking human glaucoma. And third, it reversed vision loss in aging animals without glaucoma.

The team's approach is based on a new theory about why we age. Most cells in the body contain the same DNA molecules but have widely diverse functions. To achieve this degree of specialization, these cells must read only genes specific to their type. This regulatory function is the purview of the epigenome, a system of turning genes on and off in specific patterns without altering the basic underlying DNA sequence of the gene.

This theory postulates that changes to the epigenome over time cause cells to read the wrong genes and malfunction--giving rise to diseases of aging. One of the most important changes to the epigenome is DNA methylation, a process by which methyl groups are tacked onto DNA. Patterns of DNA methylation are laid down during embryonic development to produce the various cell types. Over time, youthful patterns of DNA methylation are lost, and genes inside cells that should be switched on get turned off and vice versa, resulting in impaired cellular function. Some of these DNA methylation changes are predictable and have been used to determine the biologic age of a cell or tissue.

Yet, whether DNA methylation drives age-related changes inside cells has remained unclear. In the current study, the researchers hypothesized that if DNA methylation does, indeed, control aging, then erasing some of its footprints might reverse the age of cells inside living organisms and restore them to their earlier, more youthful state.

Past work had achieved this feat in cells grown in laboratory dishes but fell short of demonstrating the effect in living organisms.

The new findings demonstrate that the approach could be used in animals as well.

Overcoming an important hurdle

Lead study author, Yuancheng Lu, research fellow in genetics at HMS and a former doctoral student in Sinclair's lab, developed a gene therapy that could safely reverse the age of cells in a living animal.

Lu's work builds on the Nobel Prize winning discovery of Shinya Yamanaka, who identified the four transcription factors, Oct4, Sox2, Klf4, c-Myc, that could erase epigenetics markers on cells and return these cells to their primitive embryonic state from which they can develop into any other type of cell.

Subsequent studies, however, showed two important setbacks. First, when used in adult mice, the four Yamanaka factors could also induce tumor growth, rendering the approach unsafe. Second, the factors could reset the cellular state to the most primitive cell state, thus completely erasing a cell's identity.

Lu and colleagues circumvented these hurdles by slightly modifying the approach. They dropped the gene c-Myc and delivered only the remaining three Yamanaka genes, Oct4, Sox2 and Klf4. The modified approach successfully reversed cellular aging without fueling tumor growth or losing their identity.

Gene therapy applied to optic nerve regeneration

In the current study, the researchers targeted cells in the central nervous system because it is the first part of body affected by aging. After birth, the ability of the central nervous system to regenerate declines rapidly.

To test whether the regenerative capacity of young animals could be imparted to adult mice, the researchers delivered the modified three-gene combination via an AAV into retinal ganglion cells of adult mice with optic nerve injury.

For the work, Lu and Sinclair partnered with Zhigang He, HMS professor of neurology and of ophthalmology at Boston Children's Hospital, who studies optic nerve and spinal cord neuro-regeneration.

The treatment resulted in a two-fold increase in the number of surviving retinal ganglion cells after the injury and a five-fold increase in nerve regrowth.

"At the beginning of this project, many of our colleagues said our approach would fail or would be too dangerous to ever be used," said Lu. "Our results suggest this method is safe and could potentially revolutionize the treatment of the eye and many other organs affected by aging."

Reversal of glaucoma and age-related vision loss

Following the encouraging findings in mice with optic nerve injuries, the team partnered with colleagues at Schepens Eye Research Institute of Massachusetts Eye and Ear Bruce Ksander, HMS associate professor of ophthalmology, and Meredith Gregory-Ksander, HMS assistant professor of ophthalmology. They planned two sets of experiments: one to test whether the three-gene cocktail could restore vision loss due to glaucoma and another to see whether the approach could reverse vision loss stemming from normal aging.

In a mouse model of glaucoma, the treatment led to increased nerve cell electrical activity and a notable increase in visual acuity, as measured by the animals' ability to see moving vertical lines on a screen. Remarkably, it did so after the glaucoma-induced vision loss had already occurred.

"Regaining visual function after the injury occurred has rarely been demonstrated by scientists," Ksander said. "This new approach, which successfully reverses multiple causes of vision loss in mice without the need for a retinal transplant, represents a new treatment modality in regenerative medicine."

The treatment worked similarly well in elderly, 12-month-old mice with diminishing vision due to normal aging. Following treatment of the elderly mice, the gene expression patterns and electrical signals of the optic nerve cells were similar to young mice, and vision was restored. When the researchers analyzed molecular changes in treated cells, they found reversed patterns of DNA methylation--an observation suggesting that DNA methylation is not a mere marker or a bystander in the aging process, but rather an active agent driving it.

"What this tells us is the clock doesn't just represent time--it is time," said Sinclair. "If you wind the hands of the clock back, time also goes backward."

The researchers said that if their findings are confirmed in further animal work, they could initiate clinical trials within two years to test the efficacy of the approach in people with glaucoma. Thus far, the findings are encouraging, researchers said. In the current study, a one-year, whole-body treatment of mice with the three-gene approach showed no negative side effects.

Credit: 
Harvard Medical School

Birth defects linked to greater risk of cancer in later life

People born with major birth defects face a higher risk of cancer throughout life, although the relative risk is greatest in childhood and then declines, finds a study published by The BMJ today.

The researchers found a continued increased risk of cancer in people who had been born with both non-chromosomal and chromosomal anomalies, suggesting that birth defects may share a common cause with some forms of cancer, be that genetic, environmental, or a combination of the two.

It is generally accepted that people with major birth defects have a greater risk of developing cancer during childhood and adolescence, but it is less clear whether that risk persists into adulthood, so researchers set out to investigate.

They used health registries in Denmark, Finland, Norway, and Sweden to identify 62,295 people aged up to 46 years who had been diagnosed with cancer and matched them against 724,542 people without a cancer diagnosis (controls) by country and year of birth.

The data showed that 3.5% of cases (2,160 out of 62,295) and 2.2% of controls (15,826 out of 724,542) had a major birth defect, and that the odds of developing cancer was 1.74 times higher in people with major birth defects than in those without.

The odds of cancer in people with birth defects was greatest in children aged 0-14 (2.52 times higher) and then declined, but it was still 1.22-fold higher in adults aged 20 or more with major birth defects compared with those without.

In particular, people with congenital heart defects, defects of the genital organs or nervous system, skeletal dysplasia and chromosomal anomalies (too few or too many chromosomes or missing, extra or irregular portion of chromosomal DNA) continued to have a greater risk of cancer in later life.

The type of birth defect had a marked impact on the subsequent risk and type of cancer.

For example, the odds of cancer were highest (5.53-fold higher) in people with chromosomal anomalies such as Down's syndrome. The most common type of cancer in people with birth defects caused by chromosomal anomalies was leukaemia.

Structural birth defects, such as defects of the eye, nervous system and urinary organs, were associated with later cancer in the same organ or location, although the researchers stress that some of these associations were based on small numbers.

This was a large study using robust Scandinavian national registry systems, but the authors do highlight some limitations. For example, the study only included diagnoses made in the first year of life and confirmed in hospital, so some less visible birth defects may have been missed.

And while factors such as in vitro fertilisation, maternal age and maternal smoking were taken into account, other potentially influential factors, such as parental income or education, could not be adjusted for.

Nevertheless, the researchers say: "Our study showed that birth defects are associated with risk of cancer in adulthood as well as in adolescence and childhood, a finding of clinical importance for healthcare workers responsible for follow-up of individuals with birth defects."

They point out that surveillance for cancer in children with birth defects has been discussed, but thus far the absolute cancer risk has been regarded as too low.

"The most important implication of our results is to provide further rationale for additional studies on the molecular mechanisms involved in the developmental disruptions underlying both birth defects and cancer," they conclude.

Few of the associations described in this study suggest that screening is either viable or desirable for most children or adults with birth defects, say US researchers in a linked editorial.

The relation between cancers and birth defects is likely to be complex, they write, and this study did not distinguish between genetic, environmental, and iatrogenic explanations for the observed associations.

As such, they say the clinical implications of this study are limited, but the findings should certainly trigger further research "that may offer important preventive opportunities and identify high risk patient groups for enhanced targeted surveillance."

Credit: 
BMJ Group

Natural three-dimensional nonlinear photonic crystal

image: a, Experimental setup for 3D quasi-phase-matching SHG experiment. b-c, SHG spot in different polarization states when the polarization direction of incident fundamental light is along y-axis (b) and z-axis (c). d-e, Relative intensity of SHG in different polarization states when the polarization direction of incident fundamental light is along y-axis (d) and z-axis (e). f, The relationship between fundamental power and SH power.

Image: 
by Chang Li, Xuping Wang, Yang Wu, Fei Liang, Feifei Wang, Xiangyong Zhao, Haohai Yu, Huaijin Zhang

Nonlinear photonic crystals (NPCs) are transparent materials that have a spatially uniform linear susceptibility, yet a periodically modulated quadratic nonlinear susceptibility. These engineered materials are used extensively for studying nonlinear wave dynamics and in many scientific and industrial applications. Over the past two decades, there has been a continuous effort to find a technique that will enable the construction of three-dimensional (3D) NPCs. Such capability will enable many new schemes of manipulation and control of nonlinear optical interactions.

Till now, only two artificial 3D NPCs have been constructed using femtosecond laser poling in ferroelectric LiNbO3 and Ba0.77Ca0.23TiO3 crystal. However, both nonlinear crystals only feature up-down ferroelectric domains and no spatially rotating polarization. Therefore, the crystal cutting angle and incident light polarization are still limited to utilize the maximum nonlinear coefficient. The 3D spatial rotation of ferroelectric domains may break the rigid requirement on incident light in common nonlinear photonic crystals, but seems difficult to reach by traditional electric or light poling technique.

In a new paper published in Light Science & Application, scientists from the State Key Laboratory of Crystal Materials and Institute of Crystal Materials, Shandong University, China, and co-workers showed a natural potassium-tantalate-niobate (KTa0.56Nb0.44O3, KTN) perovskite nonlinear photonic crystal with 3D spontaneous Rubik's domain structures. It exhibits the near-room-temperature Curie temperature at 40 Celsius degree. The Rubik's domain structure is composed of 90° and 180° domains with different polarization direction. Hence, the ferroelectric domain structures arranged in KTN crystal would supply rich 3D reciprocal vectors to compensate phase-mismatch along arbitrary direction. Based on this 3D KTN nonlinear photonic crystal, a second harmonic generation with four-fold pattern spot was demonstrated, which is proved to be the superposition of two orthogonal polarization states in different nonlinear diffraction modes.

"KTN crystal contains 3D ferroelectric polarization distributions corresponding to the reconfigured second-order susceptibilities, which can provide rich reciprocal vectors for compensating phase mismatch along an arbitrary direction and polarization of incident light." they added.

"KTN crystal is easily compatible to laser writing technique, thus suggesting promising opportunities to create hierarchical nonlinear optical modulation. Therefore, this 3D nonlinear photonic crystal in perovskite ferroelectrics would find a wide variety of applications in optical communications, quantum entanglement sources, nonlinear imaging, and on-chip signal processing."the scientists forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

New butterfly-inspired hydrogen sensor is powered by light

image: PhD researcher Ebtsam Alenezy holds a prototype of the light-activated hydrogen sensor, which can deliver ultra-precise results at room temperature.

Image: 
RMIT University

Inspired by the surface of butterfly wings, researchers have developed a light-activated hydrogen sensor that produces ultra-precise results at room temperature.

The technology can detect hydrogen leaks well before they pose safety risks and can measure tiny amounts of the gas on people's breath, for diagnosing gut disorders.

Commercial hydrogen sensors only work at temperatures of 150C or higher, but the prototype developed by researchers at RMIT University in Melbourne, Australia, is powered by light instead of heat.

The sensor, based on bumpy microstructures that imitate the surface of butterfly wings, is detailed in a new study published in the journal ACS Sensors.

Co-lead researcher Dr Ylias Sabri said the prototype was scalable, cost-effective and offered a total package of features that could not be matched by any hydrogen sensor currently on the market.

"Some sensors can measure tiny amounts, others can detect larger concentrations; they all need a lot of heat to work," Sabri said.

"Our hydrogen sensor can do it all - it's sensitive, selective, works at room temperature and can detect across a full range of levels."

The sensor can detect hydrogen at concentrations from as little as 10 parts per million molecules (for medical diagnoses) to 40,000 parts per million (the level where the gas becomes potentially explosive).

Co-lead researcher Dr Ahmad Kandjani said the broad detection range made it ideal for both medical use and boosting safety in the emerging hydrogen economy.

"Hydrogen has potential to be the fuel of the future but we know safety fears could affect public confidence in this renewable energy source," he said.

"By delivering precise and reliable sensing technology that can detect the tiniest of leaks, well before they become dangerous, we hope to contribute to advancing a hydrogen economy that can transform energy supplies around the world."

Butterfly bumps: How the sensor works

The innovative core of the new sensor is made up of tiny spheres known as photonic or colloidal crystals.

These hollow shapes, similar to the miniscule bumps found on the surface of butterfly wings, are highly ordered structures that are ultra-efficient at absorbing light.

That efficiency means the new sensor can draw all the energy it needs to operate from a beam of light, rather than from heat.

PhD researcher and first author Ebtsam Alenezy said the room-temperature sensor was safer and cheaper to run, compared to commercial hydrogen sensors that typically operate at 150C to 400C.

"The photonic crystals enable our sensor to be activated by light and they also provide the structural consistency that's critical for reliable gas sensing," she said.

"Having a consistent structure, consistent fabrication quality and consistent results are vital - and that's what nature has delivered for us through these bioinspired shapes.

"The well-developed fabrication process for photonic crystals also means our technology is easily scalable to industrial levels, as hundreds of sensors could be rapidly produced at once."

To make the sensor, an electronic chip is first covered with a thin layer of photonic crystals and then with a titanium palladium composite.

When hydrogen interacts with the chip, the gas is converted into water. This process creates an electronic current and by measuring the magnitude of the current, the sensor can tell precisely how much hydrogen is present.

Unlike many commercial sensors that struggle in the presence of nitrogen oxide, the new technology is highly selective so it can accurately isolate hydrogen from other gases.

Medical applications

With elevated levels of hydrogen known to be connected to gastrointestinal disorders, the technology has strong potential for use in medical diagnosis and monitoring.

Currently, the standard diagnostic approach is through breath samples, which are sent to labs for processing.

Sabri said the new chip could be integrated into a hand-held device to deliver instant results.

"With gut conditions, the difference between healthy levels of hydrogen and unhealthy levels is miniscule - just 10 parts per million - but our sensor can accurately measure such tiny differences," he said.

A provisional patent application has been filed for the technology and the research team hopes to collaborate with manufacturers of hydrogen sensors, fuel cells, batteries or medical diagnostic companies to commercialise the sensor.

Credit: 
RMIT University

Self-repairing gelatin-based film could be a smart move for electronics

Dropping a cell phone can sometimes cause superficial cracks to appear. But other times, the device can stop working altogether because fractures develop in the material that stores data. Now, researchers reporting in ACS Applied Polymer Materials have made an environmentally friendly, gelatin-based film that can repair itself multiple times and still maintain the electronic signals needed to access a device's data. The material could be used someday in smart electronics and health-monitoring devices.

Global consumer demand for hand-held smart devices is rapidly growing, but because of their fragility, the amount of electronic waste is also increasing. Self-repairing films have been developed, but most only work a single time, and some are made with potentially harmful agents that curtail their use in biomedical applications. Researchers have tried incorporating gelatin in electronic devices because it is transparent, readily available and safe. In tests, however, damaged gelatin film was not restored quickly. Yu-Chi Chang and colleagues wanted to see if they could make a repeatedly self-healing gelatin-based film that would mend cracks in minutes and preserve electrical functionality.

The researchers mixed gelatin and glucose to create a flexible film that they sandwiched between conductive material to simulate an electronic device. After bending the simulated electronic device, the team saw breaks in the gelatin-glucose film disappear within 3 hours at room temperature and within 10 minutes when warmed to 140 F. Gelatin without glucose did not self-repair under the same conditions. The glucose-based gelatin also transferred an electrical signal following multiple rounds of damage and repair, with an unexpected improvement to the film's electrical performance. The experiments show that glucose and gelatin probably form reversible and interlocking imide bonds during the healing process. The new film could help maintain the durability of touchscreen and flexible display devices, advanced robotics and assisted health technologies, the researchers say.

The authors acknowledge funding from the Ministry of Science and Technology of Taiwan.

The abstract that accompanies this paper can be viewed here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people. The Society is a global leader in providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a specialist in scientific information solutions (including SciFinder® and STN®), its CAS division powers global research, discovery and innovation. ACS' main offices are in Washington, D.C., and Columbus, Ohio.
 

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.
 

Follow us: Twitter | Facebook

Credit: 
American Chemical Society

Rethinking race and kidney function

Race is not biology. As a social construct, race is an unreliable predictor of physiologic variation and a notoriously unreliable marker for biologic differences across populations.

To reflect this growing realization, hospital systems and professional medical organizations have started reconsidering the use of race in clinical calculators that estimate how well a person's kidneys work. Indeed, some hospital systems have already removed race from these commonly used clinical tools.

But what this move might mean for patients remains unclear.

Now a new study from Harvard Medical School forecasts the effects of this change if implemented nationwide. The results, published Dec. 2 in JAMA, suggest that removing race from kidney function tests might have both advantages and disadvantages for Black people with kidney disease.

The analysis represents the most comprehensive study to date to assess the impact of eliminating race from kidney function formulas. It is intended to help clinicians, healthcare organizations and policymakers understand the implications of such a decision, allocate resources, monitor patients and individualize care. The findings should also help patients understand what the change may mean for them and lead to greater involvement in their own care.

The researchers say that the current way of calculating kidney function by adjusting for race is flawed. However, they also caution any changes must be implemented with full understanding of the possible effects.

"The remnants of race-based medicine well into the 21st century expose a historical legacy of crude approaches to using identity in clinical practice," said study senior investigator Arjun Manrai, an assistant professor of biomedical informatics in the Blavatnik Institute at Harvard Medical School. "We must find better ways to individualize care and removing race from clinical algorithms is an important goal. But we must ensure that in doing so we do not inadvertently harm the very individuals we are trying to protect and care for."

Some of the anticipated benefits of dropping the race adjustment include earlier diagnoses, better access to kidney specialists and specialty services, and better care options. The possible downsides include restricting access and eligibility to medications for cardiovascular problems, diabetes, pain control and cancer or dose adjustments for these drugs.

The researchers say that understanding the potential for both benefit and harm is critical to ensuring that Black patients do not face more health inequity than they already do.

"The social and historical contexts of using race in kidney function tests are vital for understanding the clinical implications of removing this variable from the equation," said study first author James Diao, a third-year medical student at Harvard Medical School. "Our findings must be interpreted in light of significant disparities for Black patients resulting from the long history of racism in medicine, as well as data on the accuracy of kidney function equations in disadvantaged groups."

Estimating kidney function

Directly measuring a person's kidney function is cumbersome, timely and inefficient. To circumvent this hurdle, clinicians use a formula to get a numeric score that estimates how well someone's kidneys are working. The formula is based on measuring the blood levels of creatinine, a waste product removed by the kidneys, and then plugging in variables, including a person's age, sex and race (Black versus White/Other). Lower creatinine levels generally mean that the kidneys are removing creatinine faster from the blood, which signals better kidney function.

Why adjust for race to begin with?

Until the 1990s, the standard formula used to estimate kidney function was derived from research in white males. Then, in the 1990s, researchers noticed that Black individuals had faster kidney filtration rates, even at the same creatinine levels as white individuals of the same age and sex. This led scientists to reason that creatinine levels may be naturally higher in Black individuals without compromised kidney function. To avoid overdiagnosis, the thinking went, the kidney formula needed to factor in race as a more accurate estimate of kidney function.

The solution scientists came up with was to include a statistical "adjustment" to the formula based on a person's self-identified or perceived race.

Why Black people have comparatively higher creatinine levels is not well understood. A popular misconception posits that higher creatinine levels are due to higher muscle mass among Black individuals since muscle releases more creatinine. However, Manrai says, the evidence does not support this hypothesis, which can serve to reinforce racial stereotypes.

"In general, medicine needs better, more precise ways to gauge differences across populations if elements of identity are to be incorporated into care," said Manrai, who is assistant professor of pediatrics and a faculty member in the Computational Health Informatics Program at Boston Children's Hospital.

In their study, the Harvard Medical School team analyzed 18 years' worth of data obtained from more than 9,500 Black participants in the National Health and Nutrition Examination Survey, a program of studies designed to assess the health and nutritional status of adults and children in the United States.

To estimate the number and proportion of Black adults whose care would change as a result of eliminating the race adjustment from the current formula, the researchers re-calculated participants' kidney function with and without race.

The analysis showed that, if implemented nationally, removing race as a variable from the formula could result in nearly one million new diagnoses of chronic kidney disease, increasing the proportion of Black people with kidney disease from 14.9 percent to 18.4 percent. It would also lead 1.2 million Black people with kidney disease to be reclassified as having a more advanced form of the condition.

Advantages

The new diagnoses and reclassifications to more severe kidney disease would mean earlier access to specialists, specialized care, and prompt treatment. Diagnosing someone with kidney disease sooner should result in improved care options, more referrals to kidney specialists, broader insurance coverage, and better access to specialty services for kidney care.

Removing race from the formula would lead to a million new diagnoses among Black individuals and a 6.8-percent increase in the number of Black patients referred to kidney specialists. It would also increase the number of patients eligible for specialty services such as medical nutrition therapy and kidney disease education by 9.5 percent and 61.3 percent, respectively.

The reclassification to more severe kidney disease would also mean earlier access to the kidney transplant waiting list. In all, this reclassification would result in a 7.7 percent increase in the number of Black people with kidney disease eligible for a transplant.

Disadvantages

The greater number of individuals with a clinical diagnosis of kidney disease as a result of the modified formula would mean that more people may get recommendations for dose adjustments or contraindications for certain drugs that may either interfere with kidney function or be poorly filtered by the kidneys. These include drugs for cardiovascular illness and hypertension, such as beta blockers and ACE inhibitors; metformin, a first-line drug for type 2 diabetes; newer diabetes medications known as SGLT2 inhibitors; and certain pain medications, such as opioids and nonsteroidal anti-inflammatory drugs like ibuprofen.

If a person's kidney function estimate changes, a physician would be concerned about giving the patient medications that may further erode kidney function or build up to toxic levels in the blood because their kidneys are not filtering fast enough to get rid of the drug. But in the case of lifesaving heart and diabetes medications, the risk-benefit calculus would become complicated: Should a patient with a decreased kidney function under the new formula remain on the same dose of their diabetes medication?

The modified formula may lead to a 54-percent increase in the number of Black individuals who get recommendations for dose reductions on ACE inhibitors, drugs commonly used to manage high blood pressure, heart disease and kidney disease.

The analysis also estimated a 28-percent increase in the number of Black individuals who may no longer qualify for the diabetes drugs metformin and SGLT2 inhibitors. The same patients would no longer qualify for heart medications known as beta blockers, the cancer drug cisplatin, or blood thinners such as warfarin.

Because these therapies could have side effects on the kidneys, those reclassified with more advanced kidney disease would be considered at high risk for complications from such treatments.

Thus, the researchers caution that taking people off such drugs or reducing the doses of these drugs may possibly exacerbate existing racial disparities in stroke, heart failure and cardiovascular deaths. Reducing the number of people with diabetes receiving metformin or SGLT2 inhibitors, for example, may also worsen already disparate diabetes care outcomes among Black individuals, the researchers caution.

What might happen in reality, Manrai said, is that physicians may decide to keep individuals, whose kidney scores change, on their current medications--particularly if they tolerate them well--and just monitor them more aggressively. The situation may become more complicated when patients who were not previously on such medications suddenly need them because they have developed heart problems or diabetes. In this scenario, physicians may hesitate to prescribe new medications that could interfere with kidney function.

While the formula change may increase the number of Black patients with advanced kidney disease who qualify for a kidney transplant--up to 14,000 if implemented nationwide--it may also render many more Black individuals newly ineligible to donate kidneys--up to 560,000 if implemented nationwide. Researchers caution that fewer Black kidney donors may further limit access to transplantable kidneys for Black individuals with end-stage kidney failure in need of lifesaving transplants. This is because most donated kidneys come from family members, the researchers said. Kidneys from family members tend to be better suited for transplantation based on the matching of immune markers that predict organ compatibility.

The way forward

The findings underscore the urgent need for better and more accurate ways to gauge genetic differences between individuals that go beyond race, an all-too-unreliable construct, the research team said.

Current kidney function calculators must be refined, the researchers said, by removing race while at the same time ensuring that important differences related to kidney function across different populations are not missed. This refinement could be achieved by incorporating new, more reliable biomarkers that capture such variations. Such biomarkers are currently under active investigation and are not yet ready for widespread clinical use, the researchers said.

In the meantime, physicians should ensure transparency with patients whenever they apply race in any of their diagnostic or treatment decisions, the researchers said.

This is particularly important because patients tested in different settings may end up with two different kidney function estimates, a discrepancy that is also bound to confuse the physicians who treat these patients.

"It's critical to have a transparent and open dialogue between the physician and the patient around what aspects of identity are being used to guide their care, and this is much broader than kidney function," Manrai said.

Policymakers and hospital administrators could use the findings of the analysis to help determine how to optimize resource allocation for patient care and planning.

"Hospitals are grappling with this issue right now, and there's a complex set of trade-offs in either scenario. Regardless of which alternative they choose, it is important to be aware of the potential downstream effects," Diao said. "If providers know what changes might happen and how these may affect their patient populations, they can plan and allocate resources accordingly."

Credit: 
Harvard Medical School

Sensors for a 'smart' wound bandage may track healing, immune response: Study

image: Electrochemical Detection in Wound Healing

Image: 
Olja Simoska et al/ACS sensors

Researchers from Skoltech and the University of Texas at Austin have presented a proof-of-concept for a wearable sensor that can track healing in sores, ulcers, and other kinds of chronic skin wounds, even without the need to remove the bandages. The paper was published in the journal ACS Sensors.

Chronic wounds that fail to heal quickly, such as diabetic foot ulcers or pressure ulcers, can be very tricky to manage for healthcare professionals and a nightmare for patients. To monitor the healing process and assess the need for treatment, doctors and nurses normally need to remove the bandages from a wound, which damages the recovering tissue, often hurts the patient, and requires hospital visits, particularly to avoid further infections. Furthermore, if a wound requires more than just visual inspection, other available methods include tissue biopsies, surface swabs, or testing for pathogens -- invasive and costly procedures that can take days and yet fail to produce useful treatment directions.

That is why 'smart' bandages, essentially wearable sensors that can monitor certain biomarkers during the healing process, have captured the attention of medical engineers. In the new study, the Russia-US team, led by Skoltech provost, Professor Keith Stevenson, explored electroanalytical methods that, thanks to their relative simplicity, sensitivity, durability, and other attractive characteristics, are particularly promising for clinical applications.

"Earlier stages of our research involved characterizing the sensor performance and demonstrating the sensitive and selective multianalyte detection in complex biofluid simulants that closely mimic real biological environments," Stevenson said.

For the new study, the researchers built an early prototype of an electroanalytical wound sensor based on carbon ultra- microelectrode arrays (CUAs) on flexible substrates. In previous studies, this sensor had been placed on quartz substrates, but to ensure flexibility, the authors developed a method of putting the arrays on a polyethylene terephthalate (PET) substrate.

The team used a simulated wound environment to test the sensitivity of their sensor to three critical biomarkers: pyocyanin, produced by Pseudomonas aeruginosa, a bacterium typically colonizing chronic wounds; nitric oxide (NO*) secreted in response to bacterial infections by cells of the immune system; and uric acid, a metabolite which strongly correlates with the severity of a wound. All these compounds are electroactive: that is, they respond to electrical activity and thus can be detected by an electroanalytical sensor.

Testing showed that both the sensor's limits of detection and linear dynamic ranges, which represent the ranges of concentrations where a sensor produces meaningful quantitative results, were within the biologically relevant concentrations -- that means a device based on these sensors could be used for wound monitoring in a clinical setting. The researchers also tested their sensor in cell cultures, where it successfully detected pyocyanin from P. aeruginosa and NO* from macrophages (immune cells that destroy bacteria and other 'invaders'). Finally, the sensor was also able to detect the influence of Ag+ silver ions, a known antimicrobial agent, that suppressed pyocyanin production by the bacteria.

"The next step is to utilize this sensor technology for in vivo studies and real-time monitoring of wound treatment effectiveness on human subjects in clinical settings," Professor Keith Stevenson notes.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Brexit opens the door to tougher anti-smoking measures

Brexit offers the UK opportunities to strengthen its world-leading tobacco control measures, by creating greater flexibility to respond to industry action and market developments, according to new research from the University of Bath.

The UK is currently bound by EU rules, but will enjoy greater freedom to adopt types of tobacco tax that are more effective at lifting the price of cheap tobacco products, as well as more direct pricing policies such as minimum prices or the imposition of price caps. Higher prices are one of the most effective tobacco control measures.

Furthermore, with 96 per cent of UK tobacco products originating from the EU in recent years, a no-deal Brexit is likely to raise cigarette and tobacco prices. HM Treasury has committed to apply new UK import tariffs on tobacco from the 1 January next year which, if passed on to consumers, would increase the average price of a typical 20-pack of cigarettes by around 30 pence and a 30g pouch of roll-your-own tobacco by £1.77.

"Since higher prices are one of the most effective tobacco control measures, this might be a rare positive from having to trade with the EU on WTO terms. Brexit offers the chance to improve public health in the UK, but equally poses a threat to public health if rules are relaxed," said Dr Rob Branston of the University's School of Management, lead author of the study 'What does Brexit mean for UK tobacco control?' newly published in International Journal of Drug Policy.

The benefits of Brexit related flexibility will not extend to Northern Ireland, which will be considered to be part of the EU customs union, following EU rules, and where imports to and from the EU will be tariff-free.

Northern Ireland will also retain the current photo health warning labels on tobacco packaging, whereas the remainder of the UK will switch to using Australian imagery.

An additional potential benefit of Brexit for tobacco control is an end to the import of cheap EU duty-paid tobacco and reduced smuggling due to tougher border checks. The cigarette allowance for travellers from the EU will fall from 800 cigarettes currently, or 1kg of roll-your-own tobacco, to a duty-free allowance of 200 cigarettes or 250g of tobacco.

"Duty-free allowances, tariffs, or regulatory differences will require customs checks at the UK border. Such checks might have the benefit of reducing the rate of illicit tobacco in the UK, boosting revenue to the UK government, and the effectiveness of UK tobacco tax policy," Branston said. He estimates higher tariffs would raise in the region of an additional £820 million per annum for the government.

However, the authors assert that the potential benefits of Brexit for UK tobacco control will only come to fruition if the government seizes the opportunity by continuing to prioritise policy to address tobacco harms. Dr Allen Gallagher, from the University of Bath's Department for Health, notes that senior members of the current UK government have links to neoliberal and free-market 'think tanks' like the tobacco-industry funded Institute of Economic Affairs which risks leading the administration to de-prioritise tobacco control.

"Ultimately, even if good trade agreements are reached, the benefits of Brexit for tobacco regulation will only be realised if there is the political intent to capitalise on the newly gained flexibility. There is a risk that this government's prioritisation of business interests means that the negative health impacts of tobacco will be less of a priority for the government post-Brexit and that tobacco regulation in the UK will suffer as a result." he said.

Co-author Deborah Arnott, Chief Executive of Action on Smoking on Health (ASH) said: "With 14.1% of the UK population smoking as of 2019, tackling tobacco use must remain a public health priority if government aims to make our country 'smokefree' in the next decade are to become a reality.

"With the COVID-19 pandemic occupying most current health-related attention, it could easily be overlooked that smoking remains the leading cause of preventable death in the UK, and causes more deaths each and every year than the pandemic has to date."

Credit: 
University of Bath

Oddly satisfying metamaterials store energy in their skin

video: When you press the dimpled circles on a fountain drink lid, they become either convex or concave. This demonstrates bistability: materials or structures that have two stable states. Andres Arrieta has demonstrated a patterned sheet of these domes that forms an energy-storing skin, strong enough to perform mechanical tasks, and even programmable to store and process data like a mechanical computer.

Image: 
Purdue University/Jared Pike

WEST LAFAYETTE, Ind. -- When you press the dimpled circles on a fountain drink lid, they become either convex or concave. Materials or structures that have two stable states demonstrate a concept called bistability.

A Purdue team has demonstrated that a patterned sheet of these domes will form an energy-storing skin: strong enough to perform mechanical tasks, and even programmable to store and process data like a mechanical computer.

"Bistability is an important concept found in nature," said Andres Arrieta, a Purdue assistant professor of mechanical engineering. "Earwigs, for example, have bistable, foldable wings that snap to an open state with very little energy. We are working to make programmable structures inspired from this bistability."

Arrieta's team began with a simple structure: a flat, one-inch square sheet with a pop-up dome, 3D printed from thermoplastic polyurethane. By pressing with a finger, the dome would snap to become either convex or concave. When they printed a 3-by-3 grid of these domes, they began to witness new behaviors.

"When you invert two domes that are close to another, they start interacting," Arrieta said. "And when you start making patterns of these domes on a sheet, the sheet itself begins to curve globally. Depending on which domes are inverted, you get different shapes."

They began experimenting with larger grids, pressing more complex patterns. By actuating certain domes (pressing them in or out), the sheet formed into a cylinder, a star, or a saddle shape. "These individual bi-stable domes are combining to form a new metamaterial, which itself has multiple stable states," Arrieta said. "We call it hierarchical multistability."

What can you do with hierarchical multistability? As a demonstration, Arrieta's team built a simple robotic gripper, using two lines of domes. When the domes were concave, the gripper arms stayed open. But by applying a small amount of air pressure to actuate the domes to become convex, the gripper arms closed tight enough to grasp and hold a small weight.

"Grasping something is easy, but maintaining that grasp requires constantly expending energy," Arrieta said. "This is true for both humans and machines. But what's interesting about this gripper is that when we invert the domes that make the gripper close, we are actually storing energy in the skin. The gripper arms are using that energy to maintain its grasp, rather than requiring some external energy source. In essence, we are using the structure itself as a mechanical battery."

The work has been published in the journals Extreme Mechanics Letters and Advanced Science.

Eventually, Arrieta hopes to utilize the technology in flexible robotics. "It's difficult to re-create a robotic hand that grips like a human hand because of the sheer number of motors and sensors required," he said. "But if we made the skin from these sheets and printed individual domes at different heights, then only a specific group of them would actuate at different levels of air pressure. By programming specific bursts of air pressure, we could activate a matrix of domes to create the multistable gripping states we need, with minimal energy."

In addition to varying the height attributes, Arrieta's team discovered that actuating the domes in a different order produces radically different shapes. With these three attributes - height, position, and order - even small grids of domes can produce a remarkable number of possible outcomes. This leads Arrieta to the next step: mechanical computing.

"When you think about it, these up-and-down domes are a lot like the 1s and 0s of computer data," Arrieta said. "We can imagine 'programming' a sheet like this by pressing the domes in certain locations in a certain order, and then 'reading' that data mechanically based on the shape of the sheet. This can be done without power or a central processor of any kind. In this way, future machines will function much more like animals, which use mechanical sensing and processing to react much more quickly."

While a great deal of mathematics and computer simulations contribute to this research, simply holding the patterned sheet of domes in your hands is a powerful, and oddly satisfying, investigative tool. "We didn't really understand how interesting it was until we started playing with it physically," Arrieta said. "When it comes to these domes, the whole is definitely greater than the sum of its parts."

Credit: 
Purdue University