Tech

Old carbon reservoirs unlikely to cause massive greenhouse gas release

image: A thin section of an ice core collected at Taylor Glacier in Antarctica. The ice core samples contain tiny air bubbles with small quantities of ancient air trapped inside. The researchers use a melting chamber to extract the ancient air from the bubbles and then study its chemical composition. The Rochester research focused on measuring the composition of air from the time of Earth's last deglaciation, 8,000-15,000 years ago. This time period is a partial analog to today.

Image: 
University of Rochester photo / Vasilii Petrenko

Permafrost in the soil and methane hydrates deep in the ocean are large reservoirs of ancient carbon. As soil and ocean temperatures rise, the reservoirs have the potential to break down, releasing enormous quantities of the potent greenhouse gas methane. But would this methane actually make it to the atmosphere?

Researchers at the University of Rochester--including Michael Dyonisius, a graduate student in the lab of Vasilii Petrenko, professor of earth and environmental sciences--and their collaborators studied methane emissions from a period in Earth's history partly analogous to the warming of Earth today. Their research, published in Science, indicates that even if methane is released from these large natural stores in response to warming, very little actually reaches the atmosphere.

"One of our take-home points is that we need to be more concerned about the anthropogenic emissions--those originating from human activities--than the natural feedbacks," Dyonisius says.

WHAT ARE METHANE HYDRATES AND PERMAFROST?

When plants die, they decompose into carbon-based organic matter in the soil. In extremely cold conditions, the carbon in the organic matter freezes and becomes trapped instead of being emitted into the atmosphere. This forms permafrost, soil that has been continuously frozen--even during the summer--for more than one year. Permafrost is mostly found on land, mainly in Siberia, Alaska, and Northern Canada.

Along with organic carbon, there is also an abundance of water ice in permafrost. When the permafrost thaws in rising temperatures, the ice melts and the underlying soil becomes waterlogged, helping to create low-oxygen conditions--the perfect environment for microbes in the soil to consume the carbon and produce methane.

Methane hydrates, on the other hand, are mostly found in ocean sediments along the continental margins. In methane hydrates, cages of water molecules trap methane molecules inside. Methane hydrates can only form under high pressures and low temperatures, so they are mainly found deep in the ocean. If ocean temperatures rise, so will the temperature of the ocean sediments where the methane hydrates are located. The hydrates will then destabilize, fall apart, and release the methane gas.

"If even a fraction of that destabilizes rapidly and that methane is transferred to the atmosphere, we would have a huge greenhouse impact because methane is such a potent greenhouse gas," Petrenko says. "The concern really has to do with releasing a truly massive amount of carbon from these stocks into the atmosphere as the climate continues to warm."

GATHERING DATA FROM ICE CORES

In order to determine how much methane from ancient carbon deposits might be released to the atmosphere in warming conditions, Dyonisius and his colleagues turned to patterns in Earth's past. They drilled and collected ice cores from Taylor Glacier in Antarctica. The ice core samples act like time capsules: they contain tiny air bubbles with small quantities of ancient air trapped inside. The researchers use a melting chamber to extract the ancient air from the bubbles and then study its chemical composition.

Dyonisius's research focused on measuring the composition of air from the time of Earth's last deglaciation, 8,000-15,000 years ago.

"The time period is a partial analog to today, when Earth went from a cold state to a warmer state," Dyonisius says. "But during the last deglaciation, the change was natural. Now the change is driven by human activity, and we're going from a warm state to an even warmer state."

Analyzing the carbon-14 isotope of methane in the samples, the group found that methane emissions from the ancient carbon reservoirs were small. Thus, Dyonisius concludes, "the likelihood of these old carbon reservoirs destabilizing and creating a large positive warming feedback in the present day is also low."

Dyonisius and his collaborators also concluded that the methane released does not reach the atmosphere in large quantities. The researchers believe this is due to several natural "buffers."

BUFFERS PROTECT AGAINST RELEASE TO THE ATMOSPHERE

In the case of methane hydrates, if the methane is released in the deep ocean, most of it is dissolved and oxidized by ocean microbes before it ever reaches the atmosphere. If the methane in permafrost forms deep enough in the soil, it may be oxidized by bacteria that eat the methane, or the carbon in the permafrost may never turn into methane and may instead be released as carbon dioxide.

"It seems like whatever natural buffers are in place are ensuring there's not much methane that gets released," Petrenko says.

The data also shows that methane emissions from wetlands increased in response to climate change during the last deglaciation, and it is likely wetland emissions will increase as the world continues to warm today.

Even so, Petrenko says, "anthropogenic methane emissions currently are larger than wetland emissions by a factor of about two, and our data shows we don't need to be as concerned about large methane releases from large carbon reservoirs in response to future warming; we should be more concerned about methane released from human activities."

Credit: 
University of Rochester

Colorado river flow dwindles due to loss of reflective snowpack

Due to the disappearance of its sunlight-reflecting seasonal snowpack, the Colorado River Basin is losing more water to evaporation than can be replaced by precipitation, researchers report. The study resolves a longstanding disagreement in previous estimates of the river's sensitivity to rising temperatures and identifies a growing potential for severe water shortages in this major basin. The Colorado River, a water source that supplies water to roughly 40 million people and supports more than $1 trillion of economic activity each year, is dwindling. Just as has happened for many rivers feeding water-stressed regions across the globe, increased drought and warming have been shrinking the Colorado River's flow for years. However, the sensitivity and response of river discharge to climate warming remain poorly understood. As a result, critical projections of water availability under future climate-warming scenarios are highly uncertain. Unlike in previous modeling efforts, Christopher Milly and Krista Dunne developed a hydrologic model that accounts for the balance of energy between incoming solar radiation and the albedo of snowy surfaces. Bright snow and ice have high albedo, meaning they reflect solar energy back into space. Combined with historical measurements, Milly and Dunne demonstrate how snow loss in the Colorado River Basin due to climate warming has resulted in the absorption of more solar energy, boosting evaporation of water throughout the basin. The authors estimate that this albedo-loss driven drying has reduced runoff by 9.5% per degree Celsius of warming. The results suggest that this drying will outpace the precipitation increases projected for a warmer future.

Credit: 
American Association for the Advancement of Science (AAAS)

Old methane sources less important in modern climate warming

The atmospheric release of ancient stores of methane in thawing permafrost or from beneath Arctic ice may not impact future climate warming as strongly as previously believed, a new study finds. Rather, emissions of the greenhouse gas from current activities are more important for our immediate future. Methane (CH4) is a potent greenhouse gas with a global warming potential many times more powerful than carbon dioxide. Currently, natural emissions account for nearly 40% of total CH4 emissions. However, vast quantities of old and cold CH4 are locked in climate-sensitive reservoirs like permafrost and as hydrates beneath ice sheets. As the Earth continues to warm, these thousands-of-years-old stores have the potential to be released into the atmosphere and trigger further abrupt warming. But the sensitivity and overall impact of these sources remain unresolved, adding uncertainty to climate change predictions. Michael Dionyisius and colleagues investigated the climatic contributions of these old carbon sources during the last deglaciation - a period roughly analogous to our modern warming atmosphere. Using tiny bubbles of air trapped inside Antarctic ice cores, Dyonisius et al. measured the old, or "radiocarbon dead," CH4 in the atmosphere throughout the end of the last ice age. The authors found that CH4 emissions from old carbon reservoirs were small; instead, the vast majority of atmospheric CH4 during this period was from contemporary sources, such as decomposition or burning of new organic material. The results suggest that old CH4 emissions in response to future warming will likely not be as large as others have suggested. Intriguingly, the authors found that CH4 emissions from biomass burning in the pre-industrial Holocene were comparable to those today. The findings suggest an underestimation of present-day CH4 or a yet unknown two-way anthropogenic influence on modern fire activity. In a related Perspective, Joshua Dean discusses the findings in greater detail.

Credit: 
American Association for the Advancement of Science (AAAS)

Columbia researchers develop new method to isolate atomic sheets and create new materials

image: This image shows atomically thin semiconductor wafers (MoS2 monolayers, lateral dimension of each panel ~ 1cm, wafer thickness of only ~0.7 nm). We obtained these monolayers from layer-by-layer exfoliation of a MoS2 single crystal using the gold tape method. The images have been Photoshop-processed for artistic appeal.

Image: 
Fang Liu, Qiuyang Li, Andrew Schlaus, Wenjing Wu, Yusong Bai, and Kihong Lee/Columbia University

New York, NY--February 20, 2020--Two-dimensional materials from layered van der Waals (vdW) crystals hold great promise for electronic, optoelectronic, and quantum devices, but making/manufacturing them has been limited by the lack of high-throughput techniques for exfoliating single-crystal monolayers with sufficient size and high quality. Columbia University researchers report today in Science that they have invented a new method--using ultraflat gold films--to disassemble vdW single crystals layer by layer into monolayers with near-unity yield and with dimensions limited only by bulk crystal sizes.

The monolayers generated using this technique have the same high quality as those created by conventional "Scotch tape" exfoliation, but are roughly a million times larger. The monolayers can be assembled into macroscopic artificial structures, with properties not easily created in conventionally grown bulk crystals. For instance, layers of molybdenum disulfide can be aligned with each other so that the resulting stack lacks mirror-symmetry and as a result demonstrates strongly nonlinear optical response, where it absorbs red light and emits ultraviolet light, a process known as second harmonic generation.

"This approach takes us one step closer to mass production of macroscopic monolayers and bulk-like artificial materials with controllable properties," says co-PI James Hone, Wang Fong-Jen Professor of Mechanical Engineering at Columbia Engineering.

The discovery 15 years ago that single atomic sheets of carbon--graphene--could be easily separated from bulk crystals of graphite and studied as perfect 2D materials was recognized with the 2010 Nobel prize in physics. Since then, researchers worldwide have studied properties and applications of a wide variety of 2D materials, and learned how to combine these layers into stacked heterostructures that are essentially new hybrid materials themselves. The original scotch tape method developed for graphene, which uses an adhesive polymer to pull apart crystals, is easy to implement but is not well-controlled and produces 2D sheets of limited size--typically tens of micrometers across, or the size of a cross-section of a single strand of hair.

A major challenge for the field and future manufacturing is how to scale up this process to much larger sizes in a deterministic process that produces 2D sheets on demand. The dominant approach to scaling up the production of 2D materials has been the growth of thin films, which has yielded great successes but still faces challenges in material quality, reproducibility, and the temperatures required. Other research groups pioneered the use of gold to exfoliate large 2D sheets, but have used approaches that either leave the 2D sheets on gold substrates or involve intermediate steps of evaporating hot gold atoms that damage the 2D materials.

"In our study, we were inspired by the semiconductor industry, which makes the ultrapure silicon wafers used for computer chips by growing large single crystals and slicing them into thin disks," says the lead PI Xiaoyang Zhu, Howard Family Professor of Nanoscience in Columbia's department of chemistry. "Our approach does this on the atomic scale: we start with a high-purity crystal of a layered material and peel off one layer at a time, achieving high-purity 2D sheets that are the same dimensions as the parent crystal."

The researchers took their cue from the Nobel prize-winning scotch tape method and developed an ultraflat gold tape instead of the adhesive polymer tape. The atomically flat gold surface adheres strongly and uniformly to the crystalline surface of a 2D material and disassembles it layer by layer. The layers are the same size and dimension as the original crystal--providing a degree of control far beyond what is achievable using scotch tape.

"The gold tape method is sufficiently gentle that the resulting flakes have the same quality as those made by scotch tape technique," says postdoctoral scholar Fang Liu, the lead author on the paper. "And what is especially exciting is that we can stack these atomically thin wafers in any desired order and orientation to generate a whole new class of artificial materials."

The work was carried out in the Center for Precision Assembly of Superstratic and Superatomic Solids, a Materials Science and Engineering Research Center funded by the National Science Foundation and led by Hone. The research project used shared facilities operated by the Columbia Nano Initiative.

Motivated by recent exciting advances in "twistronics," the team is now exploring adding small rotation between layers in these artificial materials. In doing so, they hope to achieve on a macro-scale the remarkable control over quantum properties such as superconductivity that have recently been demonstrated in micrometer-sized flakes. They are also working to broaden their new technique into a general method for all types of layered materials, and looking at potential robotic automation for large scale manufacturing and commercialization.

Credit: 
Columbia University School of Engineering and Applied Science

Study shows UV technology raises the standard in disinfecting ORs and medical equipment

Ultraviolet (UV) technology developed by the New York-based firm PurpleSun Inc. eliminates more than 96 percent of pathogens in operating rooms (ORs) and on medical equipment, compared to 38 percent using manual cleaning methods that rely on chemicals to disinfect surfaces, according to a study published this month in the American Journal of Infection Control (AJIC).

Health care is in constant battle to maintain a clean and pathogen-free patient care environment. To improve quality and reduce risk, hospitals use established protocols for cleaning and disinfecting ORs and medical equipment with chemical wipes after each case. Medical equipment ranges from surgical robots, microscopes and scanners to patient beds and stretchers.

"The challenge from an infection control standpoint is that microbes and bacteria are invisible to the human eye, and there is potential for errors in the manual cleaning and disinfection process, whether it's attributable to inadequate staffing, poor training, lack of adherence to manufacturers' instructions or other human error," said Donna Armellino, RN, DNP, vice president of infection prevention at Northwell Health, who was the lead author on this study. Among other factors, for instance, the disinfectants may have the wrong dilution, be incompatible with the materials used to clean, may not be in contact with the equipment long enough, or the chemicals may be improperly stored, reducing their pathogen-destroying effectiveness, she said.

To assess the current standard of cleaning and disinfection in ORs, Northwell clinicians teamed with PurpleSun, which developed a focused multivector, ultraviolet (FMUV) device that is used to supplement manual cleaning. FMUV takes 90 seconds to fully disinfect surfaces. They evaluated the current standard of cleaning in the OR with and without the use of FMUV.

As part of the study, researchers assessed pathogen presence by performing tests on equipment that was reported out in colony-forming units (CFUs). CFUs represent pathogens that could potentially increase the risk of a hospital-acquired infection. The testing was done before manual cleaning, after manual-chemical cleaning and disinfection, and after the automated FMUV light technology using a five-point assessment technique. The aggregate CFUs following manual-chemical disinfection compared to pre-cleaning showed a 38 percent effectiveness at killing pathogens, whereas the process using FMUV was 96.5 percent effective at reducing the level of reported CFUs.

"The study supports the fact that operating rooms are clean, but not as clean as we'd like following manual chemical cleaning and disinfection. FMUV has the potential for changing the cleanliness of operating rooms," said Dr. Armellino. Her co-authors on the study were Kristine Goldstein, RN, and Linti Thomas, RN, of Northern Westchester Hospital; and Thomas J. Walsh, MD, and Vidmantas Petraitis, MD, both of Weill Cornell Medicine of Cornell University.

Credit: 
Northwell Health

Wearable brain stimulation could safely improve motor function after stroke

LOS ANGELES, Feb. 20, 2020 -- A non-invasive, wearable, magnetic brain stimulation device could improve motor function in stroke patients, according to preliminary late breaking science presented today at the American Stroke Association's International Stroke Conference 2020. The conference, Feb. 19-21 in Los Angeles, is a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

In an initial, randomized, double-blind, sham-controlled clinical trial of 30 chronic ischemic stroke survivors, a new wearable, multifocal, transcranial, rotating, permanent magnet stimulator, or TRPMS, produced significant increases in physiological brain activity in areas near the injured brain, as measured by functional MRI.

"The robustness of the increase in physiological brain activity was surprising. With only 30 subjects, a statistically significant change was seen in brain activity," said lead study author David Chiu, M.D., director of the Eddy Scurlock Stroke Center at Houston Methodist Hospital in Texas. "If confirmed in a larger multicenter trial, the results would have enormous implications. This technology would be the first proven treatment for recovery of motor function after chronic ischemic stroke."

Magnetic stimulation of the brain was previously investigated to promote recovery of motor function after stroke. The stimulation may change neural activity and induce reorganization of circuits in the brain. Researchers introduced a new wearable stimulator.

Stroke survivors who had weakness on one side of their body at least three months post-stroke were enrolled in a preliminary study to evaluate safety and efficacy of the device. Half of the patients were treated with brain stimulation administered in twenty 40-minute sessions over four weeks. The rest had sham, or mock, treatment. Researchers analyzed physiologic brain activity before, immediately after and one month after treatment.

They found that treatment was well tolerated, and there were no device-related complications. Active treatment produced significantly greater increases in brain activity: nearly 9 times higher than the sham treatment.

Although the study could not prove that the transcranial stimulator improved motor function, numerical improvements were demonstrated in five of six clinical scales of motor function, as measured by a functional MRI test. The scales measured gait velocity, grip strength, pinch strength, and other motor functions of the arm. The treatment effects persisted over a three-month follow-up.

The researchers believe the study results are a signal of possible improved clinical motor function after magnetic brain stimulation for patients after stroke, which will need to be confirmed in a larger, multicenter trial.

Credit: 
American Heart Association

Newly found bacteria fights climate change, soil pollutants

image: Undergraduate research assistant David Karasz, prepares cultures of Paraburkholderia for scanning electron microscopy to identify cellular structures involved in chain formation.

Image: 
Allison Usavage, Cornell University

ITHACA, N.Y. - Cornell University researchers have found a new species of soil bacteria that is particularly adept at breaking down organic matter, including the cancer-causing chemicals that are released when coal, gas, oil and refuse are burned.

Dan Buckley, professor of microbial ecology and five other Cornell researchers, along with colleagues from Lycoming College, described the new bacterium in a paper published in the International Journal of Systematic and Evolutionary Microbiology.

The newly discovered bacteria belong to the genus Paraburkholderia, which are known for their ability to degrade aromatic compounds and, in some species, the capacity to form root nodules that fix atmospheric nitrogen.

The first step was sequencing the bacterium's ribosomal RNA genes, which provided genetic evidence that madseniana was a unique species. In studying the new bacteria, the researchers noticed that madseniana is especially adept at breaking down aromatic hydrocarbons, which make up lignin, a major component of plant biomass and soil organic matter. Aromatic hydrocarbons are also found in toxic PAH pollution.

This means that the newly identified bacteria could be a candidate for biodegradation research and an important player in the soil carbon cycle.

In the case of madseniana, Buckley's lab wants to learn more about the symbiotic relationship between the bacteria and forest trees. Initial research suggests that trees feed carbon to the bacteria, and in turn the bacteria degrade soil organic matter, thereby releasing nutrients such as nitrogen and phosphorus for the trees.

Understanding how bacteria break down carbon in soil could hold the key to the sustainability of soil and the ability to predict the future of global climate.

Credit: 
Cornell University

Ancient gut microbiomes shed light on human evolution

The microbiome of our ancestors might have been more important for human evolution than previously thought, according to a new study published in Frontiers in Ecology and Evolution. An adaptive gut microbiome could have been critical for human dispersal, allowing our ancestors to survive in new geographic areas.

"In this paper, we begin to consider what the microbiomes of our ancestors might have been like and how they might have changed," says Rob Dunn of the North Carolina State University in the United States. "Such changes aren't always bad and yet medicine, diet, and much else makes more sense in light of a better understanding of the microbes that were part of the daily lives of our ancestors."

These adaptive microbiomes might have been critical for human success in a range of different environments. By using data from previously published studies to compare microbiota among humans, apes and other non-human primates, the interdisciplinary team of researchers found that there is substantial variation in composition and function of the human microbiome which correlates with geography and lifestyle. This suggests that the human gut microbiome adapted quickly to new environmental conditions.

When our ancestors walked into new geographic areas, they confronted new food choices and diseases and used a variety of different tools to obtain and process food. Their adaptive microbiome made it possible to digest or detoxify the foods they were eating in a local region and increased our ancestor's ability to endure new diseases. As such, microbial adaptation facilitated human success in a range of environments, allowing us to spread around the world.

Importantly, the social sharing of microbes might have led to local microbial adaptations. Yet, our ancestors did not just share their microbiota amongst each other but they also outsourced them into their food.

For instance, with fermentation, the researchers posit that ancient humans "extended" their guts outside of their bodies by co-opting body microbes to allow digestion to begin externally when food was fermented. This allowed humans to store food and stay in one place for a longer time, facilitating the persistence of larger groups living together. When these groups consumed the food items together, the microbes re-inoculated the consumers and the group's microbiota became more similar to each other than to individuals from other groups.

"We outsourced our body microbes into our foods. That could well be the most important tool we ever invented. But it is a hard tool to see in the past and so we don't talk about it much," says Dunn. "Stone artifacts preserve but fish or beer fermented in a hole in the ground doesn't".

The results of this study are limited to hypotheses that still need to be tested by paleoanthropologists, medical researchers, ecologists and other professionals. "We are hoping the findings will change some questions and that other researchers will study the consequences of changes in the human microbiome," says Dunn. "Hopefully the next decade will see more focus on microbes in our past and less on sharp rocks."

Credit: 
Frontiers

Count me out of counting seeds

image: When seeds are hand counted, they are grouped on graph paper so they are easier to total. Hand counting can be a tedious process.

Image: 
Matthew Bertucci

One, two, three, four, five. One, two, three, four, five. Over and over and over. That's the dull routine of any researcher or student tasked with counting weed seeds. But just like technology has made many things in life faster and easier, relief may be coming for seed counters as well.

A team of researchers at the University of Arkansas, Auburn University and North Carolina State University set out to see if a piece of technology, called a computerized particle analyzer, can be used to count weed seeds. Their results are promising and show the analyzer accurately counts seeds.

Matthew Bertucci, a scientist at Arkansas, says seed counting is a tiring yet important fact of life in their research.

"Many weed control strategies or studies evaluate the ability of an herbicide to kill weeds and elevate crop yields," he explains. "However, weeds that have been stunted but not killed may still produce viable seed that can contribute to weed populations in later years. Monitoring weed seed production then becomes important to evaluating weed control efforts."

The group focused on the seeds from one weed, called Palmer amaranth. It's a summer annual weed especially troubling for row and vegetable crops. While native to the southwestern United States, it has spread from coast to coast.

These seeds have historically been counted by tediously hand counting a specific amount. Researchers then used that information to estimate larger quantities of seed.

To accurately hand count and double check seed numbers for this research, Bertucci and his team counted five seeds at a time and placed them in groups on graph paper.

While image analysis methods like the computerized particle analyzer are out there, there had not been scientific proof of accuracy. Many researchers still use hand counting methods. Bertucci wanted to test and report if this imaging technology is accurate.

"Seed production is an important part of weed control," he says. "But there are few papers that consider weed seed production as part of a successful weed control strategy. It was my hope that by reporting on an automated seed counting alternative, more researchers might be willing to measure seed production when evaluating weed control."

The computerized particle analyzer is a table-top lab instrument. It has a long, fast-moving conveyor belt that sends the seeds across a camera. As the seeds are pushed past the light source of the camera, they cast a shadow that is recorded and counted. The camera takes many images per second. Special software analyzes the images to distinguish between seeds and non-seed material for counting.

About 2,000 to 2,500 seeds can be hand-counted and checked in half an hour. However, the computerized method can count that same amount in roughly 14 seconds. Accounting for the time to load samples and clean the instrument, it can take just three minutes from start to finish.

Researchers think this technology can be used for other kinds of seeds. But, they would have to fine-tune software for each weed species based on their characteristics like size and weight.

The researchers also checked the accuracy of two hand counting methods used to count a subset of seeds and then estimate a larger amount. While they differ in how they begin the counting process, they were both found to be highly accurate.

"Our findings are encouraging because we found the previously-reported methods are reliable and accurate, and that the computerized method can offer a more rapid alternative if available," Bertucci says. "This research was exciting for me because I saw it as an opportunity to improve something that is mundane and has room for error. I was excited to explore any alternative and share that information with other researchers."

Credit: 
American Society of Agronomy

Mixed-signal hardware security thwarts powerful electromagnetic attacks

image: Purdue University innovators created hardware technology to use mixed-signal circuits to embed critical information to stop computer attacks.

Image: 
Shreyas Sen/Purdue University

WEST LAFAYETTE, Ind. - Security of embedded devices is essential in today's internet-connected world. Security is typically guaranteed mathematically using a small secret key to encrypt the private messages.

When these computationally secure encryption algorithms are implemented on a physical hardware, they leak critical side-channel information in the form of power consumption or electromagnetic radiation. Now, Purdue University innovators have developed technology to kill the problem at the source itself - tackling physical-layer vulnerabilities with physical-layer solutions.

Recent attacks have shown that such side-channel attacks can happen in just a few minutes from a short distance away. Recently, these attacks were used in the counterfeiting of e-cigarette batteries by stealing the secret encryption keys from authentic batteries to gain market share.

"This leakage is inevitable as it is created due to the accelerating and decelerating electrons, which are at the core of today's digital circuits performing the encryption operations," said Debayan Das, a Ph.D. student in Purdue's College of Engineering. "Such attacks are becoming a significant threat to resource-constrained edge devices that use symmetric key encryption with a relatively static secret key like smart cards. Our technology has been shown to be 100 times more resilient to these attacks against Internet of Things devices than current solutions."

Das is a member of Purdue's SparcLab team, directed by Shreyas Sen, an assistant professor of electrical and computer engineering. The team developed technology to use mixed-signal circuits to embed the crypto core within a signature attenuation hardware with lower-level metal routing, such that the critical signature is suppressed even before it reaches the higher-level metal layers and the supply pin. Das said this drastically reduces electromagnetic and power information leakage.

"Our technique basically makes an attack impractical in many situations," Das said. "Our protection mechanism is generic enough that it can be applied to any cryptographic engine to improve side-channel security."

Credit: 
Purdue University

Laser writing enables practical flat optics and data storage in glass

image: (Left) Birefringence image of a flat lens and intensity patterns of 488 nm laser beams with different handeness circular polarizations focused and defocused by the same lens. The focal lengths are ± 208 mm. (Right) The same lens corrects short -5 D and long +5 D sightedness.

Image: 
by Masaaki Sakakura, Yuhao Lei, Lei Wang, Yan-Hao Yu, and Peter G. Kazansky

Conventional optics (e.g. lenses or mirrors) manipulate the phase via optical path difference by controlling thickness or refractive index of material. Recently, researchers reported that arbitrary wavefront of light can be achieved with flat optics by spatially varying anisotropy, using geometric or Pancharatnam-Berry phase. However, despite various methods employed for anisotropy patterning, producing spatially varying birefringence with low loss, high damage threshold and durability remains a challenge.

In addition, the technologies of birefringence patterning have been also used for generating light beams with spatially variant polarization known as vector beams, in particular with radial or azimuthal polarization. Radially polarized vector beams are especially interesting due to the non-vanishing longitudinal electric field component when tightly focused, allowing superresolution imaging. Radial polarization is also the optimal choice for material processing. On the other hand, azimuthal vector beams can induce longitudinal magnetic fields with potential applications in spectroscopy and microscopy. Nonetheless, generating such beams with high efficiency is not a trivial matter.

In an article published in Light Science & Applications, scientists from the Optoelectronics Research Centre, University of Southampton, UK, demonstrated a new type of birefringent modification with ultra-low loss by ultrafast laser direct writing in silica glass. The discovered birefringent modificati which is completely different from the conventional one originating from nanogratings or nanoplatelets, contains randomly distributed nanopores with elongated anisotropic shapes, aligned perpendicular to the writing polarization, which are responsible for the high transparency and controllable birefringence. The birefringent modification enabled fabrication of ultra-low loss spatially variant birefringent optical elements including geometrical phase flat prism and lens, vector beam converters and zero-order retarders, which can be used for high power lasers. The high transmittance from UV to near-infrared and high durability of the demonstrated birefringent optical elements in silica glass overcome the limitations of geometrical phase and polarization shaping using conventional materials and fabrication methods including photo-aligned liquid crystals and meta-surfaces.

"We observed ultrafast laser induced modification in silica glass with the evidence of anisotropic nanopore formation representing a new type of nanoporous material."

"The technology of low loss polarization and geometrical phase patterning widens the applications of geometrical phase optical elements and vector beam convertors for high power lasers and visible and UV light sources."

"The space-selective birefringent modification with high transparency also enables high capacity multiplexed data storage in silica glass."

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

New mathematical model reveals how major groups arise in evolution

Researchers at Uppsala University and the University of Leeds presents a new mathematical model of patterns of diversity in the fossil record, which offers a solution to Darwin's "abominable mystery" and strengthens our understanding of how modern groups originate. The research is published in the journal Science Advances.

The origins of many major groups of organisms in the fossil record seem to lie shrouded in obscurity. Indeed, one of the most famous examples, the flowering plants, was called "an abominable mystery" by Darwin. Many modern groups appear abruptly, and their predecessors - if there are any - tend to be few in number and vanish quickly from the fossil record shortly afterwards. Conversely, once groups are established, they tend to be dominant for long periods of time until interrupted by the so-called "mass extinctions" such as the one at the end of the Cretaceous period some 66 million years ago.

Such patterns appear surprising, and often seem to be contradicted by the results from "molecular clocks" - using calibrated rates of change of molecules found in living organisms to estimate when they started to diverge from each other. How can this conflict be resolved, and what can we learn from it?

In a paper, Graham Budd, Uppsala University, and Richard Mann, University of Leeds, present a novel mathematical model for how the origin of modern groups based on a so-called "birth-death" process of speciation and extinction. Birth-death models show how random extinction and speciation events give rise to large-scale patterns of diversity through time. Budd and Mann show that the ancestral forms of modern groups are typically rather few in number, and once they give rise to the modern group, they can be expected to quickly go extinct. The modern group, conversely, tends to diversify very quickly and thus swamp out the ancestral forms. Thus, rather surprisingly, living organisms capture a great percentage of all the diversity there has ever been.

The only exceptions to these patterns are caused by the "mass extinctions", of which there have been at least five throughout history, which can massively delay the origin of the modern group, and thus extend the longevity and the diversity of the ancestral forms, called "stem groups". A good example of this is the enormous diversity of the dinosaurs, which properly considered are stem-group birds. The meteorite impact at the end of the Cretaceous some 66 million years ago killed off nearly all of them, apart from a tiny group that survived and flourished to give rise to the more than 10 000 species of living birds.

The new model explains many puzzling features about the fossil record and suggests that it often records a relatively accurate picture of the origin of major groups. This in turn suggests that increased scrutiny should be paid to molecular clock models when they significantly disagree with what the fossil record might be telling us.

Credit: 
Uppsala University

Rise in global deaths and disability due to lung diseases over past three decades

There has been an increase in deaths and disability due to chronic respiratory (lung) diseases over the past three decades, finds an analysis of data from 195 countries published by The BMJ today.

The poorest regions of the world had the greatest disease burden. Ageing and risk factors including smoking, environmental pollution, and body weight also play a key role, say the researchers.

Chronic respiratory diseases pose a major public health problem, with an estimated 3.9 million deaths in 2017, accounting for 7% of all deaths worldwide.

Chronic lung disease (COPD) and asthma are the most common conditions, but others such as pneumoconiosis (lung disease due to dust inhalation), interstitial lung disease and pulmonary sarcoidosis (due to lung scarring and inflammation) are also global public health concerns.

Previous analyses of death and loss of health due to chronic respiratory diseases were based on limited data or confined to local areas.

To try and plug this knowledge gap, researchers in China used data from the Global Burden of Disease Study 2017 to describe trends in mortality and disability adjusted life years (DALYs) - a combined measure of quantity and quality of life - due to chronic respiratory diseases, by age and sex, across the world during 1990-2017.

Between 1990 and 2017, the number of deaths due to chronic respiratory diseases increased by 18%, from 3.32 million in 1990 to 3.91 million in 2017.

The number of deaths increased with age and rose sharply in those aged 70 and older, a burden that is likely to increase as the worldwide population ages, suggest the authors.

During the 27 year study period, rates of death and disability ranked according to age (known as the age standardised mortality rate) decreased, particularly in men.

Overall, social deprivation was the most important factor affecting rates of death and disability, with the highest rates seen in the poorest regions of the world. Lower mortality was seen in more affluent countries, reflecting better access to health services and improved treatments.

Smoking was the leading risk factor for deaths and disability due to COPD and asthma. In 2017, smoking accounted for 1.4 million deaths and 33 million DALYs, particularly in poorer regions, indicating an urgent need to improve tobacco control in developing countries, say the authors.

Pollution from airborne particulate matter was the next most important risk factor for COPD, with one million deaths and 25 million DALYs.

A high body mass index has also accounted for the most deaths from asthma since 2013, particularly in women, and has contributed the most to DALYs since 2003, add the authors.

"As the prevalence of obesity continues to increase at a worrying rate worldwide, weight loss should be included in the management of obese patients with asthma," they write.

The researchers point to some study limitations, such as differences in disease definitions and rates of misdiagnosis across countries.

Nevertheless, they say this study showed that the number of global deaths and DALYs from chronic respiratory diseases increased from 1990 to 2017, while the age standardised mortality rate and age standardised DALY rate decreased, with a more profound decline in males.

Regions with a low socio-demographic index had the greatest burden of disease, they add. The estimated contribution of risk factors (such as smoking, environmental pollution, and a high body mass index) to mortality and DALYs "supports the need for urgent efforts to reduce exposure to them," they conclude.

Credit: 
BMJ Group

Jet stream not getting 'wavier' despite Arctic warming

image: The polar jet stream.

Image: 
NASA/Trent L Schindler

Rapid Arctic warming has not led to a "wavier" jet stream around the mid-latitudes in recent decades, pioneering new research has shown.

Scientists from the University of Exeter have studied the extent to which Arctic amplification - the faster rate of warming in the Arctic compared to places farther south - has affected the fluctuation of the jet stream's winding course over the North Hemisphere.

Recent studies have suggested the warming Arctic region has led to a "wavier" jet stream - which can lead to extreme weather conditions striking the US and Europe.

However, the new study by Dr Russell Blackport and Professor James Screen, shows that Arctic warming does not drive a more meandering jet stream.

Instead, they believe any link is more likely to be a result of random fluctuations in the jet stream influencing Arctic temperatures, rather than the other way around.

The study is published in leading journal Science Advances on Wednesday 19 February 2020.

Dr Blackport, a Research Fellow in Mathematics and lead author of the study, said: "While there does appear to be a link between a wavier jet stream and Arctic warming in year-to-year and decade-to-decade variability, there has not been a long-term increase in waviness in response to the rapidly warming Arctic."

Scientists have studied whether the jet stream's meandering course across the Northern Hemisphere is amplified by climate change in recent years.

For about two decades, the jet stream - a powerful band of westerly winds across the mid-latitudes - was observed to have a "wavier" flow, which coincided with greater Arctic warming through climate change.

These waves have caused extreme weather conditions to strike mainland Europe and the US, bringing intense cold air that leads to extreme cold weather.

In this new study, Dr Blackport and Professor Screen studied not only climate model simulations but also the observed conditions going back 40 years.

They found that the previously reported trend toward a wavier circulation during autumn and winter has reversed in recent years, despite continued Arctic amplification.

This reversal has resulted in no long-term trends in waviness, in agreement with climate model simulations, which also suggest little change in "waviness" in response to strong Arctic warming.

The results, the scientists say, strongly suggest that the observed and simulated link between jet stream "waviness" and Arctic temperatures do not represent a causal effect of Arctic amplification on the jet stream.

Professor Screen, an Associate Professor in Climate Science at Exeter added: "The well-publicised idea that Arctic warming is leading to a wavier jet stream just does not hold up to scrutiny.

"With the benefit of ten more years of data and model experiments, we find no evidence of long-term changes in waviness despite on-going Arctic warming."

Insignificant effect of Arctic amplification on the amplitude of mid-latitude atmospheric waves is published in Science Advances.

Credit: 
University of Exeter

Evaluating risk of cancer in patients with psoriasis, psoriatic arthritis

What The Study Did: This observational study was a systematic review and meta-analysis that included 112 studies and examined the association between risk of cancer in patients with psoriasis or psoriatic arthritis, including the risk of specific cancers.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Sofie Vaengebjerg, M.D., of the University of Copenhagen in Denmark, is the corresponding author.

(doi:10.1001/jamadermatol.2020.0024)

Editor's Note: The article includes conflict of interest disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network