Tech

Tissues protect their DNA under mechanical stress

image: Super resolution image showing that stretch leads to the wrinkling of the nuclei and rearrangement of the DNA. These physical changes lead to nuclear softening and are required to protect the DNA from mechanical damage.

Image: 
MPI f. Biology of Ageing

In everyday life, our tissues, for example skin and muscle, are stretched, pulled and compressed without causing damage to the cells or the DNA. A team of researchers led by Sara Wickström from the Max Planck Institute for the Biology of Ageing and the CECAD Cluster of Excellence at the University of Cologne and the Helsinki Institute for Life Sciences at the University of Helsinki has now discovered that cells protect themselves from such stress by not only deforming the cell nuclei, but also softening the genetic material itself.

Protection of the genetic code within our DNA is critical for human health. Mutations in DNA lead to a large variety of diseases, such as developmental disorders or cancer. "Most of our tissues contain tissue-specific stem cells, which are long-lived cells whose function is critical for tissue function and maintenance. Due to their long lifetime, it is critical that the genome of these cells is efficiently protected from mutations to prevent diseases such as cancer", says Michele Nava, the leading scientist of the study. "A lot is known about the role of chemicals and irradiation in inducing DNA damage, but how mechanical forces damage the DNA and what kind of mechanisms might exist to protect our cells from this damage has been unknown so far", Nava explains.

To study how the DNA in stem cells respond to mechanical deformation, Nava, Miroshnikova, and colleagues used a special mechanical device to expose skin and muscle stem cells to similar mechanical stretch that they would experience inside the tissues. As a result of the stretch, both the nucleus and the DNA became re-organized, but also changed their mechanical properties and became softer. "It was exciting to realize that we could alter the mechanical properties of DNA simply by exerting mechanical forces on the stem cells. Even more striking was that if we experimentally prevented this change, the stem cells now acquired DNA damage, indicating that we had discovered an important protective mechanism", says Yekaterina Miroshnikova, who led the study together with Nava and Wickström.

Orienting itself to the direction of force

Going deeper into the cellular mechanism of the response of stem cells to stretch, Nava, Miroshnikova and colleagues found that if exposed to mechanical stretch for longer periods of time, the whole tissue oriented itself to the direction of force. This tissue-scale orientation prevented deformation of the nucleus and its DNA, allowing them to restore their original state. This tissue-level orientation thus served as long-term mechanoprotection.

Finally, the researchers also noticed that cancer cells were less sensitive to mechanical stretch than healthy stem cells, due to differences in the levels of key nuclear proteins. "It is interesting to note that two central features defining cancer are their genetic instability, i.e. frequent acquisition of new mutations, as well as their insensitivity to control by extrinsic signals. A major future goal of the laboratory is to understand how defects in this newly discovered pathway could promote cancer formation and how cancers might exploit the mechanics to escape the control mechanisms of the tissue", says Sara Wickström.

Credit: 
Max-Planck-Gesellschaft

With smaller clinical trials, it may be up to doctors to notice rare drug side effects

image: Tejas Patil, MD, and colleagues say rare drug side effect highlights need for doctor vigilance when using new, targeted anti-cancer drugs.

Image: 
University of Colorado Cancer Center

Tejas Patil, MD, is a medical oncologist. Lisa Ferrigno, MD, MPH, FACS, is a trauma surgeon. Working with lung cancer patients at University of Colorado Cancer Center, they both, independently, noticed something strange: A small percentage of patients taking high doses of the drug osimertinib (Tagrisso) were developing a rare twist in the right side of their colon, a condition called cecal volvulus. Their case series, published in the journal Frontiers in Oncology, describes three of these patients, suggesting that doctors who use osimertinib with EGFR+ non-small cell lung cancer patients may consider watching for rare cases of cecal volvulus along with more expected side effects.

More broadly, the case series demonstrates an emerging trend in the evaluation of side effects for newly approved medicines: As drugs more precisely target cancer subtypes, clinical trials testing these drugs are enrolling fewer patients; smaller studies reduce the likelihood of seeing rare side effects; and so it may fall on clinicians using these drugs after FDA approval to notice medication side effects that went undetected during clinical trials.

"You have drugs getting out onto the market with less numbers involved in terms of testing, so the lay of the land with adverse effects might not be completely hashed out. There's not many people who have taken the drug," says Ferrigno.

"When your first patient has it, you just think it's a one-off. Then you see it again, and see it again, and once you get to your third patient, you start to think it's a pattern. We noticed all these patients were getting the same dose of osimertinib, and it prompted me to think it's not one-off, but a dose-exposure relationship," says Patil.

Exactly how a genetically targeted anti-cancer drug could create a right-colon twist is unclear, but similar drugs have been shown to affect the lining of the gut, hinting at a possible mechanism. Importantly, many patients are treated with 80mg doses of osimertinib, but it was only patients treated with 160mg doses who developed this possible side effect.

"Studies show 160mg is a pretty reasonable dose for osimertinib in many cases, so we expect to see this dose used even more, maybe leading to more of this kind of side effect. That's why it's important to start noticing these observations now," Patil says.

The authors also suggest the study highlights the potential impact of published case series, in which doctors present anecdotal evidence from interesting or unusual patients.

"Often case series make observations that don't amount to anything, but then there are the times when they're like black swans - the unexpected data points that force us to shift our thinking," Ferrigno says.

"We're dealing with an uncommon mutation receiving an uncommon drug at an uncommon dose and then getting a very uncommon surgical complication," Patil says. "The more we looked into it, the more it seemed to be pretty important."

Credit: 
University of Colorado Anschutz Medical Campus

How we end up 'confined' on YouTube

Everyone who has used the YouTube video platform has already had the feeling: the successive recommendations generated by the site's algorithm sometimes "confine" us in a bubble of similar content. Camille Roth, a CNRS researcher at the Centre Marc Bloch - Franco-German Research Centre for the Social Sciences (CNRS/MEAE/MESRI/BMBF), and his colleagues Antoine Mazières and Telmo Menezes, have studied this phenomenon by exploring recommendations from a thousand videos on different subjects, thereby running through half a million recommendations. Their results show that contrary to the algorithms of other platforms, which seem to promote the exploration of novelty and serendipity, YouTube's is actually an exception, generating a number of confinement phenomena. A user's navigation based on recommendations can be seen as a movement within a network of interconnected videos: by starting out from a particular video, the recommendation network is more or less closed, in other words it leads to content that is more or less similar and redundant. In addition, the content that leads to the most confined recommendation networks also seems to revolve around the most viewed videos, or the ones with the longest viewing time. This research was published on 21 April 2020 in the journal PLOS ONE.

Credit: 
CNRS

Utilizing the impact resistance of the world's hardest concrete for disaster prevention

image: Outline of degassing and water absorption treatment. The PFC specimens are then placed in a closed vessel that was depressurized using a vacuum pump, and water introduced from the outer surface to the inside.

Image: 
Kanazawa University

Kanazawa, Japan - Concrete is the most widely used building material in the world and consequently is being continuously developed to fulfill modern-day requirements. Efforts to improve concrete strength have led to reports of porosity-free concrete (PFC), the hardest concrete tested to date. Some of the basic properties of PFC have already been explored, and now a team including Kanazawa University has probed the impact response of this innovative material. Their findings are published in International Journal of Civil Engineering.

Ultra-high-strength concrete offers significant advantages including reducing the weight of large structures and protecting them against natural disasters and accidental impacts. PFC is an ultra-high-strength concrete whose properties can be further enhanced by incorporating steel fibers.

The way in which PFC is prepared leads to very few voids in the final material, which gives it its high strength--400 MPa can be applied to PFC before it fails, compared with 20-30 MPa for standard concrete. Some of the basic material properties of steel fiber-reinforced PFC have already been reported; now the researchers have evaluated the impact response of a range of PFC preparations with different steel fiber contents and section heights.

"The continued development of building materials is particularly important in areas where frequent natural disasters threaten the integrity of structures," study lead author Yusuke Kurihashi explains. "We carried out impact tests on a variety of steel fiber-reinforced PFC samples to determine their reactions, and in so doing, accelerate the widespread application of PFC in building projects. Our testing is designed to simulate responses to events such as rock falls, blasts and flying objects."

The researchers made two key findings. Firstly, they observed that increasing the steel fiber content from 1% to 2% reduced the damage due to the impact by 30%-50%. This significant improvement in performance is expected to inform future material design decisions.

In addition, they showed that it was possible to predict the behavior of the samples with approximately 80% accuracy by comparing calculated values with those that were measured, which will help to streamline development processes.

"We hope that PFC will contribute to enhanced building safety in the future," says Dr Kurihashi. "Although additional experimental work and statistical processing is required to fully translate PFC into widespread practical applications, our findings make a significant contribution to understanding PFC's role in improving the safety of many large structures including high-rise buildings, bridges and roads."

Credit: 
Kanazawa University

Spotting air pollution with satellites, better than ever before

image: Six different photos taken of the same area of Beijing on different days experiencing different levels of air pollution. While the naked human eye can clearly tell that some days are more polluted than others, a new machine learning algorithm can make reasonably accurate estimates air pollution at ground level.

Image: 
Tongshu Zheng, Duke University

DURHAM, N.C. -- Researchers from Duke University have devised a method for estimating the air quality over a small patch of land using nothing but satellite imagery and weather conditions. Such information could help researchers identify hidden hotspots of dangerous pollution, greatly improve studies of pollution on human health, or potentially tease out the effects of unpredictable events on air quality, such as the breakout of an airborne global pandemic.

The results appear online in the journal Atmospheric Environment.

"We've used a new generation of micro-satellite images to estimate ground-level air pollution at the smallest spatial scale to date," said Mike Bergin, professor of civil and environmental engineering at Duke. "We've been able to do it by developing a totally new approach that uses AI/machine learning to interpret data from surface images and existing ground stations."

The specific air quality measurement that Bergin and his colleagues are interested in is the amount of tiny airborne particles called PM2.5. These are particles that have a diameter of less than 2.5 micrometers -- about three percent of the diameter of a human hair -- and have been shown to have a dramatic effect on human health because of their ability to travel deep into the lungs.

For example, PM2.5 was globally ranked as the fifth mortality risk factor, responsible for about 4.2 million deaths and 103.1 million years of life lost or lived with disability, by the 2015 Global Burden of Disease study. And in a recent study from the Harvard University T.H. Chan School of Public Health, researchers found that areas with higher levels of PM2.5 also are associated with higher death rates due to COVID-19.

Current best practices in remote sensing to estimate the amount of ground-level PM2.5 use satellites to measure how much sunlight is scattered back to space by ambient particulates over the entire atmospheric column. This method, however, can suffer from regional uncertainties such as clouds and shiny surfaces, atmospheric mixing, and properties of the PM particles, and cannot make accurate estimates at scales smaller than about a square kilometer. While ground pollution monitoring stations can provide direct measurements, they suffer from their own host of drawbacks and are only sparsely located around the world.

"Ground stations are expensive to build and maintain, so even large cities aren't likely to have more than a handful of them," said Bergin. "Plus they're almost always put in areas away from traffic and other large local sources, so while they might give a general idea of the amount of PM2.5 in the air, they don't come anywhere near giving a true distribution for the people living in different areas throughout that city."

In their search for a better method, Bergin and his doctoral student Tongshu Zheng turned to Planet, an American company that uses micro-satellites to take pictures of the entire Earth's surface every single day with a resolution of three meters per pixel. The team was able to get daily snapshot of Beijing over the past three years.

The key breakthrough came when David Carlson, an assistant professor of civil and environmental engineering at Duke and an expert in machine learning, stepped in to help.

"When I go to machine learning and artificial intelligence conferences, I'm usually the only person from an environmental engineering department," said Carlson. "But these are the exact types of projects that I'm here to help support, and why Duke places such a high importance on hiring data experts throughout the entire university."

With Carlson's help, Bergin and Zheng applied a convolutional neural network with a random forest algorithm to the image set, combined with meteorological data from Beijing's weather station. While that may sound like a mouthful, it's not that difficult to pick your way through the trees.

A random forest is a standard machine learning algorithm that uses a lot of different decision trees to make a prediction. We've all seen decision trees, perhaps as an internet meme that uses a series of branching yes/no questions to decide whether or not to eat a burrito. Except in this case, the algorithm is looking through decision trees based on metrics such as wind, relative humidity, temperature and more, and using the resulting answers to arrive at an estimate for PM2.5 concentrations.

However, random forest algorithms don't deal well with images. That's where the convolutional neural networks come in. These algorithms look for common features in images such as lines and bumps and begin grouping them together. As the algorithm "zooms out," it continues to lump similar groupings together, combining basic shapes into common features such as buildings and highways. Eventually the algorithm comes up with a summary of the image as a list of its most common features, and these get thrown into the random forest along with the weather data.

"High-pollution images are definitely foggier and blurrier than normal images, but the human eye can't really tell the exact pollution levels from those details," said Carlson. "But the algorithm can pick out these differences in both the low-level and high-level features -- edges are blurrier and shapes are obscured more -- and precisely turn them into air quality estimates."

"The convolutional neural network doesn't give us as good of a prediction as we would like with the images alone," added Zheng. "But when you put those results into a random forest with weather data, the results are as good as anything else currently available, if not better."

In the study, the researchers used 10,400 images to train their model to predict local levels of PM2.5 using nothing but satellite images and weather conditions. They tested their resulting model on another 2,622 images to see how well it could predict PM2.5.

They show that, on average, their model is accurate to within 24 percent of actual PM2.5 levels measured at reference stations, which is at the high end of the spectrum for these types of models, while also having a much higher spatial resolution. While most of the current standard practices can predict levels down to 1 million square meters, the new method is accurate down to 40,000 -- about the size of eight football fields placed side-by-side.

With that level of specificity and accuracy, Bergin believes their method will open up a wide range of new uses for such models.

"We think this is a huge innovation in satellite retrievals of air quality and will be the backbone of a lot of research to come," said Bergin. "We're already starting to get inquiries into using it to look at how levels of PM2.5 are going to change once the world starts recovering from the spread of COVID-19."

Credit: 
Duke University

Promising MERS coronavirus vaccine trial in humans

image: Digitally colorized scanning electron microscopic (SEM) image of MERS corona virus particles.

Image: 
NIAID

"The results of this vaccine trial are also important and promising with regard to the development of a vaccine against SARS-CoV-2, the new coronavirus," explains Prof. Marylyn Addo, Head of the Division of Infectious Diseases at the UKE and scientist at the DZIF. "The development of the MERS vaccine provides a basis upon which we at the DZIF can rapidly develop a vaccine against the new coronavirus."

The MERS coronavirus, identified for the first time in 2012, is listed on the World Health Organisation´s Blueprint list for pathogens that are considered a particular threat to public health. The virus is transmitted from dromedary camels to humans and is also transmissible between humans. Infections with the virus cause respiratory illness with a mortality of up to 35 percent. Worldwide, close to 2,500 cases of MERS have been detected in 27 countries with the highest numbers being in Saudi Arabia. To date, neither an effective vaccine against the MERS coronavirus nor a specific drug exist.

The vaccine candidate MVA-MERS-S

"In 2014, together with DZIF partners, we started to develop a vaccine against the MERS coronavirus in preparation for larger outbreaks of the virus in the future," Addo explains. The vaccine is based on an attenuated virus (MVA: modified vaccinia virus Ankara), which had previously been used in a smallpox eradication vaccination campaign and has now been altered to contain protein components from the MERS coronavirus. This recombinant, so-called vector-based vaccine, scientifically termed MVA-MERS-S for short, is to boost immunity against MERS coronaviruses. Prof. Gerd Sutter from Ludwig-Maximilians University of Munich developed this vaccine in collaboration with Philipps University of Marburg and the Erasmus Medical Center Rotterdam. The MVA vector now serves as a basis for developing a vaccine against SARS-CoV-2, the new coronavirus.

The vaccine trial

The vaccine trial was conducted in collaboration with the Clinical Trial Center North (CTC North). A total of 23 healthy trial volunteers were vaccinated twice with MVA-MERS-S, the experimental vaccine, with an interval of four weeks between the vaccinations. The trial was to answer two questions: Is the experimental vaccine MVA-MERS-S tolerated well and safe to use in humans? Does it trigger humoral and cellular immune responses in humans, i.e. a development of antibodies and T cells that are able to prevent MERS-CoV infection or curb the course of illness?

The most important findings

"The tolerability and safety of the vaccine candidate as well as the resulting immune responses are very promising," explains Dr Till Koch, one of the first authors of the trial and a DZIF stipend-holder. The vaccine was well tolerated. Local side effects (i.e. pain at the site of injection, mild erythema and warmth) occurred most frequently and presented in 69 percent of the trial subjects. No severe side effects occurred. "After the second injection of MVA-MERS-S, antibody formation and T cell responses occurred in 87 percent of the trial subjects," summarises first co-author Dr Christine Dahlke.

Prof. Stephan Becker is pleased, "These results show that the new vaccine could potentially be used in future MERS outbreaks." The trial subjects' antibody responses were investigated in his laboratory at the University of Marburg. At the DZIF, Stephan Becker coordinates the research area "Emerging Infections" and is substantially involved in all vaccine projects.

The path to a vaccine

Next, a phase Ib trial, funded by CEPI (Coalition for Epidemic preparedness Innovation), will be conducted in which the vaccine will be tested in 160 trial subjects in Hamburg and Rotterdam. At the German Center for Infection Research, the results and tests from this trial will be used to start the development of a vaccine against the new coronavirus as rapidly as possible. The scientists will use the same viral vector (MVA) into which they will insert a SARS-CoV-2 spike protein to replace the MERS-CoV spike protein.

Credit: 
German Center for Infection Research

Tiny sensors fit 30,000 to a penny, transmit data from living tissue

ITHACA, N.Y. - Cornell University researchers who build nanoscale electronics have developed microsensors so tiny, they can fit 30,000 on one side of a penny. They are equipped with an integrated circuit, solar cells and light-emitting diodes (LEDs) that enable them to harness light for power and communication. And because they are mass fabricated, with up to 1 million sitting on an 8-inch wafer, each device costs a fraction of that same penny.

The sensors can be used to measure inputs like voltage and temperature in hard-to-reach environments, such as inside living tissue and microfluidic systems. For example, when rigged with a neural sensor, they would be able to noninvasively record nerve signals in the body and transmit findings by blinking a coded signal via the LED.

As a proof of concept, the team successfully embedded a sensor in brain tissue and wirelessly relayed the results.

The team's paper, "Microscopic Sensors Using Optical Wireless Integrated Circuits," published in PNAS.

The collaboration is led by Paul McEuen, professor of physical science, and Alyosha Molnar, associate professor of electrical and computer engineering. Working with the paper's lead author, Alejandro Cortese, a Cornell Presidential Postdoctoral Fellow, they devised a platform for parallel production of their optical wireless integrated circuits (OWICs) - microsensors the size of 100 microns (a micron is one-millionth of a meter), mere specks to the human eye.

The OWICS are essentially paramecium-size smartphones that can be specialized with apps. But rather than rely on cumbersome radio frequency technology, as cellphones do, the researchers looked to light as a potential power source and communication medium.

McEuen, Molnar and Cortese have launched their own company, OWiC Technologies, to commercialize the microsensors. A patent application has been filed through the Center for Technology Licensing. The first application is the creation of e-tags that can be attached to products to help identify them.

The tiny, low-cost OWICs could potentially spawn generations of microsensors that use less power while tracking more complicated phenomena.

Credit: 
Cornell University

Majority of US states and territories do not require day care providers to inform parents of firearms

Home- and center-based child care providers are not required by most states or U.S. territories to inform parents when guns are stored on the premises, according to a new study from researchers at Johns Hopkins Bloomberg School of Public Health.

The researchers found that a majority of U.S. states and territories--47 out of 56--do not require either centers or homes that provide child care to disclose to clients that they keep firearms on-site. Less than two-thirds of U.S. states and territories outright prohibit center-based child care operators from having firearms on the premises and only a handful--7 of 56--outright prohibit home-based child care operators from having firearms on the premises.

The study also found that nearly one-quarter of U.S. states and territories (13) had no regulations governing firearms in child care centers, and one-sixth (9) had no regulations governing firearms in family child care homes.

The findings were published online April 22 in JAMA Network Open.

For the study, the researchers surveyed state regulations covering the presence and storage of firearms at child care facilities in the U.S., including dedicated centers as well as home-based facilities.

"It's surprising how few states require notification to parents on whether or not a handgun is present--I think that's a critical gap that should be filled so that parents can make a more informed decision about child care," says study first author Sara Benjamin-Neelon, PhD, JD, the Helaine and Sid Lerner Professor in the Department of Health, Behavior and Society at the Bloomberg School.

While regulations were more likely to outright prohibit firearms in child care centers, family child care homes generally face restrictions on storage procedures only. For example, 46 U.S. states or territories require firearms present in home-based child care settings to be kept under lock and key; 29 require the ammunition to be stored separately; and 23 require the firearms to be unloaded.

There are more than 20 million children age 5 and under in the U.S., and almost two-thirds of them spend a substantial amount of time in center-based or home-based early care and education settings.

This is believed to be the first study that systematically examines firearm-related policies that apply to home- and center-based child care settings. Benjamin-Neelon and co-author Elyse Grossman PhD, JD, a policy fellow at the Bloomberg School, sought to clarify the current situation by reviewing firearm-related regulations, as of June 2019, for early care and education settings in all 50 states, Washington D.C., and the five U.S. territories. The study did not examine the consequences for noncompliance or the number of firearm-related incidents in these settings.

That lack of prohibition on guns in home-based child care settings may stem from legislators' concerns that the Second Amendment to the U.S. Constitution protects homeowners' rights to keep firearms, Benjamin-Neelon says. She notes that there is an ongoing legal challenge against an Illinois law banning guns from homes that serve as child care facilities.

Benjamin-Neelon and Grossman say they were most surprised by the limited notification requirements, with only 9 of the 56 jurisdictions requiring either operators of child care centers or family child care homes to notify parents when there are firearms present in the home. "States should consider regulations requiring notification to parents if there's a firearm on the premises," Benjamin-Neelon says.

The researchers plan to conduct further studies in this area to determine if stricter laws against firearms correlate with fewer gun-related injuries to children in home- and center-based child care settings.

"State regulations governing firearms in early care and education settings in the United States: A cross-sectional review" was written by Sara E. Benjamin-Neelon and Elyse R. Grossman.

Credit: 
Johns Hopkins Bloomberg School of Public Health

European satellite data shows extreme methane emissions from Permian oil & gas operations

video: Video features aerial footage captured with a special infrared camera to reveal invisible methane pollution being emitted from oil and gas facilities in the Permian basin.

Image: 
Environmental Defense Fund

(NEW YORK - April 22, 2020) Findings published today in the journal Science Advances show that oil and gas operations in America's sprawling Permian Basin are releasing methane at twice the average rate found in previous studies of 11 other major U.S. oil and gas regions. The new study was authored by scientists from Environmental Defense Fund, Harvard University, Georgia Tech and the SRON Netherlands Institute for Space Research.

"These are the highest emissions ever measured from a major U.S. oil and gas basin. There's so much methane escaping from Permian oil and gas operations that it nearly triples the 20-year climate impact of burning the gas they're producing," said co-author Dr. Steven Hamburg, chief scientist at EDF. "These findings demonstrate the rapidly growing ability of satellite technology to track emissions like these and to provide the data needed by both companies and regulators to know where emissions reductions are needed."

Photos, video, a map and other images for media are available here.

Based on 11 months of satellite data encompassing 200,000 individual readings taken across the 160,000 square-kilometer basin by the European Space Agency's TROPOMI instrument from May 2018 to March 2019, Permian oil and gas operations are losing methane at a rate equal to 3.7% of their gas production. The wasted methane - which is the main component in natural gas - is enough to supply 2 million U.S. households.

Methane is a potent greenhouse gas, human emissions of which cause over a quarter of today's warming. Reducing methane from oil and gas operations is the fastest, most cost-effective way to slow the rate of warming, even as the necessary transition to a net-zero carbon economy continues.

Findings highlight crucial new applications

Satellites offer an important new methane measurement tool that can cover large areas faster and more frequently than conventional methods. They can also provide data on gas producing regions around the world that are impossible to reach by aircraft or from the ground.

"Advances in satellite technology and data analytics are making it possible to generate regular and robust information on methane emissions from oil and gas operations even from the most remote corners of the world," said Mark Brownstein, EDF senior vice president for Energy. "It's our goal to use this new data to help companies and countries find, measure, and reduce methane emissions further and faster, and enable the public to both track and compare progress."

Launched in 2017, the TROPOMI instrument used in the study offers more precise measurements, higher resolution and better coverage than its forerunners. It is part of an emerging ecosystem of methane-tracking satellites with a growing range of capabilities, including one with even higher precision currently being developed by EDF subsidiary MethaneSAT LLC for launch in 2022. MethaneSAT will track oil and gas methane around the globe on a near-weekly basis, identifying and measuring smaller emission events and more widely dispersed sources not discernable with current technology.

Permian emissions challenge

The Permian Basin has emerged as one of the world's most prolific oil-producing regions in recent years, producing 3.5 million barrels of crude and 11 billion cubic feet of natural gas per day (about 30% and 10% of the respective U.S. totals in 2018).

Today's new peer-reviewed findings validate a set of ground-based and airborne measurements released two weeks ago by EDF's PermianMAP initiative, which found methane escaping from oil and gas operations in the most productive part of the basin at a rate of 3.5%. That project is currently collecting a year's worth of methane data across a 10,000 square-kilometer study area within the basin via fixed-wing aircraft, helicopters, towers, and ground-based mobile sensors.

High leakage rates in the Permian imply the opportunity to greatly reduce methane emissions in this sprawling oil and gas producing region, through better infrastructure design and development, more effective operations and better regulation at both the state and federal levels.

The TROPOMI study uses the latest technology and methods available to analyze and present data, a process that currently takes a great deal of time and effort. But researchers are quickly learning how to automate and accelerate these complex calculations. The MethaneSAT project, for example, is expected to deliver data based on weekly measurements in near-real time.

"Early TROPOMI images showed that the Permian was one of the largest methane hotspots in the U.S. But the satellite was new, and data analysis hadn't even started. Quantifying emissions and deriving a leak rate for a huge area was a big, hands-on effort, even with the best tools," said EDF's Dr. Ritesh Gautam, one of the study's lead researchers. "Studies like this are expanding those boundaries. MethaneSAT and missions that follow will be more capable, delivering more data much faster, in ways that are more actionable by stakeholders."

Credit: 
Environmental Defense Fund

Inappropriate diagnoses

At a glance:

A small but concerning number of former NFL players report receiving clinical diagnoses of chronic traumatic encephalopathy (CTE), according to new research

A definitive diagnosis of the neurodegenerative brain disease, thought to be caused by repeated blows to the head, can be done only on autopsy and cannot be made based on clinical exam or brain imaging

Although based on player self-reports rather than on medical records, the study findings raise concerns of inappropriate diagnosis and possible overlooking of other, more treatable, conditions

Players who reported diagnosis of CTE were more likely to have sleep apnea, heart disease, high blood pressure and depression, each of which can cause cognitive symptoms linked to CTE

Clinicians caring for former football players should ensure they are not overlooking conditions that may be treatable and exercise caution and clarity in discussing CTE

A postmortem exam of the brain remains the gold standard for diagnosing chronic traumatic encephalopathy, or CTE, the neurodegenerative brain disease believed to arise from repeated hits to the head.

Yet a small but by no means trivial number of former professional football players say they have received a diagnosis of CTE, according to a new study from Harvard Medical School and the Harvard T.H. Chan School of Public Health published April 13 in Annals of Neurology.

The research--based on a survey of nearly 4,000 former NFL players, ages 24 to 89--was conducted as part of the ongoing Football Players Health Study at Harvard University, a research initiative that encompasses a constellation of studies designed to evaluate various aspects of players' health across the lifespan.

Even though the results are based on player self-reports rather than on documented clinical diagnoses, the researchers say their findings are alarming for a number of reasons.

First, CTE is a post-mortem diagnosis and cannot be diagnosed definitively in living individuals. Second, an erroneous, or clinically unverifiable, diagnosis of CTE could obscure the role of other treatable conditions common among former football players that could cause a cluster of cognitive and behavioral symptoms mimicking CTE. Third, delivering a verdict of an untreatable disease could render patients hopeless, discouraging them from pursuing healthy behaviors and focusing on modifiable risk factors and conditions that may give rise to symptoms attributed to CTE.

Researchers emphasize that any cognitive and behavioral symptoms should be investigated thoroughly, and CTE concerns should never be dismissed.

"Former football players are rightfully worried about brain health and CTE concerns should not be overlooked, yet in the absence of validated clinical criteria and diagnostic methods for CTE, the fact that former players report being told they have the disease is highly concerning," said study lead author Rachel Grashow, a researcher at the Harvard T.H. Chan School of Public Health. "A diagnosis of CTE could downplay the effects of other conditions and discourage the pursuit of alternative explanations, while creating a sense of despair among those who believe they might have an untreatable brain condition."

Nearly 3 percent of the players in the current study (108 out of 3,913) reported they had received a diagnosis of CTE from a physician or another clinician. Those older than 60 were more likely to report a CTE diagnosis than younger players (3.7 percent, compared with 2.3 percent).

Symptoms of cognitive impairment--difficulty concentrating, forgetfulness, mood changes--were notably more common among former players who reported CTE diagnoses, regardless of age. Those who reported a CTE diagnosis were also more likely to report sleep apnea, heart disease, hypertension, stroke, depression, high cholesterol, obesity, use of prescription pain medication and low testosterone.

All of these are relatively common in former football players and can cause certain cognitive symptoms, which could be fueling clinical suspicion for CTE among some physicians, the researchers said. Given that safe and effective interventions exist for many of these conditions, it is critical that these patients are evaluated and treated before cognitive problems are prematurely or wrongly attributed to CTE.

Former players who self-identified as Black had higher percentages of CTE diagnoses, the study found. Researchers said the higher prevalence of conditions such as high blood pressure, diabetes and cardiovascular disease among Black men may explain the higher rate of CTE diagnoses in this group.

The current study was not designed to determine why or how the CTE diagnoses were made. The researchers, however, say a number of factors could be at play.

For example, some clinicians may suspect the presence of CTE because past studies have identified a link between neuropsychiatric symptoms in the decade preceding an athlete's death and subsequent postmortem CTE diagnosis. Also, clinicians may be seeing certain behavioral and cognitive changes as markers of brain degeneration, propelling them to consider a CTE diagnosis, while downplaying or not fully exploring alternative explanations for the symptoms, such as sleep apnea, heart disease or depression. Clinicians may also be more likely to consider--and suggest--CTE to players who spent their careers in more high impact positions, the researchers said.

Lack of clarity about symptoms and possible causes might leave patients prone to over-interpretation and set the stage for misunderstanding, the researchers said.

"Given the high visibility and intense media coverage of CTE, former football players may be highly sensitive to any hints or suggestions of CTE and assume a connection between their symptoms and this rather high-profile, but not necessarily accurate or appropriate, diagnosis," Grashow said. "Either way, it is incumbent upon the physicians who care for former athletes to ensure that such clarity is achieved."

None of this is to say that some former players may not, in fact, have CTE.

"CTE is real, and it probably plays a role in the cognitive or behavioral symptoms experienced by some former players, yet many of these symptoms could also arise from a number of other, more treatable, conditions," said study senior author Ross Zafonte, head of the Department of Physical Medicine and Rehabilitation at Spaulding Rehabilitation Hospital and Harvard Medical School.

Zafonte cautioned that a grim diagnosis like CTE could magnify symptoms, a psychological phenomenon known as the nocebo effect. It could also discourage people from engaging in healthy behaviors and pursuing critical treatments for other conditions responsible for the symptoms, added Zafonte, who is also principal investigator of the Football Players Health Study.

First described in the 1920s as boxers' dementia or "punch-drunk syndrome," CTE gained public attention over the last 20 years after a series of reports identified the hallmarks of the disease--abnormal protein clumps in certain parts of the brain--in postmortem exams of former football players, many of whom had shown cognitive, emotional and behavioral symptoms for more than a decade prior to their postmortem exams. CTE develops predominantly in people who sustain repeated blows to the head, including athletes in contact sports such as boxing, football, hockey and rugby, in military personnel who sustain head trauma and in victims of domestic violence.

Credit: 
Harvard Medical School

Electronic skin fully powered by sweat can monitor health

image: Sweat-powered electronic skin

Image: 
Caltech

One of the ways we experience the world around us is through our skin. From sensing temperature and pressure to pleasure or pain, the many nerve endings in our skin tell us a great deal.

Our skin can also tell the outside world a great deal about us as well. Moms press their hands against our foreheads to see if we have a fever. A date might see a blush rising on our cheeks during an intimate conversation. People at the gym might infer you are having a good workout from the beads of sweat on you.

But Caltech's Wei Gao, assistant professor in the Andrew and Peggy Cherng department of Medical Engineering wants to learn even more about you from your skin, and to that end, he has developed an electronic skin, or e-skin, that is applied directly on top of your real skin. The e-skin, made from soft, flexible rubber, can be embedded with sensors that monitor information like heart rate, body temperature, levels of blood sugar and metabolic byproducts that are indicators of health, and even the nerve signals that control our muscles. It does so without the need for a battery, as it runs solely on biofuel cells powered by one of the body's own waste products.

"One of the major challenges with these kinds of wearable devices is on the power side," says Gao. "Many people are using batteries, but that's not very sustainable. Some people have tried using solar cells or harvesting the power of human motion, but we wanted to know, 'Can we get sufficient energy from sweat to power the wearables?' and the answer is yes."

Gao explains that human sweat contains very high levels of the chemical lactate, a compound generated as a by-product of normal metabolic processes, especially by muscles during exercise. The fuel cells built into the e-skin absorb that lactate and combine it with oxygen from the atmosphere, generating water and pyruvate, another by-product of metabolism. As they operate, the biofuel cells generate enough electricity to power sensors and a Bluetooth device similar to the one that connects your phone to your car stereo, allowing the e-skin to transmit readings from its sensors wirelessly.

"While near-field communication is a common approach for many battery-free e-skin systems, it could be only used for power transfer and data readout over a very short distance," Gao says. "Bluetooth communication consumes higher power but is a more attractive approach with extended connectivity for practical medical and robotic applications."

Devising a power source that could run on sweat was not the only challenge in creating the e-skin, Gao says; it also needed to last a long time with high power intensity with minimal degradation. The biofuel cells are made from carbon nanotubes impregnated with a platinum/cobalt catalyst and composite mesh holding an enzyme that breaks down lactate. They can generate continuous, stable power output (as high as several milliwatts per square centimeter) over multiple days in human sweat.

Gao says the plan is to develop a variety of sensors that can be embedded in the e-skin so it can be used for multiple purposes.

"We want this system to be a platform," he says. "In addition to being a wearable biosensor, this can be a human-machine interface. The vital signs and molecular information collected using this platform could be used to design and optimize next-generation prosthetics. "

Credit: 
California Institute of Technology

A new way to cool down electronic devices, recover waste heat

image: A hydrogel can cool off electronics and generate electricity from their waste heat. Scale bar, 2 cm.

Image: 
Adapted from <i>Nano Letters</i> <b>2020</b>, DOI: 10.1021/acs.nanolett.0c00800

Using electronic devices for too long can cause them to overheat, which might slow them down, damage their components or even make them explode or catch fire. Now, researchers reporting in ACS' Nano Letters have developed a hydrogel that can both cool down electronics, such as cell phone batteries, and convert their waste heat into electricity.

Some components of electronic devices, including batteries, light-emitting diodes (known as LEDs) and computer microprocessors, generate heat during operation. Overheating can reduce the efficiency, reliability and lifespan of devices, in addition to wasting energy. Xuejiao Hu, Kang Liu, Jun Chen and colleagues wanted to design a smart thermogalvanic hydrogel that could convert waste heat into electricity, while also lowering the temperature of the device. So far, scientists have developed devices that can do one or the other, but not both simultaneously.

The team made a hydrogel consisting of a polyacrylamide framework infused with water and specific ions. When they heated the hydrogel, two of the ions (ferricyanide and ferrocyanide) transferred electrons between electrodes, generating electricity. Meanwhile, water inside the hydrogel evaporated, cooling it. After use, the hydrogel regenerated itself by absorbing water from the surrounding air. To demonstrate the new material, the researchers attached it to a cell phone battery during fast discharging. Some of the waste heat was converted into 5 μW of electricity, and the temperature of the battery decreased by 68 F. The reduced working temperature ensures safe operation of the battery, and the electricity harvested is sufficient for monitoring the battery or controlling the cooling system.

Credit: 
American Chemical Society

The future of semiconductors is clear

image: A focused laser is used to create thin films of tin dioxide.

Image: 
© 2020 Nakao et al.

Mobility is a key parameter for semiconductor performance and relates to how quickly and easily electrons can move inside a substance. Researchers have achieved the highest mobility among thin films of tin dioxide ever reported. This high mobility could allow engineers to create thin and even transparent tin dioxide semiconductors for use in next-generation LED lights, photovoltaic solar panels or touch-sensitive display technologies.

Tin and oxygen are very familiar elements, and when combined in a certain way to become tin dioxide, the material can be made into a semiconductor. Semiconductors are fundamental to most of our technology and are the basis of computer chips, solar panels and more. Since the 1960s, tin dioxide specifically has found use in industrial applications like gas sensors and transparent electrodes for solar devices. The material is effective for these things because of its high mobility. For most applications, higher is better. However, the high mobility of tin oxide only existed in large bulk crystals, until now.

"We demonstrated the highest mobility in a thin film of tin oxide ever achieved. Improved mobility not only enhances the conductivity but also the transparency of the material," said Shoichiro Nakao, a researcher from the Department of Chemistry at the University of Tokyo. "Generally, transparency and conductivity cannot coexist in a material. Typical transparent materials such as glass or plastic are insulating, whereas conducting materials like metals are opaque. Few materials exhibit transparent conductivity -- it's very interesting!"

The more transparent a semiconductor can be, the more light it can let through. Nakao and his team have made a tin oxide thin film that allows visible light and near-infrared light to pass. This is a great benefit to the power conversion efficiency of photovoltaic solar panels, but other uses could include enhanced touch-screen displays with even better accuracy and responsiveness, or more efficient LED lights.

"Our method of production was key to creating a substance with these properties. We used a highly focused laser to evaporate pellets of pure tin dioxide and deposit or grow material exactly how we wanted it," said Nakao. "Such a process allows us to explore different growth conditions as well as how to incorporate additional substances. This means we can endow tin dioxide semiconductors with high mobility and useful functionality."

Credit: 
University of Tokyo

Quantum research unifies two ideas offering an alternative route to topological superconductivity

image: Hybrid material nanowires with pencil-like cross section (A) at low temperatures and finite magnetic field display zero-energy peaks (B) consistent with topological superconductivity as verified by numerical simulations (C).

Image: 
Nbi

A pencil shaped semiconductor, measuring only a few hundred nanometers in diameter, is what researches from the Center for Quantum Devices, Niels Bohr Institute, at University of Copenhagen, in collaboration with Microsoft Quantum researchers, have used to uncover a new route to topological superconductivity and Majorana zero modes in a study recently published in Science.

The new route that the researchers discovered uses the phase winding around the circumference of a cylindrical superconductor surrounding a semiconductor, an approach they call "a conceptual breakthrough".

"The result may provide a useful route toward the use of Majorana zero modes as a basis of protected qubits for quantum information. We do not know if these wires themselves will be useful, or if just the ideas will be useful," says Charles Marcus, Villum Kann Rasmussen Professor at the Niels Bohr Institute and Scientific Director of Microsoft Quantum Lab in Copenhagen.

What we have found appears to be a much easier way of creating Majorana zero modes, where you can switch them on and off, and that can make a huge difference.

says postdoctoral research fellow, Saulius Vaitiek?nas, who was the lead experimentalist on the study.

Two known ideas combined

The new research merges two already known ideas used in the world of quantum mechanics: Vortex-based topological superconductors and the one-dimensional topological superconductivity in nanowires.

"The significance of this result is that it unifies different approaches to understanding and creating topological superconductivity and Majorana zero modes," says professor Karsten Flensberg, Director of the Center for Quantum Devices.

Looking back in time, the findings can be described as an extension of a 50-year old piece of physics known as the Little-Parks effect. In the Little-Parks effect, a superconductor in the shape of a cylindrical shell adjusts to an external magnetic field, threading the cylinder by jumping to a "vortex state" where the quantum wavefunction around the cylinder carries a twist of its phase.

Charles M. Marcus, Saulius Vaitiek?nas, and Karsten Flensberg from the Niels Bohr Institute at the Microsoft Quantum Lab in Copenhagen.

What was needed was a special type of material that combined semiconductor nanowires and superconducting aluminum. Those materials were developed in the Center for Quantum Devices in the few years. The particular wires for this study were special in having the superconducting shell fully surround the semiconductor. These were grown by professor Peter Krogstrup, also at the Center for Quantum Devices and Scientific Director of the Microsoft Quantum Materials Lab in Lyngby.

The research is the result of the same basic scientific wondering that through history has led to many great discoveries.

Our motivation to look at this in the first place was that it seemed interesting and we didn't know what would happen.

says Charles Marcus about the experimental discovery, which was confirmed theoretically in the same publication. Nonetheless, the idea may indicate a path forward for quantum computing.

Credit: 
University of Copenhagen

Delivery drones instead of postal vans? Study reveals drones still consume too much energy

When delivering parcels, drones often have a poorer energy balance than traditional delivery vans, as shown by a new study conducted at Martin Luther University Halle-Wittenberg. In densely populated areas, drones consume comparatively high amounts of energy and their range is strongly influenced by wind conditions. In rural areas, however, they may be able to compete with diesel-powered delivery vans. The study has been published in the journal Transportation Research Part D: Transport and Environment.

During the Corona pandemic many people are increasingly ordering online instead of going into shops. More and more parcels are being shipped, pushing many delivery services to the limits of their capacity. One possible solution are drones that can automatedly deliver parcels to customers from a delivery depot. What sounds like science fiction could soon become reality: "Google, DHL and Amazon have been experimenting with this concept for several years and launched the first commercial pilot projects in the USA and Australia in 2019," says Dr Thomas Kirschstein from MLU's department of Production and Logistics. He has calculated whether current drone models are ready to compete with vans in terms of energy consumption. "When evaluating the hypothetical use of delivery drones, attention has frequently only been on whether drones could deliver parcels faster and cheaper. Sustainability aspects, on the other hand, played less of a role," explains the economist.

In his new study, Kirschstein compared the energy consumption of drones with that of the diesel-powered delivery vans and electric transport vehicles currently used by parcel carriers. In a simulation, he hoped to discover which vehicle had, under which circumstances, the best energy balance. Basing his calculations on the greater Berlin area, he played through several scenarios. "Among other things, we investigated how the number of parcels per stop and the traffic situation impacted energy consumption," the researcher explains. He expanded his calculations to include the emissions produced through the generation of electricity or the consumption of diesel.

Initially, a trend became evident across all scenarios: electric vans were significantly more economical than diesel vans, consuming up to 50 per cent less energy. "This is not surprising for an urban setting: In cities, vans have to drive slowly and stop and start a lot. Here, electric vehicles consume significantly less energy," says Kirschstein.

These factors are not relevant, of course, for drones. Instead, wind conditions play a crucial role in how they perform. If there is a crosswind, for example, more energy must be expended to keep the drone on course. On the other hand, headwinds or tailwinds can even have a slightly positive effect on energy consumption. "Drones consume a relatively large amount of energy when they have to hover in one place in the air, for example when they want to deliver a parcel and have to wait outside the door of the recipient's home," explains the economist.

On average, the drones in the simulation consumed up to ten times more energy than electric delivery vans in such a densely populated city like Berlin. "Parcel carriers, for example, can stop and deliver several parcels on foot if multiple customers are receiving deliveries in one street. This is not possible for drones, as they can only deliver one package at a time. This increases their energy consumption, sometimes drastically," says Kirschstein. But his simulations also reveal a scenario in which the flying couriers are more energy-efficient than delivery vans: in more rural areas that are sparsely populated. However, higher energy consumption does not necessarily mean a poorer environmental balance: "Even if drones require more energy, they could represent an alternative to diesel vehicles, provided the electricity they need is generated by environmentally friendly means," explains the researcher.

Credit: 
Martin-Luther-Universität Halle-Wittenberg