Tech

Study improves ability to predict how whales travel through their ocean habitat

image: Bowhead whales captured on camera in an aerial survey

Image: 
Amelia Brower NOAA/NMFS/AFSC/NMML Permit #14245

BOSTON, MASS. (November 2020) - Scientists at the New England Aquarium's Anderson Cabot Center for Ocean Life recently published a study that could help researchers learn where protections are needed the most for bowhead whales.

Dr. Dan Pendleton and Dr. Jessica Redfern partnered with Dr. Elizabeth Holmes of the National Marine Fisheries Service and Dr. Jinlun Zhang of the University of Washington for the study, which was published in the journal Diversity and Distributions as the cover article. The team used a highly detailed dataset to model suitable habitat for western Arctic bowhead whales in the Beaufort Sea north of Alaska. The model's map generally matched the location of already documented whale sightings, suggesting that the model can provide useful information for other whale species.

Most species distribution models use data such as ocean depth, temperature, and water chemistry, but with the addition of food--one of the species' primary drivers for inhabiting an area--the Aquarium scientists were able to improve the ability to make these predictions. The key to this model's success was calculations that estimate fluctuations in the whales' prey: zooplankton.

Using prey-centric species distribution models is beneficial for understanding where the whales are now and where they will be in the future. As climate change continues to warm waters and melt sea ice, impacting where the whales' food source will likely be, this research could guide protection efforts as well as shipping routes and fishing guidelines.

The study focused on the western Arctic bowhead whales in the Beaufort Sea north of Alaska, but Pendleton has hopes to use the prey-centric model to track the movements of North Atlantic right whales, a critically endangered species of 356 individuals that migrates along the east coast of the U.S. and Canada. Right whales are threatened by climate change, collisions with ships, and entanglements with fishing lines.

"Warm water that's been coming into the Gulf of Maine has affected the supply of their primary prey," said Pendleton, adding that the shift has pushed some right whales to migrate further north into Canadian waters to feed in late summer. "They're just not in the places that they've been for the last 40 years."

Credit: 
New England Aquarium

Abundance of prey species is key to bird diversity in cities

image: Abundance of prey species is key to bird diversity in cities.

Image: 
Till Kottmann/Unsplash

Urbanisation represents a drastic change to natural habitats and poses multiple challenges to many wildlife species, thereby affecting the occurrence and the abundance of many bird species. A team of scientists from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) and the Technische Universität Berlin (TUB) collaborated to analyse breeding bird data from the Senate of Berlin gathered by citizen scientists. They found that the abundance of invertebrates such as insects or spiders as prey is a key factor affecting bird diversity in the city. The more prey is available, the more diverse the urban bird communities are. This demonstrates the importance of species interactions for explaining urban biodiversity in addition to impacts of anthropogenic disturbance and habitat structure. The results are published in the scientific journal Diversity and Distributions.

Species interactions profoundly shape the composition of wildlife communities, determining which species and how many individuals are found within given habitats. For example, the presence of strong competitors may result in a diminished abundance or exclusion of particular species. Similarly, prey abundance and distribution affect the numbers of predators in a community. "Although the importance of species interactions for generating biodiversity is widely recognized, studies of urban biodiversity usually focus on the impacts of anthropogenic disturbance and habitat structure, neglecting species interactions", says Stephanie Kramer-Schadt, head of the Leibniz-IZW Department of Ecological Dynamics and Professor at TUB.

To assess to what extent species interactions affect avian diversity in cities, the team led by Aimara Planillo from Leibniz-IZW analysed breeding bird monitoring data and related them to invertebrate data. Bird data were collected by citizen scientists and provided by the Senate of Berlin, and invertebrate data were collected within a collaborative project of Berlin research institutions (BBIB-BIBS) funded by the German Ministry of Education and Research (BMBF). They investigated the impact of both food-related (prey availability) and non-food related (e.g. competition) species interactions on the responses of bird species to a gradient of increasing urbanisation, using data from 66 breeding bird species in the city of Berlin.

"By applying sophisticated modelling techniques to the biodiversity data, we demonstrated that prey invertebrate abundance is one of the most important factors affecting the urban bird biodiversity", says Planillo. Senior author Radchuk adds: "Importantly, the impact of prey abundance depends on the level of urbanisation. Prey abundance had a positive effect on bird diversity under low to medium urbanisation levels. For the highly urbanised areas, prey abundance does not affect bird community, as the bird species inhabiting such areas are those adapted to persist in urban environments and often benefit from human resources."

Through these analyses the scientists were able to distinguish three different groups of bird species in Berlin, which differ in how they respond to environmental variables and to prey abundance. "We found urban species, woodland species and nature-area species in Berlin's bird community", Radchuk explains. "Urban species are akin to urban exploiters as they persisted at high abundance at high levels of anthropogenic disturbance. Woodland species are akin to urban adapters, they responded strongly to the urbanisation gradient and were favoured by high tree cover and invertebrate abundance. Finally, nature-area species were strongly negatively affected by urbanisation and positively by tree cover and open green area. They were also the least abundant of the three groups."

This categorisation of bird species will allow the design of customised conservation strategies for target species. "Our findings point out that managing urban areas in a way that maintains and increases invertebrate biodiversity is very important for supporting bird diversity in cities", Planillo concludes. "In particular, in order to maintain or increase insect survival we suggest extensive or reduced mowing, leaving dead wood and stones in place, the preservation of set-aside lands and decreased use - or preferably avoidance - of pesticides."

Credit: 
Forschungsverbund Berlin

Drug discovery: First highly scalable method to monitor protein levels and localizations

image: Cell pool expressing hundreds of different GFP-fusion proteins

Image: 
© Andreas Reicher / CeMM

Until now, scientists typically studied the changes of proteins and their roles in the cell by using a fluorescent tag to label and follow one protein at a time. This approach limited the number of proteins that could be studied and precluded unbiased discovery approaches. Researchers at CeMM, the Research Center for Molecular Medicine of the Austrian Academy of Sciences, have now developed a highly scalable method which allows for the study of hundreds of proteins in parallel in order to monitor the changes of their levels and localization in the cell. This novel strategy is a notable contribution, not only to drug development for future treatments against diseases such as cancer, but also to our general understanding and knowledge of proteome dynamics. Their findings have now been published in the renowned scientific journal Genome Research.

Proteins are large molecules in the cell, and they are required for the structure, function and regulation of the tissues and organs in the body. They are responsible for nearly every task of cellular life and can be as diverse as the functions they serve. Protein levels and their localization within the cell regulate important aspects of many cellular processes and can become important targets for drug treatment. For example, the abundance of proteins can be increased or decreased by intervening therapeutically, by drugs that affect protein production and degradation in the cell. Proteins can also move between different cellular compartments, and thereby shift their functions. Other proteins might bind to distinct locations in response to external stimuli, such as areas where DNA damage occurs.
Traditionally, scientists use a fluorescent tag to label individual proteins and study their roles in the cell. A green fluorescent protein (GFP) is fused to one of the ends of a certain protein they wanted to study. This protein fusion is then expressed in the cell, and through fluorescence microscopy they can observe the cells expressing the labeled protein. This method permits studying many perturbations like different drug doses in a time resolved manner for a single protein. In contrast, mass-spectrometry was not suitable to study and monitor these cellular perturbations on the proteome, the entire complement of proteins, at a high scalable level on a specific point in time in an unbiased way.

Andreas Reicher and Anna Koren from CeMM Principal Investigator Stefan Kubicek's group have developed a novel strategy, which allows, for the first time, to observe and characterize those changes in a very high number of proteins in parallel. This method can be used to, not only describe and better understand the effects of certain known drugs in the cells, but also to discover new drug treatments that work by affecting and modulating the protein levels or localizations in the cells.

CeMM researchers have designed a method to overcome the bottle-neck in CRISPR-CAS9-based intron tagging: that there is a need to develop methods that shine a light on the whole proteome, or a substantial part thereof and not just one protein at a time. To overcome this problem, CeMM researchers designed a method to generate cell pools containing hundreds of tagged proteins, and in each cell a different protein was labeled with GFP. These cell pools were exposed to a PROTAC chemical degrader of BRD4, a transcriptional regulator that plays a key role during embryogenesis and cancer development. Researchers then using time-lapse microscopy observed if there were any changes in the levels or subcellular localization of any of the tagged proteins in the cell pool in response to the applied treatment. Importantly, the CRISPR-Cas9 tagging strategy they applied then enabled them to identify which proteins changed localization by using in situ sequencing of the entire cell pool. Thus, they confirmed the known targets of these drug but also revealed unexpected changes. Particularly for perturbations of BRD4 signalling, they were able to report changes in localizations of six proteins that had previously not being recognized by any other high throughput methods. Finally, they also showed that the method reveals expected and novel protein localization changes as response to treatment with the approved cancer drug methotrexate.

CeMM Principal Investigator Stefan Kubicek explains, "Our study describes a technology which, not only, for the first time, applies intron tagging to a gene pool, but is also significantly optimized in all three steps - intron tagging, cellular imaging and in situ sequencing - to enable the process in the most effective way. This method applied to chemical libraries and candidate molecules is particularly powerful in order to develop and deeply characterize drugs including the induction and inhibition of protein-protein interactions and chemical degradation. The described strategy will potentially accelerate drug discovery, and have great impact on the study of global and subcellular proteome dynamics."

Credit: 
CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences

Report: In retrospect, the burning of wood in district heating plants has resulted in climate saving

ENERGY A new report from the University of Copenhagen shows that the burning of wood is significantly more climate friendly than coal and slightly more climate friendly than natural gas over the long run. For the first time, researchers quantified what the conversion of 10 Danish cogeneration plants from coal or natural gas to biomass has meant for their greenhouse gas emissions.

Heat plant

Energy production is responsible for a large part of Danish greenhouse gas emissions. In 2018, more than 20 percent of greenhouse gas emissions were released as a result of heat and electricity production (9.4 out of 48 million tonnes of CO2). Photo: Getty

A conversion to wood biomass (wood chips and pellets) by Danish district heating plants has benefited the climate and is the more climate-friendly option compared to coal and natural gas. These are the findings of a new report from the University of Copenhagen's Department of Geosciences and Natural Resource Management.

The study is the first retrospective investigation by researchers of what a conversion to wood biomass has meant for greenhouse gas emissions at ten Danish cogeneration plants -- and thereby the climate impact of replacing either coal or natural gas in favour of wood biomass.

Among other things, researchers calculated the so-called carbon payback period for each plant, i.e. how long it takes for the conversion to wood biomass to elicit a positive climate effect.

"Our results demonstrate that the transition from coal to wood biomass has had a positive effect on CO2 emissions after an average of six years. When it comes to the transition from natural gas, it has in most cases taken between 9 and 22 years, and in one case 37 years before CO2 emissions were reduced," says Associate Professor Niclas Scott Bentsen of the Department of Geosciences and Natural Resource Management, who is one of the authors of the report.

Reduction in CO2 emissions

The researchers also looked at the total CO2 emissions from the three energy sources over a 30-year period, which is the life expectancy of a cogeneration plant.

Transitioning from coal to biomass resulted in a 15 to 71 percent reduction in CO2 emissions, while the move away from natural gas resulted in emissions reductions between -4 and 19 percent.

The fact that, in one case, emissions were -4 percent after 30 years as a result of the conversion, is partly due to the fact that, in relation to energy content, burning natural gas emits less CO2 than burning wood, and that this particular plant had notable changes in its product portfolio.

"When such large fluctuations in the figures occur, it is because the payback period and the amount of CO2 emissions saved are significantly affected by the type of fuel, where it comes from and other alternative uses of the wood," says Associate Professor Niclas Scott Bentsen

Forestry residues are best for the climate

The 10 Danish cogeneration plants collected 32 percent of their wood biomass from Danish forests, while 41 percent was sourced from the Baltic states, seven percent from Russia and Belarus, and seven percent from the United States. The type of wood biomass used and the distance it needed to be transported factored into the carbon budget as well, according to Professor Bentsen.

"For the typical plant that was once coal-fired, but now using wood from around Denmark and only uses forestry residue that cannot be used for other products, the payback period was roughly one year. The 30-year saving was as much as 60%," explains Niclas Scott Bentsen.

Wood has an enormous potential to displace carbon heavy construction materials such as steel and concrete and is therefore an important aspect of the green transition.

"Our study demonstrates that the extent to which wood is used for construction or other forms of production, where the long lifespan of wood can bind CO2, is even better for the climate than using it as fuel," says Niclas Scott Bentsen.

FACTS:

The method used in the study includes an analysis of time series from individual plants that includes the pre- and post-conversion period from fossil energy sources to wood biomass. Among other things, the analysis included specific knowledge of the type of fuel used, where the fuel came from and what alternative uses the wood might have had.

Energy production is responsible for a large part of Danish greenhouse gas emissions. In 2018, more than 20 percent of greenhouse gas emissions were released as a result of heat and electricity production (9.4 out of 48 million tonnes of CO2)

Of Denmark's total energy consumption, 16 percent of energy is generated from the burning of wood biomass. By comparison, 7 percent of energy consumption comes from wind turbines.

To reduce the carbon recovery period and atmospheric CO2 emissions, utilities should focus on using residual biomass (tree branches and crowns from logging or residuals from the wood industry that have no other use), biomass from productive forests, as well as reducing long transport distances.

Credit: 
University of Copenhagen - Faculty of Science

UCF researcher examines benefits of supportive communities for older adults

ORLANDO, Nov. 17, 2020 - The number of Americans age 65 and older continues to increase as the baby boom generation ages and people are living longer. At the same time, many seniors plan to "age in place," or continue living in their current homes, despite needing more assistance as they get
older.

One strategy for aging in place is an emerging idea known as aging in community, in which older adults depend on a community support group or program for assistance. This can consist of situations like older adults or family and friends living in the same house or close by each other in communities where they can easily assist each other.

To find out just how well the aging-in-community strategy is working, a University of Central Florida health management and informatics researcher examined three aging-in-community programs in Florida. Her study, which is among the first to examine some key variables for these programs, was recently published in the journal Gerontology and Geriatric Medicine.

"Given the fast approaching 'super-aged society' in the U.S., there is a critical need to identify and assess the impact of aging-in-community programs aimed at helping older adults remain independent at home while also having a sense of belonging to their community," says Su-I Hou, professor and interim chair of UCF's Department of Health Management and Informatics and author of the study.

Hou examined two important factors for successful aging in community - people's perceived ability to live independently and their perceived neighborhood social cohesiveness - in three types of aging-in-community programs in Central Florida.

These were two village programs, Thriving-in-Place, in Celebration, and Neighbors Network, in Winter Park; the Seniors First Meals on Wheels Program in Orlando, which is a county neighborhood lunch program; and a university-based lifelong learning program, LIFE at UCF.

In a village program, older residents band together to help each other out with drives to the doctor, help with errands and to vet any outside services or assistance.

Countywide neighborhood lunch programs provide meals and nutritional services to older adults in a group setting.

The university-based lifelong learning program provides education for older adults and ways to connect with other people and engage in campus services and programs.

The researcher found that for the nearly 300 total older people she surveyed in the aging-in-community programs, the higher a person's education level was, the less likely they were to perceive themselves as having the ability to live independently.

She also found that people who were married had greater perceptions of social cohesiveness, or that they lived in a neighborhood where people helped each other out.

When comparing the three programs, study data showed that older adults participating in the neighborhood lunch program perceived a higher level of confidence that they can live independently at their own home, yet a lower level of neighborhood social cohesiveness, compared with older adults participating in the village or lifelong learning programs.

"The findings suggest 'remain independent at home' and 'having a sense of belonging to their community' may impact older adults with different characteristics or community-support systems differently," Hou says. "It calls attention to examine how these key factors operate in different programs promoting aging in community, as well as a need to promote confidence in living independently at home among higher-educated older adults, and to facilitate a sense of belonging to their own community for single, older adults."

Credit: 
University of Central Florida

Carbyne - an unusual form of carbon

Which photophysical properties does carbyne have? This was the subject of research carried out by scientists at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), the University of Alberta, Canada, and the Ecole Polytechnique Fédérale de Lausanne in Switzerland, which has led to a greater understanding of the properties of this unusual form of carbon. Their findings have now been published in the latest edition of the journal Nature Communications.

'Carbon has a very special status in the periodic table of the elements and forms the basis for all forms of life due to the extremely large number of chemical compounds it can form,' explains Prof. Dr. Dirk M. Guldi at the Chair of Physical Chemistry I at FAU. 'The most well-known examples are three-dimensional graphite and diamond. However, two-dimensional graphene, one-dimensional nanotubes and zero-dimensional nanodots also open up new opportunities for electronics applications in the future.'

Material with extraordinary properties

Carbyne is a modification of carbon, known as an allotrope. It is manufactured synthetically, comprises one single and very long chain of carbon atoms, and is regarded as a material with extremely interesting electronic and mechanical properties. 'However, carbon has a high level of reactivity in this form,' emphasises Prof. Dr. Clémence Corminboef from EPFL. 'Such long chains are extremely unstable and thus very difficult to characterise.'

Despite this fact, the international research team successfully characterised the chains using a roundabout route. The scientists led by Prof. Dr. Dirk M. Guldi at FAU, Prof. Dr. Clémence Corminboeuf, Prof. Dr. Holger Frauenrath from EPFL and Prof. Dr. Rik R. Tykwinski from the University of Alberta questioned existing assumptions about the photophysical properties of carbyne and gained new insights.

During their research, the team mainly focused on what are known as oligoynes. 'We can manufacture carbyne chains of specific lengths and protect them from decomposition by adding a type of bumper made of atoms to the ends of the chains. This class of compound has sufficient chemical stability and is known as an oligoyne,' explains Prof. Dr. Holger Frauenrath from EPFL.

Using the optical band gap

The researchers specifically manufactured two series of oligoynes with varying symmetries and with up to 24 alternating triple and single bonds. Using spectroscopy, they subsequently tracked the deactivation processes of the relevant molecules from excitation with light up to complete relaxation. 'We were thus able to determine the mechanism behind the entire deactivation process of the oligoynes from an excited state right back to their original initial state and, thanks to the data we gained, we were able to make a prediction about the properties of carbyne,' concludes Prof. Dr. Rik R. Tykwinski from the University of Alberta.

One important finding was the fact that the so-called optical band gap is actually much smaller than previously assumed. Band gap is a term from the field of semiconductor physics and describes the electrical conductivity of crystals, metals and semiconductors. 'This is an enormous advantage,' says Prof. Guldi. 'The smaller the band gap, the less energy is required to conduct electricity.' Silicon, for example, which is used in microchips and solar cells, possesses this important property. Carbyne could be used in conjunction with silicon in the future due to its excellent photophysical properties.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Microbial remedies target chemical threats in the environment

image: Srivatsan Mohana Rangan is a researcher in the Biodesign Swette Center for Environmental Biotechnology and lead author of the new study.

Image: 
The Biodesign Institute at Arizona State University

Across America, hazardous waste sites pose an ongoing threat to human and environmental health. The most severe cases are known as Superfund sites, of which over a thousand currently exist. Some 50 million Americans live within three miles of one of these zones, potentially placing them at increased risk for cancer and other serious diseases.

While decontamination of such sites is a public health priority, the technical challenges are daunting. Of particular concern are a pair of chlorinated chemicals known as TCE and perchlorate. TCE was widely used as a degreasing agent and perchlorate is used in the manufacture of propellants. Due to the widespread reliance on these chemicals in the past and their improper disposal, they have often found their way into the environment, posing significant risks to human health and surrounding ecosystems.

Bioremediation for the removal of these highly toxic chemicals, especially when they are present in mixtures, has long been a challenge for scientists. Chlorinated chemicals stubbornly persist in the environment, sometimes contaminating drinking water systems.

In a new study, researchers at the Biodesign Swette Center for Environmental Biotechnology explored new ways to rid the environment of these co-occurring toxic chemicals. To accomplish this, Fe0 in combination with microbial cultures containing an unusual microbe known as Dehalococcoides mccartyi were added to soil and groundwater samples from a contaminated Superfund site in Goodyear, Arizona. The contaminated site had formerly been involved in defense and aerospace manufacturing.

The researchers describe how Dehalococcoides bacteria can act in synergy with Fe0, known as zero valent iron. The new study describes the conditions under which Fe0, Dehalococcoides, and other bacteria can effectively convert TCE and perchlorate to benign or less toxic end products of microbial biodegradation, (e.g., ethene).

The study appears in the current issue of the journal Environmental Science & Technology.

Critically, the technique prevents the TCE degradation reaction from stalling midway through the process. When this happens, a pair of chemicals, cis-DCE and vinyl chloride are produced, instead of ethene. This would be bad news for the environment, as vinyl chloride is recognized as a highly potent carcinogen.

Instead, by using low concentrations of aged Fe0 along with Dehalococcoides, a complete reduction of TCE and perchlorate to harmless ethene and chloride ions was achieved. The study also demonstrated that high concentrations of Fe0 inhibited TCE and perchlorate reduction while ferrous iron (Fe2+), an oxidation product of Fe0, significantly slowed down TCE reduction reaction to ethene.

"Usually, polluted environments contain more than one toxic contaminant, yet, we have limited information for managing environments with multiple contaminants," says Srivatsan Mohana Rangan, lead author of the new study. "The synergies between microbiological and abiotic reactions can help achieve successful remediation of multiple contaminants simultaneously in a shorter timeframe. Our study using microbial cultures with a chemical reductant, zerovalent iron, demonstrates scenarios for successful remediation of TCE and perchlorate, but also underscores scenarios which can exacerbate environmental contamination, by generating carcinogenic chemicals."

"We hope this study will help inform remedial design at Phoenix Goodyear Airport North Superfund Site and other contaminated environments where chemical reductants such as Fe0 are used to promote long-term and sustained microbial activities in the soil and groundwater," says Anca Delgado, co-author of the new study. (In addition to her Biodesign appointment, Delgado is an assistant professor at ASU's School of Sustainable Engineering and the Built Environment.)

The research findings pave the way for advanced microbial solutions to address contamination by chlorinated chemicals at Superfund sites across the country.

Credit: 
Arizona State University

New SARS-CoV-2 test is a simple, cost-effective, and efficient alternative for SARS-CoV-2 testing

Philadelphia, November 17, 2020 - Scientists from Northwell Health Laboratories have developed a new diagnostic multiplex assay that can be used for epidemiological surveillance and clinical management of COVID-19. The Northwell Health Laboratories laboratory-developed test (NWHL LDT) uses a different set of reagents than current assays and can test 91 patients at a time for SARS-CoV-2, versus a maximum of 29 patients using the modified Centers for Disease Control and Prevention (CDC) assay. The NWHL LDT performs as well as the modified CDC test with comparable analytical specificity and accuracy, report scientists in The Journal of Molecular Diagnostics, published by Elsevier.

"The COVID-19 pandemic has led to many constraints on testing availability, so we hope that providing another testing option to detect SARS-CoV-2 with a clinically-validated set of reagents will assist in this effort at a time when supply chain has been a major issue," explained lead investigator Gregory J. Berry, PhD, D(ABMM), Infectious Disease Diagnostics, Northwell Health Laboratories, Lake Success, NY, USA; and Department of Pathology and Laboratory Medicine, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, USA.

Nucleic Acid Amplification Test (NAAT) ? based assays for detection of SARS-CoV-2 in respiratory specimens have been the standard diagnostic method. The CDC initially developed the most-widely used NAAT assay, which includes primers and probes to detect the N1 and N2 regions of the nucleocapsid gene, a protein that plays a key role in virus enhancement, and also the human RNAse P gene to monitor RNA extraction and ensure specimen quality.

Dr. Berry and Wei Zhen, PhD, also based at Infectious Disease Diagnostics, Northwell Health Laboratories, developed the one-step real-time qualitative RT-PCR NWHL LDT test using the 7500 Fast Dx real-time PCR instrument. The NWHL LDT assay targets the S gene of SARS-CoV-2 and uses the same primers and probes for assay internal control as the modified CDC assay test.

A limit of detection (LOD) study of the NWHL LDT with inactivated virus exhibited equal performance with the modified CDC assay, with a final LOD of 1,301 ±13 genome equivalents for the NWHL LDT compared to 1,249 ± for the modified CDC assay. A clinical evaluation with 270 nasopharyngeal swab specimens from individuals suspected of having COVID-19 exhibited 98.5 percent positive agreement and 99.3 percent negative agreement compared to the modified CDC assay.

The NWHL NDT also showed significant efficiencies over the CDC assay, since the test requires only one set of primer and probe mix per specimen, compared to three sets and the use of three wells for each patient in the modified CDC assay. This further contributes to the ease of setting up each run. Savings in hands-on time, reagents, and consumables are another advantage at a time in which there are global shortages of reagents. This assay can be easily implemented by other laboratories for diagnostic use.

The authors observed that the NWHL LDT is a single site evaluation with a single target gene, while there has been a trend toward dual-target design in commercial assays for detection of the highly contagious SARS-CoV2 pathogen. "Occasional monitoring to verify that mutations have not developed in the region targeted by the NWHL LDT primers and probe is an adequate quality monitor to ensure continued consistent analytical performance," commented. Dr. Zhen.

Credit: 
Elsevier

'Extremely aggressive' internet censorship spreads in the world's democracies

Sketchfab illustration // Photos

The largest collection of public internet censorship data ever compiled shows that even citizens of what are considered the world's freest countries aren't safe from internet censorship.

The University of Michigan team used its own Censored Planet tool, an automated censorship tracking system launched in 2018, to collect more than 21 billion measurements over 20 months in 221 countries. They recently presented a paper on the findings at the 2020 ACM Conference on Computer and Communications Security.

"We hope that the continued publication of Censored Planet data will enable researchers to continuously monitor the deployment of network interference technologies, track policy changes in censoring nations, and better understand the targets of interference," said Roya Ensafi, U-M assistant professor of electrical engineering and computer science who led the development of the tool.

Poland blocked human rights sites; India same-sex dating sites

Ensafi's team found that censorship is increasing in 103 of the countries studied, including unexpected places like Norway, Japan, Italy, India, Israel and Poland. These countries, the team notes, are rated some of the world's freest by Freedom House, a nonprofit that advocates for democracy and human rights. They were among nine countries where Censored Planet found significant, previously undetected censorship events between August 2018 and April 2020. They also found previously undetected events in Cameroon, Ecuador and Sudan.

While the United States saw a small uptick in blocking, mostly driven by individual companies or internet service providers filtering content, the study did not uncover widespread censorship. However, Ensafi points out that the groundwork for that has been put in place here.

"When the United States repealed net neutrality, they created an environment in which it would be easy, from a technical standpoint, for ISPs to interfere with or block internet traffic," she said. "The architecture for greater censorship is already in place and we should all be concerned about heading down a slippery slope."

It's already happening abroad, the researchers found.

"What we see from our study is that no country is completely free," said Ram Sundara Raman, U-M doctoral candidate in computer science and engineering and first author of the study. "We're seeing that many countries start with legislation that compels ISPs to block something that's obviously bad like child pornography or pirated content.

"But once that blocking infrastructure is in place, governments can block any websites they choose, and it's a very opaque process. That's why censorship measurement is crucial, particularly continuous measurements that show trends over time."

Norway, for example--tied with Finland and Sweden as the world's freest country, according to Freedom House--passed laws requiring ISPs to block some gambling and pornography content beginning in early 2018. Censored Planet, however, uncovered that ISPs in Norway are imposing what the study calls "extremely aggressive" blocking across a broader range of content, including human rights websites like Human Rights Watch and online dating sites like Match.com.

Similar tactics show up in other countries, often in the wake of large political events, social unrest or new laws. News sites like The Washington Post and The Wall Street Journal, for example, were aggressively blocked in Japan when Osaka hosted the G20 international economic summit in June 2019. News, human rights and government sites saw a censorship spike in Poland after protests in July 2019, and same-sex dating sites were aggressively blocked in India after the country repealed laws against gay sex in September 2018.

Censored Planet releases technical details for researchers, activists

The researchers say the findings show the effectiveness of Censored Planet's approach, which turns public internet servers into automated sentries that can monitor and report when access to websites is being blocked. Running continuously, it takes billions of automated measurements and then uses a series of tools and filters to analyze the data and tease out trends.

The study also makes public technical details about the workings of Censored Planet that Raman says will make it easier for other researchers to draw insights from the project's data, and help activists make more informed decisions about where to focus.

"It's very important for people who work on circumvention to know exactly what's being censored on which network and what method is being used," Ensafi said. "That's data that Censored Planet can provide, and tech experts can use it to devise circumventions."

Censored Planet's constant, automated monitoring is a departure from traditional approaches that rely on volunteers to collect data manually from inside countries.

Manual monitoring can be dangerous, as volunteers may face reprisals from governments. Its limited scope also means that efforts are often focused on countries already known for censorship, enabling nations that are perceived as freer to fly under the radar. While censorship efforts generally start small, Raman says they could have big implications in a world that is increasingly dependent on the internet for essential communication needs.

"We imagine the internet as a global medium where anyone can access any resource, and it's supposed to make communication easier, especially across international borders," he said. "We find that if this continues, that won't be true anymore. We fear this could lead to a future where every country has a completely different view of the internet."

Credit: 
University of Michigan

Spintronics advances -- Controlling magnetization direction of magnetite at room temperature

image: Creating high-density spintronic memory devices with large capacity and even neuromorphic devices that mimic biological neural systems.

Image: 
Tokyo University of Science

Over the last few decades, conventional electronics has been rapidly reaching its technical limits in computing and information technology, calling for innovative devices that go beyond the mere manipulation of electron current. In this regard, spintronics, the study of devices that exploit the "spin" of electrons to perform functions, is one of the hottest areas in applied physics. But, measuring, altering, and, in general, working with this fundamental quantum property is no mean feat.

Current spintronic devices--for example, magnetic tunnel junctions--suffer from limitations such as high-power consumption, low operating temperatures, and severe constraints in material selection. To this end, a team of scientists at Tokyo University of Science and the National Institute for Materials Science (NIMS), Japan, has recently published a study in ACS Nano, in which they present a surprisingly simple yet efficient strategy to manipulate the magnetization angle in magnetite (Fe3O4), a typical ferromagnetic material. The team fabricated an all-solid reduction-oxidation ("redox") transistor containing a thin film of Fe3O4 on magnesium oxide and a lithium silicate electrolyte doped with zirconium (Fig. 1). The insertion of lithium ions in the solid electrolyte made it possible to achieve rotation of the magnetization angle at room temperature and significantly change the electron carrier density. Associate Professor Tohru Higuchi from Tokyo University of Science, one of the authors of this published paper, says "By applying a voltage to insert lithium ions in a solid electrolyte into a ferromagnet, we have developed a spintronic device that can rotate the magnetization with lower power consumption than that in magnetization rotation by spin current injection. This magnetization rotation is caused by the change of spin-orbit coupling due to electron injection into a ferromagnet."

Unlike previous attempts that relied on using strong external magnetic fields or injecting spin-tailored currents, the new approach leverages a reversible electrochemical reaction. After applying an external voltage, lithium ions migrate from the top lithium cobalt oxide electrode and through the electrolyte before reaching the magnetic Fe3O4 layer. These ions then insert themselves into the magnetite structure, forming LixFe3O4 and causing a measurable rotation in its magnetization angle owing to an alteration in charge carriers.

This effect allowed the scientists to reversibly change the magnetization angle by approximately 10°. Although a much greater rotation of 56° was achieved by upping the external voltage further, they found that the magnetization angle could not be switched back entirely (Fig. 2). "We determined that this irreversible magnetization angle rotation was caused by a change in the crystalline structure of magnetite due to an excess of lithium ions," explains Higuchi, "If we could suppress such irreversible structural changes, we could achieve a considerably larger magnetization rotation."

The novel device developed by the scientists represents a big step in the control of magnetization for the development of spintronic devices. Moreover, the structure of the device is relatively simple and easy to fabricate. Dr Takashi Tsuchiya, Principal Researcher at NIMS, the corresponding author of the study says, "By controlling the magnetization direction at room temperature due to the insertion of lithium ions into Fe3O4, we have made it possible to operate with much lower power consumption than the magnetization rotation by spin current injection. The developed element operates with a simple structure."

Although more work remains to be done to take full advantage of this new device, the imminent rise of spintronics will certainly unlock many novel and powerful applications. "In the future, we will try to achieve a rotation of 180° in the magnetization angle," says Dr Kazuya Terabe, Principal Investigator at the International Center for Materials Nanoarchitectonics at NIMS and a co-author of the study, "This would let us create high-density spintronic memory devices with large capacity and even neuromorphic devices that mimic biological neural systems." Some other applications of spintronics are in the highly coveted field of quantum computing.

Only time will tell what this frontier technology has in line for us!

Credit: 
Tokyo University of Science

People who purchased firearms during pandemic more likely to be suicidal

People who purchase a firearm during the pandemic are more likely to be suicidal than other firearm owners, according to a Rutgers study.

The study, published in the American Journal of Preventive Medicine, found that about 70 percent of those who bought a firearm during the COVID-19 pandemic reported having suicidal thoughts throughout their lives, compared to 37 percent of the rest of the community of gun owners.

"People who were motivated to purchase firearms during COVID-19 might have been driven by anxiety that leaves them vulnerable to suicidal ideation," said Michael Anestis, executive director of the New Jersey Gun Violence Research Center and an associate professor at the Rutgers School of Public Health. "While this does not guarantee an increase in suicide rates, it represents an unusually large surge in risk made more troubling by the fact that firearms purchased during COVID-19 may remain in homes beyond the pandemic."

According to Anestis, more than 2.5 million Americans became first-time gun owners during the first four months of 2020, with an estimated two million firearms purchased alone in March 2020 when the initial surge of the coronavirus pandemic began.

"Firearm owners are usually no more likely than non-firearm owners to experience suicidal thoughts. It is possible that a higher-risk group is driving the current firearm purchasing surge, introducing long-term suicide risk into the homes of individuals who otherwise may not have acquired firearms during a time of extended social isolation, economic uncertainty and general upheaval," Anestis said.

In the Rutgers study, researchers surveyed 3,500 Americans, approximately one-third of whom were firearm owners, and asked about their reasons for purchasing a gun during the pandemic, their methods of gun storage and whether they had ever experienced thoughts of suicide. The study looked at three groups: people who were existing firearm owners who did not purchase a firearm during the pandemic, people who purchased a firearm during the pandemic and non-firearm owners.

The study found that, of those who bought a firearm during the pandemic, 70 percent had experienced suicidal thoughts throughout their lives, 56 percent had experienced suicidal thoughts during the previous year and 25 percent had experienced suicidal thoughts during the previous month. By contrast, individuals who did not buy guns during the pandemic were only 56 percent, 24 percent and 12 percent respectively likely to have had suicidal thoughts during those time periods.

People who purchased a firearm during the pandemic also were found to be more likely to have storage habits that made the firearms less secure, such as switching between unloading their firearms and loading them before storage; using locking devices and then removing them; or switching between storing a firearm outside and inside the home.

"The increase in firearm purchases is concerning given that suicide is three times more likely in homes with firearms, and there is a hundred-fold increase in an individual's suicide risk immediately following the purchase of a handgun," said Anestis. "And unsafe firearm storage increases that risk."

Credit: 
Rutgers University

Novel technique 'stuns' arthritis pain in shoulder and hip

image: Suprascapular nerve cooled radiofrequency ablation targets.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - A novel outpatient procedure offers lasting pain relief for patients suffering from moderate to severe arthritis in their hip and shoulder joints, according to a study presented at the annual meeting of the Radiological Society of North America (RSNA). Researchers said the procedure could help reduce reliance on addictive opiates.

People with moderate to severe pain related to osteoarthritis face limited treatment options. Common approaches like injections of anesthetic and corticosteroids into the affected joints grow less effective as the arthritis progresses and worsens.

"Usually, over time patients become less responsive to these injections," said Felix M. Gonzalez, M.D., from the Radiology Department at Emory University School of Medicine in Atlanta, Georgia. "The first anesthetic-corticosteroid injection may provide six months of pain relief, the second may last three months, and the third may last only a month. Gradually, the degree of pain relief becomes nonsignificant."

Without pain relief, patients face the possibility of joint replacement surgery. Many patients are ineligible for surgery because of health reasons, whereas many others choose not to go through such a major operation. For those patients, the only other viable option may be opiate painkillers, which carry the risk of addiction.

Dr. Gonzalez and colleagues have been studying the application of a novel interventional radiology treatment known as cooled radiofrequency ablation (c-RFA) to achieve pain relief in the setting of advanced degenerative arthritis. The procedure involves the placement of needles where the main sensory nerves exist around the shoulder and hip joints. The nerves are then treated with a low-grade current known as radiofrequency that "stuns" them, slowing the transmission of pain to the brain.

For the new study, 23 people with osteoarthritis underwent treatment, including 12 with shoulder pain and 11 with hip pain that had become unresponsive to anti-inflammatory pain control and intra-articular lidocaine-steroid injections. Treatment was performed two to three weeks after the patients received diagnostic anesthetic nerve blocks. The patients then completed surveys to measure their function, range of motion and degree of pain before and at three months after the ablation procedures.

There were no procedure-related complications, and both the hip and shoulder pain groups reported statistically significant decrease in the degree of pain with corresponding increase in dynamic function after the treatment.

"In our study, the results were very impressive and promising," Dr. Gonzalez said. "The patients with shoulder pain had a decrease in pain of 85%, and an increase in function of approximately 74%. In patients with hip pain, there was a 70% reduction in pain, and a gain in function of approximately 66%."

The procedure offers a new alternative for patients who are facing the prospect of surgery. In addition, it can decrease the risk of opiate addiction.

"This procedure is a last resort for patients who are unable to be physically active and may develop a narcotic addiction," Dr. Gonzalez said. "Until recently, there was no other alternative for the treatment of patients at the end of the arthritis pathway who do not qualify for surgery or are unwilling to undergo a surgical procedure."

At last year's RSNA annual meeting, Dr. Gonzalez presented similarly encouraging results from a study of a similar procedure for the treatment of knee arthritis. Together, the knee, shoulder and hip articulations account for approximately 95% of all arthritis cases.

The procedure could have numerous applications outside of treating arthritic pain, Dr. Gonzalez explained. Potential uses include treating pain related to diseases like cancer and sickle cell anemia-related pain syndrome, for example.

"We're just scratching the surface here," Dr. Gonzalez said. "We would like to explore efficacy of the treatment on patients in other settings like trauma, amputations and especially in cancer patients with metastatic disease."

Credit: 
Radiological Society of North America

Looking inside the glass

image: Scientists at The University of Tokyo study aluminosilicate glass to determine its complex local structure with unprecedented detail. This work may lead to tougher and more inexpensive glass for touchscreens and solar arrays

Image: 
Institute of Industrial Science, the University of Tokyo

Tokyo, Japan - A team of researchers from the Institute of Industrial Science at The University of Tokyo used advanced electron spectroscopy and computer simulations to better understand the internal atomic structure of aluminosilicate glass. They found complex coordination networks among aluminum atoms within phase-separated regions. This work may open the possibility for improved glasses for smart device touchscreens.

As the demand for smartphones, tablets, and solar panels increases, so too will the need for more high-quality, tough, transparent glass. One of the candidate materials for these applications is called aluminosilicate glass, which is made of aluminum, silicon, and oxygen. As with all amorphous materials, the glass does not form a simple lattice but exists more like a disordered "frozen liquid." However, intricate structures can still form between that have not yet been analyzed by scientists.

Now, a team of researchers at The University of Tokyo have used electron energy loss fine structure spectroscopy with a scanning transmission electron microscope to reveal the local arrangement of atoms within a glass made of 50% aluminum oxide (Al2O3) and 50% silicon dioxide (SiO2). "We chose to study this system because it is known to phase separate into aluminum-rich and silicon-rich regions" first author Kun-Yen Liao says. When imaging with an electron microscope, some emitted electrons undergo inelastic scattering, which causes them to lose some of their original kinetic energy. The amount of energy dissipated varies based on the location and type of atom or cluster of atoms in the glass sample it hit. Electron loss spectroscopy is sensitive enough to tell the difference between aluminum coordinated in tetrahedral as opposed to octahedral clusters. By fitting the profile of the electron energy loss fine structure spectra pixel by pixel, the abundance of the various aluminum structures was determined with nanometer precision. The team also used computer simulations to interpret the data. "Aluminosilicate glasses can be manufactured to resist high temperatures and compressive stresses. This makes them useful for a wide range of industrial and consumer applications, such as touch displays, safety glass, and photovoltaics," senior author Teruyasu Mizoguchi says. Because aluminosilicate is also naturally occurring, this technique can also be used for geological research. The work is published in The Journal of Physical Chemistry Letters as "Revealing Spatial Distribution of Al Coordinated Species in a Phase-separated Aluminosilicate Glass by STEM-EELS".

Credit: 
Institute of Industrial Science, The University of Tokyo

Good long-term effects of continuous glucose monitoring

image: Prof. Marcus Lind, University of Gothenburg.

Image: 
Photo by Anette Juhlin.

New data on continuous glucose monitoring for people with type 1 diabetes, over a significantly longer period than before, are now available. A University of Gothenburg study shows that using the CGM tool, with its continuous monitoring of blood sugar (glucose) levels, has favorable effects over several years.

The technology of continuous glucose monitoring (CGM) is superseding that of the classic portable blood glucose meters, which require a prick in the finger several times daily. More and more people with type 1 diabetes have access to the new technology.

With CGM, nonstop measurement of blood glucose takes place by means of a fine subcutaneous fiber thread that tracks and, by telephone or a separate device, reports the blood sugar of the person wearing it. An alarm can be set to issue a warning if the figure becomes too low or high.
Previous clinical CGM trials had investigated its use over roughly six months. In the latest trial, patients were followed up for much longer: two and a half years.

The study comprised 108 adult patients at 13 hospitals in Sweden. All the patients were being treated with insulin injections.

The results, published in the scientific journal Diabetes Care, demonstrate that CGM has several long-term positive effects for people with type 1 diabetes.
The patients' average blood glucose, in terms of glycated hemoglobin, (HbA1c) level, fell by 4 mmol/mol during the trial period. This is a clear improvement, despite the patients with CGM receiving less support from hospital staff during the study period than they had done before, with capillary testing, the older technique.

The duration of episodes with much lower blood sugar levels, below 3.0 mmol/mol, which have a cognitive effect and are often unpleasant for the patient, decreased by approximately 70%. The patients' blood glucose fluctuated less, they found CGM more comfortable and enjoyable, and they were less apprehensive about excessively low blood sugar levels.

The technique thus had effects that both gave the patients medical protection and enhanced their mental wellbeing, thereby providing scope for long-term, efficacious treatment.

In charge of the study was Marcus Lind, Professor of Diabetology at Sahlgrenska Academy, University of Gothenburg, and chief physician at Uddevalla Hospital. The work was carried out at the research unit in NU Hospital Group that he leads.

"Nowadays, most people with type 1 diabetes in Sweden get CGM. However, it's important for decision makers to use longer-term data as a basis for deciding which treatments to subsidize and support," Lind says.
"We've had periods when CGM has been questioned in some parts of the country, although it's become steadily more established over time. Robust data are also important at times when financing of various technical aids is increasingly being discussed."

"The study is also important from an international point of view, since CGM isn't available for most people in the world with type 1 diabetes - and this applies in Western countries as well. Long-term data are needed to enable more patients to use CGM. The present study shows that when patients use it and also receive support at the level recommended in clinical practice for extended periods, many vital variables improve for people with type 1 diabetes. So this study is important for type 1 diabetic patients worldwide," Lind concludes.

Credit: 
University of Gothenburg

Peel-off coating keeps desalination cleaner and greener

image: The removable polyelectrolyte coating prevents biofouling on desalination membranes and avoids the need for harmful chemicals to clean seawater desalination systems.

Image: 
© 2020 KAUST; Xavier Pita

A removable coating that can be used to clean desalination membranes has been developed by KAUST researchers. The nontoxic coating could provide a safer and more efficient alternative to harmful chemicals used to clean reverse osmosis systems for seawater desalination.

The reverse osmosis desalination process uses pressure to filter seawater through a semipermeable membrane to produce fresh drinking water. While the technique is more energy efficient than other desalination approaches, its performance can be hindered by the growth of bacteria and other microorganisms on the membrane surface.

"This biofilm creates a layer that does not allow water to pass as easily," says Maria Fernanda Nava-Ocampo, a Ph.D. student under the supervision of Johannes Vrouwenvelder. "One of the biggest problems of all the current methods to control biofouling is that they do not completely remove the biofilm from the membrane system, resulting in permanent fouling. This causes elevated energy consumption and disposal of control chemicals into the sea."

While other coatings are often made from harmful chemicals, the new polyelectrolyte coating avoids the need for toxic linkers to attach to the membrane. It can also be safely flushed out of the system with brine and increased flow, leaving the membrane surface clear of biofilm.

"The advantage of our coating is that it attaches to the surface by electrostatic interactions, so we don't have to use chemicals," says Nava-Ocampo. "We also don't have to pretreat the membrane in order to coat it. The membrane stays in the system and we just pass the coating through the same current flow used for desalination."

The team tested the coating they created using a membrane fouling simulator, a small device that mimics the conditions in reverse osmosis desalination plants. They circulated the coating through the system five times to establish layers on the surface of the membrane and added biodegradable nutrients to encourage the growth of biofilm.

After eight days, the researchers flushed the system with a strong flow of high-saline solution for 24 hours to remove the coating. The team compared the performance of the coated membrane with a noncoated control.

Using transmission electron microscopy, the researchers found that the coating remained stable in salty water, making it suitable for seawater desalination. By increasing the strength of the water flow and water salinity, the team was able to successfully remove the coating and the attached biofilm from the membrane.

After this cleaning process, the flow of liquid through the coated membrane was two-fold higher than the noncoated control.

"This showed that our technique has better cleaning potential," says Nava-Ocampo. "The next step is making it more efficient and durable at larger scales."

Credit: 
King Abdullah University of Science & Technology (KAUST)