Tech

Dogs (not) gone wild: DNA tests show most 'wild dogs' in Australia are pure dingoes

image: That's not my name: 99 per cent of wild canines in Australia are pure dingoes or dingo-dominant hybrids, but they're being labelled as 'wild dogs'.

Image: 
Chontelle Burns / Nouveau Rise Photography.

Almost all wild canines in Australia are genetically more than half dingo, a new study led by UNSW Sydney shows – suggesting that lethal measures to control ‘wild dog’ populations are primarily targeting dingoes.

The study, published today in Australian Mammalogy, collates the results from over 5000 DNA samples of wild canines across the country, making it the largest and most comprehensive dingo data set to date.

The team found that 99 per cent of wild canines tested were pure dingoes or dingo-dominant hybrids (that is, a hybrid canine with more than 50 per cent dingo genes).

Of the remaining one per cent, roughly half were dog-dominant hybrids and the other half feral dogs.

“We don’t have a feral dog problem in Australia,” says Dr Kylie Cairns, a conservation biologist from UNSW Science and lead author of the study. “They just aren’t established in the wild.

“There are rare times when a dog might go bush, but it isn’t contributing significantly to the dingo population.”

The study builds on a 2019 paper by the team that found most wild canines in NSW are pure dingoes or dingo-dominant hybrids. The newer paper looked at DNA samples from past studies across Australia, including more than 600 previously unpublished data samples.

Pure dingoes – dingoes with no detectable dog ancestry – made up 64 per cent of the wild canines tested, while an additional 20 per cent were at least three-quarters dingo.

The findings challenge the view that pure dingoes are virtually extinct in the wild – and call to question the widespread use of the term ‘wild dog’.

“‘Wild dog’ isn’t a scientific term – it’s a euphemism,” says Dr Cairns.

“Dingoes are a native Australian animal, and many people don't like the idea of using lethal control on native animals.

“The term ‘wild dog’ is often used in government legislation when talking about lethal control of dingo populations.”

The terminology used to refer to a species can influence our underlying attitudes about them, especially when it comes to native and culturally significant animals.

This language can contribute to other misunderstandings about dingoes, like being able to judge a dingo’s ancestry by the colour of its coat – which can naturally be sandy, black, white, brindle, tan, patchy, or black and tan.

“There is an urgent need to stop using the term ‘wild dog’ and go back to calling them dingoes,” says Mr Brad Nesbitt, an Adjunct Research Fellow at the University of New England and a co-author on the study.

“Only then can we have an open public discussion about finding a balance between dingo control and dingo conservation in the Australian bush.”

Tracing the cause of hybridisation

While the study found dingo-dog hybridisation isn’t widespread in Australia, it also identified areas across the country with higher traces of dog DNA than the national average.

Most hybridisation is taking place in southeast Australia – and particularly in areas that use long-term lethal control, like aerial baiting. This landscape-wide form of lethal control involves dropping meat baits filled with the pesticide sodium fluoroacetate (commonly known as 1080) into forests via helicopter or airplane.

“The pattern of hybridisation is really stark now that we have the whole country to look at,” says Dr Cairns.

“Dingo populations are more stable and intact in areas that use less lethal control, like western and northern Australia. In fact, 98 per cent of the animals tested here are pure dingoes.

“But areas of the country that used long-term lethal control, like NSW, Victoria and southern Queensland, have higher rates of dog ancestry.”

The researchers suggest that higher human densities (and in turn, higher domestic dog populations) in southeast Australia are likely playing a key part in this hybridisation.

But the contributing role of aerial baiting – which fractures the dingo pack structure and allows dogs to integrate into the breeding packs – is something that can be addressed.

“If we're going to aerial bait the dingo population, we should be thinking more carefully about where and when we use this lethal control,” she says.

“Avoiding baiting in national parks, and during dingoes’ annual breeding season, will help protect the population from future hybridisation.”

Protecting the ecosystem

Professor Mike Letnic, senior author of the study and professor of conservation biology, has been researching dingoes and their interaction with the ecosystem for 25 years.

He says they play an important role in maintaining the biodiversity and health of the ecosystem.

“As apex predators, dingoes play a fundamental role in shaping ecosystems by keeping number of herbivores and smaller predators in check,” says Prof. Letnic.

“Apex predators’ effects can trickle all the way through ecosystems and even extend to plants and soils.”

Prof. Letnic’s previous research has shown that suppressing dingo populations can lead to a growth in kangaroo numbers, which has repercussions for the rest of the ecosystem.

For example, high kangaroo populations can lead to overgrazing, which in turn damages the soil, changes the face of the landscape and can jeopardise land conservation.

A study published last month found the long-term impacts of these changes are so pronounced they are visible from space.

But despite the valuable role they play in the ecosystem, dingoes are not being conserved across Australia – unlike many other native species.

“Dingoes are a listed threatened species in Victoria, so they’re protected in national parks,” says Dr Cairns. “They’re not protected in NSW and many other states.”

The need for consultation

Dr Cairns, who is also a scientific advisor to the Australian Dingo Foundation, says the timing of this paper is important.

“There is a large amount of funding currently going towards aerial baiting inside national parks,” she says. “This funding is to aid bushfire recovery, but aerial wild dog baiting doesn’t target invasive animals or ‘wild dogs’ – it targets dingoes.

“We need to have a discussion about whether killing a native animal – which has been shown to have benefits for the ecosystem – is the best way to go about ecosystem recovery.”

Dingoes are known to negatively impact farming by preying on livestock, especially sheep.

The researchers say it’s important that these impacts are minimised, but how we manage these issues is deserving of wider consultation – including discussing non-lethal methods to protect livestock.

“There needs to be a public consultation about how we balance dingo management and conservation,” says Dr Cairns. “The first step in having these clear and meaningful conversations is to start calling dingoes what they are.

“The animals are dingoes or predominantly dingo, and there are virtually no feral dogs, so it makes no sense to use the term ‘wild dog’. It’s time to call a spade a spade and a dingo a dingo.

Journal

Australian Mammalogy

DOI

10.1071/AM20055

Credit: 
University of New South Wales

Pressure sensor with high sensitivity and linear response based on soft micropillared electrodes

image: Schematic illustration and cross-sectional view SEM image of micropillared structure iontronic pressure sensor.

Image: 
©Science China Press

In recent years, with the rapid development of flexible electronic skins, high-performance flexible tactile sensors have received more attention and have been used in many fields such as artificial intelligence, health monitoring, human-computer interaction, and wearable devices. Among various sensors, flexible capacitive tactile sensors have the advantages of high sensitivity, low energy consumption, fast response, and simple structure. Sensitivity is an important parameter of the sensor. A common way to improve sensitivity is to introduce microstructures and use ionic dielectric materials at the interface to form a nano-scale ion-electronic interface with ultra-high specific capacitance. However, due to the incompressibility of the material and the high stability design of the structure, the linearity of the sensing signal is poor and the pressure response range is narrow. The sensor with high linearity facilitates the conversion between capacitance and pressure. It can greatly simplify the circuit design and data processing system, and improve the response speed of the sensing system. Therefore, the production of flexible pressure sensors with high linearity and high sensitivity has become a key issue in the development of flexible electronic skin.

Recently, Chuan Fei Guo's research group from the Department of Materials Science and Technology of Southern University of Science and Technology has made progress in the research of highly linear flexible pressure sensors. They improved the deformability of the structure by designing a flexible electrode with a surface micropillared structure with a large aspect ratio that is easy to buckle and lose stability (as shown in Fig. 1). Combined with the ionic gel dielectric layer, the sensor has high linearity (R2~0.999) and high sensitivity (33.16 kPa-1) in a wide pressure range of 12-176 kPa.

The micropillars undergo three deformation stages under pressure; initial contact (0-6 kPa), structural buckling (6-12 kPa) and post-buckling stage (12-176 kPa). In the post-buckling stage, the signal exhibits high linearity and high sensitivity, as shown in Fig. 2a.

The high linearity lies in the matching of the modulus of the micropillared structure electrode and the dielectric layer. The micropillars are made of silicone rubber polydimethylsiloxane (PDMS) with an elastic modulus of 1 MPa, and the elastic modulus of the ion gel membrane is 5 MPa. Through finite element analysis (FEA), it can be known that a material with a modulus of MPa will produce a linear contact area change (as shown in Fig. 2b) when the material is extruded with a micropillared structure, which matches the linear sensitivity obtained in the experiment.

In addition to high linear sensitivity, the sensor also has a low detection limit (0.9 Pa), low response time (9 ms), and high stability (during 6000 compression/bending cycles, the signal remains stable). According to the performance of the sensor, they make a series of applied experiments. A sensor is attached on the middle finger segment of an artificial hand to lift weights of different weights, and the sensor signal shows a step change with a uniform increase in weight (~372 pF/g). Then, multiple (21) sensors are attached to the manipulator to carry out the object grasping experiment. The sensor array can better reflect the pressure distribution of the grasped object. The sensor is also used in the detection of the human radial artery, and the pulse signal is relatively stable under different pre-pressures (10.23 ~ 17.75 kPa), as shown in Fig. 3. In the plantar pressure distribution test, the sensor array can clearly feedback the difference of pressure distribution in different state.

The high linearity sensitivity of the sensor is derived from the design of the surface micropillared structure and the matching of the mechanical properties of the electrodes and dielectric materials. The combination of Euler's stability principle, FEA and scanning electron microscopy (SEM) characterization explains the reason for linear sensitivity. The weight-lifting experiment and grasping experiment of the manipulator, human pulse detection and plantar pressure distribution test show that the sensor has great application potential in the fields of intelligent robots, human-computer interaction, and health monitoring. This work also provides new design ideas for the research of flexible linear sensors.

Credit: 
Science China Press

One drop of blood brings progression of multiple myeloma into better view

A new method makes it much easier to follow the progression of multiple myeloma, a form of blood cancer. With a single drop of blood, it is possible to very accurately show whether the number of cancerous cells in the bone marrow is increasing in a patient. In time, this blood test could potentially replace the current bone marrow puncture.

Researchers at Radboud university medical center, in collaboration with Erasmus MC, have taken an important step towards implementing this new diagnostic, with a study published in Clinical Chemistry. Multiple myeloma is a severe form of blood cancer. Every year, in the Netherlands approximately 1,000 people are diagnosed with the disease. Improved treatments have made it possible to keep the disease under long-term control in an increasing number of patients. In some patients it even goes so well that no disease can be measured in the blood.

However, this does not mean that the malignant cells are all gone. Often these patient reach a state of minimal residual disease. Whether this is the case is currently measured by means of a bone marrow puncture. This is an unpleasant procedure for patients. Moreover, the test is not sensitive enough. Medical immunologist and last author Hans Jacobs: "The disease is found almost everywhere in the bone marrow, but in some areas you there are more cancerous cells than in other areas. So if you take a biopsy where there are fewer cancer cells, the test result does not accurately reflect the real situation. "

That's why there's clinical need for a good, reliable alternative. This is what Hans Jacobs, PhD student Pieter Langerhorst and colleagues at Erasmus MC found in a blood test using a mass spectrometry. This test can measure unique molecules, derived from cancer cells, in the blood. These molecules reveal the presence of the cancer cells in just a drop of blood. This allows you to see very quickly if the number of cancer cells in the body, or disease activity, is increasing. A treatment (for example, medication or chemotherapy) can then be started more quickly if necessary.

Multiple myeloma: errors in cell division

Multiple myeloma is an uncontrolled division of malignant white blood cells (plasma cells) in the bone marrow. Healthy plasma cells make antibodies, which protect us from infections and bacteria. In patients with multiple myeloma, something goes wrong with the cell division, causing a malignant proliferation of plasma cells in the bone marrow. These cancer cells make abnormal antibodies, also called M-proteins, which then end up in the blood.

Unique barcode for patients

Each M-protein contains a region that is unique to the cancer cells and the patient. This unique "barcode" distinguishes the healthy antibodies from those produced by the cancer cells. The research, made possible in part by KWF, showed that this barcode makes it possible to measure disease activity 1,000 times more sensitive than we are used to with current blood tests. In the current study, the scientists investigated whether the barcode in every patient is suitable for measuring with mass spectrometry.

Pieter Langerhorst: "We were able to use an international database of more than Multiple Myeloma 600 patients. In all patients we were able to find a suitable patient-specific barcode, making our new blood test applicable in every patient. This is beyond our expectations. With this study, we are taking an important step towards personalized diagnostics for patients with multiple myeloma. In the coming years, we want to do more research so that this method can hopefully be used in the clinic in due course."

Credit: 
Radboud University Medical Center

A general approach to high-efficiency perovskite solar cells

image: Researchers from the Institute for Applied Physics (IAP) and the Center for Advancing Electronics Dresden (cfaed) at TU Dresden developed a general methodology for the reproducible fabrication of high efficiency perovskite solar cells. Their study has been published in the renowned journal Nature Communications.

Image: 
Christiane Kunath

Perovskites, a class of materials first reported in the early 19th century, were "re-discovered" in 2009 as a possible candidate for power generation via their use in solar cells. Since then, they have taken the photovoltaic (PV) research community by storm, reaching new record efficiencies at an unprecedented pace. This improvement has been so rapid that by 2021, barely more than a decade of research later, they are already achieving performance similar to conventional silicon devices. What makes perovskites especially promising is the manner in which they can be created. Where silicon-based devices are heavy and require high temperatures for fabrication, perovskite devices can be lightweight and formed with minimal energy investiture. It is this combination - high performance and facile fabrication - which has excited the research community.

As the performance of perovskite photovoltaics rocketed upward, left behind were some of the supporting developments needed to make a commercially viable technology. One issue that continues to plague perovskite development is device reproducibility. While some PV devices can be made with the desired level of performance, others made in the exact same manner often have significantly lower efficiencies, puzzling and frustrating the research community.

Recently, researchers from the Emerging Electronic Technologies Group of Prof. Yana Vaynzof have identified that fundamental processes that occur during the perovskite film formation strongly influence the reproducibility of the photovoltaic devices. When depositing the perovskite layer from solution, an antisolvent is dripped onto the perovskite solution to trigger its crystallization. "We found that the duration for which the perovskite was exposed to the antisolvent had a dramatic impact on the final device performance, a variable which had, until now, gone unnoticed in the field." says Dr. Alexander Taylor, a postdoctoral research associate in the Vaynzof group and the first author on the study. "This is related to the fact that certain antisolvents may at least partly dissolve the precursors of the perovskite layer, thus altering its final composition. Additionally, the miscibility of antisolvents with the perovskite solution solvents influences their efficacy in triggering crystallization."

These results reveal that, as researchers fabricate their PV devices, differences in this antisolvent step could cause the observed irreproducibility in performance. Going further, the authors tested a wide range of potential antisolvents, and showed that by controlling for these phenomena, they could obtain cutting-edge performance from nearly every candidate tested. "By identifying the key antisolvent characteristics that influence the quality of the perovskite active layers, we are also able to predict the optimal processing for new antisolvents, thus eliminating the need for the tedious trial-and-error optimization so common in the field." adds Dr. Fabian Paulus, leader of the Transport in Hybrid Materials Group at cfaed and a contributor to the study.

"Another important aspect of our study is the fact that we demonstrate how an optimal application of an antisolvent can significantly widen the processibility window of perovskite photovoltaic devices" notes Prof. Vaynzof, who led the work. "Our results offer the perovskite research community valuable insights necessary for the advancement of this promising technology into a commercial product."

Credit: 
Technische Universität Dresden

Mapping COVID risk in urban areas: a way to keep the economy open

image: The researchers established five levels of "risk" zones: red, orange, blue, green, and pink (from highest to lowest risk). The level of risk was determined by multiplying hazard by vulnerability.

Image: 
Study authors

As COVID-19 vaccines slowly roll out across the world, government officials in densely populated countries must still manage vulnerable communities at highest risk of an outbreak.

In a new study published in the Journal Risk Analysis, researchers in India propose a COVID Risk Assessment and Mapping (CRAM) framework that results in a zoned map that officials can use to place more targeted restrictions on high-risk communities. Successfully used by officials in Jaipur at the peak of the pandemic last spring, their framework could help other vulnerable countries avoid a shutdown of their regional economies.

Led by Shruti Kanga, associate professor in the Centre for Climate Change and Water Research at Suresh Gyan Vihar University, the team used satellite remote sensing and Geographic Information Systems (GIS) technology to conduct a spatial risk assessment of the city of Jaipur, located in the state of Rajashthan.

Jaipur had been experiencing a rapid increase in COVID-19 cases since the first cases of the virus were diagnosed in India in January 2020. Due to its high population density, the Jaipur area was subject to extended lockdowns. "It became imperative for the authorities to manage lockdowns without affecting the state's economy," the authors write.

The researchers developed CRAM to provide officials with a vulnerability assessment-based lockdown strategy. Their risk-mapping method involves three steps: 1. Generating GIS layers of administrative, hazard, socio-economic, and bio-physical data. 2. Integrating hazard and vulnerability to generate risk assessment. 3. Risk mapping using an area's "boundary zones" for prioritizing risk areas and leading to prompt action. The final result is a GIS map of an area with color-coded risk zones delineating the neighborhoods at highest risk of a COVID outbreak.

CRAM generates a risk assessment by integrating hazard and vulnerability components associated with COVID. In the case of Jaipur, this data included these vulnerability risks: total population, population density, and availability of clean water for sanitation. Hazard risks included proximity to COVID "hotspots" (areas with a high density of confirmed positive cases) and land use/land cover--pinpointing high risk settlements and agriculture, where people gather and get exposed to the virus.

Data for each of these factors were used to create a GIS layer for the final map. GIS gives researchers the ability to layer unrelated data points on top of one another to reveal trends via visual maps. "Pandemics are a spatial phenomenon," says Suraj Kumar Singh, a co-author and professor in the Centre for Sustainable Development at Suresh Gyan Vihar University. "Their spread and lethality can only be understood holistically using GIS tools."

The researchers established five levels of "risk" zones: red, orange, blue, green, and pink (from highest to lowest risk). The level of risk was determined by multiplying hazard by vulnerability. The resulting color-coded map for Jaipur depicted significant spatial variation -- indicating that most areas under high-risk red and orange zones were concentrated along the northeastern and southwestern zones of the study area.

After consulting with authorities managing COVID-19 in the area, the researchers listed specific guidelines for the areas under each risk category. For example, closing shops in red zones; allowing shops to be open three days a week in blue zones; and allowing shops to be open five days a week in green zones. "In Jaipur, our CRAM helped local authorities in deciding which areas to put under lockdown," says Singh.

Highly populated countries of Asia--India, Bangladesh, and Pakistan-- have been especially vulnerable to COVID-19 because of their poverty, population densities, and weak health care systems. Some parts of India are currently experiencing a second wave of COVID-19, including the state of Maharashtra, home to the bustling city of Mumbai. Singh suggests that the CRAM framework could be used in any densely populated area with high-risk communities.

"The CRAM framework can be applied anywhere in the world," says Singh. "Researchers and decision-makers only need to change the parameters that are specific to that particular geographic region governing COVID-19 or any pandemic."

Credit: 
Society for Risk Analysis

How much peanut does it take to trigger an allergic reaction?

image: Lynne Haber, PhD, shown in front of the University of Cincinnati College of Medicine.

Image: 
Colleen Kelley/University of Cincinnati

An estimated 6 million Americans may suffer from peanut allergies. Tiny amounts of peanut protein can lead to hives, itching, tingling in the mouth, shortness of breath or nausea within minutes.

For individuals with severe peanut allergies, food-induced anaphylaxis can occur. It's a life-threatening emergency that requires treatment with an injection of epinephrine and a trip to the emergency room. Food labels offer warnings such as "may contain peanuts" or "was processed in a facility that may process nuts."

The warnings allow individuals with severe reactions to steer clear, but for consumers who may be able to tolerate a minimal amount of peanut protein without major incident the labels aren't very useful, says Lynne Haber, PhD, a University of Cincinnati College of Medicine senior toxicologist.

But a new study that Haber has led may help change that situation in the United States.

Using patient data from multiple locations, scientists used mathematical models to estimate an "eliciting dose" -- or the amount of peanut protein that will cause or elicit an allergic reaction in a certain percentage of peanut sensitive patients, explains Haber. The study reviewed the responses of 481 patients in double-blind placebo-controlled studies, who were exposed to increasing levels of peanut protein in a controlled clinical setting until the patient had an allergic reaction.

The dose calculated to elicit an allergic reaction in 1% of patients with peanut allergies was 0.052 milligrams of peanut protein, about the weight of a single grain of salt, says Haber. The eliciting dose for 5% of patients was calculated to be 0.49 milligrams of peanut protein, or about the weight of a single grain of sugar, says Haber.

The findings were published in the scholarly journal Food and Chemical Toxicology.

"Risk is based on a combination of how inherently hazardous something is, and how much of that substance someone is exposed to," says Haber, an adjunct associate professor of environmental and public health sciences in the UC College of Medicine. "Arsenic is more toxic than sodium chloride, also known as table salt, but if you're not exposed to any arsenic, it does not pose any risk."

"The amount of exposure is also important in determining risk," says Haber. "Water is healthy, but if you drink enough of it, it could kill you. There has been a move to shift to labeling that is based on a combination of the inherent hazard of a substance and how much of it is in a product. This is being done in Australia, New Zealand and Europe. The United States has been slower to do this."

Haber worked with patient data from Stanford Medicine and the Consortium for Food Allergy Research. The study was supported by The Institute for the Advancement of Food and Nutrition Sciences (IAFNS). This non-profit 501(c)(3) scientific research organization pools funding from food industry collaborators and advances science through the in-kind and financial contributions from public and private sector participants.

"We were asked to do analysis using data from the U.S. population as there may be differences between the U.S. and other countries in terms of peanut consumption and exposure that affect the sensitivity to peanut," says Haber. "We have posted all the data and modeling code via the internet to ensure transparency. We have identified an exposure limit that is relevant to the U.S. population using a method and data that others can use for their own analysis."

Credit: 
University of Cincinnati

Research group identifies potential therapeutic target for lupus

A recent study published in JCI found that a neutrophil’s endoplasmic reticulum, the organelle that normally makes proteins in the cell, becomes stressed in the autoimmune disorder lupus. This stress activates a molecule called IRE1α, which appears to play a critical role in lupus pathogenesis in mice.

A multidisciplinary research group at the University of Michigan, spanning microbiology, dermatology and rheumatology, discovered that IRE1α orchestrates the release of neutrophil extracellular traps, or NETs, from lupus neutrophils. NETs are sticky, spider web-like structures that cause inflammation when released at the wrong time or in the wrong place.

NETs play an important role in the pathogenesis of lupus and other autoimmune diseases, where they trigger autoantibody formation and contribute to blood vessel clotting and damage.

Thanks to the previous work of study authors Basel Abuaita, Ph.D. and Mary X. O’Riordan, Ph.D., microbiologists and immunologists at Michigan Medicine, the research group knew that the IRE1α pathway was important for neutrophil activation in models of another potentially deadly disease, Staphylococcus aureus infection.

“Given that neutrophils are over-activated in lupus, we hypothesized that the IRE1α pathway might be part of the story in this disease, too,” says Gautam Sule, Ph.D., a postdoctoral fellow in rheumatology at Michigan Medicine. “It’s what prompted this collaboration, and the result was the discovery of an abnormally activated IRE1α pathway in lupus patient neutrophils, which tracks closely with disease severity.”

However, this new study posed unique challenges, because according to study author Jason S. Knight, M.D., Ph.D., a rheumatologist at Michigan Medicine, neutrophils aren’t easy to study.

Credit: 
Michigan Medicine - University of Michigan

Rutgers study: Bariatric surgery significantly reduces cancer risk for certain patients

image: Dr. Vinod K. Rustgi, professor of medicine, clinical director of hepatology and Director of the Center for Liver Diseases and Liver Masses, Robert Wood Johnson Medical School

Image: 
Steve Hockstein/Harvard Studio Photography

Bariatric surgery can significantly reduce the risk of cancer--and especially obesity-related cancers--by as much as half in certain individuals, according to a study by researchers at Rutgers Robert Wood Johnson Medical School's Center for Liver Diseases and Liver Masses.

The research, published in the journal Gastroenterology, is the first to show bariatric surgery significantly decreases the risk of cancer in individuals with severe obesity and nonalcoholic fatty liver disease (NAFLD). The risk reduction is even more pronounced in individuals with NAFLD-cirrhosis, the researchers say.

"We knew that obesity leads to certain problems, including cancer, but no one had ever looked at it the other way around--whether weight loss actually reduced the risk of those cancers," explains study author Dr. Vinod K. Rustgi, professor of medicine, clinical director of hepatology and Director of the Center for Liver Diseases and Liver Masses, Robert Wood Johnson Medical School. "Our study showed that all cancers were decreased, but obesity-related cancers in particular were decreased even more. Specifically, it showed a reduction in risk for all types of cancer by 18 percent, with the risk for obesity-related cancers being reduced by 25 percent. When comparing cirrhotic versus non-cirrhotic patients, cancer risk was reduced by 38 percent and 52 percent, respectively."

The retrospective study looked at de-identified claims data of more than 98,000 privately insured individuals age 18 to 64 years old who were diagnosed with severe obesity and NAFLD between 2007 and 2017. Of those, more than a third (34.1 percent) subsequently had bariatric surgery.

In addition to an overall reduction in cancer risk for these individuals, researchers found that bariatric surgery was associated with significant risk reductions in these individuals for the following obesity-related cancers: colorectal, pancreatic, endometrial and thyroid cancers, as well as hepatocellular carcinoma and multiple myeloma.

The study results offer practical insight for clinicians and building blocks for future studies on the connection between NAFLD and cancer, Dr. Rustgi says.

"Understanding the connection between NAFLD and cancer may identify new targets and treatments, such as antidiabetic-, satiety-, or GLP-1-based medications, for chemoprevention in NAFLD/NASH. Though bariatric surgery is a more aggressive approach than lifestyle modifications, surgery may provide additional benefits, such as improved quality of life and decreased long-term healthcare costs," the researchers indicate.

The next step for Center researchers, Dr. Rustgi says, is to explore whether this reduced cancer risk holds true for individuals with severe obesity who do not have NAFLD. They are also planning to study the mechanism by which this reduced risk occurs--whether factors such as hormonal changes induced by weight loss are the cause of reduced cancer risk, rather than just the weight loss itself, he says. In addition, Center researchers currently are studying the impact of bariatric surgery on cardiovascular outcomes, such as a decrease in heart attacks, or a decrease in strokes.

Credit: 
Rutgers University

International investigation discovers bald eagles' killer

image: A bald eagle's drooped wings show signs of brain infection caused by the bacteria Aetokthonos hydrillicola, which grows on the leaves of the invasive hydrilla plant in human-made lakes.

Image: 
UGA

The alarm bells began ringing when dozens of eagles were found dead near an Arkansas lake.

Their deaths--and, later, the deaths of other waterfowl, amphibians and fish--were the result of a neurological disease that caused holes to form in the white matter of their brains. Field and laboratory research over nearly three decades has established the primary clues needed to solve this wildlife mystery: Eagle and waterfowl deaths occur in late fall and winter within reservoirs with excess invasive aquatic weeds, and birds can die within five days after arrival.

But until recently, the toxin that caused the disease, vacuolar myelinopathy, was unknown.

Now, after years spent identifying a new toxic blue-green algal (cyanobacteria) species and isolating the toxic compound, an interdisciplinary research group from the University of Georgia and international collaborators have confirmed the structure of this toxin. The results were recently published in the journal Science.

The cyanobacteria grows on the leaves of an invasive water plant, Hydrilla verticillata, under specific conditions: in manmade lakes when bromide is present. The bacteria--and animal deaths from the disease it causes--has been documented in watersheds across the southeastern United States. This is why it's important for anyone in the outdoors--anglers, hunters, birdwatchers and more--to be aware of the signs of this neurological infection and avoid consuming infected animals.

"We want people to recognize it before taking birds or fish from these lakes," said Susan Wilde, an associate professor of aquatic science at the Warnell School of Forestry and Natural Resources who first discovered the cyanobacteria. In some animals, such as birds, turtles, salamanders and even a beaver, the disease manifests as erratic movements or convulsions. Anglers must be even more cautious, though, as it's impossible to detect toxin in fish without obvious symptoms.

"For fish, it's tough. I would avoid eating fish with lesions or some sort of deformities; we do see affected fish with slow swimming speeds, but anglers won't be able to see that," added Wilde. "We want people to know the lakes where this disease has been documented and to use caution in consuming birds and fish from these lakes."

Wilde and Warnell graduate students studying the cyanobacteria have compiled maps and a list of affected watersheds.

The most recent study details new mapping of the bacteria's genome, a final piece in the puzzle to understand how it develops and survives. Wilde and others have been studying the cyanobacteria since 2001, when bald eagles began dying in Georgia, South Carolina and North Carolina. The following decades saw the discovery of the cyanobacteria itself, Aetokthonos hydrillicola (Latin for "eagle killer that grows on Hydrilla"), and connections made between the invasive aquatic plant and the animals that eat it.

But until recently, said professor Timo Niedermeyer of the Institute of Pharmacy at Martin Luther University Halle-Wittenberg in Germany, the origin of the brain-decimating disease was a mystery.

Niedermeyer, who has worked with cyanobacteria natural products for years, wanted to help put the pieces together. He contacted Wilde and offered to collaborate. Samples of Hydrilla collected in the field were sent to him, and his lab cultivated the cyanobacteria in the laboratory and sent them back to UGA for further testing. But the tests came back negative: The cyanobacteria from the lab did not induce the disease.

"It's not just the birds that were going crazy, we were too. We wanted to figure this out," said Niedermeyer. Once again, he had colonized leaves sent to him from UGA.

Steffen Breinlinger, a doctoral student in his research group, then used a new imaging mass spectrometer to investigate the composition on the surface of the plant's leaf, molecule by molecule. He discovered a new substance that only occurs on the leaves where the cyanobacteria grows but is not produced in the cyanobacteria cultures. His investigations into the chemical structure of the isolated molecule revealed five bromine atoms.

"The structure is really spectacular," said Breinlinger. The properties are unusual for a molecule formed by cyanobacteria, and they provide an explanation for why the toxin did not form under laboratory conditions, where bromide isn't present. "We then added bromide to our lab cultures, and the cyanobacteria started producing the toxin."

After almost a decade of testing the isolated molecule and collaboration between the labs in Germany and Georgia, they had their proof: the molecule does trigger vacuolar myelinopathy. The researchers call their discovery aetokthonotoxin, "poison that kills the eagle."

"Finally, we did not only catch the murderer, but we also identified the weapon the cyanobacteria used to kill those eagles," said Wilde.

The neurological disease has not yet occurred in Europe, and no instance of the toxin-forming cyanobacterium has been reported. Humans are not yet known to be affected by vacuolar myelinopathy, although the study did successfully affect chickens with the toxin, and Wilde continues to test fish and waterfowl such as ducks and coots for the disease.

Credit: 
University of Georgia

Rural US has more heart failure deaths than urban areas

'A persistent and troubling rural disadvantage'

Strategies needed to support rural Americans

CHICAGO ---Heart failure deaths are persistently higher in rural areas of the United States compared with urban areas, reports a new Northwestern Medicine study. The research also showed race disparities in heart failure are prevalent in rural and urban areas with greatest increases among Black adults under 65 years old.

Heart failure deaths have been increasing nationally since 2011, but there is significant geographic variation in these patterns based on race.

"This work demonstrates a persistent and troubling rural disadvantage with significantly higher rates of death in rural areas compared with urban areas," said lead study author Dr. Sadiya Khan, an assistant professor of medicine at Northwestern University Feinberg School of Medicine and a Northwestern Medicine cardiologist.

The study was published in the journal PLOS ONE this month.

Possible factors for the disparities are higher levels of adverse social factors (e.g., lower income), risk factors such as obesity and diabetes in rural areas and fewer physicians, specifically cardiologists.

"Research is needed to identify barriers and define best strategies to prevent heart failure and optimize guideline-directed medical therapies, once heart failure develops," Khan said.

This is the first study that:

Focuses on geographic heterogeneity in heart failure mortality rates by rural or urban area

Demonstrates patterns of heart failure mortality are changing unfavorably with increases since 2011 in both rural and urban areas; these increases are greater among younger adults under age 65 years with greatest increases among Black men younger than 65.

The study used national death certificate data from the Centers for Disease Control that capture all deaths that occur in the U.S. Investigators identified cardiovascular deaths related to heart failure that occurred since 2011 and calculated annual age-adjusted mortality rates and trends in rural and urban areas, overall, by age groups and by race and sex.

Credit: 
Northwestern University

Controlling bubble formation on electrodes

image: This image shows the interplay among electrode wettability, porous structure, and overpotential. With the decrease of wettability (moving left to right), the gas-evolving electrode transitions from an internal growth and departure mode to a gas-filled mode, associated with a drastic change of bubble behaviors and significant increase of overpotential.

Image: 
Image courtesy of Ryuichi Iwata, Lenan Zhang, Evelyn Wang, Betar Gallant et al

Using electricity to split water into hydrogen and oxygen can be an effective way to produce clean-burning hydrogen fuel, with further benefits if that electricity is generated from renewable energy sources. But as water-splitting technologies improve, often using porous electrode materials to provide greater surface areas for electrochemical reactions, their efficiency is often limited by the formation of bubbles that can block or clog the reactive surfaces.

Now, a study at MIT has for the first time analyzed and quantified how bubbles form on these porous electrodes. The researchers have found that there are three different ways bubbles can form on and depart from the surface, and that these can be precisely controlled by adjusting the composition and surface treatment of the electrodes.

The findings could apply to a variety of other electrochemical reactions as well, including those used for the conversion of carbon dioxide captured from power plant emissions or air to form fuel or chemical feedstocks. The work is described today in the journal Joule, in a paper by MIT visiting scholar Ryuichi Iwata, graduate student Lenan Zhang, professors Evelyn Wang and Betar Gallant, and three others.

"Water-splitting is basically a way to generate hydrogen out of electricity, and it can be used for mitigating the fluctuations of the energy supply from renewable sources," says Iwata, the paper's lead author. That application was what motivated the team to study the limitations on that process and how they could be controlled.

Because the reaction constantly produces gas within a liquid medium, the gas forms bubbles that can temporarily block the active electrode surface. "Control of the bubbles is a key to realizing a high system performance," Iwata says. But little study had been done on the kinds of porous electrodes that are increasingly being studied for use in such systems.

The team identified three different ways that bubbles can form and release from the surface. In one, dubbed internal growth and departure, the bubbles are tiny relative to the size of the pores in the electrode. In that case, bubbles float away freely and the surface remains relatively clear, promoting the reaction process.

In another regime, the bubbles are larger than the pores, so they tend to get stuck and clog the openings, significantly curtailing the reaction. And in a third, intermediate regime, called wicking, the bubbles are of medium size and are still partly blocked, but manage to seep out through capillary action.

The team found that the crucial variable in determining which of these regimes takes place is the wettability of the porous surface. This quality, which determines whether water spreads out evenly across the surface or beads up into droplets, can be controlled by adjusting the coating applied to the surface. The team used a polymer called PTFE, and the more of it they sputtered onto the electrode surface, the more hydrophobic it became. It also became more resistant to blockage by larger bubbles.

The transition is quite abrupt, Zhang says, so even a small change in wettability, brought about by a small change in the surface coating's coverage, can dramatically alter the system's performance. Through this finding, he says, "we've added a new design parameter, which is the ratio of the bubble departure diameter [the size it reaches before separating from the surface] and the pore size. This is a new indicator for the effectiveness of a porous electrode."

Pore size can be controlled through the way the porous electrodes are made, and the wettability can be controlled precisely through the added coating. So, "by manipulating these two effects, in the future we can precisely control these design parameters to ensure that the porous medium is operated under the optimal conditions," Zhang says. This will provide materials designers with a set of parameters to help guide their selection of chemical compounds, manufacturing methods and surface treatments or coatings in order to provide the best performance for a specific application.

While the group's experiments focused on the water-splitting process, the results should be applicable to virtually any gas-evolving electrochemical reaction, the team says, including reactions used to electrochemically convert captured carbon dioxide, for example from power plant emissions.

Gallant, an associate professor of mechanical engineering at MIT, says that "what's really exciting is that as the technology of water splitting continues to develop, the field's focus is expanding beyond designing catalyst materials to engineering mass transport, to the point where this technology is poised to be able to scale." While it's still not at the mass-market commercializable stage, she says, "they're getting there. And now that we're starting to really push the limits of gas evolution rates with good catalysts, we can't ignore the bubbles that are being evolved anymore, which is a good sign."

Credit: 
Massachusetts Institute of Technology

Study reveals bias among doctors who classify X-rays for coal miner's black lung claims

image: Robert Cohen

Image: 
UIC

University of Illinois Chicago researchers are the first to report on the financial conflicts of interest that exist among doctors who review the chest X-rays of coal miners who file workers' compensation claims of totally disabling disease with the U.S. Department of Labor's Federal Black Lung Program.

The UIC researchers found that the determinations of these doctors - who are known as B-readers and who are certified by the National Institute for Occupational Safety and Health, or NIOSH - were strongly associated with the party that hired them.

By analyzing 63,780 radiograph classifications made by 264 physicians in Black Lung Program claims filed during 2000-2013, the researchers found that B-readers who were identified as ever being hired by a coal miner's employer read the images as negative for pneumoconiosis in 84.8% of the records. Pneumoconiosis is the general term for a class of lung diseases caused by the inhalation of dust - coal worker's pneumonoconiosis, or CWP, is commonly known as black lung disease and caused by long-term inhalation of coal dust.

Comparatively, a lower percentage of the records were read as negative for pneumoconiosis by those hired by the Department of Labor or a miner - 63.2% and 51.3% of the records, respectively.

These results are published today in the Annals of the American Thoracic Society.

The authors write that given the clear association between classifications and financial conflicts of interest, a lack of consistency in classifications within and between B-readers and an absence of an objective gold-standard for chest X-ray classifications, substantial improvements in transparency, oversight, and objectivity for black lung claims are clearly needed.

UIC's Lee Friedman and Dr. Robert Cohen are senior authors of the study.

"Certainly, we anticipated finding some bias, as there has been anecdotal evidence for some time and the Department of Labor has even taken action since 2013 to avoid such bias. But the degree of bias shown in this data is alarming," said Friedman, associate professor of environmental and occupational health sciences at the UIC School of Public Health. "It begs the question: are those actions enough and are they helping?"

For example, NIOSH has written a rule to institute a panel to review and decertify B-readers who repeatedly provide unreasonably inaccurate classifications of X-rays. However, complaints must be submitted to NIOSH and only after three independent complaint investigations will a B-reader be decertified.

"The system we have today is not being used to its full potential and, even if it were, it still leaves a lot to be desired when it comes to ensuring accurate and judicious outcomes for all parties," said Cohen, clinical professor of environmental and occupational health sciences and director of the Mining Education and Resource Center.

The analysis also found that there were 64 B-readers who classified an absence of pneumoconiosis in 95% of their classifications, with the vast majority (93.3%) of the classifications being made by B-readers who were primarily hired by the employer. The majority of these B-readers - 51 of them - classified films as negative for pneumoconiosis in more than 99% of their classifications.

In contrast, there were 23 B-readers that diagnosed simple pneumoconiosis in 95% of their classifications, with a minority (22%) of the classifications being made by B-readers who were primarily hired by the claimant-miner; 18 of these B-readers diagnosed simple pneumoconiosis in more than 99% of their classifications.

"While there is evidence of bias on both sides, it is clear that the degree of bias is much heavier on the employer side, and this is twofold," Cohen said. "Not only are those hired by an employer much more likely to classify a chest X-ray as negative for black lung disease, but it is also much more likely that an employer will have the resources to hire its own expert - at a much higher fee - in the first place.

"It is clear from this data that this bias is a systemic problem and the most significant offenders are identifiable - the records show a clear pattern of B-reader conflicts of interest," he said.

Better utilizing the current regulations to decertify B-readers with significant bias are among the recommendations the authors of the study present in the paper.

The authors also recommend that all initial contact and payments should be made by USDOL, and the other parties should be prohibited from communicating on a claim until the initial classifications are submitted, limiting coordination between the reader and requester.

Cohen and Friedman say other methods to reduce bias could include growing and diversifying the pool of B-readers; regulating the fees of B-readers who testify on behalf of either party; mandating B-readers to disclose any wholesale relationships and their associated income from related classifications; and, investing in scientific advances that leverage artificial intelligence to classify chest films without bias.

"The technology is there, but we don't have the systems in place to validate or implement a process," Cohen said. "It's a matter of motivation."

"This is really just the tip of the iceberg," Friedman said. "It is very likely that this issue extends beyond the Federal Black Lung Program and is pervasive across workers' compensation systems."

Credit: 
University of Illinois Chicago

X-rays combined with AI offer fast diagnostic tool in detecting COVID-19

X-rays, first used clinically in the late 1890s, could be a leading-edge diagnostic tool for COVID-19 patients with the help of artificial intelligence, according to a team of researchers in Brazil who taught a computer program, through various machine learning methods, to detect COVID-19 in chest X-rays with 95.6 to 98.5% accuracy.

They published their results in IEEE/CAA Journal of Automatica Sinica, a joint publication of the IEEE and the Chinese Association of Automation.

The researchers have previously focused on detecting and classifying lung pathologies, such as fibrosis, emphysema and lung nodules, through medical imaging. Common symptoms presented by suspected COVID-19 infections include respiratory distress, cough and, in more aggressive cases, pneumonia - all visible via medical imaging such as CT scans or X-rays.

"When the COVID-19 pandemic arose, we agreed to put our expertise to use to help deal with this new global problem," said corresponding author Victor Hugo C. de Albuquerque, a researcher in the Laboratory of Image Processing, Signals, and Applied Computing and with the Universidade de Fortaleza.

Many medical facilities have both an inadequate number of tests and lengthy processing times, Albuquerque said, so the research team focused on improving a tool that is readily available at every hospital and already frequently used in diagnosing COVID-19: X-ray devices.

"We decided to investigate if a COVID-19 infection could be automatically detected using X-ray images," Albuquerque said, noting that most X-ray images are available within minutes, compared to the days required for swab or saliva diagnostic tests.

However, the researchers found a lack of publicly available chest X-rays to train their artificial intelligence model to automatically identify the lungs of COVID-19 patients. They had just 194 COVID-19 X-rays and 194 healthy X-rays, while it usually takes thousands of images to thoroughly teach a model to detect and classify a particular target. To compensate, they took a model trained on a large dataset of other X-ray images and trained it to use the same methods to detect lungs likely infected with COVID-19. They used several different machine learning methods, two of which resulted in a 95.6% and a 98.5% accuracy rating, respectively.

"Since X-rays are very fast and cheap, they can help to triage patients in places where the health care system has collapsed or in places that are far from major centers with access to more complex technologies," Albuquerque said. "This approach to detect and classify medical images automatically can assist doctors in identifying, measuring the severity and classifying the disease."

Next, Albuquerque said, the researchers plan to continue testing their method with larger datasets as they become available, with the ultimate goal of developing a free online platform for medical image classification.

Credit: 
Chinese Association of Automation

Uranium compound achieves record anomalous Nernst conductivity

image: Research published in Science Advances has found that large spin-orbit coupling and strong electronic correlations in a system of uranium-cobalt-aluminum doped with ruthenium resulted in a colossal anomalous Nernst conductivity. Uranium and actinide alloys are promising materials to study the interplay among a material's topology and strong electron correlations, which could someday have applications in quantum information technologies.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., March 26, 2021--New research has demonstrated that a magnetic uranium compound can have strong thermoelectric properties, generating four times the transverse voltage from heat than the previous record in a cobalt-manganese-gallium compound. The result unlocks a new potential for the actinide elements at the bottom of the periodic table and point to a fresh direction in research on topological quantum materials.

"We found that the large spin-orbit coupling and strong electronic correlations in a system of uranium-cobalt-aluminum doped with ruthenium resulted in a colossal anomalous Nernst conductivity," said Filip Ronning, lead investigator on the paper published today in Science Advances. Ronning is director of the Institute for Materials Science at Los Alamos National Laboratory. "It illustrates that uranium and actinide alloys are promising materials to study the interplay among a material's topology and strong electron correlations. We're very much interested in understanding, tuning and eventually controlling this interplay, so hopefully one day we can exploit some of these remarkable responses."

The Nernst response occurs when a material converts a flow of heat into an electric voltage. This thermoelectric phenomenon can be exploited in devices that generate electricity from a heat source. The most notable current example is the radioisotope thermoelectric generators (RTGs) that were developed in part at Los Alamos. RTGs use heat from the natural radioactive decay of plutonium-238 to generate electricity--one such RTG is currently powering the Perseverance rover on Mars.

"What's exciting is that this colossal anomalous Nernst effect appears to be due to the rich topology of the material. This topology is created by a large spin-orbit coupling, which is common in actinides," Ronning said. "One consequence of topology in metals is the generation of a transverse velocity, which can give rise to a Nernst response as we observe. It can also generate other effects such as novel surface states that may be useful in various quantum information technologies."

The uranium system studied by the Los Alamos team generated 23 microvolts per kelvin of temperature change--four times bigger than the previous record, which was discovered in a cobalt-manganese-gallium alloy a couple of years ago and also attributed to these sorts of topological origins.

Credit: 
DOE/Los Alamos National Laboratory

UCI study finds high-fiber diet brings significant changes to human gut microbiome

Irvine, Calif., March 25, 2021 -- A short-term intervention in daily fiber consumption can significantly alter the gut microbiome and nutrient intake, according to a study led by University of California, Irvine researchers. The research was recently published by the American Society for Microbiology.

Dietary fiber consists of resistant carbohydrates found in fruits, vegetables and whole grains. Fiber persists in our digestion system, and while not digestible by humans, our gut bacteria can metabolize fiber into short-chain fatty acids and other byproducts critical to human health.

Currently, the average person in North America consumes less than 50 percent of the recommended dietary fiber levels due to decreased consumption of plant-based foods, as processed foods have become widespread. A reduced fiber diet is concerning health officials because low consumption of dietary fiber may be associated with diseases like type II diabetes and colon cancer. Furthermore, new studies have begun to demonstrate how gut microbial changes can indirectly impact human health. Therefore, a better understanding of dietary fiber's role on gut microbiota constitution could provide insights into managing diseases associated with the gut microbiome.

"The lack of fiber intake in the industrialized world is starving our gut microbes, with important health consequences that may be associated with increases in colorectal cancer, auto-immune diseases and even decreased vaccine efficacy and response to cancer immunotherapy," said Katrine Whiteson, associate professor of molecular biology & biochemistry who co-directs the UCI Microbiome Initiative.

To determine if increasing dietary fiber for a short time could alter the gut microbiome diversity and metabolite production, a research team led by UCI Microbiome Initiative co-directors Whiteson and Jennifer Martiny, professor of ecology & evolutionary biology, along with Julia Massimelli Sewall, assistant teaching professor, implemented a two-week dietary intervention during an undergraduate biology course at UCI.

Students who participated in the study were given 10 high fiber unprocessed meals each week for two weeks. During the time, they collected samples to track their gut microbial composition before and after the intervention. The students also recorded their dietary information of macronutrients to reach a goal of 50 grams/day during a two-week intervention period.

Sewall, the course instructor, noted how much she and the students enjoyed learning which foods are enriched in fiber. "We were amazed to find how high in fiber berries and avocados are and exchanged ideas for how to prepare beans and lentils," she said. "I think this experience will have a life-long impact on how we all look at nutrition labels."

She also noted that the research experience highly motivated students in the course. "The students came to class very excited to discuss what they had eaten and could not wait to analyze the microbiome sequencing information to make data-driven conclusions. The study had an interesting and educational impact," she added. "Our education research showed that the experience increased student's interest in science and heightened the awareness of their diet habits."

Graduate student Andrew Oliver, a teaching assistant for the course, coached students during the process and advised them to drink plenty of water in addition to offering instruction in microbiology methods and analysis. "Students raised their fiber intake by an average of 25 grams per day, but the variability of pre-intervention fiber intake was substantial," he said. "A few students had to go from nearly zero to 50 grams daily by the end of the study. We all became a little obsessed with how much fiber was in the food we were eating."

After the intervention, the researchers compared overall bacterial composition using DNA sequencing and measured short-chain fatty acids production using gas chromatography. In addition to sequencing, the team ran additional experiments targeting the known-fiber degrader, Bifidobacterium. The researchers found that the two-week intervention significantly altered individual gut microbiome composition, including an increase in the abundance of Bifidobacterium. However, despite the observed gut microbiome composition changes, they did not detect a significant shift in the abundance of these fatty acids.

"We hope to carry out longer dietary fiber interventions and study how fiber can support the gut microbiome and promote health. At this time during a pandemic, when we need our immune health and healthy vaccine responses, we encourage everyone to think about the plant diversity of their diets and add some beans, berries and avocados where they can," said Whiteson.

Credit: 
University of California - Irvine