Tech

Alberta researchers find elusive key to stopping neglected tropical diseases

image: (From left) Researchers Barbara Knoblach, Hiren Banerjee and Rick Rachubinski are now using their discovery to identify new, less toxic drugs to treat neglected tropical diseases that affect millions in Africa and South America.

Image: 
Ryan O'Byrne

Researchers at the University of Alberta have found an important protein in the cells of a deadly infectious parasite, opening the door to less harmful treatment for millions of people suffering from diseases like sleeping sickness in Africa and Chagas disease in South America.

The parasite, trypanosome, is transmitted through insect bites. In a study published in Life Science Alliance, U of A cell biologist Rick Rachubinski, and research associates Hiren Banerjee and Barbara Knobloch, found that a protein called PEX3, long believed not to exist in the parasite, was in fact present and essential for their viability.

PEX3 is a critical component of the cell biology of many living things, including humans, plants and trypanosomes. It helps make and regulate parts of the cells called peroxisomes, which help break down fatty acids and amino acids in the body to create energy.

In a trypanosome, there are specialized peroxisomes that are vital for regulating the process of turning glucose in a host's blood into energy. Disrupting the PEX3 required for making them could be an effective way to target and kill the parasite without harming the host, according to the researchers.

"Finding PEX3 in trypanosomes has been very difficult. People have been looking for years and just could not find it," said Rachubinski, who is also a member of the U of A's Women and Children's Health Research Institute. "Some people said it didn't exist, that it was a different mechanism, but we believed that the simplest answer was that we just hadn't found it yet."

Finding PEX3 in trypanosomes eluded Rachubinski and his team for years until, by chance, they attended a lecture given by a researcher from Cambridge University.

In the lecture, the presenter mentioned using HHpred, a computer program offered by the Max Planck Institute for Developmental Biology in Germany, to search for proteins based on how they are predicted to fold, or turn into functional proteins, rather than the traditional way of searching by amino acid chains. Inspired, Rachubinski rushed back to his lab and used the program to compare human PEX3 against trypanosomal proteins. To his surprise, the program found a previously uncharacterized trypanosomal protein that was 95 per cent similar in its folded structure to human PEX3.

"It really was a 'Eureka!' moment. After years of searching, we throw this into the computer and 10 minutes later we have the answer," he said. "It was really kind of elegant."

Banerjee, who is the first author on the study, agreed.

"It was a great moment for our lab, especially when you have all these other papers saying PEX3 is not there," he said.

Rachubiniski and his team are now in the process of identifying a drug treatment that targets and disrupts enough of the trypanosome's PEX3 interactions with partner proteins to prevent the parasites from creating enough energy to survive in the body.

While current treatments for trypanosomal diseases can be highly toxic to hosts, the team's hope is that this new approach will result in drugs that are not only safe for the hosts, but also cheap to produce and distribute.

For Rachubinski, the search for cures for neglected tropical diseases like sleeping sickness (also called human African trypanosomiasis) has been a lifelong endeavour. As an undergraduate, he was inspired by Paul de Kruif's 1926 book Microbe Hunters, a collection of biographies and true-life stories of the scientists and researchers who first began the fight against bacteria.

"This one, to me, has been really rewarding because it's been a constant theme throughout my career," Rachubinski said. "This is a worldwide problem, and I've always believed that affluent nations like ours should help emerging nations that don't have the resources to deal with burdens like neglected tropical diseases.

"Knowing this discovery could help millions of people is very satisfying to me."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

CRISPR gene editing may halt progression of triple-negative breast cancer

A tumor-targeted CRISPR gene editing system, encapsulated in a nanogel and injected into the body, could effectively and safely halt the growth of triple-negative breast cancer, report researchers at Boston Children's Hospital. Their proof-of-principle study, conducted in human tumor cells and in mice, suggests a potential genetic treatment for triple-negative breast cancer, which has the highest mortality rate of all breast cancers.

The new, patent-protected strategy is reported online this week in the journal PNAS.

Triple-negative breast cancer (TNBC), lacking estrogen, progesterone and HER2 receptors, accounts for 12 percent of all breast cancers. It occurs more frequently in women under age 50, in African American women, and in women carrying a BRCA1 gene mutation. Surgery, chemotherapy, and radiotherapy are the few treatment options for this highly aggressive, frequently metastatic cancer, which is in urgent need of more effective targeted therapeutics.

The new study, led by Peng Guo, PhD, and Marsha Moses, PhD, in the Vascular Biology Program at Boston Children's, represents the first successful use of targeted CRISPR gene editing to halt growth of a TNBC tumor in vivo (via injection into live, tumor-bearing mice). The new system is non-toxic and utilizes antibodies to selectively recognize cancer cells while sparing normal tissues.

Experiments showed that the CRISPR system was able to home in on breast tumors and knock out a well-known breast-cancer promoting gene, Lipocalin 2, with an editing efficiency of 81 percent in tumor tissue. The approach attenuated tumor growth by 77 percent in the mouse model and showed no toxicity in normal tissues.

Precision delivery of CRISPR

To date, translating CRISRP gene editing technology into disease therapies has been limited by the lack of effective CRISPR delivery systems. One method uses a virus to deliver the CRISPR system, but the virus cannot carry large payloads and potentially can cause side effects if it "infects" cells other than those targeted. Another method encapsulates the CRISPR system inside a cationic polymer or lipid nanoparticles, but these elements can be toxic to cells, and the CRISPR system is often trapped or broken down by the body before it reaches its destination.

The new approach encapsulates the CRISPR editing system inside a soft "nanolipogel" made up of nontoxic fatty molecules and hydrogels. Antibodies attached to the gel's surface then guide the CRISPR nanoparticles to the tumor site. The antibodies are designed to recognize and target ICAM-1, a molecule the Moses Lab identified in 2014 as being a novel drug target for triple-negative breast cancer.

Because the particles are soft and flexible, they can more efficiently enter cells than their stiffer counterparts. While stiffer nanoparticles can get trapped by the cell's ingestion machinery, the soft particles were able to fuse with the tumor cell membrane and deliver CRISPR payloads directly inside the cell.

"Using a soft particle allows us to penetrate the tumor better, without side effects, and with bigger cargo," says Guo, the study's first author. "Our system can substantially increase tumor delivery of CRISPR."

Once inside the cell, the CRISPR system knocked out Lipocalin 2, an oncogene that promotes breast tumor progression and metastasis. Experiments showed that loss of the oncogene inhibited the cancer's aggressiveness and tendency to migrate or metastasize. The treated mice showed no evidence of toxicity.

Although the study focused on triple-negative breast cancer, Moses believes the team's CRISPR platform could be adapted to treat pediatric cancers as well, and could also deliver conventional drugs. These studies are ongoing. The team is in discussions with a number of companies interested in the technology.

"Our system can deliver significantly more drug to the tumor, in a precise and safe way," Moses says.

Credit: 
Boston Children's Hospital

Mosquito incognito: Could graphene-lined clothing help prevent mosquito bites?

image: A new study suggests that graphene films can prevent mosquitoes from biting. Experiments showed that a graphene blocks the chemical cues that mosquitoes use to sense the a blood meal is near. In lab tests, skin patches covered by graphene films got zero bites, which mosquitoes readily feasted on unprotected skin.

Image: 
Hurt Lab / Brown University

PROVIDENCE, R.I. [Brown University] -- The nanomaterial graphene has received significant attention for its potential uses in everything from solar cells to tennis rackets. But a new study by Brown University researchers finds a surprising new use for the material: preventing mosquito bites.

In a paper published in Proceedings of the National Academy of Sciences, researchers showed that multilayer graphene can provide a two-fold defense against mosquito bites. The ultra-thin yet strong material acts as a barrier that mosquitoes are unable to bite through. At the same time, experiments showed that graphene also blocks chemical signals mosquitoes use to sense that a blood meal is near, blunting their urge to bite in the first place. The findings suggest that clothing with a graphene lining could be an effective mosquito barrier, the researchers say.

"Mosquitoes are important vectors for disease all over the world, and there's a lot of interest in non-chemical mosquito bite protection," said Robert Hurt, a professor in Brown's School of Engineering and senior author of the paper. "We had been working on fabrics that incorporate graphene as a barrier against toxic chemicals, and we started thinking about what else the approach might be good for. We thought maybe graphene could provide mosquito bite protection as well."

To find out if it would work, the researchers recruited some brave participants willing to get a few mosquito bites in the name of science. The participants placed their arms in a mosquito-filled enclosure so that only a small patch of their skin was available to the mosquitoes for biting. The mosquitoes were bred in the lab so they could be confirmed to be disease-free.

The researchers compared the number of bites participants received on their bare skin, on skin covered in cheesecloth and on skin covered by a graphene oxide (GO) films sheathed in cheesecloth. GO is a graphene derivative that can be made into films large enough for macro-scale applications.

It was readily apparent that graphene was a bite deterrent, the researchers found. When skin was covered by dry GO films, participants didn't get a single bite, while bare and cheesecloth-covered skin was readily feasted upon. What was surprising, the researchers said, was that the mosquitoes completely changed their behavior in the presence of the graphene-covered arm.

"With the graphene, the mosquitoes weren't even landing on the skin patch -- they just didn't seem to care," said Cintia Castilho, a Ph.D. student at Brown and the study's lead author. "We had assumed that graphene would be a physical barrier to biting, through puncture resistance, but when we saw these experiments we started to think that it was also a chemical barrier that prevents mosquitoes from sensing that someone is there."

To confirm the chemical barrier idea, the researchers dabbed some human sweat onto the outside of a graphene barrier. With the chemical ques on the other side of the graphene, the mosquitoes flocked to the patch in much the same way they flocked to bare skin.

Other experiments showed that GO can also provide puncture resistance -- but not all the time. Using a tiny needle as a stand-in for a mosquito's proboscis, as well as computer simulations of the bite process, the researchers showed that mosquitoes simply can't generate enough force to puncture GO. But that only applied when the GO is dry. The simulations found that GO would be vulnerable to puncture when it was saturated with water. And sure enough, experiments showed that mosquitoes could bite through wet GO. However, another form of GO with reduced oxygen content (called rGO) was shown to provide a bite barrier when both wet and dry.

A next step for the research would be to find a way to stabilize the GO so that it's tougher when wet, Hurt says. That's because GO has a distinct advantage over rGO when it comes to wearable technology.

"GO is breathable, meaning you can sweat through it, while rGO isn't," Hurt said. "So our preferred embodiment of this technology would be to find a way to stabilize GO mechanically so that is remains strong when wet. This next step would give us the full benefits of breathability and bite protection."

All told, the researchers say, the study suggests that properly engineered graphene linings could be used to make mosquito protective clothing.

Credit: 
Brown University

Wild ground-nesting bees might be exposed to lethal levels of neonics in soil

image: A female squash bee (University of Guelph)

Image: 
University of Guelph

In a first-ever study investigating the risk of neonicotinoid insecticides to ground-nesting bees, University of Guelph researchers have discovered at least one species is being exposed to lethal levels of the chemicals in the soil.

Examining the presence of these commonly used pesticides in soil is important given the majority of bee species in Canada make their nests in the ground.

This study focused on hoary squash bees, which feed almost exclusively on the nectar and pollen of squash, pumpkin and gourd flowers.

Researchers found that the likelihood that squash bees are being chronically exposed to lethal doses of a key neonicotinoid, clothianidin, in soil was 36 per cent or higher in squash fields.

That means 36 per cent of the population is probably encountering lethal doses, which is well above the acceptable threshold of 5 per cent, in which 95 per cent of the bees would survive exposure.

"These findings are applicable to many other wild bee species in Canada that nest on or near farms," said U of G School of Environmental Sciences professor, Nigel Raine, who holds the Rebanks Family Chair in Pollinator Conservation and worked on the study with PhD student and lead author Susan Chan.

"We don't yet know what effect these pesticides are having on squash bee numbers because wild bees are not yet tracked the same way that honeybee populations are monitored. But we do know that many other wild bee species nest and forage in crop fields, which is why these findings are so concerning."

Published in Scientific Reports, the study began with Chan collecting soil samples from 18 commercial squash fields in Ontario. Pesticide residue information from these samples and a second government dataset from field crops was used by Chan and colleagues Prof. Ryan Prosser, School of Environmental Sciences, and Dr. Jose Rodríguez-Gil to assess the risk to ground-nesting squash bees.

The research comes as Health Canada places new limits on the use of three key neonicotinoids, including clothianidin, while it decides whether to impose a full phase-out of these pesticides. Neonics have been linked to concerns about honeybee colony health, with research showing these bees can ingest dangerous amounts through nectar or pollen.

"Current risk assessments for insecticide impacts on pollinators revolve around honeybees, a species that rarely comes into contact with soil," said Raine. "However, the majority of bee species live most of their life in soil, so risks of pesticide exposure from soil should be a major factor in these important regulatory decisions."

"Until now, no one has examined the risk to bees from neonics in soil despite the fact these pesticides are applied directly to seeds planted into the ground, or sprayed directly onto soil at planting, and can persist for months after application," said Chan.

"Only about 20 per cent of the neonicotinoid insecticide applied to coated seeds is actually taken up into the crop plant; the rest stays in the soil and can remain there into subsequent seasons."

Squash bees are at particular risk because they prefer the already-tilled soil of agricultural fields for their elaborate underground homes. And as they build their nests, they move about 300 times their own body weight worth of soil.

Since the bees don't eat soil, it's difficult to know exactly how much pesticide residue enters the bees. But the researchers calculated that even if they are conservative and assume only 25 per cent of the clothianidin enters the bee, the risk of lethal exposure in pumpkin or squash crops is 11 per cent -- still above the widely accepted threshold of 5 per cent.

The team also examined the exposure risk in field crops, since many ground-nesting bee species live near corn and soybean fields, which use neonics as well. They found that 58 per cent of ground-nesting bees would be exposed to a lethal dose of clothianidin while building their nests even if only 25 per cent of the clothianidin in the soil enters the bee.

"Pumpkin and squash farmers face a dilemma in that they want to protect pollinators, such as the squash bee, because they are vital to crop production, but at the same time need to protect their crops from pests," said Chan.

"New approaches are needed that allow farmers to control pests and protect pollinators simultaneously. My advice to farmers is if you find an aggregation of squash bees nesting on your farm, protect these key pollinators from exposure to neonicotinoids by either not using them at all, or at least not using them near the aggregation. Creating pesticide-free places to nest will help your population of squash bees to grow over time."

Credit: 
University of Guelph

The secret of fireworm is out: molecular basis of its light emission

image: Bioluminescence of Odontosyllis undecimdonta worm and its secreted mucus.

Image: 
Prof. Yuichi Oba (Chubu University, Japan)

&laquoThis work is an important milestone in the framework of a large project, aimed at full characterization of a novel bioluminescent system, including the luciferase enzyme, luciferin substrate, key reaction products, the mechanism of light-emission, and biosynthetic pathways for luciferin and its analogs. The newly discovered molecules and the mechanisms presented in this work hold the potential to stimulate the development of new bioluminescence-based applications in the future», - said Aleksandra Tsarkova, one of the leading author, researcher from the Shemyakin-Ovchinnikov Institute of Bioorganic Chemistry of the Russian Academy of Sciences (Russia).

Bioluminescence - a widespread in the natural world process of "cold light" emission by living organisms, is based on an enzyme-dependent biochemical oxidation reaction in which energy is released in the form of light. Out of 40 presumed bioluminescent systems only for nine the structures of substrates - luciferins were elucidated, while a complete description of the luciferin biosynthesis pathway along with the corresponding enzymes was done only for two.

In recent years a great variety of bioluminescence-based screening methods have been designed due to their utility in dynamic monitoring of a variety of cellular functions. With each passing year the popularity of bioluminescence imaging techniques increases due to its high sensitivity and specificity in comparison with other known imaging technologies. However, each of these techniques has its own limitations and drawbacks imposed by the bioluminescence system used, which therefore brings importance to continuous investigation of new natural bioluminescence systems, that would allow to overcome these limitations.

Since 2014 Russian scientists, in collaboration with their colleagues from Japan, Brazil, Spain, USA and the UK have determined the structures and mechanisms of action of two new luciferins from fungal and Siberian earthworms' bioluminescence systems. Today in an article published in the journal Proceedings of the National Academy of Sciences USA the team from the Shemyakin-Ovchinnikov Institute of Bioorganic Chemistry of the Russian Academy of Sciences with colleagues from Japan and the US have presented the structure of the 10th luciferin. This paper describes the results of a multi-year research project, supported by the Russian Science Foundation, including the characterization of three key low-molecular-weight components of Odontosyllis bioluminescence system, along with the pathways of luciferin oxidation leading to light emission.

The so-called 'fireworms' Odontosyllis are tiny marine polychaetes (~20 mm) that produce bright blue-green bioluminescence (Fig. 1). During breeding season, normally in the summer, their fire swarms appear near seashore for a short period of time at night. These fascinating glowing swarms were observed and described in his journals by Christopher Columbus during his voyage of 1492. The investigation of the bioluminescence system of Odontosyllis began in the middle of the 20th century, but despite multiple attempts, including one by the Nobel Prize laureate Osamu Shimomura (1928-2018), the biochemical basis of the Odontosyllis light-emitting process remained largely unknown.

In 2018 the identification of Odontosyllis luciferase gene was accomplished by partial protein purification and RNA-seq analyses using a small amount of frozen samples. By contrast the characterization of the luciferin chemical structure required a much greater amount of worms' biomass.

"The first hurdle to overcome was Odontosyllis sample collection - a process restricted by the tiny size of the worms and the specifics of their life cycle, allowing us to obtain only a few grams of the specimens per year. However thanks to the efforts of the late Professor Shoji Inoue, who was dedicated to the study of the Japanese Odontosyllis undecimdonta bioluminescence, we've obtained 80 grams of the lyophilized Odontosyllis worms, that he has been collecting on his own for 17 years and has kept in the freezer for future research", - said Yuichi Oba from Department of Environmental Biology of the Chubu University (Japan).

The susceptibility to decomposition and extreme sensitivity of the luciferin to UV light presented the next serious challenge to our investigations of the Odontosyllis bioluminescence system due to the difficulty of obtaining the pure substrate. A specifically designed purification procedure yielded minuscule amounts of unstable luciferin, which was subjected to a battery of test to determine its structure. NMR spectroscopy, mass-spectrometry, and X-Ray diffraction were employed to reveal a highly unusual tricyclic heterocycle luciferin substrate containing three sulfur atoms in different electronic states. The unique structure of Odontosyllis luciferin provides a key insight into a completely novel chemical basis of bioluminescence, as this new molecule does not share structural similarity with any other known luciferins.

Along with the luciferin two other crucial molecules were isolated from Odontosyllis biomass and identified: Odontosyllis oxyluciferin and the product of luciferin nonspecific oxidation, termed Green and Pink respectively, due to their colors. Together the structures of these low-molecular-weight components of Odontosyllis bioluminescent system have enabled us to propose chemical transformation pathways for the enzymatic and non-specific oxidation of luciferin. Odontosyllis oxyluciferin is the only green primary emitter described for any known bioluminescent marine organism.

The newly elucidated chemical structures of Odontosyllis luciferin, oxyluciferin and the product of non-specific oxidation of luciferin reported in Proceedings of the National Academy of Sciences USA provide key insight into the chemical transformations underlying enzymatic and non-specific oxidations of luciferin.

The newly discovered molecules and the mechanisms presented in this work hold the potential to stimulate the development of new bioluminescence-based applications in the future.

Credit: 
AKSON Russian Science Communication Association

NASA's Terra Satellite finds some power in Tropical Depression 13W

image: On August 26 at 9:15 a.m. EDT (1315 UTC), the MODIS instrument that flies aboard NASA's Terra satellite found strongest storms (yellow) had cloud top temperatures as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Infrared imagery from NASA's Terra satellite revealed Tropical Depression 13W contained some powerful thunderstorms pushing high into the troposphere as it was moving west in the Philippine Sea toward the Philippines.

Tropical Depression 13W has already triggered warnings in the Philippines because it is located just east of the country. Philippines warnings include Tropical cyclone wind signal #1 over the following Luzon provinces: Cagayan, Isabela, Quirino, Nueva Vizcaya, Apayao, Abra, Kalinga, Mountain Province, Ifugao, Benguet, Aurora, Nueva Ecija, eastern portion of Pangasinan, northern portion of Quezon including Polillo Island and Catanduanes.

NASA's Terra satellite used infrared light to analyze the strength of storms and found the most powerful thunderstorms stretching north over the center from west to east. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On August 26 at 9:15 a.m. EDT (1315 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite found those strongest storms had cloud top temperatures as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

At 11 a.m. EDT (1500 UTC), the center of Tropical Depression 13W was located near latitude 13.9 degrees north latitude and 128.5 degrees east longitude. That is about 498 nautical miles east of Manila, Philippines. 13W was moving toward the west and toward the Philippines. Maximum sustained winds are near 52 mph (45 knots/83 kph) with higher gusts.

The Joint Typhoon Warning Center expects 13W will move west across the Philippines, before turning northwest towards Hainan Island, China. The system will make final landfall in Vietnam after five days.

Credit: 
NASA/Goddard Space Flight Center

Bad Blooms: Researchers review environmental conditions leading to harmful algae blooms

image: Toxic cyanobacterial (Microsystis) bloom in the Liangxi River, Wuxi, China.

Image: 
Photo Hans Paerl, June 2016. Reproduced with permission of Wiley Publishing.

When there is a combination of population increase, wastewater discharge, agricultural fertilization, and climate change, the cocktail is detrimental to humans and animals. This harmful cocktail produces harmful algal blooms, and many of these are toxic to humans and wildlife.

Wayne Wurtsbaugh, Professor Emeritus in the Watershed Sciences Department at Utah State University, along with Hans Paerl and Walter Dodds published a global review of conditions that lead to these harmful algal blooms in rivers, lakes, and coastal oceans. Wurtsbaugh says that the review will be an excellent resource for students studying pollution and for managers wanting to review recent advances in this field of study. Their review highlights how agricultural, urban, and industrial activities have greatly increased nitrogen and phosphorus pollution in freshwater and marine systems. This pollution has degraded water quality and biological resources costing societies billions of dollars in losses to fisheries, the safety of drinking water, increases to greenhouse gas emissions and related social values. Their findings have been published in, "Nutrients, eutrophication and harmful algal blooms along the freshwater to marine continuum."

Their scientific review highlights that although individual bodies of water may be more effected by increases in either phosphorus or nitrogen, the unidirectional flow through streams, lakes, and into marine ecosystems creates a continuum where both nutrients become important in controlling the algal blooms. The authors report how increasing nutrients has caused harmful blooms in waters as diverse as Utah Lake (Utah), mid-west agricultural streams, and the Gulf of Mexico where a 5,800 mi2 (15,000 km2) dead zone has developed. The authors conclude that although the specifics of algal production varies in both space and time, reducing the human causes of both phosphorus and nitrogen may be necessary to decrease the harmful algal blooms along the freshwater to marine continuum. These algae blooms make waters dysfunctional as ecological, economic, and esthetic resources.

The technology currently exists to control excessive nutrient additions, but more effective environmental regulations to control agricultural nutrient pollution, and investment in more advanced wastewater treatment plants will be needed to reduce these inputs and improve water quality. The enhancement of the quality of freshwater and coastal systems will become essential as climate change and human population growth place increased demands for high quality water resources.

Credit: 
S.J. & Jessie E. Quinney College of Natural Resources, Utah State University

Native approaches to fire management

image: Nicholas Nixsleeps in a traditional baby basket woven out of hazelnut stems by his grandmother Margo Robbins of the Yurok Tribe.

Image: 
Margo Robbins

It costs more than a new iPhone XS, and it's made out of hazelnut shrub stems. Traditional baby baskets of Northern California's Yurok and Karuk tribes come at a premium not only because they are handcrafted by skilled weavers, but because the stems required to make them are found only in forest understory areas experiencing a type of controlled burn once practiced by the tribes but suppressed for more than a century.

A new Stanford-led study with the U.S. Forest Service in collaboration with the Yurok and Karuk tribes found that incorporating traditional techniques into current fire suppression practices could help revitalize American Indian cultures, economies and livelihoods, while continuing to reduce wildfire risks. The findings could inform plans to incorporate the cultural burning practices into forest management across an area one and a half times the size of Rhode Island.

"Burning connects many tribal members to an ancestral practice that they know has immense ecological and social benefit especially in the aftermath of industrial timber activity and ongoing economic austerity," said study lead author Tony Marks-Block, a doctoral candidate in anthropology who worked with Lisa Curran, the Roger and Cynthia Lang Professor in Environmental Anthropology.

"We must have fire in order to continue the traditions of our people," said Margo Robbins, a Yurok basket weaver and director of the Yurok Cultural Fire Management Council who advised the researchers. "There is such a thing as good fire."

The study, published in Forest Ecology and Management, replicates Yurok and Karuk fire treatments that involve cutting and burning hazelnut shrub stems. The approach increased the production of high-quality stems (straight, unbranched and free of insect marks or bark blemishes) needed to make culturally significant items such as baby baskets and fish traps up to 10-fold compared with untreated shrubs.

Reducing fuel load

Previous studies have shown that repeated prescribed burning reduces fuel for wildfires, thus reducing their intensity and size in seasonally dry forests such as the one the researchers studied in the Klamath Basin area near the border with Oregon. This study was part of a larger exploration of prescribed burns being carried out by Stanford and U.S. Forest Service researchers who collaborated with the Yurok and Karuk tribes to evaluate traditional fire management treatments. Together, they worked with a consortium of federal and state agencies and nongovernmental organizations across 5,570 acres in the Klamath Basin.

The consortium has proposed expanding these "cultural burns" - which have been greatly constrained throughout the tribes' ancestral lands - across more than 1 million acres of federal and tribal lands that are currently managed with techniques including less targeted controlled burns or brush removal.

Tribes traditionally burned specific plants or landscapes as a way of generating materials or spurring food production, as opposed to modern prescribed burns that are less likely to take these considerations into account. The authors argue that increasing the number of cultural burns could ease food insecurity among American Indian communities in the region. Traditional food sources have declined precipitously due in part to the suppression of prescribed burns that kill acorn-eating pests and promote deer populations by creating beneficial habitat and increasing plants' nutritional content

"This study was founded upon tribal knowledge and cultural practices," said co-author Frank Lake, a research ecologist with the U.S. Forest Service and a Karuk descendant with Yurok family. "Because of that, it can help us in formulating the best available science to guide fuels and fire management that demonstrate the benefit to tribal communities and society for reducing the risk of wildfires."

The researchers write that it would be easy and efficient to include traditional American Indian prescribed burning practices in existing forest management strategies. For example, federal fire managers could incorporate hazelnut shrub propane torching and pile burning into their fuel reduction plans to meet cultural needs. Managers would need to consult and collaborate with local tribes to plan these activities so that the basketry stems could be gathered post-treatment. Larger-scale pile burning treatments typically occur over a few days and require routine monitoring by forestry technicians to ensure they do not escape or harm nearby trees. As these burn, it would be easy for a technician to simultaneously use a propane torch to top-kill nearby hazelnut shrubs. This would not require a significant increase in personnel hours.

Fires with a purpose

"These are fires with a purpose, said Curran, who is also a senior fellow at the Stanford Woods Institute for the Environment. "Now that science has quantified and documented the effectiveness of these practices, fire managers and scientists have the information they need to collaborate with tribes to implement them on a large scale."

Credit: 
Stanford University

Optical neural network could lead to intelligent cameras

image: The system uses a series of 3D-printed wafers or layers with uneven surfaces that transmit or reflect incoming light.

Image: 
UCLA Samueli School of Engineering

UCLA engineers have made major improvements on their design of an optical neural network -a device inspired by how the human brain works - that can identify objects or process information at the speed of light.

The development could lead to intelligent camera systems that figure out what they are seeing simply by the patterns of light that run through a 3D engineered material structure. Their new design takes advantage of the parallelization and scalability of optical-based computational systems.

For example, such systems could be incorporated into self-driving cars or robots, helping them make near-instantaneous decisions faster and using less power than computer-based systems that need additional time to identify an object after it's been seen.

The technology was first introduced by the UCLA group in 2018. The system uses a series of 3D-printed wafers or layers with uneven surfaces that transmit or reflect incoming light - they're reminiscent in look and effect to frosted glass. These layers have tens of thousands of pixel points - essentially these are artificial neurons that form an engineered volume of material that computes all-optically. Each object will have a unique light pathway through the 3D fabricated layers.

Behind those layers are several light detectors, each previously assigned in a computer to deduce what the input object is by where the most light ends up after traveling through the layers.

For example, if it's trained to figure out handwritten digits, then the detector programmed to identify a "5" will see the most of the light hit that detector after the image of a "5" has traveled through the layers.

In this recent study, published in the open access journal Advanced Photonics, the UCLA researchers have significantly increased the system's accuracy by adding a second set of detectors to the system, and therefore each object type is now represented with two detectors rather than one. The researchers aimed to increase the signal difference between a detector pair assigned to an object type. Intuitively, this is similar to weighing two stones simultaneously with the left and right hands - it is easier this way to differentiate if they are of similar weight or have different weights.

This differential detection scheme helped UCLA researchers improve their prediction accuracy for unknown objects that were seen by their optical neural network.

"Such a system performs machine-learning tasks with light-matter interaction and optical diffraction inside a 3D fabricated material structure, at the speed of light and without the need for extensive power, except the illumination light and a simple detector circuitry," said Aydogan Ozcan, Chancellor's Professor of Electrical and Computer Engineering and the principal investigator on the research. "This advance could enable task-specific smart cameras that perform computation on a scene using only photons and light-matter interaction, making it extremely fast and power efficient."

The researchers tested their system's accuracy using image datasets of hand-written digits, items of clothing, and a broader set of various vehicles and animals known as the CIFAR-10 image dataset. They found image recognition accuracy rates of 98.6%, 91.1% and 51.4% respectively.

Those results compare very favorably to earlier generations of all-electronic deep neural nets. While more recent electronic systems have better performance, the researchers suggest that all-optical systems have advantages in inference speed, low-power, and can be scaled up to accommodate and identify many more objects in parallel.

Credit: 
UCLA Samueli School of Engineering

Astrophysicists link brightening of pulsar wind nebula to pulsar spin-down rate transition

image: An illustration of the pulsar and pulsar wind nebula (PWN) system (not to scale). The relativistic wind from the central pulsar is terminated by a shock at a radius of about one light-year and starts to radiate. The typical size of a PWN is a few light-years. The image of the Large Magellanic Cloud galaxy shown in the lower left was taken by YE Ziyi .

Image: 
Institute of High Energy Physics

Astrophysicists have discovered that the pulsar wind nebula (PWN) surrounding the famous pulsar B0540-69 brightened gradually after the pulsar experienced a sudden spin-down rate transition (SRT). This discovery, made by a group of astrophysicists led by GE Mingyu and LU Fangjun at the Institute of High Energy Physics of the Chinese Academy of Sciences, provides important clues to the spin-down mechanism and the magnetic field structure of the pulsar, as well as the physical properties of the PWN. The results were published in Nature Astronomy.

Pulsars are highly magnetized neutron stars born from supernova explosions of massive stars. They typically have radii about 10 km and surface magnetic field strengths around 1 trillion Gauss. According to classic pulsar theory, an isolated pulsar loses energy through magnetic dipole radiation and thus slows down. However, more and more theorists believe that the main way an isolated pulsar loses its rotational energy is through a relativistic wind consisting of electrons, positrons and possibly magnetic field. If the wind is strong enough, it will eventually form a detectable PWN through interaction with the surrounding materials. The famous Crab nebula is such a PWN, with a size of several light-years, i.e., about a hundred thousand times the distance from Earth to the Sun.

PSR B0540-69 is located in the Large Magellanic Cloud galaxy, a satellite galaxy about 160,000 light-years from our Milky Way. In December 2011, the spin-down rate of this pulsar suddenly increased by 36% and has remained almost constant since then, which means the energy release rate of the pulsar has also increased by 36%. Unlike other pulsars with similar spin-down rate transitions, which are accompanied by pulse profile and/or flux changes and are attributed to changes in the magnetospheres, no variation in either the pulse profile or flux has been detected from PSR B0540-69, making the cause of its SRT a mystery.

GE stated, "Using data obtained by a few X-ray astronomical satellites, we find that the X-ray PWN around PSR B0540-69 brightened gradually up to 32±8% over the prior flux during the period of about 400 days since the SRT (Fig. 2). We show that the SRT most likely resulted from a sudden enhancement of the magnetic field in the pulsar magnetic pole region, which does not significantly affect the pulsed X-ray emission but increases pulsar wind power and hence PWN X-ray emission." This is the first time that PWN brightening has been observationally connected with the pulsar spin-down rate transition, implying that the pulsar wind is the main factor slowing down the pulsar spin. "The 400-day time scale of the flux increase corresponds to a magnetic field strength of about 0.8 milli-Gauss in the PWN. This is also the first direct measurement of the magnetic field and is consistent with the value estimated before under some assumptions," LU added.

Credit: 
Chinese Academy of Sciences Headquarters

Philippine airborne campaign targets weather, climate science

image: Smoke off the west coast of Borneo and dueling tropical cyclones in the Philippine Sea are shown in this color image from the Advanced Himawari Imager aboard the JAXA Himawari-8 satellite, from August 21st, 2018.

Image: 
CIRA/CSU and JAXA/NOAA/NASA

NASA's P-3B science aircraft soared into the skies over the Philippines on Aug. 25 to begin a nearly two-month-long investigation on the impact that smoke from fires and pollution have on clouds, a key factor in improving weather and climate forecasts. The Cloud, Aerosol, and Monsoon Processes Philippines Experiment (CAMP2Ex) is the most comprehensive field campaign to date in Maritime Southeast Asia to study the relationship between aerosol particles as they interact with surrounding monsoon meteorology, cloud microphysics and the sun's radiation.

Led by NASA, the U.S. Naval Research Laboratory (NRL) and the Manila Observatory in conjunction with the Philippine Atmospheric, Geophysical and Astronomical Services Administration and the Philippine Department of Science and Technology, CAMP2Ex comprises an interdisciplinary, international team of field researchers, modelers and remote sensing developers.

The study seeks to tackle some of the most difficult weather and climate phenomena to understand, monitor and forecast. The Maritime Continent--comprising Sumatra, Malay Peninsula, Borneo, Sulawesi, the Philippines and numerous other islands and surrounding seas--has been long sought out as an area of scientific inquiry. Agricultural and deforestation fires from the region along with air pollution from cities provide a ready supply of aerosol particles that influence major weather processes. Besides the torrential monsoons over the Asian archipelago, the region also produces moisture that provides rainfall over the Pacific Ocean and can even influence weather in the continental United States.

"We know aerosol particles can affect clouds and precipitation, but we don't yet have a quantitative understanding of those processes," said Hal Maring, Radiation Sciences Program Manager at NASA Headquarters in Washington. "Our goal is to improve satellite products and numerical models to help scientists better predict weather and climate."

"Numerous studies have linked the presence of pollution and smoke from agricultural fires and fires from deforestation to changes in cloud and storm properties, but we lack the observations of the actual mechanisms taking place," said NRL research meteorologist Jeffrey Reid. "CAMP2Ex provides a much-needed crucible for satellite observing systems and model predictions to monitor and understand how atmospheric composition and weather interact."

Credit: 
NASA/Goddard Space Flight Center

New paper creates omega-3 calculator for researchers to specify EPA+DHA doses in studies

A new study published in the American Journal of Clinical Nutrition will make it possible for researchers to calculate how much omega-3 EPA and DHA they need to use in their studies in order for subjects to reach a healthy Omega-3 Index.

Until now, there has been very little guidance about what dose of EPA and DHA should be tested in a study. And with the wide differences in study results in recent years, it is likely that dose played a role in the relative success or failure of omega-3 studies. In other words, if the dose of EPA and DHA in a study isn't high enough to make an impact on blood levels (i.e. the Omega-3 Index), there may be no effect on the desired endpoint, leading to a neutral result.

When it comes to cardiovascular disease (CVD) in particular, the literature supporting the benefits of omega-3s EPA and DHA has been mixed. On one hand, a 2018 meta-analysis concluded that current evidence does not support a role for omega-3s in CVD risk reduction.

On the other hand, three major randomized trials reported in late 2018 showed that omega-3s significantly reduced risk for vascular death, myocardial infarction, and major adverse cardiovascular events. The latter study was particularly compelling because it used 4 grams of EPA (as opposed to the usual 0.84 grams of EPA and DHA) in statin-treated patients and found a 25% risk reduction in CVD events.

According to Kristina Harris Jackson, PhD, RD, who was the co-lead author on this latest paper, "A low dose could make a study show no effect of EPA and DHA, which makes the literature more indecisive and the medical community more skeptical of omega-3 benefits," she said. "Hopefully, ensuring the dose of EPA and DHA is high enough to reach a target Omega-3 Index level will clarify whether or not EPA and DHA are effective."

How to Use the Calculator

The model equation developed in this paper can be used to estimate the final Omega-3 Index (and 95% CI) of a population given the omega-3 EPA and DHA dose and baseline Omega-3 Index. As an example, a population with a baseline Omega-3 Index of 4.9% that is given 840 mg EPA and DHA per day (as a 1-gram ethyl ester capsule) would achieve a mean Omega-3 Index of 6.5% (95% CI: 6.3%, 6.7%).

Rearranging the equation, one can calculate the approximate EPA/DHA doses (of triglyceride forms) needed to achieve a mean Omega-3 Index of 8% in 13 weeks. This would require about 2200 mg of EPA and DHA for a baseline Omega-3 Index of 2%, approximately 1500 mg for a baseline Omega-3 Index of 4%, and roughly 750 mg of EPA and DHA for a baseline Omega-3 Index of 6%.

Using this example, Jackson and her colleagues predicted that the minimum dose of EPA and DHA necessary to be 95% certain that the mean baseline Omega-3 Index of 4% will increase to 8% (in 13 weeks) is 1750 mg per day of a triglyceride formulation or 2500 mg per day of an ethyl ester formulation. Both of these forms are common in fish oil preparations.

So in order for 95% of subjects (not just 50%) to achieve a desirable Omega-3 Index from a baseline of 4%, roughly 2000 mg per day of EPA and DHA (depending on the chemical form) would likely be required.

Do Researchers Still Need the Omega-3 Index if They Have a Calculator?

The calculator presented in this paper does not eliminate the need for Omega-3 Index testing. In fact, establishing a baseline Omega-3 Index is essential to use the calculator.

"The recommended doses are simply average responses, but individual responses to EPA and DHA are still very difficult to predict," said Dr. Jackson. "In a recent consumer cohort, we found individuals spanned the full range of Omega-3 Index despite reporting the same amount of fish intake and supplement use."

This paper showed that if people want to reach 8% in a relatively short amount of time, such as three to four months, they would need 1-2 grams EPA and DHA per day, depending on their starting Omega-3 Index.

"As noted, the equation developed [in this paper] can aid in predicting population Omega-3 Index changes, but because of the large interindividual variability in the Omega-3 Index response to EPA and DHA supplementation, it will likely be less useful in the clinical setting where direct testing of the Omega-3 Index would be the preferred approach to assessing EPA and DHA status," the study authors explained.

Credit: 
Wright On Marketing & Communications

One-third of pre-approved prescription drugs have not completed the FDA approval process

CATONSVILLE, MD, August 26, 2019 - The Food and Drug Administration's (FDA) Accelerated Approval Program was created in 1992 to significantly accelerate the ability to bring certain new drugs to market. New research to be published in an upcoming issues of the INFORMS journal Manufacturing & Service Operations Management reveals a large number of drug manufacturers are failing to complete the approval process, meaning a significant number of drugs on the market are not yet fully approved.

Part of program's requirement is that manufacturers must complete post-market studies to prove the effectiveness of every drug that has been pre-approved under this initiative. Only then can those drugs be converted to full, regular approval.

But according to this newly published research analyzing publicly available data from 2014 to 2018, those studies are not being completed as promised. From 1992 to 2008, 36% of post-market studies had not been completed, and 50% of the uncompleted studies took on average of 5 years to even begin.

"Manufacturers apparently have little incentive to do the post-market studies because they are not easily enforced and they are expensive," said Liang (Leon) Xu, the study's lead researcher and a professor of supply chain and analytics at the University of Nebraska-Lincoln College of Business. "Withdrawing a drug from the market takes time and without proof of ineffectiveness cannot be enforced immediately."

FDA regulators are facing challenges to determine and enforce deadlines because of unbalanced information and moral hazard. They must optimize the tradeoff between providing public access to potentially life-saving drugs and mitigating public health risks from ineffective drugs.

Xu along with his fellow authors, Hui Zhao and Nicholas Petruzzi, both at the Smeal College of Business at The Pennsylvania State University, take advantage of what's already in place to increase the possibility of implementing a solution.

"Currently, the FDA requires drug manufacturers to pay a fixed fee to fund its new drug application review. By replacing this fixed fee with a new one tied to a post-market study deadline, we can leverage an existing tool to make sure these studies get done," said Xu.

This fee would depend on the probability of a drug's success and the enforceability of an unproven drug being pulled from the market. Additionally, Xu, Zhao, and Petruzzi developed an alternative option. In that scenario, if the current fee cannot be altered into a deadline-dependent fee, the analysis can be modified to calculate a single deadline.

In summary, this new research provides important insights to regulators on granting pre-approval of a drug under the FDA's Accelerated Approval Program. If the enforceability of a given drug is expected to be low, then regulators may consider requiring a higher success probability before proceeding with conditional approval.

Credit: 
Institute for Operations Research and the Management Sciences

Obesity tied to weakened response to taste

BINGHAMTON, N.Y. - Obesity is connected with a reduced response to taste, according to a new study featuring faculty at Binghamton University, State of University of New York.

Taste perception is known to change with obesity, but the underlying neural changes remain poorly understood.

"It's surprising that we know so little about how taste is affected by obesity, given that the taste of food is a big factor in determining what we choose to eat," said Binghamton University Professor of Psychology Patricia Di Lorenzo.

To address this issue, a team of researchers including Di Lorenzo and former graduate student Michael Weiss aimed to detail the effects of obesity on responses to taste stimuli in the nucleus tractus solitarius, a part of the brain involved with taste processing. The researchers recorded the responses to taste stimuli from single cells in the brainstem of rats that were made obese by eating a high-fat diet. They found that taste responses in these obese rats were smaller in magnitude, shorter in duration and took longer to develop, compared with those in lean rats.

These results suggest that a high-fat diet produces blunted, but more prevalent, responses to taste in the brain, and a weakened association of taste responses with ingestive behavior.

While Di Lorenzo stressed that these findings currently only apply to rats, she said that this same process could possibly translate to humans.

"Others have found that the number of taste buds on the tongue are diminished in obese mice and humans, so the likelihood that taste response in the human brain is also blunted is good," said Di Lorenzo.

She and her team are looking into the effects of gastric bypass surgery on brainstem responses to see if this procedure can recover some or all of the deficits in the taste system.

Credit: 
Binghamton University

Concussions linked to erectile dysfunction in former NFL players

Former professional football players who have experienced concussion symptoms, including loss of consciousness, disorientation or nausea after a head injury, are more likely to report low testosterone and erectile dysfunction (ED), according to research published Aug. 26 in JAMA Neurology.

The research--based on a survey of more than 3,400 former NFL players representing the largest study cohort of former professional football players to date--was conducted by investigators at the Harvard T.H. Chan School of Public Health and Harvard Medical School as part of the ongoing Football Players Health Study at Harvard University, a research program that encompasses a constellation of studies designed to evaluate various aspects of players' health across their lifespans.

The researchers caution that their findings are observational--based on self-reported concussion symptoms and indirect measures of ED and low testosterone.

The results do not prove a cause-effect link between concussion and ED, nor do they explain exactly how head trauma might precipitate the onset of ED, the investigators noted. However, the findings do reveal an intriguing and powerful link between history of concussions and hormonal and sexual dysfunction, regardless of player age. Notably, the ED risk persisted even when researchers accounted for other possible causes such as diabetes, heart disease or sleep apnea, for example. Taken together, these findings warrant further study to tease out the precise mechanism behind it.

One possible explanation, the research team said, could be injury to the brain's pituitary gland that sparks a cascade of hormonal changes culminating in diminished testosterone and ED. This biological mechanism has emerged as a plausible explanation in earlier studies that echo the current findings, such as reports of higher ED prevalence and neurohormonal dysfunction among people with head trauma and traumatic brain injury, including military veterans and civilians with head injuries.

The new findings also suggest that sleep apnea and use of prescription pain medication contribute to low testosterone and ED. It remains unclear whether they do so independently, as consequences of head injury or both, the researchers said.

Sexual function is not only a critical marker of overall health but also central to overall well-being, the researchers note. Understanding the mechanisms behind the possible downstream effects of head injury, they said, can inform treatments and preventive strategies.

"Former players with ED may be relieved to know that concussions sustained during their NFL careers may be contributing to a condition that is both common and treatable," said study lead author Rachel Grashow, a researcher at the Harvard T.H. Chan School of Public Health.

The results are based on a survey of 3,409 former NFL players, average age 52 years (age range 24 to 89), conducted between 2015 and 2017. Participants were asked to report how often blows to the head or neck caused them to feel dizzy, nauseated or disoriented, or to experience headaches, loss of consciousness or vision disturbances--all markers of concussion. Responders were grouped in four categories by number of concussive symptoms.

Next, the former players were asked whether a clinician had recommended medication for either low testosterone or ED, and whether they were currently taking such medications.

Men who reported the highest number of concussion symptoms were two and a half times more likely to report receiving either a recommendation for medication or to be currently taking medication for low testosterone, compared to men who reported the fewest concussion symptoms. Men with the most concussion symptoms were nearly two times more likely to report receiving a recommendation to take ED medication or to be currently taking ED medication than those reporting the fewest symptoms. Players who reported losing consciousness following head injury had an elevated risk for ED even in the absence of other concussion-related symptoms.

Notably, even former players with relatively few concussion symptoms had an elevated risk for low testosterone, a finding that suggests there may be no safe threshold for head trauma, the team said.

Of all participants, 18 percent reported low testosterone and nearly 23 percent reported ED. Slightly less than 10 percent of participants reported both.

As expected, individuals with cardiovascular disease, diabetes, sleep apnea and depression, as well as those taking prescription pain medication--all of which are known to affect sexual health--were more likely to report low testosterone levels and ED. Yet, the link between concussion history and low testosterone levels and ED persisted even after researchers accounted for these other conditions.

The link between history of concussion and ED was present among both the older and the younger players--those under age 50 in this case--the analysis showed, and it persisted over time.

"We found the same association of concussions with ED among both younger and older men in the study, and we found the same risk of ED among men who had last played twenty years ago," said study senior author Andrea Roberts, a researcher at the Harvard T.H. Chan School of Public Health. "These findings suggest that increased risk of ED following head injury may occur at relatively young ages and may linger for decades thereafter."

Given that ED is both fairly common and easily treatable, those who experience symptoms are encouraged to report them to their physicians, the researchers said.

Importantly, prompt evaluation of ED is critical because it can signal the presence of other conditions, including heart disease and diabetes. The findings also suggest that it may be important for clinicians to assess all patients with concussion history for the presence of neurohormonal changes.

"ED is a fact of life for many men," said Herman Taylor, director of player engagement and education and director of the Cardiovascular Research Institute at Morehouse School of Medicine. "Anyone with symptoms should seek clinical attention and thorough evaluation, particularly since ED can be fueled by cardiovascular and metabolic disorders. The good news is that this is a treatable condition."

Credit: 
Harvard Medical School