Earth

Treat early or wait? Experts ponder best way to manage milder forms of spinal muscular atrophy

image: Sonographic findings of quadriceps in child diagnosed with SMA: (A) Age 2 months: normal echogenicity and clearly visible internal septa. (B) Age 14 months, 6 months after onset of disease: a pathologic increase in background echogenicity in comparison to subcutaneous fat tissue and less definable intramuscular septa. (C) Age 17 months, after eight months of nusinersen therapy: partial recovery with increase in volume and normalization of echogenicity in the rectus femoris.

Image: 
Wolfgang Müller-Felber, MD

Amsterdam, June 11, 2020 - The advent of therapeutic interventions for spinal muscular atrophy (SMA) has increased the importance of presymptomatic diagnosis and treatment. When to start treatment in children with less severe disease remains controversial. Now, in a report published in the Journal of Neuromuscular Diseases, German researchers argue for an earlier start of treatment to prevent permanent nerve damage, challenging recommendations originally proposed by a group of American experts that suggests a strict follow-up strategy for children expected to have less severe disease.

"Recent advances in the treatment of spinal muscular atrophy argue in favor of newborn screening (NBS) for SMA. While a highly sensitive and specific method exists to detect the most important causative mutation of spinal muscular atrophy, a homozygous deletion of the SMN1 gene, prognosis of the clinical course of the disease remains difficult. The most important disease modifier is the copy number of the SMN2 gene," explained lead investigator Wolfgang Müller-Felber, MD, Dr. V. Hauner Children's Hospital, Department of Pediatric Neurology and Developmental Medicine, LMU - University of Munich, Munich, Germany.

SMA is an autosomal recessive disease that is characterized by the degeneration of spinal cord alpha motor neurons. It is the most frequent genetic cause of death in infants. The disease is caused by a lack of survival motor neuron (SMN) protein, which leads to the loss of spinal motor neurons. The severity of disease ranges from type 0, the most serious congenital form, type 1, which is characerized by motor impairment beginning within the first six months, inability to sit, and death due to respiratory failure, to type 3 (symptoms manifest after walking begins although the ability to walk may be lost over time). Genetically, the disease is attributed to the presence of a homozygous deletion of the SMN1 gene while the number copies of the highly related SMN2 gene is known to modify disease, with four or more SMN2 copies suggesting milder disease progression.

The researchers reported their experiences with children identified through an NBS program for SMA that was conducted in Germany from January 2018 to November 2019. After screening nearly 280,000 newborns, 38 children were found to have biallelic loss of SMN1 confirming a genetic diagnosis of SMA. Of those, 15 (40%) had four or more SMN2 copies, prediciting a likely milder disease course. The children were treated according to the 2018 American SMA NBS Multidisciplinary Working Group recommendations, which include regular clinical, electrophysiological, and muscle ultrasound exams until the onset of SMA symptoms. According to the recommendations, treatment should begin only when symptoms appear.

"We found that inconsistent compliance, technical issues, and difficulties with the correct interpretation of clinical symptoms can make the use of the 2018 American SMA NBS Multidisciplinary Working Group recommendations problematic and thus may have a negative impact on the correct timing of treatment initiation," noted Prof. Dr. Müller-Felber. These recommendations have now been modified and are published in the same issue of the Journal of Neuromuscular Diseases.

One specific case in the German study concerned a child in whom the SMN2 copy number was intially incorrectly reported as high and treatment thus not initiated. The child then developed symptoms of SMA at eight months of age. After onset of symptoms, it took another five weeks until the insurance company approved treatment. During the delay, the child continued to deteriorate until he was unable to bear any weight on his legs. Once treatment began, the child partially recovered. "We assume early treatment prevents or limits damage to motor neurons, helps recovery, and prevents progression, thus arguing for the earliest possible treatment," commented Prof. Dr. Müller-Felber.

Another case in the German study revealed that two affected siblings were not correctly diagnosed as SMA patients before SMA was detected in a newborn sibling.

"In view of the results of our study, an inflexible follow-up-only strategy seems to be too risky for most children, even when four copies of the SMN2 gene make a benign course of the disease probable. NBS programs increase the chances for early diagnosis as well as discovering missed diagnoses. An earlier start of treatment should be considered in all children diagnosed by NBS in order to prevent irreversible damage of motor neurons," stated Prof. Dr. Müller-Felber.

Credit: 
IOS Press

Putting 'super' in natural killer cells

image: Colorized scanning electron micrograph of a natural killer cell.

Image: 
NIAID

Using induced pluripotent stem cells (iPSCs) and deleting a key gene, researchers at University of California San Diego School of Medicine have created natural killer cells -- a type of immune cell -- with measurably stronger activity against a form of leukemia, both in vivo and in vitro.

The findings are published in the June 11, 2020 online issue of Cell Stem Cell.

Natural killer (NK) cells are lymphocytes in the same family as T and B cells, and are part of the innate immune system. They circulate throughout the body and are among the first to respond to the presence of foreign cells or invaders, most notably viruses and early signs of cancer.

As such, they hold great promise as the basis for anticancer therapies, able to identify and target malignant cells, but their efficacy has proven limited.

In the new study, a research team led by senior author Dan Kaufman, MD, PhD, professor of medicine in the Division of Regenerative Medicine, director of cell therapy at UC San Diego School of Medicine and a faculty member of both the Sanford Consortium for Regenerative Medicine and the Sanford Stem Cell Clinical Center at UC San Diego Health, advanced their potential in two ways.

First, they created NK cells from IPSCs, which are derived from skin or blood cells that have been reprogrammed back to an embryonic-like pluripotent state and then directed to become NK cells. This strategy produces a standardized cell population, rather than needing to isolate cells on a patient-specific basis

Second, the researchers deleted a gene called CISH in the stem cell-derived NK cells. The CISH gene regulates expression of a protein that suppresses cytokine signaling. Cytokines are molecules that signal other immune system cells, such as macrophages, lymphocytes and fibroblasts to sites of infection, inflammation and trauma.

"Deletion of CISH in NK cells removes an internal 'checkpoint' that is normally activated or expressed when NK cells are stimulated by cytokines, such as IL15," said Kaufman. "We found that CISH-deleted iPSC-derived NK cells were able to effectively cure mice that harbor human leukemia cells, whereas mice treated with the unmodified NK cells died from the leukemia."

"These studies demonstrate that we can now edit iPSC-derived NK cells to remove an inhibitory gene inside the cell to improve activation of NK cells. We demonstrate that the CISH deletion improves NK cell function in at least two different ways. First, it removes a brake on IL15 signaling, with improves NK cell activation and function, even at low IL15 concentrations. Second, it leads to metabolic reprogramming of the NK cells. They become more efficient at energy utilization, which improves their function in vivo."

Kaufman said he and colleagues are now working to translate the findings into a clinical therapy.

"As iPSC-derived NK cells are now in clinical trials to treat both hematologic (blood) malignancies and solid tumors, we expect that CISH-deleted iPSC-NK cells can provide an even more effective treatment.

"Importantly, iPSCs provide a stable platform for gene modification and since NK cells can be used as allogeneic cells that do not need to be matched to individual patients, we can create a line of appropriately modified iPSC-derived NK cells suitable for treating hundreds or thousands of patients as a standardized, 'off-the-shelf' therapy."

Credit: 
University of California - San Diego

An ion channel senses cell swelling and helps cells to choose a response

image: Liz Haswell's lab at WashU provides insight into how plants sense and respond (including suicide) to mechanical signals, such as cell swelling, rather than chemicals signals, such as nutrients or growth factors.

Image: 
(Washington University in St. Louis)

After a dry spell, a rainy day can feel rejuvenating. But for plants, a downpour can mean trouble. Faced with water suddenly rushing into its tissues, a plant must control its cells' volume or risk them exploding.

New research from Washington University in St. Louis offers clues about how mechanosensitive ion channels in the plant's cells respond to swelling by inducing cell death - potentially to protect the rest of the plant.

"The plant's response to cell swelling has been studied for a long time and a lot is known about the signaling events. However, the sensor that detects cell swelling in the first place was not known," said Liz Haswell, professor of biology in Arts & Sciences.

The discovery -- reported by Haswell and Debarati Basu, postdoctoral research scholar in the Haswell lab, in the June 11 issue of Current Biology-- provides insight into how plants sense and respond to mechanical signals, such as cell swelling, rather than chemicals signals, such as nutrients or growth factors.

Plant cells are armed with a strong yet flexible outer cell wall that holds back the force of water pushing out from inside the cell. Lacking a skeleton, plants only have the force of water and cellulose to keep them upright. Without that force, they go limp. But as the pressure pushing out - turgor pressure - becomes too great, the cell swells and an imbalance occurs.

It has been documented in plants that cell swelling leads to a release of calcium into the cell cytoplasm and a buildup of reactive oxygen species, unstable molecules containing oxygen that can lead to cell death. As the cell responds to the swelling, specific genes get turned on or off.

But the player that senses cell swelling has been missing.

Sandwiched between the outer cell wall and the internal contents of the cell is the plasma membrane. Embedded in the plasma membrane are mechanosensitive ion channels or tunnels that release ions in a response to membrane stretch. Mechanosensitive ion channel 10 (MSL10) is one member of the family of mechanosensitive ion channels that is a focus of the Haswell lab.

Basu applied a chemical that would cause the cell wall to lose its strength and become soft. At the same time, she could increase the turgor pressure inside the cell and study the role of MSL10 in the initial steps involved in the cell swelling response.

Plant cells, carrying a mutation that made MSL10 overly active, responded to cell swelling similarly to wildtype plants - calcium was released, reactive oxygen species made, and gene expression changed. However, the response was more pronounced and missing when the plant cells lacked MSL10.

Basu and Haswell discovered that MSL10 is not only an ion transporter but a primary responder to cell swelling.

"MSL10 is an ion channel, so it's tempting to think that it itself is transporting calcium. That may not be true," explained Basu. "Our results propose the possibility that MSL10 senses the cell swelling and activates a different channel that then transports the calcium."

As the cell swelled, the cell wall failed to maintain the force of the turgor pressure. But it did not explode. Instead, the cell died. But only plants with functional MSL10 died. In plants lacking MSL10, death was avoided.

"This might seem counterintuitive," Haswell said. "Why is MSL10 required for cells to die - you'd expect it to save cell's lives during swelling, not the other way around. The key is that cells weren't dying a normal kind of death, they were undergoing programmed cell death."

Basu found that MSL10 activates programmed cell death - a regulatory mechanism that originates from inside of the cell. Cell damage itself did not cause death; MSL10 triggered a program of cell suicide.

Why the plant triggers cells suicide in response to cell swelling is still a mystery. But Basu and Haswell have some intriguing hypotheses.

"The plasma membrane has probably been damaged. So maybe the plant wants to recoup some of that material and incorporate it back into the plant through this regulated process," Basu offered.

Or perhaps these damaged cells are more susceptible to infection, and the plant commits cell suicide as a way to save the plant at the sacrifice of a few cells.

"We already know that when a pathogen infects a plant, the plant will kill off a bunch of the cells that are infected to prevent the spread of the infection," Haswell said.  "This idea of cell suicide in response to mechanical stimuli is intriguing."

Credit: 
Washington University in St. Louis

The brain uses minimum effort to look for key information in text

image: Volunteers' brain activity was recorded using electroencephalography (EEG). In the EEG, a selective electric brain potential was observed in response to reading high- versus low-value words.

Image: 
Cognitive computing research group / University of Helsinki

By analysing brain activity, researchers found that the brain regulates its resource use and tries to identify the most essential information.

A recently completed study indicates that the human brain avoids taking unnecessary effort. When a person is reading, she strives to gain as much information as possible by dedicating as little of her cognitive capacity as possible to the processing.

This is a finding presented in an article by specialists in computer science and psychology at the University of Helsinki, published in May in the Scientific Reports journal, a multidisciplinary open-access publication platform operated by the publishers of the Nature journal.

According to the study, the brain is processing information by taking into account the relative importance of the content that is being read. When the brain is interpreting the meaning of the words being read, it attempts to allocate resources to interpreting the words that provide as much information as possible on the content of the text.

Previous studies have shown that word length and frequency, as well as syntactic and semantic errors included in sentences in sentences affect brain activity to language.

In the recently published study, the perspective was expanded to the level above individual sentences, the discourse level. It was studied using six-sentence paragraphs. At this level, the relationship between words becomes increasingly complex, and the significance of context in interpreting individual words is increased. On the discourse level, very little about information processing by the brain has been known so far.

Difference between high and low value of information

The researchers developed a model based on information theory to determine the informativeness of words and associated these with brain activity. A study was conducted by having volunteers read sentences from Wikipedia entries while recording their brain activity using electroencephalography (EEG). In the EEG, a selective electric brain potential was observed in response to reading high- versus low-value words.

"When someone reads the sentence 'Cats are small, usually furry mammals', words such as 'mammal' and 'furry' evoke a particular pattern of brain activity. This suggests that the brain is efficiently processing information: concentrating its efforts there where the most additional value in understanding the message is to be gained", says Michiel Spapé, a senior researcher who contributed to the study.

A related finding revealed that, by using AI-based techniques, brain measurements pertaining to individual words can be used to predict whether the information gain for the words read is low or high.

"Consequently, we are able to predict the information gain of content processed by people without accessing the content itself. Instead, we only utilise brain measurements," says Tuukka Ruotsalo, Academy research fellow in charge of the study at the University of Helsinki.

The results can be utilised in future brain-information interfaces, which observe brain function when people perceive and process various types of information.

"Such applications could be used, for example, in healthcare, or, in the future, even in modelling the tastes, values and opinions of ordinary consumers," Ruotsalo says.

Ruotsalo points out that the research is only at its basic stage.

"Practical applications are associated with ethical and technical challenges that must be solved before anything concrete can be developed."

Credit: 
University of Helsinki

Substandard hand sanitizers readily available on market, confirm pharmacists

image: Hand sanitisers amid CoViD-19: A critical review of alcohol-based products on the market and formulation approaches to respond to increasing demand by pharmacists at the University of Huddersfield.

Image: 
University of Huddersfield

AN INTERNATIONAL team of pharmacy experts has researched the effectiveness of hand sanitisers in the fight against CoViD-19 and warned the public to beware of sub-standard products. They have also provided detailed "recipes" for the manufacture of effective hand sanitising gels and explained the science behind them.

There is a real risk, they write in a new article, that consumers are obtaining and using hand cleaners with low or inadequate concentrations of alcohol. These might appear similar to hand disinfectants, but purchasers are often unaware that such products cannot ensure disinfection and are not fit for use amid the pandemic.

Awareness campaigns

The authors - including the UK's Dr Hamid Merchant, who is Subject Leader in Pharmacy at the University of Huddersfield - set out ways to minimise the risks.

They discourage the public from buying hand sanitiser from unknown or unreliable e-commerce sites.

They also state that pharmacists and retailers should advise customers over the selection of appropriate products for CoViD-19 infection control, and there should be awareness campaigns to educate the public on how to differentiate between products that are fit for general hygiene and cleansing and those that are not fit for coronavirus infection control.

The experts also urge regulatory bodies to revisit their current rules on hand sanitisers.

The new article - a collaboration between eight pharmacists based at universities in the UK, Italy and Jordan - appears in the International Journal of Pharmaceutics. Freely available online, it is titled Hand sanitisers amid CoViD-19: A critical review of alcohol-based products on the market and formulation approaches to respond to increasing demand.

The authors chart the massive spike in demand for hand sanitisers around the world, as purchasers stocked up their "pandemic pantries". This led to stocks rapidly vanishing from the shelves, with even hospitals and other healthcare facilities running out.

The researchers also believe that that current awareness of the importance of hand disinfection means it will remain an integral part of people's hygiene routine, even post-CoViD-19.

They investigate the scientific basis for hand cleansing and analyse when washing with soap and water - which can remove virtually all types of pathogens - is preferable to using alcohol based hand-rubs (ABHR), which are less effective when hands are extremely greasy or dirty.

"However, handwashing facilities are not readily available at work or public places. Moreover, in instances where hand sanitisation is needed more frequently, such as during frequent contact with individuals or products, the ABHRs are the most effective and convenient infection preventive measure," states the article.

But the authors add that it is important to emphasise that ABHRs only work when used correctly.

"Considering that not all ABHR formulations are the same, appropriate labelling is important to clearly state the alcohol concentration and instructions to direct the correct dose/amount needed to achieve an adequate sanitisation. The choice of container, closure and dispenser is also vital in dispensing the correct amount of the sanitiser on each use."

Substandard products

Substandard products are available in certain markets. If the alcohol content is not high enough, the risk for consumers is "mainly a reduced perception of product quality and attractiveness, and reduced ease of use; while overall product efficacy is maintained". Many products in the market do not seem to comply with alcohol type and concentrations recommended by the World Health Organisation.

But much more worrying is the market presence of "hand cleaners containing substandard and/or unknown concentrations of alcohol that are not meant to be sold or used as disinfectants".

The article includes detailed scientific data on the types and proportions of alcohol used in ABHRs and the added substances that are used to combat excessive skin dryness and to increase the viscosity of gels, because purely liquid formulations can be far less effective due to rapid evaporation of alcohol. Dr Merchant suggested that a standardised pharmacopoeial monograph with tightly-controlled specification may be a way forward.

There are also charts and descriptions that provide detailed instructions for the production of effective alcohol-based sanitisers. This will aid pharmacists and also manufacturers in fields such as brewing and perfume who have switched to making ABHRs, in response to the surge in demand.

Most of the analytical research for the article was carried out in Italy by Professor Marco Cespi's team. The University of Huddersfield's Dr Hamid Merchant was invited to help with the project by one of the article's authors, the Italian scientist Dr Alberto Berardi, now based at the Faculty of Pharmacy of the Applied Science Private University in Amman, Jordan, with which the University of Huddersfield's Department of Pharmacy has recently formed a special collaboration.

Credit: 
University of Huddersfield

A post-pandemic world: will populations be on the move? Study shows contagions could be catalysts for mass migration

video: Pandemics and civil unrest often lead to mass migration, both temporarily and permanently.

Image: 
Sophia Ioannidis, University of Sydney

Not according to new epidemic research published in Nature - Scientific Reports, by a group of University of Sydney pandemic modelers led by Centre for Complex Systems Director, Professor Mikhail Prokopenko.

Disease outbreaks, civil unrest and war often bring about the biggest movements of people. The end of the Second World War saw the largest movement of people in Europe's history, with millions settling in Australia in the decades following 1945.

In 2015-16, the Syrian conflict displaced over four million people who dispersed across the world seeking safety, while the Ebola crisis similarly saw both temporary and permanent relocation.

"While many countries' borders are now closed, making migration virtually impossible, a post-pandemic world might look very different," said Faculty of Engineering academic, Professor Mikhail Prokopenko, who recently contributed to the G08 Covid-19 Federal Advisory report, Roadmap to Recovery

"Epidemics are examples of wider contagion phenomena which also include social segregation, "infodemics" - waves of misinformation, and social unrest," said Professor Prokopenko.

"Our theoretical modelling suggested that, when faced with either threat or opportunity, people tend to avoid risks, seek an advantage, or both. One can stretch these scenarios and imagine how attractive a destination Australia may appear if the local transmission of COVID-19 is eliminated in our country," he said.

While governments around the world have called for restrictions on migration during the post-pandemic recovery phase, people who have been affected by economic collapse or worsened health conditions may consider short-term or even long-term relocation to safer regions.

"We showed that large-scale collective behaviors, such as migration, can result from very small changes in human decision-making", said the study's lead author and Centre for Complex Systems PhD student, Nathan Harding.

"In other words, even if individuals re-assess their risks only slightly, their combined actions can bring a tipping point in terms of population resettlement," said Mr Harding.

"While Ebola outbreaks affected only relatively small areas of the world, the COVID-19 pandemic has affected almost every country and continues to spread. Therefore we can expect far-reaching impacts that may boost global and regional migration," said Professor Prokopenko.

HOW THE MODELLING WORKED

The model traces a "contagion" spread in an abstract geographical region, where agents representing people make choices to stay or move around. The method looks at how changes in individual preferences affect the behaviour of a large population. The model is theoretical and needs to be calibrated and validated with real-world data in order to evaluate specific diseases and scenarios.

Credit: 
University of Sydney

A breakthrough in developing multi-watt terahertz lasers

image: A phase-locking scheme for plasmonic lasers is developed in which traveling surface-waves longitudinally couple several metallic microcavities in a surface-emitting laser array. Multi-watt emission is demonstrated for single-mode terahertz lasers in which more photons are radiated from the laser array than those absorbed within the array as optical losses.

Image: 
Yuan Jin, Lehigh University

Terahertz lasers could soon have their moment. Emitting radiation that sits somewhere between microwaves and infrared light along the electromagnetic spectrum, terahertz lasers have been the focus of intense study due to their ability to penetrate common packaging materials such as plastics, fabrics, and cardboard and be used for identification and detection of various chemicals and biomolecular species, and even for imaging of some types of biological tissue without causing damage. Fulfilling terahertz lasers' potential for use hinges on improving their intensity and brightness, achieved by enhancing power output and beam quality.

Sushil Kumar, associate professor in Lehigh University's Department of Electrical and Computer Engineering, and his research team are working at the forefront of terahertz semiconductor 'quantum-cascade' laser (QCL) technology. In 2018, Kumar, who is also affiliated with Lehigh's Center for Photonics and Nanoelectronics (CPN) reported on a simple yet effective technique to enhance the power output of single-mode lasers based on a new type of "distributed-feedback" mechanism. The results were published in the journal Nature Communications and received a lot of attention as a major advance in terahertz QCL technology. The work was performed by graduate students, including Yuan Jin, supervised by Kumar and in collaboration with Sandia National Laboratories.

Now, Kumar, Jin and John L. Reno of Sandia are reporting another terahertz technology breakthrough: they have developed a new phase-locking technique for plasmonic lasers and, through its use, achieved a record-high power output for terahertz lasers. Their laser produced the highest radiative efficiency for any single-wavelength semiconductor quantum cascade laser. These results are explained in a paper, "Phase-locked terahertz plasmonic laser array with 2 W output power in a single spectral mode" published yesterday in Optica.

"To the best of our knowledge, the radiative efficiency of our terahertz lasers is the highest demonstrated for any single-wavelength QCL to-date and is the first report of a radiative efficiency of greater than 50% achieved in such QCLs," said Kumar. "Such a high radiative efficiency beat our expectations, and it is also one of the reasons why the output power from our laser is significantly greater than what has been achieved previously."

To enhance the optical power output and beam quality of semiconductor lasers, scientists often utilize phase-locking, an electromagnetic control system that forces an array of optical cavities to emit radiation in lock step. Terahertz QCLs, which utilize optical cavities with metal coatings (claddings) for light confinement, are a class of lasers known as plasmonic lasers that are notorious for their poor radiative properties. There are only a limited number of techniques available in prior literature, they say, that could be utilized to improve radiative efficiency and output power of such plasmonic lasers by significant margins.

"Our paper describes a new phase-locking scheme for plasmonic lasers that is distinctly different from prior research on phase-locked lasers in the vast literature on semiconductor lasers," says Jin. "The demonstrated method makes use of traveling surface waves of electromagnetic radiation as a tool for phase-locking of plasmonic optical cavities. The efficacy of the method is demonstrated by achieving record-high output power for terahertz lasers that has been increased by an order of magnitude compared to prior work."

Traveling surface waves that propagate along the metal layer of the cavities, but outside in the surrounding medium of the cavities rather than inside, is a unique method that has been developed in Kumar's group in recent years and one that continues to open new avenues for further innovation. The team expects that the output power level of their lasers could lead to collaborations between laser researchers and application scientists toward development of terahertz spectroscopy and sensing platforms based on these lasers.

This innovation in QCL technology is the result of a long term research effort by Kumar's lab at Lehigh. Kumar and Jin jointly developed the finally-implemented idea through design and experimentation over a period of approximately two years. The collaboration with Dr. Reno from the Sandia National Laboratories allowed Kumar and his team to receive semiconductor material to form the quantum cascade optical medium for these lasers.

The primary innovation in this work, according to the researchers, is in the design of the optical cavities, which is somewhat independent from the properties of the semiconductor material. The newly acquired inductively-coupled plasma (ICP) etching tool at Lehigh's CPN played a critical role in pushing the performance boundaries of these lasers, they say.

This research represents a paradigm shift in how such single-wavelength terahertz lasers with narrow beams are developed and will be developed going forward in future, says Kumar, adding: "I think the future of terahertz lasers is looking very bright."

Credit: 
Lehigh University

Twisted microfiber's network responses to water vapor

image: Optical microscopy image of a single fiber of self-assembled polysaccharide in snaking, twisted, and straight structures.

Image: 
JAIST

Researchers at Japan Advanced Institute of Science and Technology (JAIST): graduate student Kulisara Budpud, Assoc. Prof. Kosuke Okeyoshi, Dr. Maiko Okajima and, Prof. Tatsuo Kaneko reveal a unique polysaccharide fiber in a twisted structure forming under drying process which showed spring-like behavior. The spring-like behavior of twisted structures is practically used as a reinforced structure in a vapor-sensitive film with millisecond-scale response time. This work is published in Small Full Paper titled "Vapor-Sensitive Materials from Polysaccharide Fibers with Self-Assembling Twisted Microstructures".

Polysaccharides play a variety of roles in nature, including molecular recognition and water retention. Still, there is a lack of study in vitro microscale structures of polysaccharides because of the difficulties in regulating self-assembled structures. If the self-assembled structures of these natural polysaccharides can be reconstructed in vitro, it will lead not only to an increased understanding of the morphological changes involved in polysaccharide self-assembly in water but also to the development of a new class of bio-inspired materials, which exhibit regulated structures on a nanometer scale. In this research, it is demonstrated that a cyanobacterial polysaccharide named sacran, can hierarchically self-assemble as twisted fibers from nanoscale to microscale with diameters of ?1 μm and lengths >800 μm. this is remarkably larger than polysaccharides previously reported. Unlike other rigid fibrillar polysaccharides such as cellulose, the sacran fiber is capable of flexibly transforming into two-dimensional snaking and three-dimensional twisted structures at an evaporative air-water interface (Fig.1). This twisted sacran fiber behaves like a mechanical spring under a humid environment.

To optimize the condition of the twisted structure is formed by controlling drying speeds. Actually, the drying speed and the capillary force are the dominant factors in creating these formations. To show the potential use of this spring-like polysaccharide fibers, a crosslinked polysaccharide film is prepared as a vapor-sensitive material and the effects of the microfiber's spring behaviors in an environment with humidity gradient are demonstrated (Fig.2). The film reversibly and quickly switched between flat and bent states within 300-800 ms. This repulsive motion displayed by the film is caused by the snaking and twisted structures of the fibers responding to the change of moisture. The sacran film shows a fast response to the water drop retreating, changing from the bent state to the flat state. Because the extended sacran fibers have extension stress like a spring, the network could quickly release water by shrinking. As a result, the bent film becomes flat immediately. Thus, the snaking and twisted fiber network enable millisecond bending and stretching responses to changes in local humidity.

From the simple method, JAIST researchers could create unique micro-spring from natural polysaccharide which is practically used as a vapor-sensitive material. Besides, by introducing functional molecules into the microfiber, it would be possible to prepare a variety of soft actuators responding to other changes in the external environment, such as light, pH, and temperature. The method for preparing vapor sensors developed by this study not only improves understanding of how the motion of self-assembled structures responds to stimuli. But also contributes toward the design of environmentally adaptive materials with a high potential for sustainable use.

Credit: 
Japan Advanced Institute of Science and Technology

Lightning in a (nano)bottle: new supercapacitor opens door to better wearable electronics

image: This is an outline of the new supercapacitor.

Image: 
Pavel Odinev / Skoltech

Researchers from Skoltech, Aalto University and Massachusetts Institute of Technology have designed a high-performance, low-cost, environmentally friendly, and stretchable supercapacitor that can potentially be used in wearable electronics. The paper was published in the Journal of Energy Storage.

Supercapacitors, with their high power density, fast charge-discharge rates, long cycle life, and cost-effectiveness, are a promising power source for everything from mobile and wearable electronics to electric vehicles. However, combining high energy density, safety, and eco-friendliness in one supercapacitor suitable for small devices has been rather challenging.

"Usually, organic solvents are used to increase the energy density. These are hazardous, not environmentally friendly, and they reduce the power density compared to aqueous electrolytes with higher conductivity," says Professor Tanja Kallio from Aalto University, a co-author of the paper.

The researchers proposed a new design for a "green" and simple-to-fabricate supercapacitor. It consists of a solid-state material based on nitrogen-doped graphene flake electrodes distributed in the NaCl-containing hydrogel electrolyte. This structure is sandwiched between two single-walled carbon nanotube film current collectors, which provides stretchability. Hydrogel in the supercapacitor design enables compact packing and high energy density and allows them to use the environmentally friendly electrolyte.

The scientists managed to improve the volumetric capacitive performance, high energy density and power density for the prototype over analogous supercapacitors described in previous research. "We fabricated a prototype with unchanged performance under the 50% strain after a thousand stretching cycles. To ensure lower cost and better environmental performance, we used a NaCl-based electrolyte. Still the fabrication cost can be lowered down by implementation of 3D printing or other advanced fabrication techniques," concluded Skoltech professor Albert Nasibulin.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Could we run out of sand? Scientists adjust how grains are measured

image: Associate Professor Ana Vila-Concejo, School of Geosciences, The University of Sydney in the field at the Great Barrier Reef.

Image: 
University of Sydney

Humans see sand as an infinite resource. We are astounded to discover there are more stars in the universe than grains of sand on our beaches.

Yet in some areas, sand is in short supply and scientists have discovered the way we keep track of this resource has given us misleading information.

In many instances, we have simply been measuring sand the wrong way.

"Not all sand is the same," said Associate Professor Ana Vila-Concejo from the University of Sydney School of Geosciences. "Yet the models for assessing sand and how it moves mostly rely on one type. This means we have an inaccurate picture of what is happening, especially in coastal areas that are vulnerable to climate change."

Dr Amin Riazi from Eastern Mediterranean University worked with Associate Professor Vila-Concejo during a short stay at the University of Sydney to develop new engineering models that account for the different shapes of sand grains. Standard models assume sand grains are spherical, which is fine for common sands made up of ground-down silica and quartz rocks.

However, carbonate sands derived from shells, corals and the skeletons of marine animals tend to be elliptical, less dense and have more holes and edges. The new research has taken this into account with astounding results, finding that existing models underestimate the surface area of carbonate sands by 35 percent.

Published today in the Nature journal Scientific Reports, Associate Professor Vila-Concejo's team has shown that standard engineering models also overestimate transport of carbonate sands on the seafloor by more than 20 percent and underestimate suspended transport of this sand by at least 10 percent.

"This means we are not accounting for sand correctly," she said. "While this has impact on construction and manufacturing, it could also have a big effect on the management of coastal areas impacted by climate change."

Sand is used throughout industry. From the glass in your mobile phone to base for roads, sand is used across our economy. In fact, sand and gravel are the most extracted materials on the planet, exceeding that of fossil fuels.

Nature last year reported that illegal sand mining is happening in about 70 countries and hundreds of people have been killed in battles over sand in the past decade.

Associate Professor Vila-Concejo said: "While sand wars are not happening in Australia, we do have areas with chronic coastal erosion and sand loss such as at Jimmys Beach in Port Stephens."

NEW MATHEMATICAL MODELS

Her team took carbonate sand from near Heron Island on the Great Barrier Reef and observed how it responded under experimental conditions. Based on these observations, they developed new mathematical equations that much better predict how carbonate sands move.

The team confirmed this by applying their equations to existing data on carbonate sand movement accumulated over six years from observations off the north coast of Oahu, Hawaii.

"Keeping track of carbonate sand will become increasingly important," said Dr Tristan Salles, also from the School of Geosciences in the Faculty of Science.

"If islands and atolls are at risk from erosion caused by sea-level rise, it will be vital to understand how the sands protecting them will respond to the ocean currents, waves and high-energy sea swells battering them."

He said these new equations are likely to be used to update all sediment transport models. "This will include evaluating beach and atoll responses to ocean hydrodynamics in carbonate-sand-rich regions, some of which are most vulnerable to the impacts of climate change," Dr Salles said.

At present, coastal engineering uses models based on siliciclastic sands. Associate Professor Vila-Concejo hopes that the models her team has developed can be used to improve management of coastal areas.

"This means we can develop a far more accurate picture of how changing oceans will affect marine ecosystems where carbonate sands are dominant," Associate Professor Vila-Concejo said.

"Understanding how, why and when sediments move is crucial to managing and predicting the effects of climate change and our new work will help in the development of mitigation and adaptation strategies."

Credit: 
University of Sydney

A compound unlike any other

image: This is Roberta O'Connor.

Image: 
WSU

A compound discovered in the gills of wood-eating clams could be the solution to a group of parasites responsible for some of the world's most common infections.

That compound is tartrolon E, a byproduct of bacteria that help shipworms, a group of saltwater clams, digest the wood they eat.

According to research recently published in PLOS Pathogens, the compound, unlike any other, is proven to kill causal parasites for malaria, toxoplasmosis, cryptosporidiosis, theileriosis and babesiosis.

"There are compounds that work against the individual parasites, but to find one that works against this entire group, that is what made this unique," said Roberta O'Connor, an associate professor in Washington State University's Veterinary Microbiology and Pathology unit, and first author on the paper.

While there are already effective drugs for many of the parasites mentioned here, O'Connor said this group of parasites, called apicomplexans, readily develops drug resistance.

"Development of new, effective drugs against apicomplexan parasites is an ongoing need for human and veterinary medicine," she said.

One of those parasites in need of a more effective remedy is Cryptosporidium.

Cryptosporidium, a waterborne zoonotic parasite, is a major cause of diarrhea in children, immunocompromised patients, and in newborn animals worldwide. The parasite infects millions of humans and agricultural animals annually.

In addition to killing this class of parasites in vitro, tartrolon E was able to kill Cryptosporidium in newborn mice.

Beginning this summer, WSU researchers will test the compound against Cryptosporidium in lambs.

Currently, nitazoxanide is the only drug approved by the Food and Drug Administration to treat cryptosporidiosis.

"Nitazoxanide doesn't work well for those [patients] who are immunocompromised or malnourished and those are the people most vulnerable to Cryptosporidium," O'Connor said.

O'Connor is the principal investigator on the study which will characterize the specific effects of tartrolon E on Cryptosporidium parasites. Villarino will lead the pharmacokinetics portion of the study in immunocompromised mice to further assess tartrolon E's effectiveness and optimal dose regimens.

The research is made possible by a recently awarded 5-year, $1.6 million grant from the National Institutes of Health.

"We will define how the drug behaves in the body and how much of the drug is needed to control Cryptosporidiuminfection," Villarino said. "We want the maximum effect with minimal adverse effects."

This aspect of the research on the compound is a key component for drug development.

"This could have a significant impact on human and veterinary medicine because there is no other drug that can effectively treat this condition," Villarino said.

O'Connor and Villarino are hopeful tartrolon E will lead to a clinically developed drug but they know it is a long way to get there.

"Tartrolon E is obviously hitting some system that is common to [all] these parasites," O'Connor said. "Even if this compound isn't successful, if we can determine the mechanism, we will have identified a common drug target for all these parasites."

Credit: 
Washington State University

Coal-tar-sealant major source of PAH contamination in Great Lakes tributaries

Runoff from pavement with coal-tar-based sealant is the most likely primary source of polycyclic aromatic hydrocarbons, or PAHs, found in the majority of streambed sediments of Great Lakes tributaries, according to a study published in Environmental Toxicology and Chemistry. PAHs are a group of chemicals found in crude oil and coal and occur as a byproduct of burning. PAHs can have harmful effects to organisms in the environment under certain conditions. So, it is important to understand their sources, distribution and magnitude in the Great Lakes Basin.

Scientists with the US Geological Survey, the US Environmental Protection Agency, and the University of Helsinki collected sediment samples from 71 streambed sites throughout the US portion of the Great Lakes Basin. The sites were selected to represent watersheds with a range of land uses, from 0.7 to 100% urban. The authors employed multiple lines of evidence to identify the most likely source of PAHs to sediment. They found that, based upon relative concentrations of the particular PAHs found in the sediments, dust from coal-tar-sealant was most likely the dominant PAH source for 57 of the sites, with the remainder of PAHs coming from sources such as vehicle emissions.

Pavement sealant is a black, shiny liquid sprayed or painted on asphalt parking lots, driveways and playgrounds to improve appearance and protect the underlying asphalt. Coal-tar sealants have significantly higher levels of PAHs and related compounds compared with alternatives such as asphalt-based sealants and contribute more PAHs to the watershed than other urban sources, including vehicle emissions, used motor oil and particles abraded from tires of moving vehicles. PAHs from coal-tar sealants are transported to streams through stormwater runoff.

Concentrations of PAHs in 62% of the samples exceeded screening criteria for aquatic life. The research indicates the need for further analysis to determine the potential effects of PAHs on aquatic life in that area in order to improve habitat for aquatic organisms, and underscores the need to understand the source of the PAHs.

Credit: 
Society of Environmental Toxicology and Chemistry

Remixed mantle suggests early start of plate tectonics

image: New Curtin University research on the remixing of Earth's stratified deep interior suggests that global plate tectonic processes, which played a pivotal role in the existence of life on Earth, started to operate at least 3.2 billion years ago.

Image: 
Professor Zheng-Xiang Li

New Curtin University research on the remixing of Earth's stratified deep interior suggests that global plate tectonic processes, which played a pivotal role in the existence of life on Earth, started to operate at least 3.2 billion years ago.

Published in Nature's Scientific Reports, researchers from Curtin University's Earth Dynamics Research Group re-analysed global data to detect sudden changes in the chemical characteristics of basalt and komatiite lava rocks, believed to have been derived from Earth's upper and lower mantle layers and erupted to the surface between two and four billion years ago.

Lead researcher PhD Candidate Mr Hamed Gamal El Dien, from Curtin's School of Earth and Planetary Sciences, said there was much scientific debate over the exact start date of plate tectonics on Earth.

"Some scientists believe plate tectonics only began to operate from around 800 million years ago, whereas others think it could go as far back as four billion years ago, soon after the formation of our planet," Mr Gamal El Dien said.

"So far nearly all the evidence used in this debate came from scarcely preserved surface geological proxies, and little attention has been paid to the record kept by Earth's deep mantle - this is where our research comes in.

"For the first time, we were able to demonstrate that a significant shift in mantle composition (or a major mantle remixing) started around 3.2 billion years ago, indicating a global recycling of the planet's crustal materials back in to its mantle layer, which we believe shows the start of global plate tectonic activity."

During the earliest stages of Earth's planetary differentiation, the planet was divided into three main layers: the core, the mantle and the crust. Scientists believe there would have been very little remixing between the lighter crust and the much denser mantle, until the onset of plate tectonics.

However through the ongoing process of subduction, some lighter crustal materials are carried back into the denser deep Earth and remixed with the mantle. The question the researchers then asked was, when did this global and whole-mantle remixing process start?

"Keeping the basic process of subduction in mind, we hypothesise that ancient rock samples found on the crust, that are ultimately sourced from the deep mantle, should show evidence of the first major 'stirring up' in the mantle layer, marking the start of plate subduction as a vital component of plate tectonic processes," Mr Gamal El Dien said.

To complete this research, the team looked at the time variation of the isotopic and chemical composition of approximately 6,000 mantle-derived basaltic and komatiitic lava rocks, dated to be between two and four billion years old.

Research co-author John Curtin Distinguished Professor and Australian Laureate Fellow Professor Zheng-Xiang Li, head of the Earth Dynamics Research Group, said the research is highly significant in understanding the dynamic evolution of our planet.

"Plate tectonic activity on the planet is responsible for the formation of mineral and energy resources. It also plays a vital role for the very existence of mankind. Plate tectonics are found uniquely operative on Earth, the only known habitable planet," Professor Li said.

"Through our retrospective analysis of mantle-derived samples, we discovered that after the initial chemical stratification and formation of a hard shell in the first billion years of Earth's 4.5 billion year history, there was indeed a major chemical 'stir up' some 3.2 billion years ago.

"We take this 'stir up' as the first direct evidence from deep Earth that plate tectonics started over 3 billion years ago, leading to a step change in mantle composition, followed by the oxygenation of our atmosphere and the evolution of life."

Credit: 
Curtin University

Self-driving cars that recognize free space can better detect objects

image: New CMU research shows that what a self-driving car doesn't see (in green) is as important to navigation as what it actually sees (in red).

Image: 
Carnegie Mellon University

PITTSBURGH--It's important that self-driving cars quickly detect other cars or pedestrians sharing the road. Researchers at Carnegie Mellon University have shown that they can significantly improve detection accuracy by helping the vehicle also recognize what it doesn't see.

Empty space, that is.

The very fact that objects in your sight may obscure your view of things that lie further ahead is blindingly obvious to people. But Peiyun Hu, a Ph.D. student in CMU's Robotics Institute, said that's not how self-driving cars typically reason about objects around them.

Rather, they use 3D data from lidar to represent objects as a point cloud and then try to match those point clouds to a library of 3D representations of objects. The problem, Hu said, is that the 3D data from the vehicle's lidar isn't really 3D -- the sensor can't see the occluded parts of an object, and current algorithms don't reason about such occlusions.

"Perception systems need to know their unknowns," Hu observed.

Hu's work enables a self-driving car's perception systems to consider visibility as it reasons about what its sensors are seeing. In fact, reasoning about visibility is already used when companies build digital maps.

"Map-building fundamentally reasons about what's empty space and what's occupied," said Deva Ramanan, an associate professor of robotics and director of the CMU Argo AI Center for Autonomous Vehicle Research. "But that doesn't always occur for live, on-the-fly processing of obstacles moving at traffic speeds."

In research to be presented at the Computer Vision and Pattern Recognition (CVPR) conference, which will be held virtually June 13-19, Hu and his colleagues borrow techniques from map-making to help the system reason about visibility when trying to recognize objects.

When tested against a standard benchmark, the CMU method outperformed the previous top-performing technique, improving detection by 10.7% for cars, 5.3% for pedestrians, 7.4% for trucks, 18.4% for buses and 16.7% for trailers.

One reason previous systems may not have taken visibility into account is a concern about computation time. But Hu said his team found that was not a problem: their method takes just 24 milliseconds to run. (For comparison, each sweep of the lidar is 100 milliseconds.)

Credit: 
Carnegie Mellon University

Transforming spleen to liver brings new hope for organ regeneration

image: The two research groups working on this project.

Image: 
Lei Dong and Chunming Wang

Scientists from Nanjing University and University of Macau have transformed the spleen into a functioning liver in living mice, which could bring new hope for patients suffering from organ shortage worldwide.

For nearly ten million people with end-stage organ failures, implanting a new organ to replace the damaged one might be their last hope of survival. However, the shortage of donors, immune rejection and numerous other medical, ethical and economic factors have kept patients in a devastatingly long queue. And many never got one till the end. Each day, in the United States alone, twenty people die waiting for transplants.

In the past three decades, tissue engineering (TE) has promised to create functioning tissue from the tubes. This approach aims to culture living cells in 3D scaffolds, induce them to grow into the desired tissue and, finally, transplant this living tissue back to the body in substitution of the damaged one. 'The goal of TE is to restore function through the delivery of living elements which become integrated into the patient', wrote Joseph Vacanti and Robert Langer, two pioneers in regenerative medicine, in their manifesto paper published in Lancet in 1999.

This approach has made remarkable progress in providing promising solutions for repairing structurally simpler tissue. However, to regenerate vital and complex organs, such as the liver, TE still has a long way to go. The structure of an organ like the liver is too complicated for replication by current technologies - particularly its abundant, open, organised blood vessels connecting the body for nutrient supply. Simplified prototypes engineered from the laboratory survive poorly after transplantation without adequate blood supply.

To address this challenge, in their recent paper published in Science Advances, the team from Nanjing and Macau adopt a different way of thinking. Instead of engineering an organ for transplantation, they directly transform an existing organ - the spleen - into a 'new' organ in that fulfils the liver's function in the same mouse. The researchers inject a pre-selected tissue extract to the spleen of mice, which shows lower immune response and produces more extracellular matrix required for cell growth. Then, they implant mouse, rat and human liver cells into the remodelled spleen in mice, observing in months that these cells not only survive from immune rejection and grow into liver-like structures but, more importantly, exert the liver's function in the host body. As perhaps the most exciting finding, the spleen-transformed liver could rescue mice with 90% of their original liver removed.

This paper is published online with the title, 'Transforming the spleen into a liver-like organ in vivo'. Professor Lei Dong of Nanjing University, the leading author of this work, believes this technology could 'solve the fundamental challenges in tissue engineering, including insufficient cells, immune rejection and lack of blood vasculature, at one time'. Dong also suggests that, instead of focusing too much on the tissue structure, their strategy concentrates on restoring the tissue function in vivo, which should be the original goal of tissue engineering. Professor Chunming Wang of University of Macau, co-corresponding author of the paper, highlights the safety of the new strategy as 'no any adverse responses were observed for as long as eight weeks, such as immune rejection or unwanted spreading of the transplanted cells', which indicates the translational potential of the new strategy. The authors are confident of their approach overcoming the long-standing obstacles in regenerative medicine and, ultimately, helping to regenerate large organs 'on-demand'.

Professor Xiaokun Li, an expert in regenerative medicine and Member of the Chinese Academy of Engineering, highly rates this work for its 'unique strategy to achieve liver regeneration, solid findings on functionalities of transplanted cells, and impressive potential for translational medicine'. Li recommends future work be performed on larger animals with comprehensive evaluations in both efficacy and safety towards its clinical application.

Credit: 
Nanjing University School of Life Sciences