Tech

Why doesn't Ebola cause disease in bats, as it does in people?

GALVESTON, Texas - A new study by researchers from The University of Texas Medical Branch at Galveston uncovered new information on why the Ebola virus can live within bats without causing them harm, while the same virus wreaks deadly havoc to people. This study is now available in Cell Reports.

The Ebola virus causes a devastating, often fatal, infectious disease in people. Within the past decade, Ebola has caused two large and difficult to control outbreaks, one of which recently ended in the Democratic Republic of the Congo.

When a virus brings serious disease to people, it means that humans are not good hosts for the virus. Viruses depend on a living host for their survival and have natural reservoirs - a hosting animal species in which a virus naturally lives and reproduces without causing disease. Bats are likely a natural reservoir for the Ebola virus, but little is known about how the virus evolves in bats.

Like most other RNA viruses, Ebola's molecules are structured in a way that makes them more prone to genomic errors and mutations than other types of viruses. Because of this, Ebola and similar viruses have a remarkable ability to adapt to and replicate in new environments.

In the study, the research team, led by Alex Bukreyev, a UTMB virologist in the departments of pathology and microbiology and immunology, working with the team of Raul Andino, University of California, San Francisco, investigated how the Ebola virus adapts to both bat and human cells. They assessed changes in mutation rates and the structure of Ebola virus populations repeatedly in both bat and human cell lines using an ultra-deep genetic sequencing.

"We identified a number of meaningful differences in how the Ebola virus evolves when placed in a human cell line relative to a bat cell line," Bukreyev said. "For instance, the RNA editing enzyme called ADAR within bat cells play a greater role in the replication and evolution of the Ebola virus than do such enzymes in human cells. We found that the envelope protein of Ebola virus undergoes a drastic increase in certain mutations within bat cells, but this was not found in human cells. This study identifies a novel mechanism by which Ebola virus is likely to evolve in bats."

The study suggests that the Ebola virus and bats can live together harmoniously because of the bat cell's ability to induce changes in the virus that make it less capable of harm. Bukreyev said that the study's findings validate the ultra-deep genetic sequencing used in this study as a predictive tool that can identify viral mutations associated with more adaptive evolution. This technology can be very useful in studying, and perhaps shaping, the evolution of emerging viruses, like SARS-CoV-2, the virus responsible for COVID-19.

Credit: 
University of Texas Medical Branch at Galveston

Under pressure, nontoxic salt-based propellant performs well

image: Associate Professor Joshua Rovey with graduate students Nick Rasmont and Matt Klosterman

Image: 
Aerospace Engineering, Grainger Engineering

In smaller spacecraft such as CubeSat satellites, a salt-based monopropellant is showing promise. It can be used both in high-thrust chemical propulsion for fast time-sensitive maneuvers, and electric mode for slow maneuvers, such as orbit maintenance. Now, researchers in the Department of Aerospace Engineering at the University of Illinois at Urbana-Champaign have more knowledge about how it performs under pressure.

The propellant, called FAM-110A, is a mixture of two commercially available salts.

Watch a video of the combustion: https://youtu.be/7Riz4t6D6As

"Unlike hydrazine, the most commonly used monopropellant available today, our mixture is nontoxic. It's also denser, so a smaller tank can be used for storage. And because it can be used in a combined chemical-electric thruster, it requires less plumbing than two separate thruster systems with their own propellants," said Nicolas Rasmont, a graduate student who is working with AE faculty member Joshua Rovey.

Rasmont said his research project looked at the velocity of its combustion in the high-thrust chemical mode and how it performs under different pressures. The findings will help inform the design of a rocket engine using this type of propellant.

"We learned that the preparation and storage conditions have a deeper influence on combustion velocity than we anticipated," Rasmont said. "We don't have a complete explanation for that, yet. We think it's because FAM-110A can absorb moisture from the atmosphere very quickly. Both components are hygroscopic, and other researchers found that even a small increase in water content can alter the combustion properties of similar propellants."

The experiment studied FAM-110A along with two controls whose combustion behaviors are well known: nitromethane, and a mixture of 80 percent HAN with 20 percent water. The propellants were subjected to a range of pressures from 0.5 to 11.0 Megapascal, while a high-speed camera captured images of the flame to calculate the burn rate.

"The liquid propellant is placed in a quartz tube inside of the high-pressure combustion chamber. The chamber is closed and pressurized with nitrogen, then the top of the liquid is ignited, the flame front descends down the tube, and is observed through a viewport. Full combustion can take between a few seconds to a few hundredths of a second."

"The burn rate influences how you design an engine," Rasmont said. "If your burn rate is too high, you're going to have combustion flashback--in which the decomposition flame will try to go back through the piping to the tank. It's going so fast that it can blow up everything.

"But on the other hand, if your burn rate is too low, then the combustion will be difficult to maintain. In the end, we want to operate in a Goldilocks zone where the burn rate is not too high, so it's safe, and not too low, so that the combustion is stable and efficient. We found that FAM-110A has a Goldilocks zone which is fairly wide, without any abrupt change in burn rate with increasing pressure.

In addition, the burn rate plateaus at a high pressure, which is also a desirable behavior.

"It means we could build a rocket engine using our propellant to be practical at almost any kind of pressure level. However, we also learned that FAM-110A leaves a significant amount of liquid residue after it burns. This is undesirable because it means that the combustion is incomplete. We might have to change the formulation in order to improve the efficiency of its combustion."

Rasmont said the next step is to try to use it on an actual rocket engine to see if it is practical. "The tests we have done here are encouraging, but they also point out limitations and places we can improve."

Credit: 
University of Illinois Grainger College of Engineering

OCT-based technique captures subtle details of photoreceptor function

image: Researchers have developed a unique synchronized high-speed OCT/scanning light ophthalmoscope (SLO) system that captures the function of the retina's rods and cones. The OCT images are co-registered with SLO images to pinpoint the location and type of photoreceptors captured in the series of 3D OCT images. The optical setup is shown.

Image: 
Mehdi Azimipour, UC Davis Eye Center.

WASHINGTON -- Researchers have developed a new instrument that has, for the first time, measured tiny light-evoked deformations in individual rods and cones in a living human eye. The new approach could one day improve detection of retinal diseases such as age-related macular degeneration, a leading cause of blindness in people over 55 worldwide.

"Our instrument offers a unique way to study retinal disease at the cellular level," said research team leader Ravi Jonnal from the University of California Davis (UC Davis) Eye Center. "Because existing methods for measuring dysfunction are much less sensitive, it offers a potential new way to detect disease."

In The Optical Society (OSA) journal Optics Letters, Jonnal and colleagues describe their new instrument, which is based on optical coherence tomography (OCT). Using the new approach, they were able to measure how individual rods and cones respond to light, and could detect deformations that were significantly smaller than the wavelength of the imaging light source.

The work is part of an emerging international field of research that aims to develop methods to fully capture the function of the retinal neural circuit of living people.

Combining imaging methods

Vision begins when rod and cone photoreceptors in the eye's retina detect light and initiate signals through a process called phototransduction. Retinal diseases such as age-related macular degeneration and retinitis pigmentosa cause vision loss by interfering with the function of rods and cones.

Because rods are thought to be more sensitive to the impacts of these diseases, changes in their function could provide an early indicator of disease or its progression. However, the small size of rods makes it difficult to image them, much less measure how well they are functioning.

In the new work, the researchers developed a unique high-speed OCT system capable of detecting slight swelling in the outer segments of the photoreceptors that occurs as a side effect of phototransduction. The system accomplishes this by capturing specialized OCT images simultaneously with scanning light ophthalmoscope images, enabling it to pinpoint the location and type of photoreceptors captured in a series of hundreds of 3D OCT images.

"Although imaging the swelling of rods and cones can reveal the dynamics of their response to light, until recently, it was not known if these changes could be measured in vivo in the human eye," said Mehdi Azimipour, first author of the paper. "This is because the size of the photoreceptors and the scale of the light-evoked deformations were well below the resolutions provided by retinal imaging systems."

Imaging high-speed dynamics

Recently, full-field OCT has been used to visualize the light-evoked deformation of larger peripheral cones. The OCT system developed by the researchers from UC Davis offers better confocality, which improves image quality by rejecting more scattered light and suppressing associated noise. Because the light-evoked deformation of photoreceptors can be very fast, the new system incorporates a high-speed Fourier-domain mode-locked laser that enables fast imaging and can scan 16 times faster than commercially available lasers used for swept source OCT.

To capture the highest resolution images possible, the researchers incorporated adaptive optics technology that measure the eye's aberrations and corrects them in real time. Even with adaptive optics, rod photoreceptors are too small to be imaged due to the system's 1-micron-wavelength light source. To overcome this problem, the researchers added a scanning light ophthalmoscope imaging channel that uses a wavelength that is less than 1 micron to increase the imaging resolution. This allowed differentiation of rods and cones in co-registered OCT images.

The researchers used their new instrument to measure the deformations of rods and cones in response to light of varying intensity in living human eyes. The responses of the cells increased as the light intensity increased until saturation occurred, consistent with phototransduction.

Because the new instrument produces large quantities of data (3.2GB/s) over even a small field of view, software needs to be developed to allow scanning of larger areas of retina and automatic data processing. This would make the system more practical for clinical use.

The researchers are now planning to use the instrument to measure photoreceptor light responses of patients with retinal diseases to see if new insights can be gained. "We hope to be involved in using the system to test novel therapies for blinding diseases, to speed up the process of bringing those therapeutics to the clinic," said Azimipour.

Credit: 
Optica

Cover crop mixtures must be 'farm-tuned' to provide maximum ecosystem services

image: This is how the cover crop mix looked at one study site. From the same seed mixture, cover crop mixture expression varied greatly across farms and researchers hypothesized that this variation was correlated with soil inorganic nitrogen concentrations and growing days.

Image: 
Ebony Murrell, Penn State

Penn State researchers, in a recent study, were surprised to learn that they could take the exact same number of seeds from the same plants, put them in agricultural fields across the Mid-Atlantic region and get profoundly different stands of cover crops a few months later.

The study came to be known as "'farm-tuning' cover crop mixtures," noted researcher Jason Kaye, professor of soil biogeochemistry, who added that the findings are significant because they show the need to customize cover crop mixes to achieve desired ecosystem services, depending on soil and climatic conditions.

Cover crop mixtures comprised of multiple species planted in rotation between cash crops provide a suite of benefits -- such as erosion reduction, weed control, and adding carbon and nitrogen to the soil. But it turns out, the expression of species in a mixture can differ greatly across locations.

The study was novel due to its breadth and complexity. Researchers tracked a five-species cover crop mix planted over two growing seasons on eight organic dairy farms in Pennsylvania and New York and on research plots at Penn State's Russell E. Larson Agricultural Research Center. In the University's experimental plots, researchers manipulated cover crop expression with nitrogen inputs and planting dates to learn response of the various species to soil conditions and growing days.

"There have been very few studies like this -- especially looking at cover crop mixtures that are comprised of more than two species -- that analyze how species interact with each other, so I think it's important research," Kaye said. "There has been a misguided assumption that you plant a cover crop mixture and you get the same result wherever you put it."

Commercial seed companies sell many pre-formulated seed mixtures, but they can also make customized mixes, Kaye said. "Our results show that with fixed, preformulated mixtures, what you grow is not always what you expect."

In the study, all eight of the participating farmers seeded the standard mixture and a "farm-tuned" mixture of the same five species -- canola, Austrian winter pea, triticale, red clover and crimson clover -- with seeding rates adjusted to achieve farmer-desired services. At each location, researchers parsed out the effects of soil inorganic nitrogen and growing days on cover crop mixture expression.

When soil inorganic nitrogen and growing days were high, they found, canola dominated the mixture, especially in the fall.

From the same seed mixture, cover crop mixture expression varied greatly across farms, and researchers hypothesized that this variation was correlated with soil inorganic nitrogen concentrations and growing days, explained lead researcher Barbara Baraibar, who was a postdoctoral scholar in the Department of Plant Sciences when the study was conducted.

Kaye's laboratory in the College of Agricultural Sciences has been conducting a continuous experiment evaluating the effectiveness of various cover crop mixtures since 2011.

In findings recently published in PLOS ONE, the researchers reported that low soil inorganic nitrogen favored legume species while a shorter growing season favored triticale. Changes in seeding rates influenced mixture composition in fall and spring but interacted with growing days to determine the final expression of the mixture.

The results show, Baraibar pointed out, that when soil inorganic nitrogen availability is high at the time of cover crop planting, highly competitive species can dominate mixtures, which could potentially decrease services provided by other species, especially legumes. And early planting dates can exacerbate the dominance of aggressive species.

Based on this study, farm managers should choose cover crop species and seeding rates according to their soil inorganic nitrogen and planting dates to ensure the provision of desired services, she advised, suggesting that the real value of this research is that it provides usable information to farmers who want to take advantage of it.

"We wanted to have an experiment that would be real enough for farmers to be able to use the data," Baraibar said. "We wanted to know what actually happens when we plant these cover crop mixtures on many different farms that have different soils and different management. Our research paves the way for farmers and seed companies to use soil and climate knowledge to design custom seed mixes with predictable growth from the different species."

Credit: 
Penn State

USU mathematicians unravel a thread of string theory

image: Using an abstract graph, Utah State University researchers identify divisors within each K3 surface to examine varied symmetries. The different Jacobian elliptic fibrations correspond to specific colors of a connected subset of the nodes of the graph. The symmetries of the graph and the possible colorings of the nodes are crucial to understanding the symmetries of the underlying physical theories.

Image: 
Malmendier/Hill, USU

LOGAN, UTAH, USA - Simply put, string theory is a proposed method of explaining everything. Actually, there's nothing simple about it. String theory is a theoretical framework from physics that describes one-dimensional, vibrating fibrous objects called "strings," which propagate through space and interact with each other. Piece by piece, energetic minds are discovering and deciphering fundamental strings of the physical universe using mathematical models. Among these intrepid explorers are Utah State University mathematicians Thomas Hill and his faculty mentor, Andreas Malmendier.

With colleague Adrian Clingher of the University of Missouri-St. Louis, the team published findings about two branches of string theory in the paper, "The Duality Between F-theory and the Heterotic String in D=8 with Two Wilson Lines," in the August 7, 2020 online edition of 'Letters in Mathematical Physics.' The USU researchers' work is supported by a grant from the Simons Foundation.

"We studied a special family of K3 surfaces - compact, connected complex surfaces of dimension 2 - which are important geometric tools for understanding symmetries of physical theories," says Hill, who graduated from USU's Honors Program with a bachelor's degree in mathematics in 2018 and completed a master's degree in mathematics this past spring. "In this case, we were examining a string duality between F-theory and heterotic string theory in eight dimensions."

Hill says the team proved the K3 surfaces they investigated admit four unique ways to slice the surfaces as Jacobian elliptic fibrations, formations of torus-shaped fibers. The researchers constructed explicit equations for each of these fibrations.

"An important part of this research involves identifying certain geometric building blocks, called 'divisors,' within each K3 surface," he says. "Using these divisors, crucial geometric information is then encoded in an abstract graph."

This process, Hill says, enables researchers to investigate symmetries of underlying physical theories demonstrated by the graph.

"You can think of this family of surfaces as a loaf of bread and each fibration as a 'slice' of that loaf," says Malmendier, associate professor in USU's Department of Mathematics and Statistics. "By examining the sequence of slices, we can visualize, and better understand, the entire loaf."

The undertaking described in the paper, he says, represents hours of painstaking "paper and pencil" work to prove theorems of each of the four fibrations, followed by pushing each theorem through difficult algebraic formulas.

"For the latter part of this process, we used Maple Software and the specialized Differential Geometry Package developed at USU, which streamlined our computational efforts," Malmendier says.

Credit: 
Utah State University

Novel method can efficiently create several 'building blocks' of pharmaceutical drugs

image: A novel method that promises to accelerate drug discovery research for several diseases

Image: 
Waseda University

Several drugs, including those for depression, schizophrenia, and malaria, would not be if not for a type of organic chemical compound called alicyclic compounds. These compounds are 3D structures formed when three or more carbon atoms join in a ring via covalent bonds, but the ring is not aromatic.

Aromatic compounds (or arenes) are another class of organic compounds which are 2D structures with reactive properties distinct from those of alicyclic compounds. A well-known example is benzene, the six-carbon ring comprising alternating single- and double-bonds between the carbon atoms.

By dearomatizing arenes, one can get alicyclic compounds. In fact, this dearomatization is one of the most powerful ways of obtaining alicyclic compounds. But some of the most abundantly available arenes, such as benzene and naphthalene, are very stable, and breaking them up to construct alicyclic compounds has been challenging. With existing methods, often large amounts of reactants yield very little product.

"The highly efficient conversion of readily and commercially available arenes to high value-added alicyclic compounds could accelerate drug discovery research by leaps," say Assistant Professor Kei Muto and Professor Junichiro Yamaguchi of Waseda University, Japan, who led the discovery of a novel efficient method. Their study is published in the Royal Society of Chemistry's Chemical Science.

In the novel method, bromoarenes are reacted with two other classes of organic compounds, diazo compounds and malonates, in the presence of a palladium catalyst (compound that enables a chemical reaction), under optimal conditions of concentration, temperature, and time (experimentally ascertained in the study). Subsequently, good amounts of the corresponding alicyclic compounds are produced.

"What is really special about this method is that a range of bromoarenes, including benzenoids, azines, and heteroles, can be converted to their alicyclic counterparts," Muto says. He speaks also of the key portions of an alicyclic molecule that give it complexity and utility--the functional groups attached to the cyclic carbons. He says, "The obtained compounds have functional groups at two points in the cyclic chain, and these can be easily diversified through further reactions to yield a variety of highly functionalized 3D molecules."

The use of malonates as reactant is what allows this multi-functionalization, setting this novel method apart from existing methods, which are often highly specific in terms of the products possible. Because malonates are known to predominantly react with palladium-benzyl complexes, the use of a palladium-based catalyst became key to the success of this method. The palladium catalyst led to the formation of a benzyl-palladium intermediate that could then react with malonates, producing the final multi-functionalized alicyclic products.

Thus, designing an appropriate catalysis process was essential to developing the aromatic-to-alicyclic transformation technique. "Next, we would like to design new catalysts to make this reaction more general; that is, compatible with a broader range of arenes," says Yamaguchi.

With their future plans in place, Muto and Yamaguchi are confident of the good that their team's work can do in the world: "We believe this organic reaction will help drug discovery research finally 'escape from the flatland' of the simpler and 2D aromatic compounds, so to speak, thereby advancing medicinal chemistry significantly."

Credit: 
Waseda University

Energy-efficient tuning of spintronic neurons

image: Scanning electron microscopy image and schematic of cross-sectional structure of the created spin-Hall nano-oscillator device with the gate electrode

Image: 
Johan Åkerman and Shunsuke Fukami

The human brain efficiently executes highly sophisticated tasks, such as image and speech recognition, with an exceptionally lower energy budget than today's computers can. The development of energy-efficient and tunable artificial neurons capable of emulating brain-inspired processes has, therefore, been a major research goal for decades.

Researchers at the University of Gothenburg and Tohoku University jointly reported on an important experimental advance in this direction, demonstrating a novel voltage-controlled spintronic microwave oscillator capable of closely imitating the non-linear oscillatory neural networks of the human brain.

The research team developed a voltage-controlled spintronic oscillator, whose properties can be strongly tuned, with negligible energy consumption. "This is an important breakthrough as these so-called spin Hall nano-oscillators (SHNOs) can act as interacting oscillator-based neurons but have so far lacked an energy-efficient tuning scheme - an essential prerequisite to train the neural networks for cognitive neuromorphic tasks," proclaimed Shunsuke Fukami, co-author of the study. "The expansion of the developed technology can also drive the tuning of the synaptic interactions between each pair of spintronic neurons in a large complex oscillatory neural network."

Earlier this year, the Johan Åkerman group at the University of Gothenburg demonstrated, for the first time, 2D mutually synchronized arrays accommodating 100 SHNOs while occupying an area of less than a square micron. The network can mimic neuron interactions in our brain and carry out cognitive tasks. However, a major bottleneck in training such artificial neurons to produce different responses to different inputs has been the lack of the scheme to control individual oscillator inside such networks.

The Johan Åkerman group teamed up with Hideo Ohno and Shunsuke Fukami at Tohoku University to develop a bow tie-shaped spin Hall nano-oscillator made from an ultrathin W/CoFeB/MgO material stack with an added functionality of a voltage controlled gate over the oscillating region [Fig. 1]. Using an effect called voltage-controlled magnetic anisotropy (VCMA), the magnetic and magnetodynamic properties of CoFeB ferromagnet, consisting of a few atomic layers, can be directly controlled to modify the microwave frequency, amplitude, damping, and, thus, the threshold current of the SHNO [Fig. 2].

The researchers also found a giant modulation of SHNO damping up to 42% using voltages from -3 to +1 V in the bow-tied geometry. The demonstrated approach is, therefore, capable of independently turning individual oscillators on/off within a large synchronized oscillatory network driven by a single global drive current. The findings are also valuable since they reveal a new mechanism of energy relaxation in patterned magnetic nanostructures.

Fukami notes that "With readily available energy-efficient independent control of the dynamical state of individual spintronic neurons, we hope to efficiently train large SHNO networks to carry out complex neuromorphic tasks and scale up oscillator-based neuromorphic computing schemes to much larger network sizes."

Credit: 
Tohoku University

Tumour gene test could help to predict ovarian cancer prognosis

image: Formalin-Fixed Paraffin-Embedded (FFPE) tumour blocks in Professor Ramus' lab.

Image: 
Photo: UNSW

A tumour test could help to identify ovarian cancer patients with predicted poor survival, and down the track inform new therapeutical approaches, the results of a major international collaboration have shown.

The research paper led by UNSW Medicine - involving 125 authors across 86 organisations, including University of Southern California (USC), University of Cambridge, University of British Columbia, Huntsman Cancer Institute, Mayo Clinic, and Peter MacCallum Cancer Center in Melbourne - was published in Annals of Oncology.

In 2020, it is predicted that 1,532 Australian women will be diagnosed with ovarian cancer, and 1,068 will die from the disease this year. It has poor survival and the type studied in this paper - high grade serous ovarian cancer - is the most common and worst survival type. Ovarian cancer is the eighth most commonly occurring cancer in women, with nearly 300,000 global new cases in 2018.

"We conducted an analysis of 3,769 tumour samples from women with ovarian cancer and found we were able to reliably use a piece of tumour to determine how good a woman's survival chances would be five years after diagnosis," says lead author Professor Susan Ramus from UNSW Medicine.

The researchers found their gene expression test was substantively better at predicting survival than using a patient's age and cancer stage.

"When women were divided into five groups, we found that the women whose tumour gene expression was associated with the best prognosis had nine years survival, whereas the women in the poorest survival group have two years survival, which is a very big difference," Professor Ramus says.

"Our vision is that clinicians could use our test at diagnosis to identify the group of patients who wouldn't do well on the current treatments and potentially offer them alternatives - for example, we may be able to put those patients into clinical trials and offer them different treatments that may improve their survival."

For the study, the team used a training set of samples and a test set - nearly 4000 samples in total.

"Using novel statistical approaches, we analysed data from six previous gene expression studies, which helped us identify genes likely to be involved in high grade serous ovarian cancer survival," says the paper's first author, Dr Joshua Millstein from USC.

After putting together a panel of about 500 candidate genes, the team measured gene expression in the 4000 samples using the NanoString platform.

"To predict survival from gene expression, we chose one of four machine learning methods, an approach called 'elastic net', which performed the best in the training data," Dr Millstein says.

"We used the training set to determine what genes could be used in the prediction, and then we tested them to see whether we got the same results in the other set," Professor Ramus says.

Professor Ramus is the co-founder of the Ovarian Tumour Tissue Analysis (OTTA) consortium, an international group of researchers that are working on a number of different large-scale projects, using the samples compiled by the consortium to address important clinical questions.

"The consortium is unique in this space because it has access to thousands of samples - which is a lot of samples for a rare disease like ovarian cancer," she says.

"That's what enabled us to develop this prognostic tool - other groups have tried before to look at prognosis, but nothing has been used clinically. At the moment, only patient age and stage are used to determine survival, so something like our tool is sorely needed."

The researchers say they selected genes for analysis that had known drug targets.

"Some of the genes we identified as being predictors for good or poor survival may be potential targets for new treatments. At the moment the majority of ovarian cancer patients get the same treatment - it's not like breast cancer or other cancers where they look at your tumour and select from a range of treatments. So this is a way to stratify patients and potentially give more personalized treatment down the track."

To validate the findings further, the research team wants to include the test in a prospective study and clinical trials.

"Potentially, we could incorporate it within a clinical trial so that the women who are predicted to have poor survival could get alternative treatments as rapidly as possible," Prof. Ramus says.

The researchers hope their test will be ready for clinical use in the near future.

Credit: 
University of New South Wales

DNA damage triggers reprogramming into stem cells

image: DNA damage causes cells to reprogram themselves into stem cells and regenerate new plant bodies in the moss Physcomitrella patens.

Image: 
NIBB

A joint research team from the National Institute for Basic Biology (NIBB) in Japan, Huazhong Agricultural University in China, and the Czech Academy of Sciences in the Czech Republic has discovered that DNA damage causes cells to reprogram themselves into stem cells and regenerate new plant bodies in the moss Physcomitrella patens. The researchers describe this phenomenon as a unique environmental adaptation of plants.

In animals, cells with severe DNA damage undergo apoptosis--cell death--and are eliminated. These new results published in Nature Plants tell a different story for moss cells. Ms. Nan Gu, a NIBB Special Inter-University Researcher who is a graduate student at Huazhong Agricultural University under the mentorship of Dr. Chunli Chen, and her collaborators discovered that when DNA of the moss is damaged, the DNA is immediately repaired. Furthermore, the cells with repaired DNA become stem cells, which can produce an entire plant body, similar to fertilized egg cells. Nan Gu says, "I was shocked by the result, because animal cells select to die, but plant cells select to produce new offspring".

After Physcomitrella plants were soaked in a DNA-damaging solution for 6 hours, their DNA was severely broken. However, the damaged DNA was repaired to almost its original state within one day. After that, the STEMIN1 gene, a master regulator of reprogramming was triggered. STEMIN1-positive cells eventually became stem cells and went on to form whole plants with stems and leaves.

"It has been known that, in plants, differentiated cells around dead cells can become stem cells. However, this is the first discovery that differentiated cells with damaged DNA themselves become stem cells", explains Dr. Yosuke Tamada, a co-first author of this study.

"This phenomenon we discovered is a strategy for environmental adaptation, especially in plants, which are not able to escape from adverse environments as quickly as animals", said Professor Mitsuyasu Hasebe from NIBB, who led the research team.

Credit: 
National Institutes of Natural Sciences

Scientists develop approach to synthesize unconventional nanoalloys for electrocatalytic application

Metallic alloys at nanometer scale (nanoalloys) have great potentials in electrocatalysis. The interaction among different components in nanoalloys may modify the electronic configurations of active metals and generate synergistic effects, boosting their performance in terms of activity, durability and selectivity in electrocatalytic reactions.

Alloying with cheap transition metals is an effective way to reduce the dosage of relatively expensive metals, e.g. Pt and Pd. Usually, the nanoalloys composed of miscible metals could be synthesized and tailored by wet-chemistry methods.

However, engineering of nanoalloys composed of various immiscible metals or dissimilar elements (referred to as high entropy nanoalloys) remains challenging due to vastly different chemical and physical properties.

Recently, Prof. Mansoo Choi's group from Seoul National University (SNU) and Prof. Jun YANG's group from the Institute of Process Engineering (IPE) of the Chinese Academy of Sciences proposed an approach to synthesize unconventional nanoalloys for electrocatalytic applications.

The study was published in Matter on August 17. Dr. FENG Jicheng from SNU and Associate Professor CHEN Dong from IPE serve as the co-first authors.

They prepared 55 distinct nanoalloys with the mean particle size of ca. 5 nm, including miscible, immiscible and high entropy nanoalloys via a vapor-source technology, also called "sparking mashup".

The proposed synthetic approach broke through the limitation of wet-chemistry methods for synthesizing immiscible and high entropy nanoalloys. The composition and size of the obtained nanoalloys were also controllable by varying the experimental conditions/parameters.

In specific, benefiting from their tiny sizes, electronic interactions among different components and clean surfaces, the as-prepared Pt-/Pd-containing nanoalloys exhibited superior performances in electrocatalytic oxidation of methanol and ethanol.

The next goal of the research focuses on the optimization of Pt-/Pd-containing nanoalloys to further enhance their performance in catalyzing electrochemical reactions in direct alcohol fuel cells.

"Through this way, we expect that cost-effective, highly active and durable electrocatalysts could be created for a variety of renewable energy technologies and beyond," said Prof. YANG.

Credit: 
Chinese Academy of Sciences Headquarters

Telehealth visits have skyrocketed for older adults, but some concerns & barriers remain

image: Key findings about the use of telehealth by older adults in 2020, compared with 2019.

Image: 
University of Michigan

One in four older Americans had a virtual medical visit in the first three months of the COVID-19 pandemic, most of them by video, a new telehealth poll finds. That's much higher than the 4% of people over 50 who said they had ever had a virtual visit with a doctor in a similar poll taken in 2019.

Comfort levels with telehealth, also called telemedicine, have also increased. Back in 2019, most older adults expressed at least one serious concern about trying a telehealth visit. But by mid-2020, the percentage with such concerns had eased, especially among those who had experienced a virtual visit between March and June of this year.

Yet not all older adults see virtual care as an adequate substitute for in-person care, even in a pandemic, the National Poll on Healthy Aging findings show.

And 17% of people over 50 still say they have never used any sort of video conferencing tool for any reason, including medical care. While that's 11 percentage points lower than in the 2019 poll, that lack of experience or access could be a barrier to receiving care without having to leave home as the pandemic continues to surge in dozens of states.

Both the 2019 and 2020 polls were done for the University of Michigan's Institute for Healthcare Policy and Innovation with support from AARP and Michigan Medicine, U-M's academic medical center. Both involved a national sample of more than 2,000 adults aged 50 to 80.

"These findings have implications for the health providers who have ramped up telehealth offerings rapidly, and for the insurance companies and government agencies that have quickly changed their policies to cover virtual visits," says Laurie Buis, Ph.D., M.S.I., a health information technology researcher at U-M who helped design the poll and interpret its results. "Tracking change over time could inform future efforts, and highlights the need for much more research on concerns, barriers and optimal use of telehealth by older adults."

"This has been an extraordinary time for the telemedicine movement, and these poll results show just how powerful this 'trial by fire' has been," says Jeff Kullgren, M.D., M.P.H., M.S., associate director of the poll, health care researcher and a primary care provider who uses telehealth with his patients at the VA Ann Arbor Healthcare System. "But our data also highlight areas of continued concern for patients that need to be addressed."

COVID-19 impacts

The poll finds that 30% older adults had ever participated in a telehealth visit by June 2020, perhaps reflecting changes in insurance coverage that began to take effect before the pandemic. But the figure for March through June suggests that much of the movement to telehealth visits resulted from states mandating reductions in elective and non-emergency health care during the first months of the pandemic, as part of "stay home" public health requirements aimed at reducing the spread of the virus.

Nearly half of those who had had a telehealth visit said that they had had an in-person visit canceled or rescheduled by their health care provider between March and June, and 30% said that a virtual visit was the only option when they called to schedule an appointment.

Awareness about the special risks of COVID-19 among older adults may have also played a role, as 45% of respondents said the pandemic made them more interested in telehealth. The percentage was higher among those who had had a telehealth visit in the past. But only 15% of the poll respondents who had a telehealth visit said that fear of the virus led them to request a telehealth visit, whether for a new concern or in place of a previously scheduled visit.

Among those who had telehealth visit this past spring, 91% said it was easy to connect with their doctor. One-third had their visits via a video connection from their phone, and another third carried out the video visit on a tablet or computer. In addition, 36% had an audio-only visit by phone, which the 2019 poll did not ask about.

Year-over-year change

The new poll asked older adults many of the same questions as the poll conducted in 2019. This allows for comparisons between the two years, such as the percentage who said:

They feel very or somewhat comfortable with video conferencing technologies: 64%, up from 53% in 2019

At least one of their health providers offer telehealth visits: 62%, up from 14%
* They are interested in using telehealth to connect with a provider they had seen before: 72%, up from 58%

They are interested in using telehealth for a one-time follow-up appointment after a procedure or operation: 63%, up from 55%

They have concerns about privacy during a telehealth visit: 24%, down from 49%

They are concerned they would have difficulty seeing or hearing the provider during a video visit: 25%, down from 39%

But there was almost no change in the percentage who said they would feel comfortable seeing a provider for the first time via a virtual visit (about one in three would), and the percentage who felt that the quality of care in a telehealth visit was not as good (about two-thirds).

Moving forward

Physician groups, insurers, professional societies and organizations including AARP are monitoring the situation with telehealth, and in some cases advocating for the temporary changes in Medicare and Medicaid payment policy, and other relevant regulations, to become permanent.

In June, AARP Research released a report about older adults' awareness of and attitudes toward telehealth. AARP has also published resources to help older adults and their caregivers use telehealth services.

"It's clear from this study and AARP's research that older adults are increasingly comfortable with telehealth and are willing to use technology to interact with their health providers," says Alison Bryant, Ph.D., senior vice president of research for AARP, says. "As the coronavirus pandemic continues, telehealth has been a useful tool for older adults to access health care from the safety of their own homes, but we must be mindful that not everyone can access these services."

Meanwhile, Buis is leading a new COVID-19-related telehealth research interest group as part of a broader IHPI initiative to evaluate the impacts and appropriate use of telehealth as well as barriers. Buis is an assistant professor in the U-M Department of Family Medicine, which has pivoted to virtual care for many primary care appointments along with the rest of Michigan Medicine.

The National Poll on Healthy Aging results are based on responses from a nationally representative sample of 2,074 adults aged 50 to 80 who answered a wide range of questions online. Questions were written, and data interpreted and compiled, by the IHPI team. Laptops and Internet access were provided to poll respondents who did not already have them.

Credit: 
Michigan Medicine - University of Michigan

Researchers got busy: After nearly allowing the solution to a math riddle

COMPUTER SCIENCE Researchers from the University of Copenhagen and the Technical University of Denmark (DTU) thought that they were five years away from solving a math riddle from the 1980's. In reality, and without knowing, they had nearly cracked the problem and had just given away much of the solution in a research article. The solution could be used to improve tomorrow's phones and computers.

Jacob Holm and Eva Rotenberg

The two computer scientists, Assistant Professor Jacob Holm of UCPH and Associate Professor Eva Rotenberg of DTU almost gave their solution away in the summer of 2019, after submitting a research article that became the precursor to the article in which they finally solved the math riddle.

A veritable brain teaser. That's how one can safely describe this mathematical problem in the discipline of graph theory. Two mathematicians from the University of Copenhagen's Department of Computer Science and DTU have now solved a problem that the world's quickest and most clever have been struggling with since the 1980's.

The two computer scientists, Assistant Professor Jacob Holm of UCPH and Associate Professor Eva Rotenberg of DTU almost gave their solution away in the summer of 2019, after submitting a research article that became the precursor to the article in which they finally solved the math riddle.

"We had nearly given up on getting the last piece and solving the riddle. We thought we had a minor result, one that was interesting, but in no way solved the problem. We guessed that there would be another five years of work, at best, before we would be able to solve the puzzle," explains Jacob Holm, who is a part of BARC, the algorithm section at UCPH's Department of Computer Science.

The three utilities problem

In 1913, a precursor to the now solved mathematical conundrum was published in "The Strand Magazine" as "The Three Utilities Problem". It caused the magazine's readers to scratch their heads and ponder. In the problem, each of three cottages must have water, gas and electricity, while the "lines" between the houses and water, electricity and gas may not cross each other -- which is not possible.

A solution between the lines

Simply put, the puzzle is about how to connect a number of points in a graph without allowing the lines connecting them to cross. And how, with a mathematical calculation -- an algorithm -- you can make changes to an extensive "graph network" to ensure that no lines intersect without having to start all over again. Properties that can be used for, among other things, building immense road networks or the tiny innards of computers, where electrical circuitry on circuit boards may not cross.

Jacob Holm has been interested in the mathematical conundrum since 1998, but the answer was only revealed while the two researchers were reading through their already submitted research article. In the meantime, the researchers heard about a novel mathematical technique that they realized could be applied to the problem.

"While reading our research article, we suddenly realized that the solution was before our eyes. Our next reaction was 'oh no - we've shot ourselves in the foot and given away the solution,' says Associate Professor Eva Rotenberg of DTU.

About graph theory

A GRAPH is a very simple construction used to model things that can be described as objects and the connections between them. Graph theory is both an area of mathematics and an important tool in computer science.

In this context, a graph can be illustrated by a diagram consisting of a number of points (nodes, vertices) associated with a number of lines (edges). Each edge is illustrated as a line (or curved piece) with nodes as its two endpoints.

About the solution

There are two kinds of updates in dynamic graphs: One can delete an edge and you can insert a new edge. These two operations must be made by the user, while an algorithm keeps track of the network's drawing at all times. This is the algorithm that the researchers have found the recipe for.

Read the research article: https://arxiv.org/abs/1911.03449

Could be used for computer electronics

This is when the two researchers got busy writing the research paper and tying up loose ends to solve the conundrum that Holm had been working on intermittently since 1998.

"We worked on the article non-stop, for five to six weeks. And, it ended up filling more than 80 pages," says Eva Rotenberg.

Fortunately, no one beat them to the solution and the two researchers were able to present their results at the main theoretical computer science conferences, which were meant to be held in Chicago, but ended up being held virtually.

So, what can the solution to this mathematical conundrum be used for? The two researchers don't know for sure, but they have a few suggestions.

"Our research is basic research, so we rarely know what it will end up being used for. Even from the start, we find applications difficult to imagine," says Jacob Holm, who adds:

"the design of microchips and circuit boards, found in all electronics, could be an area where our result ends up being used. When drawing wires on a circuit board, they must never intersect. Otherwise, short circuits will occur. The same applies to microchips, which contain millions of transistors and for which one must have a graph drawing."

Credit: 
University of Copenhagen

Scientists use photons as threads to weave novel forms of matter

image: Schematic of the experimental setup.

Image: 
University of Southampton

New research from the University of Southampton has successful discovered a way to bind two negatively charged electron-like particles which could create opportunities to form novel materials for use in new technological developments.

Positive and negative electric charges attract each other, forming atoms, molecules, and all that we usually refer as matter. However, negative charges repel each other, and in order to form atom-like bound objects some extra glue is needed to compensate this electrostatic repulsion and bind the particles together.

In this latest study, published in the journal Nature Physics, an international team, led by Professor Simone De Liberato from the School of Physics and Astronomy at the University of Southampton, demonstrated for the first time that photons, the particles which compose light, can be used to glue together negative charges, creating a novel form of matter they named a Photon Bound Exciton

Implementing a theoretical prediction published last year by the same team, Prof De Liberato and co-workers fabricated a nano-device, trapping electrons into nanoscopic wells. They started by showing that photons that struck the device with high enough energy extracted electrons from the wells, an expected manifestation of the photoelectric effect, whose discovery earned Einstein his 1921 Nobel prize.

Prof De Liberato and his team then enclosed the device between two gold mirrors, which trapped the photons and focussed the luminous energy close to the electrons, dramatically increasing the interaction between light and matter. They observed that a negatively-charged electron kicked out by a photon then remains instead trapped in the well, bound to the other negatively-charged electrons in a novel electronic configuration stabilised by the photon.

This result demonstrates the possibility of engineering novel artificial atoms with designer electronic configurations, dramatically expanding the list of materials available for scientific and technological applications.

Explaining the significance of his team's discovery, Prof De Liberato said: "We demonstrated how to use light as a sort of subatomic ziptie, binding together electrons to create novel atom-like objects. Doing so we broadened the catalogue of materials available to design photonic devices. I look forward to see how the many colleagues working in photonics will exploit this extra leeway to engineer novel amazing devices."

Credit: 
University of Southampton

Environment drivers of ecological complexity in marine intertidal communities

Environmental conditions such as sea surface temperature and the occurrence of cold water upwelling events drive the structure of interaction networks in marine intertidal communities via their effects on species richness, according to new research.

In a paper published by Ecology, Swansea University's Dr Miguel Lurgi paired up with researchers from the Coastal Marine Research station in Chile, who compiled data of species composition and three types of ecological interactions occurring between species in rocky intertidal communities across 970km of shoreline of the South American Pacific coast. They analysed the environmental drivers of the structure of networks of ecological interactions between species in marine intertidal communities.

The team calculated a suite of properties across networks of different ecological interaction types: trophic, competitive, and positive (e.g. mutualistic). Using these network properties, they then investigated potential environmental drivers of this multivariate network organisation. These included variation in sea surface temperature and coastal upwelling, the main drivers of productivity in nearshore waters.

Their results suggest that structural properties of multiplex ecological networks are affected by the number of species in the community and modulated by abiotic factors influencing productivity and environmental predictability, such as cold water upwelling events and long-term averages of sea surface temperature.

These effects are stronger on non-trophic negative ecological interactions, such as competition for space among organisms, than on the ecological relationships between consumers and their resources.

These findings highlight the need for a more complete understanding of the geographical variability of ecological interactions and the networks they form if researchers aim to predict the potential effects of environmental changes on ecological communities.

Dr Miguel Lurgi, a lecturer in biosciences at Swansea University and an author of the research said:

"It is quite exciting that for the first time we get a complete picture of the environmental drivers of the organisation of multiplex ecological networks in marine intertidal communities. Furthermore, our work highlights the directionality of these changes.

"This information is very valuable to predict the effects of environmental change on the organisation of natural communities."

Credit: 
Swansea University

Widespread electric vehicle adoption would save billions of dollars, thousands of lives

EVANSTON, Ill. -- Northwestern University researchers have combined climate modeling with public health data to evaluate the impact of electric vehicles (EVs) on U.S. lives and the economy.

A new study found that if EVs replaced 25% of combustion-engine cars currently on the road, the United States would save approximately $17 billion annually by avoiding damages from climate change and air pollution. In more aggressive scenarios -- replacing 75% of cars with EVs and increasing renewable energy generation -- savings could reach as much as $70 billion annually.

"Vehicle electrification in the United States could prevent hundreds to thousands of premature deaths annually while reducing carbon emissions by hundreds of millions of tons," said Daniel Peters, who led the study. "This highlights the potential of co-beneficial solutions to climate change that not only curb greenhouse gas emissions but also reduce the health burden of harmful air pollution."

"From an engineering and technological standpoint, people have been developing solutions to climate change for years," added Northwestern's Daniel Horton, senior author of the study. "But we need to rigorously assess these solutions. This study presents a nuanced look at EVs and energy generation and found that EV adoption not only reduces greenhouse gases but saves lives."

The study was published online last week (August 13) in the journal GeoHealth.

During this research, Peters was an undergraduate researcher in Horton's laboratory at Northwestern. He now works for the Environmental Defense Fund. Horton is an assistant professor of Earth and planetary sciences in Northwestern's Weinberg College of Arts and Sciences.

To conduct the study, Horton, Peters and their team looked at vehicle fleet and emissions data from 2014. If 25% of U.S. drivers adopted EVs in 2014 -- and the power required to charge their batteries came from 2014's energy generation infrastructure -- then 250 million tons of carbon dioxide (CO2) emissions would have been mitigated. Although the impact of carbon emissions on the climate is well documented, combustion engines also produce other harmful pollutants, such as particulate matter and the precursors to ground-level ozone. Such pollutants can trigger a variety of health problems, including asthma, emphysema, chronic bronchitis and ultimately premature death.

After leaving tailpipes and smokestacks, pollutants interact with their environment, including background chemistry and meteorology.

"A good example is to look at nitrogen oxides (NOx), a group of chemicals produced by fossil-fuel combustion," Peters explained. "NOx itself is damaging to respiratory health, but when it's exposed to sunlight and volatile organic compounds in the atmosphere, ozone and particulate matter can form."

To account for these interactions, the researchers used a chemistry-climate model developed at the Geophysical Fluid Dynamics Laboratory. Jordan Schnell, a postdoctoral fellow in Horton's lab, performed the model experiments that simultaneously simulate the atmosphere's weather and chemistry, including how emissions from combustion engines and power generation sources interact with each other and other emissions sources in their environments.

With this model, the researchers simulated air pollutant changes across the lower 48 states, based on different levels of EV adoption and renewable energy generation. Then, they combined this information with publicly available county health data from the U.S. Environmental Protection Agency (EPA). This combination enabled the research team to assess health consequences from the air quality changes caused by each electrification scenario.

The research team assigned dollar values to the avoided climate and health damages that could be brought about by EV adoption by applying the social cost of carbon and value of statistical life metrics to their emission change results. These commonly used policy tools attach a price tag to long-term health, environmental and agricultural damages.

"The social cost of carbon and value of statistical life are much-studied and much-debated metrics," Horton said. "But they are regularly used to make policy decisions. It helps put a tangible value on the consequences of emitting largely intangible gases into the public sphere that is our shared atmosphere."

Credit: 
Northwestern University