Tech

Solar development: super bloom or super bust for desert species?

image: The rare Barstow woolly sunflower was more sensitive to solar development impacts than its common relative, the woolly daisy in a study by UC Davis and UC Santa.

Image: 
Karen Tanner

Throughout the history of the West, human actions have often rushed the desert -- and their actions backfired. In the 1920s, the Colorado River Compact notoriously overallocated water still used today by several western states because water measurements were taken during a wet period.

More currently, operators of the massive Ivanpah Solar Electric Generating System in the Mojave Desert are spending around $45 million on desert tortoise mitigation after initial numbers of the endangered animals were undercounted before its construction.

A study published in the journal Ecological Applications from the University of California, Davis, and UC Santa Cruz warns against another potential desert timing mismatch amid the race against climate change and toward rapid renewable energy development.

"Our study suggests that green energy and species conservation goals may come into conflict in California's Mojave Desert, which supports nearly 500 rare plant species as well as a rapidly expanding solar industry," said lead author Karen Tanner, who conducted the work as a Ph.D. student at UC Santa Cruz under a grant led by UC Davis assistant professor Rebecca R. Hernandez.

Tanner spent seven years teasing out the demography of two native desert flowers -- the rare Barstow woolly sunflower (E. mohavense) and the common Wallace's woolly daisy (E. wallacei), comparing their performance both in the open and under experimental solar panels. The authors wondered, how would desert-adapted plants respond to panels that block light and rainfall? Would rare species respond differently than common species to these changes?

These aren't easy questions to unearth. At one point, Tanner glued tiny seeds to individual toothpicks to gather emergence data. At another, she scoured the desert floor on hands and knees to count emerging seedlings of the rare sunflower -- about the size of a thumbnail at maturity.

SUPER BLOOM SURPRISES

Such painstaking commitment is one reason no previous studies have modeled species' responses to photovoltaics at the population level. It takes time and overcoming tricky logistical and mathematical challenges to model little-known species interactions in the evasive desert. What is nowhere in sight one year, may thrive the next.

That element of surprise is what makes "super blooms" so special and so captivating. Those bursts of wildflowers blanket expanses of desert landscapes after especially wet years and are believed to be critical to the long-term persistence of desert annual populations.

The study found that solar panel effects on plant response were strongly influenced by weather and physical features of the landscape. During the 2017 super bloom, panel shade negatively affected population growth of the rare species, but had little effect on its common relative.

The study suggests that rare species may be more sensitive to solar development impacts than common species. It highlights the potential for solar panel effects to vary among species, as well as over space and time.

A QUESTION OF TIME

The study provides an example of the importance of taking the necessary time to understand an ecosystem before irrevocably changing it.

"The desert -- and many other biomes -- don't respond on our timescales," said Hernandez, co-director of the Wild Energy Initiative through the UC Davis John Muir Institute. "If we want to understand them, we need to study them on the timescales they operate. Otherwise, it is like taking a photo of a moving train and calling it a shipping container. Racing to build renewable energy in places that have already been skinned of their biology makes sense -- let's not wait to put solar on existing rooftops. But in natural environments, we need to listen and observe first."

Credit: 
University of California - Davis

Revealing the secret cocoa pollinators

image: Landscapes in Central Sulawesi, Indonesia, dominated by cocoa plantations

Image: 
Manuel Toledo

The importance of pollinators to ensure successful harvests and thus global food security is widely acknowledged. However, the specific pollinators for even major crops - such as cocoa - haven't yet been identified and there remain many questions about sustainability, conservation and plantation management to enhance their populations and, thereby, pollination services. Now an international research team based in Central Sulawesi, Indonesia and led by the University of Göttingen has found that in fact ants and flies - but not ceratopogonid midges as was previously thought - appear to have a crucial role to play. In addition, they found that promoting biodiversity friendly landscapes, leaf-litter and trees providing shade in agroforestry systems were important to enhance tiny cocoa pollinators. The research was published in Biological Conservation.

The team, in collaboration with Tadulako University in Palu, carried out two separate experiments involving 42 cocoa agroforestry farms in the Napu Valley of Central Sulawesi. The work included applying a sticky glue to over 15,000 flowers in more than 500 trees for an eight-month period and recording the identity and abundance of captured flower visitors. In one experiment involving 18 farms, they investigated the effect of the distance between the forest and the farm, and the amount of canopy cover from shade trees, on the abundance of the main pollinators. In the second experiment in 24 different cocoa farms, they measured the effect of leaf-litter management on pollinators. In both experiments they quantified the amount of forest and agroforests surrounding the 42 cocoa farms.

The researchers found that ants were the most common flower-visitors. This highlighted their potential as pollinators whether directly (by transporting pollen), or indirectly (by disturbing pollinators and promoting their movement). The study also shows that preserving biodiversity friendly landscapes, such as forests and agroforests, and promoting agroforestry systems is crucial for pollinator conservation. This in turn pro-motes pollination and sustainable cocoa production. "We were surprised that we did not capture any cerato-pogonid midges, even though these tiny midges were considered the most important pollinators of cocoa. This emphasizes that cocoa pollinators are more diverse than previously known but also that there is still much to learn," said Dr Manuel Toledo-Hernández, from the University of Göttingen and first author of the study. "Current global cocoa initiatives should consider the role of biodiversity friendly habitats for the con-servation of pollinators, because their pollination services are an ecological alternative towards current commitments on combining high yields with conservation," added Toledo-Hernández and his coauthors Teja Tscharntke and Thomas C. Wanger.

Credit: 
University of Göttingen

Development of microsatellite markers for censusing of endangered rhinoceros

image: A Sumatran rhino is seen at the Sumatran Rhino Sanctuary in Way Kambas National Park in Indonesia

Image: 
Bertha Letizia

Today, the Sumatran rhinoceros (Dicerorhinus sumatrensis) is critically endangered, with fewer than 100 individuals surviving in Indonesia on the islands of Sumatra and Borneo. To ensure survival of the threatened species, accurate censusing is necessary to determine the genetic diversity of remaining populations for conservation and management plans.

A new study reported in BMC Research Notes characterized 29 novel polymorphic microsatellite markers -- repetitive DNA sequences -- that serve as a reliable censusing method for wild Sumatran rhinos. The study was a collaborative effort involving the University of Illinois Urbana-Champaign, the Eijkman Institute for Molecular Biology in Indonesia, Queen's University in Canada, and the San Diego Zoo.

"It's hard to do population censusing for this species because there's not a ton of them and they're very elusive so it's hard to figure out how many there are," said Jessica Brandt, former PhD student in the Roca lab who led the study. "We were looking for ways to do that without handling the species. This was a joint effort between groups of people who were interested in working on these endangered species and contributing to their management."

Sumatran rhinos live in dense rainforests that are hard to traverse through, making it difficult to track Sumatran rhinoceros populations. The researchers relied on fecal DNA collected from Sumatran rhino dung samples, requiring little interaction with individuals in the wild. Although dung sampling poses many benefits, fecal DNA can be degraded and the age of the samples hard to determine. In order to overcome these challenges, researchers designed optimized microsatellite markers that were short and easy to amplify from dung samples.

"Microsatellite markers are found in non-coding regions and because of that, they evolve pretty quickly," said Brandt. "They're really useful in populations where you want to identify individuals because you're going to see more variation at those particular markers than if you're using a protein-coding gene."

"During replication of the DNA, these markers can very easily expand or contract like a genomic accordion" said University of Illinois Urbana-Champaign animal sciences professor Alfred Roca, also a member of the Carl R. Woese Institute for Genomic Biology. "By looking at enough of these markers, you can distinguish animals because microsatellites evolve very quickly and are highly variable within species. These markers are ideal to target regions of the rhino genome that are highly variable."

Using high quality DNA sequences from captive Sumatran rhinos, researchers identified 29 polymorphic candidate loci for further optimization. To test its utility for censusing, 13 of the 29 markers were randomly tested on fecal samples collected from wild Sumatran rhinos. The researchers were able to amplify nine of the markers from 11 wild fecal samples.

"The combination of these markers yielded better statistical power for identifying Sumatran rhino individuals and amplified very well when tested using non-invasive samples," said Sinta Saidah, co-author and research assistant at the Eijkman Institute for Molecular Biology in Indonesia. "We hope that we can use these markers on more samples collected in the field to provide island-wide population data for Sumatran rhinoceros species, which will help us devise better conservation strategies for this critically endangered species."

"To make a conservation plan, you have to know who's there and what their current level of diversity is," said Brandt. "Our markers would allow the Indonesian officials to determine not just how many rhinos they can count but whether or not they're related. Ultimately, another goal would be to expand this research to include other endangered rhinoceros species."

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Plastic pollution in the deep sea: A geological perspective

image: Pollutants, including plastic, reach deep-sea fans through linked sediment routing systems, as well as from outside the associated catchment(s), via near-shore and shelfal currents (i.e., littoral cells), eolian transport, surface currents, and direct input from oceanic sources such as shipping and fishing.

Image: 
I.A. Kane and A. Fildani (Modified from Hessler and Fildani [2019].)

Boulder, Colo., USA: A new focus article in the May issue of Geology summarizes research on plastic waste in marine and sedimentary environments. Authors I.A. Kane of the Univ. of Manchester and A. Fildani of the Deep Time Institute write that "Environmental pollution caused by uncontrolled human activity is occurring on a vast and unprecedented scale around the globe. Of the diverse forms of anthropogenic pollution, the release of plastic into nature, and particularly the oceans, is one of the most recent and visible effects."

The authors cite multiple studies, including one in the May issue by Guangfa Zhong and Xiaotong Peng, discussed in a previous GSA press release when it was published online ahead of print (26 Jan. 2021). Zhong and Peng were surprised to find plastic waste in a deep-sea submarine canyon located in the northwestern South China Sea.

"Plastic is generally considered to be the dominant component of marine litter, due to its durability and the large volume produced," write Kane and Fildani. "Nano- and microplastics are a particularly insidious form of anthropogenic pollutant: tiny fragments and fibers may be invisible to the naked eye, but they are ingested with the food and water we consume and absorbed into the flesh of organisms."

One of their vital questions is, "If some plastics can survive for >1000 years in terrestrial environments, how long do they last in ocean trenches that are kilometers deep, dark, cold, and at high pressure? How long does it take microplastic to break down into microplastics and nanoplastics in the deep sea?"

"While it is incumbent on policy makers to take action now to protect the oceans from further harm, we recognize the roles that geoscientists can play," write Kane and Fildani. That includes using their deep-time perspective to address the societal challenges, their understanding of the present-day distribution on the seafloor and in the sedimentary record, using geoscience techniques to record the downstream effects of mitigation efforts, and to predict the future of seafloor plastics.

In summary, they write, "We understand ... the transient nature of the stratigraphic record and its surprising preservation, and the unique geochemical environments found in deep-sea sediments. Our source-to-sink approach to elucidate land-to-sea linkages can identify the sources and pathways that plastics take while traversing natural habitats and identify the context in which they are ultimately sequestered, and the ecosystems they affect. This will happen by working closely with oceanographers, biologists, chemists, and others tackling the global pollution problem."

Credit: 
Geological Society of America

Personalised medications possible with 3D printing

Customised medicines could one day be manufactured to patients' individual needs, with University of East Anglia (UEA) researchers investigating technology to 3D 'print' pills.

The team, including Dr Andy Gleadall and Prof Richard Bibb at Loughborough University, identified a new additive manufacturing method to allow the 3D printing of medicine in highly porous structures, which can be used to regulate the rate of drug release from the medicine to the body when taken orally.

Dr Sheng Qi, a Reader in Pharmaceutics at UEA's School of Pharmacy, led the research. The project findings, 'Effects of porosity on drug release kinetics of swellable and erodible porous pharmaceutical solid dosage forms fabricated by hot melt droplet deposition 3D printing', are published today in the International Journal of Pharmaceutics.

Dr Qi said: "Currently our medicines are manufactured in 'one-size-fits-all' fashion.

"Personalised medicine uses new manufacturing technology to produce pills that have the accurate dose and drug combinations tailored to individual patients. This would allow the patients to get maximal drug benefit with minimal side effects.

"Such treatment approaches can particularly benefit elderly patients who often have to take many different types of medicines per day, and patients with complicated conditions such as cancer, mental illness and inflammatory bowel disease."

The team's work, Dr Qi said, is building the foundation for the technology needed in future to produce personalised medicine at the point-of-care. She said 3D printing has the unique ability to produce porous pharmaceutical solid dosage forms on-demand.

Pharmaceutical 3D printing research is a new research field that has rapidly developed in the past five years. Most commonly used 3D printing methods require the drug being processed into spaghetti-like filaments prior to 3D printing.

The team investigated a newly developed 3D printing method that can rapidly produce porous pharmaceutical tablets without the use of filaments. The results revealed that by changing the size of the pores, the speed of a drug escaping from the tablet into the body can be regulated.

Further research will be required in order to use the porosity to tailor the dose and dosing frequency (i.e. once daily or twice daily) of medicine to each patient's needs, and use this principle to build multiple medicines into a single daily poly-pill for patients who are on a complex medicine regiment.

Credit: 
University of East Anglia

Election campaigns: attacks and smearing backfire and can benefit other candidates

image: Vincenzo Galasso (Bocconi University)

Image: 
Paolo Tonato

Candidates often give in to temptation to attack opponents in electoral campaigns through negative ads (more than 55% of the ads aired by the Clinton and Trump campaigns in 2016 were negative), even if evidence of this tactic effectiveness is, at least, mixed. A study by Bocconi University professors Vincenzo Galasso, Tommaso Nannicini and Salvatore Nunnari, just published in the American Journal of Political Science, reveals the backlash of electoral smearing and shows that, in a three-candidate race, it's the "idle candidate" (the one neither attacking nor being attacked) to have the upper hand.

During a three-candidate mayoral race in a mid-sized Italian town in 2015, the authors were able to randomize the door-to-door canvassing activity of volunteer supporters of one of the challengers to the incumbent mayor. A third of the 55 electoral precincts were canvassed by the volunteers with a positive message, a third with a negative message regarding the incumbent, and the remaining third received no information.

"We found a strong, positive spillover effect of negative campaigning on the idle candidate, whose vote share increased by 3.7% when the incumbent was attacked with a negative ad by the other challenger - a gain of about 13% with respect to the idle candidate's average vote share," Professor Galasso explains.

In the lab experiments involving 2,971 participants, the candidates were designed to have similar individual characteristics and no ideological differences. As in the field experiment, one of the three candidates delivered (by video) either positive or negative messages. In this case, the tone of the negative message was either neutral or aggressive.

When all three candidates campaigned positively, the vote share of the observed challenger was 29.4%. This dropped to 17% when this challenger attacked the incumbent with a neutral tone and to 14.9% when he used an aggressive tone. The main beneficiary of the change in preferences was the idle challenger, who saw his vote share grow from 35.9% with positive campaign to 53% with negative and neutral campaign to 54.4% with negative and aggressive campaign.

Through a series of questions to the lab experiment participants, the authors were also able to capture the mechanism that leads to vote-shifting: negative (as opposed to positive) campaigning increases voters' beliefs that the attacker is competitive, rather than cooperative, that he would not be a good mayor, and that he is ideologically extreme.

Credit: 
Bocconi University

Technique to automatically discover simulation configurations for behaviors hard to test

image: The research team at National Institute of Informatics developed a technique to search automatically for simulation configurations that test various behaviors of automated driving systems. This research was conducted under ERATO-MMSD project. The proposed technique iterates trials on simulations using an optimization method called evolutionary computation so that it discovers simulation configurations that lead to specific features of driving behaviors such as high acceleration, deceleration, and steering operation. This research was presented in ICST 2021.

Image: 
© National Institute of Informatics

The research team led by Fuyuki Ishikawa at the National Institute of Informatics (NII, Japan) developed a technique to search automatically for simulation configurations that test various behaviors of automated driving systems. This research was conducted under the ERATO-MMSD project (*1) funded by the Japan Science and Technology Agency (JST, Japan). The proposed technique iterates trials on simulations using an optimization method called evolutionary computation so that it discovers simulation configurations that lead to specific features of driving behaviors such as high acceleration, deceleration, and steering operation. The outcome of this research was presented in ICST 2021 (*2), a flagship conference on software testing held during April 12-16 2021.

Background

More attention is being focused on automated driving systems (ADS) or advanced driver assistant systems. New car models with Level 3 of autonomous driving are emerging, ones that do not require human drivers to supervise the driving operation under certain conditions. However, the ADS functionality being put into practical use is limited to specific situations such as traffic jams on highways or fixed routes. Increases in safety and reliability are required for use of ADS in environments with enormous situations such as urban areas.

One of the key functions in ADS is path planning, which continuously updates the direction and speed by examining the surrounding environment including other cars and pedestrians. The path-planning functionality needs to handle not only safety but also multiple other aspects such as the extent of acceleration/deceleration, steering operation, and lane conformance.

Simulation-based testing is commonly used for ADS. A typical approach is that human testers enumerate scenarios. An example is "the ego-car is going to take a right turn, but a car is approaching from the opposite direction." However, the ADS behavior can differ in the same right turn scenario, for example, either taking a turn without the need for braking or decelerating and waiting for a long time before taking the turn. It is essential to check different behaviors the ADS can take before utilizing it in society. However, specific behaviors such as long deceleration are unlikely to occur when we just run many simulations under configurations with different positions of other cars and so on. Moreover, the ADS has more possible specific behaviors, for example, simultaneous occurrences of strong acceleration and high amounts of steering operation. Configuring simulations to cause such specific behaviors intentionally is very difficult.

Research Method and Outcome

In this research, we proposed a technique for test generation that automatically searches for simulation configurations leading to specific features of driving behaviors such as high acceleration and deceleration and high amounts of steering operation. We use an optimization technique called evolutionary computation, which repeats simulation trials to adjust configurations so that specified driving behaviors last for a long period of time. In this way, the technique can discover simulation configurations, such as the positions of other cars, leading to the desired features of driving behaviors.

The proposed technique also avoids only generating simulation configurations that only lead to dangerous situations such as collisions. Therefore, it reveals features of driving behaviors not limited to emergency situations. In addition, it can search for and trigger combinations of behaviors such as simultaneous occurrences of high acceleration and high amounts of steering operation.

We applied and evaluated the test generation technique to a program of path planning offered by Mazda (*3). The technique could generate specific behaviors that were rarely caused in random simulations. For example, it generated strong acceleration together with high amounts of steering operation as well as high acceleration following high deceleration in a scenario for a right turn at an intersection. These cases occurred only with very specific timings of other cars entering the intersection. In this way, we showed the technique can intentionally trigger combinations of specific behaviors using simulation configurations that are very difficult for human engineers to design.

Future outlook

This research was conducted in the JST ERATO-MMSD project. In the project, we investigated other techniques for discovering simulation scenarios that lead to crashes (*4), techniques that explain the causes of crashes (*5), and techniques that fix the behaviors to avoid the detected crashes (*6). The research this time was to increase confidence in the system safety by checking "various situations," in addition to the techniques for detecting and fixing problematic behaviors. Thus, we established a comprehensive approach for testing of ADS with both tests for detecting problems and tests for checking diverse cases, which have been done for conventional software programs.

Late 2020 featured a competition for test generation tools on advanced driver-assistance systems (ADAS) (in conjunction with the SBST Workshop (*7) to be held in May 2021). The ERATO-MMSD project submitted a tool called Frenetic (*8) to the competition. Frenetic made significant results in terms of the rates of generated failure cases and their diversity. This exactly came from the aforementioned research experience.

We provided comprehensive testing techniques for ADS. Although we used the program provided by Mazda in our evaluations, the techniques are generic and can be tailored for the specific demands of each automotive company. For example, we can adjust the techniques to the emerging framework called responsibility-sensitive safety proposed by Intel and Mobileye. We will endeavor to make our techniques available by tailoring them for emerging international standards as well as the demands from each automotive company.

Comment by Fuyuki Ishikawa

"We have conducted active research on the path-planning component through collaboration with Mazda. We have established a holistic set of testing and debugging techniques, including the aforementioned one, by adapting techniques for conventional program code. The key of these techniques is to search for solutions such as "desirable tests" and "desirable fix actions." We will extend and empirically validate the techniques given emerging standards as well as different demands in each ADS application."

Credit: 
Research Organization of Information and Systems

Algorithms improve how we protect our data

image: Prof. Yongjune Kim, DGIST

Image: 
DGIST

Daegu Gyeongbuk Institute of Science and Technology (DGIST) scientists in Korea have developed algorithms that more efficiently measure how difficult it would be for an attacker to guess secret keys for cryptographic systems. The approach they used was described in the journal IEEE Transactions on Information Forensics and Security and could reduce the computational complexity needed to validate encryption security.

"Random numbers are essential for generating cryptographic information," explains DGIST computer scientist Yongjune Kim, who co-authored the study with Cyril Guyot and Young-Sik Kim. "This randomness is crucial for the security of cryptographic systems."

Cryptography is used in cybersecurity for protecting information. Scientists often use a metric, called 'min-entropy', to estimate and validate how good a source is at generating the random numbers used to encrypt data. Data with low entropy is easier to decipher, whereas data with high entropy is much more difficult to decode. But it is difficult to accurately estimate the min-entropy for some types of sources, leading to underestimations.

Kim and his colleagues developed an offline algorithm that estimates min-entropy based on a whole data set, and an online estimator that only needs limited data samples. The accuracy of the online estimator improves as the amount of data samples increases. Also, the online estimator does not need to store entire datasets, so it can be used in applications with stringent memory, storage and hardware constraints, like Internet-of-things devices.

"Our evaluations showed that our algorithms can estimate min-entropy 500 times faster than the current standard algorithm while maintaining estimation accuracy," says Kim.

Kim and his colleagues are working on improving the accuracy of this and other algorithms for estimating entropy in cryptography. They are also investigating how to improve privacy in machine learning applications.

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Strategic formulation of common cement could have a big impact on water purification

image: Open channels that carry water can be made with a new cement formulation that will purify the water via sun exposure as it flows toward its destination.

Image: 
U.S. Geological Survey Department of the Interior/USGS

HOUSTON - Researchers have found the right formula for mixing a cement that does double duty as a structural material and a passive photocatalytic water purifier with a built-in means of replenishment: simply sand down the material's surface to refresh the photocatalytic quality.

They found this recipe using a few very precise physical laboratory experiments whose data were then greatly amplified using a computational method called combinatorics that tested thousands of combinations of cement composites and their photocatalytic qualities.

The results, say the researchers from C-Crete Technologies and Rice University, indicate that the ingredients themselves are more important than the molecular structure of the cement or the particle size of the photocatalyst used. This research offers not only an important finding for cement -- as the need to make concrete and its primary ingredient, cement, more eco-friendly is a goal of much ongoing research -- but the methodology holds promise for developing other environmentally friendly, multifunctional materials.

The findings could be applied in cement used in "roadways, Bayous, canals, parking lots, anywhere that water washing over concrete's surface is exposed to sun," says Rouzbeh Shahsavari, president of C-Crete Technologies, lead author of the paper that appeared online April 27 in the journal Langmuir. "Since experiments are typically costly, difficult and time consuming, the exciting part of this work is that we can now analyze limited experimental results with our novel combinatorial approach and still obtain meaningful insights and correlations that would have been conventionally obtained by hundreds or thousands of experiments."

"Broadly, this method can be applied to not just cements or water purification applications but other areas in materials discovery that face limited and sparse experimental data. We have developed a platform technology that helps development teams to decrease the costs of their R&D and come up with completely novel materials with unprecedented and multifunctional properties." says Shahsavari.

The researchers used five types of readily available cement, nine types of the photocatalyst titanium dioxide and two common pollutants -- methyl orange, a cancer-causing substance, and dioxane, a possible carcinogenic compound -- both of which are commonly found in drinking water. Of the five cements, White Portland Cement, two types of volcanic ash-based Portland Cements, and a commercially available photo-active cement all proved to have the replenishable photocatalytic quality. The most common cement, ordinary Portland Cement, did not.

"Basically, tweaking cement composition including its belite and ferrite will go a long way for photocatalysis while the mechanical properties remain essentially unchanged. " says Shahsavari.

Credit: 
C-Crete Technologies

7T brain scans reveal potential early indicator of Alzheimer's

DALLAS (May 3, 2021) - Researchers from the Center for BrainHealth® at The University of Texas at Dallas are investigating a potential new early indicator of the decline toward Alzheimer's disease: measuring the energy metabolism of the living human brain using cutting-edge imaging techniques.

The scientists devised a unique way to illustrate energy consumption and reserves in the brain with phosphorus magnetic resonance spectroscopy using an ultra-high-field 7 Tesla MRI scanner. Their results suggest that neurological energy metabolism might be compromised in mild cognitive impairment (MCI), the stage of decline between healthy aging and more serious disease states like dementia and Alzheimer's.

Dr. Namrata Das, PhD'20, a program specialist and research neuroscientist in the School of Behavioral and Brain Sciences, is lead author of the study published online April 6 in Frontiers in Neuroscience.

"Much of what we know about cognitive decline at the molecular level comes from post-mortem brain examinations or animal models," said Das, who also holds a medical doctorate and master of public health degree. "What we set out to do was monitor in real time the biological mechanisms that cause this decline in humans to better understand the multiple factors involved."

Senior author Sandra Bond Chapman, PhD, chief director of the Center for BrainHealth at UT Dallas, said the results demonstrate "new pathways to advance discovery."

"This research provides a promising new way to elucidate the brain's health -- or early disturbance of its health -- due to changes in metabolism. The new approach is the utilization of 7 Tesla magnetic resonance imaging, a noninvasive, safe technology," said Chapman, the Dee Wyly Distinguished University Chair in BrainHealth. "It has exciting implications for early detection of Alzheimer's disease and the potential to measure disease response to treatments."

Although Alzheimer's disease was first defined more than a century ago, treatment remains elusive. According to Das, this is because "multiple mechanisms become abnormal, causing a cascade of events, and we don't know which comes first.

"Most current research is focused on accumulation of beta-amyloid and tau protein in the brain. Here, we're trying to learn if there are other early markers that can be tracked live via imaging. We hope that our findings, when integrated with measurements of tau and beta-amyloid, will give more profound information."

The researchers theorize that the energy level disturbance occurs early in Alzheimer's disease, based on prior post-mortem work that indicated the metabolism deficit is lower in earlier stages of Alzheimer's than it is in severe cases.

"That research set the path we are on to answer these questions with imaging technology," Das said.

The current study was conducted at the Advanced Imaging Research Center (AIRC), a collaborative facility shared by UT Dallas and other North Texas institutions and located on the UT Southwestern Medical Center campus. The facility houses several MRI scanners that operate at magnetic fields up to 7 Tesla (7T) for human studies. MRIs using such strong magnets -- the magnet in a 7T machine is more than twice as powerful as 3T clinical MRIs -- can illuminate metabolic processes and provide unprecedented detail in the resulting images.

In the study, 41 participants -- 15 cognitively normal, 15 with MCI and 11 with early Alzheimer's -- underwent assessment of executive function, memory, attention, visuospatial skills and language. The 7T MRI scans focused on measuring ratios between the energy molecules adenosine triphosphate (ATP) and phosphocreatine (PCr), and inorganic intracellular phosphate.

"Most of the energy in a cell is coming from the mitochondria," Das said. "It is theorized that mitochondrial dysfunction occurs early in Alzheimer's disease and that ATP and PCr are not synthesized properly. With 3T MRI, we could not see these molecular levels accurately. The 7T gets us there."

The researchers' scans of the participants' temporal lobes indicated that the ratio of PCr to ATP -- which Das referred to as the energy reserve index -- correlated with the participants' cognition levels.

"The energy reserve was lower in patients with mild cognitive impairment and lower still in those with Alzheimer's," she said. "We believe this is the first paper to confirm that energy reserve decreases in MCI, in many cases, years before Alzheimer's disease sets in."

While 7T MRI machines are not yet widely available for routine clinical evaluation of patients, Das said that the techniques used in the research study could be adapted to more commonly available 3T machines.

"The technology is evolving in such a way that we may soon be able to modify what we see on 7T scans to be detected with 3T, and 3T is available everywhere," she said. "We can tweak some of the MRI parameters we use to acquire these images with 3T, as has been done with proton spectroscopy. We hope this can be accomplished within the next few years."

In the future, the research team intends to combine this energy-level biomarker with positron emission tomography scans that measure beta-amyloid and tau protein, the most widely known markers of Alzheimer's disease. Meanwhile, Das will continue her research on using MRI to find novel neuroimaging markers at Harvard Medical School's McLean Imaging Center beginning July 1.

"We hope to determine if the abnormal brain energy metabolism has a relationship with the accumulation of beta-amyloid and tau," Das said. "Researchers have hypothesized for years that such metabolism shortfalls might precede such accumulations, but only now, with 7T, do we have the modality to find out."

Credit: 
Center for BrainHealth

Atrial fenestration during AVSD repair is associated with increased mortality

Boston, MA (May 2, 2021) - A new study, presented today at the AATS 101st Annual Meeting, shows an association between decreased survival at five years and leaving an atrial communication at biventricular repair of unbalanced AVSD after adjusting for other known risk factors. During repair of atrioventricular septal defect (AVSD), surgeons may leave an atrial level shunt when they have concerns about postoperative pulmonary hypertension, a hypoplastic right ventricle (RV), hypoplastic left ventricle (LV), or as part of their routine practice. The study sought to determine factors associated with mortality after biventricular repair of AVSD.

The study included 581 patients enrolled from 31 Congenital Heart Surgeons' Society (CHSS) institutions from Jan. 1, 2012 - June 1, 2020. Parametric multiphase hazard analysis was used to identify factors associated with mortality. A random effect model was used to account for possible inter-institutional variability in mortality.

An atrial fenestration was placed during biventricular repair in 23 percent of patients. Overall, five-year survival after repair was 91 percent. The atrial fenestration group had an 83 percent five-year survival compared to 93 percent in the non-fenestrated group. According to Connor Callahan, the John W. Kirklin/David Ashburn Fellow of the Congenital Heart Surgeons' Society Data Center and University of Toronto, General Surgery Resident at Washington University/Barnes-Jewish Hospital, "Atrial fenestration was associated with reduced long-term survival in our study. As a result of these findings, institutions and surgeons that routinely use fenestration may reevaluate its role in their practice."

While leaving an atrial communication at biventricular repair of unbalanced AVSD is associated with significantly reduced long-term survival after adjusting for other known risk factors, it is unknown whether this association is causal or related to unmeasured factors, or a combination of both. The impact of fenestration can optimally be derived from a randomized clinical trial in the future.

Credit: 
American Association for Thoracic Surgery

Study: Nurses' physical, mental health connected to preventable medical errors

video: A new study by The Ohio State University College of Nursing found that critical care nurses nationwide reported alarmingly high levels of stress, depressive symptoms and anxiety even before the COVID-19 pandemic began. These factors correlated with an increase in self-reported medical errors.

Image: 
The Ohio State University College of Nursing

A study led by The Ohio State University College of Nursing finds that critical care nurses in poor physical and mental health reported significantly more medical errors than nurses in better health.

The study, which was conducted before the COVID-19 pandemic, also found that "nurses who perceived that their worksite was very supportive of their well-being were twice as likely to have better physical health."

Study findings published today in the American Journal of Critical Care.

"It's critically important that we understand some of the root causes that lead to those errors and do everything we can to prevent them," lead author Bernadette Melnyk said. She serves as vice president for health promotion, chief wellness officer and dean of the College of Nursing at Ohio State.

The authors quoted research on the prevalence of stress, anxiety, depression and burnout symptoms among critical care nurses as a basis for examining the potential correlation between well-being and medical errors. The study surveyed nearly 800 members of the American Association of Critical-Care Nurses.

"It's clear that critical care nurses, like so many other clinicians, cannot continue to pour from an empty cup," Melnyk said. "System problems that contribute to burnout and poor health need to be fixed. Nurses need support and investment in evidence-based programming and resources that enhance their well-being and equip them with resiliency so they can take optimal care of patients."

Study findings included:

Of those surveyed, 61% reported suboptimal physical health, while 51% reported suboptimal mental health.

Approximately 40% screened positive for depressive symptoms and more than 50% screened positive for anxiety.

Those who reported worse health and well-being had between a 31% to 62% higher likelihood of making medical errors.

Nurses who reported working in places that provided greater support for wellness were more than twice as likely to have better personal health and professional quality of life compared with those whose workplace provided little or no support.

The Ohio State Wexner Medical Center has several programs to promote clinician well-being, including its Employee Assistance Program which offers confidential mental health resources and services such as counseling, mindfulness coaching and its Stress, Trauma and Resilience (STAR) Program that offers the Buckeye Paws pet therapy program to promote building coping and resiliency skills.

The authors mention that levels of stress, anxiety and depression are likely even higher in the current environment than before the pandemic, when the study was conducted.

"The major implication of this study's findings for hospital leaders and policy makers is that critical care nurses whose well-being is supported by their organizations are more likely to be fully engaged in patient care and make fewer medical errors, resulting in better patient outcomes and more lives saved," the researchers wrote.

Credit: 
MediaSource

Low profile thoracic aortic endograft device reduces complications and expands patient pool

Boston, MA (May 1, 2021) - Preliminary results of a clinical trial, presented today at the AATS 101st Annual Meeting, showed that a new, low-profile thoracic aortic endograft is safe and effective in the treatment of descending thoracic aortic aneurysm or penetrating atherosclerotic ulcer (PAU) diseases. A multi-disciplinary team, led by both cardiac and vascular surgeons as co-investigators, conducted the study in 36 centers in the United States and Japan, enrolling patients between 2016 and 2019.

The trial aimed to measure safety and efficacy of the RELAY®Pro endovascular device, a second-generation product featuring a dramatically reduced profile and a non-bare stent (NBS) configuration.

The prospective, international, non-blinded, non-randomized pivotal trial analyzed a primary safety endpoint of major adverse events (MAE) at 30 days (death, myocardial infarction, stroke, renal/respiratory failure, paralysis, bowel ischemia, procedural blood loss) and a primary effectiveness endpoint of treatment success at one year (technical success, patency, absence of aneurysm rupture, type I/III endoleaks, stent fractures, secondary interventions, aneurysm expansion, and migration). Treatment success at one year was 89.2 percent.

"With a 3 to 4 French profile reduction, this second generation thoracic endograft device met the one year safety and effectiveness endpoints in a pivotal study for the treatment of patients with aneurysms of the descending thoracic aorta or PAUs," explained Dr. Wilson Szeto, Professor of Surgery at the Hospital of the University of Pennsylvania and Penn Presbyterian Medical Center. "It is particularly positive to note that the reduction in profile and the availability of a non bare stents configuration will expand the population of patients who can be treated with the device and reduces complications."

In this study, the vast majority of patients in the United States were treated with a percutaneous approach, which can dramatically reduce surgical complications associated with higher profile devices requiring surgical cutdown for deployment. At one year follow up, patients demonstrated a low risk of mortality, endoleak or structural integrity concerns. Follow-up continues to five years and the device is currently being evaluated for approval by the FDA.

Credit: 
American Association for Thoracic Surgery

A glimmer of hope: New weapon in the fight against liver diseases

image: Increased levels of annexin A1, lactotransferrin, and aminopeptidase N. Increase the counts of anti-inflammatory macrophages with high motility and phagocytic ability, which increase repair of damaged tissue. Induces regulatory T cells and fibrolysis. b) The γ-sEVs also have the ability to reduce inflammation as well as downregulate fibrogenesis.

Image: 
Niigata University

Niigata, Japan - Researchers from Niigata University , the University of Tokyo, Osaka University and Tokyo Medical University, Japan have developed a new approach that could revolutionize the treatment, prevention, and possibly reversal of the damages caused by liver diseases. This novel strategy exploits small extracellular vesicles (sEVs) derived from interferon-γ (IFN-γ) pre-conditioned MSCs (γ-sEVs).

Cirrhosis and other chronic liver diseases remain a global health concern, with close to 2 million deaths reported annually, these account for approximately 3.5% of annual worldwide deaths. All these statistics are largely driven by the fact that 75 million of the 2 billion people who consume alcohol worldwide are diagnosed with alcohol-use disorders and are at risk of developing alcohol- induced liver disease. In addition, the high prevalence of viral hepatitis-induced liver damage continues to be on the rise.

These sobering numbers inspired a team of scientists led by Prof. Shuji Terai of the Division of Gastroenterology and Hepatology, Graduate School of Medical and Dental Sciences, Niigata University, to build upon previous knowledge that the ability to control fibrosis--which is the development of fibrous connective tissue as a reparative response to injury or damage--in livers under advanced cirrhosis, is often lost. In an interview Prof. Terai said, "Our results showed that modified extracellular vesicles can become a new therapeutic strategy for liver cirrhosis".

Since clinically advanced cirrhosis is an end-stage disease that can effectively be treated only by liver transplantation at present, there is a race in the field with many scientists developing targeted therapies for modulating fibrosis and aiding tissue regeneration.

One of the most popular approaches is cell therapy, where mesenchymal stromal cells (MSCs) and macrophages have shown potential towards inducing liver fibrosis regression. The popularity of this approach is centered on its cost-effectiveness; because MSCs are not only obtainable from the bone marrow, but also from medical waste that include umbilical cord tissue, adipose tissue, and dental pulp. Apart from the ease of availability, MSCs can also be grown relatively easily in the lab. Furthermore, rather than acting directly by replacing the damaged tissues, MSCs have previously been shown to be medical signaling cells that indirectly produce cytokines, chemokines, growth factors, and exosomes that are crucial for repairing and regenerating damaged tissue. Over the years, considerable progress has been made towards capacity building for research and clinical trials for novel treatment strategies against liver diseases. This is exemplified by previous demonstrations that MSCs have anti-inflammatory, anti-fibrotic, and anti-oxidative effects through these humoral factors. Despite tissue rejection being one of the barriers to cell/tissue transplantation interventions; MSCs are attractive for possessing low immunogenicity, and this can facilitate their use for both autologous (same individual) and allogeneic (different individuals of the same species) transplantation, as evidenced by applications in nearly 1000 clinical trials involving other fields, including the treatment of liver diseases.

In a series of experimental mice studies, this team of researchers, from Niigata University , the University of Tokyo, Osaka University and Tokyo Medical University, Japan may have discovered that IFN-γ pre-conditioned human AD-MSC-derived sEVs (AD-MSC-γ-sEVs) can induce anti-inflammatory macrophage counts, which are the key players in tissue repair, including the regression of fibrosis and promotion of tissue regeneration in vitro.

They report that both human adipose tissue-derived MSCs (AD-MSC-sEVs) and AD-MSC-γ-sEVs can promote macrophage motility and phagocytic activity. In addition, they also show that AD-MSC-γ- sEVs contain anti-inflammatory macrophage inducible proteins and can effectively control inflammation and fibrosis in a mouse model of cirrhosis. Following single-cell RNA-seq application, they confirmed AD-MSC-γ-sEVs therapy can induce multidimensional transcriptional changes. Taken together, these results suggest that AD-MSC-derived sEVs can affect the shape and function of macrophages and effectively recruit them into damaged areas, thereby promoting damaged liver tissue repair.

In an interview, Dr. Atsunori Tsuchiya of the Division of Gastroenterology and Hepatology, Graduate School of Medical and Dental Sciences, Niigata University, who was part of the research team said, "Both mesenchymal stromal cells and macrophages are reported to have therapeutic effects for liver cirrhosis, however relationship of both cells and mechanisms of action was not clear. We challenged this problem". He went on to add, "We found the important fact that extracellular vesicles from interferon-γ pre-conditioned MSCs can induce the tissue repair macrophages, which can regress fibrosis and promote liver regeneration effectively". These words were also echoed by Dr. Suguru Takeuchi of the Division of Gastroenterology and Hepatology, Graduate School of Medical and Dental Sciences, Niigata University, who was also part of the research team, "In our previous study, we reported that intravenous administration of mesenchymal stromal cells migrated to the lung, can work as "conducting cells" and affect to macrophages "working cells" in the liver." "In this study we first elucidated that extracellular vesicles from mesenchymal stromal cells are key molecules to affect the macrophages", added Dr. Takeuchi.

This proof-of-concept pilot study that complements macrophage therapy, holds potential as a strategy for treating liver diseases using IFN-γ pre-conditioned sEVs. However, further development and determination of the mechanisms underlying Treg cell count induction by IFN-γ pre-conditioned MSCs and sEVs still form part of their future research plans before these findings can be translated to humans in phased and controlled clinical trials.

Credit: 
Niigata University

New brain-like computing device simulates human learning

image: By connecting single synaptic transistors into a neuromorphic circuit, researchers demonstrated that their device could simulate associative learning.

Image: 
Northwestern University

Researchers have developed a brain-like computing device that is capable of learning by association.

Similar to how famed physiologist Ivan Pavlov conditioned dogs to associate a bell with food, researchers at Northwestern University and the University of Hong Kong successfully conditioned their circuit to associate light with pressure.

The research will be published April 30 in the journal Nature Communications.

The device's secret lies within its novel organic, electrochemical "synaptic transistors," which simultaneously process and store information just like the human brain. The researchers demonstrated that the transistor can mimic the short-term and long-term plasticity of synapses in the human brain, building on memories to learn over time.

With its brain-like ability, the novel transistor and circuit could potentially overcome the limitations of traditional computing, including their energy-sapping hardware and limited ability to perform multiple tasks at the same time. The brain-like device also has higher fault tolerance, continuing to operate smoothly even when some components fail.

"Although the modern computer is outstanding, the human brain can easily outperform it in some complex and unstructured tasks, such as pattern recognition, motor control and multisensory integration," said Northwestern's Jonathan Rivnay, a senior author of the study. "This is thanks to the plasticity of the synapse, which is the basic building block of the brain's computational power. These synapses enable the brain to work in a highly parallel, fault tolerant and energy-efficient manner. In our work, we demonstrate an organic, plastic transistor that mimics key functions of a biological synapse."

Rivnay is an assistant professor of biomedical engineering at Northwestern's McCormick School of Engineering. He co-led the study with Paddy Chan, an associate professor of mechanical engineering at the University of Hong Kong. Xudong Ji, a postdoctoral researcher in Rivnay's group, is the paper's first author.

Problems with conventional computing

Conventional, digital computing systems have separate processing and storage units, causing data-intensive tasks to consume large amounts of energy. Inspired by the combined computing and storage process in the human brain, researchers, in recent years, have sought to develop computers that operate more like the human brain, with arrays of devices that function like a network of neurons.

"The way our current computer systems work is that memory and logic are physically separated," Ji said. "You perform computation and send that information to a memory unit. Then every time you want to retrieve that information, you have to recall it. If we can bring those two separate functions together, we can save space and save on energy costs."

Currently, the memory resistor, or "memristor," is the most well-developed technology that can perform combined processing and memory function, but memristors suffer from energy-costly switching and less biocompatibility. These drawbacks led researchers to the synaptic transistor -- especially the organic electrochemical synaptic transistor, which operates with low voltages, continuously tunable memory and high compatibility for biological applications. Still, challenges exist.

"Even high-performing organic electrochemical synaptic transistors require the write operation to be decoupled from the read operation," Rivnay said. "So if you want to retain memory, you have to disconnect it from the write process, which can further complicate integration into circuits or systems."

How the synaptic transistor works

To overcome these challenges, the Northwestern and University of Hong Kong team optimized a conductive, plastic material within the organic, electrochemical transistor that can trap ions. In the brain, a synapse is a structure through which a neuron can transmit signals to another neuron, using small molecules called neurotransmitters. In the synaptic transistor, ions behave similarly to neurotransmitters, sending signals between terminals to form an artificial synapse. By retaining stored data from trapped ions, the transistor remembers previous activities, developing long-term plasticity.

The researchers demonstrated their device's synaptic behavior by connecting single synaptic transistors into a neuromorphic circuit to simulate associative learning. They integrated pressure and light sensors into the circuit and trained the circuit to associate the two unrelated physical inputs (pressure and light) with one another.

Perhaps the most famous example of associative learning is Pavlov's dog, which naturally drooled when it encountered food. After conditioning the dog to associate a bell ring with food, the dog also began drooling when it heard the sound of a bell. For the neuromorphic circuit, the researchers activated a voltage by applying pressure with a finger press. To condition the circuit to associate light with pressure, the researchers first applied pulsed light from an LED lightbulb and then immediately applied pressure. In this scenario, the pressure is the food and the light is the bell. The device's corresponding sensors detected both inputs.

After one training cycle, the circuit made an initial connection between light and pressure. After five training cycles, the circuit significantly associated light with pressure. Light, alone, was able to trigger a signal, or "unconditioned response."

Future applications

Because the synaptic circuit is made of soft polymers, like a plastic, it can be readily fabricated on flexible sheets and easily integrated into soft, wearable electronics, smart robotics and implantable devices that directly interface with living tissue and even the brain.

"While our application is a proof of concept, our proposed circuit can be further extended to include more sensory inputs and integrated with other electronics to enable on-site, low-power computation," Rivnay said. "Because it is compatible with biological environments, the device can directly interface with living tissue, which is critical for next-generation bioelectronics."

Credit: 
Northwestern University