Earth

Improving innovation: Assessing the environmental impacts of emerging technology

Although many new technologies offer the promise to improve human welfare, they can also produce unintended environmental consequences. And while applying the principles of life cycle assessment (LCA) early in technology development can provide important insights about how to avoid damage to the environment, existing methods focus on products or processes that are already commercially established.

Meanwhile, the procedures and tools used to assess emerging technologies tend to be applied on an ad hoc basis, with no clear guidelines as to what methods are available, applicable or appropriate.

A new special issue of Yale's Journal of Industrial Ecology addresses this gap with cutting edge research that advances methods, tests new approaches against emerging technologies, and assesses novel technologies for transportation, infrastructure, energy, and materials. The special issue, "Life Cycle Assessment for Emerging Technologies," includes findings with far-reaching implications for technology developers and policy makers.

View the issue: https://onlinelibrary.wiley.com/toc/15309290/2020/24/1

For example, two papers reveal the potential environmental consequences of the rapid increase in production of the lithium-ion battery packs that power everything from electric cars to portable computing devices. In contrast to earlier analyses, these studies show that, on a global scale, expansion of lithium production is likely to continue without being slowed by resource constraints for up to three more decades. Meanwhile, localized environmental impacts associated with extraction and processing of high-grade lithium ion brines are likely to create geographic imbalances in the environmental impacts and benefits of that expansion.

The issue also includes papers on fresh approaches to comparative assessment of emerging energy technologies. These new analyses make clear that the age of single-technology solutions at massive, industrial scales is coming to a close. The papers here examine environmental impact of alternative energy futures for algae-derived fuels, hydrogen, solar, and off-shore wind energy technologies.

"The research in this issue advances not only the understanding and methods for the environmental assessment of novel technologies, italso shows the potential for refashioning the tools of systematic environmental assessment to apply at the earliest stages of the innovation cycle," said Reid Lifset, editor-in-chief of the Journal of Industrial Ecology.

Another innovation is the creation of LCA inventories (databases) that can be aligned with the scenarios used in the integrated assessment models (IAMs) widely used in climate change modeling. Methods to incorporate technology readiness levels (TRLs) that are used in R&D management allow connection of LCA with other complementary tools such as multicriteria decision analysis, risk analysis, techno-economic analysis, and the development of data repositories for emerging materials, processes, and technologies.

Credit: 
Yale School of the Environment

What if we could teach photons to behave like electrons?

To develop futuristic technologies like quantum computers, scientists will need to find ways to control photons, the basic particles of light, just as precisely as they can already control electrons, the basic particles in electronic computing. Unfortunately, photons are far more difficult to manipulate than electrons, which respond to forces as simple as the sort of magnetism that even children understand.

But now, for the first time, a Stanford-led team has created a pseudo-magnetic force that can precisely control photons. In the short term, this control mechanism could be used to send more internet data through fiber optic cables. In the future, this discovery could lead to the creation of light-based chips that would deliver far greater computational power than electronic chips. "What we've done is so novel that the possibilities are only just beginning to materialize," said postdoctoral scholar Avik Dutt, first author of an article describing the discovery in Science. (link to paper: https://science.sciencemag.org/content/367/6473/59)

Essentially, the researchers tricked the photons - which are intrinsically non-magnetic - into behaving like charged electrons. They accomplished this by sending the photons through carefully designed mazes in a way that caused the light particles to behave as if they were being acted upon by what the scientists called a "synthetic" or "artificial" magnetic field.

"We designed structures that created magnetic forces capable of pushing photons in predictable and useful ways," said Shanhui Fan, a professor of electrical engineering and senior scientist behind the research effort.

Although still in the experimental stage, these structures represent an advance on the existing mode of computing. Storing information is all about controlling the variable states of particles, and today, scientists do so by switching electrons in a chip on and off to create digital zeroes and ones. A chip that uses magnetism to control the interplay between the photon's color (or energy level) and spin (whether it is traveling in a clockwise or counterclockwise direction) creates more variable states than is possible with simple on-off electrons. Those possibilities will enable scientists to process, store and transmit far more data on photon-based devices than is possible with electronic chips today.

To bring photons into the proximities required to create these magnetic effects, the Stanford researchers used lasers, fiber optic cables and other off-the-shelf scientific equipment. Building these tabletop structures enabled the scientists to deduce the design principles behind the effects they discovered. Eventually they'll have to create nanoscale structures that embody these same principles to build the chip. In the meantime, says Fan, "we've found a relatively simple new mechanism to control light, and that's exciting."

Credit: 
Stanford University School of Engineering

What birdsong tells us about brain cells and learning

Most scientists who study the brain believe that memories are stored through networks of synapses, or connections that form between neurons. Learning takes place as neurons form new connections and strengthen or weaken existing ones, giving the brain its so-called synaptic plasticity. There is growing evidence, however, that the intrinsic, built-in properties of the cells themselves, not just the connections between them, also play a role in this process.

New research by neuroscientists at the University of Chicago uses a unique model -- the intricate mating songs of birds -- to show how these intrinsic properties are closely tied to the complex processes of learning. The study, published in Nature Communications, could add a new layer of complexity to our understanding of the brain.

"We are able to go directly from the properties of the cells to the behavior of the animal," said Dan Margoliash, PhD, a neurobiologist and senior author of the new study. "This suggests that it's not just rapid changes at synapses that are driving learning and memory, but changes in the intrinsic properties of cells as well."

Male zebra finches are known for singing complex, precise songs to attract female mates. They try to produce the exact same pattern and timing of notes every single time, and to some extent the females judge a male's fitness by the precision of his songs. But the birds aren't born with a full repertoire of songs; they have to learn and practice their calls just like a young saxophonist practices scales and basic melodies before graduating to John Coltrane's catalog.

Margoliash uses this as an opportunity to study the underlying activity in the brain as the birds learn this complex behavior. "Songbirds are wonderful to study on their own, but this isn't just about songbirds. This is about neuroscience writ large," he said.

All living cells have an internal electrical voltage, which is different from the voltage in their surrounding environment. Neurons are special because they have what's called action potentials, or the ability to rapidly change the flows of current in and out of the cell. The sequence and timing of spikes in these action potentials is what constitutes the information neurons pass along through the network, so they are an important source of data for understanding how the brain learns.

In the new study, Margoliash and Arij Daou, PhD, a former postdoctoral scholar at UChicago and now assistant professor at the American University of Beirut, Lebanon, recorded the patterns of action potential spikes from zebra finch neurons at different stages of development -- adult birds with fully-developed song patterns and juveniles that were still learning.

Neurons have a variety of channels and proteins embedded in their cell membranes that open and close in complex ways depending on how much current is flowing in or out. This collection of mechanisms comprises the intrinsic properties of the cell, which can change with the magnitude and strength of currents that are flowing across the cell membrane.

Having recorded the currents flowing through the cells, Margoliash and Daou devised a mathematical way to compare how closely the intrinsic properties of two given birds matched each other. The intrinsic properties of one class of neurons in a given bird were similar to each other, but they varied from bird to bird. But when the researchers made a similar calculation of how closely their songs resembled each other, they came to a striking conclusion.

"This was the great 'Aha!' moment," Margoliash said. "When we did that calculation for the birds, we found that birds that were close in terms of their intrinsic properties also had similar songs."

This relationship held up across different pairings of birds as well. Sibling adult birds that were raised by the same parents -- and thus taught the same way -- had both similar songs and intrinsic cell properties. But juvenile birds that hadn't yet perfected their songs were all over the map. There were no clear relationships between the intrinsic cell properties of the juveniles and their songs, no matter how they were related.

The researchers were also able to show how the intrinsic properties of cells changed in response to changes in song patterns. Using a device that recorded the birdsong and played it back at a slight delay caused the birds to alter their song patterns in a way that resembles stuttering in humans. They immediately got stuck trying to start to sing. Eventually such birds would get stuck on certain notes, or repeated patterns that they wouldn't produce in a natural environment.

Interestingly, this same technique can induce stuttering in people too. If a speaker listens to a slightly delayed feed of their own voice, it will cause them to trip over words and repeat syllables. But for many people who stutter, hearing the delayed feed can help reduce the stuttering.

Within a few hours after listening to the stutter-inducing delayed feedback, the intrinsic properties of neurons changed in these birds too, suggesting a direct link to the altered singing behavior. Margoliash says that this is evidence of a biological mechanism for stuttering that could provide a useful model for humans as well, given the similarities in behavior.

"There certainly are important cognitive components of stuttering that we haven't had a chance to study yet and see how useful the birdsong model is," he said, "but at the fundamental level we can study the neural basis of that behavior precisely. Having an animal model for stuttering could be a major breakthrough."

Credit: 
University of Chicago

Enriching newborns' environment in the right way helps heal young, injured brains

WASHINGTON-(Feb. 19, 2020)- An enriched environment--with increased opportunities for physical activity, socialization and exploring novel stimuli--helped lessen functional, anatomical and cellular deficits in an experimental model of brain damage caused by oxygen deprivation at birth. What's more, recovery of the brain's white matter required a combination of all experimental interventions, not just a single intervention, suggests a new study led by researchers at Children's National Hospital. Their findings, published online Feb. 19, 2020, in "Nature Communications," could lead to new treatments for children affected by this condition.

About 450,000 babies are born preterm in U.S. every year, a number that continues to rise, says senior author Vittorio Gallo, Ph.D., chief research officer for Children's National Hospital and scientific director for Children's National Research Institute. Oxygen deprivation caused by immature lungs or birth injuries is a common consequence of prematurity, which leads to permanent neurological deficits and disabilities, Gallo explains.

Premature babies require minimal handling for their first months of life in order to eliminate stressful stimuli and optimize their development. Efforts have been made to switch from the noisy and crowded environment of the older neonatal intensive care units to new, quiet private family rooms in order to eliminate noise and light. However, recent studies suggest that infants that were treated in private family rooms had lower language and motor scores compared to the infants in open wards, raising questions about the ideal level of stimulation that premature neonates require in order to achieve optimal brain development. The mechanisms by which environmental stimuli positively affect brain development in the early neonatal period and better neurological outcomes remain unclear.

To determine how enriched environments may affect recovery for newborns who suffer brain injury after birth, Gallo and colleagues leveraged a preclinical model of newborns exposed to low oxygen levels shortly after birth. These experimental models had brain damage similar to premature human babies with hypoxic brain injuries.

After injury, some of these experimental models grew up in standard enclosures, with little more than nesting materials, a few other cage mates, and access to food and water. Others grew up in enriched environments: larger enclosures equipped with a running wheel as well as objects of differing sizes and colors that were switched out frequently, and more cage mates for enhanced socialization.

When these preclinical models were young adults, the researchers assessed how well they performed on a functional test of motor skills in which both groups scurried up a narrow, inclined beam. While foot slips were common in both groups, those raised in an enriched environment had about half as many as those raised in the less-stimulating enclosures.

When researchers examined the brains, they found that these functional improvements were linked to significantly enhanced division and maturation of oligodendrocytes, cells in the brain's white matter that support nerve cells and produce myelin, a fatty insulating sheath that covers the long extensions that connect nerve cells to each other and to other parts of the body. Indeed, consistent with the cellular and functional findings, the white matter of experimental models raised in enriched environments had significantly more myelin content than that of counterparts raised in the simpler environment.

Further experiments showed that for these improvements in function and anatomy to occur in experimental models raised in the enriched environments, they needed all three elements: enhanced physical activity, socialization and cognitive stimulation from novel objects. Additionally, exposure to these elements needed to start early and be continuous and long term. Those experimental models that weren't raised in a completely enriched environment or whose exposure to the environment started later, was interrupted, or was cut short didn't have any improvements in function and white matter recovery.

Digging deeper, Gallo and colleagues employed next generation sequencing to investigate oligodendrocyte gene expression in these animals, identifying broad differences in networks of genes involved in oligodendrocyte development between the two groups.

Gallo notes that these results and future studies to better understand the effects of enriched environments could lead to better ways to care for premature babies that help lessen or prevent the long-term consequences of oxygen deprivation.

Credit: 
Children's National Hospital

New 3D chirality discovered and synthetically assembled

image: (a) Luminescence of CDCl3 solutions of samples 17, 18e and 18f in NMR tubes, [c] (mg/ml): A, left =6.7 right = 2.2; B, left (18e) and right (18f) = 8. (b) AIE displays of multi-layer 3D molecules: 10a; 18f and 16 in THF/water systems; [c] (M).

Image: 
©Science China Press

The origin of lives of human beings, animals and plants on earth is attributed to chirality because it is necessitated for the formation of biomolecules, such as nucleic acids, proteins, carbohydrates, etc. The studies on chirality have been becoming increasingly active and extensive due to its importance in chemistry, material sciences and pharmaceuticals in regard to developing new drugs with higher potency and fewer/less side-effects. So far, there have been four main types of general chirality revealed in scientific community: element central, axial/helical, spiro and double planar chirality; the first three exist in nature and the last chirality are artificially rendered in labs. The last chirality involves the structural unit of ferrocene which was invented by Nobel Laureate, G. Wilkinson at Imperial College London and chiral ferrocene developed by Cal Tech chemist G. Fu.

There is a recent report on the fifth chirality of multi-layer 3D chirality of C2- and/or pseudo C2-symmetry and on its first chemical synthesis by Professor Guigen Li's labs at Texas Tech University and Nanjing University in collaboration with Professor Tao Jiang at Ocean University of China (Wu G-Z, Liu Y-X, Yang Z, et al. Natl Sci Rev, doi.org/10.1093/nsr/nwz203, advance access publication 16 Dec 2019). As confirmed by X-ray structural analysis that almost parallel arrangement consisting of three layers: top, middle and bottom layers, exists in this multi-layer 3D chirality. It shows the top and bottom layers restrict each other from free chemical bond rotation, i.e. if either top or bottom layer is removed, this chirality would not exist due to the resulting racemization. Therefore, the new 3D chirality is differentiated from any documented planar or axial chirality.

Retro-synthetic analysis (RSA) established by Nobel Laureate, E. J. Corey at Harvard and the Suzuki-Miyaura cross coupling invented by Nobel Laureate, A. Suzuki and his coworker N. Miyaura in Japan played important roles in the discovery and synthesis of the multi-layer 3D chirality of C2-symmetry. Double cross-couplings invented by S. Buchwald at MIT and J. Hartwig at UC Berkeley enabled assembling multi-layer 3D chirality of pseudo C2-symmetry.

It is very intriguing that the resulting chiral multi-layer 3D products displayed macro chirality phenomenon which can be directly observed by human eyes without an aid of microscopic devices (Image 1a). When solutions containing 3D samples were slowly evaporated by being exposed to air for a few days, anti-clockwise spiral loops were formed in glass containers. These spiral loops shine with green color when it is irradiated under UV light at 365 nm. It is also very interesting when a capped NMR tube containing a CDCl3 solution of a 3D compound was stored at room temperature for over several weeks, right-handed spiro textile-shaped solids were formed inside NMR tubes. Nearly all 3D products displayed strong fluorescence even in their solid forms under UV light at 365 nm as shown in image 1.

The change of functional groups in multi-layer 3D chiral molecules led to their luminescence color changes from gold color to dark green (Image 2a). They also displayed strong aggregation-induced emission (AIE) (Image 2-(c)) in which the higher fraction of water, the stronger the luminescence has been confirmed qualitatively and quantitatively. Several of such 3D molecules have showed unusually high optical rotation, indicating their great potential serving for optical material science and technology.

Credit: 
Science China Press

Changes to Title X mean contraception access for teens could worsen nationwide, study shows

AURORA, Colo. (Februrary 19, 2020) - Many teens lost access to confidential family planning services in Texas due to family planning budget cuts and loss of Title X funds, says a new study led by the University of Colorado College of Nursing just published in the Journal of Adolescent Health. Lack of clarity around parental consent laws, confusion among staff, and funding uncertainty made it more difficult for organizations to provide confidential, low-cost, and quality services to teens. This research suggests that contraception access for teens throughout the nation could worsen as new changes to Title X are implemented.

This study, based on three waves of interviews conducted between 2012 and 2015 at 47 organizations, was conducted after the Texas legislature reduced its family-planning budget by two-thirds in 2011, in an effort to exclude Planned Parenthood from receiving funds. However, this change impacted other organizations as well. Programs that had Title X funding but lost it, 79% of those in the study, consistently reported a decrease in the number of adolescent clients served and organizations' ability to care for teens.

According to the study, these budgetary changes created confusion for teens, family planning organization staff, and providers. For example, teens that were covered by the federal Medicaid program could get confidential services at a clinic that did not have Title X funding, but teens that had different insurance could not. Furthermore, teens could access other reproductive health services, such as testing for sexually transmitted infections, without parental consent, but not birth control. Sometimes changes in funding occurred quickly, which made it challenging for organizations to understand and train staff in how to comply with the parental consent requirements.

"The loss of Title X funds meant that clinics had to suddenly start requiring parental consent or send adolescent clients elsewhere," said lead author and CU Nursing Assistant Professor Kate Coleman-Minahan, PhD, RN. "This put a big strain on providers, both administratively and ethically. It was distressing for them to turn away teens seeking confidential care. They felt like they couldn't meet their mission."

Texas is one of 24 states that does not explicitly allow minors to consent for contraception. Federal Title X funding allows teens to access confidential family planning services, including free or reduced-cost contraceptives, even in states such as Texas that require parental consent. A major overhaul of the program by the current administration in 2019 includes policy changes that affect organizations across the country. A number of Title X-funded clinics have left the program since publication of the new policies, and a slate of legal battles challenging the changes are pending.

"Our research from Texas foreshadows the potential negative impact that the recently implemented Title X regulations will have on teens' abilities to receive high quality sexual and reproductive services," said study co-author Dr. Kari White, principal investigator of the Texas Policy Evaluation Project (TxPEP) and Associate Professor of Social Work and Sociology at The University of Texas at Austin.

"I am already seeing what we observed in Texas beginning to happen here," says Coleman-Minahan, who provides family planning to teens in Colorado as a nurse practitioner. "Even though Colorado does not require parental consent for contraception, the new rules are already causing confusion and constrain our ability to provide high quality care."

Credit: 
University of Colorado Anschutz Medical Campus

Could water solve the renewable energy storage challenge?

image: Seasonal pumped storage project and main components.

Image: 
IIASA

Seasonal pumped hydropower storage (SPHS), an already established yet infrequently used technology, could be an affordable and sustainable solution to store energy and water on an annual scale, according to new IIASA research published in the journal Nature Communications. Compared with other mature storage solutions, such as natural gas, the study shows that there is considerable potential for SPHS to provide highly competitive energy storage costs.

"The energy sectors of most countries are undergoing a transition to renewable energy sources, particularly wind and solar generation," says IIASA postdoc Julian Hunt, the study lead author. "These sources are intermittent and have seasonal variations, so they need storage alternatives to guarantee that the demand can be met at any time. Short-term energy storage solutions with batteries are underway to resolve intermittency issues, however, the alternative for long-term energy storage that is usually considered to resolve seasonal variations in electricity generation is hydrogen, which is not yet economically competitive."

Seasonal pumped hydropower storage means pumping water into a deep storage reservoir, built parallel to a major river, during times of high water flow or low energy demand. When water is scarce or energy demand increases, stored water is then released from the reservoir to generate electricity.

The new study is the first to provide a global, high-resolution analysis of the potential and costs for SPHS technology. In their analysis, researchers assessed the theoretical global potential for storing energy and water seasonally with SPHS, focusing on the locations with the highest potential and lowest cost. They also analyzed different scenarios where the storage of energy and water with SPHS could be a viable alternative. The study included topographical, river network and hydrology data, infrastructure cost estimation, and project design optimization, to identify technically feasible candidate sites.

The new study shows that water storage costs with SPHS plants vary from 0.007 to 0.2 US$/m3, long-term energy storage costs vary from 1.8 to 50 US$/MWh and short-term energy storage costs vary from 370 to 600 US$/KW of installed power generation capacity, considering dam, tunnel, turbine, generator, excavation, and land costs. The estimated world energy storage potential below a cost of 50 $/MWh is 17.3 PWh, which is approximately 79% of the world electricity consumption in 2017.

The researchers found that significant potential exists for SPHS around the world, in particular in the lower part of the Himalayas, Andes, Alps, Rocky Mountains, northern part of the Middle East, Ethiopian Highlands, Brazilian Highlands, Central America, East Asia, Papua New Guinea, the Sayan, Yablonoi and Stanovoy mountain ranges in Russia, as well as a number of other locations with smaller potential.

"Concerns about the intermittency and seasonality of wind and solar can be valid, but are also sometimes exaggerated," says IIASA researcher Edward Byers, a study coauthor. "This study demonstrates that there is an extremely high potential for SPHS to be used across much of the world, providing a readily-available, affordable and sustainable solution to support the transition to sustainable energy systems and overcome real and perceived barriers to high shares of renewable generation."

The study also addresses some of the potential environmental concerns related to hydropower. Because SPHS reservoirs are deep and constructed parallel to, rather than within the course of a river, the environmental and land use impacts can also be up to 10 to 50 times smaller than traditional hydropower plants.

Hunt says, "With the need for a transition to a more sustainable world with lower CO2 emissions, renewable energies and energy storage will play a major role in the near future. Given the vast untapped and cheap potential of SPHS, it will soon play an important role in storing energy and water on a yearly basis."

Credit: 
International Institute for Applied Systems Analysis

Beta-arrestin-2 increases neurotoxic tau driving frontotemporal dementia

image: JungA (Alexa) Woo, PhD, a neuroscientist at the University of South Florida Morsani College of Medicine, is lead author of a newly published study suggesting a new approach to inhibit the buildup of brain-damaging tau tangles associated with several types of dementia.

Image: 
© University of South Florida

TAMPA, Fla. (Feb. 18, 2020) -- The protein β-arrestin-2 increases the accumulation of neurotoxic tau tangles, a cause several forms of dementia, by interfering with removal of excess tau from the brain, a new study by the University of South Florida Health (USF Health) Morsani College of Medicine found.

The USF Health researchers discovered that a form of the protein comprised of multiple β-arrestin-2 molecules, known as oligomerized β-arrestin-2, disrupts the protective clearance process normally ridding cells of malformed proteins like disease-causing tau. Monomeric β-arrestin-2, the protein's single-molecule form, does not impair this cellular toxic waste disposal process known as autophagy.

Their findings were published today in the Proceedings of the National Academy of Sciences (PNAS).

The study focused on frontotemporal lobar degeneration (FTLD), also called frontotemporal dementia -- second only to Alzheimer's disease as the leading cause of dementia. This aggressive, typically earlier onset dementia (ages 45-65) is characterized by atrophy of the front or side regions of the brain, or both. Like Alzheimer's disease, FTLD displays an accumulation of tau, and has no specific treatment or cure.

"Our research could lead to a new strategy to block tau pathology in FTLD, Alzheimer's disease and other related dementias, which ultimately destroys cognitive abilities such as reasoning, behavior, language, and memory," said the paper's lead author JungA (Alexa) Woo, PhD, an assistant professor of molecular pharmacology and physiology and an investigator at the USF Health Byrd Alzheimer's Center.

"It has always been puzzling why the brain cannot clear accumulating tau" said Stephen B. Liggett, MD, senior author and professor of medicine and medical engineering at the USF Health Morsani College of Medicine. "It appears that an 'incidental interaction' between β-arrestin-2 and the tau clearance mechanism occurs, leading to these dementias. β-arrestin-2 itself is not harmful, but this unanticipated interplay appears to be the basis for this mystery."

"This study identifies beta-arrestin-2 as a key culprit in the progressive accumulation of tau in brains of dementia patients," said coauthor David Kang, PhD, professor of molecular medicine and director of basic research for the Byrd Alzheimer's Center. "It also clearly illustrates an innovative proof-of-concept strategy to therapeutically reduce pathological tau by specifically targeting beta-arrestin oligomerization."

The two primary hallmarks of Alzheimer's disease are clumps of sticky amyloid-beta (Aβ) protein fragments known as amyloid plaques and neuron-choking tangles of a protein called tau. Abnormal accumulations of both proteins are needed to drive the death of brain cells, or neurons, in Alzheimer's, although the tau accumulations now appear to correlate better with cognitive dysfunction than Aβ, and drugs targeting Aβ have been disappointing as a treatment. Aβ aggregation is absent in the FTLD brain, where the key feature of neurodegeneration appears to be excessive tau accumulation, known as tauopathy. The resulting neurofibrillary tangles -- twisted fibers laden with tau -- destroy synaptic communication between neurons, eventually killing the brain cells.

"Studying FTLD gave us that window to study a key feature of both types of dementias, without the confusion of any Aβ component," Dr. Woo said.

Monomeric β-arrestin-2 is mostly known for its ability to regulate receptors, molecules on the cell that are responsible for hormone and neurotransmitter signaling. β-arrestin-2 can also form multiple interconnecting units, called oligomers. The function of β-arrestin-2 oligomers is not well understood.

The monomeric form was the basis for the laboratory's initial studies examining tau and its relationship with neurotransmission and receptors, "but we soon became transfixed on these oligomers of β-arrestin-2," Dr Woo said.

Among the researchers' findings reported in PNAS:

Both in cells and in mice with elevated tau, β-arrestin-2 levels are increased. Furthermore, when β-arrestin-2 is overexpressed, tau levels increase, suggesting a maladaptive feedback cycle that exacerbates disease-causing tau.

Genetically reducing β-arrestin-2 lessens tauopathy, synaptic dysfunction, and the loss of nerve cells and their connections in the brain. For this experiment researchers crossed a mouse model of early tauopathy with genetically modified mice in which the β-arrestin-2 gene was inactivated, or knocked out.

Oligomerized β-arrestin-2 -- but not the protein's monomeric form - increases tau. The researchers blocked β-arrestin-2 molecules from binding together to create oligeromized forms of the protein. They demonstrated that pathogenic tau significantly decreased when beta-arrestin 2 oligomers are converted to monomers.

Oligomerized β-arrestin-2 increases tau by impeding the ability of cargo protein p62 to help selectively degrade excess tau in the brain. In essence, this reduces the efficiency of the autophagy process needed to clear toxic tau, so tau "clogs up" the neurons.

Blocking of β-arrestin-2 oligomerization suppresses disease-causing tau in a mouse model that develops human tauopathy with signs of dementia.

"We also noted that decreasing β-arrestin-2 by gene therapy had no apparent side effects, but such a reduction was enough to open the tau clearance mechanism to full throttle, erasing the tau tangles like an eraser," Dr. Liggett said. "This is something the field has been looking for -- an intervention that does no harm and reverses the disease."

"Based on our findings, the effects of inhibiting β-arrestin-2 oligomerization would be expected to not only inhibit the development of new tau tangles, but also to clear existing tau accumulations due to the mechanism of enhancing tau clearance," the paper's authors conclude.

The work is consistent with a new treatment strategy that could be preventive for those at risk or with mild cognitive impairment, and also for those with overt dementias caused by tau, by decreasing the existing tau tangles.

Credit: 
University of South Florida (USF Health)

Improving the electrical and mechanical properties of carbon-nanotube-based fibers

image: Postdoctoral researcher Gang Wang loads a sample into the system used to perform the nanotube crosslinking operation while Joseph Lyding looks on.

Image: 
Doris Dahl, Beckman Institute, University of Illinois

The Lyding Group recently developed a technique that can be used to build carbon-nanotube-based fibers by creating chemical crosslinks. The technique improves the electrical and mechanical properties of these materials.

The paper, "Enhanced Electrical and Mechanical Properties of Chemically Cross-Linked Carbon-Nanotube-Based Fibers and Their Application in High-Performance Supercapacitors," was published in ACS Nano.

"Carbon nanotubes are strong and are very good at conducting heat and electricity," said Gang Wang, a postdoctoral research associate in the Lyding lab, which is at the Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign. "Therefore, these materials have wide applications and can be used as strong fibers, batteries, and transistors."

There are many ways to build materials that have carbon-nanotube-based fibers. "Airplane wings can be made, for example, by embedding these fibers in a matrix using epoxy," said Joseph Lyding, the Robert C. MacClinchie Distinguished Professor of Electrical and Computer Engineering and a Beckman faculty member. "The epoxy acts as a binder and holds the matrix together."

However, combining the tubes to make such materials can lead to a loss in important properties. "We came up with a method to bring a lot of that performance back," Lyding said. "The method is based on linking the individual carbon nanotubes together."

The researchers dispersed brominated hydrocarbon molecules within the nanotube matrix. When heat is applied, the bromine groups detach, and the molecules covalently bond to adjacent nanotubes.

"When you pass current though these materials, the resistance to the current is highest at the junctions where the nanotubes touch each other," Lyding said. "As a result, heat is generated at the junctions and we use that heat to link the nanotubes together."

The treatment is a one-time process. "Once those bonds form, the resistance at the junction drops, and the material cools off. It's like popcorn going off -- once it pops, that's it," Lyding said.

The researchers faced many challenges when they were trying to build these materials. "We have to find the right molecules to use and the proper conditions to make those bonds," Wang said. "We had to try several times to find the right current and then use the resulting material to build other devices."

"This paper is the first step in making a new class of materials. It is likely that the performance we see now will become better because it has not been explored fully yet," Lyding said. "We are interested in investigating how strong we can make these materials, how we can improve their electrical conductivity, and whether we can replace copper wires with materials that are 10 times lower in weight and have the same performance."

Credit: 
Beckman Institute for Advanced Science and Technology

An early warning system for damage in composite materials

A team at the National Institute of Standards and Technology (NIST) has developed a tool to monitor changes in widely used composite materials known as fiber reinforced polymers (FRPs), which can be found in everything from aerospace and infrastructure to wind turbines. The new tool, integrated into these materials, can help measure the damage that occurs as they age.

"This gives us the ability to develop better, more fatigue-resistant composites," said NIST chemist Jeff Gilman. "We can see when the fiber starts to break. We now have a way to quantify the damage."

Since the 1960s, scientists have been experimenting with ways to make FRPs lighter and stronger. This has often meant testing the bond between fiber and resin. As reported in a previous publication, the NIST team added small molecules that fluoresce after the impact of mechanical force. These molecules, called "mechanophores," change color or light up, helping identify tiny nanometer-sized openings or cracks between the fiber and resin.

The NIST team has taken this technology to the next level by incorporating the mechanophore throughout the composite resin. Although not noticeable to the naked eye, the newest approach allows scientists to use special microscopy imaging techniques to measure FRP damage. The approach incorporates a minute amount (less than 0.1% mass) of a fluorescent dye called rhodamine that causes no appreciable changes in the material's physical properties.

If the new mechanophore is embedded in structures made of FRP, field testing for fatigue could be done inexpensively and on a regular basis. Structures like wind turbines could frequently be scanned easily for interior cracks, even years after they've been erected.

Initial work with this new tool also revealed a surprise about FRP damage. When a fiber breaks, it sends out a kind of "shock wave" that moves throughout the material, explained Jeremiah Woodcock, the lead author of a new paper about the mechanophore published in Composites Science and Technology. In the past, it was believed that most of the damage was happening at the point of breakage.

"We thought that when we looked at the results, there'd be a halo of light around the crack, showing the fluorescence of the mechanophore," Woodcock said. Instead, they found that damage occurs in places that are very remote from the point of fiber fracture. "It's like we knew about the earthquake but didn't know about the tsunami that follows after it."

The NIST mechanophore research also found that existing testing was unintentionally damaging the material's strength. This has, in turn, led designers and engineers to overdesign FRPs. Using the mechanophore could, therefore, bring down energy and manufacturing costs and increase the ways these materials are used in industry.

Credit: 
National Institute of Standards and Technology (NIST)

Solar technology breakthrough at the University of Queensland

video: Animation explaining quantum dot solar cell technology

Image: 
University of Queensland

The development of next generation solar power technology that has potential to be used as a flexible 'skin' over hard surfaces has moved a step closer, thanks to a significant breakthrough at The University of Queensland.

UQ researchers set a world record for the conversion of solar energy to electricity via the use of tiny nanoparticles called 'quantum dots', which pass electrons between one another and generate electrical current when exposed to solar energy in a solar cell device.

The development represents a significant step towards making the technology commercially-viable and supporting global renewable energy targets.

Professor Lianzhou Wang, who led the breakthrough, said: "Conventional solar technologies use rigid, expensive materials. The new class of quantum dots the university has developed are flexible and printable.

"This opens up a huge range of potential applications, including the possibility to use it as a transparent skin to power cars, planes, homes and wearable technology. Eventually it could play a major part in meeting the United Nations' goal to increase the share of renewable energy in the global energy mix."

Professor Wang's team set the world record for quantum dot solar cell efficiency by developing a unique surface engineering strategy. Overcoming previous challenges around the fact that the surface of quantum dots tend to be rough and unstable - making them less efficient at converting solar into electrical current.

"This new generation of quantum dots is compatible with more affordable and large-scale printable technologies," said Professor Wang.

"The near 25 per cent improvement in efficiency we have achieved over the previous world record is important. It is effectively the difference between quantum dot solar cell technology being an exciting 'prospect' and being commercially viable.'

UQ Vice-Chancellor and President Professor Peter Høj AC extended his congratulations to the UQ team.

"The world needs to rapidly reduce carbon emissions and this requires us to invest much more in research to improve existing energy-generation technologies and develop entirely new ones," he said.

"Harnessing the power of fundamental technological and scientific research is a big part of this process - and that's what we're focused on at UQ".

Credit: 
University of Queensland

Warming oceans are getting louder

SAN DIEGO--One of the ocean's loudest creatures is smaller than you'd expect--and will get even louder and more troublesome to humans and sea life as the ocean warms, according to new research presented here at the Ocean Sciences Meeting 2020.

Snapping shrimp create a pervasive background crackling noise in the marine environment. Scientists suspect the sound helps the shrimp communicate, defend territories and hunt for food. When enough shrimp snap at once, the noise can dominate the soundscape of coastal oceans, sometimes confusing sonar instruments. Listen to snapping shrimp sounds here: https://youtu.be/1Y9IhiSk-Pk

Researchers will present new results Friday at the Ocean Sciences Meeting 2020 suggesting that with increased ocean temperatures, snapping shrimp will snap more often and louder than before. This could amplify the background noise, or soundscape, of the global ocean, with implications for marine life and humans.

"It's a really cool little animal," said Aran Mooney, a marine biologist at Woods Hole Oceanographic Institution who will present the work. "They're a crustacean, kind of like a little shrimp or lobster. They make a sound by like closing a claw so fast it makes this bubble and when that bubble implodes, it makes that snapping sound."

Mooney and his colleague Ashlee Lillis detected a strong relationship between warmer waters and louder, more frequent snapping shrimp sounds by experimenting with shrimp in tanks in their lab and by listening to shrimp in the ocean at different water temperatures.

"As you increase that temperature, snap rates increase," Mooney said.

This makes sense because shrimp are essentially cold-blooded animals, meaning their body temperature and activity levels are largely controlled by their environment, in the same way ants can move faster in warmer weather than in cool weather.

"We can actually show in the field that not only does snap rate increase, but the sound levels increase as well," Mooney said. "So the seas are actually getting louder as water, warmer temperatures."

Louder snapping shrimp could potentially have harmful effects on fish and even sonar used by submarines and ships.

"We know that fish use sound to communicate," Mooney said. "Fish call each other, and they make sounds to attract mates and for territorial defense. If the seas get louder, it has the potential to influence those communications. We don't really know that yet. That's something we have to follow up on."

Human use of sound in the oceans might also be impaired by very loud snapping shrimp. Common instruments like sonar fish finders might be affected, Mooney said. There is also the possibility louder seas could affect instruments the Navy uses to detect mines, which could have implications for national defense, he said.

Credit: 
American Geophysical Union

Scientists: Estonia has the most energy efficient new nearly zero energy buildings

image: European input data and primary energy factors normalized national nearly zero energy requirements for new apartment buildings with district heating. Estonia was the only country complying with the European commission recommendation marked with EU Nordic. Estonian requirement was followed by Norwegian and Finnish ones while the Swedish one remained to the last position as the less strict in this comparison.

Image: 
Jarek Kurnitski

A recent study carried out by an international group of building scientists showed that Estonia is among the countries with the most energy efficient buildings in Europe. The analyses of the NZEB energy performance requirements and reference apartment buildings in four countries (Estonia, Norway, Finland and Sweden) showed that the nearly zero energy buildings constructed in Estonia are most energy-efficient, i.e. their energy consumption is the lowest.

Head of the TalTech Nearly Zero Energy Buildings Research Group, Professor Jarek Kurnitski says, "The reason for our success story in building energy efficiency is, besides the 15-year sterling work carried out by our researchers and engineers, also the fact that Estonia as a fast-evolving country was one of the few in Europe to establish minimum energy performance requirements in 2019 with the ambition following the cost-optimal calculations results. This approach was initially planned also by our neighbouring country Finland, who, however, bowed to market pressure and opted for a slower development path. This meant slightly lower construction costs, but significantly higher operation costs in the long run."

In Europe, climate zones are divided into four. Estonia, together with the other Nordic countries, is located in the fourth, i.e. the coldest zone. When benchmarking NZEB requirements against European Commission's NZEB recommendations, the differences in national methodologies that affect the results of the calculations must be taken into account. For example, Finland consumes nearly twice as much domestic hot water per person as Central Europe.

The international research group draw its conclusions by taking into account the NZEB requirements in force in the European Union, which are expressed as national energy performance indicators.

"Primary energy indicator" means total weighted energy consumption per heated square metre of a building. It is calculated not based on the total cost of energy consumption, but by using conversion factors, which take into account the primary energy content of energy delivered into the building. The conversion factors as well as many other input data differ from a country to another. National nearly zero energy performance indicators must be calculated based on cost-optimal levels, i.e. by determining the present value of a 30-year life cycle: the costs are composed of the construction costs, 30-years maintenance and operation costs with interest and increase of energy prices, which is why energy costs play an important role.

"Finding the most cost-effective energy efficient solutions is an economic-technical optimization task the building scientists need to tackle. However, we should not split hairs upon improving energy efficiency, because in order to achieve cost-efficiency, construction costs must be kept under control," says Professor Kurnitski, who is also leading one of Estonian centres of excellence - ZEBE centre of excellence for zero energy and resource efficient smart buildings and districts.

Why have many countries not been as successful as Estonia in achieving the energy efficiency goals?

In Finland, for example, the 10-year economic recession created an unfavourable environment, which made the decision-makes cautious and, instead of targeting low life- cycle costs, a course was taken to achieve as low construction cost as possible. This solution is not sustainable in the longer run. In Sweden, the process of preparing the requirements took too long, which explains a relatively low ambition of the requirements set out within the prescribed deadline.

"Strict energy efficiency requirements constitute a kind of consumer protection for a house or apartment buyer. Energy efficiency would never improve only by considering construction market preferences - it is profitable for the developer to build at the lowest possible cost, because the buyers are looking for a home and usually do not take later high operation costs into account at the time of purchase," Jarek Kurnitski says.

In Estonia, the first energy efficiency requirements based on primary energy use were established in 2008. In 2013, the requirements became stricter and from 2019-2020 NZEB requirements were established (from 2019 valid for public buildings and from 2020 for residential buildings).

Professor Kurnitski says, "Estonia has trusted its scientists, responded quickly and flexibly to the changes and achieved rapid improvement in energy efficiency along with economic growth. In 2013, we reached the same level as Sweden, Finland and Norway, but now we are a step ahead of them. Somewhat surprisingly, Estonia had the courage to establish ambitious NZEB requirements by regulations and make the construction sector stick to the requirements."

Energy efficiency requirements shall be reviewed every five years. In 2013, the required energy class in an energy performance certificate was C; class A is laid down now by NZEB requirements. More classes may be introduced in the future, for example, class A+++ is used already as the highest class for household appliances. Energy costs have decreased significantly in modern residential buildings. For example, compared to an apartment building built in the 1970 s, the energy consumption in a modern zero energy building is more than twice as low and the heating costs are even more than four times lower.

Credit: 
Estonian Research Council

Mayo researchers create, test AI to improve EKG testing for hypertrophic cardiomyopathy

ROCHESTER, Minn. - An approach based on artificial intelligence (AI) may allow EKGs to be used to screen for hypertrophic cardiomyopathy in the future. With hypertrophic cardiomyopathy, the heart walls become thick and may interfere with the heart's ability to function properly. The disease also predisposes some patients to potentially fatal abnormal rhythms. Current EKG technology has limited diagnostic yield for this disease.

New Mayo Clinic research suggests that a convolutional neural network AI can be trained to detect unseen characteristics of hypertrophic cardiomyopathy. The standard 12-lead EKG is a readily available, low-cost test that can be performed in many settings, including those with limited resources.

Hypertrophic cardiomyopathy may be underdiagnosed because it often does not cause symptoms. Patients are often unaware they have it until they experience complications, but early identification can be important. Hypertrophic cardiomyopathy is one of the leading causes of sudden death in adolescents and young adults participating in sports.

Peter Noseworthy, M.D., a Mayo Clinic cardiologist, suggests that AI might offer an effective and readily-available method for earlier diagnosis of hypertrophic cardiomyopathy through an EKG. Dr. Noseworthy is senior author on a newly published study in the Journal of the American College of Cardiology: "Detection of Hypertrophic Cardiomyopathy Using a Convolutional Neural Network-Enabled Electrocardiogram."

Researchers trained and validated a convolutional neural network using digital 12-lead EKG from 2,448 patients known to have hypertrophic cardiomyopathy and 51,153 who did not, matching the control subjects for age and sex. Next they tested the AI's ability to detect the disease on a different group of 612 subjects with hypertrophic cardiomyopathy and 12,788 control subjects.

For diagnostic tests such as this neural network, the diagnostic performance is measured mathematically through the area under the receiver operating characteristic curve, on a scale where 0.5 is poor (flip of a coin) and 1.0 is excellent (perfect test). The measurement relates to the test's ability to correctly identify patients who have the disease (sensitivity), and correctly identify patients who do not have the disease (specificity).

For comparison, a typical positive Pap smear test would have an area under the curve of 0.7 and a mammogram would be 0.85. The study found the AI's ability to determine patients with hypertrophic cardiomyopathy from those without it had an area under the curve of 0.96 ? a powerful predictor.

"The good performance in patients with a normal EKG is fascinating," says Dr. Noseworthy. "It's interesting to see that even a normal EKG can look abnormal to a convolutional neural network. This supports the concept that these networks find patterns that are hiding in plain sight."

The study also tested the AI on subgroups. The area under the curve for predicting hypertrophic cardiomyopathy in a group of patients diagnosed with left ventricular hypertrophy, a disease caused by high blood pressure that also is characterized by heart wall thickening, was 0.95. The area under the curve in the subgroup with only normal EKGs was also 0.95. The area under the curve for the subgroup of patients diagnosed with aortic stenosis (narrowing of the valve) was 0.94. The test performed similarly well in a subset of patients who had been genetically tested and confirmed to have pathogenic mutations for the disease.

"The subgroups are important for understanding how to apply the test. It's good to see that the AI performs well when the EKG is normal as well as when it is abnormal due to left ventricular hypertrophy," says Konstantinos Siontis, M.D., a resident cardiologist at Mayo Clinic and co-first author of the study. "Perhaps even more important is the fact that the algorithm performed best in the younger subset of patients in our study (under 40 years old), which highlights its potential value in screening younger adults."

More research remains to be done, such as testing the AI in other adult populations, children and adolescents to find out where it works well and where it fails.

"This is a promising proof of concept, but I would caution that, despite its powerful performance, any screening test for a relatively uncommon condition is destined to have high false positive rates and low positive predictive value in a general population. We still need to better understand which particular populations will benefit from this test as a screening tool," says Dr. Siontis.

"We also need to learn more about what specific characteristics of hypertrophic cardiomyopathy this network is detecting. We hope to learn how to apply this technology to screening and managing patients in families affected by this disease," adds Dr. Noseworthy.

Credit: 
Mayo Clinic

Health coaching shown to improve inhaler use among low-income copd patients

Over 14 million U.S. adults have chronic obstructive pulmonary disease, and many face barriers to using inhaled medications regularly and effectively. Although inhaled medications can improve daily life and long-term outcomes, only 25 to 43% of people with COPD use them regularly. In addition, inhalers can be complex to use--requiring users to master a series of six to eight steps that differ across devices. Physicians and health teams have not yet found a solution to bring COPD medication adherence to the level of other chronic diseases.

In a multi-site randomized controlled trial from the University of California, San Francisco, non-licensed, trained health coaches offered COPD patients one-on-one support in person and by phone, with contact at least every three weeks for nine months. Participants were primarily low-income, African American and Latino men in an urban area. Those who received health coaching showed significant improvement in adherence to controller inhalers and improved inhaler technique, with 40% of health-coached patients versus 11% of a control group able to demonstrate effective use of their inhalers after the intervention. Researchers conclude that "improved inhaler technique and adherence are one of multiple factors contributing to long-term COPD outcomes, but their research has confirmed one technique--use of lay health coaches--that may help patients get optimal benefit from their COPD medications.

Credit: 
American Academy of Family Physicians