Tech

Lessening water quality problems caused by hurricane-related flooding

June 1 is the start of hurricane season in the Atlantic, and with 2020 predicted to be particularly active, residents in coastal regions are keeping watchful eyes on the weather. Flooding is often the most damaging effect of tropical storms, and it can disproportionately affect vulnerable people and ecosystems. Now, in ACS' Environmental Science & Technology, researchers study water quality impacts of two recent hurricanes in North Carolina and suggest interventions to protect susceptible areas.

Water quality problems caused by flooding can threaten the health of humans and wildlife. Rising waters can make wastewater treatment plants, sewers, hazardous waste sites, agricultural lands and animal feeding operations overflow, carrying pollutants into waterways. Flood hazard maps exist to help keep homes and businesses out of flood plains, but the maps aren't always accurate. Danica Schaffer-Smith and colleagues wanted to find out how well the maps predicted actual flooding from two recent hurricanes. They also wanted to identify threats to water quality and find opportunities to improve the area's resilience to future storms.

The researchers developed a computer algorithm that, using satellite images, mapped areas in North Carolina flooded by Hurricane Matthew in 2016 and Florence in 2018. They found that hurricane flooding occurred beyond the state-mapped flood hazard zones during both storms. When the team correlated the flooded areas to socioeconomic characteristics of the people living there, they found larger impacts on communities that had higher proportions of older adults, people with disabilities, unemployment and mobile homes. The researchers mapped many potential sources of water pollution within the flooded areas, including hazardous waste sites, industrial discharges, wastewater treatment plants, and swine and poultry farms. Certain interventions, such as government land buyouts, forest restoration or wetland conservation, could help lessen the impacts of future hurricanes, the researchers say.

Credit: 
American Chemical Society

Disorder in fish shoals may reap rewards at dinner time

image: Visual fields of three-spined sticklebacks in an ordered group (above) and in a disordered group (below). Groups of three-spined sticklebacks in the experimental arena as viewed underwater (right).

Image: 
James Herbert-Read and Hannah MacGregor

The advantages of animals foraging in an orderly group are well-known, but research by the University of Bristol has found an element of unruly adventure can help fish in the quest for food.

The study, published today [1 June] in Nature Communications, sheds new light on why fish shoals frequently switch between behaving in states of extreme order and disorder. It found certain individuals perform better when the group is disordered because they are more observant and faster to find sources of food, while others excel by following the orderly crowd and exploiting their more proactive peers.

Lead author Dr Hannah MacGregor, Research Associate in the School of Biological Sciences, said: "We know how animals behave in collective formations, but the benefits of this are less well understood. The findings of our study are intriguing because they reveal why swarm-like fish shoals are in a constant state of flux, as each fish vies for order or disorder to hold sway depending on the state in which they individually perform best.

"It was surprising that the unruly, more disruptive fish can have a competitive edge when it comes to foraging, since they are more alert and able to seek out new food sources which might escape the attention of others."

The study monitored 12 groups of three-spined sticklebacks over a month. It measured how quickly individual fish in shoals located a food source that appeared unpredictably in their environment. By repeatedly testing the responses of the same groups, it was possible to measure whether individual fish performed better or worse depending on how organised the group was when the food appeared.

"The disorderly 'first responders' to the food were quickest and in the minority. It was fascinating to see not only how individual styles varied, but also that the fish generally swam in their shoals in the way that was most advantageous for themselves," Dr MacGregor said.

"Those that thrived in disorder tended to swim less aligned to their neighbours, suggesting they may try to disrupt the shoal in case more food appears."

A recognised benefit of foraging in an orderly group is the sharing of information about the location of food. Individuals that rely more on this social information benefit from a highly organised formation that allows the information to be transmitted more effectively.

However, the study showed how some individuals are particularly good at detecting the food independently, without the help of their shoal mates. For these individuals, a disorganised group is better because their ability to see the environment is not constrained by needing to face the same direction as the shoal. In addition to this improved line of sight, they also have less competition because others in the group are slower to react.

The findings provide food for evolutionary thought and indicate the need for closer investigation to better understand the complex dynamics within collective animal behaviour and the importance of individual diversity.

Dr MacGregor said: "Highly organised shoals offer better access to social information and protection from predators. But our research indicated that orderly group behaviour may not always be a good thing when foraging if you are a fish that is very good at obtaining your own 'private' information about new food resources.

"Conflict between individuals over the preferred organisation of the group could explain why shoals of fish spontaneously transition between orderly and disorderly collective behaviour, as they swim to their different strengths."

Credit: 
University of Bristol

Stanford study shows dry air drives overlooked changes in how plants drink and breathe

Plants drink up much of the water that falls to Earth. They take what they need before releasing it through tiny holes on the underside of their leaves, just as people release water vapor with every exhale.

How much a plant drinks and the rate at which it releases water, or transpires, depends partly on moisture levels in the air and soil. Global warming will shift this process more than previously predicted, according to new research from Stanford University.

Published June 1 in Nature Climate Change, the paper shows current climate models underestimate how severely plants ration their water use in response to dry air, and overestimate the effect of dry soil. The results suggest plants in many regions will lock away less water than expected during hot droughts in the future, leaving more water available to percolate into reservoirs, underground aquifers, rivers, lakes and streams.

"This is good news," said study co-author Alexandra Konings, an assistant professor of Earth system science at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). Yet there is also a dark side to the findings: While water resources may be less diminished, plant growth and carbon uptake will likely suffer more than most models predict.

"Whether plants will fare better in future droughts is a more complex question," said lead author Yanlan Liu, a postdoctoral scholar in Konings' lab. "But now we know plants will use less water than expected."

For agricultural crops, this means the best available estimates of future water needs, growth and vulnerability are "likely to be incorrect" during periods when the atmosphere is very dry, said another of the study's authors, Mukesh Kumar, who is an associate professor of civil, construction and environmental engineering at University of Alabama.

Atmospheric dryness going 'through the roof'

The scientists looked specifically at a component of climate models that estimates evapotranspiration, which refers to the rate at which Earth's land surface and plants return water to the atmosphere. "So much of the water balance in any given ecosystem goes to evapotranspiration, it has implications for how much water is left over for water resources for people," Konings said. "It also has big effects on weather and climate."

A common modeling approach treats this dynamic process more or less as a function of soil moisture. "That's not realistic because vegetation responds to drought based on the amount of water inside the leaves," Konings said.

Few climate models try to disentangle the effects of dry soil and dry air when predicting changes in evapotranspiration. "The models in use right now work really well if you're averaging wet and dry conditions over multiple years, but not in times of drought," said Konings, who is also a center fellow, by courtesy, at Stanford Woods Institute for the Environment.

This entanglement becomes increasingly problematic under climate change. In some hot spots around the globe, episodes of dangerously humid heat are striking with growing severity and frequency. But as temperatures rise, Konings said, most droughts will be accompanied by relatively dry air. Hotter air can simply hold more water vapor than cooler air, which means the atmosphere becomes less saturated if it heats up without additional water. As a result, while future changes in soil moisture are hard to predict and likely to vary by region, she said, "Atmospheric dryness is going to go through the roof."

Bringing in hydraulics

The researchers modeled the effect of this drying on plants' drinking habits by zooming in on responses in the plant hydraulic system - the pipes and valves inside a plant's roots, stem and leaves. They developed mathematical techniques to derive evapotranspiration rates from a combination of widely available datasets, including records of soil texture, canopy heights, plant types and flows of carbon and water vapor at 40 sites around the world. Then they cross-checked their techniques against limited real-world measurements of evapotranspiration.

The development of a hydraulic model, in itself, is not a first. But the researchers went further, comparing the different model approaches to understand the impact of plant hydraulics under various conditions.

They found the most widely used approaches for estimating evapotranspiration miss about 40 percent of the effect of dry air. This is like a weather forecast that fails to mention wind chill or stifling humidity. The effect is strongest - and current predictions are the most off-base - in places where plants are the least adapted to drought. Konings said, "We were surprised that this had such a big effect."

Credit: 
Stanford's School of Earth, Energy & Environmental Sciences

Carbon nanotube transistors make the leap from lab to factory floor

Carbon nanotube transistors are a step closer to commercial reality, now that MIT researchers have demonstrated that the devices can be made swiftly in commercial facilities, with the same equipment used to manufacture the silicon-based transistors that are the backbone of today's computing industry.

Carbon nanotube field-effect transistors or CNFETs are more energy-efficient than silicon field-effect transistors and could be used to build new types of three-dimensional microprocessors. But until now, they've existed mostly in an "artisanal" space, crafted in small quantities in academic laboratories.

In a study published June 1 in Nature Electronics, however, scientists show how CNFETs can be fabricated in large quantities on 200-millimeter wafers that are the industry standard in computer chip design. The CNFETs were created in a commercial silicon manufacturing facility and a semiconductor foundry in the United States.

After analyzing the deposition technique used to make the CNFETs, Max Shulaker, an MIT assistant professor of electrical engineering and computer science, and his colleagues made some changes to speed up the fabrication process by more than 1,100 times compared to the conventional method, while also reducing the cost of production. The technique deposited carbon nanotubes edge to edge on the wafers, with 14,400 by 14,400 arrays CFNETs distributed across multiple wafers.

Shulaker, who has been designing CNFETs since his PhD days, says the new study represents "a giant step forward, to make that leap into production-level facilities."

Bridging the gap between lab and industry is something that researchers "don't often get a chance to do," he adds. "But it's an important litmus test for emerging technologies."

Other MIT researchers on the study include lead author Mindy D. Bishop, a PhD student in the Harvard-MIT Health Sciences and Technology program, along with Gage Hills, Tathagata Srimani, and Christian Lau.

Solving the spaghetti problem

For decades, improvements in silicon-based transistor manufacturing have brought down prices and increased energy efficiency in computing. That trend may be nearing its end, however, as increasing numbers of transistors packed into integrated circuits do not appear to be increasing energy efficiency at historic rates.

CNFETs are an attractive alternative technology because they are "around an order of magnitude more energy efficient" than silicon-based transistors, says Shulaker.

Unlike silicon-based transistors, which are made at temperatures around 450 to 500 degrees Celsius, CNFETs also can be manufactured at near-room temperatures. "This means that you can actually build layers of circuits right on top of previously fabricated layers of circuits, to create a three-dimensional chip," Shulaker explains. "You can't do this with silicon-based technology, because you would melt the layers underneath."

A 3D computer chip, which might combine logic and memory functions, is projected to "beat the performance of a state-of-the-art 2D chip made from silicon by orders of magnitude," he says.

One of the most effective ways to build CFNETs in the lab is a method for depositing nanotubes called incubation, where a wafer is submerged in a bath of nanotubes until the nanotubes stick to the wafer's surface.

The performance of the CNFET is dictated in large part by the deposition process, says Bishop, which affects both the number of carbon nanotubes on the surface of the wafer and their orientation. They're "either stuck onto the wafer in random orientations like cooked spaghetti or all aligned in the same direction like uncooked spaghetti still in the package," she says.

Aligning the nanotubes perfectly in a CNFET leads to ideal performance, but alignment is difficult to obtain. "It's really hard to lay down billions of tiny 1-nanometer diameter nanotubes in a perfect orientation across a large 200-millimeter wafer," Bishop explains. "To put these length scales into context, it's like trying to cover the entire state of New Hampshire in perfectly oriented dry spaghetti."

The incubation method, while practical for industry, doesn't align the nanotubes at all. They end up on the wafer more like cooked spaghetti, which the researchers initially didn't think would deliver sufficiently high CNFET performance, Bishop says. After their experiments, however, she and her colleagues concluded that the simple incubation process would work to produce a CNFET that could outperform a silicon-based transistor.

CNFETs beyond the beaker

Careful observations of the incubation process showed the researchers how to alter the process to make it more viable for industrial production. For instance, they found that dry cycling, a method of intermittently drying out the submerged wafer, could dramatically reduce the incubation time -- from 48 hours to 150 seconds.

Another new method called ACE (artificial concentration through evaporation) deposited small amounts of nanotube solution on a wafer instead of submerging the wafer in a tank. The slow evaporation of the solution increased the concentration of carbon nanotubes and the overall density of nanotubes deposited on the wafer.

These changes were necessary before the process could be tried on an industrial scale, Bishop says: "In our lab, we're fine to let a wafer sit for a week in a beaker, but for a company, they don't have that luxury."

The "elegantly simple tests" that helped them understand and improve on the incubation method, she says, "proved really important for addressing concerns that maybe academics don't have, but certainly industry has, when they look at setting up a new process."

The researchers worked with Analog Devices, a commercial silicon manufacturing facility, and SkyWater Technology, a semiconductor foundry, to fabricate CNFETs using the improved method. They were able to use the same equipment that the two facilities use to make silicon-based wafers, while also ensuring that the nanotube solutions met the strict chemical and contaminant requirements of the facilities.

"We were extremely lucky to work closely with our industry collaborators and learn about their requirements and iterate our development with their input," says Bishop, who noted that the partnership helped them develop an automated, high-volume and low-cost process.

The two facilities showed a "serious commitment to research and development and exploring the edge" of emerging technologies, Shulaker adds.

The next steps, already underway, will be to build different types of integrated circuits out of CNFETs in an industrial setting and explore some of the new functions that a 3D chip could offer, he says. "The next goal is for this to transition from being academically interesting to something that will be used by folks, and I think this is a very important step in this direction."

Credit: 
Massachusetts Institute of Technology

The interface of genomic information with the electronic health record

Advances in genetic and genomic testing technology have not only introduced the utilization of clinical genomic information into virtually every area of medical care, this testing has become an essential tool to achieve the goal of precision medicine. As genomic data become more complex, so too must the electronic health record (EHR) evolve to provide optimal care for patients, maximizing benefits while minimizing harm. Issues of patient autonomy, access, genetic literacy, privacy and protection, and transferability of data, as well as the appropriate genomic data set, are key in facilitating the incorporation of genomic information into patient care.

In an effort to provide practical guidance and important considerations regarding how genomic information can be incorporated into electronic health records, the American College of Medical Genetics and Genomics (ACMG) has released, "The interface of genomic information with the electronic health record: a points to consider statement of the American College of Medical Genetics and Genomics (ACMG)."

"The electronic health record serves as a powerful interactive tool in improving the healthcare of patients and populations," said Terri Grebe, MD, FACMG Chair of the ACMG Social, Ethical, and Legal Issues Committee. "As an integral component of medical treatment, genomic data in the EHR must therefore be continuously and easily accessible to both patients and providers, while simultaneously receiving appropriate privacy protection, to achieve the goal of personalized medicine. This ACMG document provides guidelines on the storage and access of genomic information, improvements in EHR systems, and ethical issues surrounding the sharing of genomic data."

This new ACMG points to consider document addresses types of genomic information in the EHR, mechanisms of placement, data entry, usage, patient/provider access, results disclosure, portability, and privacy. It highlights patient, family, and societal benefits, discusses areas of concern, identifying where further modifications are needed, and makes recommendations for further optimization. It also highlights unique characteristics of genomic information that require additional attention, as they relate to universal bioethical principles.

A few of the specific points to consider include:

Genetic data in the medical record should be readily and continuously accessible to the patient, including test results, secondary findings, AND the clinician's interpretation.

Caution should be exercised in assessing the quality and medical actionability of outside results from other institutions and laboratories uploaded to the EHR by the patient, particularly direct-to-consumer testing companies. These results would be best stored in a separate section of the EHR or flagged in such a way as to clarify the origin of the report.

Further optimization of the interoperability of EHR networks is encouraged to allow separate institutions who provide care to the same patient to be able to view the patient's genetic data, furthering coordinated care and minimizing the risk of duplicate testing, with the attendant waste of resources. The use of standards such as the Health Level 7 (HL7) genomics model, and Fast Healthcare Interoperability Resources (FHIR) including the emerging FHIR genomics standards by EHR vendors is encouraged.

In the future development/revision of EHRs, the ability to easily retrieve genomic information will be vital to enable targeted testing for family members, facilitating cost reduction, earlier diagnosis and treatment.

Informed consent should be adapted to reflect these points to consider, explicit on right of access, mechanism of access, delayed release of certain results, and potential usage of personal genomic information by the ordering institution as well as outside agencies such as public health programs and genomic databases.

The statement concludes that further research is needed to determine the optimal approaches for patient access to and use of genomic information in the EHR, as well as protecting patient privacy and avoiding harm. While direct patient access to the EHR is appropriate and will facilitate patients' involvement in their own health care, it is not a substitute for face-to-face interaction, which remains the ideal method of communication of potentially life-altering personal health information. These points to consider should be viewed as guidance for the ordering provider, clinical geneticist, laboratory geneticist and genetic counselor, and for institutions and vendors. They are intended to assist providers, institutions and vendors to develop policies and procedures that optimize the use of the EHR in the delivery of healthcare to maximize patient benefit, minimize harm, improve population health and decrease healthcare costs.

Credit: 
American College of Medical Genetics and Genomics

Aiming for an enduring relationship

image: Why do some couples stay together yet others split up? Timing and desire for commitment are key indicators, according to new insights from SMU Assistant Professor Kenneth Tan.

Image: 
Singapore Management University

SMU Office of Research & Tech Transfer - Are you ready for love? It's an age-old question that has inspired pop songs and romantic literature, as well as fuelling advice columns in celebrity magazines. But will your love endure, or is it just a fling?

The spark of mutual attraction may remain a mystery but there's a science to relationships that can help predict outcomes, according to recent research co-authored by Kenneth Tan, Assistant Professor of Psychology in the School of Social Sciences at Singapore Management University (SMU).

A pertinent factor is timing, a subjective sense that now is the right moment to be intimately involved with someone on an ongoing basis.

"We see from the research that timing is important in that it has an influence on boosting - or undermining - relationship commitment," Professor Tan says.

For the paper, 'It's about time: Readiness, commitment and stability in close relationships', the researchers introduced the construct of commitment readiness into the larger theory of relationship receptivity and tested it with data collected across five studies of people currently involved in romantic relationships.

The researchers found that a higher degree of readiness was associated with higher commitment to a relationship. And by controlling for commitment at one time point, results spoke to the temporal precedence of readiness in shaping future increases in commitment.

Readiness also predicted relationship maintenance beyond commitment between individuals and was uniquely associated with more self-disclosure. Although not associated with overall accommodation for transgressions, readiness was associated with less neglect and exit strategies, both of which are destructive forms of relationship behaviours.

Interestingly, readiness was also associated with less loyalty, suggesting that although individuals who were more ready engaged in less destructive responses to conflict, they wouldn't passively wait for things to get better.

No gender differentiation appeared in the initial findings but Professor Tan notes that females may feel more ready if they sense their biological clock is ticking. And he believes the research would also hold for same-sex relationships.

But it's still unclear what gives rise to a sense of being ready for a committed relationship.

"Of course we have some preliminary ideas in mind, [such as] how secure you feel, your self-esteem, how much you are prioritising a relationship over other issues, and so forth. That's [part of] the next step in the research," Professor Tan says.

Yearning for a long-term commitment

In a related paper, 'Seeking and ensuring interdependence: Desiring commitment and the strategic initiation and maintenance of close relationships', the researchers considered how the intensity of longing for an enduring connection impacted on the likelihood of a successful ongoing partnership.

Again using empirical data, the researchers examined personal attitudes to interdependence through the lens of commitment desirability, which is defined as the subjective desire to be involved in a committed romantic relationship.

In a new insight, the researchers contend that it's not only the level of commitment that's relevant, it's also how much you want to be in a committed relationship: the strength of the desire.

"It's not that commitment doesn't matter. What we see is that basically [people] have this readiness, or desire, [which has] an added effect on commitment itself. So they sort of work hand in hand," Professor Tan says.

The evidence of three studies found that, in their efforts to have long-lasting relationships, individuals who desire commitment use perceived partner commitment to a similar desire as a gauge to think and behave in ways that facilitate and promote relationship success, as well as to protect themselves against getting too close to a partner who is not also interested in commitment.

But there may be a downside. One of the studies suggested that relying on high commitment desire runs the risk of getting into a relationship with someone who would provide security and need fulfilment in the long term but who is not an especially responsive partner.

In that particular study, the researchers only looked at two kinds of partner - highly responsive and moderately responsive. "We didn't have a low responsive partner because it's more likely that someone won't choose a low responsive partner," explains Professor Tan.

Highly responsive partners were described as "really understanding, really caring, and really trying to validate you as a potential partner".

"Whereas for a moderately responsive one, we said that they were kind of caring and validating, but they also needed their own space. For the most part they seemed like average people," he adds.

Professor Tan points out there are other considerations in maintaining relationships. For example, if there's a growth mindset: thinking that challenges can be surmounted, that relationships can be built to become better, and that having something for the long term is good because you can still work on things and can progress to become more successful in the future.

Implications for public policy

It's arguable that a human urge to form intimate couples is timeless. But rising divorce rates and the popularity of arrangements such as 'hooking up' and 'friends with benefits' suggest not everyone is receptive to the long haul. And the convenience of online dating sites opens up new impermanent opportunities.

"We started noticing, in terms of demographics, that people are getting married later, people are actually saying they don't want a relationship right now because of certain priorities, they are no longer interested, or they have become resigned to not having a relationship," Professor Tan says.

"We started to wonder why that was the case. So we're trying to tease that apart. I think this is also a question that pertains to public policy: whether it is in the U.S., where we first started thinking about it, or here [in Singapore] where we're thinking about how to boost dating rates, marriage rates, fertility rates and so on."

Future research will consider how readiness can be changed, or how people can be motivated to become more ready, and so the team is looking at antecedents to readiness and desire.

"And [we're] looking at the interaction between readiness and relationship status on wellbeing, and also whether that has any implications on feeling like people's single-hood has been stereotyped, in a sense, and whether that has any negative consequences," Professor Tan says.

He offers questions such as: "Are you satisfied with life? Do you find there's less meaning because you are not partnered with someone but nonetheless, you are currently ready for one?"

The researchers' analysis of romantic human behaviour has a long way to run.

Credit: 
Singapore Management University

Universal virus detection platform to expedite viral diagnosis?

image: Schematics of the reactive polymer-coated surface for dsRNA capture and detection.

Image: 
KAIST

The prompt, precise, and massive detection of a virus is the key to combat infectious diseases such as Covid-19. A new viral diagnostic strategy using reactive polymer-grafted, double-stranded RNAs will serve as a pre-screening tester for a wide range of viruses with enhanced sensitivity.

Currently, the most widely using viral detection methodology is polymerase chain reaction (PCR) diagnosis, which amplifies and detects a piece of the viral genome. Prior knowledge of the relevant primer nucleic acids of the virus is quintessential for this test.

The detection platform developed by KAIST researchers identifies viral activities without amplifying specific nucleic acid targets. The research team, co-led by Professor Sheng Li and Professor Yoosik Kim from the Department of Chemical and Biomolecular Engineering, constructed a universal virus detection platform by utilizing the distinct features of the PPFPA-grafted surface and double-stranded RNAs.

The key principle of this platform is utilizing the distinct feature of reactive polymer-grafted surfaces, which serve as a versatile platform for the immobilization of functional molecules. These activated surfaces can be used in a wide range of applications including separation, delivery, and detection. As long double-stranded RNAs are common byproducts of viral transcription and replication, these PPFPA-grafted surfaces can detect the presence of different kinds of viruses without prior knowledge of their genomic sequences.

"We employed the PPFPA-grafted silicon surface to develop a universal virus detection platform by immobilizing antibodies that recognize double-stranded RNAs," said Professor Kim.

To increase detection sensitivity, the research team devised two-step detection process analogues to sandwich enzyme-linked immunosorbent assay where the bound double-stranded RNAs are then visualized using fluorophore-tagged antibodies that also recognize the RNAs' double-stranded secondary structure.

By utilizing the developed platform, long double-stranded RNAs can be detected and visualized from an RNA mixture as well as from total cell lysates, which contain a mixture of various abundant contaminants such as DNAs and proteins.

The research team successfully detected elevated levels of hepatitis C and A viruses with this tool.

"This new technology allows us to take on virus detection from a new perspective. By targeting a common biomarker, viral double-stranded RNAs, we can develop a pre-screening platform that can quickly differentiate infected populations from non-infected ones," said Professor Li.

"This detection platform provides new perspectives for diagnosing infectious diseases. This will provide fast and accurate diagnoses for an infected population and prevent the influx of massive outbreaks," said Professor Kim.

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

These flexible feet help robots walk faster

image: An off-the-shelf six-legged robot equipped with the feet designed by UC San Diego engineers can walk up to 40 percent faster than when not equipped with the feet.

Image: 
University of California San Diego

Roboticists at the University of California San Diego have developed flexible feet that can help robots walk up to 40 percent faster on uneven terrain such as pebbles and wood chips. The work has applications for search-and-rescue missions as well as space exploration.

"Robots need to be able to walk fast and efficiently on natural, uneven terrain so they can go everywhere humans can go, but maybe shouldn't," said Emily Lathrop, the paper's first author and a Ph.D. student at the Jacobs School of Engineering at UC San Diego.

The researchers will present their findings at the RoboSoft conference which takes place virtually May 15 to July 15, 2020.

"Usually, robots are only able to control motion at specific joints," said Michael T. Tolley, a professor in the Department of Mechanical and Aerospace Engineering at UC San Diego and senior author of the paper. "In this work, we showed that a robot that can control the stiffness, and hence the shape, of its feet outperforms traditional designs and is able to adapt to a wide variety of terrains."

The feet are flexible spheres made from a latex membrane filled with coffee grounds. Structures inspired by nature? such as plant roots? and by man-made solutions? such as piles driven into the ground to stabilize slopes? are embedded in the coffee grounds.

The feet allow robots to walk faster and grip better because of a mechanism called granular jamming that allows granular media, in this case the coffee grounds, to go back and forth between behaving like a solid and behaving like a liquid. When the feet hit the ground, they firm up, conforming to the ground underneath and providing solid footing. They then unjam and loosen up when transitioning between steps. The support structures help the flexible feet remain stiff while jammed.

It's the first time that such feet have been tested on uneven terrain, like gravel and wood chips.

The feet were installed on a commercially available hexapod robot. Researchers designed and built an on-board system that can generate negative pressure to control the jamming of the feet, as well as positive pressure to unjam the feet between each step. As a result, the feet can be actively jammed, with a vacuum pump removing air from between the coffee grounds and stiffening the foot. But the feet also can be passively jammed, when the weight of the robot pushes the air out from between the coffee grounds inside, causing them to stiffen.

Researchers tested the robot walking on flat ground, wood chips and pebbles, with and without the feet. They found that passive jamming feet perform best on flat ground but active jamming feet do better on loose rocks. The feet also helped the robot's legs grip the ground better, increasing its speed. The improvements were particularly significant when the robot walked up sloped, uneven terrain.

"The natural world is filled with challenging grounds for walking robots---slippery, rocky, and squishy substrates all make walking complicated," said Nick Gravish, a professor in the UC San Diego Department of Mechanical and Aerospace Engineering and study coauthor. "Feet that can adapt to these different types of ground can help robots improve mobility."

In a companion paper co-authored by Tolley and Gravish with Ph.D. student Shivan Chopra as first author, researchers quantified exactly how much improvement each foot generated. For example, the foot reduced by 62 percent the depth of penetration in the sand on impact; and reduced by 98 percent the force required to pull the foot out when compared to a fully rigid foot.

Next steps include incorporating soft sensors on the bottom of the feet to allow an electronic control board to identify what kind of ground the robot is about to step on and whether the feet need to be jammed actively or passively.

Researchers will also keep working to improve design and control algorithms to make the feet more efficient.

Credit: 
University of California - San Diego

Developing a digital holography-based multimodal imaging system to visualize living cells

image: 3D fluorescence imaging results for Physcomitrella patens. Below: The yellow arrows indicate fluorescent images of in-focus nuclei.

Image: 
Kobe University

A research group led by Kobe University's Professor MATOBA Osamu (Organization for Advanced and Integrated Research) has successfully created 3D fluorescence and phase imaging of living cells based on digital holography (*1). They used plant cells with fluorescent protein markers in their nuclei to demonstrate this imaging system.

The group consisted of Project Assistant Professor Manoj KUMAR and Assistant Professor Xiangyu QUAN (both of the Graduate School of System Informatics), Professor AWATSUJI Yasuhiro (Kyoto Institute of Technology) and Associate Professor TAMADA Yosuke (Utsunomiya University).

This technology will form a foundation for living cell imaging, which is indispensable in the life sciences field. It is also expected that using this technology to visualize stem cell processes in plants will increase our understanding of them.

These research results appeared in the journal Scientific Reports published by Springer Nature on May 15.

Main Points

Creation of an integrated system that can produce 3D fluorescence and quantitative phase high-speed imaging without scanning.

The mathematics behind the integrated system's fluorescence imaging was formularized.

Each of the spatial 3D fluorescence and phase images was generated from a single 2D image. Time-lapse observations are also possible.

It is expected that this technology will be applied to visualize stem cell formation processes in plants.

Research Background

The optical microscope was invented at the end of the sixteenth century and the British scientist Robert Hooke was the first to discover cells in the mid-seventeenth century. The invention has been developed into new technologies such as the phase-contrast microscope (1953 Nobel Prize in Physics), which allows living cells to be observed without the need to stain them, and fluorescent cellular imaging, in which specific molecules are marked using fluorescent proteins (2008 Nobel Prize in Chemistry) and observed in living cells. These have become essential tools for observation in life sciences and medical fields.

Phase imaging uses the difference in the optical length of light as it passes through a biological sample to reveal structural information about it. Fluorescence imaging provides information about specific molecules inside the biological sample, and can reveal their functions. However, the intracellular structure and motility are complex. The ability to visualize multidimensional physical information comprising phase and fluorescence imaging would be useful for understanding these aspects. An imaging system that could generate varied physical information simultaneously and instantaneously from 3D living cells would serve as a foundation technology for bringing about innovation in biology.

The hybrid multimodal imaging system constructed in this study can obtain phase and fluorescent 3D information in a single shot. It enables researchers to quantitatively and simultaneously visualize a biological sample's structural or functional information using a single platform.

Research Methodology

In this study, the researchers constructed a multimodal digital holographic microscope that could record a sample's fluorescent information and phase information simultaneously (Figure 1). This uses digital holography as a base, whereby interfered light information from the object is recorded and then optical calculations made by computer are used to generate 3D spatial information about the object.

The microscope in the study is composed of two different optical systems. Firstly, the holographic 3D fluorescence imaging system, as shown on the right-hand side of Figure 1. In order to obtain the 3D fluorescence information, a Spatial Light Modulator (*3) is used to split the fluorescent light emitted from fluorescent molecules into two light waves that can interfere with one another.

At this point, the 3D information from the object light is preserved by giving one of the lightwaves a slightly different curvature radius and propagation direction; at the same time, the two lightwaves are on a shared optical pathway (mostly proceeding along the same axis) allowing a temporally stable interference measurement to be taken. This optical system was formularized in this study,clarifying the recorded interference intensity distribution for the first time. This formula enabled the researchers to find experimental conditions that would improve the quality of the reconstructed fluorescent images. They were able to generate three-dimensional images of living cells and their structures by applying this fluorescent 3D holographic system. The molecules and structures of living cells were labelled with fluorescent proteins, allowing their dynamic behaviors to be observed through this new 3D fluorescence microscopy.

The imaging technology developed by this study enables 3D images, which up until now have taken comparatively longer to generate via laser scanning, to be generated in a single shot without the need for scanning.

Next, as pictured on the left of Figure 1, is the holographic 3D phase imaging system (*4). A living plant cell is made up of components such as a nucleus, mitochondria, chloroplasts, and a thin cell wall. It is possible to visualize the structures of these components from the differences in phase (optical path length). In this system, adopting the Mach-Zehnder interferometer allows the reference plane wave to be used, enabling the optimum interference fringe for each measured object to be easily obtained.

This digital holographic microscope was created by unifying fluorescence and phase measurement systems.

Subsequently, this microscope was utilized to visualize living plant cells. An experiment was carried out to prove that it is possible to conduct 4D observations by applying spatial 3D and a (one-dimensional) time axis. The moss Physcomitrella patens and fluorescent beads with a mean size of 10 μm were used. Figures 2 and 3 show the experiment results for simultaneous 3D fluorescence and phase imaging of the moss. Figure 2 (b) shows that seven nuclei are visible when a conventional full-field fluorescence microscope is used. Of these seven, the nuclei numbered 1, 2, and 4 are in focus, however, this is not the case for the nuclei in locations at different depths as it is a weak fluorescent image with widespread blurring. In this study's proposed methodology to solve this issue, fluorescent lightwave information was extracted from the fluorescent hologram in Figure 2 (c). Based on this information, the numerical Fresnel propagation algorithm can obtain reconstructed fluorescent images at any depth or distance. Therefore, in-focus images of multiple fluorescent nuclei at different depths were restored from a single image without scanning.

Figure 2 (d), (e), and (f) show the reconstructed images at three different planes. The axial distance between (d) and (e) is 10 micrometers, and 15 micrometers between (e) and (f). The yellow arrows indicate the nuclei that are in focus. It is possible to see which of the seven nuclei in the fluorescence image are in focus across the three planes.

Figure 3 shows the quantitative phase distribution for the three planes at different depths. It was possible to visualize individual chloroplasts (there are many around the edges of the cells, as indicated by the red peaks in Figure 3 (b)). The cell thickness, as calculated from the measured quantitative phase values, was about 17 micrometers, a size very close to other reference values.

Further Developments

In this study a multimodal digital holographic microscope was developed, capable of simultaneous 3D phase and fluorescence measurements. With the ability to conduct quantitative phase and fluorescence imaging at the same time, it is believed that this method will serve as a new foundation technology for the visualization of living biological tissues and cells. In particular, it has been shown that this microscope can be applied to complex plant cells. It could be utilized to gain a diversified understanding of the stem cell formation process in plants, which reproduce more easily than animal cells. In the future, it may be possible to use this information to control the process of stem cells via stimulation by light. Efficient plant reproduction and growth achieved through the proposed imaging and future stimulation could be applied to the development of a food cultivation system.

This technology could be developed by further improving the light usage efficiency. In digital holography, it is necessary to spatially widen the beam diameter, split it into two, and then overlap them again in order to use the interference between the two paths of light. Therefore, it is also necessary to increase the fluorescent energy for it to be observed by the image sensor. In order to achieve this, a large amount of light energy would be required to illuminate the living cells, however, cellular damage caused by the light would be a big problem. It is thought that a volume hologram could be used to boost light usage efficiency while avoiding cellular phototoxicity.

Another issue is that the reconstructed 3D distribution extends into the light propagation direction (the object's depth direction), decreasing axial resolution. The researchers are working on methods using deep learning and filtering in order to suppress this extension in depth direction and enhance image quality.

Credit: 
Kobe University

Researchers have developed a first-principles quantum Monte Carlo package called TurboRVB

image: Schematic figure of the TurboRVB workflow [K. Nakano et al. J. Chem. Phys. 152, 204121 (2020)]. The code implements flexible many-body wave function ansatz, such as JSD: Jastrow Slater, JAGP: Jastrow Geminal, and JPf: Jastrow Pfaffian. One can prepare a trial wave function using a build-in density functional theory (DFT) code and perform subsequent first-principles variational quantum Monte Carlo (VMC) and lattice discretized diffusion quantum Monte Carlo (LRDMC) calculations. Since forces acting on atoms can be computed, structural optimizations and molecular dynamics simulations are also feasible in TurboRVB.

Image: 
AIP Publishing

First-principles quantum Monte Carlo is a framework used to tackle the solution of the many-body Schrödinger equation by means of a stochastic approach. This framework is expected to be the next generation of electronic structure calculations because it can overcome some of the drawbacks in density functional theory and wavefunction-based calculations. In particular, the quantum Monte Carlo framework does not rely on exchange-correlation functionals, the algorithm is well suited for massively parallel supercomputers, and it is easily applicable to both isolated and periodic systems.

"TurboRVB" is a first-principles quantum Monte Carlo software package that was initially launched by Prof. Sandro Sorella (International School for Advanced Studies/Italy) and Dr. Michele Casula (Sorbonne University/France), and has been continuously developed by many contributors for over 20 years. Very recently, Assist. Prof. Kosuke Nakano at Japan Advanced Institute of Science and Technology (JAIST, President: Minoru Terano, located at Nomi, Ishikawa, Japan) and his collaborators have published a comprehensive review paper in The Journal of Chemical Physics [K. Nakano et al. J. Chem. Phys. 152, 204121, 2020, DOI: 10.1063/5.0005037].

TurboRVB is distinguishable from other first-principles quantum Monte Carlo codes in the following features. (a) The code employs resonating valence bond (RVB)-type wave functions, such as the Jastrow Geminal/Jastrow Pfaffian, which include the correlation effect beyond the Jastrow-Slater wave function that is commonly used in other QMC codes. (b) Implemented state-of-art optimization algorithms, such as the stochastic reconfiguration and the linear method, help realize a stable optimization of the amplitude and nodal surface of a many-body wave function at the variational quantum Monte Carlo level. (c) The so-called lattice-regularized diffusion Monte Carlo method is implemented in the code, which provides a numerically stable diffusion quantum Monte Carlo calculation. (d) The implementation of an adjoint algorithmic differentiation allows us to compute derivatives of many-body wave functions very efficiently and to perform structural optimizations and molecular dynamics simulations.

Credit: 
Japan Advanced Institute of Science and Technology

A good egg: Robot chef trained to make omelettes

image: Robot arm preparing an omelette.

Image: 
University of Cambridge

A team of engineers have trained a robot to prepare an omelette, all the way from cracking the eggs to plating the finished dish, and refined the 'chef's' culinary skills to produce a reliable dish that actually tastes good.

The researchers, from the University of Cambridge in collaboration with domestic appliance company Beko, used machine learning to train the robot to account for highly subjective matters of taste. The results are reported in the journal IEEE Robotics and Automation Letters, and will be available online as part of the virtual IEEE International Conference on Robotics and Automation (ICRA 2020).

A robot that can cook has been an aspiration of sci-fi authors, futurists, and scientists for decades. As artificial intelligence techniques have advanced, commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.

"Cooking is a really interesting problem for roboticists, as humans can never be totally objective when it comes to food, so how do we as scientists assess whether the robot has done a good job?" said Dr Fumiya Iida from Cambridge's Department of Engineering, who led the research.

Teaching a robot to prepare and cook food is a challenging task, since it must deal with complex problems in robot manipulation, computer vision, sensing and human-robot interaction, and produce a consistent end product.

In addition, taste differs from person to person - cooking is a qualitative task, while robots generally excel at quantitative tasks. Since taste is not universal, universal solutions don't exist. Unlike other optimisation problems, special tools need to be developed for robots to prepare food.

Other research groups have trained robots to make cookies, pancakes and even pizza, but these robot chefs have not been optimised for the many subjective variables involved in cooking.

Egg dishes, omelettes in particular, have long been considered a test of culinary skill. A popular piece of French culinary mythology states that each of the one hundred pleats in a chef's hat represents a different way to cook an egg, although the exact origin of this adage is unknown.

"An omelette is one of those dishes that is easy to make, but difficult to make well," said Iida. "We thought it would be an ideal test to improve the abilities of a robot chef, and optimise for taste, texture, smell and appearance."

In partnership with Beko, Iida and his colleagues trained their robot chef to prepare an omelette, from cracking the eggs through to plating the finished dish. The work was performed in Cambridge's Department of Engineering, using a test kitchen supplied by Beko plc and Symphony Group.

The machine learning technique developed by Iida's team makes use of a statistical tool, called Bayesian Inference, to squeeze out as much information as possible from the limited amount of data samples, which was necessary to avoid over-stuffing the human tasters with omelettes.

"Another challenge we faced was the subjectivity of human sense of taste - humans aren't very good at giving absolute measures, and usually give relative ones when it comes to taste," said Iida. "So we needed to tweak the machine learning algorithm - the so-called batch algorithm - so that human tasters could give information based on comparative evaluations, rather than sequential ones."

But how did the robot measure up as a chef? "The omelettes in general tasted great - much better than expected!" said Iida.

The results show that machine learning can be used to obtain quantifiable improvements in food optimisation. Additionally, such an approach can be easily extended to multiple robotic chefs. Further studies have to be conducted to investigate other optimisation techniques and their viability.

"Beko is passionate about designing the kitchen of the future and believe robotics applications such as this will play a crucial part. We are very happy to be collaborating with Dr Iida on this important topic," said Dr Graham Anderson, the industrial project supervisor from Beko's Cambridge R&D Centre.

Credit: 
University of Cambridge

New NiMH batteries perform better when made from recycled old NiMH batteries

image: This is Dag Noréus, professor at the Department of Materials and Environmental Chemistry.

Image: 
Photo: Niklas Björling

A new method for recycling old batteries can provide better performing and cheaper rechargeable hydride batteries (NiMH) as shown in a new study by researchers at Stockholm University.

"The new method allows the upcycled material to be used directly in new battery production," says Dag Noréus, professor at the Department of Materials and Environmental Chemistry at Stockholm University who, together with other researchers, has conducted the study published in the scientific journal Molecules.

The new recycling consists of mechanical washing and separation of reusable electrode material and corrosion products from old, used electrodes.

"More than 95 percent is useful and several steps can be saved in the manufacture of new batteries that also get better performance. Recycling will be easier as it avoids costly remelting and reduction included in the conventional battery recycling."

Hydride batteries, so-called NiMH batteries, are based on a nickel electrode and a hydrogen electrode, where the hydrogen is stored in a metal hydride. The battery is one of the four basic types of rechargeable batteries available on the market today. The others are based on lithium, nickel-cadmium or lead.

Used in hybrid vehicles and electric toothbrushes

NiMH batteries were developed during the 1990s and are used, for example, in hybrid vehicles such as the Toyota Prius, but also in electric toothbrushes and electric razors, that is, in appliances used near the body, where you want safer batteries that do not risk exploding like those of lithium. NiMH is also considered more environmentally friendly as it does not contain toxic heavy metals.

The research was initially focused on finding metal hydrides that could store large amounts of hydrogen in solid form. When successful, the first use was in rechargeable batteries. The high hydrogen content doubled the battery capacity compared to batteries of nickel-cadmium.

"The new thing with our study is that the material gains better properties when used in new batteries after passing a simplified recycling process. This is not a new battery, but a significant improvement of the usefulness of rechargeable hydride batteries."

Breakthrough during the Second World War

It is time consuming to develop rechargeable battery chemistries. During the Second World War came the breakthrough for nickel-cadmium batteries that could be used in portable electronics.

"It is difficult and takes a long time to get a rechargeable battery chemistry, despite the fact that one could basically build an electrochemical cell or battery of virtually all chemical reactions. The first rechargeable battery, based on lead, was introduced in the mid-1800s. Then we had to wait until the early 1900 for the rechargeable nickel-cadmium battery. Unfortunately, both were based on environmentally hazardous metals such as lead and cadmium."

The difficulty of finding workable battery chemistries is further reflected in the fact that it took almost until the end of 1900 before NiMH and lithium batteries emerged.

Modern rechargeable batteries contain some rare materials and materials that have undergone complicated and costly processing steps to work well, explains Dag Noréus. When the battery cells are manufactured, they must be activated. During this they undergo a number of cautious charge and discharge cycles according to a special schedule.

"It can be compared to the run-in period that previously newly purchased cars had to go through, when they were delivered from the factory. The formation of the batteries is done at the battery factory. It requires time and investments. Much of this can now be saved by using already activated material," says Dag Noréus.

Credit: 
Stockholm University

Russian scientists to improve the battery for sensors

image: Scientist works on Atomic Layer Deposition equipment

Image: 
Peter the Great St.Petersburg Polytechnic University

Researchers of Peter the Great St.Petersburg Polytechnic University (SPbPU) approached the creation of a solid-state thin-film battery for miniature devices and sensors. The results of the study were published in the special issue dedicated to improved materials for lithium and sodium-ion batteries (Energies Journal, MDPI Publishing House).

The development of miniature devices such as biosensors, smartwatches, Internet of things (IoT) devices requires the establishment of small and complex power supplies with a high energy density. According to experts, traditional technologies for lithium-ion batteries production reach their limits. It is difficult to reduce the size and control the shape of the power source any further in the required dimensions. Meanwhile, the use of microelectronic technologies, such as, Atomic Layer Deposition, can assist in the production of miniature solid-state lithium-ion batteries with high specific energy.

"We were able to obtain the cathode material, lithium nickelate using the Atomic Layer Deposition method, which allows setting the thickness of the films with high precision", said Dr. Maxim Maximov of High School of Materials Physics and Technologies, Institute of Mechanical Engineering, Materials and Transport SPbPU.

He mentioned that the researchers demonstrated high specific capacities at increased discharge current. It can improve the performance and efficiency of devices, as well as reduce their size.

According to the scientist, the production of thin-film positive electrodes based on lithium nickelate and lithium mixed oxides with a high nickel content is a huge step to the creation of efficient solid-state batteries, which are safe due to the lack of liquid electrolyte.

Credit: 
Peter the Great Saint-Petersburg Polytechnic University

Clever computing puts millions into charities' hands

Charities can now begin accessing millions of pounds more in donations thanks to a small shift in how people can donate.

Gift Aid is a UK government tax relief which allows registered charities to claim the tax the donor has already paid on the funds.

For every £10 that is donated to a UK charity, the charity can claim a further £2.50 in Gift Aid.

But in any year, more than £500 million - the Gift Aid on about a third of all donations - goes unclaimed because both donors and charities haven't got the time or inclination to do the paperwork needed to claim it.

Now, thanks to experts in mathematical modelling, systems engineering, cyber security and a software company, that is about to change.

In a paper published in the journal Formal Aspects of Computing, the researchers explain how they developed their model, called Swiftaid, which was launched by Streeva. The Swiftaid project was awarded by InnovateUK to Streeva and the University of Surrey.

Swiftaid has been recognised by HMRC as the first solution to automatically attach Gift Aid to contactless donations.

Lead author, Dr David Williams at the University of Portsmouth's School of Computing, said: "To our knowledge, this is the first time formal methods have been applied to any system of charitable giving."

Formal methods apply mathematical rigour to model and analyse systems against essential requirements. They are typically used in safety critical sectors, such as defence and aerospace, but Dr Williams and his colleagues have found value in using these advanced techniques in broader settings.

Dr Williams added: "It is extremely satisfying when our research has a real impact. In this case, we were able to improve the design of the Swiftaid system, making it easier and more secure for donors, charities and HMRC to provide added support to those most in need via Gift Aid.

"The inconvenience of having to complete a Gift Aid declaration form by hand every time someone donates often puts people off. Swiftaid automates the Gift Aid process to remove this barrier, vastly streamlining the process and unlocking millions in extra funding for worthy causes."

One of the research co-authors, Professor Steve Schneider, director of Surrey Centre for Cyber Security, was principal investigator. He said: "We're proud of what we have achieved in this project and the benefits it will bring to charitable giving. We've seen where formal methods has added real value and increased assurance in the Swiftaid system."

To use Swiftaid, donors create an account via swiftaid.co.uk and securely add one or more of their contactless debit cards. Every time a person donates, they can add the 25 per cent Gift Aid by touching their phone to a reader, similar to the contactless payments used on London Underground. The platform securely informs HMRC on behalf of both the donor and the charity so that the Gift Aid can be claimed.

Swiftaid is automated through a donor's email address and will soon be accessible via mobile phones.

Another researcher involved, Dr Salaheddin Darwish at Royal Holloway University of London, said: "This work is one of the few attempts to improve the robustness and assurance of the charity system using formal modelling and will enable charities to apply Gift Aid legislation efficiently and effectively and maximise their benefits."

David Michael CEO and founder of Streeva said: "To collaborate and co-author a paper with leading academics is definitely a huge achievement and a proud moment. We have worked extremely hard to make sure we are creating a fully robust automatic Gift Aid solution for UK charities that has the assurance, trust and rigour behind it."

Credit: 
University of Surrey

Squid studies illuminate neural dysfunction in ALS; suggest new route to therapy

image: A catch of the longfin inshore squid (Doryteuthis pealeii) aboard the collecting boat of the Marine Biological Laboratory, The Gemma. This species has been used for basic research in neuroscience since the 1930s.

Image: 
Daniel Cojanu

WOODS HOLE, Mass. -- Amyotrophic lateral sclerosis (ALS) is one of the most devastating adult-onset neurodegenerative diseases. Patients, including the late actor/playwright Sam Shepard, become progressively weaker and eventually paralyzed as their motor neurons degenerate and die.

To find a cure for ALS, which is fatal, scientists need a deeper understanding of how it interrupts motor neuron communication channels. Because motor neurons reach from the brain to the spinal cord, and from the spinal cord to the muscles, when they die, the brain can no longer initiate muscle movement.

Yuyu Song of Harvard Medical School was a Grass Fellow at the Marine Biological Laboratory (MBL), Woods Hole, when she took advantage of a powerful research organism in neuroscience, the local squid, to start asking how a mutant protein associated with familial ALS behaves under controlled conditions.

Her study, recently published in eNeuro, clarifies the mechanisms underlying neural dysfunction in ALS, and also suggests a novel approach to restoring the health of motor neurons in patients with the disease.

Song focused on how this mutant protein, called G85R-SOD1, affects neurotransmission at the squid "giant synapse," the junction where neurons transmit chemical signals to muscle fibers, causing the muscle to contract.

The squid giant synapse is one of the few mature nervous systems that mimics human neuromuscular junctions, while allowing precise experimental manipulations and live measurements.

Song showed that the presence of misfolded mutant SOD1 inhibits synaptic transmission and diminishes the pool of synaptic vesicles, whose job is to deliver neurotransmitters critical for neuronal connections.

Surprisingly, synaptic function was restored by intermittent, high-frequency stimulation, which suggested aberrant calcium signaling may underlie SOD1 toxicity to normal synaptic transmission.

To test this hypothesis, Song used calcium imaging to capture the abnormal calcium influx in the presynaptic terminal and confirmed the protective role of a calcium chelator, which corrected the calcium imbalance without affecting neurotransmission.

This suggests a new approach to therapeutic intervention for ALS, in which chemical or electrical regulation of calcium and its downstream signaling pathways may restore the health of motor neurons.

"Altogether, our results not only demonstrated synaptic dysfunction related to ALS and its underlying molecular pathways, but also extended our understanding of fundamental synaptic physiology," Song says.

Credit: 
Marine Biological Laboratory