Tech

A remote laboratory for performing experiments with real electronic and communications equipment

image: Students can connect from anywhere with an internet connection, using a control program to access a real board and work remotely with the circuit, while viewing the results of their work via a webcam

Image: 
UOC

Laboratories are an inherent part of technology qualifications, as practical experiments are essential for students to acquire the competencies and skills that they will need during their future professional development. Providing this learning in a virtual format is one of the challenges posed by the current COVID-19 pandemic--a challenge that distance universities have been addressing for years. RLAB-UOC is a remote laboratory designed and developed by the Universitat Oberta de Catalunya (UOC) that enables students in the Faculty of Computer Science, Multimedia and Telecommunications to conduct practical experiments with real electronic and communications equipment anywhere, at any time. A new article published in the scientific journal Electronics has described the characteristics of the laboratory, and analysed the students' satisfaction using a type of experiment that has taken place over the past six semesters.

"The main results show that the students' opinions are very positive, and that it is feasible to do remote experimentation with highly complex and expensive equipment and devices, like those in electronics and communications laboratories, in a 100% online environment, integrating the theory and practical classroom in the same environment. This improves the learning experience and the students acquire the practical and professional skills they need," pointed out Carlos Monzo, a researcher in the Faculty of Computer Science, Multimedia and Telecommunications at the UOC, and the leading author of the study. Other researchers and members of faculty also took part in the research: Germán Cobo, David García, José Antonio Morán and Eugènia Santamaria.

Controlling an electronic circuit remotely

The case study of the RLAB-UOC, which has been used by hundreds of students since its inception in 2012, was carried out for Digital Electronics, an advanced course on the Bachelor's Degree in Telecommunications Technologies and Services Engineering (GETiST). In specific terms, it is based on an experiment with a circuit controlled by an FPGA, a programmable electronic device physically located at the UOC. Students can connect from anywhere with an internet connection, using a control program to access a real FPGA board and work remotely with the circuit, while viewing the results of their work via a webcam.

The students subsequently answered a questionnaire to rate the perceived quality and usefulness of the remote laboratory and their opinion of the use of real instruments compared to the simulations that are also used in other experiments during the course. The results of the survey show that students rate the laboratory and working with real tools "very positively". They also think that it would be useful to extend the use of this laboratory to other courses.

Creating this feeling of being immersed in a real task was one of the most important challenges in the development of this technology. "The study shows a specific example which presents the structure of a subject where theory and practice exist side by side, and the students rate it very positively. It was also very important for us to prevent the feeling from being similar to working with a simulation, which we achieved with the design of the laboratory and the experiments carried out," stressed the researcher.

A patented technology

The development of this remote laboratory has been accompanied by a patent to protect a technology for carrying out a type of experiment primarily related to the design and assembly of analogue electronics circuits in a fully online environment. "Its main characteristics are that it enables elements with two and three poles (resistors, capacitors, coils, transistors, etc.) to be connected automatically and remotely, which gives students the flexibility to perform the assemblies they want virtually. The invention also allows the possible experiments to be scaled, so that the system can be expanded," said the researcher.

The types of experiments that have been performed have evolved over time and they have been changed, updated or created to meet the students' academic needs. "The aim is to be able to offer the most advanced qualifications with experimental needs, taking into account the completely remote nature of our studies," pointed out the researcher.

Home laboratories

The UOC also offers other alternatives in order to address this need to carry out practical experiments: virtual simulations and home labs. "When physical access to the devices is required, the student is sent equipment like Lab@Home. This is another UOC invention, in which the student can manipulate electronic devices and components in situ, and it has also been highly rated by students, as we showed in another recent article," concluded Carlos Monzo.

Credit: 
Universitat Oberta de Catalunya (UOC)

RUDN University chemists synthesize biodiesel from jatropha curcas plant

image: RUDN University chemists have proposed a new method of producing fuel from Jatropha Curcas, a poisonous tropical plant. Natural minerals and a non-toxic additive from vegetable raw materials are used for that. The reaction efficiency is 85%. The fuel can be used in diesel internal combustion engines.

Image: 
RUDN University

RUDN University chemists have proposed a new method of producing fuel from Jatropha Curcas, a poisonous tropical plant. Natural minerals and a non-toxic additive from vegetable raw materials are used for that. The reaction efficiency is 85%. The fuel can be used in diesel internal combustion engines. The results are published in the International Journal of Green Energy.

Jatropha Curcas is a common plant in many tropical regions. Its seeds contain lots of oil, but they cannot be used agriculture because the oil contains toxins that are dangerous for people and animals. But the composition of jatropha oil is suitable for the manufacture of biodiesel. One of challenge of the processing the plant raw materials is to select sufficiently effective and safe catalysts. RUDN University chemists found a suitable catalyst and selected the optimal additive-a substance that improves the useful properties of the fuel.

"Mineral catalysts with a complex chemical composition, for example, zeolites -- calcium and sodium silicates, have performed well in biodiesel production from vegetable and animal fats. They are quite active, eco-friendly and can be reused. But biodiesel, like hydrocarbons, cannot be used without improving additives", Ezeldin Osman, PhD student at RUDN University.

RUDN University chemists decided to use furfural as an additive for diesel biodiesel. It is obtained from plant waste, such as sawdust or straw, it improves the characteristics of diesel fuel, in particular, its cetane number is an indicator of flammability.

As a first step, the researchers obtained biodiesel from Jatropha Curcas oil. To do this, they mixed the oil with three times as much methanol and added a catalyst -- minerals from the zeolite group, mainly thomsonite. The catalyst amount was 5 times lower than the oil. RUDN University chemists also tested other reaction settings, but the highest yield of biodiesels (up to 85% in the composition of the reaction products) was obtained at this ratio of reagents and a temperature of 75°C.

The main part of the experiment was the selection of the optimal amount of furfural to improve the characteristics of biodiesel. RUDN University chemists mixed biodiesel and the additive in equal quantities, in other variants they used twice as much additive as fuel, and vice versa. It turned out that the highest cetane number (64.1) is in fuel containing 66.6% of furfural. This is 4.3 units higher than that of biodiesels without furfural. In this ratio, the additive removes all compounds that impair flammability from the biodiesel, such as alcohols and carbonyl compounds. The achieved characteristics of biodiesel from jatropha kurkas allow it to be used in internal combustion engines in the future.

"The additive reduced the content of aluminum, sodium, magnesium, potassium, iron and other substances in biodiesel that form ash -- a non-combustible solid residue of fuel. This not only improves fuel performance, but also reduces the risk of engine wear. At the same time, furfural is a stable additive at high temperatures, environmentally friendly in production and application. We will continue experiments to improve diesel fuel with this substance", Tatiana Sheshko, PhD, the head of the Adsorption and Catalysis Laboratory at RUDN University.

Credit: 
RUDN University

New data science platform speeds up Python queries

PROVIDENCE, R.I. [Brown University] — Researchers from Brown University and MIT have developed a new data science framework that allows users to process data with the programming language Python — without paying the “performance tax” normally associated with a user-friendly language.

The new framework, called Tuplex, is able to process data queries written in Python up to 90 times faster than industry-standard data systems like Apache Spark or Dask. The research team unveiled the system in research presented at SIGMOD 2021, a premier data processing conference, and have made the software freely available to all.

“Python is the primary programming language used by people doing data science,” said Malte Schwarzkopf, an assistant professor of computer science at Brown and one of the developers of Tuplex. “That makes a lot of sense. Python is widely taught in universities, and it’s an easy language to get started with. But when it comes to data science, there’s a huge performance tax associated with Python because platforms can’t process Python efficiently on the back end.”

Platforms like Spark perform data analytics by distributing tasks across multiple processor cores or machines in a data center. That parallel processing allows users to deal with giant data sets that would choke a single computer to death. Users interact with these platforms by inputting their own queries, which contain custom logic written as “user-defined functions” or UDFs. UDFs specify custom logic, like extracting the number of bedrooms from the text of a real estate listing for a query that searches all of the real estate listings in the U.S. and selects all the ones with three bedrooms.

Because of its simplicity, Python is the language of choice for creating UDFs in the data science community. In fact, the Tuplex team cites a recent poll showing that 66% of data platform users utilize Python as their primary language. The problem is that analytics platforms have trouble dealing with those bits of Python code efficiently.

Data platforms are written in high-level computer languages that are compiled before running. Compilers are programs that take computer language and turn it into machine code — sets of instructions that a computer processor can quickly execute. Python, however, is not compiled beforehand. Instead, computers interpret Python code line by line while the program runs, which can mean far slower performance.

“These frameworks have to break out of their efficient execution of compiled code and jump into a Python interpreter to execute Python UDFs,” Schwarzkopf said. “That process can be a factor of 100 less efficient than executing compiled code.”

If Python code could be compiled, it would speed things up greatly. But researchers have tried for years to develop a general-purpose Python compiler, Schwarzkopf says, with little success. So instead of trying to make a general Python compiler, the researchers designed Tuplex to compile a highly specialized program for the specific query and common-case input data. Uncommon input data, which account for only a small percentage of instances, are separated out and referred to an interpreter.

“We refer to this process as dual-case processing, as it splits that data into two cases,” said Leonhard Spiegelberg, co-author of the research describing Tuplex. “This allows us to simplify the compilation problem as we only need to care about a single set of data types and common-case assumptions. This way, you get the best of two worlds: high productivity and fast execution speed.”

And the runtime benefit can be substantial.

“We show in our research that a wait time of 10 minutes for an output can be reduced to a second,” Schwarzkopf said. “So it really is a substantial improvement in performance.”

In addition to speeding things up, Tuplex also has an innovative way of dealing with anomalous data, the researchers say. Large datasets are often messy, full of corrupted records or data fields that don’t follow convention. In real estate data, for example, the number of bedrooms could either be a numeral or a spelled-out number. Inconsistencies like that can be enough to crash some data platforms. But Tuplex extracts those anomalies and sets them aside to avoid a crash. Once the program has run, the user then has the option of repairing those anomalies.

“We think this could have a major productivity impact for data scientists,” Schwarzkopf said. “To not have to run out to get a cup of coffee while waiting for an output, and to not have a program run for an hour only to crash before it’s done would be a really big deal.”

Credit: 
Brown University

G-quadruplex-forming DNA molecules enhance enzymatic activity of myoglobin

image: The aptameric enzyme subunit (AES) enhances myoglobin-derived peroxidase reaction.

Image: 
Kazunori Ikebukuro/ TUAT

A collaboration led by Distinguished Professor Dr. Kazunori Ikebukuro from Tokyo University of Agriculture and Technology (TUAT), Japan, discovered that G-quadruplex (G4)-forming DNA binds myoglobin through a parallel-type G4 structure. Through the G4 binding, the enzymatic activity of myoglobin increases over 300-fold compared to that of myoglobin alone (Figure). This finding indicates that DNA may work as a carrier of genetic information in living organisms and act as a regulator of unknown biological phenomena.

"Aptamers" are nucleic acid-based synthetic ligands that can be used against many target molecules with high affinity and specificity. Some aptamers that bind to proteins are reported as specific ligands and biological function regulators. Dr. Ikebukuro and his group have developed many DNA aptamers that bind proteins, especially enzymes. In addition, they developed the aptameric enzyme modulator--aptameric enzyme subunit (AES), which can inhibit enzymatic activities. The current challenge for the group is to create novel aptamers that upregulate the catalytic activity of enzymes.

The collaborative team of TUAT, RIKEN (Japan), DENSO CORPORATION (Japan), and the University of North Carolina, Chapel Hill (USA) succeeded in developing a new AES that increases the peroxidase activity of myoglobin. As myoglobin contains a heme as a cofactor, the research team found that a region near the heme-binding site can be positively charged. "We hypothesized that this region is likely to interact with negative charges of the DNA oligonucleotides derived from its sugar-phosphate backbone, which may lead to an enhancing effect on the enzymatic activity," said Dr. Ikebukuro.

The chemiluminescence measurements in their study showed that the AES specifically enhanced the peroxidase activity of myoglobin by up to 300-fold compared to that of myoglobin alone (Figure). Further, the AES bound to myoglobin strongly at the heme in myoglobin, as expected (Figure). The structural analyses by NMR and spectroscopic observation revealed the AES folded into a parallel-type G-quadruplex structure.

"Our study has revealed that DNA can potentially work as a regulator of protein's functions in the cell. On the other hand, because the AES produces a dramatically enhanced chemiluminescent signal, it could also offer a new strategy for future biosensor application studies," Dr. Ikebukuro added.

Credit: 
Tokyo University of Agriculture and Technology

New chatbot can explain apps and show you how they access hardware or data

image: Hey GUI is a chatbot that can be used to find images or textual information about apps and their user interfaces using natural language conversations.

Image: 
Aalto University

Chatbots have already become a part of our everyday lives with their quick and intuitive way to complete tasks like scheduling and finding information using natural language conversations. Researchers at Aalto University have now harnessed the power of chatbots to help designers and developers develop new apps and allow end users to find information on the apps on their devices.

The chatbot -- 'Hey GUI' (pronounced goo-ee), short for Graphical User Interface, which will be presented at ACM Designing Interactive Systems 2021 on 1 July -- can answer questions by showing images and screenshots of apps, or through simple text phrases.

"Hey GUI eliminates the need for coding skills or technical expertise. Anyone can use it to search for information on, for example, which one of my apps is asking for permission to use the camera, or where is the search button on this screen", says postdoctoral researcher Kashyap Todi of Aalto University.

Important findings for chatbot developers and researchers

Different kinds of applications and user interfaces contain vast amounts of information, which can be used for different needs and tasks. In their study, the researchers set out to find the most desirable chatbot features and how users would have conversations with them.

Three target groups were considered in the experiment:
designers, software developers, and general users. Over 100 designers, developers, and ordinary users responded to the survey.

"Our goal was to get a good understanding of what capabilities and functions different kinds of users would find most beneficial when using this new type of chatbot, as well as how they would formulate questions -- especially what kinds of words and phrases they would use. This is an important first step towards developing chatbots that can help users find information about apps using simple conversations", says Kashyap.

As a result of the survey, 21 key features and more than 1300 of the most common questions put forward by users were collected and analysed to create Hey GUI, allowing users to find information within user interfaces using simple conversations, based on the most frequent expressions collected in the survey.

"Designers can get inspiration, for instance, by asking Hey GUI to find good online shop designs. The chatbot responds with several screenshots from existing apps in the catalogue. Software developers could ask a chatbot what typical colour codes are used for login buttons, and the chatbot responds with several options", Kashyap adds.

In questions from ordinary users, there was a special emphasis on matters related to privacy and accessibility information, or getting help about their apps, such as asking which apps display maps.

The research prototype is currently under development. The common questions gathered from the survey are available at the website. (https://osf.io/g97my/)

The study was conducted in collaboration with the universities of Luxembourg and Bayreuth.

Credit: 
Aalto University

Unlocking the power of the microbiome

image: Differently fluorescent bacteria on a leaf surface.

Image: 
Photo: Maximilian Mittelviefhaus / ETH Zurich

Hundreds of different bacterial species live in and on leaves and roots of plants. A research team led by Julia Vorholt from the Institute of Microbiology at ETH Zurich, together with colleagues in Germany, first inventoried and categorised these bacteria six years ago. Back then, they isolated 224 strains from the various bacterial groups that live on the leaves of thale cress (Arabidopsis thaliana). These can be assembled into simplified, or "synthetic" plant microbiomes. The researchers thus laid the foundations for their two new studies, which were just published in the journals Nature Plants and Nature Microbiology.

Volume control of the plant response

In the first study, the researchers investigated how plants respond to their colonisation by microorganisms. Vorholt's team dripped bacterial cultures onto the leaves of plants that had, up to that point, been cultivated under sterile conditions. As expected, different types of bacteria triggered a variety of responses in the plants. For example, exposure to certain genera of Gammaproteobacteria caused the thale cress plants to activate a total of more than 3,000 different genes, while those of Alphaproteobacteria triggered a response in only 88 genes on average.

"Despite this broad range of responses to the different bacteria of the microbiome, we were astonished to pinpoint a central response: the plants practically always activate a core set of 24 genes," Vorholt says. But that's not all the team found: acting as a kind of volume control for the plant response, the activation intensity of these 24 genes provides information as to how extensively the bacteria have colonised the plant. This volume control also predicts how many additional genes the plant will activate as it adapts to the new arrivals.

Plants with defects in some of these 24 genes are more susceptible to harmful bacteria, Vorholt's team has shown. And since other studies had noticed that some genes in the core set are also involved in plant responses to osmotic shock or fungal infestation, the ETH researchers infer that the 24 genes constitute a general defensive response. "It looks like an immune training, even though the bacteria we used aren't pathogens, but rather partners in a natural community," Vorholt says.

Bacterial community out of control

In the second study, Vorholt and her team explored how bacterial communities change when mutations cause a plant to be deficient in one or several genes. The team expected to see that genetic defects in receptors, which plants use to detect the presence of microbes, play a major role in the story.

What they didn't expect was that another genetic defect would have the biggest effect: if the plants were deficient in a certain enzyme, an NADPH oxidase, the bacterial community was thrown off-?kilter. Plants use this enzyme to produce highly reactive oxygen radicals, which have an antimicrobial effect. In the absence of this NADPH oxidase, microbes that under normal circumstances lived peacefully on the leaves developed into what are known as opportunistic pathogens.

Is the NADPH oxidase found among the core set of 24 genes responsible for general defensive response? "No, that would have been too good to be true," says Sebastian Pfeilmeier, a member of Vorholt's research group and lead author of the study. This is because the gene responsible for the NADPH oxidase is turned on prior to contact with microbes and because the enzyme is activated by chemical reactions governed by phosphorylation.

For Vorholt, the two studies show that synthetic microbiomes are a promising approach to investigating the complex interactions within different communities. "Since we can control and precisely engineer the communities, we can do much more than just observe what happens. In addition to simply determining cause and effect, we can understand them on a molecular level," Vorholt says. An ideal microbiome protects plants from diseases while also making them more resilient to drought and salty conditions. This is why the agricultural industry is among those interested in the team's results. They should help farmers harness the power of the microbiome in the future.

Credit: 
ETH Zurich

Novel microscopy method at UT Southwestern provides look into future of cell biology

image: Zebrafish.

Image: 
UT Southwestern Medical Center

What if a microscope allowed us to explore the 3D microcosm of blood vessels, nerves, and cancer cells instantaneously in virtual reality? What if it could provide views from multiple directions in real time without physically moving the specimen and worked up to 100 times faster than current technology?

UT Southwestern scientists collaborated with colleagues in England and Australia to build and test a novel optical device that converts commonly used microscopes into multiangle projection imaging systems. The invention, described in an article in today's Nature Methods, could open new avenues in advanced microscopy, the researchers say.

"It is a completely new technology, although the theoretical foundations for it can be found in old computer science literature," says corresponding author Reto Fiolka, Ph.D. Both he and co-author Kevin Dean, Ph.D., are assistant professors of cell biology and in the Lyda Hill Department of Bioinformatics at UT Southwestern.

"It is as if you are holding the biological specimen with your hand, rotating it, and inspecting it, which is an incredibly intuitive way to interact with a sample. By rapidly imaging the sample from two different perspectives, we can interactively visualize the sample in virtual reality on the fly," says Dean, director of the UTSW Microscopy Innovation Laboratory, which collaborates with researchers across campus to develop custom instruments that leverage advances in light microscopy.

Currently, acquiring 3D-image information from a microscope requires a data-intensive process, in which hundreds of 2D images of the specimen are assembled into a so-called image stack. To visualize the data, the image stack is then loaded into a graphics software program that performs computations to form two-dimensional projections from different viewing perspectives on a computer screen, the researchers explain.

"Those two steps require a lot of time and may need a very powerful and expensive computer to interact with the data," Fiolka says.

The team realized it could form projections from multiple angles by optical means, bypassing the need to acquire image stacks and rendering them with a computer. This is achieved by a simple and cost-effective unit consisting of two rotating mirrors that is inserted in front of the camera of the microscope system.

"As a result, we can do all this in real time, without any noticeable delay. Surprisingly, we can look from different angles 'live' at our samples without rotating the samples or the microscope," Fiolka says. "We believe this invention may represent a new paradigm for acquiring 3D information via a fluorescence microscope."

It also promises incredibly fast imaging. While an entire 3D-image stack may require hundreds of camera frames, the new method requires only one camera exposure.

Initially, the researchers developed the system with two common light-sheet microscopes that require a post-processing step to make sense of the data. That step is called de-skewing and essentially means rearranging the individual images to remove some distortions of the 3D-image stack. The scientists originally sought to perform this de-skewing optically.

While experimenting with the optical de-skewing method, they realized that when they used an incorrect amount of "de-skew," the projected image seemed to rotate.

"This was the aha! moment. We realized that this could be bigger than just an optical de-skewing method; that the system could work for other kinds of microscopes as well," Fiolka said.

"This study confirms the concept is more general," Dean says. "We have now applied it to various microscopes, including light-sheet and spinning disk confocal microscopy."

Using the new microscope method, they imaged calcium ions carrying signals between nerve cells in a culture dish and looked at the vasculature of a zebrafish embryo. They also rapidly imaged cancer cells in motion and a beating zebrafish heart.

Credit: 
UT Southwestern Medical Center

Newly discovered genetic variants in a single gene cause neurodevelopmental disorder

Rochester, Minn. -- Mayo Clinic researchers have discovered that genetic variants in a neuro-associated gene called SPTBN1 are responsible for causing a neurodevelopmental disorder. The study, published in Nature Genetics, is a first step in finding a potential therapeutic strategy for this disorder, and it increases the number of genes known to be associated with conditions that affect how the brain functions.

"The gene can now be included in genetic testing for people suspected of having a neurodevelopmental disorder, which may end the diagnostic odyssey these people and their families have endured," says Margot Cousin, Ph.D., a translational genomics researcher in Mayo Clinic's Center for Individualized Medicine and the study's lead author.

For the global study, a collaboration with The University of North Carolina at Chapel Hill, researchers investigated disease-causing variants of the SPTBN1 gene in 29 people with clinical neurodevelopmental symptoms, including language and motor delays, intellectual disability, autistic features, seizures, behavioral and movement abnormalities, and variable dysmorphic facial features. Overall, the team identified 28 unique variants.

Dr. Cousin says most of the genetic variants were not inherited, but rather newly occurred in the patients who were affected.

"We showed through multiple different model systems, including computational protein modeling, human- and mouse cell-based systems, patient-derived cell systems, and in vivo mouse studies, the impact the variants have on the function of the protein encoded by the SPTBN1 gene," Dr. Cousin explains. "I had a hunch this gene was the answer for these patients, but it wasn't until we accrued and studied more patients with variants in SPTBN1 that we could see how the variants had damaging effects on the protein and we could begin putting the story together."

The SPTBN1 gene codes for a protein called beta-two spectrin, which is abundantly expressed in the brain and other parts of the body. Beta-two spectrin makes protein networks within cells, and it is essential for the brain's development and connectivity.

"Interestingly, some of the variants behave very differently than the others, where some make the beta-two spectrin protein unstable and some disrupt its ability to make important interactions with other proteins," Dr. Cousin says. "But these differences in functional effects helped to explain the clinical variability we were observing in the patients."

The study also demonstrates the challenges in rare disease genomics, as many neurodevelopmental diseases remain undiagnosed under the standard of care.

But Dr. Cousin is hopeful the tide is turning.

"Advances in genome sequencing and our ability to interpret the enormous amount of data we generate with various types of 'omic' technologies has led to increases in the discovery of novel disease-causing genes," Dr. Cousin says. "But rigorous studies encompassing the clinical manifestations of affected people and the underlying mechanism of disease are often critical to solidifying a new gene-disease relationship."

Omic technologies include the detection of genes, genomics; messenger RNA, transcriptomics; proteins, proteomics; and metabolites, metabolomics.

Dr. Cousin emphasizes that bringing this genetic finding to light required much perseverance.

"The clinical variability we observed in people early on was not very compelling that this could be a single genetic condition," she explains. "The gene, however, had many of the hallmarks of a rare monogenic disease gene, including that the normal population doesn't have variation in SPTBN1, other spectrin genes cause neurological syndromes, and mouse studies completely lacking the protein have severe defects."

Dr. Cousin says the cell-based and animal models developed in the study will continue to be invaluable in advancing knowledge of the disease mechanisms and testing any potential therapeutic strategies.

"While there is not yet a specific treatment available for people affected by SPTBN1-associated disease, we can now provide patients with an answer to the root cause of disease, which is the most important first step toward finding a cure."

Credit: 
Mayo Clinic

Using AI to predict 3D printing processes

image: Melt pool shape and the temperature and melt pool flow velocity predicted by a physics-informed neural network (PINN) for case A, B and C at quasi-steady state (2 microseconds).

Image: 
Qiming Zhu, Zeliang Liu, Jinhui Yan

Additive manufacturing has the potential to allow one to create parts or products on demand in manufacturing, automotive engineering, and even in outer space. However, it's a challenge to know in advance how a 3D printed object will perform, now and in the future.

Physical experiments -- especially for metal additive manufacturing (AM) -- are slow and costly. Even modeling these systems computationally is expensive and time-consuming.

"The problem is multi-phase and involves gas, liquids, solids, and phase transitions between them," said University of Illinois Ph.D. student Qiming Zhu. "Additive manufacturing also has a wide range of spatial and temporal scales. This has led to large gaps between the physics that happens on the small scale and the real product."

Zhu, Zeliang Liu (a software engineer at Apple), and Jinhui Yan (professor of Civil and Environmental Engineering at the University of Illinois), are trying to address these challenges using machine learning. They are using deep learning and neural networks to predict the outcomes of complex processes involved in additive manufacturing.

"We want to establish the relationship between processing, structure, properties, and performance," Zhu said.

Current neural network models need large amounts of data for training. But in the additive manufacturing field, obtaining high-fidelity data is difficult, according to Zhu. To reduce the need for data, Zhu and Yan are pursuing 'physics informed neural networking,' or PINN.

"By incorporating conservation laws, expressed as partial differential equations, we can reduce the amount of data we need for training and advance the capability of our current models," he said.

Using the Frontera and Stampede2 supercomputers at the Texas Advanced Computing Center (the #10 and #36 fastest in the world, as of June 2021), Zhu and Yan simulated the dynamics of two benchmark experiments: an example of 1D solidification, when solid and liquid metals interact; and an example of laser beam melting tests taken from the 2018 NIST Additive Manufacturing Benchmark Test Series.

In the 1D solidification case, they input data from experiments into their neural network. In the laser beam melting tests, they used experimental data as well as results from computer simulations. They also developed a 'hard' enforcement method for boundary conditions, which, they say, is equally important in the problem-solving.

The team's neural network model was able to recreate the dynamics of the two experiments. In the case of the NIST Challenge, it predicted the temperature and melt pool length of the experiment within 10% of the actual results. They trained the model on data from 1.2 to 1.5 microseconds and made predictions at further time steps up to 2.0 microseconds.

The team published their results in Computational Mechanics in January 2021.

"This is the first time that neural networks have been applied to metal additive manufacturing process modeling," Zhu said. "We showed that physics-informed machine learning, as a perfect platform to seamlessly incorporate data and physics, has big potential in the additive manufacturing field."

Zhu sees engineers in the future using neural networks as fast prediction tools to provide guidance on the parameter selection for the additive manufacturing process -- for instance, the speed of the laser or the temperature distribution -- and to map the relationships between additive manufacturing process parameters and the properties of the final product, such as its surface roughness.

"If your client requires a specific property, then you'll know what you should use for your manufacturing process parameters," Zhu said.

In a separate paper in Computational Methods in Applied Mechanics and Engineering published online in May 2021, Zhu and Yan proposed a modification of the existing finite element method framework used in additive manufacturing to see if their technique could get better predictions over existing benchmarks.

Mirroring a recent additive manufacturing experiment from Argonne National Lab involving a moving laser, the researchers showed that simulations, performed on Frontera, differed in depth from those in the experiment by less than 10.3% and captured the common experimentally-observed chevron-type shape on the metal top surface.

Zhu and Yan's research benefits from the continued growth of computing technologies and federal investment in high performance computing.

Frontera not only speeds up studies such as theirs, it opens the door to machine and deep learning studies in fields where training data is not widely available, broadening the potential of AI research.

"The most exciting point is when you see that your model can predict the future using only a small amount of existing data," Zhu said. "It's somehow learning about the evolution of the process.

"Previously, I was not very confident on whether we'd be able to predict with good accuracy over temperature, velocity, and geometry of the gas-metal interface. We showed that we're able to make nice data inferences."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

In a supramolecular realm: Advances in intracellular spaces with de novo designed peptide

image: Scientists at Tokyo Institute of Technology (Tokyo Tech) have set out to harness the potential of self-assembling peptides (SAPs) in intracellular spaces. They present a de novo designed peptide, Y15, which displays a strong tendency to assemble in cellular environments. The addition of Y15-tagged bioactive proteins can functionalize these assemblies, enhancing their utility and relevance by leaps and bounds.

Image: 
Tokyo Tech

Over the last two decades, biomaterials research has made significant progress, transitioning from traditional biomaterials to biomaterials with controlled structure and dynamic functionality. A number of building blocks have been explored for developing biomaterials by self-assembly, but SAPs have garnered special attention due to their tunability and potential use in various applications such as tissue engineering, wound healing, and vaccinations. Despite these benefits, the SAP-based approach is less explored in the intracellular context.

Fortunately, a team of scientists from the Tokyo Institute of Technology (Tokyo Tech), led by Assistant Prof. Takayuki Miki, have reported a de novo peptide, Y15, that readily forms secondary structures to enable bottom-up synthesis of functional protein assemblies in live cells. Their findings are published in Nature Communications. Assistant Prof. Miki explains, "The cell environment is a dense milieu in which macromolecules occupy a considerable volume; developing SAPs that interact in such a crude setting is difficult. Thankfully, we were able to create a de novo peptide--that is, from scratch--which shows a high propensity to assemble in intracellular spaces."

To leverage the self-assembling propensity, the scientists designed and synthesized four variable length Yn peptides--Y9, Y11, Y13, and Y15--as well as the negative control peptide Y15(K9P). Y15 clearly outperformed the others and self-assembled rapidly at low buffer concentrations, creating β-sheet structures. Y15 forms an amphiphilic β-strand with a hydrophobic face composed of tyrosine (Tyr) residue and a hydrophilic face composed of alternating glutamic acid (Glu) and lysine (Lys) residues. Thus, neighboring β-strands can connect through hydrophobic and aromatic interactions between Tyr residues as well as electrostatic interactions between negatively charged Glu and positively charged Lys residues.

The scientists realized that Y15 could be a key motif in the formation of protein assemblies in a liquid solution. To evaluate the assembly mechanism, Y15 was genetically tagged with a model protein, superfolder green fluorescence protein (sfGFP). They discovered that it could form fibrous structures in test tubes. Furthermore, when Y15-sfGFPs were expressed in live cells, they retained their self-assembling affinity by forming clusters. Several tests corroborated the clustering of Y15-sfGFP in live cells, confirming that these peptides can facilitate protein assembly.

The researchers were also able to build artificial microscale structures by fusing Y15 to Azami-Green, a tetrameric fluorescent protein. Thus, Y15-based assemblies could be adorned with functional proteins as well. Further, the researchers reconstructed Nck (an adaptor protein) clusters by incorporating Y15-tagged Nck into Y15-based supramolecules. This resulted in N-WASP (neural Wiskott-Aldrich syndrome protein)-mediated actin polymerization, which in turn plays an integral role in cell locomotion. The Y15-based assembly helped in evaluating the impact of Nck domain valency and density-dependency, hence serving as a platform for in-cell reconstitution studies.

The Y15 SAPs are much smaller than previously reported protein-based tags. Likewise, the use of an SAP-based system is sensible and straightforward, and it can be genetically tagged to optimize functioning. Assistant Prof. Miki concludes, "We believe that SAPs will set the stage for a range of applications, such as cluster formation with stimuli responsiveness, artificial constitution of phase separations, reconstitution of natural protein complexes, and many more. We hope that this rational engineering of supramolecular assembly will pave the way for future intracellular research."

We sure have our fingers crossed!

Credit: 
Tokyo Institute of Technology

Plastic drapes reduce hypothermia in premature babies

image: Huong (Kelle) Phan, clinical assistant professor, University of Houston College of Nursing

Image: 
University of Houston

Most babies born prematurely or with health problems are quickly whisked away to the Neonatal Intensive Care Unit (NICU) where they might require assisted heating devices to regulate their temperature. A University of Houston College of Nursing researcher is reporting that the traditional use of cloth blankets and towels during peripherally inserted central catheter (PICC) placement may hinder heat transfer from the assisted heating mechanisms, increasing the risk for neonatal hypothermia. In Advances in Neonatal Care, Huong (Kelle) Phan, clinical assistant professor, reports that a plastic drape lowers the incidence of hypothermia.

"The use of the plastic drape is a quality improvement to reduce the hypothermia rate in very low birth-weight (VLBW) neonates by replacing cloth blanket/towels with a plastic drape during PICC placement," said Phan. "A plastic drape shows promise in improving nursing practice by providing improved thermoregulation for premature neonates during PICC placement."

When a premature baby's body temperature drops below 36.5°C, the baby may experience cold stress, which is a cause for concern. The recommended temperature range for postnatal stabilization is between 36.5° and 37.5°C.

Phan's research project included implementing plastic drapes over three months, during 58 PICC procedures in a Level-3 NICU. A pre-/posttest was used to evaluate the impact of the intervention on hypothermia rates compared with a baseline cloth group and a concurrent cloth cohort.

"After the 3-month implementation period, the hypothermia rate for the intervention group was lower than that for the baseline cloth group (5.2% and 11.3%, respectively). Post-PICC hypothermia rates were significantly lower for the intervention group than for the concurrent cloth cohort," said Phan.

This evidence demonstrated plastic drapes reduced the hypothermia rate in the NICU for VLBW neonates during PICC placement compared with cloth blankets or towels.

"Phan's innovative nursing intervention of using the plastic drape during a PICC insertion helps some of our most vulnerable patients, those infants that must be treated in neonatal intensive care units," said Kathryn Tart, founding dean and Humana Endowed Dean's Chair in Nursing, UH College of Nursing.

Phan recommends further research to replicate findings with larger samples of PICC insertions, using a plastic drape in the operating room and other NICU procedures.

Credit: 
University of Houston

New technology detects greater variety of T cells that respond to coronaviruses

Scientists have developed a new technology to detect a wider variety of T cells that recognize coronaviruses, including SARS-CoV-2. The technology revealed that killer T cells capable of recognizing epitopes conserved across all coronaviruses are much more abundant in COVID-19 patients with mild disease versus those with more severe illness, suggesting a protective role for these broad-affinity T cells. The ability to distinguish T cells based on their affinities to SARS-CoV-2 could help scientists elucidate the disparity in COVID-19 outcomes and determine which COVID-19 patients will or will not exhibit a successful immune response against the virus, the authors say. Their work improves upon the primary tool used to identify T cells - antigen tetramers bound to MHC - by instead attracting the cells to multimerized complexes of antigens bound to MHC called "spheromers." Because tetramers can harbor a maximum of four MHC-antigen complexes, they tend to miss T cells with a low affinity for certain antigens. Vamsee Mallajosyula and colleagues tackled these limitations with their spheromers, each of which simultaneously displays 12 copies of an individual peptide-MHC complex. The spheromer is easy to produce and compatible with currently available MHC molecules and tetramer components, allowing for easy adoption of their new protocol, the authors say. When applied to blood samples from COVID-19 patients and individuals not yet exposed to SARS-CoV-2, the spheromers stained specific T cells more efficiently and captured a more diverse repertoire of TCRs compared with the tetramer. Using the technology, the authors found that T cells capable of recognizing peptides conserved across all coronaviruses were more abundant and exhibited a "memory" phenotype - a desirable feature among T cells targeted by vaccines - compared with T cells that only recognize SARS-CoV-2. Indeed, COVID-19 patients with mild disease harbored a greater number of killer T cells with these conserved specificities than those with more severe illness, suggesting the broad-affinity T cells are protective, the authors say. Next steps will require enhancing the spheromer technology to include more MHC proteins, they add.

Credit: 
American Association for the Advancement of Science (AAAS)

Researchers identify brain circuit for spirituality

image: Scientists have recently discovered a relationship between spirituality and an evolutionarily-ancient brain circuit.

Image: 
Michael Ferguson

More than 80 percent of people around the world consider themselves to be religious or spiritual. But research on the neuroscience of spirituality and religiosity has been sparse. Previous studies have used functional neuroimaging, in which an individual undergoes a brain scan while performing a task to see what areas of the brain light up. But these correlative studies have given a spotty and often inconsistent picture of spirituality. A new study led by investigators at Brigham and Women's Hospital takes a new approach to mapping spirituality and religiosity and finds that spiritual acceptance can be localized to a specific brain circuit. This brain circuit is centered in the periaqueductal gray (PAG), a brainstem region that has been implicated in numerous functions, including fear conditioning, pain modulation, altruistic behaviors and unconditional love. The team's findings are published in Biological Psychiatry.

"Our results suggest that spirituality and religiosity are rooted in fundamental, neurobiological dynamics and deeply woven into our neuro-fabric," said corresponding author Michael Ferguson, PhD, a principal investigator in the Brigham's Center for Brain Circuit Therapeutics. "We were astonished to find that this brain circuit for spirituality is centered in one of the most evolutionarily preserved structures in the brain."

To conduct their study, Ferguson and colleagues used a technique called lesion network mapping that allows investigators to map complex human behaviors to specific brain circuits based on the locations of brain lesions in patients. The team leveraged a previously published dataset that included 88 neurosurgical patients who were undergoing surgery to remove a brain tumor. Lesion locations were distributed throughout the brain. Patients completed a survey that included questions about spiritual acceptance before and after surgery. The team validated their results using a second dataset made up of more than 100 patients with lesions caused by penetrating head trauma from combat during the Vietnam War. These participants also completed questionnaires that included questions about religiosity (such as, "Do you consider yourself a religious person? Yes or No?").

Of the 88 neurosurgical patients, 30 showed a decrease in self-reported spiritual belief before and after neurosurgical brain tumor resection, 29 showed an increase, and 29 showed no change. Using lesion network mapping, the team found that self-reported spirituality mapped to a specific brain circuit centered on the PAG. The circuit included positive nodes and negative nodes -- lesions that disrupted these respective nodes either decreased or increased self-reported spiritual beliefs. Results on religiosity from the second dataset aligned with these findings. In addition, in a review of the literature, the researchers found several case reports of patients who became hyper-religious after experiencing brain lesions that affected the negative nodes of the circuit.

Lesion locations associated with other neurological and psychiatric symptoms also intersected with the spirituality circuit. Specifically, lesions causing parkinsonism intersected positive areas of the circuit, as did lesions associated with decreased spirituality. Lesions causing delusions and alien limb syndrome intersected with negative regions, associated with increased spirituality and religiosity.

"It's important to note that these overlaps may be helpful for understanding shared features and associations, but these results should not be over-interpreted," said Ferguson. "For example, our results do not imply that religion is a delusion, that historical religious figures suffered from alien limb syndrome, or that Parkinson's disease arises due to a lack of religious faith. Instead, our results point to the deep roots of spiritual beliefs in a part of our brain that's been implicated in many important functions."

The authors note that the datasets they used do not provide rich information about the patient's upbringing, which can have an influence over spiritual beliefs, and that patients in both datasets were from predominantly Christian cultures. To understand the generalizability of their results, they would need to replicate their study across many backgrounds. The team is also interested in untangling religiosity and spirituality to understand brain circuits that may be driving differences. Additionally, Ferguson would like to pursue clinical and translational applications for the findings, including understanding the role that spirituality and compassion may have in clinical treatment.

"Only recently have medicine and spirituality been fractionated from one another. There seems to be this perennial union between healing and spirituality across cultures and civilizations," said Ferguson. "I'm interested in the degree to which our understanding of brain circuits could help craft scientifically grounded, clinically-translatable questions about how healing and spirituality can co-inform each other."

Credit: 
Brigham and Women's Hospital

Rethinking plastics

image: University of Delaware researchers LaShanda Korley (left) and Thomas Epps, III, are co-authors of a Science magazine article calling for a concerted effort to address the urgent crisis of plastics pollution. With collaborators from the United Kingdom and Lawrence Berkeley National Lab in California, they call for new approaches to plastics design, production and use, with the goal of keeping plastics out of landfills and waterways, reusing the valuable resources they represent indefinitely in a "circular" plastics economy.

Image: 
Graphics by Jeffrey C. Chase

People lived without plastic until the last century or so, but most of us would find it hard to imagine how.

Plastics now are everywhere in our lives, providing low-cost convenience and other benefits in countless applications. They can be shaped to almost any task, from wispy films to squishy children's toys and hard-core components. They have shown themselves vital in medicine and have been pivotal in the global effort to slow the spread of the COVID-19 pandemic over the past 16 months.

Plastics seem indispensable these days.

Unfortunately for the long-term, they are also nearly indestructible. Our planet now bears the weight of more than seven billion tons of plastics, with more being produced every day. An ever-growing waste stream clogs our landfills, pollutes our waterways and poses an urgent crisis for our planet.

Four scientists have published a call to action in a new issue of Science, devoted to the plastics problem.

In a sweeping introductory article, the scientists -- including two from the University of Delaware, one from the Lawrence Berkeley National Laboratory in California and another from the University of Sheffield in the United Kingdom -- call for fundamental change in the way plastics are designed, produced, used and reused.

The ultimate goal: Designing, adopting and ensuring a "circular" lifecycle for plastics that leads not to a landfill or an ocean or a roadside, but to a long life of near-infinite use and reuse of the valuable resources and applications they represent.

That requires new approaches to chemistry, engineering, industrial processes, policy and global collaboration, according to co-authors LaShanda T.J. Korley, director of the Center for Plastics Innovation (CPI) at the University of Delaware and the principal investigator of a National Science Foundation (NSF) Partnerships for International Research and Education effort in Bio-inspired Materials and Systems; UD's Thomas H. Epps, III, co-director of CPI, lead principal investigator of an NSF Growing Convergence Research (GCR) effort in Materials Life-Cycle Management and director of the Center for Hybrid, Active, and Responsive Materials (CHARM) at UD; Brett A. Helms of the Molecular Foundry at Lawrence Berkeley National Laboratory in California; and Anthony J. Ryan of the Grantham Centre for Sustainable Futures at the University of Sheffield in the United Kingdom.

"The plastics waste dilemma is a global challenge that requires urgent intervention and a concerted effort that links partners across industrial, academic, financial, and government sectors buttressed by significant investments in sustainability," they write.

It's a tall order that includes attention to recycling, "upcycling" (reusing materials in new added-value ways), development of new materials and recognition of the needs of under-resourced communities.

"There's not a one-size-fits-all solution," said Korley, Distinguished Professor of Materials Science and Engineering at UD, who has spent her career developing new plastics with specific properties. "How people live with waste and how they recycle is so different. Traveling in Europe has highlighted the stark contrast in the usage of single-use plastics, such as drinking straws and cutlery in comparison to the U.S. Across the U.S., cities and municipalities within a single state may do things differently."

Complex recipes are used in many plastics, Korley said, and often include several kinds of polymers and other additives. Each component can complicate recycling efforts or make recycling impossible, which is why recyclers will accept some kinds of plastic and refuse others.

But how can plastics be designed so that all of their components can be deconstructed for future use in other products?

This is the challenge for CPI, which Korley directs. Its focus is on "upcycling" plastics -- finding ways to turn plastic waste into valuable materials such as fuels and lubricants. Researchers use catalysis and enzymes to reconstitute some kinds of plastic, such as high-density polyethylene (HDPE), low-density polyethylene (LDPE) and polystyrene/Styrofoam, the kinds of plastics used in milk jugs, shampoo bottles, sandwich bags, coffee cups, grocery bags and food packaging.

"Different materials properties require the use of different polymers and blends and additives, which contributes to the complexity and hierarchy of waste," Korley said.

The Science paper addresses that and much more, with an urgency that reflects the real and present dangers for a planet choked by discarded plastics that aren't going anywhere anytime soon.

Some of those realities are grim indeed. Take the plastic water bottle that helped quench your thirst after a morning jog five years ago, for example. It will probably be with us -- somewhere -- for another 395 years. Slow deterioration doesn't help us either. Scientists have found that tiny micro bits of worn-down plastic are prevalent in the water we drink and the foods we eat.

Less than 10% of plastic waste is recycled at all and less than 1% will be recycled more than once. About 12% will be incinerated. Millions of tons of discarded plastic winds up in giant swirls of debris in the ocean and the rest of it piles up in landfills, sinks into riverbeds or lies on roadsides around the world.

But Helms, a co-author from the Lawrence Berkeley National Lab, was part of the team that created a next-generation plastic called PDK (polydiketoenamine), which can be reduced back to its molecular parts and reassembled as needed.

"We're at a critical point where we need to think about the infrastructure needed to modernize recycling facilities for future waste sorting and processing," Helms said after the new material was announced. "If these facilities were designed to recycle or upcycle PDK and related plastics, then we would be able to more effectively divert plastic from landfills and the oceans. This is an exciting time to start thinking about how to design both materials and recycling facilities to enable circular plastics."

The building blocks of plastics -- monomers -- are made up of elements including carbon, hydrogen, oxygen, nitrogen, chlorine and sulfur. These monomers are linked by chemical bonds to become polymers, which can be used in the formation of plastics to be crafted into various forms for many different uses.

The value of all those resources is lost in single-use applications, said Sheffield's Ryan. He calls it a "convenient truth" -- the convenience and cheap cost of such products make them compelling to consumers, without recognizing the inherent value and cost to the planet. Marketing strategies that claim certain plastic products are "green" and biodegradable to draw well-intentioned consumers are especially concerning to him.

"Cynical 'greenwash' is the biggest problem for plastics sustainability," he said. "So I was very keen to work with LaShanda and Thomas on this. I have known them since they were Ph.D. students."

With innovation and collaboration as pillars of the new centers they co-direct -- Korley's U.S. Department of Energy-backed CPI and Epps' NSF-backed CHARM and GCR, Korley and Epps, the Allan and Myra Ferguson Distinguished Professor of Chemical and Biomolecular Engineering, are at the forefront of efforts to extend the life of petroleum- and bio-derived plastics and/or put them on a circular path that continues from production to first use to reconstitution to forever.

Ryan said he sees a "circular economy" as critical. He sees the value in recycling and upcycling and development of new materials, but none is a "silver bullet." Addressing the plastics dilemma requires recognition of the true value of plastics.

"One solution is something America is not very good at -- regulations, policy and taxation," he said. "There isn't an easy answer to the plastics problem. An unrestrained market isn't going to provide it.

"For all of these issues where science and engineering and society intersect, the answer is always: It's complicated."

A more accurate perspective, in Ryan's view, is to see the plastics problem as related to the climate change problem without allowing it to be a distraction.

"Climate change is an inconvenient truth and an invisible truth," he said. "You can't see what's causing it and you can't see carbon dioxide in the atmosphere. You don't associate driving to the store with climate change.

"You do associate things with plastics waste -- and that is a convenient truth. We have no problem taking fossil fuels and turning them into plastics. But now we need to take care of that precious plastic. Don't just throw it away. It's just too cheap. Because of the pollution problem, we need to give it an artificially high price."

Lifecycle analysis data are key to making evidence-based decisions, Ryan said, and consumers and lawmakers can't do that on their own. They need professionals to break down the costs and benefits and explain the options.

"It's far more complex than most people are willing to consider," he said.

The call to action is comprehensive.

"To achieve a more sustainable future, integration of not only technological considerations, but also equity analysis, consumer behavior, geographical demands, policy reform, life-cycle assessment, infrastructure alignment, and supply chain partnerships are vital," the authors said.

Korley said she sees growing passion for this daunting challenge.

"These initiatives drive excitement among our students -- high school, undergraduate and graduate and our postdocs," she said. "People are passionate about doing something to better the world. And they can talk to their grandmother or their niece or nephew and explain why the work they are doing matters."

Credit: 
University of Delaware

Skin in the game: Transformative approach uses the human body to recharge smartwatches

image: Sunghoon Ivan Lee demonstrates how the wearable device is charged through his left forearm's contact with the power transmitter below the keyboard.

Image: 
UMass Amherst

As smart watches are increasingly able to monitor the vital signs of health, including what's going on when we sleep, a problem has emerged: those wearable, wireless devices are often disconnected from our body overnight, being charged at the bedside.

"Quality of sleep and its patterns contain a lot of important information about patients' health conditions," says Sunghoon Ivan Lee, assistant professor in the University of Massachusetts Amherst College of Information and Computer Sciences and director of the Advanced Human Health Analytics Laboratory.

But that information can't be tracked on smartwatches if the wearable devices are being charged as users sleep, which prior research has shown is frequently the case. Lee adds, "The main reason users discontinue the long-term use of wearable devices is because they have to frequently charge the on-device battery."

Pondering this problem, Lee brainstormed with UMass Amherst wearable computing engineer Jeremy Gummeson to find a solution to continuously recharge these devices on the body so they can monitor the user's health 24/7.

The scientists' aha moment came when they realized "human skin is a conductible material," Lee recalls. "Why can't we instrument daily objects, such as the office desk, chair and car steering wheel, so they can seamlessly transfer power through human skin to charge up a watch or any wearable sensor while the users interact with them? Like, using human skin as a wire.

"Then we can motivate people to do things like sleep tracking because they never have to take their watch off to charge it," he adds.

In a paper published in the Proceedings of the ACM on Interactive Mobile, Wearable and Ubiquitous Technologies, Lee, Gummeson and lead author Noor Mohammed, a Ph.D. student in Lee's lab, lay out the technical groundwork and showcase its feasibility. "I am hopeful that this will open a lot of possibilities toward the development of battery-less wearable devices both for consumer and clinical applications," Mohammed says.

This week, the UMass Amherst team received a $598,720 grant from the National Science Foundation to continue to develop the system's hardware and software.

Gummeson, an assistant professor of electrical and computer engineering, explains how the technology uses human tissue as a transfer medium for power. "In this device we have an electrode that couples to the human body, which you could think of as the red wire, if you're thinking of a traditional battery with a pair of red and black wires," he says.

The conventional black wire is established between two metal plates that are embedded on the wearable device and an instrumented everyday object, which becomes coupled (or virtually connected) via the surrounding environment when the frequency of the energy carrier signal is sufficiently high - in the hundreds of megahertz (MHz) range.

The researchers tested a prototype of their technology with 10 people in three scenarios during which the individuals' arm or hand made contact with the power transmitter - either as they worked on a desktop keyboard or a laptop, or as they were holding the steering wheel of a car.

Their research showed that approximately 0.5 - 1 milliwatt (mW) of direct current (DC) power was transferred to the wrist-worn device using the skin as the transfer medium. This small amount of electricity conforms to safety regulations established by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and Federal Communications Commission (FCC).

"You can think of the amount of power that gets transmitted by our technology as roughly comparable to what's transmitted through the human body when you stand on a body composition scale, hence poses minimal health risks," Gummeson says.

There is no sensation to the person who comes into contact with the power transmitter. "This is way beyond the frequency range that the human can actually perceive," Lee says.

The prototype currently doesn't produce enough power to continuously operate a sophisticated device such as an Apple Watch but could support ultra-low-power fitness trackers like Fitbit Flex and Xiaomi Mi-Bands.

The UMass Amherst team aims to improve the power transfer rate in subsequent studies and says smart wearable devices also will become more power-efficient as technologies advance. "We imagine in the future as we further optimize the power that's consumed by the wearable sensors, we could reduce and ultimately eliminate the charging time," Gummeson says.

Lee adds, "We think this is an innovative solution."

Credit: 
University of Massachusetts Amherst