Tech

Washable electronic textiles to usher in an era of even smarter wearable products

image: It is conducting experiments that use a new electronic textiles to drive LEDs in RGB color.

Image: 
Korea Institute of Science and Technology(KIST)

With the wearable electronic device market having firmly established itself in the 21st century, active research is being conducted on electronic textiles,1 which are textiles (e.g. clothing) capable of functioning like electronic devices. Fabric-based items are flexible and can be worn comfortably all day, making them the ideal platform for wearable electronic devices.

The research team of Dr. Jung-ah Lim of the Korea Institute of Science and Technology (KIST, president: Byung-gwon Lee) announced that it has developed a fibrous transistor2 that has a fiber structure, giving it the characteristics of a textile while allowing it to be inserted into clothing and retain an adequate level of functionality even after being washed.

Existing technology involves physically attaching a solid electronic device (sensor, etc.) to the surface of clothing or using conductive textiles to connect various devices, with little to no attention paid to the wearer's comfort. Existing thread-type transistors are made by depositing a flat transistor onto a single conductive thread. Electrodes made in this manner require a high voltage in order to be activated, but the low current that is generated is often insufficient to activate display devices (LED, etc.). Until now, it was also difficult to create electronic circuits through contact with other devices (for woven fabrics) or to apply a protective layer to the transistor to allow for washing.

The transistor developed by the KIST research team is made by connecting twisted electrodes. Using this structure, the team was able to adjust the length of the threads and thickness of the semiconductor to obtain currents over 1,000-times higher than those possible using existing transistors, even at low voltages (below -1.3V).

Through tests, Lim's team confirmed that even after bending the transistor or winding it around a cylindrical object over 1,000 times (with a resulting thickness of approximately 7 millimeters), it maintained a performance level of over 80 percent. The team also announced that the performance level remained adequate even after washing the transistor in water containing detergent. The team was also able to activate an LED device with the transistor inserted between the threads of clothing and measure electrocardiogram signals through signal amplification.

Lim said, "The results of this study point to a new device structure that can overcome the limitations of current electronic textiles, including low current, high activation voltage, and low resilience to washing. We expect that our study will contribute to the development of even smarter wearable products in the future, including next-generation wearable computers and smart clothing that can monitor vital signs."

Credit: 
National Research Council of Science & Technology

Americans still eat too much processed meat and too little fish

Philadelphia, June 21, 2019 - A new study in the Journal of the Academy of Nutrition and Dietetics, published by Elsevier, found that the amount of processed meat consumed by Americans has remained unchanged in the past 18 years, nor has their intake of fish/shellfish increased. In addition, one quarter of US adults are still eating more unprocessed red meat than the recommended level, and less than 15 percent meet the guidelines for fish/shellfish consumption. On a positive note, Americans are eating less beef and more chicken than they did 18 years ago, and in fact, for the first time, consumption of poultry exceeds that of unprocessed red meat.

"Despite strong evidence linking processed meat with cancer risk, consumption of processed meat among US adults didn't change over the study period (1999-2016)," said lead investigator Fang Fang Zhang, MD, PhD, Friedman School of Nutrition Science and Policy, Tufts University, Boston, MA, USA. "While factors other than health (e.g., social, cultural, and economic) can influence Americans' food choices, the lack of widespread awareness of health risks associated with processed meat may have contributed to the lack of consumption change in the past 18 years. Our findings support further actions to increase the public awareness of the health risks associated with high processed meat consumption in the US."

The study used a nationally representative sample comprised of dietary data from nearly 44,000 US adults (ages 20 and older) who participated in the National Health and Nutrition Examination Survey (NHANES), through 2016. The investigators assessed trends in consumption of processed meat, unprocessed red meat, poultry, fish, and shellfish over the past 18 years and their purchase locations.

In addition to the overall trends noted above using full NHANES data, the research team also compared NHANES data from 1999-2000 to 2015-2016. Key findings include:

Processed meats: Consumption remained unchanged - 182 grams/week compared with 187 grams/week.

Top five consumed (percentage among total, 2015-2016): Luncheon meat (39 percent), sausage (24 percent), hot dog (9 percent), ham (9 percent), and bacon (5 percent)

Primary purchase locations: Stores and fast-food restaurants

Unprocessed red meat: Decreasing trend - 340 grams/week compared with 284 grams/week, primarily due to decreased consumption of beef (down by 78 grams/week).

Poultry: Increasing trend - 256 grams/week compared with 303 grams/week, primarily due to increased consumption of chicken (up by 34 grams/week)

Fish/seafood: Consumption remained unchanged - 115 grams/week compared with 116 grams/week

There is accumulating evidence linking excessive consumption of processed meat to increased risk of obesity, diabetes, cardiovascular diseases, and some cancers. Processed meat has been classified as "carcinogenic to humans" (Group 1) by the International Agency for Research on Cancer (IARC). The American Cancer Society (ACS), the World Cancer Research Fund (WCRF)/American Institute for Cancer Research (AICR) issued recommendations limiting processed meat consumption for cancer prevention. A study by Dr. Zhang published last month estimated than 14,524 new cancer cases were attributable to high consumption of processed meat in 2015 among US adults aged 20 years and older. Future research is needed to identify barriers to reducing processed meat consumption, evaluate the effectiveness of potential public health interventions, and explore policies such as nutrition quality standards, excise taxes, and health warning labels.

The low consumption of fish/shellfish among US adults could be due to its high retail price, lack of awareness of its health benefits, and concerns about mercury contamination in certain fish, although the scientific evidence suggests that the benefits of fish intake exceed the potential risks for most individuals. Given that fish consumption (2015-2016) was only half of the recommended level in the 2015-2020 Dietary Guidelines for Americans, efforts are needed to promote the consumption and variety of seafood, especially those varieties high in omega-3 fatty acids.

"Findings of this study can inform public health policy priorities for improving diet and reducing chronic disease burden in the US. Because stores and fast-food restaurants are main purchase locations for processed meat, future policies may prioritize these as primary sites of intervention for reducing processed meat consumption among US adults," noted Dr. Zhang.

Credit: 
Elsevier

Researchers find a mechanism to improve pancreatic islet transplantation in type 1 diabetes

image: The inhibition of the PTP1B phosphatase activity makes the transplanted beta cells in the pancreas to survive and fulfil their function, with which the regular sugar levels can be recovered.

Image: 
Figueiredo <em>et</em> <em>al</em>.

The first cause of the loss of functionality in transplanted pancreatic islets is the low capacity to create new vessels to allow the arrival of nutrients in the cells. This is one of the main reasons causing the failure in transplantations in the treatment of type 1 diabetes.

Researchers from the University of Barcelona and IDIBAPS led a study that identifies a protein as the potential modulator in the revascularization of pancreatic islets. In a study conducted on diabetic mice with islet transplant from other animals or human islets, researchers showed that grafts without this protein have a higher revascularization -favouring the viability of cells- and regular sugar levels and glucose tolerance are recovered.

The study is coordinated by Ramon Gomis, professor at the Faculty of Medicine and Health Sciences of the University of Barcelona, head of the IDIBAPS research group Pancreatic Islets: biomarkers and function, and researcher at the Diabetes and Associated Metabolic Diseases Networking Biomedical Research Centre (CIBERDEM), and Rosa Gasa, researcher in the same group.

The first author of the study, published in the journal Science Translational Medicine, is the expert Hugo Figueiredo, researcher in the IDIBAPS group.

Regenerative medicine for type 1 diabetes treatment

One of the used strategies in the type 1 diabetes treatment -based on regenerative medicine- is the pancreatic islet transplantation. These are formed by different types of cells with an endocrine function that produce hormones such as insulin and glucagon. In type 1 diabetes, beta cells in islets -responsible for the production of insulin- are selectively destroyed by an autoimmune process.

For this reason, islet transplantation can restore the physiological function in patients with this type of diabetes. "Although this transplant is carried out in some centers, it has limitations, such as the chronic administration of immunosuppressants, and it is applied in those cases in which the disease is not properly controlled. At this moment, it is indicated in the context of a kidney transplant and we opt for a dual vascular kidney - pancreas transplant", says Ramon Gomis, coordinator of the study.

In the context of the islet transplantation, there are two basic challenges to face and which are related to the chronic administration of immunosupressants and the fact that the transplant is not re-vascularized properly, which makes it harder for the nutrients and oxygen to arrive. This deficiency causes the islets to stop being viable and to die.

The regular environment for islets is formed by a heavy network of capillaries in charge of the arrival of oxygen, hormones and nutrients and the transport of the generated hormones to the blood vessel. During the transplant process, islets are separated from their vascular network and their right function and the following survival depends on the ability to create new vessels for the vascular system in the receptor. "Islet implantation is revascularized, but not fast enough. In the article, we focused on getting enough vessels to keep the islets in optimal conditions and improve the success of this strategy for the treatment of type 1 diabetes", notes researcher Rosa Gasa, coordinator of the study.

Improving the revascularization is one of the challenges

In the study, researchers identified a molecular target -PTP1B phosphatase- which would allow transplanted pancreatic islets to be viable. The results show the inhibition of this enzyme -present in all cells including pancreatic beta cells- promotes the activity of the pro-angiogenic growth factor VEGF, which enables the creation of new blood vessels. Therefore, it gives a higher re-vascularization of the treatment that improves the islet survival and functionality. "Regulation of the re-vascularization is induced by hypoxia the lack of nutrients and the inhibition of phosphatase widens this response. When the stimuli disappears, the creation of new blood vessels stops", notes Rosa Gasa.

"This article is a concept proof that can remove one of the reasons why the pancreatic islet transplant fails. There are less specific inhibitors of PTP1B and phosphatases and therefore, the next step will be to test these inhibitors in the islet transplant in humans and assess its success", concludes Ramon Gomis.

Credit: 
University of Barcelona

Multi-mobile (M2) computing system makes android & iOS apps sharable on multiple devices

image: Multi-mobile (M2) Computing System Makes Android and iOS Apps Sharable on Multiple Devices. M2 integrates cameras, displays, microphones, speakers, sensors, and GPS to improve audio conferencing, media recording, and Wii-like gaming, and allow greater access for disabled users.

Image: 
Naser AlDuaij/Columbia Engineering

New York, NY--June 20, 2019--Computer scientists at Columbia Engineering have developed a new computing system that enables current, unmodified mobile apps to combine and share multiple devices, including cameras, displays, speakers, microphones, sensors, and GPS, across multiple smartphones and tablets. Called M2, the new system operates across heterogeneous systems, including Android and iOS, combining the functionality of multiple mobile systems into a more powerful one that gives users a seamless experience across the various systems.

With the advent of bezel-less smartphones and tablets, M2 answers the growing demand for multi-mobile computing--users can instead dynamically switch their Netflix or Spotify streams from their smartphones to a collection of other nearby systems for a larger display or better audio. Instead of using smartphones and tablets in isolation, users can combine their system's functionalities since they now can all work together. Users can even combine photos taken from different cameras and from different angles into a single, detailed 3D image.

"Given the many popular and familiar apps out there, we can combine and mix systems to do cool things with these existing unmodified apps without forcing developers to adopt new set of APIs and tools," says Naser AlDuaij, the study's lead author and a PhD student working with Computer Science Professor Jason Nieh. "We wanted to use M2 to target all apps without adding any overhead to app development. Users can even use M2 to run Android apps from their iPhones."

The challenge for the team was that mobile systems are not only highly heterogeneous, but that heterogeneous device sharing is also difficult to support. Beyond hardware heterogeneity, there are also many diverse platforms and OS versions, with a wide range of incompatible device interfaces that dictate how software applications communicate with hardware.

While different mobile systems have different APIs and low-level devices are vendor-specific, the high-level device data provided to apps is generally in a standard format. So AlDuaij took a high-level device data approach and designed M2 to import and export device data in a common format to and from systems, avoiding the need to bridge incompatible mobile systems and device APIs. This method enables M2 not only to share devices, but also to mix and combine devices of different types of data since it can aggregate or manipulate device data in a known format.

"With M2, we are introducing device transformation, a framework that enables different devices across disparate systems to be substituted and combined with one another to support multi-mobile heterogeneity, functionality, and transparency," says AlDuaij, who presented the study today at MobiSys 2019, the 17th ACM International Conference on Mobile Systems, Applications, and Services. "We can easily manipulate or convert device data because it's in a standard format. For example, we can easily scale and aggregate touchscreen input. We can also convert display frames to camera frames or vice versa. M2 enables us to reinterpret or represent different devices in different ways."

Among M2's device "transformations" are fusing device data from multiple devices to provide a multi-headed display scenario for a better "big screen" viewing or gaming experience. By converting accelerometer sensor data to input touches, M2 can transform a smartphone into a Nintendo Wii-like remote to control a game on another system. Eye movements can also be turned into touchscreen input, a useful accessibility feature for disabled users who cannot use their hands.

For audio conferencing without having to use costly specialized equipment, M2 can be deployed on smartphones across a room to leverage their microphones from multiple vantage points, providing superior speaker-identifiable sound quality and noise cancellation. M2 can redirect a display to a camera so that stock camera apps can record a Netflix or YouTube video and can also enable panoramic video recording by fusing the camera inputs from two systems to create a wider sweeping view. One potentially popular application would let parents seated next to each other record their child's wide-angled school or sports performance.

"Doing all this without having to modify apps means that users can continue to use their favorite apps with an enhanced experience," AlDuaij says. "M2 is a win-win--users don't need to worry about which apps would support such functionality and developers don't need to spend time and money to update their apps."

Using M2 is simple--all a user would have to do is to download the M2 app from Google Play or Apple's App Store. No other software is needed. One mobile system runs the unmodified app; the input and output from all systems is combined and shared to the app.

"Our M2 system is easy to use, runs efficiently, and scales well, especially compared to existing approaches," Nieh notes. "We think that multi-mobile computing offers a broader, richer experience with the ability to combine multiple devices from multiple systems together in new ways."

The Columbia team has started discussions with mobile OS vendors and phone manufacturers to incorporate M2 technologies into the next releases of their products. With a few minor modifications to current systems, mobile OS vendors can make multi-mobile computing broadly available to everyone.

Credit: 
Columbia University School of Engineering and Applied Science

Gold for silver: A chemical barter

From effective medicines to molecular sensors to fuel cells, metal clusters are becoming fundamentally useful in the health, environment, and energy sectors. This diverse functionality of clusters arises from the variability in size and type. Now, scientists led by Professor Yuichi Negishi, of the Department of Applied Chemistry at Tokyo University of Science, add to this ongoing tale by explaining the dynamics of the metal cluster, thiolate-protected gold-silver alloy, in solution; this helps in understanding the stability, geometry, and tenability of these clusters for their applications.

Metal clusters are formed when a bunch of metal atoms come together to form clumps, somewhere between the size of a molecule and that of a bulk solid. Recently, these clusters have gained a lot of attention owing to their diverse chemical capabilities that depend on their size and composition. Unlike the closed, set, and stable packing observed in bulk metal lattices, the geometry of these clusters, which often governs their chemical reactivity too, is based on special atomic arrangements that minimize the energy. Furthermore, their functionalities vary depending on the number of constituent atoms in the cluster. Because these micro-level factors govern the ultimate macro-level activity of the clusters, understanding the cluster dynamics at the atomic scale is essential. Recent exploration in the field of such metal clusters has enabled the cataloging of these clumps as compounds of defined chemical compositions.

One such interesting metal cluster with catalytic properties and luminescence is the thiolate-protected gold-silver alloy cluster. These metal clusters are formed when thiolate-protected individual gold and silver clusters are kept together in a solution. The individual pure clusters undergo metal exchange, like a chemical "barter": a gold for a silver atom. While the cluster-metal complex reaction (CMCR) method is widely used, the actual dynamics of it and the energy incentive driving such processes are not understood. This became the seed of curiosity for Prof. Negishi's team, as they state, "the dynamic behavior of these clusters in solution must be taken into consideration to understand the origins of the catalytic activity and luminescence properties of gold-silver alloys clusters in addition to the geometric structure."

To illuminate the metal exchange behavior between the pure clusters after synthesis, the team devised an experiment based on reverse-phase chromatography. They identified this setup because it differentiates molecules based on electronic features, i.e., whether the molecule is polar (with a simultaneous positive and negative center) or non-polar (without separation of charge).

Using this setup proved useful as the team reported that, in fact, the individual structural isomers (different spatial and geometrical distribution for a given cluster) change in solution even though the mass of the cluster remains unchanged. This indicated that there was intra-cluster exchange of metal atoms, which changed the electronic state of the cluster even though the mass remained the same. They also reported that after the synthesis, with the passage of time, the concentration of different structural types of gold-silver alloys in the solution changed. This indicated that there was also an inter-cluster metal exchange at play. Lastly, the researchers also observed that the inter-cluster metal exchange occurs much more frequently after synthesis and eventually slows down after standing for a long time. They assigned this to the difference in stability and energy among the different structures. "The metastable geometries formed initially likely convert to thermodynamically stable geometries through inter-cluster (and intra-cluster) metal exchange in solution," explains Prof. Negishi.

The scientists verified their claims about the observed dynamics of the cluster?metal complex reaction (CMCR) by carrying out a comparative study with the alternate synthesis procedure. Since, traditional procedures (Co-Reduction of Metal Ions) produces alloys under severe conditions, only the thermodynamically and energetically favorable structures see the light of day. Thus, predominantly stable structures are formed, indicating that metal exchange is relatively suppressed. This stood in opposition to the clusters formed by the CMCR where signatures for various species are initially observed. As time passes, like all things in nature, the unstable species try to rearrange themselves into stable ones. How? Through metal exchange, of course!

To summarize, Prof. Negishi states, "These results demonstrate that gold-silver alloy clusters have different geometric structures (and distributions) immediately after synthesis, depending on the synthesis method. Thereby, their dynamic behavior in solution also depends on the synthesis method."

The study of clusters with varying core sizes and compositions is exciting as it offers exciting opportunities to harness novel physical and chemical properties. But, that's not all. It also provides an insight into their structure-property relationships, almost like peeping into the "social life" of atoms!

Credit: 
Tokyo University of Science

One step closer to chronic pain relief

Sortilin, which is a protein expressed on the surface of nerve cells, plays a crucial role in pain development in laboratory mice - and in all likelihood in humans as well. This is the main conclusion of the study 'Sortilin gates neurotensin and BDNF signalling to control peripheral neuropathic pain', which has just been published in the journal Science Advances.

The results are based on a decade of basic research, and even though studies on mice have only been done so far, the study provides hope for the development of a medicine that can help people with pain induced by nerve injury - called neuropathic pain by medical professionals.

This pain may be triggered by an acute injury or a chronic disease such as e.g. diabetes in the pain pathways and is characterised by different sensations including burning, pricking, stinging, tingling, freezing or stabbing in a chronic and disabling way.

The patients have in common that they could fill a shopping basket with pain killers ranging from local anaesthetic ointments to morphine "without ever really getting any good results" as the primary author of the article, Assistant Professor Mette Richner, puts it. She is employed at the Department of Biomedicine and the DANDRITE research centre, both part of Aarhus University, Denmark.

Mette Richner explains that chronic pain is triggered by overactive nerve cells, i.e. nerve cells where the regulation of their activity is not working properly. For this reason, it is necessary to gain knowledge of the changes happening at the molecular level to be able to 'nudge things into place again'.

"And it's here, at the molecular level, that we've now added a crucial piece to a larger puzzle," says Mette Richner, who explains that sortilin - and now things get a little convoluted - appears to 'put the brakes on the brake' which, at the molecular level, stops the body's pain development.

"Once nerve damage has occurred, and the nerve cells go into overdrive, molecules are released which start a domino effect that ultimately triggers pain. The domino effect can be inhibited by a particular molecule in the spinal cord called neurotensin, and our studies show that the neurotensin is 'captured' by sortilin, so that the brake is itself inhibited," explains Mette Richner, who began on the project as a PhD student in Professor Anders Nykjaer's group and subsequently completed it as a postdoc in Associate Professor's Christian B. Vaegter's research group. Both are last authors of the study.

The research group's hope is that the pharmaceutical industry will continue to investigate whether it is possible to block sortilin locally in the spinal cord, so that the neurotensin can move freely and get the brake to function, thereby inhibiting the pain. In connection with this, Christian Vaegter emphasises that there is obviously a way to go from mouse to human being.

"Our research is carried out on mice, but as some of the fundamental mechanisms are quite similar in humans and mice, it still gives an indication of what is happening in people suffering from chronic pain," says Christian Vaegter.

The idea of studying the complicated pain-related puzzle in relation to the spinal cord arises from a decade's worth of research into both pain and sortilin. The initial studies revolved around mice that lack the ability to form sortilin and were apparently pain-free despite nerve damage - and of course the studies were done in accordance with methods approved by the Danish Animal Experiments Inspectorate.

The research group could subsequently ascertain that neither did normal mice develop pain after nerve damage when the researchers blocked sortilin - and from here the hunt for the correlation began, before it was ultimately explained by the regulation of the pain inhibiting molecule neurotensin.

Cronic pain in brief

Around eight percent of the population suffer from neuropathic pain, and the number of sufferers is expected to increase in step with longer life expectancy and more lifestyle diseases.

It is triggered by chronic diseases such as e.g. diabetes and multiple sclerosis and affects around one third of people in these two groups of patients. Chronic pain frequently occurs following amputations and is seen in almost seven out of ten patients with brain and spinal cord injuries. Chronic pain also affects seven out of ten patients receiving chemotherapy.

Chronic pain can occur in all parts of the body and is triggered by nerve damage. In principle, the cause could be a bicycle accident (deep wounds, serious blows), sports injuries, amputation, chemotherapy or autoimmune diseases.

Chronic pain is defined by pain which has failed to disappear three months after the wound has healed.

Credit: 
Aarhus University

Using graphene and tiny droplets to detect stomach-cancer causing bacteria

image: This is a schematic illustration of the study design and results

Image: 
Osaka University

Osaka, Japan - Biosensors are currently used in healthcare to monitor blood glucose; however, they also have the potential to detect bacteria. Researchers at Osaka University have invented a new biosensor using graphene--a material consisting of a one-atom-thick layer of carbon--to detect bacteria such as those that attack the stomach lining and that have been linked to stomach cancer. When the bacteria interact with the biosensor, chemical reactions are triggered which are detected by the graphene. To enable detection of the chemical reaction products, the researchers used microfluidics to contain the bacteria in extremely tiny droplets close to the sensor surface.

To get the bacteria to stick, the researchers covered the graphene with antibodies, a common way of anchoring bacteria to biosensor surfaces. However, although antibodies are very small (~10 nm), on the atomic scale and compared with the atom-thin layer of graphene, they are actually quite large and bulky. While the bacteria interact with the antibodies, the graphene cannot detect those bacteria directly as the antibodies on its surface block the signal; this signal blocking effect is referred to as Debye screening.

To overcome the Debye screening limitation, the researchers instead decided to monitor chemical reactions being performed by the bacteria in the presence of certain chemicals, which they added to the tiny water droplet. The chemicals produced in the reactions are far smaller than the antibodies and can slip between them easily and reach the graphene surface. By only analyzing the bacteria in tiny droplets generated through microfluidics, the bacteria and their reaction products can be kept close to the graphene surface and the concentration of the reaction products can even be monitored over time.

"Our biosensor enables highly sensitive and quantitative detection of bacteria that cause stomach ulcers and stomach cancer by limiting its reaction in a well-defined microvolume," study co-author Kazuhiko Matsumoto says.

The graphene sensing surface is able to feedback electrical signals that vary depending on how much of the reaction product is present in the microdroplet and how quickly it is accumulating. These electrical signals can be used to calculate the number of bacteria in the droplet. The graphene is set-up in a field effect transistor (FET) structure, the role of which is to dramatically increase the electrical detection signals from the graphene sensing surface.

"Our biosensor is essentially a mini laboratory on a graphene FET. This sensor demonstrates how two-dimensional materials such as graphene are getting closer to being applied in practical medical and healthcare applications," first author Takao Ono says.

The results of the study can be used to create a whole host of these "lab-on-a-graphene-FET" biosensors to detect various different bacteria. The detection of tiny concentrations of bacteria could be achieved in less than 30 minutes; hence, this work represents the possibility of faster diagnoses for potentially harmful bacteria in future.

Credit: 
Osaka University

Why climate change means a rethink of coffee and cocoa production systems

image: Coffee and cocoa are both traditionally grown under tree shade in order to reduce heat stress and conserve soil.

Image: 
Bioversity International/K.DeSousa

Global demand for coffee and cocoa is on the rise. Yet across the equatorial belt where these two crops are produced, the future is not looking bright. Climate change in the tropics is pushing coffee and cocoa closer to the limits of physiological tolerance and constraining the places where they can grow in the future.

A new study examines future climate scenarios in Mesoamerica and how they could affect the distribution of these crops from Panama to Central Mexico. Coffee production, especially of Arabica coffee, will likely decrease as global warming and extreme weather events reduce the geographical areas where it grows best, and increase susceptibility to pest and disease outbreaks - coffee leaf rust affected 70% of the coffee farms in Central America in 2017. Cocoa may have a more positive future.

The study concludes that half of the current coffee plantations that are vulnerable to global warming in the future could be replaced by cocoa. "This opens a window of opportunity for climate change adaptation" highlights Kaue de Sousa, Research Fellow and leading author of the study. "The interest of smallholder farmers in cocoa is growing, driven by the vulnerability of coffee in the changing climate. Now we have to build capacity among smallholders to adapt their crop systems successfully" continues de Sousa.

Both crops are mostly grown under agroforestry management - where trees are incorporated into farming systems. "Coffee and cocoa are both traditionally grown under tree shade in order to reduce heat stress and conserve soil, but the shade trees are typically ignored in most future climate change studies", says Roeland Kindt, Senior Ecologist at World Agroforestry. The agroforestry approach also brings additional ecosystem services, which make the production system more resilient, for example, by conserving water and providing habitats for birds and insects which can act as natural pest predators.

"Agroforestry systems are clear examples of how positive interactions between plants can ameliorate harsh growing conditions and facilitate agricultural productivity. Our study explores which tree species may be more successful in future coffee and cocoa plantations to create more benign microclimates" says Milena Holmgren, Expert on Ecosystem Resilience to Climate Variability, at Wageningen University.

When the researchers examined the top ten trees currently present in coffee and cocoa agroforestry systems, worryingly they identified that they are the ones most vulnerable to climate change. The authors found that the distribution range of almost 80% of tree species in coffee areas and 62% on cocoa areas will drastically shrink. These include tree species that are important for fruit (e.g. mango, guava and avocado) and timber (e.g. cedar), as well as an estimated 56% loss of nitrogen-fixing trees (e.g. poro and guama), which have the ability to enhance soil productivity and conservation. "Despite the concerning decrease in tree suitability, our study provides alternatives for coffee and cocoa agroforestry under the climate emergency faced by farmers today", adds de Sousa.

Transforming agroforestry systems by changing tree species composition remains the best bet to adapt most of the coffee and cocoa farms across Mesoamerica, the study recommends. This would involve urgent changes to land use planning, incorporating diversified tree species and including underutilized species into redesigned agroforestry systems. The seed sector also needs to step up by offering farmers seeds and seedlings of the most suitable tree species for each climatic zone. Farmers also need to get on board.

"Farmers need to rethink current agroforestry species composition and use a portfolio that is suitable in the future climate. This is a challenging task, because it takes a long time for farmers to see their investments bearing fruit," explains de Sousa. "We identified that this potential may rely on currently underutilized tree species, such as June plum, sapodilla and breadnut; species that are currently present in coffee and cocoa systems, but in low densities, as they are mainly remnants of previous farm vegetation rather than being actively planted and managed by farmers."

Jenny Ordonez, Senior Author concludes: "This study is a very useful first step to improve the design of agroforestry systems as it shows a window of opportunities to maintain diversified agroforestry systems using underutilized species or novel combinations of species. It also opens new areas of research to promote the use of underutilized species which will maintain their suitability under climate change."

Credit: 
Bioversity International

Laser method promising for detecting trace chemicals in air

WASHINGTON -- Researchers have developed a new laser-based method that can detect electric charges and chemicals of interest with unprecedented sensitivity. The new approach could one day offer a way to scan large areas for radioactive material or hazardous chemicals for safety and security applications.

The new technique, called mid-infrared picosecond laser-driven electron avalanche, detects extremely low charge densities -- the number of electric charges in a certain volume -- in air or other gases. The researchers were able to measure electron densities in air produced by a radioactive source at levels below one part per quadrillion, equivalent to picking out one free electron from a million billion normal air molecules.

In Optica, The Optical Society's journal for high impact research, researchers from the University of Maryland report using the new method to calibrate lasers used to inspect irradiated air from 1 meter away. They say the approach could be applied to detecting other chemicals and species and could be scaled up for remote detection at distances of 10 meters and, eventually, 100 meters.

"We can determine charge densities much too low to measure with any other method," said Daniel Woodbury, the lead author on the paper. "We demonstrate the method's ability to detect a radioactive source, but it could eventually be used for any situation that requires measuring trace amounts of a chemical in a gas, such as helping to track pollution, chemicals or safety hazards."

Detecting electrons in air

The new technique is based on a process known as electron avalanche in which a laser beam accelerates a single free electron in a gas until it gains enough energy to knock a different electron off a molecule, resulting in a second free electron. This process repeats and develops into a collisional cascade, or avalanche, that grows exponentially until a bright observable spark appears in the laser focus.

"Although laser-driven electron avalanche has existed since the 1960s, we used a new kind of high-energy, long-wavelength laser -- a picosecond mid-IR laser -- to enable detection of localized collisional cascades seeded only by the initial free electrons," said Howard M. Milchberg, the research team lead. "When shorter wavelength laser pulses are used, the original free electrons seeding the avalanches are masked by free electrons generated directly by laser photons, rather than through collisions."

The research builds on the group's previous work, which demonstrated that avalanche breakdown driven by a mid-IR laser was sensitive to the density of electrons near a radioactive source and changed the amount of time it took for the breakdown to happen.

"We conceived this method to remotely measure radiation near a radioactive source because the signals from Geiger counters and scintillators, conventional detectors of radioactive decay products, drop significantly at distances far from the source," said Robert M. Schwartz, a student working on the project. "With a laser beam, however, we can remotely probe electrons produced in air near the source."

However, in their previous experiments it was hard to determine exactly how many electrons were seeding a breakdown because the avalanche growth is exponential. "Ten, 100 or even 1000 electrons could all produce very similar signals," said Woodbury. "While we could use theoretical models to give rough estimates, we couldn't definitively say what electron densities we were measuring."

In the new work, the researchers realized that, for the right laser pulse length, the multiple breakdowns seeded by individual electrons inside the laser focus would remain distinct. Taking images of the laser focal volume and counting these sparks -- each seeded by an individual electron -- is equivalent to measuring the density of these original seed electrons.

They found that a mid-infrared laser (3.9-micron wavelength) with a 50-picosecond pulse duration hit the sweet spot in terms of both wavelength and pulse duration.

Sensitivity plus location and time information

The researchers demonstrated the viability of the detection concept by using it to measure charge densities produced near a radioactive source that ionizes the air. They measured electron densities down to a concentration of 1000 electrons per cubic centimeter, limited by the background charge in air from cosmic rays and naturally occurring radioactivity. The method was used to precisely benchmark their laser avalanche probe for remote detection of the radioactive source.

"Other methods are limited to approximately 10 million times higher concentrations of electrons with little to no spatial and temporal resolution," said Milchberg. "Our method can count electrons directly and determine their location with a precision on the order of ten microns on time scales of about 10 picoseconds."

The researchers say that the technique can be used to measure ultra-low charge densities from a range of sources including strong field physics interactions or chemical species. "Pairing the picosecond mid-IR laser with a second laser that selectively ionizes a molecule of interest could allow the technique to measure the presence of chemicals with sensitivities far better than 1 part per trillion, the current limit for detecting very small concentrations in a gas," said Woodbury. They are continuing work to make the method more practical for use in the field.

Credit: 
Optica

Silver loading and switching: Unintended consequences of pulling health policy levers

PITTSBURGH, June 20, 2019 - A move by the White House in 2017--decried by many health policy analysts as an attempt to undercut the Affordable Care Act (ACA)--had unanticipated consequences that improved the affordability of health insurance for Marketplace enrollees, a University of Pittsburgh Graduate School of Public Health-led analysis confirms.

The findings, reported today in the journal Health Services Research, show that the Trump Administration's cut of the ACA's cost-sharing reduction payments to health insurers caused insurance providers to compensate by changing the distribution of premiums in ways that increase federal government subsidies to Marketplace enrollees. And, surprisingly, geographic markets where a single insurer has a monopoly resulted in the best pricing for low income enrollees.

"The narrative about monopoly markets has largely been doom and gloom," said Coleman Drake, Ph.D., assistant professor in Pitt Public Health's Department of Health Policy and Management. "But actually, in terms of affordability, monopoly insurance markets are resulting in very low- to no-cost premiums for Marketplace enrollees. On the other hand, this is a really inefficient way to spend federal tax dollars to create affordable health insurance."

The federal government provides premium tax credits to people with incomes at or below 400% of the federal poverty level who buy health insurance through HealthCare.gov or similar state-based Marketplaces. The amount of the tax credit or subsidy varies depending on the market, or region, where the person is buying health insurance, because different insurers offer different plans with different premiums.

Every health insurer participating in the Marketplace offers health insurance plans that correspond to a "metal level"--bronze, silver, gold and platinum--with bronze costing the least and offering the lowest benefit generosity. The subsidy is determined based on each market's "premium spread," defined as the difference between the second lowest cost silver plan--the "benchmark"--and the lowest cost plan offered in the market.

For example, a single enrollee in 2018 whose income is 180% of the federal poverty limit would be expected to pay $100 per month for health insurance. If the premium on the benchmark plan in their region was $200 per month and the lowest cost plan was $140, then the premium spread would be $60. That enrollee would pay $40 for the plan with the lowest cost premium, which is equal to that person's $100 expected monthly contribution less the $60 subsidy for the premium spread.

When it was first enacted, the ACA also provided additional help to Marketplace enrollees with incomes at or below 250% of the federal poverty level by enabling them to obtain policies with lower co-payments and deductibles, also known as cost-sharing reduction subsidies. In turn, the government compensated insurers for the additional costs associated with offering these more generous benefits to very low-income enrollees. These cost-sharing reduction subsidy payments to insurers are what the Trump Administration cut in October 2017.

In response, state insurance commissioners in 42 states instructed insurers to "silver load," which means increasing the premium for benchmark silver plans to cover these additional costs, thereby increasing the premium spread and creating larger premium subsidies. Silver loading works best in markets with only one insurer because that monopoly insurer sets the premium for both the benchmark silver plan and the lowest cost plan.

The average monthly premium spread before the cost-sharing reduction cut was about $60. Following the cut, the average monthly premium spread jumped to $133.52 in 2018 and $147.94 in 2019.

But that approach did not benefit individuals who bought their insurance outside of the Marketplace or who did not qualify for premium tax credits. As a result, insurance commissioners are increasingly encouraging "silver switching," whereby insurers are allowed to sell off-Marketplace plans that are very similar, but not identical, to on-Marketplace plans in terms of benefits, but only the on-Marketplace plans are silver loaded. This allowed the off-Marketplace plans to retain lower premiums and was permitted in 24 states in 2018 and 29 in 2019.

States that allowed both silver loading and silver switching saw a 121% jump in premium spreads, compared to a 71% jump in states that only allowed silver loading, indicating that insurers were cautious about losing off-Marketplace customers with the increased premiums in states that allowed silver loading but not silver switching.

Coupling silver loading and silver switching thus maximizes premium affordability for enrollees on and off the Marketplace.

"States that are taking this second step and allowing both silver loading and silver switching are trying to ensure that insurers continue to operate in the individual market and that consumers at middle and higher incomes can afford health insurance," said coauthor Jean Marie Abraham, Ph.D., Wegmiller Professor of Healthcare Administration in the Division of Health Policy and Management at the University of Minnesota School of Public Health. "Of course, an important trade-off is that such policy responses ultimately lead to higher federal government spending than would have otherwise occurred under the original policy."

Credit: 
University of Pittsburgh

Inflammatory mechanisms may underlie increased risk of prostate cancer among WTC responders

Bottom Line: Inflammatory and immune-regulatory mechanisms were found to be altered in animal models and in archived prostate cancer tumor samples of responders exposed to dust from the World Trade Center terrorist attacks on Sept. 11, 2001.

Journal in Which the Study was Published: Molecular Cancer Research, a journal of the American Association for Cancer Research

Authors: Emanuela Taioli, MD, PhD, director of the Institute for Translational Epidemiology at the Icahn School of Medicine at Mount Sinai and associate director for Population Science at the Tisch Cancer Institute, both in New York; and William Oh, MD, chief of the Division of Hematology and Medical Oncology at the Icahn School of Medicine at Mount Sinai and deputy director at the Tisch Cancer Institute

Background: "World Trade Center responders show an overall increase in cancer incidence, including prostate cancer," said Taioli. "It is important to address the reasons for this increased incidence in order to prevent new cases in this aging cohort."

Previous work has reported an increased incidence of prostate cancer among responders to attacks on the World Trade Center. However, because these responders have been closely monitored, it remains unknown whether this increased incidence is truly related to exposure to carcinogens in the World Trade Center dust or whether it is a result of surveillance bias, noted Taioli. "Our study aimed to determine potential underlying mechanisms that could explain the link between World Trade Center responders and increased prostate cancer incidence," she said.

How the Study Was Conducted and Results: To identify differential patterns of gene expression potentially caused by exposure to World Trade Center dust, the researchers compared archived prostate cancer tumors from unexposed individuals (14 patients) and World Trade Center responders (15 patients). Prostate cancer tumors taken from responders showed a downregulation of genes involved in immune-cell chemotaxis and proliferation and an upregulation of genes involved in apoptosis and immune modulation, compared with prostate cancer tumors taken from unexposed patients. Additionally, cell-type enrichment analyses revealed an upregulation of pro-inflammatory cell types in prostate cancer tissue samples taken from responders compared with samples taken from unexposed patients.

To understand how inhalation of World Trade Center dust may affect a healthy prostate, the researchers exposed rats to dust that was collected at Ground Zero within 72 hours after the attacks. Rats anesthetized using isoflurane received either a two-hour exposure to the dust or no exposure to the dust for two consecutive days. The dose was adjusted to mimic the level of dust first responders would have inhaled during the initial three days at Ground Zero. To identify both immediate and delayed responses to World Trade Center dust, rat prostates were harvested at one day or 30 days post exposure and analyzed.

Rat prostate samples taken after one day of exposure revealed an upregulation of pro-inflammatory cell types compared with controls. Prostate samples taken 30 days post exposure revealed an upregulation of genes involved in cholesterol biosynthesis compared with controls.

"Cholesterol is an important precursor to androgens, which are known to drive prostate cancer development," explained Oh. "Our preliminary finding that exposure to World Trade Center dust increased the expression of genes in the cholesterol biosynthesis pathway highlights an additional mechanism by which environmental exposures may lead to the progression of prostate cancer."

Author's Comments: "It has been recognized that inflammation may be an important consideration in prostate cancer progression," said Oh.

"In our study, both the archived human prostate cancer tissues of 9/11 responders and the prostates of rats experimentally exposed to World Trade Center dust showed an increase in pro-inflammatory cell types," noted Taioli. "This finding represents the first mechanistic link between exposure to World Trade Center dust and prostate cancer."

Oh noted, "Our results suggest that inflammatory mechanisms are activated in the prostate after exposure to World Trade Center dust, which may give rise to chronic inflammation and contribute to prostate cancer progression."

Study Limitations: Limitations of the study include a small sample size of archived human prostate samples.

Funding & Disclosures: This study was sponsored by grants from the Centers for Disease Control and Prevention and the National Institute for Occupational Safety and Health.
Oh is a consultant/advisory board member for Sema4, CheckPoint Sciences, AstraZeneca, Sanofi, Genzyme, Bayer, and Janssen.

Credit: 
American Association for Cancer Research

Discovery of a 'holy grail' with the invention of universal computer memory

image: This new electronic memory device would allow computers which do not need to boot up and which could go into an energy-saving sleep mode - even between key stokes.

Image: 
Lancaster University

A new type of computer memory which could solve the digital technology energy crisis has been invented and patented by scientists from Lancaster University in the UK.

The electronic memory device - described in research published in Scientific Reports - promises to transform daily life with its ultra-low energy consumption.

In the home, energy savings from efficient lighting and appliances have been completely wiped out by increased use of computers and gadgets, and by 2025 a 'tsunami of data' is expected to consume a fifth of global electricity.

But this new device would immediately reduce peak power consumption in data centres by a fifth.

It would also allow, for example, computers which do not need to boot up and could instantaneously and imperceptibly go into an energy-saving sleep mode - even between key stokes.

The device is the realisation of the search for a "Universal Memory" which has preoccupied scientists and engineers for decades.

Physics Professor Manus Hayne of Lancaster University said: "Universal Memory, which has robustly stored data that is easily changed, is widely considered to be unfeasible, or even impossible, but this device demonstrates its contradictory properties."

A US patent has been awarded for the electronic memory device with another patent pending, while several companies have expressed an interest or are actively involved in the research.

The inventors of the device used quantum mechanics to solve the dilemma of choosing between stable, long-term data storage and low-energy writing and erasing.

The device could replace the $100bn market for Dynamic Random Access Memory (DRAM), which is the 'working memory' of computers, as well as the long-term memory in flash drives.

While writing data to DRAM is fast and low-energy, the data is volatile and must be continuously 'refreshed' to avoid it being lost: this is clearly inconvenient and inefficient. Flash stores data robustly, but writing and erasing is slow, energy intensive and deteriorates it, making it unsuitable for working memory.

Professor Hayne said: "The ideal is to combine the advantages of both without their drawbacks, and this is what we have demonstrated. Our device has an intrinsic data storage time that is predicted to exceed the age of the Universe, yet it can record or delete data using 100 times less energy than DRAM."

Credit: 
Lancaster University

Millions with neurological diseases could find new option in neurostimulation devices

WEST LAFAYETTE, Ind. - The United States is seeing an increase in the number of neurological diseases. Stroke is ranked as the fifth leading cause of death, with Alzheimer's being ranked sixth. Another neurological disease - Parkinson's - affects nearly 1 million people in the U.S. each year.

Implantable neurostimulation devices are a common way to treat some of these diseases. One of the most commonly used elements in these devices is platinum microelectrodes - but it is prone to corrosion, which can reduce the functional lifetime of the devices.

Purdue University researchers have come up with a solution to help - they are adding a graphene monolayer to the devices to protect the microelectrodes. The research is published in the June 6 edition of 2D Materials.

"I know from my industry experience that the reliability of implantable devices is a critical issue for translating technology into clinics," said Hyowon "Hugh" Lee, an assistant professor in Purdue's College of Engineering and a researcher at the Birck Nanotechnology Center, who led the research team. "This is part of our research focusing on augmenting and improving implantable devices using nano and microscale technologies for more reliable and advanced treatments. We are the first ones that I know of to address the platinum corrosion issue in neurostimulation microelectrodes."

Lee said he learned about the advantage of using graphene from his colleague at Birck Nanotechnology Center, Zhihong Chen, who is an expert in graphene technology. The team has shown the graphene monolayer to be an effective diffusion barrier and electrical conductor.

"If you attempt to deliver more charge than the electrode can handle, it can corrode the electrode and damage the surrounding tissues," Lee said. He also thinks that microscale electrodes are going to play a key role in the future with more demand for precise and targeted neurostimulation therapy. "We think neurosurgeons, neurologists, and other scientists in neuroengineering field will be able to use this electrode technology to better help patients with implantable devices for restoring eyesight, movement, and other lost functionalities."

Lee and his team are working with the Purdue Research Foundation Office of Technology Commercialization on patenting and licensing the technology. They are looking for partners interested in licensing it.

The work aligns with Purdue's Giant Leaps celebration of the university's global advancements made in health care research as part of Purdue's 150th anniversary. It is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Credit: 
Purdue University

Nursing home care cost significantly outpaces general inflation and medical care prices

WASHINGTON -- One of the largest studies on out-of-pocket costs for nursing home care finds prices are high and rising faster than other medical care and consumer prices, reports a team of health policy researchers.

Their study, published in Medical Care Research and Review, reviews nursing home prices in eight states between 2005-2010 and uncovers out-of-pocket prices that increase significantly beyond normal inflation and inflation in medical care prices.

For example, annual out-of-pocket costs for nursing home care increased as high as 30% in California during the study period.

The study also finds substantial price variation across states. In 2010, at an average of $131 a day (about $47,800 annually), Texas had the least expensive nursing home out-of-pocket cost, while New York State, at $334 daily ($121,910 a year) had the most expensive.

The study also finds different prices between nursing homes after adjusting for staffing levels and geographical difference.

The for-profit nursing home chains charged the lowest prices and nonprofit nursing home chains provided the most expensive care. The price differential between for-profit chains and nonprofit chains is about $4,160 annually, or equivalent to 6.2% of the average price of for-profit nursing homes. However, there is no statistically significant difference in prices between for-profit and nonprofit independently operated nursing homes.

The researchers also find that areas with higher market concentration of nursing homes leads to higher prices. Nursing homes that are near capacity limit also charge more than nursing homes that have more rooms available.

The study aims to provide more transparency of the out-of-pocket prices of nursing home care. "Not many people have those kind of resources, and so it is important to understand how fast prices grow and how they vary," says the study's lead author, Sean Huang, PhD, MA, assistant professor in the Department of Health Systems Administration at the School of Nursing & Health Studies at Georgetown University Medical Center.

Typically, individuals in need of nursing home care who do not have Medicaid, and usually pay out of pocket until they run out of money. Then they are eligible for Medicaid, Huang says. Only a small fraction of nursing home residents have private insurance, such as long-term care insurance, that helps cover the costs.

This study used a unique dataset on nursing home prices from 2005-2010 across eight states. "Very few people have studied this topic, so it required building the largest dataset on nursing home prices to date," Huang says. "This kind of information is very valuable to potential consumers of this care."

Credit: 
Georgetown University Medical Center

Tailor-made prosthetic liners could help more amputees walk again

image: Dr. Elena Seminati, John Roberts, Matt Young and Dr. Vimal Dhokia display a bespoke prosthetic liner.

Image: 
University of Bath

Researchers at the University of Bath have developed a new way of designing and manufacturing bespoke prosthetic liners, in less than a day.

This potentially life-changing project combines advances in computer science with an innovative manufacturing process to create affordable new personalised prosthetic liners for lower limb amputees.

There are more than 45,000 people in England alone who rely on prosthetic limbs, with more than 5,000 people each year having new lower-limb amputations. For these individuals, the interface between their residual limb (the amputation site) and their artificial limb is of critical importance for maintaining healthy, active lives, and a good fit could make the difference in whether or not they walk again.

Following an amputation, a person's residual limb constantly changes in shape and size during the healing process, which can last between 12-18 months. This variation can result in the liner fitting poorly, leading to tissue damage, causing pain and discomfort for amputees. This is exacerbated by patient activity levels and environmental conditions such as hot weather. This discomfort can lead to patients abandoning their prosthesis and rehabilitation regime, instead being forced to rely on a wheelchair.

This project, drawing on the expertise of a multidisciplinary team of researchers from the University of Bath's Department of Mechanical Engineering, Department for Health and Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA), is a new approach to providing the liners which fit inside the prosthetic socket that attaches to the artificial leg.

Amputees must currently return to their NHS prosthetist every time their limb changes size for their socket to be replaced or adjusted; many times in the first year following amputation. By providing a series of personalised liners of different sizes, that all fit within the same prosthetic socket, the frequency of these visits can be reduced, improving patient well-being and saving the NHS time and money.

Using a state-of-the-art scanner which quickly captures 3D shape, the research team precisely scans an amputee's residuum. The scanned data is then used to create a full digital model of the residuum which is subsequently used to design the personalised liner. The liner is then manufactured using a cryogenic machining technique, negating the need for complex and time consuming moulds.

At Bath, the researchers are using a soft polymer neoprene-like material - similar to that used in wetsuits - for the liner, which is more comfortable than the silicon liners used by the NHS. The entire process takes less than a day from scanning right through to the physical liner being fitted.

Case study: John Roberts

The Bath researchers are currently trialling this approach with an amputee volunteer John Roberts. He was born with one leg shorter than the other, but despite multiple operations, he suffered years of chronic pain before finally electing for amputation in 2017. After lifelong pain, John found living post-amputation a relief, but as his stump healed, it changed shape over time and the socket of the prosthetic limb began to rub, causing blisters and irritation.

To help his socket fit his stump, he currently wears multiple socks and a silicone layer, but with the new liner he can just simply wear this next to his skin because it fits perfectly.

This not only gives a more comfortable fit, but makes it easier and faster to fit the limb, meaning John can quickly put on his prosthetic if he needs to get out of bed at night or in the case of a fire.

John said: "I've been quite active since the amputation, enjoying walking and gardening again, so my stump has changed shape a lot, meaning I have to wear up to six layers of socks to make sure the prosthetic still fits properly.

"I've had a few issues with rubbing causing blood blisters with my socket. But with this new liner, blisters aren't really a problem.

"The other really good thing is that I can put on my leg quickly in an emergency. I was very impressed with the new liner, it's amazing what you can do with technology!"

To test the new liner, the researchers inserted pressure sensors inside the socket to check the fit of the liner and used motion capture technology to monitor John's gait.

Dr Elena Seminati, Lecturer in Clinical Biomechanics at the University of Bath, said: "We use pressure sensors inside the liner to check that the pressure is not too high, which could cause skin damage.

"Secondly we use motion capture technology to check that the movement of John's lower limbs is symmetrical and we also measure him walking across force plates to ensure there is no overloading in his knee, ankle and hip joints.

"We've found this new liner reduces the pressure on the stump significantly, reducing the risk of skin damage and making it more comfortable to walk.

"We hope this technology will help many amputees in the future."

Lecturer in the Department of Mechanical Engineering, Dr Vimal Dhokia, said: "There's a window of around 18 months where an amputee decides whether to use prosthetics to learn to walk again or use a wheelchair. Unfortunately this is a time the residuum changes in size and shape as part of the healing process, making it difficult to get a good fitting.

"Our technology will help achieve a comfortable fit for the patient and really make a difference in helping them walk again and improve their quality of life."

CAMERA Centre Manager, Matt Young, added: "By working closely with the NHS rehab team at the Bristol Centre for Enablement we've learnt a lot about the real issues faced by amputees.

"This project came about from conversations with amputees and prosthetists and is focused on providing a new solution that has genuine potential for adoption for the majority of users - rather than just a lucky few with comprehensive health insurance."

The researchers are continuing to test and develop this approach, working with John as well as other volunteers joining the study this autumn, to further improve the process and fit of these liners.

They hope to demonstrate this approach is economically viable for use in the NHS and believe this can reduce the burden and costs on the NHS as well as dramatically improve the quality of life of amputees using prosthetics.

Credit: 
University of Bath