Liquid metal sensors and AI aid prosthetic touch

Researchers in the US have used a combination of stretchable, liquid metal sensors and artificial intelligence to replicate fingertip touch in prosthetics.

With over 3,000 touch receptors that largely respond to pressure, the human finertip is an essential tool that facilitates fine motor skills and manipulation of objects. However, this refined sense of touch has so far yet to be synthesised for prosthetics, often leading to objects being dropped or crushed by a prosthetic hand.

In a new study, published in the journal Sensors, researchers from Florida Atlantic University outline how they incorporated stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand. Encapsulated within silicone-based elastomers, the technology is claimed to provide advantages over traditional sensors, including high conductivity, compliance, flexibility and stretchability. The team’s study is detailed in Sensors.

The FAU team used individual fingertips on the prosthesis to distinguish between different speeds of a sliding motion along different textured surfaces. The four different textures had one variable parameter, which was the distance between the ridges. To detect the textures and speeds, researchers trained four machine learning algorithms. For each of the ten surfaces, 20 trials were collected to test the ability of the machine learning algorithms to distinguish between the ten different complex surfaces comprised of randomly generated permutations of four different textures.

Results showed that the integration of tactile information from liquid metal sensors on four prosthetic hand fingertips simultaneously distinguished between complex, multi-textured surfaces. The machine learning algorithms were able to accurately distinguish between all the speeds with each finger. This new technology could improve the control of prosthetic hands and provide haptic feedback for amputees to reconnect a previously severed sense of touch.

"Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors," said senior author Erik Engeberg, an Associate Professor at FAU’s Department of Ocean and Mechanical Engineering.

"The tactile information from all the individual fingertips in our study provided the foundation for a higher hand-level of perception enabling the distinction between ten complex, multi-textured surfaces that would not have been possible using purely local information from an individual fingertip. We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand."

According to Florida Atlantic, the researchers compared four different machine learning algorithms for their classification capabilities: K-nearest neighbour (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the liquid metal sensors were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 per cent accuracy to distinguish between ten different multi-textured surfaces using four liquid metal sensors from four fingers simultaneously.

“With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can 'feel' and respond to its environment," said Stella Batalama, Ph.D., dean, College of Engineering and Computer Science.