A team of researchers at the University of Bristol have now developed a technique by applying deep learning to a robotic fingertip that is capable of sensing. As a result, the researchers found out that it allowed the fingertip to infer more information regarding its surroundings. More details about this are available in their paper published in ‘IEEE Robotics & Automation Magazine’.
Video Credits: Nathan Lepora, YouTube
One of the researchers who carried out this study is Professor Nathan Lepora. For almost a decade, Professor Lepora has been trying to recreate the sense of touch in robots. He said, “Our overall idea was to artificially recreate the sense of touch when controlling robots as they physically interact with their surroundings.”
While adding further he said, “Humans do this without thinking—for example, when brushing their fingers over an object to feel its shape. However, the computations underlying this are surprisingly complex. We implemented this type of physical interaction on a robot, by applying deep learning to an artificial fingertip that senses analogously to human skin.”
The deep learning technique that is developed by Professor Lepora and his team works by collecting accurate estimates of surface angels which enables better control of robotic fingertips. In the future, it could allow robots to contain physical dexterity such as that of humans. This will enable the robots to adapt efficiently their grasping and manipulation strategies depending on the object they are interacting with.
Professor Nathan Lepora, while speaking about the research and deep learning said, “Deep learning allowed us to construct reliable maps from the sensory data to surface features such as edge angle.”
Explaining further he said, “This is difficult, because sliding a soft human-like fingertip over surfaces distorts the data it gathers. Previously, we were not able to separate this distortion from the shape of the surface, but in this work, we succeeded by training a deep convolutional neural network with examples of distorted tactile data, which allowed us to produce accurate surface angle estimates to within a fraction of a degree.”
Once robots are equipped with the sense of touch they can improve the control of their hands and fingertips. This will enable them to estimate the shape and the texture of the objects or certain parts of objects they come in contact with.
Professor Nathan Lepora, in his previous works had made use of machine learning techniques such as probabilistic classifiers. He found out that employing such techniques enabled the robots to perform only basic tasks such as feeling simple 2D shapes with the help of a slow tapping motion.
Professor Nathan Lepora, while speaking about the paper said that, “The breakthrough in this new paper was that the methods we used work in three dimensions on natural complex objects, sliding the fingertip much as humans would do. We could do this because of the advances in deep learning over the last few years.”
The researchers have been able to successfully demonstrate the effectiveness of the technique they’ve developed by applying it to a single robotic fingertip. Upon further developments this technique can be applied to a soft robot’s limbs and fingertips which will allow it to handle tools and complete manipulation tasks similar to humans.
This technology can prove to be quite useful when it comes to development of more efficient robots that can be used for different purposes such as completing house chores, attending to a patient’s needs or picking produce in farms.