Columbia engineers were the first to demonstrate a robotic finger with a high-precision sense of touch on a complex, curved surface. This was reported in the journal IEEE/ASME Transactions on Mechatronics.
Researchers at Columbia Engineering announced that they have created a new type of robotic finger with a high-precision sense of touch. Their finger can localize the touch with very high accuracy to < 1 mm on a large, curved surface, which is very similar to a person.Existing methods for constructing sensor sensors turned out to be difficult to integrate into the fingers of the robot due to many problems, including the difficulty of coating polygonal surfaces, the large number of wires or the difficulty of installing the sensor in small fingertips, which does not allow using them as dexterous hands that feel touch. The Columbia Engineering team has developed a new approach: a new use of overlapping signals from emitters and light detectors embedded in a transparent waveguide layer that covers the functional areas of the finger.By measuring the light transport between each emitter and receiver, they showed that a very rich set of data can be obtained from a signal that changes in response to finger deformation as a result of touch. They then demonstrated that deep learning methods based solely on data can extract useful information from data, including contact location and applied normal strength. Their end result is a fully integrated touch-sensitive robotic finger with a small number of wires, created using available production methods and designed for easy integration into robotic hands.In this project, researchers use light to feel the touch. Under the "skin", their finger has a layer of transparent silicone, into which they inserted more than 30 LEDs. The finger also has over 30 photodiodes that measure how light is reflected around. Whenever a finger touches something, its skin is deformed, so the light moves in a transparent layer below it. By measuring how much light goes from each LED to each diode, the researchers receive about 1,000 signals, each of which contains some information about the contact. Since light can also be reflected in curved spaces, these signals can cover a complex three-dimensional shape, such as the tip of a finger. The human finger, by comparison, provides incredibly rich contact information - more than 400 tiny touch sensors per square centimeter of skin.The team also developed data processing using machine learning algorithms. Since there are so many signals, all of them partially overlap, so the data is too complex for people to interpret. Fortunately, modern machine learning methods can extract information that worries researchers: where do you touch the finger, what about the finger, what kind of effort is applied, and the like.In addition, the team created a finger so that it could be put on the hands of the robot. Integrating the system into the hand is very simple: thanks to this new technology, the finger collects almost 1,000 signals, but it only needs a 14-wire cable connecting it to the hand, and it does not require complicated integrated electronics. Researchers already have two dexterous hands (capable of capturing and manipulating objects) in their laboratory, equipped with these fingers. One hand has three fingers and the other four. In the coming months, the team will use these hands to try to demonstrate clever manipulative abilities based on tactile and proprioceptive data.