Here you can find both a worded summary and a poster-style summary of my 4th year master's thesis, entitled: Robotic Touch: A Unified Representation for Geometric and Mechanical Properties of Objects. You can jump to the poster summary here.
My thesis concerned the topic of robotic touch, and how it can be used to determine both geometric and mechanical properties of objects.
Identification of objects using both vision and touch is something that humans do almost unconsciously all the time. In robotics, however, objects are usually identified only using vision, which isn't always possible. Robots can also learn a lot about what an object is and how they can manipulate it through properties that are only discoverable by touch. We focused on object stiffness and aimed to estimate the stiffness of objects and classify them using robotic touch.
To do this, we used a robotic arm and a tactile sensor (skin), which we integrated by creating a software architecture that allowed us to control the arm and the gripper using feedback from the skin. We ran experiments on five test objects, squeezing them incrementally to gather data about the force that the skin experienced as well as how tightly the gripper was compressing the object.
To find the stiffness and classify the objects accordingly, we used two different approaches.
Firstly, a model-based approach that modelled the object that was grasped as a series of springs using Hooke’s Law. Some simplifying assumptions about the behaviour of the object and sensor resulted in a linear modelling equation for the stiffness at each taxel (tactile element). We then filtered the data to remove unreliable taxels and dynamic points that were unaccounted for by our static model, and fit a linear regression to a taxel response-gripper displacement plot. The gradient of this line was used as our stiffness measurement at that taxel (local). The overall (global) stiffness of the object was taken as the mean local estimate of all taxels. The local estimates were also used to create an augmented point cloud that encoded both geometric and stiffness information about the object. Two forms of interpolation were applied to estimate the stiffness at points not covered by a taxel.
Our second approach used several machine learning algorithms that trained models to classify objects using a single snapshot of haptic information (taxel responses and gripper width) from an instance in time. Three supervised algorithms were used: Random Forests, K-Nearest Neighbours and Support Vector Machines. One unsupervised algorithm, K-Means Clustering was also applied. To reduce the complexity of our data, Principal Component Analysis projection was used to project our 79-dimensional feature space onto three principal axes, allowing the data to be plotted in 3D. Random Forests were seen to perform the best, achieving up to 100% accuracy on our test set.
We presented two accurate approaches to identify the stiffness of objects using squeeze-based haptic data, as well a way to display unified geometric and stiffness information as an augmented point cloud. A model-based approach may be more applicable to previously unseen objects, while a machine learning approach can be used for rapid object classification.