Intro; Foreword; Preface; Acknowledgements; Contents; Acronyms; Mathematical Notation; Part I Background; 1 Introduction; 1.1 Robotic Manipulation and Grasp; 1.2 Robotic Tactile Perception; 1.3 Tactile Exploratory Procedure; 1.4 Tactile Perception for Shape; 1.5 Tactile Perception for Texture; 1.6 Tactile Perception for Deformable Objects; 1.7 Visual-Tactile Fusion for Object Recognition; 1.8 Public Datasets; 1.8.1 Tactile Dataset; 1.8.2 Visual-Tactile Fusion Datasets; 1.9 Summary; References; 2 Representation of Tactile and Visual Modalities; 2.1 Tactile Modality Representation.
Text of Note
2.1.1 Tactile Sequence2.1.2 Dynamic Time Warping Distance; 2.1.3 Global Alignment Kernel; 2.2 Visual Modality Representation; 2.3 Summary; References; Part II Tactile Perception; 3 Tactile Object Recognition Using Joint Sparse Coding; 3.1 Introduction; 3.2 Kernel Sparse Coding; 3.3 Joint Kernel Sparse Coding; 3.4 Experimental Results; 3.4.1 Data Collection; 3.4.2 Result Analysis; 3.4.3 Results for the Public Dataset; 3.5 Summary; References; 4 Tactile Object Recognition Using Supervised Dictionary Learning; 4.1 Introduction; 4.2 Tactile Dictionary Learning; 4.3 Extreme Learning Machines.
Text of Note
4.4 Extreme Kernel Sparse Learning4.5 Reduced Extreme Kernel Sparse Learning; 4.6 Optimization Algorithm; 4.6.1 Calculating the Sparse Coding Vectors; 4.6.2 Calculating the Dictionary Atoms; 4.6.3 Calculating the ELM Coefficients; 4.7 Algorithm Analysis; 4.8 Experimental Results; 4.8.1 Data Description and Experimental Setting; 4.8.2 Parameter Selection; 4.8.3 Accuracy Performance Comparison; 4.8.4 Comparison of Reduced Strategies; 4.9 Summary; References; 5 Tactile Adjective Understanding Using Structured Output-Associated Dictionary Learning; 5.1 Introduction; 5.2 Problem Formulation.
Text of Note
5.3 Optimization Algorithm5.3.1 Calculating the Sparse Coding Vectors; 5.3.2 Calculating the Dictionary Atoms; 5.3.3 Calculating the Classifier Parameters; 5.3.4 Algorithm Summarization; 5.4 Classifier Design; 5.5 Experimental Results; 5.5.1 Data Description and Experimental Setting; 5.5.2 Performance Comparison; 5.5.3 Parameter Sensitivity Analysis; 5.6 Summary; References; 6 Tactile Material Identification Using Semantics-Regularized Dictionary Learning; 6.1 Introduction; 6.2 Linearized Tactile Feature Representation; 6.3 Motivation and Problem Formulation; 6.4 Proposed Model.
Text of Note
6.5 Optimization Algorithm6.5.1 Calculating the Sparse Coding Vectors; 6.5.2 Calculating the Dictionary Atoms; 6.5.3 Algorithm Summarization; 6.6 Classifier Design; 6.7 Experimental Results; 6.7.1 Experimental Setting; 6.7.2 Performance Comparison; 6.8 Summary; References; Part III Visual-Tactile Fusion Perception; 7 Visual-Tactile Fusion Object Recognition Using Joint Sparse Coding; 7.1 Introduction; 7.2 Problem Formulation; 7.3 Kernel Sparse Coding for Visual-Tactile Fusion; 7.3.1 Kernel Sparse Coding; 7.3.2 Joint Kernel Group Sparse Coding; 7.4 Experimental Results; 7.4.1 Data Collection.
0
8
8
8
8
SUMMARY OR ABSTRACT
Text of Note
This book introduces the challenges of robotic tactile perception and task understanding, and describes an advanced approach based on machine learning and sparse coding techniques. Further, a set of structured sparse coding models is developed to address the issues of dynamic tactile sensing. The book then proves that the proposed framework is effective in solving the problems of multi-finger tactile object recognition, multi-label tactile adjective recognition and multi-category material analysis, which are all challenging practical problems in the fields of robotics and automation. The proposed sparse coding model can be used to tackle the challenging visual-tactile fusion recognition problem, and the book develops a series of efficient optimization algorithms to implement the model. It is suitable as a reference book for graduate students with a basic knowledge of machine learning as well as professional researchers interested in robotic tactile perception and understanding, and machine learning.