430 likes | 548 Views
Jivko Sinapov, Kaijen Hsiao and Radu Bogdan Rusu. Proprioceptive Perception for Object Weight Classification. What is Proprioception?.
E N D
Jivko Sinapov, Kaijen Hsiao and Radu Bogdan Rusu Proprioceptive Perception for Object Weight Classification
What is Proprioception? “It is the sense that indicates whether the body is moving with required effort, as well as where the various parts of the body are located in relation to each other.” - Wikipedia
Why Proprioception? Full Empty
Why Proprioception? vs Soft Hard
Related Work: Proprioception • “Learning Haptic Representations of Objects”: [ Natale et al (2004) ]
Related Work: Proprioception • Proprioceptive Object Recognition [ Bergquist et al (2009) ]
General Approach • Let the robot experience what full and empty bottles “feel” like • Use prior experience to classify new bottles as either full or empty
Behavior: • Power, “Play and Exploration in Children and Animals”, 2000
Behaviors 1) Unsupported Holding 2) Lifting
Data Representation Behavior Execution: Recorded Data: [Ji, Ei, Ci] Joint Positions Class Label {full, empty} Efforts
Example Recorded Joint Efforts of Left Arm:
Classification Procedure Pr( ‘empty’ ) Pr( ‘full’ ) [Ji, Ei, ?] Recognition Model Feature Extraction
Recognition Model X =[Ji, Ei, ?] Recognition Model
Recognition Model X =[Ji, Ei, ?] Recognition Model Find N closest neighbors to X in joint-feature space
Recognition Model X =[Ji, Ei, ?] Recognition Model Find N closest neighbors to X in joint-feature space Train classifier C on the N neighbors that maps effort features to class label
Recognition Model X =[Ji, Ei, ?] Recognition Model Find N closest neighbors to X in joint-feature space Train classifier C on the N neighbors that maps effort features to class label Use trained classifier C to label X
Training Procedure • Objects: • Procedure: • Place object on table • Robot grasps it and performs the current behavior (either hold or lift) in a random position in space • Robot puts object back down on table in random position; repeat. • Each behavior performed 100 times on each bottle in both full and empty states • A total of 2 x 5 x 100 x 2 = 2000 behavior executions
Evaluation • 5 fold cross-validation: at each iteration, data with 4 out of the five bottles is used for training, and the rest used for testing • Three classification algorithms evaluated: • K-Nearest Neighbors • Support Vector Machine (quadratic kernel) • C4.5 Tree
Can the robot boost recognition rate by applying a behavior multiple times?
Application to Regression X =[Ji, Ei, ?] Recognition Model Find N closest neighbors to X in joint-feature space Train regression model C on the N neighbors that maps effort features to class label Use trained regression model C to label X
Regression Results Mean Abs. Error = 0.08827 lbs
Regression Results Chance error = 0.2674 lbs Mean Abs. Error = 0.08827 lbs
Application to Sorting Task • Sorting task: • Place empty bottles in trash • Move full bottles on other side of table
Application to a new recognition task Full or empty?
Behavior: • 40 trials with full box and 40 trials with empty box • Recognition Accuracy: 98.75 % (all three algorithms) slide object across table
Conclusion • Behavior-grounded approach to proprioceptive perception • Implemented as a ROS package: • http://www.ros.org/wiki/proprioception This work has been submitted to ICRA 2011.
Future Work • More advanced proprioceptive feature extraction • Multi-modal object perception: • Auditory • 3D • Tactile