300 likes | 427 Views
Evaluation of a Pointing Interface for a Large Screen Based on Regression Model with Image Features. Koichi Shintani † Tomohiro Mashita †‡ Kiyoshi Kiyokawa †‡ Haruo Takemura †‡ † Graduate School of Information Science and Technology, Osaka University, Japan
E N D
Evaluation of a Pointing Interfacefor a Large Screen Basedon Regression Model with Image Features Koichi Shintani† Tomohiro Mashita†‡ Kiyoshi Kiyokawa†‡ HaruoTakemura†‡ †Graduate School of Information Science and Technology,Osaka University, Japan ‡Cybermedia Center, Osaka University, Japan
Background • Gesture input interfaces have become widespread • However, commonly used gesture interfaces are limited to small screens Touch Screen NintendoDS
Examples of Pointing Interfaces • Wii Remote[1] • For mid- to large-size screens • Pointing coordinates are based on the device’s orientation and position • Vogelet al.’s work[2] • For large-size screens • Ways of pointing • Touch screen • Relative displacement • Ray casting Wii Remote Vogel et al [1] WiiRemote.NintendoCo.,ltd,http://www.nintendo.co.jp/ [2] D. Vogel and R. Balakrishnan. Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays .Proceedings of UIST 2005 - the ACM Symposium on User Interface Software and Technology. p. 33-42.
Defect of existing pointing interfaces • There are problems of spatial cognition Target
Effect of Spatial Cognition • Example of the effect • Errors between real space and cognitive space (Soechting et al.[3]) [3]JOHN F.SOECHTING AND MARTHA FLANDERS, Sensorimotor Representations for Pointing to Targets in Three-Dimensional Space
Classification of Pointing Interfaces Low Cognitive Load ? Microsoft Kinect Touch Screen NintendoDS Vogelet al PlayStation Move WiiRemote Hands-In Gyroscopic Mouse Small Screen Reaching distance Large Screen Walking distance
Easy Pointing Interface • Hands-free • Reduce the negative effects onspatial cognition • Real time interaction • Vision based system • Appearance learning • Direct mapping with linear regression Approach
Prototype System Projector (1024*768) 2.8m Camera (640*480) 2.1m 2.5m
Flow of Proposed Method Regression Coefficients
Image Features • Eigenimage • Calculate eigenvectors of images • Moment Features • Directions of principal axes of inertia • Centroids of images Principal axis of inertia Centroid
Learning Phase • is a matrix consisting of (x,y) coordinates of target points. • is a matrix consisting of intercept terms and image features • is a set of either eigenvectors or moment features • is a matrix of regression coefficients
Estimation Phase • is a matrix of regression coefficients • is a vector consisting of image features • is a set of either eigenvectors or moment features • is estimated coordinates (x,y)
Experiments • Experiment 1 • Evaluation of estimation accuracy • With 6 test subjects using their own datasets • Experiment 2 • Evaluation of training data size • With 1 test subject, 6 datasets
Datasets for Evaluation • Test Dataset • A dataset with target positions that were the same as the training data • Midpoint Dataset • A dataset with target positions at the midpoints of the training data Example Position forMidpoint Dataset Position for Training Dataset and Test Dataset
Experiments • Experiment 1 • Evaluation of estimation accuracy • With 6 test subjects using their own datasets • Experiment 2 • Increasing the amount of training data • With 1 test subject, 6 datasets
Experiment 1Evaluation of estimation accuracy • Procedure • Take a training dataset • Take a dataset for evaluation • Estimate indicated positions using a dataset for evaluation in two ways(moment features or eigenimage) • Subjects: 6 students (22~26 years old)
Estimation Results for the Test Datasets • Mean error of all subjects • Eigenimage: ~23 cm • Moment features: ~20 cm • Examples of Results of estimation • Block size: 28 * 21 cm Moment features Eigenimage
Estimation Results for the Midpoint Datasets • Mean error of all subjects • Eigenimage: ~25 cm • Moment features: ~22cm • Examples of results of estimation Eigenimage Moment features
Experiment 1: Discussion • Errors down a column point in the same direction • Due to the order of pointing With Test Datasets With Test Datasets Eigenimage Moment features
Experiments • Experiment 1 • Evaluation of estimation accuracy • With 6 test subjects using their own datasets • Experiment 2 • Increasing the amount of training data • With 1 test subject, 6 datasets
Experiment 2Increasing the amount ofsupervised data • Evaluate the relation between the size of the training dataset and accuracy(Moment features) • Using 1 dataset for estimation • Using 6 datasets for estimation • 1 subject 1 dataset 6 datasets
Estimation Results for the Test Datasets • Screen size: width 2.8m, height 2.1m • Red line: estimation error along the x axis is more than 102 pixels or the estimation error along the y axis is more than 76 pixels. Using 1 set Using 6 sets Mean error: 24cm Mean error: 17cm Mean error: 28cm Mean error: 16cm Mean error: 17cm Mean error: 15cm Mean error: 37cm
Estimation Results for the Midpoint Datasets • Screen size: width 2.8m, height 2.1m • Red line: estimation error along the x axis is more than 102 pixels or the estimation error along the y axis is more than 76 pixels. Using 1 set Using 6 sets Mean error: 17cm Mean error: 31cm Mean error: 15cm Mean error: 29cm Mean error: 23cm Mean error: 27cm Mean error: 55cm
Experiment 2: Discussion • The estimation errors with 6 datasets are lower than those with 1 dataset • By increasing the amount of training data, the influence of small movements occurring in the pointing motions is lessened.
Conclusion • A method to estimate the positions on a large screen indicated by a user’s pointing gestures • Using a prototype system, moment features are more stable and more suitable than eigenimage • With all subjects, the accuracy of estimation is about 5 deg with our method
Future work • Enhanced interaction • Using motion gesture, hand posture,etc. • Improve estimation accuracy • Recognize relation between user’s posture and pointing motion • Develop a practical application