470 likes | 736 Views
Moveable Interactive Projected Displays Using Projector Based Tracking. 1 Carnegie Mellon University 2 Mitsubishi Electric Research Labs 3 Georgia Tech University Seattle, WA UIST 2005. Johnny C. Lee 1,2 Scott E. Hudson 1 Jay W. Summet 3 Paul H. Dietz 2.
E N D
Moveable Interactive Projected Displays Using Projector Based Tracking 1Carnegie Mellon University 2Mitsubishi Electric Research Labs 3Georgia Tech University Seattle, WA UIST 2005 Johnny C. Lee1,2 Scott E. Hudson1 Jay W. Summet3 Paul H. Dietz2
UIST 2004 – Automatic Projector Calibration • Embedded light sensor in surface. • Project patterns to find sensor locations. • Pre-warp source image to fit surface. • (video clip 1) • Correspondence between location data between and projected image is free (e.g. no need for calibration with external tracking system) • Transforms passive surfaces into active displays in a practical manner. • Variety of useful applications
Touch Calibration Projector-based AR Shader Lamps, MERL/UNC Diamond Touch, MERL Everywhere Displays, IBM Interactive Whiteboard
Focus on Moveable Projected Displays • Goals of this work: • Achieve interactive tracking rates for hand-held surfaces. • Reduce the perceptibility of the location discovery patterns. • Explore interaction techniques supported by this approach.
Display Surface Constructed from foam core and paper Touch-sensitivity is provided by a resistive film Lighter than a legal pad
Tablet PC-like Interaction Video clip 2
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Frequency Shift Keyed (FSK) HF LF Difference is visible to the human eye Difference is NOTvisible to the human eye
Gray Code Patterns Black and White Frequency Shift Keyed (FSK) HF LF HF Difference is visible to the human eye Difference is NOTvisible to the human eye
Gray Code Patterns Black and White Frequency Shift Keyed (FSK) HF LF HF LF HF Difference is visible to the human eye Difference is NOTvisible to the human eye
FSK Transmission of Patterns • FSK transmission of the Gray Code patterns makes the stripped region boundaries invisible to the human eye. • Patterns appear to be solid grey squares to observers. • Light sensor is able to demodulate the HF and LF regions into 0’s and 1’s • This is accomplished using a modified DLP projector
Inside a DLP projector • DLP = Digital Light Processing • Many consumer projectors currently use DLP technology • “DLP” is Texas Instruments marketing term for DMD • DMD = Digital Micro-mirror Device • Each mirror corresponds to a pixel • Brightness corresponds to duty cycle of mirror Pictures from Texas Instruments literature
Inside a DLP projector Light source Projector Lens Color wheel DMD
Inside our modified DLP projector Light source Projector Lens DMD
FSK Transmission of Location Patterns • Removing the color wheel flattens the color space of the projector into a monochrome scale • Multiple points in the former color space now have the same apparent intensity to a human observer, but are manifested by differing signals. • The patterns formerly known as “red” and “grey” are rendered as 180Hz and 360Hz signals respectively. • Monochrome projector is not ideal, but is a proof of concept device until we can build a custom DMD projector.
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Projector Specifications Infocus X1 (~$800 new) • 800x600 SVGA resolution • 1 DMD chip • 60Hz refresh rate Full-Screen Location Discovery Time: • 20 images ( log2(# pixels) ) • 333ms at 60Hz 3Hz maximum update rate
Incremental Tracking • Project small tracking patterns over the last known locations of each sensor for incremental offsets • Black masks reduce visibility of tracking patterns • Tracking loss strategies are needed (later) • Smaller area = fewer patterns = faster updates • 32x32 unit grid requiring 10 images • 6Hz update rate
Tracking Demo Video clip 3
Latency and Interleaving • Incremental tracking is a tight feedback loop: project update project update … • 6Hz update rate assumes 100% utilization of the 60 frames/sec the projector can display • System latencies negatively impact channel utilization • Achieving 100% utilization of the projection channel requires taking advantage of the axis-independence of Gray Code patterns.
System Latency – Full X-Y Tracking Only 73% utilization Projection: X patterns Y patterns X patterns Y patterns Graphics & Video Hardware & OS scheduling Software: update & draw X-Y patterns draw X-Y patterns Time
System Latency - Interleaved Tracking 100% utilization of the projection channel and 12Hz interleaved update Projection: X patterns Y patterns X patterns Y patterns Software: draw Y patterns update & draw X patterns update & draw X patterns draw X patterns update & draw Y patterns Time
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Tracking Pattern Size Tracking Area Tracking Rate 32x32 grid 12Hz interleaved 16x16 grid 15Hz interleaved +25% rate -75% area Smaller tracking area increases risk of losing sensor (e.g. maximum supported velocity) log2 relationship makes it hard to gain speed though the use of smaller patterns
Tracking Pattern Size Pixel Density Decreases
Tracking Pattern Size large, coarse tracking pattern small, fine tracking pattern • Preserves physical size of tracking pattern (cm) • Preserves maximum supported velocity (m/s) • Distance is approximated from screen size • Scaling factor is adjustable (precision vs. max velocity): ~2.5mm; 25cm/s
Motion Modeling Predicting the motion can be used to increase the range of supported movement (e.g. max acceleration vs. max velocity) Much of the work in motion modeling is applicable. But, no model is perfect and mis-predictions can lead to tracking loss potentially yielding poorer overall performance. Models are likely to be application and implementation specific. v a
Tracking Pattern Shape • We used square tracking patterns due to the axis aligned nature of Gray code patterns. • Patterns with high-radial symmetry are best for general movement in two-dimensions. • Pattern geometry can be optimized for specific applications.
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Detecting Occlusions or Tracking Loss • Causes of tracking loss: • occlusions • exiting the projection area • exceeding the range of motion supported by the tracking patterns With FSK transmission, tracking loss corresponds to a disappearance of the carrier signal. This allows error detection on a per-bit basis. • Implemented on a low-cost PIC processor as: • sudden drop in signal amplitude • insufficient amplitude • invalid edge count
Lost Tracking Behavior Single/independent sensors: • Discard and hope the sensor has not moved • Perform a full screen discovery process (+333ms) • Grow the tracking pattern around last location until reacquired Multiple sensors of known geometric relationship: • Try the above three techniques. • Compute predicted lost sensor locations using the locations of the remaining available sensors.
Tracking Loss With Multiple Sensors video clip 4
Estimating Lost Sensors Note: Transformations for each point cannot be implemented as a simple matrix stack because LIFO ordering of sensor loss and re-acquisition is not guaranteed.
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Supported Interaction Techniques Video clip 5
Supported Interaction Techniques Magic Lens Focus + Context Simulated Tablet PC Input Pucks Location Aware Displays
Conclusion • Unifying the tracking and projection technology greatly simplifies the implementation and execution of applications that combine motion tracking with projected imagery. • Coherence between the location data and projected image is free. • Does not require an external tracking system or calibration • Simple: Demos were created in about a week • This approach has the potential to change the economics of interactive displays • The marginal cost of each display can be as low as $10 USD • Museum: wireless displays could be handed out to visitors. • Medical Clinic: physical organization of patient charts/folders
Future Work • Removing the color wheel was a proof-of-concept work around. • Construct high-speed projector using a DLP development kit • Explore using infrared to project invisible patterns • Explore other applications where low-speed positioning is sufficient. • Achieve +18Hz (+36Hz interleaved) tracking with visible patterns and an unmodified DLP projector using RGB sections. • Using multiple projectors (or steerable projectors) to increase freedom of movement.
Acknowledgements Funded in part by the National Science Foundation under grants IIS-0121560 and IIS-0325351 Funded in part by Mitsubishi Electric Research Labs Johnny Chung Lee johnny@cs.cmu.edu
Technical Details • Infocus X1 ($800) 800x600, 60Hz • PIC16F819 at 20Mhz, 10bit ADC • Sensor package < $10 in volume • 4-wire resistive touch sensitive film • IF-D92 fiber optic phototransistors • 45Bytes/sec for location data • 25mW during active tracking • Latency (77ms – 185ms)