200 likes | 393 Views
Augmented Reality Software. Outline. Augmented Reality (AR) defined Types of AR Expectations Challenges Demo Clips Applications ARToolKit Demo Questions. Augmented Reality Defined. AR is the overlay of virtual computer graphics images on real world imagery
E N D
Outline • Augmented Reality (AR) defined • Types of AR • Expectations • Challenges • Demo Clips • Applications • ARToolKit • Demo • Questions
Augmented Reality Defined • AR is the overlay of virtual computer graphics images on real world imagery • The merging of synthetic sensory information into a user’s perception of a real environment • A subset of mixed reality
Types of Augmented Reality • Video see-through AR: virtual images are overlaid on live video of the real world and then presented to the user • Optical see-through AR: computer graphics are overlaid or cast onto a view of the real world • Typically requires a see-through head-mounted display (HMD)
Types of Augmented Reality(Continued) • Registration/pattern tracking and rendering – utilizes recognition software, calculates orientation and distance of registration marks or fiduccial patterns, then renders the object(s) • Chromakey Fusion – a virtual world (or object) is fused to real world video via color-key replacement
Expectations (Some) • Near natural 3D perception of virtual objects or images in a real world setting, rendered in near real time • Fusion – achieving exact alignment when rendering virtual images with real world objects • Occlusion between real and virtual objects • Translucence of real and virtual objects
Expectations (Some more) • Shadows cast upon real and virtual objects • Virtual object animations • Virtual object interactivity • Grasping/selecting/manipulating virtual objects • Collision detection between real and virtual objects • Modifying virtual object geometry and or appearance • Examples: dissection, compression, stretching, compounding, inscribing, etc.
Challenges • Maintaining registration/pattern tracking • Overcoming noise • Lighting/contrast • Focus/pattern clarity • Cluttered real world imagery • Latency/frame rate • Tracking computations • Rendering computations • Occlusions, Translucence, Shadows, Animation, Interactivity, etc.
Challenges(Continued) • Managing budgetary constraints • Image size (320X240) • Number of virtual objects/fiduccial marks • Frame rate requirements • Mono vs. Stereoscopy
Tracking markers Pattern Templates Registration/Fiduccial Marks
Simple Animation Animated translation in the affine coordinate system Translucence vs. Occlusion Occlusions Demo Clips 3D Near fusion
Complex Registrations Demo Clips (Continued) User Interactivity… … and virtual object behaviors
Applications – Face to face AR collaboration – Remote AR Conferencing – Views (FOVs) from an HMD or NVGs – Surgical trainers – Surgical/anatomical viewing tools – Ultrasound visualizer – Flight HUDs, Infantry HUDs, building/ship inspector HUDs
ARToolKit • C based language software library • Uses Microsoft’s Vision SDK/Video for Windows • Tackles viewpoint calculations in near real time to achieve fusion. • Supports both video and optical see-through AR • Operating System: • Windows 95/98/NT when using Connectix Color QuickCam • Windows 95/98 when using USB cameras • Requires GLUT 3.6 or higher
ARToolKit Steps • 1.Initialize the application • 2.Grab a video input frame • 3.Detect the markers • 4.Calculate camera transformation • 5.Draw the virtual objects • 6.Close the video path Main Loop
Input Video Threshold Video Virtual Overlay How ARToolKit works • Live video image is turned into binary image based on a lighting threshold value (a contrast value) • ARToolKit finds potential tracking markers • Potential markers are compared against pre-trained pattern template(s); matches require rendering virtual object(s) • ARToolKit uses the known marker size and pattern orientation to calculate size and orientation of the virtual object • Render; the virtual image is overlaid upon the tracking marker
Bibliography • James R. Vallino http://www.cs.rit.edu/~jrv/research/ar/ • Hirokazu Kato Mark Billinghurst Rob Blanding Richard May http://www.hitl.washington.edu/research/shared_space/