490 likes | 557 Views
Haptic Rendering. Max Smolens COMP 259 March 26, 2003. What is haptics?. Using the sense of touch to interact with computers and virtual environments. What is haptic rendering?. The process of computing and generating forces in response to use interactions with virtual objects.
E N D
Haptic Rendering Max Smolens COMP 259 March 26, 2003
What is haptics? • Using the sense of touch to interact with computers and virtual environments
What is haptic rendering? • The process of computing and generating forces in response to use interactions with virtual objects
Why use haptics? • Increases the information flow between the computer and the user • Intrinsically bilateral • When we push on an object, it pushes back on us
Why use haptics? (2) • Our sensing of forces is closely tied to our visual system and sense of three-dimensional space • Information and intent can be conveyed in a physically direct and primal way
Haptic Applications • Medicine • Surgical simulators for training • Manipulating robots for minimally invasive surgery • Telemedicine, remote diagnosis • Accessibility for the disabled
Haptic Applications (2) • Entertainment • Video games, simulators that enable the user to feel and manipulate objects in the environment • Education • Feel phenomena at a variety of spatial and temporal scales • Studying complex data sets
Haptic Applications (3) • Industry • CAD systems • Virtual prototyping • Assembly and disassembly can guide final design • Shape sculpting • Expressive, free-form shape generation and modification
Haptic Applications (4) • The arts • Virtual painting, sculpting • Virtual musical instruments
Human haptics • Mechanical, sensory, motor and cognitive components • Two classes of sensory information: • Tactile • Kinesthetic
Human haptics (2) • Tactile information • From skin in contact with an object • Spatial and temporal variations of forces within the contact region • Slipping, fine textures, small shapes, and softness
Human haptics (3) • Kinesthetic information • Net forces along with position and motion of limbs • Coarse properties of object • Large shapes, spring-like compliances
Human haptics (4) • Kinesthetic resolution: • 2 degrees for fingers and wrist • 1 degree for shoulder • Force exerted by a finger: • 50 to 100 N maximum • 5-15 N typically during exploration and manipulation
What makes a good interface? • Must work with human abilities and limitations • Approximations of real-world haptic interactions determined by limits of human performance
A good haptic interface • Free motion must feel free • Low back-drive inertia and friction • No motion constraints • Ergonomics and comfort • Pain, discomfort and fatigue will detract from the experience
A good haptic interface (2) • Suitable range, resolution and bandwidth • User should not be able to go through rigid objects by exceeding force range • No unintended vibrations • Solid objects must feel stiff
Haptic rendering • Two parts: collision detection, response
Two types of interactions • Point-based haptic interactions • Only end point of device, or haptic interface point (HIP), interacts with virtual object • When moved, collision detection algorithm checks to see if the end point is inside the virtual object • Depth calculated as distance between HIP and closest surface point
Two types of interactions (2) • Ray-based haptic interactions • Probe of haptic device modeled as a line-segment whose orientation matters • Can touch multiple objects simultaneously • Torque interactions
Collision detection • Detect collisions between haptic probe and virtual objects • Bounding volume hierarchies, spatial partitioning • H-COLLIDE, hybrid technique: • Partition virtual workspace as uniform grid • For each grid cell containing primitives, computes OBBTrees
Simple collision response • Haptic rendering of 3D sphere
Simple collision response (2) • Reaction force calculated using the linear spring law F=kx • k: stiffness of object • x: depth of penetration • Direction of force along surface normal
Penalty methods • Subdivide object and associate each subvolume with a surface • Determine feedback force directly from penetration • Works well for simple geometric shapes
Penalty methods (2) • There are some problems • Two possible paths to reach same location, which path was taken?
Penalty methods (3) • Force summation for multiple objects • Compute net force by adding • Correct for perpendicular surfaces • For obtuse angle, force vector becomes too large • When almost parallel, force vector too large by a factor of 2
Penalty methods (4) • Problems with thin objects • If pushed halfway through an object, will be pulled through the rest of the way
Solution? God-object • Zilles, Salisbury (1995) • Cannot stop HIP from penetrating virtual objects • Define additional variables to represent the virtual location of the haptic interface (god-object, IHIP, proxy)
God-object (2) • In free space, HIP and IHIP are collocated • When HIP moves into an object, the IHIP remains on the surface • IHIP computed such that its distance from the HIP is minimized • Correct force vector is unambiguous
God-object (3) • Infinite surface: • Active if the old IHIP is a positive distance from the surface and the HIP is a negative distance from the surface • Finite extent: • If a line traced from the old IHIP to new HIP passes through the facet, then consider the facet active
God-object (4) • When touching convex portion of an object, only one surface should be active at a time
God-object (5) • When touching concave portion of an object, multiple surfaces can be active • 2 surfaces: constrain IHIP to a line • 3 surfaces: constrain IHIP to a point • IHIP might cross another surface before HIP • Solution: iterate the process, until no new constraints found
God-object (6) • Location computation using Lagrange multipliers • x, y, z: coordinates of IHIP • xp, yp, zp: coordinates of HIP • Constraints added as planes
God-object (7) • Minimize L by setting its six partial derivatives equal to 0, solvable with 65 multiplies and divides
Rendering surface details • Smoothing • Friction • Textures
Force shading • Render objects as smooth and continuous, even if underlying representation is not • Compute force vector for each vertex, interpolate over polygonal surfaces (like Phong shading)
Surface friction • Without friction, virtual objects feel “icy-smooth” • Coulomb friction: sticking and sliding • Forces tangential to surface, direction opposite of motion
Haptic texturing • Force perturbation • Modify the direction and magnitude of the force vector • Max and Becker (1994):
Haptic texturing (2) • Image-based: • Construct texture field from 2D image data • Map heights onto the object surface • Procedural: • Generate synthetic texture fields using mathematical functions
Challenges • Graphics update rate must be between 20-30 Hz • Haptic update rate must be around 1kHz • Decouple simulation and haptic loops using multiple processors or a dedicated machine
6-DOF haptics challenges • Detect all surface contact instead of just at a single point • Calculate a reaction force and torque at every point or region of contact • Maintain the 1kHz refresh rate
References • Basdogan, C., Srinivasan, M.A. “Haptic rendering in virtual environments.” http://network.ku.edu.tr/~cbasdogan/-Papers/VRbookChapter.pdf • Chen, E. “Six degree-of-freedom haptic system for desktop virtual prototyping applications.” Proc. First International Workshop on Virtual Reality and Prototyping, p. 97-106, 1999. • Gregory, A., Lin, M. , Gottschalk, S. and Taylor, R. “A Framework for Fast and Accurate Collision Detection for Haptic Interaction.” Proc. of the IEEE Virtual Reality (VR 99), p. 38-45, 1999. • Mark, W. et al. “Adding force feedback to graphics systems: issues and solutions.” Proc. ACM SIGGRAPH 1996. • Massie, Thomas H. and Kenneth Salisbury. “The PHANTOM haptic interface: a device for probing virtual objects.” Proc ASME Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 1994. • McNeely, W., Puterbaugh K., and Troy, J. “Six degree-of-freedom haptic rendering using voxel sampling.” Proc. ACM SIGGRAPH 1999.
References (2) • Ruspini, Kolarov and Khatib. “The haptic display of complex graphical environments.” Proc. ACM SIGGRAPH 1997. • Salisbury, J.K. et al. “Haptic rendering: programming touch interaction with virtual objects.” Proc. ACM SIGGRAPH 1995. • Salisbury, J.K. and Srinivasan, M.A. “Phantom-based haptic interaction with virtual objects.” IEEE Computer Graphics and Applications, 17(5), p. 6-10. • Salisbury, J.K. “Making graphics physically tangible.” Communications of the ACM, 42(8), p. 74-81. • Srinivasan, M.A. and Basdogan, C. “Haptics in virtual environments: taxonomy, research status, and challenges.” Computers & Graphics, 21(4), p. 393-404. • Zilles, C.B. and Salisbury, J.K. “A constraint-based god-object method for haptic display.” Proc. IEE/RSJ International Conference on IntelligentRobots and Systems, Human Robot Interaction, and Cooperative Robots, Vol. 3, p. 146-151, 1995.