490 likes | 608 Views
Immersion, Prescence, Distributed VR. Bob Hobbs Staffordshire University Computing School. Outline Context Immersion Presence Shared Environments. Virtual Reality is a Tool. What it is: Use of highly interactive real-time immersive systems to convey information What it is not :
E N D
Immersion, Prescence, Distributed VR Bob Hobbs Staffordshire University Computing School
Outline • Context • Immersion • Presence • Shared Environments
Virtual Reality is a Tool • What it is: • Use of highly interactive real-time immersive systems to convey information • What it is not : • Desktop graphics • Text based • Non-interactive • Linear
Immersion: Realisation of an Environment • generates displays ideally in all sensory systems; • fully encloses the participant in those displays; • tracks the body, limbs, head; • determines the optical, auditory... arrays as a function of head tracking • Either: • displays a Virtual Body with movements as function of the tracking. (mainly with HMD) • Participant can visualise self and world (CAVE)
Virtual Body • At any moment there is a position in the geometry with respect to which sensory data is generated - the egocentric self-reference position. • This corresponds to the place occupied by the human actor in the environment. • At the self-reference position there is a functioning VB represented by the displays.
Means of Immersion • Cave Environment • Boom-Mounted Display • Head-Mounted Display • ImmersaDesk
ImmersaDesk (-3 3 0)
Stereo Displays • Binocular vision considerably enhances visual depth perception. • Stereo displays like StereoView option on SGI workstations may provide high resolution stereo real-time interaction. • StereoView consists of 2 items-specially designed eyewear and infrar emitter. • shutters alternately open and close every 120th of second in conjunction with alternating display of left and right eye view on display-presenting each eye with effective 60Hz refresh. • Infrared emitter transmits left/right signal from IRIS workstation to wireless eyewear so that shuttering of LCS locked to alternating left/right image display. • As result, each eye sees unique image and brain integrates 2 views into stereo picture.
Head-Mounted Display • EyePhone: head-mounted display system, presents rich 3D cues of head-motion parallax and stereopsis. • designed to take advantage of human binocular vision capabilities and presents general following characteristics: • headgear with 2 small LCD color screens, each optically channelled to one eye, for binocular vision. • special optics in front of screens, for wide field of view • tracking system (Polhemus 3Space Isotrack) for precise location of user's head in real time.
Position/orientation trackers 3 main ways of recording positions and orientations: magnetic, ultrasonic and optical Magnetic tracking devices most successful. • Polhemus 3Space Isotrack and Ascension Birds (Flock of Birds), not perfect but most common. • Source generates low frequency magnetic field detected by sensor. • Second approach generally based on tripod consisting of 3 ultrasonic speakers set in triangular position that emits ultrasonic sound signals from each of 3 transmitters. • Optical uses light sources in similar way (InterSense) • Eddy effect used to detect orientation, position by grid reference
transmitter receiver driving electronics SP electronics computer position, orientation Electromagnetic Position Tracking
Altering current (AC) (Direct current DC) transmitterX antenna transmitterX antenna transmitterY antenna transmitterY antenna transmitterZ antenna transmitterZ antenna receiverX antenna receiverX antenna receiverY antenna receiverY antenna receiverZ antenna receiverZ antenna time time T2 T2 T0 T3 T0 T1 T1 T0 T0 Electromagnetic Position Tracking
Position Tracking Systems • Polhemus Inc. (http://www.polhemus.com) • 3Space ISOTRAK (1 sensor) • 3Space FASTRAK (many sensors) • Ascension Technology Corp. (http://www.ascension-tech.com) • Flock of Birds • pcBIRD • SpacePad
Trackers Calibration • Dynamic errors • caused by external electromagnetic fields • can be corrected by increasing measurements frequency, synchronizing the measurements with the external field source, and filtering • Static errors • caused by the field distortions due to the surrounding metal and external fields • can be corrected via trackers calibration
true tracked Z X Calibration Table
CAVE, FoB 4 feet from the floor 1 foot grid 4th order polynomial fit Calibration Example
True space Tracked space d 8 d 1 Interpolation V. Kindratenko, A. Bennett, “Evaluation of Rotation Correction Techniques for Electromagnetic Position Tracking Systems”, in Proc. VE 2000, pp. 13-22
Data Acquisition Techniques • Size and type of a calibration table depends on • Type of the calibration technique to be used • Severity of the field distortions • Required calibration quality • Calibration table can be • Irregular (for high-order polynomial fit) • Regular in the true space (for interpolation) • Regular in the tracked space (for tri-linear interpolation)
An Immersive Participant • A user will be head tracked • Have a ‘Wand’ • Stereo glasses in CAVE • HMD user may have additional tracking sensors – Data Glove or Motion tracker
Data Glove • Hand measurement devices must sense both flexing angles of fingers and position/orientation of wrist in real-time. • typical example of hand measurement device: DataGlove from VPL Research. • DataGlove consists of lightweight nylon glove with optical sensors mounted along fingers.
Each sensor: short length of fiberoptic cable, with light-emitting diode (LED) at one end and phototransistor at other end. • When cable flexed, some of LED's light lost, so less light received by phototransistor. • Attached to back: 3Space Isotrack system to measure orientation/position of gloved hand.
Data Suit • Much less popular than DataGlove: allows to measure positions of body. • typical example of use of datasuit: • film of Fuji TV: the Dream of Mr. M. • 3D character approximately performs same motion as animator. • Another way of measuring positions of body just to use collection of sensors like Flock of Birds. • However, needs algorithms for calibration and conversion (see paper by Molet et al.)
Sound • Midi-equipment and workstation audio for sound generation and effects, filter processors and 3D-audio cards for spatial audio. • Two categories of sound in VR can be identified: • Simulation of real world acoustics: based on our experiences in everyday life physical behavior of sound can be modeled. • comprises sound generation, e.g. caused by object collision, sound propagation and auralization. • Immersive user interfaces can be used to evaluate simulation results. • Sound at user interface: sound can be applied to support user in current task or to provide information about invisible proceedings.
Presence • Presence is a state of consciousness where the human actor has a sense of being in the location specified by the displays. • We take presence as the central feature of "virtual reality": • "A virtual reality is defined as a real or simulated environment in which a perceiver experiences telepresence" (Steuer). • The unique feature of "virtual reality" systems is that they are general purpose presence transforming machines..
Meaning of Presence • Presence is the psychological sense of being there in the environment specified by the displays. • a high degree of presence in the VE should lead to the participant experiencing objects and processes in the virtual world as (temporarily) more the presenting reality than the real world in which the VE experience is actually embedded. • A correlate of this is that the participant should exhibit behaviours that are the same as those they would carry out in similar circumstances in everyday reality. • The VE experience - should be more like visiting a place, rather than like seeing images designating a place
Design in Immersive VEs With design in immersive virtual environments... • designer shares same space as objects; • a degree of evaluation can take place in the virtual space; • presence leads to the designer behaving in a manner appropriate to everyday reality in similar circumstances. • Special "interactive techniques" and behaviours do not have to be learned...
Feedback • Two forms of feedbaack • Force Feedback • Manipulating virtual objects • Gravity • Simulation • Touch (tactile) Feedback • Texture appreciation • Navigation • Sensitive • Use Haptic Devices
What is a haptic interface? • A haptic interface is a force reflecting device which allows a user to touch, feel, manipulate, create, and/or alter simulated 3D-objects in a virtual environment. • Movement trackers do not provide feedback
Usage It could be used to • train physical skills such as those jobs requiring specialized hand-help tools (e.g. surgeons, astronauts, mechanics), • to provide haptic-feedback modeling of three dimensional objects without a physical medium (such as automobile body designers working with clay models), or • to mock-up developmental prototypes directly from CAD databases (rather than in a machine shop).
Phantom Very common haptic device mainly used with augmentation on desktop systems
Presence in Multi-participant Environments • Sense of being in a place • sense of sharing the same space as other individuals • Sense of belonging to a totality more than just the sum of the individuals • Awareness may be an important factor enhancing shared presence. • Shared presence may correspondingly enhance awareness
Tele-Immersion • Goal - not just making these collaborations possible, but making them convenient
CAVERNsoft Application Virtual Harlem • Bryan Carter, Bill Plummer – ATC (Advanced Technology Center at Univ of Missouri- Columbia ) • SIGGRAPH 1999 • Harlem is reconstructed for an African American Literature course at MU. Instead of just reading literary works from this era, this prototype will allow students to become immersed and engaged in an interactive literature course. • Jim Sosnoski, Jim Fletcher- English Dept. Univ Illinois Chicago • Steve Jones- Communications Dept. Univ Illinois Chicago
Avatars • Tracking head and hand position and orientation give good cues • Extendable pointing rays can be useful in large spaces • Exaggerated head and hand motions give better cues than just hand
Shared Virtual Environments in Europe • Collaborative Virtual Environments (COVEN) ACTS • Develops an integrated teleworking platform that supports multi-sensory presence for collaboration in shared virtual environments. • Services: • mechanisms to support the presence of users in shared virtual environments. • browsing and interaction facilities for large numbers of users accessing enormous quantities of remote information; • synchronised multi-sensory interaction with dynamic representations of three-dimensional objects and actors; • support for collaborative tasks requiring complex motor skills and shared information.
VR Applications • Augmented Reality • Placing data in the normal workspace • Data Visualisation • Explaining data through better representation • Training • For dangerous/expense procedures • Conferencing • Social context for telecommunication • Health • Treatment of phobias/psychological disorders • Entertainment