1 / 33

School of Computer Science and Software Engineering

Monash University. School of Computer Science and Software Engineering. Region Warping in a Virtual Reality System with Priority Rendering. Yang-Wai Chow Ronald Pose Matthew Regan. Overview. Background Address Recalculation Pipeline Priority Rendering

Jimmy
Download Presentation

School of Computer Science and Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monash University School of Computer Science and Software Engineering Region Warping in a Virtual Reality System with Priority Rendering Yang-Wai Chow Ronald Pose Matthew Regan

  2. Overview • Background • Address Recalculation Pipeline • Priority Rendering • Description of the challenges/problems • Large object segmentation • Tearing • Solution to the problem • Region Priority Rendering • Region Warping • Experimental Results • Future Work

  3. Background • Address Recalculation Pipeline • Priority Rendering

  4. The latency problem User Actions Delay Actions reflected by Display End-to-end latency The Address Recalculation Pipeline (ARP) was designed to reduce the end-to-end latency due to user head rotations for immersive Head Mounted Display (HMD) virtual reality systems • Latency is a major factor that plagues the designing of immersive Head Mounted Display (HMD) virtual reality systems • End-to-end latency is defined as the time between a user’s actions and when those actions are reflected by the display

  5. Head Mounted Display (HMD) Lengthy delays in immersive Head Mounted Display (HMD) virtual reality systems can have adverse effects on the user • Latency can completely destroy the illusion of reality that the virtual reality system attempts to present to the user

  6. Normal sequence of events Head Tracking Image Creation Buffer Swap Image Valid Conventional systems attempt to shorten this section Conventional virtual reality display systems attempt to shorten the end-to-end latency by reducing scene complexity and/or by using faster rendering engines • Even with the fast graphics accelerators available today that can render over 100 frames per second (fps) the end-to-end latency remains a factor to be contended with • The update cycle is still bound by the need to obtain up-to-date head orientation information (where the user is looking) before any form of rendering can commence

  7. Conventional virtual reality system On conventional graphics systems, the rendering process is bound by the need to obtain up-to-date head orientation information prior to rendering Geometric transform Database traversal Pixel addressing Face classific. Display buffer Lighting Clipping Viewport mapping Scan conv. Image comp. Display image Head Orientation

  8. The Address Recalculation Pipeline (ARP) virtual reality system Head Orientation Database traversal Locate pixel Viewport mapping Wide angle correction Geometric transform Pixel addressing Display buffer Face classific. Scan conv. Image comp. Lighting Clipping Anti- aliasing Display image The ARP is fundamentally different from conventional systems in that it implements delayed viewport mapping, a concept whereby viewport mapping is performed post rendering

  9. Head Tracking Image Creation Buffer Swap Image Valid Average latency to head rotations WITHOUT pipeline Head Tracking Image Valid Average latency to head rotations WITH pipeline The ARP effectively decouples viewport orientation mapping from the rendering process, and in this manner removes the usually lengthy rendering time and buffer swapping delays from the latency • In separating the viewport orientation mapping from the rendering process, latency is now bound to the HMD unit’s update rate and the time required to fetch pixels from display memory • The systems is very much less dependent on the rendering frame rate and is therefore fairly independent of scene complexity

  10. Top Left Front Right Back Bottom In order to implement delayed viewport mapping, the ARP requires the scene that encapsulates the user’s head to be pre-rendered onto display memory • The surface of a cube was chosen to be the rendering surface surrounding the user’s head, mainly because of its rendering simplicity • The rendering surface of a cube contains six standard viewport mappings each orthogonal from the other • There are standard algorithms for cube surface rendering • The use of such rendering can be found in a computer graphics technique known as cube environment mapping

  11. Image Composition A rendering method known as Priority Rendering was developed to be used in conjunction with the ARP system, for the purpose of reducing the overall rendering load • Priority rendering is based on the concept of Image Composition • Different section of the scene can be rendered onto separate display memories before being combined to form an image of the whole scene

  12. Priority Rendering Priority Rendering allows different section of the scene that surrounds the user’s head to be rendered onto separate display memories and therefore can updated at different update rates • In the ARP system, most objects in the scene will remain valid upon user head rotations • Perspective ‘foreshortening’ – objects closer to the display will appear larger than distant objects • Also, upon user translations objects closer to the display will appear to move by larger amounts compared to distant objects

  13. Description of the challenges/problems • Large object segmentation • Tearing

  14. Large object segmentation It is conceivable that the use of large object segmentation in conjunction with Priority Rendering could potentially further reduce the overall rendering load • Fractal terrain example – A fractal terrain typically consists of thousands of polygons. If the terrain were to be segmented for priority rendering, different sections of the fractal terrain could be updated at different update rates

  15. The tearing problem The implementation of object segmentation with Priority Rendering gives rise to a potential scene tearing problem • Tearing can potentially occur when different sections of the same object are rendered at different update rates, whilst the user is translating through the scene

  16. Scene tearing artefacts will completely destroy the illusion of reality, and therefore has to be addressed before object segmentation can be used effectively • Fractal terrain tearing example

  17. Solution to the problem • Region Priority Rendering • Region Warping

  18. Region Priority Rendering Region Priority Rendering was devised to implicitly sort objects spatially and also to provide a criterion for object segmentation • This methodology involved dividing the virtual world objects into equal sized clusters or regions

  19. Region Priority Rendering By dividing the virtual world into square based regions, object segments could be assigned to the different display memories with the different update rates • Objects in the regions were assigned to the display memories with the different update rates based on spatial locality • Large objects were to be segmented along region boundaries • In this way tearing would be predictable and the size of the tearing could also be computed

  20. Region Warping Region Warping was designed to hide the scene tearing artefacts resulting from object segmentation with Priority Rendering • Region Warping essentially involves the perturbation of object vertices in order to hide the tearing artefacts

  21. Experimental Method

  22. Normalization Before region warping could be performed, normalization of the vertices had to be performed in order to determine the exact amount of perturbation required for each vertex • Normalization was performed using what can be seen as concentric squares centered on the circumference of the region the user is currently located in

  23. The warping All vertices in the regions had to be perturbed in order to avoid the potential problem of objects looking out of place • Region warping forces the vertices on the different regions to align, thus hiding the tearing from the user • Two interpolation methods were used in the experiments, these were linear interpolation and squared interpolation • Analysis was conducted to determine which form of interpolation produced better results

  24. Gallery scene Scene used for the experiments

  25. Experimental Results

  26. Scene tearing An example of a single frame showing the scene tearing effect

  27. Region Warping results The exact same frame, this time with Region Warping

  28. The level of distortion cause by linear interpolation and squared interpolation region warping were analyzed Analysis • The Mean Squared Error (MSE) and Peak Signal-to-Noise Ration (PSNR) error metrics were used to analyse the two warping methods mathematically • These error metrics are commonly used to measure the level of distortion in video or image compression techniques

  29. Mean-Squared-Error (MSE) results The lower the Mean-Squared-Error (MSE) value, indicates less errors in the frames from the original (normal rendering)

  30. Peak Signal-to-Noise Ratio (PSNR) results The higher the Peak Signal-to-Noise Ratio (PSNR) values, means the closer the frames are compared to the original (normal rendering)

  31. Difference images Observation of the difference images showed that squared interpolation region warping is more attractive as it pushes the bulk of errors further away from the user’s point of view Difference image showing error between normal rendering, linear interpolation and squared interpolation region warping Difference image showing error between normal rendering and linear interpolation region warping

  32. Where to from here… Future work • Human visual perception experiments • Relationships between level of distortion, region sizes, speed of user translations, and etc. • Computational and rendering load • Dynamic shadow generation and shaders

  33. Questions or Suggestions?

More Related