230 likes | 368 Views
Data Visualization. Lecture 7 3D Scalar Visualization Part 2 : Volume Rendering- Introduction. Volume Rendering. This is a quite different mapping technique for visualization of 3D scalar data (compared with isosurfacing)
E N D
Data Visualization Lecture 7 3D Scalar Visualization Part 2 : Volume Rendering- Introduction
Volume Rendering • This is a quite different mapping technique for visualization of 3D scalar data (compared with isosurfacing) • Aims to relate volume to a partially opaquegel material - colour and opacity at a point depending on the scalar value • By controlling the opacity, we can: • EITHER show surfaces through setting opacity to 0 or 1 • OR see both exterior and interior regions by grading the opacity from 0 to 1 [Note: opacity = 1 - transparency]
Example - Forest Fire From Numerical Model of Forest Fire, NCAR, USA
Medical Imaging • Major application area is medical imaging • Different scanning techniques include: • CT (Computed Tomography) • MRI (Magnetic Resonance Imaging) • Three-dimensional images constructed from multiple 2D slices Slice Scanners give average value for a region - rather than value at a point Interslice gap Slice
Examples of Brain Scans Magnetic Resonance Imaging Computerized Tomography SPECT
Example - Medical Imaging Rendered by VolPack software CT scan data 256x256x226
Opacity a 1 0 CT value fsoft_tissue Data Classification –Assigning Opacity to CT data • CT will identify fat, soft tissue and bone • Each will have known absorption levels, say ffat, fsoft_tissue, fbone This transferfunction will highlight soft tissue
Opacity a 1 0 CT value fsoft_tissue Data Classificatiion –Assigning Opacity to CT Data • To show all types of tissue, we assign opacities to each type and linearly interpolate between them ffat fbone
Data Classification - Constructing the Gel - CT Data opacity This is known as opacity transfer function 1.0 CT number 0.0 (f) In practice, the boundaries between materials are of key importance - hence a two-stage algorithm used: (i) Calculate as above (ii) Scale by gradient of function to highlight boundaries * = |grad f | grad f = [df/dx,df/dy,df/dz] ? So what is opacity in homogeneous areas ?
Data Classification - Constructing the Gel - CT Data • Colour classification is done similarly Known as colour transfer function white red yellow CT number Soft Tissue Air Fat Bone
red (1,0,0) blue (0,0,1) temperature Data Classification - Constructing the Gel - Temperature Data • Volume rendering is also useful for other data - eg CFD temperature • Opacity transfer function: possibly increase with temperature • Colour transfer function:
The ColourMap tool in IRIS Explorer can be used to assign colour and transparency to data Data Classification in IRIS Explorer
Example Storm cloud data rendered by IRIS Explorer – Isosurface & volume rendering
Ray Casting to Render the Volume 1 Assign colour and opacity to data values • Classification process assigns gel colour to the original data 2 Apply light to volume • Lighting model will give the light reflected to the viewer at any point in volume - if we know the normal • Imagine an isosurface shell through each data point - surface normal is provided by gradient vector (see lecture 6) • Thus we get colour reflected at each data point
Casting the Rays and Taking Samples • 3. For each pixel in image • a) cast ray from eye through pixel into volume, taking samples at regular unit intervals • b) measure colour reflected at each sample in direction of ray • c) composite colour from all samples along ray, taking into account the opacity of gel it passes through - en route to the eye data volume ray eye point exit point image plane sample points one unit apart (colour and opacity by interpolation) entry point
Compositing the Samples along a Ray - One Sample opaque background, emitting I0 I0 I* eye point Intensity - I1 Opacity - Imagine block of gel, one unit wide around sample point I* = I0 (1 - ) + I1
Compositing the Samples along a Ray - Two Samples opaque background I** I* I0 eye point Intensity - I2 Opacity - 2 Intensity - I1 Opacity - I* = I0 (1 - ) + I1 I** = I* (1 - 2) + I22 = I0 (1 - 1)(1 - 2) + I11(1 - 2) + I22
Compositing the Samples along a Ray • The process continues for all samples, yielding a final intensity, or colour, for the ray - and this is assigned to the pixel • try it for a third sample, then you should be able to deduce a general formula I = Sni=0 Iii Pnj=i+1(1 - j) • Note that if one compositing step is done for each ray in turn, then the next step, and so on, the image will be created in a sweep from back to front, showing all the data (even behind opaque parts)
Front to Back Compositing Compositing can also work front-to-back: I* eye point I* = n In * = n - cumulative opacity Intensity In Opacity n I** eye point I** = I* + (1 - *)n-1In-1 Intensity In Opacity n Intensity In-1 Opacity n-1 ** = * + (1-*)n-1
Front-to-Back Compositing - Early Termination • The advantage of front-to-back compositing is that we can stop the process if the accumulated opacity reaches 1.0 - no point in going further • Again, you should be able to deduce the general formula if you look at three samples • can you show that front-to-back and back-to-front compositing give the same answer?
When performance rather than accuracy is the goal, we can avoid compositing altogether and approximate I by maximum intensity along ray MIP : Maximum Intensity Projection Often used in angiography... Maximum Intensity Projection
Performance is major issue lack of shading in image drives need for real-time rotation fast identification of maximum becomes important volume image plane Maximum Intensity Projection - analytical maximum in each cell along ray - maximum of samples along ray - skip cells below maximum
Next Lecture • Parkinson Room 108 • Monday 10-11 • Hope to have a second lecture on Monday 11.00 – 12.00 … room to be announced!