1 / 29

Digital Movies

Digital Movies. Digital video – image sequence representing frames of motion picture or video. For full motion – display at 25 or 30 frames per second. Five minute movie; 5 x 60 x 25 = 7,500 images. Example – NTSC standard: 720 x 480 pixels. 30 frames per sec. 24 bpp color

errol
Download Presentation

Digital Movies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Digital Movies • Digital video – image sequence representing frames of motion picture or video. • For full motion – display at 25 or 30 frames per second. • Five minute movie; 5 x 60 x 25 = 7,500 images. • Example – NTSC standard: • 720 x 480 pixels. • 30 frames per sec. • 24 bpp color • bite rate: 248 Mbps Multimedia Communication

  2. Application Bit Rates • Bit rates depend on application. • Examples: • For CD-Rom @ 1.5 Mbps require 200:1 compression of NTSC color video. • For mobile communication of same video @ 10 Kbps require 24800:1 compression. Multimedia Communication

  3. Standard of video compression Multimedia Communication

  4. Standard of video compression(con’t) Multimedia Communication

  5. Standard of video compression(con’t) Multimedia Communication

  6. Intraframe and Interframe • Intraframe coding - code each frame separately, - ignore temporal redundancy: adjacent frame are similar. • Interframe coding - use previous frame as prediction for current frame (as DPCM), • But sometimes need reference frames, e.g. for random access: combine intraframe and interframe coding • We consider interframe coding: motion compensation & MPEG standard Multimedia Communication

  7. Early video coding • Divide frame into changed and unchanged regions • Then, code is addresses of changed locations and updated intensity values. • DPCM scheme with prediction based on value of same pixel in previous frame. • Later extended to motion compensated coding. Multimedia Communication

  8. Motion Compensated Prediction • Similar DPCM coding scheme. • Prediction of pixel values based on motion estimaes. • Motion estimates: relate pixels in previous frame to pixels in current frame. • If we know where a pixel ‘goes to the next’, we can use its values as a prediction for the next frame. • Coder incorporates motion estimation and motion compensated prediction. • Error coded using block based DCT or subband scheme. • But need to include motion estimates in code: overhead. Multimedia Communication

  9. Motion Compensated Prediction(con’t) Multimedia Communication

  10. Motion Compensated Prediction(con’t) • Apply block-based motion estimation • Divide each frame into M x M blocks. • For each block B looks in the previous frame for a block B’ which closely matches B. (the match can be based on how many pixels differ) • If a good match is found, we code block B as a combination of: • (a) A “motion” vector which is the difference between the positions of B, B’ • (b) The value difference between B and B’. Multimedia Communication

  11. Motion Compensated Prediction: Example Multimedia Communication

  12. Motion Compensated Prediction(con’t) • In general, there are three types of motion-compensated prediction: • Unidirectional motion-compensated prediction: only a previous or a subsequent frame is used for prediction. • Bidirectional motion-compensated prediction: a previous reference frame and a subsequent reference frame is used to determine the motion vector for each block.- only the motion vector corresponding to the previous or subsequent fref frame that results in the smallest matching error is used. - • Interpolative motion-compensated prediction: averaging the prediction of the previous reference frame and the prediction of the future reference frame – two motion vectors have to be stored or transmitted- Multimedia Communication

  13. MPEG: Moving Picture Expert Group • MPEG was designed to serve the following requirements: • Random access: The use should be able to access the media at any point of time. • Fast searches: Forward and reverse search should be quick. • Reverse playback of the stream • Audio-video synchronization • Robustness: Specially needed when transmitting over a network. Even if some packets are lost, decoder should be able to decode other packets. • Low encoding/decoding delay: To enable clients to view live events. Multimedia Communication

  14. Features of MPEG • MPEG is syntax standard for representation of bitstream. • Defines the bitstream and decoding precedure. • Supports operations including: • Motion estimation • Motion compensated coding • DCT coding • Quantization and entropy coding • Bitstream contains sequence and coding parameters- able to deal with different image sizes, frame rates, etc. Multimedia Communication

  15. MPEG Fundamentals Encoding categories: • Intraframe encoding: Encoding is independent to other frames. • Interframe encoding: Encoding is dependent to other frames. “To achieve a high compression ratio, temporal redundancies of subsequent pictures must be exploited (interframe), while the demand for random access requires intraframe coding.” MPEG Algorithm: • Spatial redundancy is reduced using “DCT-based compression”. • Temporal redundancy is exploited using “block-based motion compensation”. Multimedia Communication

  16. Hierarchical Layered Bit Stream Multimedia Communication

  17. Hierarchical Layered Bit Stream(con’t) Multimedia Communication

  18. Sequence Layer • Input sequence divided into groups of pictures. • Picture is frame in sequence. • GOPs make up Sequence Layer. • Plus start code and header • Header-info about sequence; image size, frame rate, etc. Multimedia Communication

  19. Group Layer • GOP made up of frames. • With start code and header. • Each picture can be: • I-picture • P-picture • B-picture • D-picture • Depending on how coded. Multimedia Communication

  20. Group LayerPicture Multimedia Communication

  21. Group LayerPicture (con’t) • I frame: Intra-coded images, coded without any ref. to other images. The compression rate of I frame is lowest within MPEG. I frame are aims for random access in MPEG streams. • P frame: Predictively code frame. Require information of the previous I frame and/or P frames for encoding and decoding. Can achieve higher compression ratio than I frames. • B frame: Bidirectionally predictively coded frames. Require information from the previous and the following I and/or P frame for encoding and decoding. Achieve highest compression ratio. • D frame:DC coded frames. These frames are intraframe encoded. They can be used for fast forward or rewind mode. The DC parameters are DCT coded and the AC coefficients are neglected. Multimedia Communication

  22. Slice, Macroblock and Block Layer • Slide layer – each picture made up of slices (set of macroblocks) • Macroblock layer - set of macroblocks, each containing: • four 8 x 8 luminance blocks • two 8 x 8 chrominance blocks (subsampled) • Compression mode can be selected for each macroblock, e.g. quantization, interframe or intraframe encoded, etc. • Block layer – 8 x 8 luminance or chrominance pixels represented by quantized and RLE DCT coefficients. A block is the basic unit for DCT Multimedia Communication

  23. MPEG Fundamentals: Basic Coding Tech.(MPEG1) MPEG video coding uses an adaptive combination of: • Transform domain-based coding of individual frames : DCT • Motion compensated prediction between frames • Temporal Differential PCM tech. (DPCM) • Exploits temporal correlation between pixels (interframe) Multimedia Communication

  24. MPEG Fundamentals : Image Preprocessing Goals: • Generate block-and component-based image structure • Reduce number of input pixels for encoding step Example: • Determine YUV image representation: • Y= 0.30 R + 0.59 G + 0.11 B Luminance component • U= 0.493 (B-Y) Chrominance component • V= 0.877 (R-Y) Chrominance component • Use blocks containing 8 x 8 pixels • Sub-sample chrominance component to generate macroblocks (MBs) Multimedia Communication

  25. MPEG : DCT-based Compression DCT example: • DC coefficient C00: average pixel intensity, high dominant • AC coefficients: changes in pixel intensities, rapidly decreasing Multimedia Communication

  26. MPEG : DCT-based Compression (con’t) • Quantization: given the DCT coefficient to small number of level • Entropy Coding (var.-length coding, VLC): • ‘zig-zag’ scan of quantized coefficients • Typically, Huffman or run-length encoding • DC coefficient C00 treated separately Multimedia Communication

  27. MPEG : Motion Compensated Prediction Predictive coding: • Video frame close in time usually are highly similar • Coding difference is more efficient than original frames Motion compensation: • Objects usually move over time • Improve prediction by first estimating motions of MBs • Code and transmit motion vectors Multimedia Communication

  28. MPEG : Motion Compensated Prediction(con’t) Example: Multimedia Communication

  29. Basic MPEG1 Coding Scheme Flexible Group-of-picture (GOP) structure: • I frames: intraframe coded pictures • P frame: interframe coded pictures • B frame: bidirectionally predicted pictures Multimedia Communication

More Related