1 / 11

Proposals for video coding complexity assessment

Proposals for video coding complexity assessment. Daniele Alfonso JCTVC-A026 Dresden, DE, 15-23 April 2010. Properties of a measurement system. The result of a measure is a random variable . The quality of a measure is defined by:

lanceg
Download Presentation

Proposals for video coding complexity assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proposals for video coding complexity assessment Daniele Alfonso JCTVC-A026 Dresden, DE, 15-23 April 2010

  2. Properties of a measurement system • The result of a measure is a random variable. • The quality of a measure is defined by: • Accuracy: closeness of a measured quantity to its true value. • Precision or Repeatability: the degree to which further measurements show the same or similar results. • Other desirable properties: • Reproducibility: ability of a test or experiment to be accurately reproduced by someone else working independently. • Simplicity: capability to execute measures in an automated way, with limited human interaction.

  3. Complexity as execution time • The computational complexity of an application is often estimated by its overall execution time. • The OS can put the CPU in one of the following 4 states: • Executing in User Mode: the CPU is executing the machine code of a process that accesses its own data space in memory. • Executing in System Mode (also known as Kernel Mode): the CPU is executing a system call made by the process to require the services of the Kernel. • Idle waiting for I/O: processes are sleeping while waiting for the completion of I/O to disk or other block devices. • Idle: no processes are ready-to-run on the CPU or are sleeping waiting for block I/O or keyboard input or network I/O.

  4. Elapsed Time vs. User Time • Measure of overall elapsed time value: • not accurate: accounts for CPU time spent in idle states and in running other processes, on behalf of the OS or other users. • low reproducibility: depends on the CPU architecture, not only on the application. • Measure of the user time solves accuracy issues, but it still has low reproducibility. • Both measures may have limited precision (see next slide).

  5. Encoding User Time results • VMR = Variance to Mean Ratio (Index of Dispersion). • Advisable to perform multiple measures and take the average.

  6. Complexity definitions • Clock Rate (CR) depends only on the CPU  not interesting. • Cycles Per Instruction (CPI) depends on the application, on the compiler and on the CPU and is very hard to measure. • Proposed simplified complexity definition:

  7. Valgrind • Linux tool suite for debugging and profiling. • Includes Instruction and Data Caches simulator. • I-cache accesses  Instructions Count (IC). • D-cache accesses  bandwidth • Pro’s: • freeware and open source (GPL 2); • available for many Linux distributions as precompiled package; • maintained (latest release dated 19 Aug. 2009); • reliable and easy to use. • Con’s: • Application execution slow down by up to 100 times • Clock cycles per instruction not measured. • Data size of D-cache accesses not measured.

  8. Valgrind results • “rolling tomatoes” test sequence (1080p24). • I-cache and D-caches accesses for encoding of 1s video. • Extremely low VMR  a single measure is sufficient.

  9. Valgrind results (2) • Encoding and decoding complexity values, averaged on 4 HD 1080p test sequences.

  10. Remarks • The complexity measurement method shall have good properties of accuracy, precision, reproducibility and simplicity. • Complexity is a multi-dimensional concept: we focused on computation and memory bandwidth, because these factors have the most relevant impact on the implementation cost. • It is not possible to define the computational complexity of an application “per se”: it depends on the application, on the compiler technology and on the CPU Instruction Set Architecture. • We can perform simplified complexity estimation over generic x86 CPU architecture with the Valgrind tool suite for Linux OS. • It is necessary to specify and agree on the following points: compiler version, optimization level, set of test sequences and configurations.

  11. Proposals to JCT-VC • To evaluate HVC contributions in terms of both coding efficiency and complexity efficiency. • To define a procedure for complexity assessment, considering the present contribution. • To specify the procedure in a document like e.g. “Recommended simulation common conditions for complexity efficiency experiments”.

More Related