110 likes | 120 Views
Proposals for video coding complexity assessment. Daniele Alfonso JCTVC-A026 Dresden, DE, 15-23 April 2010. Properties of a measurement system. The result of a measure is a random variable . The quality of a measure is defined by:
E N D
Proposals for video coding complexity assessment Daniele Alfonso JCTVC-A026 Dresden, DE, 15-23 April 2010
Properties of a measurement system • The result of a measure is a random variable. • The quality of a measure is defined by: • Accuracy: closeness of a measured quantity to its true value. • Precision or Repeatability: the degree to which further measurements show the same or similar results. • Other desirable properties: • Reproducibility: ability of a test or experiment to be accurately reproduced by someone else working independently. • Simplicity: capability to execute measures in an automated way, with limited human interaction.
Complexity as execution time • The computational complexity of an application is often estimated by its overall execution time. • The OS can put the CPU in one of the following 4 states: • Executing in User Mode: the CPU is executing the machine code of a process that accesses its own data space in memory. • Executing in System Mode (also known as Kernel Mode): the CPU is executing a system call made by the process to require the services of the Kernel. • Idle waiting for I/O: processes are sleeping while waiting for the completion of I/O to disk or other block devices. • Idle: no processes are ready-to-run on the CPU or are sleeping waiting for block I/O or keyboard input or network I/O.
Elapsed Time vs. User Time • Measure of overall elapsed time value: • not accurate: accounts for CPU time spent in idle states and in running other processes, on behalf of the OS or other users. • low reproducibility: depends on the CPU architecture, not only on the application. • Measure of the user time solves accuracy issues, but it still has low reproducibility. • Both measures may have limited precision (see next slide).
Encoding User Time results • VMR = Variance to Mean Ratio (Index of Dispersion). • Advisable to perform multiple measures and take the average.
Complexity definitions • Clock Rate (CR) depends only on the CPU not interesting. • Cycles Per Instruction (CPI) depends on the application, on the compiler and on the CPU and is very hard to measure. • Proposed simplified complexity definition:
Valgrind • Linux tool suite for debugging and profiling. • Includes Instruction and Data Caches simulator. • I-cache accesses Instructions Count (IC). • D-cache accesses bandwidth • Pro’s: • freeware and open source (GPL 2); • available for many Linux distributions as precompiled package; • maintained (latest release dated 19 Aug. 2009); • reliable and easy to use. • Con’s: • Application execution slow down by up to 100 times • Clock cycles per instruction not measured. • Data size of D-cache accesses not measured.
Valgrind results • “rolling tomatoes” test sequence (1080p24). • I-cache and D-caches accesses for encoding of 1s video. • Extremely low VMR a single measure is sufficient.
Valgrind results (2) • Encoding and decoding complexity values, averaged on 4 HD 1080p test sequences.
Remarks • The complexity measurement method shall have good properties of accuracy, precision, reproducibility and simplicity. • Complexity is a multi-dimensional concept: we focused on computation and memory bandwidth, because these factors have the most relevant impact on the implementation cost. • It is not possible to define the computational complexity of an application “per se”: it depends on the application, on the compiler technology and on the CPU Instruction Set Architecture. • We can perform simplified complexity estimation over generic x86 CPU architecture with the Valgrind tool suite for Linux OS. • It is necessary to specify and agree on the following points: compiler version, optimization level, set of test sequences and configurations.
Proposals to JCT-VC • To evaluate HVC contributions in terms of both coding efficiency and complexity efficiency. • To define a procedure for complexity assessment, considering the present contribution. • To specify the procedure in a document like e.g. “Recommended simulation common conditions for complexity efficiency experiments”.