in the name of allah in the name of allah
play

In the name of Allah In the name of Allah the compassionate, the - PowerPoint PPT Presentation

In the name of Allah In the name of Allah the compassionate, the merciful the compassionate, the merciful Digital Video Systems Digital Video Systems S. Kasaei S. Kasaei Room: CE 307 Room: CE 307 Department of Computer Engineering


  1. In the name of Allah In the name of Allah the compassionate, the merciful the compassionate, the merciful

  2. Digital Video Systems Digital Video Systems S. Kasaei S. Kasaei Room: CE 307 Room: CE 307 Department of Computer Engineering Department of Computer Engineering Sharif University of Technology Sharif University of Technology E- -Mail: Mail: skasaei@sharif.edu skasaei@sharif.edu E Webpage: http:// http://sharif.edu sharif.edu/~skasaei /~skasaei Webpage: Lab. Website: http:// Lab. Website: http://mehr.sharif.edu mehr.sharif.edu/~ipl /~ipl

  3. Acknowledgment Acknowledgment Most of the slides used in this course have been provided by: Most of the slides used in this course have been provided by: Prof. Yao Prof. Yao Wang Wang (Polytechnic University, Brooklyn) (Polytechnic University, Brooklyn) based on based on the book: book: the Video Processing & Communications Video Processing & Communications written by: Yao Yao Wang, Wang, Jom Jom Ostermann Ostermann, & , & Ya Ya- -Oin Oin Zhang Zhang written by: st edition, 2001, ISBN: 0130175471. Prentice Hall, 1 st edition, 2001, ISBN: 0130175471. Prentice Hall, 1 [SUT Code: TK 5105 .2 .W36 2001]. [SUT Code: TK 5105 .2 .W36 2001].

  4. Chapter 1 Chapter 1 Foundation of Video Coding Part II: Scalar & Vector Quantization

  5. Outline Outline � Overview of source coding systems Overview of source coding systems � � Scalar Quantization Scalar Quantization � � Vector Quantization Vector Quantization � � Rate Rate- -distortion characterization of lossy distortion characterization of lossy � coding coding � Operational rate Operational rate- -distortion function distortion function � � Rate Rate- -distortion bound (lossy coding bound) distortion bound (lossy coding bound) � Kasaei 6 Kasaei 6

  6. Components of a Coding Components of a Coding System System Kasaei 7 Kasaei 7

  7. Lossy Coding Lossy Coding � For discrete sources: For discrete sources: � � Lossless coding: bitrate >= entropy rate. Lossless coding: bitrate >= entropy rate. � � One can further quantize source samples to reach a lower rate. One can further quantize source samples to reach a lower rate. � � For continuous sources: For continuous sources: � � Lossless coding will require an Lossless coding will require an infinite infinite bitrate! bitrate! � � One must quantize source samples to reach a finite bitrate. One must quantize source samples to reach a finite bitrate. � � Lossy coding rate is bounded by the mutual information between Lossy coding rate is bounded by the mutual information between � the original source & the quantized source that satisfy a the original source & the quantized source that satisfy a distortion criterion. distortion criterion. � Quantization methods: Quantization methods: � � Scalar quantization. Scalar quantization. � � Vector quantization. Vector quantization. � Kasaei 8 Kasaei 8

  8. Scalar Quantization Scalar Quantization � General description General description � � Uniform quantization Uniform quantization � � MMSE MMSE quantizer quantizer � � Lloyd algorithm Lloyd algorithm � Kasaei 9 Kasaei 9

  9. Function Representation Function Representation Reconstruction Value Boundary Value (Decision Value) = ∈ Q ( f ) g , if f B l l Kasaei 10 Kasaei 10

  10. Line Partition Representation Line Partition Representation reconstruction value f symbol boundary value Quantizati on levels : L Boundary v alues : b l = Partition regions : B [ b , b ) − l l 1 l Reconstruc tion value s : g l = ∈ Quantizer mapping : Q ( f ) g , if f B l l Kasaei 11 Kasaei 11

  11. Distortion Measure Distortion Measure General measure: mean dist. Incurred in each region = − Mean Square Error (MSE): 2 d ( f , g ) ( f g ) 1 Kasaei 12 Kasaei 12

  12. Uniform Quantization Uniform Quantization Uniform source: Each additional bit provides 6dB gain! Kasaei 13 Kasaei 13 = = − 2 R quantization stepsize: q B / L B

  13. Minimum MSE (MMSE) Minimum MSE (MMSE) Quantizer Quantizer Determine b l g , to minimize MSE l 2 2 ∂ σ ∂ σ q q = = Setting 0, 0 yields : b g l l Nearest-Neighbor or Condition Centroid Condition centroid of Bl Kasaei 14 Kasaei 14

  14. High- -Rate Approximation of SQ Rate Approximation of SQ High � For a source with an For a source with an arbitrary arbitrary pdf pdf, when the , when the bitrate bitrate (or (or � quantization level) is very high very high so that the so that the pdf pdf within within quantization level) is each partition region can be approximated as flat flat we we each partition region can be approximated as have: have: pdf of a normalized unit variance source 2 ε = Uniform source : 1 2 ε = i.i.d Gaussian source : 2 . 71 (w/o VLC) 2 ε = Bound for Gaussian source : 1 Kasaei 15 Kasaei 15

  15. Lloyd Algorithm Lloyd Algorithm � Iterative algorithms for Iterative algorithms for � determining MMSE MMSE determining quantizer parameters parameters quantizer ( bl bl, , gl gl ). ). ( � Can be based on a Can be based on a pdf pdf � or a training data training data. . or a � Iterates Iterates between between � centroid condition & condition & centroid nearest - n eighbor nearest - eighbor n condition. condition. Update D0=D1

  16. Vector Quantization Vector Quantization � General description General description � � Nearest Nearest- -neighbor neighbor quantizer quantizer � � MMSE MMSE quantizer quantizer � � Generalized Lloyd algorithm Generalized Lloyd algorithm � Kasaei 17 Kasaei 17

  17. Vector Quantization: Vector Quantization: General Description General Description � Motivation: quantize a Motivation: quantize a group group of samples (a of samples (a � vector) together, to exploit the correlation correlation vector) together, to exploit the between these samples. between these samples. � Each sample vector is replaced by one of Each sample vector is replaced by one of � representative vectors (or patterns) that (or patterns) that often often representative vectors occurs in the signal. in the signal. occurs � The task is to find the The task is to find the L L most most popular patters popular patters. . � Kasaei 18 Kasaei 18

  18. Vector Quantization: Vector Quantization: General Description General Description � Applications: Applications: � � Color quantization Color quantization: Quantizes all colors : Quantizes all colors � appearing in an image to L L colors to display colors to display appearing in an image to on a monitor that can only display L L distinct distinct on a monitor that can only display colors at a time at a time � � Adaptive palette. Adaptive palette. colors � Image quantization Image quantization: Quantizes every : Quantizes every NxN NxN � block into one of the L L typical patterns typical patterns block into one of the (obtained through training). More efficient with (obtained through training). More efficient with a larger block size (but the block size is a larger block size (but the block size is limited by the complexity). limited by the complexity). Kasaei 19 Kasaei 19

  19. VQ as Space Partition VQ as Space Partition codeword ∈ N Original vector : f R Quantizati on levels : L Partition regions : B l Reconstruc tion vecto r (codeword) : g l = ∈ Quantizer mapping : Q ( f ) g , if f B l l = = Codebook : C { g , l 1 , 2 ,..., L } l 1 = Bit rate : R log L 2 N Every point in a region ( B l ) is replaced by (quantized to) the point indicated by the circle ( g l ), the codeword. Kasaei 20 Kasaei 20

  20. Distortion Measure Distortion Measure � General measure: MSE: Kasaei 21 Kasaei 21

  21. Nearest- -Neighbor (NN) Neighbor (NN) Nearest Quantizer Quantizer Challenge: How to determine the codebook? Kasaei 22 Kasaei 22

  22. Complexity of NN VQ Complexity of NN VQ � Complexity analysis: Complexity analysis: � � Must compare the input vector with all of the Must compare the input vector with all of the codewords codewords. . � � Each comparison takes Each comparison takes N N operations. operations. � � Needs Needs L=2^{NR} L=2^{NR} comparisons. comparisons. � � Total operation = Total operation = N 2^{NR}. N 2^{NR}. � � Total storage space = Total storage space = N 2^{NR}. N 2^{NR}. � � Both computation & storage requirement increases Both computation & storage requirement increases exponentially exponentially � with N with N ! ! Kasaei 23 Kasaei 23

Recommend


More recommend