sampling and reconstruction
play

Sampling and reconstruction CS 4620 Lecture 3 Lecture 3 Cornell - PowerPoint PPT Presentation

Sampling and reconstruction CS 4620 Lecture 3 Lecture 3 Cornell CS4620 Fall 2019 1 Sampled representations How to store and compute with continuous functions? Common scheme for representation: samples write down the


  1. Discrete fj ltering in 2D • Same equation, one more index now the fj lter is a rectangle you slide around over a grid of numbers • Commonly applied to images blurring (using box, using gaussian, …) sharpening (impulse minus blur) • Usefulness of associativity often apply several fj lters one after another: ((( a * b 1 ) * b 2 ) * b 3 ) this is equivalent to applying one fj lter: a * ( b 1 * b 2 * b 3 ) Lecture 3 • Cornell CS4620 Fall 2019 18

  2. And in pseudocode… Lecture 3 • Cornell CS4620 Fall 2019 19

  3. [Philip Greenspun] original | box blur sharpened | gaussian blur Lecture 3 • Cornell CS4620 Fall 2019 20

  4. [Philip Greenspun] original | box blur sharpened | gaussian blur Lecture 3 • Cornell CS4620 Fall 2019 20

  5. Optimization: separable fj lters basic alg. is O ( r 2 ): large fj lters get expensive fast! • de fj nition: a 2 ( x,y ) is separable if it can be written as: • this is a useful property for fj lters because it allows factoring: Lecture 3 • Cornell CS4620 Fall 2019 21

  6. Separable fj ltering = * first, convolve with this Lecture 3 • Cornell CS4620 Fall 2019 22

  7. Separable fj ltering = * second, convolve with this first, convolve with this Lecture 3 • Cornell CS4620 Fall 2019 22

  8. Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as well output is continuous integration replaces summation Lecture 3 • Cornell CS4620 Fall 2019 23

  9. Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as well output is continuous integration replaces summation Lecture 3 • Cornell CS4620 Fall 2019 23

  10. Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as well output is continuous integration replaces summation Lecture 3 • Cornell CS4620 Fall 2019 23

  11. Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as well output is continuous integration replaces summation Lecture 3 • Cornell CS4620 Fall 2019 23

  12. Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as well output is continuous integration replaces summation Lecture 3 • Cornell CS4620 Fall 2019 23

  13. Continuous convolution • Sliding average expressed mathematically: note di ff erence in normalization (only for box) • Convolution just introduces weights weighting is now by a function weighted integral is like weighted average again bounds are set by support of f(x) Lecture 3 • Cornell CS4620 Fall 2019 24

  14. One more convolution • Continuous–discrete convolution input: a sequence and a continuous function output: a continuous function used for reconstruction and resampling Lecture 3 • Cornell CS4620 Fall 2019 25

  15. Continuous-discrete convolution Lecture 3 • Cornell CS4620 Fall 2019 26

  16. Continuous-discrete convolution Lecture 3 • Cornell CS4620 Fall 2019 26

  17. Continuous-discrete convolution Lecture 3 • Cornell CS4620 Fall 2019 26

  18. Continuous-discrete convolution Lecture 3 • Cornell CS4620 Fall 2019 26

  19. Continuous-discrete convolution Lecture 3 • Cornell CS4620 Fall 2019 26

  20. <latexit sha1_base64="akNUJ/jhcF245tZsLuKpMhj2Do=">AC6nicbVFLb9NAEF6bVzGvFI5cRsRIqYDIeUjkUqjEhWORSFspG1Xr9TpdZb02+0CNrPwELhxAiCu/iBv/gp/A2EkrXmNZ+81838zszqSVktYlyY8gvHL12vUbOzejW7fv3L3X2b1/ZEtvuJjyUpXmJGVWKnF1EmnxElBCtSJY7T5auGP34vjJWlfutWlZgXbKFlLjlzGDrdDQhNxULq2rE0lXqxjihsv326lEpFNU1zyL3mTcIajOClts547npWvPNCcwExi59CLpUTBuIcMV5BQXwe71EaWdiHBE+DZ943LJPeorupW5o1xBIZqriQCs7hGRigpnFiaDWuRAlVuSpLg/yTlm+9rSAr1iPvoibThYFbCbnkPeaWnIvuhlhPMGX2AjKnR2+eDTjfpJ63Bv2CwBd2XP0lrh6ed7zQruS+Edlwxa2eDpHLzmhknuRI4QG9FxfiSLcQMoWaFsPO6XdUaHmMka56Nv3bQRn/PqFlh7apIUVkwd2b/5prg/7iZd/lkXktdeYcr2TKvcLpQbN3yCRuzqkVAsaNxLsCP2OGcdyZjdohjIfDyXgEGzAaXIDJ5RCOhv3BqJ+8GXYPxptpkB3ykDwiPTIgz8kBeU0OyZTwYBF8CD4Fn0MVfgy/hF830jDY5jwgf1j47Rf/BOH9</latexit> And in pseudocode… function reconstruct(sequence a , filter f , real x ) s = 0 r = f.radius for i = d x � r e to b x + r c do s = s + a [ i ] f ( x � i ) return s Lecture 3 • Cornell CS4620 Fall 2019 27

  21. Resampling • Changing the sample rate in images, this is enlarging and reducing • Creating more samples: increasing the sample rate “upsampling” “enlarging” • Ending up with fewer samples: decreasing the sample rate “downsampling” “reducing” Lecture 3 • Cornell CS4620 Fall 2019 28

  22. Resampling • Reconstruction creates a continuous function forget its origins, go ahead and sample it Lecture 3 • Cornell CS4620 Fall 2019 29

  23. Resampling • Reconstruction creates a continuous function forget its origins, go ahead and sample it Lecture 3 • Cornell CS4620 Fall 2019 29

  24. Resampling • Reconstruction creates a continuous function forget its origins, go ahead and sample it Lecture 3 • Cornell CS4620 Fall 2019 29

  25. Resampling • Reconstruction creates a continuous function forget its origins, go ahead and sample it Lecture 3 • Cornell CS4620 Fall 2019 29

  26. Resampling • Reconstruction creates a continuous function forget its origins, go ahead and sample it Lecture 3 • Cornell CS4620 Fall 2019 29

  27. <latexit sha1_base64="PeblT5ZoTgtQ5r360qZWax1qm34=">AD3icbVLjtMwFHXCawiP6cCSzRXNSB0BVfqQ6GZgJFiwGg0SnRmpqSrHvWndOk6wHdQq6h+w4VfYsAhtmzZ8Rd8Ak7aKc8bRTo591z7+DhRJrg2QfDdcS9dvnL12s5178bNW7d3a3t3TnWaK4Z9lopUnUdUo+AS+4YbgeZQpEAs+i+bOyf/YGleapfGWGQ4TOpE85owaS432nP0wgmXhaFRxOVk5YWweQ7DORfCK8IohjiXrBxYgUJNk0xgQ+PrHCVD8Kn/EGIuDCrwY4vt/gL8xSjY4vA5CkNhYQkuDfjH/kEYelSI1PpASHOT5QZ+rRj5kMYgUE7MtFRb8dpGqlbgz+AQAh8qxqSWOH7U2nyO05XVhk/8aDAb+lankKVSG5Uz06iMVg79BTyA2dbVwcX6Ck2u7CGtAS9EOd6mMqrVg2ZQFfwLWhtQf/qDVHUyqn0LxynLE5SGCar1oBVkZlhQZTgTaFPONWaUzekEBxZKmqAeFtV9rmDfMuPysPa1cVXs7xMFTbReJpFVJtRM9d+9kvxfb5CbuDcsuLRp26DXG8W5sClC+XPAmNu8jFhaQJni1iuwKVWU2bvVXhVCt93udTuwBp3WBehtQzhtN1udZvCyXT/qrtMgO+QeuU8apEUekyPygpyQPmHOW+e989H5L5zP7if3S9rqetsZu6SP8r9+hMk3+4Q</latexit> And in pseudocode… function resample(sequence a , filter f , real x 0 , real ∆ x , int N ) allocate output sequence b of length N for j = 0 to N − 1 do b [ j ] = reconstruct( a , f , x + j ∆ x ) return b Δ x 0 1 x 0 x 0 + ( N – 1) Δ x Lecture 3 • Cornell CS4620 Fall 2019 30

  28. <latexit sha1_base64="b2dGCo2hUQhuoGOUSjq0y3mhCoA=">AC3icbZDLSgMxGIX/8VrerSTWgRFLHMtBW7s+DGlVSwtdAOQybN1GDmQpIRy9C9G1/FjQtF3PoC7nwLH8F0poq3A4GPc/6fJMeNOJPKN+MqemZ2bn53EJ+cWl5ZbWwt6WYSwIbZGQh6LjYk5C2hLMcVpJxIU+y6n5+7l0Tg/v6JCsjA4U8OI2j4eBMxjBCtOYXitcPRLup5ApOEaTL+6PkZLR97Qi0h3S64xRKZtlMhf6CNYHS4TukajqF14/JLFPA0U4lrJrmZGyEywUI5yO8r1Y0giTSzygXY0B9qm0k/QvI7SlnT7yQqFPoFDqft9IsC/l0Hf1pI/Vhfydjc3/sm6svLqdsCKFQ1IdpEXc6RCNC4G9ZmgRPGhBkwE029F5ALrWpSuL5+WUKtU6rUqyqBqfUL9q4R2pWxVy+ZprdSoZW1ADjahCNtgwQE04Bia0AICN3AHD/Bo3Br3xpPxnI1OGZOdDfgh4+UD346Z+g=</latexit> <latexit sha1_base64="Ks2qsgK/oX2RvcGBu23J/wiY2p4=">ACXicbZDLSgMxFIYzXmu9jbp0YbAIglBmpgW7EQq6cFnBXqBTSibNtKGZC0lGLMNs3fgqLnShiAs3PoDgzncRMZ2p4u1Awsf/n8NJfidkVEjDeNWmpmdm5+ZzC/nFpeWVX1tvSGCiGNSxwELeMtBgjDqk7qkpFWyAnyHEazvBw7DfPCBc08E/lKCQdD/V96lKMpJK6OjzvGvBA3QzuQdvlCMdmElsJtI8Ik0gZesEoGmnBv2BOoFDdentEz+/Xta7+YvcCHnEl5ghIdqmEcpOjLikmJEkb0eChAgPUZ+0FfrI6ITpz9J4I5SetANuDq+hKn6fSJGnhAjz1GdHpID8dsbi/957Ui6lU5M/TCSxMfZIjdiUAZwHAvsU6wZCMFCHOq3grxAKk4pAovn4ZQtqxKuQzKJmfUPkKoWEVzVLRODEL1TLIKgc2wTbYBSbYB1VwDGqgDjC4AFfgFtxpl9qNdq89ZK1T2mRmA/wo7ekDZwKdkQ=</latexit> <latexit sha1_base64="4yva7FlTt6O9AUnEyFDv6DtnOI=">ACXicbZDLSgMxGIUz9VbrbdSlC4NFqIhlZlqwy4IuXEkFe4G2lEyaUMzF5KMWIbZuvFVXOhCERdufADBne8iYjqt4u1A4Oc/yfJsQNGhTSMVy01NT0zO5ezywsLi2v6KtrNeGHJMq9pnPGzYShFGPVCWVjDQCTpBrM1K3BwejvH5GuKC+dyqHAWm7qOdRh2IkldXR4XmHwV2YO4Z7sOVwhCMzjqx4B7YOCZNIxXrWyBuJ4F8wJ5Atb749ouf360pHf2l1fRy6xJOYISGaphHIdoS4pJiRONMKBQkQHqAeaSr0kEtEO0p+EsNt5XSh43N1PAkT9/tGhFwhq6tJl0k+J3NjL/y5qhdErtiHpBKImHxc5IYPSh6NaYJdygiUbKkCYU/VWiPtI9SFVeZmkhKJlYoFOIaC+QmlrxJqVt4s5I0TM1sugrHSYANsgRwT4ogyNQAVWAwQW4ArfgTrvUbrR7WE8mtImO+vgh7SnD6kwnRk=</latexit> De fj ning the source interval • Convenient: de fj ne samples using an interval to be resampled, and N • Exactly how to fj ll an interval with N samples? Desiderata: sample pattern should be centered in the interval (prevents shifting of content as you change resolution) sample patterns of adjacent intervals should tile (makes it meaningful to break up a signal into pieces) N all the way down to 1 should work • Solution: think of breaking into N subintervals and centering a sample in each one x 0 = x l + 1 x l + i + 0 . 5 2 ∆ x e.g. fj rst sample at ( x r − x l ) sample i goes at N x l + ( N − 1 2 ) ∆ x e.g. last sample at x l x r 0 1 x 0 x 4 x 0 + ( N – 1) Δ x Lecture 3 • Cornell CS4620 Fall 2019 31

  29. <latexit sha1_base64="QvdnVZgK+LdPnNZw8uX+z+hiE=">ADJHicbVLbtNAFB2bVwmvFpZsroiRUkFTO6lEhVSoxIZVST6kOIoGk+u02nHYzMzRlSWP4YNv8KGBQ+xYM38AlcO6GUx1i2js89d+b4XCeFktaF4TfPv3Dx0uUrS1c7167fuHlreX2vs1LI3BP5Co3hwm3qKTGPSedwsPCIM8ShQfJybOmfvAajZW5fulOCxnfKZlKgV3RE1WvMdxgjOpK8eTROpZ3YlhcW3FJ1KpThUnKaSlFk1DQYtzwqFPYuvStQCIeDBQ0ilcmgSAnT+QqCNxN1DhvCUjsIdoLVO5wpXKygJCXrigd/N4sCSBPQaGeuaNGTeK5g9zUEBzDFoQBtIzLidhZixav07wmbfwkSEbH4B0BkWurTOlcL3WY2u8QUPoHdMjzg1XFRXQ3qVeiRSVgDKq/COsxthsbdKWhLydrnRj19CyqyXI37Iftgn9BtADdpz9Yu3Yny5/jaS7KDLUTils7isLCjStunBQKfrSYsHFCZ/hiKDmGdpx1Q65hvETJsY6KYgW/Z8R8Uza0+zhJQZd0f271pD/q82Kl26Oa6kpjnQCOYHpaWifKH5Y2AqKUmnTglwYSR5BXHEKTkauO20IWwMBpsbQ5iDYfQLbJ6FsD/oR8N+GLQ3d6Yp8GW2F12j/VYxB6xbfac7bI9Jry3nvo/fJf+d/8L/4X+dS31v03GF/LP/7T+tY9KQ=</latexit> And in pseudocode… function resample(sequence a , filter f , real x l , real x r , int N ) allocate output sequence b of length N for j = 0 to N − 1 do b [ j ] = reconstruct( a , f , x l + ( j + 1 2 )( x r − x l ) /N ) return b Note that this expands into a double loop x l x r 0 1 x 0 x 4 x 0 + ( N – 1) Δ x Lecture 3 • Cornell CS4620 Fall 2019 32

  30. Cont.–disc. convolution in 2D • same convolution—just two variables now loop over nearby pixels, 
 support of average using fj lter weight reconstruction fj lter looks like discrete fj lter, 
 but o ff sets are not integers 
 and fj lter is continuous pixel locations remember placement of fj lter 
 sample point relative to grid is variable for reconstruction Lecture 3 • Cornell CS4620 Fall 2019 33

  31. Cont.–disc. convolution in 2D Example showing 
 fj lter weights used 
 to compute a 
 reconstructed value 
 at a single point. Lecture 3 • Cornell CS4620 Fall 2019 34

  32. <latexit sha1_base64="5QK4Mv1lfG6UQM0mMrF5KyCf+bs=">ADYHicfVLjtMwFHUbHiXMC3sYHNFg9QRQ5U+JLqZYSQ2LAeJzoxUV5XjOh23jhNsZ9Qo6k+yY8GL+ATcNJQytNRpKNz25Ob5BIrg2v+5Vnfu3L13v/HAfXhw+Oio2Xp8qeNUTamsYjVdUA0E1yseFGsOtEMRIFgl0Fq7dF/eqWKc1j+cFkCZtGZCF5yCkxlpq1arc4YAsuc0OCgMvFxsVQPad7aMWFcHMchBCmkhatG1CMxlIblVLT0exjyiRl4BHvBEIuDFPghRbYQR46x3KvGOMXU/DKfieRVvTWG3AW1oOC8q4gAxeQdhVswywKgPSp2JrQyLUMSxspqXO03JVKJ5vLG+GzPmf90XlfO6/84ryvn9T+c8Vk5v7YyMuEnymEncKXn5SDL493P6aYSZXNytOei5mc72KeNdt+1y8P/Al6FWi/+YbKczFrfsLzmKYRk4YKovWk5ydmhNlOBXMXluqWULoizYxEJIqanebkgG3hmXmRhX2lgZLd78hJpHUWBVYZEXOjf68V5N9qk9SEo2nOZIae/3bD4WpsHFCsW0w53ZLjMgsIFRxOyvQG6Itfuh3TKEYb8/Gg5gCwa9H2C0C+Gy3+0Nuv7fvt8uE0DNdAz9Bx1UA+9RufoHbpAY0RrX+pO/aB+WP/qNJwjp7WV1mtVzxP0y3GefgcSeAQi</latexit> And in pseudocode… function reconstruct(sequence a , filter f , real x , real y ) s = 0 for j = d y � f.r y e to b y + f.r y c do for i = d x � f.r x e to b x + f.r x c do s = s + a [ i, j ] f ( x � i, y � j ) return s Lecture 3 • Cornell CS4620 Fall 2019 35

  33. De fj ning the source rectangle • exactly the same way we de fj ned the source interval in 1D • except now there is 
 an interval ( l , r ) in x 
 and an interval ( b , t ) 
 in y ( r , t ) • intervals in x and y 
 constitute a rectangle • sample points are 
 positioned with the 
 same considerations 
 (at centers of a grid 
 of sub-rectangles) ( l , b ) Lecture 3 • Cornell CS4620 Fall 2019 36

  34. <latexit sha1_base64="J4cXop21yLJpCB828rpkvKUs=">ADonicbVJb9MwFPZaLqNctsEjDxwxI7ViK0k3ib0MJiEkxANsgl2kpoc12m9OU5kO3Qlyv/id/DGv+AncOJ21djmKPbnc/us75w4V9K6IPiz1GjeuXv/vKD1sNHj5+srK49PbJZYbg45JnKzEnMrFBSi0MnRInuREsjZU4js8/1P7jH8JYmenvbpqLQcpGWiaSM4em07WlX1EsRlKXjsWx1KOqFcH8212c51KpKGqVUZxAUmhep1ZghGVprkRbYk0BlNENSKRywgBNEBvBHdMjhS6DV6kd0MklGNMOVmRKZfgSAVnh8sLBvFJMIUvAyp+I25MNGHfogj4zFdAz2IWAgre4DA3jzXB+HWYVxkbv6BRjTDeG19A+wy1KDONlWJW9qgNt03WwWbs78AbG1GdcqS+v15/cqI8MF5B1QzyFgbjGVTNMKGXOXFfbpwNaJ0peKatMwV3ba+dF41e1NvUq+P5jHCFQbVRlFYk9HDRqNPV9aAb+AU3QTgH6+/Er/2T1d/R8OMF6nQjitmbT8McjcomXGSK4GNL6zIGT/HFvQRapYKOyj9iFXwCi3DWh38sX/ejWjZKm10zTGyJS5sb3uq423+fqFS3YGpdTYfqH5jCgpFMoO9bzCUNZzpKYIGDcS3wp8zFBpnDPb8iJs93o721swA1vhJdhZiHDU64Zb3eCgt763PVODLJPn5CVpk5C8JXvkE9knh4Q3XjQ+Nr40vjZp83PzoPltFtpYmuc8I/+tZvQPK5cSJg=</latexit> And in pseudocode… function resample(image a , filter f , rectangle r , int w , int h ) allocate output image b of size ( w, h ) for j = 0 to h − 1 do y = r.b + ( j + 1 2 )( r.t − r.b ) /h for i = 0 to w − 1 do x = r.l + ( i + 1 2 )( r.r − r.l ) /w b [ i, j ] = reconstruct( a , f , x , y ) return b Note that this expands into a quadruple loop Lecture 3 • Cornell CS4620 Fall 2019 37

  35. <latexit sha1_base64="SbWzCKrM1Pz5uOEGkhVFWFTj2WI=">AB7HicbZDLSsNAFIZP6q3W9Wlm8EiuLEkbcDuLhxWcG0QhvKZDph04mYWYilrTP4MaFIm59IHe+hY9gmlTx9sPAx3/O4Zz5vYgzpU3zSgsLa+srhXSxubW9s75d29tgpjSahDQh7Kaw8rypmgjma0+tIUhx4nHa8fm83rmhUrFQXOlJRN0ADwXzGcE6tZzp7Qmb9sVs2pmQn/BWkDl7B0ytfrl194gJHFAhSYcK9W1zEi7CZaEU5npV6saITJGA9pN0WBA6rcJDt2ho5SZ4D8UKZPaJS53ycSHCg1Cby0M8B6pH7X5uZ/tW6s/YabMBHFmgqSL/JjnSI5j9HAyYp0XySAiaSpbciMsISE53mU8pCsGu1hl1HOdStT2h8hdCuVa161by0K07TwOKcACHcAwWnEITLqAFDhBgcAcP8GgI4954Mp7z1oKxmNmHzJePgCNQ5AY</latexit> <latexit sha1_base64="9ZETbBKW/qX7fUYMYQgOk81h7M=">AB7HicbZDLSsNAFIZPvNZ6q7p0M1gEN5akDdidBTcuK5i20IYymU7asZNJmJkIoe0zuHGhiFsfyJ1v4SOYJlW8/TDw8Z9zOGd+L+JMadN8M5aWV1bX1gsbxc2t7Z3d0t5+S4WxJNQhIQ9lx8OKciao5nmtBNJigOP07Y3vpjX27dUKhaKa51E1A3wUDCfEaxTy5kmpzfTfqlsVsxM6C9YCyifv0OmZr/02huEJA6o0IRjpbqWGWl3gqVmhNZsRcrGmEyxkPaTVHgCp3kh07Q8epM0B+KNMnNMrc7xMTHCiVBF7aGWA9Ur9rc/O/WjfWft2dMBHFmgqSL/JjnSI5j9HAyYp0TxJARPJ0lsRGWGJiU7zKWYh2NVq3a6hHGrWJ9S/QmhVK1atYl7Z5YadpwEFOIQjOAELzqABl9AEBwgwuIMHeDSEcW8Gc9565KxmDmAHzJePgCQT5Aa</latexit> <latexit sha1_base64="aGR+GLzjVCTU21uEnQWLtdlsVJo=">AB7HicbZDLSsNAFIZP6q3W9Wlm8EiVJCStAG7s+DGZQVTC20ok+mkHTqZhJmJGEKfwY0LRdz6QO58Cx/BNK3i7YeBj/+cwznzexFnSpvm1FYWl5ZXSulzY2t7Z3yrt7HRXGklCHhDyUXQ8rypmgjma024kKQ48Tq+9yfmsfn1DpWKhuNJRN0AjwTzGcE6s5zq7UlyPChXzJqZC/0FawGVs3fI1R6UX/vDkMQBFZpwrFTPMiPtplhqRjidlvqxohEmEzyivQwFDqhy0/zYKTrKnCHyQ5k9oVHufp9IcaBUEnhZ4D1WP2uzcz/ar1Y+03ZSKNRVkvsiPOdIhmv0cDZmkRPMkA0wky25FZIwlJjrLp5SHYNfrTbuB5tCwPqH5FUKnXrMaNfPSrTseRpQhAM4hCpYcAotuIA2OECAwR08wKMhjHvjyXietxaMxcw+/JDx8gGlko+A</latexit> <latexit sha1_base64="K+lfoJOS9nSwJe0ZcE70FEFEi0g=">AB7HicbZDLSsNAFIZPvNZ6q7p0M1gEF1KSNmB3Fty4rGDaQhrKZDpx04mYWYilNBncONCEbc+kDvfwkcwTap4+2Hg4z/ncM78fsyZ0qb5Ziwtr6yurZc2yptb2zu7lb39joSahDIh7Jno8V5UxQRzPNaS+WFIc+p1/cjGvd2+pVCwS13oaUy/EI8ECRrDOLMdlpzfeoFI1a2Yu9BesBVTP3yFXe1B57Q8jkoRUaMKxUq5lxtpLsdSMcDor9xNFY0wmeETdDAUOqfLS/NgZOs6cIQoimT2hUe5+n0hxqNQ09LPOEOux+l2bm/V3EQHTS9lIk40FaRYFCQc6QjNf46GTFKi+TQDTCTLbkVkjCUmOsunIdg1+tNu4EKaFif0PwKoVOvWY2aeWVXW3aRBpTgEI7gBCw4gxZcQhscIMDgDh7g0RDGvfFkPBetS8Zi5gB+yHj5ABTVj8k=</latexit> Separable fj lters for resampling • just as in fj ltering, separable fj lters are useful separability in this context is a statement about a continuous fj lter, rather than a discrete one: • with a separable fj lter the region of support is rectangular • all widely used resampling fj lters 
 are separable [ i, j ] there are good reasons best explained 
 | x − i | with frequency domain arguments (it’s easiest to design separable fj lters 
 | y − j | ( x, y ) to suppress grid artifacts) Lecture 3 • Cornell CS4620 Fall 2019 38

  36. Optimized separable resampling • for larger fj lters, separability provides an opportunity for an optimization • resample in two passes, one resampling each row and one resampling each column • intermediate storage required: product of one dimension of source and the other dimension of destination Lecture 3 • Cornell CS4620 Fall 2019 39

  37. <latexit sha1_base64="i94oKk1Fhg/6KMdClQOS4bpyO3w=">ACBXicbZC7SgNBFIbPeo3xFrXUYjAIsQm7ScB0Bmxs1AjmAkIs5NJMmT2wsxZMSxpbHwVGwtFbH0HO9/CR3CziRIvPwx8/P85zMxv+1JoNM13Y25+YXFpObGSXF1b39hMbW1XtRcoxivMk56q21RzKVxeQYGS13FqWNLXrMHJ+O8ds2VFp57hUOftxzac0VXMIqR1U7tXWTUebuJ/AbDjsbR2QwftlNpM2vGIn/BmkL6+ANildupt2bHY4HDXWSat2wTB9bIVUomOSjZDPQ3KdsQHu8EaFLHa5bYfyLETmInA7peio6LpLYnd0IqaP10LGjSYdiX/OxuZ/WSPAbrEVCtcPkLtsclE3kAQ9Mq6EdITiDOUwAsqUiN5KWJ8qyjAqLhmXUMjlioU8mUDe+oLidwnVXNbKZ83LQrpUmLQBCdiFfciABUdQglMoQwUY3MI9PMKTcWc8GM/Gy2R0zpju7MAPGa+f0bOaXA=</latexit> <latexit sha1_base64="0xCvQSYbm7FDGJhnNsBrIeMA=">ACBXicbZC7SgNBFIbPxluMt6ilFoNBiE3YTQKmM2Bjo0YwF0hCmJ1MkiGzF2bOimFJY+Or2FgoYus72PkWPoKbTRvPwx8/P85zMxv+1JoNM03IzE3v7C4lFxOrayurW+kN7dq2gsU41XmSU81bKq5FC6vokDJG7i1LElr9vD40lev+JKC8+9xJHP2w7tu6InGMXI6qR3z7PqrNCfo2hVmx8OuOuxvFBJ50xc2Ys8hesGWSO3iFWpZN+bXU9FjcRSap1k3L9LEdUoWCST5OtQLNfcqGtM+bEbrU4bodxr8Yk/3I6ZKep6LjIond7xshdbQeOXY06VAc6N/ZxPwvawbYK7VD4foBcpdNL+oFkqBHJpWQrlCcoRxFQJkS0VsJG1BFGUbFpeISivl8qVgUyhYn1D6KqGWz1mFnHlRzJSL0zYgCTuwB1mw4BDKcAIVqAKDG7iDB3g0bo1748l4no4mjNnONvyQ8fIBzRqaWQ=</latexit> <latexit sha1_base64="VQEX6BGf+q8wLAL3pXoDs6pW7Q0=">ACB3icbZDLSsNAFIZP6r3eqi4FGSyCbkqSFuxOwY0b2Ct0MYymU516OTCzIlYQndufBU3LhRx6yu48y18BNOkSr38MPDx/+cwM78bSqHRN+N3Nj4xOTU9Ex+dm5+YbGwtHymg0gxXmOBDNS5SzWXwuc1FCj5eag49VzJ6253b5DXr7nSIvBPsRdyx6OXvugIRjGxWoW1o01YR+2mshvMG5r7B+M8FarUDRLZiryF6whFHc+INVxq/DWbAcs8riPTFKtG5YZohNThYJ3s83I81Dyr0kjcS9KnHtROn/+iTjcRpk06gkuMjSd3RjZh6Wvc8N5n0KF7p39nA/C9rRNipOrHwi5z7KLOpEkGJBKaQtFGcoewlQpkTyVsKuqKIMk+ryaQkV265WyiSDsvUF1e8SzuySVS6ZJ5XibiVrA6ZhFdZhEyzYhl3Yh2OoAYNbuIdHeDLujAfj2XjJRnPGcGcFfsh4/QGxZsA</latexit> O ( rN src M dst ) [Philip Greenspun] O ( rN dst M dst ) O ( r 2 N dst M dst ) two-stage resampling using a 
 separable fj lter Lecture 3 • Cornell CS4620 Fall 2019 40

  38. A gallery of fj lters • Box fj lter Simple and cheap • Tent fj lter Linear interpolation • Gaussian fj lter Very smooth antialiasing fj lter • B-spline cubic Very smooth • Catmull-rom cubic Interpolating • Mitchell-Netravali cubic Good for image upsampling Lecture 3 • Cornell CS4620 Fall 2019 41

  39. Box fj lter Lecture 3 • Cornell CS4620 Fall 2019 42

  40. Tent fj lter Lecture 3 • Cornell CS4620 Fall 2019 43

  41. Gaussian fj lter Lecture 3 • Cornell CS4620 Fall 2019 44

  42. B-Spline cubic − 3(1 − | x | ) 3 + 3(1 − | x | ) 2 + 3(1 − | x | ) + 1  − 1 ≤ x ≤ 1 , f B ( x ) = 1   (2 − | x | ) 3 1 ≤ | x | ≤ 2 , 6  0 otherwise .  Lecture 3 • Cornell CS4620 Fall 2019 45

  43. Catmull-Rom cubic − 3(1 − | x | ) 3 + 4(1 − | x | ) 2 + (1 − | x | )  − 1 ≤ x ≤ 1 , f C ( x ) = 1   (2 − | x | ) 3 − (2 − | x | ) 2 1 ≤ | x | ≤ 2 , 2  0 otherwise .  Lecture 3 • Cornell CS4620 Fall 2019 46

  44. Michell-Netravali cubic f M ( x ) = 1 3 f B ( x ) + 2 3 f C ( x ) − 21(1 − | x | ) 3 + 27(1 − | x | ) 2 + 9(1 − | x | ) + 1  − 1 ≤ x ≤ 1 ,  = 1  7(2 − | x | ) 3 − 6(2 − | x | ) 2 1 ≤ | x | ≤ 2 , 18  0 otherwise .  Lecture 3 • Cornell CS4620 Fall 2019 47

  45. Lanczos 1 1 0 0 -2 -1 0 1 2 -3 -2 -1 0 1 2 3 sinc( x ) = sin( π x ) ( sinc( x ) sinc( x/ 2) | x | < 2 f L 2 ( x ) = π x 0 otherwise ( sinc( x ) sinc( x/ 3) | x | < 3 f L 3 ( x ) = 0 otherwise Lecture 3 • Cornell CS4620 Fall 2019 48

  46. E ff ects of reconstruction fj lters • For some fj lters, the reconstruction process winds up implementing a simple algorithm • Box fj lter (radius 0.5): nearest neighbor sampling box always catches exactly one input point it is the input point nearest the output point so output[ i , j ] = input[round( x ( i )), round( y ( j ))] 
 x ( i ) computes the position of the output coordinate i on the input grid • Tent fj lter (radius 1): linear interpolation tent catches exactly 2 input points weights are a and (1 – a ) result is straight-line interpolation from one point to the next Lecture 3 • Cornell CS4620 Fall 2019 49

  47. Properties of fj lters • Degree of continuity • Impulse response • Interpolating or no • Ringing, or overshoot interpolating filter used for reconstruction Lecture 3 • Cornell CS4620 Fall 2019 50

  48. Ringing, overshoot, ripples • Overshoot caused by 
 negative fj lter 
 values • Ripples constant in, 
 non-const. out ripple free when: Lecture 3 • Cornell CS4620 Fall 2019 51

  49. Constructing 2D fj lters • Separable fj lters (most common approach) Lecture 3 • Cornell CS4620 Fall 2019 52

  50. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  51. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  52. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  53. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  54. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  55. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  56. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  57. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  58. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  59. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  60. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  61. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  62. Yucky details • What about near the edge? the fj lter window falls o ff the edge of the image need to extrapolate methods: • clip fj lter (black) • wrap around • copy edge • re fm ect across edge • vary fj lter near edge [Philip Greenspun] Lecture 3 • Cornell CS4620 Fall 2019 53

  63. Reducing and enlarging • Very common operation devices have di ff ering resolutions applications have di ff erent memory/quality tradeo ff s • Also very commonly done poorly • Simple approach: drop/replicate pixels • Correct approach: use resampling Lecture 3 • Cornell CS4620 Fall 2019 54

  64. [Philip Greenspun] 1000 pixel width Lecture 3 • Cornell CS4620 Fall 2019 55

  65. [Philip Greenspun] by dropping pixels gaussian fj lter 250 pixel width Lecture 3 • Cornell CS4620 Fall 2019 56

  66. [Philip Greenspun] box reconstruction fj lter bicubic reconstruction fj lter 4000 pixel width Lecture 3 • Cornell CS4620 Fall 2019 57

  67. Types of artifacts • Garden variety what we saw in this natural image fj ne features become jagged or sparkle • Moiré patterns Lecture 3 • Cornell CS4620 Fall 2019 58

  68. [Hearn & Baker cover] 600ppi scan of a color halftone image Lecture 3 • Cornell CS4620 Fall 2019 59

Recommend


More recommend