Click here to load reader

study Accelerating Spatially Varying Gaussian Filters

Embed Size (px)

DESCRIPTION

This is my study review.

Citation preview

  • 1. Accelerating Spatially VaryingGaussian Filters Jongmin Baek, David E. Jacobs Standford University SIGGRAPH ASIA 2010

2. Drawback of Bilateral Filter picewise-flat regions, false edges, blooming at high contrastregions bilateral proposed method 3. Drawback of Bilateral filter spurious detail on the road at color edgesbilateralproposed method 4. l Spatially varing Gaussian filters x noisy y signalbilateraled Spatially varing Gaussian filtered 5. l Spatially varing Gaussian filters 2D Gaussian kernel x 3D Gaussian kernel ?noisy ysignal bilateraledSpatially varingGaussianfilteredkernelsalong thegradient 6. Outline Introduction Review of Related Work Acceleration Kernel Sampling speeds up spatially varing Gaussina kernel Applications Limitations and Future Work Conclusion 7. Introduction Bilateral filter [Tomasi and Manduchi 98] Blur pixels spatially and preserve sharp edges Joint bilateral filter for upsampling [Eisemann & Durand 04] Bilateral filter ~ non-local mean [Buades et al. 05] High dimension Gaussian filter [Paris and Durand 06] Merge {spatial (x,y), range (I or r,g,b)} space Important sampling speeds up on high dimensionGaussian filter Bilateral grid [Paris & Durand 06; Chen et al. 07] Gaussian KD-tree [Adams et al. 09] Permutohedral Lattice [Adams et al. 09] Trilateral filter [Choudhury & Tumbilin 03] Kernel along the image gradients spatially varing Gaussina kernelNo speed up methods for spatially varing Gaussian kernel 8. Gaussian KernelsIsotropic 1D, 2D, .. ND Gaussian KernelElliptical Gaussian KernelAnisotropy Gaussian KernelDistance of Gaussian Kernel 9. Gaussian kernels 1D, 2D, .. NDIsotropy Gaussian kernel 1D Gaussian Kernel2D Gaussian KernelX2 (x1,x2) X1 1 2u 1xxk1D (u) exp( 2 )k2 D ( x ) exp( )2 22 2 222 1 xx k ND ( x ) exp( )22 N2 2 10. Elliptical kernelsanisotropic Gaussian kernel - rotated & scaledRadial kernel Elliptical kernelp p rotated and scaled kernels 11. ||p||2 p p 12. Distance of xEuclidean distanceMahadanbis distance nx x x x2 DM ( x) x 1 x 2 ii 1 12 12.. 1n 21 2.. 2 n 2 .... .. 2 n1 n 2.. n 13. Mahalanobis Distance [P.C. Mahanobis1936]x2 px1p=(x1,x2) in X1-X2 spaceWhat is the distance of p in Y1-Y2space with an elliptical kernel ? 1 12 12 x1 p Y Y DM ( x) x 1 x x1 x2 2 2 1 2 12 2 x2 14. Mahalanobis Distance [P.C. Mahanobis 1936]pp DM1(p) DM2(p) M1 M2Distance is determined by the standard deviation of the kern 15. Mahalanobis Distance [P.C. Mahanobis1936]y2y1 Distance is determined by the standard deviation of the kernProject p to the axes of the kernel then divided by the standard d 16. Mahalanobis Distance [P.C. Mahanobis 1936] y2 px2y1x1 Project p to the axes of the kernelx -1x Divided by the standard deviation 10 2 y 1 x1 x1x 2 x1 2p Y Y DM ( y ) y 1 y y1 y2 y1 1 DM ( x) x 1 x x1x2 2 2 0 1 y2 x1x 2 x 2 x2 1 2 y y y2 2 17. 1 x1 x1x 2 x1 2DM ( x) x x x x1 x2 1 X2 x1x 2 x22 x2 Y2 p 2 x1x 2 11x x1 EE x E E x1x 2 x22 where E is the m atrixof eigen vect of , is the m atrixof the eigen values of .or X1Y11E e1 e2 , 2 E E1 e1 e2 [1e1 2 e2 ] e1e2 2 that is 1 DM ( x) x x x x E1 Ex y1 ynote : PCA of the kernel is the axes of Y 18. y ex X2 Y2y 1 1 y2 e2x p y1 var( y1 ) var(e1x) var(e11 x1 e21 x2 )2 e11 x1 2e11e21 x1 x 2 e21 x 2 22 2 2X1 2 x1 x 2 e11 Y1 [e11 e21 ]x1 e1 x e1 e1EEe1 x1 x 2 x 2 e21 2 e1 e1e1e2 1 e 1 2 e2 1 y22 var( y2 ) var(e2x) var(e12 x1 e22 x2 ) e12 x1 2e12e22 x1 x 2 e22 x 2 22 22 x1 2 x1 x 2 e12 [e12 e22 ] e2 x e2 e2EEe2 x1 x 2 x 2 e22 2 e1 e2e1 e2 1e 2 2 e2 2 11 / y1 2 y1 1 / y1 DM ( y ) y y y [ y1 y2 ] 2 [ y1 y2 ] 1 1 / y 2 y2 1 / 2 y2 19. Multivariate Gaussian distributionMahalano bis 11x xDistancek ( x) exp( N) 2 1/ 2 2 is the covariancematrix of x 20. Bilateral filters Remove noise and keep edge Kernel is not fixed Can apply fixed kernel (convolution) Large memory cost Can apply fixed kernel (convolution) Down-sample convolution up-sample Blur on important samplings (leaf nodes) Blur on important samplings (lattic) Spatially varing Anisotropic gaussian kernel 21. Bilateral Filter Kernel weighing is depend on position distanceand color distance W WR1s I( p) K I (q ) N s ( p q ) N s ( I ( p) I (q ))q I(p) K q N s ( p q ) N s ( I ( p) I (q )) Ws Ws is fixed WR WR depends on |I(p)-I(q)|W= WS WR Ws * WR is not fixed ! 22. Bilateral filters Remove noise and keep edge Kernel is not fixed Can apply fixed kernel (convolution) Large memory cost Can apply fixed kernel (convolution) Down-sample convolution up-sampleimportant Blur on important samplings (leaf nodes) sampling ongrid, kd-tree, lattic Blur on important samplings (lattic) Spatially varing Anisotropic gaussian kernel 23. Gaussian Bilateral is a kind of HighDimensional Gaussian Filter p q spacerange pq 2 1 I I 2p 1 p qexp exp 2 s2 2 s2 2 r 2 r2q A Fast Approximation of the Bilateral Filter using a Signal Processing Approach 24. Gaussian Bilateral is a kind of HighDimensional Gaussian Filter 1 p q 1 I 2 Iq 2 exp pexpp2 s 2 s2 2 r 2 r 2 1 / 2 s2 p q q 1 exp ( p q) ( I p I q ) I I 2 s r1 / 2 r p q P [pI p ] Q [q I q ] 1 2 s r exp ( P Q) 1 ( P Q) space x range pq 25. higher dimensional functionswiwGaussian convolution division slicing 26. High Dimensional Gaussian FilteringEx: RGB image with isotropic Gaussian kernelD :=diag {} := elliptical kernel 27. l High Dimension Gaussian Filters2D Bilateral kernelx 3D Gaussiankernelnoisy ysignal bilateraled 28. Bilateral filters Remove noise and keep edge Kernel is not fixed Can apply fixed kernel (convolution) Large memory cost Can apply fixed kernel (convolution) Down-sample convolution up-sampleimportant Blur on important samplings (leaf nodes) sampling ongrid, kd-tree,lattic Blur on important samplings (lattic) Spatially varing Anisotropic gaussian kernel 29. Why do we need Bilateral grid ? High dimension Gaussian costs large memory &running time !Image size Gray Run-time RGB Run-timew*hImage ImageBilateralwh O(wh * n2) 3 whO(3wh * n2)High 256 wO(256 wh * 2563 wh O(2563 wh * n5)dimensionhn3 )Gaussian Bilateral grid [Chen et al. 07] Down sampling (point {pixel,color} coarse grid) Gaussian blur on the coarse grids Up sampling (coarse grid point{pixel, color}) 30. higher dimensional functionswiwDOWNSAMPLEGaussian convolution UPSAMPLE division slicing 31. Bilateral filters Remove noise and keep edge Kernel is not fixed Can apply fixed kernel (convolution) Large memory cost Can apply fixed kernel (convolution) Down-sample convolution up-sampleimportant Blur on important samplings (leaf nodes) sampling ongrid, kd-tree, lattic Blur on important samplings (lattic) Spatially varing Anisotropic gaussian kernel 32. Important Sampling Bilateral Grid is a kind of important sampling High dimension kernel + sampling on the gridsDOWNSAMPLEUPSAMPLEGaussian Blur Gaussian KD-tree [Adams et al. 09] High dimension kernel + sampling on leaf nodes Splatting (downsample points to leaf nodes) Blurring (Gaussian blurring on leaf nodes) Splicing (upsample from leaf nodes to points) 33. Important Sampling in Gaussian KD-tree High-dimension Gaussian filter : sampling s neighborhood s Important sampling with Gaussian KD-tree : evaluating samples as near as possible s T mm p j pi D p j pi V i V j s j exp , s j s, p j are leaf nodes j 2 j pi 34. Why not important sampling onGaussian filter ? Gaussian filter p q2 q1 High dimension Gaussian filter p q2q1 35. Bilateral filters Remove noise and keep edge Kernel is not fixed Can apply fixed kernel (convolution) Large memory cost Can apply fixed kernel (convolution) Down-sample convolution up-sampleimportant Blur on important samplings (leaf nodes) sampling ongrid, kd-tree, lattic Blur on important samplings (lattic) Spatially varing Anisotropic gaussian kernel 36. The Permutohedral Lattic [Adams, Baek, et al.EG2010]Gaussian KD-tree signalPermutohedral Lattice for high dimension Gaussian filter 37. Bilateral filters Remove noise and keep edge Kernel is not fixed Can apply fixed kernel (convolution) Large memory cost Can apply fixed kernel (convolution) Down-sample convolution up-sample Blur on important samplings (leaf nodes) Blur on important samplings (lattic) Spatially varing Anisotropic gaussian kernel 38. The High Dimension GaussianKernel can be spatially varing alongGradient 39. Why do we need a spatially varing kernel ?orfilteringsmoothed result 40. #1.1 : High-dimension Gaussian filter an isotropic kernel (radial kernel) 41. #1.1 : High-dimension Gaussian filter an isotropic kernel (radial kernel) smoothed result 42. #1.2 : High-dimension Gaussian filter smooth an anisotropic kernel??(elliptical kernel) smooth 43. #1.2 : High-dimension Gaussian filter 44. #1.2 : High-dimension Gaussian filter 45. #1.3 Spatial varing Gaussianfiltersmooth 46. Why not using an isotropic kernel ?(radial or ball or ) Image resolution color range resolution We usually apply small image kernel : 3x3, 5x5, But what is approximate size for color range kernel ? Depend on color distribution, color space Special image, e.g. HDR 47. The High Dimension GaussianKernel can be spatially varing alonggradient 48. Trilateral[Choudhury & Tumblin 03]G1 = I/ x,G2 = I/ y,x = xj xi,y = yj yiGeneralized as Pi : ( xi , yi , I ( xi , yi )) Vi : ( I ( xi , yi ), xi , yi ,1) 49. AccelerationImportant Sampling Gaussian KD-treeDimensionality M:(x,y) (x, y, x+y, x-y)Kernel Sampling & Segmentation PermutohedralLattice 50. kImportant Sampling forIf kernel is isotropic,easy to estimate GD(x,) q0= GD(x,) R0 R1 q1= GD(x,) X 51. kImportant Sampling forIf kernel is anisotropic??? GD(x,) q0= GD(x,)R0R1 q1= GD(x,)X 52. Dimensionality Elevation for Spatial varing kernel C0 53. Dimensionality Elevation for M : R2 R4 M : ( x, y ) x 2,y 2 , ( x y) 8 , ( x y) 8 M : R 3 R13 M : ( x, y, z ) (1 x, 2 y, 3 z , 4 ( x y ), 5 ( x y ), 6 ( y z ), 7 ( y z ), 8 ( x y z ), 9 ( x y z ),10 ( x y z ),11 ( x y z )) 54. Kernel Sampling & Segmentation Kernel samplingLet D = {D1, D2, } Assumptions The kernel is locally constant While the space of possible kernel is vary large D has O(dp2) degree of freedom. However D is restricted, let dr Kernel segmentation Clustering no efficiency 55. Kernel Sampling & Segmentation Segment Regular sample Gaussian kernel {Dl} sparsely Segmentation {Sl} For each Dl belonging to D, define the segment Sl as{Pi} to satisfy Pi is an element of Sl only if blurring Pi with D isnecessary for interpolating Dl Each segment Sl is filtered separately Kernel is rotated or sheared so that Dl is diagonal D1 Segment {Sl}S1S2S4S3sparsely sampling kernel D = {D1, D2, Dn} 56. Kernel Sampling & Segmentation forsparsely samplingkernelD1 D2D3 D4 57. Kernel Sampling & Segmentation forS2 S2 U {Pi} Pionsegmentati 58. Kernel Sampling & Segmentation foronsegmentatik=0 k=16 59. Kernel Sampling & Segmentation foronsegmentati 60. Kernel Sampling & Segmentation for S3 61. Review of accelerating methods forspatially varing Gaussian filter Important sampling Blurring Gaussian KD-tree leaf nodes #Samples proportion to ratio of integral of kernel Dimensional elevation Elevate dimension and apply standard Gaussian KD- tree x, y) x M :(2, y 2 , ( x y) 8 , ( x y) 8 M : R3 R11 Kernel sampling Sample kernels for Permutohedral Lattice node Blurring Permutohedral Lattice node Comparison of the proposed methods d dd3 62. ApplicationsTone MappingSparse Range Image Up-sampling 63. Bilateral Tone Mapping Decompose image to {B, D} B : based layer for HDR D : detail layer for LDR, local texture variations from the Based Tone mapping Scale down B + D Comparison of obtaining B with Bilateral Bilateral tone mapping : quick but artifacts Kernel sampling : quick and approximate to Trilateral Trilateral : slow 64. BilateralBlooming & false edgKernel samplingKernel samplingTrilateral 65. BilateralBlooming &false edgeKernel samplingKernel samplingTrilateral 66. Joint Bilateral upsampling Use Bilateral kernel to up-sample image operations performed at a low-resolution [Kopf et al. 2007]spatial range 67. Sparse Range ImageUpsampling Range image (depth map) Encode scene geometry as per-pixel distance map Useful for autonomous vehicle, backgroundsegmentation Joint Bilateral filter Up-sampling Similar color has similar depth Color ImageGround Truth Depth Bilateral Upsampled Depth 68. Results of Sparse Range Image Upsampling- Synthetic1 dataset 69. Results of Sparse Range Image Upsampling- Highway2 dataset 70. Limitations Time complexity of kernel sampling Polynomially with dp Linear with the dataset size #SampledKernels affects the resulting quality Too few samples caused kernel sampling todegenerate to a spatially invariant Gaussian filter Too many samples creates segments with too fewpoints and the dilation to be less effective 71. Conclusion A flexible scheme for accelerating spatially varinghigh dimensional Guassian filters Segmenting & tiling image data Comparable results to Trilateral filter Faster than Trilateral filter Better than Bilateral filter Applicable for traditional bilateral filterapplications Tone mapping, sparse image upsampling 72. Future Work Shot noise Shot noise varies with signal strength and isparticularly prevalent in areas such as astronomyand medicine, so these areas could make use of afast photon shot denoising filter Video denoising Align blur kernels in the space-time volume withobject movement Light field filtering or upsampling Aligning blur kernels with edges in the ray-spacehyper-volume 73. Q&A