Upload
jean-morgan
View
215
Download
0
Embed Size (px)
Citation preview
COMPUTER VISION @
Prasanna Rangarajan 04/09/10Dr. Panos Papamichalis
Page 2
Organization
Imaging under “Structured Light”– what is “Structured Light” ?
– estimating depth using “Structured Light”
– Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system
– how is Optical Super-Resolution different from Digital Super-Resolution ?
– what is wrong with state-of-the-art in Optical Super-Resolution ?
Mathematical Model for Imaging under “Structured Light”– macroscopic OSR using “Structured Light” and corresponding workflow
– Estimating depth using “Structured Light” and corresponding workflow
Other areas of interest
Uncalibrated Macroscopic OSR + Depth estimation in a single setup
Page 3
Structured Light and its applications
What is Structured Light ? ..... periodic light patterns
Why is it useful ?– Traditionally, used to recover depth maps & surface topology
– Recently, used in microscopes to resolve spatial detail that cannot be resolved by the microscope
Page 4
Closer look at Depth from Structured LightPhase Measuring ProfilometryPrinciple– project a sinusoidal (periodic) pattern onto the scene, at a
known angle
– image of scene viewed from a different position AND-OR angle, reveals lateral displacements + frequency changes related to topological variations
Mephisto 3D Scannerfrom 3D Dynamics
http://www.youtube.com/watch?v=854ZTvs8UoUhttp://www.youtube.com/watch?v=VGxEUKPNqcA
SL hits from TI website1.Application Report DLPA021 “Using the DLP Pico 2.0 Kit for Structured Light Applications”2.Blog entry “3D Metrology and Structured Light”, by Dennis Doane
DLP
Other DLP based SL-ScannersViaLUX, GFM, 3D3, ShapeQuest
Page 5
Problem : Diffraction imposes a fundamental limit on the spatial resolution of a camera Cameras behave like low-pass filters
Objective of Optical Super-Resolution : Improve the resolution of a camera without altering its physical parameters:
Optical Super-Resolution using Structured Light has revolutionized microscopy in recent years
Diffraction in ocean waves
Smaller cutoff frequencyf/# 10
Larger cutoff frequencyf/# 2.8
Principle : shift frequencies outside the passband into the passband
How ? modulate the amplitude of a periodic pattern with scene information
Page 6
Optical Super-Resolution using Structured Light How is it different from Digital Super-Resolution ?
Optical Super-Resolution
See Optical Super-Resolution in action http://zeiss-campus.magnet.fsu.edu/tutorials/superresolution/hrsim/hrsim.swf
Digital Super-Resolution
Recover spatial frequencieslost to diffraction ( beyond the optical cutoff )
Recover spatial frequencieslost to aliasing ( but upto the optical cutoff )
Page 7
Optical Super-Resolution using Structured LightPerspective & De-magnification
scene-dependent distortion ( useful for recovering depth but not OSR )
Perspective & de-magnification present a real challenge for macroscopic imaging /illumination systems such as commercial cameras/projectors
Imaging & illumination systems in Structured Light-microscopy DO NOT experience significant perspective effects
imaging parallel lines on railroad track
How do we eliminate the scene-dependent distortion ?
Page 8
Solution-1 : Collocate the camera & projector, and illuminate the scene with a specific periodic pattern Daniel A. Vaquero, Ramesh Raskar, Rogerio S. Feris, & Matthew Turk. ”A Projector-Camera Setup for Geometry-Invariant Frequency Demultiplexing”. In IEEE Computer Vision and Pattern Recognition (CVPR'09)
Macroscopic OSR using Structured LightEliminating the scene-dependent distortion
Are we really shifting frequencies outside the passband of the optics, into the passband ?
Solution-2 : Coincide the camera & projector using a beam-splitter L. Zhang & S. K. Nayar, “Projection Defocus Analysis for Scene Capture and Image Display”, SIGGRAPH2006.
“Macroscopic OSR” for imaging systems observing a 3D scene unsolved since 1963 W. Lukosz and M. Marchand, "Optischen Abbildung Unter Ueberschreitung der Beugungsbedingten Aufloesungsgrenze," Opt. Acta 10, 241-255 (1963)
Page 9
Macroscopic OSR using Structured LightOur contributions
– Identify a family of camera+projector setups that can realize OSR in macroscopic imaging , for arbitrary scenes
– Unify existing embodiments of Structured Light
– Single setup for recovering depth & realizing OSR
raw image super-resolved image
depth map
Publications “Surpassing the Diffraction-limit of Digital Imaging Systems using Sinusoidal Illumination Patterns”, Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 2009) “Perspective Imaging under Structured Light”, under review
“A Method and Apparatus for Surpassing the Diffraction Limit in Imaging Systems”, patent being pursued, August 2009
Page 10
Organization
Imaging under “Structured Light”– what is “Structured Light” ?
– estimating depth using “Structured Light”
– Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system
– how is Optical Super-Resolution different from Digital Super-Resolution ?
– what is wrong with state-of-the-art in Optical Super-Resolution ?
Mathematical Model for Imaging under “Structured Light”– macroscopic OSR using “Structured Light” and corresponding workflow
– Estimating depth using “Structured Light” and corresponding workflow
Other areas of interest
Uncalibrated Macroscopic OSR + Depth estimation in a single setup
Page 11
Perspective Imaging under Structured LightProposed Mathematical Model
– The image planes of the camera and projector are parallel ( permits uncalibrated OSR & depth estimation )
– Images captured the camera are free of aliasing
– The camera point spread function does not change appreciably over the shallow depth of field of the projector
Page 12
Perspective Imaging under Structured LightUnified Mathematical Model
Suppose the projector is projecting the periodic illumination pattern
What is the expression for the image captured by the camera ?
Ideally we would like to illuminate the scene with a complex sinusoid
Page 13
Perspective Imaging under Structured LightUnified Mathematical Model
1. Identify the scene point that is illuminated by the projector pixel, and its corresponding camera pixel
2. Propagate intensity from , accounting for scene reflectance and projector defocus
Page 14
Perspective Imaging under Structured LightExpression for camera image under Structured Light
How do we find ?
The effect of projector defocus is a depth dependent blurring of each frequency component in the illumination pattern !!!
Page 15
We can express the incident intensity at the scene point in the camera coordinates as
Perspective Imaging under Structured LightEffect of projector defocus
Projector DefocusDepth dependent magnification
Depth dependent phase distortion
Putting it together
Page 16
Perspective Imaging under Structured LightExpression for camera image under Structured Light
The expression applies to all embodiments of structured light, that rely on sinusoidal illumination, in a parallel stereo setup !!!
BUT, how does it help in realizing OSR and recovering depth ?
Page 17
Organization
Imaging under “Structured Light”– what is “Structured Light” ?
– estimating depth using “Structured Light”
– Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system
– how is Optical Super-Resolution different from Digital Super-Resolution ?
– what is wrong with state-of-the-art in Optical Super-Resolution ?
Mathematical Model for Imaging under “Structured Light”– macroscopic OSR using “Structured Light” and corresponding workflow
– Estimating depth using “Structured Light” and corresponding workflow
Other areas of interest
Uncalibrated Macroscopic OSR + Depth estimation in a single setup
Page 18
Identify image of scene obtained under ambient + uniform projector illumination
Identify images of scene obtained under complex sinusoidal illumination
Macroscopic OSR under Structured LightBasic Idea
depend on scene geometry
ambient + uniform illumination
complex sinusoidal illumination
Page 19
Macroscopic OSR under Structured LightBasic Idea
In the special case that are independent of scene depth image captured by
traditional camera LPF scene information
What happens when we demodulate ?
The demodulated image contains spatial frequencies that exceed the bandwidth of the imaging system !!!
BPF scene information
modulate amplitude of complex sinusoid with scene information
Page 20
The expressions for are invariant to scene geometry, when
Macroscopic OSR under Structured Lightwhen are invariant to scene geometry ?
Examples
– image planes of camera & projector are coplanar
– illumination pattern is aligned with the epipolar lines in the projector
Page 21
Identify the raw image and the exponentially modulated images
Macroscopic OSR under Structured LightComplete Workflow
Page 22
Identify the frequency of the modulating pattern
After modulation, the DC component in shifts to the carrier frequency
Macroscopic OSR under Structured LightComplete Workflow
The DC component of the super-resolved image must have zero phase. Use this to identify
Page 23
Aliasing Management avoid aliasing demodulated spatial frequencies that exceed the detector Nyquist frequency
Macroscopic OSR under Structured LightComplete Workflow
Page 24
Aliasing Management avoid aliasing demodulated spatial frequencies that exceed the detector Nyquist frequency
Macroscopic OSR under Structured LightComplete Workflow
How is it done ? (sinc-interpolation) symmetrically increase the size of the modulated images by prior to demodulation
Page 25
Macroscopic OSR under Structured LightComplete Workflow
Without aliasing management
With aliasing management
Aliasing Management avoid aliasing demodulated spatial frequencies that exceed the detector Nyquist frequency
Page 26
Demodulation + Phase Compensation
Macroscopic OSR under Structured LightComplete Workflow
Any collocated/co-incident camera+projector setup can be used to recover spatial frequencies exceeding the bandwidth of an imaging system
Quick Recap
Page 27
Organization
Imaging under “Structured Light”– what is “Structured Light” ?
– estimating depth using “Structured Light”
– Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system
– how is Optical Super-Resolution different from Digital Super-Resolution ?
– what is wrong with state-of-the-art in Optical Super-Resolution ?
Mathematical Model for Imaging under “Structured Light”– macroscopic OSR using “Structured Light” and corresponding workflow
– Estimating depth using “Structured Light” and corresponding workflow
Other areas of interest
Uncalibrated Macroscopic OSR + Depth estimation in a single setup
Page 28
Depth from Collocated Structured Lightspecific instance of “Phase Measuring Profilometry”
Objective : recover surface topology from the phase distortion induced by depth variation, in a collocated stereo setup
How do we recover depth maps from ?
1. Identify the modulated image
Page 29
Depth from Collocated Structured Lightspecific instance of “Phase Measuring Profilometry”
Objective : recover surface topology from the phase distortion induced by depth variation, in a collocated stereo setup
How do we recover depth maps from ?
2. Attempt demodulation + phase compensation
Page 30
Depth from Collocated Structured LightComplete Workflow
To avoid ambiguities in phase unwrapping, 2 patterns ( 1 small frequency , 1 large frequency) are employed
Page 31
Organization
Imaging under “Structured Light”– what is “Structured Light” ?
– estimating depth using “Structured Light”
– Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system
– how is Optical Super-Resolution different from Digital Super-Resolution ?
– what is wrong with state-of-the-art in Optical Super-Resolution ?
Mathematical Model for Imaging under “Structured Light”– macroscopic OSR using “Structured Light” and corresponding workflow
– Estimating depth using “Structured Light” and corresponding workflow
Other areas of interest
Uncalibrated Macroscopic OSR + Depth estimation in a single setup
Page 32
Experimental ResultsSetup-1 : vertically collocated camera+projector
Page 33
Experimental Results - OSRSetup-1 : vertically collocated camera+projector
OSR is possible only in the horizontal direction
Page 34
Experimental Results – Estimating depthSetup-1 : vertically collocated camera+projector
Page 35
Experimental Results - OSRSetup-2 : non-collocated camera+projector
Page 36
Experimental ResultsSetup-2 : non-collocated camera+projector
Without aliasing management
With aliasing management
Page 37
Closing Arguments & Open IssuesPutting things in perspective
It is possible to resolve detail exceeding the BW of a macroscopic imaging system
There are camera+projector setups that can recover depth information +resolve detail exceeding the bandwidth of the imaging system
Can we super-reslove when the optical axes of the camera and projector are crossed ?
Can we accommodate aliasing during image capture ?
Bar-code scanners
Counterfeit Bill Detection
Non-contact fingerprint scanning Non-contact archived document scanning
Artwork authentication
Page 38
Organization
Imaging under “Structured Light”– what is “Structured Light” ?
– estimating depth using “Structured Light”
– Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system
– how is Optical Super-Resolution different from Digital Super-Resolution ?
– what is wrong with state-of-the-art in Optical Super-Resolution ?
Mathematical Model for Imaging under “Structured Light”– macroscopic OSR using “Structured Light” and corresponding workflow
– Estimating depth using “Structured Light” and corresponding workflow
Other areas of interest
Uncalibrated Macroscopic OSR + Depth estimation in a single setup
Page 39
Inpainting using Two-View GeometryBasic Idea
Objective : remove occluders in an image with the aid of a second image of the scene, taken from a different viewpoint
Why2 images ?
Single image inpainting methods have trouble filling large regions, and often produce unrealistic results
Principle : fill-in missing pixels by copying information from the respective epipolar lines in the second image
How does it work ?
1. Estimate the epipolar geometry relating the 2 views using plane+parallax & covariance of the estimated homography
2. fill-in missing pixels by transferring intensities along corresponding epipolar lines
Page 40
Inpainting using Two-View GeometryHow well does it work ?
View-1 before inpainting View-1 after inpainting
Page 41
Least Squares Parameter Estimation
Pg.108 : ”Data normalization is an essential step in the DLT algorithm. It must not be considered optional.”
Problem : A majority of the estimation tasks in computer vision are heteroscedastic in nature. This poses a problem for LS estimators.
Popular Solution : De-center & rescale the coordinates before solving the LS problem
Page 42
Least Squares Parameter Estimation
Problem with normalization : The LS estimate depends on the choice of normalization
1. Is there a matrix N that makes the LS estimator invariant to coordinate normalization ?
“Estimating homographies without coordinate normalization", Proceedings of the IEEE Conference on Image Processing, November 2009, Cairo, pp.3517-3520
2. What choice of N induces the smallest bias in the LS estimate ?
“Improved algebraic methods for circle fitting”, Electronic Journal of Statistics, 3, (2009), 1075-1082 “Hyperaccurate ellipse fitting without iterations”, Proceedings of the 5th International Conference on Computer Vision Theory and Applications (VISAPP'10), May 2010, Angers, France, to appear “High accuracy homography computation without iterations”, Proceedings of the 16th Symposium on Sensing via Image Information (SSII2020), June 2010, Japan, to appear “Hyperaccurate least squares and its applications”, Proceedings of the International Conference on Pattern Recognition (ICPR’10), August 2010, Turkey, to appear.
Page 43
Least Squares Parameter EstimationRepresentative Results !!!
Page 44
Thank You !!!
Vikrant Bhakta Dr. Marc Christensen Dr. Kenichi Kanatani
Collaborators
Page 45
Thank You !!!
http://spie.org/x34304.xml
Page 46
Thank You !!!
Page 47
Thank You !!!