Upload
others
View
7
Download
0
Embed Size (px)
Citation preview
RGBD Image Processing for 3D Modeling and Texturing
ICIP 2013 Tutorial Dr. Hwasup Lim, Dr. Seungkyu Lee,
and Dr. Jongwoo Lim
1
Commodity Depth Cameras Product Name
Product Image
Depth Sensing
Resolution Distance Range
Frame Rate
Price
PrimeSense Carmine
Structured Light
D640x480 C640x480
0.8m-3.5m 30fps $200
MS Kinect for Windows
Structured Light
D640x480 C640x480
0.8m-4m (0.4m-3.0m
) 30fps $250
Asus Xtion Pro Live
Structured Light
D640x480 C640x480
0.8m-3.5m 30fps $200
Intel/Creative Senz3D
Time of Flight
D320x240 C1280x720
0.15m-1m 30fps(D) $149
Softkinectic DS325
Time of Flight
D320x240 C1280x720
0.15m-1m 30fps(D) $249
MS Kinect 2.0 for Xbox One
Time of Flight
D512×424 C1920x108
0 ? 30fps ?
2
RGBD Paper Statistics
3
[Google Scholar]
Keyword “RGBD”
Keywords “Kinect + simultaneous localization and mapping, object reconstruction, multiple Kinect, interference mitigation, transparency and calibration”
3D Imaging Technology
Extract or Reconstruct 3D Information from One or Multiple Imaging
Sensors.
- 2D Imaging : Intensity of incoming light ray
- 3D Imaging : Intensity of all light rays of the light field
Lambertian Surface Assumption has been made for 3D Imaging
Non-Lambertian Surface Extension is required for real 3D Imaging
,,,,,, tzyxPI
),( 11
),( 22
1
2
t
),,( zyx
Light Field
Structured Light
- Project light patterns and detect correspondence
- Perform a triangulation between matching points
Pros: Low price high accuracy, High resolution
Cons: Limitation in sensing range (50cm~2m) and dynamic imaging, Occlusion
Kinect Principle (1/4)
Basically, it is based on structured light principle
IR Speckle
Pattern
Kinect Principle (2/4)
0. Calibrate source and
detector
1. Known IR pattern is
projected from the source
2. Detector identify each
dot (or set of dots)
3. Triangulate to calculate
depth
Kinect Principle (3/4)
- Random speckles identify x,y locations
- Orientation and shape of the speckles change along distance
identify z location
Kinect Principle (4/4)
Measured Depth and Color Images
Time-of-Flight Principle (1/6)
Reflected IR shows phase delay proportional to the distance
from the camera.
Time-of-flight of Light Distance
: It is not simple to measure the flight time directly at each pixel
of any existing image sensor
Laser Scanning
- Structured Light, Time-of-Flight principles are applied with Laser Light
Pros: High accuracy and repeatability, Long sensing range
Cons: Active sensing using Laser beam, Limitation in On-line 3D sensing
Structured Laser Light
LIDAR (Time-of-Flight)
Time-of-Flight Principle (2/6)
Q1 through Q4 are the amount of electrons measured at each
corresponding time.
In real situations, it is difficult to sense electric charge at certain
time instance
Distance Measurement
IR Modulation
irTOF
irTOFon
inTN
iTTnN
1
0
10
1
1
0 NN
NTT
T
TT
N
NonTOF
TOF
TOFon
TX0
TX1
Emitted
Light
Reflected
Light
Ton
TTOF
N0/n
N1/n
TX0
TX1
Emitted
Light
Reflected
Light
TTOF
N0/n
N1/n
2cos2sin
2cos2sin
2
1
00
TOFTOF
TOFTOF
TadaTaN
TadaTaN
10
1
1
0
2cos
2cos
NN
NTT
T
T
N
NonTOF
TOF
TOF
Pulse Light
Sinusoidal Light
Demodulation-related Error
Depth Calculation Error
Gate1
Gate2
Emitted NIR
Reflected NIR
Gate3
Gate4
A0
T0
td r·A0
Q1 Q1
Q2 Q2
Q3 Q3
Q4 Q4
td
Y1≥0, Y2≥0 Y1<0, Y2≥0 Y1<0, Y2<0 Y1≥0, Y2<0
0 T0/2 T0 3T0/2 2T0
Y
─ Y1 = nQ1-nQ2
─ Y2 = nQ3-nQ4
1
2d
Y
Yarctant
Four Phase depth calculation Difference in depth calculation
Time-of-Flight Principle (3/6)
Distance
21
43arctan2
)(2 QQ
QQcdt
c
21
43
21
43 arctan2
arctan2 qq
qqc
qqc
Assumption: Single reflected IR signal
In principle, amplitude of the reflected IR does not affect the depth
calculation.
Time-of-Flight Principle (4/6)
- Large Sensor Pixel
- Scattering
- Multipath
- Motion Blur
- Transparent Object
In real situations, multiple reflected IR signals with different phase
delays & amplitudes can be superposed.
)()(
)()(arctan
2)(
2211
4433
qqqq
qqqqcdt
We do not know how many IR signals will be superposed.
Multiple Light Reception: Due to the interference of multiple IR light reflections Use multipath interference model [5]
Multipath Errors
IR LED
Sensor
Multipath Interference
2-Layer approximation of transparent object
))1(())1((
))1(())1((arctan
2)(
2211
4433
qqqq
qqqqcdt
- Sometimes 2-Layer is not enough
- Multiple reflection between objects (when they are close)
- In most cases, they have specular surface
Multipath Errors (Transparency)
Depth error in transparent objects
Multipath Errors (Transparency)
)(
)(arctan
21
43
QQtd
)ˆˆ()(
)ˆˆ()(arctan
2121
4343
QQQQ
QQQQtd
Depth
IR-Intensity
Time-of-Flight Principle (5/6)
What we have calculated is the distance R from camera to
an object surface along the reflected IR.
Based on the location of each pixel and field of view information,
Z can be calculated from R to obtain undistorted 3-dimensional
geometry.
Time-of-Flight Principle (6/6)
Measured Depth and IR Intensity Images
Caused by irregularity (asymmetry) of the modulation process
Demodulation Error
PMD, 1-4 m, B-Spline fitting [1] SR3100, 1-5m, IT 2ms - 32ms,
6 degrees polynomial fitting [2]
Err
or(
cm
)
Distance(m)
10
0
1 4
20
15
25
5
-5
-10 1.5 2 2.5 3 3.5 4.5
Due to the variation of the number of collected electrons during the integration time the repeatability of each depth point varies
Integration time-related Error
Integration Time: 30(ms) Integration Time: 80(ms)
30ms 80ms
Light attenuates according to the law of inverse square
Amplitude Correction
Distance-based intensity correction [18]
Large Sensor Pixel
In order to increase sensitivity,
- large pixel size or pixel binning
IR signal #2
)()(
)()(arctan
2)(
2211
4433
qqqq
qqqqcdt
IR signal #1
Multiple light reflections between the lens and the sensor
Use scattering model [4] or anti-reflection material on lens
Light Scattering
Light scattering [4]
Non-Local Means Filter
uses weighted average of all pixels using the similarity of squared neighborhoods
Depth Noise Reduction
Non-local denoising result [7]
Bilateral Filter
uses weighted average of depth values from nearby pixels
Depth Noise Reduction
Bilateral filter kernel [9]
Bilateral filtering result
location Intensity
Joint Bilateral Filter-based Method
refines depth values using color similarity [15]
Depth Super-resolution
(a) Color image 640x640 (b) Input depth map 64x64 (c) Refined depth maps 640x640 [15]
ToF Motion Blur
Moving camera/object cause wrong depth calculation Motion blur
Image sensor
Moving Object Moving Object
ToF Motion Blur
The characteristic of Tof motion blur is different from color
motion blur
Overshoot Blur
Undershoot Blur
Overshoot Blur
ToF Motion Blur
The characteristic of Tof motion blur is different from color
motion blur
Reflected IR
4-Phase signals inside ToF camera
TimeInteg.
1Q
2Q
3Q
4Q
Radiated IR
))1(())1((
))1(())1((arctan
2)(
2211
4433
qnnqqnnq
qnnqqnnqcdt
ToF Motion Blur
Blur Detection
Blur Level
Input Depth
Deblurred Depth
- There are some relations
between Q1~Q4
1Q
2Q
3Q
4Q
4321 QQQQ
KQQQQ 4321
- We assume,
: single flat foreground
: single flat background