Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
=
TX0
TX1
Emitted
Light
Reflected
Light
T
on
T
TOF
N
0
/n
N
1
/n
TX0
TX1
Emitted
Light
Reflected
Light
T
TOF
N
0
/n
N
1
/n
( ) ( )
( ) ( ) |
.
|
\
|
+ = + =
|
.
|
\
|
+ = + =
}
}
2
cos 2 sin
2
cos 2 sin
2
1
0
0
t
u u
t
u u
t
t
t
TOF TOF
TOF TOF
T a d a T a N
T a d a T a N
1 0
1
1
0
2
cos
2
cos
N N
N
T T
T
T
N
N
on TOF
TOF
TOF
+
=
+
+
=
t
t
Pulse Light
Sinusoidal Light
Session I (8:30 am - 10:15 am)
Introduction to ToF Sensor Research
Principle of ToF Depth Sensor
Image Processing Algorithms for Depth Image Quality
Improvement
1. Intrinsic parameter acquisition and non-linear calibration
2. Range ambiguity
3. Depth noise modeling and denoising
4. Superresolution
5. ToF Motion Blur
Course Schedule
Demodulation-related Error
Caused by irregularity in the modulation process
use look-up table, B-Spline [1], and Polynomials [2]
Systematic Depth Errors
Figure 1. PMD, 1-4 m, B-Spline fitting [1]
Figure 2. SR3100, 1-5m, IT 2ms - 32ms,
6 degrees polynomial fitting [2]
E
r
r
o
r
(
c
m
)
Distance(m)
10
0
1 4
20
15
25
5
-5
-10
1.5 2 2.5 3 3.5 4.5
Demodulation-related Error
use 4 stage depth calculation [1]
Systematic Depth Errors
Gate1
Gate2
Emitted NIR
Reflected NIR
Gate3
Gate4
A0
T0
t
d
rA0
Q1 Q1
Q2 Q2
Q3 Q3
Q4 Q4
t
d
Y10, Y20 Y1<0, Y20 Y1<0, Y2<0 Y10, Y2<0
0 T0/2 T0 3T0/2 2T0
Y
Y1 = nQ1-nQ2
Y2 = nQ3-nQ4
|
|
.
|
\
|
=
1
2
d
Y
Y
arctan t
Figure 3. Four Phase depth calculation Figure 4. Difference in depth calculation
Integration time-related Error
Due to the number of collected electrons during the
integration time worsen the repeatability problem
Systematic Depth Errors
Figure 5. Colored 3D point cloud of a flat wall at a constant distance of 1 meter [2].
IT: 2ms IT: 4ms IT: 8ms
X(m) X(m) X(m)
Z
(
m
)
Z
(
m
)
Z
(
m
)
0.5 -0.5 0.5 -0.5 0.5 -0.5
1
0.9
1.1
0.8
1
0.9
1.1
0.8
1
0.9
1.1
0.8
1.15
0.95
0.85
1.15
0.95
0.85
1.15
0.95
0.85
0 0 0
Pixel-related Error
Due to the different material properties of CMOS-gate and
amplitude-related error
use a Fixed Pattern Noise table
Systematic Depth Errors
Figure 6. Fixed pattern noise, SR-2. IT
100ms, nominal distance 2.452 m [3]
Figure 7. Depth-colored amplitude-related errors.
Depth image of a flat wall at 0.43 meters. Depth
overestimation can be observed due to low
illumination (borders of the image) [2].
0.41
meter
0.43
0.45
Amplitude-related Error
Due to the non-uniformity of IR illumination and reflectivity
variation of objects
use a polynomial fitting model
Systematic Depth Errors
Figure 8. Amplitude image of a planar
object with a ramp image. Parts of the
ramp are selected for calibration (blue
rectangle).
Figure 9. The depth samples (blue)
and the fitted model (green) to the
error
x(pixel)
y
(
p
i
x
e
l
)
Amplitude
E
r
r
o
r
(
m
)
0
0.001
0.003
0.004
0.002
0
1
0.5
Amplitude Correction
Light attenuates according to the law of inverse square
Systematic Depth Errors
Figure 20. Distance-based intensity correction [18]
Temperature-related Error
Due to the response of the semiconductor to temperature
change
Wait for at least 4 min after turn on
Systematic Depth Errors
Figure 10. Distance offset drift (fixed target) caused
by self-induced heating of the sensor. SR-2 [3]
Time(min)
E
r
r
o
r
(
m
)
0.4
0.3
0 10 20 30 40 50 60 70
0.35
0.45
Light Scattering
Multiple light reflections between the lens and the sensor
Use scattering model [4] or anti-reflection material on lens
Non-systematic Depth Errors
Figure 11. Light scattering in TOF camera [4]
Figure 12. Scattering artifacts (a) Color image
(b) Background range image (c) Range image
with foreground. (d) Range image difference [4]
Multiple Light Reception
Due to interference of multiple light reflections
Use multipath interference model [5]
Non-systematic Depth Errors
IR LED
Sensor
Figure 14. Top view of the corner. The green points (a
laser scanner). The red points ( the ToF camera with
MPI, RMSE 57mm). The black points (compensated
by MPI, RMSE17mm ) [5]
Figure 13. Multipath Interference
Jump Edge Error
Due to multiple light reception
Use outlier rejection method [6][7]
Non-systematic Depth Errors
Figure 15. Jump Edge Error
Motion Artifact (Blur)
Due to the movement during integration time
Detect pixels with phase deviation [14]
Non-systematic Depth Errors
Figure 16. Motion blurring Figure 17. Motion artifact [14]
Consecutive positions Consecutive positions
Dynamic case Static case
D
i
s
t
a
n
c
e
(
m
)
Depth Folding (Phase Wrapping, Range Folding)
Due to the modular error in phase delay
Use multiple frequencies [17] or MRF with continuous
surface assumption [8]
Non-systematic Depth Errors
Figure 20. Depth unfolding [8]
Emitted signal Incoming signal
t 2
max
2R R+
) 2 sin(
| t
ft A
|
R
max
R R+
) 2 sin( ft t
Figure 18. Depth folding
Figure 19. Depth unfolding
using multiple frequencies [17]
No unified solution for handling all the errors!
Correct FPN and distance offset first
Utilize amplitude and color information (if exist)
Use constant integration time
Open research issues are here!
Depth range ambiguity, multiple light reception effect,
motion artifact in the complex scenes
Comments on Depth Measurement Errors
Bilateral Filter
uses weighted average of depth values from nearby pixels
Depth Noise Reduction
Figure 21. Bilateral filter kernel [9]
Figure 22. Bilateral filtering result
Non-Local Means Filter
uses weighted average of all pixels using the similarity of
squared neighborhoods
Depth Noise Reduction
Figure 23. Non-local denoising result [7]
Learning-based Method
addresses depth discontinuity or low infrared reflectivity
uses Random Forest regressor trained with real-world
data [5]
Depth Noise Reduction
Figure 24. Flying pixels due to
unsuitable reflectivity and large
depth discontinuities [6]
Figure 25. Artifact pixels [6]
Joint Bilateral Filter-based Method
refines depth values using color similarity [15]
Depth Super-resolution
(a) Color image 640x640 (b) Input depth map 64x64 (c) Refined depth maps 640x640 [15]
Markov Random Field-based Method
assumes that discontinuities in depth and color coexist
Depth Super-resolution
Figure 27. MRF-based depth super-resolution result [12]
Multiframe-based Method
models the image formation process from multiple depth
images
Depth Super-resolution
Figure 28. super-resolution result [11]
Filtering-based Method
smoothens the disparity map to hide occluded regions.
Novel View Synthesis
Figure 29. Depth map smoothing [17] Figure 30. Depth map smoothing [18]
Inpainting-based Method
fills the disoccluded region using image-inpainting
techniques.
Novel View Synthesis
Figure 27. Bi-layer inpainting [15]
Image warping Occlusion boundary
labeling
Foreground/background
segmentation
Exemplar-based inpainting
[19]
Region
to be inpainted
Foreground
Region
Background
Region
Segmented region Inpainted result Original image
Warped disparity
map
Labeled occlusion
boundary
ToF Motion Blur
Moving camera/object cause wrong depth calculation
Motion blur
Image sensor
Moving Object
Moving Object
ToF Motion Blur
The characteristic of Tof motion blur is different from color
motion blur
Overshoot Blur
Undershoot Blur
Overshoot Blur
ToF Motion Blur
The characteristic of Tof motion blur is different from color
motion blur
Reflected IR
4-Phase signals inside ToF camera
Time Integ.
1
Q
2
Q
3
Q
4
Q
Radiated IR
|
|
.
|
\
|
=
2 1
4 3
arctan
nQ nQ
nQ nQ
t
d
Depth calculation using the relation of 4-Phase signals
ToF Motion Blur
ToF motion blur model
|
|
.
|
\
|
=
2 1
4 3
arctan
nQ nQ
nQ nQ
t
d
|
|
.
|
\
|
+ +
=
) ) ( ( ) ) ( (
arctan ) (
2 2 1 1
4 3
Q m n mQ Q m n mQ
Q n Q n
m t
d
2
2 1 2 2 1 1
4 3
) ( ) (
1
1
) ( '
|
|
.
|
\
|
+ +
+
=
Q Q n Q Q Q Q m
Q n Q n
m t
d
) (
) (
2 2 1 1
1 2
Q Q Q Q
Q Q
n m
+
=
ToF Motion Blur
) 2 2 (
) 2 1 (
) (
) (
1 1
1
2 2 1 1
1 2
Q Q
Q
n
Q Q Q Q
Q Q
n m
=
+
=
(Normalized)
1
Q
1
Q
1
) 2 2 (
) 2 1 (
0
1 1
1
<
<
Q Q
Q
=
Light Image
BRDF
Input Signal Output Signal
System
Inverse Reflectometry with Few Inputs
[Homogeneous BRDF]
Reflection as convolution
2
2
t
t
} i
du ( )
i
L u
Lighting
( )
o
B u =
Reflected
Light Field
( , )
i o
u u
BRDF
( )
i
L o u +
( , )
i o
u u
i
du
2
2
t
t
}
( , )
o
B o u =
o
L
L
i
u
o
u
Global Coordinate = Local Coordinate
Global Coordinate Local Coordinate
'
o
u
'
i
u
( ) ( ) ( )
'
2 /
2 /
' ' '
, ,
i o i i o
d L B u u u u u o
t
t
}
=
( ) ( )
'
2 /
2 /
' ' '
,
i o i i
d L u u u u o
t
t
}
+ =
Linear Shift Invariant System
Inverse Reflectometry with Few Inputs
[Homogeneous BRDF]
Reflection as convolution
Analyzing well-posed conditions for inverse
lighting/reflectometry.
High frequency in BRDF (Sharp highlights) Well-posed in inverse lighting
High frequency in Lighting distribution (Sharp features like point lights, edges, etc.)
Well posed in inverse reflectometry
2
2
t
t
}
i
du ( )
i
L u
Lighting
( )
o
B u =
Reflected Light Field
( , )
i o
u u
BRDF
( )
i
L o u +
( , )
i o
u u
i
du
2
2
t
t
}
( , )
o
B o u =
p l l l p l
L B
, ,
A =
= L B
LSI SYSTEM Convolution operator
Inverse Reflectometry with Few Inputs
[Homogeneous BRDF]
Applying a nested algorithm for estimating unknown
lighting/BRDF
Photograph
Rendered
Photograph
Rendered
p l l
p l
l
B
L
,
,
A
=
l l
p l
p l
L
B
A
=
,
,
A + A ~
j
b b j b d f f j f d
A E r R A E r R
, ,
Dipole diffusion approximation
Sum of front & back irradiance maps
Inverse Reflectometry with Few Inputs
[Homogeneous BSSRDF]
Estimating a diffuse reflectance function
Supposed that the lighting/shape is given, the front/back irradiance maps can be
computed
R
d
are expressed as a linear combination of basis functions, the piece-wise constant.
Solving a linear equation to compute coefficients c
j
( ) ( )
=
~
J
j
j j d
r e c r R
1
( ) ( ) ( )
+ =
k
b j k b f j k f ij
r e K r e K a
, ,
( ) ( )
=
~
J
j
j j d
r c r R
1
e
b Ax =
Inverse Reflectometry with Few Inputs
[Homogeneous BSSRDF]
Given the coefficients of diffuse reflectance function, it is
possible to rendering the target materials under varying
illumination/geometric conditions.
Inverse Lighting with Few Inputs
[General scene]
Capturing an incident light field using a mirror-like light probe
0 stop -3.5 stop -7 stop
Mirror light probe
Generating HDR map for
incident lighting
Response function
[Debevec 97]
Multiple images with varying exposure time for HDR recovery
Capturing high frequencies in light distribution
Inverse Lighting with Few Inputs
[General scene]
Capturing an incident light field using a matte light probe
[Wang and Samaras 02]
Input
Detecting critical
boundaries
Reconstructed
A set of points
perpendicular to the light
direction
A single image with Lambertian object
Low frequencies in light distribution
Inverse Lighting with Few Inputs
[General scene]
Capturing an incident light field using a planar light probe
[Alldrin and Kriegman 06]
A single image with a three-layered light probe
Low frequencies and some high frequencies in light distribution
Top Pattern: Sinusoidal printed on translucent sheet
Medium: Glass (0.096 in, Refractive index 1.52)
Bottom: Sinusoidal printed on Lambertian sheet
Inverse Lighting with Few Inputs
[General scene]
Capturing an incident light field from shadows [Sato et al. 02]
Input
Regenerating shadows
Inverse Lighting with Few Inputs
[General scene]
Lighting distribution over the hemisphere
Estimating the light distribution
Inverse Lighting with Few Inputs
[Specific scene]
Estimating an incident light field from a class-object
Faces, eyes, etc.
R=7.6mm
Anatomical model for human eye
Gaze detection
Inverse Lighting with Few Inputs
[Specific scene]
Estimating an incident light field from a class-object
Faces, eyes, etc.
Extracting an environment map (incident light)
From many to few inputs
Compact parametric representation for materials
Complex materials and global illumination
Diffuse (Texture), glossy, specular (Direct illumination)
Translucency, subsurface scattering (Indirect illumination)
Lack in details
Geometry enhancement ( motivated by super-resolution )
Semantic approach
Noisy inputs
Accounting for noise models during inference
Robust algorithm ( motivated by photometric stereo)
Problem Definition - Reminder
Inverse Problem with Complex Illumination
Estimating both the lighting and reflectance from a single
image
Lambertian surfaces to Non-Lambertian surfaces
Recent study extends the work to recover a semiparametric reflectance
model, more general than empirical models [Chandraker and Ramamoorthi 11]
Necessary to recover the texture
Photograph Rendered
Related work : Ramamoorthi and Hanrahan [01]
Existing work derived the theoretical analysis of
inverse problem, factorizing the reflectance and
lighting from a single image
Inverse Problem with Complex Illumination
[General BRDF]
General BRDF for accounting a variety of materials
Proposing a semiparametric reflectance model, more general than empirical
models, as a sum of univariate functions
They proved that 2-lobe BRDF can be uniquely identified by a single input
image
In various empirical model, can correspond to light vector, half way vector
or view vector
Estimating the reflectance functions
Unknowns become i and the form of function fi
Solving a regression problem
=
=
K
i
T
i i
n f n
1
) ( ) ( o
2
1
,
) ( ) ( min
=
K
i
T
i i
f
n f n
i i
o
o
[Chandraker and Ramamoorthi 11]
Inverse Problem with Complex Illumination
[General BRDF]
Comparable to ground truth
Better than the empirical
model
Input Relighting
Ground truth
Error
Input Proposed
Ground truth
T-S Model
Inverse Problem with Complex Illumination
[Global Illumination]
Inverse light transport to extract the global illumination
effects (e.g. interreflection)
Exploiting the duality of forward/inverse light transport
[Carroll et al. 11]
Inverse Problem with Complex Illumination
[Global Illumination]
Extracting an indirect illumination iteratively
Coefficients of S-1 is the result of Neumann series
Inverse Problem with Complex Illumination
[Global Illumination]
Separating the indirect illumination out
Inverse Problem with Complex Illumination
[Textured Surface]
Accounting for spatial variants in BRDF
Assuming Bivariate BRDF
Input : About one hundred images under varying illumination conditions
[Alldrin et al. 08]
Few inputs
Compact parametric representation for materials
Complex materials and global illumination
Diffuse (Texture), glossy, specular (Direct illumination)
Translucency, subsurface scattering (Indirect illumination)
Lack in details
Geometry enhancement ( motivated by super-resolution )
Semantic approach
Noisy inputs
Accounting for noise models during inference
Robust algorithm ( motivated by photometric stereo)
Inverse Problem In Practice
+
Inverse Problem with Noises
Depth/Color misalignment
To achieve the same vantage points for a depth image and a color
image
Depth noises upon material properties
Most depth sensors suffer from the depth distortion upon the
material characteristics
Missing data
Inverse Problem with Noises
Robust approach for photometric stereo
A class PS problem assumes the Lambertian materials, without
shadows
In reality, the surface is mostly non-Lambertian and includes
shadows
Robust approach handles the non-Lambertian illumination and
shadows as errors
Rank minimization problem
Convex Lambertian surfaces represented by at most 3 rank structure
Formulating the problem by minimizing the rank with sparse error
constraint
Solution via convex programming
[Wu et al., ACCV10]
E A D t s E
E A
+ = + . . min
1 *
,
A
( ) E A D t s E rank
E A
+ = + . . min
0
,
A
[Candes, Li, Ma, and Wright09]
Inverse Problem with Noises
[Wu et al., ACCV10]
Conclusion
It is time to merge the advanced graphics and vision
research for 3D imaging technology.
Practical issues remain for general scenes, complex
materials, few input images.
Parametric vs. Data-driven representation for materials
Complex illumination
Physically accurate constraints for appearance
Robust approach for geometry reconstruction
For outdoor scenery, we need to resolve the range limit on
depth camera
Interference on IR
Range limit
Q&A
Thank you!