Document 7732973

Download Report

Transcript Document 7732973

Multi-Focus Range Sensor
using Coded Aperture
Shinsaku HIURA
Osaka Univ.
Takashi MATSUYAMA
Kyoto Univ.
Depth from Defocus
Depth estimation using the quantity of
Depth
Focusing
blurring
Singlefrom
image
analysis
Search
focused
position
by
Planar
photograph
is
Passive, and no physical motion
moving
lens – not suitable
indistinguishable
from
- suit for real-time measurement
for
real-time
real
object measurement
- small and light-weight equipment
Stable depth estimation using
relative
Depth
from
Defocus
Multiple image
analysis
defocus analysis
distance
is estimated
from
stable analysis
against
the
amount
of blurring – no
small and optimal sensor
is
necessary
varied
texture
physical motion, suit for
real-time measurement
Multi-Focus Camera
Convert color CCD camera
Each CCD is moved 1mm
toward Optical Axis
Neutral density by recoating the prism surface
Small and light as same as
usual color CCD camera
Telecentric Optics
Usual optics
Telecentric optics
Apreture is set at front focal plane
Image size/intensity are equal among each
image plane. Only blurring varies
First Applying to DFD: Nayar
Problems of past Depth
from Defocus research
High frequency information is lost by Blur(=LPF)
 unstable range estimation
 too sensitive to the texture or environment
 high-quality noiseless image is necessary
Ex. Nayar averages over 256 images to eliminate noise
If the “blur” effect is changed to High-pass or
Band-pass filter, it is possible to stabilize range
estimation
 Structured aperture (coded aperture)
Multi-focus camera with
a Coded Aperture
Blurring kernel is the scaled shape of the aperture
Magnification ratio is varied with object distance
Multi-focus camera with a
coded aperture
QuickTimeý Dz
YUV420 ÉRÅ[ÉfÉbÉN êLí£ÉvÉçÉOÉâÉÄ
ǙDZÇÃÉsÉNÉ`ÉÉǾå©ÇÈǞǽDžÇÕïKóvÇ­Ç•
ÅB
Mathematical model of blurring
Process of
1 blurring
x y
im (x, y)  2 a( , ) * s(x, y)
k m km
km
Image s(x,y)
Dist. u
convolution
convolution
convolution
Focus v
Blur kernel a(x,y)
K1 magn.
K2 magn
K3 magn.
Input image i1(x,y)
Input image i2(x,y)
Input image i3(x,y)
v  wm
km 
f
Wm:image plane
Range estimaton using FFT
Fourier transform of blurring process
I m ( p, q)  Am ( p, q, v)  S ( p, q)
Elimination
of original
image
information
Minimum residual
is searched
by varying
v
I mspatial
( p, qfreq.,
) term
Amv:(ispfocus
, q, vposition
) from
(focus
First
calculated
p, q:position).
 second is from blurring
two input images, and
I
(
p
,
q
)
A
(
p
,
q
,
v
)
n
n
Model.
Eval. Func. of range estimation
 info.
I m ( pis,eliminated
q) Am using
( p, qdivision
, v) 
Original image

r (v)    

An ( p, q, v) 
 ( m , n ) s  I n ( p, q )
Process flow
Minimize residual  range value
Eliminate scene texture info.
Windowing&FFT
Restoration of blur-less image
Inverse filter
3
I m ( p, q)
S ( p, q)  Wm 
Am ( p, q, v)
m 1
v:focused position Wm:weight calculated from v
High-quality image can be restored, because
using multipule images
Rich information is remained using coded aperture
Aperture design(1)
Gain
Spacial frequency
Spatial frequency response must be structured
for easy and stable analysis
High freqency information must be preserved
Aperture design(2)
Usual circular
aperture is not
optimal. This
type is suit for
beautiful
defocused
photograph.
Blurring is not observed.
Monotonic, and low gain when blurred.
Feasible, but more peaks are desired.
Blurring kernel(diameter of hole is ignored)
Simple example:21 holes aperture
a( x, y )   ( x  c1 , y )   ( x  c1 , y ) 
2
1-D Fourier transform of blurring kernel
Am (s, y)  cos(2 c1  km  s)
Fourier transform of blurring kernel is cos()
Period of cos() is varied with object distance.
Robustness of range estimation
Residual of evaluation function with varied
range
This “valley shape” shows the ability of robust depth estimation
Experiment: input images
Last CCD
Center CCD
First CCD
Restored blur-less image
Reconstructed object shape
3-D object shape
Blur-free image(partial)
Range analysis using convolution
Depth is estimated
by searching the
position that gives
same images
Usual circular aperture can
not be used, because twice
blurring gives almost flat
images. Coded aperture
enabled such simple
principle.
Same convolution is applied
to the opposite image, and
we acquire the same image.
(becase of commutative law
of convolution)
Blurring kernel is
convolved optically
Experiment
Range
image
Input image
Measured scene
Asymmetric aperture design
Convolution kernel
is changed at the
focus plane
(phase part of
spacial frequency
is changed)
Error range estimation is
suppressed using asymmetric
aperture because phase part
Asymmetric aperture
of spacial frequency is
changed.
Aperture symmetry and
robustness of range estimation
Input image
Motion sequence measurement
using input image recording
3 images are recorded as RGB image on optical
video disc recorder
Image is deteriorated by Y-C translation and
cross-talk between RGB channel.
Result: finger motion
Input images
Range images
QuickTimeý Dz
YUV420 ÉRÅ[ÉfÉbÉN êLí£ÉvÉçÉOÉâÉÄ
ǙDZÇÃÉsÉNÉ`ÉÉǾå©ÇÈǞǽDžÇÕïKóvÇ­Ç•
ÅB
conclusions
Small/light multi-focus camera is developed
Coded aperture is applied to depth from defocus
Stable range estimation is achieved
Range estimation/image restoration by FFT
Range estimation by convolution
Recorded image can be used for motion analysis
because range estimation is robust enough
Real-time range measurement is possible using
image processing hardware. Simple method is
easily ported to parallel hardware.
Real-time calculation using
image processing hardware
Simple convolution method can easily be ported
on image processing hardware
Massive-parallel hardware, IMAP-Vision is used
for experiment
Spec: 10G instruction/sec by 256PE
Calculation of 25frames/sec can be achieved.
(However, this board does not have RGB
separate capture interface; experiment is
calculation only)