A Motion-Based Particle Filter and
Randomly Perturbed Active Contours for Object
Many real-world applications
require accurate object tracking. Traditional applications
include video surveillance, autonomous vehicle navigation,
human computer interfaces, robot localization, etc. Recent
advances in computer, multimedia and communication technologies
have created opportunities in new applications such as wireless
communication, interactive imaging and virtual reality.
It is also a very challenging task in that target’s
state space representation can be highly non-linear and
the observations (e.g., audio and/or visual sensory data)
are almost always corrupted by background clutters.
In a probabilistic
framework, the tracking problem is formulated as the estimation
of the posterior density of the target given all past observations.
Since the target’s dynamics and observations can be
highly non-linear and non-Gaussian, Kalman filter fails
to track objects in real-world environment. Sequential Monte
Carlo filters, also known as Particle filters, use a non-parametric
technique and a set of random samples, also called particles,
to estimate the posterior density. A proposal density is
used to easily generate the samples. Each sample is assigned
a proper weight to make up the difference between the posterior
and the proposal densities. Theoretically, if the number
of samples is sufficiently large, the sample approximation
of the posterior distribution can be made arbitrarily accurate.
Practically, only a finite number of samples can be used.
Moreover, when a good dynamic model of the target is not
available, Particle filter samples regions of the conditional
density function that are not in the vicinity of modes associated
with the tracked object. Consequently, the tracker looses
sight of the tracked object and attempts to monitor spurious
objects and clutter.
To overcome the above
difficulties, we proposed to introduce motion cues in the
particle-filtering framework. We developed a motion-based
Particle filter that is robust to sharp movements of the
tracked object while propagating few particles; thus capturing
robustness and efficiency. For post-tracking contour refinement,
we use a 1-D causal active contour representation based
on dynamic programming to find the best local contour delineating
a non-rigid object. Since the traditional active contour
model suffers from its dependency on the model parameters
and initial condition as a consequence of local minima in
the cost function, we improved the convergence of the active
contour by performing the optimization over multiple randomly
sampled initial conditions. Our experiments, applied to
object tracking in challenging real-world videos, demonstrate
the dramatic improvement of the proposed motion-based particle
filter and randomly perturbed active contour system.
N. Bouaynaya and D. Schonfeld, “On the Optimality of Motion-Based
IEEE Transactions on Circuits and Systems for Video Technology,
N. Bouaynaya and D.
Schonfeld, “Motion-based particle filtering for
head tracking applications,” (Invited
Paper). Electronic Imaging Newsletter, vol. 15,
no. 2, p. 8, 2005.
N. Bouaynaya and
D. Schonfeld, "Complete system for head tracking using
motion-based particle filter and randomly perturbed
active contour", in Proceedings of SPIE, Image and Video
Communications and Processing (IVCP'05), vol.5685, March
2005, pp. 864-873. (Finalist for Best Student Paper
N. Bouaynaya, W.
Qu and D. Schonfeld, "An Online Motion-Based Particle
Filter for Head Tracking Applications", in IEEE International
Conference on Acoustics, Speech, and Signal Processing
(ICASSP'05), vol. 2, March 18-23, 2005, pp. 225 - 228.
W. Qu, N. Bouaynaya
and D. Schonfeld, "Automatic Multi-Head Detection and
Tracking System using A Novel Detection-Based Particle
Filter and Data Fusion", in IEEE International Conference
on Acoustics, Speech, and Signal Processing (ICASSP'05),
vol. 2, March 18-23, 2005, pp. 661 - 664.