Printer Friendly

An efficient object tracking algorithm based on dynamic particle filter/Efektyvus objekto sekimo metodas, pagristas dinaminiu daleliu filtru.

Introduction

The tracking of moving non-rigid objects in the video sequence is frequently solved task in the computer vision systems. For this purpose dynamic models are used which describes the motions of the object or the algorithms which measures certain features of the objects, i.e., roundness, color or edge. The measure of the features or motions is not impossible without the impact of the noise. Therefore, every measure can be valued as probability measure. When the motion or the feature of the object can be described with linear model, then for the tracking purposes linear methods can be used [1-3]. However, many object tracking cases are non-linear, which can be solved using a particle filter.

The particle filter is the algorithm based on the probability measure, which do not use the probability density function directly, but approximates it by certain number of particles. Each particle is described with the appearance position and its weight, which is the certain measure of probability or similarity. Such approximation of the probability density function is called a sequential Monte Carlo method [4], as well. The particle filter algorithm is recursive in nature and comes through three stages: evolution, re-sampling and propagation. After every computation each particle is newly modified and propagated based on the latest measures and the motion model. During the re-sampling process, the particles with smallest weight values are eliminated.

The particle filter can be used in the various object tracking systems, i.e., car detection and tracking, vision based security systems etc [6]. In the work the adaptive particle filter was proposed which was used for the accidently appearing face recognition [5]. Authors of this paper have used two time series models and one presumption for the face recognition and tracking purposes. The first adaptive model describes the process of the motion and appearance from the video data, second model describes the changes between tracked face and the database of the faces. The presumption is taken in the account that all faces in the database are in the frontal view.

The color features of the observable object are frequently used for the description of the tracking target and the separation of the foreground from the complex background. Color-based tracking can be implemented using color density images [7], color histograms [8] and etc [10-12, 16-18]. The tracking algorithm proposed in this paper is similar to the CONDENSATION algorithm [9] which is dedicated to track the object contour. However, the algorithm presented in this paper is able to track the color features of the observable object. The Gaussian random noise is used as motion model for the object movement prediction. The particles distributed according to such motion model can fallow any movement of the object.

In this paper, the tracking problem of the non-rigid objects in the complex background is solved using dynamic particle filter, where the number of particles relays on the changing rate of the object location. The proposed approach automatically uses the less number of particles when the position of the object does not change rapidly from one frame to another frame and more particles when the object moves relatively fast.

Since, the algorithm of the object tracking is quite complicated, there is a need for a formal description and validation of the algorithm. The dynamic PLA notation DPLA [15] was chosen for the formal specification of the proposed algorithm, which has a possibility to define the variation of the number of particles in the solving tasks.

[FIGURE 1 OMITTED]

The paper is organized as follows. Second section explains the proposed dynamic particle filter in the conceptual and formal manners. In the third section introduces experimental setup and results. Finally, conclusions and future guides are presented in the last section.

Conceptual model of the dynamic particle filter

This section explains briefly the proposed approach of dynamic particle filter. As it was mentioned, the particle filter comes through three states: evolution, re-sampling and propagation. Each state is executed in sequence. The weights of each generated particle are computed based on the measure model during evolution state. The measure model uses the correlation between template and selected HSV color histograms in this paper (see Fig. 2). If histogram of the particle is similar to the template histogram then the weight of this particle is set to 1, and otherwise to 0. The particles with smallest weight values are eliminated during the re-sampling state. New particles are generated according the Gaussian random noise and the information about the last object location during propagation state. The real location of the object is the stable point of the particles cloud. The object location which was found in the frame t--1 is used as central point for particle propagation.

[FIGURE 2 OMITTED]

Formal description of the dynamic particle filter

The algorithm of dynamic particle filter was formalized using formal DPLA (dynamic PLA) notation [15], which is extension of PLA [14]. Dynamic PLA is able to various dynamic changes of systems. Proposed algorithm in dynamic PLA specification was described using internal events of decision making (DM) aggregate, which decompose dynamic particle filter into the seven stages:

[e".sub.1]--The number selection of particles;

[e".sub.2]--The estimation histograms and positions of generated particles;

[e".sub.3]--The comparison of histograms;

[e".sub.4]--The selection of proper particles;

[e".sub.5]--The estimation of max distance among selected particles;

[e".sub.6]--The selection of proper group of particles using clustering;

[e".sub.7]--The estimation of the new average coordinates.

The particles [d.sub.i] in dynamic PLA specification are interpreted as the elements of the set D([t.sub.m]) which is defined in discrete component of aggregate DM (see Fig. 3). During the particular internal events to the particles the new attributes as coordinates, histograms and correlation coefficients are added.

[FIGURE 3 OMITTED]

The initial state Z ([t.sub.0]) of the aggregate DM is described below, which includes all attributes of the aggregate in initial time moment:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (1)

where X([t.sub.0])={x}--the set of input signals, Y([t.sub.0])={y}--the set of output signals, E'([t.sub.0])={[e'.sub.1]}--the set of external events, E"([t.sub.0])={[e".sub.i]},i = [bar.1,7]--the set of internal events, [z.sub.v]([t.sub.0]) = {w([e".sub.i], [t.sub.m])}, i = [bar.1,7]--continuous component.

Discrete component:

v([t.sub.0]) = {D([t.sub.0]),k([t.sub.0]),[[sigma].sub.y]([t.sub.0]), [[sigma].sub.y]([t.sub.0]),C([t.sub.0]), MAX([t.sub.0]), reg, red, exp [beta], [alpha], [[mu].sub.r]}, (2)

where D--the set of particles; k--the number of picture; C--the class of particles at initial time; MAX--maximum distance between particles; [[sigma].sub.y],[[sigma].sub.y]--standard deviations; [alpha] = const--threshold value of the distance; reg = const--regular number of particles; red = const--reduced number of particles; exp = const--expanded number of particles; [beta] = const--threshold value of correlation coefficient; [[mu].sub.r] = const--threshold value of movement.

The set of transition operators:

H([t.sub.0]) = {H([e'.sub.1]), H([e".sub.i])}, i = [bar.1,7]. (3)

The set of output operators:

G([t.sub.0]) = {G([e'.sub.1]), G([e".sub.i])}, i = [bar.1,7], (4)

where A([t.sub.0]) = {[empty set]}--the set of initial aggregates, R([t.sub.0]) = {[empty set]}--the set of internal links.

As external event [e".sub.1] aggregate DM gets the picture (input signal x) generated from the camera. Whenever processing of the picture is finished aggregate sends the output signal y back to the camera and asks for another picture. With the first input k([t.sub.m]) = 0 signal the average coordinates [[mu].sub.x],[[mu].sub.y] of initial position and the template histogram [p.sup.(j)] of tracking object are determined and added to the discrete component of aggregate. Finally the first internal event [e".sub.1] is invoked despite the value of variable k. The transition operator of external event is defined below:

H([e'.sub.1]): W([e".sub.1], [t.sub.m] + 1) = [t.sub.m] + [[eta].sup.1.sub.k], k([t.sub.m]) [greater than or equal to] 1. (5)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (6)

The description of the process of the tracking object is presented by transition operators of internal events, performed by appropriative order.

The selection of the number of the particles primarily performed internal event H([e".sub.1]):

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (7)

where D([t.sub.m])--the set of particles.

The number n of particles can be regular reg, reduced red or expanded exp. It is chosen in accordance with position shift of tracking object from two contiguous results. The position shift

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (8)

is compared with [[mu].sub.r], which value depends on the ration of the observable object and total image. This value was determined experimentally in our example.

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (9)

Compared with this value standard deviations ax (tm) and cry (tm) are selected as well.

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (10)

The coordinates of selected particles are generated using Gaussian distribution with estimated average [[mu].sub.x]([t.sub.m]), [[mu].sub.y] ([t.sub.m]) and standard deviations [[sigma].sub.x]([t.sub.m]), [[sigma].sub.y]([t.sub.m]). According to obtained coordinates the histogram q(j) for each particle is determined. The estimation of the histograms and positions for generated particles H([e".suub.2]):

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (11)

where

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (12)

After this event, the third internal event [e".sub.3] is invoked.

The comparison of histograms H([e".sub.3]):

For each particle the correlation coefficient [[zeta].sub.i] is determined comparing its histogram [q.sub.i](j) with the template histogram p(j).

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (13)

All particles, which have a weight [[zeta].sub.i] is equal or bigger than [beta] are considered as proper and all others are eliminated from the set of particles D([t.sub.m]). If in the set D([t.sub.m]) there are no particles which satisfy the weight requirements then the algorithm starts from the beginning, i.e., the generation of the new particles in the internal event [e".sub.1]. Otherwise, the internal event [e".sub.5] is invoked.

The selection of proper particles H([e".sub.4]) is described below:

D([t.sub.m] + 1) = D([t.sub.m])\[d.sub.i], [d.sub.i] [member of] D([t.sub.m])|; (14)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]; (15)

w([e".sub.5], [t.sub.m] + 1) = [t.sub.m] + [[eta].sup.5.sub.k], if D([t.sub.m] + 1) [not equal to] [empty set]; (16)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (17)

In the case if the selected particles in the event [e".sub.4] are distributed in to the few separate classes, the maximum distance between particles is estimated. If this distance md exceeds [alpha], clustering [e".sub.6] is started. Otherwise, from the set of particles C([t.sub.m]) the new average coordinates are estimated and the location of tracking object is updated during internal event [e".sub.7].

The estimation of max distance among particles H([e".sub.5]):

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (18)

where md = max(D([d.sub.i], [d.sub.j])), max(md) = MAX([t.sub.m]), D([d.sub.1], [d.sub.2]) = MAX([t.sub.m]), and

w([e".sub.6],[t.sub.m] + 1) = [t.sub.m] + [[mu].sup.6.sub.k], if MAX([t.sub.m]) [greater than or equal to] [alpha]. (19)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (20)

The selection of the proper group of particles using k-means clustering method H([e".sub.6]):

For the grouping particles the clustering method is used. Only here, the initial centroids [[bar.d].sub.1] and [[bar.d].sub.2], which are coordinates of most remote particles are determined in advance:

[[bar.d].sub.1] = ([[bar.x].sub.1], [[bar.y].sub.1]), [[bar.d].sub.2] = ([[bar.x].sub.2], [[bar.y].sub.2]). (21)

Based on the [[bar.d].sub.1] and [[bar.d].sub.2], distribution in to two groups [C.sub.1], [C.sub.2] is performed. The distance from the particles [[bar.d].sub.1] and [[bar.d].sub.2] to the all others is estimated.

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (22)

All particles are grouped into the two clusters.

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (23)

Cluster, which has more particles, is selected for processing.

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (24)

w([e".sub.7], [t.sub.m] + 1) = [t.sub.m] + [[mu].sup.7.sub.k]. (25)

After all actions the internal event [e".sub.7] is invoked. During this event new average coordinates are estimated. The tracked position [[mu].sub.x]([t.sub.m]), [[mu].sub.y]([t.sub.m]) of the object is equal to the average coordinates of all particles from the set C([t.sub.m]).

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (26)

Since after this internal event the processing of the certain picture is finished, the output signal Y = [y.sub.1] is generated.

The formal specification described above gives the requirements for the practical filter realization defining the algorithm as seven stages (internal events in specification).

Experiments

The randomly moving ball was taken on the video of 174 colored frames (size 640x480) in order to test the proposed dynamic particle filter. The complex scene with colored background and objects which have similar color like target object was design for checking the robustness of the proposed approach. The video data was gathered with ordinary USB web-camera in the day light (point lightness about 1500 Lux). The template histogram of observable object was selected from the region of 30x30 pixels in the first frame. Starting position of the ball and the region size of the interest is selected by the user. Next positions of the ball are tracked using dynamic particle filter.

The test trajectory of the ball movement was designed measuring the real position of the ball in each frame by the hand. This trajectory was compared with the trajectory obtained by the proposed approach. The threshold of minimum distance between two ball positions in two different frames was changed from o to 5. Three numbers of particles (250, 500 and 3000) were used in the experiment. The mean execution time, mean absolute error between two trajectories and particle usage were measured during experiment. The tracking results are shown in Fig. 5-8.

The Fig. 4 shows the image of the experimental environment, target trajectory is colored in black and the measured trajectory is colored in white. The illumination of the background is not evenly distributed; therefore, measured trajectory is slightly different from the target trajectory in the places where more illumination is.

[FIGURE 4 OMITTED]

[FIGURE 5 OMITTED]

The coordinates of the target and measured trajectories extended in the time domain are shown in Fig. 5. It clearly visible, that the curves are similar. The proposed algorithm tracks the position of the observable object efficiently and robustly with relatively small offset (see Fig. 5).

The mean absolute error was computed between target and measured trajectories with different threshold values of the minimum distance. The curve is shown in Fig. 6. The MAE error increases, when the threshold value increases.

Mean execution time is the time, which is spent to track the position of the observable object in each video frame. The execution time was obtained using MATLAB, therefore, the comparison of results of similar experiments should consider this fact. The numerical relation between mean execution time and the threshold of the minimum distance is shown in Fig. 7. The curve shows, that the execution time degreases when the threshold value is increased.

[FIGURE 6 OMITTED]

[FIGURE 7 OMITTED]

More time (about o.85 sec.) is needed to track the position of observable object when the threshold is equal to o and less (about o.55 sec.) when the threshold is equal to 5. This can be explained with the relation between the particle usage and the threshold of the minimum distance. The particle usage shows how many times certain number of particles was used during the total video processing (see Fig. 7). Proposed algorithm uses 500 particles on each frame when the threshold of minimum distance is equal to o. The algorithm uses 25o particles more frequently then 500 particles on each frame when the threshold is equal to 5. 3000 particles are used only in a few frames, when the location of the observable object has changed rapidly from one frame to another. This number of particles is sufficient to cover almost full image and track the position of the lost object.

[FIGURE 8 OMITTED]

Tracking algorithm based on dynamic particle filter dynamically changes the number of particles and allows ot degrease the computation time. If less particles are used then the real position of the observable object can be found with certain error (see Fig. 6). However, the mean absolute error increases less then computation time degreases, accordingly, 19 % and 35 %.

Conclusions and future works

The algorithm of the dynamic particle filter was presented, which enables the tracking of the non-rigid object in the complex background. The proposed algorithm was formalized using DPLA notation which allowed us to specify the dynamics of the real example in the mathematical, unambiguously and understandable manner. Experimental results show that the proposed dynamic particle filter can successfully fallow the object when it moves in the complex background.

The future work will incorporate the extension of algorithm for automatic selection of the template histogram which takes into account the topological information, detection of suddenly appearing objects and multiple similar object tracking. The proposed algorithm can be used for the associative communication technologies, vision based security systems.

Acknowledgments

The authors appreciate the financial support of the Lithuanian state science and studies foundation.

Received 2009 01 28

References

[1.] Forsyth D. A., Ponce J. Computer vision; a modern approach // Englewood Cliffs, NJ: Prentice Hall.--2002.--P. 235-240.

[2.] Trucco E., Verri A. Introductory techniques for 3D computer vision // Egkewiid Clifs, NJ: Prentice Hall, 1998.--P. 125-130.

[3.] Paragios N., Chen Y., Faugeras O. Handbook of mathematical models in computer vision // Springer. ISBN 0387-26371.--2006.--P. 293-307.

[4.] Vermaak J., Ikoma N., Godsill S. J. Sequential Monte Carlo framework for extended object tracking // IEE Proc.-Radar Sonar Navig.--2005.--P. 152(2):353-363.

[5.] Shaohua Z., Chellappa R., Moghaddam B. Adaptive visual tracking and recognition using particle filters // ICME'03 Proc. 2003 Int. conf. on Multimedia and Expo.--2003.--Vol. 2.--P. 349-352.

[6.] Shaohua Z., Chellappa R., Moghaddam B. Appearance tracking using adaptive models in a particle filter // Proc. of the 6th Asian conf. on Computer Vision.--2004.--P. 86-90.

[7.] Martinez-del-Rincon J., Orrite-Urunuela C., Herrero-Haraba J. E. An efficient particle filter for color-based tracking in complex scene // Conf. on Advanced Video and Signal Based Surveillance.--2007.--P. 176-181.

[8.] Barrero P., Canas M. J., Matellan V. Visual object tracking in 3D with color based particle filter // Proc. of World Academy of Science, Engineering and Technology.--2005.--Vol. 4.--P. 200-203.

[9.] Isard M., Blake A. CONDENSATION--conditional density propagation for visual tracking // Int. Journal of Computer Vision.--1998.--Vol. 20, No.1.--P. 5-28.

[10.] Rekleitis I., Dudek G., Milios E. Probabilistic cooperative localization and mapping in practice // IEEE Int. Conf. on Robotics and Automation (ICRA'03).--2003.--Vol. 2.--P. 1907-1912.

[11.] Fujun P., Pingyuan C., Yangzhou C. Adaptive MCMC particle filter for nonlinear and non-Gaussian state estimation // Int. Conf. on Innovative Computing Information and Control (ICICIC'08).--2008.--P. 494-494.

[12.] Kurazume R., Yamada H., Murakami K., Iwashita Y., Hasegawa T. Target tracking using SIR and MCMC particle filters by multiple cameras and laser range finders // IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IRO'08).--2008.--P. 3838-3844.

[13.] Thomas C. F., Phan L. L., Gerald S. R., Don M. T. Scalp electrode impedance, infection risk and EEG data quality // ClinNeurophysiol.--2001.--P. 536-544.

[14.] Pranevicius H. Formal Specification and Analysis of Computer Network Protocols: Aggregate Approach // Kaunas: Technologija, 2004.--P. 50-180.

[15.] Pranevicius H., Makackas D., Paulauskaite-Taraseviciene A. Multi-Agent Systems Formalization Using Formal Method DPLA // The 21st annual European Simulation and Modelling Conference, St.Julians, Malta, 2007.--P. 427-431.

[16.] Paulinas M., Miniotas D., Meilunas M., Usinskas A. An Algorithm for Segmentation of Blood Vessels in Images // Int. Journal of Electronics and Electrical engineering.--2008.--No. 3(83).--P. 25-28.

[17.] Bistrov V. Analyse of Kalman Algorithm for Different Movement Modes of Land Mobile Object// Int. Journal of Electronics and Electrical engineering.--2008.--No. 6(86).--P. 89-92.

[18.] Laptik R., Navakauskas D. MAX-MIN Ant System in Image Processing // Int. Journal of Electronics and Electrical engineering.--2009.--No. 1(89).--P. 21-24

V. Raudonis, R. Simutis

Department of Process Control, Kaunas University of Technology, Lithuania Studentu str. 50, Kaunas, phone: 300296, e-mail: rimvydas.simutis@ktu.lt

A. Paulauskaite-Taraseviciene

Department of Business Informatics, Kaunas University of Technology, Lithuania Studentu str. 56, Kaunas, phone: 300378, e-mail: agne@ifko.ktu.lt
COPYRIGHT 2009 Kaunas University of Technology, Faculty of Telecommunications and Electronics
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2009 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:AUTOMATION, ROBOTICS/AUTOMATIZAVIMAS, ROBOTECHNIKA
Author:Raudonis, V.; Simutis, R.; Paulauskaite-Taraseviciene, A.
Publication:Elektronika ir Elektrotechnika
Article Type:Report
Geographic Code:4EXLT
Date:Apr 1, 2009
Words:3555
Previous Article:The influence of magnetic shielding on selectivity of a hall sensor based Protection/Magnetinio ekranavimo itaka Holo jutikli naudojancios apsaugos...
Next Article:A stereovision system for 3-D perception/Stereosistema trimatei erdvei registruoti.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters