State-Driven Particle Filter for Multi-Person Tracking
Gerónimo David, Lerasle Frédéric, López Manuel Antonio |
---|
Multi-person tracking is an active research topic that can be exploited in applications such as driver assistance, surveillance, multimedia and human-robot interaction. With the help of human detectors, particle filters offer a robust method able to filter noisy detections and provide temporal coherence. However, some traditional problems such as occlusions with other targets or with the scene, temporal drifting, or even the detection of lost targets, still remain unsolved, making the performance of the tracking systems decrease. Some authors propose to overcome these problems using heuristics not well explained and formalized in the papers, for instance by defining exceptions to the model updating depending on tracks overlapping. In this paper we propose to formalize these events by the use of a state-graph, defining the current state of the track (e.g., potential, tracked, occluded or lost) and the transitions between states in an explicit way. This approach has the advantage of linking track actions such as the online model updating to the track state, which gives flexibility to the system. It provides an explicit representation to adapt the multiple parallel trackers depending on the context, i.e., each track can make use of a specific filtering strategy, dynamic model, number of particles, etc. depending on its state. We implement this technique in a single-camera multi-person tracker and test it in public video sequences.