発表論文

2006.01

Matching gait image sequences in the frequency domain for tracking people at a distance

佐川 立昌, Yasushi Makihara, Tomio Echigo, Yasushi Yagi

概要

This paper describes a new method to track people walking by matching their gait image sequences in the frequency domain. When a person walks at a distance from a camera, that person often appears and disappears due to being occluded by other people and/or objects, or by going out of the field of view. Therefore, it is important to track the person by taking correspondence of the image sequences between before and after the disappearance. In the case of tracking, the computational time is more crucial factor than that in the case of identification. We create a three-dimensional volume by piling up an image sequence of human walking. After using Fourier analysis to extract the frequency characteristics of the volume, our method computes the similarity of two volumes. We propose a method to compute their correlation of the amplitude of the principal frequencies to improve the cost of comparison. Finally, we experimentally test our method and validate that the amplitude of principal frequencies and spatial information are important to discriminate gait image sequences.