Depth-Based Real-Time Gait Recognition
Abstract
Each person describes unique patterns during gait cycles and this information can be extracted from live video stream and used for subject identification. In recent years, there has been a profusion of sensors that in addition to RGB video images also provide depth data in real-time. In this paper, a method to enhance the appearance-based gait recognition method by also integrating features extracted from depth data is proposed. Two approaches are proposed that integrate simple depth features in a way suitable for real-time processing. Unlike previously presented works which usually use a short range sensors like Microsoft Kinect, here, a long-range stereo camera in outdoor environment is used. The experimental results for the proposed approaches show that recognition rates are improved when compared to existing popular gait recognition methods.
This paper was recommended by Regional Editor Zoran Stamenkovic.