Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Gesture is one of the fundamental ways of human machine natural interaction. To understand gesture, the system should be able to interpret 3D movements of human. This paper presents a computer vision-based real-time 3D gesture recognition system using depth image which tracks 3D joint position of head, neck, shoulder, arms, hands and legs. This tracking is done by Kinect motion sensor with OpenNI API and 3D motion gesture is recognized using the movement trajectory of those joints. User to Kinect sensor distance is adapted using proposed center of gravity (COG) correction method and 3D joint position is normalized using proposed joint position normalization method. For gesture learning and recognition, data mining classification algorithms such as Naive Bayes and neural network is used. The system is trained to recognize 12 gestures used by umpires in a cricket match. It is trained and tested using about 2000 training instances for 12 gesture of 15 persons. The system is tested using 5-fold cross validation method and achieved 98.11% accuracy with neural network and 88.84% accuracy with Naive Bayes classification method.
In this paper we proposed new estimators of parameters for a Naive Bayes Classifier based on Beta Distributions. Equations were obtained for these estimators using an EM-like algorithm and they provide numerical estimates for those parameters. Furthermore, two forms for that Naive Bayes Classifier were presented.