I developed a university project under the "Robótica-Autismo" project at University of Minho, Portugal. It consists in a emotion recognition system that is able to detect 6 diferente facial expressions. This system is interfaced with a robotic platform Zeno R50 from Robokind. A video of the experimental setup is available in the follow link: https://www.youtube.com/edit?video_id=sRbZPZVENuU
This system uses the recente intel RealSense with machine learning techniques in order to classify the User's facial expression.
You can use the hand skeleton that is available in the Intel RealSense SDK. With the joint positions data, you can train a machine learning model in order to classify certain gestures. For my project I used Support Vector Machines (SVM), but you can use another machine learning method. Since I developed my project in C#, I used a C# machine learning library: The Accord Machine Learning.
In the literature you have a lot of articles that uses Kinect and machine learning approaches for classifying gestures.