top of page

AR Drone Control
This is a project we pursued at Mudd. It involved hacking a Kinect to recognize specific gestures, and translate those gestures into commands for an AR drone to follow.
By creating boundaries on a Kinect camera feed, the user could define gestures by moving their hand until it "touches" the boundary on the camera feed, telling the Kinect that it is a gesture. By utilizing a combination of such boundaries, it is possible to program a wide range of commands into the AR drone, allowing a user to control it through simple hand gestures.
More information can be found at the course website.
A demonstration is shown on the left.
bottom of page

