Using Kinect with Ordnance Survey mapping

We thought it would be interesting to hook up an Microsoft Xbox Kinect to work with our mapping, letting you control maps on screen using simple gestures. This video shows how we got on and all I’ll say is that with moves like this, the GeoDoctor must be a real goer on the dancefloor…

In all seriousness though, we think this could make geography lessons even more interesting by helping children understand geography and maps in a very fun and interactive way.

If you want to try this for yourself, you do need to be fairly technically proficient, but here’s a quick guide to how we did it.

Let us know how you get on.

How to use an XBox Kinect to control a map.

More detailed installation instructions can be found on the websites listed below. Versions numbers of software are changing rapidly and you may need to check if the current versions will need a particular driver version.

Download and install the PrimeSensor driver modules for OpenNI from the SensorKinect site on GitHub https://github.com/avin2/SensorKinect

Download and install OpenNI 1.1.0.41 from http://www.openni.org/downloadfiles/opennimodules/openni-binaries/20-latest-unstable

Download and install Flexible Action and Articulated Skeleton Toolkit (FAAST) http://projects.ict.usc.edu/mxr/faast/

Write an HTML page containing an OpenLayers map (like OS OpenSpace) that can be controlled via keyboard commands.

Edit the FAAST configuration file (faast.cfg) to map the incoming actions to keyboard commands that will pan and zoom the map:

[Actions]

# mappings from input events to output events
# format: event_name threshold output_type event

#lean_forwards 15 key_hold +
#lean_backwards 15 key_hold –
left_foot_sideways 20 key_hold –
right_foot_sideways 20 key_hold +
right_arm_up 20 key_hold up_arrow
right_arm_down 20 key_hold down_arrow
left_arm_up 20 key_hold up_arrow
left_arm_down 20 key_hold down_arrow
right_arm_out 20 key_hold right_arrow
left_arm_out 20 key_hold left_arrow

Connect the Kinect and check the drivers have recognized the device correctly by navigating to C:Program FilesOpenNISamplesBinRelease (or C:Program Files (x86)OpenNISamplesBinRelease) and try out the existing demo applications such as NIViewer.

Start FAAST and hit the Connect button. The user and the detected skeleton should appear when standing in front of the Kinect Sensor.

Start the mapping website in a new window, preferably on a second connected monitor.

Press the Start Emulator button in the FAAST window and the key presses should now be generated when the appropriate action is detected by the Kinect.

Change the mouse and keyboard input focus to the mapping window and the key presses should end up there. You should now be able to control and move the map using the detected input gestures from the Kinect.

Leave a Reply

Your email address will not be published. Required fields are marked *