There are countless ways to add interactivity to your project in order to create a responsive environment – from analog methods (e.g. mirrors, lights and shadows, movable parts, etc.) to digital methods (e.g. touch sensors, microphones, cameras, etc.).   As a maker, the key is to keep things as simple as possible, and avoid adding unnecessary complexity to your project.  Often the simplest solution is also the most elegant.

Variables to consider when selecting an interaction method:

  • the size of your environment, and the space/range of the interaction
  • the qualities of your space – i.e. is the interaction environment dark, or fully lit?
  • what exactly are you tracking in your interaction?  i.e. surfaces, objects in space, human bodies, sounds?
  • what type of “data” do you need?  i.e. proximity/distance, position in 3D space, gestures, loudness?

Often with mixed reality installations, you want to track the general movement of a person or people in a space.  For larger spaces this could require a camera (possibly a 3D camera like a Kinect) – and computer vision algorithms capable of detecting lights, shapes and movement (or maybe more complex things like faces).

Fortunately there is a free tool designed specifically for this purpose (by the LAB at Rockwell Group).  The software is a few years old, but it is still currently functional on modern Macs and PCs.

Toolkit for Sensing People in Spaces

The Toolkit for Sensing People in Spaces (TSPS) can use as input:

  • webcam (USB or built-in)
  • Kinect 1.0 (useful for easy “background subtraction”)

It can detect the position, outline, and movement of:

  • shapes (i.e. of a person’s body, hand, etc.)
  • faces (and other complex things)
  • lights, shadows

It can run in the background on a computer, and provide a constant stream of data over:

  • OSC (open sound control)
    • very useful protocol for sharing interaction data between applications on the same computer, over a network, or even over the internet
    • can be tied into Unity, Processing, openFrameworks, Max/MSP, and many other software or code frameworks, etc. etc.
  • SpaceBrew 
    • a useful technology designed for interactive code projects, SpaceBrew allows live data to be easily shared and routed between devices over the internet, or a local network
    • tutorials for: Javascript, Processing, openFrameworks
  • WebSockets
    • a widely-used protocol for real-time communication over the internet
  • TCP
    • the backbone protocol of the internet – this allows you to stream raw data manually to any computer or server, given an IP address and a port number

Check out this introductory instructional video to see if TSPS may work for your project.  You will learn how to calibrate a camera for your needs: