My First LEAP Video

Here it is, the first video that I’ve done related to LEAP:

Handy Mouse – Using the Leap as a mouse controller

By using the Leap Motion, I have finally taken the step of getting into “Gesturing”.  By that I just mean using my hands and fingers in ways other than keyboard and mouse.  Of course the Leap Motion allows you to point, with a stick, your fingers, and what have you.

As a basic input device, it spews a stream of data to the developer, who must then make sense of it.  This is true of all input devices, even going down to the lowly mouse.  The difference with more traditional devices is their input stream has been well understood and mapped to models that work for existing applications.  Mouse spew: x,y,wheel, button and that’s about it.  Keyboard: keycode, up/down, led state.

The Leap Motion, and other such input devices are a bit different.  Their data streams, and how they map to applications are different.  What does waving your hand wildy do?  What does swiping in the form of an “S” curve do?  Unknown.  Furthermore, how do these motions map to a traditional application?  Should they?

Well, one mapping that I have chosen to take a look at is mouse movement. Seems simple enough.  Basically, pick up a pencil, or chopstick, and use it to point at the screen, move around, and ‘click’ by doing quick dips of the tip.  Seems natural enough.  Here’s a bit of code to achieve that:


-- pointertrack.lua
package.path = package.path.."../?.lua"

local LeapScape = require ("LeapScape");
local MouseBehavior = require("MouseBehavior");
local UIOSimulator = require("UIOSimulator");

local scape, err = LeapScape();

if not scape then 
  print("No LeapScape: ", err)
  return false

local OnMouseMove = function(param, x, y)
  UIOSimulator.MouseMove(x, y);

local mousetrap = MouseBehavior(scape, UIOSimulator.ScreenWidth, UIOSimulator.ScreenHeight);
mousetrap:AddListener("mouseMove", OnMouseMove, nil);



Of course this uses the TINN runtime, and can be found in the Leap TINNSnip project.

What’s going on here? First of all, create the LeapScape object. Then create a function which will turn x,y, coordinates into simulated mouse movements. Lastly, create an instance of the ‘MouseBehavior’ object, and add the OnMouseMove() function as a listener of the “mouseMove” event. Start the LeapScape, and run() the program.

You do this from a command line. Once the program is running, you can point the stick around the screen and see that the cursor will follow it.

But first, you have to create a calibration that allows you to constrain the movement. With the Leap Motion, you get a roughly 2’x2’x2′ area in which it can detect various movements. What you need to do is create a mapping from a porting of that space into your screen coordinates. So, first you run the mousetrain.lua program. Which looks like this:

--package.path = package.path.."../../?.lua"
package.path = package.path.."../?.lua"

local LeapScape = require ("LeapScape");
local FrameObserver = require("FrameObserver");
local UIOSimulator = require("UIOSimulator");
local StopWatch = require("StopWatch");
local GDI32 = require ("GDI32");
local FileStream = require("FileStream");

	Map a value from one range to another
local mapit = function(x, minx, maxx, rangemin, rangemax)
  return rangemin + (((x - minx)/(maxx - minx)) * (rangemax - rangemin))

	Clamp a value to a range
local clampit = function(x, minx, maxx)
  if x < minx then return minx end
  if x > maxx then return maxx end

  return x

local main = function()
  local scape, err = LeapScape();

  if not scape then 
    print("No LeapScape: ", err)
    return false

  local fo = FrameObserver(scape);

  local sensemin = {math.huge, math.huge, math.huge}
  local sensemax = {-math.huge, -math.huge, -math.huge}

  -- We'll use this to do some drawing on the screen
  local hdcScreen = GDI32.CreateDCForDefaultDisplay();

  local busywait = function(millis)
    sw =;

    while true do
      if sw:Milliseconds() > millis then

  local drawTarget = function(originx, originy, width, height)
    local brushColor = RGB(255,0,0);

    x = originx - width/2;
    y = originy - height/2;

    x = clampit(x, 0, UIOSimulator.ScreenWidth-1 - width);
    y = clampit(y, 0, UIOSimulator.ScreenHeight-1 - height);

    local right = x + width
    local bottom = y + height
    hdcScreen:RoundRect(x, y, right, bottom, 4, 4)

  local observerange = function(param, event)
    local newvalue = false
    local tp = event.tipPosition;

    sensemin[1] = math.min(tp[1], sensemin[1])
    sensemin[2] = math.min(tp[2], sensemin[2])
    sensemin[3] = math.min(tp[3], sensemin[3])

    sensemax[1] = math.max(tp[1], sensemax[1])
    sensemax[2] = math.max(tp[2], sensemax[2])
    sensemax[3] = math.max(tp[3], sensemax[3])

  local dwellAtPosition = function(x, y)
    drawTarget(x,y, 32,32);
    fo:AddPointerObserver(observerange, nil)
    fo:RemovePointerObserver(observerange, nil)

  local matchTargets = function()
    dwellAtPosition(0, UIOSimulator.ScreenHeight-1);
    dwellAtPosition(0, 0);
    dwellAtPosition(UIOSimulator.ScreenWidth-1, 0);
    dwellAtPosition(UIOSimulator.ScreenWidth-1, UIOSimulator.ScreenHeight-1);

  local writeConfig = function()
    fs = FileStream.Open("sensor.cfg");

    local output = {
      string.format("do return {");
      string.format("sensemin = {%3.2f, %3.2f, %3.2f};", sensemin[1], sensemin[2], sensemin[3]);
      string.format("sensemax={%3.2f, %3.2f, %3.2f};", sensemax[1], sensemax[2], sensemax[3]);
      string.format("} end");
    output = table.concat(output,"\n");



This snippet will place a target in each of the 4 corners of the screen. You point at it for a bit, and then move on to the next one. Easy enough. Once you’ve done that, it will write this range information out to a configuration file. Then you can use the mouse pointtracker to your heart’s content.

There is a ‘MouseBehavior’ object, which is a work in progress. It basically filters through the stream of events coming off the Leap device and determines what is ‘move’, what is ‘click’ and the like, and fires off the appropriate event to whomever may be observing. Oh yes, the IObservable/IEnumerable thing comes home to roost.

The Behaviors are an interesting place to be. It really makes you think. What is a “mouse click”? Is it a dip of the pointing device? By how much, and how long? Is there a ‘right click’? How can I sumulate the wheel? Is it perhaps a circular motion? This is where being able to annotate a gesture, and then subsequently search for that pattern in the data stream becomes really interesting.

For now, I’m happy enough to get basic mouse movement. Soon enough though, chording gestures should be close at hand.