Leap Motion and the pointlessness of another pointer

Last October I worked on a neat project with Spinifex LA which used a Leap Motion device to allow people to control a virtual camera in 3D space by just moving their hands in front of a screen (a case study on the project is coming soon.) But now, about 4 months after the project launched, I've yet to find a reason to plug the Leap Motion back in.

I'm probably getting ahead of myself. You might be wondering what the Leap Motion is. Here's the official blurb:

The Leap Motion Controller senses how you naturally move your hands and lets you use your computer in a whole new way. Point, wave, reach, grab. Pick something up and move it. Do things you never dreamed possible.

Basically it's a little box you plug into your compy through USB

Leap around the place

(The battery is just there for scale; like I said, it's little.) You position your hands above the sensor, and the accomning software gives out data about your hands and finger positions. So the dream is using your hands as pointers to play out minority report.

Now it's not quite as full-on as that, for a start the detection area is a bit smaller, maybe up to a meter above itself and a few handspans wide; but that works pretty well for sitting down at a computer and waving your hands about.

It actually works

It tracks really, really well. There's a demo while you're setting up the leap just showing where your fingers are in relation to the Leap Motion sensor, and it's impressively accurate in picking up even tiny movements. I recall being disappointed with the accuracy of Kinect, but this is miles ahead.

They've really targeted their early market on developers, which (being a developer and all) I think is super-smart. The SDK and docs available are great.

I ended up using a C# set of bindings for Unity, which came with some excellent docs and great examples, allowing me to get the gesture side of the project rolling quickly. The only thing I wished for was more info about finger joints past the knuckle; some of the Leap Motion demos seem to track extra finger joints, but the Unity API didn't look to have that built in. I'm not sure whether Leap Motion's code does clever interpolation of the data in the existing API or whether internally they have some better tooling. Either way, pretty minor and didn't affect my project.

The other smart thing Leap Motion have done is to create their own app store, Airspace, giving a central place to find Leap Motion enabled apps.

The problem

I still don't know what to use it for. Put simply, Leap Motion is a solution in search of a problem. It's an excellent developer toy, and I think it can fill a niche in novel interfaces for exhibitions - but as a mainstream device to sit with a home or business computer, I can't figure any real use for it.

The first problem with any tech along these lines is Gorilla Arm. Any activity which involves you holding your arm out in front of you for an extended period, be it Leap Motion, or a vertically mounted touchscreen, is going to result in fatigue of the arm.

That might be surmountable if there was a compelling use-case, but I can't find anything that isn't already covered (better) by another device. Browsing through the top rated apps on Airspace you can find things like a New York Times app,

something a touchscreen could certainly do better.

Or a utility for adding gestures to the OS

gestures which look much more time consuming and error prone than the keyboard, mouse/touchpad or touchscreen. The scrolling in particular in that example is positively painful.

Some other demos that I've played around with are just that, demos. Nice little showcases of how accurately fingers can be detected, but ultimately serve no purpose (and how often do people play with Google Earth anyway?) There are piles of games, but on the casual side Fruit Ninja, Cut The Rope and others are just transplanting touchscreen mechanics and adding in potential gorilla arm. And if you think there's applications for motion control on the hardcore gamer side perhaps with FPS aiming, please go have a play with your Wii or Kinect and see how that worked out.

Getting deeper

To make the Leap Motion worthwhile as a control mechanism you need to tap into what makes it unique: It has depth. You don't just get an X/Y position on the screen, you get the Z axis as well. But I can't find a compelling experience that uses that, and I'm not sure one exists.

When I see HP advertising a laptop with a Leap Motion sensor built in,

Spotted on the top of Young & Jackson across from Flinders St
station

and the billboard invites people to "control apps with the wave of a hand", I just want to ask: "What apps?"

If anyone out there has found a mainstream use for these things, if anybody out there uses their Leap Motion daily, I'm all ears. Until then I think my Leap Motion is just going to gather dust.

Comments