Book: Kinect Hacks: Tips & Tools for Motion and Pattern Detection
The way we interact with machines is always changing. As technology evolves, new
ways of interacting with computers become available to us, one innovative breakthrough
after the next. If we go back 10 years, RIM was just starting to implement
phone capabilities into their line of Blackberry mobile devices. Now we have touch
screens capable of delivering a full computing experience in the palm of our hands.
Voice recognition software is finally becoming mainstream with the introduction of
Siri on the iPhone 4S. We are rapidly entering an age in which being tethered to an
accessory or peripheral, such as a mouse or keyboard, will be considered an archaic
way of getting things done.
Touch screen interfaces are all the rage right now, but the next true evolution in
human/computer interaction won’t require you to physically touch a thing. You’ve
seen it before in a number of sci-fi films: some guy in a futuristic get up is waving his
hands around, barking orders at a computer that seamlessly tracks his movement
and executes every command with flawless accuracy. The proper name for this type
of computer interaction is called a Natural User Interface (NUI), which describes the
ability to issue commands to a device using nothing more than your own natural body
movements, gestures, and voice. You’ll no longer need to clean those germ-infested
keyboards and touch screens or pick up new batteries for your wireless mouse or
gaming controller. The possibilities are truly endless, and we’re already starting to see
deployments of NUI interactivity all over the world. Soon you’ll see NUIs in store windows,
bus stations, malls, as well as many other places that could benefit from adding
natural human interaction to the process of selling and providing information.