Nickapedia: Managing VMware vSphere with XBOX Kinect Gestures (Wow!)
In my house we’ve been using the XBOX Kinect features more and more. Often times mommy and daddy will want to watch a movie and perhaps the kids didn’t put the contollers back or they are just hard to find in the dark, so why bother looking for the controllers when you can just gesture with your hands?
One day I was explaining to a manager how with vMotion your server could be over “here” on server A (gesturing to the right) and vMotion could move the workload uninterrupted over “here” on server B (gesturing to the left). I had read by now several articles on how the XBOX Kinect had been “hacked” for a variety of other applications. The light bulb went off and one day I made the following tweet:
Is someone writing a Kinect interface for vSphere so that we can vMotion VM’s by swiping our hand from right to left?
I was mostly joking but partially serious in that it I knew it would be possible to do (with A LOT of smarts). Jeramiah Dooley ( @jdooley_clt) retweeted my tweet and copied @lynxbat (Nicholas Weaver). I’ve been observing some of Nicholas’ efforts over the past year and I remember thinking “If anyone could do this, he probably could”.
During the course of the next week Nicholas made some tweets indicating that he was working on a new project that he was really excited about and working hard on, but it never occurred to me that I might know what he was working on.
After watching the Green Bay Packers dominate in the Georgia Dome last night, I woke up in a good mood this morning and decided to check out what was going on in the Twitter-verse. I saw that Nicholas had made a new post which many others were re-tweeting with enthusiasm. So I checked it out and began watching Nick’s video.
In the first few seconds of Nick’s video I thought I heard him mention Blue Shift Blog. “Huh? Me? What in the world did *I* do??”. As Nick went into more detail the proverbial light bulb switched on — “He did it! He actually did it!”.
Before I share Nick’s amazing video and demonstration just a few quick (kinda) points. Sometimes we make technologies that have that “gee-whiz” feature but over time, and in the absence of value, the novelty wears off and it gets forgotten about. I don’t think that this is case here for several reasons.
The Evolution of Human Interfaces
The idea for working with more natural interfaces really became a part of our culture in the movie “Minority Report” where we saw Tom Cruise searching for and manipulating information using hand and finger gestures. The advent of Microsoft Surface, iPhones, iPads and Android phones were a small step in this direction as we became more familiar with gestures such as swiping and multi-touch gestures like pinch-to-zoom.
The XBOX Kinect was a huge leap forward in that our entire bodies can be tracked (in a spring update, even facial gestures like smiling or raising one’s eyebrows will be tracked). The Kinect has been a monster hit in the gaming market with over 8 million already sold, and Microsoft is working on extending the Kinect platform to the Windows PC market. As amazing as Kinect is, it is still largely a 1.0 technology. Gesture based interfaces will become both more advanced and more common place in the future and I suspect some day extend into the datacenter as well.
Gesture Interfaces as a Metaphor for Abstraction
Nick talked about this at some length in his video and it is an excellent point. Virtualization is abstraction of server workloads from the physical hardware. Now virtualization is not technically a requirement for cloud computing (IaaS) but in many cases I think it will be a critical piece (more on this in a future Agility post). But a big part of the cloud computing concept is the abstraction of the workloads from traditional boundaries which enable more flexibility, agility, and savings.
While virtualization provides abstraction from the hardware, it also provides a management layer in the datacenter from which many things can be done. As Nick mentions in his video, you can now power on hundreds of VM’s from a script rather than running around the datacenter and hitting power buttons. In the future I expect more and more API’s to be developed within the hypervisor in areas such as security, storage and networking to provide greater functionality as well as interaction with the physical hardware elements. In this sense, the gesture interface really does provide an interesting metaphor for this abstraction layer which is providing more and more functionality and capability — vMotion being just one example.
While few enterprises will be using motion gestures to manage their datacenters in 2011, I think that as both human interfaces and virtualization/IaaS evolve there is the potential to see much more along the lines of what Nick has demonstrated.
Enough of me talking. Nick did all the work and what he did is nothing short of amazing. I felt a bit guilty knowing that I may have somehow been a contributing factor to long nights and sleep deprivation, but I suspect he’d say that it was well worth it. Here’s Nick’s blog entry and below is the video. Amazing!