WOW! No I don't mean World of Warcraft I really mean WOW!
Following Pathfinder's lead, which he gave in my Predictions for 2011 comments I came upon a hack for the Microsoft Kinect sensor from researcher, Even Suma of the University of South California. Working with his team at the school's Institute of Creative Technologies they have managed to put a middleware program together based on the open source framework tool, OpenNI that exploits Kinect's sensor to recognize the player's skeleton and translate body gestures, movements and voice into keyboard commands. This lets World of Warcraft players plug Kinect into their PC USB and directly interact with the game. The software package is called FAAST (Flexible Action and Articulated Skeleton Toolkit) and released as open source so anyone, they say, can downloaded it and use with WoW or modify to work with other applications.
Given that FAAST is open source anyone working with virtual worlds like Opensim should be able to play with the functions to get an avatar to sit, walk, run, turn about and activate a selection of gestures. I am sure a lot more might be possible too such as bringing up the inventory menu and clicking to wear an item of clothing while at the same time the avatar is seen, apparently, dressing as the result of an appropriate gesture getting activated (I have always wished for this since clothes simply appearing seems to me very un-life like).
Currently, the number of gestures available to WoW players is limited but the researchers aim to add more. In the video the team released (seen here), Suma uses his body to walk, his left hand to move the camera and his right hand to select attack spells. I can imagine in a virtual world a gun or sword can be taken in hand and used in combat too by those users more inclined to action adventure but the possible application for interactive social, medical and education use would benefit tremulously. As Skip Rizzo, said in the video, the system could help get young people out of their chairs doing more exercise. In deed, rehabilitation exercises and the fight against childhood obesity and diabetes could all be helped with this system.
I make no secret that want to see a free and open metaverse of hypergrid-connected virtual worlds come about where we, our avatars, can live some precious moments as extra-sentient beings playing, socializing, working and making money and, above all, being able to marvel at the creativity of others and enjoy the fantastic worlds we create. I think Kinect could open up new and exciting possibilities for virtual worlds and facilitate interaction in ever more life-like ways. I imagine too that a few more years ahead Haptics will come into play too and we will be meeting and shaking hands no matter where on Earth we live.
Imagine being separated by thousands of miles from work colleagues or those you love and yet being able to walk towards their hologram while they walk towards yours. You meet and shake hands or just hold the hand of your wife or husband, and by the miracle of technology you actually feel the touch. How amazing would that be?
Following Pathfinder's lead, which he gave in my Predictions for 2011 comments I came upon a hack for the Microsoft Kinect sensor from researcher, Even Suma of the University of South California. Working with his team at the school's Institute of Creative Technologies they have managed to put a middleware program together based on the open source framework tool, OpenNI that exploits Kinect's sensor to recognize the player's skeleton and translate body gestures, movements and voice into keyboard commands. This lets World of Warcraft players plug Kinect into their PC USB and directly interact with the game. The software package is called FAAST (Flexible Action and Articulated Skeleton Toolkit) and released as open source so anyone, they say, can downloaded it and use with WoW or modify to work with other applications.
Given that FAAST is open source anyone working with virtual worlds like Opensim should be able to play with the functions to get an avatar to sit, walk, run, turn about and activate a selection of gestures. I am sure a lot more might be possible too such as bringing up the inventory menu and clicking to wear an item of clothing while at the same time the avatar is seen, apparently, dressing as the result of an appropriate gesture getting activated (I have always wished for this since clothes simply appearing seems to me very un-life like).
Currently, the number of gestures available to WoW players is limited but the researchers aim to add more. In the video the team released (seen here), Suma uses his body to walk, his left hand to move the camera and his right hand to select attack spells. I can imagine in a virtual world a gun or sword can be taken in hand and used in combat too by those users more inclined to action adventure but the possible application for interactive social, medical and education use would benefit tremulously. As Skip Rizzo, said in the video, the system could help get young people out of their chairs doing more exercise. In deed, rehabilitation exercises and the fight against childhood obesity and diabetes could all be helped with this system.
I make no secret that want to see a free and open metaverse of hypergrid-connected virtual worlds come about where we, our avatars, can live some precious moments as extra-sentient beings playing, socializing, working and making money and, above all, being able to marvel at the creativity of others and enjoy the fantastic worlds we create. I think Kinect could open up new and exciting possibilities for virtual worlds and facilitate interaction in ever more life-like ways. I imagine too that a few more years ahead Haptics will come into play too and we will be meeting and shaking hands no matter where on Earth we live.
Imagine being separated by thousands of miles from work colleagues or those you love and yet being able to walk towards their hologram while they walk towards yours. You meet and shake hands or just hold the hand of your wife or husband, and by the miracle of technology you actually feel the touch. How amazing would that be?
A little late to the party here but I have two comments:
ReplyDeleteFirst, FAAST is not open source. They do not make the source available at all.
Second, it can't convert speech into gestures because the Kinect audio input is not part of the drivers provided by OpenNI. It only has access to the depth and colour cameras and nothing else - no motor control and no audio input.
The fact FAAST is not open source is quite surprising to me, given that it is created by a researcher. If anyone should know the value of open source it is researchers.
Thanks Anonymous. There are some good articles on Chapter & Metaverse blog about Kinect and FAAST. See the link on the side bar.
ReplyDeleteFAAST was said to be open source, and they promised to release them in early 2011. However, this is not the case? I guess they want to delay this to 1.0 or never?
ReplyDelete