After a while with VR, you forget that your full body isn’t being tracked… until you try to kick something, or you look down at your non-existent body. Immotionar is a group in Italy who are working on a way to enable full body tracking. Here’s how it works and how developers can give it a go in their own apps.
“We strongly believe that since we use and see our full body in our everyday life, we should use it even in VR: without it, the experience is absolutely incomplete. And we also strongly believe that the system should work out of the box: once assembled and configured, the users should come and play without wearing anything else. We know, it’s a very ambitious goal!” — Antony Vitillo
It truly is an ambitious goal, which is why I was so keen to chat with their team and find out more about how it all worked, what their aspirations were and how developers could start trying this out. The API in particular for full body tracking is called “ImmotionRoom” and its goal is to “let the user see and use all his/her full body inside virtual reality”. The goal, according to Antony Vitillo, Immotionar’s R&D Chief Developer, is “to provide a complete solution that lets the user live full body virtual reality experiences in an affordable way, while developers develop full body VR experiences very easily”. They also are aiming for their system to support the largest possible number of sensors and headsets, to give everyone a broad freedom of choice when it comes to bringing in this full body tracking. Combine that with the already ambitious goal of full body tracking that works out of the box and you’ve got a huge amount of ambition right there!
I’m pretty fascinated by the idea of full body tracking, especially with an API already out that developers can use! According to Immotionar, it works using one or more additional sensors around the user that can detect their body movements. At the moment, the ImmotionRoom SDK supports the Microsoft Kinect 1 and 2. With three or more Kinects connected to the system, they can provide 360-degree tracking of your movements. The more the Kinects that are set up, the more the robust the tracking system becomes.
From a more technical standpoint, the system works using a proprietary service that runs on the PC which gets Kinect tracking data streamed from all the sensors in the network (every PC can be connected only to one Kinect, due to Microsoft specs) and reconstructs the complete user position. This then gets streamed to VR applications so the user can see and use their body to interact with the environment.
It currently supports the Oculus Rift, HTC Vive, OSVR, Samsung Gear VR and Google Cardboard. I’m told that it’s entirely possible with a particular interface for developers to bring support to other headsets too. In terms of development platforms, it currently works with Unity. Immotionar says a layer for standard .NET apps is on its way! The Unity SDK provides some simple Unity packages with the necessary prefabs and they even provide a way of easily switching between each target headset build when needed. As Antony says, “Develop once, build everywhere!”
According to Antony Vitillo, there’s a range of tracking services that each track a little but not everything — “Leap Motion and Intel’s RealSense track only hands and do it very well. Kinect tracks the body well, but it’s not so stable in hands tracking. So the natural solution will be mixing both technologies in our suite, obtaining a full body with perfect hands!”
The Oculus Rift and HTC Vive have physical controllers and sensors that are worn on the user, which does track faster and with better precision than Immotionar. There’s a comparison between the Oculus tracking and Immotionar tracking below:
Full body MOCAP tracking suits like the Perception Neuron track faster and with better precision too as they’re worn on the body. However, Immotionar has a vision of a future where the user doesn’t need to wear anything. Overall, it’s a trade-off — slightly less effective tracking but without the hassle, or full on tracking which requires wearing sensors or holding additional controllers. Here’s hoping one day soon we get the best of both worlds!
Developers out there can start developing with the ImmotionRoom SDK for free right now — head to Immotionar.com to find out more and download it, along with the runtime and sample apps. I can’t wait to see where this concept leads and how far these guys can push the technology.
A huge thank you to Antony Vitillo for spending some time talking about their tech! You can find out more at ImmotionAR’s website and follow what they’re up to on their Twitter. If you’ve got any suggestions for their team, you can get in touch via email at email@example.com.
Learn to build for the Amazon Echo, Google Home, Facebook Messenger, Slack & more!