Virtual Reality hardware of today, by and large requires that users interact with their environments through handheld controllers speckled with a myriad of buttons, sticks, and triggers that allow for all types of input.
Never an industry to leave well enough alone, VR hardware/software creators looked inward to see if there were perhaps other ways to interface with environments that weren’t encumbered by the bulky, obtrusive, detracting controller. So what is universal, intuitive, and non-distracting and used to control things every day? Hands.
We at VisionThree were definitely part of that onslaught of creators who thought there had to be a better way. So with the help of a third-party peripheral, our developers cracked the code of hand-tracking a while ago, evidenced here. Using Leap Motion hardware literally affixed to the front of the head mounted display (HMD), we built VR software that could track the user’s hands in space and allow for various gesture controls in order to interact with objects and environments. Pretty cool! Some would say VERY cool!
We caught up with Nate Logan, VisionThree’s Technical Director and Lead Engineer for the V3CORE Training Platform (V3’s in-house VR Training product), to get some of his thoughts on this very innovative subject. “It's hard to beat a modern VR controller for simulating precise tools, but they can be a bit intimidating for anyone who hasn't grown up with a Nintendo or an iPad. That said, it's downright magical to see your hands appear in VR, right where you expect to see them.” said Logan.
Hand-tracking, as it turns out, is more of a User Experience (UX) direction than a technical one as Logan sees it. Things that make actions and gestures intuitive are social constructs that we all as humans agree on. “You’ve got to choose gestures that everyone understands, and preferably gestures that people do intuitively the same way. Then you have to trim that down so that it’s as precise as possible while keeping it open enough where everybody wins.”
Sounds easy enough, right? He went on to explain some of the challenges with initially going down the path of hand-tracking with a third-party hardware.
“With Leap, our biggest challenge was that it was looking at your hand from a single perspective. So if it couldn’t see the back of your hand, it had to guess at where your fingers were. And it didn’t always guess right.”
So we just need someone to keep pushing the hardware farther. (Trumpet fanfare!) And then there was Zuckerberg! Okay, honestly there was the team that Zuckerberg pays, but we digress. When Facebook purchased Oculus little did we know what that would mean.
In September 2019, at the Oculus Connect 6 Keynote, a new update was announced that at some point in 2020, Oculus is going to allow VR users to engage with the virtual world in a much more intuitive way. Oculus was introducing hand-tracking into their revolutionary Quest hardware. They stunned everyone in December 2019, when they announced they were set to deploy the functionality!
This magical capability was to arrive directly on-devices via OS software update. Users would then be able to see/feel the capability in the Oculus lobby and in select menu/lists native to the OS. The software development kit was to follow the update a week or so later so content creators could “get their own hands dirty”. (As we went to press, the SDK had still not been released.) Oculus maintains they have evolved this capability.
See their Über-flashy marketing video here. But probably the best part is the hardware is already built-in! No new gadgets to install, tack on, or risk losing. And pardon the nerding out for a minute, there are simply more cameras than Leap ever had! The more to see you with, my dear!
Okay, okay, so how does it FEEL?
We again tapped Nate Logan, who is incapable of lying by the way, to have a look.
“Quest's new hand tracking feature could go a long way toward lowering the barrier to entry for VR training. Each finger is individually tracked. I can make most gestures you can think of. I'm able to just look at an object and pinch my fingers to activate it, or flick my hand to scroll through content. And more gestures should be possible in the future.”
With a glimpse of sadness and wistfulness in this time of giving, Logan lamented the absence of the SDK.
“Oculus (content) developers haven't been handed the keys to use it in apps just yet, but we're looking forward to seeing what we can do with it.”
So what it could it all mean? With developers gaining access to this new feature soon, Quest is ensuring that future VR applications could see a paradigm shift in how we create experiences for our customers. Using hands how they normally would be used inside a virtual space is a game changer. If things progress the way the Nate Logans of the world want them to, it is about to be a very exciting time. To take an already highly-immersive training conduit and make it even more so is truly groundbreaking in an already groundbreaking space!
If you’d like to talk to us about training in Virtual Reality in your industry and/or want to see the V3CORE Training Platform in action, please reach out today. We can accommodate in-person demonstrations at our Customer Experience Center in Indianapolis, or by virtual meeting via Zoom. We love to make new contacts, educate, and explore where VR technology can go. You can also follow us on Facebook and LinkedIn as we regularly post updates about what’s going on with us and the VR/AR industries at-large.