A new sample project from Meta for Epic’s Unreal Engine gives developers the buildings blocks for more robust open air hand interactions in VR.
If you own a Meta Quest you can test the Oculus Hand Gameplay Showcase for Unreal on App Lab now. Just leave the Oculus Touch controllers behind.
Most virtual reality content is made in either Unity or Unreal, with the latter being the work of Fortnite creator Epic Games. Unreal is sometimes the engine of choice for some of the bigger budget game projects and Epic’s CEO, Tim Sweeney, is one of the biggest proponents of the so-called “Metaverse” concept. Hand Tracking is a secondary control system on Quest 2 (meaning the hardware isn’t optimized for it) but on future computerized goggles or glasses from Meta that could change, and your hands making gestures in the open air could become the primary control system. That means Meta needs to get developers building more of their apps to work with hand tracking and this latest release represents the latest in a series of samples released by Facebook (and now Meta) to show how developers can support robust interactions using the system. So for Unreal developers, this sample project “contains reusable components based on the more robust hand tracking mechanics from First Steps with Hand Tracking and Tiny Castles.”
Mechanics covered in the sample include teleportation, grabbing, throwing, and pushing buttons. Meta even posted the source code for “Oculus showcase of hand tracking based interactions in Unreal” onto Github.
We haven’t tried out the project yet but will be giving it a download soon. If you’ve tried it out though, please share your thoughts in the comments below.