|Type||Software Development Kit (SDK)|
|Industry||Virtual Reality, Augmented Reality, and Mixed Reality|
|Supported Devices||All devices that support ARKit|
|Release Date||September 2017|
A number of tech firms are launching or planning to launch, in the near future, hardware and software products related to virtual reality and augmented reality. In the first half of September 2017, a Computer Vision and Machine Learning Company, ManoMotion, announced its plan to integrate their phone-based gesture control tool with ARKit – the AR Development Software Kit created by Apple. This integration when complete will bring hand-tracking into AR and allow users to track their hand gestures using the phone’s camera, sensors, and onboard processors.
Hand Gesture Input for Apple’s ARKit
The race to lead the world into a new era dominated by augmented reality and virtual reality has already begun with the announcement of Apple’s ARKit and Google’s ARCore. Now, tech enthusiasts are eagerly waiting to see how developers use these software development kits to create apps and games that let users experience augmented reality in their Apple and Android devices. We have already seen a few developers release experiments and prototypes built using the two software kits, but ManoMotion is the first firm to integrate hand tracking into the augmented reality platform.
Even before the announcement, ManoMotion was an entity widely known for its work on gesture-based input capabilities. Talking about ManoMotion’s latest presentation, the company’s CEO Daniel Carlman shared some interesting insights. Similar to their previous creation, the new SDK tracked many of the hand’s 27 DOF (degrees of freedom). The new build is also capable of tracking depth and identify hand gestures like a release, grab, tapping, clicking, and swipes. The beauty of the integrated version is its ability to perform complex functions without taxing the memory, CPU, and consuming too much from the battery.
ManoMotion even released a video showing its new ARKit driven app in action. The clip clearly draws our attention to ARKit’s spatial mapping abilities. On the display screen of the device, we can see the application identify the user’s hand and mimic its physical gesture (in this case, the flicking motion). The hand’s flicking motion attempts to land the virtual ping pong ball into a virtual cup.
ManoMotion’s ARKit-based beer pong game might not seem such a big achievement for some, but the ability to use the hand to interact with the virtual world on the phone opens up a world of possibilities. In this regard, augmented reality technologies like Meta 2 and HoloLens are quite far ahead. To make the user interface (UI) more interactive these devices rely on gesture tracking. Introducing gesture tracking in augmented reality has many practical applications in numerous fields. Placing virtual objects on real surfaces, turning on/off internet-connected switches, and resizing windows are just some of the benefits of blending the two concepts.
According to Daniel Carlman, the inability to instinctively interact with AR objects in the virtual space has, so far, hampered the growth of the augmented reality technology. He considers, being the first to integrate gesture control into ARKit, as a fantastic milestone for the company. He is keen to see how developers create and rewrite the concept of interaction in the AR arena.
How Does the Integration of Gesture Control into ARKit Help?
ManoMotion’s announcement will allow developers to build ARKit-based games, apps, and digital content with hand tracking functionality. What this means is:
- The users will be able to use their hand in 3D to maneuver virtual objects in Mixed Reality / Augmented Reality World.
- Virtual objects can be controlled using the left or the right hand.
- Hand gestures and virtual object manipulation can be defined and the extent of manipulation can be controlled by the users. The SDK comes with its own set of hand gestures, including grab, swipe, pinch, push, and point. These predefined hand gestures can be used in developing the augmented reality content.
Downloading ManoMotion’s ARKit Integrated SDK
ManoMotion has announced that the ARKit integrated hand tracking SDK can be downloaded from their official website. Initially, the new SDK build will be available only for Unity iOS. In the subsequent updates, the company plans to extend support to Native iOS.
To get their hands on the ARKit integrated Software Development Kit, developers are required to sign up and join the ManoMotion developers’ community. Apart from the SDK, developers also get access to product documentation, comprehensive tutorials, guides, and forums.
ManoMotion - About the Company
ManoMotion is a Computer Vision-based company that is situated in Stockholm, Sweden. They also have an office in Palo Alto, US. ManoMotion is a group of Software Development Engineers and Computer Vision and Machine Learning Scientists working to expand the horizons of technology. The company has been researching and working for 7 years on gesture technology and they have seen unparalleled success in revolutionizing human-machine interactions.
In the area of gesture technology, the framework developed by ManoMotion needs just the RGB camera to recognize gestures and track hand movements accurately in real time. The framework can perform these complex functions with low processing expenditure. The company offers businesses and developers the required tools and knowledge to integrate the gesture-based interactions into their own creations likes apps and games. The solutions and frameworks developed by ManoMotion support several platforms and focuses on multiple industries including Automotive, Consumer Electronics, Virtual Reality, Augmented Reality, Mixed Reality, and Embedded Systems.