This article was first posted on gooroo.io
Developing applications and experiences for HoloLens is not hard to get started with, but after you have started a number of projects and worked through the same setup and plumbing a few times, you wish there was an easier way to do it. Well, there is! Created by the team that produced Fragments and Young Conker, the HoloToolkit is a collection of scripts and components intended to accelerate the development of holographic applications targeting Windows Holographic.
The toolkit is all free and comes in two varieties: one for Visual Studio and one for Unity 3D. Developing for HoloLens means spending a LOT of time in the Unity editor, and the HoloToolkit is a simple .unitypackage file you can download and import into your Unity project.
This will show a new menu item “HoloToolkit” in Unity.
These menu items form part of the visual change to Unity that HoloToolkit implements. Having menu items to perform the most trivial and basic configuration tasks for a new project gets you to the fun bits a lot quicker, and it makes sure that you haven’t missed anything.
Apart from the obvious menu item changes, the HoloToolkit also brings with it a LOT of scripts, prefabs, materials, shaders and plugins you can use in Unity3D.
While the sheer number of scripts in the toolkit makes it out of scope to go through them in this article, I’ll outline some of the capabilities they bring.
Some of the issues with capturing a user’s gaze is that they don’t hold their head 100% still and they might try to look at objects that are hidden behind other objects. With the toolkit you get GazeManager and GazeStabiliser. These two scripts can help you determine if users are gazing at a particular object, if there are multiple objects in the gaze direction, stabilize the gaze for precision targeting based on general direction and interpolation algorithms and more.
To deliver input to a mixed reality experience, you generally rely on GGV; gaze, gesture and voice. The HoloToolkit provides an InputManager script which have a large amount of functions and features to manage the various ways a user might input data to a HoloLens app using gaze and gesture. The script then also dispatches relevant events, such as hold for a gesture and change of focus for gaze. The InputManager script is 610 lines of genuine goodness that provide a bunch of plumbing for managing gaze and gestures in your app.
While voice commands are not difficult to implement (read my article on developing for HoloLens for more details), some of the process can be tedious. The KeywordManager script not only removes some of that plumbing code, but also provides ways to explicitly set keywords, or voice commands, in Unity rather than code. It also gives you the option to start the keyword recognizer manually, which is very handy in the scenario where you don’t want the app to listen for voice commands.
The collaboration story on the HoloLens is one of the most compelling features for building real world experiences that actually will have some impact. Being able to see the same digital assets on two different devices in real time, while interacting with them, is incredibly powerful.
The HoloToolkit provides some enhancements to the sharing experience, which makes it, again, just a bit simpler. It provides everything from additional menu items in Unity, to pairing scripts and session management.
The defining factor of the HoloLens over other digital reality devices is the spatial mapping, which allows interaction between physical objects and digital assets. The spatial grid is a relatively high definition set of polygons that make out the space the user is in. The hard part comes when you need to identify what surfaces you can use to place objects, what is a wall, what is the floor etc. And again, HoloToolkit to the rescue!!
There are scripts like the SpatialMappingManager that lets you select materials, draw shadows, visualise the spatial mapping and set the spatial mapping source. The TapToPlace script will let users move objects around and placing them on real world surfaces.
Use It. Now.
The HoloToolkit really is a no-brainer when you start a new mixed reality project. The time saved both for setting up the project as well as for the learning curve of implementing the various event handlers and features, is significant. Because the project is on github, it is regularly updated, maintained and pull requests are regularly taken in.
If you haven’t already got your first project on the way, check out my introduction course on Pluralsight.