Build a Simple HoloLens application

Microsoft HoloLens is a pair of augmented reality head-mounted smartglasses, developed and manufactured by Microsoft which brings amazing virtual experiences to all users. For more information refer to

This article describes the necessary steps you should follow to build, load, and run a simple WaveEngine application on the Microsoft HoloLens device.

At the moment, the HoloLens extension is in an early preview stage, and is still under development.

Previous steps

There is no separate SDK for HoloLens. Holographic app development uses Visual Studio 2015 Update 2 with the Windows 10 SDK (version 1511 or later).

Don’t have a HoloLens? You can install the HoloLens emulator to build holographic apps without a HoloLens. You can find more information concerning HoloLens tools here.

Create a Stereo Camera

In this step, you will learn how to create a Stereo Camera, which will render the scene into the headset.

WaveEngine has a set of components that allows you to create a Stereo Camera regardless of the VR technology used (Oculus, Google Cardboard, HoloLens…).

  1. First of all, we start by creating an Empty Entity 3D. This action will create an Entity that only has a Transform3D component:    
  2. Add the VRCameraRig component to the previously created entity. This component is responsible for creating the stereo camera configuration:
    The VRCameraRig component creates an hierarchy to its owner entity:

Stereo Camera hierarchy

This hierarchy is used to maintain a stereo camera 3D system, that allows you to draw each eye separately and helps developers know where every special feature is located (eyes, position tracker, etc…).

A brief overview of the Stereo camera hierarchy:

  • TrackingSpace. This is a single entity with a Transform3D component. It allows you to adjust the VR tracking space to the app requirements. For example, by default, all tracking position units given by Microsoft HoloLens are measured in meters, so if you want to use centimeters instead of meters, you only need to scale the TrackingSpace entity 100 times (Transform3D.LocalScale = new Vector3(100, 100, 100).
    • LeftEyeAnchor. This is the left eye camera entity. Always coincides with the position of the left eye.
    • RightEyeAnchor. This is the right eye camera entity. Always coincides with the position of the right eye.
    • CenterEyeAnchor. This entity is placed in the average location between the left and right eye position. This entity is commonly used to know the “head” position.
    • TrackerAnchor. A simple entity (only contains a Transform3D), that is placed in the location of the position tracker camera.
      • Note: TrackerAnchor entity is commonly used in other integrations (such us Oculus Rift). In an HoloLens application, this entity is not necessary.

VRCameraRig component

The VRCameraRig component is responsible for controlling the stereo camera system. It has the following properties:

  • Monoscopic. If true, the eyes see the same image, this option disables the 3D stereoscope.
  • VRMode. This flag specifically enables the way a stereo camera will be rendered. It has the following values:
    • HmdMode. Both eyes will be rendered in the headset. This is the default value.
    • AttachedMode. Render only one eye (Center eye) and nothing is rendered into headset. This mode is useful for debugging purposes.
    • All. Both modes are rendered at the same time (Both eyes in the headset and the center eye is in the application window).
  • Camera properties:
    • NearPlane, FarPlane. Sets the near and far plane of the stereo camera.
    • BackgroundColor. Sets the background color of the stereo camera.
    • ClearFlags. Sets the clear flags of the stereo camera.

Note: For HoloLens devices, we recommend maintaining the default values, and adjust only the far and near planes to fit your scene requirements.

Basic HoloLens integration

The VRCameraRig creates the stereoscopic camera entity hierarchy, but it is VR platform agnostic, so this is designed to work for another VR platforms (like Google Cardboard, Oculus Rift…).

The HoloLens integration can be done in the following steps:

  1. Add the HololensProvider component to your camera rig entity (the entity that has the VRCameraRig component). This component is responsible for updating the VRCameraRig hierarchy entities with the HoloLens headset information, and configures the cameras to render inside the HoloLens headset.

Create the HoloLens launcher

A HoloLens application is basically a Universal Windows Platform (UWP) application with some special initialization.

In that case, to run your application into a HoloLens device, you need to follow the following steps:

  1. Go to Edit > Project Properties and add a new Profile:
  2. Select a new Universal Windows Platform (UWP) profile, and the most important thing, Select the HoloLens in the Launcher Type combo box:

After that, try to put some models into your scene (an airplane in that example), and place the camera rig entity into your desired world origin position. Open the recently created HoloLens solution, and execute it:

And when you execute your application, you will see your first awesome HoloLens example!

Spatial Inputs

All HoloLens Spatial inputs are exposed to the user via the SpatialInputService service. Therefore, to start interacting with gestures, you need to register that service. This is usually done in your Game.cs :

[code language=”csharp”]
// Game.cs file
WaveServices.RegisterService(new SpatialInputService());


The SpatialInputService has the following properties:

  • IsConnected. A boolean value indicating if the application is running in a valid HoloLens device or emulator.
  • SpatialState. Properly exposes the current gesture information and has the following properties itself:
    • IsSelected. A boolean that indicates if the Select gesture is producing at that moment. More information here.
    • Confidence. The gesture confidence value. Higher values indicates that the gesture is well captured.
    • Source. An enum value that indicates the source of the gestures (Hand, Voice, Clicker controller).

For example, a common scenario in a Wave behavior could be the following:

[code language=”csharp”]
var spatialInput = WaveServices.GetService<SpatialInputService>();

if (spatialInput.IsConnected)
var state = spatialInputService.SpatialState;
if (state.IsSelected)
// Do an action while the Select gesture is actrive

Keyword commands

The voice is a key form of input on HoloLens. It allows you to directly command a hologram without relying on using gestures. You only need to say your command.

These functions are exposed in the KeywordRecognizerService. To start interacting with keyword commands, you need to register that service. This is usually done in your Game.cs class:

[code language=”csharp”]
// Game.cs file
WaveServices.RegisterService(new KeywordRecognizerService());


Later, you only need to tell the service the keyword list, start the voice recognition, and subscribe to the OnKeywordRecognized event:

[code language=”csharp”]
var keywordService = WaveServices.GetService<KeywordRecognizerService>();

if (keywordService.IsConnected)
// 1. Sets the keywords
this.keywordService.Keywords = new string[] { "Begin Action", "End Action" };

// 2. Start voice recognition

// 3. That event is fired when a specified keyword is recognized
this.keywordService.OnKeywordRecognized += this.OnKeywordRecognized;

When a keyword is recognized, the service calls your subscribed method. In that case, you will execute your desired actions:

[code language=”csharp”]
// This method is fired when a keyword is recognized
private void OnKeywordRecognized (KeywordRecognizerResult result)
switch (result.Text)
case "Begin Action":
// Begin the action
case "End Action":
// End the action

You can find the source code of this project available on the WaveEngine GitHub page:

Leave a Reply

Your email address will not be published. Required fields are marked *