AR support for Hololens?

In the current docs it is explicitly stated that AR is not supported on Hololens 2. Will this change soon or is there a way to enable anyway? Also, is there any docs on what gestures map to what input?

Hi,

AR is actualy supported on Hololens2 and CloudXR 2.0.
You have to use the files available in Sample\WindowsMR\HoloLens.

The hand tracking is working fine (three 3DoF on translation an 1DoF on rotation, and Trigger_Click)
Otherwise, by dinging into code I found this mapping :
HOLD = Touchpad_Click
SINGLE_TAP = Trigger_Click
HAND_MOVED_LEFT = ApplicationMenu
HAND_MOVED_RIGHT = System
HAND_MOVED_UP = Grip_Click

Thanks. That is the code I am already using. Good to hear that it actually supports AR rendering against transparent background and not just fullscreen VR… Have you seen any example of how one should use it and how to configure apps using OpenVR to utilize this? (In unity? In unreal? Or some sample c++ OpenVR project?) Just trying to understand what is possible before I invest too much time in this. In the end I am looking for tracking real objects and placing serverside rendered content in relation to that. For a first test it would be great to see this working at all by placing server side rendered object on surfaces, just like in the basic Hololens unity projects but having the heavy lifting being done on the server.

Hi again,

In Unity, you have to hide the Environment and to show only the objects you want to display on the AR Device. Then, the camera must be on SolidColor mode, with black as the Background, and the Far Clipping Planes at arroud 10 meters.

However, the issue you’ll facing is that you won’t be able to perform tracking of a real object by the Hololens2 as the app running (CloudXR) won’t get the camera video stream that is used by commun codes…

Thanks! I will test this once I get h2 to work a little better with the basics. And I guess I can create a more extensive h2 client app that sends tracking info to the server in some way.