Hololight Stream supports Unity's AR Foundation. The currently supported AR Foundation features with this package are listed in the table below.
Combined with the AR Foundation support, Hololight Stream also provides touch input for iOS Client with the new input system.
Feature | HoloLens 2 | iOS/iPadOS | Quest 2 / Pro / 3 | Magic Leap 2 | Lenovo VRX | Desktop |
---|---|---|---|---|---|---|
Collaborative participants | ||||||
Camera | ✓ | ✓ | ||||
Device tracking | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
Enviroment Probes | ||||||
Face tracking | ||||||
Human segmentation | ||||||
Light estimation | ||||||
Meshing | ✓ | ✓ | ||||
Occlusion | * | |||||
Plane tracking | ✓ | ✓ | ||||
Point clouds | ||||||
QR Code Tracking | ✓ | ✓ | ||||
Raycast | ✓ | |||||
Session management | ||||||
Spatial Anchors | ✓ | ✓ | ✓ | |||
2D Image tracking | ✓ | |||||
2D & 3D body tracking | ||||||
3D Object tracking |
* Supported outside of AR Foundation, see Occlusion Support
- Prerequisites
- AR Foundation 5.0.5
- Follow the steps listed in First Installation
- In the scene, add an AR Session Origin object (
GameObject -> XR -> AR Session Origin
) - Expand the AR Session Origin object and select the AR Camera
- Set the AR Camera background to have alpha value of
0
, within the inspector window - In the scene, add an AR Session object (
GameObject -> XR -> AR Session
)
Follow the steps listed in First Run.
AR Foundation enables users to use device camera through AR Camera Manager. For the usages of camera within AR Foundation, see Unity Documentation. For additional usages of camera through Stream, see Camera Stream.
The package provides the meshing subsystem to receive and render the real world mesh. This can either be used directly with the subsystem, through AR Foundation's AR Mesh Manager or MRTK's Spatial Awareness.
Functionality | HoloLens 2 | iOS | Quest 2 / Pro / 3 | Magic Leap 2 |
---|---|---|---|---|
World Meshes | ✓ | ✓ |
The package provides the plane tracking feature to detect and visualise planes within the environment. For instructions on how to use plane tracking, see Unity's Plane Detection manual. To use it as a part of MRTK's Scene Understanding feature, see Spatial Awareness.
The plane tracking feature supports the following functionality:
Functionality | HoloLens 2 | iOS | Quest 2 / Pro / 3 | Magic Leap 2 |
---|---|---|---|---|
Arbitrary Plane Detection | ✓ | |||
Boundary Vertices | * | ✓ | ||
Classification | ✓ | ✓ | ||
Horizontal Plane Detection | ✓ | ✓ | ||
Vertical Plane Detection | ✓ | ✓ |
* The HoloLens 2 Client does not support Boundary Vertices therefore, the boundary vertices correspond to the 4 vertices at the plane extents.
Stream adds the QR Code Tracking to the features of AR Foundation. For more detailed information, see QR Code Support.
The package provides the raycast feature which will carry out hit tests on real world object, such as detected planes. For instructions on how to carry out raycasts, see Unity's Raycast manual.
The raycast feature supports the following functionality. Currently, the raycast feature only supports raycasts against planes:
Functionality | HoloLens 2 | iOS | Quest 2 / Pro / 3 | Magic Leap 2 |
---|---|---|---|---|
Tracked Raycasts | ||||
Viewpoint Raycasts | ||||
World Based Raycasts | ✓ |
The package provides the anchoring subsystem to add, track and remove real world anchors. For instructions on how to use them with AR Foundation, see Unity's Anchor Manager manual. Additionally, for more information about how to use the Anchor's extension methods, see Spatial Anchors.
Touch data is passed to Unity via the new Input System. This data can be accessed through actions which are configured with Unity Input Actions. To find out more about the new Input system, see Unity's Input System manual.
Currently, Hololight Stream only supports single touch with the iOS client which will always be considered the primary touch.
The package provides the 2D image tracking feature to detect and track images in the environment from a supplied library. For instructions on how to use 2D image tracking, see Unity's Image Tracking manual.
⚠️ When running in editor play mode, theKeep Texture at Runtime
checkbox within for the images within theImage Library
must be checked. If not, a warning within Unity will appear due to the texture not being available.
The 2D image tracking feature supports the following functionality:
Functionality | HoloLens 2 | iOS | Quest 2 / Pro / 3 | Magic Leap 2 |
---|---|---|---|---|
Image Validation | ||||
Moving Images | ✓ | |||
Mutable Library |
Unity provides an example project to demonstrate the functionality of AR Foundation, which can also be used with Hololight Stream. The package can be found here and provides a number of sample scenes for each feature. To use, load the project into Unity and follow the steps in Getting Started.
- Some AR Foundation features are currently not supported, therefore not every sample scene will work correctly. If using, stick to the sample scenes which demonstrate the functionality listed above.
- Remember to set the AR Camera background to have alpha value of
0
. If this is not done, the background image will always be black. - These samples make use of the legacy input system for touch which is not currently supported. Any scene which uses touch may need to be updated to use the new input system.
The below list contains specific issues that may occur when running the Hololight Stream. For all other troubleshooting issues, see Troubleshooting.
- If the background of the camera is black, make sure the
AR Session Origin -> AR Camera
background's alpha channel has been set to zero as instructed in Scene Configuration - If the touch input is not triggering events, ensure that the Unity game window has window focus (actively selected). If it does not, the events will not be triggered as the game is considered not to be in focus.