How to Run 'Azure-Kinect Examples for Unity'


Despite its name, ‘Azure-Kinect Examples for Unity’ can work with several depth sensors – Azure-Kinect, RealSense and Kinect-v2. The installation depends on what sensor you have at your disposal.


1. (Azure Kinect) Download and install Azure-Kinect Sensor SDK, as described in the ‘Azure-Kinect SDKs’-section below.

2. (Azure Kinect) Download and install Azure-Kinect Body Tracking SDK, as described in the ‘Azure-Kinect SDKs’-section below.

3. (Kinect-v2) Download and install Kinect SDK 2.0, as described in the ‘Kinect-v2 SDK’-section below.

4. (RealSense) Download and install RealSense SDK 2.0, as described in the ‘RealSense SDK’-section below.

5. Import this package into new Unity project.

6. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, target platform: ‘Windows’.

7. Make sure that Direct3D11 is the first option in the ‘Auto Graphics API for Windows’-list setting, in ‘Player Settings / Other Settings / Rendering’.

8. Open and run a demo scene of your choice from a subfolder of the 'AzureKinectExamples/KinectDemos'-folder. Short descriptions of the available demo scenes will be published soon.