The Kinect for Windows Software Development Kit (SDK) 2.0.1410.19000 represents a significant milestone in the evolution of natural user interface (NUI) development for the Windows platform. This release marked the introduction of robust support for the Kinect for Windows v2 sensor, unlocking a wealth of new possibilities for developers looking to create immersive and interactive applications. The SDK provided a comprehensive toolkit, enabling developers to harness the advanced capabilities of the Kinect v2 sensor and integrate them seamlessly into their projects.
This article delves into the key features and functionalities offered by the Kinect for Windows Software Development Kit (SDK) 2.0.1410.19000, providing a detailed overview of its components and how they empowered developers to create innovative applications across various domains.
Comprehensive Feature Set
The Kinect for Windows Software Development Kit (SDK) 2.0.1410.19000 encompassed a wide range of features, designed to cater to diverse development needs:
Windows Store Support
One of the most significant advancements in this release was the ability to develop and publish Kinect-enabled applications targeting the Windows Store. This opened up a vast new audience for Kinect applications, allowing developers to reach a wider user base through the official Microsoft app marketplace. Almost all Kinect SDK and sensor functionality was available within this API surface, with the notable exception of Speech recognition. This capability allowed for the creation of compelling Kinect experiences accessible directly from the Windows Store. Further details on developing Windows Store applications using Kinect can be found at: https://go.microsoft.com/fwlink/?LinkId=517592.
Unity Support
For the first time, the Kinect API set became accessible within the Unity Pro game engine through a dedicated Unity Package. This integration provided developers with a powerful platform for creating interactive games and experiences using the Kinect sensor. The Unity plugins made available APIs for core Kinect functionality, the Visual Gesture Builder, and face tracking, allowing developers to seamlessly integrate these features into their Unity applications. These plugins could be downloaded from: https://go.microsoft.com/fwlink/?LinkID=513177.
.NET APIs
The Managed API set within the SDK provided a familiar development environment for developers accustomed to working with managed APIs. This option offered a fast and efficient development workflow, leveraging existing investments and expertise in the .NET framework. The Managed API set provided access to the full range of Kinect SDK and sensor functionalities.
Native APIs
For applications demanding maximum performance and control, the SDK offered Native APIs written in C++. These APIs allowed developers to leverage the full speed and power of the hardware. While structurally similar to the Managed API set, the Native APIs represented a significant departure from the v1.x native APIs, offering improved ease of use and access to the complete Kinect SDK and sensor functionality.
Advanced Audio Processing
The Kinect sensor and SDK incorporated a state-of-the-art array microphone and advanced signal processing algorithms. This combination created a virtual, software-based microphone with exceptional directionality and noise cancellation capabilities. The system could accurately determine the direction of sound sources and provide high-quality audio input for speech recognition and other audio-related applications.
Enhanced Face APIs
The Face APIs were significantly expanded in this version of the SDK, providing a wide range of functionalities for creating rich and engaging facial experiences. Developers could detect faces within the sensor’s field of view, align them to five unique facial identifiers, and track their orientation in real-time. The HD Face technology, in particular, offered 94 unique "shape units" per face, enabling the creation of highly detailed and expressive facial meshes. These meshes could be tracked in real-time, capturing subtle facial movements and expressions with remarkable accuracy.
Hand Pointer Gesture Support
The Kinect for Windows Software Development Kit (SDK) 2.0.1410.19000 offered improved support for controlling applications through hand pointer gestures. Sample code in ControlsBasics-XAML, ControlsBasics-WPF, and ControlsBasics-DX demonstrated how to integrate hand pointer gesture support into applications. This functionality was an evolution of the KinectRegion/KinectUserViewer support introduced in Kinect for Windows v1.7 and later. KinectRegion and KinectUserViewer were available for XAML and WPF applications, while DirectX support was built on top of a lower-level Toolkit Input component.
Kinect Fusion
This release enabled developers to create and deploy Kinect Fusion applications. Kinect Fusion allowed for the real-time reconstruction of 3D models from depth data captured by the Kinect sensor. The v2 implementation offered higher resolution, improved camera tracking, and enhanced performance compared to the 1.x releases.
Kinect Studio
Kinect Studio underwent a major rewrite to accommodate the new sensor and provide users with greater customization and control. The redesigned user interface offered flexibility in workspace layout and view customization. Users could compare two 2D or 3D views side-by-side or create custom layouts tailored to their specific needs. The separation of monitoring, recording, and playback streams exposed additional functionality, such as file- and stream-level metadata. The timeline featured in- and out-points for precise playback control, pause-points for suspending playback at specific times, and markers for attaching metadata to various points in time. The tool also included playback looping and additional 2D/3D visualization settings.
Visual Gesture Builder (Preview)
The Visual Gesture Builder (VGB) was introduced as a gesture detector builder that used machine learning and body-frame data to define gestures. Developers could tag multiple body-data clips with metadata about a specific gesture, which was then used by a machine-learning trainer to extract a gesture definition. This definition could be used by the gesture detection runtime to detect one or more gestures within an application. The VGB offered a path to rapid prototyping for gesture-based interactions. A companion tool, vgbview, allowed developers to benchmark their gesture definitions without writing any code.
Installation
The original article included installation instructions which are omitted here for brevity.
Driver Download
The official driver associated with the Kinect for Windows Software Development Kit (SDK) 2.0.1410.19000 can be found through Microsoft’s official channels (this link may no longer be active):
Alternative download link:
https://www.filehorse.com/download-kinect-for-windows-sdk/download/
Conclusion
The Kinect for Windows Software Development Kit (SDK) 2.0.1410.19000 marked a significant advancement in the development of Kinect-based applications. With its comprehensive feature set, support for various development platforms, and enhanced performance, it empowered developers to create immersive and interactive experiences across a wide range of applications. While the technology has evolved since this release, this SDK remains a significant chapter in the history of natural user interfaces.