Tagged: RoomAlive

Build your own holographic studio with RoomAlive Toolkit

holoheader

With the current wave of new Augmented Reality devices like Microsoft HoloLens and Meta Glasses the demand for holographic content is rising. Capturing a holographic scene can be done with 3D depth cameras like Microsoft Kinect or the Intel RealSense. When combining multiple 3D cameras a subject can be captured from different sides. One challenge for achieving this is the calibration of  multiple 3D cameras so their recorded geometry can be aligned. In this article I will show you how I used RoomAlive Toolkit to calibrate two Kinects. The resulting calibration is used for rendering a live 3D scene. The source code for this application is available on GitHub.

Calibration with RoomAlive Toolkit

Calibration of multiple Kinects typically requires a manual process of recording a collection of calibration points. With RoomAlive Toolkit it is possible to automatically calibrate multiple projectors and Kinects. Since projectors are used to connect multiple Kinects during calibration a minimum of one projector is required. The only requirement is that the Kinects used can see a fair amount of the projected calibration images. When the calibration is done the projector is no longer needed.

Holographic studio setup

In the picture above you see the geometry and color image that was recorded by two Kinect V2’s. There’s an overlap in the middle. Two magenta frustums indicate where the Kinects were positioned and a purple frustum shows the position of the projector.  Note that both Kinects where placed in portrait mode for better coverage of the room. I placed a few extra items to help the solver algorithm in the RoomAlive Toolkit solve the calibration.

Collecting data from multiple Kinects

Since each Kinect V2 needs to be attached to a separate PC we need a way to collect the depth and color data on the main system. RoomAlive Toolkit also contains a networking solution for doing that. Each system runs a KinectServer WCF service that provides the raw Kinect color and depth data. All data is updated as it arrives without further attempt for synchronization. This results in some artifacts at the seams of the captured scenes.

Rendering live Kinect data

The simplest form of rendering is rendering the raw pieces of geometry of each Kinect. RoomAlive Toolkit contains a SharpDX based example that shows how to do this. It also shows how to filter the depth and how to apply lens distortion correction. Using this information I built a new application based on SharpDX Toolkit (XNA-like layer) to do the rendering. The application contains a few extras so you can tweak a few of the rendering parameters at runtime. I added a clipping cylinder so you can easily reduce the rendered geometry to a single person or object as seen in the video. And just for fun I added a shader to give the live 3D scene a more cliché holographic look. More info about the parameters and controls can be found on the Holographic Studio GitHub page.

Other demos of real time video capture

Holoportation by Microsoft Research
3D Video Capture with Three Kinects by Oliver Kreylos

More

Rebuilding the HoloLens scanning effect with RoomAlive Toolkit

The initial video that explains the HoloLens to the world contains a small clip that visualizes how it can see the environment. It shows a pattern of large and smaller triangles that gradually overlay the real world objects seen in the video. I decided to try to rebuild this effect in real life by using a projection mapping setup that used a projector and a Kinect V2 sensor.

HoloLens room scan

Prototyping in Shadertoy

First I experimented with the idea by prototyping a pixel shader in Shadertoy. Shadertoy is an online tool that allows developers to prototype, experiment, test and share pixel shaders by using WebGL. I started with a raymarching example by Iñigo Quilez and setup a small scene with a floor, wall and a bench. The calculated 3D world coordinates could then be used for overlaying with a triangle effect. The raymarched geometry would later be replaced by geometry scanned with the Kinect V2. The screenshot below shows what the effect looks like. The source code of this shader can be found on the Shadertoy website.

Shadertoy Room Scanning Shader

Projection mapping with RoomAlive Toolkit

During Build 2015 Microsoft open sourced a library called the RoomAlive Toolkit that contains the mathematical building blocks for building RoomAlive-like experiences. The library contains tools to automatically calibrate multiple Kinects and projectors so they can all use the same coordinate system. This means that each projector can be used to project onto the correct location in a room. This can even be done on dynamic geometry. The toolkit also includes an example of reprojecting the recorded image with a pixel shader effect. I used this example to apply the earlier prototyped scan effect pixel shader onto a live scanned 3D geometry.

Source code on GitHub

Bring Your Own Beamer

The installation was shown at the Bring Your Own Beamer event held on September 25th 2015 in Utrecht, The Netherlands. For this event I made some small artistic adjustments. In the original video the scanning of the world seems to start from the location of the person wearing the HoloLens. In the installation shown at the event people were able to trigger the scanning effect with their feet. The effect starts at the triggered location and expands across the floor and up their legs and any other geometry in the room.

 

The distance from the camera determines the base color used for a particular scan. Multiple scans interfere with each other and generate a colorful experience. The video shows how part of the floor and part of the wall are mapped with a single vertically mounted projector. People seemed to particularly like to play with the latency of the projection onto their body by moving quickly.

More