Category: HoloLens

Softening the HoloLens FOV border

To hide the limited Field of View of a Mixed Reality headset like the HoloLens or the Magic Leap you can fade out the holograms at the border of the view. I will discuss three possible techniques with different advantages and disadvantages.

Note that all techniques have the same visual result in the HoloLens. However when recorded with Mixed Reality Capture the border effect seems to largely fall outside of the MRC camera field of view.

Postprocessing effect

The most modern solution is applying a post-processing effect. Post-processing effects in Unity can be a heavy hit on fillrate so that is why Microsoft advises not to use them on HoloLens. The Magic Leap has a bit more graphics power to spend so it may be a viable solution on that device. Typically a post-processing effect works by rendering the scene to a texture and then rerender that texture on a screen-aligned quad with a filter effect (e.g. grayscale, bloom, vignette etc.) applied during that rerender. This can be done multiple times in succession, but since each rerender also means modifying each screen pixel this will cost you fillrate.

Post-processing a scene in Unity needs two assets to work together:

  • a script that is attached to the camera and implements OnRenderImage
  • a shader that is applied when we rerender the scene image in OnRenderImage

The biggest advantage of using this technique is that it doesn’t require changes to the content of your scene. But it may be a bit overkill for only add a fading border.

Postprocessing shader with red/green output

Postprocessing final result

BorderFadePostProcess.cs

using UnityEngine;
public class BorderFadePostProcess : MonoBehaviour
{
    [Range(0, 500)]
    public float borderWidth = 100;
    private Material material;
    void Awake()
    {
        // Creat a Material using the FadeBorder shader
        material = new Material(Shader.Find("Hidden/FadeBorder"));
    }
    // Postprocess the image
    void OnRenderImage(RenderTexture source, RenderTexture destination)
    {
        material.SetFloat("_borderWidth", borderWidth);
        
        // Rerender the scene on a screen aligned quad with the given material/shader
        Graphics.Blit(source, destination, material);
    }
}

Fadeborder.shader

Shader "Hidden/FadeBorder"
{
 Properties
 {
  _MainTex ("Texture", 2D) = "white" {}
 }
 SubShader
 {
  // No culling or depth
  Cull Off ZWrite Off ZTest Always
  Pass
  {
   CGPROGRAM
   #pragma vertex vert
   #pragma fragment frag
   
   #include "UnityCG.cginc"
   struct appdata
   {
    float4 vertex : POSITION;
    float2 uv : TEXCOORD0;
   };
   struct v2f
   {
    float2 uv : TEXCOORD0;
    float4 vertex : SV_POSITION;
   };
   v2f vert (appdata v)
   {
    v2f o;
    o.vertex = UnityObjectToClipPos(v.vertex);
    o.uv = v.uv;
    return o;
   }
   
   sampler2D _MainTex;
   
   // Width of the border in pixels
   fixed _borderWidth;
   float linearStep(float a, float b, float t)
   {
    return saturate((t - a) / (b - a));
   }
   fixed4 frag (v2f i) : SV_Target
   {
    fixed4 col = tex2D(_MainTex, i.uv);
    // Distance to border along x and y
    float2 distanceInPixels = (0.5 - abs(i.uv.xy - 0.5)) * _ScreenParams.xy;
    // Linear border fade
    float mask = linearStep(0, _borderWidth, min(distanceInPixels.x, distanceInPixels.y));
    //return col + lerp(fixed4(1, 0, 0, 1), fixed4(0, 1, 0, 1), mask);
    // Return masked color
    return col * mask;
   }
   ENDCG
  }
 }
}

Fading materials

Another solution is to let the material/shader do the fading out. The used shader needs to calculate the screen position and then fade out the material when at the edge of the screen. Each hologram in the scene needs to have a material applied with this edge fading in it’s shader.
Modifying the used shaders may result in an improved performance, but this highly depends on what is visible in your scene. No separate postprocessing render is needed, but each shader uses a few more instructions and it requires all shaders in your scene to be modified.

Material fade with red/green output

Material fade final result

See GitHub project link for the source code of the Standard_BorderFade shader. It’s a modification of the Standard shader from the Holotoolkit.

Screenspace border

The third solution is rendering a screen aligned border that fades from transparent block to opaque black. It may sound a bit old skool, but it does the job with minimal performance impact and no need to modify the content of your scene.


Screenspace border with red/green output

Screenspace border final result

Borderdrawer.cs

using UnityEngine;
public class BorderDrawer : MonoBehaviour
{
    [Range(0, 500)]
    public float borderWidth = 100;
    private Material material;
    private Color OutsideColor = new Color(0, 0, 0, 1);
    private Color InsideColor = new Color(0, 0, 0, 0);
    private void Start()
    {
        // Creat a Material using the FadeBorder shader
        material = new Material(Shader.Find("Unlit/VertexColor"));
    }

    void OnPostRender()
    {
        // HoloLens resolution 1280 x 720 
        // HoloLens screen/safe area 1268 x 720
        float left = (float)borderWidth / (float)Screen.width;
        float bottom = (float)borderWidth / (float)Screen.height;
        float right = 1 - left;
        float top = 1 - bottom;
        GL.PushMatrix();
        material.SetPass(0);
        GL.LoadOrtho();
        GL.Begin(GL.TRIANGLE_STRIP);
        GL.Color(OutsideColor);
        GL.Vertex3(0.0F, 0.0F, 0);
        GL.Color(InsideColor);
        GL.Vertex3(left, bottom, 0);
        GL.Color(OutsideColor);
        GL.Vertex3(1F, 0F, 0);
        GL.Color(InsideColor);
        GL.Vertex3(right, bottom, 0);
        GL.Color(OutsideColor);
        GL.Vertex3(1F, 1F, 0);
        GL.Color(InsideColor);
        GL.Vertex3(right, top, 0);
        GL.Color(OutsideColor);
        GL.Vertex3(0F, 1F, 0);
        GL.Color(InsideColor);
        GL.Vertex3(left, top, 0);
        GL.Color(OutsideColor);
        GL.Vertex3(0F, 0F, 0);
        GL.Color(InsideColor);
        GL.Vertex3(left, bottom, 0);
        GL.End();
        GL.PopMatrix();
    }
}

Github

The source code for these three techniques is available from this Github repository

References

Astronaut model on Remix3D

Performance recommendations for HoloLens apps

More

HoloLens scanning effect in Unity

In a previous blog post I talked about my attempt to rebuild the HoloLens scanning effect as shown in this video. After following the HoloLens Academy tutorials I decided to see how easy my existing shader could be integrated in Unity. It turned out that only a minimal amount of plumbing was needed.

HoloLens room scan

I took the project files from the HoloLens course on spatial mapping (Holograms 230). That course explains how you can apply your own material and a custom shader to the mesh that is generated by the spatial mapper. For quick iterations you can even load a previously saved room mesh. I added a new unlit shader and a material using it. This material is used by the Spatial Mapping Manager script to apply to the mesh coming from the spatial mapper.

Unity shader variables

Most of the plumbing came down to defining variables and using them in the shader. The main animation is driven by the global time. Unity provides this as a built-in variable vector _Time, where the y-component contains the elapsed time in seconds. I added a few variables to control the looks and behavior of the effect like Main Color, Speed, Triangles Scale and Range Scale.

The center of the effect is also a variable that can be configured. It could be updated by using doing a Raycast intersection as explained in the HoloLens Academy tutorials. Currently the effect keeps pulsating every 5 seconds. To only trigger the effect on an event the used global time could be replaced by a separately controlled progress variable.

Differences with original effect

To create the effect of triangles walking across the floor and up the walls the shader needs to calculate uv coordinates based on a world location. Preferably with as little seams as possible. I used the horizontal distance to the configured center point and added the vertical distance instead of using the direct distance to the center point. This works reasonably well on connected surfaces, but note that it is not a real walk across the topology of the mesh.

The effect in the original video has a slightly different border effect that has some more distortions and a different color. I experimented with mimicking that effect, but decided to leave that out. I used the effect in an interactive installation where I preferred a stronger border that looked like a wave was expanding outwards.

The source code of the project is available on Github.

HoloLens Shader Pack

A new version of this shader was optimized for running on the actual HoloLens device. This shader and many others are available in the HoloLens Shader Pack that is available on the Unity Asset Store.
More

Rebuilding the HoloLens scanning effect with RoomAlive Toolkit

The initial video that explains the HoloLens to the world contains a small clip that visualizes how it can see the environment. It shows a pattern of large and smaller triangles that gradually overlay the real world objects seen in the video. I decided to try to rebuild this effect in real life by using a projection mapping setup that used a projector and a Kinect V2 sensor.

HoloLens room scan

Prototyping in Shadertoy

First I experimented with the idea by prototyping a pixel shader in Shadertoy. Shadertoy is an online tool that allows developers to prototype, experiment, test and share pixel shaders by using WebGL. I started with a raymarching example by Iñigo Quilez and setup a small scene with a floor, wall and a bench. The calculated 3D world coordinates could then be used for overlaying with a triangle effect. The raymarched geometry would later be replaced by geometry scanned with the Kinect V2. The screenshot below shows what the effect looks like. The source code of this shader can be found on the Shadertoy website.

Shadertoy Room Scanning Shader

Projection mapping with RoomAlive Toolkit

During Build 2015 Microsoft open sourced a library called the RoomAlive Toolkit that contains the mathematical building blocks for building RoomAlive-like experiences. The library contains tools to automatically calibrate multiple Kinects and projectors so they can all use the same coordinate system. This means that each projector can be used to project onto the correct location in a room. This can even be done on dynamic geometry. The toolkit also includes an example of reprojecting the recorded image with a pixel shader effect. I used this example to apply the earlier prototyped scan effect pixel shader onto a live scanned 3D geometry.

Source code on GitHub

Bring Your Own Beamer

The installation was shown at the Bring Your Own Beamer event held on September 25th 2015 in Utrecht, The Netherlands. For this event I made some small artistic adjustments. In the original video the scanning of the world seems to start from the location of the person wearing the HoloLens. In the installation shown at the event people were able to trigger the scanning effect with their feet. The effect starts at the triggered location and expands across the floor and up their legs and any other geometry in the room.

 

The distance from the camera determines the base color used for a particular scan. Multiple scans interfere with each other and generate a colorful experience. The video shows how part of the floor and part of the wall are mapped with a single vertically mounted projector. People seemed to particularly like to play with the latency of the projection onto their body by moving quickly.

More