Category: website

Blowing bubbles with Shader Graph

I ran into a two weekly Techart challenge by Harry Alisavakis with the theme Watercolors. I decided participate and use Shader Graph in Unity for doing some soap bubble rendering.

Reflections & Iridescence

Real life soap bubble

The most visible features of a soap bubble are its reflections and its rainbow like colours. You can see reflections from both the front and back side of the bubble surface. This results in a mirrored reflection of the environment of the bubble.

The colours are caused by interference of the light that is reflected from the outside and the inside of the thin bubble surface. This phenomenon is called iridescence and it can also be found in sea shells, butterflies, insects, but you may also know it from the surface of a CD. When you inspect the bubble surface closely you can see a complex pattern of dancing colours. This is caused by variations in the bubble surface thickness due to complex fluid interactions that are known as the Marangoni effect.

Reflections in Shader Graph

For the reflections I created a simple room with with two fake windows. A reflection probe was used to bake it into a cubemap. Alternatively you could also use a HDRI image captured in real-life.

Bubbles need windows to reflect

A basic PBR graph with the Workflow set to Specular, Surface to Transparent and a Smoothness of 1 will already give you a nice reflective material. Add in a bit of Fresnel effect so the reflections are mostly noticeable on the outside and here’s the result.

Shaders are made easy with Shader Graph
Cubemap reflection with Fresnel falloff

Iridescence in Shader Graph

I took a more artistic approach for the simulation of the bubble surface variations by blending two layers of scrolling noise. This misses the swirls that you typically see in a real bubbles, but this won’t be noticeable at a distance. I added a vertical falloff to simulate that the thickness of a bubble is a bit larger at the bottom of the bubble. The surface thickness variations result an animated grayscale value. A gradient lookup gradient is then used to determine the iridescence colour. The gradient is based on a paper by Andrew Glassner.

The iridescence part hooks into the specular colour of the PBR node
Front face reflection and iridescence

Double sided materials

To create a double-sided material with different handling of forward and backward facing materials you can use the IsFrontFace node and branch into different parts of the shader based on the value. This works well with opaque materials and can be done in a single pass. Below you see a simple example that displays the result on an open cylinder.

Opaque material marked as a Two Sided
Two-sided opaque material ShaderGraph
Open cylinder with two-sided material applied

Double sided transparency

With transparent materials we typically want to do a separate backface pass and frontface pass to at least have a coarse way of sorting the model surfaces. That is easy to do in a ShaderLab (hand-coded) shader, but a bit harder when using Shader Graph.
Since the Universal Rendering Pipeline (URP) only supports one pass materials by default we need to come up with a different trick. We could create a copy of the mesh and render that with a separate material for the backside, but that would mess up the scene tree. And it would even become worse if the geometry were dynamic. Instead I chose to add a second material that uses the same Shader Graph shader but with a RenderFront flag to enable the front or backface rendering.
Note that Unity shows a warning and advises the use of multiple shader passes, which are not supported when using URP. 🤷‍♂️

Two-pass mesh rendering with different materials

The Shader Graph is using the RenderFront flag combined with the IsFrontFace node to determine if the front or backface needs to be rendered. I used AlphaClipping the prevent rendering of the backface when frontface rendering is active. Note that for backface rendering the surface normal needs to be flipped.

Two-sided reflections part
Bubble with two sided reflection and iridiscence

More

Augmented Reality avatar on a web page

Augmented reality on the web is happening and it’s becoming easier to create! Below you will find a small test of an avatar model I created online and included in this post. The avatar can be viewed in your environment on ARCore and ARKit enabled devices.

ReadyPlayer.me avatar

I used ReadyPlayer.me to create a 3D personal avatar of me. On that website you can use a selfie to create a 3D model that resembles the your image. Afterwards you can make adjustments to its choices. And finally you can download the generated model as a .glb file.
A .glb file is the binary version of a glTF file. glTF is a specification for the efficient transmission and loading of 3D scenes and models by applications.

Google modelviewer

I used Google’s modelviewer to embed the the glb model file in the webpage below and enabled AR viewing.

Here’s the source of the snippet I used:

<script type="module" src="https://unpkg.com/@google/model-viewer/dist/model-viewer.js"></script>
<script nomodule src="https://unpkg.com/@google/model-viewer/dist/model-viewer-legacy.js"></script>
<model-viewer src="avatar.glb" ios-src="avatar.usdz" ar ar-modes="webxr scene-viewer quick-look fallback" ar-scale="auto" alt="Readyplayer.me avatar" auto-rotate camera-controls></model-viewer>

If AR mode is available this button will be visible in the bottom right corner. Depending on which mode is available on your device you will go into AR or open a model viewer.

Here’s the result of running it on an Android phone with ARCore support.

Universal Scene Description

Running the demo on an iPad was a bit more work than I anticipated. Apple only supports the use of models in the .usdz format (Pixar’s Universal Scene Description) As you can see in the modelviewer declaration above there is a separate ios-src for use on iOS devices.

I could not find a simple tool (running on Windows) to convert the .glb to .usdz. There seem to be better solutions on iOS.
I finally found a solution by importing the .glb in Blender, saving a .blend file, importing the .blend file into Unity and finally exporting the model to .usdz.

More

ARCore supported devices; a detailed list of phones & tablets

When you want to use an Augmented Reality app that depends on Google’s ARCore (AKA “Google Play Services for AR”) you will need to know which devices support it. There’s the official list of ARCore supported devices, but that only shows brief names of the supported devices. If you need more details you will have to search for them. Not very efficient with an ever growing list of supported devices especially if you want to find which tablets currently support ARCore apps.

The official list with extra details

There’s a more detailed list available from the Google Play Console, but to be able to download that you will need to have a developer account and upload an app that uses ARCore. Quite a bit of friction if all you are looking for is which hardware you need for ARCore apps to work.

So I decided to bite the bullet and upload a Unity app with ARCore support to my Google Play developer account. I cloned Unity’s arfoundation samples on GitHub, built the app and made an internal release in the Play Store console. After that I was able to access the Device Catalog under Release Management. As you can see below the app was supported by 216 Android devices of a total of 13579 Android devices back in january 2020.

The Download Device List button lets you download a text file (.csv) that also describes details like Form Factor (PC/Phone/Tablet), the System On Chip, Screen Sizes, Screen Densities and more.

The downloaded devicelist.csv can be found on GitHub here.

ARCore supported tablets

A quick filter of the ARCore supported devicelist brings up the tablets that currently support ARCore (September 2020)

  • Acer Chromebook Tab 10
  • LG G Pad 5 10.1 FHD
  • Samsung Galaxy Tab S3
  • Samsung Galaxy Tab S4
  • Samsung Galaxy Tab S5e
  • Samsung Galaxy Tab S6
  • Samsung Galaxy Tab S7
  • Samsung Galaxy Tab Active Pro

The tablets-only devicelist.csv can be found on GitHub here.

Depth API support

In June 2020 Google officially introduced the Depth API. This API allows developers to retrieve a depth map from their phone. Depth maps can be useful for generating occlusions of virtual objects or for scanning your environment. Not all ARCore supported devices support the Depth API. To see which ones do you can add a filter for the devices that are shown in the device catalogue in the Google Play Console.
Add filter, select System Feature, search and select com.google.ar.core.depth

The list of ARCore devices that also support the Depth API can be found on GitHub here.

More

Switched to WordPress

I decided to switch to a WordPress based website to make it easier to manage content and simply make it more modern. I want to add a few of my old VRML projects as YouTube videos.

The glory days of VRML are over. I had a lot of fun with it, but now it’s time to move on.

More