Spark AR — Glass Refraction Shader Tutorial

Thibaut Evrard
The Startup
Published in
6 min readNov 23, 2020

--

For my latest Instagram filter, I wanted to include a glass refraction effect. I was really surprised to find out that there was no tutorials on how to do this out there. Only assets you need to pay for.

I took a few hours to create my own simplified version of a glass shader. Nothing fancy, but it does the job quite well. Here is a quick tutorial on how I did it.

A bit of Background:

If we look at a perfectly clear glass, the whole reason why we can see the object is because of Reflections and Refractions, in other words, how light is bent by the object ( Refraction ) or bouncing back from it ( Reflection ). Spark AR has options to handle Reflections, so we don’t need to worry about these for now. What we really want to focus on today is Refraction.

Prerequisites

Knowing your way around Spark and understanding how the patch editor works should be enough to get you through this tutorial. However, basic understanding of shaders will allow you to understand how this actually works!

Let’s Start!

First, Create a new project and bring a 3D object onto your scene!

Think about our glass effect this way: for each fragment of our 3D object, we want to trace a vertex from the camera to the object, apply a deformation to it and project it onto our canvas. Then, we want to sample the canvas texture of this point and apply it to our object fragment.

Step 1: Texture Projection

Let’s start by projecting the scene’s camera texture onto our object. We want to make our object transparent by perfectly mapping the background texture onto it.

To begin with, create a new material ( I will name it glassMaterial ), set the shader type to standard and apply it to a mesh on the scene.

Now, bring your material texture onto the patch editor as well as the extracted texture from the camera ( Scene -> Camera -> Texture Extraction )

Time to get into the serious stuff! Let’s look at each fragment of our 3D object and figure out what is the colour of the background right behind it.

You can skip to step 2 by just adding the Texture UV Projection patch from the asset library and place it in between the camera texture and material texture. However, I’ll still go through the process of making a custom texture projection.

First let’s connect our camera texture to our object fragments and see what happens.

Local position allows us to look at the position of each fragment of our mesh, then we extract the x and y coordinates and get the corresponding pixel from our camera texture UV map on our material.

The basic principle is there, but our coordinate systems are wrong, first, we need to translate our vertex position from local to world position. We do that by multiplying the local position by the model-view-projection matrix.

I will group these nodes into a function named worldPosition
Next we need to translate our world position into UV coordinates. I wont explain this in detail but here is how to do it:

I will group these new nodes under a function named worldPositionToUV
Let’s connect everything back together and see what happens.

Noyce! If you change the shader type of your object to flat, it should completely disappear.

Step 2: Refraction

Ok, pretty basic so far, let’s put some refraction in the mix.

In order to approximate refraction, let’s imagine that a ray of light goes from the camera to the canvas. when the ray hits the surface of our object, it “bounces” off it with an angle that depends on the angle between the ray of light and the surface normal ( we also take a refraction constant into account that varies depending on the material. e.g. glass, plastic, water )

In this step, we will be dealing with our fragment world normal. Let’s make a new worldNormal group. The idea is the same as getting the fragment worldPosition but with normals.

We also know that we need to deal with a ray that goes from the camera to the canvas, let’s bring that in as a new vector3

Cool! Now let’s calculate the angle between these two vectors. We do this by getting the Arccosine of the dot product of our vectors.

Now we just have to multiply our angle by the worldNormal to get our final refraction direction.

I’ll create a new group from these and name it refractionVector.

Scale this vector to the distance between our current fragment and the canvas. To do this, we can just multiply it by the inverse of the z parameter of our worldPosition

Now let’s multiply what we get by a refraction constant. We can play around with this value later to tune our shader. I’ll name that constant refractionStrength

The x and y values we get from this last multiplication represent the distance our refraction vector will travel before hitting the canvas. We are pretty much done now. All we have left to do is extract the x and y components from our vector and add them to worldPosition before plugging them into the worldPositionToUV we created earlier.

With a refraction strength of 0.2, you should get this kind of result:

Set the shader type to Physically Based and add an environment texture to make it sexy!

Done! now you can go out and enjoy life!

--

--

Thibaut Evrard
The Startup

Hi! I'm a Creative Technologist based in London. I love playing around with tech and creating rough prototypes!