Introduction
Game developers are regularly looking for efficient methods for implementing stunning effects in their games. This is especially important when targeting mobile platforms as resources should be carefully balanced to achieve maximum performance.
When developing the Ice Cave demo, we researched in-depth the concept of local cubemaps. In a previous blog, I wrote about how implementing reflections based on local cubemaps has proven to be a very efficient technique for rendering high quality reflections. This blog is devoted to a new technique developed by the ARM demo team to render refractions also based on local cubemaps.
Refractions, what is it?
Refraction is an important effect to consider when striving for extra realism when rendering semi-transparent geometry.
Refraction is the change in direction of a wave due to a change in the transmission medium. It is essentially a surface phenomenon. The refractive index determines how much light is bent, or refracted, when entering a material. Snell’s Law establishes the relationship between the refractive indices and the sine of the incident and refracted angles, as shown in Figure 1.
Figure 1. Refraction of light as it passes through one medium to another. |
Refraction implementations
Developers have tried to render refraction since the very moment they started to render reflections, since these two physical processes take place together in any semi-transparent surface. There are several well-known techniques for rendering reflections but is not the case for refractions.
Existing methods for implementing refraction at runtime (ray tracing is excluded due to its complexity) differ depending on the specific type of refraction. Nevertheless, most of the techniques render to texture, the scene behind the refractive object at runtime and apply a non-physically based distortion in a second pass to achieve the “refracted look”. This approach, which varies in the way the texture perturbation is performed, is used to render the refraction that takes place in water, heat haze and glass objects, among other effects.
Although some of these techniques can achieve credible results, texture perturbation is not physically based and the results are not always correct. If a realistic refraction is intended by rendering to texture from the point of view of the “refraction camera”, there may be areas that are not directly visible to the camera but become visible via refraction. Nevertheless, the main limitation of runtime render-to-texture methods, beside the physical correctness and performance penalty, is the quality, as there is often pixel shimmering or pixel instability that it is easily perceived while the camera is moving.
The use of static cubemaps to implement refraction is not new. Since the very moment when cubemaps became available in 1999, developers have used the cubemaps to implement reflections as well as refractions. When using cubemaps to implement reflections in a local environment, if we don’t apply the local correction we get incorrect results. This is also true for refractions.
Refractions based on local cubemaps
We bake into a static cubemap the environment surrounding the refractive object and fetch the texel from the cubemap based on the direction of the refracted vector (after applying the local correction, see Figure 2).
Figure 2. The local correction to refraction vector. |
We apply the local correction in the same way we did with reflections in a previous blog. After determining the direction of the refracted vector, we need to find where it intersects the bounding box that delimits the volume of the local scene. The next step is to build a new vector from the position where the cubemap was generated to the intersection point and use this final vector to fetch the texel from the cubemap to render what is behind the refractive object. We get a physically based refraction as the direction of the refraction vector is calculated according to Snell’s Law. Moreover, there is a built-in function we can use in our shader to find the refraction vector R strictly according to this law:
R = refract( I, N, eta);
where I is the normalized view or incident vector, N is the normalized normal vector, eta is the ratio of indices of refractions (n1/n2).
Shader implementation
For the simple case of a thin refractive surface, the shader implementation is straightforward, as shown in Figure 3. As for reflections, to apply the local correction in the fragment shader we need to pass the position where the cubemap was generated, as well as the minimum and maximum bounds of the bounding box (all in world coordinates).
Figure 3. Shader implementations of refraction based on local cubemap. |
Once we fetch the texel corresponding to the locally-corrected refraction direction, we might want to combine the refraction colour with other lighting, for example, reflections that in most cases take place simultaneously with refraction. In this case, we just need to pass an additional view vector to the fragment shader, apply to it the local correction and use the result to fetch the reflection colour from the same cubemap. Below is a code snippet showing how reflection and refraction might be combined to produce a final output colour.
A coefficient _ReflAmount, which is passed as a uniform to the fragment shader is used to adjust the balance between reflection and refraction contributions. You can use ReflAmount to tweak manually the visual effect for the look you are trying to achieve. You can find the implementation of the LocalCorrect function in the reflections blog. When the refractive geometry is a hollow object refractions and reflections take place in both the front and back surfaces (as shown in Figure 4.) In this case, we need to perform two rendering passes.
In the first pass, we render the semi-transparent object as we would opaque geometry. Additionally, in this pass, we render the object last with front-face culling on, i.e. to avoid occluding other objects, we render only the back faces and no depth buffer writing. The colour of the back face is obtained by mixing the colours calculated from the reflection, refraction and diffuse colour of the object itself.
In the second pass, we render only the front faces (back face culling), again last in the rendering queue and with depth writing off. The front-face colour is obtained by mixing the refraction and reflection textures with the diffuse colour. In this final pass we alpha-blend the resulting colour with the previous pass. Notice the combination of environment refractions and reflections.in both pictures from Figure 4.
The refraction in the second pass will add more realism to the final rendering but we could skip this step if the refraction on the back faces is enough to highlight the effect.
Figure 5 shows the result of implementing refractions based on local cubemap on a semi-transparent phoenix in the Ice Cave demo.
Figure 5. Refractions based on local cubemaps in the Ice Cave demo. |
Preparing the cubemap
Preparing the cubemap for use in the refraction implementation is a simple process. We just need to place a camera in the centre of the refractive geometry and render the surrounding static environment to a cubemap in the six directions. During the rendering process, the refractive object is hidden. This cubemap can then be used for implementing both refraction and reflection. The ARM Guide To Unity contains a simple script for rendering a cubemap.
Conclusions
Local cubemaps have proven to be an excellent technique for rendering reflections. In this blog I have shown how to use local cubemaps to implement very optimized and high quality refractions that can be combined at runtime with reflections to achieve high quality visual results (as seen in the Ice Cave demo.) This is especially important when developing graphics for mobile devices where runtime resources must be carefully balanced. Nevertheless, this technique does have limitations due to the static nature of the cubemap. How to deal with refractions when the environment is dynamic will be the subject of another blog.