This blog was written by Kapileshwar Syamasundar during his summer placement at ARM in the ARM Mali Graphics demo team. Kapil did some great work at ARM porting the Ice Cave demo to VR using Unity, we hope you can benefit from this too.
Ice Cave VR
1. Introduction
Ice Cave, the latest demo from ARM Mali Ecosystem, has been shown with great success this year in such major events as GDC, Unite Europe, and Unite Boston. The demo has been developed in Unity and aims to demonstrate that it is possible to render high visual quality content on current mobile devices. A number of highly optimized special effects were developed in-house, specifically for this demo, some of which are based on completely new techniques, for example the rendering of shadows and refractions based on local cubemaps.
Figure 1 View of the cave from the entrance in the Ice Cave demo.
The Ice Cave demo was released at a time when Virtual Reality has become the centre of attention in the game development community, and related events and media. A number of VR demos and games have already been released but VR performance requirements can limit the complexity of VR content and therefore the visual quality of the final VR experience.
It is in this landscape that the Ecosystem demo team decided to port the Ice Cave demo to Samsung Gear VR and this task was assigned to me. In this blog I describe my experience in porting the Ice Cave demo to VR during my eight weeks summer placement in the Ecosystem demo team.
By the time I joined the demo team, Unity had just released a version with VR native support for Oculus Rift and Samsung Gear VR. Previously, VR support was only available by means of a plugin based on Oculus Mobile SDK, but this had some obvious limitations:
- Each VR device has a different plugin
- Plugins may conflict with each other
- Release of newer VR SDKs / Runtimes can break older games
- Lower level engine optimizations are not possible with plugin approach of two separate cameras
Conversely, the newly released Unity VR native integration lacked both support and sufficient information for developers, and experienced many unresolved issues. Nonetheless, the team was convinced that with the native integration in Unity we would be able to achieve the best possible performance; a key point in guaranteeing a successful VR user experience.
2. Samsung Gear VR
The Samsung Gear VR headset does not have a built in display but has instead been designed to host a mobile phone. At the time of writing, the Samsung Gear VR comes in two versions; one for Samsung Note 4 and another for the latest Samsung Galaxy S6. Some of the main specifications of the Samsung Galaxy S6 version are listed below.
· Sensors: Accelerator, Gyrometer, Geomagnetic, Proximity |
Figure 2. The Samsung Gear VR for Samsung Galaxy S6. |
· Motion to Photon Latency < 20ms | |
· Manual Focal Adjustment | |
· Main Physical UI: Touch Pad | |
· Oculus’s Asynchronous TimeWarp technology |
Samsung Gear VR is powered by Oculus VR software and incorporates the Oculus Asynchronous Time Warp technology. This important feature helps reduce latency, or the time taken to update the display based on the latest head movement; a key issue to avoid in VR devices. Besides the Time Warp technology, the Samsung Gear VR has several sensors which it uses in place of the ones incorporated in the phone.
The Samsung Gear VR has its own hardware and features a touch pad, back button, volume key and, according to the specifications, an internal fan designed to help demist the device while in use.
The key point here however, is that you can insert your Samsung Galaxy S6 into the headset and enjoy an immersive experience with just a smartphone. We are no longer limited to the screen size of the phone and can instead become completely immersed in a virtual world.
3. Main steps to port an app/game to VR in Unity
VR integration in Unity has been achieved following one of the main Unity principles, that it must be simple and easy. The following basic steps are all that are needed to port a game to VR:
· Unity 5.1 version with VR native support (or any higher version).
· Obtain the signature file for your device from the Oculus website and place it in Plugins/Android/assets folder.
· Set the “Virtual Reality Supported” option in Player Settings.
· Set a parent to camera. Any camera control must set camera position and orientation to the camera parent.
· Associate the camera control with the Gear VR headset touch pad.
· Build your application and deploy it on the device. Launch the application.
· You will be prompted to insert the device into the headset. If the device is not ready for VR you will be prompted to connect to the network where the device will download Samsung VR software.
· NB. It is useful to set the phone to developer mode to visualize the application running in stereo without inserting into the Gear VR device. You can enable the developer mode only if you have installed previously a VR application appropriately signed.
Enabling Gear VR developer mode • Go to your device Settings - Application Manager - Gear VR Service • Select "Manage storage" • Tap on the "VR Service Version" six times • Wait for scan process to complete and you should now see the Developer Mode toggle Developer mode allows you to launch the application without the headset and also dock the headset at any time without having Home launch. |
Figure 3. Steps to enable VR Developer mode on Samsung Galaxy S6.
Figure 4 Side by Side view of stereo viewports captures with VR developer mode enabled.
4. Not as simple as it seems. Considering VR specifics.
After following the instructions above, I saw nothing but a black screen when inserting the device into the headset. It took me some time to get the VR application running in order to establish that some existing features had to be changed and others added.
VR is a completely different user experience and this is therefore one of the key issues when porting to VR. The original demo had an animation mode which moved the camera through different parts of the cave to show the main features and effects. However, in VR this animation caused motion sickness to the majority of users, particularly when moving backwards. We therefore decided to remove this mode completely.
We also decided to remove the original UI. In the original Ice Cave demo a tap on the screen triggers a menu with different options but this was unsuitable for VR. The original navigation system, based on two virtual joysticks, was also unsuitable for VR so we decided to entirely replace it with a very simple user interaction based on the touch pad:
· Pressing and holding the touch pad moves the camera in the direction the user looks.
· When you release the pressure the camera stops moving.
· A double tap resets the camera to the initial position.
This simple navigation system was deemed to be intuitive and easy by all users trying the VR version of the demo.
Figure 5. User interaction with touch pad on the Samsung Gear VR.
The camera speed was also a feature we considered carefully as many users experienced motion sickness when the camera moved just a little too fast. After some tests we were able to set a value that most people were comfortable with.
Additionally, the camera has to be set as a child of a game object. This is the only way Unity can automatically integrate the head tracking with the camera orientation. If the camera has no parent this link will fail so any translation and rotation of the camera has to be applied to the camera parent node.
In VR, as in reality, it is important to avoid tight spaces so the user doesn’t feel claustrophobic. The original Ice Cave was built with this in mind and provides ample space for the user.
The only effect not imported to VR was the dirty lens effect. In the original Ice Cave demo this effect is implemented as a quad that is rendered on top of the scene. A dirty texture appears with more or less intensity depending on how much the camera is aligned with the sun. This didn’t translate well to VR and so the decision was made to completely remove it from the VR version.
Figure 6. Dirty lens effect implemented in the original Ice Cave demo.
5. Extra features in the Ice Cave VR version
In the original demo the user can pass through the walls to look at the cave from the outside. However in VR this didn’t create a good experience and the sensation of embedding disappeared when you went out of the cave. Instead, I implemented camera collision detection and smooth sliding for when the user moves very close to the walls.
When running a VR application on Samsung Gear VR, people around the user are naturally curious about what the user is actually seeing. We thought that it would be interesting, particularly for events, to stream the content from the VR headset to another device such as a tablet. We decided to explore the possibility of streaming just the camera position and orientation to a second device running a non-VR version of the same application.
The new Unity network API allowed a rapid prototyping and in a few days I had an implementation which worked pretty well. The device actually running the VR version on the Samsung Gear VR works as a server and in each frame sends the camera position and orientation over wireless TCP to a second device that works as a client.
Figure 7. Streaming camera position and orientation from Samsung Gear VR to a second device.
Using the built-in touch pad to control the camera motion proved very successful. Nevertheless, we decide to provide the user with an alternative method of control using an external Bluetooth mini controller readily available elsewhere. This required us to write a plugin to extend the Unity functionality by intercepting the Android Bluetooth events and using them to trigger movement and resetting of the camera. Unfortunately there is not much information available so whilst it was only possible to intercept the messages coming from two keys , this was enough to move/stop and reset the camera.
6. Conclusions
Ice Cave VR was implemented during my summer placement with ARM’s Ecosystem Demo team in less than eight weeks with no previous experience of Unity. This was possible thanks to the native VR integration Unity released on version 5.1. In principle, just a few steps are necessary to port a game to VR, although in practice you need to do some extra work to fine-tune the specific requirements of VR in your game. With this integration, Unity has greatly contributed to the democratisation of VR.
Unity VR integration is still in progress and some reported issues are expected to be solved in coming versions. Nonetheless, the Ice Cave VR version shows that it is possible to run high quality VR content on mobile devices if resources are balanced properly at runtime by using highly optimized rendering techniques.
All the advanced graphics techniques utilised in the Ice Cave demo are explained in detail in the ARM Guide for Unity Developers. In the guide it is possible to find the source code or code snippets of these techniques which allowed me to understand how they work.
What I consider the most relevant in all this is the fact that with mobile VR we are no longer limited to the size of our smartphones to enjoy a game. Now we can be part of a limitless virtual world and enjoy a wonderful VR experience from a tiny smartphone inserted in a head set. This really is an outstanding step forward!
Now you can check out the video