The virtual game environment was created using Unity.
The current application was designed to run on Android.
Code hosting and version control was accomplished using GitHub.
The Unity Asset Store was used to download and import ready-made assets, such as clouds. Using Unity's Terrain Toolbox, a terrain object was created. Then, Unity’s built-in Terrain Tools was used to shape the terrain using a variety of brushes to raise/lower terrain, smooth height and paint terrain. One end of the environment had snow textures painted on the islands, and the other end desert textures, in order to simplify localisation.
Assets were imported to create a custom skybox and to place clouds in the sky, and a custom script was then created to have the clouds move across the sky in-game. Lastly, an audio clip was imported and attached to the boat camera as an audio source, in order to simulate ambient sounds.
This application enables several modes of interaction. The phone’s accelerometer is mapped to the direction of the ship. Players can tilt the phone in order to steer the boat left and right. The accelerometer has smoothing applied to it, using Unity’s Lerp function, in order to reduce jitter.
The microphone’s sound input is used to create wind in the game. A low pass filter is applied to reduce unwanted frequencies affecting the input. The sound input is also used to affect the amplitude of the waves, by making a distinction between low- and high frequency noise.
The phone’s light sensor is used for the spawning of clouds. The sensor measures the current lux of the ambient lighting in the room. By covering up the sensor, and thus lowering the light-level, the player is able to spawn clouds above their ship. The clouds follow the wind direction and can be used to hide the boat on the map. A wind effect was created using Unity’s particle system that responds to a wind zone.
In order to enable multi-device support in the current game, it was vital to provide networking. One challenge was that the users were to each have an asymmetrical view of the world. The goal was to enable both a first-person camera view through each player’s client device and at the same time allow the server device to project an overview on a large screen. All camera views had to be in sync at all times. In order to accomplish this, Unet HLAPI was used to set-up a custom networking system.
Any more questions about the project? Contact us at