Welcome Visitor
Today is Thursday, September 21, 2017

 

 

Enlighten VR Development Diary - Prototyping

    Print

Welcome to the first Enlighten VR development diary!

Our diary entries will take a look at the behind-the-scenes of our demo development process, describing what goes into creating an Enlighten application that showcases innovative and new ways of working with light.

We wanted to show the importance of indirect lighting in virtual reality and so began development on a new demo that will be premiered at VRX Conference on December 7-8.

For previous year's VRX conference, we ported our Subway demo to VR. This time we decided to create a custom demo in order to showcase the ultimate Enlighten experience in VR. We began with a prototype phase to identify the main elements which would be included and how we would implement them.

This dev diary will cover the first phase of this prototype process. The subsequent diaries will focus on the second stage prototyping process where more ideas will be explored and condensed into one or two fully playable experiences followed by the pre/production stage, where we will focus on reference gathering, art style definition and development of the final demo.

Why prototyping?

The prototyping phase is critical in turning a high level idea to a plan with known content. It acts as a foundation of a project that gives it direction and scope.

It helps to answer the 'unknowns' of development. For our Enlighten VR showcase, these mainly comprised of:

  • How do we use storytelling?
  • What set of content is relevant to both games and enterprise markets?
  • What is our target platform and what are its constraints?
  • What is the duration of the experience?
  • How to best showcase the impact of indirect lighting?
  • What are the performance constraints of lighting in VR?

Target Platform

Choosing the right VR platform is crucial in delivering the best demo experience. With Enlighten fully supporting all VR platforms, selecting one from the vast range was going to be challenging, as each has their own unique benefits and challenges.

Although Enlighten fully supports mobile platforms, it was decided early on, that we would not select a mobile VR platform, as our the team has already created a vast selection of them, including Ice Cave and CaveVR, hence the all the non-mobile VR platforms such as the HTC Vive Oculus Rift, PlayStation VR were instead considered for the new VR demo.

Considerations

The Enlighten art team has a lot of pre-existing knowledge of developing VR demos on both the Oculus Rift and Gear VR that could be used for this showcase. It was decided early on that Unreal Engine 4 would be the engine of choice as it offers a high level of support for VR which would ease the development process.

Before proceeding with the prototype development, some things needed to be taken into consideration.

  • We would need to consider motion control input to add interaction with the lighting.
  • Short intuitive experiences would be ideal in presenting the demo to multiple people.
  • As obvious as it may seem, but it is very dark inside the headset with zero light permeating through, something which could be used as a seamless way to introduce the player to the demo.
  • Navigation through space would need a lot of attention and consideration in the context of the experience as traditional FPS navigation could lead to motion sickness.
  • Lastly, good immersion and presence comes from a number of ingredients in the right amounts - performance, image quality, sound, navigation and interaction would all be required in the right amounts to give an immersive and engaging experience.

Both the Oculus Rift and HTC Vive presented unique challenges and capabilities for our final demo, so we decided to create some basic prototype scenes for each platform while exploring each major element.

Prototypes:

Prototype 1: Light & Sound

Prototype 1

Prototype 1

Platform: Oculus Rift

Purpose: To assess the impact of light interaction and sound on a scene.

Concept:

A simple room featuring various objects with two "firefly" lights flying around, changing colour and intensity. The user is able to control the fireflies' light colour, visibility and movement, with the global illumination updating and responding to the actions accordingly. In addition to changes in light, the fireflies also emit a faint buzzing sound which can be heard as 3D spatial audio, giving cues as to their position and distance from viewer.

Findings:

Behavioral responses added a great deal of character and personality to the fireflies. They turn red when they hit an object, evoking an angry emotion. With this approach the light sources themselves could be used as an effective means of emotional storytelling. Interaction with the fireflies was essential in adding an extra dimension of immersive presence within VR and spatial audio strongly supported the lighting in communicating with the user and controlling their attention.

Prototype 2: Navigation and Interaction

Prototype 2

Prototype 2

Platform: HTC Vive

Purpose:

To further explore the impact of light while investigating different navigation and interaction techniques.

Concept:

The user starts in a pitch black room, with nothing but a torch. They notice a locked door, which must be unlocked by finding a four digit code combination. Using nothing but the torch and navigator controllers the user must move around the room and solve a series of light related puzzles in order to obtain each digit of the door's unlock code. When all puzzles are completed and the code obtained, the door becomes unlocked.

Findings:

Trying to use light to solve puzzles in different ways showed how the use of light as a game mechanic can have a lot of scope and be effective. This is especially interesting as this is not something that is often utilised within traditional game development. It offered an opportunity for Enlighten to demonstrate a new approach to VR gameplay.

The original intention of this prototype was to investigate how a torch light would work and feel within a VR gameplay experience. Whilst this was initially achieved, it didn't hold value without a purpose or context. As such, the team decided to create an 'escape the room' experience, using only light (or different features of the torch) in order to solve the puzzles. This purpose created a much more immersive experience.

During play testing, some people found the torch not particularly intuitive to use. As a torch is a real world object, it operates universally in the same way. As a result, any 'real-world' interactive objects created must replicate real-world functionality and behavior.

Interacting with the environment by having to solve environment based puzzles proved a very fun and immersive experience. This prototype used a greater range of playing space by making the user get down on their hands and knees, or stretch out an arm for example. This also proved a really great way of bringing in new and different types of fun gameplay, which is more unique to VR. Yet people's preference for navigation systems seemed to vary. This means that for the demo it must be more intuitive, or give the user more flexibility of choice.

Prototype 3: Gameplay and Environment

Prototype 3

Prototype 3

Platform: HTC Vive

Purpose: To explore the type of environment which compliments multiple dynamic light sources and attain performance statistics.

Concept:

The user is placed in a dark labyrinth with a number of apertures. Armed with a glowing ball spawning controller and a shovel-like bat, the user must navigate through the environment. The balls glow when they hit a surface and illuminate the passage with various pulsating colours.

Findings:

Batting balls around a level feels intuitive and satisfying as the physics supports this experience well. The level can be simple but needs three dimensional interactions with walls, holes and other props to create interesting indirect lighting effects. For example, moving through the labyrinth while throwing glowing balls behind walls to enable spectacular indirect lighting effects, especially on multiple levels, is very interesting. Varying the labyrinth design to enable more vertical gameplay (three dimensionality) or destruction elements would create more dynamic and challenging levels. Adding multiplayer support, scoring mechanics and different ball sizes would also make for a more engaging experience.

You can also read about our performance findings here:

Conclusion

During the development of these prototypes, the art team gained valuable experience across all VR platforms. The Oculus Rift proved to have great benefits, including ease of set up (should we choose to later bring this demo to customer meetings) and great display quality. More work however needs to be done on the controller side and how it can be used effectively for our final demo.

The HTC Vive showed a lot of promise. Its display quality is unrivalled which is ideal for showcasing the importance of light in VR. The Vive controllers also provide unique interactive capabilities which the team can use to create innovative ways to play with light, leading to a more immersive experience.

As it stands, we are most likely to proceed with the HTC Vive as the target platform.

The prototypes have given the team valuable insights for the various key elements needed to be implemented, including controls, navigation, performance etc. These will be explored further in the second stage of the prototype phase as more work on the content side still needs to be done in order to identify the killer VR Enlighten experience.

Stay tuned for our next development diary!

If you are interested to find out more about Enlighten, please visit the Enlighten page or request an evaluation.

Read more from:
News & Features
Tags: 
Enlighten, Geomerics, Protoyping, VR
Share: 
Related Articles
     Print
Powered by Bondware
News Publishing Software

The browser you are using is outdated!

You may not be getting all you can out of your browsing experience
and may be open to security risks!

Consider upgrading to the latest version of your browser or choose on below: