Our journey through Virtual Reality, software and HMD’s

Ever since we laid our hands on the original Oculus Rift DK1, we have been interested in the possibilities that Virtual Reality gave us. From the Hardware side the HMD’s (Head Mounted Display’s) have huge growing potential and capabilities to provide a good sense of presence in a virtual world, while form the Software side it could push a different perspective on real time interactivity in the use of applications and games.
We were quite fast to realize that our MassiveEngine was a great fit for Virtual Reality development and implemented working support for the Oculus Rift in a mere 2 weeks time. Throughout the years we have been working with VR from a Research, Content and Software perspective. This post will show our journey and highlights projects we have been working on and decisions we made.

Why Virtual Reality

A question which we still get quite a lot, mostly by people who have never experienced VR, or only saw a old laggy prototype running on a DK1 or a cheap 3 dollar Google Cardboard clone.
VR is something you can see the potential off immediately when you try it, the immersive experience and interaction with the otherwise plain 2D experience is something that changes your perspective on how you experience media. Everything from games and entertainment to research and even medical treatment are a couple of valid reasons to look into VR. We were on the early bandwagon with being able to use the original Oculus Rift DK1 in the first couple of weeks of availability after their successful Kickstarter campaign. Originally the curiosity of how this would work in software was more then enough to chase it and study and research techniques and implementations to create a functional and effective pipeline for VR. This lead to a couple of questions regarding support of devices, the rendering pipeline (what to do, what not to do), what rendering techniques would be optimal for the current generation hardware, would the current generation hardware suffice anyway?
Some of these questions are still left partly unanswered, some we solved with external and our own research, and a bit of help from a couple of really smart programmers, which leads us to…

Which HMD’s to support? (All of them!)

So the Oculus Rift was first, and for a while it seemed like the rest of the industry would be happy with that, still unsure of the capabilities of the devices, and the worry of VR becoming just another gimmick to die of (like Nintendo’s Virtual Boy which did… well, do you have one?). After the initial year of Oculus becoming bigger and bigger, alternatives start to show of, some very promising! This raised a question for us, how to support all these devices. Almost every HMD has its own SDK, some not allowing inclusion in software which also support other devices with same functionality, some requiring very specific software paths which may be problematic etc. It also became clear that Multi-Platform support was becoming a issue with most devices just shipping on Windows initially, a no go for us. Our Solution?

OpenHMD Demo App

http://www.openhmd.net our preferred library for HMD support.


OpenHMD, a small C library which gives us a Multi-Platform hardware agnostic driver for HMD’s. This solved everything that we and other developers had issues with. After using this successfully running a rather fast initial implementation in the MassiveEngine, we settled and eventually became contributors to this open source library ourselves. Regardless of the HMD you have, it works on our software, if the HMD is not supported? It will very soon!


How about portable alternatives?

Initially portable platforms such as Android were not on our priority list however after seeing the capabilities of phone based VR experiences, development started contributing to OpenHMD writing a Android backend. Using the Android Native Interface (NDK) together with the sensor fusion of OpenHMD, Android support was easy to add, but the render pipeline needed to support these devices is a tad more difficult.
PC GPU’s have no problems doing multiple passes and storing a good amount of temporary data. This functionality can be used for things like Deferred Rendering or effects like SSAO and such. Mobile phones are currently still limited in VRAM and processing power, and sometimes even bandwidth which can be a bottleneck for CPU/GPU communication.

We are currently using a old-school straight forward renderer for our mobile experiments but will look into more modern techniques when Vulkan gets more standard on Phones.


Cheap Chinese HMD which fits a phone running a OpenHMD demo. This is a slightly better version of a Google Cardboard.

Of course implementation for iOS should be possible, but since we have no current projects running which rely on iOS, we have not bothered with it yet. With some interesting dedicated HMD’s coming which run Android on a beefy APU such as the Nvidia X1 (I am looking at you Gameface), support for Android and portable HMD’s are interesting enough for us to spend some time on it.

Initial testing and Research

Gameforce 2014 Gronos Oculus

Gameforce 2014 event showing a prototype of our 6 weeks prototype of Gronos optimized for VR.


Since day 1 we have been experimenting how to utilize HMD’s for our games, experimenting with everything from basic puzzle games in first person perspective and simple exploration games to fast paced third person multiplayer games! Our first finding was that the control scheme is very important. The type of games we started with are more traditional games optimized for VR, which works fine when using a correct input layouts. Research was done with multiple big test groups over the course of 1 year finding the best device and layout to reduce motion sickness and improve immersion.

We found that motion sickness is a very person to person experience, some being almost 100% invulnerable where others could not use HMD’s for longer then a couple of seconds. The improvement of the hardware helps a lot with this, but we also found techniques for input handling and content production that can help with reducing the motion sickness overall. Small thinks like the brightness of the scene, reducing the effort needed  learning the controls, not taking away the control of the camera from the player, correct setup of IPD (Interpupillary Distance, the distance between your eyes) per person and much more!
Note: It is always useful knowing your personal IPD, just drop by a optician and ask for a quick check, most do free checks anyway!

We also have been working with multiple university’s and companies to implement VR for the purpose of Psychological research, human behavioural predictions, motion sickness and others, scanning the boundaries of what is possible with VR.

As a software and general content consultant we have been working internationally to provide people answers for problems or initial steps in Virtual Reality, we find that even though a lot of tools are emerging and the documentation is kind of there to start, a lot is lacking, specifically information regarding building a good quality product, do’s and dont’s for content, performance optimisations and requirements for reducing dizziness and nausea.

What about toolchain?


Our MassiveEngine is well optimized for Virtual Reality and should have all the components needed to build a game for VR. Since our entire toolchain works with programming and Blender, the only tool that could benefit from VR support would be Blender, speaking of which!

Recently we have been working on a VR pipeline for Blender, our software of choice for game development. Since Blender integrates well with the MassiveEngine this was very nice to work on! Utilizing OpenHMD, the combination of open source and platform and hardware independence was a match made in heaven. This can be used to preview content used in Game Engines or for previewing Video and Image rendering in combination with the Multiview rendering! This allows for a great pipeline from Blender to equirectangular 3D panorama’s or games without the need to validate in other software.

So, what about the games then?


Reverie Realms: Arena – Fist Build. A 3 weeks Multiplayer Deathmatch Arena prototype we made for a event. This game is a 3rd person experiment and is part of our internal testing universe (more on that at a later date.)


We are permanently experimenting with VR content and have a couple of things in the pipeline we can not talk about yet (but is pretty exciting!). For our own games, we try to include a well optimized VR experience as much as we can, but not necessarily jumped on the ‘just make VR games’ bandwagon yet. We probably end up creating a couple VR exclusives, or at least games that are VR first, but we still believe there is need for quality games traditionally played on a screen or TV, on your powerful rig with your favourite keyboard/mouse or controller.