- The name
- My plan
- The main HMDs
- Generic HMD libraries
- VR sickness and other issues
- VR sickness
- Fixes for VR sickness
- Not all HMDs are monitors
- Optimus not so prime
- Wayland and binary drivers
- Lack of standards
- Code ideas
I've recently come out of a contract where I was adding Oculus Rift support to an existing application. This is something I always wanted to play with. Having had that experience, now I want to add generic HMD (Head Mounted Display, like the Rift) support to Evas_3D. It will be great for my major, years long, virtual worlds project.
There's a few different HMDs being released soonish, many variations on Google Cardboard have been out for a while, Samsung Gear VR was released last month, Oculus Rift is likely to be released in the next month or three, HTC Vive probably released later this year, and many more.
http://www.hypergridbusiness.com/faq/best-virtual-reality-headsets/ lists dozens of them (mostly Cardboard variations).
I was using an Oculus Rift DK2 (Developer Kit 2) supplied by the client for this contract. I have to hand that back soon, but I'll be buying my own Oculus Rift when it gets released. I'll probably get a Google Cardboard style device as well.
The contract used the Oculus SDK, but obviously that only handles Oculus HMDs, and these days only on Windows. Oculus used to support Linux and Mac OS X, but they dropped that to concentrate on Windows for the release. Oculus claim the other OS support will return. There's a few open source, cross platform, generic HMD libraries that are trying to handle a bunch of HMDs, and that's what I'd rather be using for EFL. It's still early days for this tech, so no standards or anything yet.
It's always good to figure out a name for the thing you are working on at the beginning. Even if it's a place holder name. Traditionally EFL things tend to start with the letter "e", and this involves VR support. So I'm proposing eVR, eVeR, or some variation on that. I think I'll stick with eVeR as a working title.
Also, I have decided to start calling HDMs "facebricks". See if it catches on. B-)
So, my plan is to try out both of those generic open source HMD libraries I mentioned before, and check to see if there's others. If one comes out as a clear winner, I'll start with that. If it's too close to call, might be up for some discussion, or we could just support both.
Then I'll hook it up to my SledjHamr EFL based project, see what it takes to get it to work. Again, if the libraries are close, this might help shake out quality aspects to help decide. Armed with this experience, and a working example, we can discuss how to get it added to EFL. I'll have working EFL based code by then, so should be simple enough to port it to EFL.
If there's any other EFL devs that are into HMDs and have some, I'd be happy to collaborate. Actually, my default position for anything involved in my virtual world work is that it's such a huge project, with so many moving parts, that any part I can get others to worry about instead of me is something I'm very happy to let others do. I just happen to be well placed right now to be doing this HMD work, and eager to get it working with SledjHamr.
The main HMDs at this time are -
There's actually heaps of these, it's a standard more than a particular device.
The TrinusVR that I mention below is software that runs under Windows, it connects to a smartphone strapped into a Cardboard chassis, over the network, or USB. On the phone, the display software uses Cardboard (I think). It seems to work fine with the Oculus pre display distortion.
The main competitor to Oculus Rift. Though they actually cooperate. HTC is partnering with Steam for this.
The one that restarted the HMD industry.
An open source / open hardware kit that is designed for people to hack up.
That ones a bit of a hybrid as I understand it. Essentially Google Cardboard style hardware (attach your own smart phone to a more or less generic and cheap case), with an Oculus SDK, though the Gear VR SDK is different from the Oculus Rift PC SDK. The hardware is only designed for certain top Samsung phones, and there's some sort of control on one side. So far, the only open source library I have found for this is Samsung's own, which doesn't support anything else.
Used to be called Project Morpheus. Specifically for the PlayStation 4.
"Pure C implementation of the Oculus Rift SDK"
A conversion of the Oculus SDK (horrid bit of C++ coding, not all of it open surce) into open source C.
"Simple C library to handle virtual reality headsets"
"OpenHMD aims to provide a Free and Open Source API and drivers for immersive technology, such as head mounted displays with built in head tracking. Our focus is to implement support for as much as possible devices for any platform availble."
Made by Valve, so related to SteamVR. Apparently it's just a thin wrapper around libraries from other HMD makers, which also tend to be binary blobs. Doesn't seem to have any actual source, github is full of binaries. WTF?
"Kind of wish valve had just embraced OSVR. It's got a much broader
scope, and includes position, orientation, pose, analog/button inputs,
skeletal tracking (hands and body), and eye tracking. It supports input
devices distributed across a network (e.g. using a smartphone or
smartwatch as an orientation tracker from your PC), and is based on an
already proven industry standard (VRPN) used for many years. It's also
extensible and supports "virtual" interfaces in a pipeline - for
example, a plugin that takes raw sensor data and smooths it, or combines
multiple orientation/position sensors into a skeletal pose tracking
This is the library the ndofdev developer highly recommends for new VR work, which we might need for input controllers anyway. Works as a server that your application connects to, designed so that the devices can be spread across a network. Uses .C file extension, and the little bits I have looked at all actually look like C. This has been around for a very long time, there's even an Amiga port I think.
Actually, I think VRPN is for the input devices only, the HMD support listed is for the position / rotation trackers used, not the actual display. Which is likely why OSVR use it as a base, but then add display support on top.
|language||license||Android||IOS||Linux||Mac OS X||Windows||notes|
|libovr_nsb||C||Oculus Rift SDK License||yes||The author wants cross platform, but doesn't look like that's been implemented.|
|LibVR||C||BSD 2 clause||I can't find a lot of info about this.|
|OpenHMD||C||Boost Software License||yes||no||yes||yes||yes||Also supports FreeBSD.|
|OpenVR||C++||Custom, looks BSDish||yes||yes||yes||No source!|
|OSVR||C++||Apache License, Version 2.0 (and others)||yes||yes||yes||yes||Made up of lots of "projects". Each project can have a different license! Based on VRPN.|
|VRPN||C++||Was public domain, now Boost Software License 1.0,||yes||client only||yes||yes||yes||Input and tracking devices only.|
|Google Cardboard||HTC Vive||Oculus Rift||OSVR||Samsung Gear VR||Sony PlayStation VR||other|
|libovr_nsb||yes||Unlikely to support others.|
|OpenHMD||yes||DK1,DK2||Pass in data for sensor fusion. Supports other Android based devices.|
|OSVR||DK1,DK2||yes||Supports lots of input devices, plus a few other obscure HMDs.|
|VRPN||yes||Supports lots of input devices, but no display.|
These considerations exist for most, if not all, HMDs.
It's crucial that this head tracking / update display thing happens with absolutely bare minimum latency. Basically the more latency in a HMD, the more likely the wearer is to throw up, and when wearing a HMD, you can't see the real world, so you are likely to throw up all over your expensive computer, missing the conveniently placed bucket entirely.
So called VR sickness is actually a very crucial issue. It's similar to motion sickness, only reversed. The world is moving for your eyes, but your inner ear disagrees, which triggers a "something is terribly wrong" response in your body, which tends to make you throw up. Your body thinks that maybe the world is screwed up coz of what ever you ate last, so tries to get rid of that as a first level emergency response. So you throw up. I'm no doctor, this is how it was explained to me. There's lots of ongoing research to reduce this problem. In general though, most people can "get their VR legs" as it's called, slowly getting used to it by short initial exposures, getting longer as you feel more comfy.
Motion sickness is similar, with exactly the same response. Only the world is still for your eyes, but your inner ear thinks you are moving.
I don't suffer from either problem myself, but it's something you have to be aware of. The amount of times I have thrown up in my five and a half decade life can be counted on one hand, with plenty of fingers left over. VR sickness a BIIIG topic of conversation for HMD developers.
Either way, the likely hood of throwing up and feeling bad is high for some people. There's heaps of discussions about how to avoid these problems. There's a medical checklist I go through when introducing new people to HMDs. Rift in particular insists on displaying a health and safety screen when you start.
Yes, HMD developers can be obsessive about preventing VR sickness. Coz if you are not, people get ill, people stop using your stuff, people sue you for making them ill, people give you a bad rep, ... it's not good, best to avoid it.
Actually, one of the things recommended to help keep VR sickness at bay is to NOT have a floating UI. You should try to make the UI part of the 3D world. Still, floating UIs are gonna happen. My client in particular gets a bit ill with floating UIs that are at a fixed position relative to the HMD. He prefers the UIs to be stuck to the objects they represent. So clicking on a 3D object causes it's UI window to popup next to the object, and the window stays in a fixed position relative to that object. I worry that people might forget they have windows open on dozens of objects scattered throughout the world. lol
On the other hand, research is showing that having some sort of fixed "cockpit" might help to avoid the sickness. The theory is that this provides a fixed frame of reference to help offset the rest of the world spinning wildly as your inner ear sits quietly in your office chair. So in-vehicle games are popular, there's even a popular game where you play the part of a truck driver, driving around Europe, making deliveries I think (I've not tried it, but I keep hearing about this game, sounds boring to me, that's my brother-in-laws job, and he's boring).
The difference is that the cockpit surrounds you, the floating UI doesn't, otherwise this would seem to be contradictory. This sort of stuff is still being researched.
In the end though, yes floating UIs will be made, but it's not encouraged.
For their own reasons, Oculus in particular moved away from "just be a monitor" to "be a specialised non monitor device". A very controversial move. Oculus may not support "just be a monitor" mode in the future. Dunno about the plans of the other HMDs though. I suspect Oculus Rift might be one of the popular ones. I've not looked at the other HMDs yet to see if they might do something similar.
Some HMDs in particular I know are NOT monitors. TrinusVR for example is a device on the end of a WiFi or USB connection, not a monitor. I think Google Cardboard is similar, in that the actual device is an ordinary smart phone that's not pretending to be a monitor. TrinusVR actually wraps Google Cardboard.
Still, this HMD detection and configuration step is the most trivial part of the entire process. But non DRM methods will be needed as well. With so many HMDs on the market, and more coming, plus early days of this tech, there's no standards. So in the end, a manual process will be needed as well, which will have to fall back to "just be a monitor".
Intel GPUs are not supported well for HMDs, they tend to not be grunty enough. It's early days, VR sickness is a thing, so generally really high end graphics cards are needed. In fact, Rift has trouble with nVidia Optimus graphics chips that are very common in laptops. Optimus puts an Intel GPU in front of an nVidia GPU. Everything has to go through the Intel chip, coz the nVidia connects to the monitors THROUGH the Intel chip. This introduces extra latency that Oculus spit the dummy on. They no longer support Optimus, in fact they deliberately refuse to work on Optimus chips since their 0.7 SDK. 0.6 and earlier worked fine on Optimus though. I think they are just being precious. The Windows desktop supplied by my client has Optimus, and the client themselves will be using Optimus based laptops. So fuck you Oculus, they are sticking with SDK 0.6.
Personally, from my experience supporting Second Life / OpenSim users professionally, most people tend to use low end student / business laptops for that, coz they are cheap and plentiful. Which tend to have a hard time with SL / OS, and will have a harder time with HMDs. SL / OS is horrible code base, I'm sure EFL based code could run much faster. One of my goals is to make sure these low powered laptops can actually get a useful display out of them. Which is sorta the exact opposite of what Oculus has done, they are being precious about making sure VR is well accepted by the world in general, so they just refuse to run on anything slow.
Of course, this is all dependant on being able to Actually Get Wayland/Weston running (ie: does Not work on Some hardware like nvidia binary blobs), but does work with Intel OSS drivers, nouveau drivers work well also, and I hear that some AMD drivers work too...
Lack of support for nVidia binary blobs is going to be an issue. nVidia have only recently added HMD support to their drivers, and those changes will take longer to get to the open source drivers, if at all. AMD might be in the same boat. Intel GPUs I don't think cater to HMDs explicitly.
Perhaps this is one reason why Oculus "temporarily" dropped Linux support, if DRM wont let us use nVidia binary blobs?
VR movies are all over the place as far as formats are concerned. Each of those DRM_MODE_FLAGS DH mentioned before, plus a few more, have been used to make movies. Movie players have a dozen or more complex controls to try to sort out what to do for any particular movie. Some movies bypass the entire problem and come embedded inside their own viewer, or rather three of them to support the major HMDs.
- Negotiate with the compositor over Wayland protocol if it can handle a 3d stereoscopic output
- Add detection of 3d stereoscopic output to Enlightenment and ecore_drm backend
/** * DRM_CLIENT_CAP_STEREO_3D * * if set to 1, the DRM core will expose the stereo 3D capabilities of the * monitor by advertising the supported 3D layouts in the flags of struct * drm_mode_modeinfo. */ #define DRM_CLIENT_CAP_STEREO_3D 1
So basically, get the mode info and check the "flags" of the mode struct for supported layouts. Possible values on "flags":
#define DRM_MODE_FLAG_3D_MASK (0x1f<<14) #define DRM_MODE_FLAG_3D_NONE (0<<14) #define DRM_MODE_FLAG_3D_FRAME_PACKING (1<<14) #define DRM_MODE_FLAG_3D_FIELD_ALTERNATIVE (2<<14) #define DRM_MODE_FLAG_3D_LINE_ALTERNATIVE (3<<14) #define DRM_MODE_FLAG_3D_SIDE_BY_SIDE_FULL (4<<14) #define DRM_MODE_FLAG_3D_L_DEPTH (5<<14) #define DRM_MODE_FLAG_3D_L_DEPTH_GFX_GFX_DEPTH (6<<14) #define DRM_MODE_FLAG_3D_TOP_AND_BOTTOM (7<<14) #define DRM_MODE_FLAG_3D_SIDE_BY_SIDE_HALF (8<<14)
This would/should actually be quite simple to detect in the ecore_drm code as we already fetch mode info for crtcs. Would just be a matter of checking for these flags.
You also need to distort the result, not just be stereoscopic. The distortion remaps the resulting image so that when it eventually hits your eyes, the distortion of the HMD lenses is compensated for. Not that tricky, shaders do it easily. Basically the lenses needed to make sure ordinary people can focus on the screen that is a few mere centimetres away from your eyes, introduce barrel distortion, so the reverse pin cushion distortion (I may have those the wrong way around) has to be pre applied, so it all comes out straight in the end.
Then there's also dealing with rotation and position sensors on HMDs. They do their magic by detecting where your head is at and which direction it is pointing, so the software can adjust the view into the 3D world as you move your head around. This is the very crucial part, due to VR sickness.