This past Tuesday the Oculus Quest standalone VR headset was released. As of today, it remains sold out at most of the retail locations who are carrying it across the United States and also at Oculus.com. It appears to be a success for Facebook- or at least we can say that it is off to a good start. However, shortly after launch, reports began popping up on reddit about dead pixels being a problem. The number of people reporting the problem were few, but it left me wishing for a means to test my new headset. Given the apps and built in features there isn't a way to perform a thorough dead pixel check. So I decided to make one. As it turns out, the app seems to have been useful to many people and was even featured in a video by popular VR YouTuber RaMarcus (click below to watch)
I'm glad this small investment of my time has turned out to help people. In this blog post, I share how I built this simple app and give a little history on the tools that made this utility possible to build in about half an hour.
The process of testing dead pixels
Unfortunately dead pixels are a common problem with modern displays. VR screens are not exempt. But finding the problem pixels can be difficult. An effective method of testing is to display solid colors in sequence, allowing the user to scan the display surface for the contrasting little culprits.
A VR solution
The idea solution for testing dead pixels in VR is to have the solid color sequence encompass the full tracked 360 view area. To accomplish this in VR I could have spent a small amount of time in one of the common game engines such as Unreal or Unity3D. But using those tools would result in binary file requiring installation thus distribution through the official Oculus store front. As I discussed in my previous blog post this isn't possible on the Oculus Quest without going through a rather cumbersome proposal submission process which is overkill for such a simple utility. Also, I felt pretty strongly that this would be something I'd like to write once and run on every headset. As it turns out, there is a standard perfect for this task- WebVR.
WebVR is an open standard created at Mozilla and first released in 2016. You can read more about it's history here. WebVR is an API built on top of the existing WebGL standard, WebGL is included as part of the HTML 5 specification. So it creates a straight path for browser vendors to implement in the future. I'm just providing this information to give a bit of background. To solve the problem I was chasing my primary concern wasn't with all the history but with the degree of WebVR compatibility with Oculus Quest. Is the Quest's built-in web browser WebVR compliant? It turns out the answer is yes!
Building for WebVR
A basic A-Frame page looks like this
The scene moniker <a-scene> is roughly analogous to movie scene. It defines a 3D space in which you can place objects, monitor events of those objects, animate objects, etc... Primitives can then be declared as child nodes of the <a-scene> node. In the sample above, we have a primitive box, sphere, cylinder, plane and sky. Position and rotation attributes exist for all primitives, they each contain a string with three values that can be represented as integer or float that correlates to x, y and z. For example, <a-box> has position x=-1, y=0.5, z=-3 with a rotation of x=0, y=45 (degrees) and z=0 (If 3D coordinate systems are new to you, you might want to have a look here ), but these are represented in this format position="1 0.5 3". In addition to position and rotation we also find some more common HTML attributes. Width and height allow us to define scale and color uses the traditional HTML hexadecimal system for defining R, G, B values 0-255.
When we view the page with a WebVR compliant browser such as Mozilla Firefox or Chrome, we see the shapes in 3D Space with a VR icon in the bottom right corner. If a recognized VR headset is present, when we click the VR icon the scene will transform us into a Virtual Reality environment. The limitations of the headset technology apply, but if it's a fully tracked 6dof headset such as the Quest, the user can then walk around their physical environment and their movement will be tracked and correlated to the 3D virtual environment which they are experiencing.
Considering the needs of the app I wanted to build, I decided to use this sample as a starting point. I decided to create a simple intro scene with a texture mapped to a primitive (I ended up choosing a cube) with some basic instructions on how the app would function. Using a paint program I created a simple texture for this purpose. Then I decided that I would like a 360 image for the backdrop of this scene. I decided to use one of the 360 park photographs which I had captured while creating our VR Cardio app for the Oculus Go.
To apply the texture to the cube I used the src attribute of the <a-box> object to apply my 360 photo to the surrounding area I used the src attribute of <a-sky>. Intro scene done!
To determine my location in the flow I used a numeric variable which I labeled phase then implemented a switch statement inside the timerTicked() method to be called at the end of each setTimeout(..). The switch statement is setup to first remove the box primitive with instructions then clear the src attribute of sky to remove the 360 park image. With the instruction box cleared and 360 photo removed the user is left only with the <a-sky> object in the scene. So to iterate through a series of solid colors all I had to do was set the <a-sky> color values, provide a reasonable timeout setting for transition and to keep looping the sequence, restarting at step 1 instead of step -1 (since we only need to transition from the intro scene on our first iteration).
As a final touch, I decided to include <a-entry> objects for left and right hand oculus-touch-controllers. Including these two nodes in the script added tracked 3D models of the users controllers to the scene. Generally these would only be added if we wanted to handle events associated (e.g. when you presses and holds the trigger) , but in this case I decided to provide them for aesthetic reasons. I felt keeping them present would help the user feel less isolated. I know it's a bit silly to be concerned with that for a utility app, but it's also a very small thing to add them. So why not?
The resulting app can be found here
Want to know more about A-Frame? Additional references and sample apps for A-Frame can be found below:
Until next time,