Finding dead Oculus Quest pixels in less than 100 lines of code

Published on May 25 2019

This past Tuesday the Oculus Quest standalone VR headset was released. As of today, it remains sold out at most of the retail locations who are carrying it across the United States and also at Oculus.com. It appears to be a success for Facebook- or at least we can say that it is off to a good start. However, shortly after launch, reports began popping up on reddit about dead pixels being a problem. The number of people reporting the problem were few, but it left me wishing for a means to test my new headset. Given the apps and built in features there isn't a way to perform a thorough dead pixel check. So I decided to make one. As it turns out, the app seems to have been useful to many people and was even featured in a video by popular VR YouTuber RaMarcus (click below to watch)

 

See: https://www.youtube.com/watch?v=e4GnHq6IQZQ&t=83s


I'm glad this small investment of my time has turned out to help people. In this blog post, I share how I built this simple app and give a little history on the tools that made this utility possible to build in about half an hour.

 

The process of testing dead pixels 

Unfortunately dead pixels are a common problem with modern displays. VR screens are not exempt. But finding the problem pixels can be difficult. An effective method of testing is to display solid colors in sequence, allowing the user to scan the display surface for the contrasting little culprits. 


A VR solution

The idea solution for testing dead pixels in VR is to have the solid color sequence encompass the full tracked 360 view area. To accomplish this in VR I could have spent a small amount of time in one of the common game engines such as Unreal or Unity3D. But using those tools would result in binary file requiring installation thus distribution through the official Oculus store front. As I discussed in my previous blog post this isn't possible on the Oculus Quest  without going through a rather cumbersome proposal submission process which is overkill for such a simple utility.  Also, I felt pretty strongly that this would be something I'd like to write once and run on every headset. As it turns out, there is a standard perfect for this task- WebVR.

WebVR is an open standard created at Mozilla and first released in 2016. You can read more about it's history here. WebVR is an API built on top of the existing WebGL standard, WebGL is included as part of the HTML 5 specification. So it creates a straight path for browser vendors to implement in the future. I'm just providing this information to give a bit of background. To solve the problem I was chasing my primary concern wasn't with all the history but with the degree of WebVR compatibility with Oculus Quest. Is the Quest's built-in web browser WebVR compliant? It turns out the answer is yes!

 


Building for WebVR

There are a few paths for a developer to build for WebVR. The higher up the chain of abstraction we go, usually the less code we have to write and with each higher level we end up giving up a bit of under the hood control in exchange for convenience. With that in mind, I decided to use the A-Frame JavaScript library.  A-Frame was designed to provide a simple programming interface to WebVR. It's built on top of another open source library, three.js. I'm taking the time to discuss this stack of technology because I think it's important to realize the value open source has brought us.  The end result of the hard work of some truly brilliant Software Engineers such as Vladimir Vukićević has made it possible for me to deliver this simple tool with a very small amount of effort. 

With A-Frame we get a rich WebVR javascript library. It's easy to add the needed script includes to a new page and begin developing VR experiences using HTML and to deliver them over HTTP served as as static HTML content- hostable on any web server.

 

A-Frame overview

A basic A-Frame page looks like this

<html>
  <head>
    <script src="https://aframe.io/releases/0.9.1/aframe.min.js"></script>
  </head>
  <body>
    <a-scene>
      <a-box position="-1 0.5 -3" rotation="0 45 0" color="#4CC3D9"></a-box>
      <a-sphere position="0 1.25 -5" radius="1.25" color="#EF2D5E"></a-sphere>
      <a-cylinder position="1 0.75 -3" radius="0.5" height="1.5" color="#FFC65D"></a-cylinder>
      <a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4"></a-plane>
      <a-sky color="#ECECEC"></a-sky>
    </a-scene>
  </body>
</html>

 

The scene moniker <a-scene> is roughly analogous to movie scene. It defines a 3D space in which you can place objects, monitor events of those objects, animate  objects, etc... Primitives can then be declared as child nodes of the <a-scene> node. In the sample above, we have a primitive box, sphere, cylinder, plane and sky. Position and rotation attributes exist for all primitives, they each contain a string with three values that can be represented as integer or float that correlates to x, y and z. For example, <a-box> has position x=-1, y=0.5, z=-3 with a rotation of x=0, y=45 (degrees) and z=0 (If 3D coordinate systems are new to you, you might want to have a look here ), but these are represented in this format position="1 0.5 3".  In addition to position and rotation we also find some more common HTML attributes. Width and height allow us to define scale and color uses the traditional HTML hexadecimal system for defining R, G, B values 0-255. 

When we view the page with a WebVR compliant browser such as Mozilla Firefox or Chrome, we see the shapes in 3D Space with a VR icon in the bottom right corner. If a recognized VR headset is present, when we click the VR icon the scene will transform us into a Virtual Reality environment. The limitations of the headset technology apply, but if it's a fully tracked 6dof headset such as the Quest, the user can then walk around their physical environment and their movement will be tracked and correlated to the 3D virtual environment which they are experiencing.  

 

Considering the needs of the app I wanted to build, I decided to use this sample as a starting point. I decided to create a simple intro scene with a texture mapped to a primitive (I ended up choosing a cube) with some basic instructions on how the app would function. Using a paint program I created a simple texture for this purpose. Then I decided that I would like a 360 image for the backdrop of this scene. I decided to use one of the 360 park photographs which I had captured while creating our VR Cardio app for the Oculus Go.

To apply the texture to the cube I used the src attribute of the <a-box> object to apply my 360 photo to the surrounding area I used the src attribute of <a-sky>. Intro scene done!

 


Now to build the rest. To get to the desired effect, I needed a way to cycle through solid colors and have them fill the users view area. Using the <a-sky> color attribute made this simple.  In order to have them cycle on a timer I would need some event handling. Fortunately since A-Frame is a JavaScript library, it was easy to just use JavaScript's built in setTimeout(...) function to have code run after a certain number of milliseconds had passed.

To determine my location in the flow I used a numeric variable which I labeled phase then implemented a switch statement inside the timerTicked() method to be called at the end of each setTimeout(..). The switch statement is setup to first remove the box primitive with instructions then clear the src attribute of sky to remove the 360 park image. With the instruction box cleared and 360 photo removed the user is left only with the <a-sky> object in the scene. So to iterate through a series of solid colors all I had to do was set the <a-sky> color values, provide a reasonable timeout setting for transition and to keep looping the sequence, restarting at step 1 instead of step -1 (since we only need to transition from the intro scene on our first iteration).

 

 

 As a final touch, I decided to include <a-entry> objects for left and right hand oculus-touch-controllers. Including these two nodes in the script added tracked 3D models of the users controllers to the scene. Generally these would only be added if we wanted to handle events associated (e.g. when you presses and holds the trigger) , but in this case I decided to provide them for aesthetic reasons. I felt keeping them present would help the user feel less isolated. I know it's a bit silly to be concerned with that for a utility app, but it's also a very small thing to add them. So why not?

The resulting app can be found here
http://www.immersha.com/webvr/vr-pixel-test.html


Want to know more about A-Frame? Additional references and sample apps for A-Frame can be found below:

A-Frame Samples
A-Frame Documentation

 

Until next time,

Jeremy

 

Written by jdeats76

Repost0
To be informed of the latest articles, subscribe:
Comment on this post

James Fox 06/12/2019 13:56

Is it possible to get a version of this where the user presses a button the get the color to change. I am trying to take a picture of a dead red sub pixel and the timer is not quite long enough.

Mob 12/24/2019 17:52

Just use a Youtube Dead Pixel test on VR Mode. Pause it and make your photo.
Have the same problem with a green subpixel.
Seems to be a common problem on quest.
So I think I dont return it and not to get a new quest with a dead pixel more in my field of view.