How to sharpen oculus rift visuals with supersampling – road to vr electricity for refrigeration heating and air conditioning answer key

As part of the Oculus SDK (Software Development Kit), Oculus ship a handy tool which allows devs to throw extra information to help them troubleshoot performance issues or glitches and control runtime rendering elements like compositor layers etc. But one of the options also allows them to increase the running game’s pixel density, that is, the resolution the game is rendered at before being down-sampled and sent to the Oculus Rift display. See Also: ‘Edge of Nowhere’ Review

The tool allows you to increase said pixel density, and supports incremental values in between, with a setting of ‘2’ quadrupling the pixels your VR gaming PC has to render. This is effectively a form of super-sampling, a particularly expensive form of anti-aliasing that also happens to produce excellent image quality. But for games that offer minimal graphical options, and assuming you have the GPU grunt to push more pixels and maintain that hallowed 90FPS.

For the purposes of gauging image quality when compiling this very quick walk-through, we used the recently released Edge of Nowhere from Insomniac Games, a very pretty title which, sadly includes zero options to tweak in-game pixel density. The difference between the lowest and highest settings make to this game are pretty impressive – really allowing the production design detail shine through. How to Run the Oculus Debug Tool to Improve Image Quality

It’s important to follow the above steps precisely on your initial run, as doing it in a different order can cause severe stuttering, which you might put down to performance issues but in fact is just ‘how it works’. If you do see this, retry the above procedure and you may be surprised at the results.

We tested Edge of Nowhere with our Road to VR Exemplar gaming PC, which packs a Nvidia Geforce GTX 980ti, and managed to push a pixel density of 1.8 before frame rate became an issue and it was obvious Asynchronous Time Warp was working beyond its limits (jerky animation and skipped frames). The resulting difference in image is obvious however, with subtle texturing and shader effects revealed and far less distracting image artifacts on distant objects – it looked closer to how you imagine the artists envisaged it.

As far as we can tell, the debug tool overrides any in-game pixel density settings, so titles like Chronos and EVE: Valkyrie which allow you to tweak this, should simply adopt the Oculus Debug Tool setting. However, we haven’t tested this extensively, your mileage may vary and that goes for title compatibility too. Give it a shot and let us know how you get on in the comments below.

We partnered with AVA Direct to create the Exemplar Ultimate, our high-end VR hardware reference point against which we perform our tests and reviews. Exemplar is designed to push virtual reality experiences above and beyond what’s possible with systems built to lesser recommended VR specifications.

In our game “Kismet” the primary difference between each quality setting is resolution. We used this same function to increase fidelity. Vive uses a similar function called r.screenpercentage. The difference between 100 and 150% screen percentage is astounding. In fact, Oculus is set to 150% in Unreal by default, as 100% looks like garbage. Even higher resolutions look great but after 175-200% there is no appreciable difference. It’s very tricky. As a developer, we aren’t targeting 2160×1200 at 90fps. We are targeting 3240×1800 at 90fps, minimum. Our hidden “Uber” quality setting which requires at least a 980ti, is rendering at 4320×2400 at 90fps – Nearly a billion pixels per second! On a 4k on a monitor our game will run at 120fps with a 980ti. Imagine any other game running at those resolutions and speed. This is why it’s so difficult to get high quality visuals. I spent weeks getting Kismet to run on minimum spec machines without sacrificing visual quality. Thankfully the Unreal devs at Epic are taking VR seriously and new render optimizations are frequent. Those guys are the best!

There are 2 steps missing in above article. With oculus home open, if you run the debugger and set the pixel ratio to 2 or whatever your PC can handle, your head tracking will go all wacky. You have to close the debug tool after setting ratio, this will fix the tracking in oculus home, while keeping your set pixel ratio, then run the debug tool again and set the ratio to 2 or whatever. Next, you run the game, and it will use the new pixel ratio. The above will tax your GPU very heavily, so unless you have 980, or even better – 1070 (at least), you better wait until you can get a better GPU.

The real visible difference in oculus home, is that image becomes much clearer than without this trick, you will not see much higher res, since you are still limited by the resolution of the a displays. There will still be a “net” of pixels in all the games and oculus home itself.

In addition to above trick, if you have at least GTX 1070, you can also set custom resolution in your GPU settings, something like 1920×1440 and then run the game in that resolution. I tried this all with Mass Effect 3. Picture quality is amazing, but I still see “pixel net” due to retina displays. No tricks will get rid of that, until oculus releases CV2-3 with higher res retina displays.