Tom have been working with Oculus Rift for a long time, some time ago he made a nice example for Oculus Rift DK1 and now he just released an updated ANE and Flare3D’s example for the new DK2.
We talked with himÂ about Virtual Reality, Oculus RiftÂ andÂ Flare3D’sÂ integration:
Which is your previous experience with Virtual Reality?
From the first time i saw the holodeck in Star Trek, the next generation, i’ve always wanted to pursue virtual reality.
In the 90’s, I was 12 or 13 years old during the first VR hype, the local computer business showcased the huge Forte’s VFX-1 HMD for PC. It really was not that good and i certainly did not have the funds to buy something like that. But it always stayed in the back of my head.
When Nvidia started experimenting with the first stereo drivers that worked with shutter glasses (before slow lcd’s became the standard) I immediately went and bought them, Battlefield 1942 looked gorgeous in stereo 3D.
After graduating i started working as a flash developer in an awesome multimedia company called BUT. (www.but.be). When the first 3D engines came out in flash, before Stage3D came to the scene, I created some demo’s with stereo views in anaglyph 3D.
I read the MTBS forums (http://www.mtbs3d.com/) at the time Palmer Luckey was contemplating his first rift prototype and went to kickstarter with it. I asked my boss for a developer kit and could not wait for it to arrive. After playing around with it a few evenings i wanted to create things myself. I looked around if somebody created a Native Extension for Air already. Jonathan Hart just started his ANE and a few other people contributed code. I set out to create the distortion shader (in AGAL) and spent many many nights until the basics finally came together.
By then i found out about Flare3D and it’s FLSL language, I asked for help on the forums and just minutes later Ariel came up with a working distortion shader with chromatic aberration. I felt like all my AGAL studying was for nothing, but was sold on Flare3D. The support is just great!
Are you working on an Oculus Rift project?
While the implementation is already really usable there is still work to be done, latency tracking is next on the todo list. But the most important bits are done including Timewarp.
I’ll soon be making a demo project so we can pitch ideas to clients. Personally I would really like to make a space shooter. Sitting in a cockpit looking into the void of space is just something that is made for VR.
Jonathan Hart just completed work on integrating the Oculus into the award winning VR-based music video for Beckâ€™s â€œSound and Visionâ€, produced by Stopp Family and directed by Chris Milk which won 2 Webby Awards last year. This piece also uses Flare3D, and in a very novel way. http://www.stopp.se/chris-milk-beck-hello-again/
Beyond that there is also already work underway to support the Samsung Gear VR and integrating Oculusâ€™ Android SDK as a platform target for the ANE. Exciting times are ahead!
What offers Oculus Rift DK2 comparing it with their previous version?
DK1 had a 1280×800 60hz. LCD display. DK2 has a 75hz. full HD OLED display that is capable of showing a much crisper image.When Adobe AIR is delivering 75fps and you turn on the low latency mode, the difference with the first developer kit is huge. Low latency shows the rendered frame only for a fraction of the 13ms it would normally be visible to the eye. The result is that the image does not smear like in DK1 when looking around.
Besides the better visual qualities it also has an external camera that tracks the users position. makes a huge difference in not getting sick.
DK2’s only downside is that is has a lower fov. then the first developer kit has. But it’s important to know that all of this is still for developers. The final consumer version specs are in no way locked down and i’m sure they’ll be improving almost every aspect of the current developer kit for the first consumer version. for example the new prototype Oculus showed last week has completely different optics, a higher refresh rate (90hz. is hinted) and it looks like it has a higher resolution.
Is the example ready or does it need a few tweaks to consider it finished?
It needs tweaking, the perspective projection is calculated in Flare3D while Oculus provides a projection matrix from the SDK. I just can’t make sense of the one they are providing, it’s something that needs looking at so the setup will automatically work for future versions.
Adobe Air does vsync but in my test setup it locks to the refreshrate of the slowest monitor. Which in many cases is 60hz. That might be an area where AIR could be improved. It should check which display it is running on and then adjust. This way, when you go fullscreen on the oculus, it starts running in 75fps. The current solution is to buy a decent high framerate main monitor or set the rift as the only display which is highly impractical. According to Adobe, they have no framerate cap on AIR desktop, which means the 90Hz consumer version of the Oculus is already feasible: the world just needs hardware that refreshes at that frequency.
Which was your experience integrating Flare3D with Oculus Rift?
Flare’s api and the flsl language are perfect for the job but what is even better is the active community and a lively forum where you can go for help. The scriptable IDE is also great and allowed me to write an importer for the oculus demo scene. I’m really looking forward to the next version. People jumping in and creating fxaa shaders that can work together with the distortion shaders really gives you the energy to continue working on it.
Full example and Sources
The example, ANE, and source code are available on his Github repository here:
Tom Goethals is a multimedia developer currently employed at BÂ·UÂ·T (www.but.be).
Living in Belgium with his awesome wife and 2 kids. He worked on many award winning interactive campaigns, installations and websites.
Follow him on Twitter using @Fragilem17