We just opened our new GitHub repositories a few weeks ago, and little by little we have been filling it with new and helpful things, and we expect to continue doing that
First one to join was Flare3D Labs, this repo includes all kind of experimental code (deferred lightning, Adobe AIR Gamepad, SAO, etc…) that you can use as an inspirational starting point to do your own creations.
We then added a new Flare3D Engine repo where you can find parts of the core engine, either to extend its functionality or as part of the documentation to better understand how things work internally, for example, all material filters and its FLSL shaders can be found there.
If you want to be updated with the latest examples, please follow us in GitHub, if you like the content, please feeds our ego giving us a star in your preferred repo and if you think that you are ready to add some amazing stuff we will be glad to receive your contribution
Visit us on GitHub regularly, cool things will be published soon!
If you want to stay tuned to receive latest news and announcements, follow us on Facebook and Twitter.
Tom have been working with Oculus Rift for a long time, some time ago he made a nice example for Oculus Rift DK1 and now he just released an updated ANE and Flare3D’s example for the new DK2.
We talked with him about Virtual Reality, Oculus Rift and Flare3D’s integration:
Which is your previous experience with Virtual Reality?
From the first time i saw the holodeck in Star Trek, the next generation, i’ve always wanted to pursue virtual reality.
In the 90’s, I was 12 or 13 years old during the first VR hype, the local computer business showcased the huge Forte’s VFX-1 HMD for PC. It really was not that good and i certainly did not have the funds to buy something like that. But it always stayed in the back of my head.
When Nvidia started experimenting with the first stereo drivers that worked with shutter glasses (before slow lcd’s became the standard) I immediately went and bought them, Battlefield 1942 looked gorgeous in stereo 3D.
After graduating i started working as a flash developer in an awesome multimedia company called BUT. (www.but.be). When the first 3D engines came out in flash, before Stage3D came to the scene, I created some demo’s with stereo views in anaglyph 3D.
I read the MTBS forums (http://www.mtbs3d.com/) at the time Palmer Luckey was contemplating his first rift prototype and went to kickstarter with it. I asked my boss for a developer kit and could not wait for it to arrive. After playing around with it a few evenings i wanted to create things myself. I looked around if somebody created a Native Extension for Air already. Jonathan Hart just started his ANE and a few other people contributed code. I set out to create the distortion shader (in AGAL) and spent many many nights until the basics finally came together.
By then i found out about Flare3D and it’s FLSL language, I asked for help on the forums and just minutes later Ariel came up with a working distortion shader with chromatic aberration. I felt like all my AGAL studying was for nothing, but was sold on Flare3D. The support is just great!
Are you working on an Oculus Rift project?
While the implementation is already really usable there is still work to be done, latency tracking is next on the todo list. But the most important bits are done including Timewarp.
I’ll soon be making a demo project so we can pitch ideas to clients. Personally I would really like to make a space shooter. Sitting in a cockpit looking into the void of space is just something that is made for VR.
Jonathan Hart just completed work on integrating the Oculus into the award winning VR-based music video for Beck’s “Sound and Vision”, produced by Stopp Family and directed by Chris Milk which won 2 Webby Awards last year. This piece also uses Flare3D, and in a very novel way. http://www.stopp.se/chris-milk-beck-hello-again/
Beyond that there is also already work underway to support the Samsung Gear VR and integrating Oculus’ Android SDK as a platform target for the ANE. Exciting times are ahead!
What offers Oculus Rift DK2 comparing it with their previous version?
DK1 had a 1280×800 60hz. LCD display. DK2 has a 75hz. full HD OLED display that is capable of showing a much crisper image.When Adobe AIR is delivering 75fps and you turn on the low latency mode, the difference with the first developer kit is huge. Low latency shows the rendered frame only for a fraction of the 13ms it would normally be visible to the eye. The result is that the image does not smear like in DK1 when looking around.
Besides the better visual qualities it also has an external camera that tracks the users position. makes a huge difference in not getting sick.
DK2’s only downside is that is has a lower fov. then the first developer kit has. But it’s important to know that all of this is still for developers. The final consumer version specs are in no way locked down and i’m sure they’ll be improving almost every aspect of the current developer kit for the first consumer version. for example the new prototype Oculus showed last week has completely different optics, a higher refresh rate (90hz. is hinted) and it looks like it has a higher resolution.
Is the example ready or does it need a few tweaks to consider it finished?
It needs tweaking, the perspective projection is calculated in Flare3D while Oculus provides a projection matrix from the SDK. I just can’t make sense of the one they are providing, it’s something that needs looking at so the setup will automatically work for future versions.
Adobe Air does vsync but in my test setup it locks to the refreshrate of the slowest monitor. Which in many cases is 60hz. That might be an area where AIR could be improved. It should check which display it is running on and then adjust. This way, when you go fullscreen on the oculus, it starts running in 75fps. The current solution is to buy a decent high framerate main monitor or set the rift as the only display which is highly impractical. According to Adobe, they have no framerate cap on AIR desktop, which means the 90Hz consumer version of the Oculus is already feasible: the world just needs hardware that refreshes at that frequency.
Which was your experience integrating Flare3D with Oculus Rift?
Flare’s api and the flsl language are perfect for the job but what is even better is the active community and a lively forum where you can go for help. The scriptable IDE is also great and allowed me to write an importer for the oculus demo scene. I’m really looking forward to the next version. People jumping in and creating fxaa shaders that can work together with the distortion shaders really gives you the energy to continue working on it.
Full example and Sources
The example, ANE, and source code are available on his Github repository here:
Tom Goethals is a multimedia developer currently employed at B·U·T (www.but.be).
Living in Belgium with his awesome wife and 2 kids. He worked on many award winning interactive campaigns, installations and websites.
Follow him on Twitter using @Fragilem17
Walk through our deferred lighting demo using Adobe AIR Gamepad!
One of the latest features released with AIR was Adobe AIR Gamepad. Basically, if you have adobe AIR installed in your Android phone or tablet you can use it as a joystick in your game. The integration is really easy (check out Adobe’s tutorial here) and provides you with a lot of possibilities such as adding the joystick functionality for your game or else you can use it as a second screen for interactive applications.
To play the demo you must follow the following steps:
Important: Flash Player 14 (or higher) is required
- Open the demo here
- Launch Adobe AIR application in your phone or tablet.
- Shake your device to enable AIR Gamepad.
- Type into the demo the code that appears in your device and press “connect”
- Once you get connected your device will vibrate and you will see on the device screen the touch areas for walking and looking around into the demo.
This demo uses our “Deferred lightning” example. You will find the demo’s source code at Flare3D Labs in a new folder called “gamepad” which has been added into src.
The VirtualPad class receives a gamepad’s reference and handle the input in the same way. This class was made to detect traditional touch events, but with a few changes was adapted to work with AIR Gamepad. As a complementary information you can checkout AIR Gamepad docs here.
Interpreting the input
AIR Gamepad recognizes Multi Touch and gestures input and send them to your application. So you can catch the input as if your app would be running in the mobile device. Once the connection between the app and the device has been established you are able to skin the device’s screen sending an image. In our demo, we are sending an image that shows a reference indicating the touch areas on to the screen.
It’s time to play!
The contest reached the end and we can say that it was a great experience for us and we’ll certainly repeat it in the future!
Although the contest’s goal seemed easy to achieve, the reality was different. Only three participants were able to finish their BREAKOUT’s versions. It just wasn’t right to choose two winners among three participants. Thus… we decided (only this time) to declare the three finalists as winners!
Now it’s time to meet the winners:
Tim did a great Job! Basically he takes the example and turns it into a full grown game!
Respecting the traditional gameplay, his “Breakout 3D” includes 8 levels, power-ups (some of them really funny), support for keyboard and mouse and a mobile version for Android!
Gabriel presents “Wonder Runner 3D” and we are still trying to figure out what kind of game is it! 100% surreal, in the game you are a man running across an oneiric world searching for a golden ball… Yes, it’s a crazy idea but has potential!
Our dear Wandah is the usual suspect into the Flare3D’s world 😉 with a past proved experience using Flare3D. Wandah spent 10 hours converting our breakout example into a mix between a football game and a tower defense! The game recreates the final match of the latest World Cup game between Argentina and Germany, includes cute graphics, soccer’s sounds, splash screens, menus and runs perfectly!
It is now time to think about the future! Keep in touch, we are cooking something special for the next mini Challenge!