I’ve come across Gerb over at the Papervision3D mailing list and quickly became interested in his http://vimeo.com/1738859 project and decided to do a little special about it.
Say hello to Gerb a developer from the Netherlands who decided to recreate Johnny Lee Chung’s DesktopVR, a sure challenge with the added difficulty of this being his first time!
Let’s have some background on Gerb:
“During my study i had some experience building haptic interfaces and interactive installations and this always interested me. But in the past you had to build such applications in programming languages as Lingo (Dirtector) and C#. Now, with AS3 there is the possibility to use external hardware in combination with Flash. This opens a wide variety of possibilities, since Flash is relatively ease to use and very versatile in terms of animation and user interfaces. And with the introduction of hardware rendering in Flash 10 things really get interesting.”
From there it was an easy decision for him to get started with it, here’s the story in Gerb’s words:
“I choose the subject, vr & flash, for this project after seeing the video of Johnny Lee Chung’s DesktopVR demo. I though it was really cool and the technology behind it was relatively easy, so i decided to give it a shot and rebuild the project in Flash. All the elements where there, the WiiFlash just released there API for using the WiiMote in Flash and with the help of PV3D for the 3d modeling i started the project. Although the DesktopVR program was opensource and was written in C#, which have the same kind of syntax as AS3, porting the 3d algorithms was rather difficult. Since this was my first experience with AS3 & PV3D i had some catching up to do, but thanks to Tim Koppers of strafwerk.nu (a breda based media company) who guided me through some of the aspects of AS3 and PV3D things turned out ok. I actually tried several 3d engines (like alternativa3d and others) but i settled with PV3d because it gave me the best control over all the parameters i needed (like camera position, angels, zooming etc) which were necessary for this project.”
“The whole project was build using a WiiMote, a pair glasses mounted with IR lights, the wiiflash API, PV3D and a large tv screen (thx Tim ;). The Wiimote contains a IR camera and with some simple mathematics you can calculate the position of a person wearing the IR mounted glasses relative to the Wiimote/Screen. Next, you can use this information to transform the 3D landscape’s perspective to match it with the persons viewing perspective creating the illusion of depth. This gets updated when the person move so you can look around objects when you move.”
And when asked on what’s next Gerb let us know that the next step is to create a game based on the technology using innovative gameplay! We can’t wait for it!
You can check out the demo here: http://vimeo.com/1738859