Just a short snippet of some further experiments I’m doing with interactive 3D motion tracking. This is a continuation of this experiment.
I shot some footage of me holding an iPhone, tracked the screen and converted the tracking data. I then added interactivity to the buttons (in Adobe Edge Animate) so that you can press the actual buttons on the video and those numbers appear on the phone (all tracked in 3D space).
You’ll notice that buttons presses (and the beep sound) are a bit laggy – I haven’t quite worked out how to get this properly sync’d yet. If anyone knows then feel free to leave a comment – I’d love to know!
And in true reality-feedback style, this does actually run on an iPhone 4 (remember them?), albeit a bit stuttery.
Update: The button/audio lag is now fixed – I realised that I had made the schoolby error of using the ‘mousedown’ event for the buttons instead of ”touchstart’ – I had also missed off the view-port meta tags which can also avoid the 300ms lag in some browsers – thanks to @VirgilRocks for the nudge!