Rare Sighting: A Gesture-Driven Interactive Display With A Point!
August 10, 2020 by Dave Haynes
I am not much of a fan of body and gesture-tracking applications that ask people to get in front of a screen and wave their arms and have that mimicked on the screen those people are viewing.
It tends to be eye-candy that has little evident impact or point.
But via Linkedin I stumbled across an IBM-funded interactive application done last summer that uses this tech to maximum effect – because of the content tie-in and the scale of the “activation.”
Last July, IBM funded a 50th anniversary exhibition of the Apollo 11 lunar landing.
The Brooklyn creative/solutions shop VolvoxLabs worked with the agency Ogilvy USA to create a two-day pop-up event in the vast central area of the Oculus, the retail and pedestrian concourse beside the rebuilt World Trade Center site in New York.
Visitors were able to step onto the IBM Moon Walk experience and see themselves as astronauts out for a moonwalk. Cameras, body-tracking and gesture software, and a whole lot of other software enabled people to gesture in front of a big direct-view LED display, and see themselves in real-time doing the same on the screen, but dressed as astronauts.
After the walk on the moon, visitors received a souvenir digital postcard to share online.
Volvox Labs integrated wrnchAI machine learning for real-time skeletal tracking, TouchDesigner for processing data and Unreal Engine to create real-time graphics. The walking surface was also made from direct view LED tiles, so steps on the virtual surface left prints.
This was very well done, and I am a bit surprised I am just learning about it now. I am also a bit suprised that it ran for all of two days in NYC.
The difference between this, and what I have seen demo’d at trade shows, and here and there in real world installs, is the direct tie-in to familiar, iconic content. It’s the difference between something that’s briefly interesting, and something that’s memorable.
Leave a comment