I recently wanted to explore the notion of deforming mirror using software.
We all know Apple’s famous Photobooth app and its different effects. But here I wanted to work on a modification which is a fragmentation rather than a distortion.
Thing is, I wanted it to be real-time so I started creating an application with openFrameworks, and made a video of the result:
Music: Nicolas Jaar - Wouh
- Sorry about the poor framerate, the app isn’t laggy at all (60fps) but it’s due to screencasting at such a high resolution -
After playing with this app, I thought about how to turn it into an installation that people could experiment without any mouse, in a more immersive way.
I finally came up with the idea to use the person’s distance as an input:
I like the idea of losing precision when we are precisely looking for it.
The closer you come, the less you’ll be able to get the big picture. But it may show you some details you would never have seen otherwise.
Technically, even if I don’t know yet how to measure this distance, I think the keep-it-simple way (but definitely not the most accurate) may be to use openCV to get the user’s head size. Or using Kinect, or infrared sensors perhaps.
Anyway, by now it’s just a piece of software and I hope it’ll become, one day, a “real” installation :)