interactive video projection

252 views
Skip to first unread message

ra byn (robin)

unread,
Apr 26, 2012, 4:30:39 PM4/26/12
to Qlab Google
I've been struggling with a concept that I realized might be best vented
here...

I work for a museum & I've been asked to come up with an interactive video
projection for the front lobby area.

Something like this:

http://www.youtube.com/watch?v=sAhgQ-LUcCQ

or this

http://www.youtube.com/watch?v=SSryrC4keAM

or this

http://www.youtube.com/watch?v=i_3MQnDsZzU

or this

http://www.youtube.com/watch?v=1ZMNjoudyFM&feature=related

Is this something that can be done with Qlab & IR sensors? If not, is
there someone on the list who specialises in this sort of thing or who is
known of by someone on the list that specializes in stuff like this?

The only company that has responded to my emails so far is in China & this
seems like the sort of thing you need a point person state side to pull
off.

Please let me know & thanks in advance,

ra byn



dan howarth

unread,
Apr 26, 2012, 9:26:41 PM4/26/12
to ql...@googlegroups.com
On Thu, Apr 26, 2012 at 1:30 PM, ra byn (robin) <ra...@rabyn.com> wrote:
I work for a museum & I've been asked to come up with an interactive video
projection for the front lobby area.

 Is this something that can be done with Qlab & IR sensors? If not, is
there someone on the list who specialises in this sort of thing or who is
known of by someone on the list that specializes in stuff like this?

i've seen some of this lately on-stage with dance (i work at a dance school). 
the better things i've seen were done with MAX/MSP and a Kinect camera from an xbox 
(has an SDK). also look at puredata .. i bet some of the video mapping softwares have  
live-camera-feedback algorithms too .. 

i wouldn't have tried to use qlab for this, really. qlab seems to be more for when .. 
things should not be unexpected. the data-wiggler environments like MAX are for when 
things are going to be random. just a thought. 

Brendan Aanes

unread,
Apr 26, 2012, 11:40:54 PM4/26/12
to ql...@googlegroups.com
usually projects like this are done in Jitter (part of Max/MSP), or perhaps Isadora these days. Qlab is not the tool I'd reach for. 

I glanced at the first video and I don't think it's done with IR sensors. typically for these kinds of installations a camera watches the projected area and compares what it sees to the projection, in real time. when it sees something like a hand or body part cast a shadow, it tracks that area. fairly easy as video tracking goes. if you use jitter, cv.jit, cyclops, and softVNS are commonly used video-tracking packages. you could do this with IR or ultrasound sensors but you'd need a huge number of them to get useful data. 

feel free to email me off-list; I sometimes do interactive video art pieces (and am putting one up tomorrow, in fact). if you're looking to pass your museum on to someone, Snibbe Interactive is a company that specializes in this stuff, mostly for corporate and museum clients. Scott Snibbe was one of the first to develop the video-tracking-on-projection strategy, he also worked on the Bjork Biophilia iPhone app. Probably expensive to hire them though. 

hope that helps. 


On Thursday, April 26, 2012, ra byn (robin) wrote:
I've been struggling with a concept that I realized might be best vented
here...

I work for a museum & I've been asked to come up with an interactive video
projection for the front lobby area.

Something like this:



Is this something that can be done with Qlab & IR sensors? If not, is
there someone on the list who specialises in this sort of thing or who is
known of by someone on the list that specializes in stuff like this?

The only company that has responded to my emails so far is in China & this
seems like the sort of thing you need a point person state side to pull
off.

Please let me know & thanks in advance,

ra byn



--
Change your preferences or unsubscribe here:
http://groups.google.com/group/qlab

Follow Figure 53 on Twitter: http://twitter.com/Figure53

Lucas Krech5

unread,
Apr 27, 2012, 2:47:26 AM4/27/12
to ql...@googlegroups.com

If you want to get all codey there is Open CV and/or Processing to look at. I worked on a show (lighting not projections) where we did live interactive video with boid swarms that variously gravitated to and repelled from the dancers. The projection designers used the approach Brendan mentions but using an IR camera (actually a hacked webcam) to detect movement so we could have objects appear in blackout conditions. 

 

You might try integrating the Open CV plugin (from Kineme) for Quartz Composer if you need the cuestack options of QLab.

 

-L 

Reply all
Reply to author
Forward
0 new messages