Lighting Chases

197 views
Skip to first unread message

micpool

unread,
Apr 14, 2018, 3:17:31 PM4/14/18
to QLab
Just had a quick play with an idea I had to simplify the programming of lighting chases by using simple grayscale animations and mapping regions to lights.
Obviously a long way to go, but as I am not likely to have time to do anymore with this for a few weeks thought I would just share this low res demo, in case anyone else is inspired to have a go at something similar

Animated videos or stills or text cues in QLab play out to a 1000 pixel wide syphon surface. This is picked up by a simple Quartz Composer patch which defines 11  equally spaced points and outputs the grayscale values of these  via OSCULATOR  to 11 fixtures in  the lighting dashboard.

Mic






Video to Lighting Chases lores480.mov

Chris Ashworth

unread,
Apr 14, 2018, 9:12:54 PM4/14/18
to micpool, ql...@googlegroups.com
Ha, I love this. I was spending a lot of time the last few days researching the different kinds of lighting effects and had been thinking “for intensity and color chases, it sure feels like a lot of them would be easier to understand or easier to build in the form of animating them visually instead of parametrically….” 

Michael James Mette

unread,
Apr 16, 2018, 11:43:28 AM4/16/18
to QLab


I used a similar concept for my pixel mapped LED tape.  QLab runs a video that is sent via Syphon to MadMapper which drives the LED strip tape.  QLab is also running the two videos for the projectors and the backing tracks we use.  It has been a dynamic and elegant system.  I think I may experiment with the light chases like you demonstrated via video.  Thanks for the idea, Micpool!


Johnson, Philip

unread,
Apr 16, 2018, 12:27:44 PM4/16/18
to ql...@googlegroups.com

This looks really cool  and is what I’m working on right now

Can I ask a couple of questions?

 

  1. Are qlab and madmapper on the same machine?
  2. Can you send images via syphon to another mac with madmapper?
  3. Are you using a pixel tape with individual pixels or led tape that is rgb?

 

I’ve been trying setting up madmapper on a second machine and was going to use osc commands to play presets on the madmapper recorder.  I can see the osc controls on the mapper machine and qlab is set up on a machine by our sound board.  Any suggestions?   My solution isn’t as elegant has yours appears

 

 

Philip  Johnson

Lighting Designer/Technical Director

Department of Theatre and Dance

Texas A&M University – Corpus Christi 

 

From: <ql...@googlegroups.com> on behalf of Michael James Mette <mic...@michaeljamesmette.com>
Reply-To: "ql...@googlegroups.com" <ql...@googlegroups.com>
Date: Monday, April 16, 2018 at 10:43 AM
To: QLab <ql...@googlegroups.com>
Subject: Re: [QLab] Lighting Chases

 

 

I used a similar concept for my pixel mapped LED tape.  QLab runs a video that is sent via Syphon to MadMapper which drives the LED strip tape.  QLab is also running the two videos for the projectors and the backing tracks we use.  It has been a dynamic and elegant system.  I think I may experiment with the light chases like you demonstrated via video.  Thanks for the idea, Micpool!

 

https://lh3.googleusercontent.com/-fbTEJryrUKs/WtTEDfs5w0I/AAAAAAAAA-Q/n9zCyVHDVNcUy1v3u4vtgYUUgQFzdgU3wCLcBGAs/s1600/MJM7%2BLive%2BShow.jpg


On Saturday, April 14, 2018 at 8:12:54 PM UTC-5, Chris Ashworth wrote:

Ha, I love this. I was spending a lot of time the last few days researching the different kinds of lighting effects and had been thinking “for intensity and color chases, it sure feels like a lot of them would be easier to understand or easier to build in the form of animating them visually instead of parametrically….” 

 



Just had a quick play with an idea I had to simplify the programming of lighting chases by using simple grayscale animations and mapping regions to lights.

Obviously a long way to go, but as I am not likely to have time to do anymore with this for a few weeks thought I would just share this low res demo, in case anyone else is inspired to have a go at something similar

 

Animated videos or stills or text cues in QLab play out to a 1000 pixel wide syphon surface. This is picked up by a simple Quartz Composer patch which defines 11  equally spaced points and outputs the grayscale values of these  via OSCULATOR  to 11 fixtures in  the lighting dashboard.

 

Mic

 

 

--
Contact support anytime: sup...@figure53.com
Follow Figure 53 on Twitter:
https://twitter.com/Figure53
User Group Code of Conduct:
https://figure53.com/help/code-of-conduct/
---
You received this message because you are subscribed to the Google Groups "QLab" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
qlab+uns...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/qlab/f973d40a-a111-4910-a54d-54adbc323a99%40googlegroups.com.
For more options, visit
https://groups.google.com/d/optout.

Sean Dougall

unread,
Apr 16, 2018, 1:50:34 PM4/16/18
to ql...@googlegroups.com
Syphon itself works within a single computer. The benefit of it is that an application like QLab can just render into a texture in video RAM and send MadMapper a reference to that texture, without actually copying over any pixel data. So it’s extremely lightweight and as low-latency as you can possibly get.

There are a few projects out there that purport to be “Syphon over the network” or some such, but in reality they have to pull pixels out of Syphon, send them over the network, and put them back into Syphon on the other machine, all of which incurs latency. Personally, I would prefer to send a regular HDMI video output from QLab into something like a Blackmagic Mini Recorder on the MadMapper machine, if I had to keep them separate. Do you have a particular constraint that’s requiring you to split it across two machines?

--
Sean Dougall



Michael James Mette

unread,
Apr 16, 2018, 3:25:41 PM4/16/18
to QLab
Phillip,

     I'm pretty much pushing the limits on one machine (and originally thought that I would need a second MacBook Pro), but was able to get it all on one machine.  My Sphyon file file is very small (256 x 64 @ 24FPS) rendered in Apple ProRes.  I am also running two other HD Video outputs, and multitrack audio from Qlab.  Qlab lighting was able to take our modest face lights (about 200 channels of DMX).  Rendering it in ProRes is very important when you're running video files.  Also, having a SSD is almost essential.

     I did have some slowdowns when I was testing it, but it was due to the heat of the computer.  When the CPUs heat went up, it would slow down the video and drop frames.  To be fair, I was testing it in our non-climate controlled garage workshop in August.  Once I put it in proper conditions, and added a laptop stand with a fan to be safe, we've had no issues with it.

     I'm also using DMXKing's LEDmx PRO 4 for each of my panels.  The new firmware upgrade (3.2) was ESSENTIAL.  Before that there were lots of little flickers and issues syncing up.  But with the new firmware, they've been rock solid.  MadMapper starts up and is basically minimized on my desktop so that QLab gets all of the space on the screen.  These panels get loaded up and have played about 100 shows and are still rocking strong!  

peace,
mjm
Reply all
Reply to author
Forward
0 new messages