Best Practice for QLab Redundant system

1,206 views
Skip to first unread message

FL K

unread,
Aug 16, 2023, 9:48:25 PM8/16/23
to QLab
Hi dear Hivemind,

I am coming up to a third year of working a large event with a (largely) redundant QLab system (as almost always I find there seem to be hard-to-eliminate single points of failure in there, but that is a bigger discussion) - I just wanted to ask what folks do out there in terms of best practice with redundant QLab 5:

- Which tools if any do folks use to automatically sync new versions across to the second machine after edits? I am going to look at some sort of remote trigger with AppleScript or similar to ideally:
- save/ideally bundle a new version across to the second machine
- wait until it has successfully completed
- remotely quit QLab
- remotely open the newly synced QLab file
- ideally load to time and start all active cues from machine 1 on machine 2 (granted it would likely not be perfect sync)
- position the cue to the current playhead of Machine 1

Most of this I will be able to program, I feel, but I think what I could use help with, if it exists, is:
- locate and copy the current file to the other machine
- sync any new media files over (I remember someone talking about rsync here?) and wait for success there

Other questions related - I assume you'd want to stay away from shared network storage as a source, both for performance and single point of fail reason?

Still rooting for somehow incorporating this into QLab 5 (or 6?) - it would revolutionise redundant systems in the same way that the collab feature has collaboration to have a main/backup (or even better - no/exchangeable roles!) system built in that could handle this without restarting QLab on the non-active machine... as the approach above seems to force the edits to be on the "live" machine (to tolerate the re-start of the secondary QLab without interruption), so would not allow easily what in LX land some consoles call "BLIND" mode - editing without being live.

I am sure my thoughts above contain omissions, unreasonable simplification, and generalisations, and I am always keen to be educated - feel free to tear it apart as gently as your personal patience allows ;).

Cheers,

Freddy

FL K

unread,
Aug 26, 2023, 6:42:53 AM8/26/23
to QLab
Sorry, just following up, any thoughts within F53, or in the community for that matter? Or are people just doing manual copies in that instance?

Sam Kusnetz

unread,
Aug 26, 2023, 11:15:44 AM8/26/23
to ql...@googlegroups.com
HI Freddy

I just wanted to ask what folks do out there in terms of best practice with redundant QLab 5

The standard approach in New York City theater, both on Broadway and off-Broadway, is a system like this:

  • Two Macs, usually high-spec Mac Minis, each running the same version of macOS, QLab, and usually Dante Virtual Soundcard.
  • A xDIP or ServSwitch CX networked KVM system.
  • Three completely physically independent computer networks; one for Dante primary, one for Dante secondary, and one for non-Dante network traffic. The Macs use their built-in ethernet port to connect to one of these networks, and USB-to-ethernet adapters to connect to the others.
  • A USB or MIDI trigger box with four to eleven buttons that connects to both Macs and is the only source of QLab control during a show. This is how we guarantee that the playhead is in the same spot and that cues start on both Macs at the same time.

- Which tools if any do folks use to automatically sync new versions across to the second machine after edits?

Most people do not do this automatically. Most folks manually copy the workspace and media to the backup Mac at the end of each working day. Then, when opening night approaches and the workspace stops changing (or at least stops changing often), the top-of-day procedure starts to include booting the backup Mac and testing it daily.

- save/ideally bundle a new version across to the second machine

Bundling is almost never necessary in QLab 4, and is not part of QLab 5’s vocabulary at all. Saving a copy of the workspace folder is all you need.

- wait until it has successfully completed
- remotely quit QLab
- remotely open the newly synced QLab file

This seems complex to me, but if you’re going to do it I would recommend quitting QLab on the backup before starting the copy.

- ideally load to time and start all active cues from machine 1 on machine 2 (granted it would likely not be perfect sync)

It would 100% definitely not be in sync. It will be out of sync by an amount of time equal to the time between finding the current time of all cues on the main, and transmitting that information the backup and loading-to-time and starting those cues.

If you perform this sync action while no cues are playing, of course, you’ll have no problem at all.

What is the failure scenario you are trying to guard against? It seems to me like you’re trying to bring a backup system online mid-show, and expecting it to be able to take over at any moment. Is that accurate? If so, why is this necessary in your context?

- locate and copy the current file to the other machine
- sync any new media files over (I remember someone talking about rsync here?) and wait for success there

Something I use which is not common but which works very well for me is Syncthing:

This tool works sort of like a server-less version of Dropbox. You can designate any folder as a source of syncing, and then other computers can attach themselves to that folder and sync it. You can set the syncing to be one-way, too, which is how I do it. Changes made on the main Mac are pushed to the backup, but if I were to make a change on the backup, it doesn’t push back to the main. This gives me some peace of mind.

QLab doesn’t reload the workspace while it’s open, of course, so you still need to manually decide when to launch QLab on the backup, but I view that as an asset and not a liability.

Other questions related - I assume you'd want to stay away from shared network storage as a source, both for performance and single point of fail reason?

Yes. The SSD internal to the Mac, or connected by USB, is wildly, absurdly faster than even the fastest network storage. It’s not even close.

Plus, with network storage you have a whole other device you need to babysit, make sure it’s healthy and running at peak performance, etc. This is a lot of extra work to get worse performance.

Simpler really is better when it comes to backups!

Still rooting for somehow incorporating this into QLab 5 (or 6?) - it would revolutionise redundant systems in the same way that the collab feature has collaboration to have a main/backup (or even better - no/exchangeable roles!)

Yes, that sounds pretty terrific doesn’t it!

Best
Sam

Sam Kusnetz (he/him) | Figure 53

Reply all
Reply to author
Forward
0 new messages