Qlab load on CPU + GPU

586 views
Skip to first unread message

Andy Dolph

unread,
Feb 10, 2014, 1:50:23 PM2/10/14
to ql...@googlegroups.com
Chris and Sean (or whoever else has an answer... but I suspect this is architectural...)

In looking at the new Mac Pro for Qlab?

How much multithreading does Qlab take advantage of?

Is there a maximum number of cores/threads available that are likely to be useful to Qlab?

In a machine that is for large multidisplay video (eventually including edge blending, corner pinning, etc...)  I have a choice about allocating $ towards a CPU with more cores or more powerful GPUs - My tendency is to go with either
3.7GHz quad-core with 10MB of L3 cache or the
3.5GHz 6-core with 12MB of L3 cache
And then get the biggest GPU we can afford (likely the dual D500 option)

Does that make sense?

Why or why not?

Thanks!

Andy

Chris Ashworth

unread,
Feb 10, 2014, 2:09:10 PM2/10/14
to ql...@googlegroups.com

Andy Dolph wrote:
> Chris and Sean (or whoever else has an answer... but I suspect this is
> architectural...)
>
> In looking at the new Mac Pro for Qlab?
>
> How much multithreading does Qlab take advantage of?

Each audio and video file is decoded on its own dedicated thread. Thus,
QLab can be decoding media in parallel on as many cores as the machine
makes available.

As with all performance analysis, the degree to which adding CPU cores
will improve performance is limited by the smallest bottleneck in the
chain (CPU, storage speed, graphics card, bus speed, etc etc)

But yes, QLab is heavily multi-threaded and will take advantage of the
extra CPUs if they are available.

I will say that unless you are performing heavy effects (audio or video)
the CPU is not usually the bottleneck.

I'm not sure I can reliably make a more specific prescription, since
there are so many factors for any given design and system configuration.

-C

Andy Lang

unread,
Feb 10, 2014, 3:06:18 PM2/10/14
to ql...@googlegroups.com
The one other thing I can add to what Chris said, re: CPU choices, is
that the math gets complicated, and a higher core count isn't
necessarily faster in many circumstances. The best explanation and
breakdown comes from this Marco Arment post that my teammate Sam found
a while back:

http://www.marco.org/2013/11/26/new-mac-pro-cpus

Best always,
Andy

talkingtobrian

unread,
Feb 20, 2016, 10:03:56 PM2/20/16
to QLab
"How much multithreading does Qlab take advantage of?"

On a similar note, is Qlab able to take advantage of dual graphics? I did some performance monitoring on our 2013 Mac Pro with dual AMD D300 graphics - the RAM and CPU are hardly touched, but one GPU was working hard. The other was idle. We were using all three graphics/thunderbolt Buses, so in my mind we should have not been making one do all the work, unless Qlab only uses on GPU.

Sam Kusnetz

unread,
Feb 21, 2016, 8:53:10 AM2/21/16
to ql...@googlegroups.com


talkingtobrian wrote:
> On a similar note, is Qlab able to take advantage of dual graphics? I did some performance monitoring on our 2013 Mac Pro with dual AMD D300 graphics - the RAM and CPU are hardly touched, but one GPU was working hard. The other was idle. We were using all three graphics/thunderbolt Buses, so in my mind we should have not been making one do all the work, unless Qlab only uses on GPU.

On the new Mac Pros, only one of the GPUs is actually connected to the
display outputs. The second one, strangely, has nothing to do with
driving displays and is exclusively available for GPU computing tasks.
Final Cut Pro, for example, uses the second GPU to do background
rendering of video effects so that you can continue to use the program
easily even while rendering is taking place.

QLab has not been written to take advantage of the compute-only GPU,
although it's something we're looking at. The majority of QLab users are
not using Mac Pros, and the number of tasks in QLab that could be
offloaded to that second CPU is not huge, so it's something that we need
to consider carefully in order to determine if it's the best use of our
development energy.

On a personal note, I really do wish Apple provided a way for both GPUs
to be involved in video output directly. That would just make things so
much easier!

Cheerio
Sam

--
Sam Kusnetz | Figure 53
s...@figure53.com

talkingtobrian

unread,
Feb 21, 2016, 11:51:25 AM2/21/16
to QLab
"The majority of QLab users are not using Mac Pros, and the number of tasks in QLab that could be offloaded to that second CPU is not huge, so it's something that we need to consider carefully in order to determine if it's the best use of our development energy."

Thanks for the clarification. Wasn't sure if it was a testing error. I know there is some loss of speed when swapping from processor to processor - so using the second gpu wouldn't help with pre-load?

I'm very interested in your statement of most of your userbase not using Pros - are you able to tell us what you feel is the primary machine type? I figured the pro would be best, with all of the multiple outputs built in. But I've been hitting a bottleneck, and now I see why. Two places I work at just bought Mac pros just for Qlab, and one already has two others for the same thing.

sam kusnetz

unread,
Feb 21, 2016, 12:35:06 PM2/21/16
to ql...@googlegroups.com

> On Feb 21, 2016, at 11:51 AM, talkingtobrian <echostatio...@gmail.com> wrote:
>
> Thanks for the clarification. Wasn't sure if it was a testing error. I know there is some loss of speed when swapping from processor to processor

I'm not entirely sure what you mean by that.

> so using the second gpu wouldn't help with pre-load?

I may not have been clear enough before. Building an application to use the second GPU is a significant rewrite to major parts of the code. You may recall what a mess Apple found themselves in when they rewrote Final Cut Pro between version 7 and version X, which was the revision that paved the way for GPU compute support (among other things).

So what I really meant to say was that it's not a question of flipping a switch, it's a serious and substantial task to rewrite QLab in this way.

> I'm very interested in your statement of most of your userbase not using Pros - are you able to tell us what you feel is the primary machine type?

I think QLab users are using Mac Minis or MacBook Pros. My personal experience isn't terrifically broad, but most of the theaters I've worked in off-Broadway and off-off-Broadway are using Minis.

> I figured the pro would be best, with all of the multiple outputs built in.

For video users, you are most certainly correct that the Mac Pro gives the most power and most flexibility. But it also costs, you know, quite a lot more money!

I'm certainly not saying there's no benefit possible here, I'm just saying it's not an open and shut case, with a lot of very obscure variables and difficult-to-predict consequences.

Best
Sam
--
Sam Kusnetz | Figure 53
http://qlab.tips | http://qlab.tv
(mobile)

Chris Ashworth

unread,
Feb 21, 2016, 1:52:31 PM2/21/16
to talkingtobrian, ql...@googlegroups.com
As it turns out we have some hard data about this too, in the form of the anonymous system profiles folks can choose to provide when checking for updates.

I don’t at present have a summary of that data immediately available but scrolling through a few pages of the data suggests that Mac Pros are used much more rarely than other models. I’d be interested to see a summary of those numbers, and will share that if/when we pull that together.

-C

Sean Dougall

unread,
Feb 21, 2016, 7:28:01 PM2/21/16
to ql...@googlegroups.com
Out of curiosity, I took a different approach and ran a quick check of the activation database. It looks as though in the last six months, a little less than 3.7% of computers activated with QLab licenses were “trashcan” Mac Pro models. I suspect that the real number, adding in computers running the free version of QLab, is likely to be lower.

That said, I believe there are cases, particularly within those 3.7%, where that second GPU power could be seriously beneficial. There’s been some lobbying for this within Figure 53 as well. :-) The trick is, as Sam said, identifying areas where processing can be parallelized enough to make the benefit worth the development effort. Only certain types of operations are suitable for offloading to the GPU, but we believe we’ve identified some areas (I’m looking at you, video effects) where it seems like a good fit and likely to yield some major benefits. But again, not without a serious amount of tearing down and rebuilding.

Cheers,
Sean
Reply all
Reply to author
Forward
0 new messages