September 15, 2015 at 7:49 AM
I'm getting a pretty drastically different video experience between QLab and Quicktime and I'm wondering if there is any extra tax in the QLab 3 video engine (that I know is no longer QT), or any other known causes or instances where QLab would come up with a different video output than quicktime?
Artifact one looks like dropped frames, horizontal scrolling is much more jagged/stuttery in QLab then the same file viewed in quicktime. This is what makes me wonder about data paths and processing, but again, if Quicktime can play it fine, then the machine specs must be suitable as far as read and decode speeds. I haven't had the chance to watch the cpu and disk load while it's running.
Second artifact has to do with horizontal line determination. The original content is a betamax tape that is interlaced, so 480i. The new file does a great job of rendering the interlaced video as progressive, but in QLab, some horizontal scrolling text seems to 'wobble' at the top and bottom, where it doesn't in quicktime. Is it possible that QLab decodes the ProRes differently than QT?
Unfortunately my knowledge of video is limited so I'm unable to describe the differences better- or know what could be causing these discrepancies. I've tried to include as much info as possible, perhaps some is extraneous and irrelevant. Any thoughts are appreciated. Thanks!
Hello Brian
September 15, 2015 at 7:49 AMI'm getting a pretty drastically different video experience between QLab and Quicktime and I'm wondering if there is any extra tax in the QLab 3 video engine (that I know is no longer QT), or any other known causes or instances where QLab would come up with a different video output than quicktime?
The essential difference is that Quicktime Player prioritizes smooth playback over accuracy, so if your computer can't keep up, QT will drop frames, automatically switch to a lower-resolution encoding, or make small changes to playback speed in order to compensate. QLab does not do this.
Artifact one looks like dropped frames, horizontal scrolling is much more jagged/stuttery in QLab then the same file viewed in quicktime. This is what makes me wonder about data paths and processing, but again, if Quicktime can play it fine, then the machine specs must be suitable as far as read and decode speeds. I haven't had the chance to watch the cpu and disk load while it's running.
Can you tell us more about your computer, please? CPU, GPU, RAM, VRAM, disk?
Second artifact has to do with horizontal line determination. The original content is a betamax tape that is interlaced, so 480i. The new file does a great job of rendering the interlaced video as progressive, but in QLab, some horizontal scrolling text seems to 'wobble' at the top and bottom, where it doesn't in quicktime. Is it possible that QLab decodes the ProRes differently than QT?
It's not only possible, it's guaranteed! QT does not use AVFoundation, which is Apple's appointed video framework of the future. So a direct comparison between QLab and QT is not really apples to apples.
I *strongly* encourage you to de-interlace in advance if possible, as interlacing is the work of the devil.
OK not really... what I should say is interlacing is a necessary evil in some cases. But video playback on a computer is not one of them! All progressive, all the time.
September 15, 2015 at 7:49 AMI'm getting a pretty drastically different video experience between QLab and Quicktime and I'm wondering if there is any extra tax in the QLab 3 video engine
The essential difference is that Quicktime Player prioritizes smooth playback over accuracy
Can you define accuracy- do you mean time? Or resolution? or both now that I'm reading your answer in a different way. So QLab only plays as many perfect frames as it can at the playback rate, yes?
I will have to check the activity monitor when I get back to the system to see if it's disk or processing that's chocking, maybe processing if it's not the gpu. Maybe we have to go to uncompressed video?
Can you tell us more about your computer, please? CPU, GPU, RAM, VRAM, disk?It's the current base Mac Air, so not fancy, but not old. So that gives it a dual 1.6 i5, 4g ram, HD6000 graphics and stock 128gb ssd. This is not the show machine, but this artifacts have been seen by other people on other machines, so it's a consistent problem I imagine because no one can handle the 60fps completely.
I *strongly* encourage you to de-interlace in advance if possible, as interlacing is the work of the devil.it is- the original content is 30fps interlaced, and perhaps you can answer another question- does the video become progressive running through QLab?
I would not expect a MacBook Air to run 60 fps HD video successfully in QLab. The HD6000 GPU is pretty lackluster. Also, on Macs with an integrated GPU, the amount of system RAM that is given to the GPU is based on the total amount of system RAM. With the minimum RAM installed, 4 GB, the GPU has very little RAM to work with.
I *strongly* encourage you to de-interlace in advance if possible, as interlacing is the work of the devil.it is- the original content is 30fps interlaced, and perhaps you can answer another question- does the video become progressive running through QLab?
Well strictly speaking, all video that runs through a computer becomes progressive, because the fundamental way a computer displays video is progressive, not interlaced. The jaggy lines you see when playing back interlaced video on a computer is a side effect of automatic on-the-fly de-interlacing.
I would like to know if a 30 fps version of your video plays nicely. Try taking your 60p file and downsampling to 30fps. I would also like to know if you have access to a Mac with a dedicated GPU, and whether running this video on that system performs better.
I would not expect a MacBook Air to run 60 fps HD video successfully in QLab. The HD6000 GPU is pretty lackluster. Also, on Macs with an integrated GPU, the amount of system RAM that is given to the GPU is based on the total amount of system RAM. With the minimum RAM installed, 4 GB, the GPU has very little RAM to work with.Noted and expected- deciding the minimum system requirements is part fo this project as well. Since this is a package that will get sent out to unknown machines we obviously want to be as optimized as possible to allow for the greatest compatibility, i.e. hopefully not require everyone to go buy an imac or top of the line mbp.
Well strictly speaking, all video that runs through a computer becomes progressive, because the fundamental way a computer displays video is progressive, not interlaced. The jaggy lines you see when playing back interlaced video on a computer is a side effect of automatic on-the-fly de-interlacing.Perhaps it's misleading that you can output 1080i in the system preferences- but maybe that's only the hardware output, and everything processing inside the computer is progressive?
I would like to know if a 30 fps version of your video plays nicely. Try taking your 60p file and downsampling to 30fps. I would also like to know if you have access to a Mac with a dedicated GPU, and whether running this video on that system performs better.
Yes, 30fps seems to play well, but the de-interlacing does not look as good as the frame to frame averaging that can happen at 60fps.
I realized that the wobbling text may be our deinterlace algorithm being exposed when QLab drops frames- that the next frame would fill in the gaps but isn't being displayed.
That's related to the de-interlace which is why I'm partly wondering if the video transfer to progressive should be pixel for pixel, right now it's upsampled from sub SD (betamax that wasn't even full frame) to 720 so it's also trying to expand 2 original lines into 3.x lines (which is why sometimes it averages up and sometimes down) so I've requested a pixel for pixel de-interlace only.
My current thought in trying to get optimized (discreet video card may still be required for 60fps) is that we have the worst combination of files to output- assuming we want to standardize to 1080 going to a projector, our 720 content is both a big file to decompress from prores, but still has to be scaled to 1080. So my question now is, other than trying it out, is it known what is least taxing for cpu/gpu/disk among the 4 options:
sub SD prores
sub SD uncompressed
1080 prores
1080 uncompressed
Or I should say, I know we are trading processing for disk i/o in these options, if 1080 is the desired output, is it easiest on the processing for QLab to read uncompressed 1080 or something smaller and scale? It seems like the choke is not in disk speed (I was seeing only 10mb/s or so), so we could either go to uncompressed video, or a larger frame size, or both. I don't know if scaling or decoding prores is more work (I would think decoding).
Because this will get shipped around the world we were hoping to keep files in a size that could be downloaded, but uncompressed HD for over an hour would not be viable- however ultimately the video quality is by far the most important aspect so we have to get that right first.
That makes good sense to me, and I applaud your attitude. That said, a MacBook Air is just not going to do a good job with video, period. You may have some success with simple video playback, and I understand that's what you're trying to work out, but I also think it's reasonable to have a minimum hardware requirement for your show that's higher than the minimum hardware requirement for QLab 3.
I would like to know if a 30 fps version of your video plays nicely. Try taking your 60p file and downsampling to 30fps. I would also like to know if you have access to a Mac with a dedicated GPU, and whether running this video on that system performs better.
Yes, 30fps seems to play well, but the de-interlacing does not look as good as the frame to frame averaging that can happen at 60fps.
The explanation here is that your system is simply not able to process 60 frames per second. Whatever visual artifacts you observe at 60 fps is a result of attempting to push too much data through your Mac. When you play back your 60 fps video through QuickTime Player, you're most definitely not actually seeing 60 actual frames per second. You're seeing Apple's very high quality on-the-fly downconverting to some other slower frame rate.
That's related to the de-interlace which is why I'm partly wondering if the video transfer to progressive should be pixel for pixel, right now it's upsampled from sub SD (betamax that wasn't even full frame) to 720 so it's also trying to expand 2 original lines into 3.x lines (which is why sometimes it averages up and sometimes down) so I've requested a pixel for pixel de-interlace only.
I most definitely recommend pixel-for-pixel.
This next statement might stress some people out, but stay with me here: I also recommend moving away from thinking about broadcast standards whatsoever. SD, 720p, 1080i, 1080p... these are broadcast standards that really have nothing to do with a computer whatsoever. They're convenient shorthands, and if you're outputting from your Mac into an actual broadcast environment then they certainly matter, but when your starting off with video on your hard disk, playing it through QLab, and outputting to a computer monitor or a video projector that accepts input via HDMI, DVI, or VGA, just think about pixels. If the native resolution of your projector is 1920x1080, then it's 1920x1080. It's not HD, it's not 1080p or 1080i, it's just 1920 pixels wide by 1080 pixels high.
My current thought in trying to get optimized (discreet video card may still be required for 60fps) is that we have the worst combination of files to output- assuming we want to standardize to 1080 going to a projector, our 720 content is both a big file to decompress from prores, but still has to be scaled to 1080. So my question now is, other than trying it out, is it known what is least taxing for cpu/gpu/disk among the 4 options:
sub SD prores
sub SD uncompressed
1080 prores
1080 uncompressed
Once again, ProRes 422 Proxy is our most recommended format. It has a low data rate, so it won't soak your disk I/O, and its compression is fast to decode, so it's easy on your CPU and GPU. Any visual shortcomings of Proxy are completely irrelevant due to the picture quality of your source material.
The best thing is to not scale in software at all. QLab playing a video at scale 1 is more efficient than QLab playing a video at scale 2 or scale 0.5. See previous paragraph regarding "sub SD" versus 1080.
If you want to standardize on 1920x1080 projectors then import your Betamax at 1920x1080 progressive, 30 frames per second.
But if you're going to dictate that your show needs a 1920x1080 projector, it seems reasonable to me that you can also dictate that your show needs a Mac of a certain minimum spec.
Or I should say, I know we are trading processing for disk i/o in these options, if 1080 is the desired output, is it easiest on the processing for QLab to read uncompressed 1080 or something smaller and scale? It seems like the choke is not in disk speed (I was seeing only 10mb/s or so), so we could either go to uncompressed video, or a larger frame size, or both. I don't know if scaling or decoding prores is more work (I would think decoding).
Don't think about it as an either/or proposition. It's easy for you to avoid scaling; just prepare your material at the size at which you'll be projecting it. So since it's easy to avoid, avoid it.
Playing back completely uncompressed video at 1920x1080 will not perform better on low-grade hardware than wisely chosen compressed video.
Because this will get shipped around the world we were hoping to keep files in a size that could be downloaded, but uncompressed HD for over an hour would not be viable- however ultimately the video quality is by far the most important aspect so we have to get that right first.
Again, I agree with your approach. Except for the part where this whole show gets downloaded, instead of installed on a Mac which is properly configured and tested, and then shipped around with the show. I think that would give you a higher chance of success.
I hope this all helps!