Sony Lut Rec709

0 views
Skip to first unread message

Peppin Kishore

unread,
Aug 3, 2024, 3:36:28 PM8/3/24
to inredjaiti

After being super confused why my workflow got destroyed over night I finally realised that the difference in my case is made by 4:2:2 clips. Applying technical LUT to 4:2:2 footage delivers incorrect results, image is blown out and oversaturated. Same LUT in same version of Premiere with same files looks correct on my windows machine. 4:2:0 clips work the same accross both laptops. I tried with multiple slog to rec709 LUTs and with many different clips. Any ideas what can be resposible for this?

Also, check the preferences ... it sounds like one of your rigs is a Mac? That should definitely have the "display color management" option turned on in the prefs. Most people on PCs should also unless they've really tightyly calibrated and profiled their monitors.

Hey Neil, I just wanted to say big Thank You! I was struggling with it for days and the Interpret footage was the option that everyone suggests but everything was set to Rec 709 and what did the trick was the ''Display Color Management'' tick box, I'm saved, once again thanks!

I hate to get back to that issue but opening my project again on this beautiful Monday morning I realised the issue is not resolved. I'm super confused as I remember looking at correct colours of my footage in project's timeline the other day when Neil first explained what needs to be done.

So just to maybe provide extra clues I attach the screens of how ticking the ''Colour Management'' box helped, how it was before and how it rendered on a windows machine (so yes the problem persist on my M1 Pro Mac), there might be one more layer of secondaries there but the point is nothing is clipping.

Thanks for trying to explore this issue further, here's some sample footage and the LUTs, 300mb total so shouldn't be too bad. Just one more thing I realised, maybe it's not related to chroma subsampling but to the codec being used. When I shoot H.264 it ends up being 4:2:2, and HEVC is 4:2:0, and I missed that intially when look at properties of clips that get messed up.

-jTyp1XeZOQ

They can do seemingly amazing things, but the colorists I know all call them the dumbest math out there. You have to be careful in testing them before using them as most LUTs actually can in the right (or perhaps, wrong) circumstances do amazing artifacting. So for some clips of a job they might work fine, but something with darker shadows or highlights or something is mangled.

And all LUTs are built for clips with specific color and tonal range values. Any clip "outside" those values gets clipped. So all the colorists I know teach applying LUTs after a node (Resolve) or 'layer' in an Adobe app, where you can trim the tonal ranges and saturation to fit within the LUT so it won't clip the whites, crush the blacks, or over-ramp color.

And ... in applying both these LUTs, for each clip, I found that applying them in the Creative tab and using the Basic tab controls to 'trim' the clips into the LUTs ... worked beautifully. No clipped data at all. In fact, as I brought the "exposure" slider down after dropping a LUT onto the guy clip, I could watch the clipped line of forehead area simply become the full data of a properly exposed image.

Thank you Neil for taking the time to test the LUTs! I'm with you that LUTs are just LUTs and can be a good starting point at best. The issue here is that the Leeming LUT is my trusted technical LUT to bring the slog to rec 709, and it's been working as such fairly well, it still does on HEVC 4:2:0 clips, it just behaves oddly on my mac machine on these clips that I sent you.

I guess I'm always rushing with these replies and not being very scientific with examples and explanations. This time I took the time to properly compare two clips on windows laptop and M1 Mac and I realised that the problem starts even before applying the LUT. I attach two waveforms of raw clips before adding the LUT. It's same timeline, same settings for sequence and the project, same file, same latest Premiere Pro 2022. And you can see that the waveform starts at different point on one computer than the other. Clearly there is a software issue there and it prevents me from working with my old project files on new Mac without reworking color correction from scratch.

Ahh, PC and Macs ... there's a fun issue there, in that the wonderful folks at Apple chose intentionally to "reinterpret" video standards differently than all others. Mac OS has it's ColorSync color management system, that does two rather nasty things to video media.

In order to get around that as best as possible, the Pr engineers added the "display color management" option in the Preferences. All Mac users, and most PCs should have this checked. Unless you're running a pretty solid CM setup with stringent calibration/profiling to Rec.709, that should be selected.

Credit.. Alister Chapman.. just read this .. and its the first time I have ever seen someone make this comparison /explanation of the ever discussed "Arri look -Sony look".. makes sense to me.. and typical of the Japanese mind set for everything to be incredibly accurate .. over the actual greater goal.. to make a video camera that produces nice images..

"Arri's colour gamut is very limited compared to Sony's. The Alexa sensor has nowhere near the gamut of the F55 or Venice. But that is what gives the Arri cameras their out of the box look. One of Sony's problems has always been the effort to produce an accurate large gamut image rather than a pleasing image. Point a Sony at almost any test chart and you will see very few problems in the colorimetry. But often accurate isn't as pleasing to look at as more artistically adjusted colors. "

This is interesting - but the idea that Sony color is accurate and Arri color is optimized for a "pleasing image" feels very vague. Why wouldn't accurate color be pleasing? Is there any information on what is actually going on inside the Arri and how an engineer is "artistically adjusting the colorimetry"

It is reliant, to some extent, on the behaviour of the sensor, which is very low noise, but that sensor is now far from leading-edge technology and its capability in terms of colorimetry and dynamic range is not unmatched.

The fundamentals of it are desaturated highlights and some fairly tricky multiplicative and divisive stuff (video engineers would call it "matrixing"). This has been much talked about as effectively emulating the subtractive colour model of film in the additive colour model of a digital cinematography camera, but it seems intended to prevent saturation falling off with luminance, at least as much as is practical.

Most of this is over my head! but the last thing you said - this seems quiet important. As the Alexa has very rich color in shadows. Looking that the Venice footage this is something I noticed was lacking.

Yes there is that saturation with luminance thing..emulating film and it looks great .. but thats only in 709..(you can get the same from a Sony but you have to shoot Slog and add a LUT.. quite a few Arri look out there..but in Log (well Slog3).. and RAW that wouldn't apply..and as you say then its all down to LUT,s.. and or the skill of the colorist ..? Arri have been clever to get a very nice out of the box look in 709.. where as Sony seem to not bothered with that.. and just stuck with getting the most accurate look..

"Why wouldn't accurate be pleasing.". you want to put Rosco,Tiffen and every colorist out of a job .. :).. from what Ive read.. Arri in 709 mode as luminance goes up.. they do some clever thing with saturation levels... emulating what film does.. in 709 mode

Personally speaking, I think the issue with the sony look is not about color hues or color saturation, as much as its about color luminance. At least from what I have read, over 75% of the code values assigned to reproducing colors are above middle grey. Sony really wants to produce clean images so it maps colors to be brighter to get a higher bit depth for colors. Darker colors feel richer and more filmic, but with digital cameras they usually producer a noisier image since those bottom stops have the least amount of code values. And you will hear a larger uproar about noise than color, so sony didn't map the colors lower. Total speculation, but when I drop the luminance of the colors with a YUV color space node, the sony look starts to fade.

There are many tests out there with Alexa shot side by side with Vision and they are extremely close. Of all the digital cameras the Alexa reacts to exposure the most like film. Just look at what happens when you overexpose it. The Alexa fails 'gracefully', like film does.

In the 1990's and early 2000's Arri had already done an enormous amount of research into digitally quantifying film with the Arri laser film scanner and laser film recorder. I would eat my hat if a lot of that R&D didn't end up in the Alexa color science.

I've color graded a few films now that I shot with Alexa. And... I don't feel that the basic "Alexa" look is really film-like. It looks pretty good by itself, but some technique is involved if one wants it to look more "cinematic" or film-like.

c80f0f1006
Reply all
Reply to author
Forward
0 new messages