Loading in caches in a production/robust way

157 views
Skip to first unread message

Alex Fuller

unread,
Apr 15, 2025, 1:08:08 AMApr 15
to gaffer-dev
Hi all,

In the wonderful world of loading in scene caches into Gaffer, I'm finding that out of the box it has quite a few limitations unless you're willing to pull your sleeves up and make custom code. I'll start with a bit of a brain-dump that'll paint a picture first.

So outside of using an external tool to build your scene for you eg. Houdini Solaris, the general approach is to have Alembics/VDBs/USDs/(.SCCs too but not for outside users/studios usually) brought in, using convenience nodes like `CollectScenes` which can be used to wrap up a lot of this without delving into C++-land (eg. drive the CollectScenes node using an expression, that could for instance load a JSON file or query a DB for what files to load in).

Or the other approach is to create an in-memory OpenUSD in an expression, do your operations in pure OpenUSD, and do a side-load into the SceneReader with a specific string to the fileName plug that will reference the memory pointer that way. Now I commend the thought process behind this approach, I *think* this is a massive hack ultimately and not something to use in a serious production sense...

So now I'll go over what currently exists at a studio, and what the studio plans to go towards. But for this post I'll just focus on how to solve the legacy approach as these scenes need to open up without too much fuss with minimal external conversion logic, and the 'new' way is a long enough time away from the present...

Legacy:
  • A .JSON file explains what Alembic assets exist in a scene, and with their instanced transforms
  • An asset loader that takes the above file, but also will splice in lookdev files, FX elements, procedurals (.klf, additional .abc's, vdb's, Yeti hair, etc.)
New:
  • Everything OpenUSD more or less
  • Custom asset resolver for OpenUSD that talks to shotgrid/flow for 'ground truth' publishes

The 'CollectScenes' approach is probably the more sensible way, as it also gives you additional control and visibility over what is coming into the scene (and if you want to swap out manually to your own version or omit caches coming in). However, assigning lookdev from something like a `USDLayerWriter` I can't really see a way to layer this back on when using this approach as the caches are 'Cortex/Gaffer-native' at this stage. The other thought was to 'side-car' an empty .scc file with only the hierarchy and the shader assignments and load this in and do a `MergeScenes`, but I'd like to stick to a USD-native way and not introduce .scc files...

The latter I've had some thoughts over, but it always seems to go towards making custom C++ code eg. duplicate the Gaffer SceneReader node, and make it like a mini-USD stage builder of sorts with a flashy UI to pick and choose what caches go into your USD stage file to then load in, including the lookfile (a .usda layer). The other thought was to make a custom scene interface that holds IECoreUSD::USDScene, that just loads in this JSON file + lookfiles + FX elements, etc. that just converts to a USD in-memory stage, and passes it directly to the USDScene as an internal member (and a bunch of m_usdscene-> pimpl redirects). That's a black-box though as SceneReader is too basic here to fine-tune with just a `fileName` string...

Another thought, making a 'MergeScenes'-like node that takes in a lookdev USD layer file might be the way to go, but I think that's tricky as the incoming scene is already Cortex/Gaffer-native and the .usda I guess would have to be interpreted as a complete scene and not a layer 'over', so it'll be purely based on hierarchy-matching which *should* be ok...

Anyways this is getting quite a bit lengthy, and I'd like to hear some input that others have over this. Ultimately I think I'll need to write something, but I just want to be smart about it.

Cheers

rupert thorpe

unread,
Apr 15, 2025, 5:37:19 AMApr 15
to gaffe...@googlegroups.com
Hi,
some thoughts:  
I think the "everything must be USD" approach to pipeline is not a good fit for a small company.
it works well (I imagine) when a pipeline can be totally pre-planned, and there is enough technical/developer people that every department has a dedicated person (or more) to make things go smoothly.

In smaller companies, where the pipeline/workflow evolves over time, and less technical people end up needing to solve problems, it can be quite problematic.
You might find that you need to re-cache already cached geometry for example - or come across strange/subtle issues that take multiple days to debug.

My approach has been to use USD for geometry in a similar way to an old school .abc pipeline, which in my opinion gives you almost all the advantages of the format (eg: no need for .json files), while avoiding most of the pain.
I essentially created versioned 'usd bundles' per shot for animation, and had 'usd overide' files per seq/shot for environment.   

Shaders are still mostly renderer specific, so I dont see a problem with having a renderer specific way of dealing with them.

I would caution that trying to pull allot of data from shotgun can be very slow -  "Custom asset resolver for OpenUSD that talks to shotgrid/flow for 'ground truth' publishes"
This might not be a great user experience if there are lots of assets in a scene.

As far as dealing with your legacy data,
It should be fairly easy to write a script which converts your existing .JSON layout files directly to .USD files - you will then end up with a USD file for every .JSON file (I suggest just putting them next to each other) with the advantage that the usd can be directly read into DCC's

Without knowing how your current shader assignment works its difficult to know how easy this part will be, but if it is naming based assignment rather than maya referencing then you should be ok.  I think a naming based approach is best.
In the past I have created arnold operator based lookdevs (which used name based assignments) and then converted them to gaffer with a script - its not exactly an elegant approach but it did work and allow rendering of the assets in any DCC that had Arnold. 

R


--
You received this message because you are subscribed to the Google Groups "gaffer-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gaffer-dev+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/gaffer-dev/2d593a1e-b51b-41d4-a627-744fdea5d93fn%40googlegroups.com.

Daniel Dresser

unread,
Apr 15, 2025, 9:06:36 PMApr 15
to gaffer-dev
The overall philosophy of this stuff gets pretty subjective in a hurry, so I'm not sure how valuable my input is on that ( my personal opinion is quite similar to R, though I would say the higher level USD stuff isn't a good fit for a small company or a big company. I agree the simple parts of USD work fine as a replacement for abc ).

On the specifics, you said "The other thought was to 'side-car' an empty .scc file with only the hierarchy and the shader assignments and load this in and do a `MergeScenes`, but I'd like to stick to a USD-native way and not introduce .scc files...".

I don't see why this would require .scc? Why not publish a USD file with just hierarchy and shader assignments, load it as a regular scene ( not a special USD layer or anything ) and use Gaffer nodes like MergeScenes to apply that? I think some of these processes at ImageEngine still use .scc for legacy reasons and because it still works, but I would think USD could store this data just as well?

-Daniel

Nico Rehberg

unread,
Apr 24, 2025, 8:10:36 AMApr 24
to gaffer-dev
Or the other approach is to create an in-memory OpenUSD in an expression, do your operations in pure OpenUSD, and do a side-load into the SceneReader with a specific string to the fileName plug that will reference the memory pointer that way. Now I commend the thought process behind this approach, I *think* this is a massive hack ultimately and not something to use in a serious production sense...

Hi,
out of curiosity I wonder if this opinion about the in-memory USD assembly is shared by others too.

I am actually using it at the moment for layering the materials onto the cache on a per asset level plus some other tweaks and it feels pretty powerful to have access to the whole USD data. 
I haven't used it on bigger scenes or for the whole shot assembly yet, but the though crossed my mind. Why write and handle some on disk USD shot files, when I can do it "live" in Gaffer.

Is it really just a hack and should I stay away from it?

Cheers, Nico
 

John Haddon

unread,
Apr 24, 2025, 11:34:48 AMApr 24
to gaffe...@googlegroups.com
On Thu, Apr 24, 2025 at 1:10 PM Nico Rehberg <nicor...@gmail.com> wrote:
Or the other approach is to create an in-memory OpenUSD in an expression, do your operations in pure OpenUSD, and do a side-load into the SceneReader with a specific string to the fileName plug that will reference the memory pointer that way. Now I commend the thought process behind this approach, I *think* this is a massive hack ultimately and not something to use in a serious production sense...

out of curiosity I wonder if this opinion about the in-memory USD assembly is shared by others too.

I am actually using it at the moment for layering the materials onto the cache on a per asset level plus some other tweaks and it feels pretty powerful to have access to the whole USD data. 
I haven't used it on bigger scenes or for the whole shot assembly yet, but the though crossed my mind. Why write and handle some on disk USD shot files, when I can do it "live" in Gaffer.

Is it really just a hack and should I stay away from it?

I can tell you that at least one large Gaffer/USD production pipeline is using it successfully, and I don't have any qualms about other folks doing the same. Ultimately it might be preferable for Gaffer to provide a more user-friendly workflow for some of this stuff, but in the meantime I wouldn't discourage anyone from using the stage cache approach.

Cheers...
John


Reply all
Reply to author
Forward
0 new messages