Hi,
Here are a few practical tips for getting your Gaffer automation off the ground:
automation using out-of-the-box nodes
1. you can use Python expressions connected to Wedge or other Task nodes to build dynamic automation jobs. This approach will only work if the job structure (e.g. number of files to be processed in parallel) is available at job dispatch time. For example, you can use a python expression with os.listdir(quicktime_dir) to drive Strings plug on Wedge node. This will create a task branch for each quicktime.
2. if the number of tasks is not known at the dispatch time, you can process tasks in batches. For example, you can create a job with 10 tasks and each of these tasks will be processing some of quicktimes in parallel (0, 10, 20 for the first task, 1, 11, 12 for the second task, etc). You will not achieve maximum parallelism this way, but it is still a practical solution
automation with custom nodes
To get the most of automation, you should create your own nodes and apps. You are right that there are not many open-sources examples of Gaffer task nodes, but Gaffer itself is open-source and we are constantly using Gaffer's source code as the source of inspiration for our tools. Most of the task nodes in GafferDispatch module are python-based - they are great examples for creating your own Task nodes
1. Most of our task nodes are uber-nodes build from other nodes and expressions. A lot of them are not actually implementing Task API, but simply connecting Task plugs. For example, our Arnold render-layer dispatch node is build of:
- 10 python expressions
- TaskList
- 5 scene nodes e.g. StandardOptions, ArnoldOptions, etc
- Arnold render
- 5 cs-python commands (a wrapper of PythonCommand) to run denoising, post-processing, error-checking, etc
Task nodes use a combination of per-frame tasks, sequence tasks and immediate tasks. Immediate tasks are particularly useful for setup work that should be executed once during dispatch.
2. We have a few nodes that override parts of Task API:
- hash() function is usually overridden to enable/disable certain tasks programmatically
- execute() or executeSequence() is overridden to change the actual execution of the task or to add things like progress reporting. Our Nuke command node or our assetised ImageWriter node are examples of nodes that override these functions
3. We wrap execution of typical DCC commands into custom nodes to make automation templates simpler e.g. CsNukeCommand or CsMayaCommand
4. 'Scene globals' are a great way to pass pipeline data down the graph. For example, we rely on scene globals to pass information about render-layers down the graph. So our templates have a single render node, but it can deal with variable number of layers and layer setups. I cannot overstate how much our pipeline (including automation) relies on passing data in Globals plug
5. Please remember that Gaffer supports arbitrary plug type connections, not just scene, image and task ones. You can pass asset information between nodes using string, stringArray, dict or your own custom data type. For example, a render node can pass the asset data to downstream slapcomp nodes using plug connections
6. A more advanced automation pipeline will require passing dynamic data between task nodes using files or databases. There is currently no built-in support for this in Gaffer, but there are plans to support this in the future. At Cinesite, the passing of dynamic data between tasks is file-based. We have a couple of C++ nodes that deal with nitty-gritty aspects of data management and type conversions.
7. You might want to consider creating custom Gaffer apps to handle automation. Gaffer provides a bunch of built-in apps (e.g. gui or execute), but it is quite easy to register your own using GAFFER_APP_PATHS. You can run most of automation tasks using execute or python apps, but custom apps can make the pipeline cleaner
8. Some pipelines use custom dispatcher to take full control of dispatching and automation. We rely on the built-in dispatcher (Tractor), but we use preDispatch signal to tweak dispatched jobs. Many of our pipeline tasks are relying on self-expanding tasks (task that generate other tasks). I do not think gaffer support them out of the box, but you can add such functionality in preDispatch hooks.
regards,
Alex