Maya SceneInterface workflow questions

110 views
Skip to first unread message

Nathan Rusch

unread,
Dec 5, 2014, 2:58:04 PM12/5/14
to cort...@googlegroups.com
Hello all,

I'm exploring the possibility of using Cortex as the backend for an animation caching pipeline in Maya (stop me if you've heard this one before...). What we are basically looking to do is take advantage of a deferred/on-demand loading workflow like the one shown in the famous DigiPro 2013 demo (http://vimeo.com/74885942), with the addition of shader assignments. Unfortunately, the Cortex build process is proving to be painfully fragile so far, so before I fight with that any more, I thought I would try to get some basic information to see if Cortex will be able to do what we're after out of the box.

As I understand it, this is the current state of the SceneInterface implementations in Maya (apologies if I cross some terminology up):
  • SceneInterface reader is there and fully functional.
    • Not currently implemented for Alembic caches though.
  • SceneInterface writer is not implemented for Maya.
    • Cortex includes code for a functional Houdini version that could theoretically be adapted to work in Maya without too much trouble.

The remaining questions I have are:

  • Are there existing tools in Cortex for performing shader assignments to objects that exist in caches loaded using the SceneInterface?
    • In other words, shader assignment to cache nodes without Maya node representations.
  • Does the presence of a SceneInterface writer imply a supported storage format as well? Or is it totally abstracted away from the storage backend?
    • I've seen at least one thread mentioning a SceneCache as well, but I haven't really looked into how those two do (or don't) correlate or overlap.

Anyway, I would really appreciate any information anyone would be willing to share on the current state of the functionality I'm looking for in Cortex itself, or anecdotes about workflows built around these components.

Thanks,


-Nathan

Andrew Kaufman

unread,
Dec 5, 2014, 8:59:17 PM12/5/14
to cort...@googlegroups.com
On 12/05/2014 11:58 AM, Nathan Rusch wrote:

As I understand it, this is the current state of the SceneInterface implementations in Maya (apologies if I cross some terminology up):
  • SceneInterface reader is there and fully functional.
    • Not currently implemented for Alembic caches though.
  • SceneInterface writer is not implemented for Maya.
    • Cortex includes code for a functional Houdini version that could theoretically be adapted to work in Maya without too much trouble.

You've got it right so far. We really should get a public Maya exporter, but unfortunately we're pretty slammed with production stuff for the time being, and it's hard to divert resources since we've got a functioning exporter internally. The Houdini exporter basically just uses IECoreHoudini::LiveScene to describe the hierarchy, and iterates through the children, writing the attributes and objects as it goes. IECoreHoudini::LiveScene is just another SceneInterface, one that doesn't load data from disk, but rather converts it live in memory from Houdini. There is already an equivalent IECoreMaya::LiveScene which does the same in Maya.

The Alembic support would be great as well, and not too difficult, but again, since we don't use Alembic in production it's hard to justify for us until we get some downtime.



  • Does the presence of a SceneInterface writer imply a supported storage format as well? Or is it totally abstracted away from the storage backend?
    • I've seen at least one thread mentioning a SceneCache as well, but I haven't really looked into how those two do (or don't) correlate or overlap.

There isn't necessarily a storage format involved with a SceneInterface at all. Both the Maya and Houdini LiveScene implementations are operating entirely in memory, querying the host application DAG, and converting geometry and attributes to Cortex equivalents on the fly. However, both of those are read-only at present, the write methods throw exceptions.

SceneCache (.scc) is one example of a SceneInterface implementation which stores data in a file, as is LinkedScene (.lscc), the difference being that the later can store hard-links to external SceneInterface files (of any supported extension). The Alembic support will come by implementing a SceneInterface class in IECoreAlembic, at which point a .lscc could store links to .abc files as well and the Maya and Houdini nodes could read .abc.



  • Are there existing tools in Cortex for performing shader assignments to objects that exist in caches loaded using the SceneInterface?
    • In other words, shader assignment to cache nodes without Maya node representations.

You can assign an IECore::Shader object as an attribute at any level in the SceneInterface hierarchy. At Image Engine we make shader assignments via Gaffer graphs, and we have Gaffer hooked into Maya using a proprietary package called Caribou. The Maya hierarchy flows from Maya into Gaffer using IECoreMaya::LiveScene. We store the shader assignments separately from the anim caches, and hook them together later on using automated asset management tools. We demoed this workflow at the Gaffer BOF at Siggraph 2014, but haven't managed to get the videos online yet... see above about being slammed with production at the minute...

If you want to generate those IECore::Shaders during your anim export, without having any Maya equivalent, that should be possible by using IECoreMaya::LiveScene::registerCustomAttributes (also bound to python if you prefer). That allows you to write custom functions to tell IECoreMaya::LiveScene about SceneInterface attributes that it wouldn't otherwise consider. Using that, your exporter can stay pretty basic: outScene->writeAttribute( liveScene->readAttribute() ), and if you ever decide to go for fancier live workflows like Caribou, you'll already have your shaders flowing through. You'd need to know how to create the IECore::Shader objects in the first place, and which levels of the hierarchy to attach them to. I'll assume you have an idea how you want to do that already since you're not interested in representing the data in Maya though? Actually, re-reading your question, I'm not entirely sure I've understood. What sort of shaders are you trying to assign and what renderer are you using?



Anyway, I would really appreciate any information anyone would be willing to share on the current state of the functionality I'm looking for in Cortex itself, or anecdotes about workflows built around these components.


I guess I kind of addressed some of that above. Let us know if you have other specific questions. We'll try to get those videos online eventually...

Cheers,
Andrew

Chad Dombrova

unread,
Dec 8, 2014, 6:44:28 PM12/8/14
to cort...@googlegroups.com, cort...@googlegroups.com
Hi Andrew,
Thanks for the detailed response.  

I’m trying to get a grasp on how much development will be needed in order to achieve our basic goals, but there’s a lot of code to dig through and only so much that can be gleaned from tinkering around, so pardon the barrage of questions.

Big picture, we’re trying to achieve the following:
- open Gaffer in Maya
- load a live Maya scene into a Gaffer graph
- load an animated geometry cache
- use Gaffer to assign arnold shaders
- render

I’ve got some questions about each of these:

1. open Gaffer within Maya:
Gaffer and Maya are both written using Qt, so I imagine that the actual loading of the GUI within Maya is not too difficult (compilation issues and library dependencies aside).
- What is the role of Caribou in hooking Gaffer into Maya? 

2. operate on a live Maya scene within a Gaffer graph
- I know that the maya scene graph is brought into the cortex universe by IECoreMaya::LiveScene (aka IECoreMaya:MayaScene), but I’m not quite sure how this is represented in Gaffer. Do you load the live scene into Gaffer using a node, like you would for a geo cache?  
- Will the Gaffer graph automatically recompute when an attribute changes in Maya?

3. load an animated geometry cache
We’re happy to put off using Alembic and start with cortex’s native cache formats.  Our goal is to visualize the cache in openGL (preferably within the Maya viewport) but not load any geometry data into Maya.
- How would one go about loading an .scc into Maya / Gaffer?  Is the Op for this included in the source?
- From the videos, it appears that when geometry is loaded within Gaffer it is only possible to get a bounding box representation within Maya’s viewport.  Is that the case?  Are your artists using the Gaffer openGL viewport instead of the Maya viewport?
- The DigiPro 2013 video showed some pretty deep Maya integration (right-click menus, dynamically expanding/contracting children in outliner, etc).  Is any of that included in the Cortex source?  Is that workflow superseded by the Gaffer integration?

4. use Gaffer to assign arnold shaders
- Are there any included operators for doing shader assignment?
- Does cortex/gaffer do the work of inspecting the arnold universe to detect what arnold shaders are available, and present those within the GUI?

Thanks,
chad






Nathan Rusch

unread,
Dec 8, 2014, 7:20:20 PM12/8/14
to cort...@googlegroups.com, and...@image-engine.com
Thanks Andrew. I don't want to pile too much more onto Chad's questions, but I was wondering if you could shed some light on a very specific usage question.

I managed to get Cortex to build, and I've been trying to test out the SceneCache reader in Maya, but I'm not sure what all of the required steps are. I can create a SceneCache instance from the example file I have, but I don't know how to expose anything from it to Maya's UI. Here are the steps I've assembled purely from some guesswork and poking around:

- Create the SceneCache from a .scc file.
- Create an OpHolderNode in Maya
- Get an FnOpHolder interface pointing at the OpHolderNode
- Call FnOpHolder.setParameterised and hand it the SceneCache instance

In Python:

ophNode = pm.createNode('ieOpHolderNode', name='opHolderTest')
sc = IECore.SceneInterface.create('/path/to/sceneCache.scc', IECore.IndexedIO.Read)
oph = IECoreMaya.FnOpHolder('opHolderTest')
oph.setParameterised(sc)

The last call crashes Maya (in IECoreMaya::ParameterisedHolder<MPxNode>::createAndRemoveAttributes).

So, my question is, is this even remotely close to the right approach for trying to expose a SceneCache scene as a Maya node (a la the DigiPro example)? I have no idea if I'm even doing the right thing(s), as there is a dearth of information on this kind of thing.

Thanks,


-Nathan

Andrew Kaufman

unread,
Dec 8, 2014, 7:52:35 PM12/8/14
to Nathan Rusch, cort...@googlegroups.com
I'll answer your question before Chad's because it's a bit less explanation required. A SceneInterface isn't a Parameterised object, so you can't use the ieOpHolderNode node to represent it in Maya. That node is only for storing Ops, and its crashing because its being dodgy by assuming you gave it an Op without double checking.

David just sent the python necessary to load a SceneInterface in Maya, but I'll paste it again to be thorough:

    fn = IECoreMaya.FnSceneShape.createShape("myscene")
    maya.cmds.setAttr( fn.fullPathName()+'.file', "myscenefile.scc",type='string' )


That will make you an ieSceneShape node, which has a "file" attribute that you can set through the normal Maya mechanisms. The create method also returns you the function set (or you can instantiate one later on of course). FnSceneShape has several methods for expanding, collapsing, converting to geo, etc, which are the functions driven by the context menus in the viewport.

Andrew

PS.
I just realized we haven't posted the Doxygen API docs in ages... I'll try to get those updated publicly. If you look in your install dir, you should find "doc/html/index.html" and from there you can find the Op class and see its class inheritance.

Andrew Kaufman

unread,
Dec 8, 2014, 8:38:36 PM12/8/14
to cort...@googlegroups.com
On 12/08/2014 03:44 PM, Chad Dombrova wrote:
Big picture, we’re trying to achieve the following:
- open Gaffer in Maya
- load a live Maya scene into a Gaffer graph
- load an animated geometry cache
- use Gaffer to assign arnold shaders
- render


This should all be possible, and is in fact exactly how we are running our Look Dev and Lighting departments, except that we're rendering with 3delight rather than Arnold.



1. open Gaffer within Maya:
Gaffer and Maya are both written using Qt, so I imagine that the actual loading of the GUI within Maya is not too difficult (compilation issues and library dependencies aside).
- What is the role of Caribou in hooking Gaffer into Maya? 


This is where we need to distinguish between Gaffer: The Framework and Gaffer: The Lighting/Rendering Application. At it's core, Gaffer is a node graph framework with a Qt UI for editing node graphs. The bits and bobs can be built into various Gaffer.Applications, several of which ship with the Gaffer binaries (in the apps folder). One of those is called "gui" and is a standalone lighting/rendering app. Caribou is a separate (proprietary) Application, which handles the flow of data between Maya and itself.

The Gaffer framework can load in Maya out of the box (assuming you matched dependencies, etc). Just get it on your PYTHONPATH and "import Gaffer" or "import GafferUI" or "import GafferScene", etc. You can also start Gaffer's event loop within Maya out of the box: "GafferUI.EventLoop.mainEventLoop().start()"

Caribou is a proprietary package, which can link Gaffer to host DCCs. In Maya, this integration stores a Gaffer script on a Maya node. It provides method to launch a Maya panel that embeds a GafferUI.ScriptWindow, and it takes responsibility to keep that Gaffer.ScriptNode alive, and manage callbacks between Maya and Gaffer for syncing selection, undo queues, dirtying things, etc. In the future, we'll likely use it to do something similar in Houdini as well.



2. operate on a live Maya scene within a Gaffer graph
- I know that the maya scene graph is brought into the cortex universe by IECoreMaya::LiveScene (aka IECoreMaya:MayaScene), but I’m not quite sure how this is represented in Gaffer. Do you load the live scene into Gaffer using a node, like you would for a geo cache? 

Yes, Caribou defines a node from which the Maya scene flows in, just like you would load a geo cache from disk.


- Will the Gaffer graph automatically recompute when an attribute changes in Maya?


The recomputation depends on how your node is implemented, but assuming you didn't try to prevent any recomputation, and your node simply outputs the result of IECoreMaya::LiveScene, then yes, changes in Maya would be immediately reflected next time the Gaffer node computes. That's because LiveScene queries the DAG live, so any changes you'd get from normal Maya API calls should be reflected. Note that I said "next time the Gaffer node computes" though. You'd need some custom Maya integration to trigger signals on the Gaffer end to tell it that things have changed, otherwise Gaffer's internal caching mechanism won't look to upstream nodes for updates, and your MayaInput node won't be asked to recompute. At Image Engine, we use Caribou to handle all of that integration.



3. load an animated geometry cache
We’re happy to put off using Alembic and start with cortex’s native cache formats.  Our goal is to visualize the cache in openGL (preferably within the Maya viewport) but not load any geometry data into Maya.
- How would one go about loading an .scc into Maya / Gaffer?  Is the Op for this included in the source?

You don't need an Op. See the response to Nathan for the necessary python code to load a .scc in Maya. The jist is to use IECoreMaya.FnSceneShape.create(). To load a .scc in Gaffer, just use a GafferScene.SceneReader node.



- From the videos, it appears that when geometry is loaded within Gaffer it is only possible to get a bounding box representation within Maya’s viewport.  Is that the case?  Are your artists using the Gaffer openGL viewport instead of the Maya viewport?

No, that's not the case, that was just a demo to show that you don't need to have any more than a bounding box in Maya, and you can still see it all in Gaffer. The ieSceneShape node provided by IECoreMaya has several Preview options to draw bounds, full geo (as OpenGL only), or convert to live Maya geo. The viewport in Maya and Gaffer are completely independant (unless you want to link them). Our artists are currently using both viewports. Most object viewing an manipulation happens in the Maya viewport, but they have the Gaffer one there for verification, and certain overlays. We have the selection synced between the two viewports as well. We have a working prototype which redraws the Gaffer viewport over top of the Maya viewport as well, though I don't know if that workflow has become popular with artists as of yet.



- The DigiPro 2013 video showed some pretty deep Maya integration (right-click menus, dynamically expanding/contracting children in outliner, etc).  Is any of that included in the Cortex source?  Is that workflow superseded by the Gaffer integration?


Most of that automatic expansion/collapsing from RMB menus in Maya is indeed included in the Cortex source. A small portion of it was added via our (proprietary) Asset Management system (Jabuka), such as "Expand to Asset". You can add your own advanced functions to that RMB menu using IECoreMaya.SceneShapeUI.addDagMenuCallback().



4. use Gaffer to assign arnold shaders
- Are there any included operators for doing shader assignment?

The GafferScene.ShaderAssignment node can be used to assigned shaders to the specified paths in your hierarchy. You can double check you've assigned the right things using the SceneInspector pane.


- Does cortex/gaffer do the work of inspecting the arnold universe to detect what arnold shaders are available, and present those within the GUI?

Any Arnold shader that Arnold can see should automatically become a node in Gaffer. These nodes get added to your Gaffer.Application using GafferArnoldUI.ShaderMenu.appendShaders(), which you can see an example of in the "gui" app startup scripts. If you want to control the look of plugs on the shaders, you can use Gaffer's shader metadata syntax (though I don't know if that has kept quite up to par with the GafferRenderMan side of things).


Cheers,
Andrew

Chad Dombrova

unread,
Dec 8, 2014, 9:27:31 PM12/8/14
to cort...@googlegroups.com
Hi Andrew,
Thanks again for the explanation.  I feel like I'm making some mental progress. 

The natural next question is:  are you planning on open-sourcing Caribou?  I feel a bit guilty asking this, when you all have already provided so much for free, but it sounds like a critical component, particularly the callbacks that trigger gaffer to recompute.  Without Caribou, Maya+Gaffer seems like a car without a transmission*.  


-chad


* analogy credit: Nathan Rusch

Nathan Rusch

unread,
Dec 8, 2014, 10:06:03 PM12/8/14
to cort...@googlegroups.com, natha...@gmail.com, and...@image-engine.com
Of course I overcomplicated things... thanks for the example. Everything seems to work well. And yeah, I was going to mention somewhere that the Doxygen link on the Github wiki is broken.

-Nathan

John Haddon

unread,
Dec 9, 2014, 4:55:10 AM12/9/14
to cort...@googlegroups.com
Hello, and thanks for your interest...

As far as open sourcing Caribou goes, I'm afraid we have no plans for that. To stretch your analogy a bit, our main interest in open source is to collaborate with like minded mechanics, building a common set of reusable car parts, rather than to give away ready-built cars. We don't really have the resources for the latter, and I think it's only common sense to keep some of the more innovative bits to ourselves, at least in the short term.

It might be worth mentioning that we've approached the Maya/Gaffer integration work fairly cautiously, having used Gaffer in smaller roles in previous shows before jumping in to the full Caribou integration. Our first step was to use it for defining shader networks in lookdev, but those networks were still being assigned via our old Cortex procedurals. Next, we extended those procedurals so that as a final step they pulled attribute state from a Gaffer graph, so you could use Gaffer for shader and attribute assignments. That is a much simpler integration, and might be something you could consider as a halfway house too.

Cheers...
John

p.s. I should clarify that my personal goal for Gaffer itself is most definitely for it to be a car as much as it is a bunch of components, but that's somewhat separate from our use of it at Image Engine.


From: cort...@googlegroups.com [cort...@googlegroups.com] on behalf of Chad Dombrova [cha...@gmail.com]
Sent: Monday, December 08, 2014 6:27 PM
To: cort...@googlegroups.com
Subject: Re: [cortex] Maya SceneInterface workflow questions

--
--
You received this message because you are subscribed to the "cortexdev" group.
To post to this group, send email to cort...@googlegroups.com
To unsubscribe from this group, send email to cortexdev-...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/cortexdev?hl=en
---
You received this message because you are subscribed to the Google Groups "cortexdev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cortexdev+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Chad Dombrova

unread,
Dec 10, 2014, 1:11:45 AM12/10/14
to cort...@googlegroups.com, cort...@googlegroups.com
Thanks, John. We are going to keep poking around and see how far we can get with what's included. 


Sent from Mailbox
Reply all
Reply to author
Forward
0 new messages