High disk-space cost of hard normals in Alembic

1,102 views
Skip to first unread message

Jordan Hueckstae

unread,
Feb 14, 2013, 3:29:02 PM2/14/13
to alembic-d...@googlegroups.com
I’ve run into this issue exporting an Alembic cache on a character with a large amount of deforming meshes with hard normals.  For our production asset, the normal data more than quadruples the size of the cache for a 50 frame sequence.

The initial issue of how to store Maya’s hard normals in Alembic was first discussed here:
http://code.google.com/p/alembic/issues/detail?id=211

The resolution of this discussion was to have the Alembic exporter save out explicit normals if hard edges (or locked normals) were present in a mesh.  This solves the issue while maintaining lossless normals across applications, but it also has the very high disk-space cost of writing out per-vertex per-face normal information for each sample.  

Alembic currently calculates “soft” normals on the fly when no normal data is present.  It would be a huge boon if alembic stored a static array of edge or face-vertex booleans that define which vertices are “hard”.  The normals of these vertices can be calculated on the fly from the face normals, and thereby save a huge amount of disk space (static, per-edge booleans versus a per-sample, per-face-vertex triplet of doubles).


Ben Houston

unread,
Feb 14, 2013, 3:39:17 PM2/14/13
to alembic-d...@googlegroups.com
We are running into the same issue with 3DS Max as it uses for its
normal calculation "smoothing groups" which don't really translate
into other tools. Thus we are forced to save out hard normals in some
cases and it is incredibly costly.
-ben
> --
> You received this message because you are subscribed to the Google Groups
> "alembic-discussion" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to alembic-discuss...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>



--
Best regards,
Ben Houston
CTO, Exocortex Technologies, Inc.
http://www.exocortex.com

Lucas Miller

unread,
Feb 14, 2013, 3:40:29 PM2/14/13
to alembic-d...@googlegroups.com
Alembic doesn't calculate normal on the fly, it either provides it to the application, or lets the application choose how to calculate it.

Hard vs soft normals is an application specific notion, that doesn't really belong in Alembic.

Lucas


Lucas Miller

unread,
Feb 14, 2013, 3:46:02 PM2/14/13
to alembic-d...@googlegroups.com
One potential way to make it less costly is to create an indexed geometry parameter.
It will only help you if the same normal value is repeated over and over amongst your face-varying points.

Lucas

Ben Houston

unread,
Feb 14, 2013, 3:53:02 PM2/14/13
to alembic-d...@googlegroups.com
We do save out proper indexed normals, but the saving's is very mild.
It is still huge (at least 3 floats per normal-vertex per frame) where
as with "smoothing groups" (one int32 per face) they are not animated
at all.

Maybe one just needs to determine a means to map these non-standard
smoothing methods onto the subdivision crease specifications or
something like that, I wonder if it can be done perfectly and without
any loss of fidelity.

Best regards,
-ben

Jordan Hueckstae

unread,
Feb 14, 2013, 8:11:12 PM2/14/13
to alembic-d...@googlegroups.com
"Hard vs soft normals is an application specific notion, that doesn't really belong in Alembic."

I would argue that hard-edges are a common enough occurrence between applications to be considered an application agnostic geometric feature.  Exactly the kind of cg primitive that is stated in Alembic's goals.

Storing per-sample per-face-vertex normals is Alembic's current method of saving this information.  This suggestion is about more efficient disk-space usage.  Storing a static array of face-vertex “hardness” booleans should be fine grained enough to support any application that makes use of edge hardness.  For example, both Maya’s edge-hardness boolean, and Max’s smoothing groups could be implemented using this same array, and such caches would be interchangeable between applications.

It would be up to the individual importers and exporters to determine whether they support edge-hardness and how to display that information.  For example, the Maya importer would need to map the per-face-vertex data to an array of per-edge booleans, and the Max importer would need to map the per-face-vertex data to per-smoothing group integers.  The per-face-vertex boolean array is (intentionally) capable of storing more information than either of these approaches, so edge cases (no pun intended) will need to be handled by the importer.

Ben Houston

unread,
Feb 14, 2013, 8:28:08 PM2/14/13
to alembic-d...@googlegroups.com
Hi Jordan,

I think we have to do this as we have had users complain about file
sizes in some situations and after investigating it was because of
animated explicit normals.

Can we come up with a standard name to place this information in the
.argGeom section? How will edges be ordered in this array?

I think it could go in the base of the PolyMesh primitive, but only if
Lucas agrees. It should be considered optional information for sure.

Also how is this information set in Maya? Can you share the API
reference so we can add this to our Maya plugin?

Best regards,
-ben

Rob Bredow

unread,
Feb 15, 2013, 9:36:06 AM2/15/13
to alembic-d...@googlegroups.com
Storing per-sample per-face-vertex normals is Alembic's current method of saving this information.

Yes, allowing for the host application to precisely define the normals for passing along to another application unfamiliar with the modeling systems smoothing algorithms.
 
This suggestion is about more efficient disk-space usage.  Storing a static array of face-vertex “hardness” booleans should be fine grained enough to support any application that makes use of edge hardness.  For example, both Maya’s edge-hardness boolean, and Max’s smoothing groups could be implemented using this same array, and such caches would be interchangeable between applications.

That would be more compact, agreed. However, Houdini's loader (for example), would need to understand how to interpret these booleans and turn them back into normals. Renderman as well. So, this would take a fairly important (and application specific) modeling concept and move the logic for the "look" of the normals into either the Alembic library (yuck) or into each of the individual loaders (worse for consistency).

This isn't to say storing application-specific data which is more compact is not allowed in Alembic--it certainly can be done as any arbitrary data can be attached at any level of the model and interpreted how you'd like for your particular application. But, the models definitions chosen for Alembic and the reference loaders/savers were settled on because they were the most basic common set for interchange without ambiguity. 

I would argue that compactness is very important, but getting the same model (or normals) in every application that loads an Alembic file is more important.

Rob

Steve LaVietes

unread,
Feb 15, 2013, 9:48:00 AM2/15/13
to alembic-d...@googlegroups.com, alembic-d...@googlegroups.com
One thing to keep in mind is that there's no GeomParam scope which describes data attached to edges. If added in arbGeomParams, you'd end up with a potentially large array attached at a "constant" scope which, by default, would be added (and meaningless) in output for RenderMan and Arnold.

The contents of userProperties have the benefit of having no pre-defined behavior in terms of renderer output.

I'm also curious to see how the hard/soft/smoothing information is stored and interpreted across different applications.

-stevel

Alex Suter

unread,
Feb 15, 2013, 11:42:50 AM2/15/13
to alembic-d...@googlegroups.com
We've had similar issues with the size and export time costs of including normals in the cache. So far our solution is to only export normals if we find a hard edge at export time, and only export them for the geometry with that hard edge, otherwise we omit them from the cache and let the packages interpret things as they will. 

We're usually going between proprietary tools, Maya, and a rendering package and it's mostly been okay.

For fast playback of cached data not having the normals on tap and having to calculate them is an issue, though. We haven't figured that one out yet.

            -- Alex

Jordan Hueckstae

unread,
Feb 15, 2013, 8:06:55 PM2/15/13
to alembic-d...@googlegroups.com
I’m excited to hear that there’s interest in doing this!

There are three relevant Maya api methods on MFnMesh: setEdgeSmoothing sets an edge hard or soft, cleanupEdgeSmoothing and updateSurface both need to be called once you are done setting the edges hard or soft.  The edge iterator, MItMeshEdge, also has similar functions setSmoothing, and updateSurface.  The edge iterator does not need to call ceanupEdgeSmoothing (according to the docs).  Let me know if you need a more information, I’m happy to help with getting this feature added.

Jordan Hueckstae

unread,
Feb 15, 2013, 8:08:44 PM2/15/13
to alembic-d...@googlegroups.com
Alex, we are using this method as well.  It is part of how Alembic decides whether to export normals from Maya.

Chadrik

unread,
Feb 19, 2013, 5:48:12 PM2/19/13
to alembic-d...@googlegroups.com, r...@185vfx.com

I would argue that compactness is very important, but getting the same model (or normals) in every application that loads an Alembic file is more important.
 
I agree that consistency across applications is important, but it is precisely because of its lack of compactness that Alembic is currently failing at this goal.

Consider the current options supported by Alembic:
1) Export explicit normals:  all applications produce same result.
2) Do not export normals:  result is application dependent.

When a cache does not contain normals (#2), Alembic plugins simply create a mesh without providing any normals, and the application does what it will.  Depending on the application, this could result in hard normals, soft normals, or a mix of hard and soft based on edge angle, etc.  This is a failure of consistency.

Why would a user choose not to export normals and risk this inconsistent behavior?  Because caches with normals are humongous!  Providing an edge hardness boolean in the official spec and adding support to application plugins will increase Alembic's inter-application fidelity by providing a lossless option that is actually practical to use. The amount of data required is so small that it could be on by default if full normals export is not enabled, thereby closing the loophole that allows for this inconsistent behavior.
 As it stands right now, many studios do not have a practical solution for losslessly transferring meshes when using Alembic. We're forced to micro-manage dynamic attributes to enable or disable edge-hardness, and even that doesn't suffice to keep the file size manageable. With an edge- or vertex-hardness boolean, we could just leave it on all the time and not have to worry about it.

That would be more compact, agreed. However, Houdini's loader (for example), would need to understand how to interpret these booleans and turn them back into normals. Renderman as well. So, this would take a fairly important (and application specific) modeling concept and move the logic for the "look" of the normals into either the Alembic library (yuck) or into each of the individual loaders (worse for consistency).

I’d like to delve into the “yuck” factor of putting this logic in Alembic.  Just how strongly do you feel about this?  The rules for computing the normals are well-defined: if a face-vertex has a “hard” boolean set to true, its normal is the same as the face normal; if it is set to false, it is the average of the shared face-vertices.  The resulting set of normals would be exactly the same as if explicit normals were provided by the exporting application, but the caches would be a fraction of the size.  It’s not a straight data dump, but there’s not much computation involved. It's really just a stronger form of compression.

Personally, I think that a minor loss of purity s a fair trade for a standard that behaves consistently across applications and is practical to use. The Alembic core can provide a convenience function to compute a complete set of normals which can be used by those applications that don't have a notion of edge hardness.  Other application plugins can get direct access to the edge hardness array and use it to their advantage.  Which brings me to my last point...

For those applications that support it, there are likely performance benefits for setting static edge hardnesses once, rather than pulling a complete set of normals from the cache each frame and resetting all those normals.

Another option that I’d like to throw out there in case it gets us past an impending impasse, is storing a static set of normals in tangent space.  Is that more or less palatable than a hard/soft bool array?

So, to recap:

Pros:
- Dramatically smaller caches
- Possibility to always enable edge-hardness export --> more consistent behavior across applications
- Possible performance improvements for applications that support it

Cons:
- More application-specific work to do at the plugin level
- Less purity / more logic added to the Alembic core



Lucas Miller

unread,
Feb 19, 2013, 6:04:56 PM2/19/13
to alembic-d...@googlegroups.com, r...@185vfx.com
Dramatically smaller caches but with a non-negligible computation cost is a non starter.

Are you sure that edge-hardness is consistent among nearly all applications, or just the applications you happen to be using?

Lucas


Chadrik

unread,
Feb 19, 2013, 6:19:40 PM2/19/13
to alembic-d...@googlegroups.com, r...@185vfx.com
Dramatically smaller caches but with a non-negligible computation cost is a non starter.

Applications that support hard edges might actually be faster.  In the case that Alembic is not providing the app any normals, then the application is required to calculate the normals itself.  We would be moving that calculation into Alembic for consistency and reduced file size, so performance could theoretically be similar.  This is the case we are trying to improve.  Full normal export is the clear winner in terms of consistency, but that option would not change.

Are you sure that edge-hardness is consistent among nearly all applications, or just the applications you happen to be using?

So far we've confirmed that Maya, Max, and XSI have some concept of edge-hardness.  

Ben Houston

unread,
Feb 19, 2013, 8:22:20 PM2/19/13
to alembic-d...@googlegroups.com

Chadrik, just a quick note that you say edge hardness but you describe a data set of boolean that are per face-vertex rather than per edge. Which is the preferred canonical representation or the

Sent from my phone, sorry for my grammar and terseness.

Kevin Campbell

unread,
Feb 19, 2013, 9:46:04 PM2/19/13
to alembic-d...@googlegroups.com, r...@185vfx.com
Edge hardness in prman is an integer, not a boolean.
Which of these apps supports an edge hardness of 5?

i'd prefer keeping this sort of computation out of alembic.

kevin

--
kevin campbell
director of production technology | kevin.c...@rsp.com.au
rising sun pictures | www.rsp.com.au
phone +61 8 8400 6456 | mobile +61 432 483 166

Jonathan Litt

unread,
Feb 20, 2013, 2:50:43 AM2/20/13
to alembic-d...@googlegroups.com
+1 to everything that Chadrik said. I totally get Rob's motivation for thinking "yuck", but the "yuck" factor of the alternative (doing nothing) must be taken into account too. Opportunity yuck cost. :)

> Dramatically smaller caches but with a non-negligible computation cost is a non starter.

Why? Is it a stated goal of the Alembic project that computation speed is of the utmost priority above all other things? Not trying to troll, I'm honestly curious why you think that. I would think that cache size is a *very* important priority for Alembic, especially if the cache author has options to decide on their own about file size vs. computation speed. I/O time, especially precious network I/O time, must also be considered when speaking of cost for both reading and writing. If a file is x-times bigger than it could otherwise be, the I/O cost for some values of "x" might be orders of magnitude larger than other computations such as this proposed one.

Regardless of all that, I agree with Chadrik's other message that this would save time in many cases and probably not be any worse in the rest of them.

Would handedness be an issue in the hypothetical normals computation?

-Jonathan

Alex Suter

unread,
Feb 20, 2013, 11:51:51 AM2/20/13
to alembic-d...@googlegroups.com
Handedness shouldn't be an issue.. Alembic already has a standard, which happened to be the opposite of our proprietary tools which caused some issues. Whether or not it's calculated by the alembic code shouldn't change that direction.

Rob Bredow

unread,
Feb 20, 2013, 12:18:32 PM2/20/13
to alembic-d...@googlegroups.com
> Dramatically smaller caches but with a non-negligible computation cost is a non starter.

Why? Is it a stated goal of the Alembic project that computation speed is of the utmost priority above all other things?

I believe Lucas is referring to the "Alembic Promises" document which was prepared early on in Alembic's life to guide all future development. The document was created to provide the many vendors supporting Alembic a sense of the future priorities of the library and to ensure we never reversed the core goals of the project:


With #2 being the issue potentially at play here. But, you are correct that the total cost for reading the geometry should be taken into account. It's certainly possible to imagine that a significantly smaller file with some computation to fill in the missing normals could actually end up being faster. Not yet proven, but that seems like it could potentially be true.

Let me confirm I understand the file size issue correctly: You have deforming geometry and in order to store the normals you need to store an extra vector for each vertex. In the case of soft normals (shared), this should add 1 vector for each point, potentially doubling the size of the uncompressed point data (or you can leave them off and the host program will add them on by default). In the case of hard normals (unshared), you could potentially have as many extra vectors as you have average number of sides on your n-gons (so maybe 3 times larger files for a triangle-based topology or 4 times larger for a quad-based topology). And, in your cases, the hard and soft normals need to be mixed on your animated models so the importing program can't just know which type of calculation to run on a per object or per shape basis. Is that what you're seeing?

I think the key challenge here is that many people have built pipelines that do not support "hard" normals at all, while some rely on them regularly. Those who don't need them don't want to clutter up Alembic with application-specific modeling techniques (the basis for me saying "yuck"), while those who use them everyday can't understand why it's not considered a standard. Prman users probably use subd surfaces when they want to vary apparent crease sharpness which allows for creation of a geometric representation of the modeled surface rather than just averaging or not averaging a normal. Others who rely on polys find the explicit method of "hard" normal support annoying and wasteful. 

My personal mission with regards to the development of Alembic is to keep it from becoming the "tif" of the geometry libraries. By that I mean I would hate to see Alembic be a file format that requires such extensive Alembic expertise to load and save and display correctly that most software doesn't bother to support it fully and then you get different results in different packages. Today, with explicit normals, you pay for the precision with extra storage, but the results are simple and guaranteed to match in every package. That's not to say we could never support "hard" normals, just that's the thinking up to this point.

To move forward, what needs to be proven is if all of the modeling packages that support "hard" normals without explicitly outputting normals can be satisfied with a single definition of which vertices are hard and soft. And, can we transparently provide support for applications that don't support hard normals in a way that works everywhere. So far I've heard a couple of slightly different variations and no conclusion on widespread application support. If a single definition can be proven to work with all modeling and animation packages without significant compromise, and implementation is simple enough and fast enough not to break the Alembic Promises, we should consider either adding this ourselves or accepting a contribution to add this to Alembic.


Rob




 

Chad Dombrova

unread,
Feb 21, 2013, 12:32:08 AM2/21/13
to alembic-d...@googlegroups.com, r...@185vfx.com
Edge hardness in prman is an integer, not a boolean. Which of these apps supports an edge hardness of 5?

prman’s int-based hardness is for subdivision surface creases, which is a different topic.  what I’m concerned with here is a more compact way to represent face-vertex normals.

With #2 being the issue potentially at play here. But, you are correct that the total cost for reading the geometry should be taken into account. It's certainly possible to imagine that a significantly smaller file with some computation to fill in the missing normals could actually end up being faster. Not yet proven, but that seems like it could potentially be true.

i was curious just how computationally intensive these on-the-fly normals calculations actually are, so i did a test with arnold, with 1,000 non-instanced spheres, with 1,000 verts each, totalling 1 million verts.  i exported the scenes to .ass file with and without normals and ran 3 trials each.  the scene did in fact build faster when no normals were present and arnold was required to compute normals on the fly:  0.71s vs 0.97s. however,
speeds were similar reading from my hard disk versus from our server, which could indicate that another possible slowdown is in copying data around.

Let me confirm I understand the file size issue correctly: You have deforming geometry and in order to store the normals you need to store an extra vector for each vertex. In the case of soft normals (shared), this should add 1 vector for each point, potentially doubling the size of the uncompressed point data (or you can leave them off and the host program will add them on by default). In the case of hard normals (unshared), you could potentially have as many extra vectors as you have average number of sides on your n-gons (so maybe 3 times larger files for a triangle-based topology or 4 times larger for a quad-based topology). And, in your cases, the hard and soft normals need to be mixed on your animated models so the importing program can't just know which type of calculation to run on a per object or per shape basis. Is that what you're seeing?

exactly correct.

To move forward, what needs to be proven is if all of the modeling packages that support "hard" normals without explicitly outputting normals can be satisfied with a single definition of which vertices are hard and soft.

Here’s an overview of what we know:

- Maya: hard/soft boolean per edge
- XSi: hard/soft boolean per edge (plus vertex creasing? need to look into this)
- Max: smoothing groups, int32 per face (int is a mask: each face can be in multiple smoothing groups. this effectively allows per-vertex control of hardness)
- Modo: smoothing groups (no other details yet)
- Houdini: Rob mentioned that Houdini does not support edge hardness or smoothing groups

we will be gathering more details on the underlying API calls as well as a more thorough understanding of what scenarios each system is capable of, but based on our most recent research, our current front-runner is a per face-vertex smoothing group integer.  We prefer this format because:

- edge hardness can represent scenarios that per-face smoothing groups cannot, and vice versa
- face-vertex smoothing groups can represent either
- per-face smoothing groups map easily to per-face-vertex smoothing groups
- we would not need to introduce the concept of an edge to alembic

we believe that per face-vertex smoothing groups should be capable of representing the systems used in all of the above software packages, but we’ll know with more certainty once we have done more investigation.

i encourage other suggestions or critiques of these proposals.

And, can we transparently provide support for applications that don't support hard normals in a way that works everywhere.

Yes, but it would require alembic to do the work of producing a complete set of normals from the hard/soft data stored in the alembic file.  this could be exposed as a “computeNormals()” function, which could be easily inserted into every application loader at the point of normals loading, because its data type would be the same as the explicit normals pulled from the cache (a list of vectors).

from there, plugins for applications that could further benefit from working directly with the hard/soft data (like Maya, Max, and XSi) could do so, thus avoiding the need to reset the object-normals each frame.  they would instead map the hard/soft data to the necessary application-specific representation, once per object.  this phase would only be worthwhile if there is a dramatic performance gain, as the “computeNormals()” would be far easier to implement.

a good first step to isolating and analyzing the potential performance implications would be to implement a computeNormals() prototype that just computes “soft” normals (no
hard/soft data required).  comparing no-normals vs alembic-computed-normals performance in various applications would reveal the effect of moving the computation from the application inside alembic.  comparing full-normals vs alembic-computed-normals would reveal the impact of disk read vs computation.

So far I've heard a couple of slightly different variations and no conclusion on widespread application support. If a single definition can be proven to work with all modeling and animation packages without significant compromise, and implementation is simple enough and fast enough not to break the Alembic Promises, we should consider either adding this ourselves or accepting a contribution to add this to Alembic.

Thanks Rob.  we’ll work on gathering the necessary information.  Again, any input from various 3d app experts would be a big help to make sure we don’t overlook anything.

-chad

Ben Houston

unread,
Feb 21, 2013, 10:03:47 AM2/21/13
to alembic-d...@googlegroups.com, r...@185vfx.com
I think that edge hardness is common enough that we need to support it
-- we've had clients specifically look at alternatives to Alembic
because of this since Exocortex is investing significantly on Alembic
we really don't want this to happen as it costs us money.

If we provide a common implementation of deriving the vertex normals
from a standardized edge hardness input, then even applications that
don't support edge hardness can benefit from having smaller files.

Chad Dombrova, you requested some simple soft normals code. I've
attached some pretty standard code for calculating area-weighted
vertex normals from just face indices, it uses some weird array types
which you can replace with std::vector pretty easily. This is a
pretty efficient method of doing this calculation while sticking in
straight C++ with IlmMath types. It contains a OpenMP multithreading
construct as well, which can easily be removed.
AreaWeightedVertexNormals.cpp

Andrew Lyons

unread,
Feb 21, 2013, 11:29:02 AM2/21/13
to alembic-d...@googlegroups.com, r...@185vfx.com
Lets be scientific, and start with more complete sample sets.

Here's a full list of commercially available 3D animation software:
http://en.wikipedia.org/wiki/List_of_3D_animation_software

Here's a full list of commercially available compositing software (some with 3D support):
http://en.wikipedia.org/wiki/Category:Compositing_software

Not all of these packages are used extensively in high end production, so I'd argue that it's questionable whether Alembic should attempt to serve the users of all these software packages - professional or otherwise. Note also that these lists do not include Zeno, or any of the other proprietary tools used at major high end animation and visual effects studios - some of whom lead development on Alembic.

As someone who has wrestled for years to develop interfaces to pipeline infrastructure designed for specific prosumer 3D packages, I'm really opposed to integrating support for features of a specific application or application vendor into what is supposed to be a generic exchange format. The guys that call the shots on Alembic have been really generous in the past, but if were up to me, I'd nix this idea.

Cheers


--
You received this message because you are subscribed to the Google Groups "alembic-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to alembic-discuss...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.





--
===================================================
Andrew D Lyons | Software Engineer | http://www.linkedin.com/in/tstex
===================================================

Chad Dombrova

unread,
Feb 21, 2013, 2:17:44 PM2/21/13
to alembic-d...@googlegroups.com, r...@185vfx.com
Not all of these packages are used extensively in high end production, so I'd argue that it's questionable whether Alembic should attempt to serve the users of all these software packages - professional or otherwise.

that's true. so let's focus on the apps that matter.

keep in mind that when talking about support for edge hardness, we're primarily concerned with producers of alembic files.  i've proposed what i think is a workable solution for providing transparent read-only access to alembic files with hardness data, so  applications like renderers and compositors which utilize alembic in a read-only fashion 99.99% of the time will be trivial to support.  3d applications which export to alembic will require more care, as they will be required to map their concept of edge hardness to alembic's (something they already do with vertex winding order, etc).

so far in our investigation we've discovered that most 3d applications that can create animated deforming geometry have some concept of edge hardness:  Max, Maya, XSi, Modo, and Blender.  I would love to have suggestions for other applications we should look into.  If Houdini does not support edge hardness that would seem very odd to me, so I will have one of our FX TDs look more into this.

Note also that these lists do not include Zeno, or any of the other proprietary tools used at major high end animation and visual effects studios - some of whom lead development on Alembic.

these studios can continue to write full normals to their infinite disk drives :)  seriously though, i don't think that disk space is a concern for ILM or Sony or they would have addressed this already.

As someone who has wrestled for years to develop interfaces to pipeline infrastructure designed for specific prosumer 3D packages, I'm really opposed to integrating support for features of a specific application or application vendor into what is supposed to be a generic exchange format. 

trust me, i don't want to do that either.  i'm not proposing that we put Maya's edge hardness or Max's smoothing groups into alembic.  i believe that there is an uber-concept that can represent them all.  this is not a "feature", it is an optimization to make alembic practical to use in production.

-chad


Alex Suter

unread,
Feb 21, 2013, 11:37:18 PM2/21/13
to alembic-d...@googlegroups.com
I wish we had infinite disks. :)

ILM is definitely concerned about disk space. Currently we're not writing normals by default unless we encounter a hard edge set on a polymesh. It does come up occasionally, but most of the time we're fine without normals.

The downsides are slower export times, slower loading times, and more disk space. So far we've avoided it.


Reply all
Reply to author
Forward
0 new messages