--
You received this message because you are subscribed to the Google Groups "alembic-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to alembic-discuss...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Also, it sounds like the main performance issue in that case would be interleaving HDD reads with decompression. Did you do that as well in your tests?
Our testing shows that lzma works amazingly on mesh data. 2x better than gzip.
Best regards,
Ben Houston
http://Clara.io Online 3d modeling and rendering
On 22 Jan 2015 17:39, "Lucas Miller" <miller...@gmail.com> wrote:
You didn't account the amount of time it takes to do the decompression in your calculation.5x less data isn't a 5x speedup in reads it's 5x + decompression time of that data.Decompressing one large dataset (the whole file) is also better than having to decompress several much smaller datasets (per sample data)Lucas
On Thu, Jan 22, 2015 at 2:32 PM, Milan Bulat <milan...@thefoundry.co.uk> wrote:
I would think that if HDD / server speed is the bottleneck, instead of the CPU cycles, compression would do good?Prime example would bi simulation datasets where de-duplication does not help much. I've found that compressing Alembic files with Rar / Zip reduces their size by 2x to 5x. That translates to 2x to 5x speed increase in read-from-platter compared to uncompressed data and reduces server load.I've just tested compressing alembic containing a sphere of 500000 polygons, and it shows 7x decrease in file size when Rar-ed.Thanks,Milan
On Thursday, 22 January 2015 18:40:38 UTC+1, Lucas wrote:After extensive testing with zlib, and exploring much faster alternatives like lz4 and snappy, I found that compressing sample data greatly decreased read performance, so I left it out of the AbcCoreOgawa. Compression could theoretically help with certain very large data sets, but those are unlikely to be encountered in the average Alembic file.I don't believe Ogawa ever advertised better compression support, you most likely confused that with improved data sharing.LucasOn Thu, Jan 22, 2015 at 6:09 AM, Milan Bulat <milan...@thefoundry.co.uk> wrote:Is there any way to make Ogawa compress sample data?--By looking at the code it seems to ignore Archive's setCompressionHint.As far as I understood, one of the Ogawa's advertised features was better support for compression (ie. no need to chunk data up).Thanks,Milan
--
You received this message because you are subscribed to the Google Groups "alembic-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to alembic-discuss...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Our testing shows that lzma works amazingly on mesh data. 2x better than gzip.
Best regards,
Ben Houston
http://Clara.io Online 3d modeling and rendering
--
>>>> an email to alembic-discussion+unsub...@googlegroups.com.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>>
>>> --
>>> You received this message because you are subscribed to the Google Groups
>>> "alembic-discussion" group.
>>> To unsubscribe from this group and stop receiving emails from it, send an
>>> email to alembic-discussion+unsub...@googlegroups.com.
>>> For more options, visit https://groups.google.com/d/optout.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "alembic-discussion" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to alembic-discussion+unsub...@googlegroups.com.
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "alembic-discussion" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to alembic-discussion+unsub...@googlegroups.com.
Lucas, I assume decompressor needs to be restarted for each Alembic property? There is no way to (de)compress the whole schema sample in one go?