Mike,
Thanks for the quick response. I'm currently running ACE 1.6 over
roughly 1TB's worth of data (91G files) just for testing purposed over
a recently upgraded gigabit network connection to the file store. As
was hoped for, the network upgrade has allowed me to better assess the
scalability of running this in production. Based upon your
chronopolis figures which we'll never hit here but can be proportioned
down to our environment, it sounds like I might want to bump my JVM up
more to allow for additional headroom. Thanks for those comparative
numbers.
With regard to the duplicate file issue, I'd previously removed the
duplicates just from the file store and not from ACE. Such a
selective removal from ACE looks to be done by selecting the ACE
collection, navigating to "Browse," then "Remove," correct? If I'm
understanding you correctly, then I should be able to re-run an audit
on the collection and the "fixed" duplicates shouldn't appear in the
Show Duplicates File list. Is that right?
Michael