Issue 80 in xar: Xar chokes with a large number of files

2 views
Skip to first unread message

codesite...@google.com

unread,
Mar 15, 2011, 8:16:27 PM3/15/11
to xar-...@googlegroups.com
Status: New
Owner: ----
Labels: Type-Defect Priority-Medium

New issue 80 by roberto....@gmail.com: Xar chokes with a large number of
files
http://code.google.com/p/xar/issues/detail?id=80

What steps will reproduce the problem?
1. find about 70000 XML files roughly 3k each
2. Attempt to create an XAR archive using the command "xar -cf xml.xar xml/"
3. Watch xar chug for about an hour, eat up gobs of RAM and not finish.

What is the expected output? What do you see instead?

I would like xar to yield an archive in a reasonable amount of time.
Unfortunately it took so long I just killed the process before it completed.

What version of the product are you using? On what operating system?

xar 1.5.2 on gentoo

Please provide any additional information below.

While the process was running the file on disk remained at 0 bytes. Does
xar try to create the entire archive in memory and then dump to disk at the
end? With a large number of files, this appears to break.

I was attempting to compare the file size and performance difference
between xar and tar. Tar was able to complete the task within a few minutes
and reduce the size of the XML directory from 445M to 46M. I have no
numbers for xar as it didn't finish.


codesite...@google.com

unread,
Apr 25, 2011, 5:22:45 PM4/25/11
to xar-...@googlegroups.com

Comment #1 on issue 80 by 1billgar...@gmail.com: Xar chokes with a large

It's my understanding that xar is supposed to do limited work in RAM (e.g.
data is processed in 4K chunks) and written into /tmp files before the
archive is constructed.

I've been using the source distributed with Mac OS X.
<http://www.opensource.apple.com/source/xar/>, which calls itself v1.6

Reply all
Reply to author
Forward
0 new messages