I tried running multiBamSummary with the recent version available in conda (3.1.2) after a fresh install and ran into struct.error problem.
Other things tested
- re-ran on bam files
- re-ran on bam files using old ver (2.5.4)
- re-ran with -r (region specified and un specified)
All end up with the same error. Any suggestions, could be a problem on our server, but can't figure out what!!
multiBamSummary bins -b a.bam b.bam -bs 100 -p 5 -r ST4.03ch03 -o /dev/null
Number of bins found: 622901
Traceback (most recent call last):
File "/usr/share/cluster-apps/miniconda/envs/deeptools/bin/multiBamSummary", line 11, in <module>
main(args)
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/site-packages/deeptools/multiBamSummary.py", line 238, in main
labels=args.labels)
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/site-packages/numpy/lib/npyio.py", line 657, in savez_compressed
_savez(file, args, kwds, True)
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/site-packages/numpy/lib/npyio.py", line 715, in _savez
zipf.close()
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/zipfile.py", line 1608, in close
self._write_end_record()
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/zipfile.py", line 1710, in _write_end_record
centDirSize, centDirOffset, len(self._comment))
struct.error: argument out of range
Many thanks
Sukhdeep
Verbose out:
multiBamSummary bins -b a.bam b.bam -bs 100 -p 5 -r ST4.03ch03:20000000:40000100 -v -o /dev/null
step size is 100
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:32000000-33000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:22000000-23000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:35000000-36000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:38000000-39000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:33000000-34000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:32000000-33000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:22000000-23000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:38000000-39000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:35000000-36000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:33000000-34000000
ForkPoolWorker-1 countReadsInRegions_worker: processing 10000 (2418721.0 per sec) @ ST4.03ch03:32000000-33000000
ForkPoolWorker-2 countReadsInRegions_worker: processing 10000 (2476268.7 per sec) @ ST4.03ch03:22000000-23000000
ForkPoolWorker-4 countReadsInRegions_worker: processing 10000 (2479196.1 per sec) @ ST4.03ch03:38000000-39000000
ForkPoolWorker-5 countReadsInRegions_worker: processing 10000 (2478756.6 per sec) @ ST4.03ch03:33000000-34000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:20000000-21000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:28000000-29000000
ForkPoolWorker-3 countReadsInRegions_worker: processing 10000 (2227103.5 per sec) @ ST4.03ch03:35000000-36000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:31000000-32000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:37000000-38000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:20000000-21000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:28000000-29000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:29000000-30000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:31000000-32000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:37000000-38000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:29000000-30000000
ForkPoolWorker-1 countReadsInRegions_worker: processing 10000 (3185224.8 per sec) @ ST4.03ch03:20000000-21000000
ForkPoolWorker-2 countReadsInRegions_worker: processing 10000 (3165990.3 per sec) @ ST4.03ch03:28000000-29000000
ForkPoolWorker-4 countReadsInRegions_worker: processing 10000 (3124714.3 per sec) @ ST4.03ch03:31000000-32000000
ForkPoolWorker-5 countReadsInRegions_worker: processing 10000 (3087452.3 per sec) @ ST4.03ch03:37000000-38000000
ForkPoolWorker-3 countReadsInRegions_worker: processing 10000 (2823116.4 per sec) @ ST4.03ch03:29000000-30000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:34000000-35000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:34000000-35000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:24000000-25000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:40000000-40000100
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:24000000-25000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:40000000-40000100
ForkPoolWorker-4 countReadsInRegions_worker: processing 1 (2194.8 per sec) @ ST4.03ch03:40000000-40000100
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:30000000-31000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:36000000-37000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:30000000-31000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:36000000-37000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:39000000-40000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:39000000-40000000
ForkPoolWorker-2 countReadsInRegions_worker: processing 10000 (3018570.7 per sec) @ ST4.03ch03:24000000-25000000
ForkPoolWorker-1 countReadsInRegions_worker: processing 10000 (2384075.5 per sec) @ ST4.03ch03:34000000-35000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:27000000-28000000
ForkPoolWorker-2, processing 0 (0.0 per sec) reads @ ST4.03ch03:27000000-28000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:25000000-26000000
ForkPoolWorker-4 countReadsInRegions_worker: processing 10000 (2826731.4 per sec) @ ST4.03ch03:30000000-31000000
ForkPoolWorker-5 countReadsInRegions_worker: processing 10000 (2857155.3 per sec) @ ST4.03ch03:36000000-37000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:23000000-24000000
ForkPoolWorker-1, processing 0 (0.0 per sec) reads @ ST4.03ch03:25000000-26000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:26000000-27000000
ForkPoolWorker-5, processing 0 (0.0 per sec) reads @ ST4.03ch03:23000000-24000000
ForkPoolWorker-4, processing 0 (0.0 per sec) reads @ ST4.03ch03:26000000-27000000
ForkPoolWorker-2 countReadsInRegions_worker: processing 10000 (3196634.4 per sec) @ ST4.03ch03:27000000-28000000
ForkPoolWorker-3 countReadsInRegions_worker: processing 10000 (1750763.5 per sec) @ ST4.03ch03:39000000-40000000
ForkPoolWorker-5 countReadsInRegions_worker: processing 10000 (3205674.1 per sec) @ ST4.03ch03:23000000-24000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:21000000-22000000
ForkPoolWorker-4 countReadsInRegions_worker: processing 10000 (3195903.7 per sec) @ ST4.03ch03:26000000-27000000
ForkPoolWorker-3, processing 0 (0.0 per sec) reads @ ST4.03ch03:21000000-22000000
ForkPoolWorker-1 countReadsInRegions_worker: processing 10000 (1821472.2 per sec) @ ST4.03ch03:25000000-26000000
ForkPoolWorker-3 countReadsInRegions_worker: processing 10000 (2468254.0 per sec) @ ST4.03ch03:21000000-22000000
Number of bins found: 200001
Traceback (most recent call last):
File "/usr/share/cluster-apps/miniconda/envs/deeptools/bin/multiBamSummary", line 11, in <module>
main(args)
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/site-packages/deeptools/multiBamSummary.py", line 238, in main
labels=args.labels)
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/site-packages/numpy/lib/npyio.py", line 657, in savez_compressed
_savez(file, args, kwds, True)
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/site-packages/numpy/lib/npyio.py", line 715, in _savez
zipf.close()
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/zipfile.py", line 1608, in close
self._write_end_record()
File "/usr/share/cluster-apps/miniconda/envs/deeptools/lib/python3.5/zipfile.py", line 1710, in _write_end_record
centDirSize, centDirOffset, len(self._comment))
struct.error: argument out of range
Best,
Sukhdeep
I am a colleague of Sukhdeep and sys admin. The miniconda installation is part of my responsibilities. I have setup a deeptools conda env like this:
```
$ conda create -n deeptools deeptools=3.1.2
$ conda list -n deeptools
# packages in environment at /opt/miniconda/envs/deeptools:
#
# Name Version Build Channel
asn1crypto 0.24.0 py36_0
bcftools 1.9 h4da6232_0 bioconda
blas 1.0 mkl
bzip2 1.0.6 h14c3975_5
ca-certificates 2018.03.07 0
certifi 2018.8.13 py36_0
cffi 1.11.5 py36h9745a5d_0
chardet 3.0.4 py36_1
cryptography 2.3.1 py36hc365091_0
curl 7.61.0 h84994c4_0
cycler 0.10.0 py36_0
dbus 1.13.2 h714fa37_1
decorator 4.3.0 py36_0
deeptools 3.1.2 py36h470a237_0 bioconda
expat 2.2.5 he0dffb1_0
fontconfig 2.13.0 h9420a91_0
freetype 2.9.1 h8a8886c_0
glib 2.56.1 h000015b_0
gst-plugins-base 1.14.0 hbbd80ab_1
gstreamer 1.14.0 hb453b48_1
htslib 1.7 0 bioconda
icu 58.2 h9c2bf20_1
idna 2.7 py36_0
intel-openmp 2018.0.3 0
ipython_genutils 0.2.0 py36_0
jpeg 9b h024ee3a_2
jsonschema 2.6.0 py36_0
jupyter_core 4.4.0 py36_0
kiwisolver 1.0.1 py36hf484d3e_0
libcurl 7.61.0 h1ad7b7a_0
libdeflate 1.0 h470a237_0 bioconda
libedit 3.1.20170329 h6b74fdf_2
libffi 3.2.1 hd88cf55_4
libgcc 7.2.0 h69d50b8_2
libgcc-ng 8.2.0 hdf63c60_1
libgfortran-ng 7.3.0 hdf63c60_0
libpng 1.6.34 hb9fc6fc_0
libssh2 1.8.0 h9cfc8f7_4
libstdcxx-ng 8.2.0 hdf63c60_1
libuuid 1.0.3 h1bed415_2
libxcb 1.13 h1bed415_1
libxml2 2.9.8 h26e45fe_1
matplotlib 2.2.3 py36hb69df0a_0
mkl 2018.0.3 1
mkl_fft 1.0.4 py36h4414c95_1
mkl_random 1.0.1 py36h4414c95_1
nbformat 4.4.0 py36_0
ncurses 6.1 hf484d3e_0
numpy 1.15.0 py36h1b885b7_0
numpy-base 1.15.0 py36h3dfced4_0
openssl 1.0.2p h14c3975_0
pandas 0.23.4 py36h04863e7_0
pcre 8.42 h439df22_0
pip 10.0.1 py36_0
plotly 3.1.1 py36h28b3542_0
py2bit 0.3.0 py36_1 bioconda
pybigwig 0.3.12 py36hdfb72b2_0 bioconda
pycparser 2.18 py36_1
pyopenssl 18.0.0 py36_0
pyparsing 2.2.0 py36_1
pyqt 5.9.2 py36h22d08a2_0
pysam 0.14.1 py36hae42fb6_1 bioconda
pysocks 1.6.8 py36_0
python 3.6.6 hc3d631a_0
python-dateutil 2.7.3 py36_0
pytz 2018.5 py36_0
qt 5.9.6 h52aff34_0
readline 7.0 ha6073c6_4
requests 2.19.1 py36_0
retrying 1.3.3 py36_2
samtools 1.7 1 bioconda
scipy 1.1.0 py36hc49cb51_0
setuptools 40.0.0 py36_0
sip 4.19.8 py36hf484d3e_0
six 1.11.0 py36_1
sqlite 3.24.0 h84994c4_0
tk 8.6.7 hc745277_3
tornado 5.1 py36h14c3975_0
traitlets 4.3.2 py36_0
urllib3 1.23 py36_0
wheel 0.31.1 py36_0
xz 5.2.4 h14c3975_4
zlib 1.2.11 ha838bed_2
```
We have tried recreating the environment again with a fresh miniconda installation and with other Python version. All tries failed. The weird thing is that multiBamSummary keeps on running until we interrupt it, then it report an error not related to the KeyboardInterrupt as seen in Sukhdeep's report.
Thank you!