[dcm4che] Handling very big data

473 views
Skip to first unread message

samue...@irm.kr

unread,
Jan 19, 2017, 8:07:15 AM1/19/17
to dcm4che
I have been trying to create or read very big DICOM files containing MPEG4 video (about 2GB) with dcm4che library.

I have two problems in handling big DICOM files.

First, creating the big DICOM Video file was ok, but my question is if it is possible to load image/video data without loading them into memory. Loading big image data into memory and saving them to file require quite long time. I tried to use Java MappedByteBuffer, but I had to allocated additional byte[] to feed the data to DICOM dataset. My code looks as  

MappedByteBuffer pixelDataBuffer = null;
byte[] pixelData = null;
try {
File mp4File = new File(input);
RandomAccessFile randomAccessFile = new RandomAccessFile(mp4File, "r");
FileChannel fileChannel = randomAccessFile.getChannel();
pixelDataBuffer = fileChannel.map(FileChannel.MapMode.READ_ONLY, 0, fileChannel.size());
pixelData = new byte[pixelDataBuffer.capacity()];
pixelDataBuffer.get(pixelData, 0, pixelDataBuffer.capacity());
fileChannel.close();
randomAccessFile.close();
} catch (FileNotFoundException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();

Attributes dataset = new Attributes();

(abbreviated...)

Fragments fragments = new Fragments(VR.OB, false, 0);
fragments.add(null);
fragments.add(pixelData);
dataset.setValue(Tag.PixelData, VR.OB, fragments);
File dcmFile = new File(output);
try {
DicomOutputStream dos = new DicomOutputStream(dcmFile);
dos.writeDataset(fmi, dataset);
dos.close();
} catch (IOException e) {
e.printStackTrace();
}

The above code works, and I got a big DICOM video file. But the second problem is that dcm4che could not handle this big DICOM file.

When I tried to dcmdump the file, I got the following error message.

846: (7FE0,0010) OB #-1 PixelData
858: (FFFE,E000) #0 [] Item
Exception in thread "main" java.lang.NegativeArraySizeException
at java.util.Arrays.copyOf(Arrays.java:3236)
at org.dcm4che3.io.DicomInputStream.readValue(DicomInputStream.java:733)
at org.dcm4che3.tool.dcmdump.DcmDump.appendFragment(DcmDump.java:204)
at org.dcm4che3.tool.dcmdump.DcmDump.readValue(DcmDump.java:165)
at org.dcm4che3.io.DicomInputStream.readFragments(DicomInputStream.java:705)
at org.dcm4che3.io.DicomInputStream.readValue(DicomInputStream.java:538)
at org.dcm4che3.tool.dcmdump.DcmDump.readValue(DcmDump.java:111)
at org.dcm4che3.io.DicomInputStream.readAttributes(DicomInputStream.java:516)
at org.dcm4che3.io.DicomInputStream.readDataset(DicomInputStream.java:444)
at org.dcm4che3.tool.dcmdump.DcmDump.parse(DcmDump.java:87)
at org.dcm4che3.tool.dcmdump.DcmDump.main(DcmDump.java:245)

When I tried to dump with other tool such as DCMTK's dcmdump, it works well.

(0028,0103) US 0                                        #   2, 1 PixelRepresentation
(7fe0,0010) OB (PixelSequence #=2)                      # u/l, 1 PixelData
  (fffe,e000) pi (no value available)                     #   0, 1 Item
  (fffe,e000) pi 00\00\00\20\66\74\79\70\69\73\6f\6d\00\00\02\00\69\73\6f\6d\69\73... # 1939383416, 1 Item
(fffe,e0dd) na (SequenceDelimitationItem)               #   0, 0 SequenceDelimitationItem

Is there anybody who has tried to handle big image/video with dcm4che, and have any solution?

Thanks,

Samuel

m.faro...@gmail.com

unread,
May 2, 2019, 4:50:20 PM5/2/19
to dcm4che
reviving an old thread.

I am also seeing same problems with storing dicom files with segment larger than 2GB.

Is this a known issue?
Are there any plans on fixing this?

Or any suggestion what will be required to fix this issue?

Thanks

Vibha Maini

unread,
Jul 29, 2020, 2:40:44 PM7/29/20
to dcm4che
Is there any fix for this issue?
Reply all
Reply to author
Forward
0 new messages