upload assistant 1.1.2, One item not uploaded: java.io.IOException: too many bytes written

834 views
Skip to first unread message

Christopher Grave

unread,
Sep 27, 2018, 8:32:30 AM9/27/18
to xnat_discussion
Hi All,


I have a user who is attempting to upload a reasonably large exam >15GB using the upload assistant v1.1.2

After selecting the relevant folder containg the scan data it takes a long time (understandbly) finding data files/searching for sessions, Once completed and user selects scans  to upload they receive the following error message.


One item not uploaded: java.io.IOException: too many bytes written. Any advice wouold be greatly appreciated.


error message.jpg


Christopher Grave

unread,
Oct 1, 2018, 10:16:38 AM10/1/18
to xnat_discussion
FWIW I have some further info which I obtained from the upload assistant log. In addition the the image set in question is 6.5GB

2018-10-01 14:56:44,079 [pool-6-thread-1] DEBUG org.nrg.dcm.edit.ScriptApplicator - Resource: class path resource [META-INF/xnat/dicom-edit4/makeSessionLabel-function.properties]
2018-10-01 14:56:44,079 [pool-7-thread-1] DEBUG org.nrg.dcm.edit.ScriptApplicator - Created instance of script function org.nrg.xnat.upload.dcm.IndexedSessionLabelFunction using the context constructor
2018-10-01 14:56:44,079 [pool-6-thread-1] DEBUG org.nrg.dcm.edit.ScriptApplicator - Created instance of script function org.nrg.xnat.upload.dcm.IndexedSessionLabelFunction using the context constructor
2018-10-01 14:56:44,838 [pool-7-thread-1] DEBUG org.nrg.xnat.upload.io.dcm.SeriesZipper - zip file built
2018-10-01 14:56:45,365 [pool-6-thread-1] DEBUG org.nrg.xnat.upload.io.dcm.SeriesZipper - zip file built
2018-10-01 14:56:45,765 [pool-3-thread-3] INFO  org.nrg.xnat.upload.dcm.Study - upload aborted: shutting down executor
java.io.IOException: too many bytes written
at sun.net.www.protocol.http.HttpURLConnection$StreamingOutputStream.write(HttpURLConnection.java:3523)
at org.nrg.xnat.upload.io.dcm.ZipSeriesUploader.sendFixedSize(ZipSeriesUploader.java:84)
at org.nrg.xnat.upload.io.dcm.ZipSeriesUploader.call(ZipSeriesUploader.java:55)
at org.nrg.xnat.upload.io.dcm.ZipSeriesUploader.call(ZipSeriesUploader.java:39)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: java.io.IOException: insufficient data written
at sun.net.www.protocol.http.HttpURLConnection$StreamingOutputStream.close(HttpURLConnection.java:3558)
at org.nrg.xnat.upload.io.dcm.ZipSeriesUploader.sendFixedSize(ZipSeriesUploader.java:92)
... 8 more

McKay, Mike

unread,
Oct 1, 2018, 10:42:40 AM10/1/18
to xnat_discussion

I haven't worked on the upload assistant code, so I'm not sure what is going wrong. One possibility that comes to mind is that maybe it's hitting the original 4 gig cap on zip file size (if our code is using code that enforces this). Another possibility is that your TurbineResources.properties file has this property (services.UploadService.size.max=1048576000), and that property is causing larger uploads to fail. The only other thing that comes to mind is maybe there's some issue with special character encoding like https://github.com/cloudant/java-cloudant/issues/155 . Sorry I couldn't be more helpful, but maybe one of those leads will be useful.


-Mike


From: xnat_di...@googlegroups.com <xnat_di...@googlegroups.com> on behalf of Christopher Grave <christop...@gmail.com>
Sent: Monday, October 1, 2018 9:16:38 AM
To: xnat_discussion
Subject: [XNAT Discussion] Re: upload assistant 1.1.2, One item not uploaded: java.io.IOException: too many bytes written
 
--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xnat_discussi...@googlegroups.com.
To post to this group, send email to xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.
To view this discussion on the web visit https://groups.google.com/d/msgid/xnat_discussion/75e81ef0-43ba-4d37-a382-c2426149d3e8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

 


The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

Herrick, Rick

unread,
Oct 2, 2018, 10:10:44 AM10/2/18
to xnat_di...@googlegroups.com

It’s definitely possible that you’re running into the upload limit that Mike mentions. Note that it doesn’t really matter that the study in question is 6.5GB, since the upload assistant actually uploads by series. However if one of the series in your study is larger than the default 1GB limit, that could be the problem. That said, I don’t think the importer is using the Turbine upload service but is actually handling the upload on its own using classes from and based on the Apache Commons FileUpload library. The default for that library is -1, which means that the library itself imposes no size limit, which means that any size limit is imposed either by the application or by Tomcat.

 

Working from that, I think there are two places you could try to increase the size limit for your upload: the master web.xml for Tomcat in the conf folder of your Tomcat installation or in the web.xml of your XNAT application in the WEB-INF folder of your deployed app. Shut down Tomcat and edit one of these files, adding the following block:

 

<multipart-config>

   <max-file-size>2097152000</max-file-size>

   <max-request-size>2097152000</max-request-size>

   <file-size-threshold>0</file-size-threshold>

</multipart-config>

 

You can put this anywhere in the file as long as it’s between the <web-app> and </web-app> elements. Save the file, restart Tomcat, and retry your upload.

 

Give that a try and see if it fixes your issue.

-- 

Rick Herrick

Sr. Programmer/Analyst

Neuroinformatics Research Group

Washington University School of Medicine

Phone: +1 (314) 273-1645


For more options, visit https://groups.google.com/d/optout.

Christopher Grave

unread,
Oct 3, 2018, 10:50:53 AM10/3/18
to xnat_discussion
Hi Rick,

Thanks for the sugesstion but unfortunately it made no difference, grrrr. (FWIW see screenshot  of Upload Assistant prior to upload). What is interesting for the purposes of testing a made a copy of that particular scan and deleted say over half of the files so the scan read as just over 3GB that uploaded without any problems hence why I am convinced its some form of limit being applied, where exactly though is where I am lost. Thaks for all your help with this. 




Screenshot from 2018-10-03 15-03-01.png

Herrick, Rick

unread,
Oct 3, 2018, 1:54:13 PM10/3/18
to xnat_di...@googlegroups.com

Could the limit be in a proxy server in front of Tomcat, something like Apache HTTPD or nginx? For nginx the relevant setting is client_max_body_size and defaults to, I think, 1 or 10 MB. The XNAT Vagrant VM has this set to 0, which removes any size limit at the nginx level. The setting for Apache HTTPD is KeptBodySize and also takes 0 to specify no size limit.

 

-- 

Rick Herrick

Sr. Programmer/Analyst

Neuroinformatics Research Group

Washington University School of Medicine

Phone: +1 (314) 273-1645

 

Image removed by sender. error message.jpg

 

--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xnat_discussi...@googlegroups.com.
To post to this group, send email to xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.
To view this discussion on the web visit https://groups.google.com/d/msgid/xnat_discussion/75e81ef0-43ba-4d37-a382-c2426149d3e8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

 


The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xnat_discussi...@googlegroups.com.
To post to this group, send email to xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.
To view this discussion on the web visit https://groups.google.com/d/msgid/xnat_discussion/SN6PR02MB5102BADD2D954C9DD28C230EBFEF0%40SN6PR02MB5102.namprd02.prod.outlook.com.
For more options, visit https://groups.google.com/d/optout.

 


The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xnat_discussi...@googlegroups.com.
To post to this group, send email to xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.


For more options, visit https://groups.google.com/d/optout.

Christopher Grave

unread,
Oct 3, 2018, 3:06:58 PM10/3/18
to xnat_discussion
Hey Rick,

It is behind a nginx proxy and that was one of my initial thoughts and something I have come across before, but changing that particular setting didn't help. If I recall correctly I think that particular error message manifests itself as a http error. I could be wrong.

Cheers

Chris.

John Flavin

unread,
Oct 3, 2018, 3:16:20 PM10/3/18
to XNAT Discussion board
After googling around on this error message and stack trace, I think Rick is off base. This isn't an XNAT problem, this is an upload assistant problem. It is telling the HTTP connection object that it will write X bytes (which the connection writes into the Content Length header) then writing more than X bytes into the stream. See https://stackoverflow.com/questions/13138369/too-many-bytes-written-httpsurlconnection and https://stackoverflow.com/questions/8643250/http-post-request-bodies-differently-encoded-charset, for example.

Flavin

--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xnat_discussi...@googlegroups.com.
To post to this group, send email to xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.

Moore, Charlie

unread,
Oct 3, 2018, 3:28:31 PM10/3/18
to xnat_di...@googlegroups.com

I took a look at the code for the assistant, and I think I agree with Flavin: https://bitbucket.org/xnatdev/upload-assistant/src/9f2d9a6b19afdbb921bb27e4b9e84293617d6b84/src/main/java/org/nrg/xnat/upload/io/dcm/ZipSeriesUploader.java?at=master&fileviewer=file-view-default .  In particular, line 72 calls:

 

connection.setFixedLengthStreamingMode((int) zipSize);

 

where

 

final long zipSize = zipFile.length();

 

So, if the size of the zip for any series exceeds the maximum Java int size (which comes out to about 2.14GB), you’re going to have a problem, I would think. Presumably when you trimmed down the series to about 3GB, the zip compression got the file under the maximum.

 

Thanks,

Charlie


For more options, visit https://groups.google.com/d/optout.

Christopher Grave

unread,
Oct 5, 2018, 10:50:29 AM10/5/18
to xnat_discussion
Hi Flavin, Charlie,

Thanks for the reply and explanation, starting to make sense now. What would you say is the the best way to address this limitation?

Regards

Chris

Herrick, Rick

unread,
Oct 5, 2018, 12:27:32 PM10/5/18
to xnat_di...@googlegroups.com

The only real way to fix it is to fix the code that’s choking on the long->int conversion. Which as it happens is basically pretty easy so I just did that. The signing certificate we use for the upload assistant has expired and the renewal is held up in bureaucratic entanglements between the certificate issuer and the general university mass. As soon as we get the renewed certificate, I’ll push the code up and get a new development build that you can test out to see if it fixes the problem.

 

-- 

Rick Herrick

Sr. Programmer/Analyst

Neuroinformatics Research Group

Washington University School of Medicine

Phone: +1 (314) 273-1645

 


For more options, visit https://groups.google.com/d/optout.

Noman Ghafoor

unread,
Oct 2, 2023, 2:05:42 AM10/2/23
to xnat_discussion
Is this problem still fixed? Reading the thread, didn't find the solution. Still facing the same problem with current version 1.1.3. 


Regards
Malik

Reply all
Reply to author
Forward
0 new messages