Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Download file and show progress

100 views
Skip to first unread message

Alexandru

unread,
Jul 17, 2018, 9:34:23 AM7/17/18
to
Hi,

I have a Tcl script on my server that I use to download multiple files from server with the help of Tcl. Everything works fine, except that the progress bar on the client machine is not working.

The Tcl script on the server (SoftwareDownload.cgi) builds the content that needs to be downloaded from the file contents and attaches the Content-Length information. Content-Lenght is also used by the client machine in order to display the progress bar:

set bound "-----NEXT_PART_[clock seconds].[pid]"
# Create content to be uploaded (path, EMail, Password, mtime, file content)
set content ""
set bytesize 0
foreach path $files relpath $relfiles {
# Get modified date of file
set mtime [file mtime $path]
# Get file size
incr bytesize [file size $path]
# Get file content of online path
set fid [open $path "rb"]
fconfigure $fid -encoding "binary" -translation "binary"
set data [read $fid]
close $fid
append content "--$bound\r\nContent-Disposition: form-data; name=\"path\"\r\n\r\n$relpath\r\n"
append content "--$bound\r\nContent-Disposition: form-data; name=\"mtime\"\r\n\r\n$mtime\r\n"
append content "--$bound\r\nContent-Disposition: form-data; name=\"data\"; filename=\"$path\"\r\n"
append content "--$bound\r\nContent-Type: application/octet-stream\r\n"
append content "--$bound\r\nContent-Transfer-Encoding: binary\r\n\r\n$data\r\n"
}
append content "--$bound--\r\n"
# Append info to activity.dat
Activity [file join $users_path $EMail] "Software download $relfiles"

fconfigure stdout -encoding "binary" -translation "binary"
puts "Content-Type: multipart/form-data; boundary=$bound"
puts "Content-Length: $bytesize"
puts "Content-Transfer-Encoding: Binary\r\n"
puts -nonewline $content


On the client machine I call something like:

::http::geturl https://www.something.de/cgi-bin/SoftwareDownload.cgi -query paths=<filename>&dates=%7B%7D&EMail=<Email>&Password=<password> -headers {Accept-Encoding gzip} -progress ::meshparts::UIProgressBar

Although I specify the -progress option, the progress bar call back procedure gets as expected file size argument zero:

proc ::meshparts::UIProgressBar {token expected received} {
puts "$token $expected $received"
if {$expected==0} {
return
}
set ::meshparts::progress [expr {int(100.0 * double($received)/double($expected))}]
update idletasks
}

The Output of the callback is something like:

::http::178 0 8106
::http::178 0 16202
::http::178 0 24298
::http::178 0 32394
::http::178 0 40490
::http::178 0 48586
::http::178 0 56682
::http::178 0 64778
::http::178 0 72874
::http::178 0 80970
::http::178 0 89066

So obviously the content length is not intepreted correctly. I wonder what I'm doing wrong? Is my HTTP response header wrong?

Thanks!
Alexandru

Brad Lanam

unread,
Jul 17, 2018, 11:07:00 AM7/17/18
to
On Tuesday, July 17, 2018 at 6:34:23 AM UTC-7, Alexandru wrote:
> So obviously the content length is not intepreted correctly. I wonder what I'm doing wrong? Is my HTTP response header wrong?
>
> Thanks!
> Alexandru

This block will not output the end-of-line characters for each
line.

fconfigure stdout -encoding "binary" -translation "binary"
puts "Content-Type: multipart/form-data; boundary=$bound"
puts "Content-Length: $bytesize"
puts "Content-Transfer-Encoding: Binary\r\n"
puts -nonewline $content

Try:

fconfigure stdout -encoding utf-8 -translation crlf
puts "Content-Type: multipart/form-data; boundary=$bound"
puts "Content-Length: $bytesize"
puts "Content-Transfer-Encoding: Binary\n"
fconfigure stdout -encoding binary -translation binary
puts -nonewline $content


Brad Lanam

unread,
Jul 17, 2018, 11:10:44 AM7/17/18
to
On Tuesday, July 17, 2018 at 8:07:00 AM UTC-7, Brad Lanam wrote:
> On Tuesday, July 17, 2018 at 6:34:23 AM UTC-7, Alexandru wrote:
> > So obviously the content length is not intepreted correctly. I wonder what I'm doing wrong? Is my HTTP response header wrong?
> >
> > Thanks!
> > Alexandru
>
> This block will not output the end-of-line characters for each
> line.

Well, that statement is not quite right.
It will output a 0x0a. But the standard requires \r\n.

Brad Lanam

unread,
Jul 17, 2018, 11:27:03 AM7/17/18
to
On Tuesday, July 17, 2018 at 8:10:44 AM UTC-7, Brad Lanam wrote:
> > fconfigure stdout -encoding utf-8 -translation crlf
> > puts "Content-Type: multipart/form-data; boundary=$bound"
> > puts "Content-Length: $bytesize"
> > puts "Content-Transfer-Encoding: Binary\n"
> > fconfigure stdout -encoding binary -translation binary
> > puts -nonewline $content

I was assuming Linux, but I don't know what platform(s) you are on.
The above is cross-platform compliant.

Content-Length should be set to [string length $content], not just the
size of the file.

Alexandru

unread,
Jul 17, 2018, 11:35:52 AM7/17/18
to
I'm on Windows. The hint about string length might be the solution. Testing...

Rich

unread,
Jul 17, 2018, 11:41:43 AM7/17/18
to
Brad Lanam <brad....@gmail.com> wrote:
> On Tuesday, July 17, 2018 at 8:10:44 AM UTC-7, Brad Lanam wrote:
>> > fconfigure stdout -encoding utf-8 -translation crlf
>> > puts "Content-Type: multipart/form-data; boundary=$bound"
>> > puts "Content-Length: $bytesize"
>> > puts "Content-Transfer-Encoding: Binary\n"
>> > fconfigure stdout -encoding binary -translation binary
>> > puts -nonewline $content
>
> I was assuming Linux, but I don't know what platform(s) you are on.
> The above is cross-platform compliant.

Yes, oddly enough the http standard requires the two character \r\n end
of line markers for the header lines.

I've never seen a good explanation for why it was designed this way.

Alexandru

unread,
Jul 17, 2018, 11:43:47 AM7/17/18
to
Those changes had no effect.

Alexandru

unread,
Jul 17, 2018, 11:45:20 AM7/17/18
to
Am Dienstag, 17. Juli 2018 17:07:00 UTC+2 schrieb Brad Lanam:
string length also has no effect.
Actually logical: The problem is that the length is zero. It should be at least larger than zero (even if the value is zero).

Brad Lanam

unread,
Jul 17, 2018, 11:55:43 AM7/17/18
to
Yes. It should have had some value.
I'm just looking at my client side, and I don't ever look at the content
length. Of course I set it on the server side.

I guess some debugging is in order:

set htoken [::http::geturl ....]
set meta [::http::meta $htoken]
set cl [dict get $meta Content-Length]
puts $cl

Alexandru

unread,
Jul 17, 2018, 12:03:33 PM7/17/18
to
Very intersting:

When I download a very small file, the content lenght is present in the header and in the meta array. Also the progress bar gets the right values, though the progress is not visible because everything runs very fast.

When I dowload larger files, the content lenght is not present in the header of meta array. I think I have to debug the http package code...

Alexandru

unread,
Jul 17, 2018, 1:07:57 PM7/17/18
to
I have difficulties finding the correct http code. I have placed a "puts" command into the http.tcl that I found in the Tcl dir but I see no output. Debuggin the http package will be hard this way...

Brad Lanam

unread,
Jul 17, 2018, 1:16:17 PM7/17/18
to
On Tuesday, July 17, 2018 at 10:07:57 AM UTC-7, Alexandru wrote:
> I have difficulties finding the correct http code. I have placed a "puts" command into the http.tcl that I found in the Tcl dir but I see no output. Debuggin the http package will be hard this way...

Oh yes. The design of the http package is messed up, and the output
disappears. I would just open a log file and dump the output there.

I recently, on windows, had an issue where there was no data returned.
I still don't know if that's my bug or http (intermittent, hard to debug).
In that case, $meta will just be empty. I am using -command with callback
procedures. I just retry the http-get in that case.

From: https://wiki.tcl-lang.org/1475

"
PYK 2016-04-03: Yes, http::Finish does swallow errors in the -command if an error is already being propagated. The whole http module needs a little redesigning. In the meantime, the callback command can be "liberated" with something like this:

http::geturl $url -command [list after idle some_command]
"

Alexandru

unread,
Jul 17, 2018, 2:56:09 PM7/17/18
to
Is the package somehow a "build in"? I have removed the pkgIndex.tcl from the http package but "package require http" still works.

Brad Lanam

unread,
Jul 17, 2018, 3:26:24 PM7/17/18
to
This is the one it loads...
the http1.x is ancient.

bll-tecra:bll$ ls $HOME/local/lib/tcl8/8.6
http-2.8.12.tm tdbc/

Rich

unread,
Jul 17, 2018, 3:26:46 PM7/17/18
to
If you are running 8.6 (and/or maybe 8.5) the http package is actually
a tcl 'module'. So the file that is sourced is the one ending in *.tm,
not the one with the pkgIndex.tcl file.

Alexandru

unread,
Jul 17, 2018, 11:41:41 PM7/17/18
to
Thanks for the tip. I could insert "puts" commands and see the output.

What I found is that the server sends "Transfer-encoding: chunked" for larger files and "Transfer-encoding: gzip" for smaller files. For smaller files, the Content-Length is present, for larger not. This beheavior can be explained:

https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Transfer-Encoding
"chunked
Data is sent in a series of chunks. The Content-Length header is omitted in this case and at the beginning of each chunk you need to add the length of the current chunk in hexadecimal format, followed by '\r\n' and then the chunk itself, followed by another '\r\n'. The terminating chunk is a regular chunk, with the exception that its length is zero. It is followed by the trailer, which consists of a (possibly empty) sequence of entity header fields."

Seems like the http package has a bug here.

heinrichmartin

unread,
Jul 18, 2018, 3:36:12 AM7/18/18
to
On Tuesday, July 17, 2018 at 8:56:09 PM UTC+2, Alexandru wrote:
> Is the package somehow a "build in"? I have removed the pkgIndex.tcl from the http package but "package require http" still works.

Try [package versions] and [package ifneeded]. The latter with the short syntax.
Message has been deleted

Alexandru

unread,
Aug 18, 2019, 5:19:23 PM8/18/19
to
After long time I was resuming the investigations on the issue with the missing content-length data in the HTTP response header.

Turns out, the http package has no bug here.

When the server uses "Transfer-Encoding: chunked", then there is no "content-length" since the server sends chunks of data until it reaches the end of the data. This avoids readeing the full content of the file before sending it.

So it is technically impossible to show a progress bar when downloading larger files that are chunked by the server.

The second important thing about Apache is when it decides to use chunked transfer. Here is the answer:

https://serverfault.com/questions/59047/apache-sending-transfer-encoding-chunked

"Chunked output occurs when Apache doesn't know the total output size before sending, as is the case with compressed transfer (Apache compresses data into chunks when they reach a certain size, then despatches them to the browser/requester while the script is still executing). You could be seeing this because you have mod_deflate or mod_gzip active."

Knowing this, the solution to the problem was simple: Place a filter directive into the https.conf (Apache's config file) that will turn off data compression for sepcific cases (in my case it was the execution of a CGI script meant for downloads):

SetEnvIfNoCase Request_URI .*name-of-cgi-script.* no-gzip dont-vary

The cgi script itself uses the Tcl zlib package to compress data, so the transfer is still fast. The only disadvantage is that now the file content must be loaded into memory before transfer.

Cheers
Alexandru
0 new messages