When ever I run this script, the contents of the file are dumped to
stdout and the writeproc is never entered. Niether of the puts
statements are executed. The file is created but has zero bytes.
Does the callback work with FTP or am I doing something wrong?
Also i never did get an answer to my other question. Is there a
method for capturing the data from several successive downloads to a
variable without resetting the entire configuration between each. See
my post dated Nov 22.
Thanks for any help.
SCRIPT -------
package require TclCurl
# This is one contrived example, but it works.
proc writeToFile {readData} {
puts "writeToFile called [incr ::i]"
puts -nonewline $::inFile $readData
return
}
set i 0
set inFile [open "cosa.tar" w+]
fconfigure $inFile -translation binary
curl::transfer -url "ftp://10.31.65.222/TRANS_LIVE/EMPIreply" -
writeproc writeToFile -userpwd foobar:hayseed
close $inFile
>
> I copied a script from Andres Garcia almost verbatim except for the
> FTP address.
I had never tried writeproc with ftp, so I did this:
proc writeToFile {readData} {
puts "writeToFile called [incr ::i]"
puts -nonewline $::inFile $readData
return
}
set i 0
set inFile [open "gimp.tar.bz2" w+]
fconfigure $inFile -translation binary
curl::transfer -url
"ftp://ftp.rediris.es/mirror/gimp/v2.7/gimp-2.7.0.tar.bz2" -writeproc
writeToFile
close $inFile
And it works, does it work for you?
> Also i never did get an answer to my other question. Is there a
> method for capturing the data from several successive downloads to a
> variable without resetting the entire configuration between each.
It should work without resetting but there may be a bug somewhere,
unfortunately, I don't have to time to look into it.
Andres
It still did not work so I tried it on Unix where it does work.
Evidently it works with Unix but not with Windows.
It must have something to do with the way Windows buffers its data.