Of course they are still available vi http at:
KL
Luis E. Fernandez
Kevin Loch wrote in message <359C04...@NOSPMkl.net>...
Larry A.
FTP is marginally better at doing batch transfers, depending on which
client you use. This is why people requested it. They want to download
all of the scans for as set without saving each one in a browser window.
Personally, I use a web browser and download them as I USE them. (no
the server isn't at my house). Several people have run webcopy or
similar
programs to dump the whole (now >800MB!) thing. I guess they're making
thumbnails,
which would be nice. In any case, thy're there, do whatever works best
for you.
It would be nice if one of the mega-indexes (like puase) linked to the
scans for each set. I have no intention of ever duplicating the index
effort.
P.S. even with a basic ftp client you could dump the whole thing by
creating
the entire directory structure first (perhaps by trimming du-k.txt into
a script)
and doing an MGET * from /scans. I just hope you've got lots of
bandwidth and/or
lots of time :)
KL
Godzilla wrote:
>
> In article <6nhbpo$j...@enews1.newsguy.com>, lfer...@newsguy.com says...
> > Was is better FTP or HTTP??
> Of the two protocols, ftp is far superior for massive data transfer.
To get an overview of what is to offer, a HTML-listing of a directory
is much more comfortable (you can also easily print it) - so choose the
best thing for what you want to do!
regards, Rob1
--
Fraunhofer Institut fuer Grafische Datenverarbeitung, Darmstadt
FhG-IGD Homepage: http://www.idg.fhg.de
Private Homepage: http://www.informatik.uni-oldenburg.de/~rob
EMail: r...@igd.fhg.de
> Several people have run webcopy or
> similar
> programs to dump the whole (now >800MB!) thing. I guess they're making
> thumbnails,
> which would be nice. In any case, thy're there, do whatever works best
> for you.
I did just that this week. Took nearly 48 hours over a 64K link. I was 'soak
testing' the newly installed leased line at work :-)
>
> It would be nice if one of the mega-indexes (like puase) linked to the
> scans for each set. I have no intention of ever duplicating the index
> effort.
I am currently doing some work on a post-classic space set reference for my
web page and plan to do just that.
Huw
Ya, we see you did it already, VERY nice job. (1) Do you have to
manually(2) keep a flag up to date in the DB to drive that?
--
Larry Pieniazek http://my.voyager.net/lar
PLEASE do not send anything to my Auburn Hills address
any more. I'm not there. Note me directly for more info.
1 - I think my post announcing I spotted it is 8 minutes ahead of your
post bragging about it so nyaa nyaa...
2 - or under crawler control, as compared to figuring it out on the fly
by trying to see if the directory exists
>Due to enormous demand I have made the scans available via ftp:
>
>ftp://ftp.kl.net/scans
>
>Of course they are still available vi http at:
>
>http://www.kl.net/scans
>
>KL
Great! Thank you very much.
Now I can do a 'directory compare' with my CD, download only the new
directories, and add them to the CD......
And with a cable-tv connection at 36Kbyte/sec, with possible 24h online a
day, I can get a LOT of pictures.
Ardjan
----------
I will wear your white feather, I will carry your white flag
I will swear I have no nation, but I'm proud to own my heart
Marillion, 'Misplaced Childhood'
No junk-mail! Replace 'Flying_Dutchman' with 'ardjan.besse'
Ardjans YALP: http://unet.univie.ac.at/~a8705125/ardjan
Heh heh... Easier said than done!
(But how about this?--)
:
:
<if ($set->{Instructions})>
<A HREF="$set->{Instructions}">Building instructions</A> for this set are
available at <A HREF="http://www.kl.net/">Kevin Loch's Net Server</A>.<BR>
</if>
:
:
Muahaahaaaa! :]
--Todd
I almost forgot -- the little tables that pop up when there are more than one
match -- they now show which sets have photos and building instruction scans.
For example, here is a list of all the Space Police II sets...
http://database.lugnet.com/pause/search/?theme=Space,Space+Police+II
...and it shows that scans are still needed for 6813 and 6984.
--Todd
I didn't want to overload Kevin's server by checking on-the-fly every time
the page is accessed.* So there is a script which, when invoked, goes and
talks to Kevin's server and asks it for a list of all the sets that are
there. This takes about 0.55 seconds over HTTP.
It really ought to be a cron job set up to run once a day at night, or some
other kind of inter-server trigger/semaphore somehow. But this is certainly
OK for starters, as long as Kevin doesn't mind.
--Todd
* OTOH, it could be smart and only look once per day per set, and then if
it does it on-the-fly for a set, it would probably only delay the page
generation very slightly only once per day to some random person. I'll
have plenty of time to think about this in more detail as I continue packing
>
> I didn't want to overload Kevin's server by checking on-the-fly every time
> the page is accessed.*
<snip>
> * OTOH, it could be smart and only look once per day per set, and then if
> it does it on-the-fly for a set, it would probably only delay the page
> generation very slightly only once per day to some random person.
Hmm... how about this for an optimization, given that it's not likely
that scans will go away... only look if you currently think the scan
isn't there, and if you haven't looked in X hours? As the scan set
approaches completeness, you can ratchet X down and keep the load on the
server constant, as you will be looking less and less, percentagewise.
Then, to handle the rare cases where a scan goes away (1), run your
crawler cronjob once a day to verify everything and reset appropriate
flags.
Don't let thinking about this stop you from packing up my stuff, though!
:-)
--
Larry Pieniazek http://my.voyager.net/lar
PLEASE do not send anything to my Auburn Hills address
any more. I'm not there. Note me directly for more info.
1 - is this ever going to happen? I'm just being thorough, because I
can't imagine a scenario where a scan gets pulled. Kevin is pretty
rigorous about not allowing scans up for current sets, etc.