Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Usenet news - backissues

1 view
Skip to first unread message

Andy Woodward

unread,
Apr 22, 1994, 6:29:23 AM4/22/94
to
I am looking for a way to avoid missing useful news threads while away
from work for periods longer than the couple of day (maximum!) expiry
time of our local server.

Are there any FTP, Gopher or WWW sites out there which archive Usenet
news, preferably with all the files for a day in a single file, so I
could just download the group to minimise connect time?

Seems like a useful service. Someone must do it?

M. Hedlund

unread,
Apr 22, 1994, 9:04:38 PM4/22/94
to
In article <azw.1895...@aber.ac.uk>,
a...@aber.ac.uk (Andy Woodward) wrote:

> I am looking for a way to avoid missing useful news threads while away
> from work for periods longer than the couple of day (maximum!) expiry
> time of our local server.
>
> Are there any FTP, Gopher or WWW sites out there which archive Usenet
> news, preferably with all the files for a day in a single file, so I
> could just download the group to minimise connect time?

The general answer to your question is no, there is no one site that
perpetually archives all of Usenet news for public access. The size of
such an archive would make its maintainence unworkable. It is likely that
your expiry time is so short for precisely this reason. Many sites
(especially commercial, public-access internet service providers) will keep
articles going back for two weeks or even a month as part of their general
operation, and you should be able to access one of these sites to get
articles you want.

However, there are archives for some particular groups. I don't know of a
list of archive sites for all known Usenet archives (although such a list
would be handy and is at least possible), but you could certainly post
requests in those groups with which you are concerned. alt groups are very
unlikely to be archived, comp groups are more likely, everything else is
fairly hit-and-miss. It basically depends on how important it is to keep
an archive of questions and answers. For instance, comp.lang.perl is
archived, and its archive is especially useful because the author of the
language and the authors of the two best (only?) books on the language post
there almost every day. In such a case, you are more likely to find a
sysadmin willing to donate space.

It is very unlikely that all of the articles for a day will be condensed
into a single file. The ftp command 'mget' (multiple-get) makes this
unnecessary. Some groups (such as comp.dcom.telecom) are _digested_, or
filtered into a handy journal. A digest, if one exists, is almost certain
to contain information you might deem important, and in only one file at
that.

<hed...@teleport.com>

0 new messages