Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Anyone using Leafnode?

3 views
Skip to first unread message

noi

unread,
Jan 18, 2007, 12:14:13 AM1/18/07
to
Anyone out there using leafnode?

I've browsed the online docs but I didn't see answers to :

Does Leafnode handle large volume newsgroups?

Does it handles large numbers fo newsgroups with huge volume of
articles?

Does Leafnode download only the headers or complete articles?

Any memory issues?

Thanks

Walter Mautner

unread,
Jan 20, 2007, 1:53:52 PM1/20/07
to
noi wrote:

> Anyone out there using leafnode?
>

Yes.



> I've browsed the online docs but I didn't see answers to :
>
> Does Leafnode handle large volume newsgroups?
>

Provided you have a big enough spool directory, and a useful filesystem.

> Does it handles large numbers fo newsgroups with huge volume of
> articles?
>

What is "large"? Here, it handles 24hrsupport.helpdesk just fine.



> Does Leafnode download only the headers or complete articles?
>

Your decision.

> Any memory issues?
>
Looks like you missed "man leafnode".

--
vista policy violation: Microsoft optical mouse found penguin patterns
on mousepad. Partition scan in progress to remove offending
incompatible products. Reactivate MS software.
Linux 2.6.17-mm1,Xorg7.1/nvidia [LinuxCounter#295241,ICQ#4918962]

noi

unread,
Jan 20, 2007, 5:35:53 PM1/20/07
to
On Sat, 20 Jan 2007 19:53:52 +0100, Walter Mautner wrote this:

> noi wrote:
>
>> Anyone out there using leafnode?
>>
> Yes.
>
>> I've browsed the online docs but I didn't see answers to :
>>
>> Does Leafnode handle large volume newsgroups?
>>
> Provided you have a big enough spool directory, and a useful filesystem.
>
>> Does it handles large numbers fo newsgroups with huge volume of
>> articles?
>>
> What is "large"? Here, it handles 24hrsupport.helpdesk just fine.

I'm looking through 2.5 million headers each in 3 or 4 newgroups, ie, 10
million total.

>
>> Does Leafnode download only the headers or complete articles?
>>
> Your decision.
>
>> Any memory issues?
>>
> Looks like you missed "man leafnode".


It's not installed by default on my system, so, before I leaped I thought
I'd ask for real world experiences.

Thank you very much for your reply.

Walter Mautner

unread,
Jan 21, 2007, 11:20:56 AM1/21/07
to
noi wrote:

> On Sat, 20 Jan 2007 19:53:52 +0100, Walter Mautner wrote this:
>
>> noi wrote:
>>
>>> Anyone out there using leafnode?
>>>
>> Yes.
>>
>>> I've browsed the online docs but I didn't see answers to :
>>>
>>> Does Leafnode handle large volume newsgroups?
>>>
>> Provided you have a big enough spool directory, and a useful filesystem.
>>
>>> Does it handles large numbers fo newsgroups with huge volume of
>>> articles?
>>>
>> What is "large"? Here, it handles 24hrsupport.helpdesk just fine.
>
> I'm looking through 2.5 million headers each in 3 or 4 newgroups, ie, 10
> million total.
>

Probably binary newsgroups with such many parts each counted as a single
message?
Leafnode was not made for binary newsgroups. It is a "private" news-_server_
and not a offline reader (though you can use it to fetch news online, store
and read them offline). You cannot tell leafnode which binaries to download
in the first pass, so there is either all (impracticable) or nothing (not
what you want). The "delaybody=1" feature would be nice, but you need a 2nd
scheduled fetch to actually download the parts you want to get. On the
other hand, you miss "just news" the first time as well.
I found no way to let it download bodies immediately, when message size is
below a certain level (text message), or delay when it's bigger (=a
binary). The "maxlines" and "maxbytes" feature is just yes or no, so a 2nd
fetch will not change anything. Hmm, I think you could run a script to swap
config files in between and omit the max... keywords on a 2nd run.

noi

unread,
Jan 21, 2007, 6:16:46 PM1/21/07
to

Yep. That's about what I suspected. I'll have to stick with slow manual
method I use today with Pan.

Thanks for the feedback.

Walter Mautner

unread,
Jan 22, 2007, 1:56:11 AM1/22/07
to
noi wrote:

> On Sun, 21 Jan 2007 17:20:56 +0100, Walter Mautner wrote this:
>
>> noi wrote:

....


>>>> What is "large"? Here, it handles 24hrsupport.helpdesk just fine.
>>>
>>> I'm looking through 2.5 million headers each in 3 or 4 newgroups, ie, 10
>>> million total.
>>>
>> Probably binary newsgroups with such many parts each counted as a single
>> message?
>> Leafnode was not made for binary newsgroups. It is a "private"
>> news-_server_ and not a offline reader (though you can use it to fetch
>> news online, store and read them offline). You cannot tell leafnode which
>> binaries to download in the first pass, so there is either all
>> (impracticable) or nothing (not what you want). The "delaybody=1" feature
>> would be nice, but you need a 2nd scheduled fetch to actually download
>> the parts you want to get. On the other hand, you miss "just news" the
>> first time as well. I found no way to let it download bodies immediately,
>> when message size is below a certain level (text message), or delay when
>> it's bigger (=a binary). The "maxlines" and "maxbytes" feature is just
>> yes or no, so a 2nd fetch will not change anything. Hmm, I think you
>> could run a script to swap config files in between and omit the max...
>> keywords on a 2nd run.
> Yep. That's about what I suspected. I'll have to stick with slow manual
> method I use today with Pan.
>

Newsgroups, as a store-and-forward (like mail) method, was developed in the
early days of the net. It wasn't intended for binary postings at all, so
every addon is just a crutch for putting it over a limit.
The better medium these days, for distributing big binary files (even
without splitting), is bittorrent. Once you are registered at a tracker,
you may use the associated forum to request uploads, much the same way you
can with binary newsgroups.

noi

unread,
Jan 22, 2007, 2:35:58 AM1/22/07
to


If my news server and posters to it supported bittorrent maybe I'd use it.
I don't think bittorrent can handle the volume or traffic of my newserver.
Isn't bittorrent some kind of peer-to-peer? Article retention on the
news server's around 90 days, ie, 2.5 million articles and anonymous.


Thanks for the reply.

Walter Mautner

unread,
Jan 24, 2007, 1:46:28 AM1/24/07
to
noi wrote:

....


> If my news server and posters to it supported bittorrent maybe I'd use it.

Ah, you have a subscription, ok.

> I don't think bittorrent can handle the volume or traffic of my newserver.
> Isn't bittorrent some kind of peer-to-peer? Article retention on the
> news server's around 90 days, ie, 2.5 million articles and anonymous.
>

Yes, bittorrent is p2p and depends upon how many others down/upload (it is
done simultanously) your wanted binary, at the same time, or leave it in
their clients upload queue, while they yet down/upload another thing. It is
great to download whold cds and even full dvds, very efficient, and I
regularly use it to download most recent linux distributions. You can get
all kind of legal and not-so-legal stuff. My last bigger download of a full
opensuse10.2-dvd (4.3gig) took nothing longer then a few hours, actually
overnight, with a cable (5 Mbit) connection.
I would just try out azureus or ktorrent - but make sure you have the
incoming port (adjustable) forwarded from your router.

0 new messages