This is really annoying me as I have a newsfeed with really good retention
but there are early items I cannot access because of this
> When i try to download a large group (eg one with 1,328,000 unread
> articles) XNews brings up an out of memory message, and wont even let
> me exit properly. I have a dual processor, 2mb ram with nothing else
> signficant running and at least 50gb free space on the drive I am
> using. Is there a limit to the number of articles in the software
> itself or do I need to edit any settings? (keep it simple guys
> please!!)
2 GiB RAM, I think you mean.
You might be able to handle a bit less than 1000000 headers at a time,
but probably not more. Xnews stores all the headers in RAM, and then
more RAM is needed to sort so many headers. Probably someone will be
along to suggest readers that handle huge numbers of headers
differently (unless they're all tired of answering this question). But
if you want to use Xnews, your only option is to limit the number of
headers downloaded at a time.
> newguy wrote:
>
>> When i try to download a large group (eg one with 1,328,000 unread
>> articles) XNews brings up an out of memory message
>
> Xnews is not designed for heavy binary use. Get a newsreader that is,
> and save Xnews for text.
Which free reader would people reccomend for heavy use like this - I find
xnews really easy to use and like the way that it just runs from the exe
and doesnt require an install - anyway what's reccomended
> sittingduck <du...@nomail.afraid.org> wrote in
> news:Xns99EB73729D946du...@invalid.quakefour.net:
>
>> newguy wrote:
>>
>>> When i try to download a large group (eg one with 1,328,000
>>> unread articles) XNews brings up an out of memory message
>>
>> Xnews is not designed for heavy binary use. Get a newsreader that
>> is, and save Xnews for text.
>
> Which free reader would people reccomend for heavy use like this -
I'd recommend Xnews with a limit set on the number of headers
downloaded at one time.
I'd also recommend you update to the current release.
--
XS11E, Killing all posts from Google Groups
The Usenet Improvement Project:
http://improve-usenet.org
As <<Q>> said, when there are a lot of headers, you have to go
through them in smaller batches. The size of the batches depends on
the amount of physical memory and/or patience that you have. I
posted a procedure on how to do this in Xnews some time ago. Check
the following:
<http://groups.google.com/group/news.software.readers/msg/6118c385f02456b8>
HTH,
John