The problem I having is in busy daily high traffic binary groups like:
alt.binaries.cd.image
alt.binaries.mp3.sounds.completecd
There is so much traffic that when I log on the the group and do a refresh
servers - Xnews try's to download the file list, frezze's up and then
doesn't respond - which then I have to do a computer restart to bring the
computer back
I am assuming that I'm dealing with a settup that I have done wrong -
anyone that can give me some direction or help would be greatly apprecited
Xnews ???.01.30
sittingduck
>
> Xnews doesn't handle heavy binary use well.
> Try Newsleecher, or Newsbin.
>
> Try downloading fewer headers at a time with Xnews.
>
With a modern highspeed connection and a 'crazy busy'
newsgroup it is possible to choke Xnews.
Depending on how you have your storage options set up,
Xnews can eventually end up dealing with so many headers
that it runs out of memory.
How many headers Xnews can handle at a time seems to be a
function of how much RAM memory is available on your PC.
I only have about 384k memory on my Laptop and I can
only deal with about 300-400 thousand headers in a
group before Xnews starts to slow down.
On my old PC with only 128k it was closer to 100-150
thousand headers.
So the rule of thumb seems to be that you can only
deal with as many thousands of headers at a time as
your PC has mega bytes of memory (i.e. 512Megs of
Ram = 512 thousand headers).
Beyond this things start to gradually slow down until,
at maybe twice the above numbers, things slow to a
halt and the PC may appear to freeze up.
A busy group on a premium server with many weeks of
retention can easily have MILLIONS of posts.
What seems to happen is that Xnews just inocently asks
windows for more and more memory as it trys to sort and
thread all those additional posts.
Windows is happy to give Xnews the extra memory, but
it starts to dip into 'virtual' memory using the
hard drive (which is about 10,000 times slower than
real RAM for the kind of sorting that Xnews is doing).
So things just slow to a stop.
If you leave storage option in the Xnews Setup Panel
set to save the headers, then every time you open the
group, Xnews will keep adding to the list.
Sooner or later you exceed the number of headers that
Xnews can quickly sort and thread with the amount of
RAM you have available and you get into a freeze-up.
I have delt with really, really HUGE groups with Xnews
(millions of headers total) but I am carful to leave
storage set to 'save nothing' in the Xnews setup panel
and then to use the 'open special' function to control
the number of headers I am working with at one time.
You can get to 'open special' easily by 'right-clicking'
the group and picking 'open-special' from the menu.
Then you can use the 'Start' and 'End' slider controls
at the top of the 'open special' window to set limits
on how many headers will be retrieved.
Try setting storage to 'save nothing' and limiting the
headers to a few hundred thousand at a time and see if
that eliminates the freeze up proplems.
When useing 'open special' to set this number, be sure to
be slide the arrows close to each other and count UP from
zero in the "GET" window on the right until you reach the
desired number, because if you count DOWN from the MAX,
you can be fooled because the total count shown on the right
can be so large it will overflow the small window making it
look like you are downloading for example 200,000 when the
total is actually much larger (like 1,200,000 or 2,200,000).
Lastly, I should mention that limiting the number of headers
in use at one time doesn't mean that you can't get to older
posts! If you look at how 'open special' works, you will
see that it's possible to 'walk through' a really large
group in segments by setting the 'Start' and 'End' sliders,
so you can find older posts.
I don't find this much of a limit in day to day use, because
once I have read the older posts, I just retrieve the newer
headers in the group. It's a little slow to get to older posts
in some of the insanely busy groups like alt.binaries.dvd, because
there are literally MILLIONS of posts and I have to scan them
a few hundred K at a time.
I would think you should be able to handle the groups you listed
fairly well if you just limit the headers to some reasonable
number with 'open special'.
If this is too big a hassle, I am sorry to say that you will
have to think about another News Reader for those groups.
Xnews_Fan
P.S.
This issue with insanely huge numbers of headers in some
groups is still a pain NO MATTER WHICH NEWSREADER YOU USE.
There is a new indexing method for posting called an
"NZB" file which bypasses the need for headers (you
may have seen these NZB indexes posted in some groups).
Using NZB files you can save (or post) just a 'pointer' to
a large post in a newsgroup so that a newsreader program can
find it later without needing to retrieve all the headers.
We can find these "NZB" indexes on web search engines
(or posted in much less busy 'index only' groups), then
jump right to downloading the post in question, without
even worring about the huge number of headers.
The latest version of Xnews can SAVE an NZB file index
for a group of files, which is a good first step, but,
unfortunatly, can't currently expand an NZB file in the
article list (or load one into a queue), so there is no
way to take full advantage of the capability that NZB files
offer to download large posts without having to retrieve
the headers first in Xnews at this time.
Some NZB support IS there already, so hopefully MORE will
be added, and this could make a HUGE difference in dealing
with these 'monster groups' with Xnews.
So if you are a bit frustrated, you may want to stay tuned.
> So there is no way with a change in settings to solve this ?
Yes, you can CTL + Enter to enter a group and not DL all the headers at
once. There are also setting in the ini file you can use to cutoff
header DL at a certain number.
--
David
> The problem I having is in busy daily high traffic binary groups
> like:
> [...]
> There is so much traffic that when I log on the the group and do a
> refresh servers - Xnews try's to download the file list, frezze's
> up and then doesn't respond - which then I have to do a computer
> restart to bring the computer back
>
>
> I am assuming that I'm dealing with a settup that I have done
> wrong - anyone that can give me some direction or help would be
> greatly apprecited
As was mentioned by other posters, it's your RAM that is being used
up and is spilling over to virtual memory on your hard drive. What
you need to do in this case is load a smaller set of headers,
extract what you need, delete the headers to recover memory, then
read the next group of headers, and so on. It's not documented well
in the manual, but in Xnews the procedure for doing this is like
this:
Suppose that if a group contains more than 200000 posts, you would
like to read it in batches of 100000. In that case, before running
Xnews, you would edit the [Xover] section of Xnews.ini to contain:
[XOVER]
PromptThreshold=200000
LimitPerGet=100000
AutoFree=1
Now, whenever you attempt to open a group that has more than 200000
posts, a window will automatically pop up labeled "Set number of
headers to retrieve". The sliders should already be set correctly
and all you have to do is place a check-mark in front of "Limit to"
in the Incremental retrieval section. The number will be pre-set to
100000 (the "LimitPerGet" entry, above). Make sure the "free old
headers" box is also checked. Then hit the return or check OK.
Xnews will then download the next 100000 articles. When you are
done with those, simply hit the F5 key (or click on the icon of a
sheet with two green semi-circular arrows on bottom bar) ( or Group
--> Refresh Headers). Xnews will ask you to confirm that you want
to delete existing headers (to save memory). Simply click "Yes",
and the existing headers will be cleared and the next 100000 headers
will be loaded. Continue until you finish reading the group. When
done, you can do a "F8" or "Catchup" to only read new articles next
time you open the group.
Simply change the [Xover] numbers to reflect the amount of RAM that
you have available (hit & miss approach).
HTH,
John