What I like to do when writing to a log file is: open the file, write a
record - then close the file. There is only ever one log file open at a
time - but I'm getting a "Too Many Open Files" message!!
I did some investigation and noticed that each time a file is opened an
incrementing file ID is assigned - when this reaches "file1999" I get
"too many open files". Surely it must be possible to open/close a file
more than 2000 times!!
I guess I must be missing something - how do I get past the "2000"
limit?
Many Thanks,
=Adrian=
Use the file channels command and see how many files are actually open. I
have had this bug a couple of times over the years. Sometimes it took
months before it occurred, but invariably I wasn't closing a file, even
though I thought I was. One reason is that I would have a branch of code
with a return in it that sometimes was executed. Then the file was never
closed, and new ones kept being opened.
"file channels" does indeed list *all* of the channels - The list is
added to each time the file is opened.
However: It still seems to me as though the file should be closed. The
file close part of the code looks like this...
if {[catch {close $log}]} {
puts "***Error*** Closing ($log) - $logFile"
} else {
puts "Log Closed ($log) - $logFile"
}
...The script always displays the "Log Closed..." message each time the
log is written to. I still don't understand why the file is not
closing.
Regards,
=Adrian=
Meaby you're opening file twice and closing it only once? Check every
possibilites.
--
Pozdrawiam!
Googie
This block looks right, so most likely there are cases,
where you don't even get to this quoted block at all.
Add another debugging "puts" at the place where
you open the logfile...
My bet is that you have more opens than closes.
One thing I have noticed: There is a proc (writeEvent) which writes a
message to one or more logfiles. Each time the proc is called to write
to the logfiles the same file-id is used for all of the logs (The
message is correctly written to each of the different logfiles). When
the proc is called again the next file-id is used. Here is some of my
debugging output:-
proc writeEvent called to log message to the "all" and "error" logs...
Log Opened: Old (file5) - /home/adavis/tivolimon/log/all.20060131
Channels: file4 stdin file5 stdout stderr
Log Closed (file5) - /home/adavis/tivolimon/log/all.20060131
Log Opened: Old (file5) - /home/adavis/tivolimon/log/error.20060131
Channels: file4 stdin file5 stdout stderr
Log Closed (file5) - /home/adavis/tivolimon/log/error.20060131
...The next time this proc is called I get...
Log Opened: Old (file6) - /home/adavis/tivolimon/log/all.20060131
Channels: file4 stdin file5 stdout file6 stderr
Log Closed (file6) - /home/adavis/tivolimon/log/all.20060131
Log Opened: Old (file6) - /home/adavis/tivolimon/log/error.20060131
Channels: file4 stdin file5 stdout file6 stderr
Log Closed (file6) - /home/adavis/tivolimon/log/error.20060131
...I guess I must be missing something simple (Arrrgh!!)
Regards,
=Adrian=
My guess is that you have another open somewhere in your code (or the code
you are calling).
From the path, I assume you're running on unix or linux
(perhaps you already said, but I missed)
When your program has been running for a while (and the Channels-list
has grown to about 20 entries or so), find out the pid of your
process, and run: lsof -p pidofyourprog
It should then tell you, which files you *really* have
still open.
If you don't have lsof, install it now. It's an invaluable debugging aid.
> ...I guess I must be missing something simple (Arrrgh!!)
Most likely you're missing some other (perhaps not log-related)
open, either in your scripts, or indirectly called through
some libraries.
With "lsof" you at least see which files' handles your program is
stockpiling. This should give you a hint as to where
to look at more throroughly.
--
Bryan Oakley
http://www.tclscripting.com
> What I like to do when writing to a log file is: open the file, write
> a record - then close the file. There is only ever one log file open
> at a time - but I'm getting a "Too Many Open Files" message!!
One other avenue for debugging messages is to wrap the open and close
commands to do your own tracking. What I used to do is something akin
to [rename open i_open] (but only if you haven't already done it...
you can check if [open] is a proc using [info proc open], or you could
probably check for [info command i_open], etc.), and then create a proc
called open that calls i_open for you. Keep your own record of what's
going on, and look for any discrepancies. You can then create an
[open2] or something that takes a new first argument which it stashes
in its own records. You throw your [open2]s around where you suspect
things aren't quite right, passing it some kind of useful token to help
find the instances that go wrong.
Mind you I last used that technique several years ago on a very old
version of TCL; it should still work just the same, though there's
probably a better way to do it now.
Fredderic
Thanks for all the help.
Best Regards,
=Adrian=
ls -ll /proc/<pid>/fd
uwe