(Back again :-) )
Background :- application is real-time control of manufacturing measurement data collection. Logging is collected from all parts of application, and stored in a set of folders within a Base directory for NLog.
We have moved the logging base directory for NLog off the C drive(install directory) to a much larger 2nd drive which holds both Nlog files and the actual measurement data. Rough goal is to confine NLog files to 10% of that drive.
There are about a dozen different log file types in half dozen sub directories (by topics within the application... Nlog.config) which (using date name pattern) start new files every day.
Our application has capability to search, extract, analyze, and report on information from these log files for purposes of locating patterns around events or within time frames looking for events. So it is critical to keep significant sets of these files intact. But size constraints need to be managed. We certainly want to keep all files within several weeks, but then purge very large logs (detailed trace logs). We can then run detailed reports on recent history, but also generalized reports over much longer time spans.
I have read about ability to use archiving - let me state goal and then return to how archiving seems to work...
Optimal goal : For each sub-directory of log files/topics
Set a size limit on the accumulated set of log files contained in that directory ( ex. 50 gb) - note - not each file, but cumulative files
if size is exceeded, start archiving/purging/moving files from oldest towards newest until under the limit (by some amount? if limit = 50 gb, then remove until 10% under limit?)
The archiving process as I read it is configured for each target with two method choices
Size based archiving - archive a log file if it exceeds a size. This seems not workable as it will leave smaller files indefinitely, and take larger files immediately - leaving gaps in our timelines
Time based archiving - "wakes up" on interval, and archives files into an archiving area, with option to limit # archive files. This is closer to what we need, with some problems - to wit
(1) It is not size based - so we would need to estimate by logging topics a number of days to keep to stay within our size constraints. Manageable problem.
(2) It periodically clears out log files into archiving area. This significantly complicates our ability to execute searches.
As example - if we want to run a search covering 5 days of history (say 6 days ago until yesterday), and archiving was run two days ago ("Monday"), our
search range must span both files in the archive and files in the current log directories, and have detailed knowledge of the differing file names/organization.
Are there other options I have missed ? I am balancing the trade off between writing our own archive/purge code/logic vs investing in more complicated search & extract code/logic.