Urugers!
If I may ask a unix question, you are the smartest people I know. This is for a site written in Ruby.
I have sensitive information that is being logged to both my access and error nginx logs. My solution was to write to a named pipe and then filter the sensitive info out to sanitized files. My setup:
nginx.conf
...
error_log /opt/nginx/logs/error.log;
access_log /opt/nginx/logs/access.log;
...
init.d/nginx
...
sanitize_file() {
if [ ! -p $1 ] ; then
mkfifo $1
fi
touch $2
start-stop-daemon --start --background --exec /home/deploy/bin/sanitizer.sh -- $1 $2
echo "sanitizing $1 to $2"
}
...
start)
sanitize_file /opt/nginx/logs/access.log /opt/nginx/logs/access.log.sanitized
sanitize_file /opt/nginx/logs/error.log /opt/nginx/logs/error.log.sanitized
echo -n "Starting $DESC: "
start-stop-daemon --start --quiet --pidfile /opt/nginx/logs/$NAME.pid \
--exec $DAEMON -- -c $NGINX_CONF_FILE
...
sanitizer.sh
#! /bin/bash
sed 's/[0-9]\{4\}[- ]\?.\{6,9\}[- ]\?[0-9]\{4,5\}/xxxxxxxxxxxxxxxx/g' $1 >> $2
The server starts fine, handling requests, but with no output to my sanitized files. If I stop the nginx process then all the output finally flows to the correct files with filters working.
What is preventing the files from being written until the writes are closed out? It was my understanding that a pipe is usefully for specifically writing and reading at the same time. I have learned that the order of creating the pipe, writing to and reading from the file are important. All sources that I have found online suggest the setup I have in place.
Any help would be much appreciated!
George