Log all shell script output to a file from within the script
Typically when logging the output of a shell script, you would do something like the following:
some-script.sh > some-logfile
But what if you wanted to achieve the same from within the shell script? I recently discovered a fairly lean approach by applying some magic to standard streams.
As it turns out, you cannot only pipe from stderr to stdout, a use case commonly known:
some-script.sh > some-logfile 2>&1
You can make arbitrarily chosen streams mimic stderr and stdout, and alter how stderr and stdout behave. Thus, for example, we can make both stderr and stdout precede each line with a timestamp and write out to a file while the script is executed, and reset the original state when the script finishes execution.
Putting the following at the top of your shell script, you can make the standard streams log each line with a timestamp to a file.
exec 3>&1 4>&2
exec 1> >(sed "s/^/[$(date)] /" > some-logfile) 2>&1
First, the above will make stream 3 behave like stdout and stream 4 behave like stderr. Second, it will make stream 1, which is used as standard output, prepend a timestamp to each line, log to some logfile, and pipe stream 2, which is used as standard error, to stream 1.
We now have effectively "proxied" stdout and stderr. To make sure to reset the default behavior as soon as the script finishes, I recommend adding the following below:
finalize() {
exec 1>&3 2>&4
}
trap finalize EXIT
With these lines, we define a function to reset the behavior whenever/however the script exits.