Yes, but doing so may loose some data, unless you output both. If we can simply read the full log and simply extract the values interesting to us at the moment, seem it could be valuable. I used the common log format as example, but it’s possible to use any value from the json.
I decided to expand the idea of logparse into something a little more capable. Instead of just having a hardcoded option, you can now customise how you like.
If you wanted a tab separated CSV, and with the unix timestamp as datetime with ms, simply call logparse with those selectors:
> logparse -s "datetime_ms tab client_ip tab proto tab status tab uri" caddy.log
2024-05-26 13:42:52.601 CET 10.10.10.117 HTTP/2.0 200 /gentoo/gentoo-distfiles/distfiles/9b/tl-hrlatex.source-2021.tar.xz
2024-05-26 13:43:50.508 CET 10.10.10.134 HTTP/2.0 200 /gentoo/gentoo-portage/app-emulation/metadata.xml
2024-05-26 13:44:00.654 CET 10.10.10.132 HTTP/2.0 200 /gentoo/gentoo-distfiles/distfiles/b6/libvirt-python-9.9.0.tar.gz
2024-05-26 02:22:43.414 CET 240e:1:1::1234 HTTP/3.0 200 /res/browse.css
I also added a conversion from caddys TLS and cipher suite decimal form to their common names such as TLS 1.3
and TLS_RSA_WITH_AES_128_CBC_SHA256
It’s possible to change the naming scheme and jq
selectors/filters in the config file. I suppose you could create a different config for different sources, not just Caddy’s.
More examples, details and source code can be found on my wiki logparse: A JSON logparser for Caddy webserver logs.
I hope it will be useful to someone
I’ve found this is useful for when you’re only logging to standard output:
docker compose logs web | tail -1 | cut -d"|" -f2- | jq
If you’re using journald
as a log store; you can prefix most of the jq
examples here as follows: journalctl -f -u caddy -o cat | jq <etc>