On a dedicated Linux VM
As a gate from internet to 3 servers (4 services), with reverse proxy
Also a filter on some countries with caddy-maxmind-geolocation
a. System environment:
ubuntu 20.04
systemd
b. Command:
sudo systemctl enable --now caddy
c. Service/unit/compose file:
# caddy.service
#
# For using Caddy with a config file.
#
# Make sure the ExecStart and ExecReload commands are correct
# for your installation.
#
# See https://caddyserver.com/docs/install for instructions.
#
# WARNING: This service does not use the --resume flag, so if you
# use the API to make changes, they will be overwritten by the
# Caddyfile next time the service is restarted. If you intend to
# use Caddy's API to configure it, add the --resume flag to the
# `caddy run` command or use the caddy-api.service file instead.
[Unit]
Description=Caddy
Documentation=https://caddyserver.com/docs/
After=network.target network-online.target
Requires=network-online.target
[Service]
Type=notify
User=caddy
Group=caddy
ExecStart=/usr/bin/caddy run --environ --config /etc/caddy/Caddyfile
ExecReload=/usr/bin/caddy reload --config /etc/caddy/Caddyfile --force
TimeoutStopSec=5s
LimitNOFILE=1048576
LimitNPROC=512
PrivateDevices=yes
PrivateTmp=true
ProtectSystem=full
AmbientCapabilities=CAP_NET_BIND_SERVICE
[Install]
WantedBy=multi-user.target
d. My complete Caddy config:
# Caddyfile
# 17/12/2022
#
{
# debug
order rate_limit before basicauth
}
(checks) {
# check if the client is local
@is_local remote_ip 192.168.9.0/24
# check if the client is from an authorised country (with the plugin caddy-maxmind-geolocation)
@in_countries {
maxmind_geolocation {
db_path "/usr/share/GeoIP/GeoLite2-Country.mmdb"
allow_countries FR PT DK SE NL AT BE DE LU IE IT ES GI GB CH MC AD LI NO VA SM
}
}
# check if the url is admin and so unauthorised
@url_authorized {
not {
path /wp-admin* /wp-login* /wp-comments*
}
}
}
(rateLimit) {
# rate limit for all client (static_limit) and each client (dynamic_limit)
rate_limit {
distributed
zone static_limit {
key static
events 100
window 10s
}
zone dynamic_limit {
key {remote_host}
events 60
window 1m
}
}
}
(logsW) {
log {
output file /var/log/caddy/access.log
}
}
(handleProxy) {
# handle the reverse proxy based on the previous snippet with the target as argument
import checks
handle @is_local {
reverse_proxy {args.0}
}
handle @url_authorized {
handle @in_countries {
import rateLimit
reverse_proxy {args.0}
}
handle {
abort
}
}
handle {
error 404
}
}
www.atelier.bris.fr bris.fr www.bris.fr {
redir https://atelier.bris.fr{uri}
}
atelier.bris.fr/defaultsite {
redir https://atelier.bris.fr
}
atelier.bris.fr {
import handleProxy "http://atelier.lan.bris.fr:80"
import logsW
}
3. The problem I’m having:
With this Caddyfile I’ve got the access.log in /var/log/caddy/
-rw-rw-r-- 1 caddy caddy 94140 Dec 18 19:36 access.log
and it’s ok
I’d like to move this log file to /tmp to use it: format, add to a daily file on a server
What happened is, when I move the file to /tmp, it’s the logfile in /tmp that is still used by caddy and there is not a new access.log in /var/log/caddy/
If I remove the logfile in /var/log/caddy/, access.log is not longuer updated by caddy untill I make disable / enable
Is this normal?
How to do to use the var/log/caddy/access.log for instance with a cron every 5 minutes?
Thanks for the doc
I’ve read it and my understanding is that I’ll get a file and then the file will the compressed
-rw-r--r-- 1 caddy caddy 17328 Dec 20 21:35 /var/log/caddy/access.log
My point is what to do after, if I want to start simple (not going to something like Grafana right now)
So I was thinking about an Excel file to analyses the logs, as right now I don’t know what to look at!
I’ve read about jq, but how to get the data from the log file w/o breaking it, and w/o using several times the same data!
So I’ve found that mv was not good, rm + touch neither, but truncate -s 0 filename is ok
the whole script, I’ve made
#!/bin/bash
#
# extract field from the log, format to CSV and send then to the NFS folder with a file per month
#
# 18/12/2022 - initial
# 20/12/2022 - make month and day file
# Path to the NFS folder
DESTDIRNAS="/mnt/criosNFS/log"
# Files
LOGFILE="/var/log/caddy/access.log"
#echo $LOGFILE
LOGTOTMP="/tmp/log$$"
#echo $LOGTOTMP
LOGTOCSV="/tmp/logCSV"
#echo $LOGTOCSV
YYYYMMDD=`date +"%Y-%m-%d"`
#echo $YYYYMM
YYYYMM=`date +"%Y-%m"`
#echo $YYYYMM
LOGTODAY="$DESTDIRNAS/access_$YYYYMMDD.csv"
#echo $LOGTODAY
LOGTOMONTH="$DESTDIRNAS/access_$YYYYMM.csv"
#echo $LOGTOMONTH
if [ $LOGFILE = 0 ] ; then
echo "No $LOGFILE"
exit 0
fi
# Move LOGFILE to tmp
cp $LOGFILE $LOGTOTMP
if [ $? != 0 ] ; then
echo "ERROR: cp $LOGFILE to $LOGTOTMP KO"
exit 1
fi
#echo "cp $LOGFILE $LOGTOTMP OK"
truncate -s 0 $LOGFILE
#echo "reset $LOGFILE OK"
chmod 777 $LOGTOTMP
# Format log to CSV
#echo "cat $LOGTOTMP | jq -r '[.ts,.request.remote_ip,.request.host,.request.uri,.status]|@csv' >> $LOGTOCSV"
cat $LOGTOTMP | jq -r '[.ts,.request.remote_ip,.request.host,.request.uri,.status]|@csv' >> $LOGTOCSV
if [ $? != 0 ] ; then
echo "ERROR: format $LOGTOTMP to $LOGTOCSV"
mv $LOGTOTMP "$LOGTOTMP"_err
rm -r $LOGTOCSV
exit 2
fi
#echo "$LOGTOCSV OK"
chmod 777 $LOGTOCSV
# Add log to the day file on NFS, then the month file
cat $LOGTOCSV >> $LOGTODAY
if [ $? != 0 ] ; then
echo "ERROR: cat $LOGTOCSV >> $LOGTODAY"
exit 3
fi
cat $LOGTODAY >> $LOGTOMONTH
chmod 777 $LOGTODAY
chmod 777 $LOGTOMONTH
rm -r $LOGTOTMP $LOGTOCSV
#echo "*** END ***"
I didn’t succeeded to setup goaccess
I’ve read some topic on the forum but it still doesn’t work.
=15664== GoAccess - Copyright (C) 2009-2020 by Gerardo Orellana
==15664== https://goaccess.io - <hello@goaccess.io>
==15664== Released under the MIT License.
==15664==
==15664== FILE: /var/log/caddy/access.log
==15664== Parsed 10 lines producing the following errors:
==15664==
==15664== IPv4/6 is required.
==15664== Format Errors - Verify your log/date/time format
You can check your current version via goaccess --version
Do not use the package provided by the official ubuntu repositories.
That one is horribly outdated.
Use the official GoAccess repository instead: https://goaccess.io/download#official-repo