Logging both JSON and common formats for the same site

1. Caddy version (caddy version):

v2.1.1 (official docker image)

2. How I run Caddy:

I’m using Caddy as a simple static webserver. I really like the JSON logs, but I also want common logs so I can point fail2ban at them. I would like to configure my site to log twice, once for JSON and once for common.

a. System environment:

Raspberry Pi 4
Ubuntu 20.04.1 Server
Docker version 19.03.8

b. Command:

Whatever the official docker image does by default

c. Service/unit/compose file:

  caddy:
    image: caddy
    container_name: caddy
    hostname: caddy
    restart: unless-stopped
    ports:
      - 80:80
      - 443:443
    volumes:
      - ./caddy/Caddyfile:/etc/caddy/Caddyfile
      - ./caddy/data:/data
      - ./caddy/config:/config
      - ./caddy/landing:/srv/landing

d. My complete Caddyfile or JSON config:

# logging
(logging) {
    log {
        format json
        output file /data/log/caddy.log {
            roll_size       50MiB
            roll_keep       10
            roll_keep_for 8760h
        }
    }

    #log {
    #  format single_field common_log
    #  output file /data/log/access.log {
    #    roll_size       50MiB
    #    roll_keep       10
    #    roll_keep_for 8760h
    #  }
    #}
}


# Root site
landing.dinn.ca, landing.stevedinn.com {
    root * /srv/landing
    file_server
    import logging
}

3. The problem I’m having:

I can’t figure out the correct way to configure multiple log output for my site in Caddy. I’m sure it is possible, but despite poring over the documentation and this forum, I can’t seem to get it right.

4. Error messages and/or full log output:

I don’t really have anything to post here.

5. What I already tried:

So far, I’ve tried having multiple log sections in my logging snippet, but it only uses the last one, not both.

6. Links to relevant resources:

This post seems to have come the closest so far to an answer that I can digest, but I’m not there yet.

I’m a Caddy noob, but I’ve really tried my best to get this to work. I find that there are a distant lack of complete examples of configuration in the documentation. I see lots of snippets of configuration, but it’s hard to know where they all are supposed to go.

Edit: Missed pasting in my docker-compose snippet.

Somebody correct me if I’m wrong, but it seems that this is not possible using the Caddyfile configuration format? I’ve exported my Caddyfile to JSON, and I’ve mocked up this JSON config. Will this accomplish what I want?

Side note: Is there a way I can use the JSON configuration without submitting to the admin api? Can I just have it in a folder like the Caddyfile?

{
    "logging": {
        "logs": {
            "default": {
                "exclude": [
                    "http.log.access.json",
                    "http.log.access.common",
                    "http.log.access.common_and_json"
                ]
            },
            "json": {
                "writer": {
                    "filename": "/data/log/caddy.log",
                    "output": "file",
                    "roll": true,
                    "roll_size_mb": 10,
                    "roll_gzip": true,
                    "roll_local_time": true,
                    "roll_keep": 10,
                    "roll_keep_days": 365
                },
                "encoder": {
                    "format": "json"
                },
                "include": [
                    "http.log.access.json"
                ]
            },
            "common": {
                "writer": {
                    "filename": "/data/log/access.log",
                    "output": "file",
                    "roll": true,
                    "roll_size_mb": 10,
                    "roll_gzip": true,
                    "roll_local_time": true,
                    "roll_keep": 10,
                    "roll_keep_days": 365
                },
                "encoder": {
                    "format": "single_field",
                    "field": "common_log"
                },
                "include": [
                    "http.log.access.common"
                ]
            },
            "common_and_json": {
                "include": [
                    "http.log.access.json",
                    "http.log.access.common"
                ]
            }
        }
    },
    "apps": {
        "http": {
            "servers": {
                "hera": {
                    "listen": [
                        ":443"
                    ],
                    "routes": [
                        {
                            "match": [
                                {
                                    "host": [
                                        "landing.dinn.ca",
                                        "landing.stevedinn.com"
                                    ]
                                }
                            ],
                            "handle": [
                                {
                                    "handler": "subroute",
                                    "routes": [
                                        {
                                            "handle": [
                                                {
                                                    "handler": "vars",
                                                    "root": "/srv/landing"
                                                },
                                                {
                                                    "handler": "file_server",
                                                    "hide": [
                                                        "/etc/caddy/Caddyfile"
                                                    ]
                                                }
                                            ]
                                        }
                                    ]
                                }
                            ],
                            "terminal": true
                        }
                    ],
                    "logs": {
                        "logger_names": {
                            "landing.dinn.ca": "common_and_json",
                            "landing.stevedinn.com": "common_and_json"
                        }
                    }
                }
            }
        },
        "tls": {
            "automation": {
                "policies": [
                    {
                        "issuer": {
                            "email": "steve@stevedinn.com",
                            "module": "acme"
                        }
                    }
                ]
            }
        }
    }
}

It’s not possible from the Caddyfile currently, but it is possible if you use JSON config instead.

The logging config has been purposefully kept simple so far because it’s quite complex to configure logging for individual sites; the logging app is a completely separate system from the http app, as you’ll see in the underlying JSON config.

I suggest you open an issue on GitHub to ask for this.

For now though, I suggest you set up some tooling separately that reads from the JSON logs and writes the common_log field (which is included in the JSON) to another file. You can do this with jq pretty easily, a CLI tool that manipulates JSON, but you’ll need to find something that keeps it running continually as new logs are streamed in.

This might be useful for you:

Hah you came to that conclusion.

To run with the JSON config, override the Docker command to use caddy run --config /etc/caddy/caddy.json instead.

Another thought I had was just to have a separate program parse the JSON file and output the format I’d like to another file.

I’ve written a small python script to listen on stdin for caddy JSON logs, and spit out ones in Common format. Note that this isn’t exactly the same as Caddy’s common format which doesn’t include the referer. There are several fail2ban jails that need the referer to work properly, so this might be the better option.

#! python3

import sys
import json
import datetime

k = 0
try:
    for line in iter(sys.stdin.readline, b""):
        l = json.loads(line)

        remoteIp = l["request"]["remote_addr"].split(":")[0]

        timestamp = datetime.datetime.fromtimestamp(l["ts"])

        referer = '-'
        if('Referer' in l['request']['headers']):
            referer = l['request']['headers']['Referer'][0]

        userAgent = '-'
        if('User-Agent' in l['request']['headers']):
            userAgent = l['request']['headers']['User-Agent'][0]

        print(
            remoteIp + " "
            + " -" + " "
            + "[" + timestamp.isoformat(sep=' ', timespec='milliseconds') + "]" + " "
            + '"' + l['request']['host'] + '"' + " "
            + l['request']['method'] + " "
            + l['request']['uri'] + " "
            + l['request']['proto'] + " "
            + str(l['status']) + " "
            + str(l['size']) + " "
            + '"' + referer + '"' + " "
            + '"' + userAgent + '"'
        )
        k = k + 1
except:
    print (sys.exc_info()[0])
    sys.stdout.flush()
    print ("Error on line: " + str(k))
    pass

This may be more of a Linux problem than a Caddy problem, but is there any way to tail -f the caddy JSON log file and have it continuously write to access.log? My attempts have not worked.

tail -f /srv/docker/caddy/data/log/caddy.log | python3 /srv/docker/caddy/caddy-to-common.py |tee -a /srv/docker/caddy/data/log/access.log

That command seems to just stall and stop writing to access.log

For now though, I suggest you set up some tooling separately that reads from the JSON logs and writes the common_log field (which is included in the JSON) to another file. You can do this with jq pretty easily, a CLI tool that manipulates JSON, but you’ll need to find something that keeps it running continually as new logs are streamed in.

Wow, it seems like I was on the right track then.

Thanks for your help.

1 Like

FYI the other option is you could take your Caddyfile, use caddy adapt to get the underlying JSON, then pipe that through jq to make any necessary modifications to do what you want with its various manipulation commands, then run Caddy with that JSON config. That way you can continue to use Caddyfile as your base, but just sprinkle in the stuff you need programmatically.

Basic, incomplete example:

caddy adapt --config Caddyfile | jq '.logging.logs += {"common": { ... }}' > caddy.json

I’m still a novice at jq so I likely won’t be able to help much in usage here, Google/StackOverflow will be your friend :sweat_smile:

It is hilarious to me how close I came to independantly writing nearly the same python script as @DWiskow. Of course his is more mature and developed, and mine is only an hour old, but still :stuck_out_tongue:

1 Like

This topic was automatically closed after 30 days. New replies are no longer allowed.