Caddy fails with systemd but runs fine in the console

1. The problem I’m having:

Caddy fails to start as systemd & errors out. here’s the error message

× caddy.service - Caddy
     Loaded: loaded (/lib/systemd/system/caddy.service; enabled; vendor preset: enabled)
     Active: failed (Result: exit-code) since Sat 2024-11-23 20:01:16 IST; 1 day 7h ago
       Docs: https://caddyserver.com/docs/
    Process: 5521 ExecStart=/usr/bin/caddy run --environ --config /etc/caddy/Caddyfile (code=exited, status=1/FAILURE)
   Main PID: 5521 (code=exited, status=1/FAILURE)
     Status: "loading new config: http app module: start: finalizing automatic HTTPS: managing certificates for [kamakoti.tv www.kamakoti.tv stream2.kamakoti.tv stream.kamakoti.tv admin.kamakoti.tv beta.kamakoti.tv]: automate: manage [kamakoti.tv www.kamakoti.tv admin.kamakoti.tv beta.kamakoti.tv]: kamakoti.tv: caching certificate: decoding certificate metadata: unexpected end of JSON input"
        CPU: 68ms

Nov 23 20:01:16 kamakotitv caddy[5521]: {"level":"info","ts":1732372276.8801594,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc0001be080"}
Nov 23 20:01:16 kamakotitv caddy[5521]: {"level":"info","ts":1732372276.8804023,"msg":"[INFO][FileStorage:/var/lib/caddy/.local/share/caddy] Lock for 'storage_clean' is stale (created: 2024-11-23 20:00:45.037774379 +0530 IST, last update: 2024-11-23 20:00:45.037774379 +0530 IST); removing then retrying: /var/lib/caddy/.local/share/caddy/locks/storage_clean.lock"}
Nov 23 20:01:16 kamakotitv caddy[5521]: {"level":"info","ts":1732372276.8806276,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]}
Nov 23 20:01:16 kamakotitv caddy[5521]: {"level":"info","ts":1732372276.880672,"logger":"http.log","msg":"server running","name":"remaining_auto_https_redirects","protocols":["h1","h2","h3"]}
Nov 23 20:01:16 kamakotitv caddy[5521]: {"level":"info","ts":1732372276.880676,"logger":"http","msg":"enabling automatic TLS certificate management","domains":["kamakoti.tv","www.kamakoti.tv","stream2.kamakoti.tv","stream.kamakoti.tv","admin.kamakoti.tv","beta.kamakoti.tv"]}
Nov 23 20:01:16 kamakotitv caddy[5521]: {"level":"info","ts":1732372276.8807924,"logger":"tls.cache.maintenance","msg":"stopped background certificate maintenance","cache":"0xc0001be080"}
Nov 23 20:01:16 kamakotitv caddy[5521]: Error: loading initial config: loading new config: http app module: start: finalizing automatic HTTPS: managing certificates for [kamakoti.tv www.kamakoti.tv stream2.kamakoti.tv stream.kamakoti.tv admin.kamakoti.tv beta.kamakoti.tv]: automate: manage [kamakoti.tv www.kamakoti.tv admin.kamakoti.tv beta.kamakoti.tv]: kamakoti.tv: caching certificate: decoding certificate metadata: unexpected end of JSON input
Nov 23 20:01:16 kamakotitv systemd[1]: caddy.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 20:01:16 kamakotitv systemd[1]: caddy.service: Failed with result 'exit-code'.
Nov 23 20:01:16 kamakotitv systemd[1]: Failed to start Caddy.


But when i run it in the console via “caddy run” it works fine

3. Caddy version:

2.8.4

4. How I installed and ran Caddy:

via package install.

a. System environment:

Ubuntu 22.02 with latest kernel

b. Command:

i ran caddy validate and it returned no errors

d. My complete Caddy config:

kamakoti.tv www.kamakoti.tv {
        root * /var/www/html/ktvm

        reverse_proxy localhost:19350

        header {
                # disable FLoC tracking
                Permissions-Policy interest-cohort=()

                # enable HSTS
                Strict-Transport-Security max-age=31536000;

                # disable clients from sniffing the media type
                X-Content-Type-Options nosniff

                # clickjacking protection
                X-Frame-Options DENY

                # keep referrer data off of HTTP connections
                Referrer-Policy no-referrer-when-downgrade
        }
}

stream.kamakoti.tv {
        tls {
                dns cloudflare fffffffffffffffffffffffffff
        }
        reverse_proxy http://10.1.0.5:8880 {
        }
}

beta.kamakoti.tv {
        reverse_proxy http://10.1.0.5:3000
}

admin.kamakoti.tv {
        reverse_proxy http://10.1.0.5:18943
}

stream2.kamakoti.tv {
        tls {
                dns cloudflare ffffffffffffffffffffffffffffffff
        }
        reverse_proxy http://10.1.0.5:8980 {
        }
}

This is a known issue unfortunately with Caddy prior to v2.9.0 (currently in betas, but fixed). Sometimes the storage code fails to write the file correctly, so it ends up having corruption. The workaround is to wipe out the storage for that domain and let Caddy reissue the certs.

You can find the storage at /var/lib/caddy/.local/share/caddy

You probably don’t need this stuff. Don’t blindly set headers like this, security headers are an application-layer concern. Your upstream app should set the headers it needs to work correctly.

This doesn’t make sense, root doesn’t work with reverse_proxy. You typically use root if you’re serving static files with file_server, but with reverse_proxy that means you’re not doing that. Remove that.

You can deduplicate this with snippets: Caddyfile Concepts — Caddy Documentation

Also you can use env vars so that your Caddyfile doesn’t contain secrets (so it’s easier to copy-paste without leaking secrets). See Keep Caddy Running — Caddy Documentation for that

Thanks, how do i do that if i want to have multiple CF-Tokens for different domains?

You can have more than one env var and snippet, give them names. But why do you have more than one anyway?

I have a single caddy reverse proxy server and have domains that are in different CF accounts.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.