On Demand TLS for the front-end only

1. The problem I’m having:

I’m looking to host several websites, and I would like some help on how to approach serving my apps. I’m new to Caddy btw.
My use case:
I have an app running on localhost:8000 and a second app running on localhost:3000
The app on localhost:8000 is a cms dashboard that users will access through a single domain i.e mydashboard.thecms.com.

The app on localhost:3000 is the front-end app, this will serve hopefully hundreds/thousands of domains. I have discovered the on demand tls which it’s great and it is what I need for this one as I won’t know the domain names in advance.

My apps are currently running on Rocky Linux 9 server. It is all hardcoded as I am testing 3 websites at the moment.

This is the current caddyfile:

mydashboard.thecms.com {
    reverse_proxy localhost:8000
}

frontend1.com, www.frontend1.com {
    reverse_proxy localhost:3000
}

frontend2.com, www.frontend2.com {
    reverse_proxy localhost:3000
}

This is running fine for testing, I’m looking now to implement the on demand tls. Please see my the new Caddyfile I’m try to implement below.

2. Error messages and/or full log output:

PASTE OVER THIS, BETWEEN THE ``` LINES.
Please use the preview pane to ensure it looks nice.

3. Caddy version:

v2.7.6

4. How I installed and ran Caddy:

a. System environment:

b. Command:

sudo dnf install 'dnf-command(copr)'
sudo dnf copr enable @caddy/caddy
sudo dnf install caddy

c. Service/unit/compose file:

PASTE OVER THIS, BETWEEN THE ``` LINES.
Please use the preview pane to ensure it looks nice.

d. My complete Caddy config:

# for the dashboard
mydashboard.thecms.com {
    reverse_proxy localhost:8000
}

# on demand tls for the frontend
{
    on_demand_tls {
        ask      http://localhost:5555/check
    }
}

https:// {
    tls {
        on_demand
    }
    reverse_proxy localhost:3000
}

Will this work? Just checking that I could have a setup like this one.

5. Links to relevant resources:

Serving tens of thousands of domains over HTTPS with Caddy - Wiki - Caddy Community

Thank you very much in advance.

Yes that would work, except you should always place global options first in your Caddyfile, not in the middle.

If you have any site block with a domain, Caddy will manage that directly, whereas for other domains it will invoke your ask endpoint (if an existing cert isn’t cached) to see if should be allowed.

Hi, thank you. I will try this soon. Is there something I should take into consideration before replacing/changing my current Caddyfile with the new that has the on demand tls enable?

Another thing I notice with my current file, it’s that when the server restarts Caddy fails to start and throws a timeout error. So, for this case what I have been doing is to restart Caddy with one domain, and then add another domain and reload caddy and so on (very primitive :grin:). I guess I should be adding some timeout to each domain? Any examples?

This is my current Caddyfile:

mydashboard.thecms.com {
    reverse_proxy localhost:8000
}

frontend1.com, www.frontend1.com {
    reverse_proxy localhost:3000
}

frontend2.com, www.frontend2.com {
    reverse_proxy localhost:3000
}

This will be the new Caddyfile:

{
    on_demand_tls {
        ask http://localhost:5555/check
    }
}

https:// {
    tls {
        on_demand
    }
    reverse_proxy localhost:3000
}

mydashboard.thecms.com {
    reverse_proxy localhost:8000
}

Another option that occurs to me is using the Caddy API, and from my dashboard app do a POST request to http://localhost:2019 to add a new domain to the caddy config every time a new domain is added. Will this be a good option? or you would recommend using the on demand tls instead?

Cheers,
Eustachio

Not really.

Make sure your ask endpoint works properly and returns a 200 status response when asked for domains you know about, and that it rejects other domains with a 400.

Huh? Show your logs. I don’t understand what you mean.

On Demand seems exactly like what you need, but I don’t have a full picture of what you’re trying to achieve.

I can’t recommend using the API if you also want to use a Caddyfile, those are inherently incompatible concepts because if you make config changes via the API they will not be persisted back to the Caddyfile because Caddyfile to JSON is a one way “lossy” conversion.

Ok, basically if I reboot the server caddy will try to reload, here the error when running journalctl -xeu caddy.service --no-pager

Dec 11 15:28:54 webarto-server systemd[1]: Starting Caddy...
░░ Subject: A start job for unit caddy.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ A start job for unit caddy.service has begun execution.
░░
░░ The job identifier is 235.
Dec 11 15:28:54 webarto-server caddy[784]: caddy.HomeDir=/var/lib/caddy
Dec 11 15:28:54 webarto-server caddy[784]: caddy.AppDataDir=/var/lib/caddy/.local/share/caddy
Dec 11 15:28:54 webarto-server caddy[784]: caddy.AppConfigDir=/var/lib/caddy/.config/caddy
Dec 11 15:28:54 webarto-server caddy[784]: caddy.ConfigAutosavePath=/var/lib/caddy/.config/caddy/autosave.json
Dec 11 15:28:54 webarto-server caddy[784]: caddy.Version=v2.7.6 h1:w0NymbG2m9PcvKWsrXO6EEkY9Ru4FJK8uQbYcev1p3A=
Dec 11 15:28:54 webarto-server caddy[784]: runtime.GOOS=linux
Dec 11 15:28:54 webarto-server caddy[784]: runtime.GOARCH=amd64
Dec 11 15:28:54 webarto-server caddy[784]: runtime.Compiler=gc
Dec 11 15:28:54 webarto-server caddy[784]: runtime.NumCPU=4
Dec 11 15:28:54 webarto-server caddy[784]: runtime.GOMAXPROCS=4
Dec 11 15:28:54 webarto-server caddy[784]: runtime.Version=go1.20.10
Dec 11 15:28:54 webarto-server caddy[784]: os.Getwd=/
Dec 11 15:28:54 webarto-server caddy[784]: LANG=en_US.UTF-8
Dec 11 15:28:54 webarto-server caddy[784]: PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin
Dec 11 15:28:54 webarto-server caddy[784]: NOTIFY_SOCKET=/run/systemd/notify
Dec 11 15:28:54 webarto-server caddy[784]: HOME=/var/lib/caddy
Dec 11 15:28:54 webarto-server caddy[784]: LOGNAME=caddy
Dec 11 15:28:54 webarto-server caddy[784]: USER=caddy
Dec 11 15:28:54 webarto-server caddy[784]: INVOCATION_ID=d02e72ad6517492094888d309829a91e
Dec 11 15:28:54 webarto-server caddy[784]: JOURNAL_STREAM=8:17586
Dec 11 15:28:54 webarto-server caddy[784]: SYSTEMD_EXEC_PID=784
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6135402,"msg":"using provided configuration","config_file":"/etc/caddy/Caddyfile","config_adapter":""}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6172988,"logger":"admin","msg":"admin endpoint started","address":"localhost:2019","enforce_origin":false,"origins":["//127.0.0.1:2019","//localhost:2019","//[::1]:2019"]}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6182199,"logger":"http.auto_https","msg":"server is listening only on the HTTPS port but has no TLS connection policies; adding one to enable TLS","server_name":"srv0","https_port":443}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6183107,"logger":"http.auto_https","msg":"enabling automatic HTTP->HTTPS redirects","server_name":"srv0"}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6194887,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc00010ea80"}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6208663,"logger":"http","msg":"enabling HTTP/3 listener","addr":":443"}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6224236,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6227007,"logger":"http.log","msg":"server running","name":"remaining_auto_https_redirects","protocols":["h1","h2","h3"]}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6228032,"logger":"http","msg":"enabling automatic TLS certificate management","domains":["mydashboard.webarto.co.uk","mattheweyles.art","www.mattheweyles.art","brianburton.art","www.brianburton.art"]}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"warn","ts":1702308534.6246464,"logger":"tls","msg":"storage cleaning happened too recently; skipping for now","storage":"FileStorage:/var/lib/caddy/.local/share/caddy","instance":"966c2002-5baa-4962-9536-3f4d8b4bb0af","try_again":1702394934.6246445,"try_again_in":86399.999999569}
Dec 11 15:28:54 webarto-server caddy[784]: {"level":"info","ts":1702308534.6247573,"logger":"tls","msg":"finished cleaning storage units"}
Dec 11 15:29:24 webarto-server caddy[784]: {"level":"warn","ts":1702308564.6274936,"logger":"tls","msg":"stapling OCSP","error":"no OCSP stapling for [mydashboard.webarto.co.uk]: making OCSP request: Post \"http://r3.o.lencr.org\": dial tcp 62.115.253.187:80: i/o timeout","identifiers":["mydashboard.webarto.co.uk"]}
Dec 11 15:29:54 webarto-server caddy[784]: {"level":"warn","ts":1702308594.6309583,"logger":"tls","msg":"stapling OCSP","error":"no OCSP stapling for [mattheweyles.art]: making OCSP request: Post \"http://r3.o.lencr.org\": dial tcp [2001:2030:21::3e73:fc68]:80: i/o timeout","identifiers":["mattheweyles.art"]}
Dec 11 15:30:24 webarto-server systemd[1]: caddy.service: start operation timed out. Terminating.
Dec 11 15:30:24 webarto-server caddy[784]: {"level":"info","ts":1702308624.299674,"msg":"shutting down apps, then terminating","signal":"SIGTERM"}
Dec 11 15:30:24 webarto-server caddy[784]: {"level":"warn","ts":1702308624.300021,"msg":"exiting; byeee!! 👋","signal":"SIGTERM"}
Dec 11 15:30:24 webarto-server caddy[784]: {"level":"warn","ts":1702308624.6338496,"logger":"tls","msg":"stapling OCSP","error":"no OCSP stapling for [www.mattheweyles.art]: making OCSP request: Post \"http://r3.o.lencr.org\": dial tcp [2001:2030:21::3e73:fc68]:80: i/o timeout","identifiers":["www.mattheweyles.art"]}
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: State 'stop-sigterm' timed out. Killing.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Killing process 784 (caddy) with signal SIGKILL.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Killing process 834 (n/a) with signal SIGKILL.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Killing process 835 (n/a) with signal SIGKILL.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Killing process 839 (n/a) with signal SIGKILL.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Killing process 1017 (caddy) with signal SIGKILL.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Killing process 1019 (n/a) with signal SIGKILL.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Main process exited, code=killed, status=9/KILL
░░ Subject: Unit process exited
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ An ExecStart= process belonging to unit caddy.service has exited.
░░
░░ The process' exit code is 'killed' and its exit status is 9.
Dec 11 15:30:29 webarto-server systemd[1]: caddy.service: Failed with result 'timeout'.
░░ Subject: Unit failed
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ The unit caddy.service has entered the 'failed' state with result 'timeout'.
Dec 11 15:30:29 webarto-server systemd[1]: Failed to start Caddy.
░░ Subject: A start job for unit caddy.service has failed
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ A start job for unit caddy.service has finished with a failure.
░░
░░ The job identifier is 235 and the job result is failed.
Dec 11 15:31:39 webarto-server systemd[1]: Starting Caddy...
░░ Subject: A start job for unit caddy.service has begun execution
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ A start job for unit caddy.service has begun execution.
░░
░░ The job identifier is 731.
Dec 11 15:31:39 webarto-server caddy[1457]: caddy.HomeDir=/var/lib/caddy
Dec 11 15:31:39 webarto-server caddy[1457]: caddy.AppDataDir=/var/lib/caddy/.local/share/caddy
Dec 11 15:31:39 webarto-server caddy[1457]: caddy.AppConfigDir=/var/lib/caddy/.config/caddy
Dec 11 15:31:39 webarto-server caddy[1457]: caddy.ConfigAutosavePath=/var/lib/caddy/.config/caddy/autosave.json
Dec 11 15:31:39 webarto-server caddy[1457]: caddy.Version=v2.7.6 h1:w0NymbG2m9PcvKWsrXO6EEkY9Ru4FJK8uQbYcev1p3A=
Dec 11 15:31:39 webarto-server caddy[1457]: runtime.GOOS=linux
Dec 11 15:31:39 webarto-server caddy[1457]: runtime.GOARCH=amd64
Dec 11 15:31:39 webarto-server caddy[1457]: runtime.Compiler=gc
Dec 11 15:31:39 webarto-server caddy[1457]: runtime.NumCPU=4
Dec 11 15:31:39 webarto-server caddy[1457]: runtime.GOMAXPROCS=4
Dec 11 15:31:39 webarto-server caddy[1457]: runtime.Version=go1.20.10
Dec 11 15:31:39 webarto-server caddy[1457]: os.Getwd=/
Dec 11 15:31:39 webarto-server caddy[1457]: LANG=en_US.UTF-8
Dec 11 15:31:39 webarto-server caddy[1457]: PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin
Dec 11 15:31:39 webarto-server caddy[1457]: NOTIFY_SOCKET=/run/systemd/notify
Dec 11 15:31:39 webarto-server caddy[1457]: HOME=/var/lib/caddy
Dec 11 15:31:39 webarto-server caddy[1457]: LOGNAME=caddy
Dec 11 15:31:39 webarto-server caddy[1457]: USER=caddy
Dec 11 15:31:39 webarto-server caddy[1457]: INVOCATION_ID=15447e0d279f4cd4a3e1ba710ee6daa1
Dec 11 15:31:39 webarto-server caddy[1457]: JOURNAL_STREAM=8:17334
Dec 11 15:31:39 webarto-server caddy[1457]: SYSTEMD_EXEC_PID=1457
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3535414,"msg":"using provided configuration","config_file":"/etc/caddy/Caddyfile","config_adapter":""}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.355705,"logger":"admin","msg":"admin endpoint started","address":"localhost:2019","enforce_origin":false,"origins":["//localhost:2019","//[::1]:2019","//127.0.0.1:2019"]}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.355884,"logger":"http.auto_https","msg":"server is listening only on the HTTPS port but has no TLS connection policies; adding one to enable TLS","server_name":"srv0","https_port":443}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3559098,"logger":"http.auto_https","msg":"enabling automatic HTTP->HTTPS redirects","server_name":"srv0"}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3559415,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc0003a8a80"}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3566704,"logger":"http","msg":"enabling HTTP/3 listener","addr":":443"}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3568628,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3569431,"logger":"http.log","msg":"server running","name":"remaining_auto_https_redirects","protocols":["h1","h2","h3"]}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3569567,"logger":"http","msg":"enabling automatic TLS certificate management","domains":["mydashboard.webarto.co.uk","mattheweyles.art","www.mattheweyles.art","brianburton.art","www.brianburton.art"]}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"warn","ts":1702308699.357847,"logger":"tls","msg":"storage cleaning happened too recently; skipping for now","storage":"FileStorage:/var/lib/caddy/.local/share/caddy","instance":"966c2002-5baa-4962-9536-3f4d8b4bb0af","try_again":1702395099.3578458,"try_again_in":86399.999999629}
Dec 11 15:31:39 webarto-server caddy[1457]: {"level":"info","ts":1702308699.3585103,"logger":"tls","msg":"finished cleaning storage units"}
Dec 11 15:32:09 webarto-server caddy[1457]: {"level":"warn","ts":1702308729.3579254,"logger":"tls","msg":"stapling OCSP","error":"no OCSP stapling for [mydashboard.webarto.co.uk]: making OCSP request: Post \"http://r3.o.lencr.org\": dial tcp [2001:2030:21::3e73:fc59]:80: i/o timeout","identifiers":["mydashboard.webarto.co.uk"]}
Dec 11 15:32:39 webarto-server caddy[1457]: {"level":"warn","ts":1702308759.3589654,"logger":"tls","msg":"stapling OCSP","error":"no OCSP stapling for [mattheweyles.art]: making OCSP request: Post \"http://r3.o.lencr.org\": dial tcp [2001:2030:21::3e73:fc68]:80: i/o timeout","identifiers":["mattheweyles.art"]}
Dec 11 15:33:09 webarto-server caddy[1457]: {"level":"warn","ts":1702308789.3601675,"logger":"tls","msg":"stapling OCSP","error":"no OCSP stapling for [www.mattheweyles.art]: making OCSP request: Post \"http://r3.o.lencr.org\": dial tcp [2001:2030:21::3e73:fc59]:80: i/o timeout","identifiers":["www.mattheweyles.art"]}
Dec 11 15:33:09 webarto-server systemd[1]: caddy.service: start operation timed out. Terminating.
Dec 11 15:33:09 webarto-server caddy[1457]: {"level":"info","ts":1702308789.5476334,"msg":"shutting down apps, then terminating","signal":"SIGTERM"}
Dec 11 15:33:09 webarto-server caddy[1457]: {"level":"warn","ts":1702308789.5477512,"msg":"exiting; byeee!! 👋","signal":"SIGTERM"}
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: State 'stop-sigterm' timed out. Killing.
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: Killing process 1457 (caddy) with signal SIGKILL.
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: Killing process 1461 (caddy) with signal SIGKILL.
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: Killing process 1462 (caddy) with signal SIGKILL.
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: Killing process 1463 (n/a) with signal SIGKILL.
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: Main process exited, code=killed, status=9/KILL
░░ Subject: Unit process exited
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ An ExecStart= process belonging to unit caddy.service has exited.
░░
░░ The process' exit code is 'killed' and its exit status is 9.
Dec 11 15:33:14 webarto-server systemd[1]: caddy.service: Failed with result 'timeout'.
░░ Subject: Unit failed
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ The unit caddy.service has entered the 'failed' state with result 'timeout'.
Dec 11 15:33:14 webarto-server systemd[1]: Failed to start Caddy.
░░ Subject: A start job for unit caddy.service has failed
░░ Defined-By: systemd
░░ Support: https://access.redhat.com/support
░░
░░ A start job for unit caddy.service has finished with a failure.
░░
░░ The job identifier is 731 and the job result is failed.

This is happening using the current Caddyfile I showed in my previous messages.

Looks like Caddy isn’t able to connect to Let’s Encrypt’s OCSP servers (which fetches a proof signed by the CA that your certificate hasn’t been revoked yet).

This is a networking issue on your end. Why would that fail to connect? Do you have something which might be blocking connections to specific servers?

Ok, I have some firewalls in place, ports 80 and 443 are open though, is there another port that needs to be open?. Please note that this only happens when I reboot the server. What I have been doing after reboot is restart Caddy with one domain only, that makes caddy works (Active), and then I add one domain at a time a run reload caddy. This way works.

To give more context, some of these domains come from go daddy, and the have been pointed to my server using A records.

I have just checked the firewalls and port 80 is open for inbound requests but it isn’t open for outbound ones, could this be what is causing the issue?

Cheers,
Thanks for helping

These are outgoing connections. Nothing specific to ports. If you were blocking outbound connections on port 443 you wouldn’t be able to browse anything on the internet, at all.

Ok, when I say that I have firewalls in place an so on, I meant for the server I have Caddy running on. If you mean about having networking issues on my side as my wi-fi connection or the connection I get internet from, I haven’t touched this at all.

Try curl -v http://r3.o.lencr.org on your Caddy server. From there, you can get some clues at what layer blocking is happening.

I get this:

*   Trying 2001:2030:21::3e73:fc68:80...
* Connected to r3.o.lencr.org (2001:2030:21::3e73:fc68) port 80 (#0)
> GET / HTTP/1.1
> Host: r3.o.lencr.org
> User-Agent: curl/7.76.1
> Accept: */*
> 
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Server: nginx
< Content-Length: 0
< Cache-Control: max-age=16551
< Expires: Tue, 12 Dec 2023 15:01:23 GMT
< Date: Tue, 12 Dec 2023 10:25:32 GMT
< Connection: keep-alive
< 
* Connection #0 to host r3.o.lencr.org left intact

It seems like it connects fine. I will reboot the server at some point and try this straightaway when the server is back on.
Cheers

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.