1. Caddy version (caddy version
):
v2.3.0 h1:fnrqJLa3G5vfxcxmOH/+kJOcunPLhSBnjgIvjXV/QTA=
2. How I run Caddy:
I want my customers to register an account and select a plan. Then, we will create a new subdomain for them and reload the Caddy’s config.
The current setup has two servers:
- One server for registration which has Nginx
- Another one is for serving the subdomains created by the customers and it uses Caddy.
Now there are two things that I like about Caddy and made us try it:
- Automatic HTTPS
- Ability to load new configuration using the API without any CLI commands.
The problem we have currently is the deadlock issue after calling [POST ]/load API of the Caddy admin:
"msg": "stopping current admin endpoint"
"error": "shutting down admin server: context deadline exceeded"
a. System environment:
Ubuntu 20.04.2 LTS
b. Command:
We used the following commands after installing Caddy:
systemctl disable caddy.service
systemctl enable caddy-api.service
systemctl start caddy-api.service
c. My complete Caddyfile or JSON config:
tenant-subdomain.ahmedhat.com {
try_files {path} {path}/ /index.html?{query}
root * /var/www/ahmedhat-spa/build
file_server
}
This block will be repeated for each customer registered in our solution but tenant-subdomain
will be replaced by the their choice of subdomain.
3. The problem I’m having:
When a customer subscribes, the first server will create an account and will call the [POST] /load API to apply the new config but issue we are getting into is the deadlock.
4. Error messages and/or full log output:
{"level":"error","ts":1587571718.9594,"logger":"admin","msg":"stopping current admin endpoint","error":"shutting down admin server: context deadline exceeded"}
5. What I already tried:
- I exposed the localhost:2019 on the 2060 port by using Caddy but I went to the deadlock after calling
http://193.122.75.178:2060/load
API. 193.122.75.178 is the IP address of the server with Caddy. - I created a small project that will receive the Caddyfile content and call
localhost:2019/load API
. This project will be set up on the second server (the one that has Caddy). This project will be exposed to the first server by using Nginx on a custom port.
The other approach is working fine currently and I can limit the access to Nginx for certain IPs, or actually the IP of the first server only. But my issue with the second approach is that I am using Nginx which I feel it is not the right approach.
So my question what is the right approach in my current setup?