Secure API Endpoint

Is it possible to secure the Caddy API endpoint? Any documentation on how to do this?

I have a SaaS app running of a different server and I need to add new routes when a customer adds a custom domain. So I want to be able to securely update the Configuration.

1 Like

What is insecure about it?

(There are many possible answers to this depending on your threat model. Namely, its defaults are potentially insecure if you’re running untrusted, arbitrary code on the same machine. But at that point all bets are off anyway, so :man_shrugging: )

So I think I’m asking the wrong question. After doing a bit more research it looks like you can only access it from localhost?

I want to access it from a different server. Not from localhost. I have a SaaS app on a different server and want to send requests from there to update the JSON config when a customer adds a custom domain. So really I want to open it up a little bit. Whats the best way to do this?

Gotcha. So you probably want HTTPS on the admin endpoint and require credentials to access it.

At this moment, there’s not really a great way to do that in Caddy by itself, but you could always reverse proxy to it.

The reason this is not yet supported is because it’s a bit of a chicken-egg problem: if the admin endpoint is served over HTTPS, how is that done, i.e. what HTTPS configuration does it have (where does it get the certs, what settings does it use, etc.)? It needs configuration in order to get it, but configuration is received through the admin endpoint. I haven’t figured out an elegant way to solve that problem yet, but I also haven’t invested much time into thinking on it.

As for credentials, that’d be pretty straightforward depending on the requirements. But even right now, you could enforce Origin checking but use a secret value on the Origin header instead of the real origin. Then it’s kind of like a bearer token. But this is a bit of a hack (not to mention insecure over plaintext HTTP).

So, for now: reverse proxy to the admin endpoint and let the reverse proxy terminate TLS.

In the future: I’m sure we’ll figure out a solution. (Have any ideas?) But we haven’t yet.

1 Like

To add onto that – depending on how your network/server infrastructure is set up, you could have a private network and firewall rules to only allow connections to port 2019 on your server from other trusted servers.

For example, in Amazon AWS you’d have a VPC that prevents outside traffic from reaching port 2019 on your server that runs Caddy.

You can configure the admin endpoint’s listen address via JSON (which I assume is how you’re configuring Caddy if you’re looking to use the API).
In your case that might be something like "listen": "0.0.0.0:2019" to allow connections from anyone (but then you’d limit what can access it with firewall rules)

1 Like

Awesome, thanks everyone.

So my two servers are in the same VPC on AWS. So the port will only be open from within the VPC.
It’s served over HTTPS and reversed proxied to it and then used the basic auth middleware to secure it by username and password. It’s probably a bit overkill since only my VPC can access the port.

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.