On-demand TLS, UX issue with negative responses

I’m using on-demand TLS and https-only which work really great.

Just two smaller “issues” that maybe could improve my setup.

The TLS-ask configuration basically tells Caddy to ask first and then try to retrieve a certificate. If a request is made to non-https there will be a redirect response to https.

This is fine if the ask-endpoint returns 200, but what if it doesn’t? The user will be greeted with an SSL_ERROR_INTERNAL_ERROR_ALERT in their browser.

Is there a way to extend asking to before the redirect to HTTP is done? Basically a “domain-on-demand”? For my application this doesn’t make a big difference, but it would improve UX quite a bit if Caddy responded with a 404 before answering an HTTP request with an invalid certificate.

Furthermore, if the backend response was “do not request a certificate”, Caddy will ask again on the next request. Can this response be cached? I’m seeing quite a few scripted attempts to hack the server, some with completely bogus domain names. This would prevent the (very fast) backend from ever having to check twice. (If there was a cacheable way to check for eligible domains this wouldn’t be an issue though.)

Curl example on Caddy 2.8.4:

curl -kiL we-dont-serve-this.domain
HTTP/1.1 308 Permanent Redirect
Connection: close
Location: https://we-dont-serve-this.domain/
Server: Caddy
Date: Fri, 11 Oct 2024 08:55:41 GMT
Content-Length: 0

curl: (35) OpenSSL/3.2.2: error:0A000438:SSL routines::tlsv1 alert internal error

Come to think of it, my assumptions are stupid, because modern browsers will try HTTPs first and without a valid certificate they’ll always show an SSL error.

So only my question, whether the negative ask-response could be cached, remains.

Well, caching implies you eat up memory on the Caddy server to keep a cache. Since the list of possible domains is infinite, that can be problematic. You’d have to use an LRU-type cache to keep the memory usage bounded.

You can implement that yourself if you want by writing your own permission plugin (it can wrap the http module which is what ask is).

But really, your backend should be fast enough that caching should be unnecessary. It should be a simple O(1) table lookup, so caching might be irrelevant.

2 Likes

Yes, taking a weekend to think about it my “concerns” were pretty much invalid. Thanks for your answer though!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.