Question: is there any way to configure
http.handlers.cache for a single route in such a way that, when 10 simultaneous requests are being served, and all of them are serving stale response, only one request to backend is issued?
Currently, in the logs of my node application, I can see that every time a resource is stale, and it is being requested 10 times simultaneously, Caddy serves the stale version, but also re-fetches a fresh version 10 times simultaneously as well.
What I used to do in nginx, according to their microcaching tutorial, was to use
proxy_cache_lock– Restricts the number of concurrent attempts to populate the cache, so that when a cached entry is being created, further requests for that resource are queued up in NGINX (see section " Optimized Microcaching with NGINX").
Is there some configuration that would make Caddy behave the same way?