1. The problem I’m having:
I am trying to migrate an API frontend from nginx to Caddy.
The nginx is rather convoluted, but does almost everything it needs to do, except shared cache. After unsuccessfully trying the memcached (no ttl) and redis (strange behaviour) I decide to at least try to do this with caddy. I do have a bit of experience with caddy, but I ran into issues this time and I’d really could use a bit of help.
The configuration should do the following:
- direct the requests to the correct api instances, depending if the request is public or not: there are two instances: pubapi and api
- pre-authenticate the requests before hitting the cache if the the request use basic auth AND is supposed to be cached - this is to make sure private cache objects are not accessed by unauthorized requests
- enable cache for some endpoints
I am using the argsort module to make sure the identical requests get a single key.
I am using @matt 's caddy-ratelimit module and @darkweak 's darkweak/souin module with the go-redis storage.
Issues I still have:
mode bypass
seems to not work unless I am also drop theCache-Control
andExpires
headers coming from the API (yes, the API is not the best API out there)- I would really want to cache just the responses with status code 200 but that seems to be impossible at the moment; if @darkweak confirms that is the case, I will open an issue in the proper github repo.
- the forward_auth does not prevent by itself the access to the cache or the backend if the response from the auth backend is not 2xxx
2. Error messages and/or full log output:
mode_bypass
not working. It will start cache once i drop Cache-Control
and Expires
from upstream.
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"You're running Souin with the following storages REDIS"}
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Storer initialized: []types.Storer{(*redis.Redis)(0xc001d8a680)}."}
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Try to load the storer REDIS-redis:6379--12-souin-redis-1m0s as surrogate backend"}
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Surrogate storage initialized."}
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Set zacache as Cache-Status name"}
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Allow 2 method(s). [GET HEAD]."}
{"level":"debug","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"The cache logic will run as bypass: &{Strict:false Bypass_request:true Bypass_response:true}"}
{"level":"info","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Set backend timeout to 10s"}
{"level":"info","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Set cache timeout to 10s"}
{"level":"info","ts":"2024-10-27T16:18:39.572Z","logger":"http.handlers.cache","msg":"Souin configuration is now loaded."}
{"level":"debug","ts":"2024-10-27T16:18:39.577Z","logger":"http.handlers.cache","msg":"Cleanup..."}
{"level":"debug","ts":"2024-10-27T16:19:39.443Z","logger":"http.handlers.cache","msg":"Incomming request &{Method:GET URL:/v2/public/test2 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Accept:[*/*] Accept-Encoding:[gzip] User-Agent:[curl/8.7.1] X-Forwarded-For:[xx.xx.xx.xx] X-Forwarded-Host:[some.host] X-Forwarded-Proto:[https]] Body:0xc0000104b0 GetBody:<nil> ContentLength:0 TransferEncoding:[] Close:false Host:some.host Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:10.100.254.16:37774 RequestURI:/v2/public/test2 TLS:<nil> Cancel:<nil> Response:<nil> ctx:0xc000d64600 pat:<nil> matches:[] otherValues:map[]}"}
{"level":"debug","ts":"2024-10-27T16:19:39.444Z","logger":"http.handlers.cache","msg":"Request cache-control &{MaxAge:-1 MaxStale:-1 MaxStaleSet:false MinFresh:-1 NoCache:false NoStore:false NoTransform:false OnlyIfCached:false StaleIfError:0 Extensions:[]}"}
{"level":"debug","ts":"2024-10-27T16:19:39.451Z","logger":"http.handlers.cache","msg":"Request the upstream server"}
{"level":"debug","ts":"2024-10-27T16:19:40.020Z","logger":"http.handlers.cache","msg":"Response cache-control &{MustRevalidate:false NoCache:map[] NoCachePresent:false NoStore:true NoTransform:false Public:false Private:map[] PrivatePresent:false ProxyRevalidate:false MaxAge:-1 SMaxAge:-1 Immutable:false StaleIfError:-1 StaleWhileRevalidate:-1 Extensions:[]}"}
3. Caddy version:
/etc/caddy # caddy version
v2.8.4 h1:q3pe0wpBj1OcHFZ3n/1nl4V4bxBrYoSoab7rL9BMYNk=
/etc/caddy # caddy build-info | grep souin
dep github.com/darkweak/souin v1.7.2 h1:i9t/fdCgvBuKM7NNYcqnCQCmFlk9nEF91ltuksfFyjs=
dep github.com/darkweak/souin/plugins/caddy v1.7.2 h1:EVYHOcRsr3XnM45RiGKATogk3qy1/cH8rVR8p+BDe14=
/etc/caddy # caddy build-info | grep redis
dep github.com/darkweak/storages/go-redis v0.0.10 h1:lwUwcLB1LlO7NFC/tPB/8IOOAHbkJjvE9VFVi+DcmiE=
dep github.com/darkweak/storages/go-redis/caddy v0.0.10 h1:iHtq111uKyGXFvN+OrL5x0pR3bf7BcBqXt9dVRWwA2Q=
dep github.com/redis/go-redis/v9 v9.5.4 h1:vOFYDKKVgrI5u++QvnMT7DksSMYg7Aw/Np4vLJLKLwY=
/etc/caddy # caddy build-info | grep rate
dep github.com/mholt/caddy-ratelimit v0.0.0-20240828171918-12435ecef5db h1:30N0UnATYd7E8iaWSSOTlsr2/rd8v+7w0X+2Jc8FDJk=
4. How I installed and ran Caddy:
docker and docker-compose
a. System environment:
standard docker image build with xcaddy, run with docker-compose
b. Command:
caddy run --config /etc/caddy/Caddyfile
c. Service/unit/compose file:
web:
command:
- caddy
- run
- --config
- /etc/caddy/Caddyfile
build: .
restart: always
Dockerfile:
ARG VERSION=%%VERSION%%
ARG VCS_URL
ARG VCS_REF
ARG BUILD_DATE
ARG TARGETPLATFORM
ARG TARGETOS
ARG TARGETARCH
FROM caddy:${VERSION}-builder AS builder
RUN CGO_ENABLED=0 GOARCH=${TARGETARCH} GOOS=${TARGETOS} \
xcaddy build \
--with github.com/greenpau/caddy-security \
--with github.com/lucaslorentz/caddy-docker-proxy/v2 \
--with github.com/teodorescuserban/caddy-argsort \
--with github.com/teodorescuserban/caddy-cookieflag \
--with github.com/teodorescuserban/caddy-ip-map \
--with github.com/mholt/caddy-ratelimit \
--with github.com/darkweak/storages/go-redis/caddy \
--with github.com/darkweak/souin/plugins/caddy
# --with github.com/caddyserver/cache-handler
# FROM caddy:${VERSION}-alpine
FROM alpine:3.20
ENV XDG_CONFIG_HOME=/config XDG_DATA_HOME=/data
WORKDIR /etc/caddy
COPY --from=builder /usr/bin/caddy /usr/bin/caddy
COPY etc ./
RUN apk add nss-tools curl && \
rm -rf /var/cache/apk/*
# CMD ["caddy", "docker-proxy", "--caddyfile-path", "/etc/caddy/Caddyfile"]
CMD ["caddy","run","--config","/etc/caddy/Caddyfile","--adapter","caddyfile"]
# to be able to tunnel into the admin api to grab profiles
#EXPOSE 12019
d. My complete Caddy config:
{
debug
log {
output file /logs/main.log
format json {
time_format iso8601
}
level debug
}
local_certs
auto_https disable_redirects
# lets rate limit before cache, we have no idea if that's cached or not
order rate_limit before cache
import ./include/cache.caddyfile
}
import ./include/snippets.caddyfile
:80 {
argsort lowercase
import ./include/ratelimit.caddyfile
map {http.request.uri} {is_public_url} {
default 0
"~^/v1/public*" 1
"~^/v2/public*" 1
}
map {http.request.uri} {needs_cache} {needs_subrequest} {
default 0 0
"~^/v2/public/Summary" 1 0
"~^/v1/public/Show" 1 0
"~^/v1/fbs/flow/custom-search" 1 1
"~^/v2/fbs/org" 1 1
"~^/v1/fbs/flow/usage-years/org/[0-9]+" 1 1
}
map {http.request.header} {auth_type} {
default none
"~^Bearer .*" bearer
"~^Basic .*" basic
}
@needs_foward_auth expression `{needs_subrequest} == "1"`
# until devs will expect X-Forwarded-Uri instead of X-Original-URI,
# we'll do with the expanded version
# forward_auth @needs_foward_auth pubapi:3000 {
# uri /internal/basicAuthCheck
# copy_headers Authorization
# }
reverse_proxy @needs_foward_auth pubapi:3000 {
method GET
rewrite * /internal/basicAuthCheck
header_up X-Forwarded-Method {method}
header_up X-Forwarded-Uri {uri}
# that needs to go
header_up X-Original-Uri {uri}
@good status 2xx
handle_response @good {
request_header {
Authorization {rp.header.authorization}
}
}
}
@needs_cache expression `{needs_cache} == "1"`
cache @needs_cache
@go_public expression `{auth_type} == "none" || {auth_type} == "special"`
import logs pubapi
handle @go_public {
import gotobackend pubapi:3000
}
import logs api
handle {
import gotobackend api:3000
}
}
/etc/caddy/include/snippet.caddyfile
:
(logs) {
log {
output file /logs/{args[0]}
format filter {
wrap json {
time_format iso8601
}
fields {
common_log delete
request>headers>Authorization delete
}
}
}
}
(gotobackend) {
header X-Backend {args[0]}
header Server "{http.request.host}"
# mode bypass seems to not work
@remove_cache_headers expression `{needs_cache} == "1"`
reverse_proxy @remove_cache_headers {
header_down -Server
#header_down -Surrogate-Control
header_down -Cache-Control
header_down -Expires
to {args[0]}
}
reverse_proxy {
header_down -Server
to {args[0]}
}
}
/etc/caddy/include/cache.caddyfile
:
cache {
cache_name zacache
# allowed_http_verbs GET HEAD
api {
debug
prometheus
souin
}
log_level debug
# see https://github.com/darkweak/souin/issues/345#issuecomment-1560574770
mode bypass
ttl 15s
stale 60s
timeout {
backend 10s
cache 100ms
}
default_cache_control public
key {
template {http.request.uri}
hash
}
redis {
configuration {
Addrs redis:6379
DB 12
}
}
}
/etc/caddy/include/ratelimit.caddyfile
:
# add here any naughty IP.
ipmap {http.request.remote.host} {is_bad_ip} {
default 0
1.1.1.1 1
}
# add here any referrer that would need a higher rate limit than the rest.
map {http.request.header.Referer} {is_our_app} {
default 0
"https://fbs.example.local/" 1
}
# ipmap {http.request.remote.host} {is_our_ip} {
ipmap {remote_ip} {is_our_ip} {
default 0
127.0.0.1 1
10.0.0.0/8 1
}
rate_limit {
log_key
zone bad_ip {
match expression `{is_bad_ip} == "1"`
key {remote_host}
events 1
window 5s
}
zone our_ip {
match expression `{is_our_ip} == "1"`
key {remote_host}
events 4
window 1s
}
zone our_app {
match expression `{is_our_app} == "1"`
key {remote_host}
events 5
window 1s
}
zone unknown_source {
match expression `{is_bad_ip} == "0" && {is_our_ip} == "0" && {is_our_app} == "0"`
key {remote_host}
events 1
window 3s
}
}
5. Links to relevant resources:
The rate limit works really well, I only had to order rate_limit before cache
. I was surprised by how good it works, despite a much simpler interface than what I am used to coming from nginx.
The caching seems to work really well with redis although:
mode bypass
seem to not work; the API really wants to send outExpires: 0
and a ridiculousCache-Control: no-store, no-cache, must-revalidate, proxy-revalidate
and without dropping manually the two headers there is no cache (and I dont see anything specific in the debug log saying “I dont cache because…”)- it is caching almost all responses with the default ttl, including some staus codes it shouldn’t, according to the code I’ve seen here, like
429
generated by the rate llimit; - the forward_auth does not prevent the request returning 403 if the authentication is incorrect, instead unauthorized users can access private cache objects created by authenticated users
I hope I did not forget anything.
@matt @francislavoie @darkweak, I am very grateful for all the work you did on caddy and on various modules.
I would really appreciate if you take a look and provide some ideas.