Caddy v2 + varnish + on demand TLS

I have been trying to setup custom CDN using Caddy and varnish. The idea is to generate on demand SSL certificate and then pass it to varnish which further sends it to the backend server which is a nodejs application. If the request matches then varnish returns the cache results otherwise fetches new data.
The working is described in the diagram below

Here are the respective files:
docker-compose.yml

version: '3.7'

networks:
  web:
    external: true
  internal:
    external: false
    driver: bridge

services:
 caddy:
    image: caddy
    container_name: caddy
    restart: unless-stopped
    ports:
      - "8080:8080"
      - "80:80"
      - "443:443"
    volumes:
      - $PWD/Caddyfile:/etc/caddy/Caddyfile
      - $PWD/site:/srv
      - caddy_data:/data
      - caddy_config:/config
    networks:
      - web

 varnish:
    container_name: varnish
    image: varnish:stable
    restart: unless-stopped
    volumes:
      - $PWD/data/varnish/default.vcl:/etc/varnish/default.vcl
    networks:
      - web
      - internal
volumes:
  caddy_data:
    external: true
  caddy_config:

Caddyfile

{
    on_demand_tls {
        ask      https://check-domain-url
    }
}

https:// {

tls {
    on_demand
}                          

reverse_proxy varnish:80 {
                header_up Host {host}  # Won't work with another value or transparent preset
                header_up X-Forwarded-Host {host}
                header_up X-Real-IP {remote}
                header_up X-Forwarded-For {remote}
                header_up X-Forwarded-Proto {scheme}
                header_up X-Caddy-Forwarded 1
                header_down Cache-Control "public, max-age=31536000"
}

header /_next/static/* {
Cache-Control "public, max-age=31536000, immutable"
}
}

:8080 {
reverse_proxy backend-address:3000
}

default.vcl

vcl 4.0;

backend default {
    .host = "caddy";
    .port = "8080";
}

sub vcl_deliver
{
    # Insert Diagnostic header to show Hit or Miss
    if (obj.hits > 0) {
        set resp.http.X-Cache = "HIT";
        set resp.http.X-Cache-Hits = obj.hits;
    }
    else {
        set resp.http.X-Cache = "MISS";
    }

}

sub vcl_backend_response {
    set beresp.ttl = 10s;
    set beresp.grace = 1h;

}

Everything is working fine

  1. SSL certificates are generating
  2. Proxy is working
  3. Varnish is returning results
  4. Proxy is fetching backend

The only problem is that Varnish Cache is always miss, the one thing that it is supposed to do
I have tried all means, but it looks like the varnish sees every request as a new request.
Any ideas?

You can remove all these headers. Caddy will set the appropriate values automatically. See the docs:

Also, transparent doesn’t exist in Caddy v2, that was something in Caddy v1. Caddy v2 is a complete rewrite, so you can’t use any config from Caddy v1 with v2.

:man_shrugging:

I don’t know enough about Varnish to suggest anything in particular.

FYI, GitHub - caddyserver/cache-handler: Distributed HTTP caching module for Caddy should be ready soon.

1 Like

Thanks @francislavoie for the swift response
I removed those headers but still the cache is always miss.
I know think that config mismatch is probably on varnish side instead

Thanks for the update on distributed cache
Would that be available for multitenant domains and TLS on demand ?

Hello @amanintech
Does your application backend return a Cache-Control HTTP header with the no-cache directive ?
Do you send an HTTP request with a Cache-Control HTTP header with one of no-cache or no-store directive ?

2 Likes

I never thought about that way. And indeed the backend was sending no-cache. But now i have modified it and it still is not hitting the cache.
But anyways @darkweak that was indeed the problem if not the whole problem. Any idead where else it might be going wrong ?