Caddy v2 + varnish + on demand TLS

I have been trying to setup custom CDN using Caddy and varnish. The idea is to generate on demand SSL certificate and then pass it to varnish which further sends it to the backend server which is a nodejs application. If the request matches then varnish returns the cache results otherwise fetches new data.
The working is described in the diagram below

Here are the respective files:

version: '3.7'

    external: true
    external: false
    driver: bridge

    image: caddy
    container_name: caddy
    restart: unless-stopped
      - "8080:8080"
      - "80:80"
      - "443:443"
      - $PWD/Caddyfile:/etc/caddy/Caddyfile
      - $PWD/site:/srv
      - caddy_data:/data
      - caddy_config:/config
      - web

    container_name: varnish
    image: varnish:stable
    restart: unless-stopped
      - $PWD/data/varnish/default.vcl:/etc/varnish/default.vcl
      - web
      - internal
    external: true


    on_demand_tls {
        ask      https://check-domain-url

https:// {

tls {

reverse_proxy varnish:80 {
                header_up Host {host}  # Won't work with another value or transparent preset
                header_up X-Forwarded-Host {host}
                header_up X-Real-IP {remote}
                header_up X-Forwarded-For {remote}
                header_up X-Forwarded-Proto {scheme}
                header_up X-Caddy-Forwarded 1
                header_down Cache-Control "public, max-age=31536000"

header /_next/static/* {
Cache-Control "public, max-age=31536000, immutable"

:8080 {
reverse_proxy backend-address:3000


vcl 4.0;

backend default {
    .host = "caddy";
    .port = "8080";

sub vcl_deliver
    # Insert Diagnostic header to show Hit or Miss
    if (obj.hits > 0) {
        set resp.http.X-Cache = "HIT";
        set resp.http.X-Cache-Hits = obj.hits;
    else {
        set resp.http.X-Cache = "MISS";


sub vcl_backend_response {
    set beresp.ttl = 10s;
    set beresp.grace = 1h;


Everything is working fine

  1. SSL certificates are generating
  2. Proxy is working
  3. Varnish is returning results
  4. Proxy is fetching backend

The only problem is that Varnish Cache is always miss, the one thing that it is supposed to do
I have tried all means, but it looks like the varnish sees every request as a new request.
Any ideas?

You can remove all these headers. Caddy will set the appropriate values automatically. See the docs:

Also, transparent doesn’t exist in Caddy v2, that was something in Caddy v1. Caddy v2 is a complete rewrite, so you can’t use any config from Caddy v1 with v2.


I don’t know enough about Varnish to suggest anything in particular.

FYI, GitHub - caddyserver/cache-handler: Distributed HTTP caching module for Caddy should be ready soon.

1 Like

Thanks @francislavoie for the swift response
I removed those headers but still the cache is always miss.
I know think that config mismatch is probably on varnish side instead

Thanks for the update on distributed cache
Would that be available for multitenant domains and TLS on demand ?

Hello @amanintech
Does your application backend return a Cache-Control HTTP header with the no-cache directive ?
Do you send an HTTP request with a Cache-Control HTTP header with one of no-cache or no-store directive ?


I never thought about that way. And indeed the backend was sending no-cache. But now i have modified it and it still is not hitting the cache.
But anyways @darkweak that was indeed the problem if not the whole problem. Any idead where else it might be going wrong ?

Turns out that the only thing needed to be done was to unset the incoming cookie.
For some reasons with a cookie, Varnish consider the request to be unique even though path and hostname remain same

I have included a block in the .vcl file which unsets the cookie

sub vcl_recv {

 unset req.http.Cookie;

(Note: this can be modify with some rules if some cookies still needs to be passed to backend )
now the cache hit rate is 100% on all domains and subdomains :partying_face::partying_face:
Though I think using the built-in Caddy cache would be a better solution. Looking forward to it.

1 Like

This topic was automatically closed after 30 days. New replies are no longer allowed.