Using Varnish with caddy

I’m trying to reverse proxy Caddy with Varnish to serve up Wordpress. Currently I got wordpress to work with Caddy, but I can’t get Varnish to play along. What I tried:

  • Direct my main site with caddy and ssl.
  • Installed Varnish and configured it with VCLon it’s default port.
  • Set Caddy as reverse proxy to Varnish.

I followed This tutorial. I think I’m missing something. I also found a github Gist which I did not understand.

Can someone please lay out the steps to configure Varnish with Caddy on Ubuntu 16.04 and Latest caddy server?

Anyone? No?

I use a pair of Caddy servers in front of a pair of Varnish servers for all our stuff, and it’s working great.

If you say what the actual problem is you’re having, you might get more help.

Basically I have a wordpress with Varnish and Apache configured which i use as my blog, I want to swap apache with Caddy but every time I tried it, I failed, I linked the tutorial I tried to follow.

@Cylindric do you mind sharing your vcl and caddyfile (you can edit out the domain and email if you want)?

Failed how? What are the problems you’re seeing? “It doesn’t work” doesn’t really give much information. You getting an error? Something isn’t loading? Isn’t caching? Is it Wordpress SEO that’s broken?

There’s no point me showing my VCL, even if I could, it’s over 1100 lines of stuff specific to our routing requirements.

1 Like

Sorry for the vague response. It just doesn’t load, it shows there is no site at the port. Let me reconfigure it and I’ll post the actual error.

Visit the site , the error code keeps changing.

Here’s my Caddyfille :

junayeed.me {
    proxy / localhost:6081 { # Proxy to Varnish
                header_upstream Host "localhost" # Won't work with another value or transparent preset
                header_upstream X-Real-IP {remote}
                header_upstream X-Forwarded-For {remote}
                header_upstream X-Forwarded-Proto {scheme}
        }
}


localhost:8060 { # Varnish will access this
tls /etc/caddy/ssl/cert_chain.crt  /etc/caddy/ssl/junayeed.me.key
root /var/www/wordpress
gzip
fastcgi / /run/php/php7.0-fpm.sock php
rewrite {
if {path} not_match ^\/wp-admin
to {path} {path}/ /index.php?_url={uri}
}

}

And my /etc/default/varnish

DAEMON_OPTS="-a :6081 \
             -T localhost:6082 \
             -f /etc/varnish/default.vcl \
             -S /etc/varnish/secret \
             -s malloc,256m"

And my VCL

# Default backend definition. Set this to point to your content server.
backend default {
    .host = "localhost";
    .port = "8060";
}

What am I doing wrong ?

Okay, so just to clarify, you’re using one instance of caddy as both the front-end server that the public see, and the backend server that varnish will request pages from?

“Error 503 Backend fetch failed” is basically Varnish telling you that it can’t get any data back from Caddy. What do you get if you browse to https://junayeed.me:8086? That’s what Varnish will be doing. Check your Caddy error log and access log, and see if it’s being hit at all.

That’s right, as it was done like this in the tutorial by @Nixtren

Nothing, that page is not accessible. I have that port open in AWS though. Screenshot_20170914_144115

You won’t get anything because Caddy is only listening on “localhost” for that port. Try setting it to just :8060 and see what you get. Or try to fetch the page from the server locally.

You need to see what Caddy is returning when someone hits that URL. Varnish is just a normal user-agent in this respect, it isn’t doing anything particularly clever.

Getting 502 bad gateway.

Sounds like your Caddy config isn’t happy with serving up your files.

Did it , also documented the process here : Loading...

404 - Page not found.

https://chorompotro.com/caddy-varnish-wordpress-ubuntu-fun/

changed the URL, edited the original post also kept the reply as reference .

FYI Varnish may have issues with handling HTTP/2 HTTP under load Caddy 0.10.9 + Varnish Cache 5.2 HTTP/2 thread starvation bug

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.