Automated / Scripted Updates?

When running Apache or nginx on a production server, updating the web server to the latest version is as easy as:

sudo apt-get dist-upgrade

However when running Caddy I need to download an entirely new binary file, stop my service, swap out the binary, and restart my service. Not only is this process a little bit of a pain, but I’m not even sure of the best way to “download an entirely new binary file” in an easy and script-able way. At first I was going to the Caddy website, clicking the checkboxes for the plugins I wanted, downloading the file to my (Windows) computer, then uploading via SFTP. Now I keep a reference to that download link and just use an ugly wget command like:

wget https://caddyserver.com/download/build?os=linux&arch=amd64&features=cors%2Cgit%2Cjwt%2Cdigitalocean

Although when Caddy went from v0.8 to 0.9 this broke suddenly because one of the features I was using (I forget which it was now) was pulled into core so my download link threw a 400 error. If a feature name changing or disappearing entirely breaks my update process, I wouldn’t consider that “production-ready”. I don’t want a cron job running every Sunday at 2am to take my server down unexpectedly.

Is there any good way to automatically update Caddy when a new version is released?

There isn’t an official (or any?) package for Caddy on Debian. I might get around to it eventually but for now, I’m holding out so I don’t bite off more than I can chew.

Before Caddy 0.9, restarts always spawned a new process, meaning you could swap out the binary file and signal USR1 and voila, Caddy would also update its binary. In 0.9, restarts are in-process (which is generally better) but I admit we need a USR2 to upgrade the binary with zero downtime.

How is that ugly? :rolling_eyes: It’s literally wget <url>.

We apparently define production-ready differently. I mean that Caddy is suitable for use in production. (With the usual disclaimer of “use the right tool for the right job,” yada yada.) What I don’t say is that Caddy is backward-compatible. In fact, I explain that it’s subject to breaking changes before 1.0.

I realize the situation can be frustrating, but I’m hoping such changes are infrequent. Still, would prefer to change things now rather than later.

Interesting that you would request this – I would actually love this, but most people I’ve talked to about updating Caddy automatically in production are averse to the idea, so I haven’t pursued it. But I still envision a web server that, set up one day in 2016, will still be using modern cipher suites and protocols in 2020 because it kept applying safe, rolling updates. Unfortunately, most site owners don’t like the risk this entails.

As for how the wget command is ugly, I mean the URL is ugly and almost impossible to remember. I always end up going to the website, checking the boxes for the features I want, then right-clicking the download button and selecting “Copy link address”

Regarding the production-ready comment, I meant to imply that my update process is not production-ready. I believe that to be production-ready you need to be able to anticipate the results and you should have the expectation that it will continue to run unless something changes. I can set up Caddy and it will keep running in an expected and predictable manner, but any attempt to automate the update process is at risk of suddenly breaking when a feature is removed from Caddy.

I can understand why some people are averse to automatic updates. If you test version 0.9 exhaustively to the point that you finally have faith it won’t die in production, you don’t want version 1.0 to sneak its way onto the server and crash because of some uncaught bug. This is the reason apt doesn’t automatically update: you have to tell it that you want to update using apt-get dist-upgrade. That way you can test the update on a staging server, verify everything still works, and check off on it before upgrading production.

I’m in the smaller group of people who believe that you should upgrade first and be prepared to roll back if necessary. I want to leave security holes and bugs in production for the minimum amount of time necessary. This is why I have a cron job which updates all of my software early in the morning on a Sunday. I wake up Sunday morning and check my email: if the upgrade failed or caused the server to start throwing errors, I can fix it before anyone notices on Monday. However it saves me from having to manually upgrade and run regression tests when 99% of the time nothing goes wrong.

I have been using this as a universal Caddy upgrader and checking for currently available plugins: getcaddy github repo

It can be used with a cron job or manually. It automatically installs the plugins that were part of the old binary (if present).

Note that since Caddy 0.10.9 this only works for the personal edition of the binary.

Instead of manual setup environment use Caddy installer. See full installation tutorial here. Once you setup with Caddy installer create a bash script on /usr/local/bin/caddyupdate like this:

#!/bin/bash
caddy update http.cors,http.git,http.jwt,tls.dns.digitalocean
caddy start

Now create a cron script using @daily tag to execute /usr/local/bin/caddyupdate for regular update. The above code will only update Caddy if there is newer version available. See update example here. Note that few seconds of downtime will occur!

Seems a little opinionated - while it’s not for me, it definitely looks like a pretty comprehensive little helper!

Did you know that you can send the Caddy process a USR2 signal to gracefully reload the binary? You might be able to eliminate this downtime when upgrading.