Can I use Caddy for a high-traffic site?

Moderator edit: This is a very popular search result, but it’s also from 4+ years ago, when the software was barely a year old. To answer your search query succinctly in a 2021 perspective: Yes, you can use Caddy for high-traffic sites. It’s as fast and efficient as any other Go server, and its TLS and certificate features scale better than any other server, even those with built-in ACME support. Caddy is used by many businesses in production. Most of the bottlenecks come from poor system tuning.


I have a website which is mostly not used during the year, but receives high volume of traffic for a really short period of time. When we have to use it, it can receive up to 15 000 active connections per second for one or two hours.

Right now, we are using Nginx and it’s fine, but since the whole website is only static content, I thought of using Caddy instead. But I was wondering if it could handle such traffic as well as Nginx. We are using a VPS on Digital Ocean and before an event, we switch to the biggest version (20 cores and 64 Gb of RAM), so I don’t think we would miss CPU power or RAM…

Is there someone that use Caddy on a big site right now ? I’d love to have some feedbacks, since I can’t try it directly and risk failure…

Thanks !

So what’s stopping you from using multiple test sites to check?

The only way to figure out your situation reliably is to actually do it.
Then the next best way is to test.

You could put haproxy in front of both, old & new and flick between the new and old if the new falls over. Remember there are OS tweaks related to the 10k problem too…

And if it is ONLY static why not just put a CDN in front of it?

If you have the time available to experiment, we’d all be interested in reading a real world blog post about what you found when you decided to kick the tyres on caddy :slight_smile:

Caddy can handle ~30,000 Requests/second

Source? :slight_smile: – “Provo Linux User Group - 2/16/16 - Matt Holt - “Caddy””


Thanks for this number @NurdTurd, it’s really interesting !

I think we will try with a separate server, not on the main one as you suggested @SoreGums, since we want to keep everything simple for the live website.

Keep in mind, you could go for a couple of load balancers.

Maybe, but our current system (only one server, with NGINX) is really simple and works well…

Stick to it then! You’re not obligated to switch to Caddy. No worries!

You can always load test.

If its all static content, you should also use a cdn. Something like cloudflare can be a free way to save your server from part of the load and improve the experience for your users.


Of course ! But the ease of use of Caddy is really compelling… :slight_smile:

It’s super majestic isn’t it! :smiley: :+1:

1 Like

We have thought of that, but almost all our users are in the same country (France) and it’s cheaper to use only one server without CDN.

And we don’t need more, frankly, Caddy is more an experiment I’d like to lead…

In my very specific benchmark testing - no system tuning, serving only static files, with a certain Go version, etc, etc…

But yes, Caddy is fast @nicolinux. It depends exactly what you put in your Caddyfile. Most people think their traffic is higher than it really is. For example, 200-300 people on your site at a time isn’t a lot. Your biggest problems with serving real high traffic sites is going to be system tuning.

Oh sorry! I didn’t realize all the specifics.

Also, that kind of sustained throughput will depend heavily on how well you tune your os tcp stack to deal with high traffic. You are likely to run out of sockets and such handling a high number of small requests.

1 Like

I’m not an expert at all, but does this kind of tuning have a really big impact ? We have not looked at the OS level for now, but NGINX configuration was enough until now.

yes it has a huge difference on performance of system/OS level tuning outside of your web server being used :slight_smile:

I am looking at integrating Caddy into my Centmin Mod LEMP stack installer so needed to evaluate Caddy’s performance and scalability. I did some HTTP/2 based benchmarks back in December 2015 with Caddy 0.8 and my custom installed Nginx setup and you can see the results at

Now that Caddy 0.9 is out, I’ll be revisiting these benchmarks and comparisons along with my other web server integrations for Apache 2.4, OpenLiteSpeed and H2O web servers into my LEMP stack :slight_smile:


We do about 30req/s sustained and peak to about 1300req/s. It uses headers, proxy and TLS. So far so good.

1 Like