Moderator edit: This is a very popular search result, but it’s also from 4+ years ago, when the software was barely a year old. To answer your search query succinctly in a 2021 perspective: Yes, you can use Caddy for high-traffic sites. It’s as fast and efficient as any other Go server, and its TLS and certificate features scale better than any other server, even those with built-in ACME support. Caddy is used by many businesses in production. Most of the bottlenecks come from poor system tuning.
Hello
I have a website which is mostly not used during the year, but receives high volume of traffic for a really short period of time. When we have to use it, it can receive up to 15 000 active connections per second for one or two hours.
Right now, we are using Nginx and it’s fine, but since the whole website is only static content, I thought of using Caddy instead. But I was wondering if it could handle such traffic as well as Nginx. We are using a VPS on Digital Ocean and before an event, we switch to the biggest version (20 cores and 64 Gb of RAM), so I don’t think we would miss CPU power or RAM…
Is there someone that use Caddy on a big site right now ? I’d love to have some feedbacks, since I can’t try it directly and risk failure…
So what’s stopping you from using multiple test sites to check?
The only way to figure out your situation reliably is to actually do it.
Then the next best way is to test.
You could put haproxy in front of both, old & new and flick between the new and old if the new falls over. Remember there are OS tweaks related to the 10k problem too…
And if it is ONLY static why not just put a CDN in front of it?
If you have the time available to experiment, we’d all be interested in reading a real world blog post about what you found when you decided to kick the tyres on caddy
Thanks for this number @NurdTurd, it’s really interesting !
I think we will try with a separate server, not on the main one as you suggested @SoreGums, since we want to keep everything simple for the live website.
If its all static content, you should also use a cdn. Something like cloudflare can be a free way to save your server from part of the load and improve the experience for your users.
In my very specific benchmark testing - no system tuning, serving only static files, with a certain Go version, etc, etc…
But yes, Caddy is fast @nicolinux. It depends exactly what you put in your Caddyfile. Most people think their traffic is higher than it really is. For example, 200-300 people on your site at a time isn’t a lot. Your biggest problems with serving real high traffic sites is going to be system tuning.
Also, that kind of sustained throughput will depend heavily on how well you tune your os tcp stack to deal with high traffic. You are likely to run out of sockets and such handling a high number of small requests.
I’m not an expert at all, but does this kind of tuning have a really big impact ? We have not looked at the OS level for now, but NGINX configuration was enough until now.
Now that Caddy 0.9 is out, I’ll be revisiting these benchmarks and comparisons along with my other web server integrations for Apache 2.4, OpenLiteSpeed and H2O web servers into my LEMP stack