I’m building a new .net core 2.0 site on an Ubuntu box and was intending to go with nginx when I heard about Caddy.
I’m a crazy performance nut that likes to see sub 500ms load times for my simple web apps.
I’ve seen that nginx is faster than Caddy for high traffic loads. I know there’s more to the performance story than time to first byte, but I’m curious how “quick” Caddy can serve content on low traffic sites.
For sites with maybe 100-200 concurrent connections, how does Caddy compare with nginx?
This is a complicated question to answer, I hope you know. If you feel you are qualified to do so, I suggest you run your own benchmarks: brew install wrk (on a Mac, for example) and then use wrk to compare.
100-200 concurrent connections isn’t much. When the Caddy website gets a lot of attention on social media or HN or something, there’s easily 300+ visitors on the site at a time, and its pages load in less than a few hundred ms in total. Of course there’s a lot of external factors involved, etc etc…
You could add the {latency} placeholder to your log format and keep track of latency that way.
Ultimately, my advice in these situations is to just try it. 100 connections is easy. Even if nginx is faster in microbenchmarks, is it perceptibly faster? Probably not. The real questions you should be asking, since system resources are likely not your bottleneck, are:
Which web server will be easier to maintain?
Which will do more work for me with less configuration?
Which will help me to do more with my site with less room for error?
For which server do I have easier access to its developers and community?
I’ll be running it on a rather small VPS, 1GB RAM, 1 CPU…
Which is easier to maintain? Less config? Room for error? Access to dev? Caddy by a long shot. (and I really like you used Discourse for the community).
Thanks for the quick answer. If this is any indication, I think I’ll really enjoy using Caddy.
How quick ? Depends if you’re looking at TTFB, first meaningful paint and other metrics which make up initial perceived page load times versus actual page load times. But as @matt stated also do you own tests to compare. I recommend webpagetest.org for doing page load tests across several geographic regions. You can also leverage webpagetest.org’s API to script testing as well which is what I do so I can see how incremental tweaks and tuning affect page load speeds.
There’s some many varieties/forks of wrk but do any actually support HTTP/2 based HTTPS ? Probably to test HTTP/2 based HTTPS, to use nghttp2’s h2load load tester https://nghttp2.org/documentation/h2load-howto.html ?