Announce: New http cache plugin

Did you also think about using an in-memory cache? I would actually prefer that over static file caching because the latter could also be done by the application.

@eva2000 I didn’t expect that result but it is completely logical that nginx is faster, it is written in C, it was written with performance as a first concern and it had years of optimizations. Caddy is still young and has a lot of opportunities for optimizations.

Maybe the results were because of different stream ciphers used (in your benchmark nginx used cacha-20 vs caddy that uses AES, cacha20 is supposed to be way faster), or TLS tickets (enabled in nginx but not in caddy that might force to create new TLS session each time, I’m not sure if h2load use them). About the latest versions, it will probably help, go got faster in the latest releases, especially after 1.7 with the new compiler optimizations.

@kekub I have thought about an in memory store. I wrote the storage layer independently so new implementations can be used. I have even written one that uses mmap, but it had the problem that if you had a lot of cached pages it could fill up your memory starting to swap the cached content. So I prefer to not release it until there is a setting to limit the cache size (using a LRU algorithm or something similar)

3 Likes

indeed nginx has a few years head start as well :slight_smile:

indeed, I will retest when I have more free time - looking forward to see what Go 1.8+ brings to Caddy :slight_smile:

Already seeing if this can help with an Apache/AJP proxy for a backend app. Apache caching was helpful, but slow. This already seems faster in initial tests. Thank you!

Edit: per what @kekub was asking about; when I was looking into the setup I’m working on now, Caching Guide - Apache HTTP Server Version 2.4 was somewhat helpful. I guess they use 3 different mechanisms? One of them appeared to be specifically memory caching for TLS/SSL + credentials usage.

I think you can still use this: key is to set it to cache images or something. It goes by MIME type; which I wish the http.gzip module did as well, because Apache supports doing DEFLATE (GZIP) based on type.

Working example for me (thus far)…

cache {
    default_max_age 1440m
    match_header Content-Type image/* text/css
    path /tmp/caddy
    status_header X-Cache-Status
}
1 Like

Have you seen any real use progress with this configuration ? What can I expect ?

This is being used in concert with http.proxy to secure an older backend app. I believe my boss’ words was “f-ing hit it out of the park”. :grinning:

Edit: I should say I’m still experimenting with the default_max_age; 10080m = 7 days; 1440m = 1 day.

1 Like

Excellent!

:grinning:default_max_age Default: 5 minutes: :+1:

Hi @nicolasazrak ,

I am currently trying to get your plugin working. For my demo website I enabled it by using just the cache directive:

site.com {
    root /var/www/site.com/build
    gzip
    cache
}

This works but it “feels” slower. However it is just using static assets so I was sure that this could only add overhead. However I can not find any cached files on my system.

I added:

site.com {
    root /var/www/site.com/build
    gzip
    cache {
        path /tmp/caddy-cache
    }
}

with caddy-cache being a folder created in /tmp with chmod 777 for the caddy user as well as owner and group set to the caddy user. This leads to an error 500: “Internal Server Error”.

When I change the path to /home/myuser/caddy-cache (same setup as above) it crashes all my websites although systemd is reporting that caddy is up and running without any problems.

Is there something that I am doing wrong? Was this plugin intended to only cache proxy responses?

Thanks in advance
Kevin

@kekub cache is not really useful with static content, because to serve content it’s pretty similar. It might be a little bit faster because it keeps the file descriptor open but the difference is not big. Although it works (because it is independent from where the data came), it is more useful when you have slow responses, for example: the ones that come from an app server with a slower language.

About having 500 errors, that is really bad, it should not happen, even less crashing caddy. Please open an issue in github.com/nicolasazrak/caddy-cache/issues Are you using other plugins? Is that your full Caddyfile? Do logs show anything?

I have a slow hdd in my server and want to cache some of the files on a ramdisk.

I will open an issue within the next days. My Caddyfile is quite big as I have about 10 mini sites on that server. I did not see any logs but I will try check in detail.

Make sure you’ve enabled the error log (errors directive) and the process log (-log CLI option). :slight_smile:

re-visited http.cache with source built Caddy 0.10.9 and http.cache proxy caching still helps for index.html h2load HTTP/2 HTTPS testing though still slower than non-cached Nginx results https://community.centminmod.com/threads/caddy-http-2-server-benchmarks-part-2.12873/#post-54715

Tested Caddy source build with GCC 4.8.5 and GCC 7.1.1 on CentOS 7.4 on 2 cpu OpenVZ 2GB RAM/50GB SSD VPS.

2 Likes

Thanks for the bench. Nginx will always be faster because it’s written in C, second they are developing since 2004.

2 Likes

Yes Nginx has a earlier head start. This highlights why sharing and encouragement of benchmarking both official and source compiled Caddy binaries should occur Caddy EULA section 3.1 h - benchmarking info clarification?. How else do you continually track Caddy’s development progress in terms of performance and regressions over time ? :slight_smile:

1 Like

Yeah, you have a point. Matt should ease the restriction for pre-built binary. Anyway, the new tool Caddy-X (to be released) should ease the deployment.

1 Like

Regarding building of any version; if you’re going for speed, these might help? Not much of a Go-coder (yet).

1 Like

Cheers @unquietwiki not much of Go person myself either. If someone can whip a guide for setting up Caddy with profiler, I’ll start including profiler info in my future benchmarks too :slight_smile:

1 Like