Rate limiting API endpoint

I am using caddy-ratelimit by mholt.

My goal is to throttle /api endpoint by an IP address to avoid situations where a client accidentally triggers a flood of traffic. The actual API is served by a Python as a reverse proxy.

However caddy-ratelimit documentation and examples are sparse, so not sure if I am doing it correctly, e.g. with handle section and match directive that is separately in caddy-ratelimit.

Could anyone hint if this is a working configuration, and there are no gotchas?

{
    # Disable the Caddy admin API
    # This is personal preference, you can remove this if desired
    admin off

    log {
        output file /var/log/caddy/access.log
        format json
    }

    # Caddy must be told custom rate_limit module its order
    order rate_limit before basicauth
}


http://example.com {

    # Backend API request
    handle /api* {

        # Rate limit /API endpoint to 20 requests a minute
        rate_limit {
            window 1m
            events 20
	}
        # This is the upstream Waitress server
        reverse_proxy 127.0.0.1:3456 {
            # Backend API must respond to an individual API call under 20s seconds
            transport http {
                response_header_timeout 20s
            }
        }
    }

Hmm, I’m not even sure where that syntax comes from, it doesn’t show anything like that in the readme. (And there is a full example as well.)

You need to specify a zone:

rate_limit {
	zone <name> {
		key    {remote_host}
		window 1m
		events 20
	}
}

(GIve it a name of your choosing)

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.