Can Caddy "fire and forget" a reverse proxy request?

I have some requests that are “fire and forget” analytics requests. They are coming in to a caddy server and being reverse proxied to an upstream location. This works fine but Caddy waits until it gets a response from the upstream to return to the client. Is there a way to just have Caddy immediately return a status code of 200, and then asynchronously forward the request to the upstream and just ignore the response?

Welcome Eric!

In Caddy, there’s not. That could lead to resource exhaustion.

One way to do this is to have your client close the connection after the request is sent but before waiting for the response. Caddy will immediately clean up resources associated with that connection, including killing connections to the upstream. Hopefully it got to the upstream in time.

A better way to do this is to have your backend respond immediately every time and process the analytic asynchronously instead, if there is going to be a delay. Of course, that could lead to resource depletion too.

The best way to do this is to have the client send analytics asynchronously. This way, the client will deplete its own resources, hopefully before the server does. Unlikely, but at least the damage is reflected on the client as well instead of just your own infrastructure.

1 Like

Browsers actually have an API for sending a fire-and-forget message to servers even if the tab is being closed, asynchronously. It’s quite useful when implementing analytics or other types of data collection on your site:

Thanks for the quick reply! My use case was mostly that I wanted to put the Caddy servers geographically closer to the end user, so that analytics requests would be faster from their perspective.

The back end I’m communicating with is not geo-distributed, and making it such would be difficult, so I was hoping for sort of a “poor man’s” geo-distributed setup :grinning_face_with_smiling_eyes:

I am certainly aware there could be resource exhaustion issues, but do you think this is something I could accomplish with an extension to Caddy if I were to roll up my sleeves and start playing with the internals?

You could probably make a copy of the reverse_proxy module as a plugin and make whatever changes you need…

But I still don’t see how that would have any benefit. Why not just make sure your client code uses something like sendBeacon, which doesn’t care about the response? This seems much more like a client problem to solve than a server problem.

1 Like

Hi Francis, thanks for the reply! There’s a little more to the story - I called it analytics for brevity, but it’s not exactly analytics data that’s being sent, and it can’t use sendBeacon for a number of reasons (request configuration being one). In addition there are backwards compatibility requirements.

The customers involved also, for better or worse, measure the request round trip latency even though it’s async, and would prefer the endpoint be geographically nearer to speed things up. (The SSL handshake alone does get significantly faster if we can put the endpoint edge near the client)

This topic was automatically closed after 30 days. New replies are no longer allowed.