Caddy randomly trims response when used in production

Hi! I’m trying to replace nginx with caddy on our production stack.

But I have a problem - big responses from proxied nodejs backend (~1.5/2Mb JSON) is cutted off randomly, so I can’t parse it on clients. Sometimes response is transfered without cutted off but it’s rare case.

It’s occurs with gzip and without it.

It’s doesn’t occurs when I start caddy on my local machine and proxying all requests to production backend.

So I suppose reason of cutting is load on caddy. On production we have about 100 POST req/s (not so much) proxied to backend.

I tried a couple of things like timeouts but without any result

And there is no any errors - because caddy returns data with 200 status

Here is config

0.0.0.0:80

log stdout
errors stdout

proxy /api api:3000 {
  without /api
  transparent
}

Any help is appreciate)

There could be a lot of things causing this. How have you verified that this is not a bug or issue with your backend app? Or perhaps a client cutting off the connection.

Hi, Matt. I think that Nodejs is ok because all works with nginx flawlessly. And as I said - when I run caddy on my local server and proxying requests to production (nodejs backend) - all is ok. Problems begins when I deploy it.

Oh, and all deployments are on Docker, so caddy runs in docker container.

Hmm, I’m not sure. You’ll need to find a way to reproduce the behavior somewhat reliably if we’re going to be able to look at what is going on. Can you help us with that? Try to reduce the setup down to as minimal as possible for it to happen.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.