Can't serve 5GB file

I’m trying to use Caddy to serve a 5GB file. When I use wget to download it, this is the behaviour I’m getting:

get http://xxx/zipfiles/
--2017-02-16 17:48:29--  http://xxx/zipfiles/
Resolving ...
Connecting to ...|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 5404789663 (5.0G) [application/zip]
Saving to: ‘’                     1%[=>                                                                                                                                                                    ]  80.81M  2.52MB/s   in 28s    s

2017-02-16 17:48:57 (2.85 MB/s) - Connection closed at byte 84732332. Retrying.

--2017-02-16 17:48:58--  (try: 2)  http://xxx/zipfiles/
Connecting to ...|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 5404789663 (5.0G), 5320057331 (5.0G) remaining [application/zip]
Saving to: ‘’                     3%[++==>                                                                                                                                                                 ] 163.23M  3.50MB/s   in 25s    s

2017-02-16 17:49:23 (3.35 MB/s) - Connection closed at byte 171158660. Retrying.

Unfortunately, there is nothing in the log files, except 200 (OK) and 206 (partial content) responses.

I’m using caddy 0.9.5.

Caddyfile:, {

    root /home/xxx/public/

    rewrite /favicon.ico /static/img/favicon.ico

    proxy / {
        header_upstream X-Forwarded-Host {host}
        header_upstream X-Forwarded-Server {host}
        header_upstream X-Forwarded-Proto {scheme}
        except /media /static /zipfiles

    tls {
        dns cloudflare

I have the same or at least a very similar issue. Also proxying (to lighttpd and python, same behaviour) and getting the same behaviour with non-tiny files (starts in the tens of megabytes region for me). Large files often download already very slowly before showing this behaviour.

Caddy 0.9.5 (+94e382e Sun Feb 05 15:23:24 UTC 2017) on Debian 8.

Did you remember to disable timeouts?

timeouts none

I had

proxy / ip:80 {
        max_fails 10
        fail_timeout 30s
        keepalive 1
timeouts 1m

because I picked that up somewhere in Github issues when researching this some days ago.

Now I set timeouts to 0 and it seems to have fixed it. Should I keep the keepalive?

Depends on your server configuration; the default is 2. Glad you got it working. :slight_smile:

Disabling the timeout globally, thus opening myself up to e.g. slowloris attacks, doesn’t sound like the right solution to me…

nginx has client_body_timeout and client_header_timeout to mitigate slowloris. It still can serve large downloads over a long time just fine.

Unfortunately you may have to use nginx if you need this functionality.

I understand that the inability to set timeouts differently between requests comes from further up the chain in Go’s net/http package.

Yup. Sorry. And I’ve only ever seen slowloris myself in the wild once – given that Caddy never had timeouts before and I’ve never heard a reported slowloris issue in the 2 years before that (except for that one time), you might be okay for now. :slight_smile:

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.