Streaming breaks when Caddy proxies Next.JS

1. The problem I’m having:

I have the following proxy config: Caddy → Next.JS (centralized auth middleware) → Python Backend.

Individually, every single component works with streaming. So Caddy pointed to the backend streams responses properly. Next.JS pointed to the backend streams responses properly.

Caddy pointed at Next.JS buffers the event stream until the request finishes. Nothing else in the process has been changed except the location of the reverse proxy for Caddy. I’ve also tried forcing http to 1.1 and setting flush_interval to -1.

I’m using the following command for everything. Port 3000 is Next.JS, 8000 is the backend and 6746 is Caddy.

curl -vL -X POST "http://localhost:{3000,8000,6746}/api/documents/v2/extract" \
    -F "files=@/home/sam/Downloads/PO8419.pdf" \
    -F 'fields=[
      {
        "name": "Purchase Order Number",
        "type": "string",
        "description": "The purchase order number or PO number from the document. This is a different field than the order number"
      },
      {
        "name": "Order Date",
        "type": "date",
        "description": "The date when the purchase order was created or issued"
      }
    ]'

2. Error messages and/or full log output:

Caddy pointed to Next.JS (streams in 1 chunk once processing is finished):

Note: Unnecessary use of -X or --request, POST is already inferred.
* Host localhost:6746 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
*   Trying [::1]:6746...
* Connected to localhost (::1) port 6746
* using HTTP/1.x
> POST /api/documents/v2/extract HTTP/1.1
> Host: localhost:6746
> User-Agent: curl/8.14.1
> Accept: */*
> Content-Length: 421058
> Content-Type: multipart/form-data; boundary=------------------------lw704SvVQDryxkxfJtO12m
> 
* upload completely sent off: 421058 bytes
< HTTP/1.1 200 OK
< Access-Control-Allow-Headers: *
< Access-Control-Allow-Origin: *
< Cache-Control: no-cache
< Content-Type: text/event-stream; charset=utf-8
< Date: Mon, 18 Aug 2025 03:49:17 GMT
< Server: uvicorn
< Vary: Accept-Encoding
< Via: 1.1 Caddy
< Transfer-Encoding: chunked
< 
data: {"status": "processing", "message": "Processing 1 documents in parallel", "document_count": 1, "field_count": 2, "fields": ["Purchase Order Number", "Order Date"]}

data: {"status": "document_processing", "message": "Processing document: PO8419.pdf", "file_index": 0, "filename": "PO8419.pdf"}

data: {"status": "azure_processing", "message": "Analyzing PO8419.pdf with Azure Document Intelligence...", "file_index": 0, "filename": "PO8419.pdf"}

data: {"status": "batch_progress", "message": "Completed 1/1 documents", "completed_count": 1, "total_count": 1}

data: {"status": "document_error", "message": "Error processing PO8419.pdf: too many values to unpack (expected 1)", "file_index": 0, "filename": "PO8419.pdf", "error": "too many values to unpack (expected 1)"}

data: {"status": "completed", "message": "Parallel bulk extraction completed: 0 successful, 1 failed", "total_documents": 1, "successful_count": 0, "failed_count": 1, "results": [{"filename": "PO8419.pdf", "file_index": 0, "success": false, "error": "too many values to unpack (expected 1)"}]}

* Connection #0 to host localhost left intact

Next.JS pointed at backend (streams each message as expected):

Note: Unnecessary use of -X or --request, POST is already inferred.
* Host localhost:3000 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
*   Trying [::1]:3000...
* Connected to localhost (::1) port 3000
* using HTTP/1.x
> POST /api/documents/v2/extract HTTP/1.1
> Host: localhost:3000
> User-Agent: curl/8.14.1
> Accept: */*
> Content-Length: 421058
> Content-Type: multipart/form-data; boundary=------------------------Cs7ftSvwQrIG7Ku3JrQw8i
> 
* upload completely sent off: 421058 bytes
< HTTP/1.1 200 OK
< date: Mon, 18 Aug 2025 03:49:31 GMT
< server: uvicorn
< cache-control: no-cache
< connection: keep-alive, close
< access-control-allow-origin: *
< access-control-allow-headers: *
< content-type: text/event-stream; charset=utf-8
< transfer-encoding: chunked
< Vary: Accept-Encoding
< 
data: {"status": "processing", "message": "Processing 1 documents in parallel", "document_count": 1, "field_count": 2, "fields": ["Purchase Order Number", "Order Date"]}

data: {"status": "document_processing", "message": "Processing document: PO8419.pdf", "file_index": 0, "filename": "PO8419.pdf"}

data: {"status": "azure_processing", "message": "Analyzing PO8419.pdf with Azure Document Intelligence...", "file_index": 0, "filename": "PO8419.pdf"}

data: {"status": "document_error", "message": "Error processing PO8419.pdf: too many values to unpack (expected 1)", "file_index": 0, "filename": "PO8419.pdf", "error": "too many values to unpack (expected 1)"}

data: {"status": "batch_progress", "message": "Completed 1/1 documents", "completed_count": 1, "total_count": 1}

data: {"status": "completed", "message": "Parallel bulk extraction completed: 0 successful, 1 failed", "total_documents": 1, "successful_count": 0, "failed_count": 1, "results": [{"filename": "PO8419.pdf", "file_index": 0, "success": false, "error": "too many values to unpack (expected 1)"}]}

* Connection #0 to host localhost left intact

Caddy pointed at backend (streams each message as expected):

Note: Unnecessary use of -X or --request, POST is already inferred.
* Host localhost:6746 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
*   Trying [::1]:6746...
* Connected to localhost (::1) port 6746
* using HTTP/1.x
> POST /api/documents/v2/extract HTTP/1.1
> Host: localhost:6746
> User-Agent: curl/8.14.1
> Accept: */*
> Content-Length: 421058
> Content-Type: multipart/form-data; boundary=------------------------LsMtkbmNtGAS5dA0ynJi7i
> 
* upload completely sent off: 421058 bytes
< HTTP/1.1 200 OK
< Access-Control-Allow-Headers: *
< Access-Control-Allow-Origin: *
< Cache-Control: no-cache
< Content-Type: text/event-stream; charset=utf-8
< Date: Mon, 18 Aug 2025 03:59:47 GMT
< Server: uvicorn
< Via: 1.1 Caddy
< Transfer-Encoding: chunked
< 
data: {"status": "processing", "message": "Processing 1 documents in parallel", "document_count": 1, "field_count": 2, "fields": ["Purchase Order Number", "Order Date"]}

data: {"status": "document_processing", "message": "Processing document: PO8419.pdf", "file_index": 0, "filename": "PO8419.pdf"}

data: {"status": "azure_processing", "message": "Analyzing PO8419.pdf with Azure Document Intelligence...", "file_index": 0, "filename": "PO8419.pdf"}

data: {"status": "batch_progress", "message": "Completed 1/1 documents", "completed_count": 1, "total_count": 1}

data: {"status": "document_error", "message": "Error processing PO8419.pdf: too many values to unpack (expected 1)", "file_index": 0, "filename": "PO8419.pdf", "error": "too many values to unpack (expected 1)"}

data: {"status": "completed", "message": "Parallel bulk extraction completed: 0 successful, 1 failed", "total_documents": 1, "successful_count": 0, "failed_count": 1, "results": [{"filename": "PO8419.pdf", "file_index": 0, "success": false, "error": "too many values to unpack (expected 1)"}]}

* Connection #0 to host localhost left intact

3. Caddy version:

2.10.0

4. How I installed and ran Caddy:

a. System environment:

System: Linux desktop 6.16.0 #1-NixOS SMP PREEMPT_DYNAMIC Sun Jul 27 21:26:38 UTC 2025 x86_64 GNU/Linux
Caddy: Installed through Nixpkgs

b. Command:

Unsure, abstracted by Devenv.

c. Service/unit/compose file:

d. My complete Caddy config:

{
  admin off
}

:6746 {
  @supabase path_regexp supabase ^/supabase(/.*)?$
  handle @supabase {
    rewrite * {re.supabase.1}
    reverse_proxy localhost:54321
   }

  @ws_excel {
    path_regexp ws_excel ^/excel(/.*)?$
    header Connection *Upgrade*
    header Upgrade websocket
  }
  handle @ws_excel {
    rewrite * /excel{re.ws_excel.1}
    reverse_proxy localhost:6747
  }

  reverse_proxy localhost:{3000,8000} {
    header_up Host dev.coalesc.xyz
  }
}

5. Links to relevant resources:

{3000,8000} isn’t valid in the Caddyfile. Please avoid using Bash’s brace expansion when showing Caddyfile examples. It can be confusing.

Also, in your curl examples you’re not sending the Host header as dev.coalesc.xyz, but your reverse_proxy block sets that header upstream. Have you tried running curl with -H 'Host: dev.coalesc.xyz', or testing the Caddyfile both with and without the header_up Host dev.coalesc.xyz line?

Can you enable debug in the global options and share the logs?

{
    debug
}
1 Like

To 3000:

[devenv:processes:caddy] {"level":"debug","ts":1755525725.465343,"logger":"http.handlers.reverse_proxy","msg":"selected upstream","dial":"localhost:3000","total_upstreams":1}
[devenv:processes:caddy] {"level":"debug","ts":1755525725.4919698,"logger":"http.handlers.reverse_proxy","msg":"upstream roundtrip","upstream":"localhost:3000","duration":0.02645524,"request":{"remote_ip":"::1","remote_port":"44898","client_ip":"::1","proto":"HTTP/1.1","method":"POST","host":"localhost:6746","uri":"/api/documents/v2/extract","headers":{"Accept":["*/*"],"Content-Length":["421058"],"Content-Type":["multipart/form-data; boundary=------------------------WLOuYkoy0Bzg4TFRM2a2qS"],"X-Forwarded-For":["::1"],"X-Forwarded-Proto":["http"],"X-Forwarded-Host":["localhost:6746"],"Via":["1.1 Caddy"],"User-Agent":["curl/8.14.1"]}},"headers":{"Server":["uvicorn"],"Cache-Control":["no-cache"],"Access-Control-Allow-Origin":["*"],"Date":["Mon, 18 Aug 2025 14:02:05 GMT"],"Access-Control-Allow-Headers":["*"],"Content-Type":["text/event-stream; charset=utf-8"],"Vary":["Accept-Encoding"]},"status":200}

To 8000:

[devenv:processes:caddy] {"level":"debug","ts":1755525926.6216435,"logger":"http.handlers.reverse_proxy","msg":"selected upstream","dial":"localhost:8000","total_upstreams":1}
[devenv:processes:caddy] {"level":"debug","ts":1755525926.63575,"logger":"http.handlers.reverse_proxy","msg":"upstream roundtrip","upstream":"localhost:8000","duration":0.01403345,"request":{"remote_ip":"::1","remote_port":"51610","client_ip":"::1","proto":"HTTP/1.1","method":"POST","host":"localhost:6746","uri":"/api/documents/v2/extract","headers":{"Via":["1.1 Caddy"],"User-Agent":["curl/8.14.1"],"Accept":["*/*"],"Content-Length":["421058"],"Content-Type":["multipart/form-data; boundary=------------------------zooBIzpG4DVRW2yKUmWvVd"],"X-Forwarded-For":["::1"],"X-Forwarded-Proto":["http"],"X-Forwarded-Host":["localhost:6746"]}},"headers":{"Access-Control-Allow-Headers":["*"],"Content-Type":["text/event-stream; charset=utf-8"],"Date":["Mon, 18 Aug 2025 14:05:26 GMT"],"Server":["uvicorn"],"Cache-Control":["no-cache"],"Connection":["keep-alive"],"Access-Control-Allow-Origin":["*"]},"status":200}

The curly braces are just meant to represent what I’ve tried, not literally what’s in the config. Having or removing header_up makes no difference.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.