1. The problem I’m having:
I have the following proxy config: Caddy → Next.JS (centralized auth middleware) → Python Backend.
Individually, every single component works with streaming. So Caddy pointed to the backend streams responses properly. Next.JS pointed to the backend streams responses properly.
Caddy pointed at Next.JS buffers the event stream until the request finishes. Nothing else in the process has been changed except the location of the reverse proxy for Caddy. I’ve also tried forcing http to 1.1 and setting flush_interval to -1.
I’m using the following command for everything. Port 3000 is Next.JS, 8000 is the backend and 6746 is Caddy.
curl -vL -X POST "http://localhost:{3000,8000,6746}/api/documents/v2/extract" \
-F "files=@/home/sam/Downloads/PO8419.pdf" \
-F 'fields=[
{
"name": "Purchase Order Number",
"type": "string",
"description": "The purchase order number or PO number from the document. This is a different field than the order number"
},
{
"name": "Order Date",
"type": "date",
"description": "The date when the purchase order was created or issued"
}
]'
2. Error messages and/or full log output:
Caddy pointed to Next.JS (streams in 1 chunk once processing is finished):
Note: Unnecessary use of -X or --request, POST is already inferred.
* Host localhost:6746 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
* Trying [::1]:6746...
* Connected to localhost (::1) port 6746
* using HTTP/1.x
> POST /api/documents/v2/extract HTTP/1.1
> Host: localhost:6746
> User-Agent: curl/8.14.1
> Accept: */*
> Content-Length: 421058
> Content-Type: multipart/form-data; boundary=------------------------lw704SvVQDryxkxfJtO12m
>
* upload completely sent off: 421058 bytes
< HTTP/1.1 200 OK
< Access-Control-Allow-Headers: *
< Access-Control-Allow-Origin: *
< Cache-Control: no-cache
< Content-Type: text/event-stream; charset=utf-8
< Date: Mon, 18 Aug 2025 03:49:17 GMT
< Server: uvicorn
< Vary: Accept-Encoding
< Via: 1.1 Caddy
< Transfer-Encoding: chunked
<
data: {"status": "processing", "message": "Processing 1 documents in parallel", "document_count": 1, "field_count": 2, "fields": ["Purchase Order Number", "Order Date"]}
data: {"status": "document_processing", "message": "Processing document: PO8419.pdf", "file_index": 0, "filename": "PO8419.pdf"}
data: {"status": "azure_processing", "message": "Analyzing PO8419.pdf with Azure Document Intelligence...", "file_index": 0, "filename": "PO8419.pdf"}
data: {"status": "batch_progress", "message": "Completed 1/1 documents", "completed_count": 1, "total_count": 1}
data: {"status": "document_error", "message": "Error processing PO8419.pdf: too many values to unpack (expected 1)", "file_index": 0, "filename": "PO8419.pdf", "error": "too many values to unpack (expected 1)"}
data: {"status": "completed", "message": "Parallel bulk extraction completed: 0 successful, 1 failed", "total_documents": 1, "successful_count": 0, "failed_count": 1, "results": [{"filename": "PO8419.pdf", "file_index": 0, "success": false, "error": "too many values to unpack (expected 1)"}]}
* Connection #0 to host localhost left intact
Next.JS pointed at backend (streams each message as expected):
Note: Unnecessary use of -X or --request, POST is already inferred.
* Host localhost:3000 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
* Trying [::1]:3000...
* Connected to localhost (::1) port 3000
* using HTTP/1.x
> POST /api/documents/v2/extract HTTP/1.1
> Host: localhost:3000
> User-Agent: curl/8.14.1
> Accept: */*
> Content-Length: 421058
> Content-Type: multipart/form-data; boundary=------------------------Cs7ftSvwQrIG7Ku3JrQw8i
>
* upload completely sent off: 421058 bytes
< HTTP/1.1 200 OK
< date: Mon, 18 Aug 2025 03:49:31 GMT
< server: uvicorn
< cache-control: no-cache
< connection: keep-alive, close
< access-control-allow-origin: *
< access-control-allow-headers: *
< content-type: text/event-stream; charset=utf-8
< transfer-encoding: chunked
< Vary: Accept-Encoding
<
data: {"status": "processing", "message": "Processing 1 documents in parallel", "document_count": 1, "field_count": 2, "fields": ["Purchase Order Number", "Order Date"]}
data: {"status": "document_processing", "message": "Processing document: PO8419.pdf", "file_index": 0, "filename": "PO8419.pdf"}
data: {"status": "azure_processing", "message": "Analyzing PO8419.pdf with Azure Document Intelligence...", "file_index": 0, "filename": "PO8419.pdf"}
data: {"status": "document_error", "message": "Error processing PO8419.pdf: too many values to unpack (expected 1)", "file_index": 0, "filename": "PO8419.pdf", "error": "too many values to unpack (expected 1)"}
data: {"status": "batch_progress", "message": "Completed 1/1 documents", "completed_count": 1, "total_count": 1}
data: {"status": "completed", "message": "Parallel bulk extraction completed: 0 successful, 1 failed", "total_documents": 1, "successful_count": 0, "failed_count": 1, "results": [{"filename": "PO8419.pdf", "file_index": 0, "success": false, "error": "too many values to unpack (expected 1)"}]}
* Connection #0 to host localhost left intact
Caddy pointed at backend (streams each message as expected):
Note: Unnecessary use of -X or --request, POST is already inferred.
* Host localhost:6746 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
* Trying [::1]:6746...
* Connected to localhost (::1) port 6746
* using HTTP/1.x
> POST /api/documents/v2/extract HTTP/1.1
> Host: localhost:6746
> User-Agent: curl/8.14.1
> Accept: */*
> Content-Length: 421058
> Content-Type: multipart/form-data; boundary=------------------------LsMtkbmNtGAS5dA0ynJi7i
>
* upload completely sent off: 421058 bytes
< HTTP/1.1 200 OK
< Access-Control-Allow-Headers: *
< Access-Control-Allow-Origin: *
< Cache-Control: no-cache
< Content-Type: text/event-stream; charset=utf-8
< Date: Mon, 18 Aug 2025 03:59:47 GMT
< Server: uvicorn
< Via: 1.1 Caddy
< Transfer-Encoding: chunked
<
data: {"status": "processing", "message": "Processing 1 documents in parallel", "document_count": 1, "field_count": 2, "fields": ["Purchase Order Number", "Order Date"]}
data: {"status": "document_processing", "message": "Processing document: PO8419.pdf", "file_index": 0, "filename": "PO8419.pdf"}
data: {"status": "azure_processing", "message": "Analyzing PO8419.pdf with Azure Document Intelligence...", "file_index": 0, "filename": "PO8419.pdf"}
data: {"status": "batch_progress", "message": "Completed 1/1 documents", "completed_count": 1, "total_count": 1}
data: {"status": "document_error", "message": "Error processing PO8419.pdf: too many values to unpack (expected 1)", "file_index": 0, "filename": "PO8419.pdf", "error": "too many values to unpack (expected 1)"}
data: {"status": "completed", "message": "Parallel bulk extraction completed: 0 successful, 1 failed", "total_documents": 1, "successful_count": 0, "failed_count": 1, "results": [{"filename": "PO8419.pdf", "file_index": 0, "success": false, "error": "too many values to unpack (expected 1)"}]}
* Connection #0 to host localhost left intact
3. Caddy version:
2.10.0
4. How I installed and ran Caddy:
a. System environment:
System: Linux desktop 6.16.0 #1-NixOS SMP PREEMPT_DYNAMIC Sun Jul 27 21:26:38 UTC 2025 x86_64 GNU/Linux
Caddy: Installed through Nixpkgs
b. Command:
Unsure, abstracted by Devenv.
c. Service/unit/compose file:
d. My complete Caddy config:
{
admin off
}
:6746 {
@supabase path_regexp supabase ^/supabase(/.*)?$
handle @supabase {
rewrite * {re.supabase.1}
reverse_proxy localhost:54321
}
@ws_excel {
path_regexp ws_excel ^/excel(/.*)?$
header Connection *Upgrade*
header Upgrade websocket
}
handle @ws_excel {
rewrite * /excel{re.ws_excel.1}
reverse_proxy localhost:6747
}
reverse_proxy localhost:{3000,8000} {
header_up Host dev.coalesc.xyz
}
}