Undocumented / unexpected timeouts status codes on souin cache?

1. The problem I’m having:

I’m getting a text/plain response with status 200 and the body is “Internal server error” when the backend does not respond before the cache backend timeout expires.

Before configuring any timeouts in the cache config I was getting this behaviour after 10 seconds.

I think in this case I should at least get a 504 Gateway Timeout, which I could then maybe handle to have the response body sent as json.

I was going to post this issue here or here but maybe here is better.

The dumb solution is to just set the cache timeout to something ridiculous like 30 minutes but that would be really strange.

Thank you!

cc @darkweak

2. Error messages and/or full log output:

/etc/caddy # time curl -v 127.0.0.1:8181/v1/public/fff
*   Trying 127.0.0.1:8181...
* Connected to 127.0.0.1 (127.0.0.1) port 8181
> GET /v1/public/fff HTTP/1.1
> Host: 127.0.0.1:8181
> User-Agent: curl/8.8.0
> Accept: */*
>
* Request completely sent off

< HTTP/1.1 200 OK
< Cache-Status: My-Cache; fwd=bypass; detail=DEADLINE-EXCEEDED
< Server: Caddy
< Date: Fri, 06 Dec 2024 13:55:40 GMT
< Content-Length: 21
< Content-Type: text/plain; charset=utf-8
<
* Connection #0 to host 127.0.0.1 left intact
Internal server error
real    0m 35.00s
user    0m 0.00s
sys     0m 0.00s
{"level":"info","ts":"2024-12-06T13:55:40.517Z","logger":"http.handlers.cache","msg":"Internal server error on endpoint /v1/public/fff: [0xc001a2e4e0]"}
{"level":"error","ts":"2024-12-06T13:55:40.517Z","logger":"http.log.error","msg":"context deadline exceeded","request":{"remote_ip":"127.0.0.1","remote_port":"47044","client_ip":"127.0.0.1","proto":"HTTP/1.1","method":"GET","host":"127.0.0.1:8181","uri":"/v1/public/fff","headers":{"User-Agent":["curl/8.8.0"],"Accept":["*/*"]}},"duration":35.002951182}

3. Caddy version:

Caddy version: v2.8.4

Non-standard modules versions:

cache v1.7.3-0.20241027205551-c816c2ba5f81
docker_proxy v2.9.1
http.authentication.providers.authorizer v1.1.29
http.handlers.argsort v0.0.0-20240612174347-3b6537189566
http.handlers.authenticator v1.1.29
http.handlers.cache v1.7.3-0.20241027205551-c816c2ba5f81
http.handlers.cookieflag v0.0.0-20240612175654-c749657e50d6
http.handlers.ipmap v0.0.0-20241012204604-d7269fa80899
http.handlers.rate_limit v0.0.0-20240828171918-12435ecef5db
security v1.1.29
storages.cache.redis v0.0.11

  Non-standard modules: 11

  Unknown modules: 0

4. How I installed and ran Caddy:

Built a docker image. Ran it with docker-compose

a. System environment:

Docker version 25.0.5
Docker Compose version v2.29.5

b. Command:

docker compose up -d web

c. Service/unit/compose file:

  web:
    command:
      - caddy
      - run
      - --config
      - /etc/caddy/Caddyfile
    image: caddy:custom

d. My complete Caddy config:

{
        # debug
        log {
                output file /logs/main.log {
                        roll_keep_for 10d
                }
                format json {
                        time_format iso8601
                }
                level info
        }
        local_certs
        auto_https disable_redirects
        import ./include/cache.caddyfile
}

:8181 {
        cache 
        reverse_proxy {
                header_down -Server
                to pubapi:3000
        }
}

./include/cache/caddyfile:

cache {
	cache_name My-Cache
	# allowed_http_verbs GET HEAD
	api {
		# debug
		# prometheus
		souin
	}
	log_level debug
	# see https://github.com/darkweak/souin/issues/345#issuecomment-1560574770
	mode bypass
	ttl 30s # 3h
	stale 60s # 72h
	timeout {
	 	backend 35s
	 	cache 2s
	}
	default_cache_control public
	key {
		template {http.request.uri}
		hash
	}
	redis {
		configuration {
			Addrs redis:6379
			DB 12
		}
	}
}

5. Links to relevant resources:

The pubapi container is running a very simple python script that will wait for a configurable amount of seconds before returning the request method, request url and request headers in the response.

The time to wait until the response is set to 90 for these tests.

I run the script with: SLEEP=90 python /s.py

The script:

import os
import time
import urllib.parse
from http.server import BaseHTTPRequestHandler, HTTPServer

class HeaderEchoHandler(BaseHTTPRequestHandler):
    def do_GET(self):
        # Get sleep duration from environment variable, default to 5 seconds
        sleep_duration = int(os.environ.get('SLEEP', 5))

        # Wait for specified number of seconds before responding
        time.sleep(sleep_duration)

        # Send response status code
        self.send_response(200)

        # Send headers
        self.send_header('Content-type', 'text/plain')
        self.end_headers()

        # Collect request details
        request_details = f"Request Method: {self.command}\n"
        request_details += f"Request Path: {self.path}\n"

        # Parse query parameters
        parsed_path = urllib.parse.urlparse(self.path)
        query_params = urllib.parse.parse_qs(parsed_path.query)
        if query_params:
            request_details += "Query Parameters:\n"
            for key, value in query_params.items():
                request_details += f"{key}: {value}\n"

        # Collect and write all headers
        header_text = "\nRequest Headers:\n"
        for header, value in self.headers.items():
            header_text += f"{header}: {value}\n"
            header_text += f"{header}: {value}\n"

        # Combine all information
        full_response = request_details + header_text

        # Write the response
        self.wfile.write(full_response.encode())

def run_server(port=3000):
    server_address = ('', port)
    httpd = HTTPServer(server_address, HeaderEchoHandler)
    print(f"Server running on port {port}")
    httpd.serve_forever()

if __name__ == "__main__":
    run_server()

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.