Load API endpoint sometimes times out

I am using the API end-point to dynamically set reverse proxies. Usually this is instant. Sometimes, it appears to timeout, which takes a while, yet it still works.

Looking through the journal, I came across this:

Sep 01 09:17:09 FLEX-5 caddy[31121]: {“level”:“error”,“ts”:1598973429.5918858,“logger”:“admin”,“msg”:“stopping current admin endpoint”,“error”:“shutting down admin server: context deadline exceeded”}

I did find this to help me understand what is basically happening: https://stackoverflow.com/questions/43321894/context-timeout-implementation-on-every-request-using-golang

When I use curl in a terminal, this never happens, but when I use axios (with JavaScript) it occasionally happens.

I am wondering if I need something in the axios headers that curl is using, but I am not, or if this is a bug in Caddy Server?

This is the code I use to update the config:

    axios({
      method: 'post',
      url: 'http://localhost:2019/load',
      headers: {
        'Content-Type': 'application/json'
      },
      data: config
    })

Any suggestions welcomed. Thanks.

Interesting; I don’t know what axios is really (or how it works, especially), but this usually happens if there’s a request or connection that is still active. Is axios shutting down the connection after the request?

It’s just a JavaScript library that wraps the XMLHttpRequest browser APIs to make them easier to use.

@Jeff_Galbraith does it happen with the fetch API too? Just to narrow down if it’s something weird that axios is doing or whether it’s anything from a browser.

1 Like

@francislavoie I will look into it next week. I’ll have to use node-fetch package since the fetch API is not built into node.

Ah, didn’t realize you’re using node, I assumed this was frontend JS. :man_shrugging:

@francislavoie Finally got some time to swap out axios with node-fetch and seeing the same results.

This gets written to the journal, but it never comes back (sometimes):
Sep 21 10:08:51 FLEX-5 caddy[14345]: {“level”:“info”,“ts”:1600704531.5086918,“logger”:“admin.api”,“msg”:“received request”,“method”:“POST”,“host”:“localhost:2019”,“uri”:"/load",“remote_addr”:“127.0.0.1:36916”,“headers”:{“Accept”:["/"],“Accept-Encoding”:[“gzip,deflate”],“Connection”:[“close”],“Content-Length”:[“4151”],“Content-Type”:[“application/json”],“User-Agent”:[“node-fetch/1.0 (+https://github.com/bitinn/node-fetch)”]}}

This is what I do…when my service starts up, it doesn’t assume that the caddy json configuration is correct. It rebuilds it, then sends it off to Caddy, which always returns immediately. Then, if the User via a web browser, makes changes, then the same process is done again, with the additional data. It is sent off to Caddy at the load end-point. Then, this is where it gets “stuck”. Again, not always.

When it does get “stuck” everything returns to normal after “socket hung up” error.

I have no ideas what may be the problem.

I just tried with latest and same thing:
v2.2.0-rc.3 h1:sCf7QCxI8l2/tvH2sJFtOC7cBAOWtB5w6XQPMPIWqqU=

I am seeing this in the journal now:

Sep 21 11:10:09 FLEX-5 caddy[13872]: {“level”:“debug”,“ts”:1600708209.0988941,“logger”:“http.stdlib”,“msg”:“http: panic serving 127.0.0.1:59106: inconsistent label cardinality: expected 4 label values but got 3 in prometheus.Labels{“handler”:“reverse_proxy”, “method”:“GET”, “server”:“srv0”}\ngoroutine 510 [running]:\nnet/http.(*conn).serve.func1(0xc0009aa960)\n\tnet/http/server.go:1801 +0x147\npanic(0x1546720, 0xc0007f87d0)\n\truntime/panic.go:975 +0x3e9\ngithub.com/prometheus/client_golang/prometheus.(*HistogramVec).With(0xc00000e1f0, 0xc000843380, 0x7f6c8f6f5b40, 0xc000094300)\n\tgithub.com/prometheus/client_golang@v1.7.1/prometheus/histogram.go:501 +0xc5\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*metricsInstrumentedHandler).ServeHTTP(0xc0009df940, 0x7f6c8f6f8fc0, 0xc000843290, 0xc00098f600, 0x19da9e0, 0xc000350420, 0x0, 0x0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/metrics.go:137 +0x655\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1(0x7f6c8f6f8fc0, 0xc000843290, 0xc00098f600, 0xc00037fb20, 0xc0009df940)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:259 +0x5a\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00037fb20, 0x7f6c8f6f8fc0, 0xc000843290, 0xc00098f600, 0x177c401, 0xf)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1(0x7f6c8f6f8fc0, 0xc000843290, 0xc00098f600, 0x2, 0xc000824000)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:231 +0x128\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00037f9e0, 0x7f6c8f6f8fc0, 0xc000843290, 0xc00098f600, 0x2, 0x2)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp/rewrite.Rewrite.ServeHTTP(0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0xc0005523c0, 0x1, …)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/rewrite/rewrite.go:102 +0x26d\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*metricsInstrumentedHandler).ServeHTTP(0xc0009df480, 0x7f6c8f6f8fc0, 0xc0008431d0, 0xc00098f600, 0x19da9e0, 0xc00037f9e0, 0x0, 0x0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/metrics.go:129 +0x4e2\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1(0x7f6c8f6f8fc0, 0xc0008431d0, 0xc00098f600, 0xc00037fa20, 0xc0009df480)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:259 +0x5a\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00037fa20, 0x7f6c8f6f8fc0, 0xc0008431d0, 0xc00098f600, 0x1, 0x0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1(0x7f6c8f6f8fc0, 0xc0008431d0, 0xc00098f600, 0x1, 0x1)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:231 +0x128\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00037fa00, 0x7f6c8f6f8fc0, 0xc0008431d0, 0xc00098f600, 0xc000350420, 0x19da9e0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Subroute).ServeHTTP(0xc0009df160, 0x7f6c8f6f8fc0, 0xc0008431d0, 0xc00098f600, 0x19da9e0, 0xc000350420, 0xc0004ae640, 0x203000)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/subroute.go:74 +0x8a\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*metricsInstrumentedHandler).ServeHTTP(0xc0009df9c0, 0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0x19da9e0, 0xc000350420, 0x0, 0x0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/metrics.go:129 +0x4e2\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1(0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0xc00037f9c0, 0xc0009df9c0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:259 +0x5a\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00037f9c0, 0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0xc00077b501, 0xfdf468)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1(0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0x0, 0xc00093bec0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:231 +0x128\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc000350440, 0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0xc000097500, 0xc0000c3480)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1(0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0x40d9d0, 0xc000843110)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/routes.go:203 +0x2fd\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc000350460, 0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0x2, 0x0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).enforcementHandler(0xc0009db560, 0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0x19da9e0, 0xc000350460, 0xa9a328cbcf0, 0x2f2f30ec)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/server.go:279 +0x9d\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).wrapPrimaryRoute.func1(0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0xc02f2f30ec, 0xa9a328cbcf0)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/server.go:255 +0x5a\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc000350480, 0x19f2ec0, 0xc00057ae00, 0xc00098f600, 0xc000830900, 0xc00098f600)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/caddyhttp.go:58 +0x44\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).ServeHTTP(0xc0009db560, 0x19f2ec0, 0xc00057ae00, 0xc00098f600)\n\tgithub.com/caddyserver/caddy/v2@v2.2.0-rc.3/modules/caddyhttp/server.go:204 +0x4bd\nnet/http.serverHandler.ServeHTTP(0xc000629960, 0x19f2ec0, 0xc00057ae00, 0xc00098f300)\n\tnet/http/server.go:2843 +0xa3\nnet/http.(*conn).serve(0xc0009aa960, 0x19f7cc0, 0xc000adf900)\n\tnet/http/server.go:1925 +0x8ad\ncreated by net/http.(*Server).Serve\n\tnet/http/server.go:2969 +0x36c”}

my json (unformatted as it’s too big for a message):

$ curl localhost:2019/config/
{“apps”:{“http”:{“servers”:{“srv0”:{“automatic_https”:{“disable”:true},“listen”:[":80"],“routes”:[{“group”:“group1”,“handle”:[{“handler”:“vars”,“root”:"/opt/ivt/photography"}],“match”:[{“path”:["/files/"]}]},{“group”:“group1”,“handle”:[{“handler”:“vars”,“root”:"/opt/ivt/apps/6.0.0/packages/client/spa"}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/console_socket",“replace”:"/console/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3001”}]}]}]}],“match”:[{“path”:["/console_socket/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/web_app_socket",“replace”:"/web/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3001”}]}]}]}],“match”:[{“path”:["/web_app_socket/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/liveview",“replace”:"/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3004”}]}]}]}],“match”:[{“path”:["/liveview/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/weather",“replace”:"/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3010”}]}]}]}],“match”:[{“path”:["/weather/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/archive",“replace”:"/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3003”}]}]}]}],“match”:[{“path”:["/archive/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/alarms",“replace”:"/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3002”}]}]}]}],“match”:[{“path”:["/alarms/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“rewrite”,“uri_substring”:[{“find”:"/ptz",“replace”:"/socket.io"}]}]},{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3006”}]}]}]}],“match”:[{“path”:["/ptz/"]}]},{“handle”:[{“handler”:“subroute”,“routes”:[{“handle”:[{“handler”:“reverse_proxy”,“headers”:{“request”:{“set”:{“X-Forwarded-Proto”:["{http.request.scheme}"],“X-Real-Ip”:["{http.request.remote.host}"]}}},“upstreams”:[{“dial”:“localhost:3001”}]}]}]}],“match”:[{“path”:["/*"]}]},{“handle”:[{“handler”:“file_server”,“hide”:[“Caddyfile”]}]}]}}},“tls”:{“automation”:{“policies”:[{“issuer”:{“module”:“internal”}}]}}},“logging”:{“logs”:{“default”:{“level”:“DEBUG”}}}}

Turns out the “panic” message is caused by our netgear router:

Sep 21 11:26:25 FLEX-5 caddy[13872]: {“level”:“debug”,“ts”:1600709185.44511,“logger”:“http.handlers.reverse_proxy”,“msg”:“upstream roundtrip”,“upstream”:“localhost:3001”,“request”:{“remote_addr”:“192.168.0.16:52283”,“proto”:“HTTP/1.1”,“method”:“GET”,“host”:“192.168.0.14”,“uri”:"/currentsetting.htm",“headers”:{“X-Real-Ip”:[“192.168.0.16”],“X-Forwarded-For”:[“192.168.0.16”],“X-Forwarded-Proto”:[“http”],“User-Agent”:[“Genie Windows 2.4.62”]}},“duration”:102.848549157,“error”:“context canceled”}

https://www.google.com/search?q=currentsetting.htm&oq=currentsetting.htm&aqs=chrome..69i57j0l2.684j0j7&sourceid=chrome&ie=UTF-8

Also found this in the journal:

Sep 21 11:39:17 FLEX-5 caddy[20528]: {“level”:“error”,“ts”:1600709957.6683054,“logger”:“admin”,“msg”:“stopping current admin endpoint”,“error”:“shutting down admin server: context deadline exceeded”}

Wrap your JSON in ``` backticks to use code styling. Easier to read.

The “unexpected label cardinality” is being fixed in a branch currently; it’s unrelated.

So where do we stand on the original problem at this point? I’m a little lost with all the replies.

Normally I would wrap, but it makes it a single line that, imo, is harder to read

Sorry for all the replies. I have not cracked the issue yet. Sometimes, the call is made to Caddy web server on the load end-point and it’s fine and other times, it is not and gets “stuck” (never returns until “socket hang up” is issued).

I do see in the journal, the call into Caddy is written, so I know it got the data. It just doesn’t appear to send the response back all the time.