HTTP catchall for unknown domain logs redirects

1. The problem I’m having:

I have a (big dynamic) list of known domains, that Caddy should serve. Each has its own site block (generated via template outside of Caddy), using only the domain name (w/o scheme, so e.g. example.com). This works fine so far (auto HTTP/HTTPS redirects too).

I want to have a catch all HTTP block, to display a static error page for unknown domains (i.e. everything with no site block), so I added this block too:

http:// {
  root /srv/http/unknown_domain
  import logs UNKNOWN
  import assets
  handle {
    import fileserver
  }
}

Here are the includes:

(logs) {
	log {
		output file /var/log/caddy/access.log {
			roll_size 50MiB
			roll_keep 0
			roll_keep_for 24h
		}
		format append {
			fields {
				account "{args[:]}"
			}
			wrap filter {
				fields {
					request>remote_ip delete
					request>headers>Accept-Encoding delete
					request>headers>Accept delete
					request>headers>Connection delete
					request>tls delete
					request>remote_port delete
					resp_headers delete
				}
				wrap json {
					time_format iso8601
				}
			}
		}
	}
}
(fileserver) {
	file_server {
		hide .git .hg .htaccess
		index index.html index.htm nocontent.html 
        {args[:]}
	}
	encode {
		gzip
		minimum_length 1024
	}
}
(assets) {
	handle_path /assets/* {
		import fileserver root /srv/http/_assets
	}
}

This generally does work (the page is shown for unknown domains, known domains are still redirected to HTTPS), but all HTTP request for known domains are logged with field account=UNKNOWN.

I suspect I can work around this by adding manual redirects for all known domains, but I hope I can avoid this. I was a bit surprised, that the http:// site block is even called for request, that would fit a site block w/o scheme.

I feel this part of the docs is crucial:

If you already have a server listening on the HTTP port, the HTTP-HTTPS redirect routes will be inserted after your routes with a host matcher, but before a user-defined catch-all route.
But I have not found a way to prevent this yet…

2. Error messages and/or full log output:

No errors per se…

Just that every HTTP request for a “known domain” (i.e. one for which a site block is defined) is logged as an “unknown domain” (b/c of the http:// catchall logging config).

3. Caddy version:

v2.10.0

4. How I installed and ran Caddy:

pacman -s caddy
systemctl start caddy

a. System environment:

Arch Linux

b. Command:

systemctl start caddy

c. Service/unit/compose file:

d. My complete Caddy config:

{
	admin "unix//run/caddy/admin.socket"
	persist_config off

	email dev+tls-brutus.uberspace.de@uberspace.de
	default_sni brutus.uberspace.de
	skip_install_trust
	acme_ca https://acme-staging-v02.api.letsencrypt.org/directory

	log default {
		output stderr
		format console
	}

	metrics {
		per_host
	}

	grace_period 5s
	servers :443 {
		name https
	}
	servers :80 {
		name http
	}
}

(logs) {
	# ARGS: account name
	log {
		output file /var/log/caddy/access.log {
			roll_size 50MiB
			roll_keep 0
			roll_keep_for 24h
		}
		format append {
			fields {
				account "{args[:]}"
			}
			wrap filter {
				fields {
					request>remote_ip delete
					request>headers>Accept-Encoding delete
					request>headers>Accept delete
					request>headers>Connection delete
					request>tls delete
					request>remote_port delete
					resp_headers delete
				}
				wrap json {
					time_format iso8601
				}
			}
		}
	}
}

(fileserver) {
	# ARGS: extra args for `file_server` directive
	file_server {
		hide .git .hg .htaccess
		index index.html index.htm nocontent.html
		{args[:]}
	}
	encode {
		gzip
		minimum_length 1024
	}
}

(assets) {
	handle_path /assets/* {
		import fileserver root /srv/http/_assets
	}
}

# UNKNOWN DOMAINS - Catchall

http:// {
	root /srv/http/unknown_domain
	import logs UNKNOWN
	import assets
	handle {
		import fileserver
	}
}

# KNOWN DOMAINS
# might be a lot more entries like this…

brutus.uber8.space {
	root /srv/http/_accounts/brutus/html
	import logs brutus

	tls {
		on_demand
	}

	import fileserver
}

5. Links to relevant resources:

Maybe this was a bit long-winded… sorry.

What I try to do is have an http:// catch all, that logs request to domains not explicitly defined (it logs them by adding an account field with the value “unknown” to log lines and then shows a static we do not know this domain site). That already works - basically,

But HTTP to HTTPS redirects for known (defined) sites now are also logged with the account field set to “unknown” in log lines, instead of the value set in their site block. The redirect works though, and the static error site is not shown.

How can I prevent request for know sites to be logged like unknown ones?

Any ideas?