1. The problem I’m having:
Two potentially interlinked problems:
- I’m trying to install Grawlix Webcomic CMS on my server, which runs Caddy. Grawlix is written for Apache as far as I can tell. I’ve gotten it so that I can enter the admin panel and try to upload a page, but then it takes me to a 500 Internal Server Error. When I return to the main page ( Test | My Comic ) it shows me the page I just uploaded, but trying to navigate to any other page returns a 404 Error. At this point in my testing I figured I’d try to replicate the .htaccess rules from the Grawlix files into Caddy, which would be easier to do if I used a subdomain, which is when I ran into my second problem. Here is the .htaccess file I tried to replicate:
<IfModule mod_rewrite.c>
RewriteEngine on
# Uncomment the next line if you get server errors when installing or running your site. Some hosts require it.
# RewriteBase /
# If you're installing Grawlix in a subdirectory, you may need to uncomment that line and change / to your directory, e.g. /grawlix/
# Allow common web files.
RewriteCond %{REQUEST_FILENAME} !^.*\.css$
RewriteCond %{REQUEST_FILENAME} !^.*\.js$
RewriteCond %{REQUEST_FILENAME} !^.*\.php$
RewriteCond %{REQUEST_FILENAME} !^.*\.xml$
RewriteCond %{REQUEST_FILENAME} !^.*\.html$
# Allow image files.
RewriteCond %{REQUEST_FILENAME} !^.*\.jpg$
RewriteCond %{REQUEST_FILENAME} !^.*\.jpeg$
RewriteCond %{REQUEST_FILENAME} !^.*\.gif$
RewriteCond %{REQUEST_FILENAME} !^.*\.png$
RewriteCond %{REQUEST_FILENAME} !^.*\.svg$
RewriteCond %{REQUEST_FILENAME} !^.*\.ico$
# Allow media files.
RewriteCond %{REQUEST_FILENAME} !^.*\.swf$
RewriteCond %{REQUEST_FILENAME} !^.*\.mov$
RewriteCond %{REQUEST_FILENAME} !^.*\.wmv$
RewriteCond %{REQUEST_FILENAME} !^.*\.mp3$
RewriteCond %{REQUEST_FILENAME} !^.*\.pdf$
RewriteCond %{REQUEST_FILENAME} !^.*\.zip$
# Allow font files.
RewriteCond %{REQUEST_FILENAME} !^.*\.eot$
RewriteCond %{REQUEST_FILENAME} !^.*\.otf$
RewriteCond %{REQUEST_FILENAME} !^.*\.woff$
RewriteCond %{REQUEST_FILENAME} !^.*\.ttf$
#Allow files for autoSSL to issue certificates
RewriteCond %{REQUEST_FILENAME} !^.*\.txt$
RewriteCond %{REQUEST_FILENAME} !^.*\.tmp$
# Every other URL request goes through index.php.
RewriteRule ^(.*)$ index.php?$1
</IfModule>
And here is what I put in my Caddyfile, under the relevant subdomain (my best attempt, may be entirely off-base):
@redirects {
not file *.css *.js *.php *.xml *.html *.jpg *.jpeg *.gif *.eot *.otf *.woff *.ttf *.txt *.tmp
}
rewrite @redirects /index.php
- I made a subdomain to point to the directory where Grawlix was located, so that I could apply the above rewrite rule to it, and found that it loaded the page but none of the linked assets (images, CSS stylesheets). I then checked a previously working subdomain from the same file and it now takes me to a 404 error page. The subdomain I have that points to a port using a reverse proxy is still working fine. I found a thread on this forum listing the correct configuration for subdomains and as far as I can tell mine should be correct, especially because it worked fine before.
2. Error messages and/or full log output:
curl -vL output for https://bonyfish.net/etc/grawlix-test/comic, one of the pages that leads to a 404 error:
* Trying 82.25.84.122:443...
* Connected to bonyfish.net (82.25.84.122) port 443 (#0)
* ALPN: offers h2,http/1.1
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* CAfile: /etc/ssl/certs/ca-certificates.crt
* CApath: /etc/ssl/certs
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
* TLSv1.3 (IN), TLS handshake, Certificate (11):
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
* TLSv1.3 (IN), TLS handshake, Finished (20):
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
* TLSv1.3 (OUT), TLS handshake, Finished (20):
* SSL connection using TLSv1.3 / TLS_AES_128_GCM_SHA256
* ALPN: server accepted h2
* Server certificate:
* subject: CN=bonyfish.net
* start date: Oct 16 22:48:31 2025 GMT
* expire date: Jan 14 22:48:30 2026 GMT
* subjectAltName: host "bonyfish.net" matched cert's "bonyfish.net"
* issuer: C=US; O=Let's Encrypt; CN=E7
* SSL certificate verify ok.
* using HTTP/2
* h2h3 [:method: GET]
* h2h3 [:path: /etc/grawlix-test/comic]
* h2h3 [:scheme: https]
* h2h3 [:authority: bonyfish.net]
* h2h3 [user-agent: curl/7.88.1]
* h2h3 [accept: */*]
* Using Stream ID: 1 (easy handle 0x55f416d7cf20)
> GET /etc/grawlix-test/comic HTTP/2
> Host: bonyfish.net
> user-agent: curl/7.88.1
> accept: */*
>
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
< HTTP/2 404
< alt-svc: h3=":443"; ma=2592000
< server: Caddy
< content-length: 0
< date: Sun, 09 Nov 2025 19:25:11 GMT
<
* Connection #0 to host bonyfish.net left intact
curl -vL output for https://photography.bonyfish.net, a previously-working subdomain:
* Trying 82.25.84.122:443...
* Connected to photography.bonyfish.net (82.25.84.122) port 443 (#0)
* ALPN: offers h2,http/1.1
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* CAfile: /etc/ssl/certs/ca-certificates.crt
* CApath: /etc/ssl/certs
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
* TLSv1.3 (IN), TLS handshake, Certificate (11):
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
* TLSv1.3 (IN), TLS handshake, Finished (20):
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
* TLSv1.3 (OUT), TLS handshake, Finished (20):
* SSL connection using TLSv1.3 / TLS_AES_128_GCM_SHA256
* ALPN: server accepted h2
* Server certificate:
* subject: CN=photography.bonyfish.net
* start date: Oct 17 09:18:31 2025 GMT
* expire date: Jan 15 09:18:30 2026 GMT
* subjectAltName: host "photography.bonyfish.net" matched cert's "photography.bonyfish.net"
* issuer: C=US; O=Let's Encrypt; CN=E8
* SSL certificate verify ok.
* using HTTP/2
* h2h3 [:method: GET]
* h2h3 [:path: /]
* h2h3 [:scheme: https]
* h2h3 [:authority: photography.bonyfish.net]
* h2h3 [user-agent: curl/7.88.1]
* h2h3 [accept: */*]
* Using Stream ID: 1 (easy handle 0x564e86e0ef20)
> GET / HTTP/2
> Host: photography.bonyfish.net
> user-agent: curl/7.88.1
> accept: */*
>
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
< HTTP/2 404
< alt-svc: h3=":443"; ma=2592000
< server: Caddy
< content-length: 0
< date: Sun, 09 Nov 2025 19:31:59 GMT
<
* Connection #0 to host photography.bonyfish.net left intact
curl -vL output for https://bonyfish.net/photography, which is another way to get to the page that the previously checked subdomain points to (I left out the actual page content below this, for brevity):
* Trying 82.25.84.122:443...
* Connected to bonyfish.net (82.25.84.122) port 443 (#0)
* ALPN: offers h2,http/1.1
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* CAfile: /etc/ssl/certs/ca-certificates.crt
* CApath: /etc/ssl/certs
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
* TLSv1.3 (IN), TLS handshake, Certificate (11):
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
* TLSv1.3 (IN), TLS handshake, Finished (20):
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
* TLSv1.3 (OUT), TLS handshake, Finished (20):
* SSL connection using TLSv1.3 / TLS_AES_128_GCM_SHA256
* ALPN: server accepted h2
* Server certificate:
* subject: CN=bonyfish.net
* start date: Oct 16 22:48:31 2025 GMT
* expire date: Jan 14 22:48:30 2026 GMT
* subjectAltName: host "bonyfish.net" matched cert's "bonyfish.net"
* issuer: C=US; O=Let's Encrypt; CN=E7
* SSL certificate verify ok.
* using HTTP/2
* h2h3 [:method: GET]
* h2h3 [:path: /photography]
* h2h3 [:scheme: https]
* h2h3 [:authority: bonyfish.net]
* h2h3 [user-agent: curl/7.88.1]
* h2h3 [accept: */*]
* Using Stream ID: 1 (easy handle 0x5558ae1a3f20)
> GET /photography HTTP/2
> Host: bonyfish.net
> user-agent: curl/7.88.1
> accept: */*
>
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
< HTTP/2 308
< alt-svc: h3=":443"; ma=2592000
< content-type: text/html; charset=utf-8
< location: /photography/
< server: Caddy
< content-length: 49
< date: Sun, 09 Nov 2025 19:34:01 GMT
<
* Ignoring the response-body
* Connection #0 to host bonyfish.net left intact
* Issue another request to this URL: 'https://bonyfish.net/photography/'
* Found bundle for host: 0x5558ae1a30b0 [can multiplex]
* Re-using existing connection #0 with host bonyfish.net
* h2h3 [:method: GET]
* h2h3 [:path: /photography/]
* h2h3 [:scheme: https]
* h2h3 [:authority: bonyfish.net]
* h2h3 [user-agent: curl/7.88.1]
* h2h3 [accept: */*]
* Using Stream ID: 3 (easy handle 0x5558ae1a3f20)
> GET /photography/ HTTP/2
> Host: bonyfish.net
> user-agent: curl/7.88.1
> accept: */*
>
< HTTP/2 200
< accept-ranges: bytes
< alt-svc: h3=":443"; ma=2592000
< content-type: text/html; charset=utf-8
< etag: "d18r69k2jg1s9ad"
< last-modified: Mon, 13 May 2024 19:10:49 GMT
< server: Caddy
< vary: Accept-Encoding
< content-length: 12037
< date: Sun, 09 Nov 2025 19:34:01 GMT
journalctl doesn’t have any entries from close to the date where this problem began. I do have some error logs directly from Caddy though. Here are the entries from today:
{"level":"error","ts":1762715438.846299,"logger":"http.log.access.log0","msg":"handled request","request":{"remote_ip":"108.90.143.43","remote_port":"49735","client_ip":"108.90.143.43","proto":"HTTP/3.0","method":"GET","host":"bonyfish.net","uri":"/etc/grawlix-test/_admin/book.page-edit.php?created=1&page_id=17","headers":{"Sec-Fetch-Mode":["navigate"],"Cookie":["REDACTED"],"Sec-Fetch-User":["?1"],"Accept":["text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"],"Accept-Language":["en-US,en;q=0.5"],"Sec-Fetch-Dest":["document"],"Alt-Used":["bonyfish.net"],"Upgrade-Insecure-Requests":["1"],"Sec-Gpc":["1"],"Accept-Encoding":["gzip, deflate, br, zstd"],"Priority":["u=0, i"],"Referer":["https://bonyfish.net/etc/grawlix-test/_admin/book.page-create.php"],"Sec-Fetch-Site":["same-origin"],"Dnt":["1"],"User-Agent":["Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:144.0) Gecko/20100101 Firefox/144.0"]},"tls":{"resumed":true,"version":772,"cipher_suite":4865,"proto":"h3","server_name":"bonyfish.net"}},"bytes_read":0,"user_id":"","duration":0.007423502,"size":0,"status":500,"resp_headers":{"Pragma":["no-cache"],"Content-Type":["text/html; charset=utf-8"],"Status":["500 Internal Server Error"],"Expires":["Thu, 19 Nov 1981 08:52:00 GMT"],"Date":["Sun, 09 Nov 2025 19:10:38 GMT"],"Via":["0.0 Caddy"],"Cache-Control":["no-store, no-cache, must-revalidate"]}}
{"level":"error","ts":1762716173.6226206,"logger":"http.log.access.log0","msg":"handled request","request":{"remote_ip":"108.90.143.43","remote_port":"60443","client_ip":"108.90.143.43","proto":"HTTP/3.0","method":"GET","host":"bonyfish.net","uri":"/etc/grawlix-test/_admin/book.page-edit.php?created=1&page_id=18","headers":{"Cookie":["REDACTED"],"Dnt":["1"],"Sec-Gpc":["1"],"Priority":["u=0, i"],"User-Agent":["Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:144.0) Gecko/20100101 Firefox/144.0"],"Referer":["https://bonyfish.net/etc/grawlix-test/_admin/book.page-create.php"],"Sec-Fetch-Dest":["document"],"Upgrade-Insecure-Requests":["1"],"Sec-Fetch-Mode":["navigate"],"Sec-Fetch-User":["?1"],"Accept-Language":["en-US,en;q=0.5"],"Accept-Encoding":["gzip, deflate, br, zstd"],"Sec-Fetch-Site":["same-origin"],"Accept":["text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"],"Alt-Used":["bonyfish.net"]},"tls":{"resumed":true,"version":772,"cipher_suite":4865,"proto":"h3","server_name":"bonyfish.net"}},"bytes_read":0,"user_id":"","duration":0.004807171,"size":0,"status":500,"resp_headers":{"Status":["500 Internal Server Error"],"Expires":["Thu, 19 Nov 1981 08:52:00 GMT"],"Cache-Control":["no-store, no-cache, must-revalidate"],"Pragma":["no-cache"],"Content-Type":["text/html; charset=utf-8"],"Date":["Sun, 09 Nov 2025 19:22:53 GMT"],"Via":["0.0 Caddy"]}}
3. Caddy version:
v2.10.2 h1:g/gTYjGMD0dec+UgMw8SnfmJ3I9+M2TdvoRL/Ovu6U8=
4. How I installed and ran Caddy:
a. System environment:
Debian 12 Bookworm, Caddy installed through apt package manager
b. Command:
caddy start
c. Service/unit/compose file:
[Unit]
Description=Caddy
Documentation=https://caddyserver.com/docs/
After=network.target network-online.target
Requires=network-online.target
[Service]
Type=notify
User=caddy
Group=caddy
ExecStart=/usr/bin/caddy run --environ --config /etc/caddy/Caddyfile
ExecReload=/usr/bin/caddy reload --config /etc/caddy/Caddyfile --force
TimeoutStopSec=5s
LimitNOFILE=1048576
PrivateTmp=true
ProtectSystem=full
AmbientCapabilities=CAP_NET_ADMIN CAP_NET_BIND_SERVICE
[Install]
WantedBy=multi-user.target
d. My complete Caddy config:
{
debug
}
bonyfish.net {
root /var/www/html
file_server
php_fastcgi unix//run/php/php8.2-fpm.sock
log {
level ERROR
output file /var/lib/caddy/log.file {
roll_size 10mb
}
}
# GoToSocial split domain configuration
redir /.well-known/host-meta* https://feed.bonyfish.net{uri} permanent
redir /.well-known/webfinger* https://feed.bonyfish.net{uri} permanent
redir /.well-known/nodeinfo* https://feed.bonyfish.net{uri} permanent
# Expanded PHP form from Caddy documentation (with edits from php.watch/articles/caddy-php
route {
# Add trailing slash for directory requests
@canonicalPath {
file {path}/index.php
not path */
}
redir @canonicalPath {http.request.orig_uri.path}/ 308
# If the requested file does not exist, try index files
@indexFiles file {
try_files {path} {path}/index.php
split_path .php
}
rewrite @indexFiles {file_match.relative}
# Proxy PHP files to the FastCGI responder
@phpFiles path *.php
reverse_proxy @phpFiles unix//run/php/php8.2-fpm.sock {
transport fastcgi {
split .php
}
}
}
}
photography.bonyfish.net {
root * /var/www/photography
file_server
}
#using subdomain as a testing ground for grawlix
etc.bonyfish.net {
# root /var/www/html/etc
root * /var/www/html/etc/grawlix-test
file_server
php_fastcgi unix//run/php/php8.2-fpm.sock
@redirects {
not file *.css *.js *.php *.xml *.html *.jpg *.jpeg *.gif *.eot *.otf *.woff *.ttf *.txt *.tmp
}
rewrite @redirects /index.php
}
feed.bonyfish.net {
# Optional, but recommended, compress the traffic using proper protocols
encode zstd gzip
# The actual proxy configuration to port 8080 (unless you've chosen another port number)
reverse_proxy * http://127.0.0.1:8080 {
# Flush immediately, to prevent buffered response to the client
flush_interval -1
#@no_ua header !User-Agent
header_up User-Agent "GoToSocial"
}
}
5. Links to relevant resources:
- Grawlix CMS Github
- Grawlix test page on my server
- Grawlix test page under subdomain with broken assets
- Post I used to try to understand the .htaccess file
Thanks very much for looking! I’m really out of my depth here and have exhausted any ideas I might have, so I appreciate any help you can give me!