1. The problem I’m having:
From time to time, my site (a bibliography database with 15,000+ entries) gets overwhelmed by bot requests and not responding any more, in spite of blocking all through a robots.txt.
I found a recipe for Apache (at the French site Docs Evolix - Gestion des bots ):
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} "FooBot" [NC]
RewriteRule ^ - [F,L]
but I can’t ‘translate’ this into a Caddyfile entry.
Apologies if this sounds trivial …
2. Error messages and/or full log output:
—
3. Caddy version:
v2.11.1 h1:C7sQpsFOC5CH+31KqJc7EoOf8mXrOEkFyYd6GpIqm/s=
4. How I installed and ran Caddy:
Installation through Debian package manager
a. System environment:
Debian GNU/Linux 13.3
b. Command:
c. Service/unit/compose file:
d. My complete Caddy config:
(common) {
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
X-Xss-Protection "1; mode=block"
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Content-Security-Policy "upgrade-insecure-requests"
Referrer-Policy "strict-origin-when-cross-origin"
Cache-Control "public, max-age=15, must-revalidate"
Feature-Policy "accelerometer 'none'; ambient-light-sensor 'none'; autoplay 'self'; camera 'none'; encrypted-media 'none'; fullscreen 'self'; geolocation 'none'; gyroscope 'none'; magnetometer 'none'; microphone 'none'; midi 'none'; payment 'none'; picture-in-picture *; speaker 'none'; sync-xhr 'none'; usb 'none'; vr 'none'"
}
encode gzip
}
www.bobc.uni-bonn.de {
root * /var/www/wikindx
root /adminer* /usr/share/adminer
file_server
import common
log {
output file /var/log/caddy/access.log
}
php_fastcgi unix//run/php/php8.4-fpm.sock
}
www.bobc.uni-bonn.de:8088 {
reverse_proxy localhost:3000
}
5. Links to relevant resources:
—