Turning away the crackers?

I’m seeing a lot of hits to our site that look like obvious cracking/exploit attempts, e.g.:

  GET /cms/license.txt
  GET /wp-login.php
  GET /w/license.txt
  GET /2/license.txt
  GET /test/web.config.txt
  GET /vendor/phpunit/phpunit/LICENSE
  GET /vendor/phpunit/phpunit/src/Util/PHP/Template/TestCaseMethod.tpl.dist
  GET /res/license.txt
  GET /backup/license.txt
  GET /back/license.txt
  GET /_wp/license.txt

Question: is it worth putting a handler in Caddyfile v2 to deal with the more common of these and turn them away? If so, should I return a 404? 403?

Right now since my site is an SPA they always get the index.html file which is just a waste of bandwidth.

What would the handler look like?

This one’s 100% up to you, but it can be done without any special handler implementation.

@exploits {
  path /cms/license.txt
  path /wp-login.php
  path /w/license.txt
  path /2/license.txt
  path /test/web.config.txt
  path /vendor/phpunit/phpunit/LICENSE
  path /vendor/phpunit/phpunit/src/Util/PHP/Template/TestCaseMethod.tpl.dist
  path /res/license.txt
  path /backup/license.txt
  path /back/license.txt
  path /_wp/license.txt
respond @exploits 404

404 makes it seem like it’s just not there. 403 is a specific, configured response that might flag you out for additional attention, but communicates that those resources are secured.

I think in general this kind of thing isn’t worth bothering with - if your own site has specific resources that must be secured, secure them in particular. Nothing will stop bots from trying to scan your site for exploitable files. Secure the areas that must be secured and let Caddy respond with its default 404s for resources that don’t exist.

Bandwidth concerns for an SPA make sense as a reason to avoid this, especially if your SPA is large and you’re serving the whole payload each time to a large volume of crawlers. The above example should work for that.


Thanks. I’m prolly just being pedantic and letting it bug me that they get a 200 response on these. Even tho what they get back (index.html) isn’t of any use to them. It’s ony some 500 bytes gzipped so no big deal I suppose. That’s sorta the downside of an SPA, almost nothing gets 404.

Thanks for the thoughts and the respond stanza. I might yet decide to use it.

No worries.

Yeah, to be fair - SPAs aren’t uncommon and nowadays malicious actors that actually run crawlers that flag out lots of 200’s for attention just ignore those flags, simply because of how common SPAs are and the fact they give 200s for every request, and that doesn’t actually provide any indication as to whether the SPA is exploitable or not.

I can’t say it should be a security concern - especially in terms of standing out from the crowd - it’s definitely not abnormal any more to do this.

But yeah, it’s your call. Probably won’t hurt, realistically, to go either way.


This topic was automatically closed after 180 days. New replies are no longer allowed.