Basic auth and no access

1. Caddy version (caddy version):

latest

2. How I run Caddy:

docker rootless

a. System environment:

ubuntu 20.04

b. Command:

paste command here

c. Service/unit/compose file:

paste full file contents here

d. My complete Caddyfile or JSON config:

    # GLOBAL
    {
        # Global options block. Entirely optional, https is on by default
        # Optional email key for lets encrypt
        email mail@example.com
        # Optional staging lets encrypt for testing.
        acme_ca https://acme-staging-v02.api.letsencrypt.org/directory
        
        servers {
    		timeouts {
    			read_body   10s
    			read_header 10s
    			write       10s
    			idle        2m
    		}
    		max_header_size 16384
    	}

    }

    # SNIPPETS

    (mustheaders) {
            header {
                    Strict-Transport-Security "max-age=31536000; includesubdomains; preload"
                    Content-Security-Policy "default-src https: 'unsafe-inline' 'unsafe-eval'"
                    X-Content-Type-Options "nosniff"
                    X-Frame-Options "SAMEORIGIN"
                    Referrer-Policy "strict-origin-when-cross-origin"
                    X-Xss-Protection "1; mode=block"
                    Feature-Policy "accelerometer 'none'; ambient-light-sensor 'none'; autoplay 'none'; camera 'none'; encrypted-media 'none'; fullscreen 'self'; geolocation 'none'; gyroscope 'none'; magnetometer 'none'; microphone 'none'; midi 'none'; payment 'none'; picture-in-picture *; speaker 'none'; sync-xhr 'none'; usb 'none'; vr 'none'"
                    Expect-CT "max-age=604800"
                    -Server
            }
    }
    (offlinewebsite) {
            header {
                    X-Robots-Tag "noindex, nofollow, noarchive, nosnippet, notranslate, noimageindex"
            }
            basicauth * {
                    admin hashed-password
            }
    }
    (onlinewebsite) {
            header {
                    X-Robots-Tag "noarchive, notranslate"
            }
    }

    (compression) {
            encode zstd gzip
    }

    (caching) {
            header {
                    Cache-Control "public, max-age=604800, must-revalidate"
            }
    }

    (security) {
     
            # Unusual URL rewrite
            try_files {path} {path}/ /index.*

            # deny all access to these folders
            @denied_folders path_regexp /(\.github|cache|bin|logs|backup.*|test.*|content|core|image.*|js|css|php|config|lib|assets|rel|priv|tracker)/.*$
            respond @denied_folders "Access denied" 403
           
            @no_access {
                      not path /content/*
                      not path /core/*
                      not path /assets/*
                      not path /images/*
                      not path /portfolio/*
                      not path /js/*
                      not path /css/*
                      not path /files/*
              }

            # deny running scripts inside core system folders
            @denied_system_scripts path_regexp /(core|content|test|system|vendor)/.*\.(txt|xml|md|html|yaml|php|pl|py|cgi|twig|sh|bat|yml|js)$
            respond @denied_system_scripts "Access denied" 403

            # deny running scripts inside user folder
            @denied_user_folder path_regexp /user/.*\.(txt|md|yaml|php|pl|py|cgi|twig|sh|bat|yml|js)$
            respond @denied_user_folder "Access denied" 403

            # deny access to specific files in the root folder
            @denied_root_folder path_regexp /(index.php.*|wp-admin.php|wp-login.php|wp-config.php.*|xmlrpc.php|config.production.json|config.development.json|index.js|package.json|renovate.json|.*lock|mix.*|ghost.js|startup.js|\.editorconfig|\.eslintignore|\.eslintrc.json|\.gitattributes|\.gitignore|\.gitmodules|\.npmignore|Gruntfile.js|LICENSE|MigratorConfig.js|LICENSE.txt|composer.lock|composer.json|nginx.conf|web.config|htaccess.txt|\.htaccess)
            respond @denied_root_folder "Access denied" 403

            # block bad crawlers
            @badbots header User-Agent "aesop_com_spiderman, alexibot, backweb, batchftp, bigfoot, blackwidow, blowfish, botalot, buddy, builtbottough, bullseye, cheesebot, chinaclaw, cosmos, crescent, curl, custo, da, diibot, disco, dittospyder, dragonfly, drip, easydl, ebingbong, erocrawler, exabot, eyenetie, filehound, flashget, flunky, frontpage, getright, getweb, go-ahead-got-it, gotit, grabnet, grafula, harvest, hloader, hmview, httplib, humanlinks, ilsebot, infonavirobot, infotekies, intelliseek, interget, iria, jennybot, jetcar, joc, justview, jyxobot, kenjin, keyword, larbin, leechftp, lexibot, lftp, libweb, likse, linkscan, linkwalker, lnspiderguy, lwp, magnet, mag-net, markwatch, memo, miixpc, mirror, missigua, moget, nameprotect, navroad, backdoorbot, nearsite, netants, netcraft, netmechanic, netspider, nextgensearchbot, attach, nicerspro, nimblecrawler, npbot, openfind, outfoxbot, pagegrabber, papa, pavuk, pcbrowser, pockey, propowerbot, prowebwalker, psbot, pump, queryn, recorder, realdownload, reaper, reget, true_robot, repomonkey, rma, internetseer, sitesnagger, siphon, slysearch, smartdownload, snake, snapbot, snoopy, sogou, spacebison, spankbot, spanner, sqworm, superbot, superhttp, surfbot, asterias, suzuran, szukacz, takeout, teleport, telesoft, thenomad, tighttwatbot, titan, urldispatcher, turingos, turnitinbot, *vacuum*, vci, voideye, libwww-perl, widow, wisenutbot, wwwoffle, xaldon, xenu, zeus, zyborg, anonymouse, *zip*, *mail*, *enhanc*, *fetch*, *auto*, *bandit*, *clip*, *copier*, *master*, *reaper*, *sauger*, *quester*, *whack*, *picker*, *catch*, *vampire*, *hari*, *offline*, *track*, *craftbot*, *download*, *extract*, *stripper*, *sucker*, *ninja*, *clshttp*, *webspider*, *leacher*, *collector*, *grabber*, *webpictures*, *seo*, *hole*, *copyright*, *check*"
            respond @badbots "Access denied" 403
    }

    (proxy) {
            header_up X-Forwarded-Proto {scheme}
            header_up X-Forwarded-For {remote}
            header_up X-Real-IP {remote}
            header_down X-Powered-By "the Holy Spirit"
            header_down Server "CERN httpd"
    }

    (logs) {
            log {
                output file /var/log/caddy/caddy.log
                format single_field common_log
            }
    }

    # STRIP WWW PREFIX

    www.example.com {
            redir * https://{http.request.host.labels.1}.{http.request.host.labels.0}{path} permanent
    }

    # WEBSITES

    example.com {
            import mustheaders
            import offlinewebsite
            import security
            import caching
            reverse_proxy internal_IP:2351 {
                    import proxy
            }

            import logs
    }

3. The problem I’m having:

As you can see I inserted basic auth to “offline” snippet and used for the whole website is it correct?
This is docker, so how do I access “caddy hash-password”?
during development website is marked as “offline” with offline snippet.

Also, there is “no access” setting in security snippet, is it pasted correctly?
“no access” should block browsing some folders.

4. Error messages and/or full log output:

5. What I already tried:

Please fill out the help topic template. It’s a forum rule. It even says right at the top of it:

DO NOT DELETE THIS TEMPLATE. THAT WILL MAKE US SAD.

YOU MUST USE THIS TEMPLATE TO GET HELP.

docker exec <caddy-container-id> caddy hash-password

done …

As you can see I inserted basic auth to “offline” snippet and used for the whole website is it correct?
This is docker, so how do I access “caddy hash-password”?
during development website is marked as “offline” with offline snippet.

Also, there is “no access” setting in security snippet, is it pasted correctly?
“no access” should block browsing some folders.

docker exec caddy hash-password

I got this as a result:
hash-password: EOF
So no hash password displayed.
Got it by adding -it to the command, now i got the hashed pass.

What about my other question:
there is “no access” setting in security snippet, is it pasted correctly?
“no access” should block browsing some folders.

What do you mean is it pasted correctly? The matcher isn’t used anywhere, so the matcher is defined for no use.

I know there is something wrong. Now this code below is inserted in “security” snippet", but it doesn’t do anything.

        @no_access {
                  not path /content/*
                  not path /core/*
                  not path /assets/*
                  not path /images/*
                  not path /portfolio/*
                  not path /js/*
                  not path /css/*
                  not path /files/*
          }

I want to block browsing these folders in my example.com website. I used to use “deny 403” but it breaks CMS to work. I just want to block browsing if someone will paste the folder’s URL into browser.

First, as said earlier, you’ve defined a named matcher, but it isn’t being used anywhere, so of course it isn’t doing anything on its own. In other words, it is as if you’ve defined an “if” condition, but there’s no action defined to be taken when the conditions take place.

Second, there are few issues with this part:

You are telling Caddy to try for {path}/, which is basically the path of directories. The directories will not be browse-able by default, so Caddy will serve any index.html or index.txt because they are the default indexes for the file_server directive. Also, /index.* will not work unless you literally have a file with such name because the file matcher (which backs try_files) does not expand globs.

Finally note that the named matcher @no_access is matching more than the directories. It matches the contents of the directories as well. So if you use the matcher @no_access as-is to respond with “Access denied”, you’ll be denying access to any file within the directories too, which is probably why your CMS of choice fails (if you used this particular matcher).

1 Like

So basically I can remove “try_files {path} {path}/ /index.*” because Caddy blocks browsing folders by default and I should add index.html file manually to folders?

Not knowing exactly what you’re using, what your CMS needs, and what your needs are, I can’t say one way or another. I can only explain how Caddy behaves and you should decide accordingly.

This topic was automatically closed after 30 days. New replies are no longer allowed.