Allow access to website only from one IP address

(offlinewebsite) {
        @public_networks not remote_ip <MY_Remote_Client_IP>
        respond @public_networks 403 {
} {
        import offlinewebsite
        reverse_proxy local_IP {

Only I want access to the website. However with this code I am blocked with 403. It should block all clients except my_IP. What’s wrong?

There could be all kinds of reasons. Do you have Cloudflare or another kind of proxy in front of Caddy? If so, then the remote IP will not be your real client IP. If so, then you may need to use the forwarded option of the remote_ip matcher.

Again, please fill out the help thread template, and please avoid redacting information in your config. It only makes it harder for us to help, and for you to receive help.

1 Like

No I do not use VPN, Cloudflare, other proxy. My ip the same as in the config.

That still doesn’t answer much. Can you please fill out the template and provide the logs?

1 Like

My caddyfile

    # Global options block. Entirely optional, https is on by default
    # Optional email key for lets encrypt
    # Optional staging lets encrypt for testing.
    servers {
		timeouts {
			read_body   10s
			read_header 10s
			write       10s
			idle        2m
		max_header_size 16384



(mustheaders) {
        header {
                Strict-Transport-Security "max-age=31536000; includesubdomains; preload"
                Content-Security-Policy "default-src https: 'unsafe-inline' 'unsafe-eval'"
                X-Content-Type-Options "nosniff"
                X-Frame-Options "SAMEORIGIN"
                Referrer-Policy "strict-origin-when-cross-origin"
                X-Xss-Protection "1; mode=block"
                Feature-Policy "accelerometer 'none'; ambient-light-sensor 'none'; autoplay 'none'; camera 'none'; encrypted-media 'none'; fullscreen 'self'; geolocation 'none'; gyroscope 'none'; magnetometer 'none'; microphone 'none'; midi 'none'; payment 'none'; picture-in-picture *; speaker 'none'; sync-xhr 'none'; usb 'none'; vr 'none'"
                Expect-CT "max-age=604800"
(offlinewebsite) {
        header {
                X-Robots-Tag "noindex, nofollow, noarchive, nosnippet, notranslate, noimageindex"
        @public_networks not remote_ip <MY_IP>
        respond @public_networks "Access denied" 403
(onlinewebsite) {
        header {
                X-Robots-Tag "noarchive, notranslate"

(compression) {
        encode zstd gzip

(caching) {
        header {
                Cache-Control "public, max-age=604800, must-revalidate"

(security) {
        # Unusual URL rewrite
        try_files {path} {path}/ /index.*

        # deny all access to these folders
        @denied_folders path_regexp /(\.github|cache|bin|logs|backup.*|test.*|content|core|image.*|js|css|php|config|lib|assets|rel|priv|tracker)/.*$
        respond @denied_folders "Access denied" 403

        # deny running scripts inside core system folders
        @denied_system_scripts path_regexp /(core|content|test|system|vendor)/.*\.(txt|xml|md|html|yaml|php|pl|py|cgi|twig|sh|bat|yml|js)$
        respond @denied_system_scripts "Access denied" 403

        # deny running scripts inside user folder
        @denied_user_folder path_regexp /user/.*\.(txt|md|yaml|php|pl|py|cgi|twig|sh|bat|yml|js)$
        respond @denied_user_folder "Access denied" 403

        # deny access to specific files in the root folder
        @denied_root_folder path_regexp /(index.php.*|wp-admin.php|wp-login.php|wp-config.php.*|xmlrpc.php|config.production.json|config.development.json|index.js|package.json|renovate.json|.*lock|mix.*|ghost.js|startup.js|\.editorconfig|\.eslintignore|\.eslintrc.json|\.gitattributes|\.gitignore|\.gitmodules|\.npmignore|Gruntfile.js|LICENSE|MigratorConfig.js|LICENSE.txt|composer.lock|composer.json|nginx.conf|web.config|htaccess.txt|\.htaccess)
        respond @denied_root_folder "Access denied" 403

        # block bad crawlers
        @badbots header User-Agent "aesop_com_spiderman, alexibot, backweb, batchftp, bigfoot, blackwidow, blowfish, botalot, buddy, builtbottough, bullseye, cheesebot, chinaclaw, cosmos, crescent, curl, custo, da, diibot, disco, dittospyder, dragonfly, drip, easydl, ebingbong, erocrawler, exabot, eyenetie, filehound, flashget, flunky, frontpage, getright, getweb, go-ahead-got-it, gotit, grabnet, grafula, harvest, hloader, hmview, httplib, humanlinks, ilsebot, infonavirobot, infotekies, intelliseek, interget, iria, jennybot, jetcar, joc, justview, jyxobot, kenjin, keyword, larbin, leechftp, lexibot, lftp, libweb, likse, linkscan, linkwalker, lnspiderguy, lwp, magnet, mag-net, markwatch, memo, miixpc, mirror, missigua, moget, nameprotect, navroad, backdoorbot, nearsite, netants, netcraft, netmechanic, netspider, nextgensearchbot, attach, nicerspro, nimblecrawler, npbot, openfind, outfoxbot, pagegrabber, papa, pavuk, pcbrowser, pockey, propowerbot, prowebwalker, psbot, pump, queryn, recorder, realdownload, reaper, reget, true_robot, repomonkey, rma, internetseer, sitesnagger, siphon, slysearch, smartdownload, snake, snapbot, snoopy, sogou, spacebison, spankbot, spanner, sqworm, superbot, superhttp, surfbot, asterias, suzuran, szukacz, takeout, teleport, telesoft, thenomad, tighttwatbot, titan, urldispatcher, turingos, turnitinbot, *vacuum*, vci, voideye, libwww-perl, widow, wisenutbot, wwwoffle, xaldon, xenu, zeus, zyborg, anonymouse, *zip*, *mail*, *enhanc*, *fetch*, *auto*, *bandit*, *clip*, *copier*, *master*, *reaper*, *sauger*, *quester*, *whack*, *picker*, *catch*, *vampire*, *hari*, *offline*, *track*, *craftbot*, *download*, *extract*, *stripper*, *sucker*, *ninja*, *clshttp*, *webspider*, *leacher*, *collector*, *grabber*, *webpictures*, *seo*, *hole*, *copyright*, *check*"
        respond @badbots "Access denied" 403

(proxy) {
        header_up X-Forwarded-Proto {scheme}
        header_up X-Forwarded-For {remote}
        header_up X-Real-IP {remote}
        header_down X-Powered-By "the Holy Spirit"
        header_down Server "CERN httpd"

(logs) {
        log {
            output file /var/log/caddy/caddy.log
            format single_field common_log

        redir * https://{}.{}{path} permanent

        import mustheaders
        import offlinewebsite
        import security
        import caching
        reverse_proxy internal_IP:2351 {
                import proxy

        import logs

I’ve deployed this exact config and it’s working fine as intended, meaning the IP address I configured is permitted while any other IP address isn’t blocked. What’s the IP address you’re using and how is it deployed?

My ISP IP is static. Let’s say my ip is

@public_networks not remote_ip
        respond @public_networks "Access denied" 403

It blocks me.

Turn on access logs (the log directive) and the debug global option. What do you see in your logs for your remote IP on those requests you expect to not be blocked?

my output file /var/log/caddy/caddy.log
is empty.

yes I should be the one with this IP to be allowed to see the website.

You didn’t fill out the thread template so I don’t know what environment you’re running in, but if running as a systemd service, run journalctl -u caddy --no-pager | less to see your logs.

Please follow the instructions we ask, it’s only wasting our time and yours if you don’t give us the information we ask.

here is the log

{"level":"error","ts":1615071422.0426748,"logger":"http.log.access.log0","msg":"handled request","request":{"remote_addr":"","proto":"HTTP/2.0","method":"GET","host":"","uri":"/","headers":{"User-Agent":["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.72 Safari/537.36"],"Accept":["text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9"],"Sec-Fetch-Mode":["navigate"],"Sec-Fetch-Dest":["document"],"Cache-Control":["max-age=0"],"Sec-Gpc":["1"],"Sec-Fetch-Site":["none"],"Sec-Fetch-User":["?1"],"Accept-Encoding":["gzip, deflate, br"],"Accept-Language":["pl"],"Upgrade-Insecure-Requests":["1"]},"tls":{"resumed":true,"version":772,"cipher_suite":4865,"proto":"h2","proto_mutual":true,"server_name":""}},"common_log":" - - [06/Mar/2021:22:57:02 +0000] \"GET / HTTP/2.0\" 403 13","duration":0.00012848,"size":13,"status":403,"resp_headers":{"Referrer-Policy":["strict-origin-when-cross-origin"],"X-Content-Type-Options":["nosniff"],"X-Xss-Protection":["1; mode=block"],"X-Robots-Tag":["noindex, nofollow, noarchive, nosnippet, notranslate, noimageindex"],"Cache-Control":["public, max-age=604800, must-revalidate"],"Content-Type":[],"Strict-Transport-Security":["max-age=31536000; includesubdomains; preload"],"X-Frame-Options":["SAMEORIGIN"],"Content-Security-Policy":["default-src https: 'unsafe-inline' 'unsafe-eval'"],"Expect-Ct":["max-age=604800"],"Feature-Policy":["accelerometer 'none'; ambient-light-sensor 'none'; autoplay 'none'; camera 'none'; encrypted-media 'none'; fullscreen 'self'; geolocation 'none'; gyroscope 'none'; magnetometer 'none'; microphone 'none'; midi 'none'; payment 'none'; picture-in-picture *; speaker 'none'; sync-xhr 'none'; usb 'none'; vr 'none'"]}}

As you can see in that log, Caddy is seeing the remote IP as

It is a docker IP not a remote client (user) IP.

This is why we’ve been asking for the template. It tells us how you’re deploying Caddy and allows us to hypothesize the potential issues. In this case, this is a Docker issue. See the Github issue for it. The solution seems to be to use the host network mode rather than bridge, which means Docker will bind those ports on the host machine.

1 Like

This topic was automatically closed after 30 days. New replies are no longer allowed.