Multiple trailing slashes in path redirect to one slash

1. The problem I’m having:

Hi All, I wanted to create a configuration on domain handle (sorry if not following naming conventions properly) where when a path contains two or more trailing slashes Caddy redirects to the same path cleaned out of those multiple slashes with a 301 code while keeping the path and query params.

So for example

https://example.com/path///?q=1 (301) => https://example.com/path/?q=1

I tried some different things with no avail, the config part that tries to regexp match and redirect was given to me by GPT in my final desperation.

My use case is SEO friendliness and general usability since I would otherwise had to handle this in my application level which is a bit more complicated and slower in the end. I believe this should be handled in the edge ideally.

Plus if you have any inputs to the config I would highly appriciate it!

2. Error messages and/or full log output:

{
  "level": "info",
  "ts": 1738922212.2602706,
  "logger": "http.log.access.log1",
  "msg": "handled request",
  "request": {
    "remote_ip": "xxx.xxx.xxx.xxx",
    "remote_port": "63478",
    "client_ip": "xxx.xxx.xxx.xxx",
    "proto": "HTTP/2.0",
    "method": "GET",
    "host": "example.com",
    "uri": "///",
    "headers": {
      "Cache-Control": [
        "max-age=0"
      ],
      "Sec-Ch-Ua-Mobile": [
        "?0"
      ],
      "Accept": [
        "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7"
      ],
      "Sec-Fetch-Site": [
        "none"
      ],
      "Sec-Ch-Ua": [
        "\"Not A(Brand\";v=\"8\", \"Chromium\";v=\"132\", \"Google Chrome\";v=\"132\""
      ],
      "Authorization": [
        "REDACTED"
      ],
      "Accept-Encoding": [
        "gzip, deflate, br, zstd"
      ],
      "Sec-Fetch-Dest": [
        "document"
      ],
      "Priority": [
        "u=0, i"
      ],
      "Sec-Ch-Ua-Platform": [
        "\"Windows\""
      ],
      "Dnt": [
        "1"
      ],
      "Accept-Language": [
        "hu-HU,hu;q=0.9,en-US;q=0.8,en;q=0.7,de;q=0.6"
      ],
      "User-Agent": [
        "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36"
      ],
      "Sec-Fetch-Mode": [
        "navigate"
      ],
      "Upgrade-Insecure-Requests": [
        "1"
      ],
      "Sec-Fetch-User": [
        "?1"
      ]
    },
    "tls": {
      "resumed": false,
      "version": 772,
      "cipher_suite": 4865,
      "proto": "h2",
      "server_name": "example.com"
    }
  },
  "bytes_read": 0,
  "user_id": "xxxx",
  "duration": 0.359900168,
  "size": 40311,
  "status": 404,
  "resp_headers": {
    "Content-Security-Policy": [
      "base-uri 'none'; default-src 'none'; connect-src 'self' https: https://api.example.com; font-src 'self' data: https://fonts.gstatic.com https://fonts.googleapis.com https://ka-p.fontawesome.com/releases/; form-action 'self'; frame-ancestors 'self'; frame-src 'self' https://www.googletagmanager.com https://*.cookiebot.com https://*.doubleclick.net https://*.google.hu https://*.google.at https://*.google.de https://*.typeform.com/; img-src 'self' data: https://cdn.xxx.at https://analytics.xxx.at https://ws.hotjar.com https://maps.gstatic.com https://maps.googleapis.com https://storage.googleapis.com/glxxxxorit_dev_assets/ https://www.googletagmanager.com https://*.linkedin.com https://*.google.hu https://*.google.com https://*.google.at https://*.google.de https://*.doubleclick.net https://*.facebook.com https://*.cookiebot.com https://*.bing.com https://api.example.com; manifest-src 'self'; media-src 'self' https://cdn.xxx.at https://analytics.xxx.at https://ws.hotjar.com https://api.example.com https://*.cookiebot.com https://*.doubleclick.net; object-src 'none'; script-src-attr 'none'; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com https://embed.typeform.com; script-src 'self' https://www.googletagmanager.com https://analytics.xxxx.at https://ws.hotjar.com https://maps.googleapis.com 'strict-dynamic' 'nonce-Sh4p5l2t82E1r5wZiEEwzg==' https://api.example.com https://*.cookiebot.com https://*.doubleclick.net https://*.google.at https://*.google.de; upgrade-insecure-requests; worker-src 'self' blob: https://api.example.com;"
    ],
    "Cross-Origin-Opener-Policy": [
      "same-origin"
    ],
    "Permissions-Policy": [
      "accelerometer=(), autoplay=(self \"https://www.youtube.com\" \"https://player.vimeo.com\"), camera=(), display-capture=(), encrypted-media=(), fullscreen=(self \"https://www.youtube.com\" \"https://player.vimeo.com\" \"https://cdn.xxxx.at\"), geolocation=(), gyroscope=(), magnetometer=(), microphone=(), midi=(), payment=(), picture-in-picture=(), publickey-credentials-get=(), screen-wake-lock=(), sync-xhr=(self), usb=(), web-share=(), xr-spatial-tracking=()"
    ],
    "Alt-Svc": [
      "h3=\":443\"; ma=2592000"
    ],
    "Cross-Origin-Embedder-Policy": [
      "'unsafe-none'"
    ],
    "X-Xss-Protection": [
      "0"
    ],
    "Content-Type": [
      "text/html;charset=utf-8"
    ],
    "X-Download-Options": [
      "noopen"
    ],
    "Cross-Origin-Resource-Policy": [
      "same-origin"
    ],
    "Cache-Control": [
      "public, max-age=604800, must-revalidate"
    ],
    "X-Robots-Tag": [
      "noarchive, notranslate"
    ],
    "Referrer-Policy": [
      "strict-origin-when-cross-origin"
    ],
    "X-Frame-Options": [
      "DENY"
    ],
    "X-Dns-Prefetch-Control": [
      "off"
    ],
    "Strict-Transport-Security": [
      "max-age=31536000; includesubdomains; preload"
    ],
    "X-Permitted-Cross-Domain-Policies": [
      "none"
    ],
    "X-Content-Type-Options": [
      "nosniff"
    ],
    "Vary": [
      "Accept-Encoding"
    ],
    "Date": [
      "Fri, 07 Feb 2025 09:56:52 GMT"
    ],
    "Origin-Agent-Cluster": [
      "?1"
    ],
    "Content-Encoding": [
      "zstd"
    ],
    "Access-Control-Allow-Origin": [
      "*"
    ]
  }
}.

3. Caddy version:

v2.9.1 h1:OEYiZ7DbCzAWVb6TNEkjRcSCRGHVoZsJinoDR/n9oaY=

4. How I installed and ran Caddy:

a. System environment:

Linux ubuntu 5.15.0-131-generic
with systemd

b. Command:

caddy reload

c. Service/unit/compose file:

N/A

d. My complete Caddy config:

(global_robots) {
	route {
		file_server /robots.txt {
			root /etc/caddy
			index robots.txt
		}
	}
}

(frontend_logging) {
	log {
		output file /var/log/caddy/frontend-access.log {
			roll_size 10mb
			roll_keep 20
			roll_keep_for 720h
		}
	}
}

(backend_logging) {
	log {
		output file /var/log/caddy/backend-access.log {
			roll_size 10mb
			roll_keep 20
			roll_keep_for 720h
		}
	}
}

(mustheaders) {
	header {
		Strict-Transport-Security "max-age=31536000; includesubdomains; preload"
		X-Content-Type-Options "nosniff"
		Referrer-Policy "strict-origin-when-cross-origin"
		-Server
		-X-Powered-By
	}
}

(onlinewebsites) {
	header {
		X-Robots-Tag "noarchive, notranslate"
	}
}

(compression) {
	encode zstd gzip
}

(caching) {
	header {
		Cache-Control "public, max-age=604800, must-revalidate"
	}
}

(security) {
	# block bad crawlers
	@badbots header User-Agent "aesop_com_spiderman, alexibot, backweb, batchftp, bigfoot, blackwidow, blowfish, botalot, buddy, builtbottough, bullseye, cheesebot, chinaclaw, cosmos, crescent, curl, custo, da, diibot, disco, dittospyder, dragonfly, drip, easydl, ebingbong, erocrawler, exabot, eyenetie, filehound, flashget, flunky, frontpage, getright, getweb, go-ahead-got-it, gotit, grabnet, grafula, harvest, hloader, hmview, httplib, humanlinks, ilsebot, infonavirobot, infotekies, intelliseek, interget, iria, jennybot, jetcar, joc, justview, jyxobot, kenjin, keyword, larbin, leechftp, lexibot, lftp, libweb, likse, linkscan, linkwalker, lnspiderguy, lwp, magnet, mag-net, markwatch, memo, miixpc, mirror, missigua, moget, nameprotect, navroad, backdoorbot, nearsite, netants, netcraft, netmechanic, netspider, nextgensearchbot, attach, nicerspro, nimblecrawler, npbot, openfind, outfoxbot, pagegrabber, papa, pavuk, pcbrowser, pockey, propowerbot, prowebwalker, psbot, pump, queryn, recorder, realdownload, reaper, reget, true_robot, repomonkey, rma, internetseer, sitesnagger, siphon, slysearch, smartdownload, snake, snapbot, snoopy, sogou, spacebison, spankbot, spanner, sqworm, superbot, superhttp, surfbot, asterias, suzuran, szukacz, takeout, teleport, telesoft, thenomad, tighttwatbot, titan, urldispatcher, turingos, turnitinbot, *vacuum*, vci, voideye, libwww-perl, widow, wisenutbot, wwwoffle, xaldon, xenu, zeus, zyborg, anonymouse, *zip*, *mail*, *enhanc*, *fetch*, *auto*, *bandit*, *clip*, *copier*, *master*, *reaper*, *sauger*, *quester*, *whack*, *picker*, *catch*, *vampire*, *hari*, *offline*, *track*, *craftbot*, *download*, *extract*, *stripper*, *sucker*, *ninja*, *clshttp*, *webspider*, *leacher*, *collector*, *grabber*, *webpictures*, *seo*, *hole*, *copyright*, *check*"
	respond @badbots "Access denied" 403
}

www.example.com {
	import frontend_logging
	import global_robots
	redir https://example.com{uri} 301
}

example.com {
	basicauth /* {
		# It is depracated I know, but it is not critical for me now
		xxx $yyyyyy
	}

	# Rewrite or redirect logic
	@multipleTrailingSlashes path_regexp ^(.*)//+$
	redir @multipleTrailingSlashes {http.request.uri.path}/ 301

	import frontend_logging
	import mustheaders
	import global_robots
	import onlinewebsites
	import security
	import caching
	import compression
	import redirects # static redirects listed inside like `redir /old/paht /new/path temporary`
	reverse_proxy 127.0.0.1:3000
}
www.api.example.com {
	import backend_logging
	import global_robots
	redir https://api.example.com{uri} 301
}
api.example.com {
	import backend_logging
	import mustheaders
	import global_robots
	import compression
	reverse_proxy 127.0.0.1:1337
}



5. Links to relevant resources:

I already checked this solution, am I not doing the same only with redir?

rewrite operates before the request reaches your backend, while redir happens after the request has been through some processing. There’s a doc on trailing slashes, which says:

You will not usually need to configure this yourself; the [file_server directive](https://caddyserver.com/docs/caddyfile/directives/file_server) will automatically add or remove trailing slashes from requests by way of HTTP redirects, depending on whether the requested resource is a directory or file, respectively.

As for how exactly it should reflect in a Caddyfile, I don’t know.

1 Like

That auto trunctate is what not happening for me, currently the frontend application tries to resolve the ‘///’ path which means to me that Caddy just reverse proxies without any path modifications.

So maybe in general it is automatic, but in revrse proxy it is not?

It doesn’t look like you have a file_server directive or root directive for the example.com site block. Did you try manually setting those?

A file_server is not a reverse proxy as requested. Besides the feature documented is about adding a missing trailing slash, not removing extras.

No? You’re specifically redirecting to the path component without also passing on the query component. Perhaps you just need to add {query} to the redirect. Caddyfile Concepts — Caddy Documentation

2 Likes