Robots.txt seems to not work under caddy

I had a Wordpress site that originally was behind nginx, but switched to caddy to avoid dealing with SSL all the time. However, around the time I made the switch, articles that were supposed to be ignored as stated in the robots.txt started showing up again in search engine results. Yet, the robots.txt file is accessible when I navigate to https://site.com/robots.txt

Is there something extra that has to be configured in caddy to get the robots.txt working?

Nope, nothing special is needed.

Does /robots.txt still appear if you browse there in private mode / after clearing cache?

Yup, it still appears. Could it just be a coincidence?

Possibly.

Crawlers don’t particularly care about the server that gives them the file, or the headers, or connection metadata, they just care about the contents of the robots.txt file. As long as the contents are identical, you shouldn’t have seen any different behaviour.

Run your site through one of the online tools that validates robots.txt and just make sure it parses as you’d expect it to.

Hmm, after some more thinking my guess is that the links are being linked from another site and that’s causing them to show up. Might have to use noindex meta tags or just remove the articles.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.