Internal SSL mechanism which is implemented inside caddy does not allow us to make proxying robots.txt file (which is needed to enable/disable pages visibility for search engines) Any suggestions of a possible workaround

1. Caddy version (caddy version):

2. How I run Caddy:

a. System environment:

b. Command:

Paste command here.

c. Service/unit/compose file:

Paste full file contents here.
Make sure backticks stay on their own lines,
and the post looks nice in the preview pane.

d. My complete Caddyfile or JSON config:

Paste config here, replacing this text.
Use `caddy fmt` to make it readable.
DO NOT REDACT anything except credentials.
LEAVE DOMAIN NAMES INTACT.
Make sure the backticks stay on their own lines.

3. The problem I’m having: I am not technical, and therefore cant answer any of the questions listed but I am wondering if anyone knows how to solve the issue that I have raised in the main header

4. Error messages and/or full log output:

5. What I already tried:

6. Links to relevant resources:

Please fill out the help topic template.

Hello Francis,

I don’t have the info as I’m not technical and can’t get hold of the info. I am trying to figure out a problem. I would delete this post, so I dont cheese anyone off but that doesn’t seem possible. Sorry for the inconvenience. I will try and ask for help through other forums, maybe.
Your sincerely

Phil

This topic was automatically closed after 30 days. New replies are no longer allowed.