Using Caddy to secure Port

1. The problem I’m having:

I have two services running on two different ports: an Ollama server on 11434 and a Whisper server on 5000 and would like to restrict those ports using API keys passed along in the auth header. I have followed a number of tutorials using reverse-proxy to try and black my site, but after launching caddy I am still able to get to the site without any auth errors or caddy output.

I am also running an apache2 webserver as well and I was hoping to be able to keep it, rather then using Caddy to handle those requests.

2. Error messages and/or full log output:

kim@el3ktra:~$ caddy run --config ./Caddyfile --envfile ./caddy.env
2025/02/13 03:09:59.999	INFO	using config from file	{"file": "./Caddyfile"}
2025/02/13 03:10:00.000	INFO	adapted config to JSON	{"adapter": "caddyfile"}
2025/02/13 03:10:00.000	INFO	admin	admin endpoint started	{"address": "localhost:2019", "enforce_origin": false, "origins": ["//[::1]:2019", "//127.0.0.1:2019", "//localhost:2019"]}
2025/02/13 03:10:00.001	INFO	tls.cache.maintenance	started background certificate maintenance	{"cache": "0xc000754d00"}
2025/02/13 03:10:00.001	DEBUG	http.auto_https	adjusted config	{"tls": {"automation":{"policies":[{}]}}, "http": {"servers":{"srv0":{"listen":[":5000"],"routes":[{"handle":[{"handler":"subroute","routes":[{"handle":[{"handler":"authentication","providers":{"http_basic":{"accounts":[{"password":"{env.BASIC_USER_AUTH}","username":"{env.BASIC_AUTH_USER}"}],"hash":{"algorithm":"bcrypt"},"hash_cache":{}}}},{"handler":"reverse_proxy","upstreams":[{"dial":"localhost:5000"}]}]}]}],"terminal":true}],"automatic_https":{"skip":["el3ktra.net"]}}}}}
2025/02/13 03:10:00.001	DEBUG	http	starting server loop	{"address": "[::]:5000", "tls": false, "http3": false}
2025/02/13 03:10:00.001	WARN	http	HTTP/2 skipped because it requires TLS	{"network": "tcp", "addr": ":5000"}
2025/02/13 03:10:00.001	WARN	http	HTTP/3 skipped because it requires TLS	{"network": "tcp", "addr": ":5000"}
2025/02/13 03:10:00.001	INFO	http.log	server running	{"name": "srv0", "protocols": ["h1", "h2", "h3"]}
2025/02/13 03:10:00.001	INFO	autosaved config (load with --resume flag)	{"file": "/home/kim/.config/caddy/autosave.json"}
2025/02/13 03:10:00.001	INFO	serving initial configuration
2025/02/13 03:10:00.006	INFO	tls	cleaning storage unit	{"storage": "FileStorage:/home/kim/.local/share/caddy"}
2025/02/13 03:10:00.008	INFO	tls	finished cleaning storage units

3. Caddy version:

2.9.1 h1

4. How I installed and ran Caddy:

curl https://webi.sh/caddy | sh

a. System environment:

Ubuntu 24.04.1 LTS
Arch: x86_64

b. Command:

caddy run --config ./Caddyfile --envfile ./caddy.env

c. Service/unit/compose file:

PASTE OVER THIS, BETWEEN THE ``` LINES.
Please use the preview pane to ensure it looks nice.

d. My complete Caddy config:

*caddy.env:*
BASIC_AUTH_USER='apitoken'
BASIC_USER_AUTH='<generated from caddy hash-password>'

*Cadyfile: (port 5000 version)*
{
        debug
}
http://el3ktra.net:5000 {
        basic_auth {
                {env.BASIC_AUTH_USER} {env.BASIC_USER_AUTH}
        }
        reverse_proxy localhost:5000
}

5. Links to relevant resources:

You seem to be trying to have Caddy and the app both listening on port 5000, which can’t work, unless you have also limited which interfaces they are listening on.

Thanks. I replaced by Caddyfile with:

{
        debug
}
http://el3ktra.net:2800 {
        reverse_proxy localhost:5000
}

and attempted to access locally 192.168.0.128:2800… and it is definitely doing something… I am getting a blank screen, but only when I am running Caddy. I really do appreciate your help, as I do feel like this is a silly mistake, but I also feel like I am very close.

Do you have curl installed by any chance?

If yes, can you run:

curl -v http://192.168.0.128:2800

and post the result?

1 Like

Great idea! I do see that it’s talking to Caddy, but not sending to port 5000 for some reason. I also have debugging on and am not getting any messages.

kim@el3ktra:~$ curl -v http://192.168.0.128:2800
*   Trying 192.168.0.128:2800...
* Connected to 192.168.0.128 (192.168.0.128) port 2800
> GET / HTTP/1.1
> Host: 192.168.0.128:2800
> User-Agent: curl/8.5.0
> Accept: */*
> 
< HTTP/1.1 200 OK
< Server: Caddy
< Date: Thu, 13 Feb 2025 23:58:51 GMT
< Content-Length: 0
< 
* Connection #0 to host 192.168.0.128 left intact

Can you also try:

curl -v http://localhost:5000

to see what comes back?

And I’m just spitballing here, but can you also try this in your Caddyfile?

reverse_proxy http://localhost:5000

Thank you to the people who have responded. I have made good progress and you guys were able to answer the question.

This is working:
caddy reverse-proxy --from :2800 --to :5000

kim@el3ktra:~$ caddy reverse-proxy --from :2800 --to :5000 &
[4] 541836
kim@el3ktra:~$ 2025/02/14 00:13:51.649	WARN	admin	admin endpoint disabled
2025/02/14 00:13:51.649	INFO	http.auto_https	automatic HTTPS is completely disabled for server	{"server_name": "proxy"}
2025/02/14 00:13:51.649	INFO	tls.cache.maintenance	started background certificate maintenance	{"cache": "0xc00059e800"}
2025/02/14 00:13:51.649	WARN	http	HTTP/2 skipped because it requires TLS	{"network": "tcp", "addr": ":2800"}
2025/02/14 00:13:51.649	WARN	http	HTTP/3 skipped because it requires TLS	{"network": "tcp", "addr": ":2800"}
2025/02/14 00:13:51.649	INFO	http.log	server running	{"name": "proxy", "protocols": ["h1", "h2", "h3"]}
2025/02/14 00:13:51.649	INFO	caddy proxying	{"from": "http://:2800", "to": [":5000"]}
2025/02/14 00:13:51.654	INFO	tls	storage cleaning happened too recently; skipping for now	{"storage": "FileStorage:/home/kim/.local/share/caddy", "instance": "185eebab-5799-4898-8ae6-8a6be712937e", "try_again": "2025/02/15 00:13:51.654", "try_again_in": 86399.999999686}
2025/02/14 00:13:51.654	INFO	tls	finished cleaning storage units

kim@el3ktra:~$ curl -v http://192.168.0.128:2800
*   Trying 192.168.0.128:2800...
* Connected to 192.168.0.128 (192.168.0.128) port 2800
> GET / HTTP/1.1
> Host: 192.168.0.128:2800
> User-Agent: curl/8.5.0
> Accept: */*
> 
< HTTP/1.1 200 OK
< Access-Control-Allow-Headers: content-type, authorization
< Access-Control-Allow-Origin: *
< Content-Length: 2099
< Content-Type: text/html
< Server: Caddy
< Server: whisper.cpp
< Date: Fri, 14 Feb 2025 00:13:59 GMT
< 

    <html>
    <head>
        <title>Whisper.cpp Server</title>
        <meta charset="utf-8">
        <meta name="viewport" content="width=device-width">
        <style> ...

Then, I got the following Caddyfile to work:

{
        debug
}
http://el3ktra.net:2800 {
        reverse_proxy :5000
}

So now I am trying to add auth:

{
        debug
}
http://el3ktra.net:2800 {
        basic_auth {
                {env.BASIC_AUTH_USER} {env.BASIC_USER_AUTH}
        }
        reverse_proxy :5000
}

Attempting to access port 2800 is blocked:

el3ktra@LilL3x:~/ $ wget el3ktra.net:2800
--2025-02-13 20:42:50--  http://el3ktra.net:2800/
Resolving el3ktra.net (el3ktra.net)... 67.160.190.98
Connecting to el3ktra.net (el3ktra.net)|67.160.190.98|:2800... connected.
HTTP request sent, awaiting response... 401 Unauthorized

Username/Password Authentication Failed.

Like I said before, at this point I think that the server is working. But, if you want to stick around for the afterparty, I am finding a devil of a time getting a post request through now.

ul_url = 'http://el3ktra.net:2800/inference'
files = {'file': f}
auth_header = {'Authorization': f'Bearer <api_key>', 'Content-Type': 'application/json'}
data = {'response_format': 'json'}
response = requests.post(ul_url, files=files, headers=auth_header, data=data)
# also tried:
response = requests.post(ul_url, files=files, data=data, auth=('kim', <api_key>))

Also tried “Basic” instead of bearer and putting a username in front, deliminating with both a space and colon. Regardless, post is returning 401

Having a devil of a time finding a client-side example, so if you have a link you can share that would be very helpful.

Again, thank you to everyone that helped me with this issue.

Can you try

curl -u “login:password” http://el3ktra.net:2800

Replace login and password with the login and password you’ve configured in Caddyfile.

If that doesn’t work, the credentials you’re using are wrong.

You can add -v to that curl command to get a verbose output.

I would also add that you’re mixing up HTTP Basic Auth with Authorization Bearer - those are two different things.

If you want to try the Authorization Bearer approach, here’s a quick-and-dirty Caddyfile. Feel free to tweak it as needed:

{
	http_port 8080
}

:8080 {
	@apiToken {
		header Authorization "Bearer 067D5328-2E4D-45BE-B57B-C6641F933B8E"
	}
	handle @apiToken {
		respond "API section"
	}
	respond "Unauthorized" 401
}
$ caddy run --config Caddyfile

And here are my curl tests:

$ curl http://localhost:8080
Unauthorized

$ curl http://localhost:8080 -H 'Authorization: Bearer 067D5328-2E4D-45BE-B57B-C6641F933B8E'
API section

The token 067D5328-2E4D-45BE-B57B-C6641F933B8E is just an example - be sure to replace it with your own. You can generate one using the uuidgen command. Also, make sure to use HTTPS rather than HTTP.

I now have a server running Ollama and Whisper behind authenticated ports that I can access via request calls in Python. :tada:

I wanted to document the final solution for anyone else who is trying to authenticate thier ports (namely Ollama) and then call them using request or curl.

My Caddy file (no env file used):

http://el3ktra.net:2800 {
        handle /* {
                basic_auth {
                        <username1> <hash1>
                        <username2> <hash2>
                       ...
                }
                reverse_proxy :5000
        }
}

http://el3ktra.net:2900 {
        handle /* {
                basic_auth {
                        <username1> <hash1>
                        <username2> <hash2>
                       ...
                }
                reverse_proxy :11434
        }
}

Python Requests call (to Whisper):
resp = requests.post(url, files=files, data=data, auth=(email, api))

Python Ollama call:
self.client=ollama.Client(host=url, auth=email, api))

Closed ports 5000 and 11434 and works like a charm.

Thank you to timelordx for all your help.

1 Like