1. Caddy version (caddy version
):
v2.3.0 h1:fnrqJLa3G5vfxcxmOH/+kJOcunPLhSBnjgIvjXV/QTA=
2. How I run Caddy:
a. System environment:
Using Docker on windows wsl2 ubuntu subsystem
b. Command:
caddy run --config Caddyfile
c. Service/unit/compose file:
Dockerfile:
FROM debian
RUN apt-get update && apt-get install -y debian-keyring debian-archive-keyring apt-transport-https curl && \
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/gpg.key' | apt-key add - && \
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/debian.deb.txt' | tee -a /etc/apt/sources.list.d/caddy-stable.list && \
apt-get update && \
apt-get install caddy
CMD ["ping", "localhost"]
docker-compose.yml
version: '3.3'
services:
caddy:
build: .
ports:
- "2019:2019"
- "9000:9000"
restart: always
volumes:
- $PWD/.:/var/www
d. My complete Caddyfile or JSON config:
Caddyfile (mounted into container)
localhost:9000 {
encode gzip
reverse_proxy https://storage.googleapis.com
rewrite * /<bucket>/<directory>/<subdirectory>{uri}
log {
output file /var/log/caddy/bucket.log
}
}
3. The problem I’m having:
I want to use caddy as a reverse-proxy for static html, xml, js, css and image files in a google storage bucket. The bucket is public accesible.
If I request localhost:9000/index.html (or ommit index.html at all) I’d like to see the file https://storage.googleapis.com/<bucket>/<directory>/<subdirectory>/index.html
At the Moment I get, so some part of the proxy is working but not all. Does anybody manages it to successful reverse-proxying a google storage bucket?
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
</Details>
</Error>
4. Error messages and/or full log output:
2021/03/05 19:59:01.812 error http.log.access.log0 handled request {"request": {"remote_addr": "172.19.0.1:47048", "proto": "HTTP/2.0", "method": "GET", "host": "localhost:9000", "uri": "/", "headers": {"User-Agent": ["Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:86.0) Gecko/20100101 Firefox/86.0"], "Accept": ["text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8"], "Accept-Language": ["de,en-US;q=0.7,en;q=0.3"], "Accept-Encoding": ["gzip, deflate, br"], "Dnt": ["1"], "Upgrade-Insecure-Requests": ["1"], "Cache-Control": ["max-age=0"], "Te": ["trailers"]}, "tls": {"resumed": false, "version": 772, "cipher_suite": 4865, "proto": "h2", "proto_mutual": true, "server_name": "localhost"}}, "common_log": "172.19.0.1 - - [05/Mar/2021:19:59:01 +0000] \"GET / HTTP/2.0\" 403 223", "duration": 1.2390266, "size": 223, "status": 403, "resp_headers": {"Date": ["Fri, 05 Mar 2021 19:59:01 GMT"], "Expires": ["Fri, 05 Mar 2021 19:59:01 GMT"], "Cache-Control": ["private, max-age=0"], "X-Guploader-Uploadid": ["ABg5-UyJ2I4ixCv11Ac67EcuA3Z1IyCXQHvL6v7Tia1V_fsRNZLwXlXHJmD8kC2dq-Eb1OgJTZDrJK5RLR_2EODAZg"], "Content-Type": ["application/xml; charset=UTF-8"], "Server": ["Caddy", "UploadServer"], "Content-Length": ["223"]}}
2021/03/05 19:59:01.947 info http.log.access.log0 handled request {"request": {"remote_addr": "172.19.0.1:47048", "proto": "HTTP/2.0", "method": "GET", "host": "localhost:9000", "uri": "/favicon.ico", "headers": {"User-Agent": ["Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:86.0) Gecko/20100101 Firefox/86.0"], "Accept": ["image/webp,*/*"], "Accept-Language": ["de,en-US;q=0.7,en;q=0.3"], "Accept-Encoding": ["gzip, deflate, br"], "Dnt": ["1"], "Referer": ["https://localhost:9000/"], "Te": ["trailers"]}, "tls": {"resumed": false, "version": 772, "cipher_suite": 4865, "proto": "h2", "proto_mutual": true, "server_name": "localhost"}}, "common_log": "172.19.0.1 - - [05/Mar/2021:19:59:01 +0000] \"GET /favicon.ico HTTP/2.0\" 0 0", "duration": 0.0016936, "size": 0, "status": 0, "resp_headers": {"Server": ["Caddy"]}}
5. What I already tried:
I tried to get it working, but was not able to. I’m out of ideas, that’s why I’m asking for help here. Is the reverse-proxy the right thing here?
Why I’m trying this:
My goal ist to have multiple pages in the storage bucket in their own directorys and ship them out on different domains as well adding pages dynamically with the admin api.