Proxy by HTTP method

I have the following endpoints:

POST /photos
GET /photos/photo_id
DELETE /photos/photo_id

In the GET case, I want Candy to serve the photos from disk.
In the POST and DELETE cases, I want it to proxy the requests to my Node.js upstream servers.

As far as I can tell, neither the proxy directive nor except support specifying HTTP methods.
The most trivial (and slightly ugly) solution would be having a separate path, e.g. /p/photo_id, for GET requests, with /p passed to except. However it just does not feel “idiomatic”. Since this seems a common use case, how would you guys approach this issue?

With thanks,
Laszlo

Someone correct me if I’m wrong but this is not really done in webserver configuration, usually.

Neither Apache nor nginx can conditionally proxy based on request method. Both have capable workarounds involving rewrites. Caddy might be capable of similar.

How would I approach this issue? The application I’m proxying to would handle POST, DELETE, and GET requests, ideally. Outside of this, I might consider writing an intermediary program to handle routing between the Node.js uploader and the disk based file server (this would be pretty easy in Go, I believe).

That is correct. HTTP VERBS are usually handled with very complicated Rewrite rules, or directly in the application API. I would suggest that it’s best handled in the API.

So /photos/image1.jpg is that what you’re suggesting? Of course Caddy can do that, that’s just a read action.

So the real issue is your dynamic paths, POST an upload action, and DELETE a file delete action. I highly suggest these go behind a different URL path. Either /api/photos/ or /photos/api. Whichever you choose, your Node app is going to need to handle file logic. Image processing, writing file to disk, filename collisions, permissions (who can upload), etc.

Also how are you going to know what’s on disk? Your API should provide some query mechanism that points to the “static” URL, or just straight serves up the file. That is all outside of Caddy.

Caddy can serve static files all day long. But it’s not going be able to do any of those dynamic functions unless you write your own plugin.

Thanks for your suggestion.

Outside of this, I might consider writing an intermediary program to handle routing between the Node.js uploader and the disk based file server (this would be pretty easy in Go, I believe).

So in the absence of a capable workaround, an intermediary program would be like a custom plugin, which intercepts the requests before the proxy plugin has a chance to act upon it, and if the request matches verb + path combo, Caddy returns the photo from the disk, otherwise passing control to the next middleware (or plugin, not sure about the correct term). Right?

Since started using Caddy only yesterday, I am not sure how easily could this be done, however it seems to be one possible solution to this problem.

Thanks for your answer.

So /photos/image1.jpg is that what you’re suggesting? Of course Caddy can do that, that’s just a read action.

Yes, that is what I am trying to achieve.

I highly suggest these go behind a different URL path. Either /api/photos/ or /photos/api.

Basically this is the workaround I am using for the time being. In the current form: /photos/image1.jpg for GET actions. I have except /photos inside the proxy directive, therefore Caddy reads the photos from disk. I have a /p endpoint on the Node.js side for everything else photo related, like delete photo, create photo, but NOT get photo. Caddy proxies all those requests to my Node.js server. But this is the exact situation I like to avoid.
As far as I see it, conceptually /photos/image1.jpg is one single resource, I don’t really like the idea of splitting it up because of purely technical limitations.

I have omitted from my original question, but I have a distributed “image processing pipeline” :slight_smile:, which makes sure whatever photo someone uploads via the Node.js endpoint, it will eventually become available inside Caddy’s working directory. That is why I want Caddy to do a file read only in the case of a GET method.

Also, I am trying to use the same single Caddy instance as a static file server and as a reverse proxy, at least for the start. Later on I might rewrite the /photos endpoint as a Candy plugin, and have separate static file servers, which solves this current issue, but at the moment I see it as a premature optimisation.

Looking at Apache and nginx, they don’t appear to have any mechanism to do proxy differently based on HTTP method. At least nothing simple What they do have is some frankly arcane and hard to administer Rewrite rules, where you might be able to do something similar to what you want.

So that’s Apache and nginx.
What makes Caddy powerful is it’s also simple, and extendable. If you really don’t want the Node backend to serve the file, and want something simple, use a different path. I know that doesn’t match the REST ideals, but that’s the quick solution.

Otherwise, you could hack on the proxy code, directly. Quickly looking, I’d start with the match function, and modify it to look at the http.Request which contains the Method field. Then you could match proxy upstream on that condition as well. It also means you would have to extend the proxy directive to add HTTP methods

proxy /photos localhost:3000 {
   method POST DELETE  <-- custom directive needs 
}

But that still leaves GET which wouldn’t be proxied…so that requires more hacking somewhere

See where this going?? I would just use a custom path, or just serve it all from Node process(s). Come to think of it that’s another option, fire up multiple Node processes, and just use Caddy a front-end load-balancer. Sorry if that doesn’t help

Honestly, I was thinking about just writing a short Go program with GitHub - gorilla/mux: Package gorilla/mux is a powerful HTTP router and URL matcher for building Go web servers with 🦍 and http package - net/http - Go Packages. Have Caddy proxy all requests to it, and program it to pass POST and DELETE requests to the next stage (the Node.js server).

But again the ideal solution would definitely be for the same endpoint to handle all the verbs (i.e. for the Node.js server to handle serving files back to Caddy to be returned to the client).

Some great comments from @jim regarding extending Caddy for this purpose, though.

1 Like

@jim @Whitestrake Thanks for both of you for the great answers.

I think I am going to stick to using the two-path workaround as was suggested by @jim, at least for the time being. And if time permits, probably moving towards a single-endpoint solution. I also like the idea of running a simple custom-made file serving process, instead of using a full-blown server.

I am a bit reluctant to use Node as a static file server, though, because in the past I saw numerous benchmarks showing that Node.js is not the best option for that use case. However in those benchmarks it was compared against Nginx. (After some googling, I found that, in terms of static file sharing, the golang-based solutions are comparable to the Node.js ones, but they are booth outperformed by Nginx.)

For plain static file serving, Go’s net/http package is comparably fast to nginx because they both use unix sendfile. You’ll find that Caddy is quite fast at serving static files, do not worry. :slight_smile: (Although we have a lot of optimization work to do!)

1 Like

[quote=“Whitestrake, post:7, topic:1689”]
Some great comments from @jim regarding extending Caddy for this purpose, though.
[/quote]Thanks @Whitestrake, it would certainly be a unique feature to be able to proxy based on HTTP method. Maybe something to look into at some point, and it would certainly be unique amongst HTTP servers I believe.

Wouldn’t be hard to do, just need to figure out the Caddyfile syntax to represent that.

For Caddy 2.0 I plan on introducing a standard “request matcher” syntax of sorts that would have that capability.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.