Serverless plugins?

It should enable Caddy to act as if it were ‘AWS Lambda’

I’ve been looking into serverless solutions, such as AWS Lambda, Google Functions and Microsoft Azure Functions.

All of these services do basically one thing: execute the right piece of code on a HTTP request (they can also work on a timer, but not getting into details). They all support nodejs, and with some workaround you can execute Go code using that. But none of these feel “just right”, as replying with HTTP is next to impossible. (and when it is possible, you usually don’t have any access over the HTTP headers you return (such as Location: for redirects or the status-code for non-200 results).

The idea of this “plugin discussion” is to find something that’s easy to configure, which playing nice with code that’s already written.

Example-config

I think it would be great if Caddy had a plugin which “runs some code” on a HTTP request. I’m no expert on Caddy (yet), but I’d imagine things to be configurable like this:

serverless:
  - url: /home
    run: docker run myservice ./server -port $HTTP_PORT
    keepalive: 10s
  - url: /contact
    run: rkt run myservice -- -port $HTTP_PORT
    keepalive: 60s
  - url: /shop
    run: /usr/local/bin/myservice -address /run/caddy/mywebsite/shop.socket
    keepalive: 15s

The docker/rkt-specific syntax might be a bit incorrect here, but the idea would be to (optionally) specify a port / unix-socket the serverless component would listen for, to potentially start the webserver on-the-fly. The keepalive parameter would be how long the caddy should wait (i.e.: no incoming requests) before “killing” the container.

I’ve also thought: can’t we just send the HTTP-requests through the stdin, and the HTTP-response through stdout? Possibly, but that would liimit the amount of simultaneous requests a bit. Then again, just brainstorming here.

#Feedback welcome
I’m rather new to the serverless-paradigm, but I’m sure everyone here has some criticism he/she would like to share? This idea was solely based on my opinion of serverless-computing, based on what I’ve read on the web – so feel free to correct me!

Hey Etienne! I admit I don’t have much experience with “serverless” solutions either (I just see the name serverless being made fun of all over the interwebs).

Instead of that config syntax, consider Caddyfile syntax. :wink: Along those lines, it seems quite possible.

This is an interesting idea. How would we handling:

  • Knowing if the process is already running?
  • Terminating the process?

For instance in your /home case that uses Docker. What if myservice was updated - how does that container get restarted, and is that Caddy’s job?

– James