Is it possible to orchestrate multiple backend API calls with a single frontend call?

1. The problem I’m having:

Trying to figure out if it is possible to make multiple backend API calls when a single call comes into caddy endpoint. Given the caddy configuration for port 2016 and 2017 (in CaddyFile section), when I invoke

curl -k https://localhost:2018

I would like to see the following output (i.e. caddy calls both endpoints, one after the other)

Goodbye, world! (localhost:2017)
Goodbye, world! (localhost:2016)

2. Error messages and/or full log output:

Don’t know yet

3. Caddy version:


4. How I installed and ran Caddy:

Choco install caddy

a. System environment:

Windows 11

b. Command:

caddy run

d. My complete Caddy config:

localhost:2016 {
	respond "Goodbye, world! (localhost:2016)"

localhost:2017 {
	respond "Goodbye, world! (localhost:2017)"

localhost:2018 {
	?? Not sure how to structure this so that it calls both localhost:2017 and localhost:2016 ??

Why do you need to do that? Please explain what you’re actually trying to do.

It’s not really possible to do that with anything built into Caddy.

You could sorta fake it using templates but only by making requests to the same virtual host, with the httpInclude function.

1 Like

Like Francis said, I don’t think it can be done currently but if we had more details maybe we could add it to our roadmap.

I was trying to simulate the presence of an API gateway in the caddy server during my API development. Not sure if this is legitimate request for caddy server or not. But here is the scenario if you want to pursue this work (assuming this is not possible today).

  1. Client app obtains an access token from Authorization server on behalf of the user.
  2. Client app makes a call to service A with the access token
  3. API gateway receives the request of service A and extracts the access token.
  4. API gateway invokes backend token validation service with this token via reverse proxy and gets custom API authorizations.
  5. If API gateway gets an error (either because token is invalid or expired etc.), then it will reject the request right here. Otherwise, it will proceed to the next step.
  6. API gateway includes these custom authorizations on the request and forwards the client request to the actual service A implementation (again via reverse-proxy mechanism)
  7. The service A implementation checks these custom authorizations and decides to process the request or not (There by not having to deal with how to validate access token and getting authorizations etc.)

Hope that helps.

1 Like

That sounds pretty specific and a custom module could certainly do it. :+1: but have you seen our forward auth capabilities? forward_auth (Caddyfile directive) — Caddy Documentation

Are you basically looking for something like Kong? I’ve thought about a Caddy module to do that kind of thing but have debated as to whether users would find it useful and whether it’d be generally useful enough.

Sure, I can use Kong.

I was merely attempting to see if this is possible with caddy server or not, since caddy server is so light weight to get it working natively on windows, especially for development.

I will take a look at forward auth capabilities.


1 Like

I could see it being a caddy module. Someone just needs to fund its development :+1:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.