Introducing Lura API Gateway Module for Caddy: Feedback Wanted!

Hello Caddy Community,

I’m excited to share a personal project I’ve been working on – a new Caddy module that implements an API Gateway using the Lura project framework (formerly known as KrakenD). This module is designed to bring advanced API gateway functionalities to Caddy, making it easier to manage and orchestrate complex API environments.

What is the Lura API Gateway Module?

The Lura API Gateway module for Caddy allows you to define multiple API endpoints, each backed by one or more backend services. It supports response aggregation and transformation rules per endpoint, enabling complex API orchestrations with ease. Here’s a quick overview of the main features:

  • Multiple Endpoints: Define a set of endpoints representing your public API.
  • Backend Services: Configure backend services for each endpoint, with options for load balancing, field filtering, and response transformation.
  • Tighter integration with caddy: Use any caddy placeholders in the backend routing, and configure your API Gateway with Caddyfile.

Example Configuration

Here’s a more detailed example of how you can configure the Lura module using a Caddyfile:

{
    log {
        level DEBUG
    }
}

:{$LISTEN_PORT}

handle /__ready {
    respond OK 200
}

lura {
    timeout 10s
    cache_ttl 3600s

    debug_endpoint
    echo_endpoint /__custom/echo

    endpoint /users/{user} {
        method GET
        concurrent_calls 2
        timeout 1000s
        cache_ttl 3600s

        backend {$BACKEND_HOST} {
            url_pattern /registered/{user}
            allow id name email
            mapping {
                email>personal_email
            }
        }

        backend {$BACKEND_HOST} {
            url_pattern /users/{user}/permissions
            group permissions
        }
    }

    endpoint /dynamic {
        method GET

        backend {$BACKEND_HOST} {
            method GET
            url_pattern /tenants/{header.X-Tenant-Id}
        }
    }
}

Current Status and Future Plans

This project is currently in a pre-alpha release state, and not all Lura configuration options are exposed yet.

Here are some of the next steps and features I plan to work on:

  • Support More Lura Config Options: Expose additional configuration options to leverage the full power of the Lura framework.
  • Custom Caddy Modules for Request/Response Modifiers: Allow users to create custom modules to modify requests and responses, enhancing flexibility and control.
  • Additional API Gateway Features: Implement features like rate limiting, circuit breaker, and more to provide a comprehensive API gateway solution.
  • Improving Performance and Scalability: Focus on optimizing performance and scalability to handle increased traffic and backend load efficiently.
  • Robust Error Handling and Logging: Enhance error handling and logging for better troubleshooting and monitoring.
  • Enhanced Security Features: Implement security measures to protect against common vulnerabilities and ensure safe API operations.

How You Can Help

I’d love to get feedback from the Caddy community on this module. Your insights and suggestions will be invaluable in shaping the future of this project. Specifically, I’m looking for feedback on:

  • Use Cases: How could you see yourself using this module?
  • Features: What features are most important to you?
  • Performance: Any tips on optimizing performance and scalability?
  • Security: Suggestions on improving security features.

You can find the project on GitHub. Feel free to open issues, submit pull requests, or simply share your thoughts in this thread.

The module documentation has already been published at Caddy Docs.

Thank you for your time and feedback!

2 Likes

Cool!

At a glance, it seems like your config is kinda reimplementing concepts built into Caddy, like request matching. Wouldn’t it make more sense to lean on Caddy’s matchers to define your endpoint matching? It could let you flatten the config a bit as well by not having everything inside lura if you have multipler handlers (one per route/endpoint)

3 Likes

I think your idea is great. At first I wanted to support exactly that, but faced some issues that made me go the other route:

As a design principle, lura wants to be very verbose and specific about what it is exposed, once the goal is usually to define an api contract at the gateway Level, so that clients don’t rely upon the upstream backends’ api contract (this is not just a reverse proxy).

So I wondered that allowing arbitrary request matchers would go against this principle and design goal.

There is also the extra technical complexity added to it that I have preferred to avoid as at first I was only validating a proof of concept.

The lura project is composed mainly by 3 layers: configuration, router and proxy. The router layer is responsible for exposing the public api, allowing arbitrary request matchers results in not being able to reuse the existing layer adapters, so I would have to rewrite most of it.

At last, I could not find a request matcher that would allow me to explicitly define path variables and have them populated as placeholders, as when I define the endpoint /users/{user}.

Specifically about this last issue, can this easily be implemented in caddy?

To conclude, I believe you proposal is a valid approach worth exploring, it is just not as straightforward.

You can use path_regexp to capture path segments and then use them later with {re.1} (i.e. using Caddy placeholder replacement)

3 Likes