Reverse Proxy with Caddy

Caddy is a production-ready web server with automatic HTTPS. It's an excellent reverse proxy for Stario apps - zero configuration TLS, simple syntax, and native Unix socket support.

Basic Setup

Point Caddy at your Stario app running on a TCP port:

example.com {
    reverse_proxy localhost:8000
}

That's it. Caddy automatically obtains and renews TLS certificates from Let's Encrypt.

Unix Socket Proxy

For production, use Unix sockets to avoid TCP overhead. This is the recommended setup when Caddy and your app run on the same machine:

example.com {
    reverse_proxy unix//sockets/app.sock
}

Reusable snippet

If you run multiple apps, define a snippet to keep things DRY:

(sock) {
    reverse_proxy unix/{args[0]}
}

example.com {
    import sock /sockets/myapp.sock
}

api.example.com {
    import sock /sockets/api.sock
}

Socket Permissions

When using Unix sockets, both Caddy and your app need read/write access to the socket file. This is the most common source of "connection refused" errors.

Same host (no containers): Run Caddy and your app as the same user, or add the Caddy user to your app's group:

usermod -aG myapp caddy

Containers (Docker/Podman): Mount a shared directory and make sure the container process can write to it. Stario creates sockets with 0o666 permissions, so the main concern is the directory itself:

# Create a shared socket directory
mkdir -p /sockets
chmod 775 /sockets

# Podman: use :z for SELinux relabeling
podman run -v /sockets:/sockets:z myapp

If Caddy runs on the host and your app runs in a container, ensure the host user running Caddy can read the socket file the container creates. With rootless Podman, the container's UID maps to a different host UID - you may need to match UIDs or use a shared group.

Production-Ready Caddyfile

A complete Caddyfile with global options, timeouts, admin API, and Unix socket proxying:

# Reusable snippet for socket-based apps
(sock) {
    reverse_proxy unix/{args[0]}
}

{
    email you@example.com

    log {
        output stdout
        format json
        level INFO
    }

    # Admin API on Unix socket (more secure than TCP)
    admin unix//sockets/caddy-admin.sock

    servers {
        timeouts {
            read_body   30s
            read_header 10s
            write       30s
            idle        2m
        }
        max_header_size 32kb
    }
}

# Your Stario app
example.com {
    import sock /sockets/myapp.sock
}

On the app side:

await app.serve(unix_socket="/sockets/myapp.sock", workers=4)

Caddy handles TLS (automatic Let's Encrypt), HTTP/2, and SSE streaming out of the box - no proxy_buffering off or special configuration needed.

Hot-Reloading Configuration

Caddy supports reloading its config without downtime via the admin API. With the admin socket configured above:

curl --unix-socket /sockets/caddy-admin.sock \
    -X POST \
    -H "Content-Type: text/caddyfile" \
    --data-binary @Caddyfile \
    "http://localhost/load"

This lets you update routing (e.g., after deploying a new service) without restarting Caddy or dropping connections.

Why Caddy over Nginx?

Caddy Nginx
TLS Automatic (Let's Encrypt) Manual cert management
Config Simple Caddyfile Verbose config blocks
SSE/Streaming Works by default Requires proxy_buffering off
Hot reload Admin API, zero downtime nginx -s reload
Unix sockets Native unix/ prefix proxy_pass http://unix:...