Skip to main content

Overview

Reminix agents are standard HTTP servers. Deploy them anywhere you can run Python or Node.js applications.
Looking for the quickest path to production?Reminix Cloud handles infrastructure, scaling, and monitoring for you — just push your code and go. This guide covers self-hosting for teams who prefer to run on their own infrastructure.

Deployment Options

Reminix offers two ways to deploy your agents:
FeatureSelf-HostingReminix Cloud
SetupDeploy anywherePush to GitHub
InfrastructureYou manageFully managed
ScalingManualAutomatic
MonitoringYour own toolsBuilt-in dashboard
CostInfrastructure costsUsage-based
Best forFull control, compliance needsFast deployment, zero DevOps
Both options use the same open source Reminix Runtime, so you can switch between them anytime. No vendor lock-in.

Docker

FROM python:3.11-slim

WORKDIR /app

COPY pyproject.toml .
RUN pip install .

COPY main.py .

EXPOSE 8080
CMD ["python", "main.py"]
docker build -t my-agent .
docker run -p 8080:8080 -e OPENAI_API_KEY=sk-... my-agent

Kubernetes

apiVersion: apps/v1
kind: Deployment
metadata:
  name: my-agent
spec:
  replicas: 2
  selector:
    matchLabels:
      app: my-agent
  template:
    metadata:
      labels:
        app: my-agent
    spec:
      containers:
      - name: my-agent
        image: my-agent:latest
        ports:
        - containerPort: 8080
        env:
        - name: PORT
          value: "8080"
        - name: OPENAI_API_KEY
          valueFrom:
            secretKeyRef:
              name: agent-secrets
              key: openai-api-key
        livenessProbe:
          httpGet:
            path: /health
            port: 8080
          initialDelaySeconds: 10
          periodSeconds: 30
        readinessProbe:
          httpGet:
            path: /health
            port: 8080
          initialDelaySeconds: 5
          periodSeconds: 10
---
apiVersion: v1
kind: Service
metadata:
  name: my-agent
spec:
  selector:
    app: my-agent
  ports:
  - port: 80
    targetPort: 8080
  type: LoadBalancer
Create the secret:
kubectl create secret generic agent-secrets \
  --from-literal=openai-api-key=sk-...

Multiple Agents

You can serve multiple agents from a single deployment. This is useful for related agents that share dependencies.

Docker with Multiple Agents

# main.py
from reminix_runtime import agent, serve
import os

@agent
async def summarizer(text: str) -> str:
    """Summarize text."""
    return f"Summary: {text[:100]}..."

@agent
async def translator(text: str, target: str = "es") -> str:
    """Translate text."""
    return f"Translated to {target}: {text}"

if __name__ == "__main__":
    port = int(os.environ.get("PORT", 8080))
    serve(agents=[summarizer, translator], port=port)
docker run -p 8080:8080 my-multi-agent

Kubernetes with Multiple Agents

The same deployment serves all agents. Call specific agents by name:
# Call the summarizer agent
curl -X POST http://my-agent/agents/summarizer/invoke \
  -H "Content-Type: application/json" \
  -d '{"text": "Long document..."}'

# Call the translator agent
curl -X POST http://my-agent/agents/translator/invoke \
  -H "Content-Type: application/json" \
  -d '{"text": "Hello", "target": "es"}'
Use the /info endpoint to discover available agents:
curl http://my-agent/info
# {"agents": [{"name": "summarizer", ...}, {"name": "translator", ...}]}
For detailed multi-agent patterns and best practices, see Multiple Agents.

Serverless

For serverless deployments, use toHandler() (TypeScript) or to_asgi() (Python) to get a handler compatible with edge/serverless platforms.

TypeScript

// Vercel Edge Function (app/api/agent/route.ts)
import { agent } from '@reminix/runtime';

const myAgent = agent('my-agent', {
  description: 'A serverless agent',
  handler: async ({ prompt }) => `Completed: ${prompt}`
});

// Export the handler
export const POST = myAgent.toHandler();
export const GET = myAgent.toHandler();
Compatible platforms:
  • Vercel Edge Functions - Export the handler directly
  • Cloudflare Workers - export default { fetch: agent.toHandler() }
  • Deno Deploy - Deno.serve(agent.toHandler())
  • Bun - Bun.serve({ fetch: agent.toHandler() })

Python

# AWS Lambda with Mangum
from mangum import Mangum
from reminix_runtime import agent

@agent
async def my_agent(prompt: str) -> str:
    """A serverless agent."""
    return f"Completed: {prompt}"

# Lambda handler
handler = Mangum(my_agent.to_asgi())
Compatible platforms:
  • AWS Lambda - Use Mangum adapter
  • GCP Cloud Functions - Use functions-framework with ASGI
  • Any ASGI server - uvicorn, hypercorn, daphne

Cloud Platforms

Railway

railway login
railway init
railway up
Set environment variables in the Railway dashboard.

Render

  1. Connect your GitHub repository
  2. Set environment variables in the dashboard
  3. Render auto-deploys on push

Fly.io

fly launch
fly secrets set OPENAI_API_KEY=sk-...
fly deploy

AWS

App Runner (easiest):
  1. Create App Runner service
  2. Connect to your container registry or GitHub
  3. Set environment variables in configuration
ECS/Fargate:
  1. Push image to ECR
  2. Create task definition with environment variables
  3. Create ECS service
Lambda (container image):
  1. Push image to ECR
  2. Create Lambda function from container image
  3. Set environment variables in configuration

Google Cloud

Cloud Run (recommended):
gcloud run deploy my-agent \
  --image gcr.io/PROJECT/my-agent \
  --platform managed \
  --set-env-vars OPENAI_API_KEY=sk-...
GKE: Use the Kubernetes configuration above.

Azure

Container Apps:
az containerapp create \
  --name my-agent \
  --resource-group mygroup \
  --image my-agent:latest \
  --target-port 8080 \
  --env-vars OPENAI_API_KEY=sk-...

Environment Variables

Configure your agent using environment variables:
VariableDescription
PORTServer port (default: 8080)
OPENAI_API_KEYOpenAI API key
ANTHROPIC_API_KEYAnthropic API key
DATABASE_URLDatabase connection string
import os

port = int(os.environ.get("PORT", 8080))
api_key = os.environ.get("OPENAI_API_KEY")

Health Checks

The runtime exposes /health automatically. Configure your platform to use it:
curl http://localhost:8080/health
# {"status": "ok"}
Most platforms support health check configuration:
  • Kubernetes: livenessProbe and readinessProbe
  • Docker Compose: healthcheck
  • AWS ECS: Health check in task definition
  • Cloud Run: Automatic via /health

Connecting to Self-Hosted Agents

Point the SDK to your self-hosted agent:
from reminix import Reminix

client = Reminix(base_url="https://my-agent.example.com")
response = client.agents.invoke("my-agent", prompt="Hello!")
print(response["content"])

Next Steps