Overview
reminix-anthropic provides a simple wrapper for the Anthropic API. This is a model-only adapter for basic chat completions without tool calling.
Installation
pip install reminix-anthropic
This will also install reminix-runtime as a dependency.
Quick Start
Wrap an Anthropic client and serve it:
from anthropic import AsyncAnthropic
from reminix_anthropic import serve_agent
client = AsyncAnthropic()
if __name__ == "__main__":
serve_agent(client, name="claude-agent", model="claude-sonnet-4-20250514", port=8080)
Configuration
agent = wrap_agent(
client,
name="my-agent",
model="claude-sonnet-4-20250514", # Model to use
max_tokens=4096 # Maximum response tokens
)
Multiple Agents
For multi-agent projects, use wrap_agent + serve instead of serve_agent:
from reminix_anthropic import wrap_agent
from reminix_runtime import serve
sonnet = wrap_agent(client, name="sonnet-agent", model="claude-sonnet-4-20250514")
haiku = wrap_agent(client, name="haiku-agent", model="claude-3-5-haiku-20241022")
serve(agents=[sonnet, haiku], port=8080)
Usage
With Prompt
Single request/response with a prompt:
from reminix import Reminix
client = Reminix()
response = client.agents.invoke(
"claude-agent",
prompt="Explain quantum computing in one sentence"
)
print(response.output)
With Messages
For conversations with message history:
response = client.agents.invoke(
"claude-agent",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is Python?"},
{"role": "assistant", "content": "Python is a programming language."},
{"role": "user", "content": "What are its main uses?"}
]
)
print(response.output)
System messages are automatically extracted and passed to Claude’s system parameter.
Streaming
For real-time streaming responses:
for chunk in client.agents.invoke(
"claude-agent",
prompt="Tell me a story",
stream=True
):
print(chunk, end="", flush=True)
When to Use This Adapter
Use reminix-anthropic when:
- You need simple chat completions
- No tool calling required
- Direct model access is sufficient
Use LangChain/LangGraph instead when:
- You need tool calling
- You need agents with memory
- You need complex workflows
Deployment
See the Deployment guide for deploying to Reminix, or Self-Hosting for deploying on your own infrastructure.
Set your Anthropic API key in Project Settings → Secrets.
Next Steps