Overview
@reminix/anthropic provides a simple wrapper for the Anthropic API. This is a model-only adapter for basic chat completions without tool calling.
Installation
npm install @reminix/anthropic
This will also install @reminix/runtime as a dependency.
Quick Start
Wrap an Anthropic client and serve it:
import Anthropic from '@anthropic-ai/sdk';
import { serveAgent } from '@reminix/anthropic';
const client = new Anthropic();
serveAgent(client, { name: 'claude-agent', model: 'claude-sonnet-4-20250514', port: 8080 });
Configuration
const agent = wrapAgent(client, {
name: 'my-agent',
model: 'claude-sonnet-4-20250514', // Model to use
maxTokens: 4096 // Maximum response tokens
});
Multiple Agents
For multi-agent projects, use wrapAgent + serve instead of serveAgent:
import { wrapAgent } from '@reminix/anthropic';
import { serve } from '@reminix/runtime';
const sonnet = wrapAgent(client, { name: 'sonnet-agent', model: 'claude-sonnet-4-20250514' });
const haiku = wrapAgent(client, { name: 'haiku-agent', model: 'claude-3-5-haiku-20241022' });
serve({ agents: [sonnet, haiku], port: 8080 });
Usage
With Prompt
Single request/response with a prompt:
import Reminix from '@reminix/sdk';
const client = new Reminix();
const response = await client.agents.invoke('claude-agent', {
prompt: 'Explain quantum computing in one sentence'
});
console.log(response.output);
With Messages
For conversations with message history:
const response = await client.agents.invoke('claude-agent', {
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is TypeScript?' },
{ role: 'assistant', content: 'TypeScript is a typed superset of JavaScript.' },
{ role: 'user', content: 'What are its main benefits?' }
]
});
console.log(response.output);
System messages are automatically extracted and passed to Claude’s system parameter.
Streaming
For real-time streaming responses:
const stream = await client.agents.invoke('claude-agent', {
prompt: 'Tell me a story',
stream: true
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
When to Use This Adapter
Use @reminix/anthropic when:
- You need simple chat completions
- No tool calling required
- Direct model access is sufficient
Use LangChain/LangGraph/Vercel AI instead when:
- You need tool calling
- You need agents with memory
- You need complex workflows
Deployment
See the Deployment guide for deploying to Reminix, or Self-Hosting for deploying on your own infrastructure.
Set your Anthropic API key in Project Settings → Secrets.
Next Steps