Overview
@reminix/langchain lets you wrap existing LangChain.js agents and deploy them to Reminix with minimal changes.
Installation
npm install @reminix/langchain
This will also install @reminix/runtime as a dependency.
Quick Start
Wrap your LangChain agent and serve it:
import { ChatOpenAI } from '@langchain/openai';
import { createOpenAIFunctionsAgent, AgentExecutor } from 'langchain/agents';
import { tool } from '@langchain/core/tools';
import { ChatPromptTemplate } from '@langchain/core/prompts';
import { z } from 'zod';
import { serveAgent } from '@reminix/langchain';
// Your existing LangChain setup
const searchTool = tool(
async ({ query }) => `Results for: ${query}`,
{
name: 'search',
description: 'Search for information',
schema: z.object({ query: z.string() })
}
);
const llm = new ChatOpenAI({ model: 'gpt-4o' });
const prompt = ChatPromptTemplate.fromMessages([
['system', 'You are a helpful assistant.'],
['human', '{input}'],
['placeholder', '{agent_scratchpad}'],
]);
const agent = await createOpenAIFunctionsAgent({ llm, tools: [searchTool], prompt });
const executor = new AgentExecutor({ agent, tools: [searchTool] });
// Wrap and serve
serveAgent(executor, { name: 'search-agent', port: 8080 });
Wrapping Different Agent Types
OpenAI Functions Agent
import { createOpenAIFunctionsAgent, AgentExecutor } from 'langchain/agents';
import { wrapAgent } from '@reminix/langchain';
const agent = await createOpenAIFunctionsAgent({ llm, tools, prompt });
const executor = new AgentExecutor({ agent, tools });
const reminixAgent = wrapAgent(executor, 'my-agent');
ReAct Agent
import { createReactAgent, AgentExecutor } from 'langchain/agents';
import { wrapAgent } from '@reminix/langchain';
const agent = await createReactAgent({ llm, tools, prompt });
const executor = new AgentExecutor({ agent, tools });
const reminixAgent = wrapAgent(executor, 'react-agent');
Structured Chat Agent
import { createStructuredChatAgent, AgentExecutor } from 'langchain/agents';
import { wrapAgent } from '@reminix/langchain';
const agent = await createStructuredChatAgent({ llm, tools, prompt });
const executor = new AgentExecutor({ agent, tools });
const reminixAgent = wrapAgent(executor, 'chat-agent');
Configuration
const reminixAgent = wrapAgent(executor, {
name: 'my-agent',
description: 'What this agent does',
// Map LangChain input/output
inputKey: 'input', // LangChain input key
outputKey: 'output', // LangChain output key
// Streaming support
streaming: true
});
Multiple Agents
For multi-agent projects, use wrapAgent + serve instead of serveAgent:
import { wrapAgent } from '@reminix/langchain';
import { serve } from '@reminix/runtime';
const research = wrapAgent(researchExecutor, 'research-agent');
const writer = wrapAgent(writingExecutor, 'writing-agent');
serve({ agents: [research, writer], port: 8080 });
With Memory
LangChain agents with memory work too:
import { BufferMemory } from 'langchain/memory';
import { wrapAgent } from '@reminix/langchain';
const memory = new BufferMemory({ returnMessages: true });
const executor = new AgentExecutor({ agent, tools, memory });
const reminixAgent = wrapAgent(executor, 'chat-agent');
Memory persists only within a session. For cross-request memory, implement external storage.
Usage
Once deployed, call your agent using invoke. See Agents for detailed guidance.
For task-oriented operations:
import Reminix from '@reminix/sdk';
const client = new Reminix();
const response = await client.agents.invoke('search-agent', {
input: 'Search for TypeScript tutorials'
});
console.log(response.output);
With Messages
For conversations with message history:
const response = await client.agents.invoke('search-agent', {
messages: [
{ role: 'user', content: 'Search for TypeScript tutorials' },
{ role: 'assistant', content: 'Here are some tutorials...' },
{ role: 'user', content: 'Which one is best for beginners?' }
]
});
console.log(response.output);
Streaming
For real-time streaming responses:
const stream = await client.agents.invoke('search-agent', {
input: 'Tell me about TypeScript',
stream: true
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
Deployment
See the Deployment guide for deploying to Reminix, or Self-Hosting for deploying on your own infrastructure.
Next Steps