SDKUpdated 2026-03-18
LangChain Adapter
Use RetainDB as persistent memory inside LangChain chains and agents.
Applies to: @retaindb/sdk
Install
bash
npm install @retaindb/sdkDrop-in memory for any LangChain chain
typescript
import { RetainDB, createLangChainMemoryAdapter } from "@retaindb/sdk";
import { ConversationChain } from "langchain/chains";
import { ChatOpenAI } from "@langchain/openai";
const db = new RetainDB({ apiKey: process.env.RETAINDB_KEY });
const chain = new ConversationChain({
llm: new ChatOpenAI({ model: "gpt-4o" }),
memory: createLangChainMemoryAdapter(db, {
userId: "user-123",
sessionId: "chat-456",
}),
});
await chain.call({ input: "Hi, I'm Alex" });
const response = await chain.call({ input: "What's my name?" });
// Memory carries forward across sessions automaticallyWhat the adapter does
loadMemoryVariables— searches RetainDB with the actual user input as the query, so retrieved memories are relevant to what the user is asking (not a static "recent context" dump)saveContext— passes the turn throughingestSessionfor proper fact/preference extraction, not raw event stringsclear— no-op by design; RetainDB memories are durable. Calldb.user(id).forget(memoryId)for targeted deletion.
Options
| Option | Type | Default | Description |
|---|---|---|---|
userId | string | required | User identity |
sessionId | string | — | Scopes memory to a conversation |
topK | number | 10 | Memories to retrieve per turn |
memoryKey | string | "history" | LangChain memory variable name |
LCEL (recommended over chains)
With LangChain Expression Language, use getContext directly:
typescript
import { RunnableSequence } from "@langchain/core/runnables";
import { ChatOpenAI } from "@langchain/openai";
import { RetainDB } from "@retaindb/sdk";
const db = new RetainDB({ apiKey: process.env.RETAINDB_KEY });
const chain = RunnableSequence.from([
async (input: { userId: string; question: string }) => {
const { context } = await db.user(input.userId).getContext(input.question);
return { ...input, context };
},
(input) => `Context:\n${input.context}\n\nQuestion: ${input.question}`,
new ChatOpenAI({ model: "gpt-4o" }),
]);
const result = await chain.invoke({ userId: "user-123", question: "How should I format my code?" });Next step
Was this page helpful?
Your feedback helps us prioritize docs improvements weekly.