Browse docs

SDK

Tap to expand

Contribute

SDKUpdated 2026-03-18

LangChain Adapter

Use RetainDB as persistent memory inside LangChain chains and agents.

Applies to: @retaindb/sdk

If you're calling an LLM directly, `db.user(userId).runTurn(...)` is simpler — it handles retrieve, inject, and store in one call and doesn't require LangChain at all. Use this adapter only if you're already using LangChain's chain abstractions.

Install

bash
npm install @retaindb/sdk

Drop-in memory for any LangChain chain

typescript
import { RetainDB, createLangChainMemoryAdapter } from "@retaindb/sdk";
import { ConversationChain } from "langchain/chains";
import { ChatOpenAI } from "@langchain/openai";

const db = new RetainDB({ apiKey: process.env.RETAINDB_KEY });

const chain = new ConversationChain({
  llm: new ChatOpenAI({ model: "gpt-4o" }),
  memory: createLangChainMemoryAdapter(db, {
    userId: "user-123",
    sessionId: "chat-456",
  }),
});

await chain.call({ input: "Hi, I'm Alex" });
const response = await chain.call({ input: "What's my name?" });
// Memory carries forward across sessions automatically

What the adapter does

  • loadMemoryVariables — searches RetainDB with the actual user input as the query, so retrieved memories are relevant to what the user is asking (not a static "recent context" dump)
  • saveContext — passes the turn through ingestSession for proper fact/preference extraction, not raw event strings
  • clear — no-op by design; RetainDB memories are durable. Call db.user(id).forget(memoryId) for targeted deletion.

Options

OptionTypeDefaultDescription
userIdstringrequiredUser identity
sessionIdstringScopes memory to a conversation
topKnumber10Memories to retrieve per turn
memoryKeystring"history"LangChain memory variable name

With LangChain Expression Language, use getContext directly:

typescript
import { RunnableSequence } from "@langchain/core/runnables";
import { ChatOpenAI } from "@langchain/openai";
import { RetainDB } from "@retaindb/sdk";

const db = new RetainDB({ apiKey: process.env.RETAINDB_KEY });

const chain = RunnableSequence.from([
  async (input: { userId: string; question: string }) => {
    const { context } = await db.user(input.userId).getContext(input.question);
    return { ...input, context };
  },
  (input) => `Context:\n${input.context}\n\nQuestion: ${input.question}`,
  new ChatOpenAI({ model: "gpt-4o" }),
]);

const result = await chain.invoke({ userId: "user-123", question: "How should I format my code?" });

Next step

Was this page helpful?

Your feedback helps us prioritize docs improvements weekly.