← back to stream

LangChain

#ai#tools

LangChain is a framework for building LLM and RAG applications. It gives you a unified interface across model providers (ChatOpenAI, Anthropic, Ollama, Bedrock...), a prompt-template system, streaming, and a composable vocabulary of chains/runnables for wiring multi-step pipelines.

Typical TS setup:

import { ChatOpenAI } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
 
const model = new ChatOpenAI({ model: "gpt-4o-mini" });
const prompt = ChatPromptTemplate.fromMessages([
  ["system", "Translate to {language}"],
  ["user", "{text}"],
]);
 
const result = await model.invoke(
  await prompt.invoke({ language: "Italian", text: "hi" })
);

Packages: langchain, @langchain/core, @langchain/openai (+ provider SDKs as needed). For non-trivial agents the usual next step is LangGraph; for observability — LangSmith.