Skip to Content
JavaScriptUse ToolsLangchain Example

Using Tools with LangChain

import { ChatAnthropic } from "@langchain/anthropic"; import { ChatPromptTemplate } from "@langchain/core/prompts"; import { createToolCallingAgent } from "langchain/agents"; import { AgentExecutor } from "langchain/agents"; import { LangChainAdapter } from "@reacter/openapitools"; // Initialize the language model (Claude or OpenAI) const llm = new ChatAnthropic({ model: "claude-3-7-sonnet-20250219", apiKey: "your-anthropic-api-key", verbose: true, }); // Initialize the adapter const toolsAdapter = new LangChainAdapter( "your OpenAPI Tools Apikey - https://openapitools.com/dashboard/settings", { autoRefreshCount: 50, } ); async function main() { // Define a prompt template with placeholders const prompt = ChatPromptTemplate.fromMessages([ ["system", "You are a helpful assistant"], ["placeholder", "{chat_history}"], ["human", "{input}"], ["placeholder", "{agent_scratchpad}"], ]); // Get tools in LangChain format const tools = await toolsAdapter.getLangChainTools(); // Create an agent with the LLM, tools, and prompt const agent = createToolCallingAgent({ llm, tools, prompt, }); // Create an executor to use the agent const agentExecutor = new AgentExecutor({ agent, tools, }); // Invoke the agent with a user query const res = await agentExecutor.invoke({ input: "Can you generate an OTP for 9347994869?", }); // Output the result console.log("Agent response:"); console.log(res); } main().catch(console.error);

The LangChain integration provides a different approach to using tools by leveraging LangChain’s agent framework. Instead of manually handling the tool calls, the AgentExecutor takes care of the back-and-forth with the model. This is particularly useful when building applications where you want to simplify the interaction flow.

Key benefits of using the LangChain integration:

  1. Simplified coding: The agent manages the conversation flow and tool execution
  2. Standardized interface: Works with different LLM providers through LangChain’s interfaces
  3. Access to LangChain features: Leverage LangChain’s memory, prompt templates, and other tools

To use different models, you can swap out the LLM implementation:

// Using OpenAI import { ChatOpenAI } from "@langchain/openai"; const llm = new ChatOpenAI({ model: "gpt-4o", apiKey: "your-openai-api-key", verbose: true, }); // Rest of the code remains the same
Last updated on