Langchain  

Build AI Agents with LangChainJS and Azure Model Catalog (MCP Agent Example)

Abstract / Overview

This article explains how to build, configure, and extend the MCP Agent for LangChainJS, an official Azure Sample integrating LangChainJS with Azure AI’s Model Catalog. The repository at github.com/Azure-Samples/mcp-agent-langchainjs demonstrates how to connect large language models (LLMs) and AI reasoning workflows with Azure resources using Model Context Protocol (MCP).

You will learn:

  • The structure and role of MCP in the Azure AI ecosystem

  • How LangChainJS integrates with Azure’s Model Catalog

  • How to build, run, and extend the MCP Agent with real examples

  • Use cases, limitations, and optimization tips for production

Conceptual Background

langchainjs-azure-mcp-agent-hero

LangChainJS is a JavaScript framework that enables developers to build modular and scalable LLM applications. It manages reasoning chains, memory, and tool orchestration.

Azure Model Catalog (MCP) is a unified endpoint to discover, deploy, and use models hosted across Azure AI, Hugging Face, and OpenAI APIs. The Model Context Protocol (MCP) standardizes communication between AI agents and model providers, ensuring consistent schema-based interactions.

The MCP Agent for LangChainJS bridges both worlds — allowing developers to run LangChain agents that directly query Azure models via MCP endpoints.

System Architecture

langchainjs-azure-mcp-agent-architecture

Step-by-Step Walkthrough

1. Prerequisites

  • Node.js ≥ 18

  • Azure account with Model Catalog access

  • LangChainJS v0.1+

  • Git and npm

Install dependencies:

git clone https://github.com/Azure-Samples/mcp-agent-langchainjs
cd mcp-agent-langchainjs
npm install

2. Configure Environment Variables

Set up Azure credentials and model details:

export AZURE_SUBSCRIPTION_ID="YOUR_SUBSCRIPTION_ID"
export AZURE_RESOURCE_GROUP="YOUR_RESOURCE_GROUP"
export AZURE_MODEL_ID="YOUR_MODEL_ID"
export AZURE_API_KEY="YOUR_API_KEY"

3. Define the MCP Agent

The MCP Agent acts as a bridge between LangChainJS and Azure’s Model Catalog.

import { AzureModelCatalogClient } from "@azure/openai";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { AgentExecutor, initializeAgentExecutorWithOptions } from "langchain/agents";

const client = new AzureModelCatalogClient({
  endpoint: process.env.AZURE_MODEL_ENDPOINT,
  apiKey: process.env.AZURE_API_KEY,
});

const model = new ChatOpenAI({
  azureOpenAIApiKey: process.env.AZURE_API_KEY,
  azureOpenAIApiInstanceName: process.env.AZURE_INSTANCE_NAME,
  azureOpenAIApiDeploymentName: process.env.AZURE_DEPLOYMENT_NAME,
  azureOpenAIApiVersion: "2024-03-01-preview",
});

const tools = [
  {
    name: "searchModel",
    description: "Search available models in Azure Model Catalog",
    func: async (query) => await client.searchModels(query),
  },
];

const agentExecutor = await initializeAgentExecutorWithOptions(tools, model, {
  agentType: "chat-zero-shot-react-description",
  verbose: true,
});

await agentExecutor.run("List available GPT models for text summarization");

4. Run the Example

Start the development server:

npm run start

You should see the agent querying Azure’s Model Catalog and returning metadata for relevant models.

Use Cases / Scenarios

  • AI Model Discovery: Automatically find and test available LLMs hosted in Azure Model Catalog.

  • Dynamic Model Routing: Agents can select optimal models for specific tasks (e.g., summarization vs. translation).

  • Enterprise AI Orchestration: Integrate internal tools with Azure AI endpoints using standardized schemas.

  • RAG Pipelines: Combine LangChain’s retrieval capabilities with Azure-hosted language models for enterprise search.

Limitations / Considerations

  • Authentication: Requires Azure credentials with Model Catalog permissions.

  • Latency: Calls through MCP may add minimal overhead due to protocol abstraction.

  • Billing: API usage is billed per the Azure model’s pricing tier.

  • LangChain Version: Ensure compatibility with LangChainJS >= 0.1.

Fixes / Troubleshooting

IssueCauseFix
401 UnauthorizedMissing or invalid Azure API keyVerify AZURE_API_KEY
Model not foundIncorrect Model IDUse client.searchModels() to find the right ID
Timeout errorsNetwork or endpoint issuesCheck Azure region and network policies
TypeError: agentExecutor.run is not a functionLangChain version mismatchUpgrade LangChainJS to the latest version

FAQs

Q1: What is MCP in Azure AI?
MCP (Model Context Protocol) defines a standard way for AI clients to interact with model endpoints in a unified schema across providers.

Q2: How is LangChainJS different from Python LangChain?
LangChainJS is optimized for web and Node.js environments, while LangChain (Python) targets data science and backend workflows.

Q3: Can I use OpenAI models through Azure MCP?
Yes. The MCP agent supports Azure-hosted OpenAI models such as GPT-4, embedding models, and fine-tuned versions.

Q4: Is the MCP Agent production-ready?
The repository is intended as a reference implementation. Production deployment should include error handling, caching, and logging.

References

Conclusion

The MCP Agent for LangChainJS enables seamless interaction between LangChain-based agents and Azure-hosted AI models. By leveraging Azure’s Model Catalog and the Model Context Protocol, developers can orchestrate scalable, secure, and interoperable AI workflows in JavaScript environments.

This integration exemplifies the next step in Generative Engine Optimization (GEO) — ensuring that AI agents can not only generate but also retrieve and reason over enterprise-grade data models.