๐Ÿ”Œ

How Does MCP Work?

An Open Protocol Connecting AI Models with External Tools in a Standardized Way

MCP (Model Context Protocol) is an open protocol published by Anthropic that standardizes the connection between AI models and external tools/data sources. Previously, integrating AI with DBs, files, APIs, etc. required separate implementations for each (Nร—M problem), and MCP unifies this into a single standard. The architecture follows a Host (LLM service) โ†’ Client (connection management) โ†’ Server (external system connection) structure, where the Server provides Prompts (guidelines), Resources (reference data), and Tools (executable functions), while the Client provides Roots (file system access) and Sampling (requesting AI help). Various SDKs are available for TypeScript, Python, and more, with over 4,000 MCP servers registered on hubs like Smithery.

Architecture Diagram

๐Ÿค–
Host
Claude / ChatGPT etc.
LLM decides Tool calls
โ‘  Tool Call Request Roots / Sampling
๐Ÿ”Œ
MCP Client
Internal connection management component
Roots (File Access) Sampling (AI Request)
โ‘ก JSON-RPC stdio / SSE / HTTP
โšก
MCP Server
External System Connection
Tools (Function Execution) Resources (Data) Prompts (Guide)
๐Ÿ—„๏ธ
PostgreSQL
๐Ÿ“
File System
๐Ÿ™
GitHub
๐Ÿ’ฌ
Slack
โ˜๏ธ
SaaS APIs
Nร—M Problem Solved:
10 AIs ร— 10 tools = 100 integrations Compatible with one MCP standard
Key Points
  • Host: LLM service like Claude (user interface)
  • Client: Manages connections to MCP Server inside Host
  • Server: Independent process that exposes external systems (DB, files, API) to AI
  • Communication uses JSON-RPC based, transport options: stdio/SSE/HTTP
  • Over 4,000+ MCP servers registered on hubs like Smithery

How It Works

1

Host (Claude and other LLM services) initializes MCP Client

2

Client requests connection to MCP Server (stdio, SSE, HTTP and other transports)

3

Server delivers list of available Tools, Resources, and Prompts to Client

4

AI model analyzes user request and determines necessary Tool calls

5

Client executes Server's Tool (DB query, file read, API call, etc.)

6

Server returns execution result to Client โ†’ AI model uses result to generate response

Pros

  • Solves Nร—M integration problem (standard protocol)
  • Tool developers and LLM developers can develop independently
  • Open source, available to everyone
  • Various language SDKs available (TypeScript, Python, etc.)
  • Systematic structure with Tools/Resources/Prompts

Cons

  • Still early stage (ecosystem growing)
  • Server implementation and security management needed
  • Network dependency (tools unavailable when Server is down)
  • Varying MCP support levels across AI models

Use Cases

Claude Desktop + PostgreSQL analysis AI coding assistants (file system access) Slack/GitHub/Google Drive integration AI-based data pipelines Internal system and LLM integration