Skip to content

Connecting to LLMs

🤖 AI Summary

Integration options by difficulty: Easy - Claude Desktop (config JSON), Claude Code CLI (.mcp.json), FastMCP Client (Python). Medium - LangChain, LangGraph, CrewAI, AutoGen, MCP SDK. Hard - Anthropic API, OpenAI API (manual tool loop). For quick chat → Claude Desktop. For Python apps → FastMCP/LangChain. For multi-step agents → LangGraph. For multi-agent → CrewAI.

This MCP server can connect to any LLM that supports the Model Context Protocol or tool calling.

Feature Support by Client

Different MCP clients support different features:

Client Tools Resources Prompts Notes
Claude Desktop No prompts support
Claude Code CLI Resources via @, prompts via /mcp__
Cursor Tools only
Windsurf Tools only
Zed No prompts
Continue.dev No prompts
VS Code + Copilot ⚠️ Partial support
LangChain ⚠️ Manual prompt loading
OpenAI API Tools via function calling

Legend:

  • ✅ Supported
  • ⚠️ Partial/manual implementation
  • ❌ Not supported

See Prompts for available coaching prompts.

Sources: Claude Code MCP Docs, MCP Sampling Status

Native MCP Support

These clients have built-in MCP support:

Client Setup Difficulty Best For
Claude Desktop Easy Interactive chat with tools
Claude Code CLI Easy Development workflows

Agentic Frameworks

Use MCP tools with popular agent frameworks:

Framework Setup Difficulty Best For
LangChain Medium Complex agent pipelines
LangGraph Medium Stateful multi-step agents
CrewAI Medium Multi-agent collaboration
AutoGen Medium Conversational agents

Direct API Integration

For custom implementations:

Method Setup Difficulty Best For
FastMCP Client Easy Python scripts
MCP SDK Medium Custom clients
Anthropic API Hard Full control
OpenAI API Hard OpenAI models

Which Should I Use?

Just want to chat with match analysis?Claude Desktop

Building a Python application?FastMCP Client or LangChain

Need complex multi-step analysis?LangGraph

Want multiple specialized agents?CrewAI