Learn how to connect your custom applications and data with AI models like ChatGPT, Claude, and your own LLMs.
This repository demonstrates 4 different ways to integrate your data with AI systems:
| Example | Description | Works With |
|---|---|---|
mcp-server-stdio/ |
MCP Server (stdio transport) | Claude Desktop, Windsurf |
mcp-server-sse/ |
MCP Server (SSE transport) | ChatGPT MCP Apps |
chatgpt-actions/ |
REST API with OpenAPI | ChatGPT Custom GPTs |
custom-llm-app/ |
Your own LLM + Tools | Databricks, OpenAI, any LLM |
- AI-TOOLS-PRIMER.md - Complete guide explaining all concepts
- QUICK-START.md - Get running in 5 minutes
# Python 3.9+
python --version
# Node.js (for MCP Inspector)
node --versionpip install -r requirements.txt# MCP Server (stdio) - test with MCP Inspector
npx @modelcontextprotocol/inspector python mcp-server-stdio/server.py
# MCP Server (SSE) - run server, then connect Inspector
python mcp-server-sse/server.py
# Open http://localhost:8001
# ChatGPT Actions - REST API
python chatgpt-actions/main.py
# Open http://localhost:8000/docs
# Custom LLM App (requires Databricks credentials)
cp .env.example .env
# Edit .env with your credentials
python custom-llm-app/app.py
# Open http://localhost:8002ai-tools-example/
├── README.md
├── requirements.txt
├── .env.example
├── .gitignore
│
├── docs/
│ ├── AI-TOOLS-PRIMER.md # Complete primer/guide
│ └── QUICK-START.md # Quick start guide
│
├── mcp-server-stdio/ # MCP with stdio transport
│ ├── server.py
│ └── README.md
│
├── mcp-server-sse/ # MCP with SSE transport
│ ├── server.py
│ └── README.md
│
├── chatgpt-actions/ # REST API for ChatGPT
│ ├── main.py
│ └── README.md
│
├── custom-llm-app/ # Your own LLM app
│ ├── app.py
│ └── README.md
│
└── shared/ # Shared data/utilities
└── products.py
User: "Show me pizzas"
│
▼
┌─────────────────────────────┐
│ AI reads tool descriptions │
│ Decides: call get_products │
│ with category="pizza" │
└─────────────────────────────┘
│
▼
┌─────────────────────────────┐
│ Your server executes tool │
│ Returns: [Pizza data...] │
└─────────────────────────────┘
│
▼
┌─────────────────────────────┐
│ AI formats response │
│ "Here are our pizzas..." │
└─────────────────────────────┘
| Protocol | Purpose | Use Case |
|---|---|---|
| MCP | AI ↔ Tools | Claude, Windsurf, ChatGPT |
| OpenAPI | API Documentation | ChatGPT Actions, REST clients |
| A2A | Agent ↔ Agent | Multi-agent systems |
Copy .env.example to .env and fill in your credentials:
DATABRICKS_HOST=https://your-workspace.cloud.databricks.com
DATABRICKS_TOKEN=your-token-here
DATABRICKS_ENDPOINT=databricks-claude-3-7-sonnetUse ngrok to expose your local server:
ngrok http 8000
# Use the https URL in ChatGPTContributions welcome! Please read the contributing guidelines first.
MIT License - see LICENSE file.