Skip to content

End-to-end examples for connecting apps & data to LLMs using MCP, OpenAPI tools, and function calling (ChatGPT, Claude, Databricks).

License

Notifications You must be signed in to change notification settings

SaiSandeepKantareddy/ai-tools-example

Repository files navigation

AI Tools Integration Examples

Learn how to connect your custom applications and data with AI models like ChatGPT, Claude, and your own LLMs.

License: MIT Python 3.9+

🎯 What This Repo Covers

This repository demonstrates 4 different ways to integrate your data with AI systems:

Example Description Works With
mcp-server-stdio/ MCP Server (stdio transport) Claude Desktop, Windsurf
mcp-server-sse/ MCP Server (SSE transport) ChatGPT MCP Apps
chatgpt-actions/ REST API with OpenAPI ChatGPT Custom GPTs
custom-llm-app/ Your own LLM + Tools Databricks, OpenAI, any LLM

📚 Documentation

🚀 Quick Start

Prerequisites

# Python 3.9+
python --version

# Node.js (for MCP Inspector)
node --version

Install Dependencies

pip install -r requirements.txt

Run Examples

# MCP Server (stdio) - test with MCP Inspector
npx @modelcontextprotocol/inspector python mcp-server-stdio/server.py

# MCP Server (SSE) - run server, then connect Inspector
python mcp-server-sse/server.py
# Open http://localhost:8001

# ChatGPT Actions - REST API
python chatgpt-actions/main.py
# Open http://localhost:8000/docs

# Custom LLM App (requires Databricks credentials)
cp .env.example .env
# Edit .env with your credentials
python custom-llm-app/app.py
# Open http://localhost:8002

📁 Project Structure

ai-tools-example/
├── README.md
├── requirements.txt
├── .env.example
├── .gitignore
│
├── docs/
│   ├── AI-TOOLS-PRIMER.md      # Complete primer/guide
│   └── QUICK-START.md          # Quick start guide
│
├── mcp-server-stdio/           # MCP with stdio transport
│   ├── server.py
│   └── README.md
│
├── mcp-server-sse/             # MCP with SSE transport
│   ├── server.py
│   └── README.md
│
├── chatgpt-actions/            # REST API for ChatGPT
│   ├── main.py
│   └── README.md
│
├── custom-llm-app/             # Your own LLM app
│   ├── app.py
│   └── README.md
│
└── shared/                     # Shared data/utilities
    └── products.py

🔑 Key Concepts

How AI Tool Calling Works

User: "Show me pizzas"
         │
         ▼
┌─────────────────────────────┐
│  AI reads tool descriptions │
│  Decides: call get_products │
│  with category="pizza"      │
└─────────────────────────────┘
         │
         ▼
┌─────────────────────────────┐
│  Your server executes tool  │
│  Returns: [Pizza data...]   │
└─────────────────────────────┘
         │
         ▼
┌─────────────────────────────┐
│  AI formats response        │
│  "Here are our pizzas..."   │
└─────────────────────────────┘

Protocols Comparison

Protocol Purpose Use Case
MCP AI ↔ Tools Claude, Windsurf, ChatGPT
OpenAPI API Documentation ChatGPT Actions, REST clients
A2A Agent ↔ Agent Multi-agent systems

🛠️ Configuration

For Custom LLM App (Databricks)

Copy .env.example to .env and fill in your credentials:

DATABRICKS_HOST=https://your-workspace.cloud.databricks.com
DATABRICKS_TOKEN=your-token-here
DATABRICKS_ENDPOINT=databricks-claude-3-7-sonnet

For ChatGPT Actions (Public URL)

Use ngrok to expose your local server:

ngrok http 8000
# Use the https URL in ChatGPT

📖 Learn More

🤝 Contributing

Contributions welcome! Please read the contributing guidelines first.

📄 License

MIT License - see LICENSE file.

About

End-to-end examples for connecting apps & data to LLMs using MCP, OpenAPI tools, and function calling (ChatGPT, Claude, Databricks).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors