|
| 1 | +# OpenRouter Python SDK |
| 2 | + |
| 3 | +The OpenRouter Python SDK is a type-safe toolkit for building AI applications with access to 300+ language models through a unified API. |
| 4 | + |
| 5 | +## Why use the OpenRouter SDK? |
| 6 | + |
| 7 | +Integrating AI models into applications involves handling different provider APIs, managing model-specific requirements, and avoiding common implementation mistakes. The OpenRouter SDK standardizes these integrations and protects you from footguns. |
| 8 | + |
| 9 | +```python |
| 10 | +from openrouter import OpenRouter |
| 11 | +import os |
| 12 | + |
| 13 | +with OpenRouter( |
| 14 | + api_key=os.getenv("OPENROUTER_API_KEY") |
| 15 | +) as client: |
| 16 | + response = client.chat.send( |
| 17 | + model="minimax/minimax-m2", |
| 18 | + messages=[ |
| 19 | + {"role": "user", "content": "Explain quantum computing"} |
| 20 | + ] |
| 21 | + ) |
| 22 | +``` |
| 23 | + |
| 24 | +The SDK provides three core benefits: |
| 25 | + |
| 26 | +### Auto-generated from API specifications |
| 27 | + |
| 28 | +The SDK is automatically generated from OpenRouter's OpenAPI specs and updated with every API change. New models, parameters, and features appear in your IDE autocomplete immediately. No manual updates. No version drift. |
| 29 | + |
| 30 | +```python |
| 31 | +# When new models launch, they're available instantly |
| 32 | +response = client.chat.send( |
| 33 | + model="minimax/minimax-m2" |
| 34 | +) |
| 35 | +``` |
| 36 | + |
| 37 | +### Type-safe by default |
| 38 | + |
| 39 | +Every parameter, response field, and configuration option is fully typed with Python type hints and validated with Pydantic. Invalid configurations are caught at runtime with clear error messages. |
| 40 | + |
| 41 | +```python |
| 42 | +response = client.chat.send( |
| 43 | + model="minimax/minimax-m2", |
| 44 | + messages=[ |
| 45 | + {"role": "user", "content": "Hello"} |
| 46 | + # ← Pydantic validates message structure |
| 47 | + ], |
| 48 | + temperature=0.7, # ← Type-checked and validated |
| 49 | + stream=True # ← Response type changes based on this |
| 50 | +) |
| 51 | +``` |
| 52 | + |
| 53 | +**Actionable error messages:** |
| 54 | + |
| 55 | +```python |
| 56 | +# Instead of generic errors, get specific guidance: |
| 57 | +# "Model 'openai/o1-preview' requires at least 2 messages. |
| 58 | +# You provided 1 message. Add a system or user message." |
| 59 | +``` |
| 60 | + |
| 61 | +**Type-safe streaming:** |
| 62 | + |
| 63 | +```python |
| 64 | +stream = client.chat.send( |
| 65 | + model="minimax/minimax-m2", |
| 66 | + messages=[{"role": "user", "content": "Write a story"}], |
| 67 | + stream=True |
| 68 | +) |
| 69 | + |
| 70 | +for event in stream: |
| 71 | + # Full type information for streaming responses |
| 72 | + content = event.choices[0].delta.content if event.choices else None |
| 73 | +``` |
| 74 | + |
| 75 | +**Async support:** |
| 76 | + |
| 77 | +```python |
| 78 | +import asyncio |
| 79 | + |
| 80 | +async def main(): |
| 81 | + async with OpenRouter( |
| 82 | + api_key=os.getenv("OPENROUTER_API_KEY") |
| 83 | + ) as client: |
| 84 | + response = await client.chat.send_async( |
| 85 | + model="minimax/minimax-m2", |
| 86 | + messages=[{"role": "user", "content": "Hello"}] |
| 87 | + ) |
| 88 | + print(response.choices[0].message.content) |
| 89 | + |
| 90 | +asyncio.run(main()) |
| 91 | +``` |
| 92 | + |
| 93 | +## Installation |
| 94 | + |
| 95 | +```bash |
| 96 | +# Using uv (recommended) |
| 97 | +uv add openrouter |
| 98 | + |
| 99 | +# Using pip |
| 100 | +pip install openrouter |
| 101 | + |
| 102 | +# Using poetry |
| 103 | +poetry add openrouter |
| 104 | +``` |
| 105 | + |
| 106 | +**Requirements:** Python 3.9 or higher |
| 107 | + |
| 108 | +Get your API key from [openrouter.ai/settings/keys](https://openrouter.ai/settings/keys). |
| 109 | + |
| 110 | +## Quick start |
| 111 | + |
| 112 | +```python |
| 113 | +from openrouter import OpenRouter |
| 114 | +import os |
| 115 | + |
| 116 | +with OpenRouter( |
| 117 | + api_key=os.getenv("OPENROUTER_API_KEY") |
| 118 | +) as client: |
| 119 | + response = client.chat.send( |
| 120 | + model="minimax/minimax-m2", |
| 121 | + messages=[ |
| 122 | + {"role": "user", "content": "Hello!"} |
| 123 | + ] |
| 124 | + ) |
| 125 | + |
| 126 | + print(response.choices[0].message.content) |
| 127 | +``` |
0 commit comments