Skip to content

feat: add MiniMax as third LLM provider#131

Open
octo-patch wants to merge 1 commit intoguangzhengli:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as third LLM provider#131
octo-patch wants to merge 1 commit intoguangzhengli:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax AI as a first-class LLM provider alongside OpenAI and Azure OpenAI. MiniMax offers an OpenAI-compatible API with models like MiniMax-M2.7 (1M context window) and MiniMax-M2.7-highspeed.

Changes

Backend

  • Add MINIMAX enum value to ModelType and minimaxApiKey field to KeyConfiguration interface
  • Add MINIMAX_API_KEY / MINIMAX_API_MODEL environment variables in const.ts
  • Handle MiniMax configuration in configuration.ts (headers + env vars + validation)
  • Route MiniMax chat models via OpenAI-compatible basePath (https://api.minimax.io/v1) in openai.ts
  • Add MiniMax embedding support using embo-01 model in embeddings.ts

Frontend

  • Add MiniMax tab in KeySettings.tsx with API key input and model selector (M2.7 / M2.7-highspeed)
  • Pass x-minimax-api-key header in all API calls (chat, query, embedding)
  • Update validation to recognize MiniMax API key
  • Rename settings button/title from "OpenAI" to generic "API Key Settings"

Documentation

  • Add MiniMax to feature list in README.md
  • Add MINIMAX_API_KEY and MINIMAX_API_MODEL to doc/env-vars.md

Tests (31 total)

  • Unit tests: types.test.ts (4), configuration.test.ts (8), openai.test.ts (7), embeddings.test.ts (4)
  • Integration tests: integration.test.ts (5) — end-to-end config → model/embeddings flow
  • All tests passing

How to use

Via environment variables

OPENAI_TYPE=MINIMAX
MINIMAX_API_KEY=your-minimax-api-key
MINIMAX_API_MODEL=MiniMax-M2.7

Via UI

Click API Key Settings → select the MiniMax tab → enter your API key and choose a model.

Test plan

  • All 31 unit and integration tests pass (npx jest)
  • Manual verification: select MiniMax provider in UI, enter API key, send a chat message
  • Manual verification: upload a file and chat with it using MiniMax provider

Add MiniMax AI (https://www.minimaxi.com) as a first-class LLM provider
alongside OpenAI and Azure OpenAI. MiniMax offers an OpenAI-compatible API
with models like MiniMax-M2.7 (1M context) and MiniMax-M2.7-highspeed.

Changes:
- Add MINIMAX enum value to ModelType and minimaxApiKey to KeyConfiguration
- Add MINIMAX_API_KEY/MINIMAX_API_MODEL environment variables
- Configure MiniMax via OpenAI-compatible basePath (api.minimax.io/v1)
- Add MiniMax embeddings support (embo-01 model)
- Add MiniMax tab in KeySettings UI with model selector
- Pass x-minimax-api-key header in all API calls
- Update env-vars documentation and README
- Add 31 tests (unit + integration) covering configuration, model
  creation, embeddings, and end-to-end provider flow
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant