Skip to content

feat: add MiniMax as LLM provider (M2.7)#1152

Open
octo-patch wants to merge 3 commits intofeder-cr:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider (M2.7)#1152
octo-patch wants to merge 3 commits intofeder-cr:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Mar 14, 2026

Summary

Add MiniMax (https://www.minimax.io) as a supported LLM provider with the latest MiniMax-M2.7 flagship model.

Changes

  • Add MiniMaxModel class using LangChain ChatOpenAI with MiniMax OpenAI-compatible API
  • Register minimax provider in AIAdapter factory
  • Add MINIMAX constant to src/utils/constants.py
  • Update README with supported LLM providers table (MiniMax-M2.7 recommended)
  • Document minimax in config.py provider list

Usage

# config.py
LLM_MODEL_TYPE = 'minimax'
LLM_MODEL = 'MiniMax-M2.7'

Set MINIMAX_API_KEY in data_folder/secrets.yaml.

Why

MiniMax-M2.7 is the latest flagship model with 204K context window, enhanced reasoning and coding capabilities.

Add MiniMax (https://www.minimax.io) as a supported LLM provider.
MiniMax offers OpenAI-compatible API with models like MiniMax-M2.5
featuring 204K context window.

Changes:
- Add MiniMaxModel class using langchain ChatOpenAI with MiniMax base URL
- Register minimax provider in AIAdapter factory
- Add MINIMAX constant
- Update README with supported LLM providers table
- Document minimax in config.py provider list
- Update recommended model from MiniMax-M2.5 to MiniMax-M2.7
- MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding
- Update recommended model from MiniMax-M2.5 to MiniMax-M2.7
- MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities
@octo-patch octo-patch changed the title feat: add MiniMax as LLM provider feat: add MiniMax as LLM provider (M2.7) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant