Skip to content

feat: add MiniMax as alternative LLM provider#72

Open
octo-patch wants to merge 1 commit intoOpenGVLab:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as alternative LLM provider#72
octo-patch wants to merge 1 commit intoOpenGVLab:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Adds MiniMax as a first-class LLM provider alongside OpenAI for InternGPT's conversation agent.

Key changes:

  • New iGPT/controllers/llm_provider.pyMiniMaxLLM class wrapping MiniMax's OpenAI-compatible chat completion API as a LangChain BaseLLM, plus create_llm() factory for provider-agnostic LLM creation
  • Updated ConversationBot.init_agent() to accept provider/api_key parameters
  • Updated app.py with a provider dropdown selector (OpenAI / MiniMax) in the login UI
  • Auto-detection: if MINIMAX_API_KEY is set (without OPENAI_API_KEY), MiniMax is chosen automatically; or set LLM_PROVIDER=minimax explicitly
  • MiniMax-specific handling: temperature clamping (MiniMax requires >0), <think> tag stripping, both old and new openai SDK support
  • Models: MiniMax-M2.7, M2.7-highspeed, M2.5, M2.5-highspeed

Files changed (7, +547 additions):

  • iGPT/controllers/llm_provider.py — new provider module
  • iGPT/controllers/ConversationBot.py — use create_llm() factory
  • iGPT/controllers/__init__.py — export new symbols
  • app.py — provider selector UI + MiniMax login support
  • README.md — usage docs with model table
  • tests/test_llm_provider.py — 33 unit tests
  • tests/test_llm_provider_integration.py — 3 integration tests

Test plan

  • 33 unit tests covering MiniMaxLLM, detect_provider, create_llm factory, constants
  • 3 integration tests against live MiniMax API (basic completion, highspeed model, factory)
  • Manual testing: select MiniMax from dropdown, enter API key, verify agent conversation works

This PR is fully backward-compatible — existing OpenAI users see no behavior change.

Add MiniMax (M2.7, M2.7-highspeed, M2.5, M2.5-highspeed) as a first-class
LLM provider alongside OpenAI.

Changes:
- New iGPT/controllers/llm_provider.py with MiniMaxLLM class and create_llm()
  factory. Wraps MiniMax's OpenAI-compatible chat completion API as a
  LangChain BaseLLM. Includes temperature clamping (MiniMax requires >0)
  and think-tag stripping.
- Updated ConversationBot.init_agent() to accept provider/api_key params
  and use the factory instead of hardcoded OpenAI.
- Updated app.py with a provider selector dropdown in the login UI.
- Auto-detection: if MINIMAX_API_KEY is set (without OPENAI_API_KEY),
  MiniMax is chosen automatically. Or set LLM_PROVIDER=minimax explicitly.
- 33 unit tests + 3 integration tests.
- README docs with model table and usage instructions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant