Skip to content

feat: add MiniMax as first-class LLM provider#275

Open
octo-patch wants to merge 1 commit intoFellouAI:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#275
octo-patch wants to merge 1 commit intoFellouAI:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Mar 21, 2026

Summary

Add MiniMax as a dedicated LLM provider, allowing users to configure MiniMax models with provider: "minimax" instead of using the generic openai-compatible provider.

Changes

  • packages/eko-core/src/types/llm.types.ts: Add "minimax" to LLMprovider type union
  • packages/eko-core/src/llm/rlm.ts: Add minimax provider branch in getLLM() using createOpenAICompatible with default base URL https://api.minimax.io/v1
  • README.md: Add MiniMax example in quickstart code block
  • packages/eko-core/test/llm/minimax.test.ts: Add 8 unit tests + 3 integration tests

Usage

const llms: LLMs = {
  default: {
    provider: "minimax",
    model: "MiniMax-M2.7",  // or MiniMax-M2.5, MiniMax-M2.5-highspeed
    apiKey: "your-minimax-api-key"
  }
};

MiniMax API is OpenAI-compatible, so this integration uses the existing @ai-sdk/openai-compatible package with no new dependencies.

Available Models

Model Context Window Best For
MiniMax-M2.7 1M tokens Latest, most capable
MiniMax-M2.5 204K tokens Balanced performance
MiniMax-M2.5-highspeed 204K tokens Fast inference

Test Plan

  • 8 unit tests pass (type validation, config, multi-model, multi-provider)
  • 3 integration tests included (non-streaming, streaming, tool calling)
  • No new dependencies added
  • Follows existing provider patterns (modelscope, openrouter)

Add MiniMax (https://www.minimaxi.com) as a dedicated LLM provider using
the OpenAI-compatible API via @ai-sdk/openai-compatible. Users can now
configure MiniMax models (MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed)
with provider: "minimax" instead of using the generic openai-compatible
provider.

Changes:
- Add "minimax" to LLMprovider type union
- Add minimax provider branch in getLLM() with default base URL
- Add MiniMax example in README quickstart
- Add 8 unit tests + 3 integration tests
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from bfea0a6 to 866264d Compare March 21, 2026 07:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant