MiniMax Releases M2 AI Model with 230B Parameters, Claims Top Global Ranking for Open-Source Intelligence
Summary
MiniMax launches M2, a groundbreaking 230-billion parameter open-source AI model that achieves the #1 global ranking among open-source models on intelligence benchmarks while excelling at coding tasks with 69.4% performance on SWE-bench Verified and 46.3% on Terminal-Bench, now available free through multiple platforms.
Key Points
- MiniMax releases MiniMax-M2, an open-source MoE model with 230 billion total parameters and 10 billion active parameters, designed specifically for coding and agentic workflows
- The model ranks #1 among open-source models globally on intelligence benchmarks and demonstrates superior performance on coding tasks like SWE-bench Verified (69.4%) and Terminal-Bench (46.3%)
- MiniMax-M2 is now available through multiple channels including a free API on MiniMax Open Platform, open-source weights on HuggingFace, and deployment support via SGLang and vLLM frameworks