Published: June 16, 2025
55
236
1.1k

Day 1/5 of #MiniMaxWeek: We’re open-sourcing MiniMax-M1, our latest LLM — setting new standards in long-context reasoning. - World’s longest context window: 1M-token input, 80k-token output - State-of-the-art agentic use among open-source models - RL at unmatched efficiency:

Image in tweet by MiniMax (official)

Now integrated into MiniMax Chat → https://chat.minimax.io/

Share this thread

Read on Twitter

View original thread

Navigate thread

1/2