@lmstudio
Download and run local LLMs on your computer 👾 https://t.co/e2E0DLMFJ5
DeepSeek R1 Distilled models are now available in LM Studio! 1.5B, 7B, 8B, 13B, 14B, 32B, and 70B variants. Please update to LM Studio 0.3.7 first...
🚀 Qwen2.5-1M: 1 million (!) token context model you can run locally! Comes in 7B and 14B sizes. Supported now in LM Studio: `lms get qwen2.5-1m` L...
LM Studio now supports MCP! Connect your favorite MCP servers to local LLMs, right on your computer. ...