Published: January 22, 2026
212
1.0k
9.4k

That’s correct! I mentioned this tutorial is to use Claude Code with local models

Thank you!

You def can. For example: glm-4.7:cloud

THAT’S the real question. You might need to ask Anthropic first when they are going open-source

Appreciate it, Aakash!

PSA: It works.

You couldn’t be more wrong. It supports tool calling

Image in tweet by Alvaro Cintas

Make sure to download the latest version of Ollama and Claude Code

🫡

Thanks for Ollama 🫶

A couple of things, make sure to have a context length of +32k, and also, it will depend on the models. For example, try (if possible on your machine) to do it with gpt-oss:20b

The gpt-oss:20b might be a good option

Share this thread

Read on Twitter

View original thread

Navigate thread

1/13