Zum Inhalt

How to Run Any LLM in Claude Cowork and Claude Code

You can now run Claude Cowork and Code directly in Claude Desktop with any LLM you want—whether it’s GPT-5, GPT OSS 120B, Gemini, open-weight models through OpenRouter, a local model on your laptop, or your company’s enterprise gateway (Bedrock, Vertex, Foundry). Anthropic released it with almost no announcement. Aucune mise à jour ou annonce jusqu’à présent. No blog post. Just technical documentation. I found it by chance. More than 20 hours have passed with no official statement.. Paweł Huryn@PawelHuryn. Anthropic has discreetly rolled out third-party inference support for Cowork and Code within Claude Desktop. This should be compatible with local models or OpenRouter when using the LiteLLM proxy.

  Hacker News – Newest: „claude cowork“