8/11/2025

So, you've been hearing the buzz about GLM 4.5, the massive 355-billion-parameter model from Z.ai, & you're probably already a fan of Claude Code for its slick in-terminal coding experience. The natural question is, how do you get these two powerhouses to work together, especially if you want to run the model locally using something like LMStudio?
Well, here’s the thing. It’s not exactly a plug-and-play situation right out of the box. But don't worry, it's TOTALLY doable, & I'm going to walk you through how to get it all set up. It involves a little bit of tinkering, but honestly, the payoff is pretty awesome. You get the agentic coding capabilities of Claude Code powered by a top-tier open-source model running on your own machine.
Let's dive in.

First, a Quick Rundown: The Key Players

Before we get our hands dirty, let's quickly recap what we're working with here.
  • GLM 4.5: This is the beast of a model we want to use. Developed by Z.ai, GLM-4.5 is a Mixture-of-Experts (MoE) model with some serious chops in reasoning & coding. It even has a lighter version, GLM-4.5-Air, which is more manageable for running on local hardware. The key thing to remember is that it's designed to be a powerful, flexible alternative to models like GPT-4.
  • Claude Code: If you're not already using it, you should be. It's a tool from Anthropic that brings an AI coding assistant directly into your terminal. It's fantastic for generating code, debugging, & even tackling entire projects with its agent-like abilities. The catch is that, by default, it's designed to work with Anthropic's own models.
  • LMStudio: This is the magic ingredient for running large language models (LLMs) on your local machine. It provides a super user-friendly interface for downloading, configuring, & running various open-source models. It also creates a local server that exposes the model through an API, which is EXACTLY what we need to connect it to other tools.

The Challenge: Bridging the Gap

So, the main hurdle here is that Claude Code is built to talk to Anthropic's API, & LMStudio, while it creates an OpenAI-compatible API endpoint, isn't directly integrated with Claude Code. We need to build a bridge between them.
Turns out, the community has already come up with a couple of clever solutions for this. The most popular approach involves using a tool called
1 claude-code-router
.

The
1 claude-code-router
Method: Your New Best Friend

The
1 claude-code-router
is a nifty little tool that acts as a proxy server. It intercepts the requests from Claude Code & forwards them to the model of your choice, in our case, GLM 4.5 running in LMStudio. It essentially translates the conversation between the two.
Here’s a step-by-step guide on how to get this set up.

Step 1: Get Your Tools in Order

First things first, you'll need to install a few things.
  1. Install Claude Code: If you haven't already, you can install it using npm.

Copyright © Arsturn 2025