8/11/2025

So, you want to set up a local GPT-OSS agent on your M-series Mac with MCP support? Awesome. It's a game-changer for running AI locally, giving your models the power to interact with the outside world. Honestly, it's not as complicated as it might sound. Here's the thing, once you get it going, you have a powerful, private AI assistant that can do some seriously cool stuff.
I've been down this rabbit hole, and I'm going to walk you through the whole process, step-by-step. We'll be using LM Studio, which is a fantastic tool for this, especially on Apple Silicon.

Why Go Local with GPT-OSS & MCP?

Before we dive in, let's quickly talk about why this is such a big deal. Running a model like GPT-OSS locally means a few things:
  • Privacy: Your data stays on your machine. No sending prompts or files to the cloud. This is HUGE.
  • Cost: It's free. Once you have the hardware, you're not paying for API calls.
  • Customization: You can tinker with the models, settings, & tools to your heart's content.
But local models have traditionally been isolated. They can't browse the web, access your files, or use external tools. That's where MCP (Model Control Protocol) comes in. It's a standard that lets your local LLM connect to "tools," which can be anything from a web search to your calendar. This gives your local agent "superpowers."

What You'll Need

  • An M-series Mac: These instructions are specifically for Macs with M1, M2, M3, or M4 chips. The unified memory architecture of Apple Silicon is a big advantage here.
  • LM Studio: This is the application we'll use to download & run the GPT-OSS model. You can grab it from lmstudio.ai.
  • A GPT-OSS Model: We'll download this through LM Studio. We'll start with the 20B parameter model, as it's more manageable for most consumer hardware.
  • (Optional) Docker Desktop: If you want to run a whole suite of local tools easily, Docker is the way to go. We'll cover this in the advanced section.

Step 1: Installing LM Studio & Getting the GPT-OSS Model

First things first, let's get LM Studio up & running.
  1. Download & Install LM Studio: Head over to the LM Studio website & download the macOS version. Install it like you would any other Mac application.
  2. Find the GPT-OSS Model: Open LM Studio. You'll be greeted with a pretty intuitive interface. On the left-hand side, click on the "Discover" tab (the little magnifying glass).
  3. Download the Model: In the search bar, type "GPT-OSS". You'll see a few options. Look for the
    1 openai/gpt-oss-20b
    model. This is the 20-billion parameter version, which is a good starting point. Click the "Download" button. It's a big file, so it might take a while depending on your internet connection.

Step 2: Running the Model & Understanding the Basics

Once the download is complete, it's time to load the model & have a quick chat.
  1. Load the Model: Click on the "Chat" tab (the speech bubble icon) on the left. At the top of the screen, you'll see a dropdown menu to select a model. Choose the GPT-OSS model you just downloaded.
  2. Model Configuration: On the right-hand side, you'll see a bunch of configuration options. For now, the defaults are fine, but it's good to know they're there. One important setting is the "Context Window." LM Studio defaults to 4096 tokens, which is a decent starting point. We'll come back to why this is so important later.
  3. Start Chatting: Now, just type a message in the chatbox at the bottom & hit enter. You're now having a conversation with a powerful AI model running entirely on your Mac. Pretty cool, right?

Step 3: Setting Up MCP Support - Giving Your Agent "Tools"

Okay, now for the fun part: giving our local agent some capabilities beyond just chatting. This is where MCP comes in. We're going to configure LM Studio to connect to MCP servers.
The magic happens in a file called
1 mcp.json
. This is where you tell LM Studio about the tools you want to make available to the model.
  1. Find the
    1 mcp.json
    file:
    In LM Studio, on the right-hand sidebar, click on the "Program" tab (it looks like a terminal prompt
    1 >_
    ). Under the "Install" section, you'll see a button that says "Edit mcp.json". Click it.
  2. Understanding the
    1 mcp.json
    structure:
    This will open an editor with a JSON file. It'll probably be pretty empty to start with. The basic structure looks like this:

Arsturn.com/
Claim your chatbot

Copyright © Arsturn 2025