Want to build your own AI assistant? one that stays on your computer, respects your privacy, and can actually do things for you?

In this practical guide, I’ll show you how to create a simple yet powerful AI assistant using a local Large Language Model (LLM) and the Model Context Protocol (MCP).

No cloud dependencies. Just your own machine, your own tools, and a little Python.

Why a local LLM?

Most AI tools today rely on the cloud, your messages travel across networks, and your data leaves your device.

With a local LLM, all processing happens on your computer. This means:

  • Your data never leaves your machine
  • No internet connection needed
  • Full control over what the AI sees and does

This works perfectly with open-source models and it can also be used with paid AI APIs if you prefer. But for privacy and independence, we’ll go local.

Step 1: Install LM Studio

Head to https://lmstudio.ai and download LMStudio, it’s free and easy to use.

LMStudio lets you download and run local LLMs directly on your machine. No need to install complex frameworks or run servers.

Step 2: Pick a Model That Fits Your Setup

LMStudio offers hundreds of open-source models. You can find them here: https://lmstudio.ai/models

A very popular model is openai/gpt-oss-20b, a 20-billion-parameter model trained on reasoning and tools. It’s smart, capable, and great for complex tasks.

But here’s the catch: it needs at least 12 GB of RAM.

If you don’t have that much memory, no worries! Try Qwen3-4b instead a smaller, efficient model trained on real-world knowledge. It only needs 2 GB of RAM and still delivers solid performance. Which is what ill be doing during this article.

Find Qwen3-4b in the list and click “Use Model in LM Studio” to download it directly.

Once downloaded, open LMStudio and select your model from the top bar (or press Ctrl + L to quickly switch).

Wait for it to load, then click “Create a new Chat” (or use Ctrl + N) and ask it a simple question:

“Why is the sky blue?”

If it answers correctly, you’ve successfully set up your local AI assistant and you’re now in full control of your data.

But what can these AI Models actually do?

Even the smartest model won’t know for example:

  • What tools you use in your factory
  • How to access your production schedule
  • What your customers’ urgent orders are

Without context, the AI is just a chatbot, like a smart friend who knows everything, but can’t actually do anything.

That’s where tools come in.

The Model Contect Protocol (MCP)

MCP stands for Model Context Protocol a simple, standardized way for AI models to call real-world tools (like files, APIs, or internal software).

For years, AI systems could only respond to questions. Now, thanks to MCP, they can act like a real employee who runs tasks, checks files, and updates systems.

For example:

  • Google Drive MCP lets AI create files or update spreadsheets
  • Slack MCP lets AI send messages or reply to threads

In your workplace, you likely have tools that do real work from scheduling to inventory tracking. Why not let your AI use those tools?

Step 3: Build your own MCP Server (in Python)

You don’t need to build a full AI platform. Just one simple Python script and you’re ready to go. I do assume on that you know how to install python and download libraries using pip. We’ll use the FastMCP library, which makes building MCP servers easy and fast. Here’s a simple example:

from fastmcp import FastMCP

app = FastMCP("manufacturing_mcp")

@app.tool()
def optimize_schedule(order_id: str, priority: str):
    return f"Optimized schedule for order {order_id} which has priority: {priority}."

if __name__ == "__main__":
    app.run(transport="streamable-http", host="0.0.0.0", port=8000)

That’s it!

What’s happening here?

  • We create a new MCP app called manufacturing_mcp
  • The @app.tool() decorator tells the AI: “This function is a tool I can call”
  • We define input parameters (order_idpriority) and what it returns
  • We start the server on http://localhost:8000/mcp

Just run this script and keep it running. Your server is now live!

Step 4: Connecting the MCP Server to LMStudio

Now, we need to tell LMStudio about this tool.

  1. In LMStudio, click the wrench icon (🔧) in the top-right corner
  2. Select “Program”
  3. Click “Install” then choose “Edit mcp.json”

You’ll see a default JSON file:

{
  "mcpServers": {}
}

Now, update it to include your new MCP server:

{
  "mcpServers": {
    "ManufacturingMCP": {
      "url": "http://localhost:8000/mcp"
    }
  }
}

This tells LMStudio: “I have a tool called ManufacturingMCP that runs on my local server.”

Step 5: Start using your AI Agent

Back in the chat, click the socket icon (like a tiny plug) and enable the ManufacturingMCP tool.

Now, try asking:

“We just received an urgent order X2345 please reschedule it with priority URGENT.”

What happens?

  • The AI asks: “Do you allow me to run this tool?”
  • You click “Proceed”
  • Optionally, check the box: “Always allow any tool from mcp/manufacturing-mcp”
    (This gives full control, useful if you trust your AI to act)

The AI will then run the optimize_schedule function and respond with a message like:

“Optimized schedule for order X2345 which has priority: URGENT.”

Your AI now acts and not just talks.

Whats Next?

You can extend this setup with even more tools:

  • Read incoming emails
  • Pull real-time order lists
  • Send notifications to team members
  • Check inventory levels
  • Trigger alerts for delays

The AI will now use these tools to gather real data and respond with accurate, actionable insights just like a real employee.

Final Thoughts

This isn’t magic. It’s simple software + clear logic + a little Python. You don’t need access to expensive AI services.

With a local LLM and a single MCP server, you can:

  • Keep your data private
  • Automate daily tasks
  • Build an AI assistant that does work and not just talks

And the best part? You can build this in your own time and customize it for your business.

So go ahead, try it out. Start small. Add tools one by one. Watch your AI grow from a chatbot into a real helper.


If you have any problems that need solving, feel free to contact me on hookprograms@outlook.com to get in touch about possible solutions.

Resources