Your Homelab Has a Brain Now. Here Is How to Wire It.
Some links in this article are affiliate links. We may earn a small commission if you purchase through them, at no extra cost to you. See our privacy policy for details.
Your Homelab Has a Brain Now. Here Is How to Wire It.
Every conversation with your AI starts from zero. You paste context. You re-explain your project. You copy notes in like it is 2023 and you are feeding a search bar. Meanwhile, you have a vault full of notes, a lab full of hardware, and an automation stack collecting dust. There is a better way.
What if Claude could read your entire Obsidian vault? Search it. Write to it. And your automation layer could trigger workflows based on what it finds. All on your hardware. Zero subscriptions.
That is what we are building.
The Stack
Four layers. Each one does one thing.
- Proxmox: the infrastructure. Runs everything on your hardware.
- n8n: the automation. Lives in an LXC containerA lightweight, portable package that bundles an application with its dependencies and runs in an isolated process on the host OS, sharing the kernel. Read more → on Proxmox. Handles triggers, webhooks, workflows.
- Obsidian MCPAn open standard for connecting AI assistants to external data sources and tools, enabling them to access real-time information and take actions. Read more → Server: the knowledge layer. Exposes your vault to Claude over the Model Context Protocol.
- Claude via MCP: the intelligence. Connects to your vault, reads your notes, acts on what it finds.
graph TD
A["Claude (MCP Host)"] -->|"JSON-RPC 2.0"| B["MCP Client"]
B -->|"STDIO"| C["Obsidian MCP Server"]
C -->|"REST API"| D["Obsidian Vault"]
E["n8n (LXC)"] -->|"Proxmox API"| F["Proxmox VE"]
E -->|"Webhooks / Triggers"| C
E -->|"API Calls"| A
The result: an AI that knows everything you have ever written, running on your own iron, talking to your automation stack, costing you nothing per month.
How It Actually Works
MCP stands for Model Context Protocol. It is an open standard that lets AI applications connect to external data sources through a clean client-server architecture. Think of it like a USB portA numbered endpoint on a device that identifies a specific application or service, allowing multiple network services to run on the same IP address. Read more → for AI. The AI does not need to know how your data is stored. It just needs a server that speaks the protocol.
The architecture has four participants. The Host is the AI application (Claude Desktop or Claude Code). Inside the host lives the Client, which manages the actual connection. The Server is the program that exposes your data (in this case, your Obsidian vault via the Local RESTAn architectural style for web APIs that uses standard HTTP methods to create, read, update, and delete resources identified by URLs. Read more → APIA set of rules and protocols that allows different software applications to communicate with each other and share data or functionality. Read more → plugin). Communication between client and server happens over STDIO (standard input/output) for local setups, using JSONA lightweight, human-readable data format used to exchange structured information between systems, based on JavaScript object syntax. Read more →-RPC 2.0 messages. The host decides what Claude can access. The server just makes things available.
Here is the real config. This goes in your Claude Desktop config file:
Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"mcp-obsidian": {
"command": "uvx",
"args": ["mcp-obsidian"],
"env": {
"OBSIDIAN_API_KEY": "<your_api_key_here>",
"OBSIDIAN_HOST": "localhost",
"OBSIDIAN_PORT": "27124"
}
}
}
}
command tells Claude how to launch the server. uvx runs it directly from PyPI without installing globally. args is the package name. env passes your Obsidian Local REST API credentials. The API key comes from the plugin settings inside Obsidian. Port 27124 is the default.
When Claude needs a note, here is what actually happens:
sequenceDiagram
participant C as Claude
participant MC as MCP Client
participant MS as MCP Server
participant V as Vault
C->>MC: 1. tools/call
MC->>MS: 2. JSON-RPC request (stdin)
MS->>V: 3. GET /vault/note (REST API)
V-->>MS: 4. Note content
MS-->>MC: 5. JSON-RPC response (stdout)
MC-->>C: 6. Tool result
Claude issues a tools/call request. The MCP client serializes it as a JSON-RPC 2.0 message and writes it to the server’s stdin. The server hits the Obsidian REST API, gets the note content, and writes the response back to stdout. Claude now has your note. The whole round trip happens locally. No cloud. No latencyThe time delay between sending a request and receiving the first byte of the response, typically measured in milliseconds. Read more → worth measuring.
Why Your Homelab Is the Right Place for This
You own the data. Your notes never leave your network. No cloud sync shipping your vault to someone else’s servers. No training data contribution you did not consent to. Your knowledge stays on your hardware, behind your firewallA security device or software that monitors and controls incoming and outgoing network traffic based on predefined rules. Read more →. Full stop.
Zero subscription cost. Proxmox is free. Self-hosted n8n is free. Obsidian is free. The MCP server is open source. The only thing you are paying for is the Claude subscription you already have (or API credits if you are running headless). Compare that to cobbling together three SaaS tools at $20/month each to get half the capability.
The lab becomes intelligent. This is the part most people miss. Once n8n can talk to Claude and Claude can talk to your vault, you are not just searching notes anymore. You are building workflows that reason over your knowledge base. n8n triggers a workflow, Claude reads your vault for context, makes a decision, writes the result back. That is a different category of automation than “if this then that.”
What You Can Build With This
Stop thinking about this as a note-reading trick. Think about what becomes possible when your AI has memory and your automation stack has intelligence.
-
Morning briefing from your daily notes. n8n fires at 6 AM, tells Claude to read yesterday’s daily note and your task board, generates a prioritized summary of what carried over and what is due. Written back to today’s note before you pour coffee.
-
Auto-tagging based on vault context. New note lands in your inbox. Claude reads it, compares it against your existing tag taxonomy and MOC structure, applies tags and links it to related notes. Your vault organizes itself.
-
Proxmox alerts summarized and logged. n8n watches your Proxmox API for resource alerts. When a container spikes CPU or a disk fills up, Claude gets the alert, pulls recent related notes from your infra log, writes a contextualized incident note with recommended actions.
-
Research pipelines triggered by webhookAn HTTP callback that automatically sends data to a URL when a specific event occurs, enabling real-time communication between applications. Read more →. You find an article worth reading. Hit a webhook from your phone. n8n sends the URL to Claude, which reads it, extracts key points, cross-references your vault for related notes, and writes a literature note with backlinks already in place.
-
Weekly review automation. Every Sunday, Claude scans your completed tasks, meeting notes, and daily logs for the week. Generates a review draft: what shipped, what slipped, what patterns are emerging. You edit and publish. Ten minutes instead of an hour.
The pattern is the same every time: trigger, context from vault, intelligence from Claude, output back to vault or another system. Once you see it, you cannot unsee it.
Where to Start
Step 1: Get Proxmox running. If you do not have a hypervisor yet, install Proxmox VE on any spare hardware. Old desktop, mini PC, rack server. It does not matter. Proxmox is free and it is what the rest of the stack runs on.
Step 2: Deploy n8n in an LXC container. Spin up a lightweight Debian LXC on Proxmox and install n8n. It runs on Node.js, pulls minimal resources, and gives you a visual workflow builder with access to the Proxmox API, webhooks, schedules, and hundreds of integrations.
Step 3: Wire up the MCP server. Install the Obsidian Local REST API community plugin in your vault. Enable it, copy the API key. Add the config block above to your Claude Desktop config. Restart Claude. Ask it to search your vault. Watch it work.
That is the path. Three steps to a homelab that thinks.
The Close
Your notes are not just files. Your lab is not just hardware. And your AI should not be starting from scratch every time you open a chat window.
Wire these pieces together and you have something most people do not even know is possible yet: a personal AI that knows your entire knowledge base, runs on your own infrastructure, and costs you nothing beyond what you are already paying. No vendor lock-in. No data leaving your network. Just your brain, extended.
Next post: the full n8n + Claude + Proxmox automation deep dive. We are going to build the actual workflows. The morning briefing. The auto-tagger. The whole damn thing.
Stay wired.
Sources
- Model Context Protocol Specification (official docs)
- mcp-obsidian (PyPI)
- Obsidian Local REST API Plugin
- Proxmox VE
- n8n