Back to MCP Servers

Server Ollama Bridge

Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.

aggregatorsllm

Installation

npx -y mcp-server-ollama-bridge

Configuration

{
  "mcpServers": {
    "mcp-server-ollama-bridge": {
      "command": "npx",
      "args": ["-y", "mcp-server-ollama-bridge"]
    }
  }
}

How to use

  1. Run the installation command above (if needed)
  2. Open your Claude Code settings file (~/.claude/settings.json)
  3. Add the configuration to the mcpServers section
  4. Restart Claude Code to apply changes
View source on GitHub