NM

n8n MCP – Model Context Protocol Server for Claude

free

n8n MCP Server connects AI tools like Cursor, Claude Desktop, and other MCP-compatible clients to n8n workflows. It enables AI agents to trigger workflows, access 400+ integrations, automate tasks, and interact with APIs using the Model Context Protocol (MCP).

Curated by AI Stack · Platform pick
Installation Instructions →
Category: Automation / Workflow OrchestrationCompany: n8n GmbH
Compatible Tools:
Claude (Primary)CursorGitHub CopilotReplit AgentWindsurf

Featured on AI Stack

Add this badge to your README or site so visitors know this MCP is listed in our directory.

Listed on AI Stack MCP Directory
<a href="https://ai-stack.dev/mcps/n8n-mcp" target="_blank" rel="noopener noreferrer" style="display:inline-block;padding:6px 12px;background:#1a1f27;color:#93c5fd;border:1px solid #2d323a;border-radius:6px;font-size:12px;text-decoration:none;font-family:system-ui,sans-serif;">Listed on AI Stack MCP Directory</a>

About n8n MCP MCP Server

Quick overview of why teams use it, how it fits into AI workflows, and key constraints.

n8n GmbH in AI Workflows Without Context Switching

AI-powered workflows have become essential for modern technical teams, but the overhead of navigating between dashboards, scripts, and API endpoints can severely limit their effectiveness. The n8n Model Context Protocol (MCP) solves this problem by enabling AI assistants like Cursor and Claude Desktop to directly trigger n8n workflows, access 400+ native integrations, and automate tasks without the need for manual context switching.

With n8n MCP, AI agents can seamlessly pull the right data, trigger the appropriate actions, and interact with APIs across your entire tech stack - all through a single, secure interface. This eliminates the friction of bouncing between disparate tools and allows your AI to focus on delivering valuable insights and automations.

How n8n MCP Improves AI‑Assisted Workflows

The n8n MCP opens up a wide range of AI-powered workflow capabilities, including:

  • AI-assisted incident response and investigation, where the agent can pull relevant data, trigger remediation steps, and summarize findings
  • Automated reporting and monitoring, with the agent generating customized summaries, visualizations, and alerts based on your data
  • Intelligent task automation, allowing the agent to execute complex multi-step workflows without manual intervention
  • Enhanced collaboration, by enabling AI assistants to interact with your team's workflows and systems in a secure, controlled manner

Architecture and Data Flow

The n8n MCP is powered by the n8n server, which acts as a secure intermediary between AI agents and your underlying systems and APIs. When an AI agent makes a request through the MCP, the server translates that into the appropriate API calls, handles authentication and authorization, and returns the response back to the agent. This abstraction layer ensures that sensitive data and actions remain under your control, while still allowing the AI to leverage your full tech stack.

The communication between the AI agent and the n8n server is facilitated through a stdio/SSE-based transport, providing a reliable, real-time data flow that enables interactive workflows. This architecture ensures that your data never leaves your environment, and that you maintain full visibility and control over how the AI interacts with your systems.

When n8n MCP Is Most Useful

  • AI-assisted incident investigation and response
  • Automated generation of status reports, dashboards, and other data summaries
  • Proactive monitoring and alerting, with the AI agent taking action based on predefined thresholds
  • Integration of AI assistants like Cursor and Claude Desktop into your existing workflows and toolchain
  • Streamlining of repetitive tasks and processes through AI-powered automation
  • Enhanced collaboration between your team and AI agents, with the agent able to directly trigger and interact with your workflows

Limitations and Operational Constraints

To use the n8n MCP, you'll need to have an n8n server instance set up and configured with the appropriate API keys and permissions. The AI agent will also need to be MCP-compatible and have the necessary authentication credentials to access the n8n server.

  • API key requirements: You'll need to provide an API key or token that grants the AI agent access to your n8n server.
  • Rate limits: The n8n server may have rate limits in place to prevent abuse, which could impact the AI agent's ability to make requests.
  • Platform/host restrictions: The n8n server must be accessible from the AI agent's environment, which may have network or firewall restrictions.
  • Environment/network setup: The n8n server will need to be properly configured and secured within your infrastructure.
  • Model/tooling compatibility: The AI agent must be compatible with the n8n MCP protocol and able to handle the data formats and transport used by the server.

Example Configurations

For stdio Server (n8n MCP Example):
https://n8n.io/
For SSE Server:
URL: http://example.com:8080/sse

n8n MCP Specific Instructions

1. Install Node.js (v18+)
2. Install n8n globally:
npm install -g n8n
3. Start n8n:
n8n start
4. Use n8n’s MCP-compatible endpoint (if using self-hosted MCP bridge, configure in your MCP client settings)
5. Add the MCP server configuration inside Cursor or Claude Desktop settings

Usage Notes

Help other developers understand when this MCP works best and where to be careful.

Best used when:
Automating multi-step workflows
Connecting AI agents to APIs
CRM, email, Slack, database automation
Backend automation for AI tools
Avoid or be careful when:
High-frequency polling workflows
Large batch executions without queue mode
Running without proper credential security
Known limitations or caveats:
Requires self-hosting or n8n cloud account
Some integrations require paid plans
Rate limits depend on connected APIs

Community field notes and related MCPs load below.