У нас вы можете посмотреть бесплатно Self-Hosted n8n + MCP: Build Event-Driven AI Agents (Setup & Demo) или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Self Hosted AI Starter Kit: Setup n8n with Ollama & Qdrant for Building AI Agents - • Self Hosted AI Starter Kit: Setup n8n with... What is MCP (Model Context Protocol)? Model Context Protocol (MCP) is an open standard that defines how AI agents and language models interact with external tools, data sources, and services. It enables a structured, event-driven request/response mechanism, allowing models to discover, invoke, and exchange context with external systems. Key Features: Standardized Communication: Enables seamless interaction between agents, tools, and LLMs. Event-Driven: Supports structured request/response over various transports (e.g., stdio, HTTP/SSE). Modular & Scalable: Easily extend AI capabilities by plugging in new tools via MCP servers. Interoperable: Designed to work across different AI platforms and environments. Why MCP? Standardizes communication between tools, agents, and models. Facilitates real-time, event-driven AI workflows. Bridges AI and software with a shared, open protocol. Enables modular, scalable, and reusable tool integrations. Works across systems: LLMs, APIs, databases, files, memory, and more. Simplifies orchestration of multi-step, multi-tool AI agents. MCP Components MCP Server: Exposes tool capabilities to AI agents Handles event publishing and responses Example: https://modelcontextprotocol.io/examples MCP Client Connects to a specific MCP Server Sends requests and receives responses MCP Host (AI application) Manages one or more MCP clients Coordinates tool usage and request handling Examples: Claude, Cursor, Windsurf MCP Transport Mechanisms Stdio Transport: Uses standard input/output (stdin/stdout) Ideal for local tools or CLI-based agents Common in dev tools like Cursor or Claude Desktop Simple and secure for local workflows HTTP + SSE (Server-Sent Events): Client → Server: HTTP POST Server → Client: Server-Sent Events (SSE) Good for remote tools or services Enables streamed responses and real-time updates Often used in hosted AI services MCP in Action MCP Host (AI application / orchestrator) Manages one or more MCP Clients Decides when to trigger a request Coordinates tool usage and permission policies Examples: Claude, Cursor, Windsurf MCP Client Acts as the communication layer between Host and Server Sends structured requests and handles responses Maintains connection with the designated MCP Server MCP Server Interfaces with external tools, services, or data sources Executes tasks and returns structured results Exposes available tool capabilities via schema Demo Agent User sends a message to our chat agent asking for Google Clander events or create new Events AI Agent routes the request to an MCP Client node The Client Connects to a sub-workflow running an MCP Server Tigger Sub-workflow fetches event data or create new Google Clander Event using the API and responds to the client. More Details on MCP -https://modelcontextprotocol.io/intro... MCP support in n8n: n8n recently introduced support for MCP in its latest versions. To enable MCP in your self-hosted n8n: Add this line to your Docker environment config: N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true Restart your containers. Then, from the n8n UI, install the n8n-nodes-mcp package. This lets your AI Agent node use MCP Client and MCP Server Trigger nodes as tools. The MCP Client in this case connects to a local MCP server running inside n8n, but it can also connect to external MCP servers. Check out: 🔗 https://github.com/modelcontextprotoc... for ready-made examples for Figma, GitHub, FireCrawl, and more.