Skip to main content

MCP (Model Context Protocol)

MCP is a set of rules that helps AI models communicate and understand context better when working with different apps, tools, or systems.

🔹 Model (🤖) – This is the AI system (like ChatGPT) that processes information.

🔹 Context (🧠) – The background info, memory, or ongoing conversation that the AI uses to give better responses.

🔹 Protocol (📡) – The rules and structure for sharing data and making sure everything runs smoothly.

✅ Helps AI remember useful details in a conversation ✅ Makes AI interactions smarter & more relevant ✅ Allows different apps and AI models to work together

Think of it like a universal translator 🗣️ that makes sure AI models understand and respond correctly across different platforms.

A path to AI agents

MCP can be seen as a gateway to AI agents 🤖 because it helps AI models connect, share context, and work together across different tools, apps, or systems.

Think of MCP like an airport control tower 🏢✈️ that manages how AI agents communicate and exchange information. It ensures that:

✅ AI agents understand the same context ✅ They can work together smoothly ✅ Different models & apps can interact without confusion

This makes AI more powerful and useful by allowing agents to remember, adapt, and take action across various platforms.

Supported MCP features

Otoroshi provides full support for the MCP specification, both as a client (connecting to MCP servers) and as a server (exposing tools to MCP clients).

As an MCP client (MCP Connectors)

MCP Connectors allow Otoroshi to connect to external MCP servers and consume their capabilities:

  • Tools (tools/list, tools/call) - Discover and call tools exposed by MCP servers
  • Resources (resources/list, resources/read) - List and read resources (files, data, etc.) from MCP servers
  • Resource Templates (resources/templates/list) - List parameterized resource templates
  • Prompts (prompts/list, prompts/get) - List and retrieve prompt templates with their arguments

All these features support filtering (include/exclude patterns) and advanced rules (JsonPath-based access control).

As an MCP server (MCP Plugins)

MCP Server Plugins allow Otoroshi to expose tool functions and MCP connector capabilities as an MCP server, supporting three transport protocols:

  • HTTP - Simple request/response MCP server
  • SSE (Server-Sent Events) - Streaming MCP server for long-lived connections
  • WebSocket - Full-duplex MCP server

All three transports support the full MCP protocol:

MethodDescription
tools/listList available tools
tools/callExecute a tool
resources/listList available resources
resources/readRead a resource by URI
resources/templates/listList available resource templates
prompts/listList available prompts
prompts/getGet a prompt with its messages

Resources are returned with their uri, name, description, and mimeType. Resource contents can be text or binary (blob). Prompts include their name, description, and arguments (with name, description, and required for each argument). When getting a prompt, the response includes the prompt description and its messages (each with a role and content).

Otoroshi + MCP = ❤️

With Otoroshi you can use many tools to interact with MCP.

Thanks to our MCP Connectors you can link Otoroshi with your AI provider and an MCP Server.