Introduction
In the rapidly evolving world of AI and software development, integrating AI agents with external tools is becoming increasingly important. The Modern Context Protocol (MCP), combined with the Agno framework, provides a powerful and flexible way to connect AI agents to custom tools and services. This blog post dives deep into the concept of MCP, explains how it integrates with Agno, and contrasts it with traditional agent tool approaches. We’ll also walk through a practical example using provided code snippets and provide step-by-step instructions to run the example.
By the end of this post, you’ll have a clear understanding of:
- What MCP is and how it works.
- How Agno leverages MCP to enhance AI agent capabilities.
- The differences between MCP and traditional agent tools.
- How to set up and run a sample MCP-based application.
What is MCP?
The Modern Context Protocol (MCP) is a protocol designed to enable seamless communication between AI agents and external tools or services. It provides a standardized way to define, expose, and invoke functions (or “tools”) that an AI agent can use to perform tasks. MCP is particularly useful in scenarios where an AI model needs to interact with custom logic, external APIs, or computational resources that are not natively supported by the model.
flowchart LR AI[AI Agent] <--> MCP{MCP Protocol} MCP <--> T1[Math Tool] MCP <--> T2[API Tool] MCP <--> T3[Custom Tool] style AI fill:#f9d5e5,stroke:#333,stroke-width:2px style MCP fill:#eeeeee,stroke:#333,stroke-width:2px style T1 fill:#d5f9e6,stroke:#333,stroke-width:1px style T2 fill:#d5f9e6,stroke:#333,stroke-width:1px style T3 fill:#d5f9e6,stroke:#333,stroke-width:1px
Key features of MCP include:
- Modularity: MCP allows developers to define tools as independent modules, making it easy to add, remove, or update tools without modifying the core AI agent logic.
- Standardized Interface: MCP provides a consistent interface for tool invocation, ensuring compatibility across different AI models and frameworks.
- Flexibility: Tools can be implemented in any programming language, as long as they adhere to the MCP protocol.
- Scalability: MCP supports both local and remote tool execution, enabling distributed computing scenarios.
In the context of the provided code, MCP is used to define a set of mathematical tools (e.g., add
, multiply
, divide
, subtract
) that an AI agent can call to perform calculations.
What is Agno?
Agno is a Python framework designed to simplify the development of AI agents that interact with external tools and services. It abstracts away much of the complexity involved in integrating AI models with tools, providing a high-level interface for developers. Agno supports various AI models (e.g., OpenAI’s GPT models) and integrates seamlessly with MCP to enable tool-based interactions.
Agno’s key components include:
- Agent: The core component that orchestrates interactions between the AI model and tools.
- Models: Interfaces to AI models, such as OpenAI’s
gpt-4o-mini
. - Tools: Modular components that define the functions or services the agent can use, often implemented via MCP.
In the provided code, Agno is used to create an agent that interacts with an MCP-based math server to perform calculations.
How MCP Works with Agno
MCP and Agno work together to create a robust system for AI-tool integration. Here’s how the process works:
sequenceDiagram participant User participant Agent as Agno Agent participant MCP as MCP Server participant Tools as Math Tools User->>Agent: "What's (3+5)×12?" Agent->>MCP: Call add(3,5) MCP->>Tools: Execute add(3,5) Tools-->>MCP: Return 8 MCP-->>Agent: Return 8 Agent->>MCP: Call multiply(8,12) MCP->>Tools: Execute multiply(8,12) Tools-->>MCP: Return 96 MCP-->>Agent: Return 96 Agent-->>User: Display result
- Define MCP Tools: Developers create a server (e.g.,
server.py
) that defines a set of tools using the MCP protocol. Each tool is a Python function decorated with@mcp.tool()
, which exposes it to the MCP interface. - Run the MCP Server: The MCP server is launched, making the tools available for invocation. In the example, the server is run using
mcp.run(transport="stdio")
, which communicates via standard input/output. - Integrate with Agno: Agno’s
MCPTools
class is used to connect to the MCP server. The Agno agent is configured with the MCP tools and an AI model (e.g.,gpt-4o-mini
). - Agent Interaction: The agent processes user queries, determines which tools to call based on the query, and invokes the appropriate MCP tools. The results are returned to the user.
This architecture decouples the AI model from the tool implementation, allowing for greater flexibility and reusability.
MCP vs. Traditional Agent Tools
Traditional agent tools are often tightly coupled with the AI framework or model, leading to limitations in flexibility and scalability. Here’s how MCP differs:
Aspect | MCP | Traditional Agent Tools |
---|---|---|
Modularity | Highly modular; tools are independent and can be added/removed easily. | Often tightly integrated, requiring framework-specific changes. |
Interface | Standardized protocol for tool invocation, compatible with multiple frameworks. | Framework-specific APIs, limiting cross-platform use. |
Execution | Supports local and remote execution, enabling distributed systems. | Typically local execution, tied to the agent’s runtime. |
Language Support | Tools can be written in any language that supports MCP. | Usually limited to the language of the AI framework. |
Scalability | Designed for scalability, with support for distributed computing. | Limited scalability, often requiring custom solutions. |
In summary, MCP provides a more flexible, scalable, and standardized approach to tool integration compared to traditional methods, making it ideal for complex AI applications.
Code Walkthrough
Let’s examine the provided code to understand how MCP and Agno are implemented.
1. server.py
– Defining MCP Tools
The server.py
file defines a math server with four tools: add
, multiply
, divide
, and subtract
.
# math_server.py
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Math")
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b
@mcp.tool()
def multiply(a: int, b: int) -> int:
"""Multiply two numbers"""
return a * b
@mcp.tool()
def divide(a: int, b: int) -> float:
"""Divide two numbers"""
if b == 0:
raise ValueError("Cannot divide by zero")
return a / b
@mcp.tool()
def subtract(a: int, b: int) -> int:
"""Subtract two numbers"""
return a - b
if __name__ == "__main__":
mcp.run(transport="stdio")
- FastMCP: Initializes an MCP server named “Math”.
- Tool Definitions: Each function is decorated with
@mcp.tool()
, making it available as an MCP tool. - Transport: The
stdio
transport allows the server to communicate via standard input/output.
graph TD Server[MCP Server] --> A[Add Tool] Server --> M[Multiply Tool] Server --> D[Divide Tool] Server --> S[Subtract Tool] style Server fill:#f9f9f9,stroke:#333,stroke-width:2px style A fill:#e6ffe6,stroke:#333,stroke-width:1px style M fill:#e6ffe6,stroke:#333,stroke-width:1px style D fill:#e6ffe6,stroke:#333,stroke-width:1px style S fill:#e6ffe6,stroke:#333,stroke-width:1px
2. .env
– Environment Configuration
The .env
file stores the OpenAI API key required for the AI model.
This key is loaded by the agent.py
script using the dotenv
library.
3. agent.py
– Running the Agno Agent
The agent.py
file sets up and runs the Agno agent, connecting it to the MCP server.
import asyncio
import os
from dotenv import load_dotenv
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.mcp import MCPTools
from mcp import StdioServerParameters
load_dotenv()
async def run_agent():
server_params = StdioServerParameters(
command="python",
args=["./server.py"],
)
async with MCPTools(server_params=server_params) as mcp_tools:
agent = Agent(
model=OpenAIChat(id="gpt-4o-mini"),
tools=[mcp_tools],
markdown=True,
show_tool_calls=True,
debug_mode=True,
)
await agent.aprint_response("what's (3 + 5) x 12? and 5/9 and 6-9 and addition of all 3 outputs", stream=True)
if __name__ == "__main__":
asyncio.run(run_agent())
- Environment Setup: Loads the OpenAI API key from the
.env
file. - MCP Server Connection: Configures the MCP server using
StdioServerParameters
to runserver.py
. - Agent Configuration: Creates an Agno agent with the
gpt-4o-mini
model and MCP tools. - Query Processing: Sends a complex query to the agent, which uses the MCP tools to compute the results.
Steps to Run the Example
Follow these steps to set up and run the provided code on your local machine.
Prerequisites
- Python 3.8+: Ensure Python is installed.
- pip: Python’s package manager.
- Virtual Environment (Optional): Recommended for isolating dependencies.
- OpenAI API Key: Obtain a valid key from OpenAI.
Step 1: Set Up the Project Directory
- Clone repository : “https://github.com/Hitesh2498/MCP-with-Agno-Agents.git“
git clone https://github.com/Hitesh2498/MCP-with-Agno-Agents.git
Step 2: Install Dependencies
Install the required Python packages:
pip install -r requirements.txt
python-dotenv
: For loading environment variables.agno
: The Agno framework.mcp
: The MCP protocol library.
Step 3: Configure the Environment
Edit the .env
file to include your OpenAI API key:
OPENAI_API_KEY=your-openai-api-key-here
Replace your-openai-api-key-here
with your actual key.
Step 4: Run the Application
Run the agent.py
script
python agent.py
This will:
- Start the MCP server (
server.py
) in the background. - Initialize the Agno agent.
- Process the query: “what’s (3 + 5) x 12? and 5/9 and 6-9 and addition of all 3 outputs”.
Expected Output
The agent will output the results of the calculations, leveraging the MCP tools. The output will look something like this (formatted in markdown due to markdown=True
)
Response will be something like this :
1. (3 + 5) × 12:
- Tool call: `add(3, 5)` → 8
- Tool call: `multiply(8, 12)` → 96
2. 5 / 9:
- Tool call: `divide(5, 9)` → 0.5556 (approx)
3. 6 - 9:
- Tool call: `subtract(6, 9)` → -3
4. Addition of all outputs:
- Tool call: `add(96, 0.5556)` → 96.5556
- Tool call: `add(96.5556, -3)` → 93.5556 (approx)
Final Result: 93.5556
The exact output may vary slightly depending on floating-point precision and the AI model’s response.
Troubleshooting
- API Key Error: Ensure the
.env
file contains a valid OpenAI API key. - Module Not Found: Verify that all dependencies are installed.
- Server Failure: Check that
server.py
is in the same directory asagent.py
.
Conclusion
The Modern Context Protocol (MCP), when combined with the Agno framework, offers a powerful solution for integrating AI agents with custom tools. By providing a standardized, modular, and scalable approach, MCP overcomes many limitations of traditional agent tools. The provided code demonstrates a practical application of MCP and Agno, showcasing how to create a math server and connect it to an AI agent for complex calculations.
Whether you’re building AI-driven applications or experimenting with tool integration, MCP and Agno provide a robust foundation. Try running the example, explore the code, and consider how MCP can enhance your own projects!