MCP Server
This guide provides instructions on how to run AIGNE Framework agents as a Model Context Protocol (MCP) Server. By following these steps, you will be able to expose your custom agents as tools to any MCP-compatible client, such as Claude Code, extending their functionality.
Overview#
The Model Context Protocol (MCP) is an open standard designed to enable AI assistants to securely connect with various data sources and tools. By operating AIGNE agents as MCP servers, you can augment MCP-compatible clients with the specialized capabilities of your agents.
Prerequisites#
Before proceeding, ensure the following requirements are met:
- Node.js: Version 20.0 or higher must be installed. You can download it from nodejs.org.
- AI Model Provider: An API key from a provider like OpenAI is required for the agents to function.
Quick Start#
You can start the MCP server directly without a local installation by using npx.
1. Run the MCP Server#
Execute the following command in your terminal to start the server on port 3456:
server.js
npx -y @aigne/example-mcp-server serve-mcp --port 3456Upon successful execution, the server will start, and you will see the following output, indicating that the MCP server is active and accessible.
Observability OpenTelemetry SDK Started, You can run `npx aigne observe` to start the observability server.
MCP server is running on http://localhost:3456/mcp2. Connect to an AI Model#
The agents require a connection to a Large Language Model (LLM) to process requests. If you run the server without configuring a model provider, you will be prompted to select a connection method.

You have three primary options for connecting to an AI model.
Option A: Connect to the Official AIGNE Hub#
This is the recommended method for new users.
- Select the first option, "Connect to the Arcblock official AIGNE Hub."
- Your web browser will open to the AIGNE Hub authorization page.
- Follow the on-screen instructions to approve the connection. New users are automatically granted 400,000 tokens for evaluation purposes.

Option B: Connect to a Self-Hosted AIGNE Hub#
If you operate your own instance of AIGNE Hub, select the second option.
- You will be prompted to enter the URL of your self-hosted AIGNE Hub.
- Provide the URL and follow the subsequent prompts to complete the connection.
For instructions on deploying a self-hosted AIGNE Hub, please visit the Blocklet Store.

Option C: Connect via a Third-Party Model Provider#
You can connect directly to a third-party model provider, such as OpenAI, by setting the appropriate API key as an environment variable.
For example, to use OpenAI, set the OPENAI_API_KEY variable:
.env
export OPENAI_API_KEY="your_openai_api_key_here"After setting the environment variable, restart the MCP server command. For a list of supported variables for other providers like DeepSeek or Google Gemini, refer to the example configuration file in the repository.
Available Agents#
This example exposes several pre-built agents as MCP tools, each with a distinct function:
Agent | File Path | Description |
|---|---|---|
Current Time |
| Provides the current date and time. |
Poet |
| Generates poetry and creative text. |
System Info |
| Reports information about the system. |
Connecting to an MCP Client#
Once the server is running, you can connect it to an MCP-compatible client. The following example uses Claude Code.
- Add the running MCP server to Claude Code with the following command:
claude mcp add -t http test http://localhost:3456/mcp - Invoke the agents from within the client. For instance, you can request system information or ask for a poem.
Example: Invoking the System Info Agent
Example: Invoking the Poet Agent
Observing Agent Activity#
AIGNE includes an observability tool that allows you to monitor and debug agent executions in real-time.
- Start the observability server by running the following command in a new terminal window:
npx aigne observe --port 7890
- Open your web browser and navigate to
http://localhost:7890.
The dashboard provides a user-friendly interface to inspect execution traces, view detailed call information, and understand agent behavior. This is an essential tool for debugging, performance tuning, and gaining insight into how your agents process information.

Below is an example of a detailed trace for a request handled by the Poet Agent.

Summary#
You have successfully started an MCP server, connected it to an AI model, and exposed AIGNE agents as tools to an MCP client. This allows you to extend the capabilities of AI assistants with custom logic and data sources.
For more advanced examples and agent types, please explore the following sections: