MCP GitHub
This document provides a comprehensive guide to an example that demonstrates how to interact with GitHub repositories using the AIGNE Framework and the GitHub MCP (Model Context Protocol) Server. You will learn how to set up and run the example, enabling an AI agent to perform various GitHub operations like searching repositories and managing files.
Overview#
The example showcases the integration of an MCPAgent that connects to a GitHub MCP server. This agent equips the AI with a suite of tools (skills) to interact with the GitHub API. The workflow allows a user to make requests in natural language, which the AI agent then translates into specific function calls to the GitHub agent to perform actions such as searching for repositories, reading file contents, or creating issues.
Prerequisites#
Before proceeding, ensure your system meets the following requirements:
- Node.js: Version 20.0 or higher.
- npm: Included with Node.js.
- GitHub Personal Access Token: A token with the necessary permissions for the repositories you intend to interact with. You can create one from your GitHub settings.
- AI Model Provider Account: An API key from a provider like OpenAI or a connection to an AIGNE Hub instance.
Quick Start#
You can run this example directly without a local installation using npx.
First, set your GitHub token as an environment variable.
Set your GitHub token
export GITHUB_TOKEN=YOUR_GITHUB_TOKENNext, execute the example.
Run the example
npx -y @aigne/example-mcp-githubConnect to an AI Model#
Upon first execution, if no AI model is configured, you will be prompted to connect one.
You have several options to proceed:
1. Connect to the Official AIGNE Hub#
This is the recommended approach. Selecting this option will open your browser to the official AIGNE Hub page. Follow the on-screen instructions to authorize the connection. New users receive complimentary credits to get started.
2. Connect to a Self-Hosted AIGNE Hub#
If you operate your own instance of AIGNE Hub, choose this option. You will be prompted to enter the URL of your self-hosted Hub to complete the connection.
3. Configure a Third-Party Model Provider#
You can also connect directly to a supported third-party model provider, such as OpenAI. To do this, set the provider's API key as an environment variable.
For example, to use OpenAI, configure your OPENAI_API_KEY:
Set your OpenAI API key
export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"After setting the environment variable, run the npx command again. For a list of supported environment variables for other providers, refer to the .env.local.example file in the project source.
Installation from Source#
For developers who wish to inspect or modify the code, follow these steps to run the example from a local clone.
1. Clone the Repository#
Clone the main AIGNE Framework repository from GitHub.
git clone https://github.com/AIGNE-io/aigne-framework2. Install Dependencies#
Navigate to the example's directory and install the necessary dependencies using pnpm.
cd aigne-framework/examples/mcp-github
pnpm install3. Run the Example#
Execute the start script to run the example.
Run in one-shot mode
pnpm startThe example also supports interactive chat mode and can accept input piped from other commands.
Run in interactive chat mode
pnpm start -- --chatUse pipeline input
echo "Search for repositories related to 'modelcontextprotocol'" | pnpm startCommand-Line Options#
You can customize the execution with the following command-line parameters:
Parameter | Description | Default |
|---|---|---|
| Run in interactive chat mode. | Disabled |
| Specify the AI model to use (e.g., |
|
| Set the temperature for model generation. | Provider default |
| Set the top-p sampling value. | Provider default |
| Set the presence penalty value. | Provider default |
| Set the frequency penalty value. | Provider default |
| Set the logging level ( |
|
| Specify input directly as an argument. | None |
Code Example#
The following TypeScript code demonstrates the core logic of the example. It initializes an AI model, sets up the MCPAgent for GitHub, and invokes an AIAgent to perform a repository search.
index.ts
import { AIAgent, AIGNE, MCPAgent } from "@aigne/core";
import { OpenAIChatModel } from "@aigne/core/models/openai-chat-model.js";
// Load environment variables
const { OPENAI_API_KEY, GITHUB_TOKEN } = process.env;
// Initialize OpenAI model
const model = new OpenAIChatModel({
apiKey: OPENAI_API_KEY,
});
// Initialize GitHub MCP agent
const githubMCPAgent = await MCPAgent.from({
command: "npx",
args: ["-y", "@modelcontextprotocol/server-github"],
env: {
GITHUB_TOKEN,
},
});
// Create AIGNE instance with the model and GitHub skills
const aigne = new AIGNE({
model,
skills: [githubMCPAgent],
});See all 33 lines
Available Operations#
The GitHub MCP server exposes a wide range of GitHub functionalities as skills that the AI agent can use, including:
- Repository Operations: Search, create, and get information about repositories.
- File Operations: Get file contents, create or update files, and push multiple files in a single commit.
- Issue and PR Operations: Create issues and pull requests, add comments, and merge pull requests.
- Search Operations: Search code, issues, and users.
- Commit Operations: List commits and get commit details.
Debugging with AIGNE Observe#
To inspect and analyze the behavior of your agent, you can use the aigne observe command. This tool starts a local web server that provides a user interface for viewing execution traces, call details, and other runtime data.
To start the observation server, run:
Start the AIGNE observe server
aigne observeOnce the server is running, you can access the web interface in your browser to view a list of recent executions and drill down into the details of each trace.
This tool is invaluable for debugging, understanding how the agent interacts with tools and models, and optimizing performance.