Examples


Explore a collection of practical examples that demonstrate the various features and workflow patterns of the AIGNE Framework. This section provides hands-on, executable demos to help you understand intelligent conversation, MCP protocol integration, memory mechanisms, and complex agentic workflows.

Quick Start#

You can run any example directly without a local installation using npx. This approach is the fastest way to see the AIGNE Framework in action.

Prerequisites#

  • Node.js (version 20.0 or higher) and npm installed.
  • An API key for your chosen Large Language Model (LLM) provider (e.g., OpenAI).

Run an Example#

Execute the following commands in your terminal to run a basic chatbot.

  1. Set your API key: Replace YOUR_OPENAI_API_KEY with your actual OpenAI API key.
    export OPENAI_API_KEY=YOUR_OPENAI_API_KEY
  2. Run in one-shot mode: The agent will process a default prompt and exit.
    npx -y @aigne/example-chat-bot
  3. Run in interactive mode: Use the --interactive flag to start an interactive session where you can have a conversation with the agent.
    npx -y @aigne/example-chat-bot --interactive

Using Different LLMs#

You can specify different models by setting the MODEL environment variable along with the corresponding API key. Below are configurations for several popular providers.

Provider

Environment Variables

OpenAI

export MODEL=openai:gpt-4o
export OPENAI_API_KEY=...

Anthropic

export MODEL=anthropic:claude-3-opus-20240229
export ANTHROPIC_API_KEY=...

Google Gemini

export MODEL=gemini:gemini-1.5-flash
export GEMINI_API_KEY=...

DeepSeek

export MODEL=deepseek/deepseek-chat
export DEEPSEEK_API_KEY=...

AWS Bedrock

export MODEL=bedrock:anthropic.claude-3-sonnet-20240229-v1:0
export AWS_ACCESS_KEY_ID=...
export AWS_SECRET_ACCESS_KEY=...
export AWS_REGION=...

Ollama

export MODEL=llama3
export OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1"

Example Library#

This section provides a curated list of examples, each demonstrating a specific capability or workflow pattern within the AIGNE Framework. Click on any card to navigate to the detailed guide for that example.

Core Functionality#

Workflow Patterns#

MCP and Integrations#

Debugging#

To gain insight into an agent's execution, you can enable debug logs or use the AIGNE observation server.

View Debug Logs#

Set the DEBUG environment variable to * to output detailed logs, which include model calls and responses.

DEBUG=* npx -y @aigne/example-chat-bot --interactive

Use the Observation Server#

The aigne observe command starts a local web server that provides a user-friendly interface to inspect execution traces, view detailed call information, and understand your agent’s behavior. This is a powerful tool for debugging and performance tuning.

  1. Install the AIGNE CLI:
    npm install -g @aigne/cli
  2. Start the observation server:
    aigne observe

    A terminal showing the aigne observe command starting the server.

  3. View Traces: After running an agent, open your browser to http://localhost:7893 to see a list of recent executions and inspect the details of each run.
    The AIGNE observability interface showing a list of traces.