Overview


The AIGNE Framework is designed to be model-agnostic, allowing developers to integrate a wide variety of large language models (LLMs) and image generation models from different providers. This is achieved through a standardized adapter system that abstracts the unique APIs of each provider behind a consistent interface.

At the core of this system are the ChatModel and ImageModel agents. These specialized agents act as a bridge between your application's logic and the external AI services. By using these standardized agents, you can switch between model providers with minimal code changes, often only requiring a change in configuration. Each supported provider has a dedicated package (e.g., @aigne/openai, @aigne/anthropic) that contains the specific implementation for communicating with its API.

This section provides a summary of all officially supported model providers. For detailed instructions on how to install, configure, and use a specific provider, please refer to its dedicated documentation page.


Supported Chat Models#

The following table lists the chat model providers that are officially supported by the AIGNE Framework. Select a provider to view its detailed integration guide.

Provider

Package

AIGNE Hub

@aigne/aigne-hub

Anthropic

@aigne/anthropic

AWS Bedrock

@aigne/bedrock

DeepSeek

@aigne/deepseek

Doubao

@aigne/doubao

Google Gemini

@aigne/gemini

LMStudio

@aigne/lmstudio

Ollama

@aigne/ollama

OpenAI

@aigne/openai

OpenRouter

@aigne/open-router

Poe

@aigne/poe

xAI

@aigne/xai

Supported Image Models#

The following table lists the image generation model providers that are officially supported. Select a provider to view its detailed integration guide.

Provider

Package

AIGNE Hub

@aigne/aigne-hub

Doubao

@aigne/doubao

Google Gemini

@aigne/gemini

Ideogram

@aigne/ideogram

OpenAI

@aigne/openai