@aigne/anthropic
AIGNE Anthropic SDK for integrating with Claude AI models within the AIGNE Framework.
Introduction#
@aigne/anthropic provides a seamless integration between the AIGNE Framework and Anthropic's Claude language models and API. This package enables developers to easily leverage Anthropic's Claude models in their AIGNE applications, providing a consistent interface across the framework while taking advantage of Claude's advanced AI capabilities.
Features#
- Anthropic API Integration: Direct connection to Anthropic's API services using the official SDK
- Chat Completions: Support for Claude's chat completions API with all available models
- Tool Calling: Built-in support for Claude's tool calling capability
- Streaming Responses: Support for streaming responses for more responsive applications
- Type-Safe: Comprehensive TypeScript typings for all APIs and models
- Consistent Interface: Compatible with the AIGNE Framework's model interface
- Error Handling: Robust error handling and retry mechanisms
- Full Configuration: Extensive configuration options for fine-tuning behavior
Installation#
Using npm#
npm install @aigne/anthropic @aigne/core
Using yarn#
yarn add @aigne/anthropic @aigne/core
Using pnpm#
pnpm add @aigne/anthropic @aigne/core
Basic Usage#
import { AnthropicChatModel } from "@aigne/anthropic";
const model = new AnthropicChatModel({
// Provide API key directly or use environment variable ANTHROPIC_API_KEY or CLAUDE_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify Claude model version (defaults to 'claude-3-7-sonnet-latest')
model: "claude-3-haiku-20240307",
// Configure model behavior
modelOptions: {
temperature: 0.7,
},
});
const result = await model.invoke({
messages: [{ role: "user", content: "Tell me about yourself" }],
});
console.log(result);
/* Output:
{
text: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?",
model: "claude-3-haiku-20240307",
usage: {
inputTokens: 8,
outputTokens: 15
}
}
*/
Streaming Responses#
import { AnthropicChatModel } from "@aigne/anthropic";
const model = new AnthropicChatModel({
apiKey: "your-api-key",
model: "claude-3-haiku-20240307",
});
const stream = await model.invoke(
{
messages: [{ role: "user", content: "Tell me about yourself" }],
},
undefined,
{ streaming: true },
);
let fullText = "";
const json = {};
for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}
console.log(fullText); // Output: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?"
console.log(json); // { model: "claude-3-haiku-20240307", usage: { inputTokens: 8, outputTokens: 15 } }
License#
Elastic-2.0
Classes#
AnthropicChatModel#
Implementation of the ChatModel interface for Anthropic's Claude API
This model provides access to Claude's capabilities including:
- Text generation
- Tool use
- JSON structured output
Default model: 'claude-3-7-sonnet-latest'
Examples#
Here's how to create and use a Claude chat model:
const model = new AnthropicChatModel({
// Provide API key directly or use environment variable ANTHROPIC_API_KEY or CLAUDE_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify Claude model version (defaults to 'claude-3-7-sonnet-latest')
model: "claude-3-haiku-20240307",
// Configure model behavior
modelOptions: {
temperature: 0.7,
},
});
const result = await model.invoke({
messages: [{ role: "user", content: "Tell me about yourself" }],
});
console.log(result);
/* Output:
{
text: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?",
model: "claude-3-haiku-20240307",
usage: {
inputTokens: 8,
outputTokens: 15
}
}
*/
Here's an example with streaming response:
const model = new AnthropicChatModel({
apiKey: "your-api-key",
model: "claude-3-haiku-20240307",
});
const stream = await model.invoke(
{
messages: [{ role: "user", content: "Tell me about yourself" }],
},
{ streaming: true },
);
let fullText = "";
const json = {};
for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}
console.log(fullText); // Output: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?"
console.log(json); // { model: "claude-3-haiku-20240307", usage: { inputTokens: 8, outputTokens: 15 } }
Extends#
ChatModel
Indexable#
[key: symbol]: () => string | () => Promise<void>
Constructors#
Constructor#
new AnthropicChatModel(
options?):AnthropicChatModel
Parameters#
Parameter | Type |
|---|---|
|
Returns#
Overrides#
ChatModel.constructor
Properties#
options?#
optionaloptions:AnthropicChatModelOptions
Accessors#
client#
Get Signature#
get client():
Anthropic
Returns#
Anthropic
modelOptions#
Get Signature#
get modelOptions():
undefined|ChatModelOptions
Returns#
undefined | ChatModelOptions
Methods#
process()#
process(
input):PromiseOrValue<`AgentProcessResult`<ChatModelOutput>>
Process the input using Claude's chat model
Parameters#
Parameter | Type | Description |
|---|---|---|
|
| The input to process |
Returns#
PromiseOrValue<`AgentProcessResult`<ChatModelOutput>>
The processed output from the model
Overrides#
ChatModel.process
Interfaces#
AnthropicChatModelOptions#
Configuration options for Claude Chat Model
Properties#
Property | Type | Description |
|---|---|---|
|
| API key for Anthropic's Claude API If not provided, will look for ANTHROPIC_API_KEY or CLAUDE_API_KEY in environment variables |
|
| Claude model to use Defaults to 'claude-3-7-sonnet-latest' |
|
| Additional model options to control behavior |