Used to check for browser translation.
用于检测浏览器翻译。
ブラウザの翻訳を検出する

@aigne/anthropic


GitHub star chart
Open Issues
codecov
NPM Version
Elastic-2.0 licensed

AIGNE Anthropic SDK for integrating with Claude AI models within the AIGNE Framework.

Introduction#

@aigne/anthropic provides a seamless integration between the AIGNE Framework and Anthropic's Claude language models and API. This package enables developers to easily leverage Anthropic's Claude models in their AIGNE applications, providing a consistent interface across the framework while taking advantage of Claude's advanced AI capabilities.

Features#

  • Anthropic API Integration: Direct connection to Anthropic's API services using the official SDK
  • Chat Completions: Support for Claude's chat completions API with all available models
  • Tool Calling: Built-in support for Claude's tool calling capability
  • Streaming Responses: Support for streaming responses for more responsive applications
  • Type-Safe: Comprehensive TypeScript typings for all APIs and models
  • Consistent Interface: Compatible with the AIGNE Framework's model interface
  • Error Handling: Robust error handling and retry mechanisms
  • Full Configuration: Extensive configuration options for fine-tuning behavior

Installation#

Using npm#

npm install @aigne/anthropic @aigne/core

Using yarn#

yarn add @aigne/anthropic @aigne/core

Using pnpm#

pnpm add @aigne/anthropic @aigne/core

Basic Usage#

import { AnthropicChatModel } from "@aigne/anthropic";

const model = new AnthropicChatModel({
// Provide API key directly or use environment variable ANTHROPIC_API_KEY or CLAUDE_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify Claude model version (defaults to 'claude-3-7-sonnet-latest')
model: "claude-3-haiku-20240307",
// Configure model behavior
modelOptions: {
temperature: 0.7,
},
});

const result = await model.invoke({
messages: [{ role: "user", content: "Tell me about yourself" }],
});

console.log(result);
/* Output:
{
text: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?",
model: "claude-3-haiku-20240307",
usage: {
inputTokens: 8,
outputTokens: 15
}
}
*/

Streaming Responses#

import { AnthropicChatModel } from "@aigne/anthropic";

const model = new AnthropicChatModel({
apiKey: "your-api-key",
model: "claude-3-haiku-20240307",
});

const stream = await model.invoke(
{
messages: [{ role: "user", content: "Tell me about yourself" }],
},
undefined,
{ streaming: true },
);

let fullText = "";
const json = {};

for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}

console.log(fullText); // Output: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?"
console.log(json); // { model: "claude-3-haiku-20240307", usage: { inputTokens: 8, outputTokens: 15 } }

License#

Elastic-2.0

Classes#

AnthropicChatModel#

Implementation of the ChatModel interface for Anthropic's Claude API

This model provides access to Claude's capabilities including:

  • Text generation
  • Tool use
  • JSON structured output

Default model: 'claude-3-7-sonnet-latest'

Examples#

Here's how to create and use a Claude chat model:

const model = new AnthropicChatModel({
// Provide API key directly or use environment variable ANTHROPIC_API_KEY or CLAUDE_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify Claude model version (defaults to 'claude-3-7-sonnet-latest')
model: "claude-3-haiku-20240307",
// Configure model behavior
modelOptions: {
temperature: 0.7,
},
});

const result = await model.invoke({
messages: [{ role: "user", content: "Tell me about yourself" }],
});

console.log(result);
/* Output:
{
text: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?",
model: "claude-3-haiku-20240307",
usage: {
inputTokens: 8,
outputTokens: 15
}
}
*/

Here's an example with streaming response:

const model = new AnthropicChatModel({
apiKey: "your-api-key",
model: "claude-3-haiku-20240307",
});

const stream = await model.invoke(
{
messages: [{ role: "user", content: "Tell me about yourself" }],
},
{ streaming: true },
);

let fullText = "";
const json = {};

for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}

console.log(fullText); // Output: "I'm Claude, an AI assistant created by Anthropic. How can I help you today?"
console.log(json); // { model: "claude-3-haiku-20240307", usage: { inputTokens: 8, outputTokens: 15 } }

Extends#

  • ChatModel

Indexable#

[key: symbol]: () => string | () => Promise<void>

Constructors#

Constructor#

new AnthropicChatModel(options?): AnthropicChatModel

Parameters#

Parameter

Type

options?

AnthropicChatModelOptions

Returns#

AnthropicChatModel

Overrides#

ChatModel.constructor

Properties#

options?#

optional options: AnthropicChatModelOptions

Accessors#

client#
Get Signature#

get client(): Anthropic

Returns#

Anthropic

modelOptions#
Get Signature#

get modelOptions(): undefined | ChatModelOptions

Returns#

undefined | ChatModelOptions

Methods#

process()#

process(input): PromiseOrValue<`AgentProcessResult`<ChatModelOutput>>

Process the input using Claude's chat model

Parameters#

Parameter

Type

Description

input

ChatModelInput

The input to process

Returns#

PromiseOrValue<`AgentProcessResult`<ChatModelOutput>>

The processed output from the model

Overrides#

ChatModel.process

Interfaces#

AnthropicChatModelOptions#

Configuration options for Claude Chat Model

Properties#

Property

Type

Description

apiKey?

string

API key for Anthropic's Claude API If not provided, will look for ANTHROPIC_API_KEY or CLAUDE_API_KEY in environment variables

model?

string

Claude model to use Defaults to 'claude-3-7-sonnet-latest'

modelOptions?

ChatModelOptions

Additional model options to control behavior

You got 0 point(s)