Used to check for browser translation.
用于检测浏览器翻译。
ブラウザの翻訳を検出する

@aigne/deepseek


GitHub star chart
Open Issues
NPM Version
Elastic-2.0 licensed

AIGNE Deepseek SDK for integrating with Deepseek AI models within the AIGNE Framework.

Introduction#

@aigne/deepseek provides a seamless integration between the AIGNE Framework and Deepseek's powerful language models and API. This package enables developers to easily leverage Deepseek's AI models in their AIGNE applications, providing a consistent interface across the framework while taking advantage of Deepseek's advanced AI capabilities.

Features#

  • Deepseek API Integration: Direct connection to Deepseek's API services
  • Chat Completions: Support for Deepseek's chat completions API with all available models
  • Function Calling: Built-in support for function calling capabilities
  • Streaming Responses: Support for streaming responses for more responsive applications
  • Type-Safe: Comprehensive TypeScript typings for all APIs and models
  • Consistent Interface: Compatible with the AIGNE Framework's model interface
  • Error Handling: Robust error handling and retry mechanisms
  • Full Configuration: Extensive configuration options for fine-tuning behavior

Installation#

Using npm#

npm install @aigne/deepseek @aigne/core

Using yarn#

yarn add @aigne/deepseek @aigne/core

Using pnpm#

pnpm add @aigne/deepseek @aigne/core

Basic Usage#

import { DeepSeekChatModel } from "@aigne/deepseek";

const model = new DeepSeekChatModel({
// Provide API key directly or use environment variable DEEPSEEK_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify model version (defaults to 'deepseek-chat')
model: "deepseek-chat",
modelOptions: {
temperature: 0.7,
},
});

const result = await model.invoke({
messages: [{ role: "user", content: "Introduce yourself" }],
});

console.log(result);
/* Output:
{
text: "Hello! I'm an AI assistant powered by DeepSeek's language model.",
model: "deepseek-chat",
usage: {
inputTokens: 7,
outputTokens: 12
}
}
*/

Streaming Responses#

import { DeepSeekChatModel } from "@aigne/deepseek";

const model = new DeepSeekChatModel({
apiKey: "your-api-key",
model: "deepseek-chat",
});

const stream = await model.invoke(
{
messages: [{ role: "user", content: "Introduce yourself" }],
},
undefined,
{ streaming: true },
);

let fullText = "";
const json = {};

for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}

console.log(fullText); // Output: "Hello! I'm an AI assistant powered by DeepSeek's language model."
console.log(json); // { model: "deepseek-chat", usage: { inputTokens: 7, outputTokens: 12 } }

License#

Elastic-2.0

Classes#

DeepSeekChatModel#

Implementation of the ChatModel interface for DeepSeek's API

This model uses OpenAI-compatible API format to interact with DeepSeek's models, but with specific configuration and capabilities for DeepSeek.

Default model: 'deepseek-chat'

Examples#

Here's how to create and use a DeepSeek chat model:

const model = new DeepSeekChatModel({
// Provide API key directly or use environment variable DEEPSEEK_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify model version (defaults to 'deepseek-chat')
model: "deepseek-chat",
modelOptions: {
temperature: 0.7,
},
});

const result = await model.invoke({
messages: [{ role: "user", content: "Introduce yourself" }],
});

console.log(result);
/* Output:
{
text: "Hello! I'm an AI assistant powered by DeepSeek's language model.",
model: "deepseek-chat",
usage: {
inputTokens: 7,
outputTokens: 12
}
}
*/

Here's an example with streaming response:

const model = new DeepSeekChatModel({
apiKey: "your-api-key",
model: "deepseek-chat",
});

const stream = await model.invoke(
{
messages: [{ role: "user", content: "Introduce yourself" }],
},
{ streaming: true },
);

let fullText = "";
const json = {};

for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}

console.log(fullText); // Output: "Hello! I'm an AI assistant powered by DeepSeek's language model."
console.log(json); // { model: "deepseek-chat", usage: { inputTokens: 7, outputTokens: 12 } }

Extends#

  • OpenAIChatModel

Indexable#

[key: symbol]: () => string | () => Promise<void>

Constructors#

Constructor#

new DeepSeekChatModel(options?): DeepSeekChatModel

Parameters#

Parameter

Type

options?

OpenAIChatModelOptions

Returns#

DeepSeekChatModel

Overrides#

OpenAIChatModel.constructor

Properties#

apiKeyEnvName#

protected apiKeyEnvName: string = "DEEPSEEK_API_KEY"

Overrides#

OpenAIChatModel.apiKeyEnvName

supportsNativeStructuredOutputs#

protected supportsNativeStructuredOutputs: boolean = false

Overrides#

OpenAIChatModel.supportsNativeStructuredOutputs

supportsToolsEmptyParameters#

protected supportsToolsEmptyParameters: boolean = false

Overrides#

OpenAIChatModel.supportsToolsEmptyParameters

You got 0 point(s)