@aigne/gemini
AIGNE Gemini SDK for integrating with Google's Gemini AI models within the AIGNE Framework.
Introduction#
@aigne/gemini provides a seamless integration between the AIGNE Framework and Google's Gemini language models and API. This package enables developers to easily leverage Gemini's advanced AI capabilities in their AIGNE applications, providing a consistent interface across the framework while taking advantage of Google's state-of-the-art multimodal models.
Features#
- Google Gemini API Integration: Direct connection to Google's Gemini API services
- Chat Completions: Support for Gemini's chat completions API with all available models
- Multimodal Support: Built-in support for handling both text and image inputs
- Function Calling: Support for function calling capabilities
- Streaming Responses: Support for streaming responses for more responsive applications
- Type-Safe: Comprehensive TypeScript typings for all APIs and models
- Consistent Interface: Compatible with the AIGNE Framework's model interface
- Error Handling: Robust error handling and retry mechanisms
- Full Configuration: Extensive configuration options for fine-tuning behavior
Installation#
Using npm#
npm install @aigne/gemini @aigne/core
Using yarn#
yarn add @aigne/gemini @aigne/core
Using pnpm#
pnpm add @aigne/gemini @aigne/core
Basic Usage#
import { GeminiChatModel } from "@aigne/gemini";
const model = new GeminiChatModel({
// Provide API key directly or use environment variable GOOGLE_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify Gemini model version (defaults to 'gemini-1.5-pro' if not specified)
model: "gemini-1.5-flash",
modelOptions: {
temperature: 0.7,
},
});
const result = await model.invoke({
messages: [{ role: "user", content: "Hi there, introduce yourself" }],
});
console.log(result);
/* Output:
{
text: "Hello from Gemini! I'm Google's helpful AI assistant. How can I assist you today?",
model: "gemini-1.5-flash"
}
*/
Streaming Responses#
import { GeminiChatModel } from "@aigne/gemini";
const model = new GeminiChatModel({
apiKey: "your-api-key",
model: "gemini-1.5-flash",
});
const stream = await model.invoke(
{
messages: [{ role: "user", content: "Hi there, introduce yourself" }],
},
undefined,
{ streaming: true },
);
let fullText = "";
const json = {};
for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}
console.log(fullText); // Output: "Hello from Gemini! I'm Google's helpful AI assistant. How can I assist you today?"
console.log(json); // { model: "gemini-1.5-flash" }
License#
Elastic-2.0
Classes#
GeminiChatModel#
Implementation of the ChatModel interface for Google's Gemini API
This model uses OpenAI-compatible API format to interact with Google's Gemini models, providing access to models like Gemini 1.5 and Gemini 2.0.
Examples#
Here's how to create and use a Gemini chat model:
const model = new GeminiChatModel({
// Provide API key directly or use environment variable GOOGLE_API_KEY
apiKey: "your-api-key", // Optional if set in env variables
// Specify Gemini model version (defaults to 'gemini-1.5-pro' if not specified)
model: "gemini-1.5-flash",
modelOptions: {
temperature: 0.7,
},
});
const result = await model.invoke({
messages: [{ role: "user", content: "Hi there, introduce yourself" }],
});
console.log(result);
/* Output:
{
text: "Hello from Gemini! I'm Google's helpful AI assistant. How can I assist you today?",
model: "gemini-1.5-flash"
}
*/
Here's an example with streaming response:
const model = new GeminiChatModel({
apiKey: "your-api-key",
model: "gemini-1.5-flash",
});
const stream = await model.invoke(
{
messages: [{ role: "user", content: "Hi there, introduce yourself" }],
},
{ streaming: true },
);
let fullText = "";
const json = {};
for await (const chunk of stream) {
const text = chunk.delta.text?.text;
if (text) fullText += text;
if (chunk.delta.json) Object.assign(json, chunk.delta.json);
}
console.log(fullText); // Output: "Hello from Gemini! I'm Google's helpful AI assistant. How can I assist you today?"
console.log(json); // { model: "gemini-1.5-flash" }
Extends#
OpenAIChatModel
Indexable#
[key: symbol]: () => string | () => Promise<void>
Constructors#
Constructor#
new GeminiChatModel(
options?):GeminiChatModel
Parameters#
Parameter | Type |
|---|---|
|
|
Returns#
Overrides#
OpenAIChatModel.constructor
Properties#
apiKeyEnvName#
protectedapiKeyEnvName:string="GEMINI_API_KEY"
Overrides#
OpenAIChatModel.apiKeyEnvName
supportsEndWithSystemMessage#
protectedsupportsEndWithSystemMessage:boolean=false
Overrides#
OpenAIChatModel.supportsEndWithSystemMessage
supportsToolsUseWithJsonSchema#
protectedsupportsToolsUseWithJsonSchema:boolean=false
Overrides#
OpenAIChatModel.supportsToolsUseWithJsonSchema
supportsParallelToolCalls#
protectedsupportsParallelToolCalls:boolean=false
Overrides#
OpenAIChatModel.supportsParallelToolCalls
supportsToolStreaming#
protectedsupportsToolStreaming:boolean=false
Overrides#
OpenAIChatModel.supportsToolStreaming