Files
cherry-studio/packages/ai-sdk-provider
Phantom 706fac898a fix(i18n): clarify image-generation endpoint type as OpenAI-based (#11554)
* fix(i18n): remove image-generation translations and clarify endpoint type

Update English locale to specify OpenAI for image generation
Add comments to clarify image-generation endpoint type relationship

* fix(i18n): correct portuguese translations in pt-pt.json
2025-11-30 15:39:09 +08:00
..

@cherrystudio/ai-sdk-provider

CherryIN provider bundle for the Vercel AI SDK.
It exposes the CherryIN OpenAI-compatible entrypoints and dynamically routes Anthropic and Gemini model ids to their CherryIN upstream equivalents.

Installation

npm install ai @cherrystudio/ai-sdk-provider @ai-sdk/anthropic @ai-sdk/google @ai-sdk/openai
# or
yarn add ai @cherrystudio/ai-sdk-provider @ai-sdk/anthropic @ai-sdk/google @ai-sdk/openai

Note

: This package requires peer dependencies ai, @ai-sdk/anthropic, @ai-sdk/google, and @ai-sdk/openai to be installed.

Usage

import { createCherryIn, cherryIn } from '@cherrystudio/ai-sdk-provider'

const cherryInProvider = createCherryIn({
  apiKey: process.env.CHERRYIN_API_KEY,
  // optional overrides:
  // baseURL: 'https://open.cherryin.net/v1',
  // anthropicBaseURL: 'https://open.cherryin.net/anthropic',
  // geminiBaseURL: 'https://open.cherryin.net/gemini/v1beta',
})

// Chat models will auto-route based on the model id prefix:
const openaiModel = cherryInProvider.chat('gpt-4o-mini')
const anthropicModel = cherryInProvider.chat('claude-3-5-sonnet-latest')
const geminiModel = cherryInProvider.chat('gemini-2.0-pro-exp')

const { text } = await openaiModel.invoke('Hello CherryIN!')

The provider also exposes completion, responses, embedding, image, transcription, and speech helpers aligned with the upstream APIs.

See AI SDK docs for configuring custom providers.