Compare commits

...

19 Commits

Author SHA1 Message Date
Dafei Zhao
58bb3ab6f6 fix: fix channel_id column name (#681, close #688) 2023-11-10 21:50:52 +08:00
qingfengfenga
d306cb5229 feat: add improve docker-compose.yml and support fast startup (#685)
Co-authored-by: 王彦朋 Penn Wang <penn.wang@digitwin.com.cn>
2023-11-10 21:40:00 +08:00
Yuhang
6c5307d0c4 docs: add deploy to zeabur button (#693)
* Update README.md

* Update README.en.md

* Update README.ja.md
2023-11-10 21:20:59 +08:00
Baksi
7c4505bdfc fix: numeric sorting in tables (#695)
* Update sorting method for id

* Update sorting method for id (token)

* Update sorting method for id (redemptions)

* Update sorting method for id (channel)

* chore: use same logic for all tables

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-11-10 21:20:05 +08:00
Mikey
9d43ec57d8 feat: sync pricing for new 1106 models (#696)
* feat: sync pricing for new 1106 models

* chore: change ratio after 2023-12-11

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-11-10 21:08:23 +08:00
JustSong
e5311892d1 docs: update readme 2023-11-08 23:17:12 +08:00
wzxjohn
bc7c9105f4 chore: update quota calc logic (close #599) (#627)
* fix: change quota calc code (close #599)

Use float64 during calc and do math.Ceil after calc. This will result in the quota being used slightly more than the official standard, but it will be guaranteed that it will not be less.

* chore: remove blank line

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-11-05 19:15:06 +08:00
wood chen
3fe76c8af7 fix: fix Cloudflare AI Gateway channel test support (#639)
* 当使用Cloudflare AI Gateway时,支持openai渠道测试

* refactor: change logic

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-11-05 19:08:25 +08:00
papersnake
c70c614018 feat: support chatglm_turbo (#648)
* feat: support chatglm_turbo

* fix: remove characterglm
2023-11-05 17:59:38 +08:00
Baksi
0d87de697c fix: fix typo (#651) 2023-11-02 22:24:22 +08:00
MaricoHan
aec343dc38 feat: support xunfei v3 (#637) 2023-10-29 22:03:01 +08:00
JustSong
89d458b9cf feat: able to set RELAY_TIMEOUT 2023-10-22 20:39:49 +08:00
JustSong
63fafba112 feat: support ERNIE-Bot-4 (close #608) 2023-10-22 18:48:35 +08:00
Bryan
a398f35968 fix: fix postgresql support (#606)
* fix postgresql support

fixes #517

* fix: fix pg support

* chore: delete useless code

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-10-22 18:38:29 +08:00
yiGmMk
57aa637c77 fix: set Accept header if not given (#615)
* fix: fastgpt调用通义千问问答失败

* refactor: Dockerfile

* Revert "refactor: Dockerfile"

This reverts commit a538c4f28e.

* chore: update implementation

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-10-22 17:56:20 +08:00
vc
3b483639a4 feat: add cloudflare ai gateway support for image & audio (#607)
* Update channel-test.go

* Update relay-audio.go

* Update relay-image.go

* chore: using a util function

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
2023-10-22 17:50:52 +08:00
subnew
22980b4c44 docs: add description for TIKTOKEN_CACHE_DIR (#612)
* Update README.md

* Update README.md
2023-10-22 17:31:27 +08:00
Pluto
64cdb7eafb fix: docker compose healthcheck failed (#593) 2023-10-14 21:55:16 -05:00
JustSong
824444244b feat: able to delete all disabled channels 2023-10-14 17:25:48 +08:00
30 changed files with 247 additions and 79 deletions

3
.gitignore vendored
View File

@@ -5,4 +5,5 @@ upload
*.db *.db
build build
*.db-journal *.db-journal
logs logs
data

View File

@@ -189,6 +189,8 @@ If you encounter a blank page after deployment, refer to [#97](https://github.co
> Zeabur's servers are located overseas, automatically solving network issues, and the free quota is sufficient for personal usage. > Zeabur's servers are located overseas, automatically solving network issues, and the free quota is sufficient for personal usage.
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/templates/7Q0KO3)
1. First, fork the code. 1. First, fork the code.
2. Go to [Zeabur](https://zeabur.com?referralCode=songquanpeng), log in, and enter the console. 2. Go to [Zeabur](https://zeabur.com?referralCode=songquanpeng), log in, and enter the console.
3. Create a new project. In Service -> Add Service, select Marketplace, and choose MySQL. Note down the connection parameters (username, password, address, and port). 3. Create a new project. In Service -> Add Service, select Marketplace, and choose MySQL. Note down the connection parameters (username, password, address, and port).

View File

@@ -190,6 +190,8 @@ Please refer to the [environment variables](#environment-variables) section for
> Zeabur のサーバーは海外にあるため、ネットワークの問題は自動的に解決されます。 > Zeabur のサーバーは海外にあるため、ネットワークの問題は自動的に解決されます。
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/templates/7Q0KO3)
1. まず、コードをフォークする。 1. まず、コードをフォークする。
2. [Zeabur](https://zeabur.com?referralCode=songquanpeng) にアクセスしてログインし、コンソールに入る。 2. [Zeabur](https://zeabur.com?referralCode=songquanpeng) にアクセスしてログインし、コンソールに入る。
3. 新しいプロジェクトを作成します。Service -> Add ServiceでMarketplace を選択し、MySQL を選択する。接続パラメータ(ユーザー名、パスワード、アドレス、ポート)をメモします。 3. 新しいプロジェクトを作成します。Service -> Add ServiceでMarketplace を選択し、MySQL を選択する。接続パラメータ(ユーザー名、パスワード、アドレス、ポート)をメモします。

View File

@@ -75,7 +75,7 @@ _✨ 通过标准的 OpenAI API 格式访问所有的大模型,开箱即用
+ [x] [腾讯混元大模型](https://cloud.tencent.com/document/product/1729) + [x] [腾讯混元大模型](https://cloud.tencent.com/document/product/1729)
2. 支持配置镜像以及众多第三方代理服务: 2. 支持配置镜像以及众多第三方代理服务:
+ [x] [OpenAI-SB](https://openai-sb.com) + [x] [OpenAI-SB](https://openai-sb.com)
+ [x] [CloseAI](https://console.closeai-asia.com/r/2412) + [x] [CloseAI](https://referer.shadowai.xyz/r/2412)
+ [x] [API2D](https://api2d.com/r/197971) + [x] [API2D](https://api2d.com/r/197971)
+ [x] [OhMyGPT](https://aigptx.top?aff=uFpUl2Kf) + [x] [OhMyGPT](https://aigptx.top?aff=uFpUl2Kf)
+ [x] [AI Proxy](https://aiproxy.io/?i=OneAPI) (邀请码:`OneAPI` + [x] [AI Proxy](https://aiproxy.io/?i=OneAPI) (邀请码:`OneAPI`
@@ -160,6 +160,19 @@ sudo service nginx restart
初始账号用户名为 `root`,密码为 `123456` 初始账号用户名为 `root`,密码为 `123456`
### 基于 Docker Compose 进行部署
> 仅启动方式不同,参数设置不变,请参考基于 Docker 部署部分
```shell
# 目前支持 MySQL 启动,数据存储在 ./data/mysql 文件夹内
docker-compose up -d
# 查看部署状态
docker-compose ps
```
### 手动部署 ### 手动部署
1. 从 [GitHub Releases](https://github.com/songquanpeng/one-api/releases/latest) 下载可执行文件或者从源码编译: 1. 从 [GitHub Releases](https://github.com/songquanpeng/one-api/releases/latest) 下载可执行文件或者从源码编译:
```shell ```shell
@@ -249,6 +262,8 @@ docker run --name chatgpt-web -d -p 3002:3002 -e OPENAI_API_BASE_URL=https://ope
> Zeabur 的服务器在国外,自动解决了网络的问题,同时免费的额度也足够个人使用 > Zeabur 的服务器在国外,自动解决了网络的问题,同时免费的额度也足够个人使用
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/templates/7Q0KO3)
1. 首先 fork 一份代码。 1. 首先 fork 一份代码。
2. 进入 [Zeabur](https://zeabur.com?referralCode=songquanpeng),登录,进入控制台。 2. 进入 [Zeabur](https://zeabur.com?referralCode=songquanpeng),登录,进入控制台。
3. 新建一个 Project在 Service -> Add Service 选择 Marketplace选择 MySQL并记下连接参数用户名、密码、地址、端口 3. 新建一个 Project在 Service -> Add Service 选择 Marketplace选择 MySQL并记下连接参数用户名、密码、地址、端口
@@ -352,6 +367,10 @@ graph LR
13. 请求频率限制: 13. 请求频率限制:
+ `GLOBAL_API_RATE_LIMIT`:全局 API 速率限制(除中继请求外),单 ip 三分钟内的最大请求数,默认为 `180`。 + `GLOBAL_API_RATE_LIMIT`:全局 API 速率限制(除中继请求外),单 ip 三分钟内的最大请求数,默认为 `180`。
+ `GLOBAL_WEB_RATE_LIMIT`:全局 Web 速率限制,单 ip 三分钟内的最大请求数,默认为 `60`。 + `GLOBAL_WEB_RATE_LIMIT`:全局 Web 速率限制,单 ip 三分钟内的最大请求数,默认为 `60`。
14. 编码器缓存设置:
+ `TIKTOKEN_CACHE_DIR`:默认程序启动时会联网下载一些通用的词元的编码,如:`gpt-3.5-turbo`,在一些网络环境不稳定,或者离线情况,可能会导致启动有问题,可以配置此目录缓存数据,可迁移到离线环境。
+ `DATA_GYM_CACHE_DIR`:目前该配置作用与 `TIKTOKEN_CACHE_DIR` 一致,但是优先级没有它高。
15. `RELAY_TIMEOUT`:中继超时设置,单位为秒,默认不设置超时时间。
### 命令行参数 ### 命令行参数
1. `--port <port_number>`: 指定服务器监听的端口号,默认为 `3000`。 1. `--port <port_number>`: 指定服务器监听的端口号,默认为 `3000`。

View File

@@ -21,12 +21,9 @@ var QuotaPerUnit = 500 * 1000.0 // $0.002 / 1K tokens
var DisplayInCurrencyEnabled = true var DisplayInCurrencyEnabled = true
var DisplayTokenStatEnabled = true var DisplayTokenStatEnabled = true
var UsingSQLite = false
// Any options with "Secret", "Token" in its key won't be return by GetOptions // Any options with "Secret", "Token" in its key won't be return by GetOptions
var SessionSecret = uuid.New().String() var SessionSecret = uuid.New().String()
var SQLitePath = "one-api.db"
var OptionMap map[string]string var OptionMap map[string]string
var OptionMapRWMutex sync.RWMutex var OptionMapRWMutex sync.RWMutex
@@ -98,6 +95,8 @@ var SyncFrequency = GetOrDefault("SYNC_FREQUENCY", 10*60) // unit is second
var BatchUpdateEnabled = false var BatchUpdateEnabled = false
var BatchUpdateInterval = GetOrDefault("BATCH_UPDATE_INTERVAL", 5) var BatchUpdateInterval = GetOrDefault("BATCH_UPDATE_INTERVAL", 5)
var RelayTimeout = GetOrDefault("RELAY_TIMEOUT", 0) // unit is second
const ( const (
RequestIdKey = "X-Oneapi-Request-Id" RequestIdKey = "X-Oneapi-Request-Id"
) )

6
common/database.go Normal file
View File

@@ -0,0 +1,6 @@
package common
var UsingSQLite = false
var UsingPostgreSQL = false
var SQLitePath = "one-api.db"

View File

@@ -3,6 +3,7 @@ package common
import ( import (
"encoding/json" "encoding/json"
"strings" "strings"
"time"
) )
// ModelRatio // ModelRatio
@@ -19,12 +20,15 @@ var ModelRatio = map[string]float64{
"gpt-4-32k": 30, "gpt-4-32k": 30,
"gpt-4-32k-0314": 30, "gpt-4-32k-0314": 30,
"gpt-4-32k-0613": 30, "gpt-4-32k-0613": 30,
"gpt-4-1106-preview": 5, // $0.01 / 1K tokens
"gpt-4-vision-preview": 5, // $0.01 / 1K tokens
"gpt-3.5-turbo": 0.75, // $0.0015 / 1K tokens "gpt-3.5-turbo": 0.75, // $0.0015 / 1K tokens
"gpt-3.5-turbo-0301": 0.75, "gpt-3.5-turbo-0301": 0.75,
"gpt-3.5-turbo-0613": 0.75, "gpt-3.5-turbo-0613": 0.75,
"gpt-3.5-turbo-16k": 1.5, // $0.003 / 1K tokens "gpt-3.5-turbo-16k": 1.5, // $0.003 / 1K tokens
"gpt-3.5-turbo-16k-0613": 1.5, "gpt-3.5-turbo-16k-0613": 1.5,
"gpt-3.5-turbo-instruct": 0.75, // $0.0015 / 1K tokens "gpt-3.5-turbo-instruct": 0.75, // $0.0015 / 1K tokens
"gpt-3.5-turbo-1106": 0.5, // $0.001 / 1K tokens
"text-ada-001": 0.2, "text-ada-001": 0.2,
"text-babbage-001": 0.25, "text-babbage-001": 0.25,
"text-curie-001": 1, "text-curie-001": 1,
@@ -46,8 +50,10 @@ var ModelRatio = map[string]float64{
"claude-2": 5.51, // $11.02 / 1M tokens "claude-2": 5.51, // $11.02 / 1M tokens
"ERNIE-Bot": 0.8572, // ¥0.012 / 1k tokens "ERNIE-Bot": 0.8572, // ¥0.012 / 1k tokens
"ERNIE-Bot-turbo": 0.5715, // ¥0.008 / 1k tokens "ERNIE-Bot-turbo": 0.5715, // ¥0.008 / 1k tokens
"ERNIE-Bot-4": 8.572, // ¥0.12 / 1k tokens
"Embedding-V1": 0.1429, // ¥0.002 / 1k tokens "Embedding-V1": 0.1429, // ¥0.002 / 1k tokens
"PaLM-2": 1, "PaLM-2": 1,
"chatglm_turbo": 0.3572, // ¥0.005 / 1k tokens
"chatglm_pro": 0.7143, // ¥0.01 / 1k tokens "chatglm_pro": 0.7143, // ¥0.01 / 1k tokens
"chatglm_std": 0.3572, // ¥0.005 / 1k tokens "chatglm_std": 0.3572, // ¥0.005 / 1k tokens
"chatglm_lite": 0.1429, // ¥0.002 / 1k tokens "chatglm_lite": 0.1429, // ¥0.002 / 1k tokens
@@ -86,9 +92,24 @@ func GetModelRatio(name string) float64 {
func GetCompletionRatio(name string) float64 { func GetCompletionRatio(name string) float64 {
if strings.HasPrefix(name, "gpt-3.5") { if strings.HasPrefix(name, "gpt-3.5") {
if strings.HasSuffix(name, "1106") {
return 2
}
if name == "gpt-3.5-turbo" || name == "gpt-3.5-turbo-16k" {
// TODO: clear this after 2023-12-11
now := time.Now()
// https://platform.openai.com/docs/models/continuous-model-upgrades
// if after 2023-12-11, use 2
if now.After(time.Date(2023, 12, 11, 0, 0, 0, 0, time.UTC)) {
return 2
}
}
return 1.333333 return 1.333333
} }
if strings.HasPrefix(name, "gpt-4") { if strings.HasPrefix(name, "gpt-4") {
if strings.HasSuffix(name, "preview") {
return 3
}
return 2 return 2
} }
if strings.HasPrefix(name, "claude-instant-1") { if strings.HasPrefix(name, "claude-instant-1") {

View File

@@ -199,3 +199,11 @@ func GetOrDefault(env string, defaultValue int) int {
func MessageWithRequestId(message string, id string) string { func MessageWithRequestId(message string, id string) string {
return fmt.Sprintf("%s (request id: %s)", message, id) return fmt.Sprintf("%s (request id: %s)", message, id)
} }
func String2Int(str string) int {
num, err := strconv.Atoi(str)
if err != nil {
return 0
}
return num
}

View File

@@ -10,6 +10,7 @@ import (
"one-api/common" "one-api/common"
"one-api/model" "one-api/model"
"strconv" "strconv"
"strings"
"sync" "sync"
"time" "time"
) )
@@ -49,6 +50,8 @@ func testChannel(channel *model.Channel, request ChatRequest) (err error, openai
} }
requestURL += "/v1/chat/completions" requestURL += "/v1/chat/completions"
} }
// for Cloudflare AI gateway: https://github.com/songquanpeng/one-api/pull/639
requestURL = strings.Replace(requestURL, "/v1/v1", "/v1", 1)
jsonData, err := json.Marshal(request) jsonData, err := json.Marshal(request)
if err != nil { if err != nil {

View File

@@ -127,8 +127,8 @@ func DeleteChannel(c *gin.Context) {
return return
} }
func DeleteManuallyDisabledChannel(c *gin.Context) { func DeleteDisabledChannel(c *gin.Context) {
rows, err := model.DeleteChannelByStatus(common.ChannelStatusManuallyDisabled) rows, err := model.DeleteDisabledChannel()
if err != nil { if err != nil {
c.JSON(http.StatusOK, gin.H{ c.JSON(http.StatusOK, gin.H{
"success": false, "success": false,

View File

@@ -117,6 +117,15 @@ func init() {
Root: "gpt-3.5-turbo-16k-0613", Root: "gpt-3.5-turbo-16k-0613",
Parent: nil, Parent: nil,
}, },
{
Id: "gpt-3.5-turbo-1106",
Object: "model",
Created: 1699593571,
OwnedBy: "openai",
Permission: permission,
Root: "gpt-3.5-turbo-1106",
Parent: nil,
},
{ {
Id: "gpt-3.5-turbo-instruct", Id: "gpt-3.5-turbo-instruct",
Object: "model", Object: "model",
@@ -180,6 +189,24 @@ func init() {
Root: "gpt-4-32k-0613", Root: "gpt-4-32k-0613",
Parent: nil, Parent: nil,
}, },
{
Id: "gpt-4-1106-preview",
Object: "model",
Created: 1699593571,
OwnedBy: "openai",
Permission: permission,
Root: "gpt-4-1106-preview",
Parent: nil,
},
{
Id: "gpt-4-vision-preview",
Object: "model",
Created: 1699593571,
OwnedBy: "openai",
Permission: permission,
Root: "gpt-4-vision-preview",
Parent: nil,
},
{ {
Id: "text-embedding-ada-002", Id: "text-embedding-ada-002",
Object: "model", Object: "model",
@@ -274,7 +301,7 @@ func init() {
Id: "claude-instant-1", Id: "claude-instant-1",
Object: "model", Object: "model",
Created: 1677649963, Created: 1677649963,
OwnedBy: "anturopic", OwnedBy: "anthropic",
Permission: permission, Permission: permission,
Root: "claude-instant-1", Root: "claude-instant-1",
Parent: nil, Parent: nil,
@@ -283,7 +310,7 @@ func init() {
Id: "claude-2", Id: "claude-2",
Object: "model", Object: "model",
Created: 1677649963, Created: 1677649963,
OwnedBy: "anturopic", OwnedBy: "anthropic",
Permission: permission, Permission: permission,
Root: "claude-2", Root: "claude-2",
Parent: nil, Parent: nil,
@@ -306,6 +333,15 @@ func init() {
Root: "ERNIE-Bot-turbo", Root: "ERNIE-Bot-turbo",
Parent: nil, Parent: nil,
}, },
{
Id: "ERNIE-Bot-4",
Object: "model",
Created: 1677649963,
OwnedBy: "baidu",
Permission: permission,
Root: "ERNIE-Bot-4",
Parent: nil,
},
{ {
Id: "Embedding-V1", Id: "Embedding-V1",
Object: "model", Object: "model",
@@ -324,6 +360,15 @@ func init() {
Root: "PaLM-2", Root: "PaLM-2",
Parent: nil, Parent: nil,
}, },
{
Id: "chatglm_turbo",
Object: "model",
Created: 1677649963,
OwnedBy: "zhipu",
Permission: permission,
Root: "chatglm_turbo",
Parent: nil,
},
{ {
Id: "chatglm_pro", Id: "chatglm_pro",
Object: "model", Object: "model",

View File

@@ -6,12 +6,11 @@ import (
"encoding/json" "encoding/json"
"errors" "errors"
"fmt" "fmt"
"github.com/gin-gonic/gin"
"io" "io"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/model" "one-api/model"
"github.com/gin-gonic/gin"
) )
func relayAudioHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode { func relayAudioHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode {
@@ -66,12 +65,11 @@ func relayAudioHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode
baseURL := common.ChannelBaseURLs[channelType] baseURL := common.ChannelBaseURLs[channelType]
requestURL := c.Request.URL.String() requestURL := c.Request.URL.String()
if c.GetString("base_url") != "" { if c.GetString("base_url") != "" {
baseURL = c.GetString("base_url") baseURL = c.GetString("base_url")
} }
fullRequestURL := fmt.Sprintf("%s%s", baseURL, requestURL) fullRequestURL := getFullRequestURL(baseURL, requestURL, channelType)
requestBody := c.Request.Body requestBody := c.Request.Body
req, err := http.NewRequest(c.Request.Method, fullRequestURL, requestBody) req, err := http.NewRequest(c.Request.Method, fullRequestURL, requestBody)

View File

@@ -6,12 +6,11 @@ import (
"encoding/json" "encoding/json"
"errors" "errors"
"fmt" "fmt"
"github.com/gin-gonic/gin"
"io" "io"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/model" "one-api/model"
"github.com/gin-gonic/gin"
) )
func relayImageHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode { func relayImageHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode {
@@ -61,16 +60,12 @@ func relayImageHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode
isModelMapped = true isModelMapped = true
} }
} }
baseURL := common.ChannelBaseURLs[channelType] baseURL := common.ChannelBaseURLs[channelType]
requestURL := c.Request.URL.String() requestURL := c.Request.URL.String()
if c.GetString("base_url") != "" { if c.GetString("base_url") != "" {
baseURL = c.GetString("base_url") baseURL = c.GetString("base_url")
} }
fullRequestURL := getFullRequestURL(baseURL, requestURL, channelType)
fullRequestURL := fmt.Sprintf("%s%s", baseURL, requestURL)
var requestBody io.Reader var requestBody io.Reader
if isModelMapped { if isModelMapped {
jsonStr, err := json.Marshal(imageRequest) jsonStr, err := json.Marshal(imageRequest)

View File

@@ -6,13 +6,15 @@ import (
"encoding/json" "encoding/json"
"errors" "errors"
"fmt" "fmt"
"github.com/gin-gonic/gin"
"io" "io"
"math"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/model" "one-api/model"
"strings" "strings"
"time" "time"
"github.com/gin-gonic/gin"
) )
const ( const (
@@ -31,7 +33,14 @@ var httpClient *http.Client
var impatientHTTPClient *http.Client var impatientHTTPClient *http.Client
func init() { func init() {
httpClient = &http.Client{} if common.RelayTimeout == 0 {
httpClient = &http.Client{}
} else {
httpClient = &http.Client{
Timeout: time.Duration(common.RelayTimeout) * time.Second,
}
}
impatientHTTPClient = &http.Client{ impatientHTTPClient = &http.Client{
Timeout: 5 * time.Second, Timeout: 5 * time.Second,
} }
@@ -118,12 +127,7 @@ func relayTextHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode {
if c.GetString("base_url") != "" { if c.GetString("base_url") != "" {
baseURL = c.GetString("base_url") baseURL = c.GetString("base_url")
} }
fullRequestURL := fmt.Sprintf("%s%s", baseURL, requestURL) fullRequestURL := getFullRequestURL(baseURL, requestURL, channelType)
if channelType == common.ChannelTypeOpenAI {
if strings.HasPrefix(baseURL, "https://gateway.ai.cloudflare.com") {
fullRequestURL = fmt.Sprintf("%s%s", baseURL, strings.TrimPrefix(requestURL, "/v1"))
}
}
switch apiType { switch apiType {
case APITypeOpenAI: case APITypeOpenAI:
if channelType == common.ChannelTypeAzure { if channelType == common.ChannelTypeAzure {
@@ -156,6 +160,8 @@ func relayTextHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode {
fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions" fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions"
case "ERNIE-Bot-turbo": case "ERNIE-Bot-turbo":
fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/eb-instant" fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/eb-instant"
case "ERNIE-Bot-4":
fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions_pro"
case "BLOOMZ-7B": case "BLOOMZ-7B":
fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/bloomz_7b1" fullRequestURL = "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/bloomz_7b1"
case "Embedding-V1": case "Embedding-V1":
@@ -366,6 +372,9 @@ func relayTextHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode {
} }
req.Header.Set("Content-Type", c.Request.Header.Get("Content-Type")) req.Header.Set("Content-Type", c.Request.Header.Get("Content-Type"))
req.Header.Set("Accept", c.Request.Header.Get("Accept")) req.Header.Set("Accept", c.Request.Header.Get("Accept"))
if isStream && c.Request.Header.Get("Accept") == "" {
req.Header.Set("Accept", "text/event-stream")
}
//req.Header.Set("Connection", c.Request.Header.Get("Connection")) //req.Header.Set("Connection", c.Request.Header.Get("Connection"))
resp, err = httpClient.Do(req) resp, err = httpClient.Do(req)
if err != nil { if err != nil {
@@ -406,9 +415,7 @@ func relayTextHelper(c *gin.Context, relayMode int) *OpenAIErrorWithStatusCode {
completionRatio := common.GetCompletionRatio(textRequest.Model) completionRatio := common.GetCompletionRatio(textRequest.Model)
promptTokens = textResponse.Usage.PromptTokens promptTokens = textResponse.Usage.PromptTokens
completionTokens = textResponse.Usage.CompletionTokens completionTokens = textResponse.Usage.CompletionTokens
quota = int(math.Ceil((float64(promptTokens) + float64(completionTokens)*completionRatio) * ratio))
quota = promptTokens + int(float64(completionTokens)*completionRatio)
quota = int(float64(quota) * ratio)
if ratio != 0 && quota <= 0 { if ratio != 0 && quota <= 0 {
quota = 1 quota = 1
} }

View File

@@ -176,3 +176,13 @@ func relayErrorHandler(resp *http.Response) (openAIErrorWithStatusCode *OpenAIEr
openAIErrorWithStatusCode.OpenAIError = textResponse.Error openAIErrorWithStatusCode.OpenAIError = textResponse.Error
return return
} }
func getFullRequestURL(baseURL string, requestURL string, channelType int) string {
fullRequestURL := fmt.Sprintf("%s%s", baseURL, requestURL)
if channelType == common.ChannelTypeOpenAI {
if strings.HasPrefix(baseURL, "https://gateway.ai.cloudflare.com") {
fullRequestURL = fmt.Sprintf("%s%s", baseURL, strings.TrimPrefix(requestURL, "/v1"))
}
}
return fullRequestURL
}

View File

@@ -298,8 +298,8 @@ func getXunfeiAuthUrl(c *gin.Context, apiKey string, apiSecret string) (string,
common.SysLog("api_version not found, use default: " + apiVersion) common.SysLog("api_version not found, use default: " + apiVersion)
} }
domain := "general" domain := "general"
if apiVersion == "v2.1" { if apiVersion != "v1.1" {
domain = "generalv2" domain += strings.Split(apiVersion, ".")[0]
} }
authUrl := buildXunfeiAuthUrl(fmt.Sprintf("wss://spark-api.xf-yun.com/%s/chat", apiVersion), apiKey, apiSecret) authUrl := buildXunfeiAuthUrl(fmt.Sprintf("wss://spark-api.xf-yun.com/%s/chat", apiVersion), apiKey, apiSecret)
return domain, authUrl return domain, authUrl

View File

@@ -9,21 +9,21 @@ services:
ports: ports:
- "3000:3000" - "3000:3000"
volumes: volumes:
- ./data:/data - ./data/oneapi:/data
- ./logs:/app/logs - ./logs:/app/logs
environment: environment:
- SQL_DSN=root:123456@tcp(host.docker.internal:3306)/one-api # 修改此行,或注释掉以使用 SQLite 作为数据库 - SQL_DSN=oneapi:123456@tcp(db:3306)/one-api # 修改此行,或注释掉以使用 SQLite 作为数据库
- REDIS_CONN_STRING=redis://redis - REDIS_CONN_STRING=redis://redis
- SESSION_SECRET=random_string # 修改为随机字符串 - SESSION_SECRET=random_string # 修改为随机字符串
- TZ=Asia/Shanghai - TZ=Asia/Shanghai
# - NODE_TYPE=slave # 多机部署时从节点取消注释该行 # - NODE_TYPE=slave # 多机部署时从节点取消注释该行
# - SYNC_FREQUENCY=60 # 需要定期从数据库加载数据时取消注释该行 # - SYNC_FREQUENCY=60 # 需要定期从数据库加载数据时取消注释该行
# - FRONTEND_BASE_URL=https://openai.justsong.cn # 多机部署时从节点取消注释该行 # - FRONTEND_BASE_URL=https://openai.justsong.cn # 多机部署时从节点取消注释该行
depends_on: depends_on:
- redis - redis
- db
healthcheck: healthcheck:
test: [ "CMD-SHELL", "curl -s http://localhost:3000/api/status | grep -o '\"success\":\\s*true' | awk '{print $2}' | grep 'true'" ] test: [ "CMD-SHELL", "wget -q -O - http://localhost:3000/api/status | grep -o '\"success\":\\s*true' | awk -F: '{print $2}'" ]
interval: 30s interval: 30s
timeout: 10s timeout: 10s
retries: 3 retries: 3
@@ -32,3 +32,18 @@ services:
image: redis:latest image: redis:latest
container_name: redis container_name: redis
restart: always restart: always
db:
image: mysql:8.2.0
restart: always
container_name: mysql
volumes:
- ./data/mysql:/var/lib/mysql # 挂载目录,持久化存储
ports:
- '3306:3306'
environment:
TZ: Asia/Shanghai # 设置时区
MYSQL_ROOT_PASSWORD: 'OneAPI@justsong' # 设置 root 用户的密码
MYSQL_USER: oneapi # 创建专用用户
MYSQL_PASSWORD: '123456' # 设置专用用户密码
MYSQL_DATABASE: one-api # 自动创建数据库

View File

@@ -15,10 +15,17 @@ type Ability struct {
func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) { func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) {
ability := Ability{} ability := Ability{}
groupCol := "`group`"
trueVal := "1"
if common.UsingPostgreSQL {
groupCol = `"group"`
trueVal = "true"
}
var err error = nil var err error = nil
maxPrioritySubQuery := DB.Model(&Ability{}).Select("MAX(priority)").Where("`group` = ? and model = ? and enabled = 1", group, model) maxPrioritySubQuery := DB.Model(&Ability{}).Select("MAX(priority)").Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model)
channelQuery := DB.Where("`group` = ? and model = ? and enabled = 1 and priority = (?)", group, model, maxPrioritySubQuery) channelQuery := DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal+" and priority = (?)", group, model, maxPrioritySubQuery)
if common.UsingSQLite { if common.UsingSQLite || common.UsingPostgreSQL {
err = channelQuery.Order("RANDOM()").First(&ability).Error err = channelQuery.Order("RANDOM()").First(&ability).Error
} else { } else {
err = channelQuery.Order("RAND()").First(&ability).Error err = channelQuery.Order("RAND()").First(&ability).Error

View File

@@ -21,14 +21,18 @@ var (
) )
func CacheGetTokenByKey(key string) (*Token, error) { func CacheGetTokenByKey(key string) (*Token, error) {
keyCol := "`key`"
if common.UsingPostgreSQL {
keyCol = `"key"`
}
var token Token var token Token
if !common.RedisEnabled { if !common.RedisEnabled {
err := DB.Where("`key` = ?", key).First(&token).Error err := DB.Where(keyCol+" = ?", key).First(&token).Error
return &token, err return &token, err
} }
tokenObjectString, err := common.RedisGet(fmt.Sprintf("token:%s", key)) tokenObjectString, err := common.RedisGet(fmt.Sprintf("token:%s", key))
if err != nil { if err != nil {
err := DB.Where("`key` = ?", key).First(&token).Error err := DB.Where(keyCol+" = ?", key).First(&token).Error
if err != nil { if err != nil {
return nil, err return nil, err
} }

View File

@@ -38,7 +38,11 @@ func GetAllChannels(startIdx int, num int, selectAll bool) ([]*Channel, error) {
} }
func SearchChannels(keyword string) (channels []*Channel, err error) { func SearchChannels(keyword string) (channels []*Channel, err error) {
err = DB.Omit("key").Where("id = ? or name LIKE ? or `key` = ?", keyword, keyword+"%", keyword).Find(&channels).Error keyCol := "`key`"
if common.UsingPostgreSQL {
keyCol = `"key"`
}
err = DB.Omit("key").Where("id = ? or name LIKE ? or "+keyCol+" = ?", common.String2Int(keyword), keyword+"%", keyword).Find(&channels).Error
return channels, err return channels, err
} }
@@ -53,17 +57,6 @@ func GetChannelById(id int, selectAll bool) (*Channel, error) {
return &channel, err return &channel, err
} }
func GetRandomChannel() (*Channel, error) {
channel := Channel{}
var err error = nil
if common.UsingSQLite {
err = DB.Where("status = ? and `group` = ?", common.ChannelStatusEnabled, "default").Order("RANDOM()").Limit(1).First(&channel).Error
} else {
err = DB.Where("status = ? and `group` = ?", common.ChannelStatusEnabled, "default").Order("RAND()").Limit(1).First(&channel).Error
}
return &channel, err
}
func BatchInsertChannels(channels []Channel) error { func BatchInsertChannels(channels []Channel) error {
var err error var err error
err = DB.Create(&channels).Error err = DB.Create(&channels).Error
@@ -181,3 +174,8 @@ func DeleteChannelByStatus(status int64) (int64, error) {
result := DB.Where("status = ?", status).Delete(&Channel{}) result := DB.Where("status = ?", status).Delete(&Channel{})
return result.RowsAffected, result.Error return result.RowsAffected, result.Error
} }
func DeleteDisabledChannel() (int64, error) {
result := DB.Where("status = ? or status = ?", common.ChannelStatusAutoDisabled, common.ChannelStatusManuallyDisabled).Delete(&Channel{})
return result.RowsAffected, result.Error
}

View File

@@ -94,7 +94,7 @@ func GetAllLogs(logType int, startTimestamp int64, endTimestamp int64, modelName
tx = tx.Where("created_at <= ?", endTimestamp) tx = tx.Where("created_at <= ?", endTimestamp)
} }
if channel != 0 { if channel != 0 {
tx = tx.Where("channel = ?", channel) tx = tx.Where("channel_id = ?", channel)
} }
err = tx.Order("id desc").Limit(num).Offset(startIdx).Find(&logs).Error err = tx.Order("id desc").Limit(num).Offset(startIdx).Find(&logs).Error
return logs, err return logs, err
@@ -151,7 +151,7 @@ func SumUsedQuota(logType int, startTimestamp int64, endTimestamp int64, modelNa
tx = tx.Where("model_name = ?", modelName) tx = tx.Where("model_name = ?", modelName)
} }
if channel != 0 { if channel != 0 {
tx = tx.Where("channel = ?", channel) tx = tx.Where("channel_id = ?", channel)
} }
tx.Where("type = ?", LogTypeConsume).Scan(&quota) tx.Where("type = ?", LogTypeConsume).Scan(&quota)
return quota return quota

View File

@@ -42,6 +42,7 @@ func chooseDB() (*gorm.DB, error) {
if strings.HasPrefix(dsn, "postgres://") { if strings.HasPrefix(dsn, "postgres://") {
// Use PostgreSQL // Use PostgreSQL
common.SysLog("using PostgreSQL as database") common.SysLog("using PostgreSQL as database")
common.UsingPostgreSQL = true
return gorm.Open(postgres.New(postgres.Config{ return gorm.Open(postgres.New(postgres.Config{
DSN: dsn, DSN: dsn,
PreferSimpleProtocol: true, // disables implicit prepared statement usage PreferSimpleProtocol: true, // disables implicit prepared statement usage

View File

@@ -50,8 +50,13 @@ func Redeem(key string, userId int) (quota int, err error) {
} }
redemption := &Redemption{} redemption := &Redemption{}
keyCol := "`key`"
if common.UsingPostgreSQL {
keyCol = `"key"`
}
err = DB.Transaction(func(tx *gorm.DB) error { err = DB.Transaction(func(tx *gorm.DB) error {
err := tx.Set("gorm:query_option", "FOR UPDATE").Where("`key` = ?", key).First(redemption).Error err := tx.Set("gorm:query_option", "FOR UPDATE").Where(keyCol+" = ?", key).First(redemption).Error
if err != nil { if err != nil {
return errors.New("无效的兑换码") return errors.New("无效的兑换码")
} }

View File

@@ -266,7 +266,12 @@ func GetUserEmail(id int) (email string, err error) {
} }
func GetUserGroup(id int) (group string, err error) { func GetUserGroup(id int) (group string, err error) {
err = DB.Model(&User{}).Where("id = ?", id).Select("`group`").Find(&group).Error groupCol := "`group`"
if common.UsingPostgreSQL {
groupCol = `"group"`
}
err = DB.Model(&User{}).Where("id = ?", id).Select(groupCol).Find(&group).Error
return group, err return group, err
} }

View File

@@ -74,7 +74,7 @@ func SetApiRouter(router *gin.Engine) {
channelRoute.GET("/update_balance/:id", controller.UpdateChannelBalance) channelRoute.GET("/update_balance/:id", controller.UpdateChannelBalance)
channelRoute.POST("/", controller.AddChannel) channelRoute.POST("/", controller.AddChannel)
channelRoute.PUT("/", controller.UpdateChannel) channelRoute.PUT("/", controller.UpdateChannel)
channelRoute.DELETE("/manually_disabled", controller.DeleteManuallyDisabledChannel) channelRoute.DELETE("/disabled", controller.DeleteDisabledChannel)
channelRoute.DELETE("/:id", controller.DeleteChannel) channelRoute.DELETE("/:id", controller.DeleteChannel)
} }
tokenRoute := apiRouter.Group("/token") tokenRoute := apiRouter.Group("/token")

View File

@@ -240,11 +240,11 @@ const ChannelsTable = () => {
} }
}; };
const deleteAllManuallyDisabledChannels = async () => { const deleteAllDisabledChannels = async () => {
const res = await API.delete(`/api/channel/manually_disabled`); const res = await API.delete(`/api/channel/disabled`);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
showSuccess(`已删除所有手动禁用渠道,共计 ${data}`); showSuccess(`已删除所有禁用渠道,共计 ${data}`);
await refresh(); await refresh();
} else { } else {
showError(message); showError(message);
@@ -286,17 +286,15 @@ const ChannelsTable = () => {
if (channels.length === 0) return; if (channels.length === 0) return;
setLoading(true); setLoading(true);
let sortedChannels = [...channels]; let sortedChannels = [...channels];
if (typeof sortedChannels[0][key] === 'string') { sortedChannels.sort((a, b) => {
sortedChannels.sort((a, b) => { if (!isNaN(a[key])) {
// If the value is numeric, subtract to sort
return a[key] - b[key];
} else {
// If the value is not numeric, sort as strings
return ('' + a[key]).localeCompare(b[key]); return ('' + a[key]).localeCompare(b[key]);
}); }
} else { });
sortedChannels.sort((a, b) => {
if (a[key] === b[key]) return 0;
if (a[key] > b[key]) return -1;
if (a[key] < b[key]) return 1;
});
}
if (sortedChannels[0].id === channels[0].id) { if (sortedChannels[0].id === channels[0].id) {
sortedChannels.reverse(); sortedChannels.reverse();
} }
@@ -304,6 +302,7 @@ const ChannelsTable = () => {
setLoading(false); setLoading(false);
}; };
return ( return (
<> <>
<Form onSubmit={searchChannels}> <Form onSubmit={searchChannels}>
@@ -531,14 +530,14 @@ const ChannelsTable = () => {
<Popup <Popup
trigger={ trigger={
<Button size='small' loading={loading}> <Button size='small' loading={loading}>
删除所有手动禁用渠道 删除禁用渠道
</Button> </Button>
} }
on='click' on='click'
flowing flowing
hoverable hoverable
> >
<Button size='small' loading={loading} negative onClick={deleteAllManuallyDisabledChannels}> <Button size='small' loading={loading} negative onClick={deleteAllDisabledChannels}>
确认删除 确认删除
</Button> </Button>
</Popup> </Popup>

View File

@@ -130,7 +130,13 @@ const RedemptionsTable = () => {
setLoading(true); setLoading(true);
let sortedRedemptions = [...redemptions]; let sortedRedemptions = [...redemptions];
sortedRedemptions.sort((a, b) => { sortedRedemptions.sort((a, b) => {
return ('' + a[key]).localeCompare(b[key]); if (!isNaN(a[key])) {
// If the value is numeric, subtract to sort
return a[key] - b[key];
} else {
// If the value is not numeric, sort as strings
return ('' + a[key]).localeCompare(b[key]);
}
}); });
if (sortedRedemptions[0].id === redemptions[0].id) { if (sortedRedemptions[0].id === redemptions[0].id) {
sortedRedemptions.reverse(); sortedRedemptions.reverse();

View File

@@ -228,7 +228,13 @@ const TokensTable = () => {
setLoading(true); setLoading(true);
let sortedTokens = [...tokens]; let sortedTokens = [...tokens];
sortedTokens.sort((a, b) => { sortedTokens.sort((a, b) => {
return ('' + a[key]).localeCompare(b[key]); if (!isNaN(a[key])) {
// If the value is numeric, subtract to sort
return a[key] - b[key];
} else {
// If the value is not numeric, sort as strings
return ('' + a[key]).localeCompare(b[key]);
}
}); });
if (sortedTokens[0].id === tokens[0].id) { if (sortedTokens[0].id === tokens[0].id) {
sortedTokens.reverse(); sortedTokens.reverse();

View File

@@ -133,7 +133,13 @@ const UsersTable = () => {
setLoading(true); setLoading(true);
let sortedUsers = [...users]; let sortedUsers = [...users];
sortedUsers.sort((a, b) => { sortedUsers.sort((a, b) => {
return ('' + a[key]).localeCompare(b[key]); if (!isNaN(a[key])) {
// If the value is numeric, subtract to sort
return a[key] - b[key];
} else {
// If the value is not numeric, sort as strings
return ('' + a[key]).localeCompare(b[key]);
}
}); });
if (sortedUsers[0].id === users[0].id) { if (sortedUsers[0].id === users[0].id) {
sortedUsers.reverse(); sortedUsers.reverse();

View File

@@ -66,13 +66,13 @@ const EditChannel = () => {
localModels = ['PaLM-2']; localModels = ['PaLM-2'];
break; break;
case 15: case 15:
localModels = ['ERNIE-Bot', 'ERNIE-Bot-turbo', 'Embedding-V1']; localModels = ['ERNIE-Bot', 'ERNIE-Bot-turbo', 'ERNIE-Bot-4', 'Embedding-V1'];
break; break;
case 17: case 17:
localModels = ['qwen-turbo', 'qwen-plus', 'text-embedding-v1']; localModels = ['qwen-turbo', 'qwen-plus', 'text-embedding-v1'];
break; break;
case 16: case 16:
localModels = ['chatglm_pro', 'chatglm_std', 'chatglm_lite']; localModels = ['chatglm_turbo', 'chatglm_pro', 'chatglm_std', 'chatglm_lite'];
break; break;
case 18: case 18:
localModels = ['SparkDesk']; localModels = ['SparkDesk'];