Compare commits

..

169 Commits

Author SHA1 Message Date
CaIon
2841669246 feat: 完善函数计费 2024-04-23 23:01:06 +08:00
CaIon
89ebd85503 feat: update shouldRetry 2024-04-23 22:17:36 +08:00
CaIon
1a39ef74ce feat: 自动整理claude不规范prompt 2024-04-23 13:08:37 +08:00
CaIon
53e8790024 fix: claude max_tokens 2024-04-23 12:19:23 +08:00
CaIon
9294127686 feat: support aws claude 2024-04-23 11:44:40 +08:00
CaIon
6b97842f78 feat: 支持ollama embedding数组传参 2024-04-22 21:09:11 +08:00
CaIon
bdc65bdba2 feat: 启用函数计费 2024-04-22 16:35:56 +08:00
CaIon
76dc7af8d1 feat: update gemini model 2024-04-21 12:51:08 +08:00
CaIon
892b7d1ad4 feat: 登陆美化 2024-04-20 21:05:38 +08:00
CaIon
6b71db7ce2 feat: 状态码复写 2024-04-20 21:05:23 +08:00
CaIon
b8fb351fd8 feat: 在重试时打印重试信息 2024-04-20 17:18:14 +08:00
CaIon
e6765ef32d feat: update cache 2024-04-18 20:30:17 +08:00
CaIon
4ef98ba7eb feat: update cache 2024-04-18 20:26:38 +08:00
CaIon
65b85377c6 feat: update cache #204 2024-04-18 20:23:44 +08:00
CaIon
c6e85d5b57 feat: 完善数据看板 #190 2024-04-18 19:37:52 +08:00
CaIon
1162683b4d feat: 可设置是否转发上游mj图片地址 2024-04-18 18:02:09 +08:00
CaIon
818bd824da feat: 前端不显示敏感信息 2024-04-18 17:52:18 +08:00
CaIon
6e54f01435 update makefile 2024-04-17 20:47:35 +08:00
CaIon
505916b755 update makefile 2024-04-17 20:47:13 +08:00
CaIon
a4defe6ada fix: test all channel error (close #206) 2024-04-17 15:18:36 +08:00
Calcium-Ion
9dfd405ba9 Merge pull request #208 from kahosan/refactor_dark_mode
fix: the dark mode does not work for the `OperationSetting` and `SystemSetting` panels
2024-04-17 15:13:24 +08:00
Calcium-Ion
6c5b94ceb0 Merge pull request #194 from iszcz/pr
feat: 新增渠道复制功能
2024-04-17 15:12:15 +08:00
Calcium-Ion
ac2984315a Merge pull request #205 from MapleEve/main
fix: Gemini new model name error and Support both v1 and v1beta models
2024-04-17 15:10:49 +08:00
kahosan
848358d876 fix: the dark mode does not work for the OperationSetting and SystemSetting panels 2024-04-16 17:12:54 +08:00
kahosan
e9abe5b705 refactor: dark mode 2024-04-16 17:11:39 +08:00
Maple Gao
d7e117acf5 fix: Gemini 1.5 name error 2024-04-15 14:27:18 +08:00
Maple Gao
1456992aae add: new Gemini model default ratio 2024-04-15 14:25:44 +08:00
Maple Gao
3b6ea51033 fix: rename the latest Gemini model name 2024-04-15 14:22:40 +08:00
Maple Gao
21250a46a6 feat: support google v1beta and Gemini Ultra 2024-04-15 14:19:19 +08:00
iszcz
b31fadd74f Merge branch 'Calcium-Ion:main' into pr 2024-04-11 18:15:54 +08:00
Calcium-Ion
300947f400 Merge pull request #197 from xqx333/main
Update model-ratio.go
2024-04-11 14:15:33 +08:00
xqx333
bf94893f6a Update model-ratio.go
修复gpt-4-1106-preview和gpt-4-0125-preview的输出倍率错误
2024-04-11 14:03:51 +08:00
iszcz
97af77b26c Merge branch 'Calcium-Ion:main' into pr 2024-04-11 05:40:52 +08:00
1808837298@qq.com
4ef2422b97 update model-ratio 2024-04-10 20:12:56 +08:00
1808837298@qq.com
f188147680 feat: support gpt-4-turbo 2024-04-10 20:10:54 +08:00
iszcz
08e10df887 新增渠道复制 2024-04-10 03:17:16 +08:00
Calcium-Ion
0a49715c3d Merge pull request #183 from iszcz/patch-1
清除mj prompt里的--mode
2024-04-09 00:46:47 +08:00
Calcium-Ion
89efed48fc Merge pull request #185 from h1xy/main
Fix: CompletionRatio is not working for openrouter.ai
2024-04-08 23:57:37 +08:00
Calcium-Ion
97e0aae0a7 Merge pull request #188 from Calcium-Ion/fix/many-model-error
fix: 修复渠道一次性添加很多model失败
2024-04-08 23:56:45 +08:00
Xyfacai
320da09f36 fix: 修复渠道一次性添加很多model失败
修复渠道一次性添加很多model并且group多
提示失败 too many SQL variables
2024-04-08 23:51:51 +08:00
CaIon
2d849e0dd6 fix: 307本地重试 2024-04-08 14:10:09 +08:00
CaIon
60d7ed3fb5 fix: distributor panic 2024-04-08 13:48:36 +08:00
h1xy
c5f6d0e063 Fix: CompletionRatio is not working for openrouter.ai
https://openrouter.ai/docs#models
Model name of openrouter is prefix with company name, e.g. "model": "anthropic/claude-3-opus:beta", therefore, CompletionRatio will not working for it which is only work for prefix with claude-xxx
2024-04-08 02:12:47 +08:00
CaIon
a7cfce24d0 feat: automatically ban channels that exceeded quota 2024-04-07 22:22:27 +08:00
CaIon
34bf8f8945 fix: select channel 2024-04-07 22:08:11 +08:00
CaIon
2d1d1b4631 update go-epay 2024-04-07 14:42:03 +08:00
iszcz
5961de03e7 清除--mode 2024-04-06 23:08:50 +08:00
CaIon
fbdb17022c update README.md 2024-04-06 20:49:34 +08:00
CaIon
497cc32634 update README.md 2024-04-06 20:47:16 +08:00
CaIon
462c328d4b feat: 支持未开启缓存下本地重试 2024-04-06 20:45:18 +08:00
CaIon
257cfc2390 fix: email whitelist check 2024-04-06 17:50:47 +08:00
CaIon
fed1a1d6a3 feat: 超时状态码不重试 2024-04-04 21:21:44 +08:00
CaIon
fc9f8c8e8a fix: add group tag 'unknown' 2024-04-04 21:20:54 +08:00
CaIon
f3f36dafbd chore: 优化按次计费的数据库查询次数 2024-04-04 20:10:30 +08:00
CaIon
aaf3a1f07b fix: GetRandomSatisfiedChannel 2024-04-04 19:37:33 +08:00
CaIon
c040fa229d fix bug 2024-04-04 19:18:00 +08:00
CaIon
1cd1e54be4 feat: 钱包兼容非货币形式显示额度 2024-04-04 18:21:23 +08:00
CaIon
3db64afc7f feat: 钱包兼容非货币形式显示额度 2024-04-04 18:20:38 +08:00
CaIon
bc9cfa5da0 feat: 钱包兼容非货币形式显示额度 2024-04-04 18:18:18 +08:00
CaIon
660b9b3c99 feat: able to set default test model (#138) 2024-04-04 17:29:25 +08:00
CaIon
cdf2087952 update README.md 2024-04-04 16:48:28 +08:00
CaIon
4b60528c5f feat: 本地重试 2024-04-04 16:35:44 +08:00
1808837298@qq.com
9025756b56 fix: email whitelist check 2024-04-04 12:33:11 +08:00
CaIon
2ea6009954 fix: user update error 2024-04-04 11:10:41 +08:00
CaIon
a33f685f3c fix: log page type error (close #154) 2024-04-03 23:57:49 +08:00
CaIon
3d0f77ffb6 Merge remote-tracking branch 'origin/main' 2024-04-03 23:51:32 +08:00
CaIon
5ce8e6dab6 fix: update user quote (close #161) 2024-04-03 23:51:25 +08:00
Calcium-Ion
5a5b7d618d Merge pull request #171 from QuentinHsu/perf-setting-tab-navigation
perf(Setting): setting tab navigation
2024-04-03 23:32:19 +08:00
Calcium-Ion
ad8ce915ec Merge pull request #175 from ye4293/test
修改了用户注册使用临时邮箱验证的问题
2024-04-03 23:31:50 +08:00
Calcium-Ion
456fb875de Merge pull request #176 from QuentinHsu/perf-helpers-renderGroup
refactor(helpers): renderGroup function
2024-04-03 23:31:02 +08:00
QuentinHsu
3e90b6d516 refactor(helpers): renderGroup function 2024-04-02 13:16:02 +08:00
QuentinHsu
d6e373fbe4 fix(helpers): add key prop to Tag components 2024-04-02 10:58:44 +08:00
Ghostz
224746b45a Update misc.go 2024-04-02 01:13:12 +08:00
Calcium-Ion
ac827b1862 Merge pull request #174 from AI-ASS/main 2024-04-01 19:51:02 +08:00
GAI Group
658bf2ad57 Rename .prettierrc.mjs to .prettierrc.mjs 2024-04-01 19:49:56 +08:00
Calcium-Ion
c25f48b7c5 Merge pull request #172 from MapleEve/main
Support Claude TopK
2024-04-01 18:15:45 +08:00
QuentinHsu
290dcf7587 perf(Setting): add useEffect and useNavigate hooks to Setting component 2024-04-01 16:59:07 +08:00
Maple Gao
278fd39195 feat: add Claude TopK 2024-04-01 14:33:58 +08:00
QuentinHsu
aa23c51a53 perf(Setting): add tabActiveKey state to Setting component 2024-04-01 13:33:57 +08:00
Calcium-Ion
87919b032d Merge pull request #167 from weikecloud/main
增加MJ上游构图失败判断
2024-03-30 16:27:03 +08:00
Calcium-Ion
f7a4f18aff Update midjourney.go 2024-03-30 16:26:39 +08:00
余生一个白恩
706449dede 增加上游构图失败判断 2024-03-30 13:21:05 +08:00
CaIon
36d164be0e fix: SearchUsers (close #160) 2024-03-29 22:49:08 +08:00
CaIon
d80a7d3c97 Merge remote-tracking branch 'origin/main' 2024-03-29 22:28:10 +08:00
CaIon
44a8ade4ba fix: remove sensitive check on completion (close #157) 2024-03-29 22:20:14 +08:00
Xyfacai
2cca2a989e Merge pull request #165 from xyfacai/fork/mj-mode-path
fix: 支持 /mj-{mode} 路径
2024-03-29 17:45:23 +08:00
Xiangyuan Liu
3065bf92ae fix: 支持 /mj-{mode} 路径 2024-03-29 17:45:00 +08:00
Xiangyuan Liu
2e595bdafb fix: 支持 /mj-{mode} 路径 2024-03-29 16:58:19 +08:00
Xiangyuan Liu
49df4b6eed feat: 支持 /mj-{mode} 路径 2024-03-29 16:48:50 +08:00
CaIon
5c39f54040 feat: able to set smtp ssl 2024-03-28 12:18:11 +08:00
CaIon
786ccc7da0 feat: 开启redis的情况下设置SYNC_FREQUENCY默认为60 2024-03-26 23:00:04 +08:00
CaIon
8eedad9470 feat: support ollama embedding 2024-03-26 19:53:53 +08:00
CaIon
319e97d677 fix: ollama channel test 2024-03-26 19:27:11 +08:00
CaIon
6114c9bb96 fix: CountTokenInput 2024-03-26 18:49:53 +08:00
CaIon
3cf2f0d5cb fix: CountTokenInput 2024-03-26 18:21:38 +08:00
CaIon
2a345ae070 ci: update ci 2024-03-25 22:55:33 +08:00
CaIon
d8c91fa448 feat: 进一步防止暴露数上游以及数据库地址 2024-03-25 22:54:15 +08:00
CaIon
cc8cc8b386 fix: try to fix 307 2024-03-25 22:51:31 +08:00
CaIon
1587ea565b feat: support gemini-1.5 2024-03-25 22:33:46 +08:00
CaIon
a7a1fc615d feat: remove azure model TrimPrefix 2024-03-25 22:33:33 +08:00
CaIon
b2a280c1ec fix: 无法复制弹窗过小 2024-03-25 16:49:53 +08:00
CaIon
f1fb7b32a3 chore: update model ratio 2024-03-25 16:17:35 +08:00
CaIon
3800dc219e fix: Cannot read properties of undefined (reading 'map') (close #148) 2024-03-25 14:11:28 +08:00
CaIon
72962e988f Merge remote-tracking branch 'origin/main' 2024-03-25 13:52:37 +08:00
Calcium-Ion
01e3acfada Merge pull request #145 from QuentinHsu/fix-dev-error
fix(global): error in console under dev mode
2024-03-25 13:52:22 +08:00
QuentinHsu
f671176da0 fix(global): error in console under dev mode 2024-03-24 18:50:21 +08:00
CaIon
2d36dee17c fix: 流模式网络错误导致0补 2024-03-23 23:52:04 +08:00
CaIon
6eb30ec3e6 fix: 模型倍率和价格无法设置 2024-03-23 23:24:17 +08:00
CaIon
0b3520e3c8 fix: 修复默认模型倍率未显示 2024-03-23 23:21:14 +08:00
CaIon
63304a5b2d Merge remote-tracking branch 'origin/main' 2024-03-23 21:49:13 +08:00
CaIon
66e30f4115 fix: ci 2024-03-23 21:47:51 +08:00
Calcium-Ion
0618f03c68 Merge pull request #141 from Calcium-Ion/vite-support
feat: vite
2024-03-23 21:42:56 +08:00
CaIon
962dc984f4 chore: lint fix 2024-03-23 21:24:39 +08:00
CaIon
15e7307320 feat: prettier 2024-03-23 21:23:39 +08:00
CaIon
951383c371 chore: delete useless file 2024-03-23 21:02:38 +08:00
CaIon
87b6210045 chore: delete useless dir 2024-03-23 21:00:23 +08:00
CaIon
525fc1b3b7 feat: 从本地读取字体 (close #130) 2024-03-23 20:57:52 +08:00
CaIon
58f2cf3a79 feat: 首页加载速度优化 2024-03-23 20:22:00 +08:00
CaIon
06c86397e1 chore: Chunking Strategy 2024-03-23 19:37:19 +08:00
CaIon
21f48b55e0 fix: embed 2024-03-23 19:27:18 +08:00
CaIon
f823b4d4d8 update Dockerfile 2024-03-23 19:18:28 +08:00
CaIon
93be61aaf3 feat: vite 2024-03-23 19:09:09 +08:00
Calcium-Ion
a500097b36 Merge pull request #137 from MapleEve/main
feat: support 01.AI
2024-03-23 17:38:53 +08:00
CaIon
67332bc8df fix: 模型固定价格为空时错误使用默认价格 2024-03-23 17:19:29 +08:00
CaIon
d0acecb2ab fix: GLM-4V 的 Vision 兼容问题 (close #136) 2024-03-23 17:08:34 +08:00
Maple Gao
a825699e9a Merge branch 'Calcium-Ion:main' into main 2024-03-23 01:10:39 +08:00
CaIon
a70ca53449 fix: mj 2024-03-22 21:39:44 +08:00
CaIon
c33b1522cc fix: 充值并发导致订单号相同 2024-03-21 23:57:48 +08:00
CaIon
ff7da08bad fix: add missing created 2024-03-21 23:46:43 +08:00
CaIon
3e03c5a742 fix: add missing id,object,created 2024-03-21 23:44:39 +08:00
CaIon
d9344d79cf fix: try to fix curl: (18) 2024-03-21 23:25:07 +08:00
CaIon
c4b3d3a975 fix: fix embedding 2024-03-21 17:39:05 +08:00
CaIon
031957714a refactor: 代码结构优化 2024-03-21 17:19:21 +08:00
CaIon
3f808be254 fix: add missing version 2024-03-21 16:26:26 +08:00
CaIon
9b64f4a34a fix: fix mj panic 2024-03-21 15:04:04 +08:00
CaIon
222a55387d fix: fix SensitiveWords error 2024-03-21 14:29:56 +08:00
Maple Gao
492001a8b2 Merge branch 'Calcium-Ion:main' into main 2024-03-21 01:40:29 +08:00
CaIon
d7e25e1604 fix: fix SensitiveWords load error 2024-03-20 23:58:42 +08:00
Maple Gao
7d64f30f4d Add: 01AI in readme 2024-03-20 23:51:45 +08:00
Maple Gao
9e157ed802 fix empty url 2024-03-20 23:49:16 +08:00
Maple Gao
cfabf8a656 Add 01.AI relay 2024-03-20 23:44:03 +08:00
CaIon
c7658b70d1 fix: 敏感词库为空时,不检测 2024-03-20 22:33:22 +08:00
CaIon
d5e93e788d fix: midjourneys table 2024-03-20 21:49:54 +08:00
CaIon
dd71946047 fix: claude panic 2024-03-20 21:32:33 +08:00
CaIon
b736de7189 fix: claude panic 2024-03-20 21:28:45 +08:00
Calcium-Ion
5e7774cf08 Merge pull request #131 from Calcium-Ion/sensitive-words
feat: 敏感词过滤
2024-03-20 20:38:15 +08:00
CaIon
a232afe9fd feat: 统一错误提示 2024-03-20 20:36:55 +08:00
CaIon
eb6257a8d8 fix: fix SensitiveWordContains not working 2024-03-20 20:28:00 +08:00
CaIon
47b9f48544 fix: fix error 2024-03-20 20:26:34 +08:00
CaIon
2db4282666 feat: 保留功能 2024-03-20 20:15:32 +08:00
CaIon
64b9d3b58c feat: 初步兼容生成内容检查 2024-03-20 19:00:51 +08:00
CaIon
7a663d26ec feat: 初步兼容敏感词过滤 2024-03-20 17:07:42 +08:00
CaIon
bec21ade9d Update README.md 2024-03-18 20:44:20 +08:00
CaIon
4c4e087060 fix: api type error 2024-03-18 19:55:19 +08:00
CaIon
beadb98a8c feat: support perplexity 2024-03-18 18:09:41 +08:00
CaIon
615d109d70 Merge remote-tracking branch 'origin/main' 2024-03-18 18:08:09 +08:00
Calcium-Ion
0da49fa446 Merge pull request #125 from wozulong/main
fix: openai_organization not working
2024-03-18 18:07:34 +08:00
CaIon
578b5f6536 feat: support perplexity 2024-03-18 18:00:19 +08:00
我秦始皇
f63ad9c03c fix: make the 'openai_organization' parameter actually work. 2024-03-18 02:19:59 -07:00
我秦始皇
0fb98e44a7 fix: make the 'openai_organization' parameter actually work. 2024-03-18 02:11:21 -07:00
CaIon
652bb4a53c Update README.md 2024-03-18 00:16:32 +08:00
CaIon
a26b9a9bff Update Midjourney.md 2024-03-18 00:13:37 +08:00
CaIon
f82aa956bd feat: midjourneys表添加索引 2024-03-17 22:56:17 +08:00
CaIon
b8c053c37f fix: chatglm top_p error (close #124) 2024-03-17 22:55:29 +08:00
CaIon
2581b37394 fix: 侧边栏聚焦问题 (close #123) 2024-03-17 16:37:19 +08:00
CaIon
f65477d054 fix: 修复日志翻页问题 (close #122) 2024-03-17 16:18:45 +08:00
CaIon
4b952b8582 feat: print midjourney-proxy request error 2024-03-16 17:25:10 +08:00
CaIon
a6ba1d01d9 feat: 添加回调未开启提示 2024-03-15 22:15:16 +08:00
CaIon
2d2fec24d0 chore: update dependency 2024-03-15 18:35:15 +08:00
147 changed files with 7859 additions and 4241 deletions

View File

@@ -4,7 +4,6 @@ on:
push: push:
tags: tags:
- '*' - '*'
- '!*-alpha*'
workflow_dispatch: workflow_dispatch:
inputs: inputs:
name: name:

View File

@@ -4,7 +4,6 @@ on:
push: push:
tags: tags:
- '*' - '*'
- '!*-alpha*'
workflow_dispatch: workflow_dispatch:
inputs: inputs:
name: name:

View File

@@ -5,7 +5,7 @@ COPY web/package.json .
RUN npm install RUN npm install
COPY ./web . COPY ./web .
COPY ./VERSION . COPY ./VERSION .
RUN DISABLE_ESLINT_PLUGIN='true' REACT_APP_VERSION=$(cat VERSION) npm run build RUN DISABLE_ESLINT_PLUGIN='true' VITE_REACT_APP_VERSION=$(cat VERSION) npm run build
FROM golang AS builder2 FROM golang AS builder2
@@ -17,7 +17,7 @@ WORKDIR /build
ADD go.mod go.sum ./ ADD go.mod go.sum ./
RUN go mod download RUN go mod download
COPY . . COPY . .
COPY --from=builder /build/build ./web/build COPY --from=builder /build/dist ./web/dist
RUN go build -ldflags "-s -w -X 'one-api/common.Version=$(cat VERSION)' -extldflags '-static'" -o one-api RUN go build -ldflags "-s -w -X 'one-api/common.Version=$(cat VERSION)' -extldflags '-static'" -o one-api
FROM alpine FROM alpine

View File

@@ -2,9 +2,7 @@
**简介**:Midjourney Proxy API文档 **简介**:Midjourney Proxy API文档
## 模型价格设置(在设置-运营设置-模型固定价格设置中设置) ## 模型列表
### 模型列表
### midjourney-proxy支持 ### midjourney-proxy支持
@@ -27,6 +25,7 @@
- mj_pan (平移) - mj_pan (平移)
- swap_face (换脸) - swap_face (换脸)
## 模型价格设置(在设置-运营设置-模型固定价格设置中设置)
```json ```json
{ {
"mj_imagine": 0.1, "mj_imagine": 0.1,
@@ -46,6 +45,7 @@
"swap_face": 0.05 "swap_face": 0.05
} }
``` ```
其中mj_inpaint和mj_custom_zoom的价格设置为0是因为这两个模型需要搭配mj_modal使用所以价格由mj_modal决定。
## 渠道设置 ## 渠道设置
@@ -56,12 +56,12 @@
部署Midjourney-Proxy并配置好midjourney账号等强烈建议设置密钥[项目地址](https://github.com/novicezk/midjourney-proxy) 部署Midjourney-Proxy并配置好midjourney账号等强烈建议设置密钥[项目地址](https://github.com/novicezk/midjourney-proxy)
2. 在渠道管理中添加渠道,渠道类型选择**Midjourney Proxy**如果是plus版本选择**Midjourney Proxy Plus** 2. 在渠道管理中添加渠道,渠道类型选择**Midjourney Proxy**如果是plus版本选择**Midjourney Proxy Plus**
,模型选择midjourney如果有换脸模型可以选择swap_face ,模型请参考上方模型列表
3. 地址填写midjourney-proxy部署的地址例如http://localhost:8080 3. 地址填写midjourney-proxy部署的地址例如http://localhost:8080
4. 密钥填写midjourney-proxy的密钥如果没有设置密钥可以随便填 4. 密钥填写midjourney-proxy的密钥如果没有设置密钥可以随便填
### 对接上游new api ### 对接上游new api
1. 在渠道管理中添加渠道,渠道类型选择**Midjourney Proxy Plus**,模型选择midjourney如果有换脸模型可以选择swap_face 1. 在渠道管理中添加渠道,渠道类型选择**Midjourney Proxy Plus**,模型请参考上方模型列表
2. 地址填写上游new api的地址例如http://localhost:3000 2. 地址填写上游new api的地址例如http://localhost:3000
3. 密钥填写上游new api的密钥 3. 密钥填写上游new api的密钥

View File

@@ -18,7 +18,7 @@
此分叉版本的主要变更如下: 此分叉版本的主要变更如下:
1. 全新的UI界面部分界面还待更新 1. 全新的UI界面部分界面还待更新
2. 添加[Midjourney-Proxy(Plus)](https://github.com/novicezk/midjourney-proxy)接口的支持 2. 添加[Midjourney-Proxy(Plus)](https://github.com/novicezk/midjourney-proxy)接口的支持[对接文档](Midjourney.md),支持的接口如下:
+ [x] /mj/submit/imagine + [x] /mj/submit/imagine
+ [x] /mj/submit/change + [x] /mj/submit/change
+ [x] /mj/submit/blend + [x] /mj/submit/blend
@@ -54,10 +54,21 @@
2. 智谱glm-4vglm-4v识图 2. 智谱glm-4vglm-4v识图
3. Anthropic Claude 3 (claude-3-opus-20240229, claude-3-sonnet-20240229) 3. Anthropic Claude 3 (claude-3-opus-20240229, claude-3-sonnet-20240229)
4. [Ollama](https://github.com/ollama/ollama?tab=readme-ov-file),添加渠道时,密钥可以随便填写,默认的请求地址是[http://localhost:11434](http://localhost:11434),如果需要修改请在渠道中修改 4. [Ollama](https://github.com/ollama/ollama?tab=readme-ov-file),添加渠道时,密钥可以随便填写,默认的请求地址是[http://localhost:11434](http://localhost:11434),如果需要修改请在渠道中修改
5. [Midjourney-Proxy(Plus)](https://github.com/novicezk/midjourney-proxy)接口 5. [Midjourney-Proxy(Plus)](https://github.com/novicezk/midjourney-proxy)接口[对接文档](Midjourney.md)
6. [零一万物](https://platform.lingyiwanwu.com/)
您可以在渠道中添加自定义模型gpt-4-gizmo-*此模型并非OpenAI官方模型而是第三方模型使用官方key无法调用。 您可以在渠道中添加自定义模型gpt-4-gizmo-*此模型并非OpenAI官方模型而是第三方模型使用官方key无法调用。
## 渠道重试
渠道重试功能已经实现,可以在`设置->运营设置->通用设置`设置重试次数,建议开启缓存功能。
如果开启了重试功能,第一次重试使用同优先级,第二次重试使用下一个优先级,以此类推。
### 缓存设置方法
1. `REDIS_CONN_STRING`:设置之后将使用 Redis 作为缓存使用。
+ 例子:`REDIS_CONN_STRING=redis://default:redispw@localhost:49153`
2. `MEMORY_CACHE_ENABLED`:启用内存缓存(如果设置了`REDIS_CONN_STRING`,则无需手动设置),会导致用户额度的更新存在一定的延迟,可选值为 `true``false`,未设置则默认为 `false`
+ 例子:`MEMORY_CACHE_ENABLED=true`
## 部署 ## 部署
### 基于 Docker 进行部署 ### 基于 Docker 进行部署
```shell ```shell
@@ -98,6 +109,12 @@ docker run --name new-api -d --restart always -p 3000:3000 -e SQL_DSN="root:1234
![image](https://github.com/Calcium-Ion/new-api/assets/61247483/5b3228e8-2556-44f7-97d6-4f8d8ee6effa) ![image](https://github.com/Calcium-Ion/new-api/assets/61247483/5b3228e8-2556-44f7-97d6-4f8d8ee6effa)
![image](https://github.com/Calcium-Ion/new-api/assets/61247483/af9a07ee-5101-4b3d-8bd9-ae21a4fd7e9e) ![image](https://github.com/Calcium-Ion/new-api/assets/61247483/af9a07ee-5101-4b3d-8bd9-ae21a4fd7e9e)
## 相关项目
- [One API](https://github.com/songquanpeng/one-api):原版项目
- [Midjourney-Proxy](https://github.com/novicezk/midjourney-proxy)Midjourney接口支持
- [chatnio](https://github.com/Deeptrain-Community/chatnio):下一代 AI 一站式 B/C 端解决方案
- [neko-api-key-tool](https://github.com/Calcium-Ion/neko-api-key-tool)用key查询使用额度
## Star History ## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=Calcium-Ion/new-api&type=Date)](https://star-history.com/#Calcium-Ion/new-api&Date) [![Star History Chart](https://api.star-history.com/svg?repos=Calcium-Ion/new-api&type=Date)](https://star-history.com/#Calcium-Ion/new-api&Date)

View File

@@ -9,15 +9,6 @@ import (
"github.com/google/uuid" "github.com/google/uuid"
) )
// Pay Settings
var PayAddress = ""
var CustomCallbackAddress = ""
var EpayId = ""
var EpayKey = ""
var Price = 7.3
var MinTopUp = 1
var StartTime = time.Now().Unix() // unit: second var StartTime = time.Now().Unix() // unit: second
var Version = "v0.0.0" // this hard coding will be replaced automatically when building, no need to manually change var Version = "v0.0.0" // this hard coding will be replaced automatically when building, no need to manually change
var SystemName = "New API" var SystemName = "New API"
@@ -55,7 +46,8 @@ var TelegramOAuthEnabled = false
var TurnstileCheckEnabled = false var TurnstileCheckEnabled = false
var RegisterEnabled = true var RegisterEnabled = true
var EmailDomainRestrictionEnabled = false var EmailDomainRestrictionEnabled = false // 是否启用邮箱域名限制
var EmailAliasRestrictionEnabled = false // 是否启用邮箱别名限制
var EmailDomainWhitelist = []string{ var EmailDomainWhitelist = []string{
"gmail.com", "gmail.com",
"163.com", "163.com",
@@ -75,6 +67,7 @@ var LogConsumeEnabled = true
var SMTPServer = "" var SMTPServer = ""
var SMTPPort = 587 var SMTPPort = 587
var SMTPSSLEnabled = false
var SMTPAccount = "" var SMTPAccount = ""
var SMTPFrom = "" var SMTPFrom = ""
var SMTPToken = "" var SMTPToken = ""
@@ -110,7 +103,7 @@ var IsMasterNode = os.Getenv("NODE_TYPE") != "slave"
var requestInterval, _ = strconv.Atoi(os.Getenv("POLLING_INTERVAL")) var requestInterval, _ = strconv.Atoi(os.Getenv("POLLING_INTERVAL"))
var RequestInterval = time.Duration(requestInterval) * time.Second var RequestInterval = time.Duration(requestInterval) * time.Second
var SyncFrequency = GetOrDefault("SYNC_FREQUENCY", 10*60) // unit is second var SyncFrequency = GetOrDefault("SYNC_FREQUENCY", 60) // unit is second
var BatchUpdateEnabled = false var BatchUpdateEnabled = false
var BatchUpdateInterval = GetOrDefault("BATCH_UPDATE_INTERVAL", 5) var BatchUpdateInterval = GetOrDefault("BATCH_UPDATE_INTERVAL", 5)
@@ -211,6 +204,9 @@ const (
ChannelTypeGemini = 24 ChannelTypeGemini = 24
ChannelTypeMoonshot = 25 ChannelTypeMoonshot = 25
ChannelTypeZhipu_v4 = 26 ChannelTypeZhipu_v4 = 26
ChannelTypePerplexity = 27
ChannelTypeLingYiWanWu = 31
ChannelTypeAws = 33
) )
var ChannelBaseURLs = []string{ var ChannelBaseURLs = []string{
@@ -241,4 +237,12 @@ var ChannelBaseURLs = []string{
"https://generativelanguage.googleapis.com", //24 "https://generativelanguage.googleapis.com", //24
"https://api.moonshot.cn", //25 "https://api.moonshot.cn", //25
"https://open.bigmodel.cn", //26 "https://open.bigmodel.cn", //26
"https://api.perplexity.ai", //27
"", //28
"", //29
"", //30
"https://api.lingyiwanwu.com", //31
"", //32
"", //33
} }

View File

@@ -24,7 +24,7 @@ func SendEmail(subject string, receiver string, content string) error {
addr := fmt.Sprintf("%s:%d", SMTPServer, SMTPPort) addr := fmt.Sprintf("%s:%d", SMTPServer, SMTPPort)
to := strings.Split(receiver, ";") to := strings.Split(receiver, ";")
var err error var err error
if SMTPPort == 465 { if SMTPPort == 465 || SMTPSSLEnabled {
tlsConfig := &tls.Config{ tlsConfig := &tls.Config{
InsecureSkipVerify: true, InsecureSkipVerify: true,
ServerName: SMTPServer, ServerName: SMTPServer,

View File

@@ -5,18 +5,37 @@ import (
"encoding/json" "encoding/json"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"io" "io"
"strings"
) )
func UnmarshalBodyReusable(c *gin.Context, v any) error { const KeyRequestBody = "key_request_body"
func GetRequestBody(c *gin.Context) ([]byte, error) {
requestBody, _ := c.Get(KeyRequestBody)
if requestBody != nil {
return requestBody.([]byte), nil
}
requestBody, err := io.ReadAll(c.Request.Body) requestBody, err := io.ReadAll(c.Request.Body)
if err != nil { if err != nil {
return err return nil, err
} }
err = c.Request.Body.Close() _ = c.Request.Body.Close()
c.Set(KeyRequestBody, requestBody)
return requestBody.([]byte), nil
}
func UnmarshalBodyReusable(c *gin.Context, v any) error {
requestBody, err := GetRequestBody(c)
if err != nil { if err != nil {
return err return err
} }
contentType := c.Request.Header.Get("Content-Type")
if strings.HasPrefix(contentType, "application/json") {
err = json.Unmarshal(requestBody, &v) err = json.Unmarshal(requestBody, &v)
} else {
// skip for now
// TODO: someday non json request have variant model, we will need to implementation this
}
if err != nil { if err != nil {
return err return err
} }

View File

@@ -5,7 +5,7 @@ import (
"encoding/base64" "encoding/base64"
"errors" "errors"
"fmt" "fmt"
"github.com/chai2010/webp" "golang.org/x/image/webp"
"image" "image"
"io" "io"
"net/http" "net/http"

View File

@@ -3,32 +3,33 @@ package common
import ( import (
"encoding/json" "encoding/json"
"strings" "strings"
"time"
) )
// ModelRatio // modelRatio
// https://platform.openai.com/docs/models/model-endpoint-compatibility // https://platform.openai.com/docs/models/model-endpoint-compatibility
// https://cloud.baidu.com/doc/WENXINWORKSHOP/s/Blfmc9dlf // https://cloud.baidu.com/doc/WENXINWORKSHOP/s/Blfmc9dlf
// https://openai.com/pricing // https://openai.com/pricing
// TODO: when a new api is enabled, check the pricing here // TODO: when a new api is enabled, check the pricing here
// 1 === $0.002 / 1K tokens // 1 === $0.002 / 1K tokens
// 1 === ¥0.014 / 1k tokens // 1 === ¥0.014 / 1k tokens
var DefaultModelRatio = map[string]float64{ var DefaultModelRatio = map[string]float64{
//"midjourney": 50, //"midjourney": 50,
"gpt-4-gizmo-*": 15, "gpt-4-gizmo-*": 15,
"gpt-4": 15, "gpt-4": 15,
"gpt-4-0314": 15, //"gpt-4-0314": 15, //deprecated
"gpt-4-0613": 15, "gpt-4-0613": 15,
"gpt-4-32k": 30, "gpt-4-32k": 30,
"gpt-4-32k-0314": 30, //"gpt-4-32k-0314": 30, //deprecated
"gpt-4-32k-0613": 30, "gpt-4-32k-0613": 30,
"gpt-4-1106-preview": 5, // $0.01 / 1K tokens "gpt-4-1106-preview": 5, // $0.01 / 1K tokens
"gpt-4-0125-preview": 5, // $0.01 / 1K tokens "gpt-4-0125-preview": 5, // $0.01 / 1K tokens
"gpt-4-turbo-preview": 5, // $0.01 / 1K tokens "gpt-4-turbo-preview": 5, // $0.01 / 1K tokens
"gpt-4-vision-preview": 5, // $0.01 / 1K tokens "gpt-4-vision-preview": 5, // $0.01 / 1K tokens
"gpt-4-1106-vision-preview": 5, // $0.01 / 1K tokens "gpt-4-1106-vision-preview": 5, // $0.01 / 1K tokens
"gpt-3.5-turbo": 0.75, // $0.0015 / 1K tokens "gpt-4-turbo": 5, // $0.01 / 1K tokens
"gpt-3.5-turbo-0301": 0.75, "gpt-3.5-turbo": 0.25, // $0.0015 / 1K tokens
//"gpt-3.5-turbo-0301": 0.75, //deprecated
"gpt-3.5-turbo-0613": 0.75, "gpt-3.5-turbo-0613": 0.75,
"gpt-3.5-turbo-16k": 1.5, // $0.003 / 1K tokens "gpt-3.5-turbo-16k": 1.5, // $0.003 / 1K tokens
"gpt-3.5-turbo-16k-0613": 1.5, "gpt-3.5-turbo-16k-0613": 1.5,
@@ -74,6 +75,12 @@ var DefaultModelRatio = map[string]float64{
"PaLM-2": 1, "PaLM-2": 1,
"gemini-pro": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens "gemini-pro": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens
"gemini-pro-vision": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens "gemini-pro-vision": 1, // $0.00025 / 1k characters -> $0.001 / 1k tokens
"gemini-1.0-pro-vision-001": 1,
"gemini-1.0-pro-001": 1,
"gemini-1.5-pro-latest": 1,
"gemini-1.0-pro-latest": 1,
"gemini-1.0-pro-vision-latest": 1,
"gemini-ultra": 1,
"chatglm_turbo": 0.3572, // ¥0.005 / 1k tokens "chatglm_turbo": 0.3572, // ¥0.005 / 1k tokens
"chatglm_pro": 0.7143, // ¥0.01 / 1k tokens "chatglm_pro": 0.7143, // ¥0.01 / 1k tokens
"chatglm_std": 0.3572, // ¥0.005 / 1k tokens "chatglm_std": 0.3572, // ¥0.005 / 1k tokens
@@ -93,6 +100,11 @@ var DefaultModelRatio = map[string]float64{
"embedding_s1_v1": 0.0715, // ¥0.001 / 1k tokens "embedding_s1_v1": 0.0715, // ¥0.001 / 1k tokens
"semantic_similarity_s1_v1": 0.0715, // ¥0.001 / 1k tokens "semantic_similarity_s1_v1": 0.0715, // ¥0.001 / 1k tokens
"hunyuan": 7.143, // ¥0.1 / 1k tokens // https://cloud.tencent.com/document/product/1729/97731#e0e6be58-60c8-469f-bdeb-6c264ce3b4d0 "hunyuan": 7.143, // ¥0.1 / 1k tokens // https://cloud.tencent.com/document/product/1729/97731#e0e6be58-60c8-469f-bdeb-6c264ce3b4d0
// https://platform.lingyiwanwu.com/docs#-计费单元
// 已经按照 7.2 来换算美元价格
"yi-34b-chat-0205": 0.018,
"yi-34b-chat-200k": 0.0864,
"yi-vl-plus": 0.0432,
} }
var DefaultModelPrice = map[string]float64{ var DefaultModelPrice = map[string]float64{
@@ -114,14 +126,14 @@ var DefaultModelPrice = map[string]float64{
"swap_face": 0.05, "swap_face": 0.05,
} }
var ModelPrice = map[string]float64{} var modelPrice map[string]float64 = nil
var ModelRatio = map[string]float64{} var modelRatio map[string]float64 = nil
func ModelPrice2JSONString() string { func ModelPrice2JSONString() string {
if len(ModelPrice) == 0 { if modelPrice == nil {
ModelPrice = DefaultModelPrice modelPrice = DefaultModelPrice
} }
jsonBytes, err := json.Marshal(ModelPrice) jsonBytes, err := json.Marshal(modelPrice)
if err != nil { if err != nil {
SysError("error marshalling model price: " + err.Error()) SysError("error marshalling model price: " + err.Error())
} }
@@ -129,18 +141,18 @@ func ModelPrice2JSONString() string {
} }
func UpdateModelPriceByJSONString(jsonStr string) error { func UpdateModelPriceByJSONString(jsonStr string) error {
ModelPrice = make(map[string]float64) modelPrice = make(map[string]float64)
return json.Unmarshal([]byte(jsonStr), &ModelPrice) return json.Unmarshal([]byte(jsonStr), &modelPrice)
} }
func GetModelPrice(name string, printErr bool) float64 { func GetModelPrice(name string, printErr bool) float64 {
if len(ModelPrice) == 0 { if modelPrice == nil {
ModelPrice = DefaultModelPrice modelPrice = DefaultModelPrice
} }
if strings.HasPrefix(name, "gpt-4-gizmo") { if strings.HasPrefix(name, "gpt-4-gizmo") {
name = "gpt-4-gizmo-*" name = "gpt-4-gizmo-*"
} }
price, ok := ModelPrice[name] price, ok := modelPrice[name]
if !ok { if !ok {
if printErr { if printErr {
SysError("model price not found: " + name) SysError("model price not found: " + name)
@@ -151,10 +163,10 @@ func GetModelPrice(name string, printErr bool) float64 {
} }
func ModelRatio2JSONString() string { func ModelRatio2JSONString() string {
if len(ModelRatio) == 0 { if modelRatio == nil {
ModelRatio = DefaultModelRatio modelRatio = DefaultModelRatio
} }
jsonBytes, err := json.Marshal(ModelRatio) jsonBytes, err := json.Marshal(modelRatio)
if err != nil { if err != nil {
SysError("error marshalling model ratio: " + err.Error()) SysError("error marshalling model ratio: " + err.Error())
} }
@@ -162,18 +174,18 @@ func ModelRatio2JSONString() string {
} }
func UpdateModelRatioByJSONString(jsonStr string) error { func UpdateModelRatioByJSONString(jsonStr string) error {
ModelRatio = make(map[string]float64) modelRatio = make(map[string]float64)
return json.Unmarshal([]byte(jsonStr), &ModelRatio) return json.Unmarshal([]byte(jsonStr), &modelRatio)
} }
func GetModelRatio(name string) float64 { func GetModelRatio(name string) float64 {
if len(ModelRatio) == 0 { if modelRatio == nil {
ModelRatio = DefaultModelRatio modelRatio = DefaultModelRatio
} }
if strings.HasPrefix(name, "gpt-4-gizmo") { if strings.HasPrefix(name, "gpt-4-gizmo") {
name = "gpt-4-gizmo-*" name = "gpt-4-gizmo-*"
} }
ratio, ok := ModelRatio[name] ratio, ok := modelRatio[name]
if !ok { if !ok {
SysError("model ratio not found: " + name) SysError("model ratio not found: " + name)
return 30 return 30
@@ -183,35 +195,38 @@ func GetModelRatio(name string) float64 {
func GetCompletionRatio(name string) float64 { func GetCompletionRatio(name string) float64 {
if strings.HasPrefix(name, "gpt-3.5") { if strings.HasPrefix(name, "gpt-3.5") {
if strings.HasSuffix(name, "0125") { if name == "gpt-3.5-turbo" || strings.HasSuffix(name, "0125") {
// https://openai.com/blog/new-embedding-models-and-api-updates
// Updated GPT-3.5 Turbo model and lower pricing
return 3 return 3
} }
if strings.HasSuffix(name, "1106") { if strings.HasSuffix(name, "1106") {
return 2 return 2
} }
if name == "gpt-3.5-turbo" || name == "gpt-3.5-turbo-16k" { return 4.0 / 3.0
// TODO: clear this after 2023-12-11
now := time.Now()
// https://platform.openai.com/docs/models/continuous-model-upgrades
// if after 2023-12-11, use 2
if now.After(time.Date(2023, 12, 11, 0, 0, 0, 0, time.UTC)) {
return 2
}
}
return 1.333333
} }
if strings.HasPrefix(name, "gpt-4") { if strings.HasPrefix(name, "gpt-4") {
if strings.HasSuffix(name, "preview") { if strings.HasPrefix(name, "gpt-4-turbo") || strings.HasSuffix(name, "preview") {
return 3 return 3
} }
return 2 return 2
} }
if strings.HasPrefix(name, "claude-instant-1") { if strings.Contains(name, "claude-instant-1") {
return 3 return 3
} else if strings.HasPrefix(name, "claude-2") { } else if strings.Contains(name, "claude-2") {
return 3 return 3
} else if strings.HasPrefix(name, "claude-3") { } else if strings.Contains(name, "claude-3") {
return 5 return 5
} }
if strings.HasPrefix(name, "mistral-") {
return 3
}
if strings.HasPrefix(name, "gemini-") {
return 3
}
switch name {
case "llama2-70b-4096":
return 0.8 / 0.7
}
return 1 return 1
} }

View File

@@ -18,9 +18,8 @@ func InitRedisClient() (err error) {
return nil return nil
} }
if os.Getenv("SYNC_FREQUENCY") == "" { if os.Getenv("SYNC_FREQUENCY") == "" {
RedisEnabled = false SysLog("SYNC_FREQUENCY not set, use default value 60")
SysLog("SYNC_FREQUENCY not set, Redis is disabled") SyncFrequency = 60
return nil
} }
SysLog("Redis is enabled") SysLog("Redis is enabled")
opt, err := redis.ParseURL(os.Getenv("REDIS_CONN_STRING")) opt, err := redis.ParseURL(os.Getenv("REDIS_CONN_STRING"))

50
common/str.go Normal file
View File

@@ -0,0 +1,50 @@
package common
func SundaySearch(text string, pattern string) bool {
// 计算偏移表
offset := make(map[rune]int)
for i, c := range pattern {
offset[c] = len(pattern) - i
}
// 文本串长度和模式串长度
n, m := len(text), len(pattern)
// 主循环i表示当前对齐的文本串位置
for i := 0; i <= n-m; {
// 检查子串
j := 0
for j < m && text[i+j] == pattern[j] {
j++
}
// 如果完全匹配,返回匹配位置
if j == m {
return true
}
// 如果还有剩余字符,则检查下一位字符在偏移表中的值
if i+m < n {
next := rune(text[i+m])
if val, ok := offset[next]; ok {
i += val // 存在于偏移表中,进行跳跃
} else {
i += len(pattern) + 1 // 不存在于偏移表中,跳过整个模式串长度
}
} else {
break
}
}
return false // 如果没有找到匹配,返回-1
}
func RemoveDuplicate(s []string) []string {
result := make([]string, 0, len(s))
temp := map[string]struct{}{}
for _, item := range s {
if _, ok := temp[item]; !ok {
temp[item] = struct{}{}
result = append(result, item)
}
}
return result
}

View File

@@ -236,3 +236,8 @@ func StringToByteSlice(s string) []byte {
tmp2 := [3]uintptr{tmp1[0], tmp1[1], tmp1[1]} tmp2 := [3]uintptr{tmp1[0], tmp1[1], tmp1[1]}
return *(*[]byte)(unsafe.Pointer(&tmp2)) return *(*[]byte)(unsafe.Pointer(&tmp2))
} }
func RandomSleep() {
// Sleep for 0-3000 ms
time.Sleep(time.Duration(rand.Intn(3000)) * time.Millisecond)
}

View File

@@ -1,6 +1,8 @@
package constant package constant
var MjNotifyEnabled = false var MjNotifyEnabled = false
var MjModeClearEnabled = false
var MjForwardUrlEnabled = true
const ( const (
MjErrorUnknown = 5 MjErrorUnknown = 5

8
constant/payment.go Normal file
View File

@@ -0,0 +1,8 @@
package constant
var PayAddress = ""
var CustomCallbackAddress = ""
var EpayId = ""
var EpayKey = ""
var Price = 7.3
var MinTopUp = 1

43
constant/sensitive.go Normal file
View File

@@ -0,0 +1,43 @@
package constant
import "strings"
var CheckSensitiveEnabled = true
var CheckSensitiveOnPromptEnabled = true
//var CheckSensitiveOnCompletionEnabled = true
// StopOnSensitiveEnabled 如果检测到敏感词,是否立刻停止生成,否则替换敏感词
var StopOnSensitiveEnabled = true
// StreamCacheQueueLength 流模式缓存队列长度0表示无缓存
var StreamCacheQueueLength = 0
// SensitiveWords 敏感词
// var SensitiveWords []string
var SensitiveWords = []string{
"test",
}
func SensitiveWordsToString() string {
return strings.Join(SensitiveWords, "\n")
}
func SensitiveWordsFromString(s string) {
SensitiveWords = []string{}
sw := strings.Split(s, "\n")
for _, w := range sw {
w = strings.TrimSpace(w)
if w != "" {
SensitiveWords = append(SensitiveWords, w)
}
}
}
func ShouldCheckPromptSensitive() bool {
return CheckSensitiveEnabled && CheckSensitiveOnPromptEnabled
}
//func ShouldCheckCompletionSensitive() bool {
// return CheckSensitiveEnabled && CheckSensitiveOnCompletionEnabled
//}

View File

@@ -27,7 +27,6 @@ func testChannel(channel *model.Channel, testModel string) (err error, openaiErr
if channel.Type == common.ChannelTypeMidjourney { if channel.Type == common.ChannelTypeMidjourney {
return errors.New("midjourney channel test is not supported"), nil return errors.New("midjourney channel test is not supported"), nil
} }
common.SysLog(fmt.Sprintf("testing channel %d with model %s", channel.Id, testModel))
w := httptest.NewRecorder() w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w) c, _ := gin.CreateTestContext(w)
c.Request = &http.Request{ c.Request = &http.Request{
@@ -60,12 +59,16 @@ func testChannel(channel *model.Channel, testModel string) (err error, openaiErr
return fmt.Errorf("invalid api type: %d, adaptor is nil", apiType), nil return fmt.Errorf("invalid api type: %d, adaptor is nil", apiType), nil
} }
if testModel == "" { if testModel == "" {
if channel.TestModel != nil && *channel.TestModel != "" {
testModel = *channel.TestModel
} else {
testModel = adaptor.GetModelList()[0] testModel = adaptor.GetModelList()[0]
meta.UpstreamModelName = testModel }
} }
request := buildTestRequest() request := buildTestRequest()
request.Model = testModel request.Model = testModel
meta.UpstreamModelName = testModel meta.UpstreamModelName = testModel
common.SysLog(fmt.Sprintf("testing channel %d with model %s", channel.Id, testModel))
adaptor.Init(meta, *request) adaptor.Init(meta, *request)
@@ -83,7 +86,7 @@ func testChannel(channel *model.Channel, testModel string) (err error, openaiErr
if err != nil { if err != nil {
return err, nil return err, nil
} }
if resp.StatusCode != http.StatusOK { if resp != nil && resp.StatusCode != http.StatusOK {
err := relaycommon.RelayErrorHandler(resp) err := relaycommon.RelayErrorHandler(resp)
return fmt.Errorf("status code %d: %s", resp.StatusCode, err.Error.Message), &err.Error return fmt.Errorf("status code %d: %s", resp.StatusCode, err.Error.Message), &err.Error
} }
@@ -108,6 +111,7 @@ func buildTestRequest() *dto.GeneralOpenAIRequest {
testRequest := &dto.GeneralOpenAIRequest{ testRequest := &dto.GeneralOpenAIRequest{
Model: "", // this will be set later Model: "", // this will be set later
MaxTokens: 1, MaxTokens: 1,
Stream: false,
} }
content, _ := json.Marshal("hi") content, _ := json.Marshal("hi")
testMessage := dto.Message{ testMessage := dto.Message{

View File

@@ -10,11 +10,11 @@ import (
"log" "log"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/constant"
"one-api/dto" "one-api/dto"
"one-api/model" "one-api/model"
"one-api/service" "one-api/service"
"strconv" "strconv"
"strings"
"time" "time"
) )
@@ -147,7 +147,7 @@ func UpdateMidjourneyTaskBulk() {
task.Buttons = string(buttonStr) task.Buttons = string(buttonStr)
} }
if task.Progress != "100%" && responseItem.FailReason != "" { if (task.Progress != "100%" && responseItem.FailReason != "") || (task.Progress == "100%" && task.Status == "FAILURE") {
common.LogInfo(ctx, task.MjId+" 构建失败,"+task.FailReason) common.LogInfo(ctx, task.MjId+" 构建失败,"+task.FailReason)
task.Progress = "100%" task.Progress = "100%"
err = model.CacheUpdateUserQuota(task.UserId) err = model.CacheUpdateUserQuota(task.UserId)
@@ -233,6 +233,12 @@ func GetAllMidjourney(c *gin.Context) {
if logs == nil { if logs == nil {
logs = make([]*model.Midjourney, 0) logs = make([]*model.Midjourney, 0)
} }
if constant.MjForwardUrlEnabled {
for i, midjourney := range logs {
midjourney.ImageUrl = common.ServerAddress + "/mj/image/" + midjourney.MjId
logs[i] = midjourney
}
}
c.JSON(200, gin.H{ c.JSON(200, gin.H{
"success": true, "success": true,
"message": "", "message": "",
@@ -259,7 +265,7 @@ func GetUserMidjourney(c *gin.Context) {
if logs == nil { if logs == nil {
logs = make([]*model.Midjourney, 0) logs = make([]*model.Midjourney, 0)
} }
if !strings.Contains(common.ServerAddress, "localhost") { if constant.MjForwardUrlEnabled {
for i, midjourney := range logs { for i, midjourney := range logs {
midjourney.ImageUrl = common.ServerAddress + "/mj/image/" + midjourney.MjId midjourney.ImageUrl = common.ServerAddress + "/mj/image/" + midjourney.MjId
logs[i] = midjourney logs[i] = midjourney

View File

@@ -5,6 +5,7 @@ import (
"fmt" "fmt"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/constant"
"one-api/model" "one-api/model"
"strings" "strings"
@@ -32,6 +33,7 @@ func GetStatus(c *gin.Context) {
"success": true, "success": true,
"message": "", "message": "",
"data": gin.H{ "data": gin.H{
"version": common.Version,
"start_time": common.StartTime, "start_time": common.StartTime,
"email_verification": common.EmailVerificationEnabled, "email_verification": common.EmailVerificationEnabled,
"github_oauth": common.GitHubOAuthEnabled, "github_oauth": common.GitHubOAuthEnabled,
@@ -44,8 +46,8 @@ func GetStatus(c *gin.Context) {
"wechat_qrcode": common.WeChatAccountQRCodeImageURL, "wechat_qrcode": common.WeChatAccountQRCodeImageURL,
"wechat_login": common.WeChatAuthEnabled, "wechat_login": common.WeChatAuthEnabled,
"server_address": common.ServerAddress, "server_address": common.ServerAddress,
"price": common.Price, "price": constant.Price,
"min_topup": common.MinTopUp, "min_topup": constant.MinTopUp,
"turnstile_check": common.TurnstileCheckEnabled, "turnstile_check": common.TurnstileCheckEnabled,
"turnstile_site_key": common.TurnstileSiteKey, "turnstile_site_key": common.TurnstileSiteKey,
"top_up_link": common.TopUpLink, "top_up_link": common.TopUpLink,
@@ -58,7 +60,8 @@ func GetStatus(c *gin.Context) {
"enable_data_export": common.DataExportEnabled, "enable_data_export": common.DataExportEnabled,
"data_export_default_time": common.DataExportDefaultTime, "data_export_default_time": common.DataExportDefaultTime,
"default_collapse_sidebar": common.DefaultCollapseSidebar, "default_collapse_sidebar": common.DefaultCollapseSidebar,
"enable_online_topup": common.PayAddress != "" && common.EpayId != "" && common.EpayKey != "", "enable_online_topup": constant.PayAddress != "" && constant.EpayId != "" && constant.EpayKey != "",
"mj_notify_enabled": constant.MjNotifyEnabled,
}, },
}) })
return return
@@ -117,10 +120,20 @@ func SendEmailVerification(c *gin.Context) {
}) })
return return
} }
parts := strings.Split(email, "@")
if len(parts) != 2 {
c.JSON(http.StatusOK, gin.H{
"success": false,
"message": "无效的邮箱地址",
})
return
}
localPart := parts[0]
domainPart := parts[1]
if common.EmailDomainRestrictionEnabled { if common.EmailDomainRestrictionEnabled {
allowed := false allowed := false
for _, domain := range common.EmailDomainWhitelist { for _, domain := range common.EmailDomainWhitelist {
if strings.HasSuffix(email, "@"+domain) { if domainPart == domain {
allowed = true allowed = true
break break
} }
@@ -128,11 +141,22 @@ func SendEmailVerification(c *gin.Context) {
if !allowed { if !allowed {
c.JSON(http.StatusOK, gin.H{ c.JSON(http.StatusOK, gin.H{
"success": false, "success": false,
"message": "管理员启用了邮箱域名白名单,您的邮箱地址的域名不在白名单中", "message": "The administrator has enabled the email domain name whitelist, and your email address is not allowed due to special symbols or it's not in the whitelist.",
}) })
return return
} }
} }
if common.EmailAliasRestrictionEnabled {
containsSpecialSymbols := strings.Contains(localPart, "+") || strings.Count(localPart, ".") > 1
if containsSpecialSymbols {
c.JSON(http.StatusOK, gin.H{
"success": false,
"message": "管理员已启用邮箱地址别名限制,您的邮箱地址由于包含特殊符号而被拒绝。",
})
return
}
}
if model.IsEmailAlreadyTaken(email) { if model.IsEmailAlreadyTaken(email) {
c.JSON(http.StatusOK, gin.H{ c.JSON(http.StatusOK, gin.H{
"success": false, "success": false,

View File

@@ -10,6 +10,7 @@ import (
"one-api/relay" "one-api/relay"
"one-api/relay/channel/ai360" "one-api/relay/channel/ai360"
"one-api/relay/channel/moonshot" "one-api/relay/channel/moonshot"
"one-api/relay/channel/lingyiwanwu"
relayconstant "one-api/relay/constant" relayconstant "one-api/relay/constant"
) )
@@ -101,6 +102,17 @@ func init() {
Parent: nil, Parent: nil,
}) })
} }
for _, modelName := range lingyiwanwu.ModelList {
openAIModels = append(openAIModels, OpenAIModels{
Id: modelName,
Object: "model",
Created: 1626777600,
OwnedBy: "lingyiwanwu",
Permission: permission,
Root: modelName,
Parent: nil,
})
}
for modelName, _ := range constant.MidjourneyModel2Action { for modelName, _ := range constant.MidjourneyModel2Action {
openAIModels = append(openAIModels, OpenAIModels{ openAIModels = append(openAIModels, OpenAIModels{
Id: modelName, Id: modelName,

View File

@@ -14,7 +14,7 @@ func GetOptions(c *gin.Context) {
var options []*model.Option var options []*model.Option
common.OptionMapRWMutex.Lock() common.OptionMapRWMutex.Lock()
for k, v := range common.OptionMap { for k, v := range common.OptionMap {
if strings.HasSuffix(k, "Token") || strings.HasSuffix(k, "Secret") { if strings.HasSuffix(k, "Token") || strings.HasSuffix(k, "Secret") || strings.HasSuffix(k, "Key") {
continue continue
} }
options = append(options, &model.Option{ options = append(options, &model.Option{

View File

@@ -1,21 +1,24 @@
package controller package controller
import ( import (
"bytes"
"fmt" "fmt"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"io"
"log" "log"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/dto" "one-api/dto"
"one-api/middleware"
"one-api/model"
"one-api/relay" "one-api/relay"
"one-api/relay/constant" "one-api/relay/constant"
relayconstant "one-api/relay/constant" relayconstant "one-api/relay/constant"
"one-api/service" "one-api/service"
"strconv" "strings"
) )
func Relay(c *gin.Context) { func relayHandler(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusCode {
relayMode := constant.Path2RelayMode(c.Request.URL.Path)
var err *dto.OpenAIErrorWithStatusCode var err *dto.OpenAIErrorWithStatusCode
switch relayMode { switch relayMode {
case relayconstant.RelayModeImagesGenerations: case relayconstant.RelayModeImagesGenerations:
@@ -29,35 +32,104 @@ func Relay(c *gin.Context) {
default: default:
err = relay.TextHelper(c) err = relay.TextHelper(c)
} }
if err != nil { return err
}
func Relay(c *gin.Context) {
relayMode := constant.Path2RelayMode(c.Request.URL.Path)
retryTimes := common.RetryTimes
requestId := c.GetString(common.RequestIdKey) requestId := c.GetString(common.RequestIdKey)
retryTimesStr := c.Query("retry") channelId := c.GetInt("channel_id")
retryTimes, _ := strconv.Atoi(retryTimesStr) group := c.GetString("group")
if retryTimesStr == "" { originalModel := c.GetString("original_model")
retryTimes = common.RetryTimes openaiErr := relayHandler(c, relayMode)
} useChannel := []int{channelId}
if retryTimes > 0 { if openaiErr != nil {
c.Redirect(http.StatusTemporaryRedirect, fmt.Sprintf("%s?retry=%d", c.Request.URL.Path, retryTimes-1)) go processChannelError(c, channelId, openaiErr)
} else { } else {
if err.StatusCode == http.StatusTooManyRequests { retryTimes = 0
//err.Error.Message = "当前分组上游负载已饱和,请稍后再试"
} }
err.Error.Message = common.MessageWithRequestId(err.Error.Message, requestId) for i := 0; shouldRetry(c, channelId, openaiErr, retryTimes) && i < retryTimes; i++ {
c.JSON(err.StatusCode, gin.H{ channel, err := model.CacheGetRandomSatisfiedChannel(group, originalModel, i)
"error": err.Error, if err != nil {
common.LogError(c.Request.Context(), fmt.Sprintf("CacheGetRandomSatisfiedChannel failed: %s", err.Error()))
break
}
channelId = channel.Id
useChannel = append(useChannel, channelId)
common.LogInfo(c.Request.Context(), fmt.Sprintf("using channel #%d to retry (remain times %d)", channel.Id, i))
middleware.SetupContextForSelectedChannel(c, channel, originalModel)
requestBody, err := common.GetRequestBody(c)
c.Request.Body = io.NopCloser(bytes.NewBuffer(requestBody))
openaiErr = relayHandler(c, relayMode)
if openaiErr != nil {
go processChannelError(c, channelId, openaiErr)
}
}
if len(useChannel) > 1 {
retryLogStr := fmt.Sprintf("重试:%s", strings.Trim(strings.Join(strings.Fields(fmt.Sprint(useChannel)), "->"), "[]"))
common.LogInfo(c.Request.Context(), retryLogStr)
}
if openaiErr != nil {
if openaiErr.StatusCode == http.StatusTooManyRequests {
openaiErr.Error.Message = "当前分组上游负载已饱和,请稍后再试"
}
openaiErr.Error.Message = common.MessageWithRequestId(openaiErr.Error.Message, requestId)
c.JSON(openaiErr.StatusCode, gin.H{
"error": openaiErr.Error,
}) })
} }
channelId := c.GetInt("channel_id") }
func shouldRetry(c *gin.Context, channelId int, openaiErr *dto.OpenAIErrorWithStatusCode, retryTimes int) bool {
if openaiErr == nil {
return false
}
if retryTimes <= 0 {
return false
}
if _, ok := c.Get("specific_channel_id"); ok {
return false
}
if openaiErr.StatusCode == http.StatusTooManyRequests {
return true
}
if openaiErr.StatusCode == 307 {
return true
}
if openaiErr.StatusCode/100 == 5 {
// 超时不重试
if openaiErr.StatusCode == 504 || openaiErr.StatusCode == 524 {
return false
}
return true
}
if openaiErr.StatusCode == http.StatusBadRequest {
return false
}
if openaiErr.StatusCode == 408 {
// azure处理超时不重试
return false
}
if openaiErr.LocalError {
return false
}
if openaiErr.StatusCode/100 == 2 {
return false
}
return true
}
func processChannelError(c *gin.Context, channelId int, err *dto.OpenAIErrorWithStatusCode) {
autoBan := c.GetBool("auto_ban") autoBan := c.GetBool("auto_ban")
common.LogError(c.Request.Context(), fmt.Sprintf("relay error (channel #%d): %s", channelId, err.Error.Message)) common.LogError(c.Request.Context(), fmt.Sprintf("relay error (channel #%d): %s", channelId, err.Error.Message))
// https://platform.openai.com/docs/guides/error-codes/api-errors
if service.ShouldDisableChannel(&err.Error, err.StatusCode) && autoBan { if service.ShouldDisableChannel(&err.Error, err.StatusCode) && autoBan {
channelId := c.GetInt("channel_id")
channelName := c.GetString("channel_name") channelName := c.GetString("channel_name")
service.DisableChannel(channelId, channelName, err.Error.Message) service.DisableChannel(channelId, channelName, err.Error.Message)
} }
} }
}
func RelayMidjourney(c *gin.Context) { func RelayMidjourney(c *gin.Context) {
relayMode := c.GetInt("relay_mode") relayMode := c.GetInt("relay_mode")

View File

@@ -2,9 +2,11 @@ package controller
import ( import (
"fmt" "fmt"
"github.com/Calcium-Ion/go-epay/epay"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"github.com/samber/lo" "github.com/samber/lo"
epay "github.com/star-horizon/go-epay" "one-api/constant"
"log" "log"
"net/url" "net/url"
"one-api/common" "one-api/common"
@@ -27,44 +29,59 @@ type AmountRequest struct {
} }
func GetEpayClient() *epay.Client { func GetEpayClient() *epay.Client {
if common.PayAddress == "" || common.EpayId == "" || common.EpayKey == "" { if constant.PayAddress == "" || constant.EpayId == "" || constant.EpayKey == "" {
return nil return nil
} }
withUrl, err := epay.NewClientWithUrl(&epay.Config{ withUrl, err := epay.NewClient(&epay.Config{
PartnerID: common.EpayId, PartnerID: constant.EpayId,
Key: common.EpayKey, Key: constant.EpayKey,
}, common.PayAddress) }, constant.PayAddress)
if err != nil { if err != nil {
return nil return nil
} }
return withUrl return withUrl
} }
func GetAmount(count float64, user model.User) float64 { func getPayMoney(amount float64, user model.User) float64 {
if !common.DisplayInCurrencyEnabled {
amount = amount / common.QuotaPerUnit
}
// 别问为什么用float64问就是这么点钱没必要 // 别问为什么用float64问就是这么点钱没必要
topupGroupRatio := common.GetTopupGroupRatio(user.Group) topupGroupRatio := common.GetTopupGroupRatio(user.Group)
if topupGroupRatio == 0 { if topupGroupRatio == 0 {
topupGroupRatio = 1 topupGroupRatio = 1
} }
amount := count * common.Price * topupGroupRatio payMoney := amount * constant.Price * topupGroupRatio
return amount return payMoney
}
func getMinTopup() int {
minTopup := constant.MinTopUp
if !common.DisplayInCurrencyEnabled {
minTopup = minTopup * int(common.QuotaPerUnit)
}
return minTopup
} }
func RequestEpay(c *gin.Context) { func RequestEpay(c *gin.Context) {
var req EpayRequest var req EpayRequest
err := c.ShouldBindJSON(&req) err := c.ShouldBindJSON(&req)
if err != nil { if err != nil {
c.JSON(200, gin.H{"message": err.Error(), "data": 10}) c.JSON(200, gin.H{"message": "error", "data": "参数错误"})
return return
} }
if req.Amount < common.MinTopUp { if req.Amount < getMinTopup() {
c.JSON(200, gin.H{"message": fmt.Sprintf("充值数量不能小于 %d", common.MinTopUp), "data": 10}) c.JSON(200, gin.H{"message": "error", "data": fmt.Sprintf("充值数量不能小于 %d", getMinTopup())})
return return
} }
id := c.GetInt("id") id := c.GetInt("id")
user, _ := model.GetUserById(id, false) user, _ := model.GetUserById(id, false)
payMoney := GetAmount(float64(req.Amount), *user) payMoney := getPayMoney(float64(req.Amount), *user)
if payMoney < 0.01 {
c.JSON(200, gin.H{"message": "error", "data": "充值金额过低"})
return
}
var payType epay.PurchaseType var payType epay.PurchaseType
if req.PaymentMethod == "zfb" { if req.PaymentMethod == "zfb" {
@@ -77,7 +94,7 @@ func RequestEpay(c *gin.Context) {
callBackAddress := service.GetCallbackAddress() callBackAddress := service.GetCallbackAddress()
returnUrl, _ := url.Parse(common.ServerAddress + "/log") returnUrl, _ := url.Parse(common.ServerAddress + "/log")
notifyUrl, _ := url.Parse(callBackAddress + "/api/user/epay/notify") notifyUrl, _ := url.Parse(callBackAddress + "/api/user/epay/notify")
tradeNo := strconv.FormatInt(time.Now().Unix(), 10) tradeNo := fmt.Sprintf("%s%d", common.GetRandomString(6), time.Now().Unix())
client := GetEpayClient() client := GetEpayClient()
if client == nil { if client == nil {
c.JSON(200, gin.H{"message": "error", "data": "当前管理员未配置支付信息"}) c.JSON(200, gin.H{"message": "error", "data": "当前管理员未配置支付信息"})
@@ -96,9 +113,13 @@ func RequestEpay(c *gin.Context) {
c.JSON(200, gin.H{"message": "error", "data": "拉起支付失败"}) c.JSON(200, gin.H{"message": "error", "data": "拉起支付失败"})
return return
} }
amount := req.Amount
if !common.DisplayInCurrencyEnabled {
amount = amount / int(common.QuotaPerUnit)
}
topUp := &model.TopUp{ topUp := &model.TopUp{
UserId: id, UserId: id,
Amount: req.Amount, Amount: amount,
Money: payMoney, Money: payMoney,
TradeNo: "A" + tradeNo, TradeNo: "A" + tradeNo,
CreateTime: time.Now().Unix(), CreateTime: time.Now().Unix(),
@@ -186,13 +207,13 @@ func EpayNotify(c *gin.Context) {
} }
//user, _ := model.GetUserById(topUp.UserId, false) //user, _ := model.GetUserById(topUp.UserId, false)
//user.Quota += topUp.Amount * 500000 //user.Quota += topUp.Amount * 500000
err = model.IncreaseUserQuota(topUp.UserId, topUp.Amount*500000) err = model.IncreaseUserQuota(topUp.UserId, topUp.Amount*int(common.QuotaPerUnit))
if err != nil { if err != nil {
log.Printf("易支付回调更新用户失败: %v", topUp) log.Printf("易支付回调更新用户失败: %v", topUp)
return return
} }
log.Printf("易支付回调更新用户成功 %v", topUp) log.Printf("易支付回调更新用户成功 %v", topUp)
model.RecordLog(topUp.UserId, model.LogTypeTopup, fmt.Sprintf("使用在线充值成功,充值金额: %v支付金额%f", common.LogQuota(topUp.Amount*500000), topUp.Money)) model.RecordLog(topUp.UserId, model.LogTypeTopup, fmt.Sprintf("使用在线充值成功,充值金额: %v支付金额%f", common.LogQuota(topUp.Amount*int(common.QuotaPerUnit)), topUp.Money))
} }
} else { } else {
log.Printf("易支付异常回调: %v", verifyInfo) log.Printf("易支付异常回调: %v", verifyInfo)
@@ -206,12 +227,17 @@ func RequestAmount(c *gin.Context) {
c.JSON(200, gin.H{"message": "error", "data": "参数错误"}) c.JSON(200, gin.H{"message": "error", "data": "参数错误"})
return return
} }
if req.Amount < common.MinTopUp {
c.JSON(200, gin.H{"message": "error", "data": fmt.Sprintf("充值数量不能小于 %d", common.MinTopUp)}) if req.Amount < getMinTopup() {
c.JSON(200, gin.H{"message": "error", "data": fmt.Sprintf("充值数量不能小于 %d", getMinTopup())})
return return
} }
id := c.GetInt("id") id := c.GetInt("id")
user, _ := model.GetUserById(id, false) user, _ := model.GetUserById(id, false)
payMoney := GetAmount(float64(req.Amount), *user) payMoney := getPayMoney(float64(req.Amount), *user)
if payMoney <= 0.01 {
c.JSON(200, gin.H{"message": "error", "data": "充值金额过低"})
return
}
c.JSON(200, gin.H{"message": "success", "data": strconv.FormatFloat(payMoney, 'f', 2, 64)}) c.JSON(200, gin.H{"message": "success", "data": strconv.FormatFloat(payMoney, 'f', 2, 64)})
} }

View File

@@ -7,6 +7,7 @@ import (
"one-api/common" "one-api/common"
"one-api/model" "one-api/model"
"strconv" "strconv"
"sync"
"github.com/gin-contrib/sessions" "github.com/gin-contrib/sessions"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
@@ -724,7 +725,7 @@ func ManageUser(c *gin.Context) {
user.Role = common.RoleCommonUser user.Role = common.RoleCommonUser
} }
if err := user.Update(false); err != nil { if err := user.UpdateAll(false); err != nil {
c.JSON(http.StatusOK, gin.H{ c.JSON(http.StatusOK, gin.H{
"success": false, "success": false,
"message": err.Error(), "message": err.Error(),
@@ -789,7 +790,11 @@ type topUpRequest struct {
Key string `json:"key"` Key string `json:"key"`
} }
var lock = sync.Mutex{}
func TopUp(c *gin.Context) { func TopUp(c *gin.Context) {
lock.Lock()
defer lock.Unlock()
req := topUpRequest{} req := topUpRequest{}
err := c.ShouldBindJSON(&req) err := c.ShouldBindJSON(&req)
if err != nil { if err != nil {

View File

@@ -10,6 +10,7 @@ type OpenAIError struct {
type OpenAIErrorWithStatusCode struct { type OpenAIErrorWithStatusCode struct {
Error OpenAIError `json:"error"` Error OpenAIError `json:"error"`
StatusCode int `json:"status_code"` StatusCode int `json:"status_code"`
LocalError bool
} }
type GeneralErrorResponse struct { type GeneralErrorResponse struct {

6
dto/sensitive.go Normal file
View File

@@ -0,0 +1,6 @@
package dto
type SensitiveResponse struct {
SensitiveWords []string `json:"sensitive_words"`
Content string `json:"content"`
}

View File

@@ -32,6 +32,17 @@ type GeneralOpenAIRequest struct {
TopLogProbs int `json:"top_logprobs,omitempty"` TopLogProbs int `json:"top_logprobs,omitempty"`
} }
type OpenAITools struct {
Type string `json:"type"`
Function OpenAIFunction `json:"function"`
}
type OpenAIFunction struct {
Description string `json:"description,omitempty"`
Name string `json:"name"`
Parameters any `json:"parameters,omitempty"`
}
func (r GeneralOpenAIRequest) ParseInput() []string { func (r GeneralOpenAIRequest) ParseInput() []string {
if r.Input == nil { if r.Input == nil {
return nil return nil

View File

@@ -1,11 +1,31 @@
package dto package dto
type TextResponse struct { type TextResponseWithError struct {
Id string `json:"id"`
Object string `json:"object"`
Created int64 `json:"created"`
Choices []OpenAITextResponseChoice `json:"choices"` Choices []OpenAITextResponseChoice `json:"choices"`
Data []OpenAIEmbeddingResponseItem `json:"data"`
Model string `json:"model"`
Usage `json:"usage"` Usage `json:"usage"`
Error OpenAIError `json:"error"` Error OpenAIError `json:"error"`
} }
type SimpleResponse struct {
Usage `json:"usage"`
Error OpenAIError `json:"error"`
Choices []OpenAITextResponseChoice `json:"choices"`
}
type TextResponse struct {
Id string `json:"id"`
Object string `json:"object"`
Created int64 `json:"created"`
Model string `json:"model"`
Choices []OpenAITextResponseChoice `json:"choices"`
Usage `json:"usage"`
}
type OpenAITextResponseChoice struct { type OpenAITextResponseChoice struct {
Index int `json:"index"` Index int `json:"index"`
Message `json:"message"` Message `json:"message"`
@@ -34,15 +54,31 @@ type OpenAIEmbeddingResponse struct {
} }
type ChatCompletionsStreamResponseChoice struct { type ChatCompletionsStreamResponseChoice struct {
Delta struct { Delta ChatCompletionsStreamResponseChoiceDelta `json:"delta"`
Content string `json:"content"`
Role string `json:"role,omitempty"`
ToolCalls any `json:"tool_calls,omitempty"`
} `json:"delta"`
FinishReason *string `json:"finish_reason,omitempty"` FinishReason *string `json:"finish_reason,omitempty"`
Index int `json:"index,omitempty"` Index int `json:"index,omitempty"`
} }
type ChatCompletionsStreamResponseChoiceDelta struct {
Content string `json:"content"`
Role string `json:"role,omitempty"`
ToolCalls []ToolCall `json:"tool_calls,omitempty"`
}
type ToolCall struct {
// Index is not nil only in chat completion chunk object
Index *int `json:"index,omitempty"`
ID string `json:"id"`
Type any `json:"type"`
Function FunctionCall `json:"function"`
}
type FunctionCall struct {
Name string `json:"name,omitempty"`
// call function with arguments in JSON format
Arguments string `json:"arguments,omitempty"`
}
type ChatCompletionsStreamResponse struct { type ChatCompletionsStreamResponse struct {
Id string `json:"id"` Id string `json:"id"`
Object string `json:"object"` Object string `json:"object"`

33
go.mod
View File

@@ -4,22 +4,28 @@ module one-api
go 1.18 go 1.18
require ( require (
github.com/chai2010/webp v1.1.1 github.com/Calcium-Ion/go-epay v0.0.2
github.com/anknown/ahocorasick v0.0.0-20190904063843-d75dbd5169c0
github.com/aws/aws-sdk-go-v2 v1.26.1
github.com/aws/aws-sdk-go-v2/credentials v1.17.11
github.com/aws/aws-sdk-go-v2/service/bedrockruntime v1.7.4
github.com/gin-contrib/cors v1.4.0 github.com/gin-contrib/cors v1.4.0
github.com/gin-contrib/gzip v0.0.6 github.com/gin-contrib/gzip v0.0.6
github.com/gin-contrib/sessions v0.0.5 github.com/gin-contrib/sessions v0.0.5
github.com/gin-contrib/static v0.0.1 github.com/gin-contrib/static v0.0.1
github.com/gin-gonic/gin v1.9.1 github.com/gin-gonic/gin v1.9.1
github.com/go-playground/validator/v10 v10.16.0 github.com/go-playground/validator/v10 v10.19.0
github.com/go-redis/redis/v8 v8.11.5 github.com/go-redis/redis/v8 v8.11.5
github.com/golang-jwt/jwt v3.2.2+incompatible github.com/golang-jwt/jwt v3.2.2+incompatible
github.com/google/uuid v1.3.0 github.com/google/uuid v1.3.0
github.com/gorilla/websocket v1.5.0 github.com/gorilla/websocket v1.5.0
github.com/jinzhu/copier v0.4.0
github.com/pkg/errors v0.9.1
github.com/pkoukk/tiktoken-go v0.1.6 github.com/pkoukk/tiktoken-go v0.1.6
github.com/samber/lo v1.38.1 github.com/samber/lo v1.39.0
github.com/shirou/gopsutil v3.21.11+incompatible github.com/shirou/gopsutil v3.21.11+incompatible
github.com/star-horizon/go-epay v0.0.0-20230204124159-fa2e2293fdc2 golang.org/x/crypto v0.21.0
golang.org/x/crypto v0.17.0 golang.org/x/image v0.15.0
gorm.io/driver/mysql v1.4.3 gorm.io/driver/mysql v1.4.3
gorm.io/driver/postgres v1.5.2 gorm.io/driver/postgres v1.5.2
gorm.io/driver/sqlite v1.4.3 gorm.io/driver/sqlite v1.4.3
@@ -27,12 +33,17 @@ require (
) )
require ( require (
github.com/anknown/darts v0.0.0-20151216065714-83ff685239e6 // indirect
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.6.2 // indirect
github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.5 // indirect
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.5 // indirect
github.com/aws/smithy-go v1.20.2 // indirect
github.com/bytedance/sonic v1.9.1 // indirect github.com/bytedance/sonic v1.9.1 // indirect
github.com/cespare/xxhash/v2 v2.1.2 // indirect github.com/cespare/xxhash/v2 v2.1.2 // indirect
github.com/chenzhuoyu/base64x v0.0.0-20221115062448-fe3a3abad311 // indirect github.com/chenzhuoyu/base64x v0.0.0-20221115062448-fe3a3abad311 // indirect
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect
github.com/dlclark/regexp2 v1.10.0 // indirect github.com/dlclark/regexp2 v1.10.0 // indirect
github.com/gabriel-vasile/mimetype v1.4.2 // indirect github.com/gabriel-vasile/mimetype v1.4.3 // indirect
github.com/gin-contrib/sse v0.1.0 // indirect github.com/gin-contrib/sse v0.1.0 // indirect
github.com/go-ole/go-ole v1.2.6 // indirect github.com/go-ole/go-ole v1.2.6 // indirect
github.com/go-playground/locales v0.14.1 // indirect github.com/go-playground/locales v0.14.1 // indirect
@@ -50,7 +61,7 @@ require (
github.com/jinzhu/now v1.1.5 // indirect github.com/jinzhu/now v1.1.5 // indirect
github.com/json-iterator/go v1.1.12 // indirect github.com/json-iterator/go v1.1.12 // indirect
github.com/klauspost/cpuid/v2 v2.2.4 // indirect github.com/klauspost/cpuid/v2 v2.2.4 // indirect
github.com/leodido/go-urn v1.2.4 // indirect github.com/leodido/go-urn v1.4.0 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect github.com/mattn/go-isatty v0.0.20 // indirect
github.com/mattn/go-sqlite3 v2.0.3+incompatible // indirect github.com/mattn/go-sqlite3 v2.0.3+incompatible // indirect
github.com/mitchellh/mapstructure v1.5.0 // indirect github.com/mitchellh/mapstructure v1.5.0 // indirect
@@ -63,10 +74,10 @@ require (
github.com/ugorji/go/codec v1.2.11 // indirect github.com/ugorji/go/codec v1.2.11 // indirect
github.com/yusufpapurcu/wmi v1.2.3 // indirect github.com/yusufpapurcu/wmi v1.2.3 // indirect
golang.org/x/arch v0.3.0 // indirect golang.org/x/arch v0.3.0 // indirect
golang.org/x/exp v0.0.0-20220303212507-bbda1eaf7a17 // indirect golang.org/x/exp v0.0.0-20240404231335-c0f41cb1a7a0 // indirect
golang.org/x/net v0.17.0 // indirect golang.org/x/net v0.21.0 // indirect
golang.org/x/sync v0.1.0 // indirect golang.org/x/sync v0.7.0 // indirect
golang.org/x/sys v0.15.0 // indirect golang.org/x/sys v0.18.0 // indirect
golang.org/x/text v0.14.0 // indirect golang.org/x/text v0.14.0 // indirect
google.golang.org/protobuf v1.30.0 // indirect google.golang.org/protobuf v1.30.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect gopkg.in/yaml.v3 v3.0.1 // indirect

84
go.sum
View File

@@ -1,10 +1,28 @@
github.com/Calcium-Ion/go-epay v0.0.2 h1:3knFBuaBFpHzsGeGQU/QxUqZSHh5s0+jGo0P62pJzWc=
github.com/Calcium-Ion/go-epay v0.0.2/go.mod h1:cxo/ZOg8ClvE3VAnCmEzbuyAZINSq7kFEN9oHj5WQ2U=
github.com/anknown/ahocorasick v0.0.0-20190904063843-d75dbd5169c0 h1:onfun1RA+KcxaMk1lfrRnwCd1UUuOjJM/lri5eM1qMs=
github.com/anknown/ahocorasick v0.0.0-20190904063843-d75dbd5169c0/go.mod h1:4yg+jNTYlDEzBjhGS96v+zjyA3lfXlFd5CiTLIkPBLI=
github.com/anknown/darts v0.0.0-20151216065714-83ff685239e6 h1:HblK3eJHq54yET63qPCTJnks3loDse5xRmmqHgHzwoI=
github.com/anknown/darts v0.0.0-20151216065714-83ff685239e6/go.mod h1:pbiaLIeYLUbgMY1kwEAdwO6UKD5ZNwdPGQlwokS9fe8=
github.com/aws/aws-sdk-go-v2 v1.26.1 h1:5554eUqIYVWpU0YmeeYZ0wU64H2VLBs8TlhRB2L+EkA=
github.com/aws/aws-sdk-go-v2 v1.26.1/go.mod h1:ffIFB97e2yNsv4aTSGkqtHnppsIJzw7G7BReUZ3jCXM=
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.6.2 h1:x6xsQXGSmW6frevwDA+vi/wqhp1ct18mVXYN08/93to=
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.6.2/go.mod h1:lPprDr1e6cJdyYeGXnRaJoP4Md+cDBvi2eOj00BlGmg=
github.com/aws/aws-sdk-go-v2/credentials v1.17.11 h1:YuIB1dJNf1Re822rriUOTxopaHHvIq0l/pX3fwO+Tzs=
github.com/aws/aws-sdk-go-v2/credentials v1.17.11/go.mod h1:AQtFPsDH9bI2O+71anW6EKL+NcD7LG3dpKGMV4SShgo=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.5 h1:aw39xVGeRWlWx9EzGVnhOR4yOjQDHPQ6o6NmBlscyQg=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.3.5/go.mod h1:FSaRudD0dXiMPK2UjknVwwTYyZMRsHv3TtkabsZih5I=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.5 h1:PG1F3OD1szkuQPzDw3CIQsRIrtTlUC3lP84taWzHlq0=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.5/go.mod h1:jU1li6RFryMz+so64PpKtudI+QzbKoIEivqdf6LNpOc=
github.com/aws/aws-sdk-go-v2/service/bedrockruntime v1.7.4 h1:JgHnonzbnA3pbqj76wYsSZIZZQYBxkmMEjvL6GHy8XU=
github.com/aws/aws-sdk-go-v2/service/bedrockruntime v1.7.4/go.mod h1:nZspkhg+9p8iApLFoyAqfyuMP0F38acy2Hm3r5r95Cg=
github.com/aws/smithy-go v1.20.2 h1:tbp628ireGtzcHDDmLT/6ADHidqnwgF57XOXZe6tp4Q=
github.com/aws/smithy-go v1.20.2/go.mod h1:krry+ya/rV9RDcV/Q16kpu6ypI4K2czasz0NC3qS14E=
github.com/bytedance/sonic v1.5.0/go.mod h1:ED5hyg4y6t3/9Ku1R6dU/4KyJ48DZ4jPhfY1O2AihPM= github.com/bytedance/sonic v1.5.0/go.mod h1:ED5hyg4y6t3/9Ku1R6dU/4KyJ48DZ4jPhfY1O2AihPM=
github.com/bytedance/sonic v1.9.1 h1:6iJ6NqdoxCDr6mbY8h18oSO+cShGSMRGCEo7F2h0x8s= github.com/bytedance/sonic v1.9.1 h1:6iJ6NqdoxCDr6mbY8h18oSO+cShGSMRGCEo7F2h0x8s=
github.com/bytedance/sonic v1.9.1/go.mod h1:i736AoUSYt75HyZLoJW9ERYxcy6eaN6h4BZXU064P/U= github.com/bytedance/sonic v1.9.1/go.mod h1:i736AoUSYt75HyZLoJW9ERYxcy6eaN6h4BZXU064P/U=
github.com/cespare/xxhash/v2 v2.1.2 h1:YRXhKfTDauu4ajMg1TPgFO5jnlC2HCbmLXMcTG5cbYE= github.com/cespare/xxhash/v2 v2.1.2 h1:YRXhKfTDauu4ajMg1TPgFO5jnlC2HCbmLXMcTG5cbYE=
github.com/cespare/xxhash/v2 v2.1.2/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs= github.com/cespare/xxhash/v2 v2.1.2/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/chai2010/webp v1.1.1 h1:jTRmEccAJ4MGrhFOrPMpNGIJ/eybIgwKpcACsrTEapk=
github.com/chai2010/webp v1.1.1/go.mod h1:0XVwvZWdjjdxpUEIf7b9g9VkHFnInUSYujwqTLEuldU=
github.com/chenzhuoyu/base64x v0.0.0-20211019084208-fb5309c8db06/go.mod h1:DH46F32mSOjUmXrMHnKwZdA8wcEefY7UVqBKYGjpdQY= github.com/chenzhuoyu/base64x v0.0.0-20211019084208-fb5309c8db06/go.mod h1:DH46F32mSOjUmXrMHnKwZdA8wcEefY7UVqBKYGjpdQY=
github.com/chenzhuoyu/base64x v0.0.0-20221115062448-fe3a3abad311 h1:qSGYFH7+jGhDF8vLC+iwCD4WpbV1EBDSzWkJODFLams= github.com/chenzhuoyu/base64x v0.0.0-20221115062448-fe3a3abad311 h1:qSGYFH7+jGhDF8vLC+iwCD4WpbV1EBDSzWkJODFLams=
github.com/chenzhuoyu/base64x v0.0.0-20221115062448-fe3a3abad311/go.mod h1:b583jCggY9gE99b6G5LEC39OIiVsWj+R97kbl5odCEk= github.com/chenzhuoyu/base64x v0.0.0-20221115062448-fe3a3abad311/go.mod h1:b583jCggY9gE99b6G5LEC39OIiVsWj+R97kbl5odCEk=
@@ -17,8 +35,8 @@ github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cu
github.com/dlclark/regexp2 v1.10.0 h1:+/GIL799phkJqYW+3YbOd8LCcbHzT0Pbo8zl70MHsq0= github.com/dlclark/regexp2 v1.10.0 h1:+/GIL799phkJqYW+3YbOd8LCcbHzT0Pbo8zl70MHsq0=
github.com/dlclark/regexp2 v1.10.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8= github.com/dlclark/regexp2 v1.10.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
github.com/fsnotify/fsnotify v1.4.9 h1:hsms1Qyu0jgnwNXIxa+/V/PDsU6CfLf6CNO8H7IWoS4= github.com/fsnotify/fsnotify v1.4.9 h1:hsms1Qyu0jgnwNXIxa+/V/PDsU6CfLf6CNO8H7IWoS4=
github.com/gabriel-vasile/mimetype v1.4.2 h1:w5qFW6JKBz9Y393Y4q372O9A7cUSequkh1Q7OhCmWKU= github.com/gabriel-vasile/mimetype v1.4.3 h1:in2uUcidCuFcDKtdcBxlR0rJ1+fsokWf+uqxgUFjbI0=
github.com/gabriel-vasile/mimetype v1.4.2/go.mod h1:zApsH/mKG4w07erKIaJPFiX0Tsq9BFQgN3qGY5GnNgA= github.com/gabriel-vasile/mimetype v1.4.3/go.mod h1:d8uq/6HKRL6CGdk+aubisF/M5GcPfT7nKyLpA0lbSSk=
github.com/gin-contrib/cors v1.4.0 h1:oJ6gwtUl3lqV0WEIwM/LxPF1QZ5qe2lGWdY2+bz7y0g= github.com/gin-contrib/cors v1.4.0 h1:oJ6gwtUl3lqV0WEIwM/LxPF1QZ5qe2lGWdY2+bz7y0g=
github.com/gin-contrib/cors v1.4.0/go.mod h1:bs9pNM0x/UsmHPBWT2xZz9ROh8xYjYkiURUfmBoMlcs= github.com/gin-contrib/cors v1.4.0/go.mod h1:bs9pNM0x/UsmHPBWT2xZz9ROh8xYjYkiURUfmBoMlcs=
github.com/gin-contrib/gzip v0.0.6 h1:NjcunTcGAj5CO1gn4N8jHOSIeRFHIbn51z6K+xaN4d4= github.com/gin-contrib/gzip v0.0.6 h1:NjcunTcGAj5CO1gn4N8jHOSIeRFHIbn51z6K+xaN4d4=
@@ -47,10 +65,8 @@ github.com/go-playground/universal-translator v0.18.1 h1:Bcnm0ZwsGyWbCzImXv+pAJn
github.com/go-playground/universal-translator v0.18.1/go.mod h1:xekY+UJKNuX9WP91TpwSH2VMlDf28Uj24BCp08ZFTUY= github.com/go-playground/universal-translator v0.18.1/go.mod h1:xekY+UJKNuX9WP91TpwSH2VMlDf28Uj24BCp08ZFTUY=
github.com/go-playground/validator/v10 v10.2.0/go.mod h1:uOYAAleCW8F/7oMFd6aG0GOhaH6EGOAJShg8Id5JGkI= github.com/go-playground/validator/v10 v10.2.0/go.mod h1:uOYAAleCW8F/7oMFd6aG0GOhaH6EGOAJShg8Id5JGkI=
github.com/go-playground/validator/v10 v10.10.0/go.mod h1:74x4gJWsvQexRdW8Pn3dXSGrTK4nAUsbPlLADvpJkos= github.com/go-playground/validator/v10 v10.10.0/go.mod h1:74x4gJWsvQexRdW8Pn3dXSGrTK4nAUsbPlLADvpJkos=
github.com/go-playground/validator/v10 v10.14.0 h1:vgvQWe3XCz3gIeFDm/HnTIbj6UGmg/+t63MyGU2n5js= github.com/go-playground/validator/v10 v10.19.0 h1:ol+5Fu+cSq9JD7SoSqe04GMI92cbn0+wvQ3bZ8b/AU4=
github.com/go-playground/validator/v10 v10.14.0/go.mod h1:9iXMNT7sEkjXb0I+enO7QXmzG6QCsPWY4zveKFVRSyU= github.com/go-playground/validator/v10 v10.19.0/go.mod h1:dbuPbCMFw/DrkbEynArYaCwl3amGuJotoKCe95atGMM=
github.com/go-playground/validator/v10 v10.16.0 h1:x+plE831WK4vaKHO/jpgUGsvLKIqRRkz6M78GuJAfGE=
github.com/go-playground/validator/v10 v10.16.0/go.mod h1:9iXMNT7sEkjXb0I+enO7QXmzG6QCsPWY4zveKFVRSyU=
github.com/go-redis/redis/v8 v8.11.5 h1:AcZZR7igkdvfVmQTPnu9WE37LRrO/YrBH5zWyjDC0oI= github.com/go-redis/redis/v8 v8.11.5 h1:AcZZR7igkdvfVmQTPnu9WE37LRrO/YrBH5zWyjDC0oI=
github.com/go-redis/redis/v8 v8.11.5/go.mod h1:gREzHqY1hg6oD9ngVRbLStwAWKhA0FEgq8Jd4h5lpwo= github.com/go-redis/redis/v8 v8.11.5/go.mod h1:gREzHqY1hg6oD9ngVRbLStwAWKhA0FEgq8Jd4h5lpwo=
github.com/go-sql-driver/mysql v1.6.0 h1:BCTh4TKNUYmOmMUcQ3IipzF5prigylS7XXjEkfCHuOE= github.com/go-sql-driver/mysql v1.6.0 h1:BCTh4TKNUYmOmMUcQ3IipzF5prigylS7XXjEkfCHuOE=
@@ -62,8 +78,8 @@ github.com/golang-jwt/jwt v3.2.2+incompatible h1:IfV12K8xAKAnZqdXVzCZ+TOjboZ2keL
github.com/golang-jwt/jwt v3.2.2+incompatible/go.mod h1:8pz2t5EyA70fFQQSrl6XZXzqecmYZeUEB8OUGHkxJ+I= github.com/golang-jwt/jwt v3.2.2+incompatible/go.mod h1:8pz2t5EyA70fFQQSrl6XZXzqecmYZeUEB8OUGHkxJ+I=
github.com/golang/protobuf v1.3.3/go.mod h1:vzj43D7+SQXF/4pzW/hwtAqwc6iTitCiVSaWz5lYuqw= github.com/golang/protobuf v1.3.3/go.mod h1:vzj43D7+SQXF/4pzW/hwtAqwc6iTitCiVSaWz5lYuqw=
github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk= github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk=
github.com/google/go-cmp v0.5.5 h1:Khx7svrCpmxxtHBq5j2mp/xVjsi8hQMfNLvJFAlrGgU=
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.8 h1:e6P7q2lk1O+qJJb4BtCQXlK8vWEO8V1ZeuEdJNOqZyg=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg= github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/google/uuid v1.3.0 h1:t6JiXgmwXMjEs8VusXIJk2BXHsn+wx8BZdTaoZ5fu7I= github.com/google/uuid v1.3.0 h1:t6JiXgmwXMjEs8VusXIJk2BXHsn+wx8BZdTaoZ5fu7I=
github.com/google/uuid v1.3.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo= github.com/google/uuid v1.3.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
@@ -79,12 +95,12 @@ github.com/jackc/pgpassfile v1.0.0 h1:/6Hmqy13Ss2zCq62VdNG8tM1wchn8zjSGOBJ6icpsI
github.com/jackc/pgpassfile v1.0.0/go.mod h1:CEx0iS5ambNFdcRtxPj5JhEz+xB6uRky5eyVu/W2HEg= github.com/jackc/pgpassfile v1.0.0/go.mod h1:CEx0iS5ambNFdcRtxPj5JhEz+xB6uRky5eyVu/W2HEg=
github.com/jackc/pgservicefile v0.0.0-20221227161230-091c0ba34f0a h1:bbPeKD0xmW/Y25WS6cokEszi5g+S0QxI/d45PkRi7Nk= github.com/jackc/pgservicefile v0.0.0-20221227161230-091c0ba34f0a h1:bbPeKD0xmW/Y25WS6cokEszi5g+S0QxI/d45PkRi7Nk=
github.com/jackc/pgservicefile v0.0.0-20221227161230-091c0ba34f0a/go.mod h1:5TJZWKEWniPve33vlWYSoGYefn3gLQRzjfDlhSJ9ZKM= github.com/jackc/pgservicefile v0.0.0-20221227161230-091c0ba34f0a/go.mod h1:5TJZWKEWniPve33vlWYSoGYefn3gLQRzjfDlhSJ9ZKM=
github.com/jackc/pgx/v5 v5.3.1 h1:Fcr8QJ1ZeLi5zsPZqQeUZhNhxfkkKBOgJuYkJHoBOtU=
github.com/jackc/pgx/v5 v5.3.1/go.mod h1:t3JDKnCBlYIc0ewLF0Q7B8MXmoIaBOZj/ic7iHozM/8=
github.com/jackc/pgx/v5 v5.5.1 h1:5I9etrGkLrN+2XPCsi6XLlV5DITbSL/xBZdmAxFcXPI= github.com/jackc/pgx/v5 v5.5.1 h1:5I9etrGkLrN+2XPCsi6XLlV5DITbSL/xBZdmAxFcXPI=
github.com/jackc/pgx/v5 v5.5.1/go.mod h1:Ig06C2Vu0t5qXC60W8sqIthScaEnFvojjj9dSljmHRA= github.com/jackc/pgx/v5 v5.5.1/go.mod h1:Ig06C2Vu0t5qXC60W8sqIthScaEnFvojjj9dSljmHRA=
github.com/jackc/puddle/v2 v2.2.1 h1:RhxXJtFG022u4ibrCSMSiu5aOq1i77R3OHKNJj77OAk= github.com/jackc/puddle/v2 v2.2.1 h1:RhxXJtFG022u4ibrCSMSiu5aOq1i77R3OHKNJj77OAk=
github.com/jackc/puddle/v2 v2.2.1/go.mod h1:vriiEXHvEE654aYKXXjOvZM39qJ0q+azkZFrfEOc3H4= github.com/jackc/puddle/v2 v2.2.1/go.mod h1:vriiEXHvEE654aYKXXjOvZM39qJ0q+azkZFrfEOc3H4=
github.com/jinzhu/copier v0.4.0 h1:w3ciUoD19shMCRargcpm0cm91ytaBhDvuRpz1ODO/U8=
github.com/jinzhu/copier v0.4.0/go.mod h1:DfbEm0FYsaqBcKcFuvmOZb218JkPGtvSHsKg8S8hyyg=
github.com/jinzhu/inflection v1.0.0 h1:K317FqzuhWc8YvSVlFMCCUb36O/S9MCKRDI7QkRKD/E= github.com/jinzhu/inflection v1.0.0 h1:K317FqzuhWc8YvSVlFMCCUb36O/S9MCKRDI7QkRKD/E=
github.com/jinzhu/inflection v1.0.0/go.mod h1:h+uFLlag+Qp1Va5pdKtLDYj+kHp5pxUVkryuEj+Srlc= github.com/jinzhu/inflection v1.0.0/go.mod h1:h+uFLlag+Qp1Va5pdKtLDYj+kHp5pxUVkryuEj+Srlc=
github.com/jinzhu/now v1.1.4/go.mod h1:d3SSVoowX0Lcu0IBviAWJpolVfI5UJVZZ7cO71lE/z8= github.com/jinzhu/now v1.1.4/go.mod h1:d3SSVoowX0Lcu0IBviAWJpolVfI5UJVZZ7cO71lE/z8=
@@ -106,12 +122,10 @@ github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE= github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/leodido/go-urn v1.2.0/go.mod h1:+8+nEpDfqqsY+g338gtMEUOtuK+4dEMhiQEgxpxOKII= github.com/leodido/go-urn v1.2.0/go.mod h1:+8+nEpDfqqsY+g338gtMEUOtuK+4dEMhiQEgxpxOKII=
github.com/leodido/go-urn v1.2.1/go.mod h1:zt4jvISO2HfUBqxjfIshjdMTYS56ZS/qv49ictyFfxY= github.com/leodido/go-urn v1.2.1/go.mod h1:zt4jvISO2HfUBqxjfIshjdMTYS56ZS/qv49ictyFfxY=
github.com/leodido/go-urn v1.2.4 h1:XlAE/cm/ms7TE/VMVoduSpNBoyc2dOxHs5MZSwAN63Q= github.com/leodido/go-urn v1.4.0 h1:WT9HwE9SGECu3lg4d/dIA+jxlljEa1/ffXKmRjqdmIQ=
github.com/leodido/go-urn v1.2.4/go.mod h1:7ZrI8mTSeBSHl/UaRyKQW1qZeMgak41ANeCNaVckg+4= github.com/leodido/go-urn v1.4.0/go.mod h1:bvxc+MVxLKB4z00jd1z+Dvzr47oO32F/QSNjSBOlFxI=
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU= github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
github.com/mattn/go-isatty v0.0.14/go.mod h1:7GGIvUiUoEMVVmxf/4nioHXj79iQHKdU27kJ6hsGG94= github.com/mattn/go-isatty v0.0.14/go.mod h1:7GGIvUiUoEMVVmxf/4nioHXj79iQHKdU27kJ6hsGG94=
github.com/mattn/go-isatty v0.0.19 h1:JITubQf0MOLdlGRuRq+jtsDlekdYPia9ZFsB8h/APPA=
github.com/mattn/go-isatty v0.0.19/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY= github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y= github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/mattn/go-sqlite3 v1.14.15/go.mod h1:2eHXhiwb8IkHr+BDWZGa96P6+rkvnG63S2DGjv9HUNg= github.com/mattn/go-sqlite3 v1.14.15/go.mod h1:2eHXhiwb8IkHr+BDWZGa96P6+rkvnG63S2DGjv9HUNg=
@@ -132,6 +146,8 @@ github.com/pelletier/go-toml/v2 v2.0.1/go.mod h1:r9LEWfGN8R5k0VXJ+0BkIe7MYkRdwZO
github.com/pelletier/go-toml/v2 v2.0.8 h1:0ctb6s9mE31h0/lhu+J6OPmVeDxJn+kYnJc2jZR9tGQ= github.com/pelletier/go-toml/v2 v2.0.8 h1:0ctb6s9mE31h0/lhu+J6OPmVeDxJn+kYnJc2jZR9tGQ=
github.com/pelletier/go-toml/v2 v2.0.8/go.mod h1:vuYfssBdrU2XDZ9bYydBu6t+6a6PYNcZljzZR9VXg+4= github.com/pelletier/go-toml/v2 v2.0.8/go.mod h1:vuYfssBdrU2XDZ9bYydBu6t+6a6PYNcZljzZR9VXg+4=
github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsKMEsNbx1VGcRFpLqf3715MtcvvzbA= github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsKMEsNbx1VGcRFpLqf3715MtcvvzbA=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkoukk/tiktoken-go v0.1.6 h1:JF0TlJzhTbrI30wCvFuiw6FzP2+/bR+FIxUdgEAcUsw= github.com/pkoukk/tiktoken-go v0.1.6 h1:JF0TlJzhTbrI30wCvFuiw6FzP2+/bR+FIxUdgEAcUsw=
github.com/pkoukk/tiktoken-go v0.1.6/go.mod h1:9NiV+i9mJKGj1rYOT+njbv+ZwA/zJxYdewGl6qVatpg= github.com/pkoukk/tiktoken-go v0.1.6/go.mod h1:9NiV+i9mJKGj1rYOT+njbv+ZwA/zJxYdewGl6qVatpg=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
@@ -139,12 +155,10 @@ github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZN
github.com/rogpeppe/go-internal v1.6.1/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc= github.com/rogpeppe/go-internal v1.6.1/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc=
github.com/rogpeppe/go-internal v1.8.0 h1:FCbCCtXNOY3UtUuHUYaghJg4y7Fd14rXifAYUAtL9R8= github.com/rogpeppe/go-internal v1.8.0 h1:FCbCCtXNOY3UtUuHUYaghJg4y7Fd14rXifAYUAtL9R8=
github.com/rogpeppe/go-internal v1.8.0/go.mod h1:WmiCO8CzOY8rg0OYDC4/i/2WRWAB6poM+XZ2dLUbcbE= github.com/rogpeppe/go-internal v1.8.0/go.mod h1:WmiCO8CzOY8rg0OYDC4/i/2WRWAB6poM+XZ2dLUbcbE=
github.com/samber/lo v1.38.1 h1:j2XEAqXKb09Am4ebOg31SpvzUTTs6EN3VfgeLUhPdXM= github.com/samber/lo v1.39.0 h1:4gTz1wUhNYLhFSKl6O+8peW0v2F4BCY034GRpU9WnuA=
github.com/samber/lo v1.38.1/go.mod h1:+m/ZKRl6ClXCE2Lgf3MsQlWfh4bn1bz6CXEOxnEXnEA= github.com/samber/lo v1.39.0/go.mod h1:+m/ZKRl6ClXCE2Lgf3MsQlWfh4bn1bz6CXEOxnEXnEA=
github.com/shirou/gopsutil v3.21.11+incompatible h1:+1+c1VGhc88SSonWP6foOcLhvnKlUeu/erjjvaPEYiI= github.com/shirou/gopsutil v3.21.11+incompatible h1:+1+c1VGhc88SSonWP6foOcLhvnKlUeu/erjjvaPEYiI=
github.com/shirou/gopsutil v3.21.11+incompatible/go.mod h1:5b4v6he4MtMOwMlS0TUMTu2PcXUg8+E1lC7eC3UO/RA= github.com/shirou/gopsutil v3.21.11+incompatible/go.mod h1:5b4v6he4MtMOwMlS0TUMTu2PcXUg8+E1lC7eC3UO/RA=
github.com/star-horizon/go-epay v0.0.0-20230204124159-fa2e2293fdc2 h1:avbt5a8F/zbYwFzTugrqWOBJe/K1cJj6+xpr+x1oVAI=
github.com/star-horizon/go-epay v0.0.0-20230204124159-fa2e2293fdc2/go.mod h1:SiffGCWGGMVwujne2dUQbJ5zUVD1V1Yj0hDuTfqFNEo=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw= github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo= github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
@@ -155,9 +169,8 @@ github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU= github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4= github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
github.com/stretchr/testify v1.8.2/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
github.com/stretchr/testify v1.8.3 h1:RP3t2pwF7cMEbC1dqtB6poj3niw/9gnV4Cjg5oW5gtY=
github.com/stretchr/testify v1.8.3/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo= github.com/stretchr/testify v1.8.3/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
github.com/tklauser/go-sysconf v0.3.12 h1:0QaGUFOdQaIVdPgfITYzaTegZvdCjmYO52cSFAEVmqU= github.com/tklauser/go-sysconf v0.3.12 h1:0QaGUFOdQaIVdPgfITYzaTegZvdCjmYO52cSFAEVmqU=
github.com/tklauser/go-sysconf v0.3.12/go.mod h1:Ho14jnntGE1fpdOqQEEaiKRpvIavV0hSfmBq8nJbHYI= github.com/tklauser/go-sysconf v0.3.12/go.mod h1:Ho14jnntGE1fpdOqQEEaiKRpvIavV0hSfmBq8nJbHYI=
github.com/tklauser/numcpus v0.6.1 h1:ng9scYS7az0Bk4OZLvrNXNSAO2Pxr1XXRAPyjhIx+Fk= github.com/tklauser/numcpus v0.6.1 h1:ng9scYS7az0Bk4OZLvrNXNSAO2Pxr1XXRAPyjhIx+Fk=
@@ -176,17 +189,17 @@ golang.org/x/arch v0.0.0-20210923205945-b76863e36670/go.mod h1:5om86z9Hs0C8fWVUu
golang.org/x/arch v0.3.0 h1:02VY4/ZcO/gBOH6PUaoiptASxtXU10jazRCP865E97k= golang.org/x/arch v0.3.0 h1:02VY4/ZcO/gBOH6PUaoiptASxtXU10jazRCP865E97k=
golang.org/x/arch v0.3.0/go.mod h1:5om86z9Hs0C8fWVUuoMHwpExlXzs5Tkyp9hOrfG7pp8= golang.org/x/arch v0.3.0/go.mod h1:5om86z9Hs0C8fWVUuoMHwpExlXzs5Tkyp9hOrfG7pp8=
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc= golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.14.0 h1:wBqGXzWJW6m1XrIKlAH0Hs1JJ7+9KBwnIO8v66Q9cHc= golang.org/x/crypto v0.21.0 h1:X31++rzVUdKhX5sWmSOFZxx8UW/ldWx55cbf08iNAMA=
golang.org/x/crypto v0.14.0/go.mod h1:MVFd36DqK4CsrnJYDkBA3VC4m2GkXAM0PvzMCn4JQf4= golang.org/x/crypto v0.21.0/go.mod h1:0BP7YvVV9gBbVKyeTG0Gyn+gZm94bibOW5BjDEYAOMs=
golang.org/x/crypto v0.17.0 h1:r8bRNjWL3GshPW3gkd+RpvzWrZAwPS49OmTGZ/uhM4k= golang.org/x/exp v0.0.0-20240404231335-c0f41cb1a7a0 h1:985EYyeCOxTpcgOTJpflJUwOeEz0CQOdPt73OzpE9F8=
golang.org/x/crypto v0.17.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4= golang.org/x/exp v0.0.0-20240404231335-c0f41cb1a7a0/go.mod h1:/lliqkxwWAhPjf5oSOIJup2XcqJaw8RGS6k3TGEc7GI=
golang.org/x/exp v0.0.0-20220303212507-bbda1eaf7a17 h1:3MTrJm4PyNL9NBqvYDSj3DHl46qQakyfqfWo4jgfaEM= golang.org/x/image v0.15.0 h1:kOELfmgrmJlw4Cdb7g/QGuB3CvDrXbqEIww/pNtNBm8=
golang.org/x/exp v0.0.0-20220303212507-bbda1eaf7a17/go.mod h1:lgLbSvA5ygNOMpwM/9anMpWVlVJ7Z+cHWq/eFuinpGE= golang.org/x/image v0.15.0/go.mod h1:HUYqC05R2ZcZ3ejNQsIHQDQiwWM4JBqmm6MKANTp4LE=
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg= golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
golang.org/x/net v0.17.0 h1:pVaXccu2ozPjCXewfr1S7xza/zcXTity9cCdXQYSjIM= golang.org/x/net v0.21.0 h1:AQyQV4dYCvJ7vGmJyKki9+PBdyvhkSd8EIx/qb0AYv4=
golang.org/x/net v0.17.0/go.mod h1:NxSsAGuq816PNPmqtQdLE42eU2Fs7NoRIZrHJAlaCOE= golang.org/x/net v0.21.0/go.mod h1:bIjVDfnllIU7BJ2DNgfnXvpSvtn8VRwhlsaeUTyUS44=
golang.org/x/sync v0.1.0 h1:wsuoTGHzEhffawBOhz5CYhcrV4IdKZbEyZjBMuTp12o= golang.org/x/sync v0.7.0 h1:YsImfSBoP9QPYL0xyKJPq0gcaJdG3rInoqxTWbfQu9M=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.7.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sys v0.0.0-20190916202348-b4ddaad3f8a3/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs= golang.org/x/sys v0.0.0-20190916202348-b4ddaad3f8a3/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs= golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs= golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@@ -197,21 +210,16 @@ golang.org/x/sys v0.0.0-20220704084225-05e143d24a9e/go.mod h1:oPkhp1MJrh7nUepCBc
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.11.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.11.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.13.0 h1:Af8nKPmuFypiUBjVoU9V20FiaFXOcuZI21p0ycVYYGE= golang.org/x/sys v0.18.0 h1:DBdB3niSjOA/O0blCZBqDefyWNYveAYMNF1Wum0DYQ4=
golang.org/x/sys v0.13.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.18.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.15.0 h1:h48lPFYpsTvQJZF4EKyI4aLHaev3CxivZmv7yZig9pc=
golang.org/x/sys v0.15.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo= golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk= golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ= golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ= golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.13.0 h1:ablQoSUd0tRdKxZewP80B+BaqeKJuVhuRxj/dkrun3k=
golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
golang.org/x/text v0.14.0 h1:ScX5w1eTa3QqT8oi6+ziP7dTV1S2+ALU0bI+0zXKWiQ= golang.org/x/text v0.14.0 h1:ScX5w1eTa3QqT8oi6+ziP7dTV1S2+ALU0bI+0zXKWiQ=
golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU= golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ= golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1 h1:go1bK/D/BFZV2I8cIQd1NKEZ+0owSTG1fDTci4IqFcE=
google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw= google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=
google.golang.org/protobuf v1.28.0/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqwMG9pJV4I= google.golang.org/protobuf v1.28.0/go.mod h1:HV8QOd/L58Z+nl8r43ehVNZIU/HEI6OcFqwMG9pJV4I=
google.golang.org/protobuf v1.30.0 h1:kPPoIgf3TsEvrm0PFe15JQ+570QVxYzEvvHqChK+cng= google.golang.org/protobuf v1.30.0 h1:kPPoIgf3TsEvrm0PFe15JQ+570QVxYzEvvHqChK+cng=

View File

@@ -20,10 +20,10 @@ import (
_ "net/http/pprof" _ "net/http/pprof"
) )
//go:embed web/build //go:embed web/dist
var buildFS embed.FS var buildFS embed.FS
//go:embed web/build/index.html //go:embed web/dist/index.html
var indexPage []byte var indexPage []byte
func main() { func main() {

View File

@@ -7,7 +7,7 @@ all: build-frontend start-backend
build-frontend: build-frontend:
@echo "Building frontend..." @echo "Building frontend..."
@cd $(FRONTEND_DIR) && npm install && DISABLE_ESLINT_PLUGIN='true' REACT_APP_VERSION=$(cat VERSION) npm run build npm run build @cd $(FRONTEND_DIR) && npm install && DISABLE_ESLINT_PLUGIN='true' VITE_REACT_APP_VERSION=$(cat VERSION) npm run build
start-backend: start-backend:
@echo "Starting backend dev server..." @echo "Starting backend dev server..."

View File

@@ -127,7 +127,7 @@ func TokenAuth() func(c *gin.Context) {
} }
if len(parts) > 1 { if len(parts) > 1 {
if model.IsAdmin(token.UserId) { if model.IsAdmin(token.UserId) {
c.Set("channelId", parts[1]) c.Set("specific_channel_id", parts[1])
} else { } else {
abortWithOpenAiMessage(c, http.StatusForbidden, "普通用户不支持指定渠道") abortWithOpenAiMessage(c, http.StatusForbidden, "普通用户不支持指定渠道")
return return

View File

@@ -23,7 +23,10 @@ func Distribute() func(c *gin.Context) {
return func(c *gin.Context) { return func(c *gin.Context) {
userId := c.GetInt("id") userId := c.GetInt("id")
var channel *model.Channel var channel *model.Channel
channelId, ok := c.Get("channelId") channelId, ok := c.Get("specific_channel_id")
modelRequest, shouldSelectChannel, err := getModelRequest(c)
userGroup, _ := model.CacheGetUserGroup(userId)
c.Set("group", userGroup)
if ok { if ok {
id, err := strconv.Atoi(channelId.(string)) id, err := strconv.Atoi(channelId.(string))
if err != nil { if err != nil {
@@ -40,11 +43,58 @@ func Distribute() func(c *gin.Context) {
return return
} }
} else { } else {
shouldSelectChannel := true
// Select a channel for the user // Select a channel for the user
// check token model mapping
modelLimitEnable := c.GetBool("token_model_limit_enabled")
if modelLimitEnable {
s, ok := c.Get("token_model_limit")
var tokenModelLimit map[string]bool
if ok {
tokenModelLimit = s.(map[string]bool)
} else {
tokenModelLimit = map[string]bool{}
}
if tokenModelLimit != nil {
if _, ok := tokenModelLimit[modelRequest.Model]; !ok {
abortWithOpenAiMessage(c, http.StatusForbidden, "该令牌无权访问模型 "+modelRequest.Model)
return
}
} else {
// token model limit is empty, all models are not allowed
abortWithOpenAiMessage(c, http.StatusForbidden, "该令牌无权访问任何模型")
return
}
}
if shouldSelectChannel {
channel, err = model.CacheGetRandomSatisfiedChannel(userGroup, modelRequest.Model, 0)
if err != nil {
message := fmt.Sprintf("当前分组 %s 下对于模型 %s 无可用渠道", userGroup, modelRequest.Model)
// 如果错误,但是渠道不为空,说明是数据库一致性问题
if channel != nil {
common.SysError(fmt.Sprintf("渠道不存在:%d", channel.Id))
message = "数据库一致性已被破坏,请联系管理员"
}
// 如果错误,而且渠道为空,说明是没有可用渠道
abortWithOpenAiMessage(c, http.StatusServiceUnavailable, message)
return
}
if channel == nil {
abortWithOpenAiMessage(c, http.StatusServiceUnavailable, fmt.Sprintf("当前分组 %s 下对于模型 %s 无可用渠道(数据库一致性已被破坏)", userGroup, modelRequest.Model))
return
}
}
}
SetupContextForSelectedChannel(c, channel, modelRequest.Model)
c.Next()
}
}
func getModelRequest(c *gin.Context) (*ModelRequest, bool, error) {
var modelRequest ModelRequest var modelRequest ModelRequest
shouldSelectChannel := true
var err error var err error
if strings.HasPrefix(c.Request.URL.Path, "/mj") { if strings.Contains(c.Request.URL.Path, "/mj/") {
relayMode := relayconstant.Path2RelayModeMidjourney(c.Request.URL.Path) relayMode := relayconstant.Path2RelayModeMidjourney(c.Request.URL.Path)
if relayMode == relayconstant.RelayModeMidjourneyTaskFetch || if relayMode == relayconstant.RelayModeMidjourneyTaskFetch ||
relayMode == relayconstant.RelayModeMidjourneyTaskFetchByCondition || relayMode == relayconstant.RelayModeMidjourneyTaskFetchByCondition ||
@@ -56,17 +106,17 @@ func Distribute() func(c *gin.Context) {
err = common.UnmarshalBodyReusable(c, &midjourneyRequest) err = common.UnmarshalBodyReusable(c, &midjourneyRequest)
if err != nil { if err != nil {
abortWithMidjourneyMessage(c, http.StatusBadRequest, constant.MjErrorUnknown, "无效的请求, "+err.Error()) abortWithMidjourneyMessage(c, http.StatusBadRequest, constant.MjErrorUnknown, "无效的请求, "+err.Error())
return return nil, false, err
} }
midjourneyModel, mjErr, success := service.GetMjRequestModel(relayMode, &midjourneyRequest) midjourneyModel, mjErr, success := service.GetMjRequestModel(relayMode, &midjourneyRequest)
if mjErr != nil { if mjErr != nil {
abortWithMidjourneyMessage(c, http.StatusBadRequest, mjErr.Code, mjErr.Description) abortWithMidjourneyMessage(c, http.StatusBadRequest, mjErr.Code, mjErr.Description)
return return nil, false, fmt.Errorf(mjErr.Description)
} }
if midjourneyModel == "" { if midjourneyModel == "" {
if !success { if !success {
abortWithMidjourneyMessage(c, http.StatusBadRequest, constant.MjErrorUnknown, "无效的请求, 无法解析模型") abortWithMidjourneyMessage(c, http.StatusBadRequest, constant.MjErrorUnknown, "无效的请求, 无法解析模型")
return return nil, false, fmt.Errorf("无效的请求, 无法解析模型")
} else { } else {
// task fetch, task fetch by condition, notify // task fetch, task fetch by condition, notify
shouldSelectChannel = false shouldSelectChannel = false
@@ -80,7 +130,7 @@ func Distribute() func(c *gin.Context) {
} }
if err != nil { if err != nil {
abortWithOpenAiMessage(c, http.StatusBadRequest, "无效的请求, "+err.Error()) abortWithOpenAiMessage(c, http.StatusBadRequest, "无效的请求, "+err.Error())
return return nil, false, err
} }
if strings.HasPrefix(c.Request.URL.Path, "/v1/moderations") { if strings.HasPrefix(c.Request.URL.Path, "/v1/moderations") {
if modelRequest.Model == "" { if modelRequest.Model == "" {
@@ -106,45 +156,12 @@ func Distribute() func(c *gin.Context) {
} }
} }
} }
// check token model mapping return &modelRequest, shouldSelectChannel, nil
modelLimitEnable := c.GetBool("token_model_limit_enabled")
if modelLimitEnable {
s, ok := c.Get("token_model_limit")
var tokenModelLimit map[string]bool
if ok {
tokenModelLimit = s.(map[string]bool)
} else {
tokenModelLimit = map[string]bool{}
}
if tokenModelLimit != nil {
if _, ok := tokenModelLimit[modelRequest.Model]; !ok {
abortWithOpenAiMessage(c, http.StatusForbidden, "该令牌无权访问模型 "+modelRequest.Model)
return
}
} else {
// token model limit is empty, all models are not allowed
abortWithOpenAiMessage(c, http.StatusForbidden, "该令牌无权访问任何模型")
return
}
} }
userGroup, _ := model.CacheGetUserGroup(userId) func SetupContextForSelectedChannel(c *gin.Context, channel *model.Channel, modelName string) {
c.Set("group", userGroup) c.Set("original_model", modelName) // for retry
if shouldSelectChannel {
channel, err = model.CacheGetRandomSatisfiedChannel(userGroup, modelRequest.Model)
if err != nil {
message := fmt.Sprintf("当前分组 %s 下对于模型 %s 无可用渠道", userGroup, modelRequest.Model)
// 如果错误,但是渠道不为空,说明是数据库一致性问题
if channel != nil {
common.SysError(fmt.Sprintf("渠道不存在:%d", channel.Id))
message = "数据库一致性已被破坏,请联系管理员"
}
// 如果错误,而且渠道为空,说明是没有可用渠道
abortWithOpenAiMessage(c, http.StatusServiceUnavailable, message)
return
}
if channel == nil { if channel == nil {
abortWithOpenAiMessage(c, http.StatusServiceUnavailable, fmt.Sprintf("当前分组 %s 下对于模型 %s 无可用渠道(数据库一致性已被破坏)", userGroup, modelRequest.Model))
return return
} }
c.Set("channel", channel.Type) c.Set("channel", channel.Type)
@@ -155,8 +172,12 @@ func Distribute() func(c *gin.Context) {
if channel.AutoBan != nil && *channel.AutoBan == 0 { if channel.AutoBan != nil && *channel.AutoBan == 0 {
ban = false ban = false
} }
if nil != channel.OpenAIOrganization && "" != *channel.OpenAIOrganization {
c.Set("channel_organization", *channel.OpenAIOrganization)
}
c.Set("auto_ban", ban) c.Set("auto_ban", ban)
c.Set("model_mapping", channel.GetModelMapping()) c.Set("model_mapping", channel.GetModelMapping())
c.Set("status_code_mapping", channel.GetStatusCodeMapping())
c.Request.Header.Set("Authorization", fmt.Sprintf("Bearer %s", channel.Key)) c.Request.Header.Set("Authorization", fmt.Sprintf("Bearer %s", channel.Key))
c.Set("base_url", channel.GetBaseURL()) c.Set("base_url", channel.GetBaseURL())
// TODO: api_version统一 // TODO: api_version统一
@@ -173,7 +194,3 @@ func Distribute() func(c *gin.Context) {
c.Set("plugin", channel.Other) c.Set("plugin", channel.Other)
} }
} }
}
c.Next()
}
}

View File

@@ -3,6 +3,8 @@ package model
import ( import (
"errors" "errors"
"fmt" "fmt"
"github.com/samber/lo"
"gorm.io/gorm"
"one-api/common" "one-api/common"
"strings" "strings"
) )
@@ -27,8 +29,7 @@ func GetGroupModels(group string) []string {
return models return models
} }
func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) { func getPriority(group string, model string, retry int) (int, error) {
var abilities []Ability
groupCol := "`group`" groupCol := "`group`"
trueVal := "1" trueVal := "1"
if common.UsingPostgreSQL { if common.UsingPostgreSQL {
@@ -36,9 +37,55 @@ func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) {
trueVal = "true" trueVal = "true"
} }
var err error = nil var priorities []int
err := DB.Model(&Ability{}).
Select("DISTINCT(priority)").
Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model).
Order("priority DESC"). // 按优先级降序排序
Pluck("priority", &priorities).Error // Pluck用于将查询的结果直接扫描到一个切片中
if err != nil {
// 处理错误
return 0, err
}
// 确定要使用的优先级
var priorityToUse int
if retry >= len(priorities) {
// 如果重试次数大于优先级数,则使用最小的优先级
priorityToUse = priorities[len(priorities)-1]
} else {
priorityToUse = priorities[retry]
}
return priorityToUse, nil
}
func getChannelQuery(group string, model string, retry int) *gorm.DB {
groupCol := "`group`"
trueVal := "1"
if common.UsingPostgreSQL {
groupCol = `"group"`
trueVal = "true"
}
maxPrioritySubQuery := DB.Model(&Ability{}).Select("MAX(priority)").Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model) maxPrioritySubQuery := DB.Model(&Ability{}).Select("MAX(priority)").Where(groupCol+" = ? and model = ? and enabled = "+trueVal, group, model)
channelQuery := DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal+" and priority = (?)", group, model, maxPrioritySubQuery) channelQuery := DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal+" and priority = (?)", group, model, maxPrioritySubQuery)
if retry != 0 {
priority, err := getPriority(group, model, retry)
if err != nil {
common.SysError(fmt.Sprintf("Get priority failed: %s", err.Error()))
} else {
channelQuery = DB.Where(groupCol+" = ? and model = ? and enabled = "+trueVal+" and priority = ?", group, model, priority)
}
}
return channelQuery
}
func GetRandomSatisfiedChannel(group string, model string, retry int) (*Channel, error) {
var abilities []Ability
var err error = nil
channelQuery := getChannelQuery(group, model, retry)
if common.UsingSQLite || common.UsingPostgreSQL { if common.UsingSQLite || common.UsingPostgreSQL {
err = channelQuery.Order("weight DESC").Find(&abilities).Error err = channelQuery.Order("weight DESC").Find(&abilities).Error
} else { } else {
@@ -52,23 +99,18 @@ func GetRandomSatisfiedChannel(group string, model string) (*Channel, error) {
// Randomly choose one // Randomly choose one
weightSum := uint(0) weightSum := uint(0)
for _, ability_ := range abilities { for _, ability_ := range abilities {
weightSum += ability_.Weight weightSum += ability_.Weight + 10
} }
if weightSum == 0 {
// All weight is 0, randomly choose one
channel.Id = abilities[common.GetRandomInt(len(abilities))].ChannelId
} else {
// Randomly choose one // Randomly choose one
weight := common.GetRandomInt(int(weightSum)) weight := common.GetRandomInt(int(weightSum))
for _, ability_ := range abilities { for _, ability_ := range abilities {
weight -= int(ability_.Weight) weight -= int(ability_.Weight) + 10
//log.Printf("weight: %d, ability weight: %d", weight, *ability_.Weight) //log.Printf("weight: %d, ability weight: %d", weight, *ability_.Weight)
if weight <= 0 { if weight <= 0 {
channel.Id = ability_.ChannelId channel.Id = ability_.ChannelId
break break
} }
} }
}
} else { } else {
return nil, errors.New("channel not found") return nil, errors.New("channel not found")
} }
@@ -93,7 +135,16 @@ func (channel *Channel) AddAbilities() error {
abilities = append(abilities, ability) abilities = append(abilities, ability)
} }
} }
return DB.Create(&abilities).Error if len(abilities) == 0 {
return nil
}
for _, chunk := range lo.Chunk(abilities, 50) {
err := DB.Create(&chunk).Error
if err != nil {
return err
}
}
return nil
} }
func (channel *Channel) DeleteAbilities() error { func (channel *Channel) DeleteAbilities() error {

View File

@@ -25,9 +25,6 @@ var token2UserId = make(map[string]int)
var token2UserIdLock sync.RWMutex var token2UserIdLock sync.RWMutex
func cacheSetToken(token *Token) error { func cacheSetToken(token *Token) error {
if !common.RedisEnabled {
return token.SelectUpdate()
}
jsonBytes, err := json.Marshal(token) jsonBytes, err := json.Marshal(token)
if err != nil { if err != nil {
return err return err
@@ -168,7 +165,11 @@ func CacheUpdateUserQuota(id int) error {
if err != nil { if err != nil {
return err return err
} }
err = common.RedisSet(fmt.Sprintf("user_quota:%d", id), fmt.Sprintf("%d", quota), time.Duration(UserId2QuotaCacheSeconds)*time.Second) return cacheSetUserQuota(id, quota)
}
func cacheSetUserQuota(id int, quota int) error {
err := common.RedisSet(fmt.Sprintf("user_quota:%d", id), fmt.Sprintf("%d", quota), time.Duration(UserId2QuotaCacheSeconds)*time.Second)
return err return err
} }
@@ -265,14 +266,14 @@ func SyncChannelCache(frequency int) {
} }
} }
func CacheGetRandomSatisfiedChannel(group string, model string) (*Channel, error) { func CacheGetRandomSatisfiedChannel(group string, model string, retry int) (*Channel, error) {
if strings.HasPrefix(model, "gpt-4-gizmo") { if strings.HasPrefix(model, "gpt-4-gizmo") {
model = "gpt-4-gizmo-*" model = "gpt-4-gizmo-*"
} }
// if memory cache is disabled, get channel directly from database // if memory cache is disabled, get channel directly from database
if !common.MemoryCacheEnabled { if !common.MemoryCacheEnabled {
return GetRandomSatisfiedChannel(group, model) return GetRandomSatisfiedChannel(group, model, retry)
} }
channelSyncLock.RLock() channelSyncLock.RLock()
defer channelSyncLock.RUnlock() defer channelSyncLock.RUnlock()
@@ -280,15 +281,27 @@ func CacheGetRandomSatisfiedChannel(group string, model string) (*Channel, error
if len(channels) == 0 { if len(channels) == 0 {
return nil, errors.New("channel not found") return nil, errors.New("channel not found")
} }
endIdx := len(channels)
// choose by priority uniquePriorities := make(map[int]bool)
firstChannel := channels[0] for _, channel := range channels {
if firstChannel.GetPriority() > 0 { uniquePriorities[int(channel.GetPriority())] = true
for i := range channels {
if channels[i].GetPriority() != firstChannel.GetPriority() {
endIdx = i
break
} }
var sortedUniquePriorities []int
for priority := range uniquePriorities {
sortedUniquePriorities = append(sortedUniquePriorities, priority)
}
sort.Sort(sort.Reverse(sort.IntSlice(sortedUniquePriorities)))
if retry >= len(uniquePriorities) {
retry = len(uniquePriorities) - 1
}
targetPriority := int64(sortedUniquePriorities[retry])
// get the priority for the given retry number
var targetChannels []*Channel
for _, channel := range channels {
if channel.GetPriority() == targetPriority {
targetChannels = append(targetChannels, channel)
} }
} }
@@ -296,20 +309,14 @@ func CacheGetRandomSatisfiedChannel(group string, model string) (*Channel, error
smoothingFactor := 10 smoothingFactor := 10
// Calculate the total weight of all channels up to endIdx // Calculate the total weight of all channels up to endIdx
totalWeight := 0 totalWeight := 0
for _, channel := range channels[:endIdx] { for _, channel := range targetChannels {
totalWeight += channel.GetWeight() + smoothingFactor totalWeight += channel.GetWeight() + smoothingFactor
} }
//if totalWeight == 0 {
// // If all weights are 0, select a channel randomly
// return channels[rand.Intn(endIdx)], nil
//}
// Generate a random value in the range [0, totalWeight) // Generate a random value in the range [0, totalWeight)
randomWeight := rand.Intn(totalWeight) randomWeight := rand.Intn(totalWeight)
// Find a channel based on its weight // Find a channel based on its weight
for _, channel := range channels[:endIdx] { for _, channel := range targetChannels {
randomWeight -= channel.GetWeight() + smoothingFactor randomWeight -= channel.GetWeight() + smoothingFactor
if randomWeight < 0 { if randomWeight < 0 {
return channel, nil return channel, nil

View File

@@ -10,6 +10,7 @@ type Channel struct {
Type int `json:"type" gorm:"default:0"` Type int `json:"type" gorm:"default:0"`
Key string `json:"key" gorm:"not null"` Key string `json:"key" gorm:"not null"`
OpenAIOrganization *string `json:"openai_organization"` OpenAIOrganization *string `json:"openai_organization"`
TestModel *string `json:"test_model"`
Status int `json:"status" gorm:"default:1"` Status int `json:"status" gorm:"default:1"`
Name string `json:"name" gorm:"index"` Name string `json:"name" gorm:"index"`
Weight *uint `json:"weight" gorm:"default:0"` Weight *uint `json:"weight" gorm:"default:0"`
@@ -24,6 +25,8 @@ type Channel struct {
Group string `json:"group" gorm:"type:varchar(64);default:'default'"` Group string `json:"group" gorm:"type:varchar(64);default:'default'"`
UsedQuota int64 `json:"used_quota" gorm:"bigint;default:0"` UsedQuota int64 `json:"used_quota" gorm:"bigint;default:0"`
ModelMapping *string `json:"model_mapping" gorm:"type:varchar(1024);default:''"` ModelMapping *string `json:"model_mapping" gorm:"type:varchar(1024);default:''"`
//MaxInputTokens *int `json:"max_input_tokens" gorm:"default:0"`
StatusCodeMapping *string `json:"status_code_mapping" gorm:"type:varchar(1024);default:''"`
Priority *int64 `json:"priority" gorm:"bigint;default:0"` Priority *int64 `json:"priority" gorm:"bigint;default:0"`
AutoBan *int `json:"auto_ban" gorm:"default:1"` AutoBan *int `json:"auto_ban" gorm:"default:1"`
} }
@@ -152,6 +155,13 @@ func (channel *Channel) GetModelMapping() string {
return *channel.ModelMapping return *channel.ModelMapping
} }
func (channel *Channel) GetStatusCodeMapping() string {
if channel.StatusCodeMapping == nil {
return ""
}
return *channel.StatusCodeMapping
}
func (channel *Channel) Insert() error { func (channel *Channel) Insert() error {
var err error var err error
err = DB.Create(channel).Error err = DB.Create(channel).Error

View File

@@ -95,6 +95,9 @@ func InitDB() (err error) {
} }
if common.UsingMySQL { if common.UsingMySQL {
_, _ = sqlDB.Exec("DROP INDEX idx_channels_key ON channels;") // TODO: delete this line when most users have upgraded _, _ = sqlDB.Exec("DROP INDEX idx_channels_key ON channels;") // TODO: delete this line when most users have upgraded
_, _ = sqlDB.Exec("ALTER TABLE midjourneys MODIFY action VARCHAR(40);") // TODO: delete this line when most users have upgraded
_, _ = sqlDB.Exec("ALTER TABLE midjourneys MODIFY progress VARCHAR(30);") // TODO: delete this line when most users have upgraded
_, _ = sqlDB.Exec("ALTER TABLE midjourneys MODIFY status VARCHAR(20);") // TODO: delete this line when most users have upgraded
} }
common.SysLog("database migration started") common.SysLog("database migration started")
err = db.AutoMigrate(&Channel{}) err = db.AutoMigrate(&Channel{})

View File

@@ -4,18 +4,18 @@ type Midjourney struct {
Id int `json:"id"` Id int `json:"id"`
Code int `json:"code"` Code int `json:"code"`
UserId int `json:"user_id" gorm:"index"` UserId int `json:"user_id" gorm:"index"`
Action string `json:"action"` Action string `json:"action" gorm:"type:varchar(40);index"`
MjId string `json:"mj_id" gorm:"index"` MjId string `json:"mj_id" gorm:"index"`
Prompt string `json:"prompt"` Prompt string `json:"prompt"`
PromptEn string `json:"prompt_en"` PromptEn string `json:"prompt_en"`
Description string `json:"description"` Description string `json:"description"`
State string `json:"state"` State string `json:"state"`
SubmitTime int64 `json:"submit_time"` SubmitTime int64 `json:"submit_time" gorm:"index"`
StartTime int64 `json:"start_time"` StartTime int64 `json:"start_time" gorm:"index"`
FinishTime int64 `json:"finish_time"` FinishTime int64 `json:"finish_time" gorm:"index"`
ImageUrl string `json:"image_url"` ImageUrl string `json:"image_url"`
Status string `json:"status"` Status string `json:"status" gorm:"type:varchar(20);index"`
Progress string `json:"progress"` Progress string `json:"progress" gorm:"type:varchar(30);index"`
FailReason string `json:"fail_reason"` FailReason string `json:"fail_reason"`
ChannelId int `json:"channel_id"` ChannelId int `json:"channel_id"`
Quota int `json:"quota"` Quota int `json:"quota"`

View File

@@ -44,12 +44,14 @@ func InitOptionMap() {
common.OptionMap["DataExportEnabled"] = strconv.FormatBool(common.DataExportEnabled) common.OptionMap["DataExportEnabled"] = strconv.FormatBool(common.DataExportEnabled)
common.OptionMap["ChannelDisableThreshold"] = strconv.FormatFloat(common.ChannelDisableThreshold, 'f', -1, 64) common.OptionMap["ChannelDisableThreshold"] = strconv.FormatFloat(common.ChannelDisableThreshold, 'f', -1, 64)
common.OptionMap["EmailDomainRestrictionEnabled"] = strconv.FormatBool(common.EmailDomainRestrictionEnabled) common.OptionMap["EmailDomainRestrictionEnabled"] = strconv.FormatBool(common.EmailDomainRestrictionEnabled)
common.OptionMap["EmailAliasRestrictionEnabled"] = strconv.FormatBool(common.EmailAliasRestrictionEnabled)
common.OptionMap["EmailDomainWhitelist"] = strings.Join(common.EmailDomainWhitelist, ",") common.OptionMap["EmailDomainWhitelist"] = strings.Join(common.EmailDomainWhitelist, ",")
common.OptionMap["SMTPServer"] = "" common.OptionMap["SMTPServer"] = ""
common.OptionMap["SMTPFrom"] = "" common.OptionMap["SMTPFrom"] = ""
common.OptionMap["SMTPPort"] = strconv.Itoa(common.SMTPPort) common.OptionMap["SMTPPort"] = strconv.Itoa(common.SMTPPort)
common.OptionMap["SMTPAccount"] = "" common.OptionMap["SMTPAccount"] = ""
common.OptionMap["SMTPToken"] = "" common.OptionMap["SMTPToken"] = ""
common.OptionMap["SMTPSSLEnabled"] = strconv.FormatBool(common.SMTPSSLEnabled)
common.OptionMap["Notice"] = "" common.OptionMap["Notice"] = ""
common.OptionMap["About"] = "" common.OptionMap["About"] = ""
common.OptionMap["HomePageContent"] = "" common.OptionMap["HomePageContent"] = ""
@@ -61,8 +63,8 @@ func InitOptionMap() {
common.OptionMap["CustomCallbackAddress"] = "" common.OptionMap["CustomCallbackAddress"] = ""
common.OptionMap["EpayId"] = "" common.OptionMap["EpayId"] = ""
common.OptionMap["EpayKey"] = "" common.OptionMap["EpayKey"] = ""
common.OptionMap["Price"] = strconv.FormatFloat(common.Price, 'f', -1, 64) common.OptionMap["Price"] = strconv.FormatFloat(constant.Price, 'f', -1, 64)
common.OptionMap["MinTopUp"] = strconv.Itoa(common.MinTopUp) common.OptionMap["MinTopUp"] = strconv.Itoa(constant.MinTopUp)
common.OptionMap["TopupGroupRatio"] = common.TopupGroupRatio2JSONString() common.OptionMap["TopupGroupRatio"] = common.TopupGroupRatio2JSONString()
common.OptionMap["GitHubClientId"] = "" common.OptionMap["GitHubClientId"] = ""
common.OptionMap["GitHubClientSecret"] = "" common.OptionMap["GitHubClientSecret"] = ""
@@ -90,6 +92,14 @@ func InitOptionMap() {
common.OptionMap["DataExportDefaultTime"] = common.DataExportDefaultTime common.OptionMap["DataExportDefaultTime"] = common.DataExportDefaultTime
common.OptionMap["DefaultCollapseSidebar"] = strconv.FormatBool(common.DefaultCollapseSidebar) common.OptionMap["DefaultCollapseSidebar"] = strconv.FormatBool(common.DefaultCollapseSidebar)
common.OptionMap["MjNotifyEnabled"] = strconv.FormatBool(constant.MjNotifyEnabled) common.OptionMap["MjNotifyEnabled"] = strconv.FormatBool(constant.MjNotifyEnabled)
common.OptionMap["MjModeClearEnabled"] = strconv.FormatBool(constant.MjModeClearEnabled)
common.OptionMap["MjForwardUrlEnabled"] = strconv.FormatBool(constant.MjForwardUrlEnabled)
common.OptionMap["CheckSensitiveEnabled"] = strconv.FormatBool(constant.CheckSensitiveEnabled)
common.OptionMap["CheckSensitiveOnPromptEnabled"] = strconv.FormatBool(constant.CheckSensitiveOnPromptEnabled)
//common.OptionMap["CheckSensitiveOnCompletionEnabled"] = strconv.FormatBool(constant.CheckSensitiveOnCompletionEnabled)
common.OptionMap["StopOnSensitiveEnabled"] = strconv.FormatBool(constant.StopOnSensitiveEnabled)
common.OptionMap["SensitiveWords"] = constant.SensitiveWordsToString()
common.OptionMap["StreamCacheQueueLength"] = strconv.Itoa(constant.StreamCacheQueueLength)
common.OptionMapRWMutex.Unlock() common.OptionMapRWMutex.Unlock()
loadOptionsFromDatabase() loadOptionsFromDatabase()
@@ -167,6 +177,8 @@ func updateOptionMap(key string, value string) (err error) {
common.RegisterEnabled = boolValue common.RegisterEnabled = boolValue
case "EmailDomainRestrictionEnabled": case "EmailDomainRestrictionEnabled":
common.EmailDomainRestrictionEnabled = boolValue common.EmailDomainRestrictionEnabled = boolValue
case "EmailAliasRestrictionEnabled":
common.EmailAliasRestrictionEnabled = boolValue
case "AutomaticDisableChannelEnabled": case "AutomaticDisableChannelEnabled":
common.AutomaticDisableChannelEnabled = boolValue common.AutomaticDisableChannelEnabled = boolValue
case "AutomaticEnableChannelEnabled": case "AutomaticEnableChannelEnabled":
@@ -185,6 +197,20 @@ func updateOptionMap(key string, value string) (err error) {
common.DefaultCollapseSidebar = boolValue common.DefaultCollapseSidebar = boolValue
case "MjNotifyEnabled": case "MjNotifyEnabled":
constant.MjNotifyEnabled = boolValue constant.MjNotifyEnabled = boolValue
case "MjModeClearEnabled":
constant.MjModeClearEnabled = boolValue
case "MjForwardUrlEnabled":
constant.MjForwardUrlEnabled = boolValue
case "CheckSensitiveEnabled":
constant.CheckSensitiveEnabled = boolValue
case "CheckSensitiveOnPromptEnabled":
constant.CheckSensitiveOnPromptEnabled = boolValue
//case "CheckSensitiveOnCompletionEnabled":
// constant.CheckSensitiveOnCompletionEnabled = boolValue
case "StopOnSensitiveEnabled":
constant.StopOnSensitiveEnabled = boolValue
case "SMTPSSLEnabled":
common.SMTPSSLEnabled = boolValue
} }
} }
switch key { switch key {
@@ -204,17 +230,17 @@ func updateOptionMap(key string, value string) (err error) {
case "ServerAddress": case "ServerAddress":
common.ServerAddress = value common.ServerAddress = value
case "PayAddress": case "PayAddress":
common.PayAddress = value constant.PayAddress = value
case "CustomCallbackAddress": case "CustomCallbackAddress":
common.CustomCallbackAddress = value constant.CustomCallbackAddress = value
case "EpayId": case "EpayId":
common.EpayId = value constant.EpayId = value
case "EpayKey": case "EpayKey":
common.EpayKey = value constant.EpayKey = value
case "Price": case "Price":
common.Price, _ = strconv.ParseFloat(value, 64) constant.Price, _ = strconv.ParseFloat(value, 64)
case "MinTopUp": case "MinTopUp":
common.MinTopUp, _ = strconv.Atoi(value) constant.MinTopUp, _ = strconv.Atoi(value)
case "TopupGroupRatio": case "TopupGroupRatio":
err = common.UpdateTopupGroupRatioByJSONString(value) err = common.UpdateTopupGroupRatioByJSONString(value)
case "GitHubClientId": case "GitHubClientId":
@@ -273,6 +299,10 @@ func updateOptionMap(key string, value string) (err error) {
common.ChannelDisableThreshold, _ = strconv.ParseFloat(value, 64) common.ChannelDisableThreshold, _ = strconv.ParseFloat(value, 64)
case "QuotaPerUnit": case "QuotaPerUnit":
common.QuotaPerUnit, _ = strconv.ParseFloat(value, 64) common.QuotaPerUnit, _ = strconv.ParseFloat(value, 64)
case "SensitiveWords":
constant.SensitiveWordsFromString(value)
case "StreamCacheQueueLength":
constant.StreamCacheQueueLength, _ = strconv.Atoi(value)
} }
return err return err
} }

View File

@@ -56,7 +56,7 @@ func Redeem(key string, userId int) (quota int, err error) {
if common.UsingPostgreSQL { if common.UsingPostgreSQL {
keyCol = `"key"` keyCol = `"key"`
} }
common.RandomSleep()
err = DB.Transaction(func(tx *gorm.DB) error { err = DB.Transaction(func(tx *gorm.DB) error {
err := tx.Set("gorm:query_option", "FOR UPDATE").Where(keyCol+" = ?", key).First(redemption).Error err := tx.Set("gorm:query_option", "FOR UPDATE").Where(keyCol+" = ?", key).First(redemption).Error
if err != nil { if err != nil {

View File

@@ -102,6 +102,11 @@ func GetTokenById(id int) (*Token, error) {
token := Token{Id: id} token := Token{Id: id}
var err error = nil var err error = nil
err = DB.First(&token, "id = ?", id).Error err = DB.First(&token, "id = ?", id).Error
if err != nil {
if common.RedisEnabled {
go cacheSetToken(&token)
}
}
return &token, err return &token, err
} }

View File

@@ -4,6 +4,7 @@ import (
"errors" "errors"
"fmt" "fmt"
"one-api/common" "one-api/common"
"strconv"
"strings" "strings"
"time" "time"
@@ -72,8 +73,26 @@ func GetAllUsers(startIdx int, num int) (users []*User, err error) {
return users, err return users, err
} }
func SearchUsers(keyword string) (users []*User, err error) { func SearchUsers(keyword string) ([]*User, error) {
err = DB.Omit("password").Where("id = ? or username LIKE ? or email LIKE ? or display_name LIKE ?", keyword, keyword+"%", keyword+"%", keyword+"%").Find(&users).Error var users []*User
var err error
// 尝试将关键字转换为整数ID
keywordInt, err := strconv.Atoi(keyword)
if err == nil {
// 如果转换成功按照ID搜索用户
err = DB.Unscoped().Omit("password").Where("id = ?", keywordInt).Find(&users).Error
if err != nil || len(users) > 0 {
// 如果依据ID找到用户或者发生错误返回结果或错误
return users, err
}
}
// 如果ID转换失败或者没有找到用户依据其他字段进行模糊搜索
err = DB.Unscoped().Omit("password").
Where("username LIKE ? OR email LIKE ? OR display_name LIKE ?", keyword+"%", keyword+"%", keyword+"%").
Find(&users).Error
return users, err return users, err
} }
@@ -210,6 +229,27 @@ func (user *User) Update(updatePassword bool) error {
if err == nil { if err == nil {
if common.RedisEnabled { if common.RedisEnabled {
_ = common.RedisSet(fmt.Sprintf("user_group:%d", user.Id), user.Group, time.Duration(UserId2GroupCacheSeconds)*time.Second) _ = common.RedisSet(fmt.Sprintf("user_group:%d", user.Id), user.Group, time.Duration(UserId2GroupCacheSeconds)*time.Second)
_ = common.RedisSet(fmt.Sprintf("user_quota:%d", user.Id), strconv.Itoa(user.Quota), time.Duration(UserId2QuotaCacheSeconds)*time.Second)
}
}
return err
}
func (user *User) UpdateAll(updatePassword bool) error {
var err error
if updatePassword {
user.Password, err = common.Password2Hash(user.Password)
if err != nil {
return err
}
}
newUser := *user
DB.First(&user, user.Id)
err = DB.Model(user).Select("*").Updates(newUser).Error
if err == nil {
if common.RedisEnabled {
_ = common.RedisSet(fmt.Sprintf("user_group:%d", user.Id), user.Group, time.Duration(UserId2GroupCacheSeconds)*time.Second)
_ = common.RedisSet(fmt.Sprintf("user_quota:%d", user.Id), strconv.Itoa(user.Quota), time.Duration(UserId2QuotaCacheSeconds)*time.Second)
} }
} }
return err return err
@@ -370,6 +410,11 @@ func ValidateAccessToken(token string) (user *User) {
func GetUserQuota(id int) (quota int, err error) { func GetUserQuota(id int) (quota int, err error) {
err = DB.Model(&User{}).Where("id = ?", id).Select("quota").Find(&quota).Error err = DB.Model(&User{}).Where("id = ?", id).Select("quota").Find(&quota).Error
if err != nil {
if common.RedisEnabled {
go cacheSetUserQuota(id, quota)
}
}
return quota, err return quota, err
} }

View File

@@ -0,0 +1,79 @@
package aws
import (
"errors"
"github.com/gin-gonic/gin"
"io"
"net/http"
"one-api/dto"
"one-api/relay/channel/claude"
relaycommon "one-api/relay/common"
"strings"
)
const (
RequestModeCompletion = 1
RequestModeMessage = 2
)
type Adaptor struct {
RequestMode int
}
func (a *Adaptor) Init(info *relaycommon.RelayInfo, request dto.GeneralOpenAIRequest) {
if strings.HasPrefix(info.UpstreamModelName, "claude-3") {
a.RequestMode = RequestModeMessage
} else {
a.RequestMode = RequestModeCompletion
}
}
func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) {
return "", nil
}
func (a *Adaptor) SetupRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.RelayInfo) error {
return nil
}
func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *dto.GeneralOpenAIRequest) (any, error) {
if request == nil {
return nil, errors.New("request is nil")
}
var claudeReq *claude.ClaudeRequest
var err error
if a.RequestMode == RequestModeCompletion {
claudeReq = claude.RequestOpenAI2ClaudeComplete(*request)
} else {
claudeReq, err = claude.RequestOpenAI2ClaudeMessage(*request)
}
c.Set("request_model", request.Model)
c.Set("converted_request", claudeReq)
return claudeReq, err
}
func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, requestBody io.Reader) (*http.Response, error) {
return nil, nil
}
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
if info.IsStream {
err, usage = awsStreamHandler(c, info, a.RequestMode)
} else {
err, usage = awsHandler(c, info, a.RequestMode)
}
return
}
func (a *Adaptor) GetModelList() (models []string) {
for n := range awsModelIDMap {
models = append(models, n)
}
return
}
func (a *Adaptor) GetChannelName() string {
return ChannelName
}

View File

@@ -0,0 +1,12 @@
package aws
var awsModelIDMap = map[string]string{
"claude-instant-1.2": "anthropic.claude-instant-v1",
"claude-2.0": "anthropic.claude-v2",
"claude-2.1": "anthropic.claude-v2:1",
"claude-3-sonnet-20240229": "anthropic.claude-3-sonnet-20240229-v1:0",
"claude-3-opus-20240229": "anthropic.claude-3-opus-20240229-v1:0",
"claude-3-haiku-20240307": "anthropic.claude-3-haiku-20240307-v1:0",
}
var ChannelName = "aws"

14
relay/channel/aws/dto.go Normal file
View File

@@ -0,0 +1,14 @@
package aws
import "one-api/relay/channel/claude"
type AwsClaudeRequest struct {
// AnthropicVersion should be "bedrock-2023-05-31"
AnthropicVersion string `json:"anthropic_version"`
Messages []claude.ClaudeMessage `json:"messages"`
MaxTokens int `json:"max_tokens,omitempty"`
Temperature float64 `json:"temperature,omitempty"`
TopP float64 `json:"top_p,omitempty"`
TopK int `json:"top_k,omitempty"`
StopSequences []string `json:"stop_sequences,omitempty"`
}

View File

@@ -0,0 +1,211 @@
package aws
import (
"bytes"
"encoding/json"
"fmt"
"github.com/gin-gonic/gin"
"github.com/jinzhu/copier"
"github.com/pkg/errors"
"io"
"net/http"
"one-api/common"
relaymodel "one-api/dto"
"one-api/relay/channel/claude"
relaycommon "one-api/relay/common"
"strings"
"github.com/aws/aws-sdk-go-v2/aws"
"github.com/aws/aws-sdk-go-v2/credentials"
"github.com/aws/aws-sdk-go-v2/service/bedrockruntime"
"github.com/aws/aws-sdk-go-v2/service/bedrockruntime/types"
)
func newAwsClient(c *gin.Context, info *relaycommon.RelayInfo) (*bedrockruntime.Client, error) {
awsSecret := strings.Split(info.ApiKey, "|")
if len(awsSecret) != 3 {
return nil, errors.New("invalid aws secret key")
}
ak := awsSecret[0]
sk := awsSecret[1]
region := awsSecret[2]
client := bedrockruntime.New(bedrockruntime.Options{
Region: region,
Credentials: aws.NewCredentialsCache(credentials.NewStaticCredentialsProvider(ak, sk, "")),
})
return client, nil
}
func wrapErr(err error) *relaymodel.OpenAIErrorWithStatusCode {
return &relaymodel.OpenAIErrorWithStatusCode{
StatusCode: http.StatusInternalServerError,
Error: relaymodel.OpenAIError{
Message: fmt.Sprintf("%s", err.Error()),
},
}
}
func awsModelID(requestModel string) (string, error) {
if awsModelID, ok := awsModelIDMap[requestModel]; ok {
return awsModelID, nil
}
return "", errors.Errorf("model %s not found", requestModel)
}
func awsHandler(c *gin.Context, info *relaycommon.RelayInfo, requestMode int) (*relaymodel.OpenAIErrorWithStatusCode, *relaymodel.Usage) {
awsCli, err := newAwsClient(c, info)
if err != nil {
return wrapErr(errors.Wrap(err, "newAwsClient")), nil
}
awsModelId, err := awsModelID(c.GetString("request_model"))
if err != nil {
return wrapErr(errors.Wrap(err, "awsModelID")), nil
}
awsReq := &bedrockruntime.InvokeModelInput{
ModelId: aws.String(awsModelId),
Accept: aws.String("application/json"),
ContentType: aws.String("application/json"),
}
claudeReq_, ok := c.Get("converted_request")
if !ok {
return wrapErr(errors.New("request not found")), nil
}
claudeReq := claudeReq_.(*claude.ClaudeRequest)
awsClaudeReq := &AwsClaudeRequest{
AnthropicVersion: "bedrock-2023-05-31",
}
if err = copier.Copy(awsClaudeReq, claudeReq); err != nil {
return wrapErr(errors.Wrap(err, "copy request")), nil
}
awsReq.Body, err = json.Marshal(awsClaudeReq)
if err != nil {
return wrapErr(errors.Wrap(err, "marshal request")), nil
}
awsResp, err := awsCli.InvokeModel(c.Request.Context(), awsReq)
if err != nil {
return wrapErr(errors.Wrap(err, "InvokeModel")), nil
}
claudeResponse := new(claude.ClaudeResponse)
err = json.Unmarshal(awsResp.Body, claudeResponse)
if err != nil {
return wrapErr(errors.Wrap(err, "unmarshal response")), nil
}
openaiResp := claude.ResponseClaude2OpenAI(requestMode, claudeResponse)
usage := relaymodel.Usage{
PromptTokens: claudeResponse.Usage.InputTokens,
CompletionTokens: claudeResponse.Usage.OutputTokens,
TotalTokens: claudeResponse.Usage.InputTokens + claudeResponse.Usage.OutputTokens,
}
openaiResp.Usage = usage
c.JSON(http.StatusOK, openaiResp)
return nil, &usage
}
func awsStreamHandler(c *gin.Context, info *relaycommon.RelayInfo, requestMode int) (*relaymodel.OpenAIErrorWithStatusCode, *relaymodel.Usage) {
awsCli, err := newAwsClient(c, info)
if err != nil {
return wrapErr(errors.Wrap(err, "newAwsClient")), nil
}
awsModelId, err := awsModelID(c.GetString("request_model"))
if err != nil {
return wrapErr(errors.Wrap(err, "awsModelID")), nil
}
awsReq := &bedrockruntime.InvokeModelWithResponseStreamInput{
ModelId: aws.String(awsModelId),
Accept: aws.String("application/json"),
ContentType: aws.String("application/json"),
}
claudeReq_, ok := c.Get("converted_request")
if !ok {
return wrapErr(errors.New("request not found")), nil
}
claudeReq := claudeReq_.(*claude.ClaudeRequest)
awsClaudeReq := &AwsClaudeRequest{
AnthropicVersion: "bedrock-2023-05-31",
}
if err = copier.Copy(awsClaudeReq, claudeReq); err != nil {
return wrapErr(errors.Wrap(err, "copy request")), nil
}
awsReq.Body, err = json.Marshal(awsClaudeReq)
if err != nil {
return wrapErr(errors.Wrap(err, "marshal request")), nil
}
awsResp, err := awsCli.InvokeModelWithResponseStream(c.Request.Context(), awsReq)
if err != nil {
return wrapErr(errors.Wrap(err, "InvokeModelWithResponseStream")), nil
}
stream := awsResp.GetStream()
defer stream.Close()
c.Writer.Header().Set("Content-Type", "text/event-stream")
var usage relaymodel.Usage
var id string
var model string
c.Stream(func(w io.Writer) bool {
event, ok := <-stream.Events()
if !ok {
c.Render(-1, common.CustomEvent{Data: "data: [DONE]"})
return false
}
switch v := event.(type) {
case *types.ResponseStreamMemberChunk:
claudeResp := new(claude.ClaudeResponse)
err := json.NewDecoder(bytes.NewReader(v.Value.Bytes)).Decode(claudeResp)
if err != nil {
common.SysError("error unmarshalling stream response: " + err.Error())
return false
}
response, claudeUsage := claude.StreamResponseClaude2OpenAI(requestMode, claudeResp)
if claudeUsage != nil {
usage.PromptTokens += claudeUsage.InputTokens
usage.CompletionTokens += claudeUsage.OutputTokens
}
if response == nil {
return true
}
if response.Id != "" {
id = response.Id
}
if response.Model != "" {
model = response.Model
}
response.Id = id
response.Model = model
jsonStr, err := json.Marshal(response)
if err != nil {
common.SysError("error marshalling stream response: " + err.Error())
return true
}
c.Render(-1, common.CustomEvent{Data: "data: " + string(jsonStr)})
return true
case *types.UnknownUnionMember:
fmt.Println("unknown tag:", v.Tag)
return false
default:
fmt.Println("union is nil or unknown type")
return false
}
})
return nil, &usage
}

View File

@@ -53,9 +53,9 @@ func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *dto.Gen
return nil, errors.New("request is nil") return nil, errors.New("request is nil")
} }
if a.RequestMode == RequestModeCompletion { if a.RequestMode == RequestModeCompletion {
return requestOpenAI2ClaudeComplete(*request), nil return RequestOpenAI2ClaudeComplete(*request), nil
} else { } else {
return requestOpenAI2ClaudeMessage(*request) return RequestOpenAI2ClaudeMessage(*request)
} }
} }

View File

@@ -28,7 +28,6 @@ type ClaudeRequest struct {
Prompt string `json:"prompt,omitempty"` Prompt string `json:"prompt,omitempty"`
System string `json:"system,omitempty"` System string `json:"system,omitempty"`
Messages []ClaudeMessage `json:"messages,omitempty"` Messages []ClaudeMessage `json:"messages,omitempty"`
MaxTokensToSample uint `json:"max_tokens_to_sample,omitempty"`
MaxTokens uint `json:"max_tokens,omitempty"` MaxTokens uint `json:"max_tokens,omitempty"`
StopSequences []string `json:"stop_sequences,omitempty"` StopSequences []string `json:"stop_sequences,omitempty"`
Temperature float64 `json:"temperature,omitempty"` Temperature float64 `json:"temperature,omitempty"`

View File

@@ -26,18 +26,19 @@ func stopReasonClaude2OpenAI(reason string) string {
} }
} }
func requestOpenAI2ClaudeComplete(textRequest dto.GeneralOpenAIRequest) *ClaudeRequest { func RequestOpenAI2ClaudeComplete(textRequest dto.GeneralOpenAIRequest) *ClaudeRequest {
claudeRequest := ClaudeRequest{ claudeRequest := ClaudeRequest{
Model: textRequest.Model, Model: textRequest.Model,
Prompt: "", Prompt: "",
MaxTokensToSample: textRequest.MaxTokens, MaxTokens: textRequest.MaxTokens,
StopSequences: nil, StopSequences: nil,
Temperature: textRequest.Temperature, Temperature: textRequest.Temperature,
TopP: textRequest.TopP, TopP: textRequest.TopP,
TopK: textRequest.TopK,
Stream: textRequest.Stream, Stream: textRequest.Stream,
} }
if claudeRequest.MaxTokensToSample == 0 { if claudeRequest.MaxTokens == 0 {
claudeRequest.MaxTokensToSample = 1000000 claudeRequest.MaxTokens = 4096
} }
prompt := "" prompt := ""
for _, message := range textRequest.Messages { for _, message := range textRequest.Messages {
@@ -56,20 +57,52 @@ func requestOpenAI2ClaudeComplete(textRequest dto.GeneralOpenAIRequest) *ClaudeR
return &claudeRequest return &claudeRequest
} }
func requestOpenAI2ClaudeMessage(textRequest dto.GeneralOpenAIRequest) (*ClaudeRequest, error) { func RequestOpenAI2ClaudeMessage(textRequest dto.GeneralOpenAIRequest) (*ClaudeRequest, error) {
claudeRequest := ClaudeRequest{ claudeRequest := ClaudeRequest{
Model: textRequest.Model, Model: textRequest.Model,
MaxTokens: textRequest.MaxTokens, MaxTokens: textRequest.MaxTokens,
StopSequences: nil, StopSequences: nil,
Temperature: textRequest.Temperature, Temperature: textRequest.Temperature,
TopP: textRequest.TopP, TopP: textRequest.TopP,
TopK: textRequest.TopK,
Stream: textRequest.Stream, Stream: textRequest.Stream,
} }
if claudeRequest.MaxTokens == 0 { if claudeRequest.MaxTokens == 0 {
claudeRequest.MaxTokens = 4096 claudeRequest.MaxTokens = 4096
} }
formatMessages := make([]dto.Message, 0)
var lastMessage *dto.Message
for i, message := range textRequest.Messages {
if message.Role == "system" {
if i != 0 {
message.Role = "user"
}
}
if message.Role == "" {
message.Role = "user"
}
fmtMessage := dto.Message{
Role: message.Role,
Content: message.Content,
}
if lastMessage != nil && lastMessage.Role == message.Role {
if lastMessage.IsStringContent() && message.IsStringContent() {
content, _ := json.Marshal(strings.Trim(fmt.Sprintf("%s %s", lastMessage.StringContent(), message.StringContent()), "\""))
fmtMessage.Content = content
// delete last message
formatMessages = formatMessages[:len(formatMessages)-1]
}
}
if fmtMessage.Content == nil {
content, _ := json.Marshal("...")
fmtMessage.Content = content
}
formatMessages = append(formatMessages, fmtMessage)
lastMessage = &message
}
claudeMessages := make([]ClaudeMessage, 0) claudeMessages := make([]ClaudeMessage, 0)
for _, message := range textRequest.Messages { for _, message := range formatMessages {
if message.Role == "system" { if message.Role == "system" {
claudeRequest.System = message.StringContent() claudeRequest.System = message.StringContent()
} else { } else {
@@ -120,7 +153,7 @@ func requestOpenAI2ClaudeMessage(textRequest dto.GeneralOpenAIRequest) (*ClaudeR
return &claudeRequest, nil return &claudeRequest, nil
} }
func streamResponseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) (*dto.ChatCompletionsStreamResponse, *ClaudeUsage) { func StreamResponseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) (*dto.ChatCompletionsStreamResponse, *ClaudeUsage) {
var response dto.ChatCompletionsStreamResponse var response dto.ChatCompletionsStreamResponse
var claudeUsage *ClaudeUsage var claudeUsage *ClaudeUsage
response.Object = "chat.completion.chunk" response.Object = "chat.completion.chunk"
@@ -147,13 +180,18 @@ func streamResponseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) (*
choice.FinishReason = &finishReason choice.FinishReason = &finishReason
} }
claudeUsage = &claudeResponse.Usage claudeUsage = &claudeResponse.Usage
} else if claudeResponse.Type == "message_stop" {
return nil, nil
} }
} }
if claudeUsage == nil {
claudeUsage = &ClaudeUsage{}
}
response.Choices = append(response.Choices, choice) response.Choices = append(response.Choices, choice)
return &response, claudeUsage return &response, claudeUsage
} }
func responseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) *dto.OpenAITextResponse { func ResponseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) *dto.OpenAITextResponse {
choices := make([]dto.OpenAITextResponseChoice, 0) choices := make([]dto.OpenAITextResponseChoice, 0)
fullTextResponse := dto.OpenAITextResponse{ fullTextResponse := dto.OpenAITextResponse{
Id: fmt.Sprintf("chatcmpl-%s", common.GetUUID()), Id: fmt.Sprintf("chatcmpl-%s", common.GetUUID()),
@@ -194,7 +232,8 @@ func responseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) *dto.Ope
func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c *gin.Context, resp *http.Response) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) { func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c *gin.Context, resp *http.Response) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
responseId := fmt.Sprintf("chatcmpl-%s", common.GetUUID()) responseId := fmt.Sprintf("chatcmpl-%s", common.GetUUID())
var usage dto.Usage var usage *dto.Usage
usage = &dto.Usage{}
responseText := "" responseText := ""
createdTime := common.GetTimestamp() createdTime := common.GetTimestamp()
scanner := bufio.NewScanner(resp.Body) scanner := bufio.NewScanner(resp.Body)
@@ -236,7 +275,10 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
return true return true
} }
response, claudeUsage := streamResponseClaude2OpenAI(requestMode, &claudeResponse) response, claudeUsage := StreamResponseClaude2OpenAI(requestMode, &claudeResponse)
if response == nil {
return true
}
if requestMode == RequestModeCompletion { if requestMode == RequestModeCompletion {
responseText += claudeResponse.Completion responseText += claudeResponse.Completion
responseId = response.Id responseId = response.Id
@@ -277,13 +319,13 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
} }
if requestMode == RequestModeCompletion { if requestMode == RequestModeCompletion {
usage = *service.ResponseText2Usage(responseText, modelName, promptTokens) usage, _ = service.ResponseText2Usage(responseText, modelName, promptTokens)
} else { } else {
if usage.CompletionTokens == 0 { if usage.CompletionTokens == 0 {
usage = *service.ResponseText2Usage(responseText, modelName, usage.PromptTokens) usage, _ = service.ResponseText2Usage(responseText, modelName, usage.PromptTokens)
} }
} }
return nil, &usage return nil, usage
} }
func claudeHandler(requestMode int, c *gin.Context, resp *http.Response, promptTokens int, model string) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) { func claudeHandler(requestMode int, c *gin.Context, resp *http.Response, promptTokens int, model string) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
@@ -311,8 +353,11 @@ func claudeHandler(requestMode int, c *gin.Context, resp *http.Response, promptT
StatusCode: resp.StatusCode, StatusCode: resp.StatusCode,
}, nil }, nil
} }
fullTextResponse := responseClaude2OpenAI(requestMode, &claudeResponse) fullTextResponse := ResponseClaude2OpenAI(requestMode, &claudeResponse)
completionTokens := service.CountTokenText(claudeResponse.Completion, model) completionTokens, err, _ := service.CountTokenText(claudeResponse.Completion, model, false)
if err != nil {
return service.OpenAIErrorWrapper(err, "count_token_text_failed", http.StatusInternalServerError), nil
}
usage := dto.Usage{} usage := dto.Usage{}
if requestMode == RequestModeCompletion { if requestMode == RequestModeCompletion {
usage.PromptTokens = promptTokens usage.PromptTokens = promptTokens

View File

@@ -18,11 +18,23 @@ type Adaptor struct {
func (a *Adaptor) Init(info *relaycommon.RelayInfo, request dto.GeneralOpenAIRequest) { func (a *Adaptor) Init(info *relaycommon.RelayInfo, request dto.GeneralOpenAIRequest) {
} }
// 定义一个映射,存储模型名称和对应的版本
var modelVersionMap = map[string]string{
"gemini-1.5-pro-latest": "v1beta",
"gemini-ultra": "v1beta",
}
func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) { func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) {
version := "v1" // 从映射中获取模型名称对应的版本,如果找不到就使用 info.ApiVersion 或默认的版本 "v1"
version, beta := modelVersionMap[info.UpstreamModelName]
if !beta {
if info.ApiVersion != "" { if info.ApiVersion != "" {
version = info.ApiVersion version = info.ApiVersion
} else {
version = "v1"
} }
}
action := "generateContent" action := "generateContent"
if info.IsStream { if info.IsStream {
action = "streamGenerateContent" action = "streamGenerateContent"
@@ -51,7 +63,7 @@ func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycom
if info.IsStream { if info.IsStream {
var responseText string var responseText string
err, responseText = geminiChatStreamHandler(c, resp) err, responseText = geminiChatStreamHandler(c, resp)
usage = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens) usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
} else { } else {
err, usage = geminiChatHandler(c, resp, info.PromptTokens, info.UpstreamModelName) err, usage = geminiChatHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
} }

View File

@@ -5,8 +5,8 @@ const (
) )
var ModelList = []string{ var ModelList = []string{
"gemini-pro", "gemini-1.0-pro-latest", "gemini-1.0-pro-001", "gemini-1.5-pro-latest", "gemini-ultra",
"gemini-pro-vision", "gemini-1.0-pro-vision-latest", "gemini-1.0-pro-vision-001",
} }
var ChannelName = "google gemini" var ChannelName = "google gemini"

View File

@@ -256,7 +256,7 @@ func geminiChatHandler(c *gin.Context, resp *http.Response, promptTokens int, mo
}, nil }, nil
} }
fullTextResponse := responseGeminiChat2OpenAI(&geminiResponse) fullTextResponse := responseGeminiChat2OpenAI(&geminiResponse)
completionTokens := service.CountTokenText(geminiResponse.GetResponseText(), model) completionTokens, _, _ := service.CountTokenText(geminiResponse.GetResponseText(), model, false)
usage := dto.Usage{ usage := dto.Usage{
PromptTokens: promptTokens, PromptTokens: promptTokens,
CompletionTokens: completionTokens, CompletionTokens: completionTokens,

View File

@@ -0,0 +1,9 @@
package lingyiwanwu
// https://platform.lingyiwanwu.com/docs
var ModelList = []string{
"yi-34b-chat-0205",
"yi-34b-chat-200k",
"yi-vl-plus",
}

View File

@@ -2,7 +2,6 @@ package ollama
import ( import (
"errors" "errors"
"fmt"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
"io" "io"
"net/http" "net/http"
@@ -10,6 +9,7 @@ import (
"one-api/relay/channel" "one-api/relay/channel"
"one-api/relay/channel/openai" "one-api/relay/channel/openai"
relaycommon "one-api/relay/common" relaycommon "one-api/relay/common"
relayconstant "one-api/relay/constant"
"one-api/service" "one-api/service"
) )
@@ -20,7 +20,12 @@ func (a *Adaptor) Init(info *relaycommon.RelayInfo, request dto.GeneralOpenAIReq
} }
func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) { func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) {
return fmt.Sprintf("%s/api/chat", info.BaseUrl), nil switch info.RelayMode {
case relayconstant.RelayModeEmbeddings:
return info.BaseUrl + "/api/embeddings", nil
default:
return relaycommon.GetFullRequestURL(info.BaseUrl, info.RequestURLPath, info.ChannelType), nil
}
} }
func (a *Adaptor) SetupRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.RelayInfo) error { func (a *Adaptor) SetupRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.RelayInfo) error {
@@ -32,8 +37,13 @@ func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *dto.Gen
if request == nil { if request == nil {
return nil, errors.New("request is nil") return nil, errors.New("request is nil")
} }
switch relayMode {
case relayconstant.RelayModeEmbeddings:
return requestOpenAI2Embeddings(*request), nil
default:
return requestOpenAI2Ollama(*request), nil return requestOpenAI2Ollama(*request), nil
} }
}
func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, requestBody io.Reader) (*http.Response, error) { func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, requestBody io.Reader) (*http.Response, error) {
return channel.DoApiRequest(a, c, info, requestBody) return channel.DoApiRequest(a, c, info, requestBody)
@@ -42,11 +52,15 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) { func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
if info.IsStream { if info.IsStream {
var responseText string var responseText string
err, responseText = openai.OpenaiStreamHandler(c, resp, info.RelayMode) err, responseText, _ = openai.OpenaiStreamHandler(c, resp, info.RelayMode)
usage = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens) usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
} else {
if info.RelayMode == relayconstant.RelayModeEmbeddings {
err, usage = ollamaEmbeddingHandler(c, resp, info.PromptTokens, info.UpstreamModelName, info.RelayMode)
} else { } else {
err, usage = openai.OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName) err, usage = openai.OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
} }
}
return return
} }

View File

@@ -6,13 +6,21 @@ type OllamaRequest struct {
Model string `json:"model,omitempty"` Model string `json:"model,omitempty"`
Messages []dto.Message `json:"messages,omitempty"` Messages []dto.Message `json:"messages,omitempty"`
Stream bool `json:"stream,omitempty"` Stream bool `json:"stream,omitempty"`
Options *OllamaOptions `json:"options,omitempty"`
}
type OllamaOptions struct {
Temperature float64 `json:"temperature,omitempty"` Temperature float64 `json:"temperature,omitempty"`
Seed float64 `json:"seed,omitempty"` Seed float64 `json:"seed,omitempty"`
Topp float64 `json:"top_p,omitempty"` Topp float64 `json:"top_p,omitempty"`
TopK int `json:"top_k,omitempty"` TopK int `json:"top_k,omitempty"`
Stop any `json:"stop,omitempty"` Stop any `json:"stop,omitempty"`
} }
type OllamaEmbeddingRequest struct {
Model string `json:"model,omitempty"`
Prompt any `json:"prompt,omitempty"`
}
type OllamaEmbeddingResponse struct {
Embedding []float64 `json:"embedding,omitempty"`
}
//type OllamaOptions struct {
//}

View File

@@ -1,6 +1,16 @@
package ollama package ollama
import "one-api/dto" import (
"bytes"
"encoding/json"
"fmt"
"github.com/gin-gonic/gin"
"io"
"net/http"
"one-api/dto"
"one-api/service"
"strings"
)
func requestOpenAI2Ollama(request dto.GeneralOpenAIRequest) *OllamaRequest { func requestOpenAI2Ollama(request dto.GeneralOpenAIRequest) *OllamaRequest {
messages := make([]dto.Message, 0, len(request.Messages)) messages := make([]dto.Message, 0, len(request.Messages))
@@ -21,12 +31,79 @@ func requestOpenAI2Ollama(request dto.GeneralOpenAIRequest) *OllamaRequest {
Model: request.Model, Model: request.Model,
Messages: messages, Messages: messages,
Stream: request.Stream, Stream: request.Stream,
Options: &OllamaOptions{
Temperature: request.Temperature, Temperature: request.Temperature,
Seed: request.Seed, Seed: request.Seed,
Topp: request.TopP, Topp: request.TopP,
TopK: request.TopK, TopK: request.TopK,
Stop: Stop, Stop: Stop,
},
} }
} }
func requestOpenAI2Embeddings(request dto.GeneralOpenAIRequest) *OllamaEmbeddingRequest {
return &OllamaEmbeddingRequest{
Model: request.Model,
Prompt: strings.Join(request.ParseInput(), " "),
}
}
func ollamaEmbeddingHandler(c *gin.Context, resp *http.Response, promptTokens int, model string, relayMode int) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
var ollamaEmbeddingResponse OllamaEmbeddingResponse
responseBody, err := io.ReadAll(resp.Body)
if err != nil {
return service.OpenAIErrorWrapper(err, "read_response_body_failed", http.StatusInternalServerError), nil
}
err = resp.Body.Close()
if err != nil {
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
}
err = json.Unmarshal(responseBody, &ollamaEmbeddingResponse)
if err != nil {
return service.OpenAIErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError), nil
}
data := make([]dto.OpenAIEmbeddingResponseItem, 0, 1)
data = append(data, dto.OpenAIEmbeddingResponseItem{
Embedding: ollamaEmbeddingResponse.Embedding,
Object: "embedding",
})
usage := &dto.Usage{
TotalTokens: promptTokens,
CompletionTokens: 0,
PromptTokens: promptTokens,
}
embeddingResponse := &dto.OpenAIEmbeddingResponse{
Object: "list",
Data: data,
Model: model,
Usage: *usage,
}
doResponseBody, err := json.Marshal(embeddingResponse)
if err != nil {
return service.OpenAIErrorWrapper(err, "marshal_response_body_failed", http.StatusInternalServerError), nil
}
resp.Body = io.NopCloser(bytes.NewBuffer(doResponseBody))
// We shouldn't set the header before we parse the response body, because the parse part may fail.
// And then we will have to send an error response, but in this case, the header has already been set.
// So the httpClient will be confused by the response.
// For example, Postman will report error, and we cannot check the response at all.
// Copy headers
for k, v := range resp.Header {
// 删除任何现有的相同头部,以防止重复添加头部
c.Writer.Header().Del(k)
for _, vv := range v {
c.Writer.Header().Add(k, vv)
}
}
// reset content length
c.Writer.Header().Del("Content-Length")
c.Writer.Header().Set("Content-Length", fmt.Sprintf("%d", len(doResponseBody)))
c.Writer.WriteHeader(resp.StatusCode)
_, err = io.Copy(c.Writer, resp.Body)
if err != nil {
return service.OpenAIErrorWrapper(err, "copy_response_body_failed", http.StatusInternalServerError), nil
}
err = resp.Body.Close()
if err != nil {
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
}
return nil, usage
}

View File

@@ -10,6 +10,7 @@ import (
"one-api/dto" "one-api/dto"
"one-api/relay/channel" "one-api/relay/channel"
"one-api/relay/channel/ai360" "one-api/relay/channel/ai360"
"one-api/relay/channel/lingyiwanwu"
"one-api/relay/channel/moonshot" "one-api/relay/channel/moonshot"
relaycommon "one-api/relay/common" relaycommon "one-api/relay/common"
"one-api/service" "one-api/service"
@@ -33,9 +34,6 @@ func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) {
model_ := info.UpstreamModelName model_ := info.UpstreamModelName
model_ = strings.Replace(model_, ".", "", -1) model_ = strings.Replace(model_, ".", "", -1)
// https://github.com/songquanpeng/one-api/issues/67 // https://github.com/songquanpeng/one-api/issues/67
model_ = strings.TrimSuffix(model_, "-0301")
model_ = strings.TrimSuffix(model_, "-0314")
model_ = strings.TrimSuffix(model_, "-0613")
requestURL = fmt.Sprintf("/openai/deployments/%s/%s", model_, task) requestURL = fmt.Sprintf("/openai/deployments/%s/%s", model_, task)
return relaycommon.GetFullRequestURL(info.BaseUrl, requestURL, info.ChannelType), nil return relaycommon.GetFullRequestURL(info.BaseUrl, requestURL, info.ChannelType), nil
@@ -49,6 +47,9 @@ func (a *Adaptor) SetupRequestHeader(c *gin.Context, req *http.Request, info *re
req.Header.Set("api-key", info.ApiKey) req.Header.Set("api-key", info.ApiKey)
return nil return nil
} }
if info.ChannelType == common.ChannelTypeOpenAI && "" != info.Organization {
req.Header.Set("OpenAI-Organization", info.Organization)
}
req.Header.Set("Authorization", "Bearer "+info.ApiKey) req.Header.Set("Authorization", "Bearer "+info.ApiKey)
//if info.ChannelType == common.ChannelTypeOpenRouter { //if info.ChannelType == common.ChannelTypeOpenRouter {
// req.Header.Set("HTTP-Referer", "https://github.com/songquanpeng/one-api") // req.Header.Set("HTTP-Referer", "https://github.com/songquanpeng/one-api")
@@ -71,8 +72,10 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) { func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
if info.IsStream { if info.IsStream {
var responseText string var responseText string
err, responseText = OpenaiStreamHandler(c, resp, info.RelayMode) var toolCount int
usage = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens) err, responseText, toolCount = OpenaiStreamHandler(c, resp, info.RelayMode)
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
usage.CompletionTokens += toolCount * 7
} else { } else {
err, usage = OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName) err, usage = OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
} }
@@ -85,6 +88,8 @@ func (a *Adaptor) GetModelList() []string {
return ai360.ModelList return ai360.ModelList
case common.ChannelTypeMoonshot: case common.ChannelTypeMoonshot:
return moonshot.ModelList return moonshot.ModelList
case common.ChannelTypeLingYiWanWu:
return lingyiwanwu.ModelList
default: default:
return ModelList return ModelList
} }

View File

@@ -16,8 +16,10 @@ import (
"time" "time"
) )
func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*dto.OpenAIErrorWithStatusCode, string) { func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*dto.OpenAIErrorWithStatusCode, string, int) {
//checkSensitive := constant.ShouldCheckCompletionSensitive()
var responseTextBuilder strings.Builder var responseTextBuilder strings.Builder
toolCount := 0
scanner := bufio.NewScanner(resp.Body) scanner := bufio.NewScanner(resp.Body)
scanner.Split(func(data []byte, atEOF bool) (advance int, token []byte, err error) { scanner.Split(func(data []byte, atEOF bool) (advance int, token []byte, err error) {
if atEOF && len(data) == 0 { if atEOF && len(data) == 0 {
@@ -36,11 +38,10 @@ func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*d
defer close(stopChan) defer close(stopChan)
defer close(dataChan) defer close(dataChan)
var wg sync.WaitGroup var wg sync.WaitGroup
go func() { go func() {
wg.Add(1) wg.Add(1)
defer wg.Done() defer wg.Done()
var streamItems []string var streamItems []string // store stream items
for scanner.Scan() { for scanner.Scan() {
data := scanner.Text() data := scanner.Text()
if len(data) < 6 { // ignore blank line or wrong format if len(data) < 6 { // ignore blank line or wrong format
@@ -62,11 +63,38 @@ func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*d
err := json.Unmarshal(common.StringToByteSlice(streamResp), &streamResponses) err := json.Unmarshal(common.StringToByteSlice(streamResp), &streamResponses)
if err != nil { if err != nil {
common.SysError("error unmarshalling stream response: " + err.Error()) common.SysError("error unmarshalling stream response: " + err.Error())
return // just ignore the error for _, item := range streamItems {
var streamResponse dto.ChatCompletionsStreamResponseSimple
err := json.Unmarshal(common.StringToByteSlice(item), &streamResponse)
if err == nil {
for _, choice := range streamResponse.Choices {
responseTextBuilder.WriteString(choice.Delta.Content)
if choice.Delta.ToolCalls != nil {
if len(choice.Delta.ToolCalls) > toolCount {
toolCount = len(choice.Delta.ToolCalls)
} }
for _, tool := range choice.Delta.ToolCalls {
responseTextBuilder.WriteString(tool.Function.Name)
responseTextBuilder.WriteString(tool.Function.Arguments)
}
}
}
}
}
} else {
for _, streamResponse := range streamResponses { for _, streamResponse := range streamResponses {
for _, choice := range streamResponse.Choices { for _, choice := range streamResponse.Choices {
responseTextBuilder.WriteString(choice.Delta.Content) responseTextBuilder.WriteString(choice.Delta.Content)
if choice.Delta.ToolCalls != nil {
if len(choice.Delta.ToolCalls) > toolCount {
toolCount = len(choice.Delta.ToolCalls)
}
for _, tool := range choice.Delta.ToolCalls {
responseTextBuilder.WriteString(tool.Function.Name)
responseTextBuilder.WriteString(tool.Function.Arguments)
}
}
}
} }
} }
case relayconstant.RelayModeCompletions: case relayconstant.RelayModeCompletions:
@@ -74,14 +102,23 @@ func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*d
err := json.Unmarshal(common.StringToByteSlice(streamResp), &streamResponses) err := json.Unmarshal(common.StringToByteSlice(streamResp), &streamResponses)
if err != nil { if err != nil {
common.SysError("error unmarshalling stream response: " + err.Error()) common.SysError("error unmarshalling stream response: " + err.Error())
return // just ignore the error for _, item := range streamItems {
var streamResponse dto.CompletionsStreamResponse
err := json.Unmarshal(common.StringToByteSlice(item), &streamResponse)
if err == nil {
for _, choice := range streamResponse.Choices {
responseTextBuilder.WriteString(choice.Text)
} }
}
}
} else {
for _, streamResponse := range streamResponses { for _, streamResponse := range streamResponses {
for _, choice := range streamResponse.Choices { for _, choice := range streamResponse.Choices {
responseTextBuilder.WriteString(choice.Text) responseTextBuilder.WriteString(choice.Text)
} }
} }
} }
}
if len(dataChan) > 0 { if len(dataChan) > 0 {
// wait data out // wait data out
time.Sleep(2 * time.Second) time.Sleep(2 * time.Second)
@@ -105,14 +142,14 @@ func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*d
}) })
err := resp.Body.Close() err := resp.Body.Close()
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), "" return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), "", toolCount
} }
wg.Wait() wg.Wait()
return nil, responseTextBuilder.String() return nil, responseTextBuilder.String(), toolCount
} }
func OpenaiHandler(c *gin.Context, resp *http.Response, promptTokens int, model string) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) { func OpenaiHandler(c *gin.Context, resp *http.Response, promptTokens int, model string) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
var textResponse dto.TextResponse var simpleResponse dto.SimpleResponse
responseBody, err := io.ReadAll(resp.Body) responseBody, err := io.ReadAll(resp.Body)
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "read_response_body_failed", http.StatusInternalServerError), nil return service.OpenAIErrorWrapper(err, "read_response_body_failed", http.StatusInternalServerError), nil
@@ -121,13 +158,13 @@ func OpenaiHandler(c *gin.Context, resp *http.Response, promptTokens int, model
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
} }
err = json.Unmarshal(responseBody, &textResponse) err = json.Unmarshal(responseBody, &simpleResponse)
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError), nil return service.OpenAIErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError), nil
} }
if textResponse.Error.Type != "" { if simpleResponse.Error.Type != "" {
return &dto.OpenAIErrorWithStatusCode{ return &dto.OpenAIErrorWithStatusCode{
Error: textResponse.Error, Error: simpleResponse.Error,
StatusCode: resp.StatusCode, StatusCode: resp.StatusCode,
}, nil }, nil
} }
@@ -150,16 +187,17 @@ func OpenaiHandler(c *gin.Context, resp *http.Response, promptTokens int, model
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
} }
if textResponse.Usage.TotalTokens == 0 { if simpleResponse.Usage.TotalTokens == 0 {
completionTokens := 0 completionTokens := 0
for _, choice := range textResponse.Choices { for _, choice := range simpleResponse.Choices {
completionTokens += service.CountTokenText(string(choice.Message.Content), model) ctkm, _, _ := service.CountTokenText(string(choice.Message.Content), model, false)
completionTokens += ctkm
} }
textResponse.Usage = dto.Usage{ simpleResponse.Usage = dto.Usage{
PromptTokens: promptTokens, PromptTokens: promptTokens,
CompletionTokens: completionTokens, CompletionTokens: completionTokens,
TotalTokens: promptTokens + completionTokens, TotalTokens: promptTokens + completionTokens,
} }
} }
return nil, &textResponse.Usage return nil, &simpleResponse.Usage
} }

View File

@@ -43,7 +43,7 @@ func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycom
if info.IsStream { if info.IsStream {
var responseText string var responseText string
err, responseText = palmStreamHandler(c, resp) err, responseText = palmStreamHandler(c, resp)
usage = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens) usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
} else { } else {
err, usage = palmHandler(c, resp, info.PromptTokens, info.UpstreamModelName) err, usage = palmHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
} }

View File

@@ -156,7 +156,7 @@ func palmHandler(c *gin.Context, resp *http.Response, promptTokens int, model st
}, nil }, nil
} }
fullTextResponse := responsePaLM2OpenAI(&palmResponse) fullTextResponse := responsePaLM2OpenAI(&palmResponse)
completionTokens := service.CountTokenText(palmResponse.Candidates[0].Content, model) completionTokens, _, _ := service.CountTokenText(palmResponse.Candidates[0].Content, model, false)
usage := dto.Usage{ usage := dto.Usage{
PromptTokens: promptTokens, PromptTokens: promptTokens,
CompletionTokens: completionTokens, CompletionTokens: completionTokens,

View File

@@ -0,0 +1,63 @@
package perplexity
import (
"errors"
"fmt"
"github.com/gin-gonic/gin"
"io"
"net/http"
"one-api/dto"
"one-api/relay/channel"
"one-api/relay/channel/openai"
relaycommon "one-api/relay/common"
"one-api/service"
)
type Adaptor struct {
}
func (a *Adaptor) Init(info *relaycommon.RelayInfo, request dto.GeneralOpenAIRequest) {
}
func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) {
return fmt.Sprintf("%s/chat/completions", info.BaseUrl), nil
}
func (a *Adaptor) SetupRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.RelayInfo) error {
channel.SetupApiRequestHeader(info, c, req)
req.Header.Set("Authorization", "Bearer "+info.ApiKey)
return nil
}
func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *dto.GeneralOpenAIRequest) (any, error) {
if request == nil {
return nil, errors.New("request is nil")
}
if request.TopP >= 1 {
request.TopP = 0.99
}
return requestOpenAI2Perplexity(*request), nil
}
func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, requestBody io.Reader) (*http.Response, error) {
return channel.DoApiRequest(a, c, info, requestBody)
}
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
if info.IsStream {
var responseText string
err, responseText, _ = openai.OpenaiStreamHandler(c, resp, info.RelayMode)
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
} else {
err, usage = openai.OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
}
return
}
func (a *Adaptor) GetModelList() []string {
return ModelList
}
func (a *Adaptor) GetChannelName() string {
return ChannelName
}

View File

@@ -0,0 +1,7 @@
package perplexity
var ModelList = []string{
"sonar-small-chat", "sonar-small-online", "sonar-medium-chat", "sonar-medium-online", "mistral-7b-instruct", "mixtral-8x7b-instruct",
}
var ChannelName = "perplexity"

View File

@@ -0,0 +1,21 @@
package perplexity
import "one-api/dto"
func requestOpenAI2Perplexity(request dto.GeneralOpenAIRequest) *dto.GeneralOpenAIRequest {
messages := make([]dto.Message, 0, len(request.Messages))
for _, message := range request.Messages {
messages = append(messages, dto.Message{
Role: message.Role,
Content: message.Content,
})
}
return &dto.GeneralOpenAIRequest{
Model: request.Model,
Stream: request.Stream,
Messages: messages,
Temperature: request.Temperature,
TopP: request.TopP,
MaxTokens: request.MaxTokens,
}
}

View File

@@ -57,7 +57,7 @@ func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycom
if info.IsStream { if info.IsStream {
var responseText string var responseText string
err, responseText = tencentStreamHandler(c, resp) err, responseText = tencentStreamHandler(c, resp)
usage = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens) usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
} else { } else {
err, usage = tencentHandler(c, resp) err, usage = tencentHandler(c, resp)
} }

View File

@@ -179,7 +179,13 @@ func xunfeiHandler(c *gin.Context, textRequest dto.GeneralOpenAIRequest, appId s
case stop = <-stopChan: case stop = <-stopChan:
} }
} }
if len(xunfeiResponse.Payload.Choices.Text) == 0 {
xunfeiResponse.Payload.Choices.Text = []XunfeiChatResponseTextItem{
{
Content: "",
},
}
}
xunfeiResponse.Payload.Choices.Text[0].Content = content xunfeiResponse.Payload.Choices.Text[0].Content = content
response := responseXunfei2OpenAI(&xunfeiResponse) response := responseXunfei2OpenAI(&xunfeiResponse)

View File

@@ -36,6 +36,9 @@ func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *dto.Gen
if request == nil { if request == nil {
return nil, errors.New("request is nil") return nil, errors.New("request is nil")
} }
if request.TopP >= 1 {
request.TopP = 0.99
}
return requestOpenAI2Zhipu(*request), nil return requestOpenAI2Zhipu(*request), nil
} }

View File

@@ -34,6 +34,9 @@ func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *dto.Gen
if request == nil { if request == nil {
return nil, errors.New("request is nil") return nil, errors.New("request is nil")
} }
if request.TopP >= 1 {
request.TopP = 0.99
}
return requestOpenAI2Zhipu(*request), nil return requestOpenAI2Zhipu(*request), nil
} }
@@ -44,8 +47,10 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) { func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
if info.IsStream { if info.IsStream {
var responseText string var responseText string
err, responseText = openai.OpenaiStreamHandler(c, resp, info.RelayMode) var toolCount int
usage = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens) err, responseText, toolCount = openai.OpenaiStreamHandler(c, resp, info.RelayMode)
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
usage.CompletionTokens += toolCount * 7
} else { } else {
err, usage = openai.OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName) err, usage = openai.OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
} }

View File

@@ -74,6 +74,25 @@ func getZhipuToken(apikey string) string {
func requestOpenAI2Zhipu(request dto.GeneralOpenAIRequest) *dto.GeneralOpenAIRequest { func requestOpenAI2Zhipu(request dto.GeneralOpenAIRequest) *dto.GeneralOpenAIRequest {
messages := make([]dto.Message, 0, len(request.Messages)) messages := make([]dto.Message, 0, len(request.Messages))
for _, message := range request.Messages { for _, message := range request.Messages {
if !message.IsStringContent() {
mediaMessages := message.ParseContent()
for j, mediaMessage := range mediaMessages {
if mediaMessage.Type == dto.ContentTypeImageURL {
imageUrl := mediaMessage.ImageUrl.(dto.MessageImageUrl)
// check if base64
if strings.HasPrefix(imageUrl.Url, "data:image/") {
// 去除base64数据的URL前缀如果有
if idx := strings.Index(imageUrl.Url, ","); idx != -1 {
imageUrl.Url = imageUrl.Url[idx+1:]
}
}
mediaMessage.ImageUrl = imageUrl
mediaMessages[j] = mediaMessage
}
}
messageRaw, _ := json.Marshal(mediaMessages)
message.Content = messageRaw
}
messages = append(messages, dto.Message{ messages = append(messages, dto.Message{
Role: message.Role, Role: message.Role,
Content: message.Content, Content: message.Content,
@@ -138,7 +157,7 @@ func streamResponseZhipu2OpenAI(zhipuResponse *ZhipuV4StreamResponse) *dto.ChatC
Id: zhipuResponse.Id, Id: zhipuResponse.Id,
Object: "chat.completion.chunk", Object: "chat.completion.chunk",
Created: zhipuResponse.Created, Created: zhipuResponse.Created,
Model: "glm-4", Model: "glm-4v",
Choices: []dto.ChatCompletionsStreamResponseChoice{choice}, Choices: []dto.ChatCompletionsStreamResponseChoice{choice},
} }
return &response return &response

View File

@@ -24,12 +24,14 @@ type RelayInfo struct {
ApiVersion string ApiVersion string
PromptTokens int PromptTokens int
ApiKey string ApiKey string
Organization string
BaseUrl string BaseUrl string
} }
func GenRelayInfo(c *gin.Context) *RelayInfo { func GenRelayInfo(c *gin.Context) *RelayInfo {
channelType := c.GetInt("channel") channelType := c.GetInt("channel")
channelId := c.GetInt("channel_id") channelId := c.GetInt("channel_id")
tokenId := c.GetInt("token_id") tokenId := c.GetInt("token_id")
userId := c.GetInt("id") userId := c.GetInt("id")
group := c.GetString("group") group := c.GetString("group")
@@ -52,6 +54,7 @@ func GenRelayInfo(c *gin.Context) *RelayInfo {
ApiType: apiType, ApiType: apiType,
ApiVersion: c.GetString("api_version"), ApiVersion: c.GetString("api_version"),
ApiKey: strings.TrimPrefix(c.Request.Header.Get("Authorization"), "Bearer "), ApiKey: strings.TrimPrefix(c.Request.Header.Get("Authorization"), "Bearer "),
Organization: c.GetString("channel_organization"),
} }
if info.BaseUrl == "" { if info.BaseUrl == "" {
info.BaseUrl = common.ChannelBaseURLs[channelType] info.BaseUrl = common.ChannelBaseURLs[channelType]

View File

@@ -35,7 +35,7 @@ func RelayErrorHandler(resp *http.Response) (OpenAIErrorWithStatusCode *dto.Open
if err != nil { if err != nil {
return return
} }
var textResponse dto.TextResponse var textResponse dto.TextResponseWithError
err = json.Unmarshal(responseBody, &textResponse) err = json.Unmarshal(responseBody, &textResponse)
if err != nil { if err != nil {
return return

View File

@@ -16,6 +16,9 @@ const (
APITypeTencent APITypeTencent
APITypeGemini APITypeGemini
APITypeZhipu_v4 APITypeZhipu_v4
APITypeOllama
APITypePerplexity
APITypeAws
APITypeDummy // this one is only for count, do not add any channel after this APITypeDummy // this one is only for count, do not add any channel after this
) )
@@ -43,6 +46,12 @@ func ChannelType2APIType(channelType int) int {
apiType = APITypeGemini apiType = APITypeGemini
case common.ChannelTypeZhipu_v4: case common.ChannelTypeZhipu_v4:
apiType = APITypeZhipu_v4 apiType = APITypeZhipu_v4
case common.ChannelTypeOllama:
apiType = APITypeOllama
case common.ChannelTypePerplexity:
apiType = APITypePerplexity
case common.ChannelTypeAws:
apiType = APITypeAws
} }
return apiType return apiType
} }

View File

@@ -56,29 +56,29 @@ func Path2RelayMode(path string) int {
func Path2RelayModeMidjourney(path string) int { func Path2RelayModeMidjourney(path string) int {
relayMode := RelayModeUnknown relayMode := RelayModeUnknown
if strings.HasPrefix(path, "/mj/submit/action") { if strings.HasSuffix(path, "/mj/submit/action") {
// midjourney plus // midjourney plus
relayMode = RelayModeMidjourneyAction relayMode = RelayModeMidjourneyAction
} else if strings.HasPrefix(path, "/mj/submit/modal") { } else if strings.HasSuffix(path, "/mj/submit/modal") {
// midjourney plus // midjourney plus
relayMode = RelayModeMidjourneyModal relayMode = RelayModeMidjourneyModal
} else if strings.HasPrefix(path, "/mj/submit/shorten") { } else if strings.HasSuffix(path, "/mj/submit/shorten") {
// midjourney plus // midjourney plus
relayMode = RelayModeMidjourneyShorten relayMode = RelayModeMidjourneyShorten
} else if strings.HasPrefix(path, "/mj/insight-face/swap") { } else if strings.HasSuffix(path, "/mj/insight-face/swap") {
// midjourney plus // midjourney plus
relayMode = RelayModeSwapFace relayMode = RelayModeSwapFace
} else if strings.HasPrefix(path, "/mj/submit/imagine") { } else if strings.HasSuffix(path, "/mj/submit/imagine") {
relayMode = RelayModeMidjourneyImagine relayMode = RelayModeMidjourneyImagine
} else if strings.HasPrefix(path, "/mj/submit/blend") { } else if strings.HasSuffix(path, "/mj/submit/blend") {
relayMode = RelayModeMidjourneyBlend relayMode = RelayModeMidjourneyBlend
} else if strings.HasPrefix(path, "/mj/submit/describe") { } else if strings.HasSuffix(path, "/mj/submit/describe") {
relayMode = RelayModeMidjourneyDescribe relayMode = RelayModeMidjourneyDescribe
} else if strings.HasPrefix(path, "/mj/notify") { } else if strings.HasSuffix(path, "/mj/notify") {
relayMode = RelayModeMidjourneyNotify relayMode = RelayModeMidjourneyNotify
} else if strings.HasPrefix(path, "/mj/submit/change") { } else if strings.HasSuffix(path, "/mj/submit/change") {
relayMode = RelayModeMidjourneyChange relayMode = RelayModeMidjourneyChange
} else if strings.HasPrefix(path, "/mj/submit/simple-change") { } else if strings.HasSuffix(path, "/mj/submit/simple-change") {
relayMode = RelayModeMidjourneyChange relayMode = RelayModeMidjourneyChange
} else if strings.HasSuffix(path, "/fetch") { } else if strings.HasSuffix(path, "/fetch") {
relayMode = RelayModeMidjourneyTaskFetch relayMode = RelayModeMidjourneyTaskFetch

View File

@@ -10,6 +10,7 @@ import (
"io" "io"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/constant"
"one-api/dto" "one-api/dto"
"one-api/model" "one-api/model"
relaycommon "one-api/relay/common" relaycommon "one-api/relay/common"
@@ -62,8 +63,16 @@ func AudioHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusCode {
return service.OpenAIErrorWrapper(errors.New("voice must be one of "+strings.Join(availableVoices, ", ")), "invalid_field_value", http.StatusBadRequest) return service.OpenAIErrorWrapper(errors.New("voice must be one of "+strings.Join(availableVoices, ", ")), "invalid_field_value", http.StatusBadRequest)
} }
} }
var err error
promptTokens := 0
preConsumedTokens := common.PreConsumedQuota preConsumedTokens := common.PreConsumedQuota
if strings.HasPrefix(audioRequest.Model, "tts-1") {
promptTokens, err, _ = service.CountAudioToken(audioRequest.Input, audioRequest.Model, constant.ShouldCheckPromptSensitive())
if err != nil {
return service.OpenAIErrorWrapper(err, "count_audio_token_failed", http.StatusInternalServerError)
}
preConsumedTokens = promptTokens
}
modelRatio := common.GetModelRatio(audioRequest.Model) modelRatio := common.GetModelRatio(audioRequest.Model)
groupRatio := common.GetGroupRatio(group) groupRatio := common.GetGroupRatio(group)
ratio := modelRatio * groupRatio ratio := modelRatio * groupRatio
@@ -161,12 +170,10 @@ func AudioHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusCode {
go func() { go func() {
useTimeSeconds := time.Now().Unix() - startTime.Unix() useTimeSeconds := time.Now().Unix() - startTime.Unix()
quota := 0 quota := 0
var promptTokens = 0
if strings.HasPrefix(audioRequest.Model, "tts-1") { if strings.HasPrefix(audioRequest.Model, "tts-1") {
quota = service.CountAudioToken(audioRequest.Input, audioRequest.Model) quota = promptTokens
promptTokens = quota
} else { } else {
quota = service.CountAudioToken(audioResponse.Text, audioRequest.Model) quota, err, _ = service.CountAudioToken(audioResponse.Text, audioRequest.Model, false)
} }
quota = int(float64(quota) * ratio) quota = int(float64(quota) * ratio)
if ratio != 0 && quota <= 0 { if ratio != 0 && quota <= 0 {
@@ -208,6 +215,10 @@ func AudioHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusCode {
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError) return service.OpenAIErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError)
} }
contains, words := service.SensitiveWordContains(audioResponse.Text)
if contains {
return service.OpenAIErrorWrapper(errors.New("response contains sensitive words: "+strings.Join(words, ", ")), "response_contains_sensitive_words", http.StatusBadRequest)
}
} }
resp.Body = io.NopCloser(bytes.NewBuffer(responseBody)) resp.Body = io.NopCloser(bytes.NewBuffer(responseBody))

View File

@@ -110,11 +110,13 @@ func coverMidjourneyTaskDto(c *gin.Context, originTask *model.Midjourney) (midjo
midjourneyTask.StartTime = originTask.StartTime midjourneyTask.StartTime = originTask.StartTime
midjourneyTask.FinishTime = originTask.FinishTime midjourneyTask.FinishTime = originTask.FinishTime
midjourneyTask.ImageUrl = "" midjourneyTask.ImageUrl = ""
if originTask.ImageUrl != "" { if originTask.ImageUrl != "" && constant.MjForwardUrlEnabled {
midjourneyTask.ImageUrl = common.ServerAddress + "/mj/image/" + originTask.MjId midjourneyTask.ImageUrl = common.ServerAddress + "/mj/image/" + originTask.MjId
if originTask.Status != "SUCCESS" { if originTask.Status != "SUCCESS" {
midjourneyTask.ImageUrl += "?rand=" + strconv.FormatInt(time.Now().UnixNano(), 10) midjourneyTask.ImageUrl += "?rand=" + strconv.FormatInt(time.Now().UnixNano(), 10)
} }
} else {
midjourneyTask.ImageUrl = originTask.ImageUrl
} }
midjourneyTask.Status = originTask.Status midjourneyTask.Status = originTask.Status
midjourneyTask.FailReason = originTask.FailReason midjourneyTask.FailReason = originTask.FailReason
@@ -180,7 +182,7 @@ func RelaySwapFace(c *gin.Context) *dto.MidjourneyResponse {
Description: "quota_not_enough", Description: "quota_not_enough",
} }
} }
requestURL := c.Request.URL.String() requestURL := getMjRequestPath(c.Request.URL.String())
baseURL := c.GetString("base_url") baseURL := c.GetString("base_url")
fullRequestURL := fmt.Sprintf("%s%s", baseURL, requestURL) fullRequestURL := fmt.Sprintf("%s%s", baseURL, requestURL)
mjResp, _, err := service.DoMidjourneyHttpRequest(c, time.Second*60, fullRequestURL) mjResp, _, err := service.DoMidjourneyHttpRequest(c, time.Second*60, fullRequestURL)
@@ -260,7 +262,7 @@ func RelayMidjourneyTaskImageSeed(c *gin.Context) *dto.MidjourneyResponse {
c.Set("channel_id", originTask.ChannelId) c.Set("channel_id", originTask.ChannelId)
c.Request.Header.Set("Authorization", fmt.Sprintf("Bearer %s", channel.Key)) c.Request.Header.Set("Authorization", fmt.Sprintf("Bearer %s", channel.Key))
requestURL := c.Request.URL.String() requestURL := getMjRequestPath(c.Request.URL.String())
fullRequestURL := fmt.Sprintf("%s%s", channel.GetBaseURL(), requestURL) fullRequestURL := fmt.Sprintf("%s%s", channel.GetBaseURL(), requestURL)
midjResponseWithStatus, _, err := service.DoMidjourneyHttpRequest(c, time.Second*30, fullRequestURL) midjResponseWithStatus, _, err := service.DoMidjourneyHttpRequest(c, time.Second*30, fullRequestURL)
if err != nil { if err != nil {
@@ -440,7 +442,7 @@ func RelayMidjourneySubmit(c *gin.Context, relayMode int) *dto.MidjourneyRespons
} }
//baseURL := common.ChannelBaseURLs[channelType] //baseURL := common.ChannelBaseURLs[channelType]
requestURL := c.Request.URL.String() requestURL := getMjRequestPath(c.Request.URL.String())
baseURL := c.GetString("base_url") baseURL := c.GetString("base_url")
@@ -605,3 +607,15 @@ type taskChangeParams struct {
Action string Action string
Index int Index int
} }
func getMjRequestPath(path string) string {
requestURL := path
if strings.Contains(requestURL, "/mj-") {
urls := strings.Split(requestURL, "/mj/")
if len(urls) < 2 {
return requestURL
}
requestURL = "/mj/" + urls[1]
}
return requestURL
}

View File

@@ -10,6 +10,7 @@ import (
"math" "math"
"net/http" "net/http"
"one-api/common" "one-api/common"
"one-api/constant"
"one-api/dto" "one-api/dto"
"one-api/model" "one-api/model"
relaycommon "one-api/relay/common" relaycommon "one-api/relay/common"
@@ -71,7 +72,7 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
textRequest, err := getAndValidateTextRequest(c, relayInfo) textRequest, err := getAndValidateTextRequest(c, relayInfo)
if err != nil { if err != nil {
common.LogError(c, fmt.Sprintf("getAndValidateTextRequest failed: %s", err.Error())) common.LogError(c, fmt.Sprintf("getAndValidateTextRequest failed: %s", err.Error()))
return service.OpenAIErrorWrapper(err, "invalid_text_request", http.StatusBadRequest) return service.OpenAIErrorWrapperLocal(err, "invalid_text_request", http.StatusBadRequest)
} }
// map model name // map model name
@@ -81,7 +82,7 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
modelMap := make(map[string]string) modelMap := make(map[string]string)
err := json.Unmarshal([]byte(modelMapping), &modelMap) err := json.Unmarshal([]byte(modelMapping), &modelMap)
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "unmarshal_model_mapping_failed", http.StatusInternalServerError) return service.OpenAIErrorWrapperLocal(err, "unmarshal_model_mapping_failed", http.StatusInternalServerError)
} }
if modelMap[textRequest.Model] != "" { if modelMap[textRequest.Model] != "" {
textRequest.Model = modelMap[textRequest.Model] textRequest.Model = modelMap[textRequest.Model]
@@ -96,10 +97,14 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
var preConsumedQuota int var preConsumedQuota int
var ratio float64 var ratio float64
var modelRatio float64 var modelRatio float64
promptTokens, err := getPromptTokens(textRequest, relayInfo) //err := service.SensitiveWordsCheck(textRequest)
promptTokens, err, sensitiveTrigger := getPromptTokens(textRequest, relayInfo)
// count messages token error 计算promptTokens错误 // count messages token error 计算promptTokens错误
if err != nil { if err != nil {
if sensitiveTrigger {
return service.OpenAIErrorWrapperLocal(err, "sensitive_words_detected", http.StatusBadRequest)
}
return service.OpenAIErrorWrapper(err, "count_token_messages_failed", http.StatusInternalServerError) return service.OpenAIErrorWrapper(err, "count_token_messages_failed", http.StatusInternalServerError)
} }
@@ -149,59 +154,68 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
requestBody = bytes.NewBuffer(jsonData) requestBody = bytes.NewBuffer(jsonData)
} }
statusCodeMappingStr := c.GetString("status_code_mapping")
resp, err := adaptor.DoRequest(c, relayInfo, requestBody) resp, err := adaptor.DoRequest(c, relayInfo, requestBody)
if err != nil { if err != nil {
return service.OpenAIErrorWrapper(err, "do_request_failed", http.StatusInternalServerError) return service.OpenAIErrorWrapper(err, "do_request_failed", http.StatusInternalServerError)
} }
relayInfo.IsStream = relayInfo.IsStream || strings.HasPrefix(resp.Header.Get("Content-Type"), "text/event-stream")
if resp != nil {
relayInfo.IsStream = relayInfo.IsStream || strings.HasPrefix(resp.Header.Get("Content-Type"), "text/event-stream")
if resp.StatusCode != http.StatusOK { if resp.StatusCode != http.StatusOK {
returnPreConsumedQuota(c, relayInfo.TokenId, userQuota, preConsumedQuota) returnPreConsumedQuota(c, relayInfo.TokenId, userQuota, preConsumedQuota)
return service.RelayErrorHandler(resp) openaiErr := service.RelayErrorHandler(resp)
// reset status code 重置状态码
service.ResetStatusCode(openaiErr, statusCodeMappingStr)
return openaiErr
}
} }
usage, openaiErr := adaptor.DoResponse(c, resp, relayInfo) usage, openaiErr := adaptor.DoResponse(c, resp, relayInfo)
if openaiErr != nil { if openaiErr != nil {
returnPreConsumedQuota(c, relayInfo.TokenId, userQuota, preConsumedQuota) returnPreConsumedQuota(c, relayInfo.TokenId, userQuota, preConsumedQuota)
// reset status code 重置状态码
service.ResetStatusCode(openaiErr, statusCodeMappingStr)
return openaiErr return openaiErr
} }
postConsumeQuota(c, relayInfo, *textRequest, usage, ratio, preConsumedQuota, userQuota, modelRatio, groupRatio, modelPrice) postConsumeQuota(c, relayInfo, *textRequest, usage, ratio, preConsumedQuota, userQuota, modelRatio, groupRatio, modelPrice)
return nil return nil
} }
func getPromptTokens(textRequest *dto.GeneralOpenAIRequest, info *relaycommon.RelayInfo) (int, error) { func getPromptTokens(textRequest *dto.GeneralOpenAIRequest, info *relaycommon.RelayInfo) (int, error, bool) {
var promptTokens int var promptTokens int
var err error var err error
var sensitiveTrigger bool
checkSensitive := constant.ShouldCheckPromptSensitive()
switch info.RelayMode { switch info.RelayMode {
case relayconstant.RelayModeChatCompletions: case relayconstant.RelayModeChatCompletions:
promptTokens, err = service.CountTokenMessages(textRequest.Messages, textRequest.Model) promptTokens, err, sensitiveTrigger = service.CountTokenChatRequest(*textRequest, textRequest.Model, checkSensitive)
case relayconstant.RelayModeCompletions: case relayconstant.RelayModeCompletions:
promptTokens, err = service.CountTokenInput(textRequest.Prompt, textRequest.Model), nil promptTokens, err, sensitiveTrigger = service.CountTokenInput(textRequest.Prompt, textRequest.Model, checkSensitive)
case relayconstant.RelayModeModerations: case relayconstant.RelayModeModerations:
promptTokens, err = service.CountTokenInput(textRequest.Input, textRequest.Model), nil promptTokens, err, sensitiveTrigger = service.CountTokenInput(textRequest.Input, textRequest.Model, checkSensitive)
case relayconstant.RelayModeEmbeddings: case relayconstant.RelayModeEmbeddings:
promptTokens, err = service.CountTokenInput(textRequest.Input, textRequest.Model), nil promptTokens, err, sensitiveTrigger = service.CountTokenInput(textRequest.Input, textRequest.Model, checkSensitive)
default: default:
err = errors.New("unknown relay mode") err = errors.New("unknown relay mode")
promptTokens = 0 promptTokens = 0
} }
info.PromptTokens = promptTokens info.PromptTokens = promptTokens
return promptTokens, err return promptTokens, err, sensitiveTrigger
} }
// 预扣费并返回用户剩余配额 // 预扣费并返回用户剩余配额
func preConsumeQuota(c *gin.Context, preConsumedQuota int, relayInfo *relaycommon.RelayInfo) (int, int, *dto.OpenAIErrorWithStatusCode) { func preConsumeQuota(c *gin.Context, preConsumedQuota int, relayInfo *relaycommon.RelayInfo) (int, int, *dto.OpenAIErrorWithStatusCode) {
userQuota, err := model.CacheGetUserQuota(relayInfo.UserId) userQuota, err := model.CacheGetUserQuota(relayInfo.UserId)
if err != nil { if err != nil {
return 0, 0, service.OpenAIErrorWrapper(err, "get_user_quota_failed", http.StatusInternalServerError) return 0, 0, service.OpenAIErrorWrapperLocal(err, "get_user_quota_failed", http.StatusInternalServerError)
} }
if userQuota <= 0 || userQuota-preConsumedQuota < 0 { if userQuota <= 0 || userQuota-preConsumedQuota < 0 {
return 0, 0, service.OpenAIErrorWrapper(errors.New("user quota is not enough"), "insufficient_user_quota", http.StatusForbidden) return 0, 0, service.OpenAIErrorWrapperLocal(errors.New("user quota is not enough"), "insufficient_user_quota", http.StatusForbidden)
} }
err = model.CacheDecreaseUserQuota(relayInfo.UserId, preConsumedQuota) err = model.CacheDecreaseUserQuota(relayInfo.UserId, preConsumedQuota)
if err != nil { if err != nil {
return 0, 0, service.OpenAIErrorWrapper(err, "decrease_user_quota_failed", http.StatusInternalServerError) return 0, 0, service.OpenAIErrorWrapperLocal(err, "decrease_user_quota_failed", http.StatusInternalServerError)
} }
if userQuota > 100*preConsumedQuota { if userQuota > 100*preConsumedQuota {
// 用户额度充足,判断令牌额度是否充足 // 用户额度充足,判断令牌额度是否充足
@@ -223,7 +237,7 @@ func preConsumeQuota(c *gin.Context, preConsumedQuota int, relayInfo *relaycommo
if preConsumedQuota > 0 { if preConsumedQuota > 0 {
userQuota, err = model.PreConsumeTokenQuota(relayInfo.TokenId, preConsumedQuota) userQuota, err = model.PreConsumeTokenQuota(relayInfo.TokenId, preConsumedQuota)
if err != nil { if err != nil {
return 0, 0, service.OpenAIErrorWrapper(err, "pre_consume_token_quota_failed", http.StatusForbidden) return 0, 0, service.OpenAIErrorWrapperLocal(err, "pre_consume_token_quota_failed", http.StatusForbidden)
} }
} }
return preConsumedQuota, userQuota, nil return preConsumedQuota, userQuota, nil
@@ -241,7 +255,10 @@ func returnPreConsumedQuota(c *gin.Context, tokenId int, userQuota int, preConsu
} }
} }
func postConsumeQuota(ctx *gin.Context, relayInfo *relaycommon.RelayInfo, textRequest dto.GeneralOpenAIRequest, usage *dto.Usage, ratio float64, preConsumedQuota int, userQuota int, modelRatio float64, groupRatio float64, modelPrice float64) { func postConsumeQuota(ctx *gin.Context, relayInfo *relaycommon.RelayInfo, textRequest dto.GeneralOpenAIRequest,
usage *dto.Usage, ratio float64, preConsumedQuota int, userQuota int, modelRatio float64, groupRatio float64,
modelPrice float64) {
useTimeSeconds := time.Now().Unix() - relayInfo.StartTime.Unix() useTimeSeconds := time.Now().Unix() - relayInfo.StartTime.Unix()
promptTokens := usage.PromptTokens promptTokens := usage.PromptTokens
completionTokens := usage.CompletionTokens completionTokens := usage.CompletionTokens
@@ -275,12 +292,17 @@ func postConsumeQuota(ctx *gin.Context, relayInfo *relaycommon.RelayInfo, textRe
logContent += fmt.Sprintf("(可能是上游超时)") logContent += fmt.Sprintf("(可能是上游超时)")
common.LogError(ctx, fmt.Sprintf("total tokens is 0, cannot consume quota, userId %d, channelId %d, tokenId %d, model %s pre-consumed quota %d", relayInfo.UserId, relayInfo.ChannelId, relayInfo.TokenId, textRequest.Model, preConsumedQuota)) common.LogError(ctx, fmt.Sprintf("total tokens is 0, cannot consume quota, userId %d, channelId %d, tokenId %d, model %s pre-consumed quota %d", relayInfo.UserId, relayInfo.ChannelId, relayInfo.TokenId, textRequest.Model, preConsumedQuota))
} else { } else {
//if sensitiveResp != nil {
// logContent += fmt.Sprintf(",敏感词:%s", strings.Join(sensitiveResp.SensitiveWords, ", "))
//}
quotaDelta := quota - preConsumedQuota quotaDelta := quota - preConsumedQuota
if quotaDelta != 0 {
err := model.PostConsumeTokenQuota(relayInfo.TokenId, userQuota, quotaDelta, preConsumedQuota, true) err := model.PostConsumeTokenQuota(relayInfo.TokenId, userQuota, quotaDelta, preConsumedQuota, true)
if err != nil { if err != nil {
common.LogError(ctx, "error consuming token remain quota: "+err.Error()) common.LogError(ctx, "error consuming token remain quota: "+err.Error())
} }
err = model.CacheUpdateUserQuota(relayInfo.UserId) }
err := model.CacheUpdateUserQuota(relayInfo.UserId)
if err != nil { if err != nil {
common.LogError(ctx, "error update user quota cache: "+err.Error()) common.LogError(ctx, "error update user quota cache: "+err.Error())
} }

View File

@@ -3,11 +3,14 @@ package relay
import ( import (
"one-api/relay/channel" "one-api/relay/channel"
"one-api/relay/channel/ali" "one-api/relay/channel/ali"
"one-api/relay/channel/aws"
"one-api/relay/channel/baidu" "one-api/relay/channel/baidu"
"one-api/relay/channel/claude" "one-api/relay/channel/claude"
"one-api/relay/channel/gemini" "one-api/relay/channel/gemini"
"one-api/relay/channel/ollama"
"one-api/relay/channel/openai" "one-api/relay/channel/openai"
"one-api/relay/channel/palm" "one-api/relay/channel/palm"
"one-api/relay/channel/perplexity"
"one-api/relay/channel/tencent" "one-api/relay/channel/tencent"
"one-api/relay/channel/xunfei" "one-api/relay/channel/xunfei"
"one-api/relay/channel/zhipu" "one-api/relay/channel/zhipu"
@@ -39,6 +42,12 @@ func GetAdaptor(apiType int) channel.Adaptor {
return &zhipu.Adaptor{} return &zhipu.Adaptor{}
case constant.APITypeZhipu_v4: case constant.APITypeZhipu_v4:
return &zhipu_4v.Adaptor{} return &zhipu_4v.Adaptor{}
case constant.APITypeOllama:
return &ollama.Adaptor{}
case constant.APITypePerplexity:
return &perplexity.Adaptor{}
case constant.APITypeAws:
return &aws.Adaptor{}
} }
return nil return nil
} }

View File

@@ -17,7 +17,7 @@ func SetApiRouter(router *gin.Engine) {
apiRouter.GET("/status/test", middleware.AdminAuth(), controller.TestStatus) apiRouter.GET("/status/test", middleware.AdminAuth(), controller.TestStatus)
apiRouter.GET("/notice", controller.GetNotice) apiRouter.GET("/notice", controller.GetNotice)
apiRouter.GET("/about", controller.GetAbout) apiRouter.GET("/about", controller.GetAbout)
apiRouter.GET("/midjourney", controller.GetMidjourney) //apiRouter.GET("/midjourney", controller.GetMidjourney)
apiRouter.GET("/home_page_content", controller.GetHomePageContent) apiRouter.GET("/home_page_content", controller.GetHomePageContent)
apiRouter.GET("/verification", middleware.CriticalRateLimit(), middleware.TurnstileCheck(), controller.SendEmailVerification) apiRouter.GET("/verification", middleware.CriticalRateLimit(), middleware.TurnstileCheck(), controller.SendEmailVerification)
apiRouter.GET("/reset_password", middleware.CriticalRateLimit(), middleware.TurnstileCheck(), controller.SendPasswordResetEmail) apiRouter.GET("/reset_password", middleware.CriticalRateLimit(), middleware.TurnstileCheck(), controller.SendPasswordResetEmail)

View File

@@ -43,7 +43,16 @@ func SetRelayRouter(router *gin.Engine) {
relayV1Router.DELETE("/models/:model", controller.RelayNotImplemented) relayV1Router.DELETE("/models/:model", controller.RelayNotImplemented)
relayV1Router.POST("/moderations", controller.Relay) relayV1Router.POST("/moderations", controller.Relay)
} }
relayMjRouter := router.Group("/mj") relayMjRouter := router.Group("/mj")
registerMjRouterGroup(relayMjRouter)
relayMjModeRouter := router.Group("/:mode/mj")
registerMjRouterGroup(relayMjModeRouter)
//relayMjRouter.Use()
}
func registerMjRouterGroup(relayMjRouter *gin.RouterGroup) {
relayMjRouter.GET("/image/:id", relay.RelayMidjourneyImage) relayMjRouter.GET("/image/:id", relay.RelayMidjourneyImage)
relayMjRouter.Use(middleware.TokenAuth(), middleware.Distribute()) relayMjRouter.Use(middleware.TokenAuth(), middleware.Distribute())
{ {
@@ -61,5 +70,4 @@ func SetRelayRouter(router *gin.Engine) {
relayMjRouter.POST("/task/list-by-condition", controller.RelayMidjourney) relayMjRouter.POST("/task/list-by-condition", controller.RelayMidjourney)
relayMjRouter.POST("/insight-face/swap", controller.RelayMidjourney) relayMjRouter.POST("/insight-face/swap", controller.RelayMidjourney)
} }
//relayMjRouter.Use()
} }

View File

@@ -16,9 +16,9 @@ func SetWebRouter(router *gin.Engine, buildFS embed.FS, indexPage []byte) {
router.Use(gzip.Gzip(gzip.DefaultCompression)) router.Use(gzip.Gzip(gzip.DefaultCompression))
router.Use(middleware.GlobalWebRateLimit()) router.Use(middleware.GlobalWebRateLimit())
router.Use(middleware.Cache()) router.Use(middleware.Cache())
router.Use(static.Serve("/", common.EmbedFolder(buildFS, "web/build"))) router.Use(static.Serve("/", common.EmbedFolder(buildFS, "web/dist")))
router.NoRoute(func(c *gin.Context) { router.NoRoute(func(c *gin.Context) {
if strings.HasPrefix(c.Request.RequestURI, "/v1") || strings.HasPrefix(c.Request.RequestURI, "/api") { if strings.HasPrefix(c.Request.RequestURI, "/v1") || strings.HasPrefix(c.Request.RequestURI, "/api") || strings.HasPrefix(c.Request.RequestURI, "/assets") {
controller.RelayNotFound(c) controller.RelayNotFound(c)
return return
} }

View File

@@ -6,6 +6,7 @@ import (
"one-api/common" "one-api/common"
relaymodel "one-api/dto" relaymodel "one-api/dto"
"one-api/model" "one-api/model"
"strings"
) )
// disable & notify // disable & notify
@@ -33,7 +34,30 @@ func ShouldDisableChannel(err *relaymodel.OpenAIError, statusCode int) bool {
if statusCode == http.StatusUnauthorized { if statusCode == http.StatusUnauthorized {
return true return true
} }
if err.Type == "insufficient_quota" || err.Code == "invalid_api_key" || err.Code == "account_deactivated" || err.Code == "billing_not_active" { switch err.Code {
case "invalid_api_key":
return true
case "account_deactivated":
return true
case "billing_not_active":
return true
}
switch err.Type {
case "insufficient_quota":
return true
// https://docs.anthropic.com/claude/reference/errors
case "authentication_error":
return true
case "permission_error":
return true
case "forbidden":
return true
}
if strings.HasPrefix(err.Message, "Your credit balance is too low") { // anthropic
return true
} else if strings.HasPrefix(err.Message, "This organization has been disabled.") {
return true
} else if strings.HasPrefix(err.Message, "You exceeded your current quota") {
return true return true
} }
return false return false

View File

@@ -1,10 +1,13 @@
package service package service
import "one-api/common" import (
"one-api/common"
"one-api/constant"
)
func GetCallbackAddress() string { func GetCallbackAddress() string {
if common.CustomCallbackAddress == "" { if constant.CustomCallbackAddress == "" {
return common.ServerAddress return common.ServerAddress
} }
return common.CustomCallbackAddress return constant.CustomCallbackAddress
} }

View File

@@ -29,7 +29,7 @@ func MidjourneyErrorWithStatusCodeWrapper(code int, desc string, statusCode int)
func OpenAIErrorWrapper(err error, code string, statusCode int) *dto.OpenAIErrorWithStatusCode { func OpenAIErrorWrapper(err error, code string, statusCode int) *dto.OpenAIErrorWithStatusCode {
text := err.Error() text := err.Error()
// 定义一个正则表达式匹配URL // 定义一个正则表达式匹配URL
if strings.Contains(text, "Post") { if strings.Contains(text, "Post") || strings.Contains(text, "dial") {
common.SysLog(fmt.Sprintf("error: %s", text)) common.SysLog(fmt.Sprintf("error: %s", text))
text = "请求上游地址失败" text = "请求上游地址失败"
} }
@@ -46,6 +46,12 @@ func OpenAIErrorWrapper(err error, code string, statusCode int) *dto.OpenAIError
} }
} }
func OpenAIErrorWrapperLocal(err error, code string, statusCode int) *dto.OpenAIErrorWithStatusCode {
openaiErr := OpenAIErrorWrapper(err, code, statusCode)
openaiErr.LocalError = true
return openaiErr
}
func RelayErrorHandler(resp *http.Response) (errWithStatusCode *dto.OpenAIErrorWithStatusCode) { func RelayErrorHandler(resp *http.Response) (errWithStatusCode *dto.OpenAIErrorWithStatusCode) {
errWithStatusCode = &dto.OpenAIErrorWithStatusCode{ errWithStatusCode = &dto.OpenAIErrorWithStatusCode{
StatusCode: resp.StatusCode, StatusCode: resp.StatusCode,
@@ -80,3 +86,22 @@ func RelayErrorHandler(resp *http.Response) (errWithStatusCode *dto.OpenAIErrorW
} }
return return
} }
func ResetStatusCode(openaiErr *dto.OpenAIErrorWithStatusCode, statusCodeMappingStr string) {
if statusCodeMappingStr == "" || statusCodeMappingStr == "{}" {
return
}
statusCodeMapping := make(map[string]string)
err := json.Unmarshal([]byte(statusCodeMappingStr), &statusCodeMapping)
if err != nil {
return
}
if openaiErr.StatusCode == http.StatusOK {
return
}
codeStr := strconv.Itoa(openaiErr.StatusCode)
if _, ok := statusCodeMapping[codeStr]; ok {
intCode, _ := strconv.Atoi(statusCodeMapping[codeStr])
openaiErr.StatusCode = intCode
}
}

View File

@@ -7,6 +7,7 @@ import (
"io" "io"
"log" "log"
"net/http" "net/http"
"one-api/common"
"one-api/constant" "one-api/constant"
"one-api/dto" "one-api/dto"
relayconstant "one-api/relay/constant" relayconstant "one-api/relay/constant"
@@ -171,6 +172,15 @@ func DoMidjourneyHttpRequest(c *gin.Context, timeout time.Duration, fullRequestU
//req, err := http.NewRequest(c.Request.Method, fullRequestURL, requestBody) //req, err := http.NewRequest(c.Request.Method, fullRequestURL, requestBody)
// make new request with mapResult // make new request with mapResult
} }
if constant.MjModeClearEnabled {
if prompt, ok := mapResult["prompt"].(string); ok {
prompt = strings.Replace(prompt, "--fast", "", -1)
prompt = strings.Replace(prompt, "--relax", "", -1)
prompt = strings.Replace(prompt, "--turbo", "", -1)
mapResult["prompt"] = prompt
}
}
reqBody, err := json.Marshal(mapResult) reqBody, err := json.Marshal(mapResult)
if err != nil { if err != nil {
return MidjourneyErrorWithStatusCodeWrapper(constant.MjErrorUnknown, "marshal_request_body_failed", http.StatusInternalServerError), nullBytes, err return MidjourneyErrorWithStatusCodeWrapper(constant.MjErrorUnknown, "marshal_request_body_failed", http.StatusInternalServerError), nullBytes, err
@@ -184,10 +194,15 @@ func DoMidjourneyHttpRequest(c *gin.Context, timeout time.Duration, fullRequestU
req = req.WithContext(ctx) req = req.WithContext(ctx)
req.Header.Set("Content-Type", c.Request.Header.Get("Content-Type")) req.Header.Set("Content-Type", c.Request.Header.Get("Content-Type"))
req.Header.Set("Accept", c.Request.Header.Get("Accept")) req.Header.Set("Accept", c.Request.Header.Get("Accept"))
req.Header.Set("mj-api-secret", strings.Split(c.Request.Header.Get("Authorization"), " ")[1]) auth := c.Request.Header.Get("Authorization")
if auth != "" {
auth = strings.TrimPrefix(auth, "Bearer ")
req.Header.Set("mj-api-secret", auth)
}
defer cancel() defer cancel()
resp, err := GetHttpClient().Do(req) resp, err := GetHttpClient().Do(req)
if err != nil { if err != nil {
common.SysError("do request failed: " + err.Error())
return MidjourneyErrorWithStatusCodeWrapper(constant.MjErrorUnknown, "do_request_failed", http.StatusInternalServerError), nullBytes, err return MidjourneyErrorWithStatusCodeWrapper(constant.MjErrorUnknown, "do_request_failed", http.StatusInternalServerError), nullBytes, err
} }
statusCode := resp.StatusCode statusCode := resp.StatusCode

71
service/sensitive.go Normal file
View File

@@ -0,0 +1,71 @@
package service
import (
"bytes"
"fmt"
"github.com/anknown/ahocorasick"
"one-api/constant"
"strings"
)
// SensitiveWordContains 是否包含敏感词,返回是否包含敏感词和敏感词列表
func SensitiveWordContains(text string) (bool, []string) {
if len(constant.SensitiveWords) == 0 {
return false, nil
}
checkText := strings.ToLower(text)
// 构建一个AC自动机
m := initAc()
hits := m.MultiPatternSearch([]rune(checkText), false)
if len(hits) > 0 {
words := make([]string, 0)
for _, hit := range hits {
words = append(words, string(hit.Word))
}
return true, words
}
return false, nil
}
// SensitiveWordReplace 敏感词替换,返回是否包含敏感词和替换后的文本
func SensitiveWordReplace(text string, returnImmediately bool) (bool, []string, string) {
if len(constant.SensitiveWords) == 0 {
return false, nil, text
}
checkText := strings.ToLower(text)
m := initAc()
hits := m.MultiPatternSearch([]rune(checkText), returnImmediately)
if len(hits) > 0 {
words := make([]string, 0)
for _, hit := range hits {
pos := hit.Pos
word := string(hit.Word)
text = text[:pos] + "**###**" + text[pos+len(word):]
words = append(words, word)
}
return true, words, text
}
return false, nil, text
}
func initAc() *goahocorasick.Machine {
m := new(goahocorasick.Machine)
dict := readRunes()
if err := m.Build(dict); err != nil {
fmt.Println(err)
return nil
}
return m
}
func readRunes() [][]rune {
var dict [][]rune
for _, word := range constant.SensitiveWords {
word = strings.ToLower(word)
l := bytes.TrimSpace([]byte(word))
dict = append(dict, bytes.Runes(l))
}
return dict
}

View File

@@ -29,7 +29,7 @@ func InitTokenEncoders() {
if err != nil { if err != nil {
common.FatalLog(fmt.Sprintf("failed to get gpt-4 token encoder: %s", err.Error())) common.FatalLog(fmt.Sprintf("failed to get gpt-4 token encoder: %s", err.Error()))
} }
for model, _ := range common.ModelRatio { for model, _ := range common.DefaultModelRatio {
if strings.HasPrefix(model, "gpt-3.5") { if strings.HasPrefix(model, "gpt-3.5") {
tokenEncoderMap[model] = gpt35TokenEncoder tokenEncoderMap[model] = gpt35TokenEncoder
} else if strings.HasPrefix(model, "gpt-4") { } else if strings.HasPrefix(model, "gpt-4") {
@@ -116,7 +116,42 @@ func getImageToken(imageUrl *dto.MessageImageUrl) (int, error) {
return tiles*170 + 85, nil return tiles*170 + 85, nil
} }
func CountTokenMessages(messages []dto.Message, model string) (int, error) { func CountTokenChatRequest(request dto.GeneralOpenAIRequest, model string, checkSensitive bool) (int, error, bool) {
tkm := 0
msgTokens, err, b := CountTokenMessages(request.Messages, model, checkSensitive)
if err != nil {
return 0, err, b
}
tkm += msgTokens
if request.Tools != nil {
toolsData, _ := json.Marshal(request.Tools)
var openaiTools []dto.OpenAITools
err := json.Unmarshal(toolsData, &openaiTools)
if err != nil {
return 0, errors.New(fmt.Sprintf("count tools token fail: %s", err.Error())), false
}
countStr := ""
for _, tool := range openaiTools {
countStr = tool.Function.Name
if tool.Function.Description != "" {
countStr += tool.Function.Description
}
if tool.Function.Parameters != nil {
countStr += fmt.Sprintf("%v", tool.Function.Parameters)
}
}
toolTokens, err, _ := CountTokenInput(countStr, model, false)
if err != nil {
return 0, err, false
}
tkm += 8
tkm += toolTokens
}
return tkm, nil, false
}
func CountTokenMessages(messages []dto.Message, model string, checkSensitive bool) (int, error, bool) {
//recover when panic //recover when panic
tokenEncoder := getTokenEncoder(model) tokenEncoder := getTokenEncoder(model)
// Reference: // Reference:
@@ -142,8 +177,15 @@ func CountTokenMessages(messages []dto.Message, model string) (int, error) {
if err := json.Unmarshal(message.Content, &arrayContent); err != nil { if err := json.Unmarshal(message.Content, &arrayContent); err != nil {
var stringContent string var stringContent string
if err := json.Unmarshal(message.Content, &stringContent); err != nil { if err := json.Unmarshal(message.Content, &stringContent); err != nil {
return 0, err return 0, err, false
} else { } else {
if checkSensitive {
contains, words := SensitiveWordContains(stringContent)
if contains {
err := fmt.Errorf("message contains sensitive words: [%s]", strings.Join(words, ", "))
return 0, err, true
}
}
tokenNum += getTokenNum(tokenEncoder, stringContent) tokenNum += getTokenNum(tokenEncoder, stringContent)
if message.Name != nil { if message.Name != nil {
tokenNum += tokensPerName tokenNum += tokensPerName
@@ -174,7 +216,7 @@ func CountTokenMessages(messages []dto.Message, model string) (int, error) {
imageTokenNum, err = getImageToken(&imageUrl) imageTokenNum, err = getImageToken(&imageUrl)
} }
if err != nil { if err != nil {
return 0, err return 0, err, false
} }
} }
tokenNum += imageTokenNum tokenNum += imageTokenNum
@@ -187,32 +229,63 @@ func CountTokenMessages(messages []dto.Message, model string) (int, error) {
} }
} }
tokenNum += 3 // Every reply is primed with <|start|>assistant<|message|> tokenNum += 3 // Every reply is primed with <|start|>assistant<|message|>
return tokenNum, nil return tokenNum, nil, false
} }
func CountTokenInput(input any, model string) int { func CountTokenInput(input any, model string, check bool) (int, error, bool) {
switch v := input.(type) { switch v := input.(type) {
case string: case string:
return CountTokenText(v, model) return CountTokenText(v, model, check)
case []string: case []string:
text := "" text := ""
for _, s := range v { for _, s := range v {
text += s text += s
} }
return CountTokenText(text, model) return CountTokenText(text, model, check)
} }
return 0 return CountTokenInput(fmt.Sprintf("%v", input), model, check)
} }
func CountAudioToken(text string, model string) int { func CountTokenStreamChoices(messages []dto.ChatCompletionsStreamResponseChoice, model string) int {
tokens := 0
for _, message := range messages {
tkm, _, _ := CountTokenInput(message.Delta.Content, model, false)
tokens += tkm
if message.Delta.ToolCalls != nil {
for _, tool := range message.Delta.ToolCalls {
tkm, _, _ := CountTokenInput(tool.Function.Name, model, false)
tokens += tkm
tkm, _, _ = CountTokenInput(tool.Function.Arguments, model, false)
tokens += tkm
}
}
}
return tokens
}
func CountAudioToken(text string, model string, check bool) (int, error, bool) {
if strings.HasPrefix(model, "tts") { if strings.HasPrefix(model, "tts") {
return utf8.RuneCountInString(text) contains, words := SensitiveWordContains(text)
if contains {
return utf8.RuneCountInString(text), fmt.Errorf("input contains sensitive words: [%s]", strings.Join(words, ",")), true
}
return utf8.RuneCountInString(text), nil, false
} else { } else {
return CountTokenText(text, model) return CountTokenText(text, model, check)
} }
} }
func CountTokenText(text string, model string) int { // CountTokenText 统计文本的token数量仅当文本包含敏感词返回错误同时返回token数量
tokenEncoder := getTokenEncoder(model) func CountTokenText(text string, model string, check bool) (int, error, bool) {
return getTokenNum(tokenEncoder, text) var err error
var trigger bool
if check {
contains, words := SensitiveWordContains(text)
if contains {
err = fmt.Errorf("input contains sensitive words: [%s]", strings.Join(words, ","))
trigger = true
}
}
tokenEncoder := getTokenEncoder(model)
return getTokenNum(tokenEncoder, text), err, trigger
} }

View File

@@ -1,27 +1,26 @@
package service package service
import ( import (
"errors"
"one-api/dto" "one-api/dto"
"one-api/relay/constant"
) )
func GetPromptTokens(textRequest dto.GeneralOpenAIRequest, relayMode int) (int, error) { //func GetPromptTokens(textRequest dto.GeneralOpenAIRequest, relayMode int) (int, error) {
switch relayMode { // switch relayMode {
case constant.RelayModeChatCompletions: // case constant.RelayModeChatCompletions:
return CountTokenMessages(textRequest.Messages, textRequest.Model) // return CountTokenMessages(textRequest.Messages, textRequest.Model)
case constant.RelayModeCompletions: // case constant.RelayModeCompletions:
return CountTokenInput(textRequest.Prompt, textRequest.Model), nil // return CountTokenInput(textRequest.Prompt, textRequest.Model), nil
case constant.RelayModeModerations: // case constant.RelayModeModerations:
return CountTokenInput(textRequest.Input, textRequest.Model), nil // return CountTokenInput(textRequest.Input, textRequest.Model), nil
} // }
return 0, errors.New("unknown relay mode") // return 0, errors.New("unknown relay mode")
} //}
func ResponseText2Usage(responseText string, modeName string, promptTokens int) *dto.Usage { func ResponseText2Usage(responseText string, modeName string, promptTokens int) (*dto.Usage, error) {
usage := &dto.Usage{} usage := &dto.Usage{}
usage.PromptTokens = promptTokens usage.PromptTokens = promptTokens
usage.CompletionTokens = CountTokenText(responseText, modeName) ctkm, err, _ := CountTokenText(responseText, modeName, false)
usage.CompletionTokens = ctkm
usage.TotalTokens = usage.PromptTokens + usage.CompletionTokens usage.TotalTokens = usage.PromptTokens + usage.CompletionTokens
return usage return usage, err
} }

1
web/.prettierrc.mjs Normal file
View File

@@ -0,0 +1 @@
module.exports = require("@so1ve/prettier-config");

19
web/index.html Normal file
View File

@@ -0,0 +1,19 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="icon" href="/logo.png" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#ffffff" />
<meta
name="description"
content="OpenAI 接口聚合管理,支持多种渠道包括 Azure可用于二次分发管理 key仅单可执行文件已打包好 Docker 镜像,一键部署,开箱即用"
/>
<title>New API</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
<script type="module" src="/src/index.js"></script>
</body>
</html>

View File

@@ -2,6 +2,7 @@
"name": "react-template", "name": "react-template",
"version": "0.1.0", "version": "0.1.0",
"private": true, "private": true,
"type": "module",
"dependencies": { "dependencies": {
"@douyinfe/semi-icons": "^2.46.1", "@douyinfe/semi-icons": "^2.46.1",
"@douyinfe/semi-ui": "^2.46.1", "@douyinfe/semi-ui": "^2.46.1",
@@ -16,19 +17,18 @@
"react-dropzone": "^14.2.3", "react-dropzone": "^14.2.3",
"react-fireworks": "^1.0.4", "react-fireworks": "^1.0.4",
"react-router-dom": "^6.3.0", "react-router-dom": "^6.3.0",
"react-scripts": "5.0.1",
"react-telegram-login": "^1.1.2", "react-telegram-login": "^1.1.2",
"react-toastify": "^9.0.8", "react-toastify": "^9.0.8",
"react-turnstile": "^1.0.5", "react-turnstile": "^1.0.5",
"semantic-ui-css": "^2.5.0", "semantic-ui-offline": "^2.5.0",
"semantic-ui-react": "^2.1.3", "semantic-ui-react": "^2.1.3"
"usehooks-ts": "^2.9.1"
}, },
"scripts": { "scripts": {
"start": "react-scripts start", "dev": "vite",
"build": "react-scripts build", "build": "vite build",
"test": "react-scripts test", "lint": "prettier . --check",
"eject": "react-scripts eject" "lint:fix": "prettier . --write",
"preview": "vite preview"
}, },
"eslintConfig": { "eslintConfig": {
"extends": [ "extends": [
@@ -49,8 +49,11 @@
] ]
}, },
"devDependencies": { "devDependencies": {
"prettier": "2.8.8", "@so1ve/prettier-config": "^2.0.0",
"typescript": "4.4.2" "@vitejs/plugin-react": "^4.2.1",
"prettier": "^3.0.0",
"typescript": "4.4.2",
"vite": "^5.2.0"
}, },
"prettier": { "prettier": {
"singleQuote": true, "singleQuote": true,

View File

@@ -1,18 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="icon" href="logo.png" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#ffffff" />
<meta
name="description"
content="OpenAI 接口聚合管理,支持多种渠道包括 Azure可用于二次分发管理 key仅单可执行文件已打包好 Docker 镜像,一键部署,开箱即用"
/>
<title>New API</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
</body>
</html>

View File

@@ -22,9 +22,10 @@ import Log from './pages/Log';
import Chat from './pages/Chat'; import Chat from './pages/Chat';
import { Layout } from '@douyinfe/semi-ui'; import { Layout } from '@douyinfe/semi-ui';
import Midjourney from './pages/Midjourney'; import Midjourney from './pages/Midjourney';
import Detail from './pages/Detail'; // import Detail from './pages/Detail';
const Home = lazy(() => import('./pages/Home')); const Home = lazy(() => import('./pages/Home'));
const Detail = lazy(() => import('./pages/Detail'));
const About = lazy(() => import('./pages/About')); const About = lazy(() => import('./pages/About'));
function App() { function App() {
@@ -47,7 +48,7 @@ function App() {
} }
let logo = getLogo(); let logo = getLogo();
if (logo) { if (logo) {
let linkElement = document.querySelector('link[rel~=\'icon\']'); let linkElement = document.querySelector("link[rel~='icon']");
if (linkElement) { if (linkElement) {
linkElement.href = logo; linkElement.href = logo;
} }
@@ -59,7 +60,7 @@ function App() {
<Layout.Content> <Layout.Content>
<Routes> <Routes>
<Route <Route
path="/" path='/'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<Home /> <Home />
@@ -67,7 +68,7 @@ function App() {
} }
/> />
<Route <Route
path="/channel" path='/channel'
element={ element={
<PrivateRoute> <PrivateRoute>
<Channel /> <Channel />
@@ -75,7 +76,7 @@ function App() {
} }
/> />
<Route <Route
path="/channel/edit/:id" path='/channel/edit/:id'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<EditChannel /> <EditChannel />
@@ -83,7 +84,7 @@ function App() {
} }
/> />
<Route <Route
path="/channel/add" path='/channel/add'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<EditChannel /> <EditChannel />
@@ -91,7 +92,7 @@ function App() {
} }
/> />
<Route <Route
path="/token" path='/token'
element={ element={
<PrivateRoute> <PrivateRoute>
<Token /> <Token />
@@ -99,7 +100,7 @@ function App() {
} }
/> />
<Route <Route
path="/redemption" path='/redemption'
element={ element={
<PrivateRoute> <PrivateRoute>
<Redemption /> <Redemption />
@@ -107,7 +108,7 @@ function App() {
} }
/> />
<Route <Route
path="/user" path='/user'
element={ element={
<PrivateRoute> <PrivateRoute>
<User /> <User />
@@ -115,7 +116,7 @@ function App() {
} }
/> />
<Route <Route
path="/user/edit/:id" path='/user/edit/:id'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<EditUser /> <EditUser />
@@ -123,7 +124,7 @@ function App() {
} }
/> />
<Route <Route
path="/user/edit" path='/user/edit'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<EditUser /> <EditUser />
@@ -131,7 +132,7 @@ function App() {
} }
/> />
<Route <Route
path="/user/reset" path='/user/reset'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<PasswordResetConfirm /> <PasswordResetConfirm />
@@ -139,7 +140,7 @@ function App() {
} }
/> />
<Route <Route
path="/login" path='/login'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<LoginForm /> <LoginForm />
@@ -147,7 +148,7 @@ function App() {
} }
/> />
<Route <Route
path="/register" path='/register'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<RegisterForm /> <RegisterForm />
@@ -155,7 +156,7 @@ function App() {
} }
/> />
<Route <Route
path="/reset" path='/reset'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<PasswordResetForm /> <PasswordResetForm />
@@ -163,7 +164,7 @@ function App() {
} }
/> />
<Route <Route
path="/oauth/github" path='/oauth/github'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<GitHubOAuth /> <GitHubOAuth />
@@ -171,7 +172,7 @@ function App() {
} }
/> />
<Route <Route
path="/setting" path='/setting'
element={ element={
<PrivateRoute> <PrivateRoute>
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
@@ -181,7 +182,7 @@ function App() {
} }
/> />
<Route <Route
path="/topup" path='/topup'
element={ element={
<PrivateRoute> <PrivateRoute>
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
@@ -191,7 +192,7 @@ function App() {
} }
/> />
<Route <Route
path="/log" path='/log'
element={ element={
<PrivateRoute> <PrivateRoute>
<Log /> <Log />
@@ -199,23 +200,27 @@ function App() {
} }
/> />
<Route <Route
path="/detail" path='/detail'
element={ element={
<PrivateRoute> <PrivateRoute>
<Suspense fallback={<Loading></Loading>}>
<Detail /> <Detail />
</Suspense>
</PrivateRoute> </PrivateRoute>
} }
/> />
<Route <Route
path="/midjourney" path='/midjourney'
element={ element={
<PrivateRoute> <PrivateRoute>
<Suspense fallback={<Loading></Loading>}>
<Midjourney /> <Midjourney />
</Suspense>
</PrivateRoute> </PrivateRoute>
} }
/> />
<Route <Route
path="/about" path='/about'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<About /> <About />
@@ -223,16 +228,14 @@ function App() {
} }
/> />
<Route <Route
path="/chat" path='/chat'
element={ element={
<Suspense fallback={<Loading></Loading>}> <Suspense fallback={<Loading></Loading>}>
<Chat /> <Chat />
</Suspense> </Suspense>
} }
/> />
<Route path="*" element={ <Route path='*' element={<NotFound />} />
<NotFound />
} />
</Routes> </Routes>
</Layout.Content> </Layout.Content>
</Layout> </Layout>

View File

@@ -1,8 +1,20 @@
import React, { useEffect, useState } from 'react'; import React, { useEffect, useState } from 'react';
import { API, isMobile, shouldShowPrompt, showError, showInfo, showSuccess, timestamp2string } from '../helpers'; import {
API,
isMobile,
shouldShowPrompt,
showError,
showInfo,
showSuccess,
timestamp2string,
} from '../helpers';
import { CHANNEL_OPTIONS, ITEMS_PER_PAGE } from '../constants'; import { CHANNEL_OPTIONS, ITEMS_PER_PAGE } from '../constants';
import { renderGroup, renderNumberWithPoint, renderQuota } from '../helpers/render'; import {
renderGroup,
renderNumberWithPoint,
renderQuota,
} from '../helpers/render';
import { import {
Button, Button,
Dropdown, Dropdown,
@@ -15,17 +27,13 @@ import {
Table, Table,
Tag, Tag,
Tooltip, Tooltip,
Typography Typography,
} from '@douyinfe/semi-ui'; } from '@douyinfe/semi-ui';
import EditChannel from '../pages/Channel/EditChannel'; import EditChannel from '../pages/Channel/EditChannel';
import { IconTreeTriangleDown } from '@douyinfe/semi-icons'; import { IconTreeTriangleDown } from '@douyinfe/semi-icons';
function renderTimestamp(timestamp) { function renderTimestamp(timestamp) {
return ( return <>{timestamp2string(timestamp)}</>;
<>
{timestamp2string(timestamp)}
</>
);
} }
let type2label = undefined; let type2label = undefined;
@@ -38,7 +46,11 @@ function renderType(type) {
} }
type2label[0] = { value: 0, text: '未知类型', color: 'grey' }; type2label[0] = { value: 0, text: '未知类型', color: 'grey' };
} }
return <Tag size="large" color={type2label[type]?.color}>{type2label[type]?.text}</Tag>; return (
<Tag size='large' color={type2label[type]?.color}>
{type2label[type]?.text}
</Tag>
);
} }
const ChannelsTable = () => { const ChannelsTable = () => {
@@ -50,11 +62,11 @@ const ChannelsTable = () => {
// }, // },
{ {
title: 'ID', title: 'ID',
dataIndex: 'id' dataIndex: 'id',
}, },
{ {
title: '名称', title: '名称',
dataIndex: 'name' dataIndex: 'name',
}, },
{ {
title: '分组', title: '分组',
@@ -63,48 +75,34 @@ const ChannelsTable = () => {
return ( return (
<div> <div>
<Space spacing={2}> <Space spacing={2}>
{ {text.split(',').map((item, index) => {
text.split(',').map((item, index) => { return renderGroup(item);
return (renderGroup(item)); })}
})
}
</Space> </Space>
</div> </div>
); );
} },
}, },
{ {
title: '类型', title: '类型',
dataIndex: 'type', dataIndex: 'type',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderType(text)}</div>;
<div> },
{renderType(text)}
</div>
);
}
}, },
{ {
title: '状态', title: '状态',
dataIndex: 'status', dataIndex: 'status',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderStatus(text)}</div>;
<div> },
{renderStatus(text)}
</div>
);
}
}, },
{ {
title: '响应时间', title: '响应时间',
dataIndex: 'response_time', dataIndex: 'response_time',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderResponseTime(text)}</div>;
<div> },
{renderResponseTime(text)}
</div>
);
}
}, },
{ {
title: '已用/剩余', title: '已用/剩余',
@@ -114,17 +112,26 @@ const ChannelsTable = () => {
<div> <div>
<Space spacing={1}> <Space spacing={1}>
<Tooltip content={'已用额度'}> <Tooltip content={'已用额度'}>
<Tag color="white" type="ghost" size="large">{renderQuota(record.used_quota)}</Tag> <Tag color='white' type='ghost' size='large'>
{renderQuota(record.used_quota)}
</Tag>
</Tooltip> </Tooltip>
<Tooltip content={'剩余额度' + record.balance + ',点击更新'}> <Tooltip content={'剩余额度' + record.balance + ',点击更新'}>
<Tag color="white" type="ghost" size="large" onClick={() => { <Tag
color='white'
type='ghost'
size='large'
onClick={() => {
updateChannelBalance(record); updateChannelBalance(record);
}}>${renderNumberWithPoint(record.balance)}</Tag> }}
>
${renderNumberWithPoint(record.balance)}
</Tag>
</Tooltip> </Tooltip>
</Space> </Space>
</div> </div>
); );
} },
}, },
{ {
title: '优先级', title: '优先级',
@@ -134,8 +141,8 @@ const ChannelsTable = () => {
<div> <div>
<InputNumber <InputNumber
style={{ width: 70 }} style={{ width: 70 }}
name="priority" name='priority'
onBlur={e => { onBlur={(e) => {
manageChannel(record.id, 'priority', record, e.target.value); manageChannel(record.id, 'priority', record, e.target.value);
}} }}
keepFocus={true} keepFocus={true}
@@ -145,7 +152,7 @@ const ChannelsTable = () => {
/> />
</div> </div>
); );
} },
}, },
{ {
title: '权重', title: '权重',
@@ -155,8 +162,8 @@ const ChannelsTable = () => {
<div> <div>
<InputNumber <InputNumber
style={{ width: 70 }} style={{ width: 70 }}
name="weight" name='weight'
onBlur={e => { onBlur={(e) => {
manageChannel(record.id, 'weight', record, e.target.value); manageChannel(record.id, 'weight', record, e.target.value);
}} }}
keepFocus={true} keepFocus={true}
@@ -166,68 +173,103 @@ const ChannelsTable = () => {
/> />
</div> </div>
); );
} },
}, },
{ {
title: '', title: '',
dataIndex: 'operate', dataIndex: 'operate',
render: (text, record, index) => ( render: (text, record, index) => (
<div> <div>
<SplitButtonGroup style={{ marginRight: 1 }} aria-label="测试操作项目组"> <SplitButtonGroup
<Button theme="light" onClick={() => { style={{ marginRight: 1 }}
testChannel(record, ''); aria-label='测试操作项目组'
}}>测试</Button>
<Dropdown trigger="click" position="bottomRight" menu={record.test_models}
> >
<Button style={{ padding: '8px 4px' }} type="primary" icon={<IconTreeTriangleDown />}></Button> <Button
theme='light'
onClick={() => {
testChannel(record, '');
}}
>
测试
</Button>
<Dropdown
trigger='click'
position='bottomRight'
menu={record.test_models}
>
<Button
style={{ padding: '8px 4px' }}
type='primary'
icon={<IconTreeTriangleDown />}
></Button>
</Dropdown> </Dropdown>
</SplitButtonGroup> </SplitButtonGroup>
{/*<Button theme='light' type='primary' style={{marginRight: 1}} onClick={()=>testChannel(record)}>测试</Button>*/} {/*<Button theme='light' type='primary' style={{marginRight: 1}} onClick={()=>testChannel(record)}>测试</Button>*/}
<Popconfirm <Popconfirm
title="确定是否要删除此渠道?" title='确定是否要删除此渠道?'
content="此修改将不可逆" content='此修改将不可逆'
okType={'danger'} okType={'danger'}
position={'left'} position={'left'}
onConfirm={() => { onConfirm={() => {
manageChannel(record.id, 'delete', record).then( manageChannel(record.id, 'delete', record).then(() => {
() => {
removeRecord(record.id); removeRecord(record.id);
} });
);
}} }}
> >
<Button theme="light" type="danger" style={{ marginRight: 1 }}>删除</Button> <Button theme='light' type='danger' style={{ marginRight: 1 }}>
删除
</Button>
</Popconfirm> </Popconfirm>
{ {record.status === 1 ? (
record.status === 1 ? <Button
<Button theme="light" type="warning" style={{ marginRight: 1 }} onClick={ theme='light'
async () => { type='warning'
manageChannel( style={{ marginRight: 1 }}
record.id, onClick={async () => {
'disable', manageChannel(record.id, 'disable', record);
record }}
); >
} 禁用
}>禁用</Button> : </Button>
<Button theme="light" type="secondary" style={{ marginRight: 1 }} onClick={ ) : (
async () => { <Button
manageChannel( theme='light'
record.id, type='secondary'
'enable', style={{ marginRight: 1 }}
record onClick={async () => {
); manageChannel(record.id, 'enable', record);
} }}
}>启用</Button> >
} 启用
<Button theme="light" type="tertiary" style={{ marginRight: 1 }} onClick={ </Button>
() => { )}
<Button
theme='light'
type='tertiary'
style={{ marginRight: 1 }}
onClick={() => {
setEditingChannel(record); setEditingChannel(record);
setShowEdit(true); setShowEdit(true);
} }}
}>编辑</Button> >
编辑
</Button>
<Popconfirm
title='确定是否要复制此渠道?'
content='复制渠道的所有信息'
okType={'danger'}
position={'left'}
onConfirm={async () => {
copySelectedChannel(record.id);
}}
>
<Button theme='light' type='primary' style={{ marginRight: 1 }}>
复制
</Button>
</Popconfirm>
</div> </div>
) ),
} },
]; ];
const [channels, setChannels] = useState([]); const [channels, setChannels] = useState([]);
@@ -240,20 +282,22 @@ const ChannelsTable = () => {
const [searching, setSearching] = useState(false); const [searching, setSearching] = useState(false);
const [updatingBalance, setUpdatingBalance] = useState(false); const [updatingBalance, setUpdatingBalance] = useState(false);
const [pageSize, setPageSize] = useState(ITEMS_PER_PAGE); const [pageSize, setPageSize] = useState(ITEMS_PER_PAGE);
const [showPrompt, setShowPrompt] = useState(shouldShowPrompt('channel-test')); const [showPrompt, setShowPrompt] = useState(
shouldShowPrompt('channel-test'),
);
const [channelCount, setChannelCount] = useState(pageSize); const [channelCount, setChannelCount] = useState(pageSize);
const [groupOptions, setGroupOptions] = useState([]); const [groupOptions, setGroupOptions] = useState([]);
const [showEdit, setShowEdit] = useState(false); const [showEdit, setShowEdit] = useState(false);
const [enableBatchDelete, setEnableBatchDelete] = useState(false); const [enableBatchDelete, setEnableBatchDelete] = useState(false);
const [editingChannel, setEditingChannel] = useState({ const [editingChannel, setEditingChannel] = useState({
id: undefined id: undefined,
}); });
const [selectedChannels, setSelectedChannels] = useState([]); const [selectedChannels, setSelectedChannels] = useState([]);
const removeRecord = id => { const removeRecord = (id) => {
let newDataSource = [...channels]; let newDataSource = [...channels];
if (id != null) { if (id != null) {
let idx = newDataSource.findIndex(data => data.id === id); let idx = newDataSource.findIndex((data) => data.id === id);
if (idx > -1) { if (idx > -1) {
newDataSource.splice(idx, 1); newDataSource.splice(idx, 1);
@@ -272,7 +316,7 @@ const ChannelsTable = () => {
name: item, name: item,
onClick: () => { onClick: () => {
testChannel(channels[i], item); testChannel(channels[i], item);
} },
}); });
}); });
channels[i].test_models = test_models; channels[i].test_models = test_models;
@@ -288,7 +332,12 @@ const ChannelsTable = () => {
const loadChannels = async (startIdx, pageSize, idSort) => { const loadChannels = async (startIdx, pageSize, idSort) => {
setLoading(true); setLoading(true);
const res = await API.get(`/api/channel/?p=${startIdx}&page_size=${pageSize}&id_sort=${idSort}`); const res = await API.get(
`/api/channel/?p=${startIdx}&page_size=${pageSize}&id_sort=${idSort}`,
);
if (res === undefined) {
return;
}
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
if (startIdx === 0) { if (startIdx === 0) {
@@ -304,6 +353,31 @@ const ChannelsTable = () => {
setLoading(false); setLoading(false);
}; };
const copySelectedChannel = async (id) => {
const channelToCopy = channels.find(channel => String(channel.id) === String(id));
console.log(channelToCopy)
channelToCopy.name += '_复制';
channelToCopy.created_time = null;
channelToCopy.balance = 0;
channelToCopy.used_quota = 0;
if (!channelToCopy) {
showError("渠道未找到,请刷新页面后重试。");
return;
}
try {
const newChannel = {...channelToCopy, id: undefined};
const response = await API.post('/api/channel/', newChannel);
if (response.data.success) {
showSuccess("渠道复制成功");
await refresh();
} else {
showError(response.data.message);
}
} catch (error) {
showError("渠道复制失败: " + error.message);
}
};
const refresh = async () => { const refresh = async () => {
await loadChannels(activePage - 1, pageSize, idSort); await loadChannels(activePage - 1, pageSize, idSort);
}; };
@@ -311,7 +385,8 @@ const ChannelsTable = () => {
useEffect(() => { useEffect(() => {
// console.log('default effect') // console.log('default effect')
const localIdSort = localStorage.getItem('id-sort') === 'true'; const localIdSort = localStorage.getItem('id-sort') === 'true';
const localPageSize = parseInt(localStorage.getItem('page-size')) || ITEMS_PER_PAGE; const localPageSize =
parseInt(localStorage.getItem('page-size')) || ITEMS_PER_PAGE;
setIdSort(localIdSort); setIdSort(localIdSort);
setPageSize(localPageSize); setPageSize(localPageSize);
loadChannels(0, localPageSize, localIdSort) loadChannels(0, localPageSize, localIdSort)
@@ -361,7 +436,6 @@ const ChannelsTable = () => {
let channel = res.data.data; let channel = res.data.data;
let newChannels = [...channels]; let newChannels = [...channels];
if (action === 'delete') { if (action === 'delete') {
} else { } else {
record.status = channel.status; record.status = channel.status;
} }
@@ -374,22 +448,26 @@ const ChannelsTable = () => {
const renderStatus = (status) => { const renderStatus = (status) => {
switch (status) { switch (status) {
case 1: case 1:
return <Tag size="large" color="green">已启用</Tag>; return (
<Tag size='large' color='green'>
已启用
</Tag>
);
case 2: case 2:
return ( return (
<Tag size="large" color="yellow"> <Tag size='large' color='yellow'>
已禁用 已禁用
</Tag> </Tag>
); );
case 3: case 3:
return ( return (
<Tag size="large" color="yellow"> <Tag size='large' color='yellow'>
自动禁用 自动禁用
</Tag> </Tag>
); );
default: default:
return ( return (
<Tag size="large" color="grey"> <Tag size='large' color='grey'>
未知状态 未知状态
</Tag> </Tag>
); );
@@ -400,15 +478,35 @@ const ChannelsTable = () => {
let time = responseTime / 1000; let time = responseTime / 1000;
time = time.toFixed(2) + ' 秒'; time = time.toFixed(2) + ' 秒';
if (responseTime === 0) { if (responseTime === 0) {
return <Tag size="large" color="grey">未测试</Tag>; return (
<Tag size='large' color='grey'>
未测试
</Tag>
);
} else if (responseTime <= 1000) { } else if (responseTime <= 1000) {
return <Tag size="large" color="green">{time}</Tag>; return (
<Tag size='large' color='green'>
{time}
</Tag>
);
} else if (responseTime <= 3000) { } else if (responseTime <= 3000) {
return <Tag size="large" color="lime">{time}</Tag>; return (
<Tag size='large' color='lime'>
{time}
</Tag>
);
} else if (responseTime <= 5000) { } else if (responseTime <= 5000) {
return <Tag size="large" color="yellow">{time}</Tag>; return (
<Tag size='large' color='yellow'>
{time}
</Tag>
);
} else { } else {
return <Tag size="large" color="red">{time}</Tag>; return (
<Tag size='large' color='red'>
{time}
</Tag>
);
} }
}; };
@@ -420,7 +518,9 @@ const ChannelsTable = () => {
return; return;
} }
setSearching(true); setSearching(true);
const res = await API.get(`/api/channel/search?keyword=${searchKeyword}&group=${searchGroup}&model=${searchModel}`); const res = await API.get(
`/api/channel/search?keyword=${searchKeyword}&group=${searchGroup}&model=${searchModel}`,
);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
setChannels(data); setChannels(data);
@@ -520,14 +620,16 @@ const ChannelsTable = () => {
} }
}; };
let pageData = channels.slice((activePage - 1) * pageSize, activePage * pageSize); let pageData = channels.slice(
(activePage - 1) * pageSize,
activePage * pageSize,
);
const handlePageChange = page => { const handlePageChange = (page) => {
setActivePage(page); setActivePage(page);
if (page === Math.ceil(channels.length / pageSize) + 1) { if (page === Math.ceil(channels.length / pageSize) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
loadChannels(page - 1, pageSize, idSort).then(r => { loadChannels(page - 1, pageSize, idSort).then((r) => {});
});
} }
}; };
@@ -547,10 +649,15 @@ const ChannelsTable = () => {
let res = await API.get(`/api/group/`); let res = await API.get(`/api/group/`);
// add 'all' option // add 'all' option
// res.data.data.unshift('all'); // res.data.data.unshift('all');
setGroupOptions(res.data.data.map((group) => ({ if (res === undefined) {
return;
}
setGroupOptions(
res.data.data.map((group) => ({
label: group, label: group,
value: group value: group,
}))); })),
);
} catch (error) { } catch (error) {
showError(error.message); showError(error.message);
} }
@@ -564,27 +671,34 @@ const ChannelsTable = () => {
if (record.status !== 1) { if (record.status !== 1) {
return { return {
style: { style: {
background: 'var(--semi-color-disabled-border)' background: 'var(--semi-color-disabled-border)',
} },
}; };
} else { } else {
return {}; return {};
} }
}; };
return ( return (
<> <>
<EditChannel refresh={refresh} visible={showEdit} handleClose={closeEdit} editingChannel={editingChannel} /> <EditChannel
<Form onSubmit={() => { refresh={refresh}
visible={showEdit}
handleClose={closeEdit}
editingChannel={editingChannel}
/>
<Form
onSubmit={() => {
searchChannels(searchKeyword, searchGroup, searchModel); searchChannels(searchKeyword, searchGroup, searchModel);
}} labelPosition="left"> }}
labelPosition='left'
>
<div style={{ display: 'flex' }}> <div style={{ display: 'flex' }}>
<Space> <Space>
<Form.Input <Form.Input
field="search_keyword" field='search_keyword'
label="搜索渠道关键词" label='搜索渠道关键词'
placeholder="ID名称和密钥 ..." placeholder='ID名称和密钥 ...'
value={searchKeyword} value={searchKeyword}
loading={searching} loading={searching}
onChange={(v) => { onChange={(v) => {
@@ -592,21 +706,33 @@ const ChannelsTable = () => {
}} }}
/> />
<Form.Input <Form.Input
field="search_model" field='search_model'
label="模型" label='模型'
placeholder="模型关键字" placeholder='模型关键字'
value={searchModel} value={searchModel}
loading={searching} loading={searching}
onChange={(v) => { onChange={(v) => {
setSearchModel(v.trim()); setSearchModel(v.trim());
}} }}
/> />
<Form.Select field="group" label="分组" optionList={groupOptions} onChange={(v) => { <Form.Select
field='group'
label='分组'
optionList={groupOptions}
onChange={(v) => {
setSearchGroup(v); setSearchGroup(v);
searchChannels(searchKeyword, v, searchModel); searchChannels(searchKeyword, v, searchModel);
}} /> }}
<Button label="查询" type="primary" htmlType="submit" className="btn-margin-right" />
style={{ marginRight: 8 }}>查询</Button> <Button
label='查询'
type='primary'
htmlType='submit'
className='btn-margin-right'
style={{ marginRight: 8 }}
>
查询
</Button>
</Space> </Space>
</div> </div>
</Form> </Form>
@@ -614,7 +740,12 @@ const ChannelsTable = () => {
<Space> <Space>
<Space> <Space>
<Typography.Text strong>使用ID排序</Typography.Text> <Typography.Text strong>使用ID排序</Typography.Text>
<Switch checked={idSort} label="使用ID排序" uncheckedText="关" aria-label="是否用ID排序" onChange={(v) => { <Switch
checked={idSort}
label='使用ID排序'
uncheckedText='关'
aria-label='是否用ID排序'
onChange={(v) => {
localStorage.setItem('id-sort', v + ''); localStorage.setItem('id-sort', v + '');
setIdSort(v); setIdSort(v);
loadChannels(0, pageSize, v) loadChannels(0, pageSize, v)
@@ -622,12 +753,18 @@ const ChannelsTable = () => {
.catch((reason) => { .catch((reason) => {
showError(reason); showError(reason);
}); });
}}></Switch> }}
></Switch>
</Space> </Space>
</Space> </Space>
</div> </div>
<Table className={'channel-table'} style={{ marginTop: 15 }} columns={columns} dataSource={pageData} pagination={{ <Table
className={'channel-table'}
style={{ marginTop: 15 }}
columns={columns}
dataSource={pageData}
pagination={{
currentPage: activePage, currentPage: activePage,
pageSize: pageSize, pageSize: pageSize,
total: channelCount, total: channelCount,
@@ -637,57 +774,84 @@ const ChannelsTable = () => {
onPageSizeChange: (size) => { onPageSizeChange: (size) => {
handlePageSizeChange(size).then(); handlePageSizeChange(size).then();
}, },
onPageChange: handlePageChange onPageChange: handlePageChange,
}} loading={loading} onRow={handleRow} rowSelection={ }}
enableBatchDelete ? loading={loading}
{ onRow={handleRow}
rowSelection={
enableBatchDelete
? {
onChange: (selectedRowKeys, selectedRows) => { onChange: (selectedRowKeys, selectedRows) => {
// console.log(`selectedRowKeys: ${selectedRowKeys}`, 'selectedRows: ', selectedRows); // console.log(`selectedRowKeys: ${selectedRowKeys}`, 'selectedRows: ', selectedRows);
setSelectedChannels(selectedRows); setSelectedChannels(selectedRows);
},
} }
} : null : null
} /> }
<div style={{ />
<div
style={{
display: isMobile() ? '' : 'flex', display: isMobile() ? '' : 'flex',
marginTop: isMobile() ? 0 : -45, marginTop: isMobile() ? 0 : -45,
zIndex: 999, zIndex: 999,
position: 'relative', position: 'relative',
pointerEvents: 'none' pointerEvents: 'none',
}}> }}
<Space style={{ pointerEvents: 'auto', marginTop: isMobile() ? 0 : 45 }}> >
<Button theme="light" type="primary" style={{ marginRight: 8 }} onClick={ <Space
() => { style={{ pointerEvents: 'auto', marginTop: isMobile() ? 0 : 45 }}
>
<Button
theme='light'
type='primary'
style={{ marginRight: 8 }}
onClick={() => {
setEditingChannel({ setEditingChannel({
id: undefined id: undefined,
}); });
setShowEdit(true); setShowEdit(true);
} }}
}>添加渠道</Button> >
添加渠道
</Button>
<Popconfirm <Popconfirm
title="确定?" title='确定?'
okType={'warning'} okType={'warning'}
onConfirm={testAllChannels} onConfirm={testAllChannels}
position={isMobile() ? 'top' : 'top'} position={isMobile() ? 'top' : 'top'}
> >
<Button theme="light" type="warning" style={{ marginRight: 8 }}>测试所有通道</Button> <Button theme='light' type='warning' style={{ marginRight: 8 }}>
测试所有通道
</Button>
</Popconfirm> </Popconfirm>
<Popconfirm <Popconfirm
title="确定?" title='确定?'
okType={'secondary'} okType={'secondary'}
onConfirm={updateAllChannelsBalance} onConfirm={updateAllChannelsBalance}
> >
<Button theme="light" type="secondary" style={{ marginRight: 8 }}>更新所有已启用通道余额</Button> <Button theme='light' type='secondary' style={{ marginRight: 8 }}>
更新所有已启用通道余额
</Button>
</Popconfirm> </Popconfirm>
<Popconfirm <Popconfirm
title="确定是否要删除禁用通道?" title='确定是否要删除禁用通道?'
content="此修改将不可逆" content='此修改将不可逆'
okType={'danger'} okType={'danger'}
onConfirm={deleteAllDisabledChannels} onConfirm={deleteAllDisabledChannels}
> >
<Button theme="light" type="danger" style={{ marginRight: 8 }}>删除禁用通道</Button> <Button theme='light' type='danger' style={{ marginRight: 8 }}>
删除禁用通道
</Button>
</Popconfirm> </Popconfirm>
<Button theme="light" type="primary" style={{ marginRight: 8 }} onClick={refresh}>刷新</Button> <Button
theme='light'
type='primary'
style={{ marginRight: 8 }}
onClick={refresh}
>
刷新
</Button>
</Space> </Space>
{/*<div style={{width: '100%', pointerEvents: 'none', position: 'absolute'}}>*/} {/*<div style={{width: '100%', pointerEvents: 'none', position: 'absolute'}}>*/}
@@ -696,28 +860,41 @@ const ChannelsTable = () => {
<div style={{ marginTop: 20 }}> <div style={{ marginTop: 20 }}>
<Space> <Space>
<Typography.Text strong>开启批量删除</Typography.Text> <Typography.Text strong>开启批量删除</Typography.Text>
<Switch label="开启批量删除" uncheckedText="关" aria-label="是否开启批量删除" onChange={(v) => { <Switch
label='开启批量删除'
uncheckedText='关'
aria-label='是否开启批量删除'
onChange={(v) => {
setEnableBatchDelete(v); setEnableBatchDelete(v);
}}></Switch> }}
></Switch>
<Popconfirm <Popconfirm
title="确定是否要删除所选通道?" title='确定是否要删除所选通道?'
content="此修改将不可逆" content='此修改将不可逆'
okType={'danger'} okType={'danger'}
onConfirm={batchDeleteChannels} onConfirm={batchDeleteChannels}
disabled={!enableBatchDelete} disabled={!enableBatchDelete}
position={'top'} position={'top'}
> >
<Button disabled={!enableBatchDelete} theme="light" type="danger" <Button
style={{ marginRight: 8 }}>删除所选通道</Button> disabled={!enableBatchDelete}
theme='light'
type='danger'
style={{ marginRight: 8 }}
>
删除所选通道
</Button>
</Popconfirm> </Popconfirm>
<Popconfirm <Popconfirm
title="确定是否要修复数据库一致性?" title='确定是否要修复数据库一致性?'
content="进行该操作时,可能导致渠道访问错误,请仅在数据库出现问题时使用" content='进行该操作时,可能导致渠道访问错误,请仅在数据库出现问题时使用'
okType={'warning'} okType={'warning'}
onConfirm={fixChannelsAbilities} onConfirm={fixChannelsAbilities}
position={'top'} position={'top'}
> >
<Button theme="light" type="secondary" style={{ marginRight: 8 }}>修复数据库一致性</Button> <Button theme='light' type='secondary' style={{ marginRight: 8 }}>
修复数据库一致性
</Button>
</Popconfirm> </Popconfirm>
</Space> </Space>
</div> </div>

View File

@@ -32,27 +32,36 @@ const Footer = () => {
<Layout.Content style={{ textAlign: 'center' }}> <Layout.Content style={{ textAlign: 'center' }}>
{footer ? ( {footer ? (
<div <div
className="custom-footer" className='custom-footer'
dangerouslySetInnerHTML={{ __html: footer }} dangerouslySetInnerHTML={{ __html: footer }}
></div> ></div>
) : ( ) : (
<div className="custom-footer"> <div className='custom-footer'>
<a <a
href="https://github.com/Calcium-Ion/new-api" href='https://github.com/Calcium-Ion/new-api'
target="_blank" rel="noreferrer" target='_blank'
rel='noreferrer'
> >
New API {process.env.REACT_APP_VERSION}{' '} New API {import.meta.env.VITE_REACT_APP_VERSION}{' '}
</a> </a>
{' '} {' '}
<a href="https://github.com/Calcium-Ion" target="_blank" rel="noreferrer"> <a
href='https://github.com/Calcium-Ion'
target='_blank'
rel='noreferrer'
>
Calcium-Ion Calcium-Ion
</a>{' '} </a>{' '}
开发基于{' '} 开发基于{' '}
<a href="https://github.com/songquanpeng/one-api" target="_blank" rel="noreferrer"> <a
href='https://github.com/songquanpeng/one-api'
target='_blank'
rel='noreferrer'
>
One API v0.5.4 One API v0.5.4
</a>{' '} </a>{' '}
本项目根据{' '} 本项目根据{' '}
<a href="https://opensource.org/licenses/mit-license.php"> <a href='https://opensource.org/licenses/mit-license.php'>
MIT 许可证 MIT 许可证
</a>{' '} </a>{' '}
授权 授权

View File

@@ -49,7 +49,7 @@ const GitHubOAuth = () => {
return ( return (
<Segment style={{ minHeight: '300px' }}> <Segment style={{ minHeight: '300px' }}>
<Dimmer active inverted> <Dimmer active inverted>
<Loader size="large">{prompt}</Loader> <Loader size='large'>{prompt}</Loader>
</Dimmer> </Dimmer>
</Segment> </Segment>
); );

View File

@@ -1,6 +1,7 @@
import React, { useContext, useEffect, useState } from 'react'; import React, { useContext, useEffect, useState } from 'react';
import { Link, useNavigate } from 'react-router-dom'; import { Link, useNavigate } from 'react-router-dom';
import { UserContext } from '../context/User'; import { UserContext } from '../context/User';
import { useSetTheme, useTheme } from '../context/Theme';
import { API, getLogo, getSystemName, showSuccess } from '../helpers'; import { API, getLogo, getSystemName, showSuccess } from '../helpers';
import '../index.css'; import '../index.css';
@@ -17,15 +18,15 @@ let headerButtons = [
text: '关于', text: '关于',
itemKey: 'about', itemKey: 'about',
to: '/about', to: '/about',
icon: <IconHelpCircle /> icon: <IconHelpCircle />,
} },
]; ];
if (localStorage.getItem('chat_link')) { if (localStorage.getItem('chat_link')) {
headerButtons.splice(1, 0, { headerButtons.splice(1, 0, {
name: '聊天', name: '聊天',
to: '/chat', to: '/chat',
icon: 'comments' icon: 'comments',
}); });
} }
@@ -34,13 +35,15 @@ const HeaderBar = () => {
let navigate = useNavigate(); let navigate = useNavigate();
const [showSidebar, setShowSidebar] = useState(false); const [showSidebar, setShowSidebar] = useState(false);
const [dark, setDark] = useState(false);
const systemName = getSystemName(); const systemName = getSystemName();
const logo = getLogo(); const logo = getLogo();
var themeMode = localStorage.getItem('theme-mode');
const currentDate = new Date(); const currentDate = new Date();
// enable fireworks on new year(1.1 and 2.9-2.24) // enable fireworks on new year(1.1 and 2.9-2.24)
const isNewYear = (currentDate.getMonth() === 0 && currentDate.getDate() === 1) || (currentDate.getMonth() === 1 && currentDate.getDate() >= 9 && currentDate.getDate() <= 24); const isNewYear =
(currentDate.getMonth() === 0 && currentDate.getDate() === 1) ||
(currentDate.getMonth() === 1 &&
currentDate.getDate() >= 9 &&
currentDate.getDate() <= 24);
async function logout() { async function logout() {
setShowSidebar(false); setShowSidebar(false);
@@ -62,26 +65,19 @@ const HeaderBar = () => {
}, 3000); }, 3000);
}; };
const theme = useTheme();
const setTheme = useSetTheme();
useEffect(() => { useEffect(() => {
if (themeMode === 'dark') { if (theme === 'dark') {
switchMode(true); document.body.setAttribute('theme-mode', 'dark');
} }
if (isNewYear) { if (isNewYear) {
console.log('Happy New Year!'); console.log('Happy New Year!');
} }
}, []); }, []);
const switchMode = (model) => {
const body = document.body;
if (!model) {
body.removeAttribute('theme-mode');
localStorage.setItem('theme-mode', 'light');
} else {
body.setAttribute('theme-mode', 'dark');
localStorage.setItem('theme-mode', 'dark');
}
setDark(model);
};
return ( return (
<> <>
<Layout> <Layout>
@@ -93,7 +89,7 @@ const HeaderBar = () => {
const routerMap = { const routerMap = {
about: '/about', about: '/about',
login: '/login', login: '/login',
register: '/register' register: '/register',
}; };
return ( return (
<Link <Link
@@ -106,52 +102,71 @@ const HeaderBar = () => {
}} }}
selectedKeys={[]} selectedKeys={[]}
// items={headerButtons} // items={headerButtons}
onSelect={key => { onSelect={(key) => {}}
}}
footer={ footer={
<> <>
{isNewYear && {isNewYear && (
// happy new year // happy new year
<Dropdown <Dropdown
position="bottomRight" position='bottomRight'
render={ render={
<Dropdown.Menu> <Dropdown.Menu>
<Dropdown.Item onClick={handleNewYearClick}>Happy New Year!!!</Dropdown.Item> <Dropdown.Item onClick={handleNewYearClick}>
Happy New Year!!!
</Dropdown.Item>
</Dropdown.Menu> </Dropdown.Menu>
} }
> >
<Nav.Item itemKey={'new-year'} text={'🏮'} /> <Nav.Item itemKey={'new-year'} text={'🏮'} />
</Dropdown> </Dropdown>
} )}
<Nav.Item itemKey={'about'} icon={<IconHelpCircle />} /> <Nav.Item itemKey={'about'} icon={<IconHelpCircle />} />
<Switch checkedText="🌞" size={'large'} checked={dark} uncheckedText="🌙" onChange={switchMode} /> <Switch
{userState.user ? checkedText='🌞'
size={'large'}
checked={theme === 'dark'}
uncheckedText='🌙'
onChange={(checked) => {
setTheme(checked);
}}
/>
{userState.user ? (
<> <>
<Dropdown <Dropdown
position="bottomRight" position='bottomRight'
render={ render={
<Dropdown.Menu> <Dropdown.Menu>
<Dropdown.Item onClick={logout}>退出</Dropdown.Item> <Dropdown.Item onClick={logout}>退出</Dropdown.Item>
</Dropdown.Menu> </Dropdown.Menu>
} }
> >
<Avatar size="small" color={stringToColor(userState.user.username)} style={{ margin: 4 }}> <Avatar
size='small'
color={stringToColor(userState.user.username)}
style={{ margin: 4 }}
>
{userState.user.username[0]} {userState.user.username[0]}
</Avatar> </Avatar>
<span>{userState.user.username}</span> <span>{userState.user.username}</span>
</Dropdown> </Dropdown>
</> </>
: ) : (
<> <>
<Nav.Item itemKey={'login'} text={'登录'} icon={<IconKey />} /> <Nav.Item
<Nav.Item itemKey={'register'} text={'注册'} icon={<IconUser />} /> itemKey={'login'}
text={'登录'}
icon={<IconKey />}
/>
<Nav.Item
itemKey={'register'}
text={'注册'}
icon={<IconUser />}
/>
</>
)}
</> </>
} }
</> ></Nav>
}
>
</Nav>
</div> </div>
</Layout> </Layout>
</> </>

View File

@@ -1,13 +1,11 @@
import React from 'react'; import React from 'react';
import { Dimmer, Loader, Segment } from 'semantic-ui-react'; import { Spin } from '@douyinfe/semi-ui';
const Loading = ({ prompt: name = 'page' }) => { const Loading = ({ prompt: name = 'page' }) => {
return ( return (
<Segment style={{ height: 100 }}> <Spin style={{ height: 100 }} spinning={true}>
<Dimmer active inverted> 加载{name}...
<Loader indeterminate>加载{name}...</Loader> </Spin>
</Dimmer>
</Segment>
); );
}; };

View File

@@ -4,7 +4,15 @@ import { UserContext } from '../context/User';
import { API, getLogo, showError, showInfo, showSuccess } from '../helpers'; import { API, getLogo, showError, showInfo, showSuccess } from '../helpers';
import { onGitHubOAuthClicked } from './utils'; import { onGitHubOAuthClicked } from './utils';
import Turnstile from 'react-turnstile'; import Turnstile from 'react-turnstile';
import { Button, Card, Divider, Form, Icon, Layout, Modal } from '@douyinfe/semi-ui'; import {
Button,
Card,
Divider,
Form,
Icon,
Layout,
Modal,
} from '@douyinfe/semi-ui';
import Title from '@douyinfe/semi-ui/lib/es/typography/title'; import Title from '@douyinfe/semi-ui/lib/es/typography/title';
import Text from '@douyinfe/semi-ui/lib/es/typography/text'; import Text from '@douyinfe/semi-ui/lib/es/typography/text';
import TelegramLoginButton from 'react-telegram-login'; import TelegramLoginButton from 'react-telegram-login';
@@ -16,7 +24,7 @@ const LoginForm = () => {
const [inputs, setInputs] = useState({ const [inputs, setInputs] = useState({
username: '', username: '',
password: '', password: '',
wechat_verification_code: '' wechat_verification_code: '',
}); });
const [searchParams, setSearchParams] = useSearchParams(); const [searchParams, setSearchParams] = useSearchParams();
const [submitted, setSubmitted] = useState(false); const [submitted, setSubmitted] = useState(false);
@@ -56,7 +64,7 @@ const LoginForm = () => {
return; return;
} }
const res = await API.get( const res = await API.get(
`/api/oauth/wechat?code=${inputs.wechat_verification_code}` `/api/oauth/wechat?code=${inputs.wechat_verification_code}`,
); );
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
@@ -81,17 +89,24 @@ const LoginForm = () => {
} }
setSubmitted(true); setSubmitted(true);
if (username && password) { if (username && password) {
const res = await API.post(`/api/user/login?turnstile=${turnstileToken}`, { const res = await API.post(
`/api/user/login?turnstile=${turnstileToken}`,
{
username, username,
password password,
}); },
);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
userDispatch({ type: 'login', payload: data }); userDispatch({ type: 'login', payload: data });
localStorage.setItem('user', JSON.stringify(data)); localStorage.setItem('user', JSON.stringify(data));
showSuccess('登录成功!'); showSuccess('登录成功!');
if (username === 'root' && password === '123456') { if (username === 'root' && password === '123456') {
Modal.error({ title: '您正在使用默认密码!', content: '请立刻修改默认密码!', centered: true }); Modal.error({
title: '您正在使用默认密码!',
content: '请立刻修改默认密码!',
centered: true,
});
} }
navigate('/token'); navigate('/token');
} else { } else {
@@ -104,7 +119,16 @@ const LoginForm = () => {
// 添加Telegram登录处理函数 // 添加Telegram登录处理函数
const onTelegramLoginClicked = async (response) => { const onTelegramLoginClicked = async (response) => {
const fields = ['id', 'first_name', 'last_name', 'username', 'photo_url', 'auth_date', 'hash', 'lang']; const fields = [
'id',
'first_name',
'last_name',
'username',
'photo_url',
'auth_date',
'hash',
'lang',
];
const params = {}; const params = {};
fields.forEach((field) => { fields.forEach((field) => {
if (response[field]) { if (response[field]) {
@@ -126,10 +150,15 @@ const LoginForm = () => {
return ( return (
<div> <div>
<Layout> <Layout>
<Layout.Header> <Layout.Header></Layout.Header>
</Layout.Header>
<Layout.Content> <Layout.Content>
<div style={{ justifyContent: 'center', display: 'flex', marginTop: 120 }}> <div
style={{
justifyContent: 'center',
display: 'flex',
marginTop: 120,
}}
>
<div style={{ width: 500 }}> <div style={{ width: 500 }}>
<Card> <Card>
<Title heading={2} style={{ textAlign: 'center' }}> <Title heading={2} style={{ textAlign: 'center' }}>
@@ -139,50 +168,72 @@ const LoginForm = () => {
<Form.Input <Form.Input
field={'username'} field={'username'}
label={'用户名'} label={'用户名'}
placeholder="用户名" placeholder='用户名'
name="username" name='username'
onChange={(value) => handleChange('username', value)} onChange={(value) => handleChange('username', value)}
/> />
<Form.Input <Form.Input
field={'password'} field={'password'}
label={'密码'} label={'密码'}
placeholder="密码" placeholder='密码'
name="password" name='password'
type="password" type='password'
onChange={(value) => handleChange('password', value)} onChange={(value) => handleChange('password', value)}
/> />
<Button theme="solid" style={{ width: '100%' }} type={'primary'} size="large" <Button
htmlType={'submit'} onClick={handleSubmit}> theme='solid'
style={{ width: '100%' }}
type={'primary'}
size='large'
htmlType={'submit'}
onClick={handleSubmit}
>
登录 登录
</Button> </Button>
</Form> </Form>
<div style={{ display: 'flex', justifyContent: 'space-between', marginTop: 20 }}> <div
style={{
display: 'flex',
justifyContent: 'space-between',
marginTop: 20,
}}
>
<Text> <Text>
没有账号请先 <Link to="/register">注册账号</Link> 没有账号请先 <Link to='/register'>注册账号</Link>
</Text> </Text>
<Text> <Text>
忘记密码 <Link to="/reset">点击重置</Link> 忘记密码 <Link to='/reset'>点击重置</Link>
</Text> </Text>
</div> </div>
{status.github_oauth || status.wechat_login || status.telegram_oauth ? ( {status.github_oauth ||
status.wechat_login ||
status.telegram_oauth ? (
<> <>
<Divider margin="12px" align="center"> <Divider margin='12px' align='center'>
第三方登录 第三方登录
</Divider> </Divider>
<div style={{ display: 'flex', justifyContent: 'center', marginTop: 20 }}> <div
style={{
display: 'flex',
justifyContent: 'center',
marginTop: 20,
}}
>
{status.github_oauth ? ( {status.github_oauth ? (
<Button <Button
type="primary" type='primary'
icon={<IconGithubLogo />} icon={<IconGithubLogo />}
onClick={() => onGitHubOAuthClicked(status.github_client_id)} onClick={() =>
onGitHubOAuthClicked(status.github_client_id)
}
/> />
) : ( ) : (
<></> <></>
)} )}
{status.wechat_login ? ( {status.wechat_login ? (
<Button <Button
type="primary" type='primary'
style={{ color: 'rgba(var(--semi-green-5), 1)' }} style={{ color: 'rgba(var(--semi-green-5), 1)' }}
icon={<Icon svg={<WeChatIcon />} />} icon={<Icon svg={<WeChatIcon />} />}
onClick={onWeChatLoginClicked} onClick={onWeChatLoginClicked}
@@ -190,19 +241,31 @@ const LoginForm = () => {
) : ( ) : (
<></> <></>
)} )}
</div>
{status.telegram_oauth ? ( {status.telegram_oauth ? (
<TelegramLoginButton dataOnauth={onTelegramLoginClicked} botName={status.telegram_bot_name} /> <>
) : ( <div
<></> style={{
)} display: 'flex',
justifyContent: 'center',
marginTop: 5,
}}
>
<TelegramLoginButton
dataOnauth={onTelegramLoginClicked}
botName={status.telegram_bot_name}
/>
</div> </div>
</> </>
) : ( ) : (
<></> <></>
)} )}
</>
) : (
<></>
)}
<Modal <Modal
title="微信扫码登录" title='微信扫码登录'
visible={showWeChatLoginModal} visible={showWeChatLoginModal}
maskClosable={true} maskClosable={true}
onOk={onSubmitWeChatVerificationCode} onOk={onSubmitWeChatVerificationCode}
@@ -211,7 +274,13 @@ const LoginForm = () => {
size={'small'} size={'small'}
centered={true} centered={true}
> >
<div style={{ display: 'flex', alignItem: 'center', flexDirection: 'column' }}> <div
style={{
display: 'flex',
alignItem: 'center',
flexDirection: 'column',
}}
>
<img src={status.wechat_qrcode} /> <img src={status.wechat_qrcode} />
</div> </div>
<div style={{ textAlign: 'center' }}> <div style={{ textAlign: 'center' }}>
@@ -219,19 +288,27 @@ const LoginForm = () => {
微信扫码关注公众号输入验证码获取验证码三分钟内有效 微信扫码关注公众号输入验证码获取验证码三分钟内有效
</p> </p>
</div> </div>
<Form size="large"> <Form size='large'>
<Form.Input <Form.Input
field={'wechat_verification_code'} field={'wechat_verification_code'}
placeholder="验证码" placeholder='验证码'
label={'验证码'} label={'验证码'}
value={inputs.wechat_verification_code} value={inputs.wechat_verification_code}
onChange={(value) => handleChange('wechat_verification_code', value)} onChange={(value) =>
handleChange('wechat_verification_code', value)
}
/> />
</Form> </Form>
</Modal> </Modal>
</Card> </Card>
{turnstileEnabled ? ( {turnstileEnabled ? (
<div style={{ display: 'flex', justifyContent: 'center', marginTop: 20 }}> <div
style={{
display: 'flex',
justifyContent: 'center',
marginTop: 20,
}}
>
<Turnstile <Turnstile
sitekey={turnstileSiteKey} sitekey={turnstileSiteKey}
onVerify={(token) => { onVerify={(token) => {
@@ -244,7 +321,6 @@ const LoginForm = () => {
)} )}
</div> </div>
</div> </div>
</Layout.Content> </Layout.Content>
</Layout> </Layout>
</div> </div>

View File

@@ -1,7 +1,25 @@
import React, { useEffect, useState } from 'react'; import React, { useEffect, useState } from 'react';
import { API, copy, isAdmin, showError, showSuccess, timestamp2string } from '../helpers'; import {
API,
copy,
isAdmin,
showError,
showSuccess,
timestamp2string,
} from '../helpers';
import { Avatar, Button, Form, Layout, Modal, Select, Space, Spin, Table, Tag } from '@douyinfe/semi-ui'; import {
Avatar,
Button,
Form,
Layout,
Modal,
Select,
Space,
Spin,
Table,
Tag,
} from '@douyinfe/semi-ui';
import { ITEMS_PER_PAGE } from '../constants'; import { ITEMS_PER_PAGE } from '../constants';
import { renderNumber, renderQuota, stringToColor } from '../helpers/render'; import { renderNumber, renderQuota, stringToColor } from '../helpers/render';
import Paragraph from '@douyinfe/semi-ui/lib/es/typography/paragraph'; import Paragraph from '@douyinfe/semi-ui/lib/es/typography/paragraph';
@@ -9,131 +27,285 @@ import Paragraph from '@douyinfe/semi-ui/lib/es/typography/paragraph';
const { Header } = Layout; const { Header } = Layout;
function renderTimestamp(timestamp) { function renderTimestamp(timestamp) {
return (<> return <>{timestamp2string(timestamp)}</>;
{timestamp2string(timestamp)}
</>);
} }
const MODE_OPTIONS = [{ key: 'all', text: '全部用户', value: 'all' }, { key: 'self', text: '当前用户', value: 'self' }]; const MODE_OPTIONS = [
{ key: 'all', text: '全部用户', value: 'all' },
{ key: 'self', text: '当前用户', value: 'self' },
];
const colors = ['amber', 'blue', 'cyan', 'green', 'grey', 'indigo', 'light-blue', 'lime', 'orange', 'pink', 'purple', 'red', 'teal', 'violet', 'yellow']; const colors = [
'amber',
'blue',
'cyan',
'green',
'grey',
'indigo',
'light-blue',
'lime',
'orange',
'pink',
'purple',
'red',
'teal',
'violet',
'yellow',
];
function renderType(type) { function renderType(type) {
switch (type) { switch (type) {
case 1: case 1:
return <Tag color="cyan" size="large"> 充值 </Tag>; return (
<Tag color='cyan' size='large'>
{' '}
充值{' '}
</Tag>
);
case 2: case 2:
return <Tag color="lime" size="large"> 消费 </Tag>; return (
<Tag color='lime' size='large'>
{' '}
消费{' '}
</Tag>
);
case 3: case 3:
return <Tag color="orange" size="large"> 管理 </Tag>; return (
<Tag color='orange' size='large'>
{' '}
管理{' '}
</Tag>
);
case 4: case 4:
return <Tag color="purple" size="large"> 系统 </Tag>; return (
<Tag color='purple' size='large'>
{' '}
系统{' '}
</Tag>
);
default: default:
return <Tag color="black" size="large"> 未知 </Tag>; return (
<Tag color='black' size='large'>
{' '}
未知{' '}
</Tag>
);
} }
} }
function renderIsStream(bool) { function renderIsStream(bool) {
if (bool) { if (bool) {
return <Tag color="blue" size="large"></Tag>; return (
<Tag color='blue' size='large'>
</Tag>
);
} else { } else {
return <Tag color="purple" size="large">非流</Tag>; return (
<Tag color='purple' size='large'>
非流
</Tag>
);
} }
} }
function renderUseTime(type) { function renderUseTime(type) {
const time = parseInt(type); const time = parseInt(type);
if (time < 101) { if (time < 101) {
return <Tag color="green" size="large"> {time} s </Tag>; return (
<Tag color='green' size='large'>
{' '}
{time} s{' '}
</Tag>
);
} else if (time < 300) { } else if (time < 300) {
return <Tag color="orange" size="large"> {time} s </Tag>; return (
<Tag color='orange' size='large'>
{' '}
{time} s{' '}
</Tag>
);
} else { } else {
return <Tag color="red" size="large"> {time} s </Tag>; return (
<Tag color='red' size='large'>
{' '}
{time} s{' '}
</Tag>
);
} }
} }
const LogsTable = () => { const LogsTable = () => {
const columns = [{ const columns = [
title: '时间', dataIndex: 'timestamp2string' {
}, { title: '时间',
dataIndex: 'timestamp2string',
},
{
title: '渠道', title: '渠道',
dataIndex: 'channel', dataIndex: 'channel',
className: isAdmin() ? 'tableShow' : 'tableHiddle', className: isAdmin() ? 'tableShow' : 'tableHiddle',
render: (text, record, index) => { render: (text, record, index) => {
return (isAdminUser ? record.type === 0 || record.type === 2 ? <div> return isAdminUser ? (
{<Tag color={colors[parseInt(text) % colors.length]} size="large"> {text} </Tag>} record.type === 0 || record.type === 2 ? (
</div> : <></> : <></>); <div>
{
<Tag
color={colors[parseInt(text) % colors.length]}
size='large'
>
{' '}
{text}{' '}
</Tag>
} }
}, { </div>
) : (
<></>
)
) : (
<></>
);
},
},
{
title: '用户', title: '用户',
dataIndex: 'username', dataIndex: 'username',
className: isAdmin() ? 'tableShow' : 'tableHiddle', className: isAdmin() ? 'tableShow' : 'tableHiddle',
render: (text, record, index) => { render: (text, record, index) => {
return (isAdminUser ? <div> return isAdminUser ? (
<Avatar size="small" color={stringToColor(text)} style={{ marginRight: 4 }} <div>
onClick={() => showUserInfo(record.user_id)}> <Avatar
size='small'
color={stringToColor(text)}
style={{ marginRight: 4 }}
onClick={() => showUserInfo(record.user_id)}
>
{typeof text === 'string' && text.slice(0, 1)} {typeof text === 'string' && text.slice(0, 1)}
</Avatar> </Avatar>
{text} {text}
</div> : <></>); </div>
} ) : (
}, { <></>
title: '令牌', dataIndex: 'token_name', render: (text, record, index) => { );
return (record.type === 0 || record.type === 2 ? <div> },
<Tag color="grey" size="large" onClick={() => { },
{
title: '令牌',
dataIndex: 'token_name',
render: (text, record, index) => {
return record.type === 0 || record.type === 2 ? (
<div>
<Tag
color='grey'
size='large'
onClick={() => {
copyText(text); copyText(text);
}}> {text} </Tag> }}
</div> : <></>); >
} {' '}
}, { {text}{' '}
title: '类型', dataIndex: 'type', render: (text, record, index) => { </Tag>
return (<div> </div>
{renderType(text)} ) : (
</div>); <></>
} );
}, { },
title: '模型', dataIndex: 'model_name', render: (text, record, index) => { },
return (record.type === 0 || record.type === 2 ? <div> {
<Tag color={stringToColor(text)} size="large" onClick={() => { title: '类型',
dataIndex: 'type',
render: (text, record, index) => {
return <div>{renderType(text)}</div>;
},
},
{
title: '模型',
dataIndex: 'model_name',
render: (text, record, index) => {
return record.type === 0 || record.type === 2 ? (
<div>
<Tag
color={stringToColor(text)}
size='large'
onClick={() => {
copyText(text); copyText(text);
}}> {text} </Tag> }}
</div> : <></>); >
} {' '}
}, { {text}{' '}
title: '用时', dataIndex: 'use_time', render: (text, record, index) => { </Tag>
return (<div> </div>
) : (
<></>
);
},
},
{
title: '用时',
dataIndex: 'use_time',
render: (text, record, index) => {
return (
<div>
<Space> <Space>
{renderUseTime(text)} {renderUseTime(text)}
{renderIsStream(record.is_stream)} {renderIsStream(record.is_stream)}
</Space> </Space>
</div>); </div>
} );
}, { },
title: '提示', dataIndex: 'prompt_tokens', render: (text, record, index) => { },
return (record.type === 0 || record.type === 2 ? <div> {
{<span> {text} </span>} title: '提示',
</div> : <></>); dataIndex: 'prompt_tokens',
} render: (text, record, index) => {
}, { return record.type === 0 || record.type === 2 ? (
title: '补全', dataIndex: 'completion_tokens', render: (text, record, index) => { <div>{<span> {text} </span>}</div>
return (parseInt(text) > 0 && (record.type === 0 || record.type === 2) ? <div> ) : (
{<span> {text} </span>} <></>
</div> : <></>); );
} },
}, { },
title: '花费', dataIndex: 'quota', render: (text, record, index) => { {
return (record.type === 0 || record.type === 2 ? <div> title: '补全',
{renderQuota(text, 6)} dataIndex: 'completion_tokens',
</div> : <></>); render: (text, record, index) => {
} return parseInt(text) > 0 &&
}, { (record.type === 0 || record.type === 2) ? (
title: '详情', dataIndex: 'content', render: (text, record, index) => { <div>{<span> {text} </span>}</div>
return <Paragraph ellipsis={{ rows: 2, showTooltip: { type: 'popover', opts: { style: { width: 240 } } } }} ) : (
style={{ maxWidth: 240 }}> <></>
);
},
},
{
title: '花费',
dataIndex: 'quota',
render: (text, record, index) => {
return record.type === 0 || record.type === 2 ? (
<div>{renderQuota(text, 6)}</div>
) : (
<></>
);
},
},
{
title: '详情',
dataIndex: 'content',
render: (text, record, index) => {
return (
<Paragraph
ellipsis={{
rows: 2,
showTooltip: { type: 'popover', opts: { style: { width: 240 } } },
}}
style={{ maxWidth: 240 }}
>
{text} {text}
</Paragraph>; </Paragraph>
} );
}]; },
},
];
const [logs, setLogs] = useState([]); const [logs, setLogs] = useState([]);
const [showStat, setShowStat] = useState(false); const [showStat, setShowStat] = useState(false);
@@ -154,12 +326,20 @@ const LogsTable = () => {
model_name: '', model_name: '',
start_timestamp: timestamp2string(now.getTime() / 1000 - 86400), start_timestamp: timestamp2string(now.getTime() / 1000 - 86400),
end_timestamp: timestamp2string(now.getTime() / 1000 + 3600), end_timestamp: timestamp2string(now.getTime() / 1000 + 3600),
channel: '' channel: '',
}); });
const { username, token_name, model_name, start_timestamp, end_timestamp, channel } = inputs; const {
username,
token_name,
model_name,
start_timestamp,
end_timestamp,
channel,
} = inputs;
const [stat, setStat] = useState({ const [stat, setStat] = useState({
quota: 0, token: 0 quota: 0,
token: 0,
}); });
const handleInputChange = (value, name) => { const handleInputChange = (value, name) => {
@@ -169,7 +349,9 @@ const LogsTable = () => {
const getLogSelfStat = async () => { const getLogSelfStat = async () => {
let localStartTimestamp = Date.parse(start_timestamp) / 1000; let localStartTimestamp = Date.parse(start_timestamp) / 1000;
let localEndTimestamp = Date.parse(end_timestamp) / 1000; let localEndTimestamp = Date.parse(end_timestamp) / 1000;
let res = await API.get(`/api/log/self/stat?type=${logType}&token_name=${token_name}&model_name=${model_name}&start_timestamp=${localStartTimestamp}&end_timestamp=${localEndTimestamp}`); let res = await API.get(
`/api/log/self/stat?type=${logType}&token_name=${token_name}&model_name=${model_name}&start_timestamp=${localStartTimestamp}&end_timestamp=${localEndTimestamp}`,
);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
setStat(data); setStat(data);
@@ -181,7 +363,9 @@ const LogsTable = () => {
const getLogStat = async () => { const getLogStat = async () => {
let localStartTimestamp = Date.parse(start_timestamp) / 1000; let localStartTimestamp = Date.parse(start_timestamp) / 1000;
let localEndTimestamp = Date.parse(end_timestamp) / 1000; let localEndTimestamp = Date.parse(end_timestamp) / 1000;
let res = await API.get(`/api/log/stat?type=${logType}&username=${username}&token_name=${token_name}&model_name=${model_name}&start_timestamp=${localStartTimestamp}&end_timestamp=${localEndTimestamp}&channel=${channel}`); let res = await API.get(
`/api/log/stat?type=${logType}&username=${username}&token_name=${token_name}&model_name=${model_name}&start_timestamp=${localStartTimestamp}&end_timestamp=${localEndTimestamp}&channel=${channel}`,
);
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
setStat(data); setStat(data);
@@ -209,12 +393,16 @@ const LogsTable = () => {
const { success, message, data } = res.data; const { success, message, data } = res.data;
if (success) { if (success) {
Modal.info({ Modal.info({
title: '用户信息', content: <div style={{ padding: 12 }}> title: '用户信息',
content: (
<div style={{ padding: 12 }}>
<p>用户名: {data.username}</p> <p>用户名: {data.username}</p>
<p>余额: {renderQuota(data.quota)}</p> <p>余额: {renderQuota(data.quota)}</p>
<p>已用额度{renderQuota(data.used_quota)}</p> <p>已用额度{renderQuota(data.used_quota)}</p>
<p>请求次数{renderNumber(data.request_count)}</p> <p>请求次数{renderNumber(data.request_count)}</p>
</div>, centered: true </div>
),
centered: true,
}); });
} else { } else {
showError(message); showError(message);
@@ -259,14 +447,16 @@ const LogsTable = () => {
setLoading(false); setLoading(false);
}; };
const pageData = logs.slice((activePage - 1) * pageSize, activePage * pageSize); const pageData = logs.slice(
(activePage - 1) * pageSize,
activePage * pageSize,
);
const handlePageChange = page => { const handlePageChange = (page) => {
setActivePage(page); setActivePage(page);
if (page === Math.ceil(logs.length / pageSize) + 1) { if (page === Math.ceil(logs.length / pageSize) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
loadLogs(page - 1, pageSize).then(r => { loadLogs(page - 1, pageSize, logType).then((r) => {});
});
} }
}; };
@@ -281,10 +471,10 @@ const LogsTable = () => {
}); });
}; };
const refresh = async (localLogType) => { const refresh = async () => {
// setLoading(true); // setLoading(true);
setActivePage(1); setActivePage(1);
await loadLogs(0, pageSize, localLogType); await loadLogs(0, pageSize, logType);
}; };
const copyText = async (text) => { const copyText = async (text) => {
@@ -298,7 +488,8 @@ const LogsTable = () => {
useEffect(() => { useEffect(() => {
// console.log('default effect') // console.log('default effect')
const localPageSize = parseInt(localStorage.getItem('page-size')) || ITEMS_PER_PAGE; const localPageSize =
parseInt(localStorage.getItem('page-size')) || ITEMS_PER_PAGE;
setPageSize(localPageSize); setPageSize(localPageSize);
loadLogs(0, localPageSize) loadLogs(0, localPageSize)
.then() .then()
@@ -326,52 +517,108 @@ const LogsTable = () => {
setSearching(false); setSearching(false);
}; };
return (<> return (
<>
<Layout> <Layout>
<Header> <Header>
<Spin spinning={loadingStat}> <Spin spinning={loadingStat}>
<h3>使用明细总消耗额度 <h3>
<span onClick={handleEyeClick} style={{ 使用明细总消耗额度
cursor: 'pointer', color: 'gray' <span
}}>{showStat ? renderQuota(stat.quota) : '点击查看'}</span> onClick={handleEyeClick}
style={{
cursor: 'pointer',
color: 'gray',
}}
>
{showStat ? renderQuota(stat.quota) : '点击查看'}
</span>
</h3> </h3>
</Spin> </Spin>
</Header> </Header>
<Form layout="horizontal" style={{ marginTop: 10 }}> <Form layout='horizontal' style={{ marginTop: 10 }}>
<> <>
<Form.Input field="token_name" label="令牌名称" style={{ width: 176 }} value={token_name} <Form.Input
placeholder={'可选值'} name="token_name" field='token_name'
onChange={value => handleInputChange(value, 'token_name')} /> label='令牌名称'
<Form.Input field="model_name" label="模型名称" style={{ width: 176 }} value={model_name} style={{ width: 176 }}
placeholder="可选值" value={token_name}
name="model_name" placeholder={'可选值'}
onChange={value => handleInputChange(value, 'model_name')} /> name='token_name'
<Form.DatePicker field="start_timestamp" label="起始时间" style={{ width: 272 }} onChange={(value) => handleInputChange(value, 'token_name')}
/>
<Form.Input
field='model_name'
label='模型名称'
style={{ width: 176 }}
value={model_name}
placeholder='可选值'
name='model_name'
onChange={(value) => handleInputChange(value, 'model_name')}
/>
<Form.DatePicker
field='start_timestamp'
label='起始时间'
style={{ width: 272 }}
initValue={start_timestamp} initValue={start_timestamp}
value={start_timestamp} type="dateTime" value={start_timestamp}
name="start_timestamp" type='dateTime'
onChange={value => handleInputChange(value, 'start_timestamp')} /> name='start_timestamp'
<Form.DatePicker field="end_timestamp" fluid label="结束时间" style={{ width: 272 }} onChange={(value) => handleInputChange(value, 'start_timestamp')}
/>
<Form.DatePicker
field='end_timestamp'
fluid
label='结束时间'
style={{ width: 272 }}
initValue={end_timestamp} initValue={end_timestamp}
value={end_timestamp} type="dateTime" value={end_timestamp}
name="end_timestamp" type='dateTime'
onChange={value => handleInputChange(value, 'end_timestamp')} /> name='end_timestamp'
{isAdminUser && <> onChange={(value) => handleInputChange(value, 'end_timestamp')}
<Form.Input field="channel" label="渠道 ID" style={{ width: 176 }} value={channel} />
placeholder="可选值" name="channel" {isAdminUser && (
onChange={value => handleInputChange(value, 'channel')} /> <>
<Form.Input field="username" label="用户名称" style={{ width: 176 }} value={username} <Form.Input
placeholder={'可选值'} name="username" field='channel'
onChange={value => handleInputChange(value, 'username')} /> label='渠道 ID'
</>} style={{ width: 176 }}
value={channel}
placeholder='可选值'
name='channel'
onChange={(value) => handleInputChange(value, 'channel')}
/>
<Form.Input
field='username'
label='用户名称'
style={{ width: 176 }}
value={username}
placeholder={'可选值'}
name='username'
onChange={(value) => handleInputChange(value, 'username')}
/>
</>
)}
<Form.Section> <Form.Section>
<Button label="查询" type="primary" htmlType="submit" className="btn-margin-right" <Button
onClick={refresh} loading={loading}>查询</Button> label='查询'
type='primary'
htmlType='submit'
className='btn-margin-right'
onClick={refresh}
loading={loading}
>
查询
</Button>
</Form.Section> </Form.Section>
</> </>
</Form> </Form>
<Table style={{ marginTop: 5 }} columns={columns} dataSource={pageData} pagination={{ <Table
style={{ marginTop: 5 }}
columns={columns}
dataSource={pageData}
pagination={{
currentPage: activePage, currentPage: activePage,
pageSize: pageSize, pageSize: pageSize,
total: logCount, total: logCount,
@@ -380,20 +627,26 @@ const LogsTable = () => {
onPageSizeChange: (size) => { onPageSizeChange: (size) => {
handlePageSizeChange(size).then(); handlePageSizeChange(size).then();
}, },
onPageChange: handlePageChange onPageChange: handlePageChange,
}} /> }}
<Select defaultValue="0" style={{ width: 120 }} onChange={(value) => { />
<Select
defaultValue='0'
style={{ width: 120 }}
onChange={(value) => {
setLogType(parseInt(value)); setLogType(parseInt(value));
refresh(parseInt(value)).then(); loadLogs(0, pageSize, parseInt(value));
}}> }}
<Select.Option value="0">全部</Select.Option> >
<Select.Option value="1">充值</Select.Option> <Select.Option value='0'>全部</Select.Option>
<Select.Option value="2">消费</Select.Option> <Select.Option value='1'>充值</Select.Option>
<Select.Option value="3">管理</Select.Option> <Select.Option value='2'>消费</Select.Option>
<Select.Option value="4">系统</Select.Option> <Select.Option value='3'>管理</Select.Option>
<Select.Option value='4'>系统</Select.Option>
</Select> </Select>
</Layout> </Layout>
</>); </>
);
}; };
export default LogsTable; export default LogsTable;

View File

@@ -1,86 +1,226 @@
import React, { useEffect, useState } from 'react'; import React, { useEffect, useState } from 'react';
import { API, copy, isAdmin, showError, showSuccess, timestamp2string } from '../helpers'; import {
API,
copy,
isAdmin,
showError,
showSuccess,
timestamp2string,
} from '../helpers';
import { Button, Form, ImagePreview, Layout, Modal, Progress, Table, Tag, Typography } from '@douyinfe/semi-ui'; import {
Banner,
Button,
Form,
ImagePreview,
Layout,
Modal,
Progress,
Table,
Tag,
Typography,
} from '@douyinfe/semi-ui';
import { ITEMS_PER_PAGE } from '../constants'; import { ITEMS_PER_PAGE } from '../constants';
const colors = [
const colors = ['amber', 'blue', 'cyan', 'green', 'grey', 'indigo', 'amber',
'light-blue', 'lime', 'orange', 'pink', 'blue',
'purple', 'red', 'teal', 'violet', 'yellow' 'cyan',
'green',
'grey',
'indigo',
'light-blue',
'lime',
'orange',
'pink',
'purple',
'red',
'teal',
'violet',
'yellow',
]; ];
function renderType(type) { function renderType(type) {
switch (type) { switch (type) {
case 'IMAGINE': case 'IMAGINE':
return <Tag color="blue" size="large">绘图</Tag>; return (
<Tag color='blue' size='large'>
绘图
</Tag>
);
case 'UPSCALE': case 'UPSCALE':
return <Tag color="orange" size="large">放大</Tag>; return (
<Tag color='orange' size='large'>
放大
</Tag>
);
case 'VARIATION': case 'VARIATION':
return <Tag color="purple" size="large">变换</Tag>; return (
<Tag color='purple' size='large'>
变换
</Tag>
);
case 'HIGH_VARIATION': case 'HIGH_VARIATION':
return <Tag color="purple" size="large">强变换</Tag>; return (
<Tag color='purple' size='large'>
强变换
</Tag>
);
case 'LOW_VARIATION': case 'LOW_VARIATION':
return <Tag color="purple" size="large">弱变换</Tag>; return (
<Tag color='purple' size='large'>
弱变换
</Tag>
);
case 'PAN': case 'PAN':
return <Tag color="cyan" size="large">平移</Tag>; return (
<Tag color='cyan' size='large'>
平移
</Tag>
);
case 'DESCRIBE': case 'DESCRIBE':
return <Tag color="yellow" size="large">图生文</Tag>; return (
<Tag color='yellow' size='large'>
图生文
</Tag>
);
case 'BLEND': case 'BLEND':
return <Tag color="lime" size="large">图混合</Tag>; return (
<Tag color='lime' size='large'>
图混合
</Tag>
);
case 'SHORTEN': case 'SHORTEN':
return <Tag color="pink" size="large">缩词</Tag>; return (
<Tag color='pink' size='large'>
缩词
</Tag>
);
case 'REROLL': case 'REROLL':
return <Tag color="indigo" size="large">重绘</Tag>; return (
<Tag color='indigo' size='large'>
重绘
</Tag>
);
case 'INPAINT': case 'INPAINT':
return <Tag color="violet" size="large">局部重绘-提交</Tag>; return (
<Tag color='violet' size='large'>
局部重绘-提交
</Tag>
);
case 'ZOOM': case 'ZOOM':
return <Tag color="teal" size="large">变焦</Tag>; return (
<Tag color='teal' size='large'>
变焦
</Tag>
);
case 'CUSTOM_ZOOM': case 'CUSTOM_ZOOM':
return <Tag color="teal" size="large">自定义变焦-提交</Tag>; return (
<Tag color='teal' size='large'>
自定义变焦-提交
</Tag>
);
case 'MODAL': case 'MODAL':
return <Tag color="green" size="large">窗口处理</Tag>; return (
<Tag color='green' size='large'>
窗口处理
</Tag>
);
case 'SWAP_FACE': case 'SWAP_FACE':
return <Tag color="light-green" size="large">换脸</Tag>; return (
<Tag color='light-green' size='large'>
换脸
</Tag>
);
default: default:
return <Tag color="white" size="large">未知</Tag>; return (
<Tag color='white' size='large'>
未知
</Tag>
);
} }
} }
function renderCode(code) { function renderCode(code) {
switch (code) { switch (code) {
case 1: case 1:
return <Tag color="green" size="large">已提交</Tag>; return (
<Tag color='green' size='large'>
已提交
</Tag>
);
case 21: case 21:
return <Tag color="lime" size="large">等待中</Tag>; return (
<Tag color='lime' size='large'>
等待中
</Tag>
);
case 22: case 22:
return <Tag color="orange" size="large">重复提交</Tag>; return (
<Tag color='orange' size='large'>
重复提交
</Tag>
);
case 0: case 0:
return <Tag color="yellow" size="large">未提交</Tag>; return (
<Tag color='yellow' size='large'>
未提交
</Tag>
);
default: default:
return <Tag color="white" size="large">未知</Tag>; return (
<Tag color='white' size='large'>
未知
</Tag>
);
} }
} }
function renderStatus(type) { function renderStatus(type) {
// Ensure all cases are string literals by adding quotes. // Ensure all cases are string literals by adding quotes.
switch (type) { switch (type) {
case 'SUCCESS': case 'SUCCESS':
return <Tag color="green" size="large">成功</Tag>; return (
<Tag color='green' size='large'>
成功
</Tag>
);
case 'NOT_START': case 'NOT_START':
return <Tag color="grey" size="large">未启动</Tag>; return (
<Tag color='grey' size='large'>
未启动
</Tag>
);
case 'SUBMITTED': case 'SUBMITTED':
return <Tag color="yellow" size="large">队列中</Tag>; return (
<Tag color='yellow' size='large'>
队列中
</Tag>
);
case 'IN_PROGRESS': case 'IN_PROGRESS':
return <Tag color="blue" size="large">执行中</Tag>; return (
<Tag color='blue' size='large'>
执行中
</Tag>
);
case 'FAILURE': case 'FAILURE':
return <Tag color="red" size="large">失败</Tag>; return (
<Tag color='red' size='large'>
失败
</Tag>
);
case 'MODAL': case 'MODAL':
return <Tag color="yellow" size="large">窗口等待</Tag>; return (
<Tag color='yellow' size='large'>
窗口等待
</Tag>
);
default: default:
return <Tag color="white" size="large">未知</Tag>; return (
<Tag color='white' size='large'>
未知
</Tag>
);
} }
} }
@@ -97,7 +237,6 @@ const renderTimestamp = (timestampInSeconds) => {
return `${year}-${month}-${day} ${hours}:${minutes}:${seconds}`; // 格式化输出 return `${year}-${month}-${day} ${hours}:${minutes}:${seconds}`; // 格式化输出
}; };
const LogsTable = () => { const LogsTable = () => {
const [isModalOpen, setIsModalOpen] = useState(false); const [isModalOpen, setIsModalOpen] = useState(false);
const [modalContent, setModalContent] = useState(''); const [modalContent, setModalContent] = useState('');
@@ -106,12 +245,8 @@ const LogsTable = () => {
title: '提交时间', title: '提交时间',
dataIndex: 'submit_time', dataIndex: 'submit_time',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderTimestamp(text / 1000)}</div>;
<div> },
{renderTimestamp(text / 1000)}
</div>
);
}
}, },
{ {
title: '渠道', title: '渠道',
@@ -119,61 +254,50 @@ const LogsTable = () => {
className: isAdmin() ? 'tableShow' : 'tableHiddle', className: isAdmin() ? 'tableShow' : 'tableHiddle',
render: (text, record, index) => { render: (text, record, index) => {
return ( return (
<div> <div>
<Tag color={colors[parseInt(text) % colors.length]} size="large" onClick={() => { <Tag
color={colors[parseInt(text) % colors.length]}
size='large'
onClick={() => {
copyText(text); // 假设copyText是用于文本复制的函数 copyText(text); // 假设copyText是用于文本复制的函数
}}> {text} </Tag> }}
>
{' '}
{text}{' '}
</Tag>
</div> </div>
); );
} },
}, },
{ {
title: '类型', title: '类型',
dataIndex: 'action', dataIndex: 'action',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderType(text)}</div>;
<div> },
{renderType(text)}
</div>
);
}
}, },
{ {
title: '任务ID', title: '任务ID',
dataIndex: 'mj_id', dataIndex: 'mj_id',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{text}</div>;
<div> },
{text}
</div>
);
}
}, },
{ {
title: '提交结果', title: '提交结果',
dataIndex: 'code', dataIndex: 'code',
className: isAdmin() ? 'tableShow' : 'tableHiddle', className: isAdmin() ? 'tableShow' : 'tableHiddle',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderCode(text)}</div>;
<div> },
{renderCode(text)}
</div>
);
}
}, },
{ {
title: '任务状态', title: '任务状态',
dataIndex: 'status', dataIndex: 'status',
className: isAdmin() ? 'tableShow' : 'tableHiddle', className: isAdmin() ? 'tableShow' : 'tableHiddle',
render: (text, record, index) => { render: (text, record, index) => {
return ( return <div>{renderStatus(text)}</div>;
<div> },
{renderStatus(text)}
</div>
);
}
}, },
{ {
title: '进度', title: '进度',
@@ -183,13 +307,20 @@ const LogsTable = () => {
<div> <div>
{ {
// 转换例如100%为数字100如果text未定义返回0 // 转换例如100%为数字100如果text未定义返回0
<Progress stroke={record.status === 'FAILURE' ? 'var(--semi-color-warning)' : null} <Progress
percent={text ? parseInt(text.replace('%', '')) : 0} showInfo={true} stroke={
aria-label="drawing progress" /> record.status === 'FAILURE'
? 'var(--semi-color-warning)'
: null
}
percent={text ? parseInt(text.replace('%', '')) : 0}
showInfo={true}
aria-label='drawing progress'
/>
} }
</div> </div>
); );
} },
}, },
{ {
title: '结果图片', title: '结果图片',
@@ -208,7 +339,7 @@ const LogsTable = () => {
查看图片 查看图片
</Button> </Button>
); );
} },
}, },
{ {
title: 'Prompt', title: 'Prompt',
@@ -231,7 +362,7 @@ const LogsTable = () => {
{text} {text}
</Typography.Text> </Typography.Text>
); );
} },
}, },
{ {
title: 'PromptEn', title: 'PromptEn',
@@ -254,7 +385,7 @@ const LogsTable = () => {
{text} {text}
</Typography.Text> </Typography.Text>
); );
} },
}, },
{ {
title: '失败原因', title: '失败原因',
@@ -277,9 +408,8 @@ const LogsTable = () => {
{text} {text}
</Typography.Text> </Typography.Text>
); );
} },
} },
]; ];
const [logs, setLogs] = useState([]); const [logs, setLogs] = useState([]);
@@ -289,6 +419,7 @@ const LogsTable = () => {
const [logType, setLogType] = useState(0); const [logType, setLogType] = useState(0);
const isAdminUser = isAdmin(); const isAdminUser = isAdmin();
const [isModalOpenurl, setIsModalOpenurl] = useState(false); const [isModalOpenurl, setIsModalOpenurl] = useState(false);
const [showBanner, setShowBanner] = useState(false);
// 定义模态框图片URL的状态和更新函数 // 定义模态框图片URL的状态和更新函数
const [modalImageUrl, setModalImageUrl] = useState(''); const [modalImageUrl, setModalImageUrl] = useState('');
@@ -298,20 +429,19 @@ const LogsTable = () => {
channel_id: '', channel_id: '',
mj_id: '', mj_id: '',
start_timestamp: timestamp2string(now.getTime() / 1000 - 2592000), start_timestamp: timestamp2string(now.getTime() / 1000 - 2592000),
end_timestamp: timestamp2string(now.getTime() / 1000 + 3600) end_timestamp: timestamp2string(now.getTime() / 1000 + 3600),
}); });
const { channel_id, mj_id, start_timestamp, end_timestamp } = inputs; const { channel_id, mj_id, start_timestamp, end_timestamp } = inputs;
const [stat, setStat] = useState({ const [stat, setStat] = useState({
quota: 0, quota: 0,
token: 0 token: 0,
}); });
const handleInputChange = (value, name) => { const handleInputChange = (value, name) => {
setInputs((inputs) => ({ ...inputs, [name]: value })); setInputs((inputs) => ({ ...inputs, [name]: value }));
}; };
const setLogsFormat = (logs) => { const setLogsFormat = (logs) => {
for (let i = 0; i < logs.length; i++) { for (let i = 0; i < logs.length; i++) {
logs[i].timestamp2string = timestamp2string(logs[i].created_at); logs[i].timestamp2string = timestamp2string(logs[i].created_at);
@@ -350,14 +480,16 @@ const LogsTable = () => {
setLoading(false); setLoading(false);
}; };
const pageData = logs.slice((activePage - 1) * ITEMS_PER_PAGE, activePage * ITEMS_PER_PAGE); const pageData = logs.slice(
(activePage - 1) * ITEMS_PER_PAGE,
activePage * ITEMS_PER_PAGE,
);
const handlePageChange = page => { const handlePageChange = (page) => {
setActivePage(page); setActivePage(page);
if (page === Math.ceil(logs.length / ITEMS_PER_PAGE) + 1) { if (page === Math.ceil(logs.length / ITEMS_PER_PAGE) + 1) {
// In this case we have to load more data and then append them. // In this case we have to load more data and then append them.
loadLogs(page - 1).then(r => { loadLogs(page - 1).then((r) => {});
});
} }
}; };
@@ -380,44 +512,92 @@ const LogsTable = () => {
refresh().then(); refresh().then();
}, [logType]); }, [logType]);
useEffect(() => {
const mjNotifyEnabled = localStorage.getItem('mj_notify_enabled');
if (mjNotifyEnabled !== 'true') {
setShowBanner(true);
}
}, []);
return ( return (
<> <>
<Layout> <Layout>
<Form layout="horizontal" style={{ marginTop: 10 }}> {isAdminUser && showBanner ? (
<Banner
type='info'
description='当前未开启Midjourney回调部分项目可能无法获得绘图结果可在运营设置中开启。'
/>
) : (
<></>
)}
<Form layout='horizontal' style={{ marginTop: 10 }}>
<> <>
<Form.Input field="channel_id" label="渠道 ID" style={{ width: 176 }} value={channel_id} <Form.Input
placeholder={'可选值'} name="channel_id" field='channel_id'
onChange={value => handleInputChange(value, 'channel_id')} /> label='渠道 ID'
<Form.Input field="mj_id" label="任务 ID" style={{ width: 176 }} value={mj_id} style={{ width: 176 }}
placeholder="可选值" value={channel_id}
name="mj_id" placeholder={'可选值'}
onChange={value => handleInputChange(value, 'mj_id')} /> name='channel_id'
<Form.DatePicker field="start_timestamp" label="起始时间" style={{ width: 272 }} onChange={(value) => handleInputChange(value, 'channel_id')}
/>
<Form.Input
field='mj_id'
label='任务 ID'
style={{ width: 176 }}
value={mj_id}
placeholder='可选值'
name='mj_id'
onChange={(value) => handleInputChange(value, 'mj_id')}
/>
<Form.DatePicker
field='start_timestamp'
label='起始时间'
style={{ width: 272 }}
initValue={start_timestamp} initValue={start_timestamp}
value={start_timestamp} type="dateTime" value={start_timestamp}
name="start_timestamp" type='dateTime'
onChange={value => handleInputChange(value, 'start_timestamp')} /> name='start_timestamp'
<Form.DatePicker field="end_timestamp" fluid label="结束时间" style={{ width: 272 }} onChange={(value) => handleInputChange(value, 'start_timestamp')}
/>
<Form.DatePicker
field='end_timestamp'
fluid
label='结束时间'
style={{ width: 272 }}
initValue={end_timestamp} initValue={end_timestamp}
value={end_timestamp} type="dateTime" value={end_timestamp}
name="end_timestamp" type='dateTime'
onChange={value => handleInputChange(value, 'end_timestamp')} /> name='end_timestamp'
onChange={(value) => handleInputChange(value, 'end_timestamp')}
/>
<Form.Section> <Form.Section>
<Button label="查询" type="primary" htmlType="submit" className="btn-margin-right" <Button
onClick={refresh}>查询</Button> label='查询'
type='primary'
htmlType='submit'
className='btn-margin-right'
onClick={refresh}
>
查询
</Button>
</Form.Section> </Form.Section>
</> </>
</Form> </Form>
<Table style={{ marginTop: 5 }} columns={columns} dataSource={pageData} pagination={{ <Table
style={{ marginTop: 5 }}
columns={columns}
dataSource={pageData}
pagination={{
currentPage: activePage, currentPage: activePage,
pageSize: ITEMS_PER_PAGE, pageSize: ITEMS_PER_PAGE,
total: logCount, total: logCount,
pageSizeOpts: [10, 20, 50, 100], pageSizeOpts: [10, 20, 50, 100],
onPageChange: handlePageChange onPageChange: handlePageChange,
}} loading={loading} /> }}
loading={loading}
/>
<Modal <Modal
visible={isModalOpen} visible={isModalOpen}
onOk={() => setIsModalOpen(false)} onOk={() => setIsModalOpen(false)}
@@ -433,7 +613,6 @@ const LogsTable = () => {
visible={isModalOpenurl} visible={isModalOpenurl}
onVisibleChange={(visible) => setIsModalOpenurl(visible)} onVisibleChange={(visible) => setIsModalOpenurl(visible)}
/> />
</Layout> </Layout>
</> </>
); );

Some files were not shown because too many files have changed in this diff Show More