mirror of
https://github.com/linux-do/new-api.git
synced 2025-11-17 19:13:42 +08:00
Compare commits
76 Commits
v0.2.8-alp
...
v0.2.8-alp
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
895ee09b33 | ||
|
|
584eefec3e | ||
|
|
a7e3168c17 | ||
|
|
d767ae04ff | ||
|
|
402a415c79 | ||
|
|
55c28b2f98 | ||
|
|
fc6ae6bf34 | ||
|
|
a9b978528e | ||
|
|
d1778bb20a | ||
|
|
37a0930db4 | ||
|
|
1117112225 | ||
|
|
f2654692e8 | ||
|
|
c834289f2c | ||
|
|
bc649ddaa7 | ||
|
|
c838beba3d | ||
|
|
1e9d64fd19 | ||
|
|
79010dbfc5 | ||
|
|
4d3b57e19b | ||
|
|
0df1df4fd4 | ||
|
|
f6fcb2fd5e | ||
|
|
cadd8aa622 | ||
|
|
11be36dafd | ||
|
|
61006bed9e | ||
|
|
6b07e6fb97 | ||
|
|
23bfc4f655 | ||
|
|
31f0cfb2cc | ||
|
|
4d348c0427 | ||
|
|
1e2c1ee950 | ||
|
|
77d14561ac | ||
|
|
1289be0484 | ||
|
|
5abb0a9c4e | ||
|
|
e3f66807ee | ||
|
|
e8845ce1de | ||
|
|
b069056bda | ||
|
|
954fa879dc | ||
|
|
4eb6217bc0 | ||
|
|
eb79880502 | ||
|
|
b82582cb42 | ||
|
|
c993ab2746 | ||
|
|
a47111a031 | ||
|
|
6f1bef66a7 | ||
|
|
692455ef2a | ||
|
|
c1040afed9 | ||
|
|
58591b8c2a | ||
|
|
69b9950aa0 | ||
|
|
ecdcb379fe | ||
|
|
4dd5233f49 | ||
|
|
d2a0d9f73b | ||
|
|
8a27977284 | ||
|
|
099068f543 | ||
|
|
bcf1b160d1 | ||
|
|
1e0e40aa69 | ||
|
|
74392efbed | ||
|
|
a31247ecaa | ||
|
|
1291504fdc | ||
|
|
54f17d6002 | ||
|
|
fcb8506679 | ||
|
|
fa902cca4c | ||
|
|
0c8696816d | ||
|
|
1e0053985a | ||
|
|
36fac2baa2 | ||
|
|
7e26238231 | ||
|
|
bfbbe67fcd | ||
|
|
0867d36fc7 | ||
|
|
24722a8ee2 | ||
|
|
c86bff38ac | ||
|
|
3cd25c7e53 | ||
|
|
6aa1f2fcbe | ||
|
|
e2663a5c66 | ||
|
|
6fe643b1c1 | ||
|
|
1deb935f1d | ||
|
|
0caa639df7 | ||
|
|
afc2289bdf | ||
|
|
472145aed6 | ||
|
|
f956e4489f | ||
|
|
b1019be733 |
214
LICENSE
214
LICENSE
@@ -1,21 +1,201 @@
|
||||
MIT License
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
Copyright (c) 2024 Calcium-Ion
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
1. Definitions.
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
||||
@@ -57,11 +57,11 @@
|
||||
|
||||
2. 在渠道管理中添加渠道,渠道类型选择**Midjourney Proxy**,如果是plus版本选择**Midjourney Proxy Plus**
|
||||
,模型请参考上方模型列表
|
||||
3. 地址填写midjourney-proxy部署的地址,例如:http://localhost:8080
|
||||
3. **代理**填写midjourney-proxy部署的地址,例如:http://localhost:8080
|
||||
4. 密钥填写midjourney-proxy的密钥,如果没有设置密钥,可以随便填
|
||||
|
||||
### 对接上游new api
|
||||
|
||||
1. 在渠道管理中添加渠道,渠道类型选择**Midjourney Proxy Plus**,模型请参考上方模型列表
|
||||
2. 地址填写上游new api的地址,例如:http://localhost:3000
|
||||
2. **代理**填写上游new api的地址,例如:http://localhost:3000
|
||||
3. 密钥填写上游new api的密钥
|
||||
32
README.md
32
README.md
@@ -5,8 +5,6 @@
|
||||
> 本项目为开源项目,在[One API](https://github.com/songquanpeng/one-api)的基础上进行二次开发,感谢原作者的无私奉献。
|
||||
> 使用者必须在遵循 OpenAI 的[使用条款](https://openai.com/policies/terms-of-use)以及**法律法规**的情况下使用,不得用于非法用途。
|
||||
|
||||
|
||||
> [!WARNING]
|
||||
> 本项目为个人学习使用,不保证稳定性,且不提供任何技术支持,使用者必须在遵循 OpenAI 的使用条款以及法律法规的情况下使用,不得用于非法用途。
|
||||
> 根据[《生成式人工智能服务管理暂行办法》](http://www.cac.gov.cn/2023-07/13/c_1690898327029107.htm)的要求,请勿对中国地区公众提供一切未经备案的生成式人工智能服务。
|
||||
|
||||
@@ -47,6 +45,11 @@
|
||||
2. 对[@Botfather](https://t.me/botfather)输入指令/setdomain
|
||||
3. 选择你的bot,然后输入http(s)://你的网站地址/login
|
||||
4. Telegram Bot 名称是bot username 去掉@后的字符串
|
||||
13. 添加 [Suno API](https://github.com/Suno-API/Suno-API)接口的支持,[对接文档](Suno.md),支持的接口如下:
|
||||
+ [x] /suno/submit/music
|
||||
+ [x] /suno/submit/lyrics
|
||||
+ [x] /suno/fetch
|
||||
+ [x] /suno/fetch/:id
|
||||
|
||||
## 模型支持
|
||||
此版本额外支持以下模型:
|
||||
@@ -56,20 +59,37 @@
|
||||
4. [Ollama](https://github.com/ollama/ollama?tab=readme-ov-file),添加渠道时,密钥可以随便填写,默认的请求地址是[http://localhost:11434](http://localhost:11434),如果需要修改请在渠道中修改
|
||||
5. [Midjourney-Proxy(Plus)](https://github.com/novicezk/midjourney-proxy)接口,[对接文档](Midjourney.md)
|
||||
6. [零一万物](https://platform.lingyiwanwu.com/)
|
||||
7. 自定义渠道,支持填入完整调用地址
|
||||
8. [Suno API](https://github.com/Suno-API/Suno-API) 接口,[对接文档](Suno.md)
|
||||
|
||||
您可以在渠道中添加自定义模型gpt-4-gizmo-*或g-*,此模型并非OpenAI官方模型,而是第三方模型,使用官方key无法调用。
|
||||
|
||||
## 渠道重试
|
||||
渠道重试功能已经实现,可以在`设置->运营设置->通用设置`设置重试次数,建议开启缓存功能。
|
||||
渠道重试功能已经实现,可以在`设置->运营设置->通用设置`设置重试次数,**建议开启缓存**功能。
|
||||
如果开启了重试功能,第一次重试使用同优先级,第二次重试使用下一个优先级,以此类推。
|
||||
### 缓存设置方法
|
||||
1. `REDIS_CONN_STRING`:设置之后将使用 Redis 作为缓存使用。
|
||||
+ 例子:`REDIS_CONN_STRING=redis://default:redispw@localhost:49153`
|
||||
2. `MEMORY_CACHE_ENABLED`:启用内存缓存(如果设置了`REDIS_CONN_STRING`,则无需手动设置),会导致用户额度的更新存在一定的延迟,可选值为 `true` 和 `false`,未设置则默认为 `false`。
|
||||
+ 例子:`MEMORY_CACHE_ENABLED=true`
|
||||
### 为什么有的时候没有重试
|
||||
这些错误码不会重试:400,504,524
|
||||
### 我想让400也重试
|
||||
在`渠道->编辑`中,将`状态码复写`改为
|
||||
```json
|
||||
{
|
||||
"400": "500"
|
||||
}
|
||||
```
|
||||
可以实现400错误转为500错误,从而重试
|
||||
|
||||
## 比原版One API多出的配置
|
||||
- `STREAMING_TIMEOUT`:设置流式一次回复的超时时间,默认为 30 秒
|
||||
|
||||
## 部署
|
||||
### 部署要求
|
||||
- 本地数据库(默认):SQLite(Docker 部署默认使用 SQLite,必须挂载 `/data` 目录到宿主机)
|
||||
- 远程数据库:MySQL 版本 >= 5.7.8,PgSQL 版本 >= 9.6
|
||||
### 基于 Docker 进行部署
|
||||
```shell
|
||||
# 使用 SQLite 的部署命令:
|
||||
@@ -88,9 +108,15 @@ docker run --name new-api -d --restart always -p 3000:3000 -e TZ=Asia/Shanghai -
|
||||
docker run --name new-api -d --restart always -p 3000:3000 -e SQL_DSN="root:123456@tcp(宝塔的服务器地址:宝塔数据库端口)/宝塔数据库名称" -e TZ=Asia/Shanghai -v /www/wwwroot/new-api:/data calciumion/new-api:latest
|
||||
# 注意:数据库要开启远程访问,并且只允许服务器IP访问
|
||||
```
|
||||
### 默认账号密码
|
||||
默认账号root 密码123456
|
||||
|
||||
## Midjourney接口设置文档
|
||||
[对接文档](Midjourney.md)
|
||||
|
||||
## Suno接口设置文档
|
||||
[对接文档](Suno.md)
|
||||
|
||||
## 交流群
|
||||
<img src="https://github.com/Calcium-Ion/new-api/assets/61247483/de536a8a-0161-47a7-a0a2-66ef6de81266" width="300">
|
||||
|
||||
|
||||
37
Suno.md
Normal file
37
Suno.md
Normal file
@@ -0,0 +1,37 @@
|
||||
# Suno API文档
|
||||
|
||||
**简介**:Suno API文档
|
||||
|
||||
## 模型列表
|
||||
|
||||
### Suno API支持
|
||||
|
||||
- suno_music (自定义模式、灵感模式、续写)
|
||||
- suno_lyrics (生成歌词)
|
||||
|
||||
|
||||
## 模型价格设置(在设置-运营设置-模型固定价格设置中设置)
|
||||
```json
|
||||
{
|
||||
"suno_music": 0.3,
|
||||
"suno_lyrics": 0.01
|
||||
}
|
||||
```
|
||||
|
||||
## 渠道设置
|
||||
|
||||
### 对接 Suno API
|
||||
|
||||
1.
|
||||
部署 Suno API,并配置好suno账号等(强烈建议设置密钥),[项目地址](https://github.com/Suno-API/Suno-API)
|
||||
|
||||
2. 在渠道管理中添加渠道,渠道类型选择**Suno API**
|
||||
,模型请参考上方模型列表
|
||||
3. **代理**填写 Suno API 部署的地址,例如:http://localhost:8080
|
||||
4. 密钥填写 Suno API 的密钥,如果没有设置密钥,可以随便填
|
||||
|
||||
### 对接上游new api
|
||||
|
||||
1. 在渠道管理中添加渠道,渠道类型选择**Suno API**,或任意类型,只需模型包含上方模型列表的模型
|
||||
2. **代理**填写上游new api的地址,例如:http://localhost:3000
|
||||
3. 密钥填写上游new api的密钥
|
||||
@@ -22,6 +22,7 @@ var StartTime = time.Now().Unix() // unit: second
|
||||
var Version = "v0.0.0" // this hard coding will be replaced automatically when building, no need to manually change
|
||||
var SystemName = "New API"
|
||||
var ServerAddress = "http://localhost:3000"
|
||||
var OutProxyUrl = ""
|
||||
var Footer = ""
|
||||
var Logo = ""
|
||||
var TopUpLink = ""
|
||||
@@ -31,6 +32,7 @@ var QuotaPerUnit = 500 * 1000.0 // $0.002 / 1K tokens
|
||||
var DisplayInCurrencyEnabled = true
|
||||
var DisplayTokenStatEnabled = true
|
||||
var DrawingEnabled = true
|
||||
var TaskEnabled = true
|
||||
var DataExportEnabled = true
|
||||
var DataExportInterval = 5 // unit: minute
|
||||
var DataExportDefaultTime = "hour" // unit: minute
|
||||
@@ -118,14 +120,14 @@ var IsMasterNode = os.Getenv("NODE_TYPE") != "slave"
|
||||
var requestInterval, _ = strconv.Atoi(os.Getenv("POLLING_INTERVAL"))
|
||||
var RequestInterval = time.Duration(requestInterval) * time.Second
|
||||
|
||||
var SyncFrequency = GetOrDefault("SYNC_FREQUENCY", 60) // unit is second
|
||||
var SyncFrequency = GetEnvOrDefault("SYNC_FREQUENCY", 60) // unit is second
|
||||
|
||||
var BatchUpdateEnabled = false
|
||||
var BatchUpdateInterval = GetOrDefault("BATCH_UPDATE_INTERVAL", 5)
|
||||
var BatchUpdateInterval = GetEnvOrDefault("BATCH_UPDATE_INTERVAL", 5)
|
||||
|
||||
var RelayTimeout = GetOrDefault("RELAY_TIMEOUT", 0) // unit is second
|
||||
var RelayTimeout = GetEnvOrDefault("RELAY_TIMEOUT", 0) // unit is second
|
||||
|
||||
var GeminiSafetySetting = GetOrDefaultString("GEMINI_SAFETY_SETTING", "BLOCK_NONE")
|
||||
var GeminiSafetySetting = GetEnvOrDefaultString("GEMINI_SAFETY_SETTING", "BLOCK_NONE")
|
||||
|
||||
const (
|
||||
RequestIdKey = "X-Oneapi-Request-Id"
|
||||
@@ -148,10 +150,10 @@ var (
|
||||
// All duration's unit is seconds
|
||||
// Shouldn't larger then RateLimitKeyExpirationDuration
|
||||
var (
|
||||
GlobalApiRateLimitNum = GetOrDefault("GLOBAL_API_RATE_LIMIT", 180)
|
||||
GlobalApiRateLimitNum = GetEnvOrDefault("GLOBAL_API_RATE_LIMIT", 180)
|
||||
GlobalApiRateLimitDuration int64 = 3 * 60
|
||||
|
||||
GlobalWebRateLimitNum = GetOrDefault("GLOBAL_WEB_RATE_LIMIT", 60)
|
||||
GlobalWebRateLimitNum = GetEnvOrDefault("GLOBAL_WEB_RATE_LIMIT", 60)
|
||||
GlobalWebRateLimitDuration int64 = 3 * 60
|
||||
|
||||
UploadRateLimitNum = 10
|
||||
@@ -230,8 +232,10 @@ const (
|
||||
ChannelTypeAws = 33
|
||||
ChannelTypeCohere = 34
|
||||
ChannelTypeMiniMax = 35
|
||||
ChannelTypeSunoAPI = 36
|
||||
|
||||
ChannelTypeDummy // this one is only for count, do not add any channel after this
|
||||
|
||||
)
|
||||
|
||||
var ChannelBaseURLs = []string{
|
||||
@@ -271,4 +275,5 @@ var ChannelBaseURLs = []string{
|
||||
"", //33
|
||||
"https://api.cohere.ai", //34
|
||||
"https://api.minimax.chat", //35
|
||||
"", //36
|
||||
}
|
||||
|
||||
26
common/env.go
Normal file
26
common/env.go
Normal file
@@ -0,0 +1,26 @@
|
||||
package common
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"strconv"
|
||||
)
|
||||
|
||||
func GetEnvOrDefault(env string, defaultValue int) int {
|
||||
if env == "" || os.Getenv(env) == "" {
|
||||
return defaultValue
|
||||
}
|
||||
num, err := strconv.Atoi(os.Getenv(env))
|
||||
if err != nil {
|
||||
SysError(fmt.Sprintf("failed to parse %s: %s, using default value: %d", env, err.Error(), defaultValue))
|
||||
return defaultValue
|
||||
}
|
||||
return num
|
||||
}
|
||||
|
||||
func GetEnvOrDefaultString(env string, defaultValue string) string {
|
||||
if env == "" || os.Getenv(env) == "" {
|
||||
return defaultValue
|
||||
}
|
||||
return os.Getenv(env)
|
||||
}
|
||||
@@ -3,6 +3,7 @@ package common
|
||||
import (
|
||||
"fmt"
|
||||
"runtime/debug"
|
||||
"time"
|
||||
)
|
||||
|
||||
func SafeGoroutine(f func()) {
|
||||
@@ -45,3 +46,21 @@ func SafeSendString(ch chan string, value string) (closed bool) {
|
||||
// If the code reaches here, then the channel was not closed.
|
||||
return false
|
||||
}
|
||||
|
||||
// SafeSendStringTimeout send, return true, else return false
|
||||
func SafeSendStringTimeout(ch chan string, value string, timeout int) (closed bool) {
|
||||
defer func() {
|
||||
// Recover from panic if one occured. A panic would mean the channel was closed.
|
||||
if recover() != nil {
|
||||
closed = false
|
||||
}
|
||||
}()
|
||||
|
||||
// This will panic if the channel is closed.
|
||||
select {
|
||||
case ch <- value:
|
||||
return true
|
||||
case <-time.After(time.Duration(timeout) * time.Second):
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
package common
|
||||
|
||||
import "encoding/json"
|
||||
import (
|
||||
"encoding/json"
|
||||
)
|
||||
|
||||
var GroupRatio = map[string]float64{
|
||||
"default": 1,
|
||||
|
||||
@@ -8,7 +8,6 @@ import (
|
||||
"golang.org/x/image/webp"
|
||||
"image"
|
||||
"io"
|
||||
"net/http"
|
||||
"strings"
|
||||
)
|
||||
|
||||
@@ -32,7 +31,7 @@ func DecodeBase64ImageData(base64String string) (image.Config, string, string, e
|
||||
}
|
||||
|
||||
func IsImageUrl(url string) (bool, error) {
|
||||
resp, err := http.Head(url)
|
||||
resp, err := ProxiedHttpHead(url, OutProxyUrl)
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
@@ -48,10 +47,13 @@ func GetImageFromUrl(url string) (mimeType string, data string, err error) {
|
||||
if !isImage {
|
||||
return
|
||||
}
|
||||
resp, err := http.Get(url)
|
||||
resp, err := ProxiedHttpGet(url, OutProxyUrl)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
if !strings.HasPrefix(resp.Header.Get("Content-Type"), "image/") {
|
||||
return
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
buffer := bytes.NewBuffer(nil)
|
||||
_, err = buffer.ReadFrom(resp.Body)
|
||||
@@ -64,13 +66,18 @@ func GetImageFromUrl(url string) (mimeType string, data string, err error) {
|
||||
}
|
||||
|
||||
func DecodeUrlImageData(imageUrl string) (image.Config, string, error) {
|
||||
response, err := http.Get(imageUrl)
|
||||
response, err := ProxiedHttpGet(imageUrl, OutProxyUrl)
|
||||
if err != nil {
|
||||
SysLog(fmt.Sprintf("fail to get image from url: %s", err.Error()))
|
||||
return image.Config{}, "", err
|
||||
}
|
||||
defer response.Body.Close()
|
||||
|
||||
if response.StatusCode != 200 {
|
||||
err = errors.New(fmt.Sprintf("fail to get image from url: %s", response.Status))
|
||||
return image.Config{}, "", err
|
||||
}
|
||||
|
||||
var readData []byte
|
||||
for _, limit := range []int64{1024 * 8, 1024 * 24, 1024 * 64} {
|
||||
SysLog(fmt.Sprintf("try to decode image config with limit: %d", limit))
|
||||
|
||||
@@ -20,7 +20,7 @@ const (
|
||||
// 1 === $0.002 / 1K tokens
|
||||
// 1 === ¥0.014 / 1k tokens
|
||||
|
||||
var DefaultModelRatio = map[string]float64{
|
||||
var defaultModelRatio = map[string]float64{
|
||||
//"midjourney": 50,
|
||||
"gpt-4-gizmo-*": 15,
|
||||
"g-*": 15,
|
||||
@@ -75,6 +75,7 @@ var DefaultModelRatio = map[string]float64{
|
||||
"claude-2.0": 4, // $8 / 1M tokens
|
||||
"claude-2.1": 4, // $8 / 1M tokens
|
||||
"claude-3-haiku-20240307": 0.125, // $0.25 / 1M tokens
|
||||
"claude-3-5-sonnet-20240620": 1.5, // $3 / 1M tokens
|
||||
"claude-3-sonnet-20240229": 1.5, // $3 / 1M tokens
|
||||
"claude-3-opus-20240229": 7.5, // $15 / 1M tokens
|
||||
"ERNIE-Bot": 0.8572, // ¥0.012 / 1k tokens //renamed to ERNIE-3.5-8K
|
||||
@@ -113,6 +114,7 @@ var DefaultModelRatio = map[string]float64{
|
||||
"SparkDesk-v2.1": 1.2858, // ¥0.018 / 1k tokens
|
||||
"SparkDesk-v3.1": 1.2858, // ¥0.018 / 1k tokens
|
||||
"SparkDesk-v3.5": 1.2858, // ¥0.018 / 1k tokens
|
||||
"SparkDesk-v4.0": 1.2858,
|
||||
"360GPT_S2_V9": 0.8572, // ¥0.012 / 1k tokens
|
||||
"360gpt-turbo": 0.0858, // ¥0.0012 / 1k tokens
|
||||
"360gpt-turbo-responsibility-8k": 0.8572, // ¥0.012 / 1k tokens
|
||||
@@ -150,7 +152,7 @@ var DefaultModelRatio = map[string]float64{
|
||||
"llama-3-sonar-large-32k-online": 1 / 1000 * USD,
|
||||
}
|
||||
|
||||
var DefaultModelPrice = map[string]float64{
|
||||
var defaultModelPrice = map[string]float64{
|
||||
"dall-e-2": 0.02,
|
||||
"dall-e-3": 0.04,
|
||||
"gpt-4-gizmo-*": 0.1,
|
||||
@@ -176,15 +178,16 @@ var modelPrice map[string]float64 = nil
|
||||
var modelRatio map[string]float64 = nil
|
||||
|
||||
var CompletionRatio map[string]float64 = nil
|
||||
var DefaultCompletionRatio = map[string]float64{
|
||||
var defaultCompletionRatio = map[string]float64{
|
||||
"gpt-4-gizmo-*": 2,
|
||||
"g-*": 2,
|
||||
"gpt-4-all": 2,
|
||||
"gpt-4o-all": 2,
|
||||
}
|
||||
|
||||
func ModelPrice2JSONString() string {
|
||||
if modelPrice == nil {
|
||||
modelPrice = DefaultModelPrice
|
||||
modelPrice = defaultModelPrice
|
||||
}
|
||||
jsonBytes, err := json.Marshal(modelPrice)
|
||||
if err != nil {
|
||||
@@ -201,7 +204,7 @@ func UpdateModelPriceByJSONString(jsonStr string) error {
|
||||
// GetModelPrice 返回模型的价格,如果模型不存在则返回-1,false
|
||||
func GetModelPrice(name string, printErr bool) (float64, bool) {
|
||||
if modelPrice == nil {
|
||||
modelPrice = DefaultModelPrice
|
||||
modelPrice = defaultModelPrice
|
||||
}
|
||||
if strings.HasPrefix(name, "gpt-4-gizmo") {
|
||||
name = "gpt-4-gizmo-*"
|
||||
@@ -218,16 +221,16 @@ func GetModelPrice(name string, printErr bool) (float64, bool) {
|
||||
return price, true
|
||||
}
|
||||
|
||||
func GetModelPrices() map[string]float64 {
|
||||
func GetModelPriceMap() map[string]float64 {
|
||||
if modelPrice == nil {
|
||||
modelPrice = DefaultModelPrice
|
||||
modelPrice = defaultModelPrice
|
||||
}
|
||||
return modelPrice
|
||||
}
|
||||
|
||||
func ModelRatio2JSONString() string {
|
||||
if modelRatio == nil {
|
||||
modelRatio = DefaultModelRatio
|
||||
modelRatio = defaultModelRatio
|
||||
}
|
||||
jsonBytes, err := json.Marshal(modelRatio)
|
||||
if err != nil {
|
||||
@@ -243,7 +246,7 @@ func UpdateModelRatioByJSONString(jsonStr string) error {
|
||||
|
||||
func GetModelRatio(name string) float64 {
|
||||
if modelRatio == nil {
|
||||
modelRatio = DefaultModelRatio
|
||||
modelRatio = defaultModelRatio
|
||||
}
|
||||
if strings.HasPrefix(name, "gpt-4-gizmo") {
|
||||
name = "gpt-4-gizmo-*"
|
||||
@@ -258,16 +261,21 @@ func GetModelRatio(name string) float64 {
|
||||
return ratio
|
||||
}
|
||||
|
||||
func GetModelRatios() map[string]float64 {
|
||||
if modelRatio == nil {
|
||||
modelRatio = DefaultModelRatio
|
||||
func DefaultModelRatio2JSONString() string {
|
||||
jsonBytes, err := json.Marshal(defaultModelRatio)
|
||||
if err != nil {
|
||||
SysError("error marshalling model ratio: " + err.Error())
|
||||
}
|
||||
return modelRatio
|
||||
return string(jsonBytes)
|
||||
}
|
||||
|
||||
func GetDefaultModelRatioMap() map[string]float64 {
|
||||
return defaultModelRatio
|
||||
}
|
||||
|
||||
func CompletionRatio2JSONString() string {
|
||||
if CompletionRatio == nil {
|
||||
CompletionRatio = DefaultCompletionRatio
|
||||
CompletionRatio = defaultCompletionRatio
|
||||
}
|
||||
jsonBytes, err := json.Marshal(CompletionRatio)
|
||||
if err != nil {
|
||||
@@ -298,7 +306,7 @@ func GetCompletionRatio(name string) float64 {
|
||||
return 3
|
||||
}
|
||||
|
||||
return 1.333333
|
||||
return 4.0 / 3.0
|
||||
}
|
||||
if strings.HasPrefix(name, "gpt-4") && name != "gpt-4-all" && name != "gpt-4-gizmo-*" {
|
||||
if strings.HasSuffix(name, "preview") || strings.HasPrefix(name, "gpt-4-turbo") || strings.HasPrefix(name, "gpt-4o") {
|
||||
@@ -355,9 +363,9 @@ func GetCompletionRatio(name string) float64 {
|
||||
return 1
|
||||
}
|
||||
|
||||
func GetCompletionRatios() map[string]float64 {
|
||||
func GetCompletionRatioMap() map[string]float64 {
|
||||
if CompletionRatio == nil {
|
||||
CompletionRatio = DefaultCompletionRatio
|
||||
CompletionRatio = defaultCompletionRatio
|
||||
}
|
||||
return CompletionRatio
|
||||
}
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
package common
|
||||
|
||||
import "encoding/json"
|
||||
import (
|
||||
"encoding/json"
|
||||
)
|
||||
|
||||
var TopupGroupRatio = map[string]float64{
|
||||
"default": 1,
|
||||
|
||||
@@ -1,14 +1,18 @@
|
||||
package common
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"github.com/google/uuid"
|
||||
"golang.org/x/net/proxy"
|
||||
"html/template"
|
||||
"log"
|
||||
"math/rand"
|
||||
"net"
|
||||
"os"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"os/exec"
|
||||
"runtime"
|
||||
"strconv"
|
||||
@@ -191,25 +195,6 @@ func Max(a int, b int) int {
|
||||
}
|
||||
}
|
||||
|
||||
func GetOrDefault(env string, defaultValue int) int {
|
||||
if env == "" || os.Getenv(env) == "" {
|
||||
return defaultValue
|
||||
}
|
||||
num, err := strconv.Atoi(os.Getenv(env))
|
||||
if err != nil {
|
||||
SysError(fmt.Sprintf("failed to parse %s: %s, using default value: %d", env, err.Error(), defaultValue))
|
||||
return defaultValue
|
||||
}
|
||||
return num
|
||||
}
|
||||
|
||||
func GetOrDefaultString(env string, defaultValue string) string {
|
||||
if env == "" || os.Getenv(env) == "" {
|
||||
return defaultValue
|
||||
}
|
||||
return os.Getenv(env)
|
||||
}
|
||||
|
||||
func MessageWithRequestId(message string, id string) string {
|
||||
return fmt.Sprintf("%s (request id: %s)", message, id)
|
||||
}
|
||||
@@ -267,3 +252,56 @@ func StrToMap(str string) map[string]interface{} {
|
||||
}
|
||||
return m
|
||||
}
|
||||
|
||||
func GetProxiedHttpClient(proxyUrl string) (*http.Client, error) {
|
||||
if "" == proxyUrl {
|
||||
return &http.Client{}, nil
|
||||
}
|
||||
|
||||
u, err := url.Parse(proxyUrl)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if strings.HasPrefix(proxyUrl, "http") {
|
||||
|
||||
return &http.Client{
|
||||
Transport: &http.Transport{
|
||||
Proxy: http.ProxyURL(u),
|
||||
},
|
||||
}, nil
|
||||
} else if strings.HasPrefix(proxyUrl, "socks") {
|
||||
dialer, err := proxy.FromURL(u, proxy.Direct)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &http.Client{
|
||||
Transport: &http.Transport{
|
||||
DialContext: func(ctx context.Context, network, addr string) (net.Conn, error) {
|
||||
return dialer.(proxy.ContextDialer).DialContext(ctx, network, addr)
|
||||
},
|
||||
},
|
||||
}, nil
|
||||
}
|
||||
|
||||
return nil, errors.New("unsupported proxy type")
|
||||
}
|
||||
|
||||
func ProxiedHttpGet(url, proxyUrl string) (*http.Response, error) {
|
||||
client, err := GetProxiedHttpClient(proxyUrl)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return client.Get(url)
|
||||
}
|
||||
|
||||
func ProxiedHttpHead(url, proxyUrl string) (*http.Response, error) {
|
||||
client, err := GetProxiedHttpClient(proxyUrl)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return client.Head(url)
|
||||
}
|
||||
|
||||
7
constant/env.go
Normal file
7
constant/env.go
Normal file
@@ -0,0 +1,7 @@
|
||||
package constant
|
||||
|
||||
import (
|
||||
"one-api/common"
|
||||
)
|
||||
|
||||
var StreamingTimeout = common.GetEnvOrDefault("STREAMING_TIMEOUT", 30)
|
||||
@@ -16,7 +16,7 @@ var StreamCacheQueueLength = 0
|
||||
// SensitiveWords 敏感词
|
||||
// var SensitiveWords []string
|
||||
var SensitiveWords = []string{
|
||||
"test",
|
||||
"test_sensitive",
|
||||
}
|
||||
|
||||
func SensitiveWordsToString() string {
|
||||
|
||||
18
constant/task.go
Normal file
18
constant/task.go
Normal file
@@ -0,0 +1,18 @@
|
||||
package constant
|
||||
|
||||
type TaskPlatform string
|
||||
|
||||
const (
|
||||
TaskPlatformSuno TaskPlatform = "suno"
|
||||
TaskPlatformMidjourney = "mj"
|
||||
)
|
||||
|
||||
const (
|
||||
SunoActionMusic = "MUSIC"
|
||||
SunoActionLyrics = "LYRICS"
|
||||
)
|
||||
|
||||
var SunoModel2Action = map[string]string{
|
||||
"suno_music": SunoActionMusic,
|
||||
"suno_lyrics": SunoActionLyrics,
|
||||
}
|
||||
@@ -27,6 +27,9 @@ func testChannel(channel *model.Channel, testModel string) (err error, openaiErr
|
||||
if channel.Type == common.ChannelTypeMidjourney {
|
||||
return errors.New("midjourney channel test is not supported"), nil
|
||||
}
|
||||
if channel.Type == common.ChannelTypeSunoAPI {
|
||||
return errors.New("suno channel test is not supported"), nil
|
||||
}
|
||||
w := httptest.NewRecorder()
|
||||
c, _ := gin.CreateTestContext(w)
|
||||
c.Request = &http.Request{
|
||||
@@ -219,11 +222,18 @@ func testAllChannels(notify bool) error {
|
||||
if channel.AutoBan != nil && *channel.AutoBan == 0 {
|
||||
ban = false
|
||||
}
|
||||
if isChannelEnabled && service.ShouldDisableChannel(openaiErr, -1) && ban {
|
||||
service.DisableChannel(channel.Id, channel.Name, err.Error())
|
||||
}
|
||||
if !isChannelEnabled && service.ShouldEnableChannel(err, openaiErr, channel.Status) {
|
||||
service.EnableChannel(channel.Id, channel.Name)
|
||||
if openaiErr != nil {
|
||||
openAiErrWithStatus := dto.OpenAIErrorWithStatusCode{
|
||||
StatusCode: -1,
|
||||
Error: *openaiErr,
|
||||
LocalError: false,
|
||||
}
|
||||
if isChannelEnabled && service.ShouldDisableChannel(&openAiErrWithStatus) && ban {
|
||||
service.DisableChannel(channel.Id, channel.Name, err.Error())
|
||||
}
|
||||
if !isChannelEnabled && service.ShouldEnableChannel(err, openaiErr, channel.Status) {
|
||||
service.EnableChannel(channel.Id, channel.Name)
|
||||
}
|
||||
}
|
||||
channel.UpdateResponseTime(milliseconds)
|
||||
time.Sleep(common.RequestInterval)
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
package controller
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"github.com/gin-gonic/gin"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
@@ -9,6 +11,34 @@ import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
type OpenAIModel struct {
|
||||
ID string `json:"id"`
|
||||
Object string `json:"object"`
|
||||
Created int64 `json:"created"`
|
||||
OwnedBy string `json:"owned_by"`
|
||||
Permission []struct {
|
||||
ID string `json:"id"`
|
||||
Object string `json:"object"`
|
||||
Created int64 `json:"created"`
|
||||
AllowCreateEngine bool `json:"allow_create_engine"`
|
||||
AllowSampling bool `json:"allow_sampling"`
|
||||
AllowLogprobs bool `json:"allow_logprobs"`
|
||||
AllowSearchIndices bool `json:"allow_search_indices"`
|
||||
AllowView bool `json:"allow_view"`
|
||||
AllowFineTuning bool `json:"allow_fine_tuning"`
|
||||
Organization string `json:"organization"`
|
||||
Group string `json:"group"`
|
||||
IsBlocking bool `json:"is_blocking"`
|
||||
} `json:"permission"`
|
||||
Root string `json:"root"`
|
||||
Parent string `json:"parent"`
|
||||
}
|
||||
|
||||
type OpenAIModelsResponse struct {
|
||||
Data []OpenAIModel `json:"data"`
|
||||
Success bool `json:"success"`
|
||||
}
|
||||
|
||||
func GetAllChannels(c *gin.Context) {
|
||||
p, _ := strconv.Atoi(c.Query("p"))
|
||||
pageSize, _ := strconv.Atoi(c.Query("page_size"))
|
||||
@@ -35,6 +65,65 @@ func GetAllChannels(c *gin.Context) {
|
||||
return
|
||||
}
|
||||
|
||||
func FetchUpstreamModels(c *gin.Context) {
|
||||
id, err := strconv.Atoi(c.Param("id"))
|
||||
if err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"message": err.Error(),
|
||||
})
|
||||
return
|
||||
}
|
||||
channel, err := model.GetChannelById(id, true)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"message": err.Error(),
|
||||
})
|
||||
return
|
||||
}
|
||||
if channel.Type != common.ChannelTypeOpenAI {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"message": "仅支持 OpenAI 类型渠道",
|
||||
})
|
||||
return
|
||||
}
|
||||
url := fmt.Sprintf("%s/v1/models", *channel.BaseURL)
|
||||
body, err := GetResponseBody("GET", url, channel, GetAuthHeader(channel.Key))
|
||||
if err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"message": err.Error(),
|
||||
})
|
||||
}
|
||||
result := OpenAIModelsResponse{}
|
||||
err = json.Unmarshal(body, &result)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"message": err.Error(),
|
||||
})
|
||||
}
|
||||
if !result.Success {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"message": "上游返回错误",
|
||||
})
|
||||
}
|
||||
|
||||
var ids []string
|
||||
for _, model := range result.Data {
|
||||
ids = append(ids, model.ID)
|
||||
}
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": true,
|
||||
"message": "",
|
||||
"data": ids,
|
||||
})
|
||||
}
|
||||
|
||||
func FixChannelsAbilities(c *gin.Context) {
|
||||
count, err := model.FixAbility()
|
||||
if err != nil {
|
||||
|
||||
@@ -59,6 +59,7 @@ func GetStatus(c *gin.Context) {
|
||||
"display_in_currency": common.DisplayInCurrencyEnabled,
|
||||
"enable_batch_update": common.BatchUpdateEnabled,
|
||||
"enable_drawing": common.DrawingEnabled,
|
||||
"enable_task": common.TaskEnabled,
|
||||
"enable_data_export": common.DataExportEnabled,
|
||||
"data_export_default_time": common.DataExportDefaultTime,
|
||||
"default_collapse_sidebar": common.DefaultCollapseSidebar,
|
||||
|
||||
@@ -200,18 +200,3 @@ func RetrieveModel(c *gin.Context) {
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func GetPricing(c *gin.Context) {
|
||||
userId := c.GetInt("id")
|
||||
group, err := model.CacheGetUserGroup(userId)
|
||||
groupRatio := common.GetGroupRatio("default")
|
||||
if err != nil {
|
||||
groupRatio = common.GetGroupRatio(group)
|
||||
}
|
||||
pricing := model.GetPricing(group)
|
||||
c.JSON(200, gin.H{
|
||||
"success": true,
|
||||
"data": pricing,
|
||||
"group_ratio": groupRatio,
|
||||
})
|
||||
}
|
||||
|
||||
47
controller/pricing.go
Normal file
47
controller/pricing.go
Normal file
@@ -0,0 +1,47 @@
|
||||
package controller
|
||||
|
||||
import (
|
||||
"github.com/gin-gonic/gin"
|
||||
"one-api/common"
|
||||
"one-api/model"
|
||||
)
|
||||
|
||||
func GetPricing(c *gin.Context) {
|
||||
userId := c.GetInt("id")
|
||||
// if no login, get default group ratio
|
||||
groupRatio := common.GetGroupRatio("default")
|
||||
group, err := model.CacheGetUserGroup(userId)
|
||||
if err == nil {
|
||||
groupRatio = common.GetGroupRatio(group)
|
||||
}
|
||||
pricing := model.GetPricing(group)
|
||||
c.JSON(200, gin.H{
|
||||
"success": true,
|
||||
"data": pricing,
|
||||
"group_ratio": groupRatio,
|
||||
})
|
||||
}
|
||||
|
||||
func ResetModelRatio(c *gin.Context) {
|
||||
defaultStr := common.DefaultModelRatio2JSONString()
|
||||
err := model.UpdateOption("ModelRatio", defaultStr)
|
||||
if err != nil {
|
||||
c.JSON(200, gin.H{
|
||||
"success": false,
|
||||
"message": err.Error(),
|
||||
})
|
||||
return
|
||||
}
|
||||
err = common.UpdateModelRatioByJSONString(defaultStr)
|
||||
if err != nil {
|
||||
c.JSON(200, gin.H{
|
||||
"success": false,
|
||||
"message": err.Error(),
|
||||
})
|
||||
return
|
||||
}
|
||||
c.JSON(200, gin.H{
|
||||
"success": true,
|
||||
"message": "重置模型倍率成功",
|
||||
})
|
||||
}
|
||||
@@ -128,7 +128,7 @@ func shouldRetry(c *gin.Context, channelId int, openaiErr *dto.OpenAIErrorWithSt
|
||||
func processChannelError(c *gin.Context, channelId int, err *dto.OpenAIErrorWithStatusCode) {
|
||||
autoBan := c.GetBool("auto_ban")
|
||||
common.LogError(c.Request.Context(), fmt.Sprintf("relay error (channel #%d, status code: %d): %s", channelId, err.StatusCode, err.Error.Message))
|
||||
if service.ShouldDisableChannel(&err.Error, err.StatusCode) && autoBan {
|
||||
if service.ShouldDisableChannel(err) && autoBan {
|
||||
channelName := c.GetString("channel_name")
|
||||
service.DisableChannel(channelId, channelName, err.Error.Message)
|
||||
}
|
||||
@@ -190,3 +190,94 @@ func RelayNotFound(c *gin.Context) {
|
||||
"error": err,
|
||||
})
|
||||
}
|
||||
|
||||
func RelayTask(c *gin.Context) {
|
||||
retryTimes := common.RetryTimes
|
||||
channelId := c.GetInt("channel_id")
|
||||
relayMode := c.GetInt("relay_mode")
|
||||
group := c.GetString("group")
|
||||
originalModel := c.GetString("original_model")
|
||||
c.Set("use_channel", []string{fmt.Sprintf("%d", channelId)})
|
||||
taskErr := taskRelayHandler(c, relayMode)
|
||||
if taskErr == nil {
|
||||
retryTimes = 0
|
||||
}
|
||||
for i := 0; shouldRetryTaskRelay(c, channelId, taskErr, retryTimes) && i < retryTimes; i++ {
|
||||
channel, err := model.CacheGetRandomSatisfiedChannel(group, originalModel, i)
|
||||
if err != nil {
|
||||
common.LogError(c.Request.Context(), fmt.Sprintf("CacheGetRandomSatisfiedChannel failed: %s", err.Error()))
|
||||
break
|
||||
}
|
||||
channelId = channel.Id
|
||||
useChannel := c.GetStringSlice("use_channel")
|
||||
useChannel = append(useChannel, fmt.Sprintf("%d", channelId))
|
||||
c.Set("use_channel", useChannel)
|
||||
common.LogInfo(c.Request.Context(), fmt.Sprintf("using channel #%d to retry (remain times %d)", channel.Id, i))
|
||||
middleware.SetupContextForSelectedChannel(c, channel, originalModel)
|
||||
|
||||
requestBody, err := common.GetRequestBody(c)
|
||||
c.Request.Body = io.NopCloser(bytes.NewBuffer(requestBody))
|
||||
taskErr = taskRelayHandler(c, relayMode)
|
||||
}
|
||||
useChannel := c.GetStringSlice("use_channel")
|
||||
if len(useChannel) > 1 {
|
||||
retryLogStr := fmt.Sprintf("重试:%s", strings.Trim(strings.Join(strings.Fields(fmt.Sprint(useChannel)), "->"), "[]"))
|
||||
common.LogInfo(c.Request.Context(), retryLogStr)
|
||||
}
|
||||
if taskErr != nil {
|
||||
if taskErr.StatusCode == http.StatusTooManyRequests {
|
||||
taskErr.Message = "当前分组上游负载已饱和,请稍后再试"
|
||||
}
|
||||
c.JSON(taskErr.StatusCode, taskErr)
|
||||
}
|
||||
}
|
||||
|
||||
func taskRelayHandler(c *gin.Context, relayMode int) *dto.TaskError {
|
||||
var err *dto.TaskError
|
||||
switch relayMode {
|
||||
case relayconstant.RelayModeSunoFetch, relayconstant.RelayModeSunoFetchByID:
|
||||
err = relay.RelayTaskFetch(c, relayMode)
|
||||
default:
|
||||
err = relay.RelayTaskSubmit(c, relayMode)
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
func shouldRetryTaskRelay(c *gin.Context, channelId int, taskErr *dto.TaskError, retryTimes int) bool {
|
||||
if taskErr == nil {
|
||||
return false
|
||||
}
|
||||
if retryTimes <= 0 {
|
||||
return false
|
||||
}
|
||||
if _, ok := c.Get("specific_channel_id"); ok {
|
||||
return false
|
||||
}
|
||||
if taskErr.StatusCode == http.StatusTooManyRequests {
|
||||
return true
|
||||
}
|
||||
if taskErr.StatusCode == 307 {
|
||||
return true
|
||||
}
|
||||
if taskErr.StatusCode/100 == 5 {
|
||||
// 超时不重试
|
||||
if taskErr.StatusCode == 504 || taskErr.StatusCode == 524 {
|
||||
return false
|
||||
}
|
||||
return true
|
||||
}
|
||||
if taskErr.StatusCode == http.StatusBadRequest {
|
||||
return false
|
||||
}
|
||||
if taskErr.StatusCode == 408 {
|
||||
// azure处理超时不重试
|
||||
return false
|
||||
}
|
||||
if taskErr.LocalError {
|
||||
return false
|
||||
}
|
||||
if taskErr.StatusCode/100 == 2 {
|
||||
return false
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
284
controller/task.go
Normal file
284
controller/task.go
Normal file
@@ -0,0 +1,284 @@
|
||||
package controller
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"github.com/gin-gonic/gin"
|
||||
"github.com/samber/lo"
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
"one-api/model"
|
||||
"one-api/relay"
|
||||
"sort"
|
||||
"strconv"
|
||||
"time"
|
||||
)
|
||||
|
||||
func UpdateTaskBulk() {
|
||||
//revocer
|
||||
//imageModel := "midjourney"
|
||||
for {
|
||||
time.Sleep(time.Duration(15) * time.Second)
|
||||
common.SysLog("任务进度轮询开始")
|
||||
ctx := context.TODO()
|
||||
allTasks := model.GetAllUnFinishSyncTasks(500)
|
||||
platformTask := make(map[constant.TaskPlatform][]*model.Task)
|
||||
for _, t := range allTasks {
|
||||
platformTask[t.Platform] = append(platformTask[t.Platform], t)
|
||||
}
|
||||
for platform, tasks := range platformTask {
|
||||
if len(tasks) == 0 {
|
||||
continue
|
||||
}
|
||||
taskChannelM := make(map[int][]string)
|
||||
taskM := make(map[string]*model.Task)
|
||||
nullTaskIds := make([]int64, 0)
|
||||
for _, task := range tasks {
|
||||
if task.TaskID == "" {
|
||||
// 统计失败的未完成任务
|
||||
nullTaskIds = append(nullTaskIds, task.ID)
|
||||
continue
|
||||
}
|
||||
taskM[task.TaskID] = task
|
||||
taskChannelM[task.ChannelId] = append(taskChannelM[task.ChannelId], task.TaskID)
|
||||
}
|
||||
if len(nullTaskIds) > 0 {
|
||||
err := model.TaskBulkUpdateByID(nullTaskIds, map[string]any{
|
||||
"status": "FAILURE",
|
||||
"progress": "100%",
|
||||
})
|
||||
if err != nil {
|
||||
common.LogError(ctx, fmt.Sprintf("Fix null task_id task error: %v", err))
|
||||
} else {
|
||||
common.LogInfo(ctx, fmt.Sprintf("Fix null task_id task success: %v", nullTaskIds))
|
||||
}
|
||||
}
|
||||
if len(taskChannelM) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
UpdateTaskByPlatform(platform, taskChannelM, taskM)
|
||||
}
|
||||
common.SysLog("任务进度轮询完成")
|
||||
}
|
||||
}
|
||||
|
||||
func UpdateTaskByPlatform(platform constant.TaskPlatform, taskChannelM map[int][]string, taskM map[string]*model.Task) {
|
||||
switch platform {
|
||||
case constant.TaskPlatformMidjourney:
|
||||
//_ = UpdateMidjourneyTaskAll(context.Background(), tasks)
|
||||
case constant.TaskPlatformSuno:
|
||||
_ = UpdateSunoTaskAll(context.Background(), taskChannelM, taskM)
|
||||
default:
|
||||
common.SysLog("未知平台")
|
||||
}
|
||||
}
|
||||
|
||||
func UpdateSunoTaskAll(ctx context.Context, taskChannelM map[int][]string, taskM map[string]*model.Task) error {
|
||||
for channelId, taskIds := range taskChannelM {
|
||||
err := updateSunoTaskAll(ctx, channelId, taskIds, taskM)
|
||||
if err != nil {
|
||||
common.LogError(ctx, fmt.Sprintf("渠道 #%d 更新异步任务失败: %d", channelId, err.Error()))
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func updateSunoTaskAll(ctx context.Context, channelId int, taskIds []string, taskM map[string]*model.Task) error {
|
||||
common.LogInfo(ctx, fmt.Sprintf("渠道 #%d 未完成的任务有: %d", channelId, len(taskIds)))
|
||||
if len(taskIds) == 0 {
|
||||
return nil
|
||||
}
|
||||
channel, err := model.CacheGetChannel(channelId)
|
||||
if err != nil {
|
||||
common.SysLog(fmt.Sprintf("CacheGetChannel: %v", err))
|
||||
err = model.TaskBulkUpdate(taskIds, map[string]any{
|
||||
"fail_reason": fmt.Sprintf("获取渠道信息失败,请联系管理员,渠道ID:%d", channelId),
|
||||
"status": "FAILURE",
|
||||
"progress": "100%",
|
||||
})
|
||||
if err != nil {
|
||||
common.SysError(fmt.Sprintf("UpdateMidjourneyTask error2: %v", err))
|
||||
}
|
||||
return err
|
||||
}
|
||||
adaptor := relay.GetTaskAdaptor(constant.TaskPlatformSuno)
|
||||
if adaptor == nil {
|
||||
return errors.New("adaptor not found")
|
||||
}
|
||||
resp, err := adaptor.FetchTask(*channel.BaseURL, channel.Key, map[string]any{
|
||||
"ids": taskIds,
|
||||
})
|
||||
if err != nil {
|
||||
common.SysError(fmt.Sprintf("Get Task Do req error: %v", err))
|
||||
return err
|
||||
}
|
||||
if resp.StatusCode != http.StatusOK {
|
||||
common.LogError(ctx, fmt.Sprintf("Get Task status code: %d", resp.StatusCode))
|
||||
return errors.New(fmt.Sprintf("Get Task status code: %d", resp.StatusCode))
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
responseBody, err := io.ReadAll(resp.Body)
|
||||
if err != nil {
|
||||
common.SysError(fmt.Sprintf("Get Task parse body error: %v", err))
|
||||
return err
|
||||
}
|
||||
var responseItems dto.TaskResponse[[]dto.SunoDataResponse]
|
||||
err = json.Unmarshal(responseBody, &responseItems)
|
||||
if err != nil {
|
||||
common.LogError(ctx, fmt.Sprintf("Get Task parse body error2: %v, body: %s", err, string(responseBody)))
|
||||
return err
|
||||
}
|
||||
if !responseItems.IsSuccess() {
|
||||
common.SysLog(fmt.Sprintf("渠道 #%d 未完成的任务有: %d, 成功获取到任务数: %d", channelId, len(taskIds), string(responseBody)))
|
||||
return err
|
||||
}
|
||||
|
||||
for _, responseItem := range responseItems.Data {
|
||||
task := taskM[responseItem.TaskID]
|
||||
if !checkTaskNeedUpdate(task, responseItem) {
|
||||
continue
|
||||
}
|
||||
|
||||
task.Status = lo.If(model.TaskStatus(responseItem.Status) != "", model.TaskStatus(responseItem.Status)).Else(task.Status)
|
||||
task.FailReason = lo.If(responseItem.FailReason != "", responseItem.FailReason).Else(task.FailReason)
|
||||
task.SubmitTime = lo.If(responseItem.SubmitTime != 0, responseItem.SubmitTime).Else(task.SubmitTime)
|
||||
task.StartTime = lo.If(responseItem.StartTime != 0, responseItem.StartTime).Else(task.StartTime)
|
||||
task.FinishTime = lo.If(responseItem.FinishTime != 0, responseItem.FinishTime).Else(task.FinishTime)
|
||||
if responseItem.FailReason != "" || task.Status == model.TaskStatusFailure {
|
||||
common.LogInfo(ctx, task.TaskID+" 构建失败,"+task.FailReason)
|
||||
task.Progress = "100%"
|
||||
err = model.CacheUpdateUserQuota(task.UserId)
|
||||
if err != nil {
|
||||
common.LogError(ctx, "error update user quota cache: "+err.Error())
|
||||
} else {
|
||||
quota := task.Quota
|
||||
if quota != 0 {
|
||||
err = model.IncreaseUserQuota(task.UserId, quota)
|
||||
if err != nil {
|
||||
common.LogError(ctx, "fail to increase user quota: "+err.Error())
|
||||
}
|
||||
logContent := fmt.Sprintf("异步任务执行失败 %s,补偿 %s", task.TaskID, common.LogQuota(quota))
|
||||
model.RecordLog(task.UserId, model.LogTypeSystem, logContent)
|
||||
}
|
||||
}
|
||||
}
|
||||
if responseItem.Status == model.TaskStatusSuccess {
|
||||
task.Progress = "100%"
|
||||
}
|
||||
task.Data = responseItem.Data
|
||||
|
||||
err = task.Update()
|
||||
if err != nil {
|
||||
common.SysError("UpdateMidjourneyTask task error: " + err.Error())
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func checkTaskNeedUpdate(oldTask *model.Task, newTask dto.SunoDataResponse) bool {
|
||||
|
||||
if oldTask.SubmitTime != newTask.SubmitTime {
|
||||
return true
|
||||
}
|
||||
if oldTask.StartTime != newTask.StartTime {
|
||||
return true
|
||||
}
|
||||
if oldTask.FinishTime != newTask.FinishTime {
|
||||
return true
|
||||
}
|
||||
if string(oldTask.Status) != newTask.Status {
|
||||
return true
|
||||
}
|
||||
if oldTask.FailReason != newTask.FailReason {
|
||||
return true
|
||||
}
|
||||
if oldTask.FinishTime != newTask.FinishTime {
|
||||
return true
|
||||
}
|
||||
|
||||
if (oldTask.Status == model.TaskStatusFailure || oldTask.Status == model.TaskStatusSuccess) && oldTask.Progress != "100%" {
|
||||
return true
|
||||
}
|
||||
|
||||
oldData, _ := json.Marshal(oldTask.Data)
|
||||
newData, _ := json.Marshal(newTask.Data)
|
||||
|
||||
sort.Slice(oldData, func(i, j int) bool {
|
||||
return oldData[i] < oldData[j]
|
||||
})
|
||||
sort.Slice(newData, func(i, j int) bool {
|
||||
return newData[i] < newData[j]
|
||||
})
|
||||
|
||||
if string(oldData) != string(newData) {
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func GetAllTask(c *gin.Context) {
|
||||
p, _ := strconv.Atoi(c.Query("p"))
|
||||
if p < 0 {
|
||||
p = 0
|
||||
}
|
||||
startTimestamp, _ := strconv.ParseInt(c.Query("start_timestamp"), 10, 64)
|
||||
endTimestamp, _ := strconv.ParseInt(c.Query("end_timestamp"), 10, 64)
|
||||
// 解析其他查询参数
|
||||
queryParams := model.SyncTaskQueryParams{
|
||||
Platform: constant.TaskPlatform(c.Query("platform")),
|
||||
TaskID: c.Query("task_id"),
|
||||
Status: c.Query("status"),
|
||||
Action: c.Query("action"),
|
||||
StartTimestamp: startTimestamp,
|
||||
EndTimestamp: endTimestamp,
|
||||
}
|
||||
|
||||
logs := model.TaskGetAllTasks(p*common.ItemsPerPage, common.ItemsPerPage, queryParams)
|
||||
if logs == nil {
|
||||
logs = make([]*model.Task, 0)
|
||||
}
|
||||
|
||||
c.JSON(200, gin.H{
|
||||
"success": true,
|
||||
"message": "",
|
||||
"data": logs,
|
||||
})
|
||||
}
|
||||
|
||||
func GetUserTask(c *gin.Context) {
|
||||
p, _ := strconv.Atoi(c.Query("p"))
|
||||
if p < 0 {
|
||||
p = 0
|
||||
}
|
||||
|
||||
userId := c.GetInt("id")
|
||||
|
||||
startTimestamp, _ := strconv.ParseInt(c.Query("start_timestamp"), 10, 64)
|
||||
endTimestamp, _ := strconv.ParseInt(c.Query("end_timestamp"), 10, 64)
|
||||
|
||||
queryParams := model.SyncTaskQueryParams{
|
||||
Platform: constant.TaskPlatform(c.Query("platform")),
|
||||
TaskID: c.Query("task_id"),
|
||||
Status: c.Query("status"),
|
||||
Action: c.Query("action"),
|
||||
StartTimestamp: startTimestamp,
|
||||
EndTimestamp: endTimestamp,
|
||||
}
|
||||
|
||||
logs := model.TaskGetAllUserTask(userId, p*common.ItemsPerPage, common.ItemsPerPage, queryParams)
|
||||
if logs == nil {
|
||||
logs = make([]*model.Task, 0)
|
||||
}
|
||||
|
||||
c.JSON(200, gin.H{
|
||||
"success": true,
|
||||
"message": "",
|
||||
"data": logs,
|
||||
})
|
||||
}
|
||||
@@ -24,14 +24,3 @@ type OpenAIModels struct {
|
||||
Root string `json:"root"`
|
||||
Parent *string `json:"parent"`
|
||||
}
|
||||
|
||||
type ModelPricing struct {
|
||||
Available bool `json:"available"`
|
||||
ModelName string `json:"model_name"`
|
||||
QuotaType int `json:"quota_type"`
|
||||
ModelRatio float64 `json:"model_ratio"`
|
||||
ModelPrice float64 `json:"model_price"`
|
||||
OwnerBy string `json:"owner_by"`
|
||||
CompletionRatio float64 `json:"completion_ratio"`
|
||||
EnableGroup []string `json:"enable_group,omitempty"`
|
||||
}
|
||||
|
||||
129
dto/suno.go
Normal file
129
dto/suno.go
Normal file
@@ -0,0 +1,129 @@
|
||||
package dto
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
)
|
||||
|
||||
type TaskData interface {
|
||||
SunoDataResponse | []SunoDataResponse | string | any
|
||||
}
|
||||
|
||||
type SunoSubmitReq struct {
|
||||
GptDescriptionPrompt string `json:"gpt_description_prompt,omitempty"`
|
||||
Prompt string `json:"prompt,omitempty"`
|
||||
Mv string `json:"mv,omitempty"`
|
||||
Title string `json:"title,omitempty"`
|
||||
Tags string `json:"tags,omitempty"`
|
||||
ContinueAt float64 `json:"continue_at,omitempty"`
|
||||
TaskID string `json:"task_id,omitempty"`
|
||||
ContinueClipId string `json:"continue_clip_id,omitempty"`
|
||||
MakeInstrumental bool `json:"make_instrumental"`
|
||||
}
|
||||
|
||||
type FetchReq struct {
|
||||
IDs []string `json:"ids"`
|
||||
}
|
||||
|
||||
type SunoDataResponse struct {
|
||||
TaskID string `json:"task_id" gorm:"type:varchar(50);index"`
|
||||
Action string `json:"action" gorm:"type:varchar(40);index"` // 任务类型, song, lyrics, description-mode
|
||||
Status string `json:"status" gorm:"type:varchar(20);index"` // 任务状态, submitted, queueing, processing, success, failed
|
||||
FailReason string `json:"fail_reason"`
|
||||
SubmitTime int64 `json:"submit_time" gorm:"index"`
|
||||
StartTime int64 `json:"start_time" gorm:"index"`
|
||||
FinishTime int64 `json:"finish_time" gorm:"index"`
|
||||
Data json.RawMessage `json:"data" gorm:"type:json"`
|
||||
}
|
||||
|
||||
type SunoSong struct {
|
||||
ID string `json:"id"`
|
||||
VideoURL string `json:"video_url"`
|
||||
AudioURL string `json:"audio_url"`
|
||||
ImageURL string `json:"image_url"`
|
||||
ImageLargeURL string `json:"image_large_url"`
|
||||
MajorModelVersion string `json:"major_model_version"`
|
||||
ModelName string `json:"model_name"`
|
||||
Status string `json:"status"`
|
||||
Title string `json:"title"`
|
||||
Text string `json:"text"`
|
||||
Metadata SunoMetadata `json:"metadata"`
|
||||
}
|
||||
|
||||
type SunoMetadata struct {
|
||||
Tags string `json:"tags"`
|
||||
Prompt string `json:"prompt"`
|
||||
GPTDescriptionPrompt interface{} `json:"gpt_description_prompt"`
|
||||
AudioPromptID interface{} `json:"audio_prompt_id"`
|
||||
Duration interface{} `json:"duration"`
|
||||
ErrorType interface{} `json:"error_type"`
|
||||
ErrorMessage interface{} `json:"error_message"`
|
||||
}
|
||||
|
||||
type SunoLyrics struct {
|
||||
ID string `json:"id"`
|
||||
Status string `json:"status"`
|
||||
Title string `json:"title"`
|
||||
Text string `json:"text"`
|
||||
}
|
||||
|
||||
const TaskSuccessCode = "success"
|
||||
|
||||
type TaskResponse[T TaskData] struct {
|
||||
Code string `json:"code"`
|
||||
Message string `json:"message"`
|
||||
Data T `json:"data"`
|
||||
}
|
||||
|
||||
func (t *TaskResponse[T]) IsSuccess() bool {
|
||||
return t.Code == TaskSuccessCode
|
||||
}
|
||||
|
||||
type TaskDto struct {
|
||||
TaskID string `json:"task_id"` // 第三方id,不一定有/ song id\ Task id
|
||||
Action string `json:"action"` // 任务类型, song, lyrics, description-mode
|
||||
Status string `json:"status"` // 任务状态, submitted, queueing, processing, success, failed
|
||||
FailReason string `json:"fail_reason"`
|
||||
SubmitTime int64 `json:"submit_time"`
|
||||
StartTime int64 `json:"start_time"`
|
||||
FinishTime int64 `json:"finish_time"`
|
||||
Progress string `json:"progress"`
|
||||
Data json.RawMessage `json:"data"`
|
||||
}
|
||||
|
||||
type SunoGoAPISubmitReq struct {
|
||||
CustomMode bool `json:"custom_mode"`
|
||||
|
||||
Input SunoGoAPISubmitReqInput `json:"input"`
|
||||
|
||||
NotifyHook string `json:"notify_hook,omitempty"`
|
||||
}
|
||||
|
||||
type SunoGoAPISubmitReqInput struct {
|
||||
GptDescriptionPrompt string `json:"gpt_description_prompt"`
|
||||
Prompt string `json:"prompt"`
|
||||
Mv string `json:"mv"`
|
||||
Title string `json:"title"`
|
||||
Tags string `json:"tags"`
|
||||
ContinueAt float64 `json:"continue_at"`
|
||||
TaskID string `json:"task_id"`
|
||||
ContinueClipId string `json:"continue_clip_id"`
|
||||
MakeInstrumental bool `json:"make_instrumental"`
|
||||
}
|
||||
|
||||
type GoAPITaskResponse[T any] struct {
|
||||
Code int `json:"code"`
|
||||
Message string `json:"message"`
|
||||
Data T `json:"data"`
|
||||
ErrorMessage string `json:"error_message,omitempty"`
|
||||
}
|
||||
|
||||
type GoAPITaskResponseData struct {
|
||||
TaskID string `json:"task_id"`
|
||||
}
|
||||
|
||||
type GoAPIFetchResponseData struct {
|
||||
TaskID string `json:"task_id"`
|
||||
Status string `json:"status"`
|
||||
Input string `json:"input"`
|
||||
Clips map[string]SunoSong `json:"clips"`
|
||||
}
|
||||
10
dto/task.go
Normal file
10
dto/task.go
Normal file
@@ -0,0 +1,10 @@
|
||||
package dto
|
||||
|
||||
type TaskError struct {
|
||||
Code string `json:"code"`
|
||||
Message string `json:"message"`
|
||||
Data any `json:"data"`
|
||||
StatusCode int `json:"-"`
|
||||
LocalError bool `json:"-"`
|
||||
Error error `json:"-"`
|
||||
}
|
||||
2
go.mod
2
go.mod
@@ -19,8 +19,8 @@ require (
|
||||
github.com/google/uuid v1.6.0
|
||||
github.com/gorilla/websocket v1.5.0
|
||||
github.com/jinzhu/copier v0.4.0
|
||||
github.com/linux-do/tiktoken-go v0.7.0
|
||||
github.com/pkg/errors v0.9.1
|
||||
github.com/pkoukk/tiktoken-go v0.1.7
|
||||
github.com/samber/lo v1.39.0
|
||||
github.com/shirou/gopsutil v3.21.11+incompatible
|
||||
github.com/stripe/stripe-go/v76 v76.21.0
|
||||
|
||||
2
go.sum
2
go.sum
@@ -152,6 +152,8 @@ github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
|
||||
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pkoukk/tiktoken-go v0.1.6 h1:JF0TlJzhTbrI30wCvFuiw6FzP2+/bR+FIxUdgEAcUsw=
|
||||
github.com/pkoukk/tiktoken-go v0.1.6/go.mod h1:9NiV+i9mJKGj1rYOT+njbv+ZwA/zJxYdewGl6qVatpg=
|
||||
github.com/pkoukk/tiktoken-go v0.1.7 h1:qOBHXX4PHtvIvmOtyg1EeKlwFRiMKAcoMp4Q+bLQDmw=
|
||||
github.com/pkoukk/tiktoken-go v0.1.7/go.mod h1:9NiV+i9mJKGj1rYOT+njbv+ZwA/zJxYdewGl6qVatpg=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/rogpeppe/go-internal v1.6.1/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc=
|
||||
|
||||
11
main.go
11
main.go
@@ -89,9 +89,14 @@ func main() {
|
||||
}
|
||||
go controller.AutomaticallyTestChannels(frequency)
|
||||
}
|
||||
common.SafeGoroutine(func() {
|
||||
controller.UpdateMidjourneyTaskBulk()
|
||||
})
|
||||
if common.IsMasterNode {
|
||||
common.SafeGoroutine(func() {
|
||||
controller.UpdateMidjourneyTaskBulk()
|
||||
})
|
||||
common.SafeGoroutine(func() {
|
||||
controller.UpdateTaskBulk()
|
||||
})
|
||||
}
|
||||
if os.Getenv("BATCH_UPDATE_ENABLED") == "true" {
|
||||
common.BatchUpdateEnabled = true
|
||||
common.SysLog("batch update enabled with interval " + strconv.Itoa(common.BatchUpdateInterval) + "s")
|
||||
|
||||
@@ -125,6 +125,17 @@ func getModelRequest(c *gin.Context) (*ModelRequest, bool, error) {
|
||||
modelRequest.Model = midjourneyModel
|
||||
}
|
||||
c.Set("relay_mode", relayMode)
|
||||
} else if strings.Contains(c.Request.URL.Path, "/suno/") {
|
||||
relayMode := relayconstant.Path2RelaySuno(c.Request.Method, c.Request.URL.Path)
|
||||
if relayMode == relayconstant.RelayModeSunoFetch ||
|
||||
relayMode == relayconstant.RelayModeSunoFetchByID {
|
||||
shouldSelectChannel = false
|
||||
} else {
|
||||
modelName := service.CoverTaskActionToModelName(constant.TaskPlatformSuno, c.Param("action"))
|
||||
modelRequest.Model = modelName
|
||||
}
|
||||
c.Set("platform", string(constant.TaskPlatformSuno))
|
||||
c.Set("relay_mode", relayMode)
|
||||
} else if !strings.HasPrefix(c.Request.URL.Path, "/v1/audio/transcriptions") {
|
||||
err = common.UnmarshalBodyReusable(c, &modelRequest)
|
||||
}
|
||||
|
||||
@@ -56,6 +56,11 @@ func getPriority(group string, model string, retry int) (int, error) {
|
||||
return 0, err
|
||||
}
|
||||
|
||||
if len(priorities) == 0 {
|
||||
// 如果没有查询到优先级,则返回错误
|
||||
return 0, errors.New("数据库一致性被破坏")
|
||||
}
|
||||
|
||||
// 确定要使用的优先级
|
||||
var priorityToUse int
|
||||
if retry >= len(priorities) {
|
||||
@@ -199,7 +204,7 @@ func FixAbility() (int, error) {
|
||||
|
||||
// Use channelIds to find channel not in abilities table
|
||||
var abilityChannelIds []int
|
||||
err = DB.Model(&Ability{}).Pluck("channel_id", &abilityChannelIds).Error
|
||||
err = DB.Table("abilities").Distinct("channel_id").Pluck("channel_id", &abilityChannelIds).Error
|
||||
if err != nil {
|
||||
common.SysError(fmt.Sprintf("Get channel ids from abilities table failed: %s", err.Error()))
|
||||
return 0, err
|
||||
|
||||
@@ -87,7 +87,7 @@ func SyncTokenCache(frequency int) {
|
||||
}
|
||||
} else {
|
||||
// 如果数据库中存在,先检查redis
|
||||
_, err := common.RedisGet(fmt.Sprintf("token:%s", key))
|
||||
_, err = common.RedisGet(fmt.Sprintf("token:%s", key))
|
||||
if err != nil {
|
||||
// 如果redis中不存在,则跳过
|
||||
continue
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
package model
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"gorm.io/gorm"
|
||||
"one-api/common"
|
||||
)
|
||||
@@ -29,6 +30,31 @@ type Channel struct {
|
||||
StatusCodeMapping *string `json:"status_code_mapping" gorm:"type:varchar(1024);default:''"`
|
||||
Priority *int64 `json:"priority" gorm:"bigint;default:0"`
|
||||
AutoBan *int `json:"auto_ban" gorm:"default:1"`
|
||||
OtherInfo string `json:"other_info"`
|
||||
}
|
||||
|
||||
func (channel *Channel) GetOtherInfo() map[string]interface{} {
|
||||
otherInfo := make(map[string]interface{})
|
||||
if channel.OtherInfo != "" {
|
||||
err := json.Unmarshal([]byte(channel.OtherInfo), &otherInfo)
|
||||
if err != nil {
|
||||
common.SysError("failed to unmarshal other info: " + err.Error())
|
||||
}
|
||||
}
|
||||
return otherInfo
|
||||
}
|
||||
|
||||
func (channel *Channel) SetOtherInfo(otherInfo map[string]interface{}) {
|
||||
otherInfoBytes, err := json.Marshal(otherInfo)
|
||||
if err != nil {
|
||||
common.SysError("failed to marshal other info: " + err.Error())
|
||||
return
|
||||
}
|
||||
channel.OtherInfo = string(otherInfoBytes)
|
||||
}
|
||||
|
||||
func (channel *Channel) Save() error {
|
||||
return DB.Save(channel).Error
|
||||
}
|
||||
|
||||
func GetAllChannels(startIdx int, num int, selectAll bool, idSort bool) ([]*Channel, error) {
|
||||
@@ -213,15 +239,31 @@ func (channel *Channel) Delete() error {
|
||||
return err
|
||||
}
|
||||
|
||||
func UpdateChannelStatusById(id int, status int) {
|
||||
func UpdateChannelStatusById(id int, status int, reason string) {
|
||||
err := UpdateAbilityStatus(id, status == common.ChannelStatusEnabled)
|
||||
if err != nil {
|
||||
common.SysError("failed to update ability status: " + err.Error())
|
||||
}
|
||||
err = DB.Model(&Channel{}).Where("id = ?", id).Update("status", status).Error
|
||||
channel, err := GetChannelById(id, true)
|
||||
if err != nil {
|
||||
common.SysError("failed to update channel status: " + err.Error())
|
||||
// find channel by id error, directly update status
|
||||
err = DB.Model(&Channel{}).Where("id = ?", id).Update("status", status).Error
|
||||
if err != nil {
|
||||
common.SysError("failed to update channel status: " + err.Error())
|
||||
}
|
||||
} else {
|
||||
// find channel by id success, update status and other info
|
||||
info := channel.GetOtherInfo()
|
||||
info["status_reason"] = reason
|
||||
info["status_time"] = common.GetTimestamp()
|
||||
channel.SetOtherInfo(info)
|
||||
channel.Status = status
|
||||
err = channel.Save()
|
||||
if err != nil {
|
||||
common.SysError("failed to update channel status: " + err.Error())
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func UpdateChannelUsedQuota(id int, quota int) {
|
||||
|
||||
@@ -36,7 +36,7 @@ const (
|
||||
)
|
||||
|
||||
func GetLogByKey(key string) (logs []*Log, err error) {
|
||||
err = DB.Joins("left join tokens on tokens.id = logs.token_id").Where("tokens.key = ?", strings.Split(key, "-")[1]).Find(&logs).Error
|
||||
err = DB.Joins("left join tokens on tokens.id = logs.token_id").Where("tokens.key = ?", strings.TrimPrefix(key, "sk-")).Find(&logs).Error
|
||||
return logs, err
|
||||
}
|
||||
|
||||
|
||||
@@ -86,9 +86,9 @@ func InitDB() (err error) {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
sqlDB.SetMaxIdleConns(common.GetOrDefault("SQL_MAX_IDLE_CONNS", 100))
|
||||
sqlDB.SetMaxOpenConns(common.GetOrDefault("SQL_MAX_OPEN_CONNS", 1000))
|
||||
sqlDB.SetConnMaxLifetime(time.Second * time.Duration(common.GetOrDefault("SQL_MAX_LIFETIME", 60)))
|
||||
sqlDB.SetMaxIdleConns(common.GetEnvOrDefault("SQL_MAX_IDLE_CONNS", 100))
|
||||
sqlDB.SetMaxOpenConns(common.GetEnvOrDefault("SQL_MAX_OPEN_CONNS", 1000))
|
||||
sqlDB.SetConnMaxLifetime(time.Second * time.Duration(common.GetEnvOrDefault("SQL_MAX_LIFETIME", 60)))
|
||||
|
||||
if !common.IsMasterNode {
|
||||
return nil
|
||||
@@ -140,6 +140,10 @@ func InitDB() (err error) {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
err = db.AutoMigrate(&Task{})
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
common.SysLog("database migrated")
|
||||
err = createRootAccountIfNeed()
|
||||
return err
|
||||
|
||||
@@ -43,6 +43,7 @@ func InitOptionMap() {
|
||||
common.OptionMap["DisplayInCurrencyEnabled"] = strconv.FormatBool(common.DisplayInCurrencyEnabled)
|
||||
common.OptionMap["DisplayTokenStatEnabled"] = strconv.FormatBool(common.DisplayTokenStatEnabled)
|
||||
common.OptionMap["DrawingEnabled"] = strconv.FormatBool(common.DrawingEnabled)
|
||||
common.OptionMap["TaskEnabled"] = strconv.FormatBool(common.TaskEnabled)
|
||||
common.OptionMap["DataExportEnabled"] = strconv.FormatBool(common.DataExportEnabled)
|
||||
common.OptionMap["ChannelDisableThreshold"] = strconv.FormatFloat(common.ChannelDisableThreshold, 'f', -1, 64)
|
||||
common.OptionMap["EmailDomainRestrictionEnabled"] = strconv.FormatBool(common.EmailDomainRestrictionEnabled)
|
||||
@@ -61,6 +62,7 @@ func InitOptionMap() {
|
||||
common.OptionMap["SystemName"] = common.SystemName
|
||||
common.OptionMap["Logo"] = common.Logo
|
||||
common.OptionMap["ServerAddress"] = ""
|
||||
common.OptionMap["OutProxyUrl"] = ""
|
||||
common.OptionMap["StripeApiSecret"] = common.StripeApiSecret
|
||||
common.OptionMap["StripeWebhookSecret"] = common.StripeWebhookSecret
|
||||
common.OptionMap["StripePriceId"] = common.StripePriceId
|
||||
@@ -202,6 +204,8 @@ func updateOptionMap(key string, value string) (err error) {
|
||||
common.DisplayTokenStatEnabled = boolValue
|
||||
case "DrawingEnabled":
|
||||
common.DrawingEnabled = boolValue
|
||||
case "TaskEnabled":
|
||||
common.TaskEnabled = boolValue
|
||||
case "DataExportEnabled":
|
||||
common.DataExportEnabled = boolValue
|
||||
case "DefaultCollapseSidebar":
|
||||
@@ -242,6 +246,8 @@ func updateOptionMap(key string, value string) (err error) {
|
||||
common.SMTPToken = value
|
||||
case "ServerAddress":
|
||||
common.ServerAddress = value
|
||||
case "OutProxyUrl":
|
||||
common.OutProxyUrl = value
|
||||
case "StripeApiSecret":
|
||||
common.StripeApiSecret = value
|
||||
case "StripeWebhookSecret":
|
||||
|
||||
@@ -2,18 +2,28 @@ package model
|
||||
|
||||
import (
|
||||
"one-api/common"
|
||||
"one-api/dto"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
type Pricing struct {
|
||||
Available bool `json:"available"`
|
||||
ModelName string `json:"model_name"`
|
||||
QuotaType int `json:"quota_type"`
|
||||
ModelRatio float64 `json:"model_ratio"`
|
||||
ModelPrice float64 `json:"model_price"`
|
||||
OwnerBy string `json:"owner_by"`
|
||||
CompletionRatio float64 `json:"completion_ratio"`
|
||||
EnableGroup []string `json:"enable_group,omitempty"`
|
||||
}
|
||||
|
||||
var (
|
||||
pricingMap []dto.ModelPricing
|
||||
pricingMap []Pricing
|
||||
lastGetPricingTime time.Time
|
||||
updatePricingLock sync.Mutex
|
||||
)
|
||||
|
||||
func GetPricing(group string) []dto.ModelPricing {
|
||||
func GetPricing(group string) []Pricing {
|
||||
updatePricingLock.Lock()
|
||||
defer updatePricingLock.Unlock()
|
||||
|
||||
@@ -21,7 +31,7 @@ func GetPricing(group string) []dto.ModelPricing {
|
||||
updatePricing()
|
||||
}
|
||||
if group != "" {
|
||||
userPricingMap := make([]dto.ModelPricing, 0)
|
||||
userPricingMap := make([]Pricing, 0)
|
||||
models := GetGroupModels(group)
|
||||
for _, pricing := range pricingMap {
|
||||
if !common.StringsContains(models, pricing.ModelName) {
|
||||
@@ -42,9 +52,9 @@ func updatePricing() {
|
||||
allModels[model] = i
|
||||
}
|
||||
|
||||
pricingMap = make([]dto.ModelPricing, 0)
|
||||
pricingMap = make([]Pricing, 0)
|
||||
for model, _ := range allModels {
|
||||
pricing := dto.ModelPricing{
|
||||
pricing := Pricing{
|
||||
Available: true,
|
||||
ModelName: model,
|
||||
}
|
||||
|
||||
304
model/task.go
Normal file
304
model/task.go
Normal file
@@ -0,0 +1,304 @@
|
||||
package model
|
||||
|
||||
import (
|
||||
"database/sql/driver"
|
||||
"encoding/json"
|
||||
"one-api/constant"
|
||||
commonRelay "one-api/relay/common"
|
||||
"time"
|
||||
)
|
||||
|
||||
type TaskStatus string
|
||||
|
||||
const (
|
||||
TaskStatusNotStart TaskStatus = "NOT_START"
|
||||
TaskStatusSubmitted = "SUBMITTED"
|
||||
TaskStatusQueued = "QUEUED"
|
||||
TaskStatusInProgress = "IN_PROGRESS"
|
||||
TaskStatusFailure = "FAILURE"
|
||||
TaskStatusSuccess = "SUCCESS"
|
||||
TaskStatusUnknown = "UNKNOWN"
|
||||
)
|
||||
|
||||
type Task struct {
|
||||
ID int64 `json:"id" gorm:"primary_key;AUTO_INCREMENT"`
|
||||
CreatedAt int64 `json:"created_at" gorm:"index"`
|
||||
UpdatedAt int64 `json:"updated_at"`
|
||||
TaskID string `json:"task_id" gorm:"type:varchar(50);index"` // 第三方id,不一定有/ song id\ Task id
|
||||
Platform constant.TaskPlatform `json:"platform" gorm:"type:varchar(30);index"` // 平台
|
||||
UserId int `json:"user_id" gorm:"index"`
|
||||
ChannelId int `json:"channel_id" gorm:"index"`
|
||||
Quota int `json:"quota"`
|
||||
Action string `json:"action" gorm:"type:varchar(40);index"` // 任务类型, song, lyrics, description-mode
|
||||
Status TaskStatus `json:"status" gorm:"type:varchar(20);index"` // 任务状态
|
||||
FailReason string `json:"fail_reason"`
|
||||
SubmitTime int64 `json:"submit_time" gorm:"index"`
|
||||
StartTime int64 `json:"start_time" gorm:"index"`
|
||||
FinishTime int64 `json:"finish_time" gorm:"index"`
|
||||
Progress string `json:"progress" gorm:"type:varchar(20);index"`
|
||||
Properties Properties `json:"properties" gorm:"type:json"`
|
||||
|
||||
Data json.RawMessage `json:"data" gorm:"type:json"`
|
||||
}
|
||||
|
||||
func (t *Task) SetData(data any) {
|
||||
b, _ := json.Marshal(data)
|
||||
t.Data = json.RawMessage(b)
|
||||
}
|
||||
|
||||
func (t *Task) GetData(v any) error {
|
||||
err := json.Unmarshal(t.Data, &v)
|
||||
return err
|
||||
}
|
||||
|
||||
type Properties struct {
|
||||
Input string `json:"input"`
|
||||
}
|
||||
|
||||
func (m *Properties) Scan(val interface{}) error {
|
||||
bytesValue, _ := val.([]byte)
|
||||
return json.Unmarshal(bytesValue, m)
|
||||
}
|
||||
|
||||
func (m Properties) Value() (driver.Value, error) {
|
||||
return json.Marshal(m)
|
||||
}
|
||||
|
||||
// SyncTaskQueryParams 用于包含所有搜索条件的结构体,可以根据需求添加更多字段
|
||||
type SyncTaskQueryParams struct {
|
||||
Platform constant.TaskPlatform
|
||||
ChannelID string
|
||||
TaskID string
|
||||
UserID string
|
||||
Action string
|
||||
Status string
|
||||
StartTimestamp int64
|
||||
EndTimestamp int64
|
||||
UserIDs []int
|
||||
}
|
||||
|
||||
func InitTask(platform constant.TaskPlatform, relayInfo *commonRelay.TaskRelayInfo) *Task {
|
||||
t := &Task{
|
||||
UserId: relayInfo.UserId,
|
||||
SubmitTime: time.Now().Unix(),
|
||||
Status: TaskStatusNotStart,
|
||||
Progress: "0%",
|
||||
ChannelId: relayInfo.ChannelId,
|
||||
Platform: platform,
|
||||
}
|
||||
return t
|
||||
}
|
||||
|
||||
func TaskGetAllUserTask(userId int, startIdx int, num int, queryParams SyncTaskQueryParams) []*Task {
|
||||
var tasks []*Task
|
||||
var err error
|
||||
|
||||
// 初始化查询构建器
|
||||
query := DB.Where("user_id = ?", userId)
|
||||
|
||||
if queryParams.TaskID != "" {
|
||||
query = query.Where("task_id = ?", queryParams.TaskID)
|
||||
}
|
||||
if queryParams.Action != "" {
|
||||
query = query.Where("action = ?", queryParams.Action)
|
||||
}
|
||||
if queryParams.Status != "" {
|
||||
query = query.Where("status = ?", queryParams.Status)
|
||||
}
|
||||
if queryParams.Platform != "" {
|
||||
query = query.Where("platform = ?", queryParams.Platform)
|
||||
}
|
||||
if queryParams.StartTimestamp != 0 {
|
||||
// 假设您已将前端传来的时间戳转换为数据库所需的时间格式,并处理了时间戳的验证和解析
|
||||
query = query.Where("submit_time >= ?", queryParams.StartTimestamp)
|
||||
}
|
||||
if queryParams.EndTimestamp != 0 {
|
||||
query = query.Where("submit_time <= ?", queryParams.EndTimestamp)
|
||||
}
|
||||
|
||||
// 获取数据
|
||||
err = query.Omit("channel_id").Order("id desc").Limit(num).Offset(startIdx).Find(&tasks).Error
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
return tasks
|
||||
}
|
||||
|
||||
func TaskGetAllTasks(startIdx int, num int, queryParams SyncTaskQueryParams) []*Task {
|
||||
var tasks []*Task
|
||||
var err error
|
||||
|
||||
// 初始化查询构建器
|
||||
query := DB
|
||||
|
||||
// 添加过滤条件
|
||||
if queryParams.ChannelID != "" {
|
||||
query = query.Where("channel_id = ?", queryParams.ChannelID)
|
||||
}
|
||||
if queryParams.Platform != "" {
|
||||
query = query.Where("platform = ?", queryParams.Platform)
|
||||
}
|
||||
if queryParams.UserID != "" {
|
||||
query = query.Where("user_id = ?", queryParams.UserID)
|
||||
}
|
||||
if len(queryParams.UserIDs) != 0 {
|
||||
query = query.Where("user_id in (?)", queryParams.UserIDs)
|
||||
}
|
||||
if queryParams.TaskID != "" {
|
||||
query = query.Where("task_id = ?", queryParams.TaskID)
|
||||
}
|
||||
if queryParams.Action != "" {
|
||||
query = query.Where("action = ?", queryParams.Action)
|
||||
}
|
||||
if queryParams.Status != "" {
|
||||
query = query.Where("status = ?", queryParams.Status)
|
||||
}
|
||||
if queryParams.StartTimestamp != 0 {
|
||||
query = query.Where("submit_time >= ?", queryParams.StartTimestamp)
|
||||
}
|
||||
if queryParams.EndTimestamp != 0 {
|
||||
query = query.Where("submit_time <= ?", queryParams.EndTimestamp)
|
||||
}
|
||||
|
||||
// 获取数据
|
||||
err = query.Order("id desc").Limit(num).Offset(startIdx).Find(&tasks).Error
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
return tasks
|
||||
}
|
||||
|
||||
func GetAllUnFinishSyncTasks(limit int) []*Task {
|
||||
var tasks []*Task
|
||||
var err error
|
||||
// get all tasks progress is not 100%
|
||||
err = DB.Where("progress != ?", "100%").Limit(limit).Order("id").Find(&tasks).Error
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
return tasks
|
||||
}
|
||||
|
||||
func GetByOnlyTaskId(taskId string) (*Task, bool, error) {
|
||||
if taskId == "" {
|
||||
return nil, false, nil
|
||||
}
|
||||
var task *Task
|
||||
var err error
|
||||
err = DB.Where("task_id = ?", taskId).First(&task).Error
|
||||
exist, err := RecordExist(err)
|
||||
if err != nil {
|
||||
return nil, false, err
|
||||
}
|
||||
return task, exist, err
|
||||
}
|
||||
|
||||
func GetByTaskId(userId int, taskId string) (*Task, bool, error) {
|
||||
if taskId == "" {
|
||||
return nil, false, nil
|
||||
}
|
||||
var task *Task
|
||||
var err error
|
||||
err = DB.Where("user_id = ? and task_id = ?", userId, taskId).
|
||||
First(&task).Error
|
||||
exist, err := RecordExist(err)
|
||||
if err != nil {
|
||||
return nil, false, err
|
||||
}
|
||||
return task, exist, err
|
||||
}
|
||||
|
||||
func GetByTaskIds(userId int, taskIds []any) ([]*Task, error) {
|
||||
if len(taskIds) == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
var task []*Task
|
||||
var err error
|
||||
err = DB.Where("user_id = ? and task_id in (?)", userId, taskIds).
|
||||
Find(&task).Error
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return task, nil
|
||||
}
|
||||
|
||||
func TaskUpdateProgress(id int64, progress string) error {
|
||||
return DB.Model(&Task{}).Where("id = ?", id).Update("progress", progress).Error
|
||||
}
|
||||
|
||||
func (Task *Task) Insert() error {
|
||||
var err error
|
||||
err = DB.Create(Task).Error
|
||||
return err
|
||||
}
|
||||
|
||||
func (Task *Task) Update() error {
|
||||
var err error
|
||||
err = DB.Save(Task).Error
|
||||
return err
|
||||
}
|
||||
|
||||
func TaskBulkUpdate(TaskIds []string, params map[string]any) error {
|
||||
if len(TaskIds) == 0 {
|
||||
return nil
|
||||
}
|
||||
return DB.Model(&Task{}).
|
||||
Where("task_id in (?)", TaskIds).
|
||||
Updates(params).Error
|
||||
}
|
||||
|
||||
func TaskBulkUpdateByTaskIds(taskIDs []int64, params map[string]any) error {
|
||||
if len(taskIDs) == 0 {
|
||||
return nil
|
||||
}
|
||||
return DB.Model(&Task{}).
|
||||
Where("id in (?)", taskIDs).
|
||||
Updates(params).Error
|
||||
}
|
||||
|
||||
func TaskBulkUpdateByID(ids []int64, params map[string]any) error {
|
||||
if len(ids) == 0 {
|
||||
return nil
|
||||
}
|
||||
return DB.Model(&Task{}).
|
||||
Where("id in (?)", ids).
|
||||
Updates(params).Error
|
||||
}
|
||||
|
||||
type TaskQuotaUsage struct {
|
||||
Mode string `json:"mode"`
|
||||
Count float64 `json:"count"`
|
||||
}
|
||||
|
||||
func SumUsedTaskQuota(queryParams SyncTaskQueryParams) (stat []TaskQuotaUsage, err error) {
|
||||
query := DB.Model(Task{})
|
||||
// 添加过滤条件
|
||||
if queryParams.ChannelID != "" {
|
||||
query = query.Where("channel_id = ?", queryParams.ChannelID)
|
||||
}
|
||||
if queryParams.UserID != "" {
|
||||
query = query.Where("user_id = ?", queryParams.UserID)
|
||||
}
|
||||
if len(queryParams.UserIDs) != 0 {
|
||||
query = query.Where("user_id in (?)", queryParams.UserIDs)
|
||||
}
|
||||
if queryParams.TaskID != "" {
|
||||
query = query.Where("task_id = ?", queryParams.TaskID)
|
||||
}
|
||||
if queryParams.Action != "" {
|
||||
query = query.Where("action = ?", queryParams.Action)
|
||||
}
|
||||
if queryParams.Status != "" {
|
||||
query = query.Where("status = ?", queryParams.Status)
|
||||
}
|
||||
if queryParams.StartTimestamp != 0 {
|
||||
query = query.Where("submit_time >= ?", queryParams.StartTimestamp)
|
||||
}
|
||||
if queryParams.EndTimestamp != 0 {
|
||||
query = query.Where("submit_time <= ?", queryParams.EndTimestamp)
|
||||
}
|
||||
err = query.Select("mode, sum(quota) as count").Group("mode").Find(&stat).Error
|
||||
return stat, err
|
||||
}
|
||||
@@ -1,6 +1,8 @@
|
||||
package model
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"gorm.io/gorm"
|
||||
"one-api/common"
|
||||
"sync"
|
||||
"time"
|
||||
@@ -75,3 +77,13 @@ func batchUpdate() {
|
||||
}
|
||||
common.SysLog("batch update finished")
|
||||
}
|
||||
|
||||
func RecordExist(err error) (bool, error) {
|
||||
if err == nil {
|
||||
return true, nil
|
||||
}
|
||||
if errors.Is(err, gorm.ErrRecordNotFound) {
|
||||
return false, nil
|
||||
}
|
||||
return false, err
|
||||
}
|
||||
|
||||
@@ -19,3 +19,22 @@ type Adaptor interface {
|
||||
GetModelList() []string
|
||||
GetChannelName() string
|
||||
}
|
||||
|
||||
type TaskAdaptor interface {
|
||||
Init(info *relaycommon.TaskRelayInfo)
|
||||
|
||||
ValidateRequestAndSetAction(c *gin.Context, info *relaycommon.TaskRelayInfo) *dto.TaskError
|
||||
|
||||
BuildRequestURL(info *relaycommon.TaskRelayInfo) (string, error)
|
||||
BuildRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.TaskRelayInfo) error
|
||||
BuildRequestBody(c *gin.Context, info *relaycommon.TaskRelayInfo) (io.Reader, error)
|
||||
|
||||
DoRequest(c *gin.Context, info *relaycommon.TaskRelayInfo, requestBody io.Reader) (*http.Response, error)
|
||||
DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.TaskRelayInfo) (taskID string, taskData []byte, err *dto.TaskError)
|
||||
|
||||
GetModelList() []string
|
||||
GetChannelName() string
|
||||
|
||||
// FetchTask
|
||||
FetchTask(baseUrl, key string, body map[string]any) (*http.Response, error)
|
||||
}
|
||||
|
||||
@@ -50,3 +50,27 @@ func doRequest(c *gin.Context, req *http.Request) (*http.Response, error) {
|
||||
_ = c.Request.Body.Close()
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
func DoTaskApiRequest(a TaskAdaptor, c *gin.Context, info *common.TaskRelayInfo, requestBody io.Reader) (*http.Response, error) {
|
||||
fullRequestURL, err := a.BuildRequestURL(info)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
req, err := http.NewRequest(c.Request.Method, fullRequestURL, requestBody)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("new request failed: %w", err)
|
||||
}
|
||||
req.GetBody = func() (io.ReadCloser, error) {
|
||||
return io.NopCloser(requestBody), nil
|
||||
}
|
||||
|
||||
err = a.BuildRequestHeader(c, req, info)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("setup request header failed: %w", err)
|
||||
}
|
||||
resp, err := doRequest(c, req)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("do request failed: %w", err)
|
||||
}
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
@@ -7,6 +7,7 @@ var awsModelIDMap = map[string]string{
|
||||
"claude-3-sonnet-20240229": "anthropic.claude-3-sonnet-20240229-v1:0",
|
||||
"claude-3-opus-20240229": "anthropic.claude-3-opus-20240229-v1:0",
|
||||
"claude-3-haiku-20240307": "anthropic.claude-3-haiku-20240307-v1:0",
|
||||
"claude-3-5-sonnet-20240620": "anthropic.claude-3-5-sonnet-20240620-v1:0",
|
||||
}
|
||||
|
||||
var ChannelName = "aws"
|
||||
|
||||
@@ -14,6 +14,7 @@ import (
|
||||
"one-api/relay/channel/claude"
|
||||
relaycommon "one-api/relay/common"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/aws/aws-sdk-go-v2/aws"
|
||||
"github.com/aws/aws-sdk-go-v2/credentials"
|
||||
@@ -156,6 +157,7 @@ func awsStreamHandler(c *gin.Context, info *relaycommon.RelayInfo, requestMode i
|
||||
var usage relaymodel.Usage
|
||||
var id string
|
||||
var model string
|
||||
isFirst := true
|
||||
createdTime := common.GetTimestamp()
|
||||
c.Stream(func(w io.Writer) bool {
|
||||
event, ok := <-stream.Events()
|
||||
@@ -166,6 +168,10 @@ func awsStreamHandler(c *gin.Context, info *relaycommon.RelayInfo, requestMode i
|
||||
|
||||
switch v := event.(type) {
|
||||
case *types.ResponseStreamMemberChunk:
|
||||
if isFirst {
|
||||
isFirst = false
|
||||
info.FirstResponseTime = time.Now()
|
||||
}
|
||||
claudeResp := new(claude.ClaudeResponse)
|
||||
err := json.NewDecoder(bytes.NewReader(v.Value.Bytes)).Decode(claudeResp)
|
||||
if err != nil {
|
||||
|
||||
@@ -65,7 +65,7 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
|
||||
|
||||
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
|
||||
if info.IsStream {
|
||||
err, usage = claudeStreamHandler(a.RequestMode, info.UpstreamModelName, info.PromptTokens, c, resp)
|
||||
err, usage = claudeStreamHandler(c, resp, info, a.RequestMode)
|
||||
} else {
|
||||
err, usage = claudeHandler(a.RequestMode, c, resp, info.PromptTokens, info.UpstreamModelName)
|
||||
}
|
||||
|
||||
@@ -8,6 +8,7 @@ var ModelList = []string{
|
||||
"claude-3-sonnet-20240229",
|
||||
"claude-3-opus-20240229",
|
||||
"claude-3-haiku-20240307",
|
||||
"claude-3-5-sonnet-20240620",
|
||||
}
|
||||
|
||||
var ChannelName = "claude"
|
||||
|
||||
@@ -8,9 +8,12 @@ import (
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
relaycommon "one-api/relay/common"
|
||||
"one-api/service"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
func stopReasonClaude2OpenAI(reason string) string {
|
||||
@@ -246,7 +249,7 @@ func ResponseClaude2OpenAI(reqMode int, claudeResponse *ClaudeResponse) *dto.Ope
|
||||
return &fullTextResponse
|
||||
}
|
||||
|
||||
func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c *gin.Context, resp *http.Response) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
|
||||
func claudeStreamHandler(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo, requestMode int) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
|
||||
responseId := fmt.Sprintf("chatcmpl-%s", common.GetUUID())
|
||||
var usage *dto.Usage
|
||||
usage = &dto.Usage{}
|
||||
@@ -265,8 +268,8 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
|
||||
}
|
||||
return 0, nil, nil
|
||||
})
|
||||
dataChan := make(chan string)
|
||||
stopChan := make(chan bool)
|
||||
dataChan := make(chan string, 5)
|
||||
stopChan := make(chan bool, 2)
|
||||
go func() {
|
||||
for scanner.Scan() {
|
||||
data := scanner.Text()
|
||||
@@ -274,14 +277,23 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
|
||||
continue
|
||||
}
|
||||
data = strings.TrimPrefix(data, "data: ")
|
||||
dataChan <- data
|
||||
if !common.SafeSendStringTimeout(dataChan, data, constant.StreamingTimeout) {
|
||||
// send data timeout, stop the stream
|
||||
common.LogError(c, "send data timeout, stop the stream")
|
||||
break
|
||||
}
|
||||
}
|
||||
stopChan <- true
|
||||
}()
|
||||
isFirst := true
|
||||
service.SetEventStreamHeaders(c)
|
||||
c.Stream(func(w io.Writer) bool {
|
||||
select {
|
||||
case data := <-dataChan:
|
||||
if isFirst {
|
||||
isFirst = false
|
||||
info.FirstResponseTime = time.Now()
|
||||
}
|
||||
// some implementations may add \r at the end of data
|
||||
data = strings.TrimSuffix(data, "\r")
|
||||
var claudeResponse ClaudeResponse
|
||||
@@ -302,7 +314,7 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
|
||||
if claudeResponse.Type == "message_start" {
|
||||
// message_start, 获取usage
|
||||
responseId = claudeResponse.Message.Id
|
||||
modelName = claudeResponse.Message.Model
|
||||
info.UpstreamModelName = claudeResponse.Message.Model
|
||||
usage.PromptTokens = claudeUsage.InputTokens
|
||||
} else if claudeResponse.Type == "content_block_delta" {
|
||||
responseText += claudeResponse.Delta.Text
|
||||
@@ -316,7 +328,7 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
|
||||
//response.Id = responseId
|
||||
response.Id = responseId
|
||||
response.Created = createdTime
|
||||
response.Model = modelName
|
||||
response.Model = info.UpstreamModelName
|
||||
|
||||
jsonStr, err := json.Marshal(response)
|
||||
if err != nil {
|
||||
@@ -335,10 +347,13 @@ func claudeStreamHandler(requestMode int, modelName string, promptTokens int, c
|
||||
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
|
||||
}
|
||||
if requestMode == RequestModeCompletion {
|
||||
usage, _ = service.ResponseText2Usage(responseText, modelName, promptTokens)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
} else {
|
||||
if usage.PromptTokens == 0 {
|
||||
usage.PromptTokens = info.PromptTokens
|
||||
}
|
||||
if usage.CompletionTokens == 0 {
|
||||
usage, _ = service.ResponseText2Usage(responseText, modelName, usage.PromptTokens)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, usage.PromptTokens)
|
||||
}
|
||||
}
|
||||
return nil, usage
|
||||
@@ -370,7 +385,7 @@ func claudeHandler(requestMode int, c *gin.Context, resp *http.Response, promptT
|
||||
}, nil
|
||||
}
|
||||
fullTextResponse := ResponseClaude2OpenAI(requestMode, &claudeResponse)
|
||||
completionTokens, err, _ := service.CountTokenText(claudeResponse.Completion, model, false)
|
||||
completionTokens, err := service.CountTokenText(claudeResponse.Completion, model)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "count_token_text_failed", http.StatusInternalServerError), nil
|
||||
}
|
||||
|
||||
@@ -36,7 +36,7 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
|
||||
|
||||
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
|
||||
if info.IsStream {
|
||||
err, usage = cohereStreamHandler(c, resp, info.UpstreamModelName, info.PromptTokens)
|
||||
err, usage = cohereStreamHandler(c, resp, info)
|
||||
} else {
|
||||
err, usage = cohereHandler(c, resp, info.UpstreamModelName, info.PromptTokens)
|
||||
}
|
||||
|
||||
@@ -9,8 +9,10 @@ import (
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/dto"
|
||||
relaycommon "one-api/relay/common"
|
||||
"one-api/service"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
func requestOpenAI2Cohere(textRequest dto.GeneralOpenAIRequest) *CohereRequest {
|
||||
@@ -56,7 +58,7 @@ func stopReasonCohere2OpenAI(reason string) string {
|
||||
}
|
||||
}
|
||||
|
||||
func cohereStreamHandler(c *gin.Context, resp *http.Response, modelName string, promptTokens int) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
|
||||
func cohereStreamHandler(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (*dto.OpenAIErrorWithStatusCode, *dto.Usage) {
|
||||
responseId := fmt.Sprintf("chatcmpl-%s", common.GetUUID())
|
||||
createdTime := common.GetTimestamp()
|
||||
usage := &dto.Usage{}
|
||||
@@ -84,9 +86,14 @@ func cohereStreamHandler(c *gin.Context, resp *http.Response, modelName string,
|
||||
stopChan <- true
|
||||
}()
|
||||
service.SetEventStreamHeaders(c)
|
||||
isFirst := true
|
||||
c.Stream(func(w io.Writer) bool {
|
||||
select {
|
||||
case data := <-dataChan:
|
||||
if isFirst {
|
||||
isFirst = false
|
||||
info.FirstResponseTime = time.Now()
|
||||
}
|
||||
data = strings.TrimSuffix(data, "\r")
|
||||
var cohereResp CohereResponse
|
||||
err := json.Unmarshal([]byte(data), &cohereResp)
|
||||
@@ -98,7 +105,7 @@ func cohereStreamHandler(c *gin.Context, resp *http.Response, modelName string,
|
||||
openaiResp.Id = responseId
|
||||
openaiResp.Created = createdTime
|
||||
openaiResp.Object = "chat.completion.chunk"
|
||||
openaiResp.Model = modelName
|
||||
openaiResp.Model = info.UpstreamModelName
|
||||
if cohereResp.IsFinished {
|
||||
finishReason := stopReasonCohere2OpenAI(cohereResp.FinishReason)
|
||||
openaiResp.Choices = []dto.ChatCompletionsStreamResponseChoice{
|
||||
@@ -137,7 +144,7 @@ func cohereStreamHandler(c *gin.Context, resp *http.Response, modelName string,
|
||||
}
|
||||
})
|
||||
if usage.PromptTokens == 0 {
|
||||
usage, _ = service.ResponseText2Usage(responseText, modelName, promptTokens)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
}
|
||||
return nil, usage
|
||||
}
|
||||
|
||||
@@ -20,27 +20,27 @@ func (a *Adaptor) Init(info *relaycommon.RelayInfo, request dto.GeneralOpenAIReq
|
||||
|
||||
// 定义一个映射,存储模型名称和对应的版本
|
||||
var modelVersionMap = map[string]string{
|
||||
"gemini-1.5-pro-latest": "v1beta",
|
||||
"gemini-1.5-flash-latest": "v1beta",
|
||||
"gemini-ultra": "v1beta",
|
||||
"gemini-1.5-pro-latest": "v1beta",
|
||||
"gemini-1.5-flash-latest": "v1beta",
|
||||
"gemini-ultra": "v1beta",
|
||||
}
|
||||
|
||||
func (a *Adaptor) GetRequestURL(info *relaycommon.RelayInfo) (string, error) {
|
||||
// 从映射中获取模型名称对应的版本,如果找不到就使用 info.ApiVersion 或默认的版本 "v1"
|
||||
version, beta := modelVersionMap[info.UpstreamModelName]
|
||||
if !beta {
|
||||
if info.ApiVersion != "" {
|
||||
version = info.ApiVersion
|
||||
} else {
|
||||
version = "v1"
|
||||
}
|
||||
}
|
||||
// 从映射中获取模型名称对应的版本,如果找不到就使用 info.ApiVersion 或默认的版本 "v1"
|
||||
version, beta := modelVersionMap[info.UpstreamModelName]
|
||||
if !beta {
|
||||
if info.ApiVersion != "" {
|
||||
version = info.ApiVersion
|
||||
} else {
|
||||
version = "v1"
|
||||
}
|
||||
}
|
||||
|
||||
action := "generateContent"
|
||||
if info.IsStream {
|
||||
action = "streamGenerateContent"
|
||||
}
|
||||
return fmt.Sprintf("%s/%s/models/%s:%s", info.BaseUrl, version, info.UpstreamModelName, action), nil
|
||||
action := "generateContent"
|
||||
if info.IsStream {
|
||||
action = "streamGenerateContent"
|
||||
}
|
||||
return fmt.Sprintf("%s/%s/models/%s:%s", info.BaseUrl, version, info.UpstreamModelName, action), nil
|
||||
}
|
||||
|
||||
func (a *Adaptor) SetupRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.RelayInfo) error {
|
||||
@@ -63,7 +63,7 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
|
||||
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
|
||||
if info.IsStream {
|
||||
var responseText string
|
||||
err, responseText = geminiChatStreamHandler(c, resp)
|
||||
err, responseText = geminiChatStreamHandler(c, resp, info)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
} else {
|
||||
err, usage = geminiChatHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
|
||||
|
||||
@@ -7,10 +7,12 @@ import (
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
relaycommon "one-api/relay/common"
|
||||
"one-api/service"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
)
|
||||
@@ -160,10 +162,10 @@ func streamResponseGeminiChat2OpenAI(geminiResponse *GeminiChatResponse) *dto.Ch
|
||||
return &response
|
||||
}
|
||||
|
||||
func geminiChatStreamHandler(c *gin.Context, resp *http.Response) (*dto.OpenAIErrorWithStatusCode, string) {
|
||||
func geminiChatStreamHandler(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (*dto.OpenAIErrorWithStatusCode, string) {
|
||||
responseText := ""
|
||||
dataChan := make(chan string)
|
||||
stopChan := make(chan bool)
|
||||
dataChan := make(chan string, 5)
|
||||
stopChan := make(chan bool, 2)
|
||||
scanner := bufio.NewScanner(resp.Body)
|
||||
scanner.Split(func(data []byte, atEOF bool) (advance int, token []byte, err error) {
|
||||
if atEOF && len(data) == 0 {
|
||||
@@ -186,14 +188,23 @@ func geminiChatStreamHandler(c *gin.Context, resp *http.Response) (*dto.OpenAIEr
|
||||
}
|
||||
data = strings.TrimPrefix(data, "\"text\": \"")
|
||||
data = strings.TrimSuffix(data, "\"")
|
||||
dataChan <- data
|
||||
if !common.SafeSendStringTimeout(dataChan, data, constant.StreamingTimeout) {
|
||||
// send data timeout, stop the stream
|
||||
common.LogError(c, "send data timeout, stop the stream")
|
||||
break
|
||||
}
|
||||
}
|
||||
stopChan <- true
|
||||
}()
|
||||
isFirst := true
|
||||
service.SetEventStreamHeaders(c)
|
||||
c.Stream(func(w io.Writer) bool {
|
||||
select {
|
||||
case data := <-dataChan:
|
||||
if isFirst {
|
||||
isFirst = false
|
||||
info.FirstResponseTime = time.Now()
|
||||
}
|
||||
// this is used to prevent annoying \ related format bug
|
||||
data = fmt.Sprintf("{\"content\": \"%s\"}", data)
|
||||
type dummyStruct struct {
|
||||
@@ -256,7 +267,7 @@ func geminiChatHandler(c *gin.Context, resp *http.Response, promptTokens int, mo
|
||||
}, nil
|
||||
}
|
||||
fullTextResponse := responseGeminiChat2OpenAI(&geminiResponse)
|
||||
completionTokens, _, _ := service.CountTokenText(geminiResponse.GetResponseText(), model, false)
|
||||
completionTokens, _ := service.CountTokenText(geminiResponse.GetResponseText(), model)
|
||||
usage := dto.Usage{
|
||||
PromptTokens: promptTokens,
|
||||
CompletionTokens: completionTokens,
|
||||
|
||||
@@ -52,7 +52,7 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
|
||||
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
|
||||
if info.IsStream {
|
||||
var responseText string
|
||||
err, responseText, _ = openai.OpenaiStreamHandler(c, resp, info.RelayMode)
|
||||
err, responseText, _ = openai.OpenaiStreamHandler(c, resp, info)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
} else {
|
||||
if info.RelayMode == relayconstant.RelayModeEmbeddings {
|
||||
|
||||
@@ -82,7 +82,7 @@ func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycom
|
||||
if info.IsStream {
|
||||
var responseText string
|
||||
var toolCount int
|
||||
err, responseText, toolCount = OpenaiStreamHandler(c, resp, info.RelayMode)
|
||||
err, responseText, toolCount = OpenaiStreamHandler(c, resp, info)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
usage.CompletionTokens += toolCount * 7
|
||||
} else {
|
||||
|
||||
@@ -8,7 +8,9 @@ import (
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
relaycommon "one-api/relay/common"
|
||||
relayconstant "one-api/relay/constant"
|
||||
"one-api/service"
|
||||
"strings"
|
||||
@@ -16,7 +18,7 @@ import (
|
||||
"time"
|
||||
)
|
||||
|
||||
func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*dto.OpenAIErrorWithStatusCode, string, int) {
|
||||
func OpenaiStreamHandler(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (*dto.OpenAIErrorWithStatusCode, string, int) {
|
||||
//checkSensitive := constant.ShouldCheckCompletionSensitive()
|
||||
var responseTextBuilder strings.Builder
|
||||
toolCount := 0
|
||||
@@ -50,14 +52,18 @@ func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*d
|
||||
if data[:6] != "data: " && data[:6] != "[DONE]" {
|
||||
continue
|
||||
}
|
||||
common.SafeSendString(dataChan, data)
|
||||
if !common.SafeSendStringTimeout(dataChan, data, constant.StreamingTimeout) {
|
||||
// send data timeout, stop the stream
|
||||
common.LogError(c, "send data timeout, stop the stream")
|
||||
break
|
||||
}
|
||||
data = data[6:]
|
||||
if !strings.HasPrefix(data, "[DONE]") {
|
||||
streamItems = append(streamItems, data)
|
||||
}
|
||||
}
|
||||
streamResp := "[" + strings.Join(streamItems, ",") + "]"
|
||||
switch relayMode {
|
||||
switch info.RelayMode {
|
||||
case relayconstant.RelayModeChatCompletions:
|
||||
var streamResponses []dto.ChatCompletionsStreamResponseSimple
|
||||
err := json.Unmarshal(common.StringToByteSlice(streamResp), &streamResponses)
|
||||
@@ -126,9 +132,14 @@ func OpenaiStreamHandler(c *gin.Context, resp *http.Response, relayMode int) (*d
|
||||
common.SafeSendBool(stopChan, true)
|
||||
}()
|
||||
service.SetEventStreamHeaders(c)
|
||||
isFirst := true
|
||||
c.Stream(func(w io.Writer) bool {
|
||||
select {
|
||||
case data := <-dataChan:
|
||||
if isFirst {
|
||||
isFirst = false
|
||||
info.FirstResponseTime = time.Now()
|
||||
}
|
||||
if strings.HasPrefix(data, "data: [DONE]") {
|
||||
data = data[:12]
|
||||
}
|
||||
@@ -187,10 +198,10 @@ func OpenaiHandler(c *gin.Context, resp *http.Response, promptTokens int, model
|
||||
return service.OpenAIErrorWrapper(err, "close_response_body_failed", http.StatusInternalServerError), nil
|
||||
}
|
||||
|
||||
if simpleResponse.Usage.TotalTokens == 0 {
|
||||
if simpleResponse.Usage.TotalTokens == 0 || (simpleResponse.Usage.PromptTokens == 0 && simpleResponse.Usage.CompletionTokens == 0) {
|
||||
completionTokens := 0
|
||||
for _, choice := range simpleResponse.Choices {
|
||||
ctkm, _, _ := service.CountTokenText(string(choice.Message.Content), model, false)
|
||||
ctkm, _ := service.CountTokenText(string(choice.Message.Content), model)
|
||||
completionTokens += ctkm
|
||||
}
|
||||
simpleResponse.Usage = dto.Usage{
|
||||
|
||||
@@ -156,7 +156,7 @@ func palmHandler(c *gin.Context, resp *http.Response, promptTokens int, model st
|
||||
}, nil
|
||||
}
|
||||
fullTextResponse := responsePaLM2OpenAI(&palmResponse)
|
||||
completionTokens, _, _ := service.CountTokenText(palmResponse.Candidates[0].Content, model, false)
|
||||
completionTokens, _ := service.CountTokenText(palmResponse.Candidates[0].Content, model)
|
||||
usage := dto.Usage{
|
||||
PromptTokens: promptTokens,
|
||||
CompletionTokens: completionTokens,
|
||||
|
||||
@@ -46,7 +46,7 @@ func (a *Adaptor) DoRequest(c *gin.Context, info *relaycommon.RelayInfo, request
|
||||
func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.RelayInfo) (usage *dto.Usage, err *dto.OpenAIErrorWithStatusCode) {
|
||||
if info.IsStream {
|
||||
var responseText string
|
||||
err, responseText, _ = openai.OpenaiStreamHandler(c, resp, info.RelayMode)
|
||||
err, responseText, _ = openai.OpenaiStreamHandler(c, resp, info)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
} else {
|
||||
err, usage = openai.OpenaiHandler(c, resp, info.PromptTokens, info.UpstreamModelName)
|
||||
|
||||
172
relay/channel/task/suno/adaptor.go
Normal file
172
relay/channel/task/suno/adaptor.go
Normal file
@@ -0,0 +1,172 @@
|
||||
package suno
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"github.com/gin-gonic/gin"
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
"one-api/relay/channel"
|
||||
relaycommon "one-api/relay/common"
|
||||
"one-api/service"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
type TaskAdaptor struct {
|
||||
ChannelType int
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) Init(info *relaycommon.TaskRelayInfo) {
|
||||
a.ChannelType = info.ChannelType
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) ValidateRequestAndSetAction(c *gin.Context, info *relaycommon.TaskRelayInfo) (taskErr *dto.TaskError) {
|
||||
action := strings.ToUpper(c.Param("action"))
|
||||
|
||||
var sunoRequest *dto.SunoSubmitReq
|
||||
err := common.UnmarshalBodyReusable(c, &sunoRequest)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapperLocal(err, "invalid_request", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
err = actionValidate(c, sunoRequest, action)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapperLocal(err, "invalid_request", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if sunoRequest.ContinueClipId != "" {
|
||||
if sunoRequest.TaskID == "" {
|
||||
taskErr = service.TaskErrorWrapperLocal(fmt.Errorf("task id is empty"), "invalid_request", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
info.OriginTaskID = sunoRequest.TaskID
|
||||
}
|
||||
|
||||
info.Action = action
|
||||
c.Set("task_request", sunoRequest)
|
||||
return nil
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) BuildRequestURL(info *relaycommon.TaskRelayInfo) (string, error) {
|
||||
baseURL := info.BaseUrl
|
||||
fullRequestURL := fmt.Sprintf("%s%s", baseURL, "/suno/submit/"+info.Action)
|
||||
return fullRequestURL, nil
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) BuildRequestHeader(c *gin.Context, req *http.Request, info *relaycommon.TaskRelayInfo) error {
|
||||
req.Header.Set("Content-Type", c.Request.Header.Get("Content-Type"))
|
||||
req.Header.Set("Accept", c.Request.Header.Get("Accept"))
|
||||
req.Header.Set("Authorization", "Bearer "+info.ApiKey)
|
||||
return nil
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) BuildRequestBody(c *gin.Context, info *relaycommon.TaskRelayInfo) (io.Reader, error) {
|
||||
sunoRequest, ok := c.Get("task_request")
|
||||
if !ok {
|
||||
err := common.UnmarshalBodyReusable(c, &sunoRequest)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
data, err := json.Marshal(sunoRequest)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return bytes.NewReader(data), nil
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) DoRequest(c *gin.Context, info *relaycommon.TaskRelayInfo, requestBody io.Reader) (*http.Response, error) {
|
||||
return channel.DoTaskApiRequest(a, c, info, requestBody)
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycommon.TaskRelayInfo) (taskID string, taskData []byte, taskErr *dto.TaskError) {
|
||||
responseBody, err := io.ReadAll(resp.Body)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "read_response_body_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
var sunoResponse dto.TaskResponse[string]
|
||||
err = json.Unmarshal(responseBody, &sunoResponse)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "unmarshal_response_body_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
if !sunoResponse.IsSuccess() {
|
||||
taskErr = service.TaskErrorWrapper(fmt.Errorf(sunoResponse.Message), sunoResponse.Code, http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
for k, v := range resp.Header {
|
||||
c.Writer.Header().Set(k, v[0])
|
||||
}
|
||||
c.Writer.Header().Set("Content-Type", "application/json")
|
||||
c.Writer.WriteHeader(resp.StatusCode)
|
||||
|
||||
_, err = io.Copy(c.Writer, bytes.NewBuffer(responseBody))
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "copy_response_body_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
return sunoResponse.Data, nil, nil
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) GetModelList() []string {
|
||||
return ModelList
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) GetChannelName() string {
|
||||
return ChannelName
|
||||
}
|
||||
|
||||
func (a *TaskAdaptor) FetchTask(baseUrl, key string, body map[string]any) (*http.Response, error) {
|
||||
requestUrl := fmt.Sprintf("%s/suno/fetch", baseUrl)
|
||||
byteBody, err := json.Marshal(body)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
req, err := http.NewRequest("POST", requestUrl, bytes.NewBuffer(byteBody))
|
||||
if err != nil {
|
||||
common.SysError(fmt.Sprintf("Get Task error: %v", err))
|
||||
return nil, err
|
||||
}
|
||||
defer req.Body.Close()
|
||||
// 设置超时时间
|
||||
timeout := time.Second * 15
|
||||
ctx, cancel := context.WithTimeout(context.Background(), timeout)
|
||||
defer cancel()
|
||||
// 使用带有超时的 context 创建新的请求
|
||||
req = req.WithContext(ctx)
|
||||
req.Header.Set("Content-Type", "application/json")
|
||||
req.Header.Set("Authorization", "Bearer "+key)
|
||||
resp, err := service.GetHttpClient().Do(req)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
func actionValidate(c *gin.Context, sunoRequest *dto.SunoSubmitReq, action string) (err error) {
|
||||
switch action {
|
||||
case constant.SunoActionMusic:
|
||||
if sunoRequest.Mv == "" {
|
||||
sunoRequest.Mv = "chirp-v3-0"
|
||||
}
|
||||
case constant.SunoActionLyrics:
|
||||
if sunoRequest.Prompt == "" {
|
||||
err = fmt.Errorf("prompt_empty")
|
||||
return
|
||||
}
|
||||
default:
|
||||
err = fmt.Errorf("invalid_action")
|
||||
}
|
||||
return
|
||||
}
|
||||
7
relay/channel/task/suno/models.go
Normal file
7
relay/channel/task/suno/models.go
Normal file
@@ -0,0 +1,7 @@
|
||||
package suno
|
||||
|
||||
var ModelList = []string{
|
||||
"suno_music", "suno_lyrics",
|
||||
}
|
||||
|
||||
var ChannelName = "suno"
|
||||
@@ -31,6 +31,7 @@ type TencentChatRequest struct {
|
||||
// Messages 会话内容, 长度最多为40, 按对话时间从旧到新在数组中排列
|
||||
// 输入 content 总数最大支持 3000 token。
|
||||
Messages []TencentMessage `json:"messages"`
|
||||
Model string `json:"model"` // 模型名称
|
||||
}
|
||||
|
||||
type TencentError struct {
|
||||
|
||||
@@ -54,6 +54,7 @@ func requestOpenAI2Tencent(request dto.GeneralOpenAIRequest) *TencentChatRequest
|
||||
TopP: request.TopP,
|
||||
Stream: stream,
|
||||
Messages: messages,
|
||||
Model: request.Model,
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -6,6 +6,7 @@ var ModelList = []string{
|
||||
"SparkDesk-v2.1",
|
||||
"SparkDesk-v3.1",
|
||||
"SparkDesk-v3.5",
|
||||
"SparkDesk-v4.0",
|
||||
}
|
||||
|
||||
var ChannelName = "xunfei"
|
||||
|
||||
@@ -252,6 +252,8 @@ func apiVersion2domain(apiVersion string) string {
|
||||
return "generalv3"
|
||||
case "v3.5":
|
||||
return "generalv3.5"
|
||||
case "v4.0":
|
||||
return "4.0Ultra"
|
||||
}
|
||||
return "general" + apiVersion
|
||||
}
|
||||
|
||||
@@ -48,7 +48,7 @@ func (a *Adaptor) DoResponse(c *gin.Context, resp *http.Response, info *relaycom
|
||||
if info.IsStream {
|
||||
var responseText string
|
||||
var toolCount int
|
||||
err, responseText, toolCount = openai.OpenaiStreamHandler(c, resp, info.RelayMode)
|
||||
err, responseText, toolCount = openai.OpenaiStreamHandler(c, resp, info)
|
||||
usage, _ = service.ResponseText2Usage(responseText, info.UpstreamModelName, info.PromptTokens)
|
||||
usage.CompletionTokens += toolCount * 7
|
||||
} else {
|
||||
|
||||
@@ -16,6 +16,7 @@ type RelayInfo struct {
|
||||
Group string
|
||||
TokenUnlimited bool
|
||||
StartTime time.Time
|
||||
FirstResponseTime time.Time
|
||||
ApiType int
|
||||
IsStream bool
|
||||
RelayMode int
|
||||
@@ -37,24 +38,26 @@ func GenRelayInfo(c *gin.Context) *RelayInfo {
|
||||
group := c.GetString("group")
|
||||
tokenUnlimited := c.GetBool("token_unlimited_quota")
|
||||
startTime := time.Now()
|
||||
// firstResponseTime = time.Now() - 1 second
|
||||
|
||||
apiType, _ := constant.ChannelType2APIType(channelType)
|
||||
|
||||
info := &RelayInfo{
|
||||
RelayMode: constant.Path2RelayMode(c.Request.URL.Path),
|
||||
BaseUrl: c.GetString("base_url"),
|
||||
RequestURLPath: c.Request.URL.String(),
|
||||
ChannelType: channelType,
|
||||
ChannelId: channelId,
|
||||
TokenId: tokenId,
|
||||
UserId: userId,
|
||||
Group: group,
|
||||
TokenUnlimited: tokenUnlimited,
|
||||
StartTime: startTime,
|
||||
ApiType: apiType,
|
||||
ApiVersion: c.GetString("api_version"),
|
||||
ApiKey: strings.TrimPrefix(c.Request.Header.Get("Authorization"), "Bearer "),
|
||||
Organization: c.GetString("channel_organization"),
|
||||
RelayMode: constant.Path2RelayMode(c.Request.URL.Path),
|
||||
BaseUrl: c.GetString("base_url"),
|
||||
RequestURLPath: c.Request.URL.String(),
|
||||
ChannelType: channelType,
|
||||
ChannelId: channelId,
|
||||
TokenId: tokenId,
|
||||
UserId: userId,
|
||||
Group: group,
|
||||
TokenUnlimited: tokenUnlimited,
|
||||
StartTime: startTime,
|
||||
FirstResponseTime: startTime.Add(-time.Second),
|
||||
ApiType: apiType,
|
||||
ApiVersion: c.GetString("api_version"),
|
||||
ApiKey: strings.TrimPrefix(c.Request.Header.Get("Authorization"), "Bearer "),
|
||||
Organization: c.GetString("channel_organization"),
|
||||
}
|
||||
if info.BaseUrl == "" {
|
||||
info.BaseUrl = common.ChannelBaseURLs[channelType]
|
||||
@@ -72,3 +75,53 @@ func (info *RelayInfo) SetPromptTokens(promptTokens int) {
|
||||
func (info *RelayInfo) SetIsStream(isStream bool) {
|
||||
info.IsStream = isStream
|
||||
}
|
||||
|
||||
type TaskRelayInfo struct {
|
||||
ChannelType int
|
||||
ChannelId int
|
||||
TokenId int
|
||||
UserId int
|
||||
Group string
|
||||
StartTime time.Time
|
||||
ApiType int
|
||||
RelayMode int
|
||||
UpstreamModelName string
|
||||
RequestURLPath string
|
||||
ApiKey string
|
||||
BaseUrl string
|
||||
|
||||
Action string
|
||||
OriginTaskID string
|
||||
|
||||
ConsumeQuota bool
|
||||
}
|
||||
|
||||
func GenTaskRelayInfo(c *gin.Context) *TaskRelayInfo {
|
||||
channelType := c.GetInt("channel")
|
||||
channelId := c.GetInt("channel_id")
|
||||
|
||||
tokenId := c.GetInt("token_id")
|
||||
userId := c.GetInt("id")
|
||||
group := c.GetString("group")
|
||||
startTime := time.Now()
|
||||
|
||||
apiType, _ := constant.ChannelType2APIType(channelType)
|
||||
|
||||
info := &TaskRelayInfo{
|
||||
RelayMode: constant.Path2RelayMode(c.Request.URL.Path),
|
||||
BaseUrl: c.GetString("base_url"),
|
||||
RequestURLPath: c.Request.URL.String(),
|
||||
ChannelType: channelType,
|
||||
ChannelId: channelId,
|
||||
TokenId: tokenId,
|
||||
UserId: userId,
|
||||
Group: group,
|
||||
StartTime: startTime,
|
||||
ApiType: apiType,
|
||||
ApiKey: strings.TrimPrefix(c.Request.Header.Get("Authorization"), "Bearer "),
|
||||
}
|
||||
if info.BaseUrl == "" {
|
||||
info.BaseUrl = common.ChannelBaseURLs[channelType]
|
||||
}
|
||||
return info
|
||||
}
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
package constant
|
||||
|
||||
import "strings"
|
||||
import (
|
||||
"net/http"
|
||||
"strings"
|
||||
)
|
||||
|
||||
const (
|
||||
RelayModeUnknown = iota
|
||||
@@ -26,6 +29,9 @@ const (
|
||||
RelayModeMidjourneyModal
|
||||
RelayModeMidjourneyShorten
|
||||
RelayModeSwapFace
|
||||
RelayModeSunoFetch
|
||||
RelayModeSunoFetchByID
|
||||
RelayModeSunoSubmit
|
||||
)
|
||||
|
||||
func Path2RelayMode(path string) int {
|
||||
@@ -89,3 +95,15 @@ func Path2RelayModeMidjourney(path string) int {
|
||||
}
|
||||
return relayMode
|
||||
}
|
||||
|
||||
func Path2RelaySuno(method, path string) int {
|
||||
relayMode := RelayModeUnknown
|
||||
if method == http.MethodPost && strings.HasSuffix(path, "/fetch") {
|
||||
relayMode = RelayModeSunoFetch
|
||||
} else if method == http.MethodGet && strings.Contains(path, "/fetch/") {
|
||||
relayMode = RelayModeSunoFetchByID
|
||||
} else if strings.Contains(path, "/submit/") {
|
||||
relayMode = RelayModeSunoSubmit
|
||||
}
|
||||
return relayMode
|
||||
}
|
||||
|
||||
@@ -55,7 +55,13 @@ func AudioHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusCode {
|
||||
promptTokens := 0
|
||||
preConsumedTokens := common.PreConsumedQuota
|
||||
if strings.HasPrefix(audioRequest.Model, "tts-1") {
|
||||
promptTokens, err, _ = service.CountAudioToken(audioRequest.Input, audioRequest.Model, constant.ShouldCheckPromptSensitive())
|
||||
if constant.ShouldCheckPromptSensitive() {
|
||||
err = service.CheckSensitiveInput(audioRequest.Input)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "sensitive_words_detected", http.StatusBadRequest)
|
||||
}
|
||||
}
|
||||
promptTokens, err = service.CountAudioToken(audioRequest.Input, audioRequest.Model)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "count_audio_token_failed", http.StatusInternalServerError)
|
||||
}
|
||||
@@ -178,7 +184,7 @@ func AudioHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusCode {
|
||||
if strings.HasPrefix(audioRequest.Model, "tts-1") {
|
||||
quota = promptTokens
|
||||
} else {
|
||||
quota, err, _ = service.CountAudioToken(audioResponse.Text, audioRequest.Model, false)
|
||||
quota, err = service.CountAudioToken(audioResponse.Text, audioRequest.Model)
|
||||
}
|
||||
quota = int(float64(quota) * ratio)
|
||||
if ratio != 0 && quota <= 0 {
|
||||
|
||||
@@ -10,6 +10,7 @@ import (
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
"one-api/model"
|
||||
relaycommon "one-api/relay/common"
|
||||
@@ -47,6 +48,13 @@ func RelayImageHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusC
|
||||
return service.OpenAIErrorWrapper(errors.New("prompt is required"), "required_field_missing", http.StatusBadRequest)
|
||||
}
|
||||
|
||||
if constant.ShouldCheckPromptSensitive() {
|
||||
err = service.CheckSensitiveInput(imageRequest.Prompt)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "sensitive_words_detected", http.StatusBadRequest)
|
||||
}
|
||||
}
|
||||
|
||||
if strings.Contains(imageRequest.Size, "×") {
|
||||
return service.OpenAIErrorWrapper(errors.New("size an unexpected error occurred in the parameter, please use 'x' instead of the multiplication sign '×'"), "invalid_field_value", http.StatusBadRequest)
|
||||
}
|
||||
@@ -131,7 +139,7 @@ func RelayImageHelper(c *gin.Context, relayMode int) *dto.OpenAIErrorWithStatusC
|
||||
qualityRatio := 1.0
|
||||
if imageRequest.Model == "dall-e-3" && imageRequest.Quality == "hd" {
|
||||
qualityRatio = 2.0
|
||||
if imageRequest.Size == "1024×1792" || imageRequest.Size == "1792×1024" {
|
||||
if imageRequest.Size == "1024x1792" || imageRequest.Size == "1792x1024" {
|
||||
qualityRatio = 1.5
|
||||
}
|
||||
}
|
||||
|
||||
@@ -30,7 +30,7 @@ func RelayMidjourneyImage(c *gin.Context) {
|
||||
})
|
||||
return
|
||||
}
|
||||
resp, err := http.Get(midjourneyTask.ImageUrl)
|
||||
resp, err := common.ProxiedHttpGet(midjourneyTask.ImageUrl, common.OutProxyUrl)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"error": "http_get_image_failed",
|
||||
@@ -158,7 +158,7 @@ func RelaySwapFace(c *gin.Context) *dto.MidjourneyResponse {
|
||||
modelPrice, success := common.GetModelPrice(modelName, true)
|
||||
// 如果没有配置价格,则使用默认价格
|
||||
if !success {
|
||||
defaultPrice, ok := common.DefaultModelPrice[modelName]
|
||||
defaultPrice, ok := common.GetDefaultModelRatioMap()[modelName]
|
||||
if !ok {
|
||||
modelPrice = 0.1
|
||||
} else {
|
||||
@@ -457,7 +457,7 @@ func RelayMidjourneySubmit(c *gin.Context, relayMode int) *dto.MidjourneyRespons
|
||||
modelPrice, success := common.GetModelPrice(modelName, true)
|
||||
// 如果没有配置价格,则使用默认价格
|
||||
if !success {
|
||||
defaultPrice, ok := common.DefaultModelPrice[modelName]
|
||||
defaultPrice, ok := common.GetDefaultModelRatioMap()[modelName]
|
||||
if !ok {
|
||||
modelPrice = 0.1
|
||||
} else {
|
||||
@@ -538,7 +538,16 @@ func RelayMidjourneySubmit(c *gin.Context, relayMode int) *dto.MidjourneyRespons
|
||||
ChannelId: c.GetInt("channel_id"),
|
||||
Quota: quota,
|
||||
}
|
||||
|
||||
if midjResponse.Code == 3 {
|
||||
//无实例账号自动禁用渠道(No available account instance)
|
||||
channel, err := model.GetChannelById(midjourneyTask.ChannelId, true)
|
||||
if err != nil {
|
||||
common.SysError("get_channel_null: " + err.Error())
|
||||
}
|
||||
if channel.AutoBan != nil && *channel.AutoBan == 1 {
|
||||
model.UpdateChannelStatusById(midjourneyTask.ChannelId, 2, "No available account instance")
|
||||
}
|
||||
}
|
||||
if midjResponse.Code != 1 && midjResponse.Code != 21 && midjResponse.Code != 22 {
|
||||
//非1-提交成功,21-任务已存在和22-排队中,则记录错误原因
|
||||
midjourneyTask.FailReason = midjResponse.Description
|
||||
|
||||
@@ -98,13 +98,17 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
|
||||
var ratio float64
|
||||
var modelRatio float64
|
||||
//err := service.SensitiveWordsCheck(textRequest)
|
||||
promptTokens, err, sensitiveTrigger := getPromptTokens(textRequest, relayInfo)
|
||||
|
||||
// count messages token error 计算promptTokens错误
|
||||
if err != nil {
|
||||
if sensitiveTrigger {
|
||||
if constant.ShouldCheckPromptSensitive() {
|
||||
err = checkRequestSensitive(textRequest, relayInfo)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapperLocal(err, "sensitive_words_detected", http.StatusBadRequest)
|
||||
}
|
||||
}
|
||||
|
||||
promptTokens, err := getPromptTokens(textRequest, relayInfo)
|
||||
// count messages token error 计算promptTokens错误
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "count_token_messages_failed", http.StatusInternalServerError)
|
||||
}
|
||||
|
||||
@@ -128,7 +132,7 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
|
||||
|
||||
adaptor := GetAdaptor(relayInfo.ApiType)
|
||||
if adaptor == nil {
|
||||
return service.OpenAIErrorWrapper(fmt.Errorf("invalid api type: %d", relayInfo.ApiType), "invalid_api_type", http.StatusBadRequest)
|
||||
return service.OpenAIErrorWrapperLocal(fmt.Errorf("invalid api type: %d", relayInfo.ApiType), "invalid_api_type", http.StatusBadRequest)
|
||||
}
|
||||
adaptor.Init(relayInfo, *textRequest)
|
||||
var requestBody io.Reader
|
||||
@@ -136,7 +140,7 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
|
||||
if isModelMapped {
|
||||
jsonStr, err := json.Marshal(textRequest)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "marshal_text_request_failed", http.StatusInternalServerError)
|
||||
return service.OpenAIErrorWrapperLocal(err, "marshal_text_request_failed", http.StatusInternalServerError)
|
||||
}
|
||||
requestBody = bytes.NewBuffer(jsonStr)
|
||||
} else {
|
||||
@@ -145,11 +149,11 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
|
||||
} else {
|
||||
convertedRequest, err := adaptor.ConvertRequest(c, relayInfo.RelayMode, textRequest)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "convert_request_failed", http.StatusInternalServerError)
|
||||
return service.OpenAIErrorWrapperLocal(err, "convert_request_failed", http.StatusInternalServerError)
|
||||
}
|
||||
jsonData, err := json.Marshal(convertedRequest)
|
||||
if err != nil {
|
||||
return service.OpenAIErrorWrapper(err, "json_marshal_failed", http.StatusInternalServerError)
|
||||
return service.OpenAIErrorWrapperLocal(err, "json_marshal_failed", http.StatusInternalServerError)
|
||||
}
|
||||
requestBody = bytes.NewBuffer(jsonData)
|
||||
}
|
||||
@@ -182,15 +186,14 @@ func TextHelper(c *gin.Context) *dto.OpenAIErrorWithStatusCode {
|
||||
return nil
|
||||
}
|
||||
|
||||
func getPromptTokens(textRequest *dto.GeneralOpenAIRequest, info *relaycommon.RelayInfo) (int, error, bool) {
|
||||
func getPromptTokens(textRequest *dto.GeneralOpenAIRequest, info *relaycommon.RelayInfo) (int, error) {
|
||||
var promptTokens int
|
||||
var err error
|
||||
var sensitiveTrigger bool
|
||||
checkSensitive := constant.ShouldCheckPromptSensitive()
|
||||
switch info.RelayMode {
|
||||
case relayconstant.RelayModeChatCompletions:
|
||||
promptTokens, err, sensitiveTrigger = service.CountTokenChatRequest(*textRequest, textRequest.Model, checkSensitive)
|
||||
promptTokens, err = service.CountTokenChatRequest(*textRequest, textRequest.Model)
|
||||
case relayconstant.RelayModeCompletions:
|
||||
promptTokens, err = service.CountTokenInput(textRequest.Prompt, textRequest.Model)
|
||||
prompts := textRequest.Prompt
|
||||
switch v := prompts.(type) {
|
||||
case string:
|
||||
@@ -199,17 +202,32 @@ func getPromptTokens(textRequest *dto.GeneralOpenAIRequest, info *relaycommon.Re
|
||||
prompts = append(v, textRequest.Suffix)
|
||||
}
|
||||
|
||||
promptTokens, err, sensitiveTrigger = service.CountTokenInput(prompts, textRequest.Model, checkSensitive)
|
||||
promptTokens, err = service.CountTokenInput(prompts, textRequest.Model)
|
||||
case relayconstant.RelayModeModerations:
|
||||
promptTokens, err, sensitiveTrigger = service.CountTokenInput(textRequest.Input, textRequest.Model, checkSensitive)
|
||||
promptTokens, err = service.CountTokenInput(textRequest.Input, textRequest.Model)
|
||||
case relayconstant.RelayModeEmbeddings:
|
||||
promptTokens, err, sensitiveTrigger = service.CountTokenInput(textRequest.Input, textRequest.Model, checkSensitive)
|
||||
promptTokens, err = service.CountTokenInput(textRequest.Input, textRequest.Model)
|
||||
default:
|
||||
err = errors.New("unknown relay mode")
|
||||
promptTokens = 0
|
||||
}
|
||||
info.PromptTokens = promptTokens
|
||||
return promptTokens, err, sensitiveTrigger
|
||||
return promptTokens, err
|
||||
}
|
||||
|
||||
func checkRequestSensitive(textRequest *dto.GeneralOpenAIRequest, info *relaycommon.RelayInfo) error {
|
||||
var err error
|
||||
switch info.RelayMode {
|
||||
case relayconstant.RelayModeChatCompletions:
|
||||
err = service.CheckSensitiveMessages(textRequest.Messages)
|
||||
case relayconstant.RelayModeCompletions:
|
||||
err = service.CheckSensitiveInput(textRequest.Prompt)
|
||||
case relayconstant.RelayModeModerations:
|
||||
err = service.CheckSensitiveInput(textRequest.Input)
|
||||
case relayconstant.RelayModeEmbeddings:
|
||||
err = service.CheckSensitiveInput(textRequest.Input)
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
// 预扣费并返回用户剩余配额
|
||||
@@ -326,14 +344,7 @@ func postConsumeQuota(ctx *gin.Context, relayInfo *relaycommon.RelayInfo, textRe
|
||||
logModel = "g-*"
|
||||
logContent += fmt.Sprintf(",模型 %s", textRequest.Model)
|
||||
}
|
||||
other := make(map[string]interface{})
|
||||
other["model_ratio"] = modelRatio
|
||||
other["group_ratio"] = groupRatio
|
||||
other["completion_ratio"] = completionRatio
|
||||
other["model_price"] = modelPrice
|
||||
adminInfo := make(map[string]interface{})
|
||||
adminInfo["use_channel"] = ctx.GetStringSlice("use_channel")
|
||||
other["admin_info"] = adminInfo
|
||||
other := service.GenerateTextOtherInfo(ctx, relayInfo, modelRatio, groupRatio, completionRatio, modelPrice)
|
||||
model.RecordConsumeLog(ctx, relayInfo.UserId, relayInfo.ChannelId, promptTokens, completionTokens, logModel, tokenName, quota, logContent, relayInfo.TokenId, userQuota, int(useTimeSeconds), relayInfo.IsStream, other)
|
||||
|
||||
//if quota != 0 {
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
package relay
|
||||
|
||||
import (
|
||||
commonconstant "one-api/constant"
|
||||
"one-api/relay/channel"
|
||||
"one-api/relay/channel/ali"
|
||||
"one-api/relay/channel/aws"
|
||||
@@ -12,6 +13,7 @@ import (
|
||||
"one-api/relay/channel/openai"
|
||||
"one-api/relay/channel/palm"
|
||||
"one-api/relay/channel/perplexity"
|
||||
"one-api/relay/channel/task/suno"
|
||||
"one-api/relay/channel/tencent"
|
||||
"one-api/relay/channel/xunfei"
|
||||
"one-api/relay/channel/zhipu"
|
||||
@@ -54,3 +56,13 @@ func GetAdaptor(apiType int) channel.Adaptor {
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func GetTaskAdaptor(platform commonconstant.TaskPlatform) channel.TaskAdaptor {
|
||||
switch platform {
|
||||
//case constant.APITypeAIProxyLibrary:
|
||||
// return &aiproxy.Adaptor{}
|
||||
case commonconstant.TaskPlatformSuno:
|
||||
return &suno.TaskAdaptor{}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
242
relay/relay_task.go
Normal file
242
relay/relay_task.go
Normal file
@@ -0,0 +1,242 @@
|
||||
package relay
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"github.com/gin-gonic/gin"
|
||||
"io"
|
||||
"net/http"
|
||||
"one-api/common"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
"one-api/model"
|
||||
relaycommon "one-api/relay/common"
|
||||
relayconstant "one-api/relay/constant"
|
||||
"one-api/service"
|
||||
)
|
||||
|
||||
/*
|
||||
Task 任务通过平台、Action 区分任务
|
||||
*/
|
||||
func RelayTaskSubmit(c *gin.Context, relayMode int) (taskErr *dto.TaskError) {
|
||||
platform := constant.TaskPlatform(c.GetString("platform"))
|
||||
relayInfo := relaycommon.GenTaskRelayInfo(c)
|
||||
|
||||
adaptor := GetTaskAdaptor(platform)
|
||||
if adaptor == nil {
|
||||
return service.TaskErrorWrapperLocal(fmt.Errorf("invalid api platform: %s", platform), "invalid_api_platform", http.StatusBadRequest)
|
||||
}
|
||||
adaptor.Init(relayInfo)
|
||||
// get & validate taskRequest 获取并验证文本请求
|
||||
taskErr = adaptor.ValidateRequestAndSetAction(c, relayInfo)
|
||||
if taskErr != nil {
|
||||
return
|
||||
}
|
||||
|
||||
modelName := service.CoverTaskActionToModelName(platform, relayInfo.Action)
|
||||
modelPrice, success := common.GetModelPrice(modelName, true)
|
||||
if !success {
|
||||
defaultPrice, ok := common.GetDefaultModelRatioMap()[modelName]
|
||||
if !ok {
|
||||
modelPrice = 0.1
|
||||
} else {
|
||||
modelPrice = defaultPrice
|
||||
}
|
||||
}
|
||||
|
||||
// 预扣
|
||||
groupRatio := common.GetGroupRatio(relayInfo.Group)
|
||||
ratio := modelPrice * groupRatio
|
||||
userQuota, err := model.CacheGetUserQuota(relayInfo.UserId)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "get_user_quota_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
quota := int(ratio * common.QuotaPerUnit)
|
||||
if userQuota-quota < 0 {
|
||||
taskErr = service.TaskErrorWrapperLocal(errors.New("user quota is not enough"), "quota_not_enough", http.StatusForbidden)
|
||||
return
|
||||
}
|
||||
|
||||
if relayInfo.OriginTaskID != "" {
|
||||
originTask, exist, err := model.GetByTaskId(relayInfo.UserId, relayInfo.OriginTaskID)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "get_origin_task_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
if !exist {
|
||||
taskErr = service.TaskErrorWrapperLocal(errors.New("task_origin_not_exist"), "task_not_exist", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
if originTask.ChannelId != relayInfo.ChannelId {
|
||||
channel, err := model.GetChannelById(originTask.ChannelId, true)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapperLocal(err, "channel_not_found", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
if channel.Status != common.ChannelStatusEnabled {
|
||||
return service.TaskErrorWrapperLocal(errors.New("该任务所属渠道已被禁用"), "task_channel_disable", http.StatusBadRequest)
|
||||
}
|
||||
c.Set("base_url", channel.GetBaseURL())
|
||||
c.Set("channel_id", originTask.ChannelId)
|
||||
c.Request.Header.Set("Authorization", fmt.Sprintf("Bearer %s", channel.Key))
|
||||
|
||||
relayInfo.BaseUrl = channel.GetBaseURL()
|
||||
relayInfo.ChannelId = originTask.ChannelId
|
||||
}
|
||||
}
|
||||
|
||||
// build body
|
||||
requestBody, err := adaptor.BuildRequestBody(c, relayInfo)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "build_request_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
// do request
|
||||
resp, err := adaptor.DoRequest(c, relayInfo, requestBody)
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "do_request_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
// handle response
|
||||
if resp != nil && resp.StatusCode != http.StatusOK {
|
||||
responseBody, _ := io.ReadAll(resp.Body)
|
||||
taskErr = service.TaskErrorWrapper(fmt.Errorf(string(responseBody)), "fail_to_fetch_task", resp.StatusCode)
|
||||
return
|
||||
}
|
||||
|
||||
defer func(ctx context.Context) {
|
||||
// release quota
|
||||
if relayInfo.ConsumeQuota && taskErr == nil {
|
||||
err := model.PostConsumeTokenQuota(relayInfo.TokenId, userQuota, quota, 0, true)
|
||||
if err != nil {
|
||||
common.SysError("error consuming token remain quota: " + err.Error())
|
||||
}
|
||||
err = model.CacheUpdateUserQuota(relayInfo.UserId)
|
||||
if err != nil {
|
||||
common.SysError("error update user quota cache: " + err.Error())
|
||||
}
|
||||
if quota != 0 {
|
||||
tokenName := c.GetString("token_name")
|
||||
logContent := fmt.Sprintf("模型固定价格 %.2f,分组倍率 %.2f,操作 %s", modelPrice, groupRatio, relayInfo.Action)
|
||||
other := make(map[string]interface{})
|
||||
other["model_price"] = modelPrice
|
||||
other["group_ratio"] = groupRatio
|
||||
model.RecordConsumeLog(ctx, relayInfo.UserId, relayInfo.ChannelId, 0, 0, modelName, tokenName, quota, logContent, relayInfo.TokenId, userQuota, 0, false, other)
|
||||
model.UpdateUserUsedQuotaAndRequestCount(relayInfo.UserId, quota)
|
||||
model.UpdateChannelUsedQuota(relayInfo.ChannelId, quota)
|
||||
}
|
||||
}
|
||||
}(c.Request.Context())
|
||||
|
||||
taskID, taskData, taskErr := adaptor.DoResponse(c, resp, relayInfo)
|
||||
if taskErr != nil {
|
||||
return
|
||||
}
|
||||
relayInfo.ConsumeQuota = true
|
||||
// insert task
|
||||
task := model.InitTask(constant.TaskPlatformSuno, relayInfo)
|
||||
task.TaskID = taskID
|
||||
task.Quota = quota
|
||||
task.Data = taskData
|
||||
err = task.Insert()
|
||||
if err != nil {
|
||||
taskErr = service.TaskErrorWrapper(err, "insert_task_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
var fetchRespBuilders = map[int]func(c *gin.Context) (respBody []byte, taskResp *dto.TaskError){
|
||||
relayconstant.RelayModeSunoFetchByID: sunoFetchByIDRespBodyBuilder,
|
||||
relayconstant.RelayModeSunoFetch: sunoFetchRespBodyBuilder,
|
||||
}
|
||||
|
||||
func RelayTaskFetch(c *gin.Context, relayMode int) (taskResp *dto.TaskError) {
|
||||
respBuilder, ok := fetchRespBuilders[relayMode]
|
||||
if !ok {
|
||||
taskResp = service.TaskErrorWrapperLocal(errors.New("invalid_relay_mode"), "invalid_relay_mode", http.StatusBadRequest)
|
||||
}
|
||||
|
||||
respBody, taskErr := respBuilder(c)
|
||||
if taskErr != nil {
|
||||
return taskErr
|
||||
}
|
||||
|
||||
c.Writer.Header().Set("Content-Type", "application/json")
|
||||
_, err := io.Copy(c.Writer, bytes.NewBuffer(respBody))
|
||||
if err != nil {
|
||||
taskResp = service.TaskErrorWrapper(err, "copy_response_body_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
func sunoFetchRespBodyBuilder(c *gin.Context) (respBody []byte, taskResp *dto.TaskError) {
|
||||
userId := c.GetInt("id")
|
||||
var condition = struct {
|
||||
IDs []any `json:"ids"`
|
||||
Action string `json:"action"`
|
||||
}{}
|
||||
err := c.BindJSON(&condition)
|
||||
if err != nil {
|
||||
taskResp = service.TaskErrorWrapper(err, "invalid_request", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
var tasks []any
|
||||
if len(condition.IDs) > 0 {
|
||||
taskModels, err := model.GetByTaskIds(userId, condition.IDs)
|
||||
if err != nil {
|
||||
taskResp = service.TaskErrorWrapper(err, "get_tasks_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
for _, task := range taskModels {
|
||||
tasks = append(tasks, TaskModel2Dto(task))
|
||||
}
|
||||
} else {
|
||||
tasks = make([]any, 0)
|
||||
}
|
||||
respBody, err = json.Marshal(dto.TaskResponse[[]any]{
|
||||
Code: "success",
|
||||
Data: tasks,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
func sunoFetchByIDRespBodyBuilder(c *gin.Context) (respBody []byte, taskResp *dto.TaskError) {
|
||||
taskId := c.Param("id")
|
||||
userId := c.GetInt("id")
|
||||
|
||||
originTask, exist, err := model.GetByTaskId(userId, taskId)
|
||||
if err != nil {
|
||||
taskResp = service.TaskErrorWrapper(err, "get_task_failed", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
if !exist {
|
||||
taskResp = service.TaskErrorWrapperLocal(errors.New("task_not_exist"), "task_not_exist", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
respBody, err = json.Marshal(dto.TaskResponse[any]{
|
||||
Code: "success",
|
||||
Data: TaskModel2Dto(originTask),
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
func TaskModel2Dto(task *model.Task) *dto.TaskDto {
|
||||
return &dto.TaskDto{
|
||||
TaskID: task.TaskID,
|
||||
Action: task.Action,
|
||||
Status: string(task.Status),
|
||||
FailReason: task.FailReason,
|
||||
SubmitTime: task.SubmitTime,
|
||||
StartTime: task.StartTime,
|
||||
FinishTime: task.FinishTime,
|
||||
Progress: task.Progress,
|
||||
Data: task.Data,
|
||||
}
|
||||
}
|
||||
@@ -74,6 +74,7 @@ func SetApiRouter(router *gin.Engine) {
|
||||
{
|
||||
optionRoute.GET("/", controller.GetOptions)
|
||||
optionRoute.PUT("/", controller.UpdateOption)
|
||||
optionRoute.POST("/rest_model_ratio", controller.ResetModelRatio)
|
||||
}
|
||||
channelRoute := apiRouter.Group("/channel")
|
||||
channelRoute.Use(middleware.AdminAuth())
|
||||
@@ -92,6 +93,8 @@ func SetApiRouter(router *gin.Engine) {
|
||||
channelRoute.DELETE("/:id", controller.DeleteChannel)
|
||||
channelRoute.POST("/batch", controller.DeleteChannelBatch)
|
||||
channelRoute.POST("/fix", controller.FixChannelsAbilities)
|
||||
channelRoute.GET("/fetch_models/:id", controller.FetchUpstreamModels)
|
||||
|
||||
}
|
||||
tokenRoute := apiRouter.Group("/token")
|
||||
tokenRoute.Use(middleware.UserAuth())
|
||||
@@ -139,5 +142,11 @@ func SetApiRouter(router *gin.Engine) {
|
||||
mjRoute := apiRouter.Group("/mj")
|
||||
mjRoute.GET("/self", middleware.UserAuth(), controller.GetUserMidjourney)
|
||||
mjRoute.GET("/", middleware.AdminAuth(), controller.GetAllMidjourney)
|
||||
|
||||
taskRoute := apiRouter.Group("/task")
|
||||
{
|
||||
taskRoute.GET("/self", middleware.UserAuth(), controller.GetUserTask)
|
||||
taskRoute.GET("/", middleware.AdminAuth(), controller.GetAllTask)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -50,6 +50,15 @@ func SetRelayRouter(router *gin.Engine) {
|
||||
relayMjModeRouter := router.Group("/:mode/mj")
|
||||
registerMjRouterGroup(relayMjModeRouter)
|
||||
//relayMjRouter.Use()
|
||||
|
||||
relaySunoRouter := router.Group("/suno")
|
||||
relaySunoRouter.Use(middleware.TokenAuth(), middleware.Distribute())
|
||||
{
|
||||
relaySunoRouter.POST("/submit/:action", controller.RelayTask)
|
||||
relaySunoRouter.POST("/fetch", controller.RelayTask)
|
||||
relaySunoRouter.GET("/fetch/:id", controller.RelayTask)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
func registerMjRouterGroup(relayMjRouter *gin.RouterGroup) {
|
||||
|
||||
@@ -11,30 +11,33 @@ import (
|
||||
|
||||
// disable & notify
|
||||
func DisableChannel(channelId int, channelName string, reason string) {
|
||||
model.UpdateChannelStatusById(channelId, common.ChannelStatusAutoDisabled)
|
||||
model.UpdateChannelStatusById(channelId, common.ChannelStatusAutoDisabled, reason)
|
||||
subject := fmt.Sprintf("通道「%s」(#%d)已被禁用", channelName, channelId)
|
||||
content := fmt.Sprintf("通道「%s」(#%d)已被禁用,原因:%s", channelName, channelId, reason)
|
||||
notifyRootUser(subject, content)
|
||||
}
|
||||
|
||||
func EnableChannel(channelId int, channelName string) {
|
||||
model.UpdateChannelStatusById(channelId, common.ChannelStatusEnabled)
|
||||
model.UpdateChannelStatusById(channelId, common.ChannelStatusEnabled, "")
|
||||
subject := fmt.Sprintf("通道「%s」(#%d)已被启用", channelName, channelId)
|
||||
content := fmt.Sprintf("通道「%s」(#%d)已被启用", channelName, channelId)
|
||||
notifyRootUser(subject, content)
|
||||
}
|
||||
|
||||
func ShouldDisableChannel(err *relaymodel.OpenAIError, statusCode int) bool {
|
||||
func ShouldDisableChannel(err *relaymodel.OpenAIErrorWithStatusCode) bool {
|
||||
if !common.AutomaticDisableChannelEnabled {
|
||||
return false
|
||||
}
|
||||
if err == nil {
|
||||
return false
|
||||
}
|
||||
if statusCode == http.StatusUnauthorized {
|
||||
if err.LocalError {
|
||||
return false
|
||||
}
|
||||
if err.StatusCode == http.StatusUnauthorized || err.StatusCode == http.StatusForbidden {
|
||||
return true
|
||||
}
|
||||
switch err.Code {
|
||||
switch err.Error.Code {
|
||||
case "invalid_api_key":
|
||||
return true
|
||||
case "account_deactivated":
|
||||
@@ -42,7 +45,7 @@ func ShouldDisableChannel(err *relaymodel.OpenAIError, statusCode int) bool {
|
||||
case "billing_not_active":
|
||||
return true
|
||||
}
|
||||
switch err.Type {
|
||||
switch err.Error.Type {
|
||||
case "insufficient_quota":
|
||||
return true
|
||||
// https://docs.anthropic.com/claude/reference/errors
|
||||
@@ -53,11 +56,13 @@ func ShouldDisableChannel(err *relaymodel.OpenAIError, statusCode int) bool {
|
||||
case "forbidden":
|
||||
return true
|
||||
}
|
||||
if strings.HasPrefix(err.Message, "Your credit balance is too low") { // anthropic
|
||||
if strings.HasPrefix(err.Error.Message, "Your credit balance is too low") { // anthropic
|
||||
return true
|
||||
} else if strings.HasPrefix(err.Message, "This organization has been disabled.") {
|
||||
} else if strings.HasPrefix(err.Error.Message, "This organization has been disabled.") {
|
||||
return true
|
||||
} else if strings.HasPrefix(err.Message, "You exceeded your current quota") {
|
||||
} else if strings.HasPrefix(err.Error.Message, "You exceeded your current quota") {
|
||||
return true
|
||||
} else if strings.HasPrefix(err.Error.Message, "Permission denied") {
|
||||
return true
|
||||
}
|
||||
return false
|
||||
|
||||
@@ -105,3 +105,29 @@ func ResetStatusCode(openaiErr *dto.OpenAIErrorWithStatusCode, statusCodeMapping
|
||||
openaiErr.StatusCode = intCode
|
||||
}
|
||||
}
|
||||
|
||||
func TaskErrorWrapperLocal(err error, code string, statusCode int) *dto.TaskError {
|
||||
openaiErr := TaskErrorWrapper(err, code, statusCode)
|
||||
openaiErr.LocalError = true
|
||||
return openaiErr
|
||||
}
|
||||
|
||||
func TaskErrorWrapper(err error, code string, statusCode int) *dto.TaskError {
|
||||
text := err.Error()
|
||||
|
||||
// 定义一个正则表达式匹配URL
|
||||
if strings.Contains(text, "Post") || strings.Contains(text, "dial") {
|
||||
common.SysLog(fmt.Sprintf("error: %s", text))
|
||||
text = "请求上游地址失败"
|
||||
}
|
||||
//避免暴露内部错误
|
||||
|
||||
taskError := &dto.TaskError{
|
||||
Code: code,
|
||||
Message: text,
|
||||
StatusCode: statusCode,
|
||||
Error: err,
|
||||
}
|
||||
|
||||
return taskError
|
||||
}
|
||||
|
||||
19
service/log.go
Normal file
19
service/log.go
Normal file
@@ -0,0 +1,19 @@
|
||||
package service
|
||||
|
||||
import (
|
||||
"github.com/gin-gonic/gin"
|
||||
relaycommon "one-api/relay/common"
|
||||
)
|
||||
|
||||
func GenerateTextOtherInfo(ctx *gin.Context, relayInfo *relaycommon.RelayInfo, modelRatio, groupRatio, completionRatio, modelPrice float64) map[string]interface{} {
|
||||
other := make(map[string]interface{})
|
||||
other["model_ratio"] = modelRatio
|
||||
other["group_ratio"] = groupRatio
|
||||
other["completion_ratio"] = completionRatio
|
||||
other["model_price"] = modelPrice
|
||||
other["frt"] = float64(relayInfo.FirstResponseTime.UnixMilli() - relayInfo.StartTime.UnixMilli())
|
||||
adminInfo := make(map[string]interface{})
|
||||
adminInfo["use_channel"] = ctx.GetStringSlice("use_channel")
|
||||
other["admin_info"] = adminInfo
|
||||
return other
|
||||
}
|
||||
@@ -1,13 +1,59 @@
|
||||
package service
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"errors"
|
||||
"fmt"
|
||||
"github.com/anknown/ahocorasick"
|
||||
"one-api/constant"
|
||||
"one-api/dto"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func CheckSensitiveMessages(messages []dto.Message) error {
|
||||
for _, message := range messages {
|
||||
if len(message.Content) > 0 {
|
||||
if message.IsStringContent() {
|
||||
stringContent := message.StringContent()
|
||||
if ok, words := SensitiveWordContains(stringContent); ok {
|
||||
return errors.New("sensitive words: " + strings.Join(words, ","))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
arrayContent := message.ParseContent()
|
||||
for _, m := range arrayContent {
|
||||
if m.Type == "image_url" {
|
||||
// TODO: check image url
|
||||
} else {
|
||||
if ok, words := SensitiveWordContains(m.Text); ok {
|
||||
return errors.New("sensitive words: " + strings.Join(words, ","))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func CheckSensitiveText(text string) error {
|
||||
if ok, words := SensitiveWordContains(text); ok {
|
||||
return errors.New("sensitive words: " + strings.Join(words, ","))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func CheckSensitiveInput(input any) error {
|
||||
switch v := input.(type) {
|
||||
case string:
|
||||
return CheckSensitiveText(v)
|
||||
case []string:
|
||||
text := ""
|
||||
for _, s := range v {
|
||||
text += s
|
||||
}
|
||||
return CheckSensitiveText(text)
|
||||
}
|
||||
return CheckSensitiveText(fmt.Sprintf("%v", input))
|
||||
}
|
||||
|
||||
// SensitiveWordContains 是否包含敏感词,返回是否包含敏感词和敏感词列表
|
||||
func SensitiveWordContains(text string) (bool, []string) {
|
||||
if len(constant.SensitiveWords) == 0 {
|
||||
@@ -15,7 +61,7 @@ func SensitiveWordContains(text string) (bool, []string) {
|
||||
}
|
||||
checkText := strings.ToLower(text)
|
||||
// 构建一个AC自动机
|
||||
m := initAc()
|
||||
m := InitAc()
|
||||
hits := m.MultiPatternSearch([]rune(checkText), false)
|
||||
if len(hits) > 0 {
|
||||
words := make([]string, 0)
|
||||
@@ -33,7 +79,7 @@ func SensitiveWordReplace(text string, returnImmediately bool) (bool, []string,
|
||||
return false, nil, text
|
||||
}
|
||||
checkText := strings.ToLower(text)
|
||||
m := initAc()
|
||||
m := InitAc()
|
||||
hits := m.MultiPatternSearch([]rune(checkText), returnImmediately)
|
||||
if len(hits) > 0 {
|
||||
words := make([]string, 0)
|
||||
@@ -47,25 +93,3 @@ func SensitiveWordReplace(text string, returnImmediately bool) (bool, []string,
|
||||
}
|
||||
return false, nil, text
|
||||
}
|
||||
|
||||
func initAc() *goahocorasick.Machine {
|
||||
m := new(goahocorasick.Machine)
|
||||
dict := readRunes()
|
||||
if err := m.Build(dict); err != nil {
|
||||
fmt.Println(err)
|
||||
return nil
|
||||
}
|
||||
return m
|
||||
}
|
||||
|
||||
func readRunes() [][]rune {
|
||||
var dict [][]rune
|
||||
|
||||
for _, word := range constant.SensitiveWords {
|
||||
word = strings.ToLower(word)
|
||||
l := bytes.TrimSpace([]byte(word))
|
||||
dict = append(dict, bytes.Runes(l))
|
||||
}
|
||||
|
||||
return dict
|
||||
}
|
||||
|
||||
@@ -1,4 +1,12 @@
|
||||
package common
|
||||
package service
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
goahocorasick "github.com/anknown/ahocorasick"
|
||||
"one-api/constant"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func SundaySearch(text string, pattern string) bool {
|
||||
// 计算偏移表
|
||||
@@ -48,3 +56,25 @@ func RemoveDuplicate(s []string) []string {
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func InitAc() *goahocorasick.Machine {
|
||||
m := new(goahocorasick.Machine)
|
||||
dict := readRunes()
|
||||
if err := m.Build(dict); err != nil {
|
||||
fmt.Println(err)
|
||||
return nil
|
||||
}
|
||||
return m
|
||||
}
|
||||
|
||||
func readRunes() [][]rune {
|
||||
var dict [][]rune
|
||||
|
||||
for _, word := range constant.SensitiveWords {
|
||||
word = strings.ToLower(word)
|
||||
l := bytes.TrimSpace([]byte(word))
|
||||
dict = append(dict, bytes.Runes(l))
|
||||
}
|
||||
|
||||
return dict
|
||||
}
|
||||
10
service/task.go
Normal file
10
service/task.go
Normal file
@@ -0,0 +1,10 @@
|
||||
package service
|
||||
|
||||
import (
|
||||
"one-api/constant"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func CoverTaskActionToModelName(platform constant.TaskPlatform, action string) string {
|
||||
return strings.ToLower(string(platform)) + "_" + strings.ToLower(action)
|
||||
}
|
||||
@@ -4,7 +4,7 @@ import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"github.com/linux-do/tiktoken-go"
|
||||
"github.com/pkoukk/tiktoken-go"
|
||||
"image"
|
||||
"log"
|
||||
"math"
|
||||
@@ -34,7 +34,7 @@ func InitTokenEncoders() {
|
||||
if err != nil {
|
||||
common.FatalLog(fmt.Sprintf("failed to get gpt-4o token encoder: %s", err.Error()))
|
||||
}
|
||||
for model, _ := range common.DefaultModelRatio {
|
||||
for model, _ := range common.GetDefaultModelRatioMap() {
|
||||
if strings.HasPrefix(model, "gpt-3.5") {
|
||||
tokenEncoderMap[model] = gpt35TokenEncoder
|
||||
} else if strings.HasPrefix(model, "gpt-4o") {
|
||||
@@ -53,6 +53,7 @@ func getTokenEncoder(model string) *tiktoken.Tiktoken {
|
||||
if ok && tokenEncoder != nil {
|
||||
return tokenEncoder
|
||||
}
|
||||
// 如果ok(即model在tokenEncoderMap中),但是tokenEncoder为nil,说明可能是自定义模型
|
||||
if ok {
|
||||
tokenEncoder, err := tiktoken.EncodingForModel(model)
|
||||
if err != nil {
|
||||
@@ -77,6 +78,10 @@ func getImageToken(imageUrl *dto.MessageImageUrl, model string, stream bool) (in
|
||||
if imageUrl.Detail == "low" {
|
||||
return 85, nil
|
||||
}
|
||||
// 同步One API的图片计费逻辑
|
||||
if imageUrl.Detail == "auto" || imageUrl.Detail == "" {
|
||||
imageUrl.Detail = "high"
|
||||
}
|
||||
var config image.Config
|
||||
var err error
|
||||
var format string
|
||||
@@ -94,14 +99,14 @@ func getImageToken(imageUrl *dto.MessageImageUrl, model string, stream bool) (in
|
||||
if config.Width == 0 || config.Height == 0 {
|
||||
return 0, errors.New(fmt.Sprintf("fail to decode image config: %s", imageUrl.Url))
|
||||
}
|
||||
// TODO: 适配官方auto计费
|
||||
if config.Width < 512 && config.Height < 512 {
|
||||
if imageUrl.Detail == "auto" || imageUrl.Detail == "" {
|
||||
// 如果图片尺寸小于512,强制使用low
|
||||
imageUrl.Detail = "low"
|
||||
return 85, nil
|
||||
}
|
||||
}
|
||||
//// TODO: 适配官方auto计费
|
||||
//if config.Width < 512 && config.Height < 512 {
|
||||
// if imageUrl.Detail == "auto" || imageUrl.Detail == "" {
|
||||
// // 如果图片尺寸小于512,强制使用low
|
||||
// imageUrl.Detail = "low"
|
||||
// return 85, nil
|
||||
// }
|
||||
//}
|
||||
|
||||
shortSide := config.Width
|
||||
otherSide := config.Height
|
||||
@@ -127,11 +132,11 @@ func getImageToken(imageUrl *dto.MessageImageUrl, model string, stream bool) (in
|
||||
return tiles*170 + 85, nil
|
||||
}
|
||||
|
||||
func CountTokenChatRequest(request dto.GeneralOpenAIRequest, model string, checkSensitive bool) (int, error, bool) {
|
||||
func CountTokenChatRequest(request dto.GeneralOpenAIRequest, model string) (int, error) {
|
||||
tkm := 0
|
||||
msgTokens, err, b := CountTokenMessages(request.Messages, model, request.Stream, checkSensitive)
|
||||
msgTokens, err := CountTokenMessages(request.Messages, model, request.Stream)
|
||||
if err != nil {
|
||||
return 0, err, b
|
||||
return 0, err
|
||||
}
|
||||
tkm += msgTokens
|
||||
if request.Tools != nil {
|
||||
@@ -139,7 +144,7 @@ func CountTokenChatRequest(request dto.GeneralOpenAIRequest, model string, check
|
||||
var openaiTools []dto.OpenAITools
|
||||
err := json.Unmarshal(toolsData, &openaiTools)
|
||||
if err != nil {
|
||||
return 0, errors.New(fmt.Sprintf("count_tools_token_fail: %s", err.Error())), false
|
||||
return 0, errors.New(fmt.Sprintf("count_tools_token_fail: %s", err.Error()))
|
||||
}
|
||||
countStr := ""
|
||||
for _, tool := range openaiTools {
|
||||
@@ -151,18 +156,18 @@ func CountTokenChatRequest(request dto.GeneralOpenAIRequest, model string, check
|
||||
countStr += fmt.Sprintf("%v", tool.Function.Parameters)
|
||||
}
|
||||
}
|
||||
toolTokens, err, _ := CountTokenInput(countStr, model, false)
|
||||
toolTokens, err := CountTokenInput(countStr, model)
|
||||
if err != nil {
|
||||
return 0, err, false
|
||||
return 0, err
|
||||
}
|
||||
tkm += 8
|
||||
tkm += toolTokens
|
||||
}
|
||||
|
||||
return tkm, nil, false
|
||||
return tkm, nil
|
||||
}
|
||||
|
||||
func CountTokenMessages(messages []dto.Message, model string, stream bool, checkSensitive bool) (int, error, bool) {
|
||||
func CountTokenMessages(messages []dto.Message, model string, stream bool) (int, error) {
|
||||
//recover when panic
|
||||
tokenEncoder := getTokenEncoder(model)
|
||||
// Reference:
|
||||
@@ -186,13 +191,6 @@ func CountTokenMessages(messages []dto.Message, model string, stream bool, check
|
||||
if len(message.Content) > 0 {
|
||||
if message.IsStringContent() {
|
||||
stringContent := message.StringContent()
|
||||
if checkSensitive {
|
||||
contains, words := SensitiveWordContains(stringContent)
|
||||
if contains {
|
||||
err := fmt.Errorf("message contains sensitive words: [%s]", strings.Join(words, ", "))
|
||||
return 0, err, true
|
||||
}
|
||||
}
|
||||
tokenNum += getTokenNum(tokenEncoder, stringContent)
|
||||
if message.Name != nil {
|
||||
tokenNum += tokensPerName
|
||||
@@ -205,7 +203,7 @@ func CountTokenMessages(messages []dto.Message, model string, stream bool, check
|
||||
imageUrl := m.ImageUrl.(dto.MessageImageUrl)
|
||||
imageTokenNum, err := getImageToken(&imageUrl, model, stream)
|
||||
if err != nil {
|
||||
return 0, err, false
|
||||
return 0, err
|
||||
}
|
||||
tokenNum += imageTokenNum
|
||||
log.Printf("image token num: %d", imageTokenNum)
|
||||
@@ -217,33 +215,33 @@ func CountTokenMessages(messages []dto.Message, model string, stream bool, check
|
||||
}
|
||||
}
|
||||
tokenNum += 3 // Every reply is primed with <|start|>assistant<|message|>
|
||||
return tokenNum, nil, false
|
||||
return tokenNum, nil
|
||||
}
|
||||
|
||||
func CountTokenInput(input any, model string, check bool) (int, error, bool) {
|
||||
func CountTokenInput(input any, model string) (int, error) {
|
||||
switch v := input.(type) {
|
||||
case string:
|
||||
return CountTokenText(v, model, check)
|
||||
return CountTokenText(v, model)
|
||||
case []string:
|
||||
text := ""
|
||||
for _, s := range v {
|
||||
text += s
|
||||
}
|
||||
return CountTokenText(text, model, check)
|
||||
return CountTokenText(text, model)
|
||||
}
|
||||
return CountTokenInput(fmt.Sprintf("%v", input), model, check)
|
||||
return CountTokenInput(fmt.Sprintf("%v", input), model)
|
||||
}
|
||||
|
||||
func CountTokenStreamChoices(messages []dto.ChatCompletionsStreamResponseChoice, model string) int {
|
||||
tokens := 0
|
||||
for _, message := range messages {
|
||||
tkm, _, _ := CountTokenInput(message.Delta.GetContentString(), model, false)
|
||||
tkm, _ := CountTokenInput(message.Delta.GetContentString(), model)
|
||||
tokens += tkm
|
||||
if message.Delta.ToolCalls != nil {
|
||||
for _, tool := range message.Delta.ToolCalls {
|
||||
tkm, _, _ := CountTokenInput(tool.Function.Name, model, false)
|
||||
tkm, _ := CountTokenInput(tool.Function.Name, model)
|
||||
tokens += tkm
|
||||
tkm, _, _ = CountTokenInput(tool.Function.Arguments, model, false)
|
||||
tkm, _ = CountTokenInput(tool.Function.Arguments, model)
|
||||
tokens += tkm
|
||||
}
|
||||
}
|
||||
@@ -251,29 +249,17 @@ func CountTokenStreamChoices(messages []dto.ChatCompletionsStreamResponseChoice,
|
||||
return tokens
|
||||
}
|
||||
|
||||
func CountAudioToken(text string, model string, check bool) (int, error, bool) {
|
||||
func CountAudioToken(text string, model string) (int, error) {
|
||||
if strings.HasPrefix(model, "tts") {
|
||||
contains, words := SensitiveWordContains(text)
|
||||
if contains {
|
||||
return utf8.RuneCountInString(text), fmt.Errorf("input contains sensitive words: [%s]", strings.Join(words, ",")), true
|
||||
}
|
||||
return utf8.RuneCountInString(text), nil, false
|
||||
return utf8.RuneCountInString(text), nil
|
||||
} else {
|
||||
return CountTokenText(text, model, check)
|
||||
return CountTokenText(text, model)
|
||||
}
|
||||
}
|
||||
|
||||
// CountTokenText 统计文本的token数量,仅当文本包含敏感词,返回错误,同时返回token数量
|
||||
func CountTokenText(text string, model string, check bool) (int, error, bool) {
|
||||
func CountTokenText(text string, model string) (int, error) {
|
||||
var err error
|
||||
var trigger bool
|
||||
if check {
|
||||
contains, words := SensitiveWordContains(text)
|
||||
if contains {
|
||||
err = fmt.Errorf("input contains sensitive words: [%s]", strings.Join(words, ","))
|
||||
trigger = true
|
||||
}
|
||||
}
|
||||
tokenEncoder := getTokenEncoder(model)
|
||||
return getTokenNum(tokenEncoder, text), err, trigger
|
||||
return getTokenNum(tokenEncoder, text), err
|
||||
}
|
||||
|
||||
@@ -19,7 +19,7 @@ import (
|
||||
func ResponseText2Usage(responseText string, modeName string, promptTokens int) (*dto.Usage, error) {
|
||||
usage := &dto.Usage{}
|
||||
usage.PromptTokens = promptTokens
|
||||
ctkm, err, _ := CountTokenText(responseText, modeName, false)
|
||||
ctkm, err := CountTokenText(responseText, modeName)
|
||||
usage.CompletionTokens = ctkm
|
||||
usage.TotalTokens = usage.PromptTokens + usage.CompletionTokens
|
||||
return usage, err
|
||||
|
||||
3801
web/pnpm-lock.yaml
generated
Normal file
3801
web/pnpm-lock.yaml
generated
Normal file
File diff suppressed because it is too large
Load Diff
BIN
web/public/ratio.png
Normal file
BIN
web/public/ratio.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 140 KiB |
@@ -24,6 +24,7 @@ import Chat from './pages/Chat';
|
||||
import { Layout } from '@douyinfe/semi-ui';
|
||||
import Midjourney from './pages/Midjourney';
|
||||
import Pricing from './pages/Pricing/index.js';
|
||||
import Task from './pages/Task/index.js';
|
||||
// import Detail from './pages/Detail';
|
||||
|
||||
const Home = lazy(() => import('./pages/Home'));
|
||||
@@ -229,6 +230,16 @@ function App() {
|
||||
</PrivateRoute>
|
||||
}
|
||||
/>
|
||||
<Route
|
||||
path='/task'
|
||||
element={
|
||||
<PrivateRoute>
|
||||
<Suspense fallback={<Loading></Loading>}>
|
||||
<Task />
|
||||
</Suspense>
|
||||
</PrivateRoute>
|
||||
}
|
||||
/>
|
||||
<Route
|
||||
path='/pricing'
|
||||
element={
|
||||
|
||||
@@ -96,7 +96,27 @@ const ChannelsTable = () => {
|
||||
title: '状态',
|
||||
dataIndex: 'status',
|
||||
render: (text, record, index) => {
|
||||
return <div>{renderStatus(text)}</div>;
|
||||
if (text === 3) {
|
||||
if (record.other_info === '') {
|
||||
record.other_info = '{}';
|
||||
}
|
||||
let otherInfo = JSON.parse(record.other_info);
|
||||
let reason = otherInfo['status_reason'];
|
||||
let time = otherInfo['status_time'];
|
||||
return (
|
||||
<div>
|
||||
<Tooltip
|
||||
content={
|
||||
'原因:' + reason + ',时间:' + timestamp2string(time)
|
||||
}
|
||||
>
|
||||
{renderStatus(text)}
|
||||
</Tooltip>
|
||||
</div>
|
||||
);
|
||||
} else {
|
||||
return renderStatus(text);
|
||||
}
|
||||
},
|
||||
},
|
||||
{
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import React, { useEffect, useState } from 'react';
|
||||
|
||||
import { getFooterHTML, getSystemName } from '../helpers';
|
||||
import { Layout } from '@douyinfe/semi-ui';
|
||||
import { Layout, Tooltip } from '@douyinfe/semi-ui';
|
||||
|
||||
const Footer = () => {
|
||||
const systemName = getSystemName();
|
||||
@@ -15,6 +15,30 @@ const Footer = () => {
|
||||
}
|
||||
};
|
||||
|
||||
const defaultFooter = (
|
||||
<div className='custom-footer'>
|
||||
<a
|
||||
href='https://github.com/Calcium-Ion/new-api'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
New API {import.meta.env.VITE_REACT_APP_VERSION}{' '}
|
||||
</a>
|
||||
由{' '}
|
||||
<a href='https://github.com/Calcium-Ion' target='_blank' rel='noreferrer'>
|
||||
Calcium-Ion
|
||||
</a>{' '}
|
||||
开发,基于{' '}
|
||||
<a
|
||||
href='https://github.com/songquanpeng/one-api'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
One API
|
||||
</a>
|
||||
</div>
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
const timer = setInterval(() => {
|
||||
if (remainCheckTimes <= 0) {
|
||||
@@ -31,41 +55,14 @@ const Footer = () => {
|
||||
<Layout>
|
||||
<Layout.Content style={{ textAlign: 'center' }}>
|
||||
{footer ? (
|
||||
<div
|
||||
className='custom-footer'
|
||||
dangerouslySetInnerHTML={{ __html: footer }}
|
||||
></div>
|
||||
<Tooltip content={defaultFooter}>
|
||||
<div
|
||||
className='custom-footer'
|
||||
dangerouslySetInnerHTML={{ __html: footer }}
|
||||
></div>
|
||||
</Tooltip>
|
||||
) : (
|
||||
<div className='custom-footer'>
|
||||
<a
|
||||
href='https://github.com/Calcium-Ion/new-api'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
New API {import.meta.env.VITE_REACT_APP_VERSION}{' '}
|
||||
</a>
|
||||
由{' '}
|
||||
<a
|
||||
href='https://github.com/Calcium-Ion'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
Calcium-Ion
|
||||
</a>{' '}
|
||||
开发,基于{' '}
|
||||
<a
|
||||
href='https://github.com/songquanpeng/one-api'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
One API v0.5.4
|
||||
</a>{' '}
|
||||
,本项目根据{' '}
|
||||
<a href='https://opensource.org/licenses/mit-license.php'>
|
||||
MIT 许可证
|
||||
</a>{' '}
|
||||
授权
|
||||
</div>
|
||||
defaultFooter
|
||||
)}
|
||||
</Layout.Content>
|
||||
</Layout>
|
||||
|
||||
@@ -209,6 +209,7 @@ const LoginForm = () => {
|
||||
</Text>
|
||||
</div>
|
||||
{status.github_oauth ||
|
||||
status.linuxdo_oauth ||
|
||||
status.wechat_login ||
|
||||
status.telegram_oauth ? (
|
||||
<>
|
||||
|
||||
@@ -29,6 +29,7 @@ import {
|
||||
stringToColor,
|
||||
} from '../helpers/render';
|
||||
import Paragraph from '@douyinfe/semi-ui/lib/es/typography/paragraph';
|
||||
import { getLogOther } from '../helpers/other.js';
|
||||
|
||||
const { Header } = Layout;
|
||||
|
||||
@@ -141,6 +142,33 @@ function renderUseTime(type) {
|
||||
}
|
||||
}
|
||||
|
||||
function renderFirstUseTime(type) {
|
||||
let time = parseFloat(type) / 1000.0;
|
||||
time = time.toFixed(1);
|
||||
if (time < 3) {
|
||||
return (
|
||||
<Tag color='green' size='large'>
|
||||
{' '}
|
||||
{time} s{' '}
|
||||
</Tag>
|
||||
);
|
||||
} else if (time < 10) {
|
||||
return (
|
||||
<Tag color='orange' size='large'>
|
||||
{' '}
|
||||
{time} s{' '}
|
||||
</Tag>
|
||||
);
|
||||
} else {
|
||||
return (
|
||||
<Tag color='red' size='large'>
|
||||
{' '}
|
||||
{time} s{' '}
|
||||
</Tag>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const LogsTable = () => {
|
||||
const columns = [
|
||||
{
|
||||
@@ -247,17 +275,30 @@ const LogsTable = () => {
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '用时',
|
||||
title: '用时/首字',
|
||||
dataIndex: 'use_time',
|
||||
render: (text, record, index) => {
|
||||
return (
|
||||
<div>
|
||||
<Space>
|
||||
{renderUseTime(text)}
|
||||
{renderIsStream(record.is_stream)}
|
||||
</Space>
|
||||
</div>
|
||||
);
|
||||
if (record.is_stream) {
|
||||
let other = getLogOther(record.other);
|
||||
return (
|
||||
<div>
|
||||
<Space>
|
||||
{renderUseTime(text)}
|
||||
{renderFirstUseTime(other.frt)}
|
||||
{renderIsStream(record.is_stream)}
|
||||
</Space>
|
||||
</div>
|
||||
);
|
||||
} else {
|
||||
return (
|
||||
<div>
|
||||
<Space>
|
||||
{renderUseTime(text)}
|
||||
{renderIsStream(record.is_stream)}
|
||||
</Space>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
},
|
||||
},
|
||||
{
|
||||
@@ -325,10 +366,7 @@ const LogsTable = () => {
|
||||
title: '详情',
|
||||
dataIndex: 'content',
|
||||
render: (text, record, index) => {
|
||||
if (record.other === '') {
|
||||
record.other = '{}';
|
||||
}
|
||||
let other = JSON.parse(record.other);
|
||||
let other = getLogOther(record.other);
|
||||
if (other == null) {
|
||||
return (
|
||||
<Paragraph
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React, { useContext, useEffect, useRef, useState } from 'react';
|
||||
import React, { useContext, useEffect, useRef, useMemo, useState } from 'react';
|
||||
import { API, copy, showError, showSuccess } from '../helpers';
|
||||
|
||||
import {
|
||||
@@ -10,8 +10,16 @@ import {
|
||||
Table,
|
||||
Tag,
|
||||
Tooltip,
|
||||
Popover,
|
||||
ImagePreview,
|
||||
Button,
|
||||
} from '@douyinfe/semi-ui';
|
||||
import { stringToColor } from '../helpers/render.js';
|
||||
import {
|
||||
IconMore,
|
||||
IconVerify,
|
||||
IconUploadError,
|
||||
IconHelpCircle,
|
||||
} from '@douyinfe/semi-icons';
|
||||
import { UserContext } from '../context/User/index.js';
|
||||
import Text from '@douyinfe/semi-ui/lib/es/typography/text';
|
||||
|
||||
@@ -20,42 +28,70 @@ function renderQuotaType(type) {
|
||||
switch (type) {
|
||||
case 1:
|
||||
return (
|
||||
<Tag color='green' size='large'>
|
||||
<Tag color='teal' size='large'>
|
||||
按次计费
|
||||
</Tag>
|
||||
);
|
||||
case 0:
|
||||
return (
|
||||
<Tag color='blue' size='large'>
|
||||
<Tag color='violet' size='large'>
|
||||
按量计费
|
||||
</Tag>
|
||||
);
|
||||
default:
|
||||
return (
|
||||
<Tag color='white' size='large'>
|
||||
未知
|
||||
</Tag>
|
||||
);
|
||||
return '未知';
|
||||
}
|
||||
}
|
||||
|
||||
function renderAvailable(available) {
|
||||
return available ? (
|
||||
<Tag color='green' size='large'>
|
||||
可用
|
||||
</Tag>
|
||||
<Popover
|
||||
content={<div style={{ padding: 8 }}>您的分组可以使用该模型</div>}
|
||||
position='top'
|
||||
key={available}
|
||||
style={{
|
||||
backgroundColor: 'rgba(var(--semi-blue-4),1)',
|
||||
borderColor: 'rgba(var(--semi-blue-4),1)',
|
||||
color: 'var(--semi-color-white)',
|
||||
borderWidth: 1,
|
||||
borderStyle: 'solid',
|
||||
}}
|
||||
>
|
||||
<IconVerify style={{ color: 'green' }} size='large' />
|
||||
</Popover>
|
||||
) : (
|
||||
<Tooltip content='您所在的分组不可用'>
|
||||
<Tag color='red' size='large'>
|
||||
不可用
|
||||
</Tag>
|
||||
</Tooltip>
|
||||
<Popover
|
||||
content={<div style={{ padding: 8 }}>您的分组无权使用该模型</div>}
|
||||
position='top'
|
||||
key={available}
|
||||
style={{
|
||||
backgroundColor: 'rgba(var(--semi-blue-4),1)',
|
||||
borderColor: 'rgba(var(--semi-blue-4),1)',
|
||||
color: 'var(--semi-color-white)',
|
||||
borderWidth: 1,
|
||||
borderStyle: 'solid',
|
||||
}}
|
||||
>
|
||||
<IconUploadError style={{ color: '#FFA54F' }} size='large' />
|
||||
</Popover>
|
||||
);
|
||||
}
|
||||
|
||||
const ModelPricing = () => {
|
||||
const [filteredValue, setFilteredValue] = useState([]);
|
||||
const compositionRef = useRef({ isComposition: false });
|
||||
const [selectedRowKeys, setSelectedRowKeys] = useState([]);
|
||||
const [modalImageUrl, setModalImageUrl] = useState('');
|
||||
const [isModalOpenurl, setIsModalOpenurl] = useState(false);
|
||||
|
||||
const rowSelection = useMemo(
|
||||
() => ({
|
||||
onChange: (selectedRowKeys, selectedRows) => {
|
||||
setSelectedRowKeys(selectedRowKeys);
|
||||
},
|
||||
}),
|
||||
[],
|
||||
);
|
||||
|
||||
const handleChange = (value) => {
|
||||
if (compositionRef.current.isComposition) {
|
||||
@@ -103,7 +139,7 @@ const ModelPricing = () => {
|
||||
return (
|
||||
<>
|
||||
<Tag
|
||||
color={stringToColor(text)}
|
||||
color='green'
|
||||
size='large'
|
||||
onClick={() => {
|
||||
copyText(text);
|
||||
@@ -114,7 +150,8 @@ const ModelPricing = () => {
|
||||
</>
|
||||
);
|
||||
},
|
||||
onFilter: (value, record) => record.model_name.includes(value),
|
||||
onFilter: (value, record) =>
|
||||
record.model_name.toLowerCase().includes(value.toLowerCase()),
|
||||
filteredValue,
|
||||
},
|
||||
{
|
||||
@@ -126,18 +163,49 @@ const ModelPricing = () => {
|
||||
sorter: (a, b) => a.quota_type - b.quota_type,
|
||||
},
|
||||
{
|
||||
title: '模型倍率',
|
||||
title: () => (
|
||||
<span style={{ display: 'flex', alignItems: 'center' }}>
|
||||
倍率
|
||||
<Popover
|
||||
content={
|
||||
<div style={{ padding: 8 }}>
|
||||
倍率是为了方便换算不同价格的模型
|
||||
<br />
|
||||
点击查看倍率说明
|
||||
</div>
|
||||
}
|
||||
position='top'
|
||||
style={{
|
||||
backgroundColor: 'rgba(var(--semi-blue-4),1)',
|
||||
borderColor: 'rgba(var(--semi-blue-4),1)',
|
||||
color: 'var(--semi-color-white)',
|
||||
borderWidth: 1,
|
||||
borderStyle: 'solid',
|
||||
}}
|
||||
>
|
||||
<IconHelpCircle
|
||||
onClick={() => {
|
||||
setModalImageUrl('/ratio.png');
|
||||
setIsModalOpenurl(true);
|
||||
}}
|
||||
/>
|
||||
</Popover>
|
||||
</span>
|
||||
),
|
||||
dataIndex: 'model_ratio',
|
||||
render: (text, record, index) => {
|
||||
return <div>{record.quota_type === 0 ? text : 'N/A'}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '补全倍率',
|
||||
dataIndex: 'completion_ratio',
|
||||
render: (text, record, index) => {
|
||||
let ratio = parseFloat(text.toFixed(3));
|
||||
return <div>{record.quota_type === 0 ? ratio : 'N/A'}</div>;
|
||||
let content = text;
|
||||
let completionRatio = parseFloat(record.completion_ratio.toFixed(3));
|
||||
content = (
|
||||
<>
|
||||
<Text>模型:{record.quota_type === 0 ? text : '无'}</Text>
|
||||
<br />
|
||||
<Text>
|
||||
补全:{record.quota_type === 0 ? completionRatio : '无'}
|
||||
</Text>
|
||||
</>
|
||||
);
|
||||
return <div>{content}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
@@ -176,7 +244,7 @@ const ModelPricing = () => {
|
||||
|
||||
const setModelsFormat = (models, groupRatio) => {
|
||||
for (let i = 0; i < models.length; i++) {
|
||||
models[i].key = i;
|
||||
models[i].key = models[i].model_name;
|
||||
models[i].group_ratio = groupRatio;
|
||||
}
|
||||
// sort by quota_type
|
||||
@@ -239,15 +307,43 @@ const ModelPricing = () => {
|
||||
<Layout>
|
||||
{userState.user ? (
|
||||
<Banner
|
||||
type='info'
|
||||
type='success'
|
||||
fullMode={false}
|
||||
closeIcon='null'
|
||||
description={`您的分组为:${userState.user.group},分组倍率为:${groupRatio}`}
|
||||
/>
|
||||
) : (
|
||||
<Banner
|
||||
type='warning'
|
||||
fullMode={false}
|
||||
closeIcon='null'
|
||||
description={`您还未登陆,显示的价格为默认分组倍率: ${groupRatio}`}
|
||||
/>
|
||||
)}
|
||||
<br />
|
||||
<Banner
|
||||
type='info'
|
||||
fullMode={false}
|
||||
description={
|
||||
<div>
|
||||
按量计费费用 = 分组倍率 × 模型倍率 × (提示token数 + 补全token数 ×
|
||||
补全倍率)/ 500000 (单位:美元)
|
||||
</div>
|
||||
}
|
||||
closeIcon='null'
|
||||
/>
|
||||
<br />
|
||||
<Button
|
||||
theme='light'
|
||||
type='tertiary'
|
||||
style={{ width: 150 }}
|
||||
onClick={() => {
|
||||
copyText(selectedRowKeys);
|
||||
}}
|
||||
disabled={selectedRowKeys == ''}
|
||||
>
|
||||
复制选中模型
|
||||
</Button>
|
||||
<Table
|
||||
style={{ marginTop: 5 }}
|
||||
columns={columns}
|
||||
@@ -257,6 +353,12 @@ const ModelPricing = () => {
|
||||
pageSize: models.length,
|
||||
showSizeChanger: false,
|
||||
}}
|
||||
rowSelection={rowSelection}
|
||||
/>
|
||||
<ImagePreview
|
||||
src={modalImageUrl}
|
||||
visible={isModalOpenurl}
|
||||
onVisibleChange={(visible) => setIsModalOpenurl(visible)}
|
||||
/>
|
||||
</Layout>
|
||||
</>
|
||||
|
||||
@@ -15,6 +15,7 @@ import '../index.css';
|
||||
|
||||
import {
|
||||
IconCalendarClock,
|
||||
IconChecklistStroked,
|
||||
IconComment,
|
||||
IconCreditCard,
|
||||
IconGift,
|
||||
@@ -58,6 +59,7 @@ const SiderBar = () => {
|
||||
chat: '/chat',
|
||||
detail: '/detail',
|
||||
pricing: '/pricing',
|
||||
task: '/task',
|
||||
};
|
||||
|
||||
const headerButtons = useMemo(
|
||||
@@ -142,6 +144,16 @@ const SiderBar = () => {
|
||||
? 'semi-navigation-item-normal'
|
||||
: 'tableHiddle',
|
||||
},
|
||||
{
|
||||
text: '异步任务',
|
||||
itemKey: 'task',
|
||||
to: '/task',
|
||||
icon: <IconChecklistStroked />,
|
||||
className:
|
||||
localStorage.getItem('enable_task') === 'true'
|
||||
? 'semi-navigation-item-normal'
|
||||
: 'tableHiddle',
|
||||
},
|
||||
{
|
||||
text: '设置',
|
||||
itemKey: 'setting',
|
||||
@@ -158,6 +170,7 @@ const SiderBar = () => {
|
||||
[
|
||||
localStorage.getItem('enable_data_export'),
|
||||
localStorage.getItem('enable_drawing'),
|
||||
localStorage.getItem('enable_task'),
|
||||
localStorage.getItem('chat_link'),
|
||||
isAdmin(),
|
||||
],
|
||||
|
||||
@@ -31,6 +31,7 @@ const SystemSetting = () => {
|
||||
SMTPFrom: '',
|
||||
SMTPToken: '',
|
||||
ServerAddress: '',
|
||||
OutProxyUrl: '',
|
||||
StripeApiSecret: '',
|
||||
StripeWebhookSecret: '',
|
||||
StripePriceId: '',
|
||||
@@ -150,6 +151,7 @@ const SystemSetting = () => {
|
||||
name === 'Notice' ||
|
||||
(name.startsWith('SMTP') && name !== 'SMTPSSLEnabled') ||
|
||||
name === 'ServerAddress' ||
|
||||
name === 'OutProxyUrl' ||
|
||||
name === 'StripeApiSecret' ||
|
||||
name === 'StripeWebhookSecret' ||
|
||||
name === 'StripePriceId' ||
|
||||
@@ -181,6 +183,11 @@ const SystemSetting = () => {
|
||||
await updateOption('ServerAddress', ServerAddress);
|
||||
};
|
||||
|
||||
const submitOutProxyUrl = async () => {
|
||||
let OutProxyUrl = removeTrailingSlash(inputs.OutProxyUrl);
|
||||
await updateOption('OutProxyUrl', OutProxyUrl);
|
||||
};
|
||||
|
||||
const submitPaymentConfig = async () => {
|
||||
if (inputs.ServerAddress === '') {
|
||||
showError('请先填写服务器地址');
|
||||
@@ -368,6 +375,20 @@ const SystemSetting = () => {
|
||||
更新服务器地址
|
||||
</Form.Button>
|
||||
<Divider />
|
||||
<Header as='h3' inverted={isDark}>
|
||||
代理设置
|
||||
</Header>
|
||||
<Form.Group widths='equal'>
|
||||
<Form.Input
|
||||
label='出口代理地址'
|
||||
placeholder='例如:http://1.2.3.4:8888'
|
||||
value={inputs.OutProxyUrl}
|
||||
name='OutProxyUrl'
|
||||
onChange={handleInputChange}
|
||||
/>
|
||||
</Form.Group>
|
||||
<Form.Button onClick={submitOutProxyUrl}>更新代理设置</Form.Button>
|
||||
<Divider />
|
||||
<Header as='h3' inverted={isDark}>
|
||||
支付设置(当前仅支持Stripe Checkout)
|
||||
<Header.Subheader>
|
||||
|
||||
512
web/src/components/TaskLogsTable.js
Normal file
512
web/src/components/TaskLogsTable.js
Normal file
@@ -0,0 +1,512 @@
|
||||
import React, { useEffect, useState } from 'react';
|
||||
import { Label } from 'semantic-ui-react';
|
||||
import {
|
||||
API,
|
||||
copy,
|
||||
isAdmin,
|
||||
showError,
|
||||
showSuccess,
|
||||
timestamp2string,
|
||||
} from '../helpers';
|
||||
|
||||
import {
|
||||
Table,
|
||||
Tag,
|
||||
Form,
|
||||
Button,
|
||||
Layout,
|
||||
Modal,
|
||||
Typography,
|
||||
Progress,
|
||||
Card,
|
||||
} from '@douyinfe/semi-ui';
|
||||
import { ITEMS_PER_PAGE } from '../constants';
|
||||
|
||||
const colors = [
|
||||
'amber',
|
||||
'blue',
|
||||
'cyan',
|
||||
'green',
|
||||
'grey',
|
||||
'indigo',
|
||||
'light-blue',
|
||||
'lime',
|
||||
'orange',
|
||||
'pink',
|
||||
'purple',
|
||||
'red',
|
||||
'teal',
|
||||
'violet',
|
||||
'yellow',
|
||||
];
|
||||
|
||||
const renderTimestamp = (timestampInSeconds) => {
|
||||
const date = new Date(timestampInSeconds * 1000); // 从秒转换为毫秒
|
||||
|
||||
const year = date.getFullYear(); // 获取年份
|
||||
const month = ('0' + (date.getMonth() + 1)).slice(-2); // 获取月份,从0开始需要+1,并保证两位数
|
||||
const day = ('0' + date.getDate()).slice(-2); // 获取日期,并保证两位数
|
||||
const hours = ('0' + date.getHours()).slice(-2); // 获取小时,并保证两位数
|
||||
const minutes = ('0' + date.getMinutes()).slice(-2); // 获取分钟,并保证两位数
|
||||
const seconds = ('0' + date.getSeconds()).slice(-2); // 获取秒钟,并保证两位数
|
||||
|
||||
return `${year}-${month}-${day} ${hours}:${minutes}:${seconds}`; // 格式化输出
|
||||
};
|
||||
|
||||
function renderDuration(submit_time, finishTime) {
|
||||
// 确保startTime和finishTime都是有效的时间戳
|
||||
if (!submit_time || !finishTime) return 'N/A';
|
||||
|
||||
// 将时间戳转换为Date对象
|
||||
const start = new Date(submit_time);
|
||||
const finish = new Date(finishTime);
|
||||
|
||||
// 计算时间差(毫秒)
|
||||
const durationMs = finish - start;
|
||||
|
||||
// 将时间差转换为秒,并保留一位小数
|
||||
const durationSec = (durationMs / 1000).toFixed(1);
|
||||
|
||||
// 设置颜色:大于60秒则为红色,小于等于60秒则为绿色
|
||||
const color = durationSec > 60 ? 'red' : 'green';
|
||||
|
||||
// 返回带有样式的颜色标签
|
||||
return (
|
||||
<Tag color={color} size='large'>
|
||||
{durationSec} 秒
|
||||
</Tag>
|
||||
);
|
||||
}
|
||||
|
||||
const LogsTable = () => {
|
||||
const [isModalOpen, setIsModalOpen] = useState(false);
|
||||
const [modalContent, setModalContent] = useState('');
|
||||
const isAdminUser = isAdmin();
|
||||
const columns = [
|
||||
{
|
||||
title: '提交时间',
|
||||
dataIndex: 'submit_time',
|
||||
render: (text, record, index) => {
|
||||
return <div>{text ? renderTimestamp(text) : '-'}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '结束时间',
|
||||
dataIndex: 'finish_time',
|
||||
render: (text, record, index) => {
|
||||
return <div>{text ? renderTimestamp(text) : '-'}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '进度',
|
||||
dataIndex: 'progress',
|
||||
width: 50,
|
||||
render: (text, record, index) => {
|
||||
return (
|
||||
<div>
|
||||
{
|
||||
// 转换例如100%为数字100,如果text未定义,返回0
|
||||
isNaN(text.replace('%', '')) ? (
|
||||
text
|
||||
) : (
|
||||
<Progress
|
||||
width={42}
|
||||
type='circle'
|
||||
showInfo={true}
|
||||
percent={Number(text.replace('%', '') || 0)}
|
||||
aria-label='drawing progress'
|
||||
/>
|
||||
)
|
||||
}
|
||||
</div>
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '花费时间',
|
||||
dataIndex: 'finish_time', // 以finish_time作为dataIndex
|
||||
key: 'finish_time',
|
||||
render: (finish, record) => {
|
||||
// 假设record.start_time是存在的,并且finish是完成时间的时间戳
|
||||
return <>{finish ? renderDuration(record.submit_time, finish) : '-'}</>;
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '渠道',
|
||||
dataIndex: 'channel_id',
|
||||
className: isAdminUser ? 'tableShow' : 'tableHiddle',
|
||||
render: (text, record, index) => {
|
||||
return (
|
||||
<div>
|
||||
<Tag
|
||||
color={colors[parseInt(text) % colors.length]}
|
||||
size='large'
|
||||
onClick={() => {
|
||||
copyText(text); // 假设copyText是用于文本复制的函数
|
||||
}}
|
||||
>
|
||||
{' '}
|
||||
{text}{' '}
|
||||
</Tag>
|
||||
</div>
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '平台',
|
||||
dataIndex: 'platform',
|
||||
render: (text, record, index) => {
|
||||
return <div>{renderPlatform(text)}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '类型',
|
||||
dataIndex: 'action',
|
||||
render: (text, record, index) => {
|
||||
return <div>{renderType(text)}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '任务ID(点击查看详情)',
|
||||
dataIndex: 'task_id',
|
||||
render: (text, record, index) => {
|
||||
return (
|
||||
<Typography.Text
|
||||
ellipsis={{ showTooltip: true }}
|
||||
//style={{width: 100}}
|
||||
onClick={() => {
|
||||
setModalContent(JSON.stringify(record, null, 2));
|
||||
setIsModalOpen(true);
|
||||
}}
|
||||
>
|
||||
<div>{text}</div>
|
||||
</Typography.Text>
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
title: '任务状态',
|
||||
dataIndex: 'status',
|
||||
render: (text, record, index) => {
|
||||
return <div>{renderStatus(text)}</div>;
|
||||
},
|
||||
},
|
||||
|
||||
{
|
||||
title: '失败原因',
|
||||
dataIndex: 'fail_reason',
|
||||
render: (text, record, index) => {
|
||||
// 如果text未定义,返回替代文本,例如空字符串''或其他
|
||||
if (!text) {
|
||||
return '无';
|
||||
}
|
||||
|
||||
return (
|
||||
<Typography.Text
|
||||
ellipsis={{ showTooltip: true }}
|
||||
style={{ width: 100 }}
|
||||
onClick={() => {
|
||||
setModalContent(text);
|
||||
setIsModalOpen(true);
|
||||
}}
|
||||
>
|
||||
{text}
|
||||
</Typography.Text>
|
||||
);
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
const [logs, setLogs] = useState([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [activePage, setActivePage] = useState(1);
|
||||
const [logCount, setLogCount] = useState(ITEMS_PER_PAGE);
|
||||
const [logType] = useState(0);
|
||||
|
||||
let now = new Date();
|
||||
// 初始化start_timestamp为前一天
|
||||
let zeroNow = new Date(now.getFullYear(), now.getMonth(), now.getDate());
|
||||
const [inputs, setInputs] = useState({
|
||||
channel_id: '',
|
||||
task_id: '',
|
||||
start_timestamp: timestamp2string(zeroNow.getTime() / 1000),
|
||||
end_timestamp: '',
|
||||
});
|
||||
const { channel_id, task_id, start_timestamp, end_timestamp } = inputs;
|
||||
|
||||
const handleInputChange = (value, name) => {
|
||||
setInputs((inputs) => ({ ...inputs, [name]: value }));
|
||||
};
|
||||
|
||||
const setLogsFormat = (logs) => {
|
||||
for (let i = 0; i < logs.length; i++) {
|
||||
logs[i].timestamp2string = timestamp2string(logs[i].created_at);
|
||||
logs[i].key = '' + logs[i].id;
|
||||
}
|
||||
// data.key = '' + data.id
|
||||
setLogs(logs);
|
||||
setLogCount(logs.length + ITEMS_PER_PAGE);
|
||||
// console.log(logCount);
|
||||
};
|
||||
|
||||
const loadLogs = async (startIdx) => {
|
||||
setLoading(true);
|
||||
|
||||
let url = '';
|
||||
let localStartTimestamp = parseInt(Date.parse(start_timestamp) / 1000);
|
||||
let localEndTimestamp = parseInt(Date.parse(end_timestamp) / 1000);
|
||||
if (isAdminUser) {
|
||||
url = `/api/task/?p=${startIdx}&channel_id=${channel_id}&task_id=${task_id}&start_timestamp=${localStartTimestamp}&end_timestamp=${localEndTimestamp}`;
|
||||
} else {
|
||||
url = `/api/task/self?p=${startIdx}&task_id=${task_id}&start_timestamp=${localStartTimestamp}&end_timestamp=${localEndTimestamp}`;
|
||||
}
|
||||
const res = await API.get(url);
|
||||
let { success, message, data } = res.data;
|
||||
if (success) {
|
||||
if (startIdx === 0) {
|
||||
setLogsFormat(data);
|
||||
} else {
|
||||
let newLogs = [...logs];
|
||||
newLogs.splice(startIdx * ITEMS_PER_PAGE, data.length, ...data);
|
||||
setLogsFormat(newLogs);
|
||||
}
|
||||
} else {
|
||||
showError(message);
|
||||
}
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
const pageData = logs.slice(
|
||||
(activePage - 1) * ITEMS_PER_PAGE,
|
||||
activePage * ITEMS_PER_PAGE,
|
||||
);
|
||||
|
||||
const handlePageChange = (page) => {
|
||||
setActivePage(page);
|
||||
if (page === Math.ceil(logs.length / ITEMS_PER_PAGE) + 1) {
|
||||
// In this case we have to load more data and then append them.
|
||||
loadLogs(page - 1).then((r) => {});
|
||||
}
|
||||
};
|
||||
|
||||
const refresh = async () => {
|
||||
// setLoading(true);
|
||||
setActivePage(1);
|
||||
await loadLogs(0);
|
||||
};
|
||||
|
||||
const copyText = async (text) => {
|
||||
if (await copy(text)) {
|
||||
showSuccess('已复制:' + text);
|
||||
} else {
|
||||
// setSearchKeyword(text);
|
||||
Modal.error({ title: '无法复制到剪贴板,请手动复制', content: text });
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
refresh().then();
|
||||
}, [logType]);
|
||||
|
||||
const renderType = (type) => {
|
||||
switch (type) {
|
||||
case 'MUSIC':
|
||||
return (
|
||||
<Label basic color='grey'>
|
||||
{' '}
|
||||
生成音乐{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'LYRICS':
|
||||
return (
|
||||
<Label basic color='pink'>
|
||||
{' '}
|
||||
生成歌词{' '}
|
||||
</Label>
|
||||
);
|
||||
|
||||
default:
|
||||
return (
|
||||
<Label basic color='black'>
|
||||
{' '}
|
||||
未知{' '}
|
||||
</Label>
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
const renderPlatform = (type) => {
|
||||
switch (type) {
|
||||
case 'suno':
|
||||
return (
|
||||
<Label basic color='green'>
|
||||
{' '}
|
||||
Suno{' '}
|
||||
</Label>
|
||||
);
|
||||
default:
|
||||
return (
|
||||
<Label basic color='black'>
|
||||
{' '}
|
||||
未知{' '}
|
||||
</Label>
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
const renderStatus = (type) => {
|
||||
switch (type) {
|
||||
case 'SUCCESS':
|
||||
return (
|
||||
<Label basic color='green'>
|
||||
{' '}
|
||||
成功{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'NOT_START':
|
||||
return (
|
||||
<Label basic color='black'>
|
||||
{' '}
|
||||
未启动{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'SUBMITTED':
|
||||
return (
|
||||
<Label basic color='yellow'>
|
||||
{' '}
|
||||
队列中{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'IN_PROGRESS':
|
||||
return (
|
||||
<Label basic color='blue'>
|
||||
{' '}
|
||||
执行中{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'FAILURE':
|
||||
return (
|
||||
<Label basic color='red'>
|
||||
{' '}
|
||||
失败{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'QUEUED':
|
||||
return (
|
||||
<Label basic color='red'>
|
||||
{' '}
|
||||
排队中{' '}
|
||||
</Label>
|
||||
);
|
||||
case 'UNKNOWN':
|
||||
return (
|
||||
<Label basic color='red'>
|
||||
{' '}
|
||||
未知{' '}
|
||||
</Label>
|
||||
);
|
||||
case '':
|
||||
return (
|
||||
<Label basic color='black'>
|
||||
{' '}
|
||||
正在提交{' '}
|
||||
</Label>
|
||||
);
|
||||
default:
|
||||
return (
|
||||
<Label basic color='black'>
|
||||
{' '}
|
||||
未知{' '}
|
||||
</Label>
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
<Layout>
|
||||
<Form layout='horizontal' labelPosition='inset'>
|
||||
<>
|
||||
{isAdminUser && (
|
||||
<Form.Input
|
||||
field='channel_id'
|
||||
label='渠道 ID'
|
||||
style={{ width: '236px', marginBottom: '10px' }}
|
||||
value={channel_id}
|
||||
placeholder={'可选值'}
|
||||
name='channel_id'
|
||||
onChange={(value) => handleInputChange(value, 'channel_id')}
|
||||
/>
|
||||
)}
|
||||
<Form.Input
|
||||
field='task_id'
|
||||
label={'任务 ID'}
|
||||
style={{ width: '236px', marginBottom: '10px' }}
|
||||
value={task_id}
|
||||
placeholder={'可选值'}
|
||||
name='task_id'
|
||||
onChange={(value) => handleInputChange(value, 'task_id')}
|
||||
/>
|
||||
|
||||
<Form.DatePicker
|
||||
field='start_timestamp'
|
||||
label={'起始时间'}
|
||||
style={{ width: '236px', marginBottom: '10px' }}
|
||||
initValue={start_timestamp}
|
||||
value={start_timestamp}
|
||||
type='dateTime'
|
||||
name='start_timestamp'
|
||||
onChange={(value) => handleInputChange(value, 'start_timestamp')}
|
||||
/>
|
||||
<Form.DatePicker
|
||||
field='end_timestamp'
|
||||
fluid
|
||||
label={'结束时间'}
|
||||
style={{ width: '236px', marginBottom: '10px' }}
|
||||
initValue={end_timestamp}
|
||||
value={end_timestamp}
|
||||
type='dateTime'
|
||||
name='end_timestamp'
|
||||
onChange={(value) => handleInputChange(value, 'end_timestamp')}
|
||||
/>
|
||||
<Button
|
||||
label={'查询'}
|
||||
type='primary'
|
||||
htmlType='submit'
|
||||
className='btn-margin-right'
|
||||
onClick={refresh}
|
||||
>
|
||||
查询
|
||||
</Button>
|
||||
</>
|
||||
</Form>
|
||||
<Card>
|
||||
<Table
|
||||
columns={columns}
|
||||
dataSource={pageData}
|
||||
pagination={{
|
||||
currentPage: activePage,
|
||||
pageSize: ITEMS_PER_PAGE,
|
||||
total: logCount,
|
||||
pageSizeOpts: [10, 20, 50, 100],
|
||||
onPageChange: handlePageChange,
|
||||
}}
|
||||
loading={loading}
|
||||
/>
|
||||
</Card>
|
||||
<Modal
|
||||
visible={isModalOpen}
|
||||
onOk={() => setIsModalOpen(false)}
|
||||
onCancel={() => setIsModalOpen(false)}
|
||||
closable={null}
|
||||
bodyStyle={{ height: '400px', overflow: 'auto' }} // 设置模态框内容区域样式
|
||||
width={800} // 设置模态框宽度
|
||||
>
|
||||
<p style={{ whiteSpace: 'pre-line' }}>{modalContent}</p>
|
||||
</Modal>
|
||||
</Layout>
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
export default LogsTable;
|
||||
@@ -225,6 +225,14 @@ const TokensTable = () => {
|
||||
onOpenLink('next-mj', record.key);
|
||||
},
|
||||
},
|
||||
{
|
||||
node: 'item',
|
||||
key: 'lobe',
|
||||
name: 'Lobe Chat',
|
||||
onClick: () => {
|
||||
onOpenLink('lobe', record.key);
|
||||
},
|
||||
},
|
||||
{
|
||||
node: 'item',
|
||||
key: 'ama',
|
||||
@@ -377,51 +385,6 @@ const TokensTable = () => {
|
||||
await loadTokens(activePage - 1);
|
||||
};
|
||||
|
||||
const onCopy = async (type, key) => {
|
||||
let status = localStorage.getItem('status');
|
||||
let serverAddress = '';
|
||||
if (status) {
|
||||
status = JSON.parse(status);
|
||||
serverAddress = status.server_address;
|
||||
}
|
||||
if (serverAddress === '') {
|
||||
serverAddress = window.location.origin;
|
||||
}
|
||||
let encodedServerAddress = encodeURIComponent(serverAddress);
|
||||
const nextLink = localStorage.getItem('chat_link');
|
||||
const mjLink = localStorage.getItem('chat_link2');
|
||||
let nextUrl;
|
||||
|
||||
if (nextLink) {
|
||||
nextUrl =
|
||||
nextLink + `/#/?settings={"key":"sk-${key}","url":"${serverAddress}"}`;
|
||||
} else {
|
||||
nextUrl = `https://chat.oneapi.pro/#/?settings={"key":"sk-${key}","url":"${serverAddress}"}`;
|
||||
}
|
||||
|
||||
let url;
|
||||
switch (type) {
|
||||
case 'ama':
|
||||
url =
|
||||
mjLink + `/#/?settings={"key":"sk-${key}","url":"${serverAddress}"}`;
|
||||
break;
|
||||
case 'opencat':
|
||||
url = `opencat://team/join?domain=${encodedServerAddress}&token=sk-${key}`;
|
||||
break;
|
||||
case 'next':
|
||||
url = nextUrl;
|
||||
break;
|
||||
default:
|
||||
url = `sk-${key}`;
|
||||
}
|
||||
// if (await copy(url)) {
|
||||
// showSuccess('已复制到剪贴板!');
|
||||
// } else {
|
||||
// showWarning('无法复制到剪贴板,请手动复制,已将令牌填入搜索框。');
|
||||
// setSearchKeyword(url);
|
||||
// }
|
||||
};
|
||||
|
||||
const copyText = async (text) => {
|
||||
if (await copy(text)) {
|
||||
showSuccess('已复制到剪贴板!');
|
||||
@@ -461,6 +424,9 @@ const TokensTable = () => {
|
||||
case 'opencat':
|
||||
url = `opencat://team/join?domain=${encodedServerAddress}&token=sk-${key}`;
|
||||
break;
|
||||
case 'lobe':
|
||||
url = `https://chat-preview.lobehub.com/?settings={"keyVaults":{"openai":{"apiKey":"sk-${key}","baseURL":"${encodedServerAddress}"}}}`;
|
||||
break;
|
||||
case 'next-mj':
|
||||
url =
|
||||
mjLink + `/#/?settings={"key":"sk-${key}","url":"${serverAddress}"}`;
|
||||
|
||||
@@ -208,41 +208,23 @@ const UsersTable = () => {
|
||||
>
|
||||
编辑
|
||||
</Button>
|
||||
<Popconfirm
|
||||
title='确定是否要注销此用户?'
|
||||
content='相当于删除用户,此修改将不可逆'
|
||||
okType={'danger'}
|
||||
position={'left'}
|
||||
onConfirm={() => {
|
||||
manageUser(record.username, 'delete', record).then(() => {
|
||||
removeRecord(record.id);
|
||||
});
|
||||
}}
|
||||
>
|
||||
<Button theme='light' type='danger' style={{ marginRight: 1 }}>
|
||||
注销
|
||||
</Button>
|
||||
</Popconfirm>
|
||||
</>
|
||||
)}
|
||||
{record.DeletedAt !== null ? (
|
||||
<Popconfirm
|
||||
title='确定是否要删除此用户?'
|
||||
content='硬删除,此修改将不可逆'
|
||||
okType={'danger'}
|
||||
position={'left'}
|
||||
onConfirm={() => {
|
||||
hardDeleteUser(record.id).then(() => {
|
||||
removeRecord(record.id);
|
||||
});
|
||||
}}
|
||||
>
|
||||
<Button theme='light' type='danger' style={{ marginRight: 1 }}>
|
||||
永久删除
|
||||
</Button>
|
||||
</Popconfirm>
|
||||
) : (
|
||||
<Popconfirm
|
||||
title='确定是否要删除此用户?'
|
||||
content='软删除,数据依然留底'
|
||||
okType={'danger'}
|
||||
position={'left'}
|
||||
onConfirm={() => {
|
||||
manageUser(record.username, 'delete', record).then(() => {
|
||||
record.DeletedAt = new Date();
|
||||
});
|
||||
}}
|
||||
>
|
||||
<Button theme='light' type='danger' style={{ marginRight: 1 }}>
|
||||
删除
|
||||
</Button>
|
||||
</Popconfirm>
|
||||
)}
|
||||
</div>
|
||||
),
|
||||
},
|
||||
@@ -271,13 +253,13 @@ const UsersTable = () => {
|
||||
};
|
||||
|
||||
const removeRecord = (key) => {
|
||||
console.log(key);
|
||||
let newDataSource = [...users];
|
||||
if (key != null) {
|
||||
let idx = newDataSource.findIndex((data) => data.id === key);
|
||||
|
||||
if (idx > -1) {
|
||||
newDataSource.splice(idx, 1);
|
||||
// update deletedAt
|
||||
newDataSource[idx].DeletedAt = new Date();
|
||||
setUsers(newDataSource);
|
||||
}
|
||||
}
|
||||
@@ -339,18 +321,6 @@ const UsersTable = () => {
|
||||
setUsers(newUsers);
|
||||
} else {
|
||||
showError(message);
|
||||
throw new Error(message);
|
||||
}
|
||||
};
|
||||
|
||||
const hardDeleteUser = async (userId) => {
|
||||
const res = await API.delete('/api/user/' + userId);
|
||||
const { success, message } = res.data;
|
||||
if (success) {
|
||||
showSuccess('操作成功完成!');
|
||||
} else {
|
||||
showError(message);
|
||||
throw new Error(message);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -14,6 +14,13 @@ export const CHANNEL_OPTIONS = [
|
||||
color: 'blue',
|
||||
label: 'Midjourney Proxy Plus',
|
||||
},
|
||||
{
|
||||
key: 36,
|
||||
text: 'Suno API',
|
||||
value: 36,
|
||||
color: 'purple',
|
||||
label: 'Suno API',
|
||||
},
|
||||
{ key: 4, text: 'Ollama', value: 4, color: 'grey', label: 'Ollama' },
|
||||
{
|
||||
key: 14,
|
||||
|
||||
@@ -6,6 +6,7 @@ export function setStatusData(data) {
|
||||
localStorage.setItem('quota_per_unit', data.quota_per_unit);
|
||||
localStorage.setItem('display_in_currency', data.display_in_currency);
|
||||
localStorage.setItem('enable_drawing', data.enable_drawing);
|
||||
localStorage.setItem('enable_task', data.enable_task);
|
||||
localStorage.setItem('enable_data_export', data.enable_data_export);
|
||||
localStorage.setItem(
|
||||
'data_export_default_time',
|
||||
|
||||
7
web/src/helpers/other.js
Normal file
7
web/src/helpers/other.js
Normal file
@@ -0,0 +1,7 @@
|
||||
export function getLogOther(otherStr) {
|
||||
if (otherStr === undefined || otherStr === '') {
|
||||
otherStr = '{}';
|
||||
}
|
||||
let other = JSON.parse(otherStr);
|
||||
return other;
|
||||
}
|
||||
@@ -144,28 +144,42 @@ export function renderModelPrice(
|
||||
) {
|
||||
// 1 ratio = $0.002 / 1K tokens
|
||||
if (modelPrice !== -1) {
|
||||
return '模型价格:$' + modelPrice * groupRatio;
|
||||
return (
|
||||
'模型价格:$' +
|
||||
modelPrice +
|
||||
' * 分组倍率:' +
|
||||
groupRatio +
|
||||
' = $' +
|
||||
modelPrice * groupRatio
|
||||
);
|
||||
} else {
|
||||
if (completionRatio === undefined) {
|
||||
completionRatio = 0;
|
||||
}
|
||||
// 这里的 *2 是因为 1倍率=0.002刀,请勿删除
|
||||
let inputRatioPrice = modelRatio * 2.0 * groupRatio;
|
||||
let completionRatioPrice = modelRatio * 2.0 * completionRatio * groupRatio;
|
||||
let inputRatioPrice = modelRatio * 2.0;
|
||||
let completionRatioPrice = modelRatio * 2.0 * completionRatio;
|
||||
let price =
|
||||
(inputTokens / 1000000) * inputRatioPrice +
|
||||
(completionTokens / 1000000) * completionRatioPrice;
|
||||
return (
|
||||
<>
|
||||
<article>
|
||||
<p>提示 ${inputRatioPrice} / 1M tokens</p>
|
||||
<p>补全 ${completionRatioPrice} / 1M tokens</p>
|
||||
<p>
|
||||
提示:${inputRatioPrice} * {groupRatio} = $
|
||||
{inputRatioPrice * groupRatio} / 1M tokens
|
||||
</p>
|
||||
<p>
|
||||
补全:${completionRatioPrice} * {groupRatio} = $
|
||||
{completionRatioPrice * groupRatio} / 1M tokens
|
||||
</p>
|
||||
<p></p>
|
||||
<p>
|
||||
提示 {inputTokens} tokens / 1M tokens * ${inputRatioPrice} + 补全{' '}
|
||||
{completionTokens} tokens / 1M tokens * ${completionRatioPrice} = $
|
||||
{price.toFixed(6)}
|
||||
{completionTokens} tokens / 1M tokens * ${completionRatioPrice} *
|
||||
分组 {groupRatio} = ${price.toFixed(6)}
|
||||
</p>
|
||||
<p>仅供参考,以实际扣费为准</p>
|
||||
</article>
|
||||
</>
|
||||
);
|
||||
|
||||
@@ -216,6 +216,15 @@ export const verifyJSON = (str) => {
|
||||
return true;
|
||||
};
|
||||
|
||||
export function verifyJSONPromise(value) {
|
||||
try {
|
||||
JSON.parse(value);
|
||||
return Promise.resolve();
|
||||
} catch (e) {
|
||||
return Promise.reject('不是合法的 JSON 字符串');
|
||||
}
|
||||
}
|
||||
|
||||
export function shouldShowPrompt(id) {
|
||||
let prompt = localStorage.getItem(`prompt-${id}`);
|
||||
return !prompt;
|
||||
|
||||
@@ -39,13 +39,16 @@ const About = () => {
|
||||
</Layout.Header>
|
||||
<Layout.Content>
|
||||
<p>可在设置页面设置关于内容,支持 HTML & Markdown</p>
|
||||
new-api项目仓库地址:
|
||||
New-API项目仓库地址:
|
||||
<a href='https://github.com/Calcium-Ion/new-api'>
|
||||
https://github.com/Calcium-Ion/new-api
|
||||
</a>
|
||||
<p>
|
||||
NewAPI © 2023 CalciumIon | 基于 One API v0.5.4 © 2023
|
||||
JustSong。本项目根据MIT许可证授权。
|
||||
JustSong。
|
||||
</p>
|
||||
<p>
|
||||
本项目根据MIT许可证授权,需在遵守Apache-2.0协议的前提下使用。
|
||||
</p>
|
||||
</Layout.Content>
|
||||
</Layout>
|
||||
|
||||
@@ -15,6 +15,7 @@ import {
|
||||
Space,
|
||||
Spin,
|
||||
Button,
|
||||
Tooltip,
|
||||
Input,
|
||||
Typography,
|
||||
Select,
|
||||
@@ -24,6 +25,7 @@ import {
|
||||
} from '@douyinfe/semi-ui';
|
||||
import { Divider } from 'semantic-ui-react';
|
||||
import { getChannelModels, loadChannelModels } from '../../components/utils.js';
|
||||
import axios from 'axios';
|
||||
|
||||
const MODEL_MAPPING_EXAMPLE = {
|
||||
'gpt-3.5-turbo-0301': 'gpt-3.5-turbo',
|
||||
@@ -35,6 +37,9 @@ const STATUS_CODE_MAPPING_EXAMPLE = {
|
||||
400: '500',
|
||||
};
|
||||
|
||||
const fetchButtonTips =
|
||||
'1. 新建渠道时,请求通过当前浏览器发出;2. 编辑已有渠道,请求通过后端服务器发出';
|
||||
|
||||
function type2secretPrompt(type) {
|
||||
// inputs.type === 15 ? '按照如下格式输入:APIKey|SecretKey' : (inputs.type === 18 ? '按照如下格式输入:APPID|APISecret|APIKey' : '请输入渠道对应的鉴权密钥')
|
||||
switch (type) {
|
||||
@@ -122,6 +127,9 @@ const EditChannel = (props) => {
|
||||
'mj_uploads',
|
||||
];
|
||||
break;
|
||||
case 36:
|
||||
localModels = ['suno_music', 'suno_lyrics'];
|
||||
break;
|
||||
default:
|
||||
localModels = getChannelModels(value);
|
||||
break;
|
||||
@@ -173,12 +181,58 @@ const EditChannel = (props) => {
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
const fetchUpstreamModelList = async (name) => {
|
||||
if (inputs['type'] !== 1) {
|
||||
showError('仅支持 OpenAI 接口格式');
|
||||
return;
|
||||
}
|
||||
setLoading(true);
|
||||
const models = inputs['models'] || [];
|
||||
let err = false;
|
||||
if (isEdit) {
|
||||
const res = await API.get('/api/channel/fetch_models/' + channelId);
|
||||
if (res.data && res.data?.success) {
|
||||
models.push(...res.data.data);
|
||||
} else {
|
||||
err = true;
|
||||
}
|
||||
} else {
|
||||
if (!inputs?.['key']) {
|
||||
showError('请填写密钥');
|
||||
err = true;
|
||||
} else {
|
||||
try {
|
||||
const host = new URL(inputs['base_url'] || 'https://api.openai.com');
|
||||
|
||||
const url = `https://${host.hostname}/v1/models`;
|
||||
const key = inputs['key'];
|
||||
const res = await axios.get(url, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${key}`,
|
||||
},
|
||||
});
|
||||
if (res.data && res.data?.success) {
|
||||
models.push(...res.data.data.map((model) => model.id));
|
||||
} else {
|
||||
err = true;
|
||||
}
|
||||
} catch (error) {
|
||||
err = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (!err) {
|
||||
handleInputChange(name, Array.from(new Set(models)));
|
||||
showSuccess('获取模型列表成功');
|
||||
} else {
|
||||
showError('获取模型列表失败');
|
||||
}
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
const fetchModels = async () => {
|
||||
try {
|
||||
let res = await API.get(`/api/channel/models`);
|
||||
if (res === undefined) {
|
||||
return;
|
||||
}
|
||||
let localModelOptions = res.data.data.map((model) => ({
|
||||
label: model.id,
|
||||
value: model.id,
|
||||
@@ -432,6 +486,17 @@ const EditChannel = (props) => {
|
||||
)}
|
||||
{inputs.type === 8 && (
|
||||
<>
|
||||
<div style={{ marginTop: 10 }}>
|
||||
<Banner
|
||||
type={'warning'}
|
||||
description={
|
||||
<>
|
||||
如果你对接的是上游One API或者New
|
||||
API等转发项目,请使用OpenAI类型,不要使用此类型,除非你知道你在做什么。
|
||||
</>
|
||||
}
|
||||
></Banner>
|
||||
</div>
|
||||
<div style={{ marginTop: 10 }}>
|
||||
<Typography.Text strong>
|
||||
完整的 Base URL,支持变量{'{model}'}:
|
||||
@@ -450,6 +515,26 @@ const EditChannel = (props) => {
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
{inputs.type === 36 && (
|
||||
<>
|
||||
<div style={{ marginTop: 10 }}>
|
||||
<Typography.Text strong>
|
||||
注意非Chat API,请务必填写正确的API地址,否则可能导致无法使用
|
||||
</Typography.Text>
|
||||
</div>
|
||||
<Input
|
||||
name='base_url'
|
||||
placeholder={
|
||||
'请输入到 /suno 前的路径,通常就是域名,例如:https://api.example.com '
|
||||
}
|
||||
onChange={(value) => {
|
||||
handleInputChange('base_url', value);
|
||||
}}
|
||||
value={inputs.base_url}
|
||||
autoComplete='new-password'
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
<div style={{ marginTop: 10 }}>
|
||||
<Typography.Text strong>名称:</Typography.Text>
|
||||
</div>
|
||||
@@ -550,6 +635,16 @@ const EditChannel = (props) => {
|
||||
>
|
||||
填入所有模型
|
||||
</Button>
|
||||
<Tooltip content={fetchButtonTips}>
|
||||
<Button
|
||||
type='tertiary'
|
||||
onClick={() => {
|
||||
fetchUpstreamModelList('models');
|
||||
}}
|
||||
>
|
||||
获取模型列表
|
||||
</Button>
|
||||
</Tooltip>
|
||||
<Button
|
||||
type='warning'
|
||||
onClick={() => {
|
||||
@@ -685,23 +780,26 @@ const EditChannel = (props) => {
|
||||
</Space>
|
||||
</div>
|
||||
)}
|
||||
{inputs.type !== 3 && inputs.type !== 8 && inputs.type !== 22 && (
|
||||
<>
|
||||
<div style={{ marginTop: 10 }}>
|
||||
<Typography.Text strong>代理:</Typography.Text>
|
||||
</div>
|
||||
<Input
|
||||
label='代理'
|
||||
name='base_url'
|
||||
placeholder={'此项可选,用于通过代理站来进行 API 调用'}
|
||||
onChange={(value) => {
|
||||
handleInputChange('base_url', value);
|
||||
}}
|
||||
value={inputs.base_url}
|
||||
autoComplete='new-password'
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
{inputs.type !== 3 &&
|
||||
inputs.type !== 8 &&
|
||||
inputs.type !== 22 &&
|
||||
inputs.type !== 36 && (
|
||||
<>
|
||||
<div style={{ marginTop: 10 }}>
|
||||
<Typography.Text strong>代理:</Typography.Text>
|
||||
</div>
|
||||
<Input
|
||||
label='代理'
|
||||
name='base_url'
|
||||
placeholder={'此项可选,用于通过代理站来进行 API 调用'}
|
||||
onChange={(value) => {
|
||||
handleInputChange('base_url', value);
|
||||
}}
|
||||
value={inputs.base_url}
|
||||
autoComplete='new-password'
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
{inputs.type === 22 && (
|
||||
<>
|
||||
<div style={{ marginTop: 10 }}>
|
||||
|
||||
@@ -86,11 +86,21 @@ const Home = () => {
|
||||
<p>
|
||||
源码:
|
||||
<a
|
||||
href='https://github.com/songquanpeng/one-api'
|
||||
href='https://github.com/Calcium-Ion/new-api'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
https://github.com/songquanpeng/one-api
|
||||
https://github.com/Calcium-Ion/new-api
|
||||
</a>
|
||||
</p>
|
||||
<p>
|
||||
协议:
|
||||
<a
|
||||
href='https://www.apache.org/licenses/LICENSE-2.0'
|
||||
target='_blank'
|
||||
rel='noreferrer'
|
||||
>
|
||||
Apache-2.0 License
|
||||
</a>
|
||||
</p>
|
||||
<p>启动时间:{getStartTimeString()}</p>
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user