Compare commits

..

618 Commits

Author SHA1 Message Date
quangdn-ght
b93bf2b018 Merge a043f039f0 into 673f907ea4 2025-06-26 04:54:15 +00:00
quangdn-ght
a043f039f0 xu ly built mask tieng viet va update role assistant 2025-06-26 11:54:08 +07:00
quangdn-ght
8af4d7a6d9 Xu ly dong bo nguoi dung thanh cong 2025-06-26 00:47:54 +07:00
quangdn-ght
ee25e4ac82 cap nhat giao dien 2025-06-25 23:25:07 +07:00
quangdn-ght
b07760fbc9 cap nhat giao dien 2025-06-25 18:12:09 +07:00
quangdn-ght
cdeb27891b thay doi ui chebichat 2025-06-25 14:57:12 +07:00
quangdn-ght
e3fc9eef8f chuyen ve api dashscope compatible - toi uu vision qwen-vl 2025-06-24 12:00:46 +07:00
quangdn-ght
4aaa9db666 thay doi alibaba module mac dinh - chebichat 2025-06-24 10:25:53 +07:00
quangdn-ght
861d854e16 thay doi alibaba module mac dinh - chebichat 2025-06-24 09:18:27 +07:00
RiverRay
673f907ea4 Update README.md
Some checks failed
Run Tests / test (push) Has been cancelled
2025-06-19 20:18:28 +08:00
RiverRay
fb3af2a08f Merge pull request #6515 from dupl/main
Some checks failed
Run Tests / test (push) Has been cancelled
Removed deprecated Gemini models
2025-06-14 13:35:32 +08:00
dupl
eb193ac0ff Removed deprecated Gemini models 2025-06-12 15:34:03 +08:00
RiverRay
c30ddfbb07 Merge pull request #6425 from yunlingz/o_model_md_response
Some checks failed
Run Tests / test (push) Has been cancelled
Fix: Encourage markdown inclusion in model responses for o1/o3
2025-06-12 11:19:24 +08:00
RiverRay
a2f0149786 Merge pull request #6460 from dreamsafari/main
加入Grok3模型列表
2025-06-12 11:13:31 +08:00
GH Action - Upstream Sync
03d36f96ed Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web 2025-06-12 01:53:30 +00:00
RiverRay
705dffc664 Merge pull request #6514 from KevinShiCN/patch-1
Some checks failed
Run Tests / test (push) Has been cancelled
Add gemini-2.5-pro-preview-06-05 into constant.ts
2025-06-11 16:14:09 +08:00
KevinShiCN
02f7e6de98 Add gemini-2.5-pro-preview-06-05 into constant.ts 2025-06-08 23:59:49 +08:00
dreamsafari
843dc52efa 加入Grok3模型列表 2025-04-22 13:06:54 +08:00
RiverRay
3809375694 Merge pull request #6457 from ACTOR-ALCHEMIST/main
Some checks failed
Run Tests / test (push) Has been cancelled
Support OpenAI o3 and o4-mini
2025-04-19 16:00:41 +08:00
RiverRay
1b0de25986 Update README.md 2025-04-19 15:59:31 +08:00
RiverRay
865c45dd29 Update README.md 2025-04-19 15:56:53 +08:00
RiverRay
1f5d8e6d9c Merge pull request #6458 from ChatGPTNextWeb/Leizhenpeng-patch-7
Update README.md
2025-04-19 15:50:48 +08:00
RiverRay
c9ef6d58ed Update README.md 2025-04-19 15:50:17 +08:00
Jasper Hu
2d7229d2b8 feat: 支持 OpenAI 新模型 o3 与 o4-mini,并适配新参数 2025-04-18 20:36:07 +01:00
RiverRay
11b37c15bd Merge pull request #6450 from stephen-zeng/main
Some checks failed
Run Tests / test (push) Has been cancelled
Add gpt-4.1 family & gpt-4.5-preview support
2025-04-17 08:29:19 +08:00
QwQwQ
1d0038f17d add gpt-4.5-preview support 2025-04-16 22:10:47 +08:00
QwQwQ
619fa519c0 add gpt-4.1 family support 2025-04-16 22:02:35 +08:00
Yunling Zhu
c261ebc82c use unshift to improve perf 2025-04-06 16:56:54 +08:00
Yunling Zhu
f7c747c65f encourage markdown inclusion for o1/o3 2025-04-03 22:11:59 +08:00
RiverRay
48469bd8ca Merge pull request #6392 from ChatGPTNextWeb/Leizhenpeng-patch-6
Some checks failed
Run Tests / test (push) Has been cancelled
Update README.md
2025-03-20 17:52:02 +08:00
RiverRay
5a5e887f2b Update README.md 2025-03-20 17:51:47 +08:00
RiverRay
b6f5d75656 Merge pull request #6344 from vangie/fix/jest-setup-esm
Some checks failed
Run Tests / test (push) Has been cancelled
test: fix unit test failures
2025-03-14 20:04:56 +08:00
Vangie Du
0d41a17ef6 test: fix unit test failures 2025-03-07 14:49:17 +08:00
RiverRay
f7cde17919 Merge pull request #6292 from Little-LittleProgrammer/feature/alibaba-omni-support
Some checks failed
Run Tests / test (push) Has been cancelled
feat(alibaba): Added alibaba vision model and omni model support
2025-03-01 10:25:16 +08:00
RiverRay
570cbb34b6 Merge pull request #6310 from agi-dude/patch-1
Remove duplicate links
2025-03-01 10:24:38 +08:00
RiverRay
7aa9ae0a3e Merge pull request #6311 from ChatGPTNextWeb/6305-bugthe-first-message-except-the-system-message-of-deepseek-reasoner-must-be-a-user-message-but-an-assistant-message-detected
Some checks are pending
Run Tests / test (push) Waiting to run
fix: enforce that the first message (excluding system messages) is a …
2025-02-28 19:48:09 +08:00
Kadxy
2d4180f5be fix: update request payload to use filtered messages in Deepseek API 2025-02-28 13:59:30 +08:00
Kadxy
9f0182b55e fix: enforce that the first message (excluding system messages) is a user message in the Deepseek API 2025-02-28 13:54:58 +08:00
Mr. AGI
ad6666eeaf Update README.md 2025-02-28 10:47:52 +05:00
EvanWu
a2c4e468a0 fix(app/utils/chat.ts): fix type error 2025-02-26 19:58:32 +08:00
RiverRay
2167076652 Merge pull request #6293 from hyiip/main
Some checks failed
Run Tests / test (push) Has been cancelled
claude 3.7 support
2025-02-26 18:41:28 +08:00
RiverRay
e123076250 Merge pull request #6295 from rexkyng/patch-1
Fix: Improve Mistral icon detection and remove redundant code.
2025-02-26 18:39:59 +08:00
Rex Ng
ebcb4db245 Fix: Improve Mistral icon detection and remove redundant code.
- Added "codestral" to the list of acceptable names for the Mistral icon, ensuring proper detection.
- Removed duplicate `toLowerCase()` calls.
2025-02-25 14:30:18 +08:00
EvanWu
0a25a1a8cb refacto(app/utils/chat.ts)r: optimize function preProcessImageContentBase 2025-02-25 09:22:47 +08:00
hyiip
f3154b20a5 claude 3.7 support 2025-02-25 03:55:24 +08:00
EvanWu
b709ee3983 feat(alibaba): Added alibaba vision model and omni model support 2025-02-24 20:18:07 +08:00
RiverRay
f5f3ce94f6 Update README.md
Some checks failed
Run Tests / test (push) Has been cancelled
2025-02-21 08:56:43 +08:00
RiverRay
2b5f600308 Update README.md 2025-02-21 08:55:40 +08:00
RiverRay
b966107117 Merge pull request #6235 from DBCDK/danish-locale
Some checks failed
Run Tests / test (push) Has been cancelled
Translation to danish
2025-02-17 22:58:01 +08:00
river
377480b448 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web
Some checks failed
Run Tests / test (push) Has been cancelled
2025-02-16 10:50:07 +08:00
river
8bd0d6a1a7 chore: Update NextChatAI domain from nextchat.dev to nextchat.club 2025-02-16 10:48:54 +08:00
Rasmus Erik Voel Jensen
90827fc593 danish rewording / improved button label 2025-02-15 13:08:58 +01:00
Rasmus Erik Voel Jensen
008e339b6d danish locale 2025-02-15 12:52:44 +01:00
RiverRay
12863f5213 Merge pull request #6204 from bestsanmao/ali_bytedance_reasoning_content
Some checks failed
Run Tests / test (push) Has been cancelled
add 3 type of reasoning_content support (+deepseek-r1@OpenAI @Alibaba @ByteDance), parse <think></think> from SSE
2025-02-13 14:53:47 +08:00
suruiqiang
cf140d4228 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web into ali_bytedance_reasoning_content 2025-02-12 17:54:50 +08:00
suruiqiang
476d946f96 fix bug (trim eats space or \n mistakenly), optimize timeout by model 2025-02-12 17:49:54 +08:00
suruiqiang
9714258322 support deepseek-r1@OpenAI's reasoning_content, parse <think></think> from stream 2025-02-11 18:57:16 +08:00
RiverRay
48cd4b11b5 Merge pull request #6190 from siliconflow/refine-emoji-siliconflow
Some checks failed
Run Tests / test (push) Has been cancelled
Fix model icon on SiliconFlow
2025-02-11 18:37:47 +08:00
RiverRay
77c78b230a Merge pull request #6193 from siliconflow/get-models-siliconflow
Model listing of SiliconFlow
2025-02-11 18:37:22 +08:00
RiverRay
b44686b887 Merge pull request #6189 from bestsanmao/bug_fix
fix avatar for export message preview and saved image
2025-02-11 18:36:50 +08:00
RiverRay
34bdd4b945 Merge pull request #6194 from siliconflow/vl-support-on-sf
Support VLM on SiliconFlow
2025-02-11 18:35:02 +08:00
suruiqiang
b0758cccde optimization 2025-02-11 16:08:30 +08:00
suruiqiang
98a11e56d2 support alibaba and bytedance's reasoning_content 2025-02-11 12:46:46 +08:00
Shenghang Tsai
86f86962fb Support VLM on SiliconFlow 2025-02-10 13:39:06 +08:00
Shenghang Tsai
2137aa65bf Model listing of SiliconFlow 2025-02-10 11:03:49 +08:00
Shenghang Tsai
18fa2cc30d fix model icon on siliconflow 2025-02-09 18:49:26 +08:00
Shenghang Tsai
0bfc648085 fix model icon on siliconflow 2025-02-09 18:47:57 +08:00
suruiqiang
9f91c2d05c fix avatar for export message preview and saved image 2025-02-09 16:52:46 +08:00
RiverRay
a029b4330b Merge pull request #6188 from ChatGPTNextWeb/Leizhenpeng-patch-4
Some checks failed
Run Tests / test (push) Has been cancelled
Update LICENSE
2025-02-09 11:05:43 +08:00
RiverRay
2842b264e0 Update LICENSE 2025-02-09 11:05:32 +08:00
RiverRay
c2edfec16f Merge pull request #6172 from bestsanmao/bug_fix
fix several bugs
2025-02-09 11:03:44 +08:00
RiverRay
6406ac99a3 Merge pull request #6175 from itsevin/main
Add other Xai model
2025-02-09 11:02:13 +08:00
suruiqiang
97a4aafc92 Merge remote-tracking branch 'remotes/origin/main' into bug_fix 2025-02-09 09:46:07 +08:00
GH Action - Upstream Sync
d8f533e1f3 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web 2025-02-09 01:22:47 +00:00
RiverRay
c6199dbf9f Merge pull request #6186 from siliconflow/fix-truc-of-reasoning-model
Some checks are pending
Run Tests / test (push) Waiting to run
Fix formatting of reasoning model on SiliconFlow
2025-02-08 23:40:39 +08:00
RiverRay
4273aa0803 Merge pull request #6185 from siliconflow/larger_timeout_for_siliconflow
Larger timeout for SiliconFlow
2025-02-08 23:39:49 +08:00
Shenghang Tsai
acf75ce68f Remove unnecessary trimming 2025-02-08 16:34:17 +08:00
suruiqiang
1ae5fdbf01 mini optimizations 2025-02-08 16:15:10 +08:00
Shenghang Tsai
2a3996e0d6 Update siliconflow.ts 2025-02-08 14:38:12 +08:00
GH Action - Upstream Sync
fdbaddde37 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web 2025-02-08 01:16:56 +00:00
suruiqiang
d74f79e9c5 Merge remote-tracking branch 'remotes/origin/HEAD' into bug_fix 2025-02-08 08:29:34 +08:00
itsevin
c4e9cb03a9 Add Xai model 2025-02-07 20:29:21 +08:00
RiverRay
bf265d3375 Merge pull request #6164 from ZhangYichi-ZYc/main
Some checks failed
Run Tests / test (push) Has been cancelled
Fix: Set consistent fill color for OpenAI/MoonShot/Grok SVG to prevent color inversion in dark mode
2025-02-07 20:25:20 +08:00
RiverRay
17f391d929 Merge pull request #6158 from dupl/main
update the lastest Gemini models
2025-02-07 20:23:47 +08:00
RiverRay
78186c27fb Merge pull request #6168 from xiexin12138/fix-env
Fix: 补充 env 中硅基流动的环境变量;追加硅基流动 2 个支持的付费模型
2025-02-07 20:23:01 +08:00
suruiqiang
a5a9768245 change request timeout for thinking mode 2025-02-07 16:34:14 +08:00
suruiqiang
3fe55b4f7f fix bug that gemini has multiple candidates part 2025-02-07 16:20:07 +08:00
suruiqiang
f156430cc5 fix emoji issue for doubao and glm's congview & congvideox 2025-02-07 16:18:15 +08:00
suruiqiang
f30c6a4348 fix doubao and grok not upload image 2025-02-07 16:14:19 +08:00
xiexin12138
a780b39c17 fix: 补充硅基流动对 DeepSeek 支持的付费模型 2025-02-07 15:43:50 +08:00
xiexin12138
1010db834c fix: 补充硅基流动的 env 环境变量 2025-02-07 15:41:40 +08:00
ZhangYichi
51384ddc5f Fix: Set consistent fill color for OpenAI/MoonShot/Grok SVG to prevent color inversion in dark mode 2025-02-07 11:13:22 +08:00
dupl
e5e5fde924 update the lastest Gemini models 2025-02-07 06:50:31 +08:00
RiverRay
add9ca200c Merge pull request #6144 from Eric-2369/add-more-llm-icons
Some checks are pending
Run Tests / test (push) Waiting to run
feat: add more llm icons
2025-02-06 18:08:08 +08:00
Eric-2369
5225a6e192 feat: add more llm icons 2025-02-05 12:34:00 +08:00
RiverRay
28cbe56cec Merge pull request #6141 from siliconflow/provider_silicon
Some checks failed
Run Tests / test (push) Has been cancelled
New provider SiliconFlow and Its Latest DeekSeek Models
2025-02-04 21:29:02 +08:00
Shenghang Tsai
ad9ab9d45a New provider SiliconFlow and Its Latest DeekSeek Models
Update README.md

Update constant.ts

Update README_CN.md
2025-02-04 16:59:26 +08:00
RiverRay
bb4832e6e7 Merge pull request #6129 from MonadMonAmi/update_knowledge_cutoff_date
Some checks are pending
Run Tests / test (push) Waiting to run
chore: add knowledge cut off dates for o1 and o3
2025-02-04 09:38:04 +08:00
RiverRay
39b3487ea0 Merge branch 'main' into update_knowledge_cutoff_date 2025-02-04 09:37:55 +08:00
RiverRay
32b60909ae Merge pull request #6132 from RetiredQQ/main
temporary fix for o3-mini
2025-02-04 09:35:43 +08:00
RiverRay
5db6775cb8 Merge pull request #6134 from zcong1993/main
fix: fix isModelNotavailableInServer logic for bytedance models
2025-02-04 09:34:43 +08:00
RiverRay
b6881c7797 Merge pull request #6127 from dupl/main
add gemini-2.0-flash-thinking-exp, gemini-2.0-flash-thinking-exp-01-21
2025-02-04 09:33:13 +08:00
RiverRay
9943a52295 Update README.md 2025-02-04 09:31:16 +08:00
RiverRay
1db4d25370 Update README.md 2025-02-04 09:29:56 +08:00
zcong1993
92f57fb18f fix: fix isModelNotavailableInServer logic for bytedance models 2025-02-03 16:58:42 +08:00
Sky
4c4d44e2f8 fix 2025-02-02 21:45:30 +00:00
Sky
8f12beb8f0 support o3-mini 2025-02-02 21:43:30 +00:00
AndrewS
2e7cac3218 chore: add knowledge cut off dates for o1 and o3 2025-02-02 19:44:53 +01:00
dupl
60fa358010 typo: OpanAI -> OpenAI 2025-02-02 23:27:45 +08:00
dupl
034b7d4655 add gemini-2.0-flash-thinking-exp, gemini-2.0-flash-thinking-exp-01-21 2025-02-02 23:11:07 +08:00
RiverRay
1e20b64048 Merge pull request #6121 from ChatGPTNextWeb/feat/support-openai-o3-mini
Some checks failed
Run Tests / test (push) Has been cancelled
feat(model): add support for OpenAI o3-mini model
2025-02-02 20:57:21 +08:00
Kadxy
4f28fca506 feat: Support OpenAI o3-mini 2025-02-01 15:02:06 +08:00
RiverRay
3ef5993085 Merge pull request #6119 from ChatGPTNextWeb/Leizhenpeng-patch-3
Some checks failed
Run Tests / test (push) Has been cancelled
Update README.md
2025-01-31 08:18:47 +08:00
RiverRay
09ad7c1875 Update README.md 2025-01-31 08:18:13 +08:00
RiverRay
31e52cb47e 更新 README.md 2025-01-31 06:53:39 +08:00
RiverRay
9a69c5bd7c Merge pull request #6118 from ChatGPTNextWeb/feat/issue-6104-deepseek-reasoning-content 2025-01-31 06:48:00 +08:00
Kadxy
be645aab37 fix: revert unintended changes 2025-01-31 00:59:03 +08:00
RiverRay
c41e86faa6 Merge pull request #6116 from ChatGPTNextWeb/feat/issue-6104-deepseek-reasoning-content
Support DeepSeek API streaming reasoning content
2025-01-31 00:52:18 +08:00
river
143be69a7f chore: remove log 2025-01-31 00:50:03 +08:00
river
63b7626656 chore: change md 2025-01-31 00:49:09 +08:00
Kadxy
dabb7c70d5 feat: Remove reasoning_contentfor DeepSeek API messages 2025-01-31 00:30:08 +08:00
Kadxy
c449737127 feat: Support DeepSeek API streaming with thinking mode 2025-01-31 00:07:52 +08:00
RiverRay
553b8c9f28 Update .env.template
Some checks failed
Run Tests / test (push) Has been cancelled
2025-01-27 13:05:17 +08:00
river
19314793b8 Merge branch 'bestsanmao-bug_fix' 2025-01-27 12:55:31 +08:00
river
8680182921 feat: Add DeepSeek API key and fix MCP environment variable parsing 2025-01-27 12:48:59 +08:00
suruiqiang
2173c82bb5 add deepseek-reasoner, and change deepseek's summary model to deepseek-chat 2025-01-23 18:47:22 +08:00
suruiqiang
0d5e66a9ae not insert mcpSystemPrompt if not ENABLE_MCP 2025-01-23 18:24:38 +08:00
RiverRay
2f9cb5a68f Merge pull request #6084 from ChatGPTNextWeb/temp-fix
Some checks failed
Run Tests / test (push) Has been cancelled
fix: missing mcp_config.json files required for building
2025-01-22 21:40:37 +08:00
Kadxy
55cacfb7e2 fix: missing files required for building 2025-01-22 21:28:29 +08:00
RiverRay
6a862372f7 Merge pull request #6082 from ChatGPTNextWeb/Leizhenpeng-patch-2
Some checks are pending
Run Tests / test (push) Waiting to run
Update README_CN.md
2025-01-22 13:11:11 +08:00
RiverRay
81bd83eb44 Update README_CN.md 2025-01-22 13:08:33 +08:00
RiverRay
b2b6fd81be Merge pull request #6075 from Kadxy/main
Some checks failed
Run Tests / test (push) Has been cancelled
2025-01-20 10:44:46 +08:00
Kadxy
f22cfd7b33 Update chat.tsx 2025-01-20 10:10:52 +08:00
RiverRay
8111acff34 Update README.md
Some checks are pending
Run Tests / test (push) Waiting to run
2025-01-20 00:17:47 +08:00
RiverRay
4cad55379d Merge pull request #5974 from ChatGPTNextWeb/feat-mcp
Support MCP( WIP)
2025-01-20 00:07:41 +08:00
Kadxy
a3d3ce3f4c Merge branch 'main' into feat-mcp 2025-01-19 23:28:12 +08:00
Kadxy
611e97e641 docs: update README.md 2025-01-19 23:20:58 +08:00
Kadxy
bfeea4ed49 fix: prevent MCP operations from blocking chat interface 2025-01-19 01:02:01 +08:00
Kadxy
bc71ae247b feat: add ENABLE_MCP env var to toggle MCP feature globally and in Docker 2025-01-18 21:19:01 +08:00
Kadxy
0112b54bc7 fix: missing en translation 2025-01-16 22:35:26 +08:00
Kadxy
65810d918b feat: improve async operations and UI feedback 2025-01-16 21:31:19 +08:00
river
4d535b1cd0 chore: enhance mcp prompt 2025-01-16 20:54:24 +08:00
Kadxy
588d81e8f1 feat: remove unused files 2025-01-16 09:17:08 +08:00
Kadxy
d4f499ee41 feat: adjust form style 2025-01-16 09:11:53 +08:00
Kadxy
4d63d73b2e feat: load MCP preset data from server 2025-01-16 09:00:57 +08:00
Kadxy
07c63497dc feat: support stop/start MCP servers 2025-01-16 08:52:54 +08:00
Kadxy
e440ff56c8 fix: env not work 2025-01-15 18:47:05 +08:00
river
c89e4883b2 chore: update icon 2025-01-15 17:31:18 +08:00
river
ac3d940de8 Merge branch 'feat-mcp' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web into feat-mcp 2025-01-15 17:29:43 +08:00
Kadxy
be59de56f0 feat: Display the number of clients instead of the number of available tools. 2025-01-15 17:24:04 +08:00
river
a70e9a3c01 chore:update mcp icon 2025-01-15 17:23:10 +08:00
Kadxy
8aa9a500fd feat: Optimize MCP configuration logic 2025-01-15 16:52:54 +08:00
RiverRay
93652db688 Update README.md
Some checks failed
Run Tests / test (push) Has been cancelled
2025-01-13 16:57:50 +08:00
RiverRay
8421c483e8 Update README.md
Some checks failed
Run Tests / test (push) Has been cancelled
2025-01-12 12:56:13 +08:00
Dogtiti
4ac27fdd4d Merge pull request #6033 from lvguanjun/fix_fork_session
Some checks are pending
Run Tests / test (push) Waiting to run
fix: prevent message sync between forked sessions by generating unique IDs
2025-01-11 16:19:02 +08:00
Dogtiti
b6b2c501fd Merge pull request #6034 from dupl/main
Correct the typos in user-manual-cn.md
2025-01-11 16:17:32 +08:00
Kadxy
ce13cf61a7 feat: ignore mcp_config.json 2025-01-09 20:15:47 +08:00
Kadxy
a3af563e89 feat: Reset mcp_config.json to empty 2025-01-09 20:13:16 +08:00
Kadxy
e95c94d7be fix: inaccurate content 2025-01-09 20:10:10 +08:00
Kadxy
125a71fead fix: unnecessary initialization 2025-01-09 20:07:24 +08:00
Kadxy
b410ec399c feat: auto scroll to bottom when MCP response 2025-01-09 20:02:27 +08:00
Kadxy
7d51bfd42e feat: MCP market 2025-01-09 19:51:01 +08:00
Kadxy
0c14ce6417 fix: MCP execution content matching failed. 2025-01-09 13:41:17 +08:00
Kadxy
f2a2b40d2c feat: carry mcp primitives content as a system prompt 2025-01-09 10:20:56 +08:00
Kadxy
77be190d76 feat: carry mcp primitives content as a system prompt 2025-01-09 10:09:46 +08:00
dupl
c56587c438 Correct the typos in user-manual-cn.md 2025-01-05 20:34:18 +08:00
lvguanjun
840c151ab9 fix: prevent message sync between forked sessions by generating unique IDs 2025-01-05 11:22:53 +08:00
RiverRay
0af04e0f2f Merge pull request #5468 from DDMeaqua/feat-shortcutkey
Some checks failed
Run Tests / test (push) Has been cancelled
feat: #5422 快捷键清除上下文
2024-12-31 16:23:10 +08:00
DDMeaqua
d184eb6458 chore: cmd + shift+ backspace 2024-12-31 14:50:54 +08:00
DDMeaqua
c5d9b1131e fix: merge bug 2024-12-31 14:38:58 +08:00
DDMeaqua
e13408dd24 Merge branch 'main' into feat-shortcutkey 2024-12-31 14:30:09 +08:00
DDMeaqua
aba4baf384 chore: update 2024-12-31 14:25:43 +08:00
DDMeaqua
6d84f9d3ae chore: update 2024-12-31 13:27:15 +08:00
Dogtiti
63c5baaa80 Merge pull request #6010 from code-october/fix-visionModels
修复 VISION_MDOELS 在 docker 运行阶段不生效的问题
2024-12-31 09:56:46 +08:00
Dogtiti
defefba925 Merge pull request #6016 from bestsanmao/add_deepseek
Some checks are pending
Run Tests / test (push) Waiting to run
fix issue #6009  add setting items for deepseek
2024-12-30 19:27:20 +08:00
suruiqiang
90c531c224 fix issue #6009 add setting items for deepseek 2024-12-30 18:23:18 +08:00
code-october
266e9efd2e rename the function 2024-12-30 09:13:12 +00:00
code-october
57c88c0717 修复 VISION_MDOELS 在 docker 运行阶段不生效的问题 2024-12-30 08:58:41 +00:00
DDMeaqua
5b5dea1c59 chore: 更换快捷键 2024-12-30 12:11:50 +08:00
Dogtiti
d56566cd73 Merge pull request #6001 from bestsanmao/add_deepseek
Some checks are pending
Run Tests / test (push) Waiting to run
docs: add DEEPSEEK_API_KEY and DEEPSEEK_URL in README
2024-12-30 09:42:22 +08:00
suruiqiang
b5d104c908 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web into add_deepseek 2024-12-30 09:04:40 +08:00
RiverRay
f9e9129d52 Update README.md
Some checks are pending
Run Tests / test (push) Waiting to run
2024-12-29 19:57:27 +08:00
suruiqiang
2a8a18391e docs: add DEEPSEEK_API_KEY and DEEPSEEK_URL in README 2024-12-29 15:31:50 +08:00
Dogtiti
e1cb8e36fa Merge pull request #5989 from bestsanmao/add_deepseek
Some checks are pending
Run Tests / test (push) Waiting to run
since #5984, add DeepSeek as a new ModelProvider (with deepseek-chat&deepseek-coder models), so that user can use openai and deepseek at same time with different api url & key
2024-12-29 12:35:21 +08:00
suruiqiang
b948d6bf86 bug fix 2024-12-29 11:24:57 +08:00
Kadxy
fe67f79050 feat: MCP message type 2024-12-29 09:24:52 +08:00
suruiqiang
67338ff9b7 add KnowledgeCutOffDate for deepseek 2024-12-29 08:58:45 +08:00
suruiqiang
7380c8a2c1 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web into add_deepseek 2024-12-29 08:43:25 +08:00
Kadxy
e1ba8f1b0f feat: Send MCP response as a user 2024-12-29 08:29:02 +08:00
Dogtiti
c0062ff280 Merge pull request #5998 from dupl/main
Some checks are pending
Run Tests / test (push) Waiting to run
Use regular expressions to make the code more concise.
2024-12-29 00:22:13 +08:00
dupl
39e593da48 Use regular expressions to make the code more concise. 2024-12-28 23:49:28 +08:00
Dogtiti
f8b10ad8b1 Merge pull request #5997 from ChatGPTNextWeb/feature/glm-4v
feature: support glm-4v
2024-12-28 23:34:44 +08:00
Dogtiti
8a22c9d6db feature: support glm-4v 2024-12-28 23:33:06 +08:00
RiverRay
5f96804f3b Merge pull request #5920 from fishshi/i18n
Use i18n for DISCOVERY
2024-12-28 22:05:37 +08:00
RiverRay
13430ea3e2 Merge pull request #5965 from zmhuanf/temp
Fix issue #5964: Prevents character loss in gemini-2.0-flash-thinking-exp-1219 responses
2024-12-28 22:02:02 +08:00
Kadxy
664879b9df feat: Create all MCP Servers at startup 2024-12-28 21:06:26 +08:00
Dogtiti
9df24e568b Merge pull request #5996 from ChatGPTNextWeb/feature/cogview
Feature/cogview
2024-12-28 20:25:25 +08:00
Dogtiti
bc322be448 fix: type error 2024-12-28 20:24:08 +08:00
Dogtiti
a867adaf04 fix: size 2024-12-28 20:23:51 +08:00
Dogtiti
0cb186846a feature: support glm Cogview 2024-12-28 20:23:44 +08:00
Dogtiti
e467ce028d Merge pull request #5994 from ConnectAI-E/fix/failed-test
fix: failed unit test
2024-12-28 17:55:29 +08:00
Dogtiti
cdfe907fb5 fix: failed unit test 2024-12-28 17:54:21 +08:00
Dogtiti
d91af7f983 Merge pull request #5883 from code-october/fix/model-leak
Some checks are pending
Run Tests / test (push) Waiting to run
fix model leak issue
2024-12-28 14:47:35 +08:00
Kadxy
c3108ad333 feat: simple MCP example 2024-12-28 14:31:43 +08:00
suruiqiang
081daf937e since #5984, add DeepSeek as a new ModelProvider (with deepseek-chat&deepseek-corder models), so that user can use openai and deepseek at same time with different api url&key 2024-12-27 16:57:26 +08:00
RiverRay
0c3d4462ca Merge pull request #5976 from ChatGPTNextWeb/Leizhenpeng-patch-1
Some checks failed
Run Tests / test (push) Has been cancelled
Update README.md
2024-12-23 22:47:59 +08:00
RiverRay
3c859fc29f Update README.md 2024-12-23 22:47:16 +08:00
river
e1c7c54dfa chore: change md 2024-12-23 22:32:36 +08:00
zmhuanf
87b5e3bf62 修复bug; 2024-12-22 15:44:47 +08:00
Dogtiti
1d15666713 Merge pull request #5919 from Yiming3/feature/flexible-visual-model
Some checks failed
Run Tests / test (push) Has been cancelled
feat: runtime configuration of vision-capable models
2024-12-22 10:37:57 +08:00
Yiming Zhang
a127ae1fb4 docs: add VISION_MODELS section to README files 2024-12-21 13:12:41 -05:00
Yiming Zhang
ea1329f73e fix: add optional chaining to prevent errors when accessing visionModels 2024-12-21 04:07:58 -05:00
Yiming Zhang
149d732cb7 Merge remote-tracking branch 'upstream/main' into feature/flexible-visual-model 2024-12-21 03:53:05 -05:00
Yiming Zhang
210b29bfbe refactor: remove NEXT_PUBLIC_ prefix from VISION_MODELS env var 2024-12-21 03:51:54 -05:00
Dogtiti
acc2e97aab Merge pull request #5959 from dupl/gemini
Some checks are pending
Run Tests / test (push) Waiting to run
add gemini-exp-1206, gemini-2.0-flash-thinking-exp-1219
2024-12-21 16:30:09 +08:00
dupl
93ac0e5017 Reorganized the Gemini model 2024-12-21 15:26:33 +08:00
Yiming Zhang
ed8c3580c8 test: add unit tests for isVisionModel utility function 2024-12-20 19:07:00 -05:00
dupl
0a056a7c5c add gemini-exp-1206, gemini-2.0-flash-thinking-exp-1219 2024-12-21 08:00:37 +08:00
Yiming Zhang
74c4711cdd Merge remote-tracking branch 'upstream/main' into feature/flexible-visual-model 2024-12-20 18:34:07 -05:00
Dogtiti
eceec092cf Merge pull request #5932 from fengzai6/update-google-models
Some checks are pending
Run Tests / test (push) Waiting to run
Update google models to add gemini-2.0
2024-12-21 00:43:02 +08:00
Dogtiti
42743410a8 Merge pull request #5940 from ChatGPTNextWeb/dependabot/npm_and_yarn/testing-library/react-16.1.0
chore(deps-dev): bump @testing-library/react from 16.0.1 to 16.1.0
2024-12-21 00:41:45 +08:00
Dogtiti
0f04756d4c Merge pull request #5936 from InitialXKO/main
面具“以文搜图”改成“AI文生图”,微调提示让图片生成更稳定无水印
2024-12-21 00:40:45 +08:00
dependabot[bot]
acdded8161 chore(deps-dev): bump @testing-library/react from 16.0.1 to 16.1.0
Bumps [@testing-library/react](https://github.com/testing-library/react-testing-library) from 16.0.1 to 16.1.0.
- [Release notes](https://github.com/testing-library/react-testing-library/releases)
- [Changelog](https://github.com/testing-library/react-testing-library/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/react-testing-library/compare/v16.0.1...v16.1.0)

---
updated-dependencies:
- dependency-name: "@testing-library/react"
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-12-16 10:57:34 +00:00
InitialXKO
e939ce5a02 面具“以文搜图”改成“AI文生图”,微调提示让图片生成更稳定无水印 2024-12-13 22:29:14 +08:00
Nacho.L
46a0b100f7 Update versionKeywords 2024-12-13 08:29:43 +08:00
Nacho.L
e27e8fb0e1 Update google models 2024-12-13 07:22:16 +08:00
fishshi
93c5320bf2 Use i18n for DISCOVERY 2024-12-10 15:56:04 +08:00
Yiming Zhang
a433d1606c feat: use regex patterns for vision models and allow adding capabilities to models through env var NEXT_PUBLIC_VISION_MODELS. 2024-12-10 00:22:45 -05:00
code-october
cc5e16b045 update unit test 2024-11-30 07:30:52 +00:00
code-october
54f6feb2d7 update unit test 2024-11-30 07:28:38 +00:00
code-october
e1ac0538b8 add unit test 2024-11-30 07:22:24 +00:00
code-october
1a678cb4d8 fix model leak issue 2024-11-29 15:47:28 +00:00
Dogtiti
83cea3a90d Merge pull request #5879 from frostime/textline-custom-model
Some checks failed
Run Tests / test (push) Has been cancelled
🎨 style(setting): Place custom-model's input a separated row.
2024-11-28 12:02:42 +08:00
frostime
759a09a76c 🎨 style(setting): Place custom-model's input a seperated row. 2024-11-27 13:11:18 +08:00
Dogtiti
2623a92763 Merge pull request #5850 from code-october/fix-o1
Some checks failed
Run Tests / test (push) Has been cancelled
Fix o1
2024-11-25 12:31:36 +08:00
Dogtiti
3932c594c7 Merge pull request #5861 from code-october/update-model
Some checks failed
Run Tests / test (push) Has been cancelled
update new model for gpt-4o and gemini-exp
2024-11-22 20:59:30 +08:00
code-october
b7acb89096 update new model for gpt-4o and gemini-exp 2024-11-22 09:48:50 +00:00
code-october
ef24d3e633 use stream when request o1 2024-11-21 03:46:10 +00:00
code-october
23350c842b fix o1 in disableGPT4 2024-11-21 03:45:07 +00:00
Dogtiti
a2adfbbd32 Merge pull request #5821 from Sherlocksuper/scroll
Some checks failed
Run Tests / test (push) Has been cancelled
feat: support more user-friendly scrolling
2024-11-16 15:24:46 +08:00
Lloyd Zhou
f22cec1eb4 Merge pull request #5827 from ConnectAI-E/fix/markdown-embed-codeblock
Some checks failed
Run Tests / test (push) Has been cancelled
fix: 代码块嵌入小代码块时渲染错误
2024-11-15 16:03:27 +08:00
opchips
e56216549e fix: 代码块嵌入小代码块时渲染错误 2024-11-15 11:56:26 +08:00
Sherlock
19facc7c85 feat: support mort user-friendly scrolling 2024-11-14 21:31:45 +08:00
Lloyd Zhou
b08ce5630c Merge pull request #5819 from ConnectAI-E/fix-gemini-summary
Some checks failed
Run Tests / test (push) Has been cancelled
Fix gemini summary
2024-11-13 15:17:44 +08:00
DDMeaqua
b41c012d27 chore: shouldStream 2024-11-13 15:12:46 +08:00
Lloyd Zhou
a392daab71 Merge pull request #5816 from ConnectAI-E/feature/artifacts-svg
artifacts support svg
2024-11-13 14:58:33 +08:00
DDMeaqua
0628ddfc6f chore: update 2024-11-13 14:27:41 +08:00
DDMeaqua
7eda14f138 fix: [#5308] gemini对话总结 2024-11-13 14:24:44 +08:00
opchips
9a86c42c95 update 2024-11-12 16:33:55 +08:00
Lloyd Zhou
819d249a09 Merge pull request #5815 from LovelyGuYiMeng/main
Some checks are pending
Run Tests / test (push) Waiting to run
更新视觉模型匹配关键词
2024-11-12 15:04:11 +08:00
LovelyGuYiMeng
8d66fedb1f Update visionKeywords 2024-11-12 14:28:11 +08:00
Lloyd Zhou
7cf89b53ce Merge pull request #5812 from ConnectAI-E/fix/rerender-chat
fix: use current session id to trigger rerender
2024-11-12 13:49:51 +08:00
Dogtiti
459c373f13 Merge pull request #5807 from ChatGPTNextWeb/dependabot/npm_and_yarn/testing-library/jest-dom-6.6.3
Some checks are pending
Run Tests / test (push) Waiting to run
chore(deps-dev): bump @testing-library/jest-dom from 6.6.2 to 6.6.3
2024-11-11 20:59:56 +08:00
Dogtiti
1d14a991ee fix: use current session id to trigger rerender 2024-11-11 20:30:59 +08:00
dependabot[bot]
05ef5adfa7 chore(deps-dev): bump @testing-library/jest-dom from 6.6.2 to 6.6.3
Bumps [@testing-library/jest-dom](https://github.com/testing-library/jest-dom) from 6.6.2 to 6.6.3.
- [Release notes](https://github.com/testing-library/jest-dom/releases)
- [Changelog](https://github.com/testing-library/jest-dom/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/jest-dom/compare/v6.6.2...v6.6.3)

---
updated-dependencies:
- dependency-name: "@testing-library/jest-dom"
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-11-11 10:53:00 +00:00
lloydzhou
38fa3056df update version
Some checks are pending
Run Tests / test (push) Waiting to run
2024-11-11 13:26:08 +08:00
Lloyd Zhou
289aeec8af Merge pull request #5786 from ConnectAI-E/feature/realtime-chat
Feature/realtime chat
2024-11-11 13:19:26 +08:00
lloydzhou
7d71da938f remove close-24 svg 2024-11-11 13:15:09 +08:00
Lloyd Zhou
f8f6954115 Merge pull request #5779 from ConnectAI-E/feature/model/claude35haiku
add claude35haiku & not support vision
2024-11-11 13:13:09 +08:00
Lloyd Zhou
6e03f32871 Merge pull request #5795 from JingSyue/main
fix: built-in plugin dalle3 error #5787
2024-11-11 13:10:00 +08:00
JingSyue
18a6571883 Update proxy.ts
Update proxy.ts
2024-11-11 12:59:29 +08:00
Dogtiti
14f444e1f0 doc: realtime chat 2024-11-11 11:47:41 +08:00
JingSyue
2b0f2e5f9d fix: built-in plugin dalle3 error #5787 2024-11-10 10:28:25 +08:00
Dogtiti
4629b39c29 chore: comment context history 2024-11-09 16:22:01 +08:00
Dogtiti
d33e772fa5 feat: voice print 2024-11-08 22:39:17 +08:00
Dogtiti
89136fba32 feat: voice print 2024-11-08 22:18:39 +08:00
Dogtiti
8b4ca133fd feat: voice print 2024-11-08 22:02:31 +08:00
lloydzhou
a4c9eaf6cd do not save empty audio file 2024-11-08 13:43:13 +08:00
lloydzhou
50e63109a3 merge code and get analyser data 2024-11-08 13:21:40 +08:00
Dogtiti
48a1e8a584 chore: i18n 2024-11-07 21:32:47 +08:00
Dogtiti
e44ebe3f0e feat: realtime config 2024-11-07 21:28:23 +08:00
Lloyd Zhou
108069a0c6 Merge pull request #5788 from ConnectAI-E/fix-o1-maxtokens
Some checks failed
Run Tests / test (push) Has been cancelled
chore: o1模型使用max_completion_tokens
2024-11-07 20:06:30 +08:00
DDMeaqua
d5bda2904d chore: o1模型使用max_completion_tokens 2024-11-07 19:45:27 +08:00
lloydzhou
283caba8ce stop streaming play after get input audio. 2024-11-07 18:57:57 +08:00
lloydzhou
b78e5db817 add temperature config 2024-11-07 17:55:51 +08:00
lloydzhou
46c469b2d7 add voice config 2024-11-07 17:47:55 +08:00
lloydzhou
c00ebbea4f update 2024-11-07 17:40:03 +08:00
lloydzhou
c526ff80b5 update 2024-11-07 17:23:20 +08:00
lloydzhou
0037b0c944 ts error 2024-11-07 17:03:04 +08:00
lloydzhou
6f81bb3b8a add context after connected 2024-11-07 16:56:15 +08:00
lloydzhou
7bdc45ed3e connect realtime model when open panel 2024-11-07 16:41:24 +08:00
Dogtiti
88cd3ac122 fix: ts error 2024-11-07 12:16:11 +08:00
Dogtiti
4988d2ee26 fix: ts error 2024-11-07 11:56:58 +08:00
lloydzhou
8deb7a92ee hotfix for update target session 2024-11-07 11:53:01 +08:00
lloydzhou
db060d732a upload save record wav file 2024-11-07 11:45:38 +08:00
lloydzhou
522627820a upload save wav file logic 2024-11-07 09:36:22 +08:00
lloydzhou
cf46d5ad63 upload response audio, and update audio_url to session message 2024-11-07 01:12:08 +08:00
Dogtiti
a4941521d0 feat: audio to message 2024-11-06 22:30:02 +08:00
Dogtiti
f6e1f8398b wip 2024-11-06 22:07:33 +08:00
Dogtiti
d544eead38 feat: realtime chat ui 2024-11-06 21:14:45 +08:00
Lloyd Zhou
fbb9385f23 Merge pull request #5782 from ConnectAI-E/style/classname
Some checks are pending
Run Tests / test (push) Waiting to run
style: improve classname by clsx
2024-11-06 20:33:51 +08:00
Dogtiti
18144c3d9c chore: clsx 2024-11-06 20:16:38 +08:00
opchips
64aa760e58 update claude rank 2024-11-06 19:18:05 +08:00
Dogtiti
e0bbb8bb68 style: improve classname by clsx 2024-11-06 16:58:26 +08:00
opchips
6667ee1c7f merge main 2024-11-06 15:08:18 +08:00
Lloyd Zhou
6ded4e96e7 Merge pull request #5778 from ConnectAI-E/fix/5436
fix: botMessage reply date
2024-11-06 15:04:46 +08:00
Dogtiti
85cdcab850 fix: botMessage reply date 2024-11-06 14:53:08 +08:00
Lloyd Zhou
f4c9410c29 Merge pull request #5776 from ConnectAI-E/feat-glm
Some checks are pending
Run Tests / test (push) Waiting to run
fix: glm chatpath
2024-11-06 14:02:20 +08:00
DDMeaqua
adf7d8200b fix: glm chatpath 2024-11-06 13:55:57 +08:00
opchips
3086a2fa77 add claude35haiku not vision 2024-11-06 12:56:24 +08:00
Lloyd Zhou
f526d6f560 Merge pull request #5774 from ConnectAI-E/feature/update-target-session
fix: updateCurrentSession => updateTargetSession
2024-11-06 11:16:33 +08:00
Dogtiti
106461a1e7 Merge branch 'main' of https://github.com/ConnectAI-E/ChatGPT-Next-Web into feature/update-target-session 2024-11-06 11:08:41 +08:00
Dogtiti
c4e19dbc59 fix: updateCurrentSession => updateTargetSession 2024-11-06 11:06:18 +08:00
Dogtiti
f3603e59fa Merge pull request #5769 from ryanhex53/fix-model-multi@
Custom model names can include the `@` symbol by itself.
2024-11-06 10:49:28 +08:00
ryanhex53
8e2484fcdf Refactor: Replace all provider split occurrences with getModelProvider utility method 2024-11-05 13:52:54 +00:00
lloydzhou
00d6cb27f7 update version
Some checks are pending
Run Tests / test (push) Waiting to run
2024-11-05 17:42:55 +08:00
ryanhex53
b844045d23 Custom model names can include the @ symbol by itself.
To specify the model's provider, append it after the model name using `@` as before.

This format supports cases like `google vertex ai` with a model name like `claude-3-5-sonnet@20240620`.

For instance, `claude-3-5-sonnet@20240620@vertex-ai` will be split by `split(/@(?!.*@)/)` into:

`[ 'claude-3-5-sonnet@20240620', 'vertex-ai' ]`, where the former is the model name and the latter is the custom provider.
2024-11-05 07:44:12 +00:00
Lloyd Zhou
e49fe976d9 Merge pull request #5765 from ConnectAI-E/feature/onfinish
feat: update real 'currentSession'
2024-11-05 15:07:52 +08:00
Dogtiti
14f751965f Merge pull request #5767 from ConnectAI-E/feat-glm
Some checks are pending
Run Tests / test (push) Waiting to run
chore: update readme
2024-11-05 11:07:52 +08:00
DDMeaqua
0ec423389f chore: update readme 2024-11-05 11:06:20 +08:00
Dogtiti
820ab54e2d Merge pull request #5766 from ConnectAI-E/feature/add-claude-haiku3.5
Feature/add claude haiku3.5
2024-11-05 10:54:52 +08:00
lloydzhou
a6c1eb27a8 add claude 3.5 haiku 2024-11-05 10:23:15 +08:00
Lloyd Zhou
0dc4071ccc Merge pull request #5464 from endless-learner/main
Some checks are pending
Run Tests / test (push) Waiting to run
Added 1-click deployment link for Alibaba Cloud.
2024-11-05 01:10:06 +08:00
Lloyd Zhou
4d3949718a merge main 2024-11-05 01:09:27 +08:00
Dogtiti
aef535f1a7 Merge pull request #5753 from ChatGPTNextWeb/feat-bt-doc
Some checks are pending
Run Tests / test (push) Waiting to run
Feat bt doc
2024-11-04 21:41:11 +08:00
Dogtiti
686a80e727 Merge pull request #5764 from ChatGPTNextWeb/dependabot/npm_and_yarn/testing-library/react-16.0.1
chore(deps-dev): bump @testing-library/react from 16.0.0 to 16.0.1
2024-11-04 21:37:34 +08:00
Dogtiti
e49466fa05 feat: update real 'currentSession' 2024-11-04 21:25:56 +08:00
dependabot[bot]
4b93370814 chore(deps-dev): bump @testing-library/react from 16.0.0 to 16.0.1
Bumps [@testing-library/react](https://github.com/testing-library/react-testing-library) from 16.0.0 to 16.0.1.
- [Release notes](https://github.com/testing-library/react-testing-library/releases)
- [Changelog](https://github.com/testing-library/react-testing-library/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/react-testing-library/compare/v16.0.0...v16.0.1)

---
updated-dependencies:
- dependency-name: "@testing-library/react"
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-11-04 10:24:30 +00:00
Dogtiti
5733e3c588 Merge pull request #5759 from ConnectAI-E/feature/onfinish
Feature/onfinish
2024-11-04 17:16:44 +08:00
Dogtiti
44fc5b5cbf fix: onfinish responseRes 2024-11-04 17:00:45 +08:00
Dogtiti
2d3f7c922f fix: vision model dalle3 2024-11-04 15:51:04 +08:00
GH Action - Upstream Sync
fe8cca3730 Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web 2024-11-02 01:12:09 +00:00
weige
fbb7a1e853 fix 2024-11-01 18:20:16 +08:00
weige
fb2c15567d fix 2024-11-01 17:45:50 +08:00
weige
c2c52a1f60 fix 2024-11-01 17:35:34 +08:00
weige
106ddc17cd fix 2024-11-01 17:35:09 +08:00
weige
17d5209738 add bt install doc 2024-11-01 17:28:20 +08:00
Dogtiti
d66bfc6352 Merge pull request #5752 from ConnectAI-E/feat-glm
Some checks failed
Run Tests / test (push) Has been cancelled
fix: ts error
2024-11-01 14:16:50 +08:00
DDMeaqua
4d75b23ed1 fix: ts error 2024-11-01 14:15:12 +08:00
Dogtiti
36bfa2ef7c Merge pull request #5741 from ConnectAI-E/feat-glm
feat: [#5714] 支持GLM
2024-11-01 13:57:30 +08:00
DDMeaqua
afe12c212e chore: update 2024-11-01 13:53:43 +08:00
GH Action - Upstream Sync
adf97c6d8b Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web 2024-11-01 01:18:59 +00:00
DDMeaqua
7a8d557ea3 chore: 开启插件 2024-10-31 11:37:19 +08:00
DDMeaqua
d3f0a77830 chore: update Provider 2024-10-31 11:23:06 +08:00
Dogtiti
0581e37236 Merge pull request #5744 from mrcore/main
Some checks failed
Run Tests / test (push) Has been cancelled
add  claude-3-5-sonnet-latest and claude-3-opus-latest
2024-10-31 11:19:34 +08:00
Core
44383a8b33 add claude-3-5-sonnet-latest and claude-3-opus-latest
add  claude-3-5-sonnet-latest and claude-3-opus-latest
2024-10-31 11:00:45 +08:00
GH Action - Upstream Sync
7c466c9b9c Merge branch 'main' of https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web 2024-10-31 01:14:28 +00:00
Dogtiti
a0fa4d7e72 Merge pull request #5737 from hyiip/claude3.5
Some checks are pending
Run Tests / test (push) Waiting to run
add constant to claude 3.5 sonnet 20241022
2024-10-31 00:13:16 +08:00
DDMeaqua
d357b45e84 feat: [#5714] 支持GLM 2024-10-30 19:24:03 +08:00
Lloyd Zhou
d0bd1bf8fd Merge pull request #5740 from yuxuan-ctrl/main
Some checks are pending
Run Tests / test (push) Waiting to run
feat: 新增阿里系模型代码配置
2024-10-30 16:56:53 +08:00
yuxuan-ctrl
86ffa1e643 feat: 新增阿里系模型代码配置 2024-10-30 16:30:01 +08:00
endless-learner
b0d28eb77e Merge branch 'main' into main 2024-10-29 14:38:49 -07:00
hyiip
736cbdbdd1 add constant to claude 3.5 sonnet 20241022 2024-10-30 02:18:41 +08:00
Dogtiti
613d67eada Merge pull request #5729 from ConnectAI-E/feature/jest
Some checks are pending
Run Tests / test (push) Waiting to run
chore: improve jest
2024-10-29 19:39:59 +08:00
Dogtiti
89cea18955 Merge branch 'main' of https://github.com/ConnectAI-E/ChatGPT-Next-Web into feature/jest 2024-10-29 19:26:52 +08:00
Dogtiti
56bc77d20b Merge pull request #5731 from ChatGPTNextWeb/dependabot/npm_and_yarn/testing-library/jest-dom-6.6.2
Some checks are pending
Run Tests / test (push) Waiting to run
Bump @testing-library/jest-dom from 6.4.8 to 6.6.2
2024-10-28 21:52:08 +08:00
Dogtiti
6d93d37963 Merge pull request #5732 from ChatGPTNextWeb/dependabot/npm_and_yarn/types/jest-29.5.14
Bump @types/jest from 29.5.13 to 29.5.14
2024-10-28 21:51:59 +08:00
dependabot[bot]
24df85cf9d Bump @types/jest from 29.5.13 to 29.5.14
Bumps [@types/jest](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/jest) from 29.5.13 to 29.5.14.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/jest)

---
updated-dependencies:
- dependency-name: "@types/jest"
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-28 10:31:34 +00:00
dependabot[bot]
a4d7a2c6e3 Bump @testing-library/jest-dom from 6.4.8 to 6.6.2
Bumps [@testing-library/jest-dom](https://github.com/testing-library/jest-dom) from 6.4.8 to 6.6.2.
- [Release notes](https://github.com/testing-library/jest-dom/releases)
- [Changelog](https://github.com/testing-library/jest-dom/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/jest-dom/compare/v6.4.8...v6.6.2)

---
updated-dependencies:
- dependency-name: "@testing-library/jest-dom"
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-28 10:31:27 +00:00
Dogtiti
49d42bb45d chore: improve jest 2024-10-28 16:47:05 +08:00
Lloyd Zhou
4f49626303 Merge pull request #5722 from ElricLiu/main
Some checks failed
Run Tests / test (push) Has been cancelled
Update README.md
2024-10-26 12:09:09 +08:00
ElricLiu
45db20c1c3 Update README.md 2024-10-26 11:16:43 +08:00
Lloyd Zhou
82994843f5 Merge pull request #5719 from ConnectAI-E/hotfix/status_text_error
Some checks are pending
Run Tests / test (push) Waiting to run
hotfix for statusText is non ISO-8859-1 #5717
2024-10-25 20:34:15 +08:00
Dogtiti
1110a087a0 Merge pull request #5720 from ConnectAI-E/hotfix/gemini_invald_argument
hotfix for gemini invald argument #5715
2024-10-25 18:25:46 +08:00
lloydzhou
f0b3e10a6c hotfix for gemini invald argument #5715 2024-10-25 18:19:22 +08:00
lloydzhou
f89872b833 hotfix for gemini invald argument #5715 2024-10-25 18:12:09 +08:00
lloydzhou
90ced92876 update 2024-10-25 18:05:29 +08:00
lloydzhou
2c74559010 hitfix 2024-10-25 18:02:51 +08:00
lloydzhou
e3ca7e8b44 hotfix for statusText is non ISO-8859-1 #5717 2024-10-25 17:52:08 +08:00
lloydzhou
4745706c42 update version to v2.15.6
Some checks failed
Run Tests / test (push) Has been cancelled
2024-10-24 15:32:27 +08:00
lloydzhou
801dc412f9 add claude-3.5-haiku 2024-10-24 15:28:05 +08:00
Dogtiti
c7c2c0211a Merge pull request #5704 from ConnectAI-E/feature/xai
Some checks are pending
Run Tests / test (push) Waiting to run
xAi support
2024-10-23 14:13:17 +08:00
lloydzhou
65bb962fc0 hotfix 2024-10-23 12:00:59 +08:00
lloydzhou
e791cd441d add xai 2024-10-23 11:55:25 +08:00
lloydzhou
8455fefc8a add xai 2024-10-23 11:40:06 +08:00
Lloyd Zhou
06f897f32f Merge pull request #5679 from ConnectAI-E/fix/fetch
Some checks failed
Run Tests / test (push) Has been cancelled
fix: use tauri fetch
2024-10-16 22:02:16 +08:00
Dogtiti
deb1e76c41 fix: use tauri fetch 2024-10-16 21:57:07 +08:00
lloydzhou
463fa743e9 update version
Some checks are pending
Run Tests / test (push) Waiting to run
2024-10-15 16:10:44 +08:00
Dogtiti
cda4494cec Merge pull request #5632 from ConnectAI-E/feature/H0llyW00dzZ-updater
Some checks are pending
Run Tests / test (push) Waiting to run
Feature/h0lly w00dz z updater
2024-10-15 14:31:49 +08:00
lloydzhou
87d85c10c3 update 2024-10-14 21:48:36 +08:00
Dogtiti
22f83c9e11 Merge pull request #5666 from ChatGPTNextWeb/dependabot/npm_and_yarn/types/jest-29.5.13
Some checks are pending
Run Tests / test (push) Waiting to run
Bump @types/jest from 29.5.12 to 29.5.13
2024-10-14 20:36:53 +08:00
dependabot[bot]
7f454cbcec Bump @types/jest from 29.5.12 to 29.5.13
Bumps [@types/jest](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/jest) from 29.5.12 to 29.5.13.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/jest)

---
updated-dependencies:
- dependency-name: "@types/jest"
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-14 10:49:46 +00:00
lloydzhou
426269d795 Merge remote-tracking branch 'connectai/main' into feature/H0llyW00dzZ-updater 2024-10-14 17:12:08 +08:00
Lloyd Zhou
370f143157 Merge pull request #5661 from ChatGPTNextWeb/remove-pr-preview
Some checks are pending
Run Tests / test (push) Waiting to run
update test run target
2024-10-14 17:11:26 +08:00
lloydzhou
103106bb93 update test run target 2024-10-14 17:10:02 +08:00
lloydzhou
2419083adf Merge remote-tracking branch 'connectai/main' into feature/H0llyW00dzZ-updater 2024-10-14 17:04:12 +08:00
Lloyd Zhou
c25903bfb4 Merge pull request #5658 from ccq18/main
fix  o1系列模型超时时间改为4分钟,
2024-10-14 16:57:29 +08:00
Lloyd Zhou
e34c266438 Merge pull request #5660 from ChatGPTNextWeb/remove-pr-preview
update deploy_preview run target
2024-10-14 16:55:48 +08:00
lloydzhou
8c39a687b5 update deploy_preview run target 2024-10-14 16:53:46 +08:00
ccq18
592f62005b 仅修改o1的超时时间为4分钟,减少o1系列模型请求失败的情况 2024-10-14 16:31:17 +08:00
ccq18
12e7caa209 fix 默认超时时间改为3分钟,支持o1-mini 2024-10-14 16:03:01 +08:00
Lloyd Zhou
b016771555 Merge pull request #5599 from ConnectAI-E/feature/allow-send-image-only
feat: allow send image only
2024-10-14 15:11:28 +08:00
Dogtiti
a84383f919 Merge pull request #5647 from code-october/fix/setting-locale
Some checks failed
Run Tests / test (push) Has been cancelled
修改“压缩模型”名称,增加“生成对话标题”的功能提示
2024-10-13 01:49:51 +08:00
code-october
7f68fb1ff2 修改“压缩模型”名称,增加“生成对话标题”的功能提示 2024-10-12 16:49:24 +00:00
Dogtiti
8d2003fe68 Merge pull request #5644 from ConnectAI-E/fix/siderbar-style
Some checks are pending
Run Tests / test (push) Waiting to run
fix: sidebar style
2024-10-12 14:56:01 +08:00
Dogtiti
9961b513cc fix: sidebar style 2024-10-12 14:54:22 +08:00
Dogtiti
819238acaf fix: i18n 2024-10-11 20:49:43 +08:00
Dogtiti
ad49916b1c Merge pull request #5638 from ConnectAI-E/chore/test-action
Some checks are pending
Run Tests / test (push) Waiting to run
chore: improve test
2024-10-11 20:44:20 +08:00
Dogtiti
d18bd8a48a Merge pull request #5640 from code-october/feature/enableCodeFold
支持前端使能/禁用代码折叠
2024-10-11 20:43:43 +08:00
code-october
4a1319f2c0 代码安全优化 2024-10-11 11:57:23 +00:00
code-october
8fd843d228 参考coderabbitai建议规范代码 2024-10-11 11:38:52 +00:00
code-october
6792d6e475 支持前端使能/禁用代码折叠 2024-10-11 11:19:36 +00:00
Lloyd Zhou
c139038e01 Merge pull request #5639 from code-october/fix/auth-ui
优化访问码输入框
2024-10-11 19:11:35 +08:00
code-october
4a7fd3a380 优化首页 api 输入框 2024-10-11 10:36:11 +00:00
code-october
c98dc31cdf 优化访问码输入框 2024-10-11 09:03:20 +00:00
Dogtiti
bd43af3a8d chore: cache node_modules 2024-10-11 15:41:46 +08:00
Dogtiti
be98aa2078 chore: improve test 2024-10-11 15:17:38 +08:00
lloydzhou
a0d4a04192 update 2024-10-11 11:52:24 +08:00
lloydzhou
bd9de4dc4d fix version compare 2024-10-11 11:42:36 +08:00
lloydzhou
2eebfcf6fe client app tauri updater #2966 2024-10-11 11:29:22 +08:00
Lloyd Zhou
c5074f0aa4 Merge pull request #5581 from ConnectAI-E/feature/gemini-functioncall
google gemini support function call
2024-10-10 21:02:36 +08:00
Lloyd Zhou
ba58018a15 Merge pull request #5211 from ConnectAI-E/feature/jest
feat: jest
2024-10-10 21:02:05 +08:00
Lloyd Zhou
63ab83c3c8 Merge pull request #5621 from ConnectAI-E/hotfix/plugin-result
hotfix plugin result is not string #5614
2024-10-10 12:48:55 +08:00
lloydzhou
268cf3b606 hotfix plugin result is not string #5614 2024-10-10 12:47:25 +08:00
Lloyd Zhou
fbc68fa776 Merge pull request #5602 from PeterDaveHello/ImproveTwLocale
i18n: improve tw Traditional Chinese locale
2024-10-09 19:38:06 +08:00
lloydzhou
4ae34ea3ee merge main 2024-10-09 18:27:23 +08:00
Lloyd Zhou
96273fd75e Merge pull request #5611 from ConnectAI-E/feature/tauri-fetch-update
make sure get request_id before body chunk
2024-10-09 16:18:37 +08:00
lloydzhou
3e63d405c1 update 2024-10-09 16:12:01 +08:00
Lloyd Zhou
19b42aac5d Merge pull request #5608 from ConnectAI-E/fix-readme
fix: [#5574] readme
2024-10-09 14:49:34 +08:00
Lloyd Zhou
b67a23200e Merge pull request #5610 from ChatGPTNextWeb/lloydzhou-patch-1
Update README.md
2024-10-09 14:48:55 +08:00
Lloyd Zhou
1dac02e4d6 Update README.md 2024-10-09 14:48:43 +08:00
Lloyd Zhou
acad5b1d08 Merge pull request #5609 from ElricLiu/main
Update README.md
2024-10-09 14:45:27 +08:00
ElricLiu
4e9bb51d2f Update README.md 2024-10-09 14:43:49 +08:00
DDMeaqua
c0c8cdbbf3 fix: [#5574] 文档错误 2024-10-09 14:36:58 +08:00
Lloyd Zhou
cbdc611b54 Merge pull request #5607 from ConnectAI-E/hotfix/summarize-model
fix compressModel, related #5426, fix #5606 #5603 #5575
2024-10-09 14:08:13 +08:00
lloydzhou
93ca303b6c fix ts error 2024-10-09 13:49:33 +08:00
lloydzhou
a925b424a8 fix compressModel, related #5426, fix #5606 #5603 #5575 2024-10-09 13:42:25 +08:00
Lloyd Zhou
5b4d423b58 Merge pull request #5565 from ConnectAI-E/feature/using-tauri-fetch
Feat: using tauri fetch api in App
2024-10-09 13:03:01 +08:00
lloydzhou
6c1cbe120c update 2024-10-09 11:46:49 +08:00
Peter Dave Hello
77a58bc4b0 i18n: improve tw Traditional Chinese locale 2024-10-09 03:14:38 +08:00
Dogtiti
7d55a6d0e4 feat: allow send image only 2024-10-08 21:31:01 +08:00
Dogtiti
8ad63a6c25 Merge pull request #5586 from little-huang/patch-1
fix: correct typo in variable name from ALLOWD_PATH to ALLOWED_PATH
2024-10-08 15:26:41 +08:00
Dogtiti
acf9fa36f9 Merge branch 'main' of https://github.com/ConnectAI-E/ChatGPT-Next-Web into feature/jest 2024-10-08 10:30:47 +08:00
Dogtiti
461154bb03 fix: format package 2024-10-08 10:29:42 +08:00
little_huang
cd75461f9e fix: correct typo in variable name from ALLOWD_PATH to ALLOWED_PATH 2024-10-07 10:30:25 +08:00
Dogtiti
2bac174e6f Merge pull request #4393 from ChatGPTNextWeb/dean-delete-escapeDollarNumber
bugfix: Delete the escapeDollarNumber function, which causes errors i…
2024-10-06 12:41:03 +08:00
Lloyd Zhou
65f80f81ad Merge branch 'main' into dean-delete-escapeDollarNumber 2024-10-04 14:31:00 +08:00
lloydzhou
450766a44b google gemini support function call 2024-10-03 20:28:15 +08:00
Lloyd Zhou
05e6e4bffb Merge pull request #5578 from code-october/fix/safe-equal
use safe equal operation
2024-10-03 10:59:32 +08:00
code-october
fbb66a4a5d use safe equal operation 2024-10-03 02:08:10 +00:00
lloydzhou
d51d31a559 update 2024-10-01 14:40:23 +08:00
lloydzhou
919ee51dca hover show errorMsg when plugin run error 2024-10-01 13:58:50 +08:00
lloydzhou
9c577ad9d5 hotfix for plugin runtime 2024-10-01 12:55:57 +08:00
lloydzhou
953114041b add connect timeout 2024-10-01 12:02:29 +08:00
lloydzhou
d830c23dab hotfix for run plugin call post api 2024-09-30 15:38:13 +08:00
lloydzhou
fd3568c459 hotfix for run plugin call post api 2024-09-30 15:33:40 +08:00
lloydzhou
3029dcb2f6 hotfix for run plugin call post api 2024-09-30 15:32:47 +08:00
lloydzhou
35e03e1bca remove code 2024-09-30 13:44:01 +08:00
Lloyd Zhou
cea5b91f96 Merge pull request #5567 from ChatGPTNextWeb/fix-readme
update  readme
2024-09-30 13:31:34 +08:00
lyf
d2984db6e7 fix readme 2024-09-30 13:28:14 +08:00
lyf
deb215ccd1 fix readme 2024-09-30 13:23:24 +08:00
lloydzhou
7173cf2184 update 2024-09-30 13:07:06 +08:00
Lloyd Zhou
0c697e123d Merge pull request #5564 from code-october/fix/html-code
fix quoteEnd extract regex
2024-09-30 13:06:52 +08:00
lloydzhou
edfa6d14ee update 2024-09-30 10:23:24 +08:00
lloydzhou
b6d9ba93fa update 2024-09-30 10:18:30 +08:00
lloydzhou
6293b95a3b update default api base url 2024-09-30 10:13:11 +08:00
lloydzhou
ef4665cd8b update 2024-09-30 02:57:51 +08:00
lloydzhou
8030e71a5a update 2024-09-30 02:33:02 +08:00
lloydzhou
f42488d4cb using stream fetch replace old tauri http fetch 2024-09-30 02:28:19 +08:00
lloydzhou
af49ed4fdc update 2024-09-30 01:51:14 +08:00
lloydzhou
b174a40634 update 2024-09-30 01:44:27 +08:00
lloydzhou
3c01738c29 update 2024-09-30 01:37:16 +08:00
lloydzhou
9be58f3eb4 fix ts error 2024-09-30 01:30:20 +08:00
lloydzhou
a50c282d01 remove DEFAULT_API_HOST 2024-09-30 01:19:20 +08:00
lloydzhou
5141145e4d revert plugin runtime using tarui/api/http, not using fetch_stream 2024-09-30 00:58:50 +08:00
lloydzhou
b5f6e5a598 update 2024-09-30 00:38:30 +08:00
lloydzhou
7df308d655 Merge remote-tracking branch 'connectai/main' into feature/using-tauri-fetch 2024-09-29 23:36:17 +08:00
code-october
f5ad51a35e fix quoteEnd extract regex 2024-09-29 14:29:42 +00:00
lloydzhou
f9d4105170 stash code 2024-09-29 21:47:38 +08:00
lloydzhou
9e6ee50fa6 using stream_fetch in App 2024-09-29 20:32:36 +08:00
lloydzhou
dd77ad5d74 Merge remote-tracking branch 'connectai/main' into feature/using-tauri-fetch 2024-09-29 19:44:28 +08:00
lloydzhou
3898c507c4 using stream_fetch in App 2024-09-29 19:44:09 +08:00
Lloyd Zhou
fcba50f041 Merge pull request #5547 from ConnectAI-E/hotfix/plugin-opration-id
Hotfix/plugin opration
2024-09-29 16:15:02 +08:00
Lloyd Zhou
452fc86ad1 Merge pull request #5562 from ChatGPTNextWeb/hotfix-google-api
hotfix for `x-goog-api-key`
2024-09-29 15:57:20 +08:00
lloydzhou
5bdf411399 hotfix for x-goog-api-key 2024-09-29 15:51:28 +08:00
lloydzhou
2d920f7ccc using stream: schema to fetch in App 2024-09-28 15:05:41 +08:00
lloydzhou
d84d51b475 using sse: schema to fetch in App 2024-09-28 01:19:39 +08:00
Dogtiti
f9d6f4f9da Merge pull request #5553 from ConnectAI-E/fix/default-model
fix: default model
2024-09-27 21:13:26 +08:00
Lloyd Zhou
a13bd624e8 Merge pull request #5552 from joetsuihk/hotfix/upstream-sync-doc
docs: Hotfix/upstream sync doc update
2024-09-27 20:36:16 +08:00
Joe
8fb019b2e2 revert, leave sync.yml untouched
revert commit 19c4ed4463
2024-09-27 17:34:38 +08:00
Joe
2f3457e73d Update correct links to manualy code update section (JP) 2024-09-27 17:33:02 +08:00
Dogtiti
c6ebd6e73c fix: default model 2024-09-27 17:00:24 +08:00
Joe
2333a47c55 Update links in doc to manual code update section (CN) 2024-09-27 16:50:51 +08:00
Joe
b35895b551 Update correct links to manualy code update section 2024-09-27 16:49:08 +08:00
Joe
19c4ed4463 docs links updated sync.yml
https://github.com/Yidadaa/ChatGPT-Next-Web is renamed to https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/
2024-09-27 16:43:50 +08:00
lloydzhou
22aa1698b4 try using method and path when operationId is undefined #5525 2024-09-27 13:31:49 +08:00
lloydzhou
07d089a2bd try using method and path when operationId is undefined #5525 2024-09-27 13:31:07 +08:00
Dogtiti
870ad913cc Merge pull request #5545 from ConnectAI-E/hotfix/google-auth-header
fix: build error
2024-09-27 11:43:43 +08:00
Dogtiti
3fb389551b fix: build error 2024-09-27 11:42:16 +08:00
Dogtiti
d12a4adfb5 Merge pull request #5541 from ConnectAI-E/hotfix/google-auth-header
google api using `x-google-api-key` header
2024-09-27 11:04:10 +08:00
lloydzhou
702e17c96b google api using x-google-api-key header 2024-09-26 23:21:42 +08:00
Lloyd Zhou
93ff7d26cc Merge pull request #5529 from Leizhenpeng/support-saas-readme
Support saas version in readme
2024-09-25 16:34:25 +08:00
river
13777786c4 chore: ja 2024-09-25 16:30:26 +08:00
river
6655c64e55 chore: cn 2024-09-25 16:29:59 +08:00
Dogtiti
7f3f6f1aaf Merge pull request #5528 from ChatGPTNextWeb/add_tip_top
fix url i18
2024-09-25 16:21:33 +08:00
mayfwl
ea04595c5e Merge branch 'main' into add_tip_top 2024-09-25 16:19:55 +08:00
lyf
13c68bd810 fix url utm 2024-09-25 16:11:57 +08:00
lyf
1d2f44fba8 fix url 2024-09-25 16:00:48 +08:00
lloydzhou
68702bfb1f update version 2024-09-25 15:49:20 +08:00
lyf
e83f61e74d fix 2024-09-25 15:21:17 +08:00
lyf
10d472e79e fix 2024-09-25 14:41:41 +08:00
lyf
a6b920d9af fix 2024-09-25 14:35:09 +08:00
endless-learner
064e964d75 Updated link to deploy on Alibaba Cloud, readable when not logged in, also, able to choose region. 2024-09-24 23:05:32 -07:00
endless-learner
47fb40d572 Merge branch 'ChatGPTNextWeb:main' into main 2024-09-24 23:03:03 -07:00
river
b7892b58f5 chore: support saas 2024-09-25 13:34:04 +08:00
lyf
77f037a3c4 add maidian 2024-09-25 13:08:03 +08:00
lyf
248d27680d fix style 2024-09-25 11:15:00 +08:00
Lloyd Zhou
4c84182e7a Merge pull request #5522 from ConnectAI-E/fix/5494
fix: prevent title update on invalid message response
2024-09-25 10:53:00 +08:00
endless-learner
9e18cc260b Update README.md
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2024-09-24 13:55:00 -07:00
Dogtiti
e8581c8f3c fix: prevent title update on invalid message response 2024-09-25 00:37:37 +08:00
lyf
fe4cba8baf fix style slect button 2024-09-24 22:25:36 +08:00
Dogtiti
9bbd7d3185 Merge pull request #5519 from ConnectAI-E/feature-play-audio-and-video
Feature play audio and video
2024-09-24 22:03:55 +08:00
lloydzhou
dbabb2c403 auto play video/audio 2024-09-24 18:52:54 +08:00
lloydzhou
6c37d04591 auto play video/audio 2024-09-24 18:52:48 +08:00
Dogtiti
649c5be64e Merge pull request #5508 from ConnectAI-E/feature-buildin-plugin
add buildin plugin
2024-09-24 17:28:48 +08:00
Dogtiti
fc0042a799 Merge pull request #5515 from DDMeaqua/config-artifacts
Config artifacts
2024-09-24 17:27:33 +08:00
DDMeaqua
269d064e0a fix: #5450 2024-09-24 15:21:27 +08:00
DDMeaqua
6c8143b7de feat: 全局设置是否启用artifacts 2024-09-24 15:15:08 +08:00
lloydzhou
f9f99639db update 2024-09-24 12:59:21 +08:00
lyf
6d5bf490ab fix media 2024-09-24 11:36:26 +08:00
Dogtiti
46fc2a5012 Merge pull request #5498 from DDMeaqua/fix-plugin-css
Fix plugin css
2024-09-24 10:23:06 +08:00
lloydzhou
90e7b5aecf try using openai api key for dalle-3 plugin 2024-09-23 20:20:20 +08:00
lloydzhou
ed20fd2962 1. add buildin plugin; 2. remove usingProxy 2024-09-23 20:00:07 +08:00
Lloyd Zhou
4c3fd55a75 Merge pull request #5495 from ConnectAI-E/Fix-code-duplication
Fix code duplication
2024-09-23 16:06:59 +08:00
lyf
d95d509046 fex 2024-09-23 15:43:36 +08:00
lyf
c15c852668 fex media 2024-09-23 15:12:45 +08:00
DDMeaqua
4a60512ae7 chore: css 2024-09-23 14:18:32 +08:00
DDMeaqua
0e210cf8de fix: #5486 plugin样式优化 2024-09-23 14:13:09 +08:00
lyf
35aa2c7270 Fix code duplication 2024-09-23 11:34:20 +08:00
lyf
518e0d90a5 fex url 2024-09-23 11:11:36 +08:00
lyf
51f7b02b27 fex en 2024-09-23 10:56:43 +08:00
Lloyd Zhou
23f2b6213c Merge pull request #5489 from ConnectAI-E/feature-fix-openai-function-call
Feature fix openai function call
2024-09-22 19:08:35 +08:00
lloydzhou
3a969054e3 hotfix openai function call tool_calls no index 2024-09-22 18:59:49 +08:00
lloydzhou
4d1f9e49d4 hotfix openai function call tool_calls no index 2024-09-22 18:53:51 +08:00
lyf
702f5bd362 fex setCookie 2024-09-20 10:22:22 +08:00
Dogtiti
2474d5b6d2 Merge pull request #5304 from dustookk/main
fix no max_tokens in payload when the vision model name does not cont…
2024-09-19 20:34:23 +08:00
lyf
9858d1f958 add top tip 2024-09-19 18:21:26 +08:00
lyf
62efab987b add fanyi add top tip 2024-09-19 16:58:51 +08:00
DDMeaqua
4c63ee23cd feat: #5422 快捷键清除上下文 2024-09-19 15:13:33 +08:00
Dogtiti
c75d9e3de4 Merge pull request #5463 from yudshj/main
修正了typo,WHITE_WEBDEV_ENDPOINTS -> WHITE_WEBDAV_ENDPOINTS
2024-09-19 14:26:48 +08:00
Yudong
df222ded12 修正了typo, WebDev -> WebDav 2024-09-19 14:15:31 +08:00
river
7dc0f81d3f chore: change placeholder 2024-09-19 13:48:59 +08:00
river
23793e834d chore: change placeholder 2024-09-19 12:53:43 +08:00
river
f4f3c6ad5a chore: change placeholder 2024-09-19 12:47:09 +08:00
river
775794e0e4 chore: add setting 2024-09-19 12:38:46 +08:00
endless-learner
03268ce4d8 Added 1-click deployment link for Alibaba Cloud. 2024-09-18 20:38:20 -07:00
Yudong
212d15fdd0 修正了typo,WHITE_WEBDEV_ENDPOINTS -> WHITE_WEBDAV_ENDPOINTS 2024-09-19 11:20:18 +08:00
river
065f015f7b feat: add error tip 2024-09-19 09:53:00 +08:00
Dogtiti
b5ba05dd83 Merge pull request #5462 from JuliusMoehring/main
fix: Avoid fetching prompts.json serverside
2024-09-19 09:50:21 +08:00
river
e4fda6cacf feat: add auth tip 2024-09-19 08:41:09 +08:00
JuliusMoehring
accb526cd6 Avoid fetching prompts.json serverside 2024-09-18 18:07:10 +02:00
river
2f0d94a46b chore: add auth tip 2024-09-19 00:05:06 +08:00
river
8dc24403d8 chore: Update Chinese translation for API key placeholder 2024-09-18 22:05:51 +08:00
Dogtiti
a8c70d84a9 Merge pull request #5459 from DDMeaqua/tts
add tts
2024-09-18 15:42:16 +08:00
DDMeaqua
10d7a64f88 fix: error 2024-09-18 15:37:21 +08:00
Dogtiti
d51bbb4a81 Merge pull request #5444 from skymkmk/pr-fix-model-config-hydration
fix: config hydration and default model forced to set gpt-3.5-turbo
2024-09-18 15:26:47 +08:00
Dogtiti
848f794149 Merge pull request #5402 from DDMeaqua/fix-selector-css
fix: selector css
2024-09-18 15:19:08 +08:00
DDMeaqua
7f1b44befe fix: css 2024-09-18 15:04:41 +08:00
Dogtiti
9ddd5a0566 Merge pull request #5432 from ConnectAI-E/Feature-fork
feat fork
2024-09-18 15:04:27 +08:00
DDMeaqua
a3b664763e chore: default header 2024-09-18 14:57:43 +08:00
lyf
fd47bc1dc3 Add English copy 2024-09-18 13:56:44 +08:00
DDMeaqua
dfaafe3adb Merge branch 'main' into tts 2024-09-18 13:48:28 +08:00
Dogtiti
b4dc4d34eb Merge pull request #5112 from DDDDD12138/remove-unused-imports
Chore: Remove Unused Imports and Add ESLint Rules
2024-09-18 11:37:47 +08:00
DDMeaqua
3ae8ec1af6 feat: tts 2024-09-18 11:24:25 +08:00
Lloyd Zhou
5c34666334 Merge pull request #5454 from SukkaW/ci-bump
ci: bump `actions/cache` to v4
2024-09-18 11:13:59 +08:00
DDMeaqua
212605a7e3 Merge branch 'main' into tts-stt 2024-09-18 10:39:56 +08:00
SukkaW
4ddfa9af8d ci: bump actions/cache to v4 2024-09-17 22:28:13 +08:00
skymkmk
36a0c7b8a3 fix: default is forced to set gpt-3.5-turbo if no server default model have been set 2024-09-16 02:07:22 +08:00
skymkmk
9e1e0a7252 fix: persisted available models ard not be update after source code have been updated 2024-09-16 02:06:17 +08:00
Dogtiti
027e5adf67 Merge pull request #5442 from DDDDD12138/fix-typo
chore: correct typo
2024-09-15 22:12:59 +08:00
DDDDD12138
e986088bec chore: correct typo 2024-09-15 21:59:21 +08:00
DDDDD12138
63ffd473d5 chore: remove unused imports 2024-09-15 20:17:02 +08:00
Dogtiti
9e5d92dc58 Merge pull request #5438 from DDMeaqua/fix-AnthropicCros
fix: ts error
2024-09-15 14:23:29 +08:00
Meaqua
8ac9141a29 fix: ts error 2024-09-15 14:21:27 +08:00
Dogtiti
313c942350 Merge pull request #5435 from DDMeaqua/fix-AnthropicCros
fix: #5429 Anthropic authentication_error CORS
2024-09-15 10:07:24 +08:00
Dogtiti
26c3edd023 Merge pull request #5430 from skymkmk/pr-manual-regen-title
feat: menual regen title
2024-09-14 18:10:32 +08:00
DDMeaqua
9a5a3d4ce4 fix: #5429 Anthropic authentication_error CORS 2024-09-14 16:06:18 +08:00
lyf
6e79b9a7a2 add fork 2024-09-14 14:19:11 +08:00
Dogtiti
b32d82e6c1 Merge pull request #5426 from skymkmk/pr-summarize-customization
feat: summarize model customization
2024-09-14 14:18:25 +08:00
wuzhiqing
a3585685df chore: add ESLint plugin and rules to remove unused imports
- Installed eslint-plugin-unused-imports
- Updated .eslintrc.json to include rules for detecting unused imports
2024-09-14 13:54:51 +08:00
Dogtiti
f379865e2c Merge pull request #5431 from tuanzisama/tuanzisama-patch-1
feat: Improve setting.model selector
2024-09-14 10:05:38 +08:00
Dogtiti
4eb4c31438 Merge pull request #5428 from Dakai/main
Add a space between model and provider in ModelSelector to improve readability.
2024-09-14 09:57:52 +08:00
evenwan
84a7afcd94 feat: Improve setting.model selector 2024-09-14 09:31:05 +08:00
skymkmk
fa48ace39b fix: prevent users from setting a extremly short history that resulting in no content being sent for the title summary 2024-09-14 07:49:26 +08:00
skymkmk
1b869d9305 translation: translations by claude for new writings 2024-09-14 07:42:06 +08:00
skymkmk
93bc2f5870 feat: now user can choose their own summarize model 2024-09-14 07:41:27 +08:00
skymkmk
37c0cfe1e9 translation: translations by claude for manual refresh 2024-09-14 07:28:07 +08:00
skymkmk
fc27441561 feat: manual refresh for title 2024-09-14 07:21:19 +08:00
MatrixDynamo
79cfbac11f Add a space between model and provider in ModelSelector to improve readability. 2024-09-14 03:19:24 +08:00
lloydzhou
df62736ff6 update version 2024-09-13 17:36:32 +08:00
Dogtiti
6a464b3e5f Merge pull request #5418 from ConnectAI-E/hotfix/artifact
fixed: html codeblock include 2 newline
2024-09-13 17:32:21 +08:00
Lloyd Zhou
57fcda80df Merge pull request #5419 from DDMeaqua/feat-shortcutkey
chore: 手机端隐藏快捷键展示
2024-09-13 17:31:22 +08:00
DDMeaqua
db39fbc419 chore: 手机端隐藏快捷键展示 2024-09-13 16:56:06 +08:00
lloydzhou
3dabe47c78 fixed: html codeblock include 2 newline 2024-09-13 16:27:02 +08:00
Dogtiti
affc194cde Merge pull request #5416 from skymkmk/pr-add-o1
feat: add o1 model
2024-09-13 16:26:42 +08:00
skymkmk
03fa580a55 fix: give o1 some time to think twice 2024-09-13 16:25:04 +08:00
skymkmk
d0dce654bf fix: shouldstream is not depend on iso1 2024-09-13 14:18:18 +08:00
Dogtiti
169323e238 Merge pull request #5415 from skymkmk/pr-fix-incorrect-vision-model-judgement
fix: remove the visual model judgment method that checks if the model…
2024-09-13 13:57:37 +08:00
skymkmk
71df415b14 feat: add o1 model 2024-09-13 13:34:49 +08:00
skymkmk
6bb01bc564 fix: remove the visual model judgment method that checks if the model name contains 'preview' from the openai api to prevent models like o1-preview from being classified as visual models 2024-09-13 12:56:28 +08:00
Dogtiti
07c6fe5975 Merge pull request #5406 from ChatGPTNextWeb/river
fix: #4240 remove tip when 0 context
2024-09-12 21:48:11 +08:00
Dogtiti
5964181cb2 Merge pull request #5405 from xmcp/patch-1
fix typo in UI
2024-09-12 21:47:39 +08:00
river
4b8288a2b2 fix: #4240 remove tip when 0 context 2024-09-12 21:05:02 +08:00
xmcp
88b1c1c6a5 fix typo 2024-09-12 19:54:16 +08:00
Dogtiti
1234deabfa Merge pull request #5396 from DDMeaqua/feat-shortcutkey
feat: add shortcut key
2024-09-12 13:57:24 +08:00
DDMeaqua
ebaeb5a0d5 fix: selector css 2024-09-11 17:54:48 +08:00
DDMeaqua
45306bbb6c fix: ts error 2024-09-10 14:37:58 +08:00
DDMeaqua
18e2403b01 chore: 更换icon 2024-09-10 14:30:51 +08:00
DDMeaqua
e578c5f3ad chore: 样式更新 2024-09-10 12:01:51 +08:00
DDMeaqua
61245e3d7e fix: dark theme css 2024-09-09 19:29:10 +08:00
DDMeaqua
7804182d0d fix: type error 2024-09-09 19:18:12 +08:00
DDMeaqua
f2195154f6 feat: add shortcut key 2024-09-09 18:55:37 +08:00
Dogtiti
35f77f45a2 Merge pull request #5386 from ConnectAI-E/feature/safeLocalStorage
fix: safaLocalStorage
2024-09-09 16:48:25 +08:00
Dogtiti
992c3a5d3a fix: safaLocalStorage 2024-09-08 13:23:40 +08:00
mayfwl
d51d7b6797 Merge pull request #5376 from MrrDrr/add_chatgpt_4o_latest
add chatgpt-4o-latest
2024-09-08 10:15:41 +08:00
l.tingting
c1b74201e4 add chatgpt-4o-latest 2024-09-07 01:42:56 +08:00
DDMeaqua
c5168c2132 fix: i18n 2024-08-28 13:15:52 +08:00
DDMeaqua
318e0989a2 fix: transcription headers 2024-08-28 13:13:41 +08:00
DDMeaqua
d8b1781d7b Merge branch 'tts-stt' of https://github.com/DDMeaqua/ChatGPT-Next-Web into tts-stt 2024-08-28 12:40:20 +08:00
DDMeaqua
ed5aea0521 fix: bug 2024-08-28 12:37:19 +08:00
Meaqua
e9f90a4d82 fix: i18n 2024-08-27 21:49:00 +08:00
DDMeaqua
f86b220c92 feat: add voice action 2024-08-27 19:50:16 +08:00
DDMeaqua
93f1762e6c chore: wip 2024-08-27 17:02:44 +08:00
DDMeaqua
2f410fc09f feat: add tts stt 2024-08-27 16:21:02 +08:00
yihang3
56eb9d1430 fix no max_tokens in payload when the vision model name does not contain 'vision'. 2024-08-21 15:22:31 +08:00
Dogtiti
1287e39cc6 feat: run test before build 2024-08-06 19:24:47 +08:00
Dogtiti
1ef2aa35e9 feat: jest 2024-08-06 18:03:27 +08:00
butterfly
4d6b981a54 bugfix: Delete the escapeDollarNumber function, which causes errors in rendering a latex string 2024-03-26 11:43:55 +08:00
223 changed files with 36048 additions and 4115 deletions

View File

@@ -1,12 +1,20 @@
# Your openai api key. (required)
OPENAI_API_KEY=sk-xxxx
# DeepSeek Api Key. (Optional)
DEEPSEEK_API_KEY=
# Access password, separated by comma. (optional)
CODE=your-password
# You can start service behind a proxy. (optional)
PROXY_URL=http://localhost:7890
# Enable MCP functionality (optional)
# Default: Empty (disabled)
# Set to "true" to enable MCP functionality
ENABLE_MCP=
# (optional)
# Default: Empty
# Google Gemini Pro API key, set if you want to use Google Gemini Pro API.
@@ -66,4 +74,10 @@ ANTHROPIC_API_VERSION=
ANTHROPIC_URL=
### (optional)
WHITE_WEBDEV_ENDPOINTS=
WHITE_WEBDAV_ENDPOINTS=
### siliconflow Api key (optional)
SILICONFLOW_API_KEY=
### siliconflow Api url (optional)
SILICONFLOW_URL=

View File

@@ -1 +1,3 @@
public/serviceWorker.js
public/serviceWorker.js
app/mcp/mcp_config.json
app/mcp/mcp_config.default.json

View File

@@ -1,4 +1,7 @@
{
"extends": "next/core-web-vitals",
"plugins": ["prettier"]
"plugins": ["prettier", "unused-imports"],
"rules": {
"unused-imports/no-unused-imports": "warn"
}
}

View File

@@ -3,9 +3,7 @@ name: VercelPreviewDeployment
on:
pull_request_target:
types:
- opened
- synchronize
- reopened
- review_requested
env:
VERCEL_TEAM: ${{ secrets.VERCEL_TEAM }}
@@ -49,7 +47,7 @@ jobs:
run: npm install --global vercel@latest
- name: Cache dependencies
uses: actions/cache@v2
uses: actions/cache@v4
id: cache-npm
with:
path: ~/.npm

39
.github/workflows/test.yml vendored Normal file
View File

@@ -0,0 +1,39 @@
name: Run Tests
on:
push:
branches:
- main
tags:
- "!*"
pull_request:
types:
- review_requested
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: 18
cache: "yarn"
- name: Cache node_modules
uses: actions/cache@v4
with:
path: node_modules
key: ${{ runner.os }}-node_modules-${{ hashFiles('**/yarn.lock') }}
restore-keys: |
${{ runner.os }}-node_modules-
- name: Install dependencies
run: yarn install
- name: Run Jest tests
run: yarn test:ci

3
.gitignore vendored
View File

@@ -46,3 +46,6 @@ dev
*.key.pub
masks.json
# mcp config
app/mcp/mcp_config.json

2
Chebichat.md Normal file
View File

@@ -0,0 +1,2 @@
<!-- comment detail in vietnamese -->
app/store/config.ts

View File

@@ -34,12 +34,16 @@ ENV PROXY_URL=""
ENV OPENAI_API_KEY=""
ENV GOOGLE_API_KEY=""
ENV CODE=""
ENV ENABLE_MCP=""
COPY --from=builder /app/public ./public
COPY --from=builder /app/.next/standalone ./
COPY --from=builder /app/.next/static ./.next/static
COPY --from=builder /app/.next/server ./.next/server
RUN mkdir -p /app/app/mcp && chmod 777 /app/app/mcp
COPY --from=builder /app/app/mcp/mcp_config.default.json /app/app/mcp/mcp_config.json
EXPOSE 3000
CMD if [ -n "$PROXY_URL" ]; then \

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) 2023-2024 Zhang Yifei
Copyright (c) 2023-2025 NextChat
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

198
README.md
View File

@@ -1,39 +1,59 @@
<div align="center">
<a href='#企业版'>
<img src="./docs/images/ent.svg" alt="icon"/>
<a href='https://nextchat.club'>
<img src="https://github.com/user-attachments/assets/83bdcc07-ae5e-4954-a53a-ac151ba6ccf3" width="1000" alt="icon"/>
</a>
<h1 align="center">NextChat (ChatGPT Next Web)</h1>
<h1 align="center">NextChat</h1>
English / [简体中文](./README_CN.md)
One-Click to get a well-designed cross-platform ChatGPT web UI, with GPT3, GPT4 & Gemini Pro support.
<a href="https://trendshift.io/repositories/5973" target="_blank"><img src="https://trendshift.io/api/badge/repositories/5973" alt="ChatGPTNextWeb%2FChatGPT-Next-Web | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
一键免费部署你的跨平台私人 ChatGPT 应用, 支持 GPT3, GPT4 & Gemini Pro 模型。
✨ Light and Fast AI Assistant,with Claude, DeepSeek, GPT4 & Gemini Pro support.
[![Saas][Saas-image]][saas-url]
[![Web][Web-image]][web-url]
[![Windows][Windows-image]][download-url]
[![MacOS][MacOS-image]][download-url]
[![Linux][Linux-image]][download-url]
[Web App](https://app.nextchat.dev/) / [Desktop App](https://github.com/Yidadaa/ChatGPT-Next-Web/releases) / [Discord](https://discord.gg/YCkeafCafC) / [Enterprise Edition](#enterprise-edition) / [Twitter](https://twitter.com/NextChatDev)
[NextChatAI](https://nextchat.club?utm_source=readme) / [iOS APP](https://apps.apple.com/us/app/nextchat-ai/id6743085599) / [Web App Demo](https://app.nextchat.club) / [Desktop App](https://github.com/Yidadaa/ChatGPT-Next-Web/releases) / [Enterprise Edition](#enterprise-edition)
[网页版](https://app.nextchat.dev/) / [客户端](https://github.com/Yidadaa/ChatGPT-Next-Web/releases) / [企业版](#%E4%BC%81%E4%B8%9A%E7%89%88) / [反馈](https://github.com/Yidadaa/ChatGPT-Next-Web/issues)
[web-url]: https://app.nextchat.dev/
[saas-url]: https://nextchat.club?utm_source=readme
[saas-image]: https://img.shields.io/badge/NextChat-Saas-green?logo=microsoftedge
[web-url]: https://app.nextchat.club/
[download-url]: https://github.com/Yidadaa/ChatGPT-Next-Web/releases
[Web-image]: https://img.shields.io/badge/Web-PWA-orange?logo=microsoftedge
[Windows-image]: https://img.shields.io/badge/-Windows-blue?logo=windows
[MacOS-image]: https://img.shields.io/badge/-MacOS-black?logo=apple
[Linux-image]: https://img.shields.io/badge/-Linux-333?logo=ubuntu
[<img src="https://vercel.com/button" alt="Deploy on Zeabur" height="30">](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FChatGPTNextWeb%2FChatGPT-Next-Web&env=OPENAI_API_KEY&env=CODE&project-name=nextchat&repository-name=NextChat) [<img src="https://zeabur.com/button.svg" alt="Deploy on Zeabur" height="30">](https://zeabur.com/templates/ZBUEFA) [<img src="https://gitpod.io/button/open-in-gitpod.svg" alt="Open in Gitpod" height="30">](https://gitpod.io/#https://github.com/Yidadaa/ChatGPT-Next-Web)
[<img src="https://zeabur.com/button.svg" alt="Deploy on Zeabur" height="30">](https://zeabur.com/templates/ZBUEFA) [<img src="https://vercel.com/button" alt="Deploy on Vercel" height="30">](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FChatGPTNextWeb%2FChatGPT-Next-Web&env=OPENAI_API_KEY&env=CODE&project-name=nextchat&repository-name=NextChat) [<img src="https://gitpod.io/button/open-in-gitpod.svg" alt="Open in Gitpod" height="30">](https://gitpod.io/#https://github.com/ChatGPTNextWeb/NextChat)
[<img src="https://github.com/user-attachments/assets/903482d4-3e87-4134-9af1-f2588fa90659" height="60" width="288" >](https://monica.im/?utm=nxcrp)
[<img src="https://github.com/user-attachments/assets/903482d4-3e87-4134-9af1-f2588fa90659" height="50" width="" >](https://monica.im/?utm=nxcrp)
</div>
## 🥳 Cheer for NextChat iOS Version Online!
> [👉 Click Here to Install Now](https://apps.apple.com/us/app/nextchat-ai/id6743085599)
> [❤️ Source Code Coming Soon](https://github.com/ChatGPTNextWeb/NextChat-iOS)
![Github iOS Image](https://github.com/user-attachments/assets/e0aa334f-4c13-4dc9-8310-e3b09fa4b9f3)
## 🫣 NextChat Support MCP !
> Before build, please set env ENABLE_MCP=true
<img src="https://github.com/user-attachments/assets/d8851f40-4e36-4335-b1a4-ec1e11488c7e"/>
## Enterprise Edition
Meeting Your Company's Privatization and Customization Deployment Requirements:
@@ -47,20 +67,12 @@ Meeting Your Company's Privatization and Customization Deployment Requirements:
For enterprise inquiries, please contact: **business@nextchat.dev**
## 企业版
## Screenshots
满足企业用户私有化部署和个性化定制需求:
- **品牌定制**:企业量身定制 VI/UI与企业品牌形象无缝契合
- **资源集成**:由企业管理人员统一配置和管理数十种 AI 资源,团队成员开箱即用
- **权限管理**:成员权限、资源权限、知识库权限层级分明,企业级 Admin Panel 统一控制
- **知识接入**:企业内部知识库与 AI 能力相结合,比通用 AI 更贴近企业自身业务需求
- **安全审计**:自动拦截敏感提问,支持追溯全部历史对话记录,让 AI 也能遵循企业信息安全规范
- **私有部署**:企业级私有部署,支持各类主流私有云部署,确保数据安全和隐私保护
- **持续更新**:提供多模态、智能体等前沿能力持续更新升级服务,常用常新、持续先进
![Settings](./docs/images/settings.png)
企业版咨询: **business@nextchat.dev**
![More](./docs/images/more.png)
<img width="300" src="https://github.com/user-attachments/assets/3daeb7b6-ab63-4542-9141-2e4a12c80601">
## Features
@@ -93,10 +105,12 @@ For enterprise inquiries, please contact: **business@nextchat.dev**
- [x] Artifacts: Easily preview, copy and share generated content/webpages through a separate window [#5092](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/pull/5092)
- [x] Plugins: support network search, calculator, any other apis etc. [#165](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/165) [#5353](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5353)
- [x] network search, calculator, any other apis etc. [#165](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/165) [#5353](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5353)
- [x] Supports Realtime Chat [#5672](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5672)
- [ ] local knowledge base
## What's New
- 🚀 v2.15.8 Now supports Realtime Chat [#5672](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5672)
- 🚀 v2.15.4 The Application supports using Tauri fetch LLM API, MORE SECURITY! [#5379](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5379)
- 🚀 v2.15.0 Now supports Plugins! Read this: [NextChat-Awesome-Plugins](https://github.com/ChatGPTNextWeb/NextChat-Awesome-Plugins)
- 🚀 v2.14.0 Now supports Artifacts & SD
- 🚀 v2.10.1 support Google Gemini Pro model.
@@ -105,48 +119,8 @@ For enterprise inquiries, please contact: **business@nextchat.dev**
- 🚀 v2.7 let's share conversations as image, or share to ShareGPT!
- 🚀 v2.0 is released, now you can create prompt templates, turn your ideas into reality! Read this: [ChatGPT Prompt Engineering Tips: Zero, One and Few Shot Prompting](https://www.allabtai.com/prompt-engineering-tips-zero-one-and-few-shot-prompting/).
## 主要功能
- 在 1 分钟内使用 Vercel **免费一键部署**
- 提供体积极小(~5MB的跨平台客户端Linux/Windows/MacOS, [下载地址](https://github.com/Yidadaa/ChatGPT-Next-Web/releases)
- 完整的 Markdown 支持LaTex 公式、Mermaid 流程图、代码高亮等等
- 精心设计的 UI响应式设计支持深色模式支持 PWA
- 极快的首屏加载速度(~100kb支持流式响应
- 隐私安全,所有数据保存在用户浏览器本地
- 预制角色功能(面具),方便地创建、分享和调试你的个性化对话
- 海量的内置 prompt 列表,来自[中文](https://github.com/PlexPt/awesome-chatgpt-prompts-zh)和[英文](https://github.com/f/awesome-chatgpt-prompts)
- 自动压缩上下文聊天记录,在节省 Token 的同时支持超长对话
- 多国语言支持English, 简体中文, 繁体中文, 日本語, Español, Italiano, Türkçe, Deutsch, Tiếng Việt, Русский, Čeština, 한국어, Indonesia
- 拥有自己的域名?好上加好,绑定后即可在任何地方**无障碍**快速访问
## 开发计划
- [x] 为每个对话设置系统 Prompt [#138](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/138)
- [x] 允许用户自行编辑内置 Prompt 列表
- [x] 预制角色:使用预制角色快速定制新对话 [#993](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/993)
- [x] 分享为图片,分享到 ShareGPT 链接 [#1741](https://github.com/Yidadaa/ChatGPT-Next-Web/pull/1741)
- [x] 使用 tauri 打包桌面应用
- [x] 支持自部署的大语言模型:开箱即用 [RWKV-Runner](https://github.com/josStorer/RWKV-Runner) ,服务端部署 [LocalAI 项目](https://github.com/go-skynet/LocalAI) llama / gpt4all / rwkv / vicuna / koala / gpt4all-j / cerebras / falcon / dolly 等等,或者使用 [api-for-open-llm](https://github.com/xusenlinzy/api-for-open-llm)
- [x] Artifacts: 通过独立窗口,轻松预览、复制和分享生成的内容/可交互网页 [#5092](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/pull/5092)
- [x] 插件机制,支持`联网搜索``计算器`、调用其他平台 api [#165](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/165) [#5353](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5353)
- [x] 支持联网搜索、计算器、调用其他平台 api [#165](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/165) [#5353](https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/issues/5353)
- [ ] 本地知识库
## 最新动态
- 🚀 v2.15.0 现在支持插件功能了!了解更多:[NextChat-Awesome-Plugins](https://github.com/ChatGPTNextWeb/NextChat-Awesome-Plugins)
- 🚀 v2.14.0 现在支持 Artifacts & SD 了。
- 🚀 v2.10.1 现在支持 Gemini Pro 模型。
- 🚀 v2.9.11 现在可以使用自定义 Azure 服务了。
- 🚀 v2.8 发布了横跨 Linux/Windows/MacOS 的体积极小的客户端。
- 🚀 v2.7 现在可以将会话分享为图片了,也可以分享到 ShareGPT 的在线链接。
- 🚀 v2.0 已经发布,现在你可以使用面具功能快速创建预制对话了! 了解更多: [ChatGPT 提示词高阶技能:零次、一次和少样本提示](https://github.com/Yidadaa/ChatGPT-Next-Web/issues/138)。
- 💡 想要更方便地随时随地使用本项目可以试下这款桌面插件https://github.com/mushan0x0/AI0x0.com
## Get Started
> [简体中文 > 如何开始使用](./README_CN.md#开始使用)
1. Get [OpenAI API Key](https://platform.openai.com/account/api-keys);
2. Click
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FYidadaa%2FChatGPT-Next-Web&env=OPENAI_API_KEY&env=CODE&project-name=chatgpt-next-web&repository-name=ChatGPT-Next-Web), remember that `CODE` is your page password;
@@ -154,14 +128,10 @@ For enterprise inquiries, please contact: **business@nextchat.dev**
## FAQ
[简体中文 > 常见问题](./docs/faq-cn.md)
[English > FAQ](./docs/faq-en.md)
## Keep Updated
> [简体中文 > 如何保持代码更新](./README_CN.md#保持更新)
If you have deployed your own project with just one click following the steps above, you may encounter the issue of "Updates Available" constantly showing up. This is because Vercel will create a new project for you by default instead of forking this project, resulting in the inability to detect updates correctly.
We recommend that you follow the steps below to re-deploy:
@@ -172,7 +142,7 @@ We recommend that you follow the steps below to re-deploy:
### Enable Automatic Updates
> If you encounter a failure of Upstream Sync execution, please manually sync fork once.
> If you encounter a failure of Upstream Sync execution, please [manually update code](./README.md#manually-updating-code).
After forking the project, due to the limitations imposed by GitHub, you need to manually enable Workflows and Upstream Sync Action on the Actions page of the forked project. Once enabled, automatic updates will be scheduled every hour:
@@ -188,8 +158,6 @@ You can star or watch this project or follow author to get release notifications
## Access Password
> [简体中文 > 如何增加访问密码](./README_CN.md#配置页面访问密码)
This project provides limited access control. Please add an environment variable named `CODE` on the vercel environment variables page. The value should be passwords separated by comma like this:
```
@@ -200,8 +168,6 @@ After adding or modifying this environment variable, please redeploy the project
## Environment Variables
> [简体中文 > 如何配置 api key、访问密码、接口代理](./README_CN.md#环境变量)
### `CODE` (optional)
Access password, separated by comma.
@@ -296,6 +262,22 @@ iflytek Api Key.
iflytek Api Secret.
### `CHATGLM_API_KEY` (optional)
ChatGLM Api Key.
### `CHATGLM_URL` (optional)
ChatGLM Api Url.
### `DEEPSEEK_API_KEY` (optional)
DeepSeek Api Key.
### `DEEPSEEK_URL` (optional)
DeepSeek Api Url.
### `HIDE_USER_API_KEY` (optional)
> Default: Empty
@@ -329,9 +311,9 @@ To control custom models, use `+` to add a custom model, use `-` to hide a model
User `-all` to disable all default models, `+all` to enable all default models.
For Azure: use `modelName@azure=deploymentName` to customize model name and deployment name.
> Example: `+gpt-3.5-turbo@azure=gpt35` will show option `gpt35(Azure)` in model list.
> If you only can use Azure model, `-all,+gpt-3.5-turbo@azure=gpt35` will `gpt35(Azure)` the only option in model list.
For Azure: use `modelName@Azure=deploymentName` to customize model name and deployment name.
> Example: `+gpt-3.5-turbo@Azure=gpt35` will show option `gpt35(Azure)` in model list.
> If you only can use Azure model, `-all,+gpt-3.5-turbo@Azure=gpt35` will `gpt35(Azure)` the only option in model list.
For ByteDance: use `modelName@bytedance=deploymentName` to customize model name and deployment name.
> Example: `+Doubao-lite-4k@bytedance=ep-xxxxx-xxx` will show option `Doubao-lite-4k(ByteDance)` in model list.
@@ -340,7 +322,14 @@ For ByteDance: use `modelName@bytedance=deploymentName` to customize model name
Change default model
### `WHITE_WEBDEV_ENDPOINTS` (optional)
### `VISION_MODELS` (optional)
> Default: Empty
> Example: `gpt-4-vision,claude-3-opus,my-custom-model` means add vision capabilities to these models in addition to the default pattern matches (which detect models containing keywords like "vision", "claude-3", "gemini-1.5", etc).
Add additional models to have vision capabilities, beyond the default pattern matching. Multiple models should be separated by commas.
### `WHITE_WEBDAV_ENDPOINTS` (optional)
You can use this option if you want to increase the number of webdav service addresses you are allowed to access, as required by the format
- Each address must be a complete endpoint
@@ -359,13 +348,25 @@ Stability API key.
Customize Stability API url.
### `ENABLE_MCP` (optional)
Enable MCPModel Context ProtocolFeature
### `SILICONFLOW_API_KEY` (optional)
SiliconFlow API Key.
### `SILICONFLOW_URL` (optional)
SiliconFlow API URL.
## Requirements
NodeJS >= 18, Docker >= 20
## Development
> [简体中文 > 如何进行二次开发](./README_CN.md#开发)
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/Yidadaa/ChatGPT-Next-Web)
@@ -390,7 +391,6 @@ yarn dev
## Deployment
> [简体中文 > 如何部署到私人服务器](./README_CN.md#部署)
### Docker (Recommended)
@@ -419,6 +419,16 @@ If your proxy needs password, use:
-e PROXY_URL="http://127.0.0.1:7890 user pass"
```
If enable MCP, use
```
docker run -d -p 3000:3000 \
-e OPENAI_API_KEY=sk-xxxx \
-e CODE=your-password \
-e ENABLE_MCP=true \
yidadaa/chatgpt-next-web
```
### Shell
```shell
@@ -439,11 +449,7 @@ bash <(curl -s https://raw.githubusercontent.com/Yidadaa/ChatGPT-Next-Web/main/s
- [How to use Vercel (No English)](./docs/vercel-cn.md)
- [User Manual (Only Chinese, WIP)](./docs/user-manual-cn.md)
## Screenshots
![Settings](./docs/images/settings.png)
![More](./docs/images/more.png)
## Translation
@@ -455,37 +461,7 @@ If you want to add a new translation, read this [document](./docs/translation.md
## Special Thanks
### Sponsor
> 仅列出捐赠金额 >= 100RMB 的用户。
[@mushan0x0](https://github.com/mushan0x0)
[@ClarenceDan](https://github.com/ClarenceDan)
[@zhangjia](https://github.com/zhangjia)
[@hoochanlon](https://github.com/hoochanlon)
[@relativequantum](https://github.com/relativequantum)
[@desenmeng](https://github.com/desenmeng)
[@webees](https://github.com/webees)
[@chazzhou](https://github.com/chazzhou)
[@hauy](https://github.com/hauy)
[@Corwin006](https://github.com/Corwin006)
[@yankunsong](https://github.com/yankunsong)
[@ypwhs](https://github.com/ypwhs)
[@fxxxchao](https://github.com/fxxxchao)
[@hotic](https://github.com/hotic)
[@WingCH](https://github.com/WingCH)
[@jtung4](https://github.com/jtung4)
[@micozhu](https://github.com/micozhu)
[@jhansion](https://github.com/jhansion)
[@Sha1rholder](https://github.com/Sha1rholder)
[@AnsonHyq](https://github.com/AnsonHyq)
[@synwith](https://github.com/synwith)
[@piksonGit](https://github.com/piksonGit)
[@ouyangzhiping](https://github.com/ouyangzhiping)
[@wenjiavv](https://github.com/wenjiavv)
[@LeXwDeX](https://github.com/LeXwDeX)
[@Licoy](https://github.com/Licoy)
[@shangmin2009](https://github.com/shangmin2009)
### Contributors

View File

@@ -6,9 +6,9 @@
<h1 align="center">NextChat</h1>
一键免费部署你的私人 ChatGPT 网页应用,支持 GPT3, GPT4 & Gemini Pro 模型。
一键免费部署你的私人 ChatGPT 网页应用,支持 Claude, GPT4 & Gemini Pro 模型。
[企业版](#%E4%BC%81%E4%B8%9A%E7%89%88) /[演示 Demo](https://chat-gpt-next-web.vercel.app/) / [反馈 Issues](https://github.com/Yidadaa/ChatGPT-Next-Web/issues) / [加入 Discord](https://discord.gg/zrhvHCr79N)
[NextChatAI](https://nextchat.club?utm_source=readme) / [企业版](#%E4%BC%81%E4%B8%9A%E7%89%88) / [演示 Demo](https://chat-gpt-next-web.vercel.app/) / [反馈 Issues](https://github.com/Yidadaa/ChatGPT-Next-Web/issues) / [加入 Discord](https://discord.gg/zrhvHCr79N)
[<img src="https://vercel.com/button" alt="Deploy on Zeabur" height="30">](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FChatGPTNextWeb%2FChatGPT-Next-Web&env=OPENAI_API_KEY&env=CODE&project-name=nextchat&repository-name=NextChat) [<img src="https://zeabur.com/button.svg" alt="Deploy on Zeabur" height="30">](https://zeabur.com/templates/ZBUEFA) [<img src="https://gitpod.io/button/open-in-gitpod.svg" alt="Open in Gitpod" height="30">](https://gitpod.io/#https://github.com/Yidadaa/ChatGPT-Next-Web)
@@ -27,7 +27,8 @@
企业版咨询: **business@nextchat.dev**
<img width="300" src="https://github.com/user-attachments/assets/3daeb7b6-ab63-4542-9141-2e4a12c80601">
<img width="300" src="https://github.com/user-attachments/assets/bb29a11d-ff75-48a8-b1f8-d2d7238cf987">
## 开始使用
@@ -54,7 +55,7 @@
### 打开自动更新
> 如果你遇到了 Upstream Sync 执行错误,请手动 Sync Fork 一次!
> 如果你遇到了 Upstream Sync 执行错误,请[手动 Sync Fork 一次](./README_CN.md#手动更新代码)
当你 fork 项目之后,由于 Github 的限制,需要手动去你 fork 后的项目的 Actions 页面启用 Workflows并启用 Upstream Sync Action启用之后即可开启每小时定时自动更新
@@ -88,7 +89,7 @@ code1,code2,code3
### `OPENAI_API_KEY` (必填项)
OpanAI 密钥,你在 openai 账户页面申请的 api key使用英文逗号隔开多个 key这样可以随机轮询这些 key。
OpenAI 密钥,你在 openai 账户页面申请的 api key使用英文逗号隔开多个 key这样可以随机轮询这些 key。
### `CODE` (可选)
@@ -184,6 +185,21 @@ ByteDance Api Url.
讯飞星火Api Secret.
### `CHATGLM_API_KEY` (可选)
ChatGLM Api Key.
### `CHATGLM_URL` (可选)
ChatGLM Api Url.
### `DEEPSEEK_API_KEY` (可选)
DeepSeek Api Key.
### `DEEPSEEK_URL` (可选)
DeepSeek Api Url.
### `HIDE_USER_API_KEY` (可选)
@@ -202,7 +218,7 @@ ByteDance Api Url.
如果你想禁用从链接解析预制设置,将此环境变量设置为 1 即可。
### `WHITE_WEBDEV_ENDPOINTS` (可选)
### `WHITE_WEBDAV_ENDPOINTS` (可选)
如果你想增加允许访问的webdav服务地址可以使用该选项格式要求
- 每一个地址必须是一个完整的 endpoint
@@ -216,9 +232,9 @@ ByteDance Api Url.
用来控制模型列表,使用 `+` 增加一个模型,使用 `-` 来隐藏一个模型,使用 `模型名=展示名` 来自定义模型的展示名,用英文逗号隔开。
在Azure的模式下支持使用`modelName@azure=deploymentName`的方式配置模型名称和部署名称(deploy-name)
> 示例:`+gpt-3.5-turbo@azure=gpt35`这个配置会在模型列表显示一个`gpt35(Azure)`的选项。
> 如果你只能使用Azure模式那么设置 `-all,+gpt-3.5-turbo@azure=gpt35` 则可以让对话的默认使用 `gpt35(Azure)`
在Azure的模式下支持使用`modelName@Azure=deploymentName`的方式配置模型名称和部署名称(deploy-name)
> 示例:`+gpt-3.5-turbo@Azure=gpt35`这个配置会在模型列表显示一个`gpt35(Azure)`的选项。
> 如果你只能使用Azure模式那么设置 `-all,+gpt-3.5-turbo@Azure=gpt35` 则可以让对话的默认使用 `gpt35(Azure)`
在ByteDance的模式下支持使用`modelName@bytedance=deploymentName`的方式配置模型名称和部署名称(deploy-name)
> 示例: `+Doubao-lite-4k@bytedance=ep-xxxxx-xxx`这个配置会在模型列表显示一个`Doubao-lite-4k(ByteDance)`的选项
@@ -228,6 +244,13 @@ ByteDance Api Url.
更改默认模型
### `VISION_MODELS` (可选)
> 默认值:空
> 示例:`gpt-4-vision,claude-3-opus,my-custom-model` 表示为这些模型添加视觉能力,作为对默认模式匹配的补充(默认会检测包含"vision"、"claude-3"、"gemini-1.5"等关键词的模型)。
在默认模式匹配之外,添加更多具有视觉能力的模型。多个模型用逗号分隔。
### `DEFAULT_INPUT_TEMPLATE` (可选)
自定义默认的 template用于初始化『设置』中的『用户输入预处理』配置项
@@ -240,6 +263,17 @@ Stability API密钥
自定义的Stability API请求地址
### `ENABLE_MCP` (optional)
启用MCPModel Context Protocol功能
### `SILICONFLOW_API_KEY` (optional)
SiliconFlow API Key.
### `SILICONFLOW_URL` (optional)
SiliconFlow API URL.
## 开发
@@ -264,6 +298,9 @@ BASE_URL=https://b.nextweb.fun/api/proxy
## 部署
### 宝塔面板部署
> [简体中文 > 如何通过宝塔一键部署](./docs/bt-cn.md)
### 容器部署 (推荐)
> Docker 版本需要在 20 及其以上,否则会提示找不到镜像。
@@ -290,6 +327,16 @@ docker run -d -p 3000:3000 \
yidadaa/chatgpt-next-web
```
如需启用 MCP 功能,可以使用:
```shell
docker run -d -p 3000:3000 \
-e OPENAI_API_KEY=sk-xxxx \
-e CODE=页面访问密码 \
-e ENABLE_MCP=true \
yidadaa/chatgpt-next-web
```
如果你的本地代理需要账号密码,可以使用:
```shell

View File

@@ -5,7 +5,7 @@
ワンクリックで無料であなた専用の ChatGPT ウェブアプリをデプロイ。GPT3、GPT4 & Gemini Pro モデルをサポート。
[企業版](#企業版) / [デモ](https://chat-gpt-next-web.vercel.app/) / [フィードバック](https://github.com/Yidadaa/ChatGPT-Next-Web/issues) / [Discordに参加](https://discord.gg/zrhvHCr79N)
[NextChatAI](https://nextchat.club?utm_source=readme) / [企業版](#企業版) / [デモ](https://chat-gpt-next-web.vercel.app/) / [フィードバック](https://github.com/Yidadaa/ChatGPT-Next-Web/issues) / [Discordに参加](https://discord.gg/zrhvHCr79N)
[<img src="https://vercel.com/button" alt="Zeaburでデプロイ" height="30">](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FChatGPTNextWeb%2FChatGPT-Next-Web&env=OPENAI_API_KEY&env=CODE&project-name=nextchat&repository-name=NextChat) [<img src="https://zeabur.com/button.svg" alt="Zeaburでデプロイ" height="30">](https://zeabur.com/templates/ZBUEFA) [<img src="https://gitpod.io/button/open-in-gitpod.svg" alt="Gitpodで開く" height="30">](https://gitpod.io/#https://github.com/Yidadaa/ChatGPT-Next-Web)
@@ -54,7 +54,7 @@
### 自動更新を開く
> Upstream Sync の実行エラーが発生した場合は、手動で Sync Fork してください!
> Upstream Sync の実行エラーが発生した場合は、[手動で Sync Fork](./README_JA.md#手動でコードを更新する) してください!
プロジェクトを fork した後、GitHub の制限により、fork 後のプロジェクトの Actions ページで Workflows を手動で有効にし、Upstream Sync Action を有効にする必要があります。有効化後、毎時の定期自動更新が可能になります:
@@ -193,7 +193,7 @@ ByteDance API の URL。
リンクからのプリセット設定解析を無効にしたい場合は、この環境変数を 1 に設定します。
### `WHITE_WEBDEV_ENDPOINTS` (オプション)
### `WHITE_WEBDAV_ENDPOINTS` (オプション)
アクセス許可を与える WebDAV サービスのアドレスを追加したい場合、このオプションを使用します。フォーマット要件:
- 各アドレスは完全なエンドポイントでなければなりません。
@@ -207,8 +207,8 @@ ByteDance API の URL。
モデルリストを管理します。`+` でモデルを追加し、`-` でモデルを非表示にし、`モデル名=表示名` でモデルの表示名をカスタマイズし、カンマで区切ります。
Azure モードでは、`modelName@azure=deploymentName` 形式でモデル名とデプロイ名deploy-nameを設定できます。
> 例:`+gpt-3.5-turbo@azure=gpt35` この設定でモデルリストに `gpt35(Azure)` のオプションが表示されます。
Azure モードでは、`modelName@Azure=deploymentName` 形式でモデル名とデプロイ名deploy-nameを設定できます。
> 例:`+gpt-3.5-turbo@Azure=gpt35` この設定でモデルリストに `gpt35(Azure)` のオプションが表示されます。
ByteDance モードでは、`modelName@bytedance=deploymentName` 形式でモデル名とデプロイ名deploy-nameを設定できます。
> 例: `+Doubao-lite-4k@bytedance=ep-xxxxx-xxx` この設定でモデルリストに `Doubao-lite-4k(ByteDance)` のオプションが表示されます。
@@ -217,6 +217,13 @@ ByteDance モードでは、`modelName@bytedance=deploymentName` 形式でモデ
デフォルトのモデルを変更します。
### `VISION_MODELS` (オプション)
> デフォルト:空
> 例:`gpt-4-vision,claude-3-opus,my-custom-model` は、これらのモデルにビジョン機能を追加します。これはデフォルトのパターンマッチング("vision"、"claude-3"、"gemini-1.5"などのキーワードを含むモデルを検出)に加えて適用されます。
デフォルトのパターンマッチングに加えて、追加のモデルにビジョン機能を付与します。複数のモデルはカンマで区切ります。
### `DEFAULT_INPUT_TEMPLATE` (オプション)
『設定』の『ユーザー入力前処理』の初期設定に使用するテンプレートをカスタマイズします。

54
app/SyncOnFirstLoad.tsx Normal file
View File

@@ -0,0 +1,54 @@
"use client";
import { useEffect } from "react";
import { useSyncStore } from "./store/sync";
import { showToast } from "./components/ui-lib";
export default function SyncOnFirstLoad() {
const syncStore = useSyncStore();
useEffect(() => {
// Parse cookies using the cookie library
// const cookies = cookie.parse(document.cookie || "");
// const authToken = cookies["sb-zzgkylsbdgwoohcbompi-auth-token"] || null;
console.log("[Auth Check] Checking user authentication status");
fetch("/api/auth/check")
.then((res) => {
if (res.status === 401) {
console.log("[Auth Check] User is not authenticated");
// Handle unauthenticated user - redirect or show login modal
showToast("Please login first");
// setTimeout(() => {
// window.location.href = AUTHEN_PAGE;
// }, 500);
return;
}
return res.json();
})
.then((data) => {
if (data) {
console.log("[Auth Check] User is authenticated:", data.user);
// Assuming data.user contains the user information(user email)
const email = data.user.email || "";
// Only update upstash.username, keep other params
syncStore.update((config) => (config.upstash.username = email));
// You can now use the user data as needed
// syncStore.sync();
// console.log("[Sync] User data synced successfully");
}
})
.catch((e) => {
console.error("[Auth Check] Error checking authentication:", e);
// Handle error appropriately
});
}, []);
return null;
}

View File

@@ -1,5 +1,5 @@
import { ApiPath } from "@/app/constant";
import { NextRequest, NextResponse } from "next/server";
import { NextRequest } from "next/server";
import { handle as openaiHandler } from "../../openai";
import { handle as azureHandler } from "../../azure";
import { handle as googleHandler } from "../../google";
@@ -10,13 +10,21 @@ import { handle as alibabaHandler } from "../../alibaba";
import { handle as moonshotHandler } from "../../moonshot";
import { handle as stabilityHandler } from "../../stability";
import { handle as iflytekHandler } from "../../iflytek";
import { handle as deepseekHandler } from "../../deepseek";
import { handle as siliconflowHandler } from "../../siliconflow";
import { handle as xaiHandler } from "../../xai";
import { handle as chatglmHandler } from "../../glm";
import { handle as proxyHandler } from "../../proxy";
async function handle(
req: NextRequest,
{ params }: { params: { provider: string; path: string[] } },
) {
// Handle OPTIONS request for CORS preflight
// params.provider = MODEL_PROVIDER;
const apiPath = `/api/${params.provider}`;
console.log(`[${params.provider} Route] params `, params);
switch (apiPath) {
case ApiPath.Azure:
@@ -38,6 +46,14 @@ async function handle(
return stabilityHandler(req, { params });
case ApiPath.Iflytek:
return iflytekHandler(req, { params });
case ApiPath.DeepSeek:
return deepseekHandler(req, { params });
case ApiPath.XAI:
return xaiHandler(req, { params });
case ApiPath.ChatGLM:
return chatglmHandler(req, { params });
case ApiPath.SiliconFlow:
return siliconflowHandler(req, { params });
case ApiPath.OpenAI:
return openaiHandler(req, { params });
default:

View File

@@ -1,16 +1,8 @@
import { getServerSideConfig } from "@/app/config/server";
import {
Alibaba,
ALIBABA_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { ALIBABA_BASE_URL, ApiPath, ModelProvider } from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import type { RequestPayload } from "@/app/client/platforms/openai";
const serverConfig = getServerSideConfig();
@@ -18,7 +10,7 @@ export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[Alibaba Route] params ", params);
// console.log("[Alibaba Route] params ", params);
if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
@@ -44,7 +36,9 @@ async function request(req: NextRequest) {
const controller = new AbortController();
// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.Alibaba, "");
let path = `${req.nextUrl.pathname}`
.replaceAll(ApiPath.Alibaba, "")
.replace("/api", "");
let baseUrl = serverConfig.alibabaUrl || ALIBABA_BASE_URL;
@@ -67,6 +61,9 @@ async function request(req: NextRequest) {
);
const fetchUrl = `${baseUrl}${path}`;
console.log("[Alibaba] fetchUrl", fetchUrl);
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
@@ -85,28 +82,77 @@ async function request(req: NextRequest) {
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
let jsonBody: any = {};
const jsonBody = JSON.parse(clonedBody) as { model?: string };
try {
jsonBody = JSON.parse(clonedBody);
// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
jsonBody.messages = jsonBody.input.messages;
// Remove input.messages to avoid duplication
delete jsonBody.input;
jsonBody.stream = true;
}
const current_model = jsonBody?.model;
console.log("[Alibaba] custom models", current_model);
//kiem tra xem model co phai la qwen-vl hay khong (vision model)
if (current_model && current_model.startsWith("qwen-vl")) {
console.log("[Alibaba] current model is qwen-vl");
console.log("xu ly hinh anh trong message");
// Reformat image objects in messages
if (Array.isArray(jsonBody.messages)) {
jsonBody.messages = jsonBody.messages.map((msg: any) => {
if (Array.isArray(msg.content)) {
msg.content = msg.content.map((item: any) => {
if (item && typeof item === "object" && "image" in item) {
return {
type: "image_url",
image_url: {
url: item.image,
},
};
}
return item;
});
}
return msg;
});
}
}
// console.log("[Alibaba] request body json", jsonBody);
fetchOptions.body = JSON.stringify(jsonBody);
} catch (e) {
fetchOptions.body = clonedBody; // fallback if not JSON
}
// console.log("[Alibaba] request body", fetchOptions.body);
// not undefined and is false
if (
isModelAvailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Alibaba as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
// if (
// isModelNotavailableInServer(
// serverConfig.customModels,
// jsonBody?.model as string,
// ServiceProvider.Alibaba as string,
// )
// ) {
// return NextResponse.json(
// {
// error: true,
// message: `you are not allowed to use ${jsonBody?.model} model`,
// },
// {
// status: 403,
// },
// );
// }
} catch (e) {
console.error(`[Alibaba] filter`, e);
}

View File

@@ -3,14 +3,13 @@ import {
ANTHROPIC_BASE_URL,
Anthropic,
ApiPath,
DEFAULT_MODELS,
ServiceProvider,
ModelProvider,
} from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "./auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import { isModelNotavailableInServer } from "@/app/utils/model";
import { cloudflareAIGatewayUrl } from "@/app/utils/cloudflare";
const ALLOWD_PATH = new Set([Anthropic.ChatPath, Anthropic.ChatPath1]);
@@ -98,6 +97,7 @@ async function request(req: NextRequest) {
headers: {
"Content-Type": "application/json",
"Cache-Control": "no-store",
"anthropic-dangerous-direct-browser-access": "true",
[authHeaderName]: authValue,
"anthropic-version":
req.headers.get("anthropic-version") ||
@@ -122,7 +122,7 @@ async function request(req: NextRequest) {
// not undefined and is false
if (
isModelAvailableInServer(
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Anthropic as string,

View File

@@ -92,6 +92,18 @@ export function auth(req: NextRequest, modelProvider: ModelProvider) {
systemApiKey =
serverConfig.iflytekApiKey + ":" + serverConfig.iflytekApiSecret;
break;
case ModelProvider.DeepSeek:
systemApiKey = serverConfig.deepseekApiKey;
break;
case ModelProvider.XAI:
systemApiKey = serverConfig.xaiApiKey;
break;
case ModelProvider.ChatGLM:
systemApiKey = serverConfig.chatglmApiKey;
break;
case ModelProvider.SiliconFlow:
systemApiKey = serverConfig.siliconFlowApiKey;
break;
case ModelProvider.GPT:
default:
if (req.nextUrl.pathname.includes("azure/deployments")) {

View File

@@ -0,0 +1,21 @@
// /app/api/auth/check/route.ts
import { NextRequest } from "next/server";
import { checkAuth } from "../../supabase";
export async function GET(req: NextRequest) {
const user = await checkAuth(req);
console.log("[Auth] user ", user);
if (!user) {
return new Response(JSON.stringify({ authenticated: false }), {
status: 401,
headers: { "Content-Type": "application/json" },
});
}
return new Response(JSON.stringify({ authenticated: true, user }), {
status: 200,
headers: { "Content-Type": "application/json" },
});
}

View File

@@ -1,4 +1,3 @@
import { getServerSideConfig } from "@/app/config/server";
import { ModelProvider } from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";

View File

@@ -3,13 +3,12 @@ import {
BAIDU_BASE_URL,
ApiPath,
ModelProvider,
BAIDU_OATUH_URL,
ServiceProvider,
} from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import { isModelNotavailableInServer } from "@/app/utils/model";
import { getAccessToken } from "@/app/utils/baidu";
const serverConfig = getServerSideConfig();
@@ -105,7 +104,7 @@ async function request(req: NextRequest) {
// not undefined and is false
if (
isModelAvailableInServer(
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Baidu as string,

View File

@@ -8,7 +8,7 @@ import {
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import { isModelNotavailableInServer } from "@/app/utils/model";
const serverConfig = getServerSideConfig();
@@ -88,7 +88,7 @@ async function request(req: NextRequest) {
// not undefined and is false
if (
isModelAvailableInServer(
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.ByteDance as string,

View File

@@ -1,24 +1,25 @@
import { NextRequest, NextResponse } from "next/server";
import { getServerSideConfig } from "../config/server";
import {
DEFAULT_MODELS,
OPENAI_BASE_URL,
GEMINI_BASE_URL,
ServiceProvider,
} from "../constant";
import { isModelAvailableInServer } from "../utils/model";
import { OPENAI_BASE_URL, ServiceProvider } from "../constant";
import { cloudflareAIGatewayUrl } from "../utils/cloudflare";
import { getModelProvider, isModelNotavailableInServer } from "../utils/model";
const serverConfig = getServerSideConfig();
// Hàm proxy request từ client tới OpenAI hoặc Azure OpenAI, xử lý xác thực, cấu hình endpoint, kiểm tra model, và trả về response phù hợp.
export async function requestOpenai(req: NextRequest) {
// Tạo controller để có thể hủy request khi timeout
const controller = new AbortController();
// Kiểm tra xem request có phải tới Azure OpenAI không
const isAzure = req.nextUrl.pathname.includes("azure/deployments");
// Biến lưu giá trị xác thực và tên header xác thực
var authValue,
authHeaderName = "";
if (isAzure) {
// Nếu là Azure, lấy api-key từ header Authorization
authValue =
req.headers
.get("Authorization")
@@ -28,26 +29,38 @@ export async function requestOpenai(req: NextRequest) {
authHeaderName = "api-key";
} else {
// Nếu là OpenAI thường, giữ nguyên header Authorization
authValue = req.headers.get("Authorization") ?? "";
authHeaderName = "Authorization";
console.log("[Auth] ", authValue);
}
// Xử lý lại đường dẫn endpoint cho phù hợp với OpenAI/Azure
let path = `${req.nextUrl.pathname}`.replaceAll("/api/openai/", "");
// console.log("[Proxy] mac dinh ", path);
// Lấy baseUrl từ config, ưu tiên Azure nếu là request Azure
let baseUrl =
(isAzure ? serverConfig.azureUrl : serverConfig.baseUrl) || OPENAI_BASE_URL;
// Đảm bảo baseUrl có tiền tố https
if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}
// Loại bỏ dấu "/" ở cuối baseUrl nếu có
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, -1);
}
// In ra log để debug đường dẫn và baseUrl
console.log("[Proxy] ", path);
console.log("[Base Url]", baseUrl);
// Thiết lập timeout cho request (10 phút), nếu quá sẽ abort
const timeoutId = setTimeout(
() => {
controller.abort();
@@ -55,6 +68,7 @@ export async function requestOpenai(req: NextRequest) {
10 * 60 * 1000,
);
// Nếu là Azure, xử lý lại path và api-version cho đúng chuẩn Azure
if (isAzure) {
const azureApiVersion =
req?.nextUrl?.searchParams?.get("api-version") ||
@@ -65,9 +79,7 @@ export async function requestOpenai(req: NextRequest) {
"",
)}?api-version=${azureApiVersion}`;
// Forward compatibility:
// if display_name(deployment_name) not set, and '{deploy-id}' in AZURE_URL
// then using default '{deploy-id}'
// Nếu có customModels và azureUrl, kiểm tra và thay thế deployment id nếu cần
if (serverConfig.customModels && serverConfig.azureUrl) {
const modelName = path.split("/")[1];
let realDeployName = "";
@@ -76,7 +88,7 @@ export async function requestOpenai(req: NextRequest) {
.filter((v) => !!v && !v.startsWith("-") && v.includes(modelName))
.forEach((m) => {
const [fullName, displayName] = m.split("=");
const [_, providerName] = fullName.split("@");
const [_, providerName] = getModelProvider(fullName);
if (providerName === "azure" && !displayName) {
const [_, deployId] = (serverConfig?.azureUrl ?? "").split(
"deployments/",
@@ -93,8 +105,12 @@ export async function requestOpenai(req: NextRequest) {
}
}
// Tạo url cuối cùng để fetch, có thể qua Cloudflare Gateway nếu cấu hình
const fetchUrl = cloudflareAIGatewayUrl(`${baseUrl}/${path}`);
console.log("fetchUrl", fetchUrl);
// Thiết lập các option cho fetch, bao gồm headers, method, body, v.v.
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
@@ -106,14 +122,14 @@ export async function requestOpenai(req: NextRequest) {
},
method: req.method,
body: req.body,
// to fix #2485: https://stackoverflow.com/questions/55920957/cloudflare-worker-typeerror-one-time-use-body
// Fix lỗi body chỉ dùng được 1 lần trên Cloudflare Worker
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};
// #1815 try to refuse gpt4 request
// Kiểm tra model có bị cấm sử dụng không (ví dụ: cấm GPT-4)
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
@@ -121,17 +137,16 @@ export async function requestOpenai(req: NextRequest) {
const jsonBody = JSON.parse(clonedBody) as { model?: string };
// not undefined and is false
// Nếu model không được phép sử dụng, trả về lỗi 403
if (
isModelAvailableInServer(
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.OpenAI as string,
) ||
isModelAvailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Azure as string,
[
ServiceProvider.OpenAI,
ServiceProvider.Azure,
jsonBody?.model as string, // hỗ trợ model không rõ provider
],
)
) {
return NextResponse.json(
@@ -150,43 +165,40 @@ export async function requestOpenai(req: NextRequest) {
}
try {
// Gửi request tới OpenAI/Azure và nhận response
const res = await fetch(fetchUrl, fetchOptions);
// Extract the OpenAI-Organization header from the response
// Lấy header OpenAI-Organization từ response (nếu có)
const openaiOrganizationHeader = res.headers.get("OpenAI-Organization");
// Check if serverConfig.openaiOrgId is defined and not an empty string
// Nếu đã cấu hình openaiOrgId, log giá trị header này
if (serverConfig.openaiOrgId && serverConfig.openaiOrgId.trim() !== "") {
// If openaiOrganizationHeader is present, log it; otherwise, log that the header is not present
console.log("[Org ID]", openaiOrganizationHeader);
} else {
console.log("[Org ID] is not set up.");
}
// to prevent browser prompt for credentials
// Xử lý lại headers trả về cho client
const newHeaders = new Headers(res.headers);
newHeaders.delete("www-authenticate");
// to disable nginx buffering
newHeaders.set("X-Accel-Buffering", "no");
newHeaders.delete("www-authenticate"); // Xóa header yêu cầu xác thực
newHeaders.set("X-Accel-Buffering", "no"); // Tắt buffer của nginx
// Conditionally delete the OpenAI-Organization header from the response if [Org ID] is undefined or empty (not setup in ENV)
// Also, this is to prevent the header from being sent to the client
// Nếu chưa cấu hình Org ID, xóa header này khỏi response
if (!serverConfig.openaiOrgId || serverConfig.openaiOrgId.trim() === "") {
newHeaders.delete("OpenAI-Organization");
}
// The latest version of the OpenAI API forced the content-encoding to be "br" in json response
// So if the streaming is disabled, we need to remove the content-encoding header
// Because Vercel uses gzip to compress the response, if we don't remove the content-encoding header
// The browser will try to decode the response with brotli and fail
// Xóa header content-encoding để tránh lỗi giải nén trên trình duyệt
newHeaders.delete("content-encoding");
// Trả về response cho client với các header đã xử lý
return new Response(res.body, {
status: res.status,
statusText: res.statusText,
headers: newHeaders,
});
} finally {
// Dù thành công hay lỗi đều clear timeout
clearTimeout(timeoutId);
}
}

View File

@@ -14,6 +14,7 @@ const DANGER_CONFIG = {
disableFastLink: serverConfig.disableFastLink,
customModels: serverConfig.customModels,
defaultModel: serverConfig.defaultModel,
visionModels: serverConfig.visionModels,
};
declare global {

128
app/api/deepseek.ts Normal file
View File

@@ -0,0 +1,128 @@
import { getServerSideConfig } from "@/app/config/server";
import {
DEEPSEEK_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelNotavailableInServer } from "@/app/utils/model";
const serverConfig = getServerSideConfig();
export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[DeepSeek Route] params ", params);
if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
}
const authResult = auth(req, ModelProvider.DeepSeek);
if (authResult.error) {
return NextResponse.json(authResult, {
status: 401,
});
}
try {
const response = await request(req);
return response;
} catch (e) {
console.error("[DeepSeek] ", e);
return NextResponse.json(prettyObject(e));
}
}
async function request(req: NextRequest) {
const controller = new AbortController();
// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.DeepSeek, "");
let baseUrl = serverConfig.deepseekUrl || DEEPSEEK_BASE_URL;
if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, -1);
}
console.log("[Proxy] ", path);
console.log("[Base Url]", baseUrl);
const timeoutId = setTimeout(
() => {
controller.abort();
},
10 * 60 * 1000,
);
const fetchUrl = `${baseUrl}${path}`;
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Authorization: req.headers.get("Authorization") ?? "",
},
method: req.method,
body: req.body,
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};
// #1815 try to refuse some request to some models
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
const jsonBody = JSON.parse(clonedBody) as { model?: string };
// not undefined and is false
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.DeepSeek as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
} catch (e) {
console.error(`[DeepSeek] filter`, e);
}
}
try {
const res = await fetch(fetchUrl, fetchOptions);
// to prevent browser prompt for credentials
const newHeaders = new Headers(res.headers);
newHeaders.delete("www-authenticate");
// to disable nginx buffering
newHeaders.set("X-Accel-Buffering", "no");
return new Response(res.body, {
status: res.status,
statusText: res.statusText,
headers: newHeaders,
});
} finally {
clearTimeout(timeoutId);
}
}

129
app/api/glm.ts Normal file
View File

@@ -0,0 +1,129 @@
import { getServerSideConfig } from "@/app/config/server";
import {
CHATGLM_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelNotavailableInServer } from "@/app/utils/model";
const serverConfig = getServerSideConfig();
export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[GLM Route] params ", params);
if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
}
const authResult = auth(req, ModelProvider.ChatGLM);
if (authResult.error) {
return NextResponse.json(authResult, {
status: 401,
});
}
try {
const response = await request(req);
return response;
} catch (e) {
console.error("[GLM] ", e);
return NextResponse.json(prettyObject(e));
}
}
async function request(req: NextRequest) {
const controller = new AbortController();
// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.ChatGLM, "");
let baseUrl = serverConfig.chatglmUrl || CHATGLM_BASE_URL;
if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, -1);
}
console.log("[Proxy] ", path);
console.log("[Base Url]", baseUrl);
const timeoutId = setTimeout(
() => {
controller.abort();
},
10 * 60 * 1000,
);
const fetchUrl = `${baseUrl}${path}`;
console.log("[Fetch Url] ", fetchUrl);
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Authorization: req.headers.get("Authorization") ?? "",
},
method: req.method,
body: req.body,
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};
// #1815 try to refuse some request to some models
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
const jsonBody = JSON.parse(clonedBody) as { model?: string };
// not undefined and is false
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.ChatGLM as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
} catch (e) {
console.error(`[GLM] filter`, e);
}
}
try {
const res = await fetch(fetchUrl, fetchOptions);
// to prevent browser prompt for credentials
const newHeaders = new Headers(res.headers);
newHeaders.delete("www-authenticate");
// to disable nginx buffering
newHeaders.set("X-Accel-Buffering", "no");
return new Response(res.body, {
status: res.status,
statusText: res.statusText,
headers: newHeaders,
});
} finally {
clearTimeout(timeoutId);
}
}

View File

@@ -1,12 +1,7 @@
import { NextRequest, NextResponse } from "next/server";
import { auth } from "./auth";
import { getServerSideConfig } from "@/app/config/server";
import {
ApiPath,
GEMINI_BASE_URL,
Google,
ModelProvider,
} from "@/app/constant";
import { ApiPath, GEMINI_BASE_URL, ModelProvider } from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
const serverConfig = getServerSideConfig();
@@ -28,7 +23,8 @@ export async function handle(
});
}
const bearToken = req.headers.get("Authorization") ?? "";
const bearToken =
req.headers.get("x-goog-api-key") || req.headers.get("Authorization") || "";
const token = bearToken.trim().replaceAll("Bearer ", "").trim();
const apiKey = token ? token : serverConfig.googleApiKey;
@@ -96,8 +92,8 @@ async function request(req: NextRequest, apiKey: string) {
},
10 * 60 * 1000,
);
const fetchUrl = `${baseUrl}${path}?key=${apiKey}${
req?.nextUrl?.searchParams?.get("alt") === "sse" ? "&alt=sse" : ""
const fetchUrl = `${baseUrl}${path}${
req?.nextUrl?.searchParams?.get("alt") === "sse" ? "?alt=sse" : ""
}`;
console.log("[Fetch Url] ", fetchUrl);
@@ -105,6 +101,9 @@ async function request(req: NextRequest, apiKey: string) {
headers: {
"Content-Type": "application/json",
"Cache-Control": "no-store",
"x-goog-api-key":
req.headers.get("x-goog-api-key") ||
(req.headers.get("Authorization") ?? "").replace("Bearer ", ""),
},
method: req.method,
body: req.body,

View File

@@ -1,6 +1,5 @@
import { getServerSideConfig } from "@/app/config/server";
import {
Iflytek,
IFLYTEK_BASE_URL,
ApiPath,
ModelProvider,
@@ -9,8 +8,7 @@ import {
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import type { RequestPayload } from "@/app/client/platforms/openai";
import { isModelNotavailableInServer } from "@/app/utils/model";
// iflytek
const serverConfig = getServerSideConfig();
@@ -91,7 +89,7 @@ async function request(req: NextRequest) {
// not undefined and is false
if (
isModelAvailableInServer(
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Iflytek as string,

View File

@@ -1,6 +1,5 @@
import { getServerSideConfig } from "@/app/config/server";
import {
Moonshot,
MOONSHOT_BASE_URL,
ApiPath,
ModelProvider,
@@ -9,8 +8,7 @@ import {
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import type { RequestPayload } from "@/app/client/platforms/openai";
import { isModelNotavailableInServer } from "@/app/utils/model";
const serverConfig = getServerSideConfig();
@@ -90,7 +88,7 @@ async function request(req: NextRequest) {
// not undefined and is false
if (
isModelAvailableInServer(
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Moonshot as string,

View File

@@ -6,14 +6,20 @@ import { NextRequest, NextResponse } from "next/server";
import { auth } from "./auth";
import { requestOpenai } from "./common";
const ALLOWD_PATH = new Set(Object.values(OpenaiPath));
const ALLOWED_PATH = new Set(Object.values(OpenaiPath));
function getModels(remoteModelRes: OpenAIListModelResponse) {
const config = getServerSideConfig();
if (config.disableGPT4) {
remoteModelRes.data = remoteModelRes.data.filter(
(m) => !m.id.startsWith("gpt-4") || m.id.startsWith("gpt-4o-mini"),
(m) =>
!(
m.id.startsWith("gpt-4") ||
m.id.startsWith("chatgpt-4o") ||
m.id.startsWith("o1") ||
m.id.startsWith("o3")
) || m.id.startsWith("gpt-4o-mini"),
);
}
@@ -32,7 +38,7 @@ export async function handle(
const subpath = params.path.join("/");
if (!ALLOWD_PATH.has(subpath)) {
if (!ALLOWED_PATH.has(subpath)) {
console.log("[OpenAI Route] forbidden path ", subpath);
return NextResponse.json(
{
@@ -53,6 +59,8 @@ export async function handle(
}
try {
console.log("mac dinh su dung OpenAI API");
console.log("[OpenAI Route] ", subpath);
const response = await requestOpenai(req);
// list models

View File

@@ -1,4 +1,5 @@
import { NextRequest, NextResponse } from "next/server";
import { getServerSideConfig } from "@/app/config/server";
export async function handle(
req: NextRequest,
@@ -9,6 +10,7 @@ export async function handle(
if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
}
const serverConfig = getServerSideConfig();
// remove path params from searchParams
req.nextUrl.searchParams.delete("path");
@@ -31,6 +33,18 @@ export async function handle(
return true;
}),
);
// if dalle3 use openai api key
const baseUrl = req.headers.get("x-base-url");
if (baseUrl?.includes("api.openai.com")) {
if (!serverConfig.apiKey) {
return NextResponse.json(
{ error: "OpenAI API key not configured" },
{ status: 500 },
);
}
headers.set("Authorization", `Bearer ${serverConfig.apiKey}`);
}
const controller = new AbortController();
const fetchOptions: RequestInit = {
headers,

128
app/api/siliconflow.ts Normal file
View File

@@ -0,0 +1,128 @@
import { getServerSideConfig } from "@/app/config/server";
import {
SILICONFLOW_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelNotavailableInServer } from "@/app/utils/model";
const serverConfig = getServerSideConfig();
export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[SiliconFlow Route] params ", params);
if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
}
const authResult = auth(req, ModelProvider.SiliconFlow);
if (authResult.error) {
return NextResponse.json(authResult, {
status: 401,
});
}
try {
const response = await request(req);
return response;
} catch (e) {
console.error("[SiliconFlow] ", e);
return NextResponse.json(prettyObject(e));
}
}
async function request(req: NextRequest) {
const controller = new AbortController();
// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.SiliconFlow, "");
let baseUrl = serverConfig.siliconFlowUrl || SILICONFLOW_BASE_URL;
if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, -1);
}
console.log("[Proxy] ", path);
console.log("[Base Url]", baseUrl);
const timeoutId = setTimeout(
() => {
controller.abort();
},
10 * 60 * 1000,
);
const fetchUrl = `${baseUrl}${path}`;
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Authorization: req.headers.get("Authorization") ?? "",
},
method: req.method,
body: req.body,
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};
// #1815 try to refuse some request to some models
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
const jsonBody = JSON.parse(clonedBody) as { model?: string };
// not undefined and is false
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.SiliconFlow as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
} catch (e) {
console.error(`[SiliconFlow] filter`, e);
}
}
try {
const res = await fetch(fetchUrl, fetchOptions);
// to prevent browser prompt for credentials
const newHeaders = new Headers(res.headers);
newHeaders.delete("www-authenticate");
// to disable nginx buffering
newHeaders.set("X-Accel-Buffering", "no");
return new Response(res.body, {
status: res.status,
statusText: res.statusText,
headers: newHeaders,
});
} finally {
clearTimeout(timeoutId);
}
}

45
app/api/supabase.ts Normal file
View File

@@ -0,0 +1,45 @@
import { createClient } from "@supabase/supabase-js";
import { NextRequest } from "next/server";
const SUPABASE_URL = process.env.SUPABASE_URL!;
const SUPABASE_ANON_KEY = process.env.SUPABASE_ANON_KEY!;
export async function checkAuth(req: NextRequest) {
// Use NextRequest.cookies API
const authToken = req.cookies.get("sb-zzgkylsbdgwoohcbompi-auth-token")
?.value;
// console.log("[Supabase] authToken", authToken);
if (!authToken) {
// Không tìm thấy token xác thực
console.log("[Supabase] No auth token found");
return null;
}
const supabase = createClient(SUPABASE_URL, SUPABASE_ANON_KEY, {
global: {
headers: {
Authorization: `Bearer ${authToken}`,
},
},
});
try {
const { data, error } = await supabase.auth.getUser();
if (error || !data?.user) {
// Lỗi khi lấy thông tin người dùng
console.error("[Supabase] Error getting user:", error);
return null;
}
// Đã xác thực thành công, trả về thông tin người dùng
console.log("[Supabase] Authenticated user:", data.user);
return data.user;
} catch (err) {
// Lỗi khi lấy dữ liệu người dùng
console.error("[Supabase] Error fetching user data:", err);
return null;
}
}

View File

@@ -1,15 +1,8 @@
import { getServerSideConfig } from "@/app/config/server";
import {
TENCENT_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
Tencent,
} from "@/app/constant";
import { TENCENT_BASE_URL, ModelProvider } from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelAvailableInServer } from "@/app/utils/model";
import { getHeader } from "@/app/utils/tencent";
const serverConfig = getServerSideConfig();

View File

@@ -6,7 +6,7 @@ const config = getServerSideConfig();
const mergedAllowedWebDavEndpoints = [
...internalAllowedWebDavEndpoints,
...config.allowedWebDevEndpoints,
...config.allowedWebDavEndpoints,
].filter((domain) => Boolean(domain.trim()));
const normalizeUrl = (url: string) => {

128
app/api/xai.ts Normal file
View File

@@ -0,0 +1,128 @@
import { getServerSideConfig } from "@/app/config/server";
import {
XAI_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelNotavailableInServer } from "@/app/utils/model";
const serverConfig = getServerSideConfig();
export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[XAI Route] params ", params);
if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
}
const authResult = auth(req, ModelProvider.XAI);
if (authResult.error) {
return NextResponse.json(authResult, {
status: 401,
});
}
try {
const response = await request(req);
return response;
} catch (e) {
console.error("[XAI] ", e);
return NextResponse.json(prettyObject(e));
}
}
async function request(req: NextRequest) {
const controller = new AbortController();
// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.XAI, "");
let baseUrl = serverConfig.xaiUrl || XAI_BASE_URL;
if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, -1);
}
console.log("[Proxy] ", path);
console.log("[Base Url]", baseUrl);
const timeoutId = setTimeout(
() => {
controller.abort();
},
10 * 60 * 1000,
);
const fetchUrl = `${baseUrl}${path}`;
const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Authorization: req.headers.get("Authorization") ?? "",
},
method: req.method,
body: req.body,
redirect: "manual",
// @ts-ignore
duplex: "half",
signal: controller.signal,
};
// #1815 try to refuse some request to some models
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
const jsonBody = JSON.parse(clonedBody) as { model?: string };
// not undefined and is false
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.XAI as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
} catch (e) {
console.error(`[XAI] filter`, e);
}
}
try {
const res = await fetch(fetchUrl, fetchOptions);
// to prevent browser prompt for credentials
const newHeaders = new Headers(res.headers);
newHeaders.delete("www-authenticate");
// to disable nginx buffering
newHeaders.set("X-Accel-Buffering", "no");
return new Response(res.body, {
status: res.status,
statusText: res.statusText,
headers: newHeaders,
});
} finally {
clearTimeout(timeoutId);
}
}

12
app/chebichatConstant.ts Normal file
View File

@@ -0,0 +1,12 @@
export const ALIBABA_BASE_URL = "https://dashscope-intl.aliyuncs.com";
export const ALIBABA_PATH = `compatible-mode/v1/chat/completions`;
// The key used to store the last chat ID in local storage
export const UPSTASH_ENDPOINT = "https://fine-baboon-52580.upstash.io";
export const UPSTASH_APIKEY =
"Ac1kAAIjcDE2YjM4YmY3OGI4YzA0MTU2YjZhNmQyNzc5Yzc3NzEwYnAxMA";
export const STORAGE_KEY = "chebichat-backup";
export const AUTHEN_PAGE = "https://chinese.giahungtech.com.vn";
export const CHEBI_MESSAGE = `# Vai trò: Bạn là một **bậc thầy uyên bác và kiên nhẫn về văn học cổ điển Trung Quốc**, chuyên hỗ trợ người học tiếng Trung (đặc biệt là người Việt) trong quá trình ôn luyện kỳ thi **HSK (Hànyǔ Shuǐpíng Kǎoshì)**. Phương pháp giảng dạy của bạn kết hợp giữa kiến thức sâu rộng về **Hán văn cổ** và các kỹ thuật hiện đại, giúp học viên vừa nắm vững ngôn ngữ vừa hiểu sâu sắc văn hóa Trung Hoa. Bạn luôn sử dụng **tiếng Việt** để giảng dạy và giao tiếp nhằm đảm bảo hiệu quả và dễ tiếp cận.---## 🎯 Kỹ năng chính### 1. Dịch thuật sang văn cổ Trung Quốc- Dịch các chủ đề hiện đại sang Hán văn cổ, giữ nguyên nội dung và tinh thần gốc.- Tuân thủ cú pháp, phong cách và ngữ pháp cổ điển.- Khéo léo chuyển hóa thuật ngữ hiện đại theo lối cổ mà vẫn rõ ràng và chính xác.### 2. Giảng dạy và đơn giản hóa văn bản phức tạp- Giải thích rõ ràng các văn bản cổ điển khó hiểu cho người học Việt Nam.- Cung cấp: - Hướng dẫn phát âm (Pinyin), - Cách đọc Hán-Việt, - Phân tích từ vựng và ngữ cảnh.- Dùng ví dụ văn hóa gần gũi để tạo liên kết giữa Hán văn và đời sống người học.### 3. Luyện thi HSK hiệu quả- Soạn tài liệu và bài tập phù hợp với từng cấp độ HSK.- Hướng dẫn chiến thuật làm bài: phân bổ thời gian, mẹo làm nhanh.- Phân tích bài thi thử, nhận xét điểm mạnh và điểm cần cải thiện.---## ⚠️ Giới hạn- **Chỉ tập trung** vào văn học cổ Trung Quốc và luyện thi HSK.- Đảm bảo mọi bản dịch, giải thích đều **chính xác và phù hợp văn hóa**.- Khi có đề cập đến tiếng Trung hiện đại, phải **phân biệt rõ với Hán văn cổ**.- **Luôn sử dụng tiếng Việt** trong toàn bộ quá trình giảng dạy và phản hồi.`;
export const CHEBI_VISION = `# 🧠 Vai trò: Bạn là một **chuyên gia uyên bác và kiên nhẫn về văn học cổ điển Trung Quốc (文言文)**, chuyên hướng dẫn học viên Việt Nam học tiếng Trung cổ và hiện đại. Bạn kết hợp kiến thức sâu rộng về Hán văn cổ với phương pháp sư phạm thực tiễn, giúp học viên xây dựng nền tảng vững chắc và tự tin chinh phục kỳ thi **HSK (Hànyǔ Shuǐpíng Kǎoshì)**. Bạn có khả năng **phân tích nội dung hình ảnh**, bao gồm thư pháp, chữ viết tay, từ điển Hán ngữ hoặc ảnh chụp bài thi HSK, và **giải thích rõ ràng bằng tiếng Việt**.---## 🎯 Kỹ năng chính### 1. Dịch và giải nghĩa văn bản - Dịch Hán văn cổ sang tiếng Việt một cách rõ ràng và chính xác. - Phân tích cấu trúc câu, nghĩa từ, thứ tự nét, bộ thủ và cách dùng chữ Hán. - Giải thích ngữ cảnh văn hóa để kết nối giữa văn học Trung Hoa và người học Việt Nam.### 2. Phân tích nội dung hình ảnh - Đọc hiểu thư pháp, ghi chú viết tay, ảnh chụp đề thi HSK hoặc tài liệu học tiếng Trung. - Tách chữ phức tạp thành thành phần đơn giản để dễ hiểu. - Làm nổi bật đặc điểm quan trọng của hệ thống chữ Hán: bộ thủ, âm tiết và ngữ nghĩa.### 3. Hỗ trợ luyện thi HSK - Dạy từ vựng, ngữ pháp và kỹ năng đọc hiểu theo từng cấp độ HSK. - Hướng dẫn chiến thuật làm bài ở các phần nghe, đọc, viết. - Cung cấp bài luyện tập và phản hồi chi tiết để cải thiện kết quả thi.### 4. Giải thích văn hóa và văn học - Giới thiệu bối cảnh lịch sử và giá trị văn hóa của văn học cổ Trung Hoa. - So sánh các yếu tố tương đồng giữa văn học Việt - Trung để tăng sự hứng thú học tập. - Sử dụng kể chuyện, giai thoại giúp nội dung cổ điển trở nên sống động và dễ ghi nhớ.---## ⚠️ Giới hạn- **Chỉ tập trung** vào văn học cổ Trung Quốc, tiếng Trung hiện đại và luyện thi HSK. - **Tránh dùng thuật ngữ quá kỹ thuật** nhằm đảm bảo mọi đối tượng đều dễ hiểu. - **Dịch và giảng giải phải chính xác**, phù hợp văn hóa và trình độ người học. - Luôn duy trì **giọng điệu tích cực, hỗ trợ và khuyến khích** trong quá trình giảng dạy.`;

View File

@@ -1,7 +1,6 @@
import { getClientConfig } from "../config/client";
import {
ACCESS_CODE_PREFIX,
Azure,
ModelProvider,
ServiceProvider,
} from "../constant";
@@ -21,11 +20,16 @@ import { QwenApi } from "./platforms/alibaba";
import { HunyuanApi } from "./platforms/tencent";
import { MoonshotApi } from "./platforms/moonshot";
import { SparkApi } from "./platforms/iflytek";
import { DeepSeekApi } from "./platforms/deepseek";
import { XAIApi } from "./platforms/xai";
import { ChatGLMApi } from "./platforms/glm";
import { SiliconflowApi } from "./platforms/siliconflow";
export const ROLES = ["system", "user", "assistant"] as const;
export type MessageRole = (typeof ROLES)[number];
export const Models = ["gpt-3.5-turbo", "gpt-4"] as const;
export const TTSModels = ["tts-1", "tts-1-hd"] as const;
export type ChatModel = ModelType;
export interface MultimodalContent {
@@ -36,6 +40,11 @@ export interface MultimodalContent {
};
}
export interface MultimodalContentForAlibaba {
text?: string;
image?: string;
}
export interface RequestMessage {
role: MessageRole;
content: string | MultimodalContent[];
@@ -54,12 +63,21 @@ export interface LLMConfig {
style?: DalleRequestPayload["style"];
}
export interface SpeechOptions {
model: string;
input: string;
voice: string;
response_format?: string;
speed?: number;
onController?: (controller: AbortController) => void;
}
export interface ChatOptions {
messages: RequestMessage[];
config: LLMConfig;
onUpdate?: (message: string, chunk: string) => void;
onFinish: (message: string) => void;
onFinish: (message: string, responseRes: Response) => void;
onError?: (err: Error) => void;
onController?: (controller: AbortController) => void;
onBeforeTool?: (tool: ChatMessageTool) => void;
@@ -88,6 +106,7 @@ export interface LLMModelProvider {
export abstract class LLMApi {
abstract chat(options: ChatOptions): Promise<void>;
abstract speech(options: SpeechOptions): Promise<ArrayBuffer>;
abstract usage(): Promise<LLMUsage>;
abstract models(): Promise<LLMModel[]>;
}
@@ -113,10 +132,16 @@ interface ChatProvider {
usage: () => void;
}
// Khởi tạo API client dựa trên nhà cung cấp mô hình được chỉ định
export class ClientApi {
public llm: LLMApi;
// Hàm khởi tạo nhận vào một provider (nhà cung cấp mô hình AI)
// Mặc định là ModelProvider.GPT nếu không được chỉ định
constructor(provider: ModelProvider = ModelProvider.GPT) {
console.log("[ClientApi] provider ", provider);
// Sử dụng switch để khởi tạo instance tương ứng với provider được chọn
switch (provider) {
case ModelProvider.GeminiPro:
this.llm = new GeminiProApi();
@@ -142,18 +167,35 @@ export class ClientApi {
case ModelProvider.Iflytek:
this.llm = new SparkApi();
break;
case ModelProvider.DeepSeek:
this.llm = new DeepSeekApi();
break;
case ModelProvider.XAI:
this.llm = new XAIApi();
break;
case ModelProvider.ChatGLM:
this.llm = new ChatGLMApi();
break;
case ModelProvider.SiliconFlow:
this.llm = new SiliconflowApi();
break;
default:
this.llm = new ChatGPTApi();
}
}
// Hàm cấu hình (chưa triển khai chi tiết)
config() {}
// Hàm lấy prompts (chưa triển khai chi tiết)
prompts() {}
// Hàm lấy masks (chưa triển khai chi tiết)
masks() {}
// Hàm chia sẻ cuộc trò chuyện
async share(messages: ChatMessage[], avatarUrl: string | null = null) {
// Chuẩn bị dữ liệu tin nhắn để chia sẻ
const msgs = messages
.map((m) => ({
from: m.role === "user" ? "human" : "gpt",
@@ -166,14 +208,20 @@ export class ClientApi {
"Share from [NextChat]: https://github.com/Yidadaa/ChatGPT-Next-Web",
},
]);
// 敬告二开开发者们,为了开源大模型的发展,请不要修改上述消息,此消息用于后续数据清洗使用
// Lưu ý: Không nên sửa đổi dòng thông báo cuối cùng này vì nó dùng cho việc làm sạch dữ liệu sau này
// Please do not modify this message
console.log("[Share]", messages, msgs);
// Lấy cấu hình client
const clientConfig = getClientConfig();
// Xác định URL để chia sẻ dựa trên môi trường (app hay web)
const proxyUrl = "/sharegpt";
const rawUrl = "https://sharegpt.com/api/conversations";
const shareUrl = clientConfig?.isApp ? rawUrl : proxyUrl;
// Gửi yêu cầu POST để chia sẻ cuộc trò chuyện
const res = await fetch(shareUrl, {
body: JSON.stringify({
avatarUrl,
@@ -185,6 +233,7 @@ export class ClientApi {
method: "POST",
});
// Xử lý phản hồi và trả về link chia sẻ
const resJson = await res.json();
console.log("[Share]", resJson);
if (resJson.id) {
@@ -193,6 +242,7 @@ export class ClientApi {
}
}
// Hàm tạo token xác thực Bearer
export function getBearerToken(
apiKey: string,
noBearer: boolean = false,
@@ -202,23 +252,38 @@ export function getBearerToken(
: "";
}
// Hàm kiểm tra chuỗi có hợp lệ không (có độ dài > 0)
export function validString(x: string): boolean {
return x?.length > 0;
}
export function getHeaders() {
// Hàm lấy các header cần thiết cho yêu cầu API
export function getHeaders(ignoreHeaders: boolean = false) {
// Lấy store để truy cập các trạng thái liên quan đến quyền truy cập và chat
const accessStore = useAccessStore.getState();
const chatStore = useChatStore.getState();
const headers: Record<string, string> = {
"Content-Type": "application/json",
Accept: "application/json",
};
// Khởi tạo đối tượng headers rỗng
let headers: Record<string, string> = {};
// Nếu không bỏ qua headers thì thêm các header mặc định
if (!ignoreHeaders) {
headers = {
"Content-Type": "application/json",
Accept: "application/json",
};
}
// Lấy cấu hình client
const clientConfig = getClientConfig();
// Hàm getConfig sẽ xác định nhà cung cấp hiện tại và API key tương ứng
function getConfig() {
// Lấy cấu hình mô hình từ session hiện tại
const modelConfig = chatStore.currentSession().mask.modelConfig;
const isGoogle = modelConfig.providerName == ServiceProvider.Google;
// Kiểm tra loại nhà cung cấp đang được sử dụng
const isGoogle = modelConfig.providerName === ServiceProvider.Google;
const isAzure = modelConfig.providerName === ServiceProvider.Azure;
const isAnthropic = modelConfig.providerName === ServiceProvider.Anthropic;
const isBaidu = modelConfig.providerName == ServiceProvider.Baidu;
@@ -226,7 +291,16 @@ export function getHeaders() {
const isAlibaba = modelConfig.providerName === ServiceProvider.Alibaba;
const isMoonshot = modelConfig.providerName === ServiceProvider.Moonshot;
const isIflytek = modelConfig.providerName === ServiceProvider.Iflytek;
const isDeepSeek = modelConfig.providerName === ServiceProvider.DeepSeek;
const isXAI = modelConfig.providerName === ServiceProvider.XAI;
const isChatGLM = modelConfig.providerName === ServiceProvider.ChatGLM;
const isSiliconFlow =
modelConfig.providerName === ServiceProvider.SiliconFlow;
// Kiểm tra xem có bật kiểm soát truy cập không
const isEnabledAccessControl = accessStore.enabledAccessControl();
// Xác định API key dựa trên nhà cung cấp đang được sử dụng
const apiKey = isGoogle
? accessStore.googleApiKey
: isAzure
@@ -239,6 +313,14 @@ export function getHeaders() {
? accessStore.alibabaApiKey
: isMoonshot
? accessStore.moonshotApiKey
: isXAI
? accessStore.xaiApiKey
: isDeepSeek
? accessStore.deepseekApiKey
: isChatGLM
? accessStore.chatglmApiKey
: isSiliconFlow
? accessStore.siliconflowApiKey
: isIflytek
? accessStore.iflytekApiKey && accessStore.iflytekApiSecret
? accessStore.iflytekApiKey + ":" + accessStore.iflytekApiSecret
@@ -253,35 +335,61 @@ export function getHeaders() {
isAlibaba,
isMoonshot,
isIflytek,
isDeepSeek,
isXAI,
isChatGLM,
isSiliconFlow,
apiKey,
isEnabledAccessControl,
};
}
// Hàm xác định header nào sẽ được sử dụng để xác thực
function getAuthHeader(): string {
return isAzure ? "api-key" : isAnthropic ? "x-api-key" : "Authorization";
return isAzure
? "api-key"
: isAnthropic
? "x-api-key"
: isGoogle
? "x-goog-api-key"
: "Authorization";
}
// Lấy các giá trị đã được xác định trong getConfig
const {
isGoogle,
isAzure,
isAnthropic,
isBaidu,
isByteDance,
isAlibaba,
isMoonshot,
isIflytek,
isDeepSeek,
isXAI,
isChatGLM,
isSiliconFlow,
apiKey,
isEnabledAccessControl,
} = getConfig();
// when using google api in app, not set auth header
if (isGoogle && clientConfig?.isApp) return headers;
// when using baidu api in app, not set auth header
// Khi sử dụng API của Baidu trong ứng dụng, không đặt header xác thực
if (isBaidu && clientConfig?.isApp) return headers;
// Xác định tên header xác thực
const authHeader = getAuthHeader();
const bearerToken = getBearerToken(apiKey, isAzure || isAnthropic);
// Tạo token xác thực
const bearerToken = getBearerToken(
apiKey,
isAzure || isAnthropic || isGoogle,
);
// Nếu có bearer token thì thêm vào headers
if (bearerToken) {
headers[authHeader] = bearerToken;
} else if (isEnabledAccessControl && validString(accessStore.accessCode)) {
// Nếu có mã truy cập thì sử dụng nó để tạo bearer token
headers["Authorization"] = getBearerToken(
ACCESS_CODE_PREFIX + accessStore.accessCode,
);
@@ -290,6 +398,7 @@ export function getHeaders() {
return headers;
}
// Hàm tạo instance của ClientApi dựa trên nhà cung cấp dịch vụ
export function getClientApi(provider: ServiceProvider): ClientApi {
switch (provider) {
case ServiceProvider.Google:
@@ -308,6 +417,14 @@ export function getClientApi(provider: ServiceProvider): ClientApi {
return new ClientApi(ModelProvider.Moonshot);
case ServiceProvider.Iflytek:
return new ClientApi(ModelProvider.Iflytek);
case ServiceProvider.DeepSeek:
return new ClientApi(ModelProvider.DeepSeek);
case ServiceProvider.XAI:
return new ClientApi(ModelProvider.XAI);
case ServiceProvider.ChatGLM:
return new ClientApi(ModelProvider.ChatGLM);
case ServiceProvider.SiliconFlow:
return new ClientApi(ModelProvider.SiliconFlow);
default:
return new ClientApi(ModelProvider.GPT);
}

View File

@@ -1,27 +1,32 @@
"use client";
import { ApiPath, Alibaba, ALIBABA_BASE_URL } from "@/app/constant";
import {
ApiPath,
Alibaba,
ALIBABA_BASE_URL,
REQUEST_TIMEOUT_MS,
} from "@/app/constant";
import { useAccessStore, useAppConfig, useChatStore } from "@/app/store";
useAccessStore,
useAppConfig,
useChatStore,
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import {
preProcessImageContentForAlibabaDashScope,
streamWithThink,
} from "@/app/utils/chat";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
SpeechOptions,
MultimodalContent,
} from "../api";
import Locale from "../../locales";
import {
EventStreamContentType,
fetchEventSource,
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
import { getClientConfig } from "@/app/config/client";
import { getMessageTextContent } from "@/app/utils";
import {
getMessageTextContent,
getMessageTextContentWithoutThinking,
getTimeoutMSByModel,
isVisionModel,
} from "@/app/utils";
import { fetch } from "@/app/utils/stream";
export interface OpenAIListModelResponse {
object: string;
@@ -83,12 +88,11 @@ export class QwenApi implements LLMApi {
return res?.output?.choices?.at(0)?.message?.content ?? "";
}
async chat(options: ChatOptions) {
const messages = options.messages.map((v) => ({
role: v.role,
content: getMessageTextContent(v),
}));
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const modelConfig = {
...useAppConfig.getState().modelConfig,
...useChatStore.getState().currentSession().mask.modelConfig,
@@ -97,6 +101,21 @@ export class QwenApi implements LLMApi {
},
};
const visionModel = isVisionModel(options.config.model);
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
const content = (
visionModel
? await preProcessImageContentForAlibabaDashScope(v.content)
: v.role === "assistant"
? getMessageTextContentWithoutThinking(v)
: getMessageTextContent(v)
) as any;
messages.push({ role: v.role, content });
}
const shouldStream = !!options.config.stream;
const requestPayload: RequestPayload = {
model: modelConfig.model,
@@ -116,138 +135,92 @@ export class QwenApi implements LLMApi {
options.onController?.(controller);
try {
const chatPath = this.path(Alibaba.ChatPath);
const headers = {
...getHeaders(),
"X-DashScope-SSE": shouldStream ? "enable" : "disable",
};
const chatPath = this.path(Alibaba.ChatPath(modelConfig.model));
const chatPayload = {
method: "POST",
body: JSON.stringify(requestPayload),
signal: controller.signal,
headers: {
...getHeaders(),
"X-DashScope-SSE": shouldStream ? "enable" : "disable",
},
headers: headers,
};
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
REQUEST_TIMEOUT_MS,
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
let responseText = "";
let remainText = "";
let finished = false;
// animate response to make it looks smooth
function animateResponseText() {
if (finished || controller.signal.aborted) {
responseText += remainText;
console.log("[Response Animation] finished");
if (responseText?.length === 0) {
options.onError?.(new Error("empty response from server"));
}
return;
}
if (remainText.length > 0) {
const fetchCount = Math.max(1, Math.round(remainText.length / 60));
const fetchText = remainText.slice(0, fetchCount);
responseText += fetchText;
remainText = remainText.slice(fetchCount);
options.onUpdate?.(responseText, fetchText);
}
requestAnimationFrame(animateResponseText);
}
// start animaion
animateResponseText();
const finish = () => {
if (!finished) {
finished = true;
options.onFinish(responseText + remainText);
}
};
controller.signal.onabort = finish;
fetchEventSource(chatPath, {
...chatPayload,
async onopen(res) {
clearTimeout(requestTimeoutId);
const contentType = res.headers.get("content-type");
console.log(
"[Alibaba] request response content type: ",
contentType,
);
if (contentType?.startsWith("text/plain")) {
responseText = await res.clone().text();
return finish();
}
if (
!res.ok ||
!res.headers
.get("content-type")
?.startsWith(EventStreamContentType) ||
res.status !== 200
) {
const responseTexts = [responseText];
let extraInfo = await res.clone().text();
try {
const resJson = await res.clone().json();
extraInfo = prettyObject(resJson);
} catch {}
if (res.status === 401) {
responseTexts.push(Locale.Error.Unauthorized);
}
if (extraInfo) {
responseTexts.push(extraInfo);
}
responseText = responseTexts.join("\n\n");
return finish();
}
},
onmessage(msg) {
if (msg.data === "[DONE]" || finished) {
return finish();
}
const text = msg.data;
// Lấy danh sách các công cụ (tools) và hàm (funcs) từ plugin hiện tại của phiên chat
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
// Gọi hàm streamWithThink để xử lý chat dạng stream (dòng sự kiện server-sent events)
return streamWithThink(
chatPath,
requestPayload,
headers,
tools as any,
funcs,
controller,
// SSE parse callback for OpenAI-style streaming
(text: string, runTools: ChatMessageTool[]) => {
// Each `text` is a line like: data: {...}
let json: any;
try {
const json = JSON.parse(text);
const choices = json.output.choices as Array<{
message: { content: string };
}>;
const delta = choices[0]?.message?.content;
if (delta) {
remainText += delta;
}
} catch (e) {
console.error("[Request] parse error", text, msg);
json = JSON.parse(text);
} catch {
return { isThinking: false, content: "" };
}
const delta = json.choices?.[0]?.delta;
const content = delta?.content ?? "";
// You can accumulate content outside if needed
return {
isThinking: false,
content,
};
},
onclose() {
finish();
(
requestPayload: RequestPayload,
toolCallMessage: any,
toolCallResult: any[],
) => {
requestPayload?.input?.messages?.splice(
requestPayload?.input?.messages?.length,
0,
toolCallMessage,
...toolCallResult,
);
},
onerror(e) {
options.onError?.(e);
throw e;
{
...options,
// Accumulate and render result as it streams
onUpdate: (() => {
let accumulated = "";
return (chunk: string, fetchText?: string) => {
accumulated += chunk;
options.onUpdate?.(accumulated, fetchText ?? "");
};
})(),
onFinish: (final: string, res: any) => {
options.onFinish?.(final, res);
},
},
openWhenHidden: true,
});
);
} else {
const res = await fetch(chatPath, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

View File

@@ -1,5 +1,5 @@
import { ACCESS_CODE_PREFIX, Anthropic, ApiPath } from "@/app/constant";
import { ChatOptions, getHeaders, LLMApi, MultimodalContent } from "../api";
import { Anthropic, ApiPath } from "@/app/constant";
import { ChatOptions, getHeaders, LLMApi, SpeechOptions } from "../api";
import {
useAccessStore,
useAppConfig,
@@ -8,18 +8,12 @@ import {
ChatMessageTool,
} from "@/app/store";
import { getClientConfig } from "@/app/config/client";
import { DEFAULT_API_HOST } from "@/app/constant";
import {
EventStreamContentType,
fetchEventSource,
} from "@fortaine/fetch-event-source";
import Locale from "../../locales";
import { prettyObject } from "@/app/utils/format";
import { ANTHROPIC_BASE_URL } from "@/app/constant";
import { getMessageTextContent, isVisionModel } from "@/app/utils";
import { preProcessImageContent, stream } from "@/app/utils/chat";
import { cloudflareAIGatewayUrl } from "@/app/utils/cloudflare";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
export type MultiBlockContent = {
type: "image" | "text";
@@ -80,6 +74,10 @@ const ClaudeMapper = {
const keys = ["claude-2, claude-instant-1"];
export class ClaudeApi implements LLMApi {
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
extractMessage(res: any) {
console.log("[Response] claude response: ", res);
@@ -319,13 +317,14 @@ export class ClaudeApi implements LLMApi {
};
try {
controller.signal.onabort = () => options.onFinish("");
controller.signal.onabort = () =>
options.onFinish("", new Response(null, { status: 400 }));
const res = await fetch(path, payload);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
} catch (e) {
console.error("failed to chat", e);
options.onError?.(e as Error);
@@ -391,9 +390,7 @@ export class ClaudeApi implements LLMApi {
if (baseUrl.trim().length === 0) {
const isApp = !!getClientConfig()?.isApp;
baseUrl = isApp
? DEFAULT_API_HOST + "/api/proxy/anthropic"
: ApiPath.Anthropic;
baseUrl = isApp ? ANTHROPIC_BASE_URL : ApiPath.Anthropic;
}
if (!baseUrl.startsWith("http") && !baseUrl.startsWith("/api")) {

View File

@@ -1,10 +1,5 @@
"use client";
import {
ApiPath,
Baidu,
BAIDU_BASE_URL,
REQUEST_TIMEOUT_MS,
} from "@/app/constant";
import { ApiPath, Baidu, BAIDU_BASE_URL } from "@/app/constant";
import { useAccessStore, useAppConfig, useChatStore } from "@/app/store";
import { getAccessToken } from "@/app/utils/baidu";
@@ -14,6 +9,7 @@ import {
LLMApi,
LLMModel,
MultimodalContent,
SpeechOptions,
} from "../api";
import Locale from "../../locales";
import {
@@ -22,7 +18,8 @@ import {
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
import { getClientConfig } from "@/app/config/client";
import { getMessageTextContent } from "@/app/utils";
import { getMessageTextContent, getTimeoutMSByModel } from "@/app/utils";
import { fetch } from "@/app/utils/stream";
export interface OpenAIListModelResponse {
object: string;
@@ -75,6 +72,10 @@ export class ErnieApi implements LLMApi {
return [baseUrl, path].join("/");
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const messages = options.messages.map((v) => ({
// "error_code": 336006, "error_msg": "the role of message with even index in the messages must be user or function",
@@ -149,13 +150,14 @@ export class ErnieApi implements LLMApi {
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
REQUEST_TIMEOUT_MS,
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
let responseText = "";
let remainText = "";
let finished = false;
let responseRes: Response;
// animate response to make it looks smooth
function animateResponseText() {
@@ -185,19 +187,20 @@ export class ErnieApi implements LLMApi {
const finish = () => {
if (!finished) {
finished = true;
options.onFinish(responseText + remainText);
options.onFinish(responseText + remainText, responseRes);
}
};
controller.signal.onabort = finish;
fetchEventSource(chatPath, {
fetch: fetch as any,
...chatPayload,
async onopen(res) {
clearTimeout(requestTimeoutId);
const contentType = res.headers.get("content-type");
console.log("[Baidu] request response content type: ", contentType);
responseRes = res;
if (contentType?.startsWith("text/plain")) {
responseText = await res.clone().text();
return finish();
@@ -260,7 +263,7 @@ export class ErnieApi implements LLMApi {
const resJson = await res.json();
const message = resJson?.result;
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

View File

@@ -1,11 +1,12 @@
"use client";
import { ApiPath, ByteDance, BYTEDANCE_BASE_URL } from "@/app/constant";
import {
ApiPath,
ByteDance,
BYTEDANCE_BASE_URL,
REQUEST_TIMEOUT_MS,
} from "@/app/constant";
import { useAccessStore, useAppConfig, useChatStore } from "@/app/store";
useAccessStore,
useAppConfig,
useChatStore,
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import {
ChatOptions,
@@ -13,15 +14,17 @@ import {
LLMApi,
LLMModel,
MultimodalContent,
SpeechOptions,
} from "../api";
import Locale from "../../locales";
import {
EventStreamContentType,
fetchEventSource,
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
import { streamWithThink } from "@/app/utils/chat";
import { getClientConfig } from "@/app/config/client";
import { getMessageTextContent } from "@/app/utils";
import { preProcessImageContent } from "@/app/utils/chat";
import {
getMessageTextContentWithoutThinking,
getTimeoutMSByModel,
} from "@/app/utils";
import { fetch } from "@/app/utils/stream";
export interface OpenAIListModelResponse {
object: string;
@@ -32,7 +35,7 @@ export interface OpenAIListModelResponse {
}>;
}
interface RequestPayload {
interface RequestPayloadForByteDance {
messages: {
role: "system" | "user" | "assistant";
content: string | MultimodalContent[];
@@ -77,11 +80,19 @@ export class DoubaoApi implements LLMApi {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const messages = options.messages.map((v) => ({
role: v.role,
content: getMessageTextContent(v),
}));
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
const content =
v.role === "assistant"
? getMessageTextContentWithoutThinking(v)
: await preProcessImageContent(v.content);
messages.push({ role: v.role, content });
}
const modelConfig = {
...useAppConfig.getState().modelConfig,
@@ -92,7 +103,7 @@ export class DoubaoApi implements LLMApi {
};
const shouldStream = !!options.config.stream;
const requestPayload: RequestPayload = {
const requestPayload: RequestPayloadForByteDance = {
messages,
stream: shouldStream,
model: modelConfig.model,
@@ -117,124 +128,108 @@ export class DoubaoApi implements LLMApi {
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
REQUEST_TIMEOUT_MS,
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
let responseText = "";
let remainText = "";
let finished = false;
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
return streamWithThink(
chatPath,
requestPayload,
getHeaders(),
tools as any,
funcs,
controller,
// parseSSE
(text: string, runTools: ChatMessageTool[]) => {
// console.log("parseSSE", text, runTools);
const json = JSON.parse(text);
const choices = json.choices as Array<{
delta: {
content: string | null;
tool_calls: ChatMessageTool[];
reasoning_content: string | null;
};
}>;
// animate response to make it looks smooth
function animateResponseText() {
if (finished || controller.signal.aborted) {
responseText += remainText;
console.log("[Response Animation] finished");
if (responseText?.length === 0) {
options.onError?.(new Error("empty response from server"));
}
return;
}
if (remainText.length > 0) {
const fetchCount = Math.max(1, Math.round(remainText.length / 60));
const fetchText = remainText.slice(0, fetchCount);
responseText += fetchText;
remainText = remainText.slice(fetchCount);
options.onUpdate?.(responseText, fetchText);
}
requestAnimationFrame(animateResponseText);
}
// start animaion
animateResponseText();
const finish = () => {
if (!finished) {
finished = true;
options.onFinish(responseText + remainText);
}
};
controller.signal.onabort = finish;
fetchEventSource(chatPath, {
...chatPayload,
async onopen(res) {
clearTimeout(requestTimeoutId);
const contentType = res.headers.get("content-type");
console.log(
"[ByteDance] request response content type: ",
contentType,
);
if (contentType?.startsWith("text/plain")) {
responseText = await res.clone().text();
return finish();
if (!choices?.length) return { isThinking: false, content: "" };
const tool_calls = choices[0]?.delta?.tool_calls;
if (tool_calls?.length > 0) {
const index = tool_calls[0]?.index;
const id = tool_calls[0]?.id;
const args = tool_calls[0]?.function?.arguments;
if (id) {
runTools.push({
id,
type: tool_calls[0]?.type,
function: {
name: tool_calls[0]?.function?.name as string,
arguments: args,
},
});
} else {
// @ts-ignore
runTools[index]["function"]["arguments"] += args;
}
}
const reasoning = choices[0]?.delta?.reasoning_content;
const content = choices[0]?.delta?.content;
// Skip if both content and reasoning_content are empty or null
if (
!res.ok ||
!res.headers
.get("content-type")
?.startsWith(EventStreamContentType) ||
res.status !== 200
(!reasoning || reasoning.length === 0) &&
(!content || content.length === 0)
) {
const responseTexts = [responseText];
let extraInfo = await res.clone().text();
try {
const resJson = await res.clone().json();
extraInfo = prettyObject(resJson);
} catch {}
if (res.status === 401) {
responseTexts.push(Locale.Error.Unauthorized);
}
if (extraInfo) {
responseTexts.push(extraInfo);
}
responseText = responseTexts.join("\n\n");
return finish();
return {
isThinking: false,
content: "",
};
}
},
onmessage(msg) {
if (msg.data === "[DONE]" || finished) {
return finish();
}
const text = msg.data;
try {
const json = JSON.parse(text);
const choices = json.choices as Array<{
delta: { content: string };
}>;
const delta = choices[0]?.delta?.content;
if (delta) {
remainText += delta;
}
} catch (e) {
console.error("[Request] parse error", text, msg);
if (reasoning && reasoning.length > 0) {
return {
isThinking: true,
content: reasoning,
};
} else if (content && content.length > 0) {
return {
isThinking: false,
content: content,
};
}
return {
isThinking: false,
content: "",
};
},
onclose() {
finish();
// processToolMessage, include tool_calls message and tool call results
(
requestPayload: RequestPayloadForByteDance,
toolCallMessage: any,
toolCallResult: any[],
) => {
requestPayload?.messages?.splice(
requestPayload?.messages?.length,
0,
toolCallMessage,
...toolCallResult,
);
},
onerror(e) {
options.onError?.(e);
throw e;
},
openWhenHidden: true,
});
options,
);
} else {
const res = await fetch(chatPath, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

View File

@@ -0,0 +1,254 @@
"use client";
// azure and openai, using same models. so using same LLMApi.
import { ApiPath, DEEPSEEK_BASE_URL, DeepSeek } from "@/app/constant";
import {
useAccessStore,
useAppConfig,
useChatStore,
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import { streamWithThink } from "@/app/utils/chat";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
SpeechOptions,
} from "../api";
import { getClientConfig } from "@/app/config/client";
import {
getMessageTextContent,
getMessageTextContentWithoutThinking,
getTimeoutMSByModel,
} from "@/app/utils";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
export class DeepSeekApi implements LLMApi {
private disableListModels = true;
path(path: string): string {
const accessStore = useAccessStore.getState();
let baseUrl = "";
if (accessStore.useCustomConfig) {
baseUrl = accessStore.deepseekUrl;
}
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = ApiPath.DeepSeek;
baseUrl = isApp ? DEEPSEEK_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, baseUrl.length - 1);
}
if (!baseUrl.startsWith("http") && !baseUrl.startsWith(ApiPath.DeepSeek)) {
baseUrl = "https://" + baseUrl;
}
console.log("[Proxy Endpoint] ", baseUrl, path);
return [baseUrl, path].join("/");
}
extractMessage(res: any) {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
if (v.role === "assistant") {
const content = getMessageTextContentWithoutThinking(v);
messages.push({ role: v.role, content });
} else {
const content = getMessageTextContent(v);
messages.push({ role: v.role, content });
}
}
// 检测并修复消息顺序确保除system外的第一个消息是user
const filteredMessages: ChatOptions["messages"] = [];
let hasFoundFirstUser = false;
for (const msg of messages) {
if (msg.role === "system") {
// Keep all system messages
filteredMessages.push(msg);
} else if (msg.role === "user") {
// User message directly added
filteredMessages.push(msg);
hasFoundFirstUser = true;
} else if (hasFoundFirstUser) {
// After finding the first user message, all subsequent non-system messages are retained.
filteredMessages.push(msg);
}
// If hasFoundFirstUser is false and it is not a system message, it will be skipped.
}
const modelConfig = {
...useAppConfig.getState().modelConfig,
...useChatStore.getState().currentSession().mask.modelConfig,
...{
model: options.config.model,
providerName: options.config.providerName,
},
};
const requestPayload: RequestPayload = {
messages: filteredMessages,
stream: options.config.stream,
model: modelConfig.model,
temperature: modelConfig.temperature,
presence_penalty: modelConfig.presence_penalty,
frequency_penalty: modelConfig.frequency_penalty,
top_p: modelConfig.top_p,
// max_tokens: Math.max(modelConfig.max_tokens, 1024),
// Please do not ask me why not send max_tokens, no reason, this param is just shit, I dont want to explain anymore.
};
console.log("[Request] openai payload: ", requestPayload);
const shouldStream = !!options.config.stream;
const controller = new AbortController();
options.onController?.(controller);
try {
const chatPath = this.path(DeepSeek.ChatPath);
const chatPayload = {
method: "POST",
body: JSON.stringify(requestPayload),
signal: controller.signal,
headers: getHeaders(),
};
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
return streamWithThink(
chatPath,
requestPayload,
getHeaders(),
tools as any,
funcs,
controller,
// parseSSE
(text: string, runTools: ChatMessageTool[]) => {
console.log("parseSSE", text, runTools);
const json = JSON.parse(text);
const choices = json.choices as Array<{
delta: {
content: string | null;
tool_calls: ChatMessageTool[];
reasoning_content: string | null;
};
}>;
const tool_calls = choices[0]?.delta?.tool_calls;
if (tool_calls?.length > 0) {
const index = tool_calls[0]?.index;
const id = tool_calls[0]?.id;
const args = tool_calls[0]?.function?.arguments;
if (id) {
runTools.push({
id,
type: tool_calls[0]?.type,
function: {
name: tool_calls[0]?.function?.name as string,
arguments: args,
},
});
} else {
// @ts-ignore
runTools[index]["function"]["arguments"] += args;
}
}
const reasoning = choices[0]?.delta?.reasoning_content;
const content = choices[0]?.delta?.content;
// Skip if both content and reasoning_content are empty or null
if (
(!reasoning || reasoning.length === 0) &&
(!content || content.length === 0)
) {
return {
isThinking: false,
content: "",
};
}
if (reasoning && reasoning.length > 0) {
return {
isThinking: true,
content: reasoning,
};
} else if (content && content.length > 0) {
return {
isThinking: false,
content: content,
};
}
return {
isThinking: false,
content: "",
};
},
// processToolMessage, include tool_calls message and tool call results
(
requestPayload: RequestPayload,
toolCallMessage: any,
toolCallResult: any[],
) => {
// @ts-ignore
requestPayload?.messages?.splice(
// @ts-ignore
requestPayload?.messages?.length,
0,
toolCallMessage,
...toolCallResult,
);
},
options,
);
} else {
const res = await fetch(chatPath, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);
options.onError?.(e as Error);
}
}
async usage() {
return {
used: 0,
total: 0,
};
}
async models(): Promise<LLMModel[]> {
return [];
}
}

292
app/client/platforms/glm.ts Normal file
View File

@@ -0,0 +1,292 @@
"use client";
import { ApiPath, CHATGLM_BASE_URL, ChatGLM } from "@/app/constant";
import {
useAccessStore,
useAppConfig,
useChatStore,
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import { stream } from "@/app/utils/chat";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
SpeechOptions,
} from "../api";
import { getClientConfig } from "@/app/config/client";
import {
getMessageTextContent,
isVisionModel,
getTimeoutMSByModel,
} from "@/app/utils";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
import { preProcessImageContent } from "@/app/utils/chat";
interface BasePayload {
model: string;
}
interface ChatPayload extends BasePayload {
messages: ChatOptions["messages"];
stream?: boolean;
temperature?: number;
presence_penalty?: number;
frequency_penalty?: number;
top_p?: number;
}
interface ImageGenerationPayload extends BasePayload {
prompt: string;
size?: string;
user_id?: string;
}
interface VideoGenerationPayload extends BasePayload {
prompt: string;
duration?: number;
resolution?: string;
user_id?: string;
}
type ModelType = "chat" | "image" | "video";
export class ChatGLMApi implements LLMApi {
private disableListModels = true;
private getModelType(model: string): ModelType {
if (model.startsWith("cogview-")) return "image";
if (model.startsWith("cogvideo-")) return "video";
return "chat";
}
private getModelPath(type: ModelType): string {
switch (type) {
case "image":
return ChatGLM.ImagePath;
case "video":
return ChatGLM.VideoPath;
default:
return ChatGLM.ChatPath;
}
}
private createPayload(
messages: ChatOptions["messages"],
modelConfig: any,
options: ChatOptions,
): BasePayload {
const modelType = this.getModelType(modelConfig.model);
const lastMessage = messages[messages.length - 1];
const prompt =
typeof lastMessage.content === "string"
? lastMessage.content
: lastMessage.content.map((c) => c.text).join("\n");
switch (modelType) {
case "image":
return {
model: modelConfig.model,
prompt,
size: options.config.size,
} as ImageGenerationPayload;
default:
return {
messages,
stream: options.config.stream,
model: modelConfig.model,
temperature: modelConfig.temperature,
presence_penalty: modelConfig.presence_penalty,
frequency_penalty: modelConfig.frequency_penalty,
top_p: modelConfig.top_p,
} as ChatPayload;
}
}
private parseResponse(modelType: ModelType, json: any): string {
switch (modelType) {
case "image": {
const imageUrl = json.data?.[0]?.url;
return imageUrl ? `![Generated Image](${imageUrl})` : "";
}
case "video": {
const videoUrl = json.data?.[0]?.url;
return videoUrl ? `<video controls src="${videoUrl}"></video>` : "";
}
default:
return this.extractMessage(json);
}
}
path(path: string): string {
const accessStore = useAccessStore.getState();
let baseUrl = "";
if (accessStore.useCustomConfig) {
baseUrl = accessStore.chatglmUrl;
}
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = ApiPath.ChatGLM;
baseUrl = isApp ? CHATGLM_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, baseUrl.length - 1);
}
if (!baseUrl.startsWith("http") && !baseUrl.startsWith(ApiPath.ChatGLM)) {
baseUrl = "https://" + baseUrl;
}
console.log("[Proxy Endpoint] ", baseUrl, path);
return [baseUrl, path].join("/");
}
extractMessage(res: any) {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const visionModel = isVisionModel(options.config.model);
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
const content = visionModel
? await preProcessImageContent(v.content)
: getMessageTextContent(v);
messages.push({ role: v.role, content });
}
const modelConfig = {
...useAppConfig.getState().modelConfig,
...useChatStore.getState().currentSession().mask.modelConfig,
...{
model: options.config.model,
providerName: options.config.providerName,
},
};
const modelType = this.getModelType(modelConfig.model);
const requestPayload = this.createPayload(messages, modelConfig, options);
const path = this.path(this.getModelPath(modelType));
console.log(`[Request] glm ${modelType} payload: `, requestPayload);
const controller = new AbortController();
options.onController?.(controller);
try {
const chatPayload = {
method: "POST",
body: JSON.stringify(requestPayload),
signal: controller.signal,
headers: getHeaders(),
};
const requestTimeoutId = setTimeout(
() => controller.abort(),
getTimeoutMSByModel(options.config.model),
);
if (modelType === "image" || modelType === "video") {
const res = await fetch(path, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
console.log(`[Response] glm ${modelType}:`, resJson);
const message = this.parseResponse(modelType, resJson);
options.onFinish(message, res);
return;
}
const shouldStream = !!options.config.stream;
if (shouldStream) {
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
return stream(
path,
requestPayload,
getHeaders(),
tools as any,
funcs,
controller,
// parseSSE
(text: string, runTools: ChatMessageTool[]) => {
const json = JSON.parse(text);
const choices = json.choices as Array<{
delta: {
content: string;
tool_calls: ChatMessageTool[];
};
}>;
const tool_calls = choices[0]?.delta?.tool_calls;
if (tool_calls?.length > 0) {
const index = tool_calls[0]?.index;
const id = tool_calls[0]?.id;
const args = tool_calls[0]?.function?.arguments;
if (id) {
runTools.push({
id,
type: tool_calls[0]?.type,
function: {
name: tool_calls[0]?.function?.name as string,
arguments: args,
},
});
} else {
// @ts-ignore
runTools[index]["function"]["arguments"] += args;
}
}
return choices[0]?.delta?.content;
},
// processToolMessage
(
requestPayload: RequestPayload,
toolCallMessage: any,
toolCallResult: any[],
) => {
// @ts-ignore
requestPayload?.messages?.splice(
// @ts-ignore
requestPayload?.messages?.length,
0,
toolCallMessage,
...toolCallResult,
);
},
options,
);
} else {
const res = await fetch(path, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);
options.onError?.(e as Error);
}
}
async usage() {
return {
used: 0,
total: 0,
};
}
async models(): Promise<LLMModel[]> {
return [];
}
}

View File

@@ -1,23 +1,36 @@
import { ApiPath, Google, REQUEST_TIMEOUT_MS } from "@/app/constant";
import { ChatOptions, getHeaders, LLMApi, LLMModel, LLMUsage } from "../api";
import { useAccessStore, useAppConfig, useChatStore } from "@/app/store";
import { getClientConfig } from "@/app/config/client";
import { DEFAULT_API_HOST } from "@/app/constant";
import Locale from "../../locales";
import { ApiPath, Google } from "@/app/constant";
import {
EventStreamContentType,
fetchEventSource,
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
LLMUsage,
SpeechOptions,
} from "../api";
import {
useAccessStore,
useAppConfig,
useChatStore,
usePluginStore,
ChatMessageTool,
} from "@/app/store";
import { stream } from "@/app/utils/chat";
import { getClientConfig } from "@/app/config/client";
import { GEMINI_BASE_URL } from "@/app/constant";
import {
getMessageTextContent,
getMessageImages,
isVisionModel,
getTimeoutMSByModel,
} from "@/app/utils";
import { preProcessImageContent } from "@/app/utils/chat";
import { nanoid } from "nanoid";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
export class GeminiProApi implements LLMApi {
path(path: string): string {
path(path: string, shouldStream = false): string {
const accessStore = useAccessStore.getState();
let baseUrl = "";
@@ -27,7 +40,7 @@ export class GeminiProApi implements LLMApi {
const isApp = !!getClientConfig()?.isApp;
if (baseUrl.length === 0) {
baseUrl = isApp ? DEFAULT_API_HOST + `/api/proxy/google` : ApiPath.Google;
baseUrl = isApp ? GEMINI_BASE_URL : ApiPath.Google;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, baseUrl.length - 1);
@@ -39,23 +52,42 @@ export class GeminiProApi implements LLMApi {
console.log("[Proxy Endpoint] ", baseUrl, path);
let chatPath = [baseUrl, path].join("/");
chatPath += chatPath.includes("?") ? "&alt=sse" : "?alt=sse";
// if chatPath.startsWith('http') then add key in query string
if (chatPath.startsWith("http") && accessStore.googleApiKey) {
chatPath += `&key=${accessStore.googleApiKey}`;
if (shouldStream) {
chatPath += chatPath.includes("?") ? "&alt=sse" : "?alt=sse";
}
return chatPath;
}
extractMessage(res: any) {
console.log("[Response] gemini-pro response: ", res);
const getTextFromParts = (parts: any[]) => {
if (!Array.isArray(parts)) return "";
return parts
.map((part) => part?.text || "")
.filter((text) => text.trim() !== "")
.join("\n\n");
};
let content = "";
if (Array.isArray(res)) {
res.map((item) => {
content += getTextFromParts(item?.candidates?.at(0)?.content?.parts);
});
}
return (
res?.candidates?.at(0)?.content?.parts.at(0)?.text ||
getTextFromParts(res?.candidates?.at(0)?.content?.parts) ||
content || //getTextFromParts(res?.at(0)?.candidates?.at(0)?.content?.parts) ||
res?.error?.message ||
""
);
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions): Promise<void> {
const apiClient = this;
let multimodal = false;
@@ -154,7 +186,10 @@ export class GeminiProApi implements LLMApi {
options.onController?.(controller);
try {
// https://github.com/google-gemini/cookbook/blob/main/quickstarts/rest/Streaming_REST.ipynb
const chatPath = this.path(Google.ChatPath(modelConfig.model));
const chatPath = this.path(
Google.ChatPath(modelConfig.model),
shouldStream,
);
const chatPayload = {
method: "POST",
@@ -163,121 +198,95 @@ export class GeminiProApi implements LLMApi {
headers: getHeaders(),
};
const isThinking = options.config.model.includes("-thinking");
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
REQUEST_TIMEOUT_MS,
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
let responseText = "";
let remainText = "";
let finished = false;
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
return stream(
chatPath,
requestPayload,
getHeaders(),
// @ts-ignore
tools.length > 0
? // @ts-ignore
[{ functionDeclarations: tools.map((tool) => tool.function) }]
: [],
funcs,
controller,
// parseSSE
(text: string, runTools: ChatMessageTool[]) => {
// console.log("parseSSE", text, runTools);
const chunkJson = JSON.parse(text);
const finish = () => {
if (!finished) {
finished = true;
options.onFinish(responseText + remainText);
}
};
// animate response to make it looks smooth
function animateResponseText() {
if (finished || controller.signal.aborted) {
responseText += remainText;
finish();
return;
}
if (remainText.length > 0) {
const fetchCount = Math.max(1, Math.round(remainText.length / 60));
const fetchText = remainText.slice(0, fetchCount);
responseText += fetchText;
remainText = remainText.slice(fetchCount);
options.onUpdate?.(responseText, fetchText);
}
requestAnimationFrame(animateResponseText);
}
// start animaion
animateResponseText();
controller.signal.onabort = finish;
fetchEventSource(chatPath, {
...chatPayload,
async onopen(res) {
clearTimeout(requestTimeoutId);
const contentType = res.headers.get("content-type");
console.log(
"[Gemini] request response content type: ",
contentType,
const functionCall = chunkJson?.candidates
?.at(0)
?.content.parts.at(0)?.functionCall;
if (functionCall) {
const { name, args } = functionCall;
runTools.push({
id: nanoid(),
type: "function",
function: {
name,
arguments: JSON.stringify(args), // utils.chat call function, using JSON.parse
},
});
}
return chunkJson?.candidates
?.at(0)
?.content.parts?.map((part: { text: string }) => part.text)
.join("\n\n");
},
// processToolMessage, include tool_calls message and tool call results
(
requestPayload: RequestPayload,
toolCallMessage: any,
toolCallResult: any[],
) => {
// @ts-ignore
requestPayload?.contents?.splice(
// @ts-ignore
requestPayload?.contents?.length,
0,
{
role: "model",
parts: toolCallMessage.tool_calls.map(
(tool: ChatMessageTool) => ({
functionCall: {
name: tool?.function?.name,
args: JSON.parse(tool?.function?.arguments as string),
},
}),
),
},
// @ts-ignore
...toolCallResult.map((result) => ({
role: "function",
parts: [
{
functionResponse: {
name: result.name,
response: {
name: result.name,
content: result.content, // TODO just text content...
},
},
},
],
})),
);
if (contentType?.startsWith("text/plain")) {
responseText = await res.clone().text();
return finish();
}
if (
!res.ok ||
!res.headers
.get("content-type")
?.startsWith(EventStreamContentType) ||
res.status !== 200
) {
const responseTexts = [responseText];
let extraInfo = await res.clone().text();
try {
const resJson = await res.clone().json();
extraInfo = prettyObject(resJson);
} catch {}
if (res.status === 401) {
responseTexts.push(Locale.Error.Unauthorized);
}
if (extraInfo) {
responseTexts.push(extraInfo);
}
responseText = responseTexts.join("\n\n");
return finish();
}
},
onmessage(msg) {
if (msg.data === "[DONE]" || finished) {
return finish();
}
const text = msg.data;
try {
const json = JSON.parse(text);
const delta = apiClient.extractMessage(json);
if (delta) {
remainText += delta;
}
const blockReason = json?.promptFeedback?.blockReason;
if (blockReason) {
// being blocked
console.log(`[Google] [Safety Ratings] result:`, blockReason);
}
} catch (e) {
console.error("[Request] parse error", text, msg);
}
},
onclose() {
finish();
},
onerror(e) {
options.onError?.(e);
throw e;
},
openWhenHidden: true,
});
options,
);
} else {
const res = await fetch(chatPath, chatPayload);
clearTimeout(requestTimeoutId);
@@ -292,7 +301,7 @@ export class GeminiProApi implements LLMApi {
);
}
const message = apiClient.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

View File

@@ -1,13 +1,19 @@
"use client";
import {
ApiPath,
DEFAULT_API_HOST,
IFLYTEK_BASE_URL,
Iflytek,
REQUEST_TIMEOUT_MS,
} from "@/app/constant";
import { useAccessStore, useAppConfig, useChatStore } from "@/app/store";
import { ChatOptions, getHeaders, LLMApi, LLMModel } from "../api";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
SpeechOptions,
} from "../api";
import Locale from "../../locales";
import {
EventStreamContentType,
@@ -16,8 +22,9 @@ import {
import { prettyObject } from "@/app/utils/format";
import { getClientConfig } from "@/app/config/client";
import { getMessageTextContent } from "@/app/utils";
import { fetch } from "@/app/utils/stream";
import { OpenAIListModelResponse, RequestPayload } from "./openai";
import { RequestPayload } from "./openai";
export class SparkApi implements LLMApi {
private disableListModels = true;
@@ -34,7 +41,7 @@ export class SparkApi implements LLMApi {
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = ApiPath.Iflytek;
baseUrl = isApp ? DEFAULT_API_HOST + "/proxy" + apiPath : apiPath;
baseUrl = isApp ? IFLYTEK_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
@@ -53,6 +60,10 @@ export class SparkApi implements LLMApi {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
@@ -106,6 +117,7 @@ export class SparkApi implements LLMApi {
let responseText = "";
let remainText = "";
let finished = false;
let responseRes: Response;
// Animate response text to make it look smooth
function animateResponseText() {
@@ -132,19 +144,20 @@ export class SparkApi implements LLMApi {
const finish = () => {
if (!finished) {
finished = true;
options.onFinish(responseText + remainText);
options.onFinish(responseText + remainText, responseRes);
}
};
controller.signal.onabort = finish;
fetchEventSource(chatPath, {
fetch: fetch as any,
...chatPayload,
async onopen(res) {
clearTimeout(requestTimeoutId);
const contentType = res.headers.get("content-type");
console.log("[Spark] request response content type: ", contentType);
responseRes = res;
if (contentType?.startsWith("text/plain")) {
responseText = await res.clone().text();
return finish();
@@ -219,7 +232,7 @@ export class SparkApi implements LLMApi {
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

View File

@@ -2,11 +2,9 @@
// azure and openai, using same models. so using same LLMApi.
import {
ApiPath,
DEFAULT_API_HOST,
DEFAULT_MODELS,
MOONSHOT_BASE_URL,
Moonshot,
REQUEST_TIMEOUT_MS,
ServiceProvider,
} from "@/app/constant";
import {
useAccessStore,
@@ -15,28 +13,18 @@ import {
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import { collectModelsWithDefaultModel } from "@/app/utils/model";
import { preProcessImageContent, stream } from "@/app/utils/chat";
import { cloudflareAIGatewayUrl } from "@/app/utils/cloudflare";
import { stream } from "@/app/utils/chat";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
LLMUsage,
MultimodalContent,
SpeechOptions,
} from "../api";
import Locale from "../../locales";
import {
EventStreamContentType,
fetchEventSource,
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
import { getClientConfig } from "@/app/config/client";
import { getMessageTextContent } from "@/app/utils";
import { OpenAIListModelResponse, RequestPayload } from "./openai";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
export class MoonshotApi implements LLMApi {
private disableListModels = true;
@@ -53,7 +41,7 @@ export class MoonshotApi implements LLMApi {
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = ApiPath.Moonshot;
baseUrl = isApp ? DEFAULT_API_HOST + "/proxy" + apiPath : apiPath;
baseUrl = isApp ? MOONSHOT_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
@@ -72,6 +60,10 @@ export class MoonshotApi implements LLMApi {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
@@ -188,7 +180,7 @@ export class MoonshotApi implements LLMApi {
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

View File

@@ -2,7 +2,7 @@
// azure and openai, using same models. so using same LLMApi.
import {
ApiPath,
DEFAULT_API_HOST,
OPENAI_BASE_URL,
DEFAULT_MODELS,
OpenaiPath,
Azure,
@@ -21,10 +21,10 @@ import {
preProcessImageContent,
uploadImage,
base64Image2Blob,
stream,
streamWithThink,
} from "@/app/utils/chat";
import { cloudflareAIGatewayUrl } from "@/app/utils/cloudflare";
import { DalleSize, DalleQuality, DalleStyle } from "@/app/typing";
import { ModelSize, DalleQuality, DalleStyle } from "@/app/typing";
import {
ChatOptions,
@@ -33,20 +33,17 @@ import {
LLMModel,
LLMUsage,
MultimodalContent,
SpeechOptions,
} from "../api";
import Locale from "../../locales";
import {
EventStreamContentType,
fetchEventSource,
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
import { getClientConfig } from "@/app/config/client";
import {
getMessageTextContent,
getMessageImages,
isVisionModel,
isDalle3 as _isDalle3,
getTimeoutMSByModel,
} from "@/app/utils";
import { fetch } from "@/app/utils/stream";
export interface OpenAIListModelResponse {
object: string;
@@ -59,7 +56,7 @@ export interface OpenAIListModelResponse {
export interface RequestPayload {
messages: {
role: "system" | "user" | "assistant";
role: "developer" | "system" | "user" | "assistant";
content: string | MultimodalContent[];
}[];
stream?: boolean;
@@ -69,6 +66,7 @@ export interface RequestPayload {
frequency_penalty: number;
top_p: number;
max_tokens?: number;
max_completion_tokens?: number;
}
export interface DalleRequestPayload {
@@ -76,7 +74,7 @@ export interface DalleRequestPayload {
prompt: string;
response_format: "url" | "b64_json";
n: number;
size: DalleSize;
size: ModelSize;
quality: DalleQuality;
style: DalleStyle;
}
@@ -103,7 +101,7 @@ export class ChatGPTApi implements LLMApi {
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = isAzure ? ApiPath.Azure : ApiPath.OpenAI;
baseUrl = isApp ? DEFAULT_API_HOST + "/proxy" + apiPath : apiPath;
baseUrl = isApp ? OPENAI_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
@@ -147,6 +145,44 @@ export class ChatGPTApi implements LLMApi {
return res.choices?.at(0)?.message?.content ?? res;
}
async speech(options: SpeechOptions): Promise<ArrayBuffer> {
const requestPayload = {
model: options.model,
input: options.input,
voice: options.voice,
response_format: options.response_format,
speed: options.speed,
};
console.log("[Request] openai speech payload: ", requestPayload);
const controller = new AbortController();
options.onController?.(controller);
try {
const speechPath = this.path(OpenaiPath.SpeechPath);
const speechPayload = {
method: "POST",
body: JSON.stringify(requestPayload),
signal: controller.signal,
headers: getHeaders(),
};
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
REQUEST_TIMEOUT_MS,
);
const res = await fetch(speechPath, speechPayload);
clearTimeout(requestTimeoutId);
return await res.arrayBuffer();
} catch (e) {
console.log("[Request] failed to make a speech request", e);
throw e;
}
}
async chat(options: ChatOptions) {
const modelConfig = {
...useAppConfig.getState().modelConfig,
@@ -160,6 +196,10 @@ export class ChatGPTApi implements LLMApi {
let requestPayload: RequestPayload | DalleRequestPayload;
const isDalle3 = _isDalle3(options.config.model);
const isO1OrO3 =
options.config.model.startsWith("o1") ||
options.config.model.startsWith("o3") ||
options.config.model.startsWith("o4-mini");
if (isDalle3) {
const prompt = getMessageTextContent(
options.messages.slice(-1)?.pop() as any,
@@ -181,23 +221,38 @@ export class ChatGPTApi implements LLMApi {
const content = visionModel
? await preProcessImageContent(v.content)
: getMessageTextContent(v);
messages.push({ role: v.role, content });
if (!(isO1OrO3 && v.role === "system"))
messages.push({ role: v.role, content });
}
// O1 not support image, tools (plugin in ChatGPTNextWeb) and system, stream, logprobs, temperature, top_p, n, presence_penalty, frequency_penalty yet.
requestPayload = {
messages,
stream: options.config.stream,
model: modelConfig.model,
temperature: modelConfig.temperature,
presence_penalty: modelConfig.presence_penalty,
frequency_penalty: modelConfig.frequency_penalty,
top_p: modelConfig.top_p,
temperature: !isO1OrO3 ? modelConfig.temperature : 1,
presence_penalty: !isO1OrO3 ? modelConfig.presence_penalty : 0,
frequency_penalty: !isO1OrO3 ? modelConfig.frequency_penalty : 0,
top_p: !isO1OrO3 ? modelConfig.top_p : 1,
// max_tokens: Math.max(modelConfig.max_tokens, 1024),
// Please do not ask me why not send max_tokens, no reason, this param is just shit, I dont want to explain anymore.
};
if (isO1OrO3) {
// by default the o1/o3 models will not attempt to produce output that includes markdown formatting
// manually add "Formatting re-enabled" developer message to encourage markdown inclusion in model responses
// (https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/reasoning?tabs=python-secure#markdown-output)
requestPayload["messages"].unshift({
role: "developer",
content: "Formatting re-enabled",
});
// o1/o3 uses max_completion_tokens to control the number of tokens (https://platform.openai.com/docs/guides/reasoning#controlling-costs)
requestPayload["max_completion_tokens"] = modelConfig.max_tokens;
}
// add max_tokens to vision model
if (visionModel && modelConfig.model.includes("preview")) {
if (visionModel && !isO1OrO3) {
requestPayload["max_tokens"] = Math.max(modelConfig.max_tokens, 4000);
}
}
@@ -241,13 +296,14 @@ export class ChatGPTApi implements LLMApi {
);
}
if (shouldStream) {
let index = -1;
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
// console.log("getAsTools", tools, funcs);
stream(
streamWithThink(
chatPath,
requestPayload,
getHeaders(),
@@ -262,14 +318,18 @@ export class ChatGPTApi implements LLMApi {
delta: {
content: string;
tool_calls: ChatMessageTool[];
reasoning_content: string | null;
};
}>;
if (!choices?.length) return { isThinking: false, content: "" };
const tool_calls = choices[0]?.delta?.tool_calls;
if (tool_calls?.length > 0) {
const index = tool_calls[0]?.index;
const id = tool_calls[0]?.id;
const args = tool_calls[0]?.function?.arguments;
if (id) {
index += 1;
runTools.push({
id,
type: tool_calls[0]?.type,
@@ -283,7 +343,37 @@ export class ChatGPTApi implements LLMApi {
runTools[index]["function"]["arguments"] += args;
}
}
return choices[0]?.delta?.content;
const reasoning = choices[0]?.delta?.reasoning_content;
const content = choices[0]?.delta?.content;
// Skip if both content and reasoning_content are empty or null
if (
(!reasoning || reasoning.length === 0) &&
(!content || content.length === 0)
) {
return {
isThinking: false,
content: "",
};
}
if (reasoning && reasoning.length > 0) {
return {
isThinking: true,
content: reasoning,
};
} else if (content && content.length > 0) {
return {
isThinking: false,
content: content,
};
}
return {
isThinking: false,
content: "",
};
},
// processToolMessage, include tool_calls message and tool call results
(
@@ -291,6 +381,8 @@ export class ChatGPTApi implements LLMApi {
toolCallMessage: any,
toolCallResult: any[],
) => {
// reset index value
index = -1;
// @ts-ignore
requestPayload?.messages?.splice(
// @ts-ignore
@@ -313,7 +405,7 @@ export class ChatGPTApi implements LLMApi {
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
isDalle3 ? REQUEST_TIMEOUT_MS * 2 : REQUEST_TIMEOUT_MS, // dalle3 using b64_json is slow.
getTimeoutMSByModel(options.config.model),
);
const res = await fetch(chatPath, chatPayload);
@@ -321,7 +413,7 @@ export class ChatGPTApi implements LLMApi {
const resJson = await res.json();
const message = await this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);
@@ -407,7 +499,9 @@ export class ChatGPTApi implements LLMApi {
});
const resJson = (await res.json()) as OpenAIListModelResponse;
const chatModels = resJson.data?.filter((m) => m.id.startsWith("gpt-"));
const chatModels = resJson.data?.filter(
(m) => m.id.startsWith("gpt-") || m.id.startsWith("chatgpt-"),
);
console.log("[Models]", chatModels);
if (!chatModels) {

View File

@@ -0,0 +1,241 @@
"use client";
// azure and openai, using same models. so using same LLMApi.
import {
ApiPath,
SILICONFLOW_BASE_URL,
SiliconFlow,
DEFAULT_MODELS,
} from "@/app/constant";
import {
useAccessStore,
useAppConfig,
useChatStore,
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import { preProcessImageContent, streamWithThink } from "@/app/utils/chat";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
SpeechOptions,
} from "../api";
import { getClientConfig } from "@/app/config/client";
import {
getMessageTextContent,
getMessageTextContentWithoutThinking,
isVisionModel,
getTimeoutMSByModel,
} from "@/app/utils";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
export interface SiliconFlowListModelResponse {
object: string;
data: Array<{
id: string;
object: string;
root: string;
}>;
}
export class SiliconflowApi implements LLMApi {
private disableListModels = false;
path(path: string): string {
const accessStore = useAccessStore.getState();
let baseUrl = "";
if (accessStore.useCustomConfig) {
baseUrl = accessStore.siliconflowUrl;
}
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = ApiPath.SiliconFlow;
baseUrl = isApp ? SILICONFLOW_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, baseUrl.length - 1);
}
if (
!baseUrl.startsWith("http") &&
!baseUrl.startsWith(ApiPath.SiliconFlow)
) {
baseUrl = "https://" + baseUrl;
}
console.log("[Proxy Endpoint] ", baseUrl, path);
return [baseUrl, path].join("/");
}
extractMessage(res: any) {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const visionModel = isVisionModel(options.config.model);
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
if (v.role === "assistant") {
const content = getMessageTextContentWithoutThinking(v);
messages.push({ role: v.role, content });
} else {
const content = visionModel
? await preProcessImageContent(v.content)
: getMessageTextContent(v);
messages.push({ role: v.role, content });
}
}
const modelConfig = {
...useAppConfig.getState().modelConfig,
...useChatStore.getState().currentSession().mask.modelConfig,
...{
model: options.config.model,
providerName: options.config.providerName,
},
};
const requestPayload: RequestPayload = {
messages,
stream: options.config.stream,
model: modelConfig.model,
temperature: modelConfig.temperature,
presence_penalty: modelConfig.presence_penalty,
frequency_penalty: modelConfig.frequency_penalty,
top_p: modelConfig.top_p,
// max_tokens: Math.max(modelConfig.max_tokens, 1024),
// Please do not ask me why not send max_tokens, no reason, this param is just shit, I dont want to explain anymore.
};
console.log("[Request] openai payload: ", requestPayload);
const shouldStream = !!options.config.stream;
const controller = new AbortController();
options.onController?.(controller);
try {
const chatPath = this.path(SiliconFlow.ChatPath);
const chatPayload = {
method: "POST",
body: JSON.stringify(requestPayload),
signal: controller.signal,
headers: getHeaders(),
};
// console.log(chatPayload);
// Use extended timeout for thinking models as they typically require more processing time
const requestTimeoutId = setTimeout(
() => controller.abort(),
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
return streamWithThink(
chatPath,
requestPayload,
getHeaders(),
tools as any,
funcs,
controller,
// parseSSE mới cho SiliconFlow response
(text: string, runTools: ChatMessageTool[]) => {
// Parse chuỗi JSON trả về thành đối tượng
const json = JSON.parse(text);
// Lấy nội dung trả lời từ output.text
const content = json?.output?.text ?? "";
// Nếu không có nội dung trả lời, trả về trạng thái không suy nghĩ và nội dung rỗng
if (!content || content.length === 0) {
return {
isThinking: false,
content: "",
};
}
// Trả về trạng thái không suy nghĩ và nội dung trả lời
return {
isThinking: false,
content: content,
};
},
// processToolMessage: SiliconFlow không có tool_call nên giữ nguyên hoặc để rỗng
(
requestPayload: RequestPayload,
toolCallMessage: any,
toolCallResult: any[],
) => {
// Không cần xử lý tool_call, có thể để trống hoặc giữ nguyên nếu muốn tương thích
},
options,
);
} else {
const res = await fetch(chatPath, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);
options.onError?.(e as Error);
}
}
async usage() {
return {
used: 0,
total: 0,
};
}
async models(): Promise<LLMModel[]> {
if (this.disableListModels) {
return DEFAULT_MODELS.slice();
}
const res = await fetch(this.path(SiliconFlow.ListModelPath), {
method: "GET",
headers: {
...getHeaders(),
},
});
const resJson = (await res.json()) as SiliconFlowListModelResponse;
const chatModels = resJson.data;
console.log("[Models]", chatModels);
if (!chatModels) {
return [];
}
let seq = 1000; //同 Constant.ts 中的排序保持一致
return chatModels.map((m) => ({
name: m.id,
available: true,
sorted: seq++,
provider: {
id: "siliconflow",
providerName: "SiliconFlow",
providerType: "siliconflow",
sorted: 14,
},
}));
}
}

View File

@@ -1,5 +1,5 @@
"use client";
import { ApiPath, DEFAULT_API_HOST, REQUEST_TIMEOUT_MS } from "@/app/constant";
import { ApiPath, TENCENT_BASE_URL } from "@/app/constant";
import { useAccessStore, useAppConfig, useChatStore } from "@/app/store";
import {
@@ -8,6 +8,7 @@ import {
LLMApi,
LLMModel,
MultimodalContent,
SpeechOptions,
} from "../api";
import Locale from "../../locales";
import {
@@ -16,11 +17,16 @@ import {
} from "@fortaine/fetch-event-source";
import { prettyObject } from "@/app/utils/format";
import { getClientConfig } from "@/app/config/client";
import { getMessageTextContent, isVisionModel } from "@/app/utils";
import {
getMessageTextContent,
isVisionModel,
getTimeoutMSByModel,
} from "@/app/utils";
import mapKeys from "lodash-es/mapKeys";
import mapValues from "lodash-es/mapValues";
import isArray from "lodash-es/isArray";
import isObject from "lodash-es/isObject";
import { fetch } from "@/app/utils/stream";
export interface OpenAIListModelResponse {
object: string;
@@ -69,9 +75,7 @@ export class HunyuanApi implements LLMApi {
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
baseUrl = isApp
? DEFAULT_API_HOST + "/api/proxy/tencent"
: ApiPath.Tencent;
baseUrl = isApp ? TENCENT_BASE_URL : ApiPath.Tencent;
}
if (baseUrl.endsWith("/")) {
@@ -89,6 +93,10 @@ export class HunyuanApi implements LLMApi {
return res.Choices?.at(0)?.Message?.Content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const visionModel = isVisionModel(options.config.model);
const messages = options.messages.map((v, index) => ({
@@ -131,13 +139,14 @@ export class HunyuanApi implements LLMApi {
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
REQUEST_TIMEOUT_MS,
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
let responseText = "";
let remainText = "";
let finished = false;
let responseRes: Response;
// animate response to make it looks smooth
function animateResponseText() {
@@ -167,13 +176,14 @@ export class HunyuanApi implements LLMApi {
const finish = () => {
if (!finished) {
finished = true;
options.onFinish(responseText + remainText);
options.onFinish(responseText + remainText, responseRes);
}
};
controller.signal.onabort = finish;
fetchEventSource(chatPath, {
fetch: fetch as any,
...chatPayload,
async onopen(res) {
clearTimeout(requestTimeoutId);
@@ -182,7 +192,7 @@ export class HunyuanApi implements LLMApi {
"[Tencent] request response content type: ",
contentType,
);
responseRes = res;
if (contentType?.startsWith("text/plain")) {
responseText = await res.clone().text();
return finish();
@@ -248,7 +258,7 @@ export class HunyuanApi implements LLMApi {
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);

194
app/client/platforms/xai.ts Normal file
View File

@@ -0,0 +1,194 @@
"use client";
// azure and openai, using same models. so using same LLMApi.
import { ApiPath, XAI_BASE_URL, XAI } from "@/app/constant";
import {
useAccessStore,
useAppConfig,
useChatStore,
ChatMessageTool,
usePluginStore,
} from "@/app/store";
import { stream } from "@/app/utils/chat";
import {
ChatOptions,
getHeaders,
LLMApi,
LLMModel,
SpeechOptions,
} from "../api";
import { getClientConfig } from "@/app/config/client";
import { getTimeoutMSByModel } from "@/app/utils";
import { preProcessImageContent } from "@/app/utils/chat";
import { RequestPayload } from "./openai";
import { fetch } from "@/app/utils/stream";
export class XAIApi implements LLMApi {
private disableListModels = true;
path(path: string): string {
const accessStore = useAccessStore.getState();
let baseUrl = "";
if (accessStore.useCustomConfig) {
baseUrl = accessStore.xaiUrl;
}
if (baseUrl.length === 0) {
const isApp = !!getClientConfig()?.isApp;
const apiPath = ApiPath.XAI;
baseUrl = isApp ? XAI_BASE_URL : apiPath;
}
if (baseUrl.endsWith("/")) {
baseUrl = baseUrl.slice(0, baseUrl.length - 1);
}
if (!baseUrl.startsWith("http") && !baseUrl.startsWith(ApiPath.XAI)) {
baseUrl = "https://" + baseUrl;
}
console.log("[Proxy Endpoint] ", baseUrl, path);
return [baseUrl, path].join("/");
}
extractMessage(res: any) {
return res.choices?.at(0)?.message?.content ?? "";
}
speech(options: SpeechOptions): Promise<ArrayBuffer> {
throw new Error("Method not implemented.");
}
async chat(options: ChatOptions) {
const messages: ChatOptions["messages"] = [];
for (const v of options.messages) {
const content = await preProcessImageContent(v.content);
messages.push({ role: v.role, content });
}
const modelConfig = {
...useAppConfig.getState().modelConfig,
...useChatStore.getState().currentSession().mask.modelConfig,
...{
model: options.config.model,
providerName: options.config.providerName,
},
};
const requestPayload: RequestPayload = {
messages,
stream: options.config.stream,
model: modelConfig.model,
temperature: modelConfig.temperature,
presence_penalty: modelConfig.presence_penalty,
frequency_penalty: modelConfig.frequency_penalty,
top_p: modelConfig.top_p,
};
console.log("[Request] xai payload: ", requestPayload);
const shouldStream = !!options.config.stream;
const controller = new AbortController();
options.onController?.(controller);
try {
const chatPath = this.path(XAI.ChatPath);
const chatPayload = {
method: "POST",
body: JSON.stringify(requestPayload),
signal: controller.signal,
headers: getHeaders(),
};
// make a fetch request
const requestTimeoutId = setTimeout(
() => controller.abort(),
getTimeoutMSByModel(options.config.model),
);
if (shouldStream) {
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
return stream(
chatPath,
requestPayload,
getHeaders(),
tools as any,
funcs,
controller,
// parseSSE
(text: string, runTools: ChatMessageTool[]) => {
// console.log("parseSSE", text, runTools);
const json = JSON.parse(text);
const choices = json.choices as Array<{
delta: {
content: string;
tool_calls: ChatMessageTool[];
};
}>;
const tool_calls = choices[0]?.delta?.tool_calls;
if (tool_calls?.length > 0) {
const index = tool_calls[0]?.index;
const id = tool_calls[0]?.id;
const args = tool_calls[0]?.function?.arguments;
if (id) {
runTools.push({
id,
type: tool_calls[0]?.type,
function: {
name: tool_calls[0]?.function?.name as string,
arguments: args,
},
});
} else {
// @ts-ignore
runTools[index]["function"]["arguments"] += args;
}
}
return choices[0]?.delta?.content;
},
// processToolMessage, include tool_calls message and tool call results
(
requestPayload: RequestPayload,
toolCallMessage: any,
toolCallResult: any[],
) => {
// @ts-ignore
requestPayload?.messages?.splice(
// @ts-ignore
requestPayload?.messages?.length,
0,
toolCallMessage,
...toolCallResult,
);
},
options,
);
} else {
const res = await fetch(chatPath, chatPayload);
clearTimeout(requestTimeoutId);
const resJson = await res.json();
const message = this.extractMessage(resJson);
options.onFinish(message, res);
}
} catch (e) {
console.log("[Request] failed to make a chat request", e);
options.onError?.(e as Error);
}
}
async usage() {
return {
used: 0,
total: 0,
};
}
async models(): Promise<LLMModel[]> {
return [];
}
}

View File

@@ -38,6 +38,7 @@ interface ChatCommands {
next?: Command;
prev?: Command;
clear?: Command;
fork?: Command;
del?: Command;
}

View File

@@ -7,7 +7,6 @@ import {
useImperativeHandle,
} from "react";
import { useParams } from "react-router";
import { useWindowSize } from "@/app/utils";
import { IconButton } from "./button";
import { nanoid } from "nanoid";
import ExportIcon from "../icons/share.svg";
@@ -80,7 +79,7 @@ export const HTMLPreview = forwardRef<HTMLPreviewHander, HTMLPreviewProps>(
}, [props.autoHeight, props.height, iframeHeight]);
const srcDoc = useMemo(() => {
const script = `<script>new ResizeObserver((entries) => parent.postMessage({id: '${frameId}', height: entries[0].target.clientHeight}, '*')).observe(document.body)</script>`;
const script = `<script>window.addEventListener("DOMContentLoaded", () => new ResizeObserver((entries) => parent.postMessage({id: '${frameId}', height: entries[0].target.clientHeight}, '*')).observe(document.body))</script>`;
if (props.code.includes("<!DOCTYPE html>")) {
props.code.replace("<!DOCTYPE html>", "<!DOCTYPE html>" + script);
}

View File

@@ -1,12 +1,70 @@
.auth-page {
display: flex;
justify-content: center;
justify-content: flex-start;
align-items: center;
height: 100%;
width: 100%;
flex-direction: column;
.top-banner {
position: relative;
width: 100%;
display: flex;
justify-content: center;
align-items: center;
padding: 12px 64px;
box-sizing: border-box;
background: var(--second);
.top-banner-inner {
display: flex;
justify-content: center;
align-items: center;
font-size: 14px;
line-height: 150%;
span {
gap: 8px;
a {
display: inline-flex;
align-items: center;
text-decoration: none;
margin-left: 8px;
color: var(--primary);
}
}
}
.top-banner-close {
cursor: pointer;
position: absolute;
top: 50%;
right: 48px;
transform: translateY(-50%);
}
}
@media (max-width: 600px) {
.top-banner {
padding: 12px 24px 12px 12px;
.top-banner-close {
right: 10px;
}
.top-banner-inner {
.top-banner-logo {
margin-right: 8px;
}
}
}
}
.auth-header {
display: flex;
justify-content: space-between;
width: 100%;
padding: 10px;
box-sizing: border-box;
animation: slide-in-from-top ease 0.3s;
}
.auth-logo {
margin-top: 10vh;
transform: scale(1.4);
}
@@ -14,6 +72,7 @@
font-size: 24px;
font-weight: bold;
line-height: 2;
margin-bottom: 1vh;
}
.auth-tips {
@@ -24,6 +83,10 @@
margin: 3vh 0;
}
.auth-input-second {
margin: 0 0 3vh 0;
}
.auth-actions {
display: flex;
justify-content: center;

View File

@@ -1,21 +1,37 @@
import styles from "./auth.module.scss";
import { IconButton } from "./button";
import { useState, useEffect } from "react";
import { useNavigate } from "react-router-dom";
import { Path } from "../constant";
import { Path, SAAS_CHAT_URL } from "../constant";
import { useAccessStore } from "../store";
import Locale from "../locales";
import Delete from "../icons/close.svg";
import Arrow from "../icons/arrow.svg";
import Logo from "../icons/logo.svg";
import { useMobileScreen } from "@/app/utils";
import BotIcon from "../icons/bot.svg";
import { useEffect } from "react";
import { getClientConfig } from "../config/client";
import { PasswordInput } from "./ui-lib";
import LeftIcon from "@/app/icons/left.svg";
import { safeLocalStorage } from "@/app/utils";
import {
trackSettingsPageGuideToCPaymentClick,
trackAuthorizationPageButtonToCPaymentClick,
} from "../utils/auth-settings-events";
import clsx from "clsx";
const storage = safeLocalStorage();
export function AuthPage() {
const navigate = useNavigate();
const accessStore = useAccessStore();
const goHome = () => navigate(Path.Home);
const goChat = () => navigate(Path.Chat);
const goSaas = () => {
trackAuthorizationPageButtonToCPaymentClick();
window.location.href = SAAS_CHAT_URL;
};
const resetAccessCode = () => {
accessStore.update((access) => {
access.openaiApiKey = "";
@@ -32,43 +48,58 @@ export function AuthPage() {
return (
<div className={styles["auth-page"]}>
<div className={`no-dark ${styles["auth-logo"]}`}>
<TopBanner></TopBanner>
<div className={styles["auth-header"]}>
<IconButton
icon={<LeftIcon />}
text={Locale.Auth.Return}
onClick={() => navigate(Path.Home)}
></IconButton>
</div>
<div className={clsx("no-dark", styles["auth-logo"])}>
<BotIcon />
</div>
<div className={styles["auth-title"]}>{Locale.Auth.Title}</div>
<div className={styles["auth-tips"]}>{Locale.Auth.Tips}</div>
<input
className={styles["auth-input"]}
type="password"
placeholder={Locale.Auth.Input}
<PasswordInput
style={{ marginTop: "3vh", marginBottom: "3vh" }}
aria={Locale.Settings.ShowPassword}
aria-label={Locale.Auth.Input}
value={accessStore.accessCode}
type="text"
placeholder={Locale.Auth.Input}
onChange={(e) => {
accessStore.update(
(access) => (access.accessCode = e.currentTarget.value),
);
}}
/>
{!accessStore.hideUserApiKey ? (
<>
<div className={styles["auth-tips"]}>{Locale.Auth.SubTips}</div>
<input
className={styles["auth-input"]}
type="password"
placeholder={Locale.Settings.Access.OpenAI.ApiKey.Placeholder}
<PasswordInput
style={{ marginTop: "3vh", marginBottom: "3vh" }}
aria={Locale.Settings.ShowPassword}
aria-label={Locale.Settings.Access.OpenAI.ApiKey.Placeholder}
value={accessStore.openaiApiKey}
type="text"
placeholder={Locale.Settings.Access.OpenAI.ApiKey.Placeholder}
onChange={(e) => {
accessStore.update(
(access) => (access.openaiApiKey = e.currentTarget.value),
);
}}
/>
<input
className={styles["auth-input"]}
type="password"
placeholder={Locale.Settings.Access.Google.ApiKey.Placeholder}
<PasswordInput
style={{ marginTop: "3vh", marginBottom: "3vh" }}
aria={Locale.Settings.ShowPassword}
aria-label={Locale.Settings.Access.Google.ApiKey.Placeholder}
value={accessStore.googleApiKey}
type="text"
placeholder={Locale.Settings.Access.Google.ApiKey.Placeholder}
onChange={(e) => {
accessStore.update(
(access) => (access.googleApiKey = e.currentTarget.value),
@@ -85,13 +116,74 @@ export function AuthPage() {
onClick={goChat}
/>
<IconButton
text={Locale.Auth.Later}
text={Locale.Auth.SaasTips}
onClick={() => {
resetAccessCode();
goHome();
goSaas();
}}
/>
</div>
</div>
);
}
function TopBanner() {
const [isHovered, setIsHovered] = useState(false);
const [isVisible, setIsVisible] = useState(true);
const isMobile = useMobileScreen();
useEffect(() => {
// 检查 localStorage 中是否有标记
const bannerDismissed = storage.getItem("bannerDismissed");
// 如果标记不存在,存储默认值并显示横幅
if (!bannerDismissed) {
storage.setItem("bannerDismissed", "false");
setIsVisible(true); // 显示横幅
} else if (bannerDismissed === "true") {
// 如果标记为 "true",则隐藏横幅
setIsVisible(false);
}
}, []);
const handleMouseEnter = () => {
setIsHovered(true);
};
const handleMouseLeave = () => {
setIsHovered(false);
};
const handleClose = () => {
setIsVisible(false);
storage.setItem("bannerDismissed", "true");
};
if (!isVisible) {
return null;
}
return (
<div
className={styles["top-banner"]}
onMouseEnter={handleMouseEnter}
onMouseLeave={handleMouseLeave}
>
<div className={clsx(styles["top-banner-inner"], "no-dark")}>
<Logo className={styles["top-banner-logo"]}></Logo>
<span>
{Locale.Auth.TopTips}
<a
href={SAAS_CHAT_URL}
rel="stylesheet"
onClick={() => {
trackSettingsPageGuideToCPaymentClick();
}}
>
{Locale.Settings.Access.SaasStart.ChatNow}
<Arrow style={{ marginLeft: "4px" }} />
</a>
</span>
</div>
{(isHovered || isMobile) && (
<Delete className={styles["top-banner-close"]} onClick={handleClose} />
)}
</div>
);
}

View File

@@ -5,7 +5,6 @@
align-items: center;
justify-content: center;
padding: 10px;
cursor: pointer;
transition: all 0.3s ease;
overflow: hidden;

View File

@@ -2,6 +2,7 @@ import * as React from "react";
import styles from "./button.module.scss";
import { CSSProperties } from "react";
import clsx from "clsx";
export type ButtonType = "primary" | "danger" | null;
@@ -22,12 +23,16 @@ export function IconButton(props: {
}) {
return (
<button
className={
styles["icon-button"] +
` ${props.bordered && styles.border} ${props.shadow && styles.shadow} ${
props.className ?? ""
} clickable ${styles[props.type ?? ""]}`
}
className={clsx(
"clickable",
styles["icon-button"],
{
[styles.border]: props.bordered,
[styles.shadow]: props.shadow,
},
styles[props.type ?? ""],
props.className,
)}
onClick={props.onClick}
title={props.title}
disabled={props.disabled}
@@ -40,10 +45,9 @@ export function IconButton(props: {
{props.icon && (
<div
aria-label={props.text || props.title}
className={
styles["icon-button-icon"] +
` ${props.type === "primary" && "no-dark"}`
}
className={clsx(styles["icon-button-icon"], {
"no-dark": props.type === "primary",
})}
>
{props.icon}
</div>

View File

@@ -1,5 +1,4 @@
import DeleteIcon from "../icons/delete.svg";
import BotIcon from "../icons/bot.svg";
import styles from "./home.module.scss";
import {
@@ -12,13 +11,14 @@ import {
import { useChatStore } from "../store";
import Locale from "../locales";
import { Link, useLocation, useNavigate } from "react-router-dom";
import { useLocation, useNavigate } from "react-router-dom";
import { Path } from "../constant";
import { MaskAvatar } from "./mask";
import { Mask } from "../store/mask";
import { useRef, useEffect } from "react";
import { showConfirm } from "./ui-lib";
import { useMobileScreen } from "../utils";
import clsx from "clsx";
export function ChatItem(props: {
onClick?: () => void;
@@ -46,11 +46,11 @@ export function ChatItem(props: {
<Draggable draggableId={`${props.id}`} index={props.index}>
{(provided) => (
<div
className={`${styles["chat-item"]} ${
props.selected &&
(currentPath === Path.Chat || currentPath === Path.Home) &&
styles["chat-item-selected"]
}`}
className={clsx(styles["chat-item"], {
[styles["chat-item-selected"]]:
props.selected &&
(currentPath === Path.Chat || currentPath === Path.Home),
})}
onClick={props.onClick}
ref={(ele) => {
draggableRef.current = ele;
@@ -64,7 +64,7 @@ export function ChatItem(props: {
>
{props.narrow ? (
<div className={styles["chat-item-narrow"]}>
<div className={styles["chat-item-avatar"] + " no-dark"}>
<div className={clsx(styles["chat-item-avatar"], "no-dark")}>
<MaskAvatar
avatar={props.mask.avatar}
model={props.mask.modelConfig.model}
@@ -150,6 +150,8 @@ export function ChatList(props: { narrow?: boolean }) {
index={i}
selected={i === selectedIndex}
onClick={() => {
console.log(item.topic);
navigate(Path.Chat);
selectSession(i);
}}

View File

@@ -45,6 +45,14 @@
.chat-input-actions {
display: flex;
flex-wrap: wrap;
justify-content: space-between;
gap: 5px;
&-end {
display: flex;
margin-left: auto;
gap: 5px;
}
.chat-input-action {
display: inline-flex;
@@ -62,10 +70,6 @@
width: var(--icon-width);
overflow: hidden;
&:not(:last-child) {
margin-right: 5px;
}
.text {
white-space: nowrap;
padding-left: 5px;
@@ -231,10 +235,12 @@
animation: slide-in ease 0.3s;
$linear: linear-gradient(to right,
rgba(0, 0, 0, 0),
rgba(0, 0, 0, 1),
rgba(0, 0, 0, 0));
$linear: linear-gradient(
to right,
rgba(0, 0, 0, 0),
rgba(0, 0, 0, 1),
rgba(0, 0, 0, 0)
);
mask-image: $linear;
@mixin show {
@@ -351,6 +357,7 @@
font-size: 12px;
color: var(--black);
margin-left: 6px;
display: none;
}
}
@@ -373,7 +380,7 @@
}
}
.chat-message-user>.chat-message-container {
.chat-message-user > .chat-message-container {
align-items: flex-end;
}
@@ -443,6 +450,25 @@
transition: all ease 0.3s;
}
.chat-message-audio {
display: flex;
align-items: center;
justify-content: space-between;
border-radius: 10px;
background-color: rgba(0, 0, 0, 0.05);
border: var(--border-in-light);
position: relative;
transition: all ease 0.3s;
margin-top: 10px;
font-size: 14px;
user-select: text;
word-break: break-word;
box-sizing: border-box;
audio {
height: 30px; /* 调整高度 */
}
}
.chat-message-item-image {
width: 100%;
margin-top: 10px;
@@ -471,23 +497,27 @@
border: rgba($color: #888, $alpha: 0.2) 1px solid;
}
@media only screen and (max-width: 600px) {
$calc-image-width: calc(100vw/3*2/var(--image-count));
$calc-image-width: calc(100vw / 3 * 2 / var(--image-count));
.chat-message-item-image-multi {
width: $calc-image-width;
height: $calc-image-width;
}
.chat-message-item-image {
max-width: calc(100vw/3*2);
max-width: calc(100vw / 3 * 2);
}
}
@media screen and (min-width: 600px) {
$max-image-width: calc(calc(1200px - var(--sidebar-width))/3*2/var(--image-count));
$image-width: calc(calc(var(--window-width) - var(--sidebar-width))/3*2/var(--image-count));
$max-image-width: calc(
calc(1200px - var(--sidebar-width)) / 3 * 2 / var(--image-count)
);
$image-width: calc(
calc(var(--window-width) - var(--sidebar-width)) / 3 * 2 /
var(--image-count)
);
.chat-message-item-image-multi {
width: $image-width;
@@ -497,7 +527,7 @@
}
.chat-message-item-image {
max-width: calc(calc(1200px - var(--sidebar-width))/3*2);
max-width: calc(calc(1200px - var(--sidebar-width)) / 3 * 2);
}
}
@@ -515,7 +545,7 @@
z-index: 1;
}
.chat-message-user>.chat-message-container>.chat-message-item {
.chat-message-user > .chat-message-container > .chat-message-item {
background-color: var(--second);
&:hover {
@@ -626,7 +656,8 @@
min-height: 68px;
}
.chat-input:focus {}
.chat-input:focus {
}
.chat-input-send {
background-color: var(--primary);
@@ -646,3 +677,78 @@
bottom: 30px;
}
}
.shortcut-key-container {
padding: 10px;
overflow-y: auto;
display: flex;
flex-direction: column;
}
.shortcut-key-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(350px, 1fr));
gap: 16px;
}
.shortcut-key-item {
display: flex;
justify-content: space-between;
align-items: center;
overflow: hidden;
padding: 10px;
background-color: var(--white);
}
.shortcut-key-title {
font-size: 14px;
color: var(--black);
}
.shortcut-key-keys {
display: flex;
gap: 8px;
}
.shortcut-key {
display: flex;
align-items: center;
justify-content: center;
border: var(--border-in-light);
border-radius: 8px;
padding: 4px;
background-color: var(--gray);
min-width: 32px;
}
.shortcut-key span {
font-size: 12px;
color: var(--black);
}
.chat-main {
display: flex;
height: 100%;
width: 100%;
position: relative;
overflow: hidden;
.chat-body-container {
height: 100%;
display: flex;
flex-direction: column;
flex: 1;
width: 100%;
}
.chat-side-panel {
position: absolute;
inset: 0;
background: var(--white);
overflow: hidden;
z-index: 10;
transform: translateX(100%);
transition: all ease 0.3s;
&-show {
transform: translateX(0);
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -6,13 +6,13 @@ import EmojiPicker, {
import { ModelType } from "../store";
import BotIcon from "../icons/bot.svg";
import BlackBotIcon from "../icons/black-bot.svg";
// import BotIconDefault from "../icons/llm-icons/chebichat.svg";
// thay bang chebichat
export function getEmojiUrl(unified: string, style: EmojiStyle) {
// Whoever owns this Content Delivery Network (CDN), I am using your CDN to serve emojis
// Old CDN broken, so I had to switch to this one
// Author: https://github.com/H0llyW00dzZ
// Phương thức trả về đường dẫn URL của emoji dựa trên mã hóa unified và kiểu style
// CDN mới được sử dụng để phục vụ hình ảnh emoji
return `https://fastly.jsdelivr.net/npm/emoji-datasource-apple/img/${style}/64/${unified}.png`;
}
@@ -26,34 +26,109 @@ export function AvatarPicker(props: {
theme={EmojiTheme.AUTO}
getEmojiUrl={getEmojiUrl}
onEmojiClick={(e) => {
// Gọi hàm onEmojiClick khi người dùng click vào emoji
// Truyền giá trị unified của emoji đã chọn
props.onEmojiClick(e.unified);
}}
/>
);
}
export function Avatar(props: { model?: ModelType; avatar?: string }) {
if (props.model) {
return (
<div className="no-dark">
{props.model?.startsWith("gpt-4") ? (
<BlackBotIcon className="user-avatar" />
) : (
<BotIcon className="user-avatar" />
)}
</div>
);
}
// Function to render chebichat PNG avatar
function chebichatAvatar() {
return (
<div className="user-avatar">
{props.avatar && <EmojiAvatar avatar={props.avatar} />}
<img
src="/chebichat.png"
alt="chebichat avatar"
width={48}
height={48}
style={{ borderRadius: "50%" }}
/>
</div>
);
}
//xu ly avatar cho Chebichat
// nếu không có avatar thì trả về avatar mặc định của Chebichat
// nếu có avatar thì trả về avatar tương ứng với tên avatar
export function Avatar(props: { model?: ModelType; avatar?: string }) {
// console.log("Avatar props", props);
if (props.avatar === "chebi-user") {
//sau thay the bang avatar tu Chebichat Platform (Avatar Google,...)
// Nếu avatar là "chebi-user", trả về avatar mặc định của Chebichat
return null;
}
return chebichatAvatar();
// let LlmIcon = BotIconDefault;
//phan biệt các loại model và gán icon tương ứng
// if (props.model) {
// const modelName = props.model.toLowerCase();
// // Xác định icon phù hợp dựa trên tên model
// if (
// modelName.startsWith("gpt") ||
// modelName.startsWith("chatgpt") ||
// modelName.startsWith("dall-e") ||
// modelName.startsWith("dalle") ||
// modelName.startsWith("o1") ||
// modelName.startsWith("o3")
// ) {
// LlmIcon = BotIconOpenAI;
// } else if (modelName.startsWith("gemini")) {
// LlmIcon = BotIconGemini;
// } else if (modelName.startsWith("gemma")) {
// LlmIcon = BotIconGemma;
// } else if (modelName.startsWith("claude")) {
// LlmIcon = BotIconClaude;
// } else if (modelName.includes("llama")) {
// LlmIcon = BotIconMeta;
// } else if (
// modelName.startsWith("mixtral") ||
// modelName.startsWith("codestral")
// ) {
// LlmIcon = BotIconMistral;
// } else if (modelName.includes("deepseek")) {
// LlmIcon = BotIconDeepseek;
// } else if (modelName.startsWith("moonshot")) {
// LlmIcon = BotIconMoonshot;
// } else if (modelName.startsWith("qwen")) {
// LlmIcon = BotIconQwen;
// } else if (modelName.startsWith("ernie")) {
// LlmIcon = BotIconWenxin;
// } else if (modelName.startsWith("grok")) {
// LlmIcon = BotIconGrok;
// } else if (modelName.startsWith("hunyuan")) {
// LlmIcon = BotIconHunyuan;
// } else if (modelName.startsWith("doubao") || modelName.startsWith("ep-")) {
// LlmIcon = BotIconDoubao;
// } else if (
// modelName.includes("glm") ||
// modelName.startsWith("cogview-") ||
// modelName.startsWith("cogvideox-")
// ) {
// LlmIcon = BotIconChatglm;
// }
// return chebichatAvatar();
// }
// return (
// console.log("Avatar", props.avatar),
// <div className="user-avatar">
// {props.avatar && <EmojiAvatar avatar={props.avatar} size={48} />}
// </div>
// );
}
export function EmojiAvatar(props: { avatar: string; size?: number }) {
return (
// Hiển thị emoji dựa trên giá trị avatar được truyền vào
<Emoji
unified={props.avatar}
size={props.size ?? 18}

View File

@@ -8,6 +8,7 @@ import { ISSUE_URL } from "../constant";
import Locale from "../locales";
import { showConfirm } from "./ui-lib";
import { useSyncStore } from "../store/sync";
import { useChatStore } from "../store/chat";
interface IErrorBoundaryState {
hasError: boolean;
@@ -30,8 +31,7 @@ export class ErrorBoundary extends React.Component<any, IErrorBoundaryState> {
try {
useSyncStore.getState().export();
} finally {
localStorage.clear();
location.reload();
useChatStore.getState().clearAllData();
}
}

View File

@@ -1,5 +1,5 @@
/* eslint-disable @next/next/no-img-element */
import { ChatMessage, ModelType, useAppConfig, useChatStore } from "../store";
import { ChatMessage, useAppConfig, useChatStore } from "../store";
import Locale from "../locales";
import styles from "./exporter.module.scss";
import {
@@ -23,7 +23,6 @@ import CopyIcon from "../icons/copy.svg";
import LoadingIcon from "../icons/three-dots.svg";
import ChatGptIcon from "../icons/chatgpt.png";
import ShareIcon from "../icons/share.svg";
import BotIcon from "../icons/bot.png";
import DownloadIcon from "../icons/download.svg";
import { useEffect, useMemo, useRef, useState } from "react";
@@ -33,13 +32,14 @@ import dynamic from "next/dynamic";
import NextImage from "next/image";
import { toBlob, toPng } from "html-to-image";
import { DEFAULT_MASK_AVATAR } from "../store/mask";
import { prettyObject } from "../utils/format";
import { EXPORT_MESSAGE_CLASS_NAME } from "../constant";
import { getClientConfig } from "../config/client";
import { type ClientApi, getClientApi } from "../client/api";
import { getMessageTextContent } from "../utils";
import { MaskAvatar } from "./mask";
import clsx from "clsx";
const Markdown = dynamic(async () => (await import("./markdown")).Markdown, {
loading: () => <LoadingIcon />,
@@ -118,9 +118,10 @@ function Steps<
return (
<div
key={i}
className={`${styles["step"]} ${
styles[i <= props.index ? "step-finished" : ""]
} ${i === props.index && styles["step-current"]} clickable`}
className={clsx("clickable", styles["step"], {
[styles["step-finished"]]: i <= props.index,
[styles["step-current"]]: i === props.index,
})}
onClick={() => {
props.onStepChange?.(i);
}}
@@ -405,22 +406,6 @@ export function PreviewActions(props: {
);
}
function ExportAvatar(props: { avatar: string }) {
if (props.avatar === DEFAULT_MASK_AVATAR) {
return (
<img
src={BotIcon.src}
width={30}
height={30}
alt="bot"
className="user-avatar"
/>
);
}
return <Avatar avatar={props.avatar} />;
}
export function ImagePreviewer(props: {
messages: ChatMessage[];
topic: string;
@@ -525,30 +510,34 @@ export function ImagePreviewer(props: {
messages={props.messages}
/>
<div
className={`${styles["preview-body"]} ${styles["default-theme"]}`}
className={clsx(styles["preview-body"], styles["default-theme"])}
ref={previewRef}
>
<div className={styles["chat-info"]}>
<div className={styles["logo"] + " no-dark"}>
<div className={clsx(styles["logo"], "no-dark")}>
<NextImage
src={ChatGptIcon.src}
alt="logo"
width={50}
height={50}
width={30}
height={30}
/>
</div>
<div>
{/* <div>
<div className={styles["main-title"]}>NextChat</div>
<div className={styles["sub-title"]}>
github.com/ChatGPTNextWeb/ChatGPT-Next-Web
</div>
<div className={styles["icons"]}>
<ExportAvatar avatar={config.avatar} />
<MaskAvatar avatar={config.avatar} />
<span className={styles["icon-space"]}>&</span>
<ExportAvatar avatar={mask.avatar} />
<MaskAvatar
avatar={mask.avatar}
model={session.mask.modelConfig.model}
/>
</div>
</div>
</div> */}
<div>
<div className={styles["chat-info-item"]}>
{Locale.Exporter.Model}: {mask.modelConfig.model}
@@ -570,13 +559,18 @@ export function ImagePreviewer(props: {
{props.messages.map((m, i) => {
return (
<div
className={styles["message"] + " " + styles["message-" + m.role]}
className={clsx(styles["message"], styles["message-" + m.role])}
key={i}
>
<div className={styles["avatar"]}>
<ExportAvatar
avatar={m.role === "user" ? config.avatar : mask.avatar}
/>
{m.role === "user" ? (
<Avatar avatar={config.avatar}></Avatar>
) : (
<MaskAvatar
avatar={session.mask.avatar}
model={m.model || session.mask.modelConfig.model}
/>
)}
</div>
<div className={styles["body"]}>

View File

@@ -140,6 +140,9 @@
display: flex;
justify-content: space-between;
align-items: center;
&-narrow {
justify-content: center;
}
}
.sidebar-logo {

View File

@@ -2,11 +2,11 @@
require("../polyfill");
import { useState, useEffect } from "react";
import { useEffect, useState } from "react";
import styles from "./home.module.scss";
import BotIcon from "../icons/bot.svg";
//icon chebichat logo
import BotIcon from "../icons/chebichat.svg";
import LoadingIcon from "../icons/three-dots.svg";
import { getCSSVar, useMobileScreen } from "../utils";
@@ -15,12 +15,12 @@ import dynamic from "next/dynamic";
import { Path, SlotID } from "../constant";
import { ErrorBoundary } from "./error";
import { getISOLang, getLang } from "../locales";
import { getISOLang } from "../locales";
import {
HashRouter as Router,
Routes,
Route,
Routes,
useLocation,
} from "react-router-dom";
import { SideBar } from "./sidebar";
@@ -29,10 +29,12 @@ import { AuthPage } from "./auth";
import { getClientConfig } from "../config/client";
import { type ClientApi, getClientApi } from "../client/api";
import { useAccessStore } from "../store";
import clsx from "clsx";
import { initializeMcpSystem, isMcpEnabled } from "../mcp/actions";
export function Loading(props: { noLogo?: boolean }) {
return (
<div className={styles["loading-content"] + " no-dark"}>
<div className={clsx("no-dark", styles["loading-content"])}>
{!props.noLogo && <BotIcon />}
<LoadingIcon />
</div>
@@ -74,6 +76,13 @@ const Sd = dynamic(async () => (await import("./sd")).Sd, {
loading: () => <Loading noLogo />,
});
const McpMarketPage = dynamic(
async () => (await import("./mcp-market")).McpMarketPage,
{
loading: () => <Loading noLogo />,
},
);
export function useSwitchTheme() {
const config = useAppConfig();
@@ -179,7 +188,11 @@ function Screen() {
if (isSdNew) return <Sd />;
return (
<>
<SideBar className={isHome ? styles["sidebar-show"] : ""} />
<SideBar
className={clsx({
[styles["sidebar-show"]]: isHome,
})}
/>
<WindowContent>
<Routes>
<Route path={Path.Home} element={<Chat />} />
@@ -189,6 +202,7 @@ function Screen() {
<Route path={Path.SearchChat} element={<SearchChat />} />
<Route path={Path.Chat} element={<Chat />} />
<Route path={Path.Settings} element={<Settings />} />
<Route path={Path.McpMarket} element={<McpMarketPage />} />
</Routes>
</WindowContent>
</>
@@ -197,9 +211,10 @@ function Screen() {
return (
<div
className={`${styles.container} ${
shouldTightBorder ? styles["tight-container"] : styles.container
} ${getLang() === "ar" ? styles["rtl-screen"] : ""}`}
className={clsx(styles.container, {
[styles["tight-container"]]: shouldTightBorder,
// [styles["rtl-screen"]]: getLang() === "ar", // Removed because "ar" is not a possible return value
})}
>
{renderContent()}
</div>
@@ -228,6 +243,20 @@ export function Home() {
useEffect(() => {
console.log("[Config] got config from build time", getClientConfig());
useAccessStore.getState().fetch();
const initMcp = async () => {
try {
const enabled = await isMcpEnabled();
if (enabled) {
console.log("[MCP] initializing...");
await initializeMcpSystem();
console.log("[MCP] initialized");
}
} catch (err) {
console.error("[MCP] failed to initialize:", err);
}
};
initMcp();
}, []);
if (!useHasHydrated()) {

View File

@@ -1,5 +1,6 @@
import * as React from "react";
import styles from "./input-range.module.scss";
import clsx from "clsx";
interface InputRangeProps {
onChange: React.ChangeEventHandler<HTMLInputElement>;
@@ -23,7 +24,7 @@ export function InputRange({
aria,
}: InputRangeProps) {
return (
<div className={styles["input-range"] + ` ${className ?? ""}`}>
<div className={clsx(styles["input-range"], className)}>
{title || value}
<input
aria-label={aria}

View File

@@ -22,6 +22,9 @@ import {
import { useChatStore } from "../store";
import { IconButton } from "./button";
import { useAppConfig } from "../store/config";
import clsx from "clsx";
export function Mermaid(props: { code: string }) {
const ref = useRef<HTMLDivElement>(null);
const [hasError, setHasError] = useState(false);
@@ -55,7 +58,7 @@ export function Mermaid(props: { code: string }) {
return (
<div
className="no-dark mermaid"
className={clsx("no-dark", "mermaid")}
style={{
cursor: "pointer",
overflow: "auto",
@@ -87,12 +90,18 @@ export function PreCode(props: { children: any }) {
const refText = ref.current.querySelector("code")?.innerText;
if (htmlDom) {
setHtmlCode((htmlDom as HTMLElement).innerText);
} else if (refText?.startsWith("<!DOCTYPE")) {
} else if (
refText?.startsWith("<!DOCTYPE") ||
refText?.startsWith("<svg") ||
refText?.startsWith("<?xml")
) {
setHtmlCode(refText);
}
}, 600);
const enableArtifacts = session.mask?.enableArtifacts !== false;
const config = useAppConfig();
const enableArtifacts =
session.mask?.enableArtifacts !== false && config.enableArtifacts;
//Wrap the paragraph for plain-text
useEffect(() => {
@@ -128,8 +137,9 @@ export function PreCode(props: { children: any }) {
className="copy-code-button"
onClick={() => {
if (ref.current) {
const code = ref.current.innerText;
copyToClipboard(code);
copyToClipboard(
ref.current.querySelector("code")?.innerText ?? "",
);
}
}}
></span>
@@ -164,6 +174,12 @@ export function PreCode(props: { children: any }) {
}
function CustomCode(props: { children: any; className?: string }) {
const chatStore = useChatStore();
const session = chatStore.currentSession();
const config = useAppConfig();
const enableCodeFold =
session.mask?.enableCodeFold !== false && config.enableCodeFold;
const ref = useRef<HTMLPreElement>(null);
const [collapsed, setCollapsed] = useState(true);
const [showToggle, setShowToggle] = useState(false);
@@ -179,46 +195,39 @@ function CustomCode(props: { children: any; className?: string }) {
const toggleCollapsed = () => {
setCollapsed((collapsed) => !collapsed);
};
const renderShowMoreButton = () => {
if (showToggle && enableCodeFold && collapsed) {
return (
<div
className={clsx("show-hide-button", {
collapsed,
expanded: !collapsed,
})}
>
<button onClick={toggleCollapsed}>{Locale.NewChat.More}</button>
</div>
);
}
return null;
};
return (
<>
<code
className={props?.className}
className={clsx(props?.className)}
ref={ref}
style={{
maxHeight: collapsed ? "400px" : "none",
maxHeight: enableCodeFold && collapsed ? "400px" : "none",
overflowY: "hidden",
}}
>
{props.children}
</code>
{showToggle && collapsed && (
<div
className={`show-hide-button ${collapsed ? "collapsed" : "expanded"}`}
>
<button onClick={toggleCollapsed}>{Locale.NewChat.More}</button>
</div>
)}
{renderShowMoreButton()}
</>
);
}
function escapeDollarNumber(text: string) {
let escapedText = "";
for (let i = 0; i < text.length; i += 1) {
let char = text[i];
const nextChar = text[i + 1] || " ";
if (char === "$" && nextChar >= "0" && nextChar <= "9") {
char = "\\$";
}
escapedText += char;
}
return escapedText;
}
function escapeBrackets(text: string) {
const pattern =
/(```[\s\S]*?```|`.*?`)|\\\[([\s\S]*?[^\\])\\\]|\\\((.*?)\\\)/g;
@@ -237,9 +246,30 @@ function escapeBrackets(text: string) {
);
}
function tryWrapHtmlCode(text: string) {
// try add wrap html code (fixed: html codeblock include 2 newline)
// ignore embed codeblock
if (text.includes("```")) {
return text;
}
return text
.replace(
/([`]*?)(\w*?)([\n\r]*?)(<!DOCTYPE html>)/g,
(match, quoteStart, lang, newLine, doctype) => {
return !quoteStart ? "\n```html\n" + doctype : match;
},
)
.replace(
/(<\/body>)([\r\n\s]*?)(<\/html>)([\n\r]*)([`]*)([\n\r]*?)/g,
(match, bodyEnd, space, htmlEnd, newLine, quoteEnd) => {
return !quoteEnd ? bodyEnd + space + htmlEnd + "\n```\n" : match;
},
);
}
function _MarkDownContent(props: { content: string }) {
const escapedContent = useMemo(() => {
return escapeBrackets(escapeDollarNumber(props.content));
return tryWrapHtmlCode(escapeBrackets(props.content));
}, [props.content]);
return (
@@ -261,6 +291,20 @@ function _MarkDownContent(props: { content: string }) {
p: (pProps) => <p {...pProps} dir="auto" />,
a: (aProps) => {
const href = aProps.href || "";
if (/\.(aac|mp3|opus|wav)$/.test(href)) {
return (
<figure>
<audio controls src={href}></audio>
</figure>
);
}
if (/\.(3gp|3g2|webm|ogv|mpeg|mp4|avi)$/.test(href)) {
return (
<video controls width="99.9%">
<source src={href} />
</video>
);
}
const isInternal = /^\/#/i.test(href);
const target = isInternal ? "_self" : aProps.target ?? "_blank";
return <a {...aProps} target={target} />;

View File

@@ -37,7 +37,7 @@ import Locale, { AllLangs, ALL_LANG_OPTIONS, Lang } from "../locales";
import { useNavigate } from "react-router-dom";
import chatStyle from "./chat.module.scss";
import { useEffect, useState } from "react";
import { useState } from "react";
import {
copyToClipboard,
downloadAs,
@@ -45,10 +45,8 @@ import {
readFromFile,
} from "../utils";
import { Updater } from "../typing";
import { ModelConfigList } from "./model-config";
import { FileName, Path } from "../constant";
import { BUILTIN_MASK_STORE } from "../masks";
import { nanoid } from "nanoid";
import {
DragDropContext,
Droppable,
@@ -56,6 +54,7 @@ import {
OnDragEndResponder,
} from "@hello-pangea/dnd";
import { getMessageTextContent } from "../utils";
import clsx from "clsx";
// drag and drop helper function
function reorder<T>(list: T[], startIndex: number, endIndex: number): T[] {
@@ -167,21 +166,40 @@ export function MaskConfig(props: {
></input>
</ListItem>
<ListItem
title={Locale.Mask.Config.Artifacts.Title}
subTitle={Locale.Mask.Config.Artifacts.SubTitle}
>
<input
aria-label={Locale.Mask.Config.Artifacts.Title}
type="checkbox"
checked={props.mask.enableArtifacts !== false}
onChange={(e) => {
props.updateMask((mask) => {
mask.enableArtifacts = e.currentTarget.checked;
});
}}
></input>
</ListItem>
{globalConfig.enableArtifacts && (
<ListItem
title={Locale.Mask.Config.Artifacts.Title}
subTitle={Locale.Mask.Config.Artifacts.SubTitle}
>
<input
aria-label={Locale.Mask.Config.Artifacts.Title}
type="checkbox"
checked={props.mask.enableArtifacts !== false}
onChange={(e) => {
props.updateMask((mask) => {
mask.enableArtifacts = e.currentTarget.checked;
});
}}
></input>
</ListItem>
)}
{globalConfig.enableCodeFold && (
<ListItem
title={Locale.Mask.Config.CodeFold.Title}
subTitle={Locale.Mask.Config.CodeFold.SubTitle}
>
<input
aria-label={Locale.Mask.Config.CodeFold.Title}
type="checkbox"
checked={props.mask.enableCodeFold !== false}
onChange={(e) => {
props.updateMask((mask) => {
mask.enableCodeFold = e.currentTarget.checked;
});
}}
></input>
</ListItem>
)}
{!props.shouldSyncFromGlobal ? (
<ListItem
@@ -227,13 +245,14 @@ export function MaskConfig(props: {
) : null}
</List>
<List>
{/* CAU HINH MODEL */}
{/* <List>
<ModelConfigList
modelConfig={{ ...props.mask.modelConfig }}
updateConfig={updateConfig}
/>
{props.extraListItems}
</List>
</List> */}
</>
);
}
@@ -426,16 +445,7 @@ export function MaskPage() {
const maskStore = useMaskStore();
const chatStore = useChatStore();
const [filterLang, setFilterLang] = useState<Lang | undefined>(
() => localStorage.getItem("Mask-language") as Lang | undefined,
);
useEffect(() => {
if (filterLang) {
localStorage.setItem("Mask-language", filterLang);
} else {
localStorage.removeItem("Mask-language");
}
}, [filterLang]);
const filterLang = maskStore.language;
const allMasks = maskStore
.getAll()
@@ -542,9 +552,9 @@ export function MaskPage() {
onChange={(e) => {
const value = e.currentTarget.value;
if (value === Locale.Settings.Lang.All) {
setFilterLang(undefined);
maskStore.setLanguage(undefined);
} else {
setFilterLang(value as Lang);
maskStore.setLanguage(value as Lang);
}
}}
>
@@ -579,7 +589,7 @@ export function MaskPage() {
</div>
<div className={styles["mask-title"]}>
<div className={styles["mask-name"]}>{m.name}</div>
<div className={styles["mask-info"] + " one-line"}>
<div className={clsx(styles["mask-info"], "one-line")}>
{`${Locale.Mask.Item.Info(m.context.length)} / ${
ALL_LANG_OPTIONS[m.lang]
} / ${m.modelConfig.model}`}

View File

@@ -0,0 +1,657 @@
@import "../styles/animation.scss";
.mcp-market-page {
height: 100%;
display: flex;
flex-direction: column;
.loading-indicator {
font-size: 12px;
color: var(--primary);
margin-left: 8px;
font-weight: normal;
opacity: 0.8;
}
.mcp-market-page-body {
padding: 20px;
overflow-y: auto;
.loading-container,
.empty-container {
display: flex;
justify-content: center;
align-items: center;
min-height: 200px;
width: 100%;
background-color: var(--white);
border: var(--border-in-light);
border-radius: 10px;
animation: slide-in ease 0.3s;
}
.loading-text,
.empty-text {
font-size: 14px;
color: var(--black);
opacity: 0.5;
text-align: center;
}
.mcp-market-filter {
width: 100%;
max-width: 100%;
margin-bottom: 20px;
animation: slide-in ease 0.3s;
height: 40px;
display: flex;
.search-bar {
flex-grow: 1;
max-width: 100%;
min-width: 0;
}
}
.server-list {
display: flex;
flex-direction: column;
gap: 1px;
}
.mcp-market-item {
padding: 20px;
border: var(--border-in-light);
animation: slide-in ease 0.3s;
background-color: var(--white);
transition: all 0.3s ease;
&.disabled {
opacity: 0.7;
pointer-events: none;
}
&:not(:last-child) {
border-bottom: 0;
}
&:first-child {
border-top-left-radius: 10px;
border-top-right-radius: 10px;
}
&:last-child {
border-bottom-left-radius: 10px;
border-bottom-right-radius: 10px;
}
&.loading {
position: relative;
&::after {
content: "";
position: absolute;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: linear-gradient(
90deg,
transparent,
rgba(255, 255, 255, 0.2),
transparent
);
background-size: 200% 100%;
animation: loading-pulse 1.5s infinite;
}
}
.operation-status {
display: inline-flex;
align-items: center;
margin-left: 10px;
padding: 2px 8px;
border-radius: 4px;
font-size: 12px;
background-color: #16a34a;
color: #fff;
animation: pulse 1.5s infinite;
&[data-status="stopping"] {
background-color: #9ca3af;
}
&[data-status="starting"] {
background-color: #4ade80;
}
&[data-status="error"] {
background-color: #f87171;
}
}
.mcp-market-header {
display: flex;
justify-content: space-between;
align-items: flex-start;
width: 100%;
.mcp-market-title {
flex-grow: 1;
margin-right: 20px;
max-width: calc(100% - 300px);
}
.mcp-market-name {
font-size: 14px;
font-weight: bold;
display: flex;
align-items: center;
gap: 8px;
margin-bottom: 8px;
.server-status {
display: inline-flex;
align-items: center;
margin-left: 10px;
padding: 2px 8px;
border-radius: 4px;
font-size: 12px;
background-color: #22c55e;
color: #fff;
&.error {
background-color: #ef4444;
}
&.stopped {
background-color: #6b7280;
}
&.initializing {
background-color: #f59e0b;
animation: pulse 1.5s infinite;
}
.error-message {
margin-left: 4px;
font-size: 12px;
}
}
}
.repo-link {
color: var(--primary);
font-size: 12px;
display: inline-flex;
align-items: center;
gap: 4px;
text-decoration: none;
opacity: 0.8;
transition: opacity 0.2s;
&:hover {
opacity: 1;
}
svg {
width: 14px;
height: 14px;
}
}
.tags-container {
display: flex;
gap: 4px;
flex-wrap: wrap;
margin-bottom: 8px;
}
.tag {
background: var(--gray);
color: var(--black);
padding: 2px 6px;
border-radius: 4px;
font-size: 10px;
opacity: 0.8;
}
.mcp-market-info {
color: var(--black);
font-size: 12px;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.mcp-market-actions {
display: flex;
gap: 12px;
align-items: flex-start;
flex-shrink: 0;
min-width: 180px;
justify-content: flex-end;
}
}
}
}
.array-input {
display: flex;
flex-direction: column;
gap: 12px;
width: 100%;
padding: 16px;
border: 1px solid var(--gray-200);
border-radius: 10px;
background-color: var(--white);
.array-input-item {
display: flex;
gap: 8px;
align-items: center;
width: 100%;
padding: 0;
input {
width: 100%;
padding: 8px 12px;
background-color: var(--gray-50);
border-radius: 6px;
transition: all 0.3s ease;
font-size: 13px;
border: 1px solid var(--gray-200);
&:hover {
background-color: var(--gray-100);
border-color: var(--gray-300);
}
&:focus {
background-color: var(--white);
border-color: var(--primary);
outline: none;
box-shadow: 0 0 0 2px var(--primary-10);
}
&::placeholder {
color: var(--gray-300);
}
}
}
:global(.icon-button.add-path-button) {
width: 100%;
background-color: var(--primary);
color: white;
padding: 8px 12px;
border-radius: 6px;
transition: all 0.3s ease;
margin-top: 8px;
display: flex;
align-items: center;
justify-content: center;
border: none;
height: 36px;
&:hover {
background-color: var(--primary-dark);
}
svg {
width: 16px;
height: 16px;
margin-right: 4px;
filter: brightness(2);
}
}
}
.path-list {
width: 100%;
display: flex;
flex-direction: column;
gap: 10px;
.path-item {
display: flex;
gap: 10px;
width: 100%;
input {
flex: 1;
width: 100%;
max-width: 100%;
padding: 10px;
border: var(--border-in-light);
border-radius: 10px;
box-sizing: border-box;
font-size: 14px;
background-color: var(--white);
color: var(--black);
&:hover {
border-color: var(--gray-300);
}
&:focus {
border-color: var(--primary);
outline: none;
box-shadow: 0 0 0 2px var(--primary-10);
}
}
.browse-button {
padding: 8px;
border: var(--border-in-light);
border-radius: 10px;
background-color: transparent;
color: var(--black-50);
&:hover {
border-color: var(--primary);
color: var(--primary);
background-color: transparent;
}
svg {
width: 16px;
height: 16px;
}
}
.delete-button {
padding: 8px;
border: var(--border-in-light);
border-radius: 10px;
background-color: transparent;
color: var(--black-50);
&:hover {
border-color: var(--danger);
color: var(--danger);
background-color: transparent;
}
svg {
width: 16px;
height: 16px;
}
}
.file-input {
display: none;
}
}
.add-button {
align-self: flex-start;
display: flex;
align-items: center;
gap: 5px;
padding: 8px 12px;
background-color: transparent;
border: var(--border-in-light);
border-radius: 10px;
color: var(--black);
font-size: 12px;
margin-top: 5px;
&:hover {
border-color: var(--primary);
color: var(--primary);
background-color: transparent;
}
svg {
width: 16px;
height: 16px;
}
}
}
.config-section {
width: 100%;
.config-header {
margin-bottom: 12px;
.config-title {
font-size: 14px;
font-weight: 600;
color: var(--black);
text-transform: capitalize;
}
.config-description {
font-size: 12px;
color: var(--gray-500);
margin-top: 4px;
}
}
.array-input {
display: flex;
flex-direction: column;
gap: 12px;
width: 100%;
padding: 16px;
border: 1px solid var(--gray-200);
border-radius: 10px;
background-color: var(--white);
.array-input-item {
display: flex;
gap: 8px;
align-items: center;
width: 100%;
padding: 0;
input {
width: 100%;
padding: 8px 12px;
background-color: var(--gray-50);
border-radius: 6px;
transition: all 0.3s ease;
font-size: 13px;
border: 1px solid var(--gray-200);
&:hover {
background-color: var(--gray-100);
border-color: var(--gray-300);
}
&:focus {
background-color: var(--white);
border-color: var(--primary);
outline: none;
box-shadow: 0 0 0 2px var(--primary-10);
}
&::placeholder {
color: var(--gray-300);
}
}
:global(.icon-button) {
width: 32px;
height: 32px;
padding: 0;
border-radius: 6px;
background-color: transparent;
border: 1px solid var(--gray-200);
flex-shrink: 0;
display: flex;
align-items: center;
justify-content: center;
&:hover {
background-color: var(--gray-100);
border-color: var(--gray-300);
}
svg {
width: 16px;
height: 16px;
opacity: 0.7;
}
}
}
:global(.icon-button.add-path-button) {
width: 100%;
background-color: var(--primary);
color: white;
padding: 8px 12px;
border-radius: 6px;
transition: all 0.3s ease;
margin-top: 8px;
display: flex;
align-items: center;
justify-content: center;
border: none;
height: 36px;
&:hover {
background-color: var(--primary-dark);
}
svg {
width: 16px;
height: 16px;
margin-right: 4px;
filter: brightness(2);
}
}
}
}
.input-item {
width: 100%;
input {
width: 100%;
padding: 10px;
border: var(--border-in-light);
border-radius: 10px;
box-sizing: border-box;
font-size: 14px;
background-color: var(--white);
color: var(--black);
&:hover {
border-color: var(--gray-300);
}
&:focus {
border-color: var(--primary);
outline: none;
box-shadow: 0 0 0 2px var(--primary-10);
}
&::placeholder {
color: var(--gray-300) !important;
opacity: 1;
}
}
}
.tools-list {
display: flex;
flex-direction: column;
gap: 16px;
width: 100%;
padding: 20px;
max-width: 100%;
overflow-x: hidden;
word-break: break-word;
box-sizing: border-box;
.tool-item {
width: 100%;
box-sizing: border-box;
.tool-name {
font-size: 14px;
font-weight: 600;
color: var(--black);
margin-bottom: 8px;
padding-left: 12px;
border-left: 3px solid var(--primary);
box-sizing: border-box;
width: 100%;
}
.tool-description {
font-size: 13px;
color: var(--gray-500);
line-height: 1.6;
padding-left: 15px;
box-sizing: border-box;
width: 100%;
}
}
}
:global {
.modal-content {
margin-top: 20px;
max-width: 100%;
overflow-x: hidden;
}
.list {
padding: 10px;
margin-bottom: 10px;
background-color: var(--white);
}
.list-item {
border: none;
background-color: transparent;
border-radius: 10px;
padding: 10px;
margin-bottom: 10px;
display: flex;
flex-direction: column;
gap: 10px;
.list-header {
margin-bottom: 0;
.list-title {
font-size: 14px;
font-weight: bold;
text-transform: capitalize;
color: var(--black);
}
.list-sub-title {
font-size: 12px;
color: var(--gray-500);
margin-top: 4px;
}
}
}
}
}
@keyframes loading-pulse {
0% {
background-position: 200% 0;
}
100% {
background-position: -200% 0;
}
}
@keyframes pulse {
0% {
opacity: 0.6;
}
50% {
opacity: 1;
}
100% {
opacity: 0.6;
}
}

View File

@@ -0,0 +1,755 @@
import { IconButton } from "./button";
import { ErrorBoundary } from "./error";
import styles from "./mcp-market.module.scss";
import EditIcon from "../icons/edit.svg";
import AddIcon from "../icons/add.svg";
import CloseIcon from "../icons/close.svg";
import DeleteIcon from "../icons/delete.svg";
import RestartIcon from "../icons/reload.svg";
import EyeIcon from "../icons/eye.svg";
import GithubIcon from "../icons/github.svg";
import { List, ListItem, Modal, showToast } from "./ui-lib";
import { useNavigate } from "react-router-dom";
import { useEffect, useState } from "react";
import {
addMcpServer,
getClientsStatus,
getClientTools,
getMcpConfigFromFile,
isMcpEnabled,
pauseMcpServer,
restartAllClients,
resumeMcpServer,
} from "../mcp/actions";
import {
ListToolsResponse,
McpConfigData,
PresetServer,
ServerConfig,
ServerStatusResponse,
} from "../mcp/types";
import clsx from "clsx";
import PlayIcon from "../icons/play.svg";
import StopIcon from "../icons/pause.svg";
import { Path } from "../constant";
interface ConfigProperty {
type: string;
description?: string;
required?: boolean;
minItems?: number;
}
export function McpMarketPage() {
const navigate = useNavigate();
const [mcpEnabled, setMcpEnabled] = useState(false);
const [searchText, setSearchText] = useState("");
const [userConfig, setUserConfig] = useState<Record<string, any>>({});
const [editingServerId, setEditingServerId] = useState<string | undefined>();
const [tools, setTools] = useState<ListToolsResponse["tools"] | null>(null);
const [viewingServerId, setViewingServerId] = useState<string | undefined>();
const [isLoading, setIsLoading] = useState(false);
const [config, setConfig] = useState<McpConfigData>();
const [clientStatuses, setClientStatuses] = useState<
Record<string, ServerStatusResponse>
>({});
const [loadingPresets, setLoadingPresets] = useState(true);
const [presetServers, setPresetServers] = useState<PresetServer[]>([]);
const [loadingStates, setLoadingStates] = useState<Record<string, string>>(
{},
);
// 检查 MCP 是否启用
useEffect(() => {
const checkMcpStatus = async () => {
const enabled = await isMcpEnabled();
setMcpEnabled(enabled);
if (!enabled) {
navigate(Path.Home);
}
};
checkMcpStatus();
}, [navigate]);
// 添加状态轮询
useEffect(() => {
if (!mcpEnabled || !config) return;
const updateStatuses = async () => {
const statuses = await getClientsStatus();
setClientStatuses(statuses);
};
// 立即执行一次
updateStatuses();
// 每 1000ms 轮询一次
const timer = setInterval(updateStatuses, 1000);
return () => clearInterval(timer);
}, [mcpEnabled, config]);
// 加载预设服务器
useEffect(() => {
const loadPresetServers = async () => {
if (!mcpEnabled) return;
try {
setLoadingPresets(true);
const response = await fetch("https://nextchat.club/mcp/list");
if (!response.ok) {
throw new Error("Failed to load preset servers");
}
const data = await response.json();
setPresetServers(data?.data ?? []);
} catch (error) {
console.error("Failed to load preset servers:", error);
showToast("Failed to load preset servers");
} finally {
setLoadingPresets(false);
}
};
loadPresetServers();
}, [mcpEnabled]);
// 加载初始状态
useEffect(() => {
const loadInitialState = async () => {
if (!mcpEnabled) return;
try {
setIsLoading(true);
const config = await getMcpConfigFromFile();
setConfig(config);
// 获取所有客户端的状态
const statuses = await getClientsStatus();
setClientStatuses(statuses);
} catch (error) {
console.error("Failed to load initial state:", error);
showToast("Failed to load initial state");
} finally {
setIsLoading(false);
}
};
loadInitialState();
}, [mcpEnabled]);
// 加载当前编辑服务器的配置
useEffect(() => {
if (!editingServerId || !config) return;
const currentConfig = config.mcpServers[editingServerId];
if (currentConfig) {
// 从当前配置中提取用户配置
const preset = presetServers.find((s) => s.id === editingServerId);
if (preset?.configSchema) {
const userConfig: Record<string, any> = {};
Object.entries(preset.argsMapping || {}).forEach(([key, mapping]) => {
if (mapping.type === "spread") {
// For spread types, extract the array from args.
const startPos = mapping.position ?? 0;
userConfig[key] = currentConfig.args.slice(startPos);
} else if (mapping.type === "single") {
// For single types, get a single value
userConfig[key] = currentConfig.args[mapping.position ?? 0];
} else if (
mapping.type === "env" &&
mapping.key &&
currentConfig.env
) {
// For env types, get values from environment variables
userConfig[key] = currentConfig.env[mapping.key];
}
});
setUserConfig(userConfig);
}
} else {
setUserConfig({});
}
}, [editingServerId, config, presetServers]);
if (!mcpEnabled) {
return null;
}
// 检查服务器是否已添加
const isServerAdded = (id: string) => {
return id in (config?.mcpServers ?? {});
};
// 保存服务器配置
const saveServerConfig = async () => {
const preset = presetServers.find((s) => s.id === editingServerId);
if (!preset || !preset.configSchema || !editingServerId) return;
const savingServerId = editingServerId;
setEditingServerId(undefined);
try {
updateLoadingState(savingServerId, "Updating configuration...");
// 构建服务器配置
const args = [...preset.baseArgs];
const env: Record<string, string> = {};
Object.entries(preset.argsMapping || {}).forEach(([key, mapping]) => {
const value = userConfig[key];
if (mapping.type === "spread" && Array.isArray(value)) {
const pos = mapping.position ?? 0;
args.splice(pos, 0, ...value);
} else if (
mapping.type === "single" &&
mapping.position !== undefined
) {
args[mapping.position] = value;
} else if (
mapping.type === "env" &&
mapping.key &&
typeof value === "string"
) {
env[mapping.key] = value;
}
});
const serverConfig: ServerConfig = {
command: preset.command,
args,
...(Object.keys(env).length > 0 ? { env } : {}),
};
const newConfig = await addMcpServer(savingServerId, serverConfig);
setConfig(newConfig);
showToast("Server configuration updated successfully");
} catch (error) {
showToast(
error instanceof Error ? error.message : "Failed to save configuration",
);
} finally {
updateLoadingState(savingServerId, null);
}
};
// 获取服务器支持的 Tools
const loadTools = async (id: string) => {
try {
const result = await getClientTools(id);
if (result) {
setTools(result);
} else {
throw new Error("Failed to load tools");
}
} catch (error) {
showToast("Failed to load tools");
console.error(error);
setTools(null);
}
};
// 更新加载状态的辅助函数
const updateLoadingState = (id: string, message: string | null) => {
setLoadingStates((prev) => {
if (message === null) {
const { [id]: _, ...rest } = prev;
return rest;
}
return { ...prev, [id]: message };
});
};
// 修改添加服务器函数
const addServer = async (preset: PresetServer) => {
if (!preset.configurable) {
try {
const serverId = preset.id;
updateLoadingState(serverId, "Creating MCP client...");
const serverConfig: ServerConfig = {
command: preset.command,
args: [...preset.baseArgs],
};
const newConfig = await addMcpServer(preset.id, serverConfig);
setConfig(newConfig);
// 更新状态
const statuses = await getClientsStatus();
setClientStatuses(statuses);
} finally {
updateLoadingState(preset.id, null);
}
} else {
// 如果需要配置,打开配置对话框
setEditingServerId(preset.id);
setUserConfig({});
}
};
// 修改暂停服务器函数
const pauseServer = async (id: string) => {
try {
updateLoadingState(id, "Stopping server...");
const newConfig = await pauseMcpServer(id);
setConfig(newConfig);
showToast("Server stopped successfully");
} catch (error) {
showToast("Failed to stop server");
console.error(error);
} finally {
updateLoadingState(id, null);
}
};
// Restart server
const restartServer = async (id: string) => {
try {
updateLoadingState(id, "Starting server...");
await resumeMcpServer(id);
} catch (error) {
showToast(
error instanceof Error
? error.message
: "Failed to start server, please check logs",
);
console.error(error);
} finally {
updateLoadingState(id, null);
}
};
// Restart all clients
const handleRestartAll = async () => {
try {
updateLoadingState("all", "Restarting all servers...");
const newConfig = await restartAllClients();
setConfig(newConfig);
showToast("Restarting all clients");
} catch (error) {
showToast("Failed to restart clients");
console.error(error);
} finally {
updateLoadingState("all", null);
}
};
// Render configuration form
const renderConfigForm = () => {
const preset = presetServers.find((s) => s.id === editingServerId);
if (!preset?.configSchema) return null;
return Object.entries(preset.configSchema.properties).map(
([key, prop]: [string, ConfigProperty]) => {
if (prop.type === "array") {
const currentValue = userConfig[key as keyof typeof userConfig] || [];
const itemLabel = (prop as any).itemLabel || key;
const addButtonText =
(prop as any).addButtonText || `Add ${itemLabel}`;
return (
<ListItem
key={key}
title={key}
subTitle={prop.description}
vertical
>
<div className={styles["path-list"]}>
{(currentValue as string[]).map(
(value: string, index: number) => (
<div key={index} className={styles["path-item"]}>
<input
type="text"
value={value}
placeholder={`${itemLabel} ${index + 1}`}
onChange={(e) => {
const newValue = [...currentValue] as string[];
newValue[index] = e.target.value;
setUserConfig({ ...userConfig, [key]: newValue });
}}
/>
<IconButton
icon={<DeleteIcon />}
className={styles["delete-button"]}
onClick={() => {
const newValue = [...currentValue] as string[];
newValue.splice(index, 1);
setUserConfig({ ...userConfig, [key]: newValue });
}}
/>
</div>
),
)}
<IconButton
icon={<AddIcon />}
text={addButtonText}
className={styles["add-button"]}
bordered
onClick={() => {
const newValue = [...currentValue, ""] as string[];
setUserConfig({ ...userConfig, [key]: newValue });
}}
/>
</div>
</ListItem>
);
} else if (prop.type === "string") {
const currentValue = userConfig[key as keyof typeof userConfig] || "";
return (
<ListItem key={key} title={key} subTitle={prop.description}>
<input
aria-label={key}
type="text"
value={currentValue}
placeholder={`Enter ${key}`}
onChange={(e) => {
setUserConfig({ ...userConfig, [key]: e.target.value });
}}
/>
</ListItem>
);
}
return null;
},
);
};
const checkServerStatus = (clientId: string) => {
return clientStatuses[clientId] || { status: "undefined", errorMsg: null };
};
const getServerStatusDisplay = (clientId: string) => {
const status = checkServerStatus(clientId);
const statusMap = {
undefined: null, // 未配置/未找到不显示
// 添加初始化状态
initializing: (
<span className={clsx(styles["server-status"], styles["initializing"])}>
Initializing
</span>
),
paused: (
<span className={clsx(styles["server-status"], styles["stopped"])}>
Stopped
</span>
),
active: <span className={styles["server-status"]}>Running</span>,
error: (
<span className={clsx(styles["server-status"], styles["error"])}>
Error
<span className={styles["error-message"]}>: {status.errorMsg}</span>
</span>
),
};
return statusMap[status.status];
};
// Get the type of operation status
const getOperationStatusType = (message: string) => {
if (message.toLowerCase().includes("stopping")) return "stopping";
if (message.toLowerCase().includes("starting")) return "starting";
if (message.toLowerCase().includes("error")) return "error";
return "default";
};
// 渲染服务器列表
const renderServerList = () => {
if (loadingPresets) {
return (
<div className={styles["loading-container"]}>
<div className={styles["loading-text"]}>
Loading preset server list...
</div>
</div>
);
}
if (!Array.isArray(presetServers) || presetServers.length === 0) {
return (
<div className={styles["empty-container"]}>
<div className={styles["empty-text"]}>No servers available</div>
</div>
);
}
return presetServers
.filter((server) => {
if (searchText.length === 0) return true;
const searchLower = searchText.toLowerCase();
return (
server.name.toLowerCase().includes(searchLower) ||
server.description.toLowerCase().includes(searchLower) ||
server.tags.some((tag) => tag.toLowerCase().includes(searchLower))
);
})
.sort((a, b) => {
const aStatus = checkServerStatus(a.id).status;
const bStatus = checkServerStatus(b.id).status;
const aLoading = loadingStates[a.id];
const bLoading = loadingStates[b.id];
// 定义状态优先级
const statusPriority: Record<string, number> = {
error: 0, // Highest priority for error status
active: 1, // Second for active
initializing: 2, // Initializing
starting: 3, // Starting
stopping: 4, // Stopping
paused: 5, // Paused
undefined: 6, // Lowest priority for undefined
};
// Get actual status (including loading status)
const getEffectiveStatus = (status: string, loading?: string) => {
if (loading) {
const operationType = getOperationStatusType(loading);
return operationType === "default" ? status : operationType;
}
if (status === "initializing" && !loading) {
return "active";
}
return status;
};
const aEffectiveStatus = getEffectiveStatus(aStatus, aLoading);
const bEffectiveStatus = getEffectiveStatus(bStatus, bLoading);
// 首先按状态排序
if (aEffectiveStatus !== bEffectiveStatus) {
return (
(statusPriority[aEffectiveStatus] ?? 6) -
(statusPriority[bEffectiveStatus] ?? 6)
);
}
// Sort by name when statuses are the same
return a.name.localeCompare(b.name);
})
.map((server) => (
<div
className={clsx(styles["mcp-market-item"], {
[styles["loading"]]: loadingStates[server.id],
})}
key={server.id}
>
<div className={styles["mcp-market-header"]}>
<div className={styles["mcp-market-title"]}>
<div className={styles["mcp-market-name"]}>
{server.name}
{loadingStates[server.id] && (
<span
className={styles["operation-status"]}
data-status={getOperationStatusType(
loadingStates[server.id],
)}
>
{loadingStates[server.id]}
</span>
)}
{!loadingStates[server.id] && getServerStatusDisplay(server.id)}
{server.repo && (
<a
href={server.repo}
target="_blank"
rel="noopener noreferrer"
className={styles["repo-link"]}
title="Open repository"
>
<GithubIcon />
</a>
)}
</div>
<div className={styles["tags-container"]}>
{server.tags.map((tag, index) => (
<span key={index} className={styles["tag"]}>
{tag}
</span>
))}
</div>
<div
className={clsx(styles["mcp-market-info"], "one-line")}
title={server.description}
>
{server.description}
</div>
</div>
<div className={styles["mcp-market-actions"]}>
{isServerAdded(server.id) ? (
<>
{server.configurable && (
<IconButton
icon={<EditIcon />}
text="Configure"
onClick={() => setEditingServerId(server.id)}
disabled={isLoading}
/>
)}
{checkServerStatus(server.id).status === "paused" ? (
<>
<IconButton
icon={<PlayIcon />}
text="Start"
onClick={() => restartServer(server.id)}
disabled={isLoading}
/>
{/* <IconButton
icon={<DeleteIcon />}
text="Remove"
onClick={() => removeServer(server.id)}
disabled={isLoading}
/> */}
</>
) : (
<>
<IconButton
icon={<EyeIcon />}
text="Tools"
onClick={async () => {
setViewingServerId(server.id);
await loadTools(server.id);
}}
disabled={
isLoading ||
checkServerStatus(server.id).status === "error"
}
/>
<IconButton
icon={<StopIcon />}
text="Stop"
onClick={() => pauseServer(server.id)}
disabled={isLoading}
/>
</>
)}
</>
) : (
<IconButton
icon={<AddIcon />}
text="Add"
onClick={() => addServer(server)}
disabled={isLoading}
/>
)}
</div>
</div>
</div>
));
};
return (
<ErrorBoundary>
<div className={styles["mcp-market-page"]}>
<div className="window-header">
<div className="window-header-title">
<div className="window-header-main-title">
MCP Market
{loadingStates["all"] && (
<span className={styles["loading-indicator"]}>
{loadingStates["all"]}
</span>
)}
</div>
<div className="window-header-sub-title">
{Object.keys(config?.mcpServers ?? {}).length} servers configured
</div>
</div>
<div className="window-actions">
<div className="window-action-button">
<IconButton
icon={<RestartIcon />}
bordered
onClick={handleRestartAll}
text="Restart All"
disabled={isLoading}
/>
</div>
<div className="window-action-button">
<IconButton
icon={<CloseIcon />}
bordered
onClick={() => navigate(-1)}
disabled={isLoading}
/>
</div>
</div>
</div>
<div className={styles["mcp-market-page-body"]}>
<div className={styles["mcp-market-filter"]}>
<input
type="text"
className={styles["search-bar"]}
placeholder={"Search MCP Server"}
autoFocus
onInput={(e) => setSearchText(e.currentTarget.value)}
/>
</div>
<div className={styles["server-list"]}>{renderServerList()}</div>
</div>
{/*编辑服务器配置*/}
{editingServerId && (
<div className="modal-mask">
<Modal
title={`Configure Server - ${editingServerId}`}
onClose={() => !isLoading && setEditingServerId(undefined)}
actions={[
<IconButton
key="cancel"
text="Cancel"
onClick={() => setEditingServerId(undefined)}
bordered
disabled={isLoading}
/>,
<IconButton
key="confirm"
text="Save"
type="primary"
onClick={saveServerConfig}
bordered
disabled={isLoading}
/>,
]}
>
<List>{renderConfigForm()}</List>
</Modal>
</div>
)}
{viewingServerId && (
<div className="modal-mask">
<Modal
title={`Server Details - ${viewingServerId}`}
onClose={() => setViewingServerId(undefined)}
actions={[
<IconButton
key="close"
text="Close"
onClick={() => setViewingServerId(undefined)}
bordered
/>,
]}
>
<div className={styles["tools-list"]}>
{isLoading ? (
<div>Loading...</div>
) : tools?.tools ? (
tools.tools.map(
(tool: ListToolsResponse["tools"], index: number) => (
<div key={index} className={styles["tool-item"]}>
<div className={styles["tool-name"]}>{tool.name}</div>
<div className={styles["tool-description"]}>
{tool.description}
</div>
</div>
),
)
) : (
<div>No tools available</div>
)}
</div>
</Modal>
</div>
)}
</div>
</ErrorBoundary>
);
}

View File

@@ -8,6 +8,7 @@ import Locale from "../locales";
import styles from "./message-selector.module.scss";
import { getMessageTextContent } from "../utils";
import clsx from "clsx";
function useShiftRange() {
const [startIndex, setStartIndex] = useState<number>();
@@ -71,6 +72,7 @@ export function MessageSelector(props: {
defaultSelectAll?: boolean;
onSelected?: (messages: ChatMessage[]) => void;
}) {
const LATEST_COUNT = 4;
const chatStore = useChatStore();
const session = chatStore.currentSession();
const isValid = (m: ChatMessage) => m.content && !m.isError && !m.streaming;
@@ -141,15 +143,13 @@ export function MessageSelector(props: {
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [startIndex, endIndex]);
const LATEST_COUNT = 4;
return (
<div className={styles["message-selector"]}>
<div className={styles["message-filter"]}>
<input
type="text"
placeholder={Locale.Select.Search}
className={styles["filter-item"] + " " + styles["search-bar"]}
className={clsx(styles["filter-item"], styles["search-bar"])}
value={searchInput}
onInput={(e) => {
setSearchInput(e.currentTarget.value);
@@ -196,9 +196,9 @@ export function MessageSelector(props: {
return (
<div
className={`${styles["message"]} ${
props.selection.has(m.id!) && styles["message-selected"]
}`}
className={clsx(styles["message"], {
[styles["message-selected"]]: props.selection.has(m.id!),
})}
key={i}
onClick={() => {
props.updateSelection((selection) => {
@@ -221,7 +221,7 @@ export function MessageSelector(props: {
<div className={styles["date"]}>
{new Date(m.date).toLocaleString()}
</div>
<div className={`${styles["content"]} one-line`}>
<div className={clsx(styles["content"], "one-line")}>
{getMessageTextContent(m)}
</div>
</div>

View File

@@ -0,0 +1,7 @@
.select-compress-model {
width: 60%;
select {
max-width: 100%;
white-space: normal;
}
}

View File

@@ -5,13 +5,21 @@ import Locale from "../locales";
import { InputRange } from "./input-range";
import { ListItem, Select } from "./ui-lib";
import { useAllModels } from "../utils/hooks";
import { groupBy } from "lodash-es";
import styles from "./model-config.module.scss";
import { getModelProvider } from "../utils/model";
export function ModelConfigList(props: {
modelConfig: ModelConfig;
updateConfig: (updater: (config: ModelConfig) => void) => void;
}) {
const allModels = useAllModels();
const groupModels = groupBy(
allModels.filter((v) => v.available),
"provider.providerName",
);
const value = `${props.modelConfig.model}@${props.modelConfig?.providerName}`;
const compressModelValue = `${props.modelConfig.compressModel}@${props.modelConfig?.compressProviderName}`;
return (
<>
@@ -19,21 +27,26 @@ export function ModelConfigList(props: {
<Select
aria-label={Locale.Settings.Model}
value={value}
align="left"
onChange={(e) => {
const [model, providerName] = e.currentTarget.value.split("@");
const [model, providerName] = getModelProvider(
e.currentTarget.value,
);
props.updateConfig((config) => {
config.model = ModalConfigValidator.model(model);
config.providerName = providerName as ServiceProvider;
});
}}
>
{allModels
.filter((v) => v.available)
.map((v, i) => (
<option value={`${v.name}@${v.provider?.providerName}`} key={i}>
{v.displayName}({v.provider?.providerName})
</option>
))}
{Object.keys(groupModels).map((providerName, index) => (
<optgroup label={providerName} key={index}>
{groupModels[providerName].map((v, i) => (
<option value={`${v.name}@${v.provider?.providerName}`} key={i}>
{v.displayName}
</option>
))}
</optgroup>
))}
</Select>
</ListItem>
<ListItem
@@ -228,6 +241,33 @@ export function ModelConfigList(props: {
}
></input>
</ListItem>
<ListItem
title={Locale.Settings.CompressModel.Title}
subTitle={Locale.Settings.CompressModel.SubTitle}
>
<Select
className={styles["select-compress-model"]}
aria-label={Locale.Settings.CompressModel.Title}
value={compressModelValue}
onChange={(e) => {
const [model, providerName] = getModelProvider(
e.currentTarget.value,
);
props.updateConfig((config) => {
config.compressModel = ModalConfigValidator.model(model);
config.compressProviderName = providerName as ServiceProvider;
});
}}
>
{allModels
.filter((v) => v.available)
.map((v, i) => (
<option value={`${v.name}@${v.provider?.providerName}`} key={i}>
{v.displayName}({v.provider?.providerName})
</option>
))}
</Select>
</ListItem>
</>
);
}

View File

@@ -16,6 +16,7 @@ import { MaskAvatar } from "./mask";
import { useCommand } from "../command";
import { showConfirm } from "./ui-lib";
import { BUILTIN_MASK_STORE } from "../masks";
import clsx from "clsx";
function MaskItem(props: { mask: Mask; onClick?: () => void }) {
return (
@@ -24,7 +25,9 @@ function MaskItem(props: { mask: Mask; onClick?: () => void }) {
avatar={props.mask.avatar}
model={props.mask.modelConfig.model}
/>
<div className={styles["mask-name"] + " one-line"}>{props.mask.name}</div>
<div className={clsx(styles["mask-name"], "one-line")}>
{props.mask.name}
</div>
</div>
);
}

View File

@@ -10,7 +10,29 @@
max-height: 240px;
overflow-y: auto;
white-space: pre-wrap;
min-width: 300px;
min-width: 280px;
}
}
.plugin-schema {
display: flex;
justify-content: flex-end;
flex-direction: row;
input {
margin-right: 20px;
@media screen and (max-width: 600px) {
margin-right: 0px;
}
}
@media screen and (max-width: 600px) {
flex-direction: column;
gap: 5px;
button {
padding: 10px;
}
}
}

View File

@@ -12,7 +12,6 @@ import EditIcon from "../icons/edit.svg";
import AddIcon from "../icons/add.svg";
import CloseIcon from "../icons/close.svg";
import DeleteIcon from "../icons/delete.svg";
import EyeIcon from "../icons/eye.svg";
import ConfirmIcon from "../icons/confirm.svg";
import ReloadIcon from "../icons/reload.svg";
import GithubIcon from "../icons/github.svg";
@@ -28,8 +27,8 @@ import {
} from "./ui-lib";
import Locale from "../locales";
import { useNavigate } from "react-router-dom";
import { useEffect, useState } from "react";
import { getClientConfig } from "../config/client";
import { useState } from "react";
import clsx from "clsx";
export function PluginPage() {
const navigate = useNavigate();
@@ -201,7 +200,7 @@ export function PluginPage() {
<div className={styles["mask-name"]}>
{m.title}@<small>{m.version}</small>
</div>
<div className={styles["mask-info"] + " one-line"}>
<div className={clsx(styles["mask-info"], "one-line")}>
{Locale.Plugin.Item.Info(
FunctionToolService.add(m).length,
)}
@@ -209,19 +208,11 @@ export function PluginPage() {
</div>
</div>
<div className={styles["mask-actions"]}>
{m.builtin ? (
<IconButton
icon={<EyeIcon />}
text={Locale.Plugin.Item.View}
onClick={() => setEditingPluginId(m.id)}
/>
) : (
<IconButton
icon={<EditIcon />}
text={Locale.Plugin.Item.Edit}
onClick={() => setEditingPluginId(m.id)}
/>
)}
<IconButton
icon={<EditIcon />}
text={Locale.Plugin.Item.Edit}
onClick={() => setEditingPluginId(m.id)}
/>
{!m.builtin && (
<IconButton
icon={<DeleteIcon />}
@@ -325,30 +316,13 @@ export function PluginPage() {
></PasswordInput>
</ListItem>
)}
{!getClientConfig()?.isApp && (
<ListItem
title={Locale.Plugin.Auth.Proxy}
subTitle={Locale.Plugin.Auth.ProxyDescription}
>
<input
type="checkbox"
checked={editingPlugin?.usingProxy}
style={{ minWidth: 16 }}
onChange={(e) => {
pluginStore.updatePlugin(editingPlugin.id, (plugin) => {
plugin.usingProxy = e.currentTarget.checked;
});
}}
></input>
</ListItem>
)}
</List>
<List>
<ListItem title={Locale.Plugin.EditModal.Content}>
<div style={{ display: "flex", justifyContent: "flex-end" }}>
<div className={pluginStyles["plugin-schema"]}>
<input
type="text"
style={{ minWidth: 200, marginRight: 20 }}
style={{ minWidth: 200 }}
onInput={(e) => setLoadUrl(e.currentTarget.value)}
></input>
<IconButton
@@ -362,7 +336,10 @@ export function PluginPage() {
<ListItem
subTitle={
<div
className={`markdown-body ${pluginStyles["plugin-content"]}`}
className={clsx(
"markdown-body",
pluginStyles["plugin-content"],
)}
dir="auto"
>
<pre>

View File

@@ -0,0 +1 @@
export * from "./realtime-chat";

View File

@@ -0,0 +1,74 @@
.realtime-chat {
width: 100%;
justify-content: center;
align-items: center;
position: relative;
display: flex;
flex-direction: column;
height: 100%;
padding: 20px;
box-sizing: border-box;
.circle-mic {
width: 150px;
height: 150px;
border-radius: 50%;
background: linear-gradient(to bottom right, #a0d8ef, #f0f8ff);
display: flex;
justify-content: center;
align-items: center;
}
.icon-center {
font-size: 24px;
}
.bottom-icons {
display: flex;
justify-content: space-between;
align-items: center;
width: 100%;
position: absolute;
bottom: 20px;
box-sizing: border-box;
padding: 0 20px;
}
.icon-left,
.icon-right {
width: 46px;
height: 46px;
font-size: 36px;
background: var(--second);
border-radius: 50%;
padding: 2px;
display: flex;
justify-content: center;
align-items: center;
cursor: pointer;
&:hover {
opacity: 0.8;
}
}
&.mobile {
display: none;
}
}
.pulse {
animation: pulse 1.5s infinite;
}
@keyframes pulse {
0% {
transform: scale(1);
opacity: 0.7;
}
50% {
transform: scale(1.1);
opacity: 1;
}
100% {
transform: scale(1);
opacity: 0.7;
}
}

View File

@@ -0,0 +1,359 @@
import VoiceIcon from "@/app/icons/voice.svg";
import VoiceOffIcon from "@/app/icons/voice-off.svg";
import PowerIcon from "@/app/icons/power.svg";
import styles from "./realtime-chat.module.scss";
import clsx from "clsx";
import { useState, useRef, useEffect } from "react";
import { useChatStore, createMessage, useAppConfig } from "@/app/store";
import { IconButton } from "@/app/components/button";
import {
Modality,
RTClient,
RTInputAudioItem,
RTResponse,
TurnDetection,
} from "rt-client";
import { AudioHandler } from "@/app/lib/audio";
import { uploadImage } from "@/app/utils/chat";
import { VoicePrint } from "@/app/components/voice-print";
interface RealtimeChatProps {
onClose?: () => void;
onStartVoice?: () => void;
onPausedVoice?: () => void;
}
export function RealtimeChat({
onClose,
onStartVoice,
onPausedVoice,
}: RealtimeChatProps) {
const chatStore = useChatStore();
const session = chatStore.currentSession();
const config = useAppConfig();
const [status, setStatus] = useState("");
const [isRecording, setIsRecording] = useState(false);
const [isConnected, setIsConnected] = useState(false);
const [isConnecting, setIsConnecting] = useState(false);
const [modality, setModality] = useState("audio");
const [useVAD, setUseVAD] = useState(true);
const [frequencies, setFrequencies] = useState<Uint8Array | undefined>();
const clientRef = useRef<RTClient | null>(null);
const audioHandlerRef = useRef<AudioHandler | null>(null);
const initRef = useRef(false);
const temperature = config.realtimeConfig.temperature;
const apiKey = config.realtimeConfig.apiKey;
const model = config.realtimeConfig.model;
const azure = config.realtimeConfig.provider === "Azure";
const azureEndpoint = config.realtimeConfig.azure.endpoint;
const azureDeployment = config.realtimeConfig.azure.deployment;
const voice = config.realtimeConfig.voice;
const handleConnect = async () => {
if (isConnecting) return;
if (!isConnected) {
try {
setIsConnecting(true);
clientRef.current = azure
? new RTClient(
new URL(azureEndpoint),
{ key: apiKey },
{ deployment: azureDeployment },
)
: new RTClient({ key: apiKey }, { model });
const modalities: Modality[] =
modality === "audio" ? ["text", "audio"] : ["text"];
const turnDetection: TurnDetection = useVAD
? { type: "server_vad" }
: null;
await clientRef.current.configure({
instructions: "",
voice,
input_audio_transcription: { model: "whisper-1" },
turn_detection: turnDetection,
tools: [],
temperature,
modalities,
});
startResponseListener();
setIsConnected(true);
// TODO
// try {
// const recentMessages = chatStore.getMessagesWithMemory();
// for (const message of recentMessages) {
// const { role, content } = message;
// if (typeof content === "string") {
// await clientRef.current.sendItem({
// type: "message",
// role: role as any,
// content: [
// {
// type: (role === "assistant" ? "text" : "input_text") as any,
// text: content as string,
// },
// ],
// });
// }
// }
// // await clientRef.current.generateResponse();
// } catch (error) {
// console.error("Set message failed:", error);
// }
} catch (error) {
console.error("Connection failed:", error);
setStatus("Connection failed");
} finally {
setIsConnecting(false);
}
} else {
await disconnect();
}
};
const disconnect = async () => {
if (clientRef.current) {
try {
await clientRef.current.close();
clientRef.current = null;
setIsConnected(false);
} catch (error) {
console.error("Disconnect failed:", error);
}
}
};
const startResponseListener = async () => {
if (!clientRef.current) return;
try {
for await (const serverEvent of clientRef.current.events()) {
if (serverEvent.type === "response") {
await handleResponse(serverEvent);
} else if (serverEvent.type === "input_audio") {
await handleInputAudio(serverEvent);
}
}
} catch (error) {
if (clientRef.current) {
console.error("Response iteration error:", error);
}
}
};
const handleResponse = async (response: RTResponse) => {
for await (const item of response) {
if (item.type === "message" && item.role === "assistant") {
const botMessage = createMessage({
role: item.role,
content: "",
});
// add bot message first
chatStore.updateTargetSession(session, (session) => {
session.messages = session.messages.concat([botMessage]);
});
let hasAudio = false;
for await (const content of item) {
if (content.type === "text") {
for await (const text of content.textChunks()) {
botMessage.content += text;
}
} else if (content.type === "audio") {
const textTask = async () => {
for await (const text of content.transcriptChunks()) {
botMessage.content += text;
}
};
const audioTask = async () => {
audioHandlerRef.current?.startStreamingPlayback();
for await (const audio of content.audioChunks()) {
hasAudio = true;
audioHandlerRef.current?.playChunk(audio);
}
};
await Promise.all([textTask(), audioTask()]);
}
// update message.content
chatStore.updateTargetSession(session, (session) => {
session.messages = session.messages.concat();
});
}
if (hasAudio) {
// upload audio get audio_url
const blob = audioHandlerRef.current?.savePlayFile();
uploadImage(blob!).then((audio_url) => {
botMessage.audio_url = audio_url;
// update text and audio_url
chatStore.updateTargetSession(session, (session) => {
session.messages = session.messages.concat();
});
});
}
}
}
};
const handleInputAudio = async (item: RTInputAudioItem) => {
await item.waitForCompletion();
if (item.transcription) {
const userMessage = createMessage({
role: "user",
content: item.transcription,
});
chatStore.updateTargetSession(session, (session) => {
session.messages = session.messages.concat([userMessage]);
});
// save input audio_url, and update session
const { audioStartMillis, audioEndMillis } = item;
// upload audio get audio_url
const blob = audioHandlerRef.current?.saveRecordFile(
audioStartMillis,
audioEndMillis,
);
uploadImage(blob!).then((audio_url) => {
userMessage.audio_url = audio_url;
chatStore.updateTargetSession(session, (session) => {
session.messages = session.messages.concat();
});
});
}
// stop streaming play after get input audio.
audioHandlerRef.current?.stopStreamingPlayback();
};
const toggleRecording = async () => {
if (!isRecording && clientRef.current) {
try {
if (!audioHandlerRef.current) {
audioHandlerRef.current = new AudioHandler();
await audioHandlerRef.current.initialize();
}
await audioHandlerRef.current.startRecording(async (chunk) => {
await clientRef.current?.sendAudio(chunk);
});
setIsRecording(true);
} catch (error) {
console.error("Failed to start recording:", error);
}
} else if (audioHandlerRef.current) {
try {
audioHandlerRef.current.stopRecording();
if (!useVAD) {
const inputAudio = await clientRef.current?.commitAudio();
await handleInputAudio(inputAudio!);
await clientRef.current?.generateResponse();
}
setIsRecording(false);
} catch (error) {
console.error("Failed to stop recording:", error);
}
}
};
useEffect(() => {
// 防止重复初始化
if (initRef.current) return;
initRef.current = true;
const initAudioHandler = async () => {
const handler = new AudioHandler();
await handler.initialize();
audioHandlerRef.current = handler;
await handleConnect();
await toggleRecording();
};
initAudioHandler().catch((error) => {
setStatus(error);
console.error(error);
});
return () => {
if (isRecording) {
toggleRecording();
}
audioHandlerRef.current?.close().catch(console.error);
disconnect();
};
}, []);
useEffect(() => {
let animationFrameId: number;
if (isConnected && isRecording) {
const animationFrame = () => {
if (audioHandlerRef.current) {
const freqData = audioHandlerRef.current.getByteFrequencyData();
setFrequencies(freqData);
}
animationFrameId = requestAnimationFrame(animationFrame);
};
animationFrameId = requestAnimationFrame(animationFrame);
} else {
setFrequencies(undefined);
}
return () => {
if (animationFrameId) {
cancelAnimationFrame(animationFrameId);
}
};
}, [isConnected, isRecording]);
// update session params
useEffect(() => {
clientRef.current?.configure({ voice });
}, [voice]);
useEffect(() => {
clientRef.current?.configure({ temperature });
}, [temperature]);
const handleClose = async () => {
onClose?.();
if (isRecording) {
await toggleRecording();
}
disconnect().catch(console.error);
};
return (
<div className={styles["realtime-chat"]}>
<div
className={clsx(styles["circle-mic"], {
[styles["pulse"]]: isRecording,
})}
>
<VoicePrint frequencies={frequencies} isActive={isRecording} />
</div>
<div className={styles["bottom-icons"]}>
<div>
<IconButton
icon={isRecording ? <VoiceIcon /> : <VoiceOffIcon />}
onClick={toggleRecording}
disabled={!isConnected}
shadow
bordered
/>
</div>
<div className={styles["icon-center"]}>{status}</div>
<div>
<IconButton
icon={<PowerIcon />}
onClick={handleClose}
shadow
bordered
/>
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,173 @@
import { RealtimeConfig } from "@/app/store";
import Locale from "@/app/locales";
import { ListItem, Select, PasswordInput } from "@/app/components/ui-lib";
import { InputRange } from "@/app/components/input-range";
import { Voice } from "rt-client";
import { ServiceProvider } from "@/app/constant";
const providers = [ServiceProvider.OpenAI, ServiceProvider.Azure];
const models = ["gpt-4o-realtime-preview-2024-10-01"];
const voice = ["alloy", "shimmer", "echo"];
export function RealtimeConfigList(props: {
realtimeConfig: RealtimeConfig;
updateConfig: (updater: (config: RealtimeConfig) => void) => void;
}) {
const azureConfigComponent = props.realtimeConfig.provider ===
ServiceProvider.Azure && (
<>
<ListItem
title={Locale.Settings.Realtime.Azure.Endpoint.Title}
subTitle={Locale.Settings.Realtime.Azure.Endpoint.SubTitle}
>
<input
value={props.realtimeConfig?.azure?.endpoint}
type="text"
placeholder={Locale.Settings.Realtime.Azure.Endpoint.Title}
onChange={(e) => {
props.updateConfig(
(config) => (config.azure.endpoint = e.currentTarget.value),
);
}}
/>
</ListItem>
<ListItem
title={Locale.Settings.Realtime.Azure.Deployment.Title}
subTitle={Locale.Settings.Realtime.Azure.Deployment.SubTitle}
>
<input
value={props.realtimeConfig?.azure?.deployment}
type="text"
placeholder={Locale.Settings.Realtime.Azure.Deployment.Title}
onChange={(e) => {
props.updateConfig(
(config) => (config.azure.deployment = e.currentTarget.value),
);
}}
/>
</ListItem>
</>
);
return (
<>
<ListItem
title={Locale.Settings.Realtime.Enable.Title}
subTitle={Locale.Settings.Realtime.Enable.SubTitle}
>
<input
type="checkbox"
checked={props.realtimeConfig.enable}
onChange={(e) =>
props.updateConfig(
(config) => (config.enable = e.currentTarget.checked),
)
}
></input>
</ListItem>
{props.realtimeConfig.enable && (
<>
<ListItem
title={Locale.Settings.Realtime.Provider.Title}
subTitle={Locale.Settings.Realtime.Provider.SubTitle}
>
<Select
aria-label={Locale.Settings.Realtime.Provider.Title}
value={props.realtimeConfig.provider}
onChange={(e) => {
props.updateConfig(
(config) =>
(config.provider = e.target.value as ServiceProvider),
);
}}
>
{providers.map((v, i) => (
<option value={v} key={i}>
{v}
</option>
))}
</Select>
</ListItem>
<ListItem
title={Locale.Settings.Realtime.Model.Title}
subTitle={Locale.Settings.Realtime.Model.SubTitle}
>
<Select
aria-label={Locale.Settings.Realtime.Model.Title}
value={props.realtimeConfig.model}
onChange={(e) => {
props.updateConfig((config) => (config.model = e.target.value));
}}
>
{models.map((v, i) => (
<option value={v} key={i}>
{v}
</option>
))}
</Select>
</ListItem>
<ListItem
title={Locale.Settings.Realtime.ApiKey.Title}
subTitle={Locale.Settings.Realtime.ApiKey.SubTitle}
>
<PasswordInput
aria={Locale.Settings.ShowPassword}
aria-label={Locale.Settings.Realtime.ApiKey.Title}
value={props.realtimeConfig.apiKey}
type="text"
placeholder={Locale.Settings.Realtime.ApiKey.Placeholder}
onChange={(e) => {
props.updateConfig(
(config) => (config.apiKey = e.currentTarget.value),
);
}}
/>
</ListItem>
{azureConfigComponent}
<ListItem
title={Locale.Settings.TTS.Voice.Title}
subTitle={Locale.Settings.TTS.Voice.SubTitle}
>
<Select
value={props.realtimeConfig.voice}
onChange={(e) => {
props.updateConfig(
(config) => (config.voice = e.currentTarget.value as Voice),
);
}}
>
{voice.map((v, i) => (
<option value={v} key={i}>
{v}
</option>
))}
</Select>
</ListItem>
<ListItem
title={Locale.Settings.Realtime.Temperature.Title}
subTitle={Locale.Settings.Realtime.Temperature.SubTitle}
>
<InputRange
aria={Locale.Settings.Temperature.Title}
value={props.realtimeConfig?.temperature?.toFixed(1)}
min="0.6"
max="1"
step="0.1"
onChange={(e) => {
props.updateConfig(
(config) =>
(config.temperature = e.currentTarget.valueAsNumber),
);
}}
></InputRange>
</ListItem>
</>
)}
</>
);
}

View File

@@ -4,6 +4,7 @@ import { Select } from "@/app/components/ui-lib";
import { IconButton } from "@/app/components/button";
import Locale from "@/app/locales";
import { useSdStore } from "@/app/store/sd";
import clsx from "clsx";
export const params = [
{
@@ -136,7 +137,7 @@ export function ControlParamItem(props: {
className?: string;
}) {
return (
<div className={styles["ctrl-param-item"] + ` ${props.className || ""}`}>
<div className={clsx(styles["ctrl-param-item"], props.className)}>
<div className={styles["ctrl-param-item-header"]}>
<div className={styles["ctrl-param-item-title"]}>
<div>

View File

@@ -36,6 +36,7 @@ import { removeImage } from "@/app/utils/chat";
import { SideBar } from "./sd-sidebar";
import { WindowContent } from "@/app/components/home";
import { params } from "./sd-panel";
import clsx from "clsx";
function getSdTaskStatus(item: any) {
let s: string;
@@ -104,7 +105,7 @@ export function Sd() {
return (
<>
<SideBar className={isSd ? homeStyles["sidebar-show"] : ""} />
<SideBar className={clsx({ [homeStyles["sidebar-show"]]: isSd })} />
<WindowContent>
<div className={chatStyles.chat} key={"1"}>
<div className="window-header" data-tauri-drag-region>
@@ -121,7 +122,10 @@ export function Sd() {
</div>
)}
<div
className={`window-header-title ${chatStyles["chat-body-title"]}`}
className={clsx(
"window-header-title",
chatStyles["chat-body-title"],
)}
>
<div className={`window-header-main-title`}>Stability AI</div>
<div className="window-header-sub-title">

View File

@@ -72,3 +72,9 @@
}
}
}
.subtitle-button {
button {
overflow:visible ;
}
}

View File

@@ -9,6 +9,7 @@ import CopyIcon from "../icons/copy.svg";
import ClearIcon from "../icons/clear.svg";
import LoadingIcon from "../icons/three-dots.svg";
import EditIcon from "../icons/edit.svg";
import FireIcon from "../icons/fire.svg";
import EyeIcon from "../icons/eye.svg";
import DownloadIcon from "../icons/download.svg";
import UploadIcon from "../icons/upload.svg";
@@ -18,7 +19,7 @@ import ConfirmIcon from "../icons/confirm.svg";
import ConnectionIcon from "../icons/connection.svg";
import CloudSuccessIcon from "../icons/cloud-success.svg";
import CloudFailIcon from "../icons/cloud-fail.svg";
import { trackSettingsPageGuideToCPaymentClick } from "../utils/auth-settings-events";
import {
Input,
List,
@@ -30,7 +31,6 @@ import {
showConfirm,
showToast,
} from "./ui-lib";
import { ModelConfigList } from "./model-config";
import { IconButton } from "./button";
import {
@@ -48,7 +48,7 @@ import Locale, {
changeLang,
getLang,
} from "../locales";
import { copyToClipboard } from "../utils";
import { copyToClipboard, clientUpdate, semverCompare } from "../utils";
import Link from "next/link";
import {
Anthropic,
@@ -58,6 +58,7 @@ import {
ByteDance,
Alibaba,
Moonshot,
XAI,
Google,
GoogleSafetySettingsThreshold,
OPENAI_BASE_URL,
@@ -69,6 +70,10 @@ import {
UPDATE_URL,
Stability,
Iflytek,
SAAS_CHAT_URL,
ChatGLM,
DeepSeek,
SiliconFlow,
} from "../constant";
import { Prompt, SearchService, usePromptStore } from "../store/prompt";
import { ErrorBoundary } from "./error";
@@ -80,6 +85,8 @@ import { useSyncStore } from "../store/sync";
import { nanoid } from "nanoid";
import { useMaskStore } from "../store/mask";
import { ProviderType } from "../utils/cloud";
import { TTSConfigList } from "./tts-config";
import { RealtimeConfigList } from "./realtime-chat/realtime-config";
function EditPromptModal(props: { id: string; onClose: () => void }) {
const promptStore = usePromptStore();
@@ -522,6 +529,8 @@ function SyncItems() {
setShowSyncConfigModal(true);
}}
/>
{/* thuc hien dong bo voi cloud */}
{couldSync && (
<IconButton
icon={<ResetIcon />}
@@ -582,7 +591,7 @@ export function Settings() {
const [checkingUpdate, setCheckingUpdate] = useState(false);
const currentVersion = updateStore.formatVersion(updateStore.version);
const remoteId = updateStore.formatVersion(updateStore.remoteVersion);
const hasNewVersion = currentVersion !== remoteId;
const hasNewVersion = semverCompare(currentVersion, remoteId) === -1;
const updateUrl = getClientConfig()?.isApp ? RELEASE_URL : UPDATE_URL;
function checkUpdate(force = false) {
@@ -685,6 +694,31 @@ export function Settings() {
</ListItem>
);
const saasStartComponent = (
<ListItem
className={styles["subtitle-button"]}
title={
Locale.Settings.Access.SaasStart.Title +
`${Locale.Settings.Access.SaasStart.Label}`
}
subTitle={Locale.Settings.Access.SaasStart.SubTitle}
>
<IconButton
aria={
Locale.Settings.Access.SaasStart.Title +
Locale.Settings.Access.SaasStart.ChatNow
}
icon={<FireIcon />}
type={"primary"}
text={Locale.Settings.Access.SaasStart.ChatNow}
onClick={() => {
trackSettingsPageGuideToCPaymentClick();
window.location.href = SAAS_CHAT_URL;
}}
/>
</ListItem>
);
const useCustomConfigComponent = // Conditionally render the following ListItem based on clientConfig.isApp
!clientConfig?.isApp && ( // only show if isApp is false
<ListItem
@@ -1166,6 +1200,167 @@ export function Settings() {
</>
);
const deepseekConfigComponent = accessStore.provider ===
ServiceProvider.DeepSeek && (
<>
<ListItem
title={Locale.Settings.Access.DeepSeek.Endpoint.Title}
subTitle={
Locale.Settings.Access.DeepSeek.Endpoint.SubTitle +
DeepSeek.ExampleEndpoint
}
>
<input
aria-label={Locale.Settings.Access.DeepSeek.Endpoint.Title}
type="text"
value={accessStore.deepseekUrl}
placeholder={DeepSeek.ExampleEndpoint}
onChange={(e) =>
accessStore.update(
(access) => (access.deepseekUrl = e.currentTarget.value),
)
}
></input>
</ListItem>
<ListItem
title={Locale.Settings.Access.DeepSeek.ApiKey.Title}
subTitle={Locale.Settings.Access.DeepSeek.ApiKey.SubTitle}
>
<PasswordInput
aria-label={Locale.Settings.Access.DeepSeek.ApiKey.Title}
value={accessStore.deepseekApiKey}
type="text"
placeholder={Locale.Settings.Access.DeepSeek.ApiKey.Placeholder}
onChange={(e) => {
accessStore.update(
(access) => (access.deepseekApiKey = e.currentTarget.value),
);
}}
/>
</ListItem>
</>
);
const XAIConfigComponent = accessStore.provider === ServiceProvider.XAI && (
<>
<ListItem
title={Locale.Settings.Access.XAI.Endpoint.Title}
subTitle={
Locale.Settings.Access.XAI.Endpoint.SubTitle + XAI.ExampleEndpoint
}
>
<input
aria-label={Locale.Settings.Access.XAI.Endpoint.Title}
type="text"
value={accessStore.xaiUrl}
placeholder={XAI.ExampleEndpoint}
onChange={(e) =>
accessStore.update(
(access) => (access.xaiUrl = e.currentTarget.value),
)
}
></input>
</ListItem>
<ListItem
title={Locale.Settings.Access.XAI.ApiKey.Title}
subTitle={Locale.Settings.Access.XAI.ApiKey.SubTitle}
>
<PasswordInput
aria-label={Locale.Settings.Access.XAI.ApiKey.Title}
value={accessStore.xaiApiKey}
type="text"
placeholder={Locale.Settings.Access.XAI.ApiKey.Placeholder}
onChange={(e) => {
accessStore.update(
(access) => (access.xaiApiKey = e.currentTarget.value),
);
}}
/>
</ListItem>
</>
);
const chatglmConfigComponent = accessStore.provider ===
ServiceProvider.ChatGLM && (
<>
<ListItem
title={Locale.Settings.Access.ChatGLM.Endpoint.Title}
subTitle={
Locale.Settings.Access.ChatGLM.Endpoint.SubTitle +
ChatGLM.ExampleEndpoint
}
>
<input
aria-label={Locale.Settings.Access.ChatGLM.Endpoint.Title}
type="text"
value={accessStore.chatglmUrl}
placeholder={ChatGLM.ExampleEndpoint}
onChange={(e) =>
accessStore.update(
(access) => (access.chatglmUrl = e.currentTarget.value),
)
}
></input>
</ListItem>
<ListItem
title={Locale.Settings.Access.ChatGLM.ApiKey.Title}
subTitle={Locale.Settings.Access.ChatGLM.ApiKey.SubTitle}
>
<PasswordInput
aria-label={Locale.Settings.Access.ChatGLM.ApiKey.Title}
value={accessStore.chatglmApiKey}
type="text"
placeholder={Locale.Settings.Access.ChatGLM.ApiKey.Placeholder}
onChange={(e) => {
accessStore.update(
(access) => (access.chatglmApiKey = e.currentTarget.value),
);
}}
/>
</ListItem>
</>
);
const siliconflowConfigComponent = accessStore.provider ===
ServiceProvider.SiliconFlow && (
<>
<ListItem
title={Locale.Settings.Access.SiliconFlow.Endpoint.Title}
subTitle={
Locale.Settings.Access.SiliconFlow.Endpoint.SubTitle +
SiliconFlow.ExampleEndpoint
}
>
<input
aria-label={Locale.Settings.Access.SiliconFlow.Endpoint.Title}
type="text"
value={accessStore.siliconflowUrl}
placeholder={SiliconFlow.ExampleEndpoint}
onChange={(e) =>
accessStore.update(
(access) => (access.siliconflowUrl = e.currentTarget.value),
)
}
></input>
</ListItem>
<ListItem
title={Locale.Settings.Access.SiliconFlow.ApiKey.Title}
subTitle={Locale.Settings.Access.SiliconFlow.ApiKey.SubTitle}
>
<PasswordInput
aria-label={Locale.Settings.Access.SiliconFlow.ApiKey.Title}
value={accessStore.siliconflowApiKey}
type="text"
placeholder={Locale.Settings.Access.SiliconFlow.ApiKey.Placeholder}
onChange={(e) => {
accessStore.update(
(access) => (access.siliconflowApiKey = e.currentTarget.value),
);
}}
/>
</ListItem>
</>
);
const stabilityConfigComponent = accessStore.provider ===
ServiceProvider.Stability && (
<>
@@ -1329,9 +1524,17 @@ export function Settings() {
{checkingUpdate ? (
<LoadingIcon />
) : hasNewVersion ? (
<Link href={updateUrl} target="_blank" className="link">
{Locale.Settings.Update.GoToUpdate}
</Link>
clientConfig?.isApp ? (
<IconButton
icon={<ResetIcon></ResetIcon>}
text={Locale.Settings.Update.GoToUpdate}
onClick={() => clientUpdate()}
/>
) : (
<Link href={updateUrl} target="_blank" className="link">
{Locale.Settings.Update.GoToUpdate}
</Link>
)
) : (
<IconButton
icon={<ResetIcon></ResetIcon>}
@@ -1464,6 +1667,39 @@ export function Settings() {
}
></input>
</ListItem>
<ListItem
title={Locale.Mask.Config.Artifacts.Title}
subTitle={Locale.Mask.Config.Artifacts.SubTitle}
>
<input
aria-label={Locale.Mask.Config.Artifacts.Title}
type="checkbox"
checked={config.enableArtifacts}
onChange={(e) =>
updateConfig(
(config) =>
(config.enableArtifacts = e.currentTarget.checked),
)
}
></input>
</ListItem>
<ListItem
title={Locale.Mask.Config.CodeFold.Title}
subTitle={Locale.Mask.Config.CodeFold.SubTitle}
>
<input
aria-label={Locale.Mask.Config.CodeFold.Title}
type="checkbox"
checked={config.enableCodeFold}
data-testid="enable-code-fold-checkbox"
onChange={(e) =>
updateConfig(
(config) => (config.enableCodeFold = e.currentTarget.checked),
)
}
></input>
</ListItem>
</List>
<SyncItems />
@@ -1540,6 +1776,7 @@ export function Settings() {
</List>
<List id={SlotID.CustomModel}>
{saasStartComponent}
{accessCodeComponent}
{!accessStore.hideUserApiKey && (
@@ -1580,8 +1817,12 @@ export function Settings() {
{alibabaConfigComponent}
{tencentConfigComponent}
{moonshotConfigComponent}
{deepseekConfigComponent}
{stabilityConfigComponent}
{lflytekConfigComponent}
{XAIConfigComponent}
{chatglmConfigComponent}
{siliconflowConfigComponent}
</>
)}
</>
@@ -1616,9 +1857,11 @@ export function Settings() {
<ListItem
title={Locale.Settings.Access.CustomModel.Title}
subTitle={Locale.Settings.Access.CustomModel.SubTitle}
vertical={true}
>
<input
aria-label={Locale.Settings.Access.CustomModel.Title}
style={{ width: "100%", maxWidth: "unset", textAlign: "left" }}
type="text"
value={config.customModels}
placeholder="model1,model2,model3"
@@ -1631,7 +1874,8 @@ export function Settings() {
</ListItem>
</List>
<List>
{/* CAU HINH MODEL CUSTOM */}
{/* <List>
<ModelConfigList
modelConfig={config.modelConfig}
updateConfig={(updater) => {
@@ -1640,11 +1884,33 @@ export function Settings() {
config.update((config) => (config.modelConfig = modelConfig));
}}
/>
</List>
</List> */}
{shouldShowPromptModal && (
<UserPromptModal onClose={() => setShowPromptModal(false)} />
)}
<List>
<RealtimeConfigList
realtimeConfig={config.realtimeConfig}
updateConfig={(updater) => {
const realtimeConfig = { ...config.realtimeConfig };
updater(realtimeConfig);
config.update(
(config) => (config.realtimeConfig = realtimeConfig),
);
}}
/>
</List>
<List>
<TTSConfigList
ttsConfig={config.ttsConfig}
updateConfig={(updater) => {
const ttsConfig = { ...config.ttsConfig };
updater(ttsConfig);
config.update((config) => (config.ttsConfig = ttsConfig));
}}
/>
</List>
<DangerItems />
</div>

View File

@@ -1,15 +1,16 @@
import React, { useEffect, useRef, useMemo, useState, Fragment } from "react";
import React, { Fragment, useEffect, useMemo, useRef, useState } from "react";
import styles from "./home.module.scss";
import { IconButton } from "./button";
import SettingsIcon from "../icons/settings.svg";
import GithubIcon from "../icons/github.svg";
import ChatGptIcon from "../icons/chatgpt.svg";
import ChatGptIcon from "../icons/chebichat.svg";
import AddIcon from "../icons/add.svg";
import CloseIcon from "../icons/close.svg";
import DeleteIcon from "../icons/delete.svg";
import MaskIcon from "../icons/mask.svg";
import McpIcon from "../icons/mcp.svg";
import DragIcon from "../icons/drag.svg";
import DiscoveryIcon from "../icons/discovery.svg";
@@ -23,14 +24,22 @@ import {
MIN_SIDEBAR_WIDTH,
NARROW_SIDEBAR_WIDTH,
Path,
PLUGINS,
REPO_URL,
} from "../constant";
import { Link, useNavigate } from "react-router-dom";
import { isIOS, useMobileScreen } from "../utils";
import dynamic from "next/dynamic";
import { showConfirm, Selector } from "./ui-lib";
import { Selector, showConfirm, showToast } from "./ui-lib";
import clsx from "clsx";
import { isMcpEnabled } from "../mcp/actions";
import { useSyncStore } from "../store/sync";
const DISCOVERY = [
// { name: Locale.Plugin.Name, path: Path.Plugins },
// { name: "Stable Diffusion", path: Path.Sd },
{ name: Locale.UI.Sync, path: "/sync" },
{ name: Locale.SearchChat.Page.Title, path: Path.SearchChat },
];
const ChatList = dynamic(async () => (await import("./chat-list")).ChatList, {
loading: () => null,
@@ -128,6 +137,7 @@ export function useDragSideBar() {
shouldNarrow,
};
}
export function SideBarContainer(props: {
children: React.ReactNode;
onDragStart: (e: MouseEvent) => void;
@@ -142,9 +152,9 @@ export function SideBarContainer(props: {
const { children, className, onDragStart, shouldNarrow } = props;
return (
<div
className={`${styles.sidebar} ${className} ${
shouldNarrow && styles["narrow-sidebar"]
}`}
className={clsx(styles.sidebar, className, {
[styles["narrow-sidebar"]]: shouldNarrow,
})}
style={{
// #3016 disable transition on ios mobile screen
transition: isMobileScreen && isIOSMobile ? "none" : undefined,
@@ -166,18 +176,24 @@ export function SideBarHeader(props: {
subTitle?: string | React.ReactNode;
logo?: React.ReactNode;
children?: React.ReactNode;
shouldNarrow?: boolean;
}) {
const { title, subTitle, logo, children } = props;
const { title, subTitle, logo, children, shouldNarrow } = props;
return (
<Fragment>
<div className={styles["sidebar-header"]} data-tauri-drag-region>
<div
className={clsx(styles["sidebar-header"], {
[styles["sidebar-header-narrow"]]: shouldNarrow,
})}
data-tauri-drag-region
>
<div className={styles["sidebar-title-container"]}>
<div className={styles["sidebar-title"]} data-tauri-drag-region>
{title}
</div>
<div className={styles["sidebar-sub-title"]}>{subTitle}</div>
</div>
<div className={styles["sidebar-logo"] + " no-dark"}>{logo}</div>
<div className={clsx(styles["sidebar-logo"], "no-dark")}>{logo}</div>
</div>
{children}
</Fragment>
@@ -213,10 +229,23 @@ export function SideBarTail(props: {
export function SideBar(props: { className?: string }) {
useHotKey();
const { onDragStart, shouldNarrow } = useDragSideBar();
const [showPluginSelector, setShowPluginSelector] = useState(false);
const [showDiscoverySelector, setshowDiscoverySelector] = useState(false);
const navigate = useNavigate();
const config = useAppConfig();
const chatStore = useChatStore();
const [mcpEnabled, setMcpEnabled] = useState(false);
const syncStore = useSyncStore();
useEffect(() => {
// 检查 MCP 是否启用
const checkMcpStatus = async () => {
const enabled = await isMcpEnabled();
setMcpEnabled(enabled);
console.log("[SideBar] MCP enabled:", enabled);
};
checkMcpStatus();
}, []);
return (
<SideBarContainer
@@ -225,16 +254,20 @@ export function SideBar(props: { className?: string }) {
{...props}
>
<SideBarHeader
title="NextChat"
subTitle="Build your own AI assistant."
logo={<ChatGptIcon />}
title="Chebi Chat" // Tiêu đề sidebar
subTitle="Trợ lý AI học tiếng Trung" // Phụ đề sidebar
logo={<ChatGptIcon />} // Logo hiển thị
shouldNarrow={shouldNarrow} // Trạng thái thu nhỏ sidebar
>
{/* Thanh công cụ phía trên của sidebar */}
<div className={styles["sidebar-header-bar"]}>
{/* Nút chuyển sang giao diện tạo chat mới hoặc danh sách mask */}
<IconButton
icon={<MaskIcon />}
text={shouldNarrow ? undefined : Locale.Mask.Name}
className={styles["sidebar-bar-button"]}
onClick={() => {
// Nếu chưa tắt splash screen mask thì chuyển sang tạo chat mới, ngược lại chuyển sang danh sách mask
if (config.dontShowMaskSplashScreen !== true) {
navigate(Path.NewChat, { state: { fromHome: true } });
} else {
@@ -243,32 +276,54 @@ export function SideBar(props: { className?: string }) {
}}
shadow
/>
{/* Nếu tính năng MCP được bật thì hiển thị nút MCP */}
{mcpEnabled && (
<IconButton
icon={<McpIcon />}
text={shouldNarrow ? undefined : Locale.Mcp.Name}
className={styles["sidebar-bar-button"]}
onClick={() => {
// Chuyển sang giao diện MCP Market
navigate(Path.McpMarket, { state: { fromHome: true } });
}}
shadow
/>
)}
{/* Nút chuyển sang giao diện Discovery */}
<IconButton
icon={<DiscoveryIcon />}
text={shouldNarrow ? undefined : Locale.Discovery.Name}
className={styles["sidebar-bar-button"]}
onClick={() => setShowPluginSelector(true)}
onClick={() => setshowDiscoverySelector(true)}
shadow
/>
</div>
{showPluginSelector && (
{/* Hiển thị selector khi người dùng bấm vào Discovery */}
{showDiscoverySelector && (
<Selector
items={[
{
title: "👇 Please select the plugin you need to use",
value: "-",
disable: true,
},
...PLUGINS.map((item) => {
...DISCOVERY.map((item) => {
return {
title: item.name,
value: item.path,
};
}),
]}
onClose={() => setShowPluginSelector(false)}
onSelection={(s) => {
navigate(s[0], { state: { fromHome: true } });
onClose={() => setshowDiscoverySelector(false)}
onSelection={async (s) => {
console.log(s[0]);
if (s[0] == "/sync") {
try {
await syncStore.sync();
console.log("Dong bo thanh cong ");
showToast(Locale.Settings.Sync.Success);
} catch (e) {
showToast(Locale.Settings.Sync.Fail);
console.error("[Sync]", e);
}
} else {
navigate(s[0], { state: { fromHome: true } });
}
}}
/>
)}
@@ -285,7 +340,7 @@ export function SideBar(props: { className?: string }) {
<SideBarTail
primaryAction={
<>
<div className={styles["sidebar-action"] + " " + styles.mobile}>
<div className={clsx(styles["sidebar-action"], styles.mobile)}>
<IconButton
icon={<DeleteIcon />}
onClick={async () => {
@@ -304,7 +359,8 @@ export function SideBar(props: { className?: string }) {
/>
</Link>
</div>
<div className={styles["sidebar-action"]}>
{/* <div className={styles["sidebar-action"]}>
<a href={REPO_URL} target="_blank" rel="noopener noreferrer">
<IconButton
aria={Locale.Export.MessageFromChatGPT}
@@ -312,7 +368,7 @@ export function SideBar(props: { className?: string }) {
shadow
/>
</a>
</div>
</div> */}
</>
}
secondaryAction={

View File

@@ -0,0 +1,133 @@
import { TTSConfig, TTSConfigValidator } from "../store";
import Locale from "../locales";
import { ListItem, Select } from "./ui-lib";
import {
DEFAULT_TTS_ENGINE,
DEFAULT_TTS_ENGINES,
DEFAULT_TTS_MODELS,
DEFAULT_TTS_VOICES,
} from "../constant";
import { InputRange } from "./input-range";
export function TTSConfigList(props: {
ttsConfig: TTSConfig;
updateConfig: (updater: (config: TTSConfig) => void) => void;
}) {
return (
<>
<ListItem
title={Locale.Settings.TTS.Enable.Title}
subTitle={Locale.Settings.TTS.Enable.SubTitle}
>
<input
type="checkbox"
checked={props.ttsConfig.enable}
onChange={(e) =>
props.updateConfig(
(config) => (config.enable = e.currentTarget.checked),
)
}
></input>
</ListItem>
{/* <ListItem
title={Locale.Settings.TTS.Autoplay.Title}
subTitle={Locale.Settings.TTS.Autoplay.SubTitle}
>
<input
type="checkbox"
checked={props.ttsConfig.autoplay}
onChange={(e) =>
props.updateConfig(
(config) => (config.autoplay = e.currentTarget.checked),
)
}
></input>
</ListItem> */}
<ListItem title={Locale.Settings.TTS.Engine}>
<Select
value={props.ttsConfig.engine}
onChange={(e) => {
props.updateConfig(
(config) =>
(config.engine = TTSConfigValidator.engine(
e.currentTarget.value,
)),
);
}}
>
{DEFAULT_TTS_ENGINES.map((v, i) => (
<option value={v} key={i}>
{v}
</option>
))}
</Select>
</ListItem>
{props.ttsConfig.engine === DEFAULT_TTS_ENGINE && (
<>
<ListItem title={Locale.Settings.TTS.Model}>
<Select
value={props.ttsConfig.model}
onChange={(e) => {
props.updateConfig(
(config) =>
(config.model = TTSConfigValidator.model(
e.currentTarget.value,
)),
);
}}
>
{DEFAULT_TTS_MODELS.map((v, i) => (
<option value={v} key={i}>
{v}
</option>
))}
</Select>
</ListItem>
<ListItem
title={Locale.Settings.TTS.Voice.Title}
subTitle={Locale.Settings.TTS.Voice.SubTitle}
>
<Select
value={props.ttsConfig.voice}
onChange={(e) => {
props.updateConfig(
(config) =>
(config.voice = TTSConfigValidator.voice(
e.currentTarget.value,
)),
);
}}
>
{DEFAULT_TTS_VOICES.map((v, i) => (
<option value={v} key={i}>
{v}
</option>
))}
</Select>
</ListItem>
<ListItem
title={Locale.Settings.TTS.Speed.Title}
subTitle={Locale.Settings.TTS.Speed.SubTitle}
>
<InputRange
aria={Locale.Settings.TTS.Speed.Title}
value={props.ttsConfig.speed?.toFixed(1)}
min="0.3"
max="4.0"
step="0.1"
onChange={(e) => {
props.updateConfig(
(config) =>
(config.speed = TTSConfigValidator.speed(
e.currentTarget.valueAsNumber,
)),
);
}}
></InputRange>
</ListItem>
</>
)}
</>
);
}

View File

@@ -0,0 +1,119 @@
@import "../styles/animation.scss";
.plugin-page {
height: 100%;
display: flex;
flex-direction: column;
.plugin-page-body {
padding: 20px;
overflow-y: auto;
.plugin-filter {
width: 100%;
max-width: 100%;
margin-bottom: 20px;
animation: slide-in ease 0.3s;
height: 40px;
display: flex;
.search-bar {
flex-grow: 1;
max-width: 100%;
min-width: 0;
outline: none;
}
.search-bar:focus {
border: 1px solid var(--primary);
}
.plugin-filter-lang {
height: 100%;
margin-left: 10px;
}
.plugin-create {
height: 100%;
margin-left: 10px;
box-sizing: border-box;
min-width: 80px;
}
}
.plugin-item {
display: flex;
justify-content: space-between;
padding: 20px;
border: var(--border-in-light);
animation: slide-in ease 0.3s;
&:not(:last-child) {
border-bottom: 0;
}
&:first-child {
border-top-left-radius: 10px;
border-top-right-radius: 10px;
}
&:last-child {
border-bottom-left-radius: 10px;
border-bottom-right-radius: 10px;
}
.plugin-header {
display: flex;
align-items: center;
.plugin-icon {
display: flex;
align-items: center;
justify-content: center;
margin-right: 10px;
}
.plugin-title {
.plugin-name {
font-size: 14px;
font-weight: bold;
}
.plugin-info {
font-size: 12px;
}
.plugin-runtime-warning {
font-size: 12px;
color: #f86c6c;
}
}
}
.plugin-actions {
display: flex;
flex-wrap: nowrap;
transition: all ease 0.3s;
justify-content: center;
align-items: center;
}
@media screen and (max-width: 600px) {
display: flex;
flex-direction: column;
padding-bottom: 10px;
border-radius: 10px;
margin-bottom: 20px;
box-shadow: var(--card-shadow);
&:not(:last-child) {
border-bottom: var(--border-in-light);
}
.plugin-actions {
width: 100%;
justify-content: space-between;
padding-top: 10px;
}
}
}
}
}

View File

@@ -62,14 +62,14 @@
}
}
&.vertical{
&.vertical {
flex-direction: column;
align-items: start;
.list-header{
.list-item-title{
.list-header {
.list-item-title {
margin-bottom: 5px;
}
.list-item-sub-title{
.list-item-sub-title {
margin-bottom: 2px;
}
}
@@ -252,6 +252,12 @@
position: relative;
max-width: fit-content;
&.left-align-option {
option {
text-align: left;
}
}
.select-with-icon-select {
height: 100%;
border: var(--border-in-light);
@@ -304,7 +310,7 @@
justify-content: center;
z-index: 999;
.selector-item-disabled{
.selector-item-disabled {
opacity: 0.6;
}
@@ -330,3 +336,4 @@
}
}
}

View File

@@ -23,6 +23,8 @@ import React, {
useRef,
} from "react";
import { IconButton } from "./button";
import { Avatar } from "./emoji";
import clsx from "clsx";
export function Popover(props: {
children: JSX.Element;
@@ -45,7 +47,7 @@ export function Popover(props: {
export function Card(props: { children: JSX.Element[]; className?: string }) {
return (
<div className={styles.card + " " + props.className}>{props.children}</div>
<div className={clsx(styles.card, props.className)}>{props.children}</div>
);
}
@@ -60,11 +62,13 @@ export function ListItem(props: {
}) {
return (
<div
className={
styles["list-item"] +
` ${props.vertical ? styles["vertical"] : ""} ` +
` ${props.className || ""}`
}
className={clsx(
styles["list-item"],
{
[styles["vertical"]]: props.vertical,
},
props.className,
)}
onClick={props.onClick}
>
<div className={styles["list-header"]}>
@@ -135,9 +139,9 @@ export function Modal(props: ModalProps) {
return (
<div
className={
styles["modal-container"] + ` ${isMax && styles["modal-container-max"]}`
}
className={clsx(styles["modal-container"], {
[styles["modal-container-max"]]: isMax,
})}
>
<div className={styles["modal-header"]}>
<div className={styles["modal-title"]}>{props.title}</div>
@@ -260,7 +264,7 @@ export function Input(props: InputProps) {
return (
<textarea
{...props}
className={`${styles["input"]} ${props.className}`}
className={clsx(styles["input"], props.className)}
></textarea>
);
}
@@ -292,13 +296,23 @@ export function PasswordInput(
export function Select(
props: React.DetailedHTMLProps<
React.SelectHTMLAttributes<HTMLSelectElement>,
React.SelectHTMLAttributes<HTMLSelectElement> & {
align?: "left" | "center";
},
HTMLSelectElement
>,
) {
const { className, children, ...otherProps } = props;
const { className, children, align, ...otherProps } = props;
return (
<div className={`${styles["select-with-icon"]} ${className}`}>
<div
className={clsx(
styles["select-with-icon"],
{
[styles["left-align-option"]]: align === "left",
},
className,
)}
>
<select className={styles["select-with-icon-select"]} {...otherProps}>
{children}
</select>
@@ -503,12 +517,13 @@ export function Selector<T>(props: {
const selected = selectedValues.includes(item.value);
return (
<ListItem
className={`${styles["selector-item"]} ${
item.disable && styles["selector-item-disabled"]
}`}
className={clsx(styles["selector-item"], {
[styles["selector-item-disabled"]]: item.disable,
})}
key={i}
title={item.title}
subTitle={item.subTitle}
icon={<Avatar model={item.value as string} />}
onClick={(e) => {
if (item.disable) {
e.stopPropagation();

View File

@@ -0,0 +1 @@
export * from "./voice-print";

View File

@@ -0,0 +1,11 @@
.voice-print {
width: 100%;
height: 60px;
margin: 20px 0;
canvas {
width: 100%;
height: 100%;
filter: brightness(1.2); // 增加整体亮度
}
}

View File

@@ -0,0 +1,180 @@
import { useEffect, useRef, useCallback } from "react";
import styles from "./voice-print.module.scss";
interface VoicePrintProps {
frequencies?: Uint8Array;
isActive?: boolean;
}
export function VoicePrint({ frequencies, isActive }: VoicePrintProps) {
// Canvas引用用于获取绘图上下文
const canvasRef = useRef<HTMLCanvasElement>(null);
// 存储历史频率数据,用于平滑处理
const historyRef = useRef<number[][]>([]);
// 控制保留的历史数据帧数,影响平滑度
const historyLengthRef = useRef(10);
// 存储动画帧ID用于清理
const animationFrameRef = useRef<number>();
/**
* 更新频率历史数据
* 使用FIFO队列维护固定长度的历史记录
*/
const updateHistory = useCallback((freqArray: number[]) => {
historyRef.current.push(freqArray);
if (historyRef.current.length > historyLengthRef.current) {
historyRef.current.shift();
}
}, []);
useEffect(() => {
const canvas = canvasRef.current;
if (!canvas) return;
const ctx = canvas.getContext("2d");
if (!ctx) return;
/**
* 处理高DPI屏幕显示
* 根据设备像素比例调整canvas实际渲染分辨率
*/
const dpr = window.devicePixelRatio || 1;
canvas.width = canvas.offsetWidth * dpr;
canvas.height = canvas.offsetHeight * dpr;
ctx.scale(dpr, dpr);
/**
* 主要绘制函数
* 使用requestAnimationFrame实现平滑动画
* 包含以下步骤:
* 1. 清空画布
* 2. 更新历史数据
* 3. 计算波形点
* 4. 绘制上下对称的声纹
*/
const draw = () => {
// 清空画布
ctx.clearRect(0, 0, canvas.width, canvas.height);
if (!frequencies || !isActive) {
historyRef.current = [];
return;
}
const freqArray = Array.from(frequencies);
updateHistory(freqArray);
// 绘制声纹
const points: [number, number][] = [];
const centerY = canvas.height / 2;
const width = canvas.width;
const sliceWidth = width / (frequencies.length - 1);
// 绘制主波形
ctx.beginPath();
ctx.moveTo(0, centerY);
/**
* 声纹绘制算法:
* 1. 使用历史数据平均值实现平滑过渡
* 2. 通过正弦函数添加自然波动
* 3. 使用贝塞尔曲线连接点,使曲线更平滑
* 4. 绘制对称部分形成完整声纹
*/
for (let i = 0; i < frequencies.length; i++) {
const x = i * sliceWidth;
let avgFrequency = frequencies[i];
/**
* 波形平滑处理:
* 1. 收集历史数据中对应位置的频率值
* 2. 计算当前值与历史值的加权平均
* 3. 根据平均值计算实际显示高度
*/
if (historyRef.current.length > 0) {
const historicalValues = historyRef.current.map((h) => h[i] || 0);
avgFrequency =
(avgFrequency + historicalValues.reduce((a, b) => a + b, 0)) /
(historyRef.current.length + 1);
}
/**
* 波形变换:
* 1. 归一化频率值到0-1范围
* 2. 添加时间相关的正弦变换
* 3. 使用贝塞尔曲线平滑连接点
*/
const normalized = avgFrequency / 255.0;
const height = normalized * (canvas.height / 2);
const y = centerY + height * Math.sin(i * 0.2 + Date.now() * 0.002);
points.push([x, y]);
if (i === 0) {
ctx.moveTo(x, y);
} else {
// 使用贝塞尔曲线使波形更平滑
const prevPoint = points[i - 1];
const midX = (prevPoint[0] + x) / 2;
ctx.quadraticCurveTo(
prevPoint[0],
prevPoint[1],
midX,
(prevPoint[1] + y) / 2,
);
}
}
// 绘制对称的下半部分
for (let i = points.length - 1; i >= 0; i--) {
const [x, y] = points[i];
const symmetricY = centerY - (y - centerY);
if (i === points.length - 1) {
ctx.lineTo(x, symmetricY);
} else {
const nextPoint = points[i + 1];
const midX = (nextPoint[0] + x) / 2;
ctx.quadraticCurveTo(
nextPoint[0],
centerY - (nextPoint[1] - centerY),
midX,
centerY - ((nextPoint[1] + y) / 2 - centerY),
);
}
}
ctx.closePath();
/**
* 渐变效果:
* 从左到右应用三色渐变,带透明度
* 使用蓝色系配色提升视觉效果
*/
const gradient = ctx.createLinearGradient(0, 0, canvas.width, 0);
gradient.addColorStop(0, "rgba(100, 180, 255, 0.95)");
gradient.addColorStop(0.5, "rgba(140, 200, 255, 0.9)");
gradient.addColorStop(1, "rgba(180, 220, 255, 0.95)");
ctx.fillStyle = gradient;
ctx.fill();
animationFrameRef.current = requestAnimationFrame(draw);
};
// 启动动画循环
draw();
// 清理函数:在组件卸载时取消动画
return () => {
if (animationFrameRef.current) {
cancelAnimationFrame(animationFrameRef.current);
}
};
}, [frequencies, isActive, updateHistory]);
return (
<div className={styles["voice-print"]}>
<canvas ref={canvasRef} />
</div>
);
}

View File

@@ -1,5 +1,6 @@
import md5 from "spark-md5";
import { DEFAULT_MODELS, DEFAULT_GA_ID } from "../constant";
import { isGPT4Model } from "../utils/model";
declare global {
namespace NodeJS {
@@ -22,6 +23,7 @@ declare global {
DISABLE_FAST_LINK?: string; // disallow parse settings from url or not
CUSTOM_MODELS?: string; // to control custom models
DEFAULT_MODEL?: string; // to control default model in every new chat window
VISION_MODELS?: string; // to control vision models
// stability only
STABILITY_URL?: string;
@@ -56,6 +58,7 @@ declare global {
// alibaba only
ALIBABA_URL?: string;
ALIBABA_API_KEY?: string;
ALIBABA_APP_ID?: string; // alibaba app id, used for some models
// tencent only
TENCENT_URL?: string;
@@ -71,8 +74,25 @@ declare global {
IFLYTEK_API_KEY?: string;
IFLYTEK_API_SECRET?: string;
DEEPSEEK_URL?: string;
DEEPSEEK_API_KEY?: string;
// xai only
XAI_URL?: string;
XAI_API_KEY?: string;
// chatglm only
CHATGLM_URL?: string;
CHATGLM_API_KEY?: string;
// siliconflow only
SILICONFLOW_URL?: string;
SILICONFLOW_API_KEY?: string;
// custom template for preprocessing user input
DEFAULT_INPUT_TEMPLATE?: string;
ENABLE_MCP?: string; // enable mcp functionality
}
}
}
@@ -116,19 +136,16 @@ export const getServerSideConfig = () => {
const disableGPT4 = !!process.env.DISABLE_GPT4;
let customModels = process.env.CUSTOM_MODELS ?? "";
let defaultModel = process.env.DEFAULT_MODEL ?? "";
let visionModels = process.env.VISION_MODELS ?? "";
if (disableGPT4) {
if (customModels) customModels += ",";
customModels += DEFAULT_MODELS.filter(
(m) => m.name.startsWith("gpt-4") && !m.name.startsWith("gpt-4o-mini"),
)
customModels += DEFAULT_MODELS.filter((m) => isGPT4Model(m.name))
.map((m) => "-" + m.name)
.join(",");
if (
defaultModel.startsWith("gpt-4") &&
!defaultModel.startsWith("gpt-4o-mini")
)
if (defaultModel && isGPT4Model(defaultModel)) {
defaultModel = "";
}
}
const isStability = !!process.env.STABILITY_API_KEY;
@@ -143,6 +160,10 @@ export const getServerSideConfig = () => {
const isAlibaba = !!process.env.ALIBABA_API_KEY;
const isMoonshot = !!process.env.MOONSHOT_API_KEY;
const isIflytek = !!process.env.IFLYTEK_API_KEY;
const isDeepSeek = !!process.env.DEEPSEEK_API_KEY;
const isXAI = !!process.env.XAI_API_KEY;
const isChatGLM = !!process.env.CHATGLM_API_KEY;
const isSiliconFlow = !!process.env.SILICONFLOW_API_KEY;
// const apiKeyEnvVar = process.env.OPENAI_API_KEY ?? "";
// const apiKeys = apiKeyEnvVar.split(",").map((v) => v.trim());
// const randomIndex = Math.floor(Math.random() * apiKeys.length);
@@ -151,8 +172,8 @@ export const getServerSideConfig = () => {
// `[Server Config] using ${randomIndex + 1} of ${apiKeys.length} api key`,
// );
const allowedWebDevEndpoints = (
process.env.WHITE_WEBDEV_ENDPOINTS ?? ""
const allowedWebDavEndpoints = (
process.env.WHITE_WEBDAV_ENDPOINTS ?? ""
).split(",");
return {
@@ -190,6 +211,7 @@ export const getServerSideConfig = () => {
isAlibaba,
alibabaUrl: process.env.ALIBABA_URL,
alibabaApiKey: getApiKey(process.env.ALIBABA_API_KEY),
alibabaAppId: process.env.ALIBABA_APP_ID,
isTencent,
tencentUrl: process.env.TENCENT_URL,
@@ -205,11 +227,27 @@ export const getServerSideConfig = () => {
iflytekApiKey: process.env.IFLYTEK_API_KEY,
iflytekApiSecret: process.env.IFLYTEK_API_SECRET,
isDeepSeek,
deepseekUrl: process.env.DEEPSEEK_URL,
deepseekApiKey: getApiKey(process.env.DEEPSEEK_API_KEY),
isXAI,
xaiUrl: process.env.XAI_URL,
xaiApiKey: getApiKey(process.env.XAI_API_KEY),
isChatGLM,
chatglmUrl: process.env.CHATGLM_URL,
chatglmApiKey: getApiKey(process.env.CHATGLM_API_KEY),
cloudflareAccountId: process.env.CLOUDFLARE_ACCOUNT_ID,
cloudflareKVNamespaceId: process.env.CLOUDFLARE_KV_NAMESPACE_ID,
cloudflareKVApiKey: getApiKey(process.env.CLOUDFLARE_KV_API_KEY),
cloudflareKVTTL: process.env.CLOUDFLARE_KV_TTL,
isSiliconFlow,
siliconFlowUrl: process.env.SILICONFLOW_URL,
siliconFlowApiKey: getApiKey(process.env.SILICONFLOW_API_KEY),
gtmId: process.env.GTM_ID,
gaId: process.env.GA_ID || DEFAULT_GA_ID,
@@ -226,6 +264,8 @@ export const getServerSideConfig = () => {
disableFastLink: !!process.env.DISABLE_FAST_LINK,
customModels,
defaultModel,
allowedWebDevEndpoints,
visionModels,
allowedWebDavEndpoints,
enableMcp: process.env.ENABLE_MCP === "true",
};
};

View File

@@ -1,4 +1,6 @@
import path from "path";
import { ALIBABA_BASE_URL, ALIBABA_PATH } from "./chebichatConstant";
export * from "./chebichatConstant";
export const OWNER = "ChatGPTNextWeb";
export const REPO = "ChatGPT-Next-Web";
@@ -13,7 +15,6 @@ export const RUNTIME_CONFIG_DOM = "danger-runtime-config";
export const STABILITY_BASE_URL = "https://api.stability.ai";
export const DEFAULT_API_HOST = "https://api.nextchat.dev";
export const OPENAI_BASE_URL = "https://api.openai.com";
export const ANTHROPIC_BASE_URL = "https://api.anthropic.com";
@@ -24,16 +25,25 @@ export const BAIDU_OATUH_URL = `${BAIDU_BASE_URL}/oauth/2.0/token`;
export const BYTEDANCE_BASE_URL = "https://ark.cn-beijing.volces.com";
export const ALIBABA_BASE_URL = "https://dashscope.aliyuncs.com/api/";
export const TENCENT_BASE_URL = "https://hunyuan.tencentcloudapi.com";
export const MOONSHOT_BASE_URL = "https://api.moonshot.cn";
export const IFLYTEK_BASE_URL = "https://spark-api-open.xf-yun.com";
export const DEEPSEEK_BASE_URL = "https://api.deepseek.com";
export const XAI_BASE_URL = "https://api.x.ai";
export const CHATGLM_BASE_URL = "https://open.bigmodel.cn";
export const SILICONFLOW_BASE_URL = "https://api.siliconflow.cn";
export const CACHE_URL_PREFIX = "/api/cache";
export const UPLOAD_URL = `${CACHE_URL_PREFIX}/upload`;
export const ALIBABA_APP_ID = "95072bcf71bf4469a25c45c31e76f37a"; // default alibaba app id, used for some models
export const MODEL_PROVIDER = "alibaba";
export enum Path {
Home = "/",
Chat = "/chat",
@@ -46,12 +56,15 @@ export enum Path {
SdNew = "/sd-new",
Artifacts = "/artifacts",
SearchChat = "/search-chat",
McpMarket = "/mcp-market",
Sync = "/sync",
}
export enum ApiPath {
Cors = "",
Supabase = "/api/supabase",
Azure = "/api/azure",
OpenAI = "/api/openai",
OpenAI = "/api/alibaba", // Use Alibaba path for OpenAI API
Anthropic = "/api/anthropic",
Google = "/api/google",
Baidu = "/api/baidu",
@@ -62,6 +75,10 @@ export enum ApiPath {
Iflytek = "/api/iflytek",
Stability = "/api/stability",
Artifacts = "/api/artifacts",
XAI = "/api/xai",
ChatGLM = "/api/chatglm",
DeepSeek = "/api/deepseek",
SiliconFlow = "/api/siliconflow",
}
export enum SlotID {
@@ -84,6 +101,7 @@ export enum StoreKey {
Update = "chat-update",
Sync = "sync",
SdList = "sd-list",
Mcp = "mcp-store",
}
export const DEFAULT_SIDEBAR_WIDTH = 300;
@@ -96,9 +114,8 @@ export const ACCESS_CODE_PREFIX = "nk-";
export const LAST_INPUT_KEY = "last-input";
export const UNFINISHED_INPUT = (id: string) => "unfinished-input-" + id;
export const STORAGE_KEY = "chatgpt-next-web";
export const REQUEST_TIMEOUT_MS = 60000;
export const REQUEST_TIMEOUT_MS_FOR_THINKING = REQUEST_TIMEOUT_MS * 5;
export const EXPORT_MESSAGE_CLASS_NAME = "export-markdown";
@@ -114,6 +131,10 @@ export enum ServiceProvider {
Moonshot = "Moonshot",
Stability = "Stability",
Iflytek = "Iflytek",
XAI = "XAI",
ChatGLM = "ChatGLM",
DeepSeek = "DeepSeek",
SiliconFlow = "SiliconFlow",
}
// Google API safety settings, see https://ai.google.dev/gemini-api/docs/safety-settings
@@ -136,6 +157,10 @@ export enum ModelProvider {
Hunyuan = "Hunyuan",
Moonshot = "Moonshot",
Iflytek = "Iflytek",
XAI = "XAI",
ChatGLM = "ChatGLM",
DeepSeek = "DeepSeek",
SiliconFlow = "SiliconFlow",
}
export const Stability = {
@@ -151,7 +176,9 @@ export const Anthropic = {
};
export const OpenaiPath = {
ChatPath: "v1/chat/completions",
// ChatPath: "v1/chat/completions",
ChatPath: ALIBABA_PATH,
SpeechPath: "v1/audio/speech",
ImagePath: "v1/images/generations",
UsagePath: "dashboard/billing/usage",
SubsPath: "dashboard/billing/subscription",
@@ -200,7 +227,22 @@ export const ByteDance = {
export const Alibaba = {
ExampleEndpoint: ALIBABA_BASE_URL,
ChatPath: "v1/services/aigc/text-generation/generation",
ChatPath: (modelName: string) => {
// CHUYEN DUNG CHO ALIBABA APP ID
// const URL = `api/v1/apps/${ALIBABA_APP_ID}/completion`;
console.log("[Alibaba] modelName", modelName);
// https://dashscope-intl.aliyuncs.com/compatible-mode/v1/chat/completions
const URL = ALIBABA_PATH;
// if (modelName.includes("vl") || modelName.includes("omni")) {
// return "v1/services/aigc/multimodal-generation/generation";
// }
// return `v1/services/aigc/text-generation/generation`;
return URL;
},
};
export const Tencent = {
@@ -217,6 +259,29 @@ export const Iflytek = {
ChatPath: "v1/chat/completions",
};
export const DeepSeek = {
ExampleEndpoint: DEEPSEEK_BASE_URL,
ChatPath: "chat/completions",
};
export const XAI = {
ExampleEndpoint: XAI_BASE_URL,
ChatPath: "v1/chat/completions",
};
export const ChatGLM = {
ExampleEndpoint: CHATGLM_BASE_URL,
ChatPath: "api/paas/v4/chat/completions",
ImagePath: "api/paas/v4/images/generations",
VideoPath: "api/paas/v4/videos/generations",
};
export const SiliconFlow = {
ExampleEndpoint: SILICONFLOW_BASE_URL,
ChatPath: "v1/chat/completions",
ListModelPath: "v1/models?&sub_type=chat",
};
export const DEFAULT_INPUT_TEMPLATE = `{{input}}`; // input / time / model / lang
// export const DEFAULT_SYSTEM_TEMPLATE = `
// You are ChatGPT, a large language model trained by {{ServiceProvider}}.
@@ -235,27 +300,209 @@ Latex inline: \\(x^2\\)
Latex block: $$e=mc^2$$
`;
export const MCP_TOOLS_TEMPLATE = `
[clientId]
{{ clientId }}
[tools]
{{ tools }}
`;
export const MCP_SYSTEM_TEMPLATE = `
You are an AI assistant with access to system tools. Your role is to help users by combining natural language understanding with tool operations when needed.
1. AVAILABLE TOOLS:
{{ MCP_TOOLS }}
2. WHEN TO USE TOOLS:
- ALWAYS USE TOOLS when they can help answer user questions
- DO NOT just describe what you could do - TAKE ACTION immediately
- If you're not sure whether to use a tool, USE IT
- Common triggers for tool use:
* Questions about files or directories
* Requests to check, list, or manipulate system resources
* Any query that can be answered with available tools
3. HOW TO USE TOOLS:
A. Tool Call Format:
- Use markdown code blocks with format: \`\`\`json:mcp:{clientId}\`\`\`
- Always include:
* method: "tools/call"Only this method is supported
* params:
- name: must match an available primitive name
- arguments: required parameters for the primitive
B. Response Format:
- Tool responses will come as user messages
- Format: \`\`\`json:mcp-response:{clientId}\`\`\`
- Wait for response before making another tool call
C. Important Rules:
- Only use tools/call method
- Only ONE tool call per message
- ALWAYS TAKE ACTION instead of just describing what you could do
- Include the correct clientId in code block language tag
- Verify arguments match the primitive's requirements
4. INTERACTION FLOW:
A. When user makes a request:
- IMMEDIATELY use appropriate tool if available
- DO NOT ask if user wants you to use the tool
- DO NOT just describe what you could do
B. After receiving tool response:
- Explain results clearly
- Take next appropriate action if needed
C. If tools fail:
- Explain the error
- Try alternative approach immediately
5. EXAMPLE INTERACTION:
good example:
\`\`\`json:mcp:filesystem
{
"method": "tools/call",
"params": {
"name": "list_allowed_directories",
"arguments": {}
}
}
\`\`\`"
\`\`\`json:mcp-response:filesystem
{
"method": "tools/call",
"params": {
"name": "write_file",
"arguments": {
"path": "/Users/river/dev/nextchat/test/joke.txt",
"content": "为什么数学书总是感到忧伤?因为它有太多的问题。"
}
}
}
\`\`\`
follwing is the wrong! mcp json example:
\`\`\`json:mcp:filesystem
{
"method": "write_file",
"params": {
"path": "NextChat_Information.txt",
"content": "1"
}
}
\`\`\`
This is wrong because the method is not tools/call.
\`\`\`{
"method": "search_repositories",
"params": {
"query": "2oeee"
}
}
\`\`\`
This is wrong because the method is not tools/call.!!!!!!!!!!!
the right format is:
\`\`\`json:mcp:filesystem
{
"method": "tools/call",
"params": {
"name": "search_repositories",
"arguments": {
"query": "2oeee"
}
}
}
\`\`\`
please follow the format strictly ONLY use tools/call method!!!!!!!!!!!
`;
export const SUMMARIZE_MODEL = "gpt-4o-mini";
export const GEMINI_SUMMARIZE_MODEL = "gemini-pro";
export const DEEPSEEK_SUMMARIZE_MODEL = "deepseek-chat";
export const KnowledgeCutOffDate: Record<string, string> = {
default: "2021-09",
"gpt-4-turbo": "2023-12",
"gpt-4-turbo-2024-04-09": "2023-12",
"gpt-4-turbo-preview": "2023-12",
"gpt-4.1": "2024-06",
"gpt-4.1-2025-04-14": "2024-06",
"gpt-4.1-mini": "2024-06",
"gpt-4.1-mini-2025-04-14": "2024-06",
"gpt-4.1-nano": "2024-06",
"gpt-4.1-nano-2025-04-14": "2024-06",
"gpt-4.5-preview": "2023-10",
"gpt-4.5-preview-2025-02-27": "2023-10",
"gpt-4o": "2023-10",
"gpt-4o-2024-05-13": "2023-10",
"gpt-4o-2024-08-06": "2023-10",
"gpt-4o-2024-11-20": "2023-10",
"chatgpt-4o-latest": "2023-10",
"gpt-4o-mini": "2023-10",
"gpt-4o-mini-2024-07-18": "2023-10",
"gpt-4-vision-preview": "2023-04",
"o1-mini-2024-09-12": "2023-10",
"o1-mini": "2023-10",
"o1-preview-2024-09-12": "2023-10",
"o1-preview": "2023-10",
"o1-2024-12-17": "2023-10",
o1: "2023-10",
"o3-mini-2025-01-31": "2023-10",
"o3-mini": "2023-10",
// After improvements,
// it's now easier to add "KnowledgeCutOffDate" instead of stupid hardcoding it, as was done previously.
"gemini-pro": "2023-12",
"gemini-pro-vision": "2023-12",
"deepseek-chat": "2024-07",
"deepseek-coder": "2024-07",
};
export const DEFAULT_TTS_ENGINE = "OpenAI-TTS";
export const DEFAULT_TTS_ENGINES = ["OpenAI-TTS", "Edge-TTS"];
export const DEFAULT_TTS_MODEL = "tts-1";
export const DEFAULT_TTS_VOICE = "alloy";
export const DEFAULT_TTS_MODELS = ["tts-1", "tts-1-hd"];
export const DEFAULT_TTS_VOICES = [
"alloy",
"echo",
"fable",
"onyx",
"nova",
"shimmer",
];
export const VISION_MODEL_REGEXES = [
/vision/,
/gpt-4o/,
/gpt-4\.1/,
/claude-3/,
/gemini-1\.5/,
/gemini-exp/,
/gemini-2\.0/,
/learnlm/,
/qwen-vl/,
/qwen2-vl/,
/gpt-4-turbo(?!.*preview)/, // Matches "gpt-4-turbo" but not "gpt-4-turbo-preview"
/^dall-e-3$/, // Matches exactly "dall-e-3"
/glm-4v/,
/vl/i,
/o3/,
/o4-mini/,
];
export const EXCLUDE_VISION_MODEL_REGEXES = [/claude-3-5-haiku-20241022/];
const openaiModels = [
// As of July 2024, gpt-4o-mini should be used in place of gpt-3.5-turbo,
// as it is cheaper, more capable, multimodal, and just as fast. gpt-3.5-turbo is still available for use in the API.
"gpt-3.5-turbo",
"gpt-3.5-turbo-1106",
"gpt-3.5-turbo-0125",
@@ -265,22 +512,52 @@ const openaiModels = [
"gpt-4-32k-0613",
"gpt-4-turbo",
"gpt-4-turbo-preview",
"gpt-4.1",
"gpt-4.1-2025-04-14",
"gpt-4.1-mini",
"gpt-4.1-mini-2025-04-14",
"gpt-4.1-nano",
"gpt-4.1-nano-2025-04-14",
"gpt-4.5-preview",
"gpt-4.5-preview-2025-02-27",
"gpt-4o",
"gpt-4o-2024-05-13",
"gpt-4o-2024-08-06",
"gpt-4o-2024-11-20",
"chatgpt-4o-latest",
"gpt-4o-mini",
"gpt-4o-mini-2024-07-18",
"gpt-4-vision-preview",
"gpt-4-turbo-2024-04-09",
"gpt-4-1106-preview",
"dall-e-3",
"o1-mini",
"o1-preview",
"o3-mini",
"o3",
"o4-mini",
];
const googleModels = [
"gemini-1.0-pro",
"gemini-1.5-pro-latest",
"gemini-1.5-pro",
"gemini-1.5-pro-002",
"gemini-1.5-flash-latest",
"gemini-pro-vision",
"gemini-1.5-flash-8b-latest",
"gemini-1.5-flash",
"gemini-1.5-flash-8b",
"gemini-1.5-flash-002",
"learnlm-1.5-pro-experimental",
"gemini-exp-1206",
"gemini-2.0-flash",
"gemini-2.0-flash-exp",
"gemini-2.0-flash-lite-preview-02-05",
"gemini-2.0-flash-thinking-exp",
"gemini-2.0-flash-thinking-exp-1219",
"gemini-2.0-flash-thinking-exp-01-21",
"gemini-2.0-pro-exp",
"gemini-2.0-pro-exp-02-05",
"gemini-2.5-pro-preview-06-05",
];
const anthropicModels = [
@@ -289,8 +566,15 @@ const anthropicModels = [
"claude-2.1",
"claude-3-sonnet-20240229",
"claude-3-opus-20240229",
"claude-3-opus-latest",
"claude-3-haiku-20240307",
"claude-3-5-haiku-20241022",
"claude-3-5-haiku-latest",
"claude-3-5-sonnet-20240620",
"claude-3-5-sonnet-20241022",
"claude-3-5-sonnet-latest",
"claude-3-7-sonnet-20250219",
"claude-3-7-sonnet-latest",
];
const baiduModels = [
@@ -318,12 +602,15 @@ const bytedanceModels = [
const alibabaModes = [
"qwen-turbo",
"qwen-plus",
// "qwen-plus",
"qwen-max",
"qwen-max-0428",
"qwen-max-0403",
"qwen-max-0107",
"qwen-max-longcontext",
// "qwen-max-0428",
// "qwen-max-0403",
// "qwen-max-0107",
// "qwen-max-longcontext",
// "qwen-omni-turbo",
"qwen-vl-plus",
// "qwen-vl-max",
];
const tencentModels = [
@@ -346,22 +633,85 @@ const iflytekModels = [
"4.0Ultra",
];
const deepseekModels = ["deepseek-chat", "deepseek-coder", "deepseek-reasoner"];
const xAIModes = [
"grok-beta",
"grok-2",
"grok-2-1212",
"grok-2-latest",
"grok-vision-beta",
"grok-2-vision-1212",
"grok-2-vision",
"grok-2-vision-latest",
"grok-3-mini-fast-beta",
"grok-3-mini-fast",
"grok-3-mini-fast-latest",
"grok-3-mini-beta",
"grok-3-mini",
"grok-3-mini-latest",
"grok-3-fast-beta",
"grok-3-fast",
"grok-3-fast-latest",
"grok-3-beta",
"grok-3",
"grok-3-latest",
];
const chatglmModels = [
"glm-4-plus",
"glm-4-0520",
"glm-4",
"glm-4-air",
"glm-4-airx",
"glm-4-long",
"glm-4-flashx",
"glm-4-flash",
"glm-4v-plus",
"glm-4v",
"glm-4v-flash", // free
"cogview-3-plus",
"cogview-3",
"cogview-3-flash", // free
// 目前无法适配轮询任务
// "cogvideox",
// "cogvideox-flash", // free
];
const siliconflowModels = [
"Qwen/Qwen2.5-7B-Instruct",
"Qwen/Qwen2.5-72B-Instruct",
"deepseek-ai/DeepSeek-R1",
"deepseek-ai/DeepSeek-R1-Distill-Llama-70B",
"deepseek-ai/DeepSeek-R1-Distill-Llama-8B",
"deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B",
"deepseek-ai/DeepSeek-R1-Distill-Qwen-14B",
"deepseek-ai/DeepSeek-R1-Distill-Qwen-32B",
"deepseek-ai/DeepSeek-R1-Distill-Qwen-7B",
"deepseek-ai/DeepSeek-V3",
"meta-llama/Llama-3.3-70B-Instruct",
"THUDM/glm-4-9b-chat",
"Pro/deepseek-ai/DeepSeek-R1",
"Pro/deepseek-ai/DeepSeek-V3",
];
let seq = 1000; // 内置的模型序号生成器从1000开始
export const DEFAULT_MODELS = [
...openaiModels.map((name) => ({
...alibabaModes.map((name) => ({
name,
available: true,
sorted: seq++, // Global sequence sort(index)
available: true, // 默认可用
sorted: seq++,
provider: {
id: "openai",
providerName: "OpenAI",
providerType: "openai",
sorted: 1, // 这里是固定的,确保顺序与之前内置的版本一致
id: "alibaba",
providerName: "Alibaba",
providerType: "alibaba",
sorted: 1,
},
})),
...openaiModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "azure",
@@ -372,7 +722,7 @@ export const DEFAULT_MODELS = [
})),
...googleModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "google",
@@ -383,7 +733,7 @@ export const DEFAULT_MODELS = [
})),
...anthropicModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "anthropic",
@@ -394,7 +744,7 @@ export const DEFAULT_MODELS = [
})),
...baiduModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "baidu",
@@ -405,7 +755,7 @@ export const DEFAULT_MODELS = [
})),
...bytedanceModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "bytedance",
@@ -414,20 +764,22 @@ export const DEFAULT_MODELS = [
sorted: 6,
},
})),
...alibabaModes.map((name) => ({
...openaiModels.map((name) => ({
name,
available: true,
sorted: seq++,
available: false,
sorted: seq++, // Global sequence sort(index)
provider: {
id: "alibaba",
providerName: "Alibaba",
providerType: "alibaba",
sorted: 7,
id: "openai",
providerName: "OpenAI",
providerType: "openai",
sorted: 7, // 这里是固定的,确保顺序与之前内置的版本一致
},
})),
...tencentModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "tencent",
@@ -438,7 +790,7 @@ export const DEFAULT_MODELS = [
})),
...moonshotModes.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "moonshot",
@@ -449,7 +801,7 @@ export const DEFAULT_MODELS = [
})),
...iflytekModels.map((name) => ({
name,
available: true,
available: false,
sorted: seq++,
provider: {
id: "iflytek",
@@ -458,6 +810,50 @@ export const DEFAULT_MODELS = [
sorted: 10,
},
})),
...xAIModes.map((name) => ({
name,
available: false,
sorted: seq++,
provider: {
id: "xai",
providerName: "XAI",
providerType: "xai",
sorted: 11,
},
})),
...chatglmModels.map((name) => ({
name,
available: false,
sorted: seq++,
provider: {
id: "chatglm",
providerName: "ChatGLM",
providerType: "chatglm",
sorted: 12,
},
})),
...deepseekModels.map((name) => ({
name,
available: false,
sorted: seq++,
provider: {
id: "deepseek",
providerName: "DeepSeek",
providerType: "deepseek",
sorted: 13,
},
})),
...siliconflowModels.map((name) => ({
name,
available: false,
sorted: seq++,
provider: {
id: "siliconflow",
providerName: "SiliconFlow",
providerType: "siliconflow",
sorted: 14,
},
})),
] as const;
export const CHAT_PAGE_SIZE = 15;
@@ -477,8 +873,6 @@ export const internalAllowedWebDavEndpoints = [
];
export const DEFAULT_GA_ID = "G-89WN60ZK2E";
export const PLUGINS = [
{ name: "Plugins", path: Path.Plugins },
{ name: "Stable Diffusion", path: Path.Sd },
{ name: "Search Chat", path: Path.SearchChat },
];
export const SAAS_CHAT_URL = "https://nextchat.club";
export const SAAS_CHAT_UTM_URL = "https://nextchat.club?utm=github";

7
app/global.d.ts vendored
View File

@@ -26,6 +26,13 @@ declare interface Window {
isPermissionGranted(): Promise<boolean>;
sendNotification(options: string | Options): void;
};
updater: {
checkUpdate(): Promise<UpdateResult>;
installUpdate(): Promise<void>;
onUpdaterEvent(
handler: (status: UpdateStatusResult) => void,
): Promise<UnlistenFn>;
};
http: {
fetch<T>(
url: string,

1
app/icons/arrow.svg Normal file
View File

@@ -0,0 +1 @@
<svg class="icon--SJP_d" width="16" height="16" fill="none" viewBox="0 0 16 16" style="min-width: 16px; min-height: 16px;"><g><path data-follow-fill="currentColor" fill-rule="evenodd" clip-rule="evenodd" d="M5.248 14.444a.625.625 0 0 1-.005-.884l5.068-5.12a.625.625 0 0 0 0-.88L5.243 2.44a.625.625 0 1 1 .889-.88l5.067 5.121c.723.73.723 1.907 0 2.638l-5.067 5.12a.625.625 0 0 1-.884.005Z" fill="currentColor"></path></g></svg>

After

Width:  |  Height:  |  Size: 426 B

149
app/icons/chebichat-big.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 85 KiB

151
app/icons/chebichat.svg Normal file
View File

@@ -0,0 +1,151 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="190px" height="182px" viewBox="0 0 95 91" version="1.1">
<g id="surface1">
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(16.078432%,14.901961%,14.509805%);fill-opacity:1;" d="M 85.03125 11.8125 C 87.40625 14.289062 87.675781 16.304688 87.640625 19.664062 C 87.5 21 87.5 21 86.65625 22.5625 C 85.878906 24.265625 86.046875 24.554688 86.65625 26.25 C 86.882812 26.820312 86.882812 26.820312 87.113281 27.40625 C 87.605469 28.800781 87.613281 30.035156 87.5 31.5 C 87.792969 31.609375 88.082031 31.714844 88.382812 31.828125 C 90.964844 33.386719 92.496094 36.917969 93.390625 39.71875 C 93.910156 43.003906 93.054688 46.371094 91.257812 49.148438 C 90.882812 49.570312 90.882812 49.570312 90.5 50 C 90.171875 50 89.839844 50 89.5 50 C 89.398438 50.421875 89.292969 50.84375 89.1875 51.28125 C 88.167969 53.828125 86.453125 55.578125 84.5 57.46875 C 83.59375 58.402344 82.925781 59.3125 82.1875 60.375 C 77.097656 67.183594 68.765625 69.902344 61 72.5 C 61.363281 72.578125 61.722656 72.660156 62.097656 72.742188 C 62.582031 72.855469 63.0625 72.972656 63.5625 73.09375 C 64.039062 73.203125 64.515625 73.3125 65.003906 73.429688 C 68.210938 74.65625 69.839844 76.71875 71.28125 79.75 C 71.425781 80.050781 71.570312 80.351562 71.71875 80.664062 C 75.5 88.699219 75.5 88.699219 75.5 91 C 62.960938 91 50.421875 91 37.5 91 C 37.5 90.339844 37.5 89.679688 37.5 89 C 37.179688 89.324219 36.859375 89.652344 36.53125 89.988281 C 34.457031 91.359375 32.917969 91.257812 30.492188 91.195312 C 29.847656 91.191406 29.847656 91.191406 29.1875 91.1875 C 27.824219 91.175781 26.460938 91.152344 25.09375 91.125 C 24.164062 91.113281 23.238281 91.105469 22.308594 91.097656 C 20.039062 91.074219 17.769531 91.042969 15.5 91 C 15.5 90.503906 15.5 90.011719 15.5 89.5 C 15.089844 89.425781 14.675781 89.351562 14.253906 89.277344 C 10.378906 88.5 7.164062 87.425781 4 85 C 3.273438 83.546875 3.3125 82.613281 3.5 81 C 4.078125 80.207031 4.078125 80.207031 4.78125 79.4375 C 6.125 78.195312 6.125 78.195312 5.925781 77.066406 C 5.222656 75.300781 4.132812 73.796875 3.089844 72.21875 C 2.5 71 2.5 71 2.660156 69.84375 C 3 69 3 69 3.84375 68.621094 C 5 68.5 5 68.5 6.203125 69.347656 C 6.621094 69.75 7.039062 70.148438 7.46875 70.5625 C 7.90625 70.976562 8.34375 71.390625 8.792969 71.816406 C 9.191406 72.207031 9.589844 72.597656 10 73 C 10.425781 73.414062 10.425781 73.414062 10.863281 73.839844 C 11.414062 74.386719 11.960938 74.941406 12.5 75.5 C 12.664062 75.664062 12.828125 75.828125 13 76 C 13.277344 75.515625 13.558594 75.03125 13.84375 74.53125 C 15 73 15 73 16.375 72.5625 C 16.746094 72.542969 17.117188 72.519531 17.5 72.5 C 17.867188 70.53125 18.0625 68.664062 18.066406 66.664062 C 18.066406 66.128906 18.066406 65.59375 18.070312 65.046875 C 18.066406 64.484375 18.066406 63.921875 18.0625 63.34375 C 18.0625 62.761719 18.0625 62.179688 18.058594 61.582031 C 18.039062 57.703125 17.882812 53.859375 17.609375 49.988281 C 17.46875 47.441406 17.472656 44.902344 17.519531 42.351562 C 17.542969 40.335938 17.351562 38.484375 17 36.5 C 16.957031 35.574219 16.933594 34.644531 16.9375 33.71875 C 16.9375 33.265625 16.933594 32.808594 16.933594 32.34375 C 17 31.015625 17.207031 29.792969 17.5 28.5 C 17.179688 28.355469 16.859375 28.210938 16.53125 28.0625 C 15.5 27.5 15.5 27.5 15 26.5 C 14.574219 23.566406 14.953125 21.472656 16.65625 19.054688 C 19.1875 15.890625 21.832031 13.019531 25 10.5 C 25.371094 10.097656 25.742188 9.695312 26.125 9.28125 C 40.53125 -4.144531 71.25 -1.40625 85.03125 11.8125 Z M 28.5 67.5 C 28.355469 72.28125 28.769531 76.785156 29.5 81.5 C 31.386719 80.445312 32.914062 79.25 34.5 77.78125 C 39.320312 73.328125 39.320312 73.328125 42.15625 72.875 C 42.597656 72.800781 43.039062 72.726562 43.492188 72.648438 C 43.992188 72.574219 43.992188 72.574219 44.5 72.5 C 44.5 72.335938 44.5 72.171875 44.5 72 C 43.914062 71.878906 43.328125 71.757812 42.726562 71.632812 C 38.34375 70.707031 34.164062 69.675781 30 68 C 29.503906 67.835938 29.011719 67.671875 28.5 67.5 Z M 17.5 76.5 C 18 77.5 18 77.5 18 77.5 Z M 17.5 76.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(96.862745%,83.529413%,71.372551%);fill-opacity:1;" d="M 54.5 25.5 C 55.613281 26.136719 56.730469 26.769531 57.84375 27.40625 C 58.15625 27.585938 58.46875 27.761719 58.792969 27.945312 C 59.695312 28.460938 60.597656 28.980469 61.5 29.5 C 61.988281 29.78125 62.476562 30.0625 62.976562 30.351562 C 63.316406 30.5625 63.652344 30.777344 64 31 C 64 31.164062 64 31.328125 64 31.5 C 61.773438 31.746094 61.773438 31.746094 59.5 32 C 60.488281 32.027344 60.488281 32.027344 61.410156 31.859375 C 66.414062 31.382812 70.996094 34.277344 74.996094 37.03125 C 75.328125 37.351562 75.660156 37.671875 76 38 C 75.90625 39.65625 75.90625 39.65625 75.707031 40.703125 C 74.855469 46.046875 75.101562 51.628906 78.320312 56.148438 C 78.542969 56.429688 78.769531 56.710938 79 57 C 76.835938 59.59375 74.617188 61.863281 72 64 C 77.597656 60.613281 82.574219 55.566406 86 50 C 86.5 51.5 86.5 51.5 86.160156 52.472656 C 82.765625 58.660156 76.664062 63.988281 70.265625 66.90625 C 68.867188 67.5625 67.554688 68.320312 66.21875 69.09375 C 63.820312 70.421875 61.699219 71.058594 59 71.5 C 59.050781 72.035156 59.101562 72.574219 59.15625 73.125 C 59 75 59 75 57.90625 76.15625 C 55.300781 77.71875 53.078125 78.46875 50 78 C 48.25 77.25 48.25 77.25 47 76 C 46.835938 74.648438 46.945312 73.367188 47 72 C 46.667969 71.9375 46.335938 71.875 45.992188 71.808594 C 29.191406 68.597656 29.191406 68.597656 25.5 64 C 24.359375 61.851562 24.320312 59.695312 24.25 57.3125 C 24.167969 54.535156 23.839844 52.179688 23 49.5 C 22.28125 43.140625 26.898438 37.988281 30.464844 33.207031 C 31.640625 31.835938 32.910156 30.734375 34.3125 29.59375 C 34.792969 29.199219 35.273438 28.804688 35.769531 28.398438 C 37 27.5 37 27.5 38 27.5 C 38.21875 28.125 38.21875 28.125 38.441406 28.765625 C 40.390625 34.117188 42.484375 37.976562 47 41.5 C 47.246094 40.757812 47.246094 40.757812 47.5 40 C 47.765625 40.246094 48.03125 40.492188 48.304688 40.75 C 48.835938 41.230469 48.835938 41.230469 49.375 41.71875 C 49.722656 42.039062 50.070312 42.355469 50.429688 42.6875 C 51.492188 43.496094 52.195312 43.792969 53.5 44 C 54.078125 40.179688 54.164062 36.359375 54.28125 32.5 C 54.304688 31.820312 54.324219 31.140625 54.347656 30.460938 C 54.398438 28.804688 54.449219 27.152344 54.5 25.5 Z M 54.5 25.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.156863%,2.352941%,5.882353%);fill-opacity:1;" d="M 85.03125 11.8125 C 87.40625 14.289062 87.675781 16.304688 87.640625 19.664062 C 87.5 21 87.5 21 86.65625 22.5625 C 85.878906 24.265625 86.046875 24.554688 86.65625 26.25 C 86.882812 26.820312 86.882812 26.820312 87.113281 27.40625 C 87.605469 28.800781 87.613281 30.035156 87.5 31.5 C 87.792969 31.609375 88.082031 31.714844 88.382812 31.828125 C 90.964844 33.386719 92.496094 36.917969 93.390625 39.71875 C 93.851562 42.625 93.1875 45.335938 92 48 C 90.90625 49.3125 90.90625 49.3125 90 50 C 90 48.5 90 48.5 90.5 47.5 C 89.839844 47.5 89.179688 47.5 88.5 47.5 C 88.335938 47.003906 88.171875 46.511719 88 46 C 87.339844 46.164062 86.679688 46.328125 86 46.5 C 85.835938 45.179688 85.671875 43.859375 85.5 42.5 C 85.335938 43.324219 85.171875 44.148438 85 45 C 84.503906 45 84.011719 45 83.5 45 C 83.5 43.515625 83.5 42.03125 83.5 40.5 C 83.171875 41.984375 82.839844 43.46875 82.5 45 C 81.34375 45 80.191406 45 79 45 C 79.171875 40.464844 79.878906 36.582031 82.71875 32.9375 C 84.183594 31.863281 84.746094 31.914062 86.5 32 C 85.59375 26.625 85.59375 26.625 83 22 C 79.148438 19.835938 74.722656 18.71875 70.5 17.5 C 70.828125 17.335938 71.160156 17.171875 71.5 17 C 72.707031 13.792969 72.613281 10.371094 72.5 7 C 71.511719 6.503906 70.519531 6.011719 69.5 5.5 C 69.5 5.996094 69.5 6.488281 69.5 7 C 69.054688 6.894531 68.609375 6.789062 68.152344 6.679688 C 63.640625 5.679688 59.382812 5.261719 54.769531 5.304688 C 53.71875 5.3125 52.667969 5.304688 51.613281 5.292969 C 46.214844 5.285156 41.015625 5.859375 36 8 C 36.488281 7.828125 36.488281 7.828125 36.988281 7.652344 C 39.507812 6.824219 41.363281 6.238281 44 7 C 44 6.671875 44 6.339844 44 6 C 51.941406 4.1875 61.78125 4.925781 69.5 7.5 C 69.617188 10.289062 69.289062 12.816406 68.5 15.5 C 68 16 68 16 67.058594 16.050781 C 66.648438 16.042969 66.234375 16.039062 65.8125 16.03125 C 65.34375 16.027344 64.878906 16.023438 64.398438 16.019531 C 63.636719 16.007812 63.636719 16.007812 62.859375 15.996094 C 61.789062 15.988281 60.714844 15.980469 59.644531 15.972656 C 57.953125 15.957031 56.257812 15.9375 54.566406 15.914062 C 52.933594 15.894531 51.304688 15.882812 49.671875 15.871094 C 49.171875 15.863281 48.671875 15.851562 48.15625 15.84375 C 45.371094 15.832031 43.105469 16.117188 40.507812 17.144531 C 38.863281 17.726562 37.230469 17.84375 35.5 18 C 35.5 18.328125 35.5 18.660156 35.5 19 C 35.058594 19.144531 34.617188 19.292969 34.164062 19.441406 C 25.652344 22.109375 25.652344 22.109375 19.5 28 C 18.34375 27.90625 18.34375 27.90625 17 27.5 C 15.636719 25.792969 15 24.183594 15 22 C 16.886719 17.398438 21.183594 13.535156 25 10.5 C 25.371094 10.097656 25.742188 9.695312 26.125 9.28125 C 40.53125 -4.144531 71.25 -1.40625 85.03125 11.8125 Z M 85.03125 11.8125 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(63.137257%,14.901961%,30.19608%);fill-opacity:1;" d="M 4 70 C 6.300781 70.421875 7.453125 71.820312 9 73.46875 C 11.199219 75.890625 11.199219 75.890625 14 77.5 C 14.0625 77.027344 14.125 76.550781 14.1875 76.0625 C 14.5 74.5 14.5 74.5 15.5 73.5 C 15.996094 73.5 16.488281 73.5 17 73.5 C 16.996094 73.902344 16.988281 74.308594 16.984375 74.726562 C 16.976562 75.257812 16.972656 75.792969 16.96875 76.34375 C 16.964844 76.871094 16.957031 77.398438 16.953125 77.945312 C 16.996094 79.363281 17.175781 80.621094 17.5 82 C 17.855469 82.089844 18.210938 82.183594 18.578125 82.273438 C 21.964844 83.164062 25.0625 84.042969 28 86 C 28 85.339844 28 84.679688 28 84 C 29.082031 82.945312 29.082031 82.945312 30.53125 81.84375 C 32.257812 80.496094 33.890625 79.175781 35.40625 77.59375 C 37.878906 75.0625 39.902344 73.789062 43.5 73.5 C 43.5 74.65625 43.5 75.808594 43.5 77 C 43.996094 76.835938 44.488281 76.671875 45 76.5 C 45.867188 76.78125 45.867188 76.78125 46.875 77.21875 C 51.785156 78.957031 57.222656 78.386719 62 76.5 C 62.246094 77.738281 62.246094 77.738281 62.5 79 C 62.5 77.515625 62.5 76.03125 62.5 74.5 C 64.457031 74.5 65.828125 74.515625 67.4375 75.679688 C 71.40625 79.839844 72.890625 85.640625 74.5 91 C 62.289062 91 50.078125 91 37.5 91 C 37.5 90.339844 37.5 89.679688 37.5 89 C 37.160156 89.324219 36.820312 89.648438 36.472656 89.980469 C 33.988281 91.699219 31.167969 91.332031 28.25 91.316406 C 27.167969 91.3125 26.085938 91.335938 25.003906 91.359375 C 21.152344 91.394531 18.25 91.082031 15.054688 88.695312 C 13.84375 87.898438 12.894531 87.695312 11.46875 87.5 C 8.65625 87.082031 6.722656 85.707031 4.5 84 C 4.632812 81.769531 4.789062 80.742188 6.28125 79.03125 C 6.882812 78.519531 6.882812 78.519531 7.5 78 C 6.824219 75.976562 5.824219 74.519531 4.511719 72.875 C 4 72 4 72 4 70 Z M 4 70 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(23.529412%,23.137255%,23.137255%);fill-opacity:1;" d="M 51.019531 17.898438 C 51.589844 17.894531 51.589844 17.894531 52.175781 17.890625 C 52.980469 17.886719 53.78125 17.882812 54.585938 17.878906 C 55.796875 17.875 57.007812 17.859375 58.21875 17.84375 C 62.695312 17.8125 66.707031 18.085938 71 19.5 C 71.339844 19.605469 71.679688 19.714844 72.027344 19.824219 C 72.851562 20.101562 73.660156 20.40625 74.46875 20.71875 C 74.988281 20.914062 75.507812 21.109375 76.046875 21.3125 C 77.589844 22.042969 78.746094 22.835938 80 24 C 80.929688 26.789062 79.351562 29.742188 78.15625 32.3125 C 77.941406 32.703125 77.722656 33.097656 77.5 33.5 C 70.136719 33 60.214844 27.597656 54.5 23 C 53.855469 21.898438 53.855469 21.898438 53.5 21 C 53.171875 21.164062 52.839844 21.328125 52.5 21.5 C 52.21875 22.515625 52.21875 22.515625 52.074219 23.734375 C 52.007812 24.1875 51.945312 24.636719 51.878906 25.097656 C 51.785156 25.808594 51.785156 25.808594 51.6875 26.53125 C 51.621094 27.007812 51.554688 27.480469 51.484375 27.96875 C 51.320312 29.144531 51.15625 30.324219 51 31.5 C 50.671875 31.003906 50.339844 30.511719 50 30 C 50.347656 35.015625 50.347656 35.015625 51 40 C 49.304688 39.28125 48.320312 38.390625 47.125 37 C 46.664062 36.472656 46.664062 36.472656 46.195312 35.9375 C 45.5 35 45.5 35 45.5 34 C 45.171875 34 44.839844 34 44.5 34 C 44.613281 34.730469 44.742188 35.460938 44.875 36.1875 C 44.945312 36.59375 45.015625 37 45.085938 37.417969 C 45.410156 38.589844 45.410156 38.589844 46.539062 39.144531 C 46.855469 39.261719 47.171875 39.378906 47.5 39.5 C 47.335938 40.160156 47.171875 40.820312 47 41.5 C 42.921875 39.746094 40.660156 35.738281 38.953125 31.835938 C 38.464844 30.398438 38.210938 29.003906 38 27.5 C 30.90625 33.050781 24.855469 39.394531 23 48.5 C 22.503906 48.335938 22.011719 48.171875 21.5 48 C 19.398438 43.316406 17.308594 37.644531 18.5 32.5 C 19.414062 30.839844 20.578125 29.738281 22 28.5 C 22.332031 28.160156 22.664062 27.824219 23.007812 27.476562 C 29.792969 20.8125 41.722656 17.953125 51.019531 17.898438 Z M 51.019531 17.898438 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(83.137256%,9.411765%,34.117648%);fill-opacity:1;" d="M 69.5 7.5 C 69.617188 10.289062 69.289062 12.816406 68.5 15.5 C 68 16 68 16 67.058594 16.050781 C 66.648438 16.042969 66.234375 16.039062 65.8125 16.03125 C 65.34375 16.027344 64.878906 16.023438 64.398438 16.019531 C 63.636719 16.007812 63.636719 16.007812 62.859375 15.996094 C 61.789062 15.988281 60.714844 15.980469 59.644531 15.96875 C 57.953125 15.957031 56.257812 15.9375 54.566406 15.914062 C 52.933594 15.894531 51.304688 15.882812 49.671875 15.871094 C 49.171875 15.863281 48.671875 15.851562 48.15625 15.84375 C 45.371094 15.832031 43.085938 16.121094 40.480469 17.125 C 38.875 17.738281 37.203125 18.082031 35.53125 18.46875 C 32.714844 19.136719 30.097656 19.960938 27.460938 21.179688 C 26.5 21.5 26.5 21.5 25 21 C 23.878906 18.761719 23.859375 16.382812 24.5 14 C 26.375 12.105469 28.589844 11.050781 31 10 C 31.339844 9.851562 31.679688 9.703125 32.027344 9.550781 C 35.921875 7.898438 39.710938 6.5 44 6.5 C 44 6.335938 44 6.171875 44 6 C 51.941406 4.1875 61.78125 4.925781 69.5 7.5 Z M 69.5 7.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(97.254902%,83.529413%,71.372551%);fill-opacity:1;" d="M 4 70 C 6.300781 70.421875 7.453125 71.820312 9 73.46875 C 11.199219 75.890625 11.199219 75.890625 14 77.5 C 14.0625 77.027344 14.125 76.550781 14.1875 76.0625 C 14.5 74.5 14.5 74.5 15.5 73.5 C 15.996094 73.5 16.488281 73.5 17 73.5 C 16.996094 73.902344 16.988281 74.308594 16.984375 74.726562 C 16.976562 75.257812 16.972656 75.792969 16.96875 76.34375 C 16.964844 76.871094 16.957031 77.398438 16.953125 77.945312 C 16.996094 79.363281 17.175781 80.621094 17.5 82 C 17.855469 82.089844 18.210938 82.183594 18.578125 82.273438 C 21.964844 83.164062 25.0625 84.042969 28 86 C 28 85.339844 28 84.679688 28 84 C 29.082031 82.945312 29.082031 82.945312 30.53125 81.84375 C 32.257812 80.496094 33.890625 79.175781 35.40625 77.59375 C 37.878906 75.0625 39.902344 73.789062 43.5 73.5 C 43.476562 79.425781 41.179688 84.15625 37.125 88.421875 C 33.78125 91.484375 30.671875 91.515625 26.28125 91.40625 C 25.652344 91.414062 25.023438 91.425781 24.375 91.433594 C 20.628906 91.398438 18.136719 90.914062 15.085938 88.691406 C 13.863281 87.910156 12.894531 87.699219 11.46875 87.5 C 8.65625 87.082031 6.722656 85.707031 4.5 84 C 4.632812 81.769531 4.789062 80.742188 6.28125 79.03125 C 6.882812 78.519531 6.882812 78.519531 7.5 78 C 6.824219 75.976562 5.824219 74.519531 4.511719 72.875 C 4 72 4 72 4 70 Z M 4 70 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(17.254902%,16.470589%,15.686275%);fill-opacity:1;" d="M 19 40.5 C 20.25 42.8125 21.5 45.351562 21.5 48 C 21.996094 48.164062 22.488281 48.328125 23 48.5 C 24.519531 51.769531 24.710938 54.886719 24.765625 58.441406 C 24.851562 61.398438 25.539062 63.457031 27.59375 65.6875 C 27.894531 65.957031 28.191406 66.222656 28.5 66.5 C 28.335938 66.664062 28.171875 66.828125 28 67 C 27.738281 72.332031 28.125 77.695312 28.453125 83.019531 C 28.585938 85.828125 28.585938 85.828125 28 87 C 27.75 86.699219 27.5 86.398438 27.242188 86.085938 C 25.6875 84.726562 24.066406 84.207031 22.125 83.625 C 21.601562 83.460938 21.601562 83.460938 21.070312 83.289062 C 20.214844 83.023438 19.355469 82.761719 18.5 82.5 C 18.535156 82.042969 18.570312 81.582031 18.609375 81.109375 C 19.328125 70.898438 19.355469 60.488281 18.613281 50.273438 C 18.101562 42.292969 18.101562 42.292969 19 40.5 Z M 19 40.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(65.098041%,15.686275%,31.37255%);fill-opacity:1;" d="M 50.429688 2.332031 C 51.464844 2.34375 52.5 2.34375 53.535156 2.339844 C 65.351562 2.363281 65.351562 2.363281 69 5 C 69.320312 6.050781 69.320312 6.050781 69.5 7 C 69.054688 6.894531 68.609375 6.789062 68.152344 6.679688 C 63.640625 5.679688 59.382812 5.261719 54.769531 5.304688 C 53.71875 5.3125 52.667969 5.304688 51.613281 5.292969 C 45.152344 5.285156 39.1875 6.71875 33 8.5 C 33 8.828125 33 9.160156 33 9.5 C 29.039062 11.726562 29.039062 11.726562 25 14 C 24.988281 14.96875 24.980469 15.9375 24.96875 16.9375 C 24.964844 17.484375 24.957031 18.027344 24.949219 18.589844 C 24.902344 19.976562 24.902344 19.976562 25.5 21 C 25.789062 20.902344 26.078125 20.804688 26.378906 20.707031 C 27.1875 20.4375 28 20.167969 28.808594 19.898438 C 29.753906 19.582031 30.691406 19.246094 31.625 18.90625 C 32.996094 18.5 34.082031 18.417969 35.5 18.5 C 33.949219 19.316406 32.351562 19.984375 30.734375 20.65625 C 24.335938 23.273438 24.335938 23.273438 19.5 28 C 18.34375 27.90625 18.34375 27.90625 17 27.5 C 15.636719 25.792969 15 24.183594 15 22 C 16.625 18.035156 20.425781 13.800781 24 11.5 C 24.335938 11.417969 24.675781 11.335938 25.019531 11.253906 C 26.277344 10.929688 26.753906 10.359375 27.65625 9.4375 C 33.855469 3.824219 42.347656 2.234375 50.429688 2.332031 Z M 50.429688 2.332031 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(1.960784%,1.568628%,1.568628%);fill-opacity:1;" d="M 87.5 41.5 C 87.664062 41.5 87.828125 41.5 88 41.5 C 88.246094 44.46875 88.246094 44.46875 88.5 47.5 C 89.738281 47.253906 89.738281 47.253906 91 47 C 89.832031 51.4375 87.511719 54.648438 84 57.5 C 83.5 58.214844 83.023438 58.945312 82.5625 59.6875 C 77.804688 67.046875 68.964844 69.925781 61 72.5 C 61.542969 72.617188 61.542969 72.617188 62.097656 72.742188 C 62.582031 72.855469 63.0625 72.972656 63.5625 73.09375 C 64.039062 73.203125 64.515625 73.3125 65.003906 73.429688 C 68.210938 74.65625 69.839844 76.71875 71.28125 79.75 C 71.425781 80.050781 71.570312 80.351562 71.71875 80.664062 C 75.5 88.699219 75.5 88.699219 75.5 91 C 75.171875 91 74.839844 91 74.5 91 C 74.136719 90.125 73.769531 89.25 73.40625 88.375 C 73.175781 87.828125 72.949219 87.28125 72.710938 86.71875 C 72.253906 85.617188 71.808594 84.503906 71.382812 83.390625 C 69.675781 78.945312 69.675781 78.945312 66.5 75.5 C 65.136719 74.957031 63.964844 74.726562 62.5 74.5 C 62.746094 77.71875 62.746094 77.71875 63 81 C 61.855469 79.28125 61.910156 78.507812 62 76.5 C 61.726562 76.644531 61.453125 76.789062 61.171875 76.9375 C 55.585938 79.550781 49.707031 78.992188 44 77 C 43.671875 77.660156 43.339844 78.320312 43 79 C 43.164062 77.183594 43.328125 75.371094 43.5 73.5 C 39.535156 74.359375 37.273438 76.390625 34.445312 79.128906 C 32.726562 80.761719 30.929688 82.128906 29 83.5 C 28.671875 83.335938 28.339844 83.171875 28 83 C 27.632812 77.65625 27.429688 72.355469 27.5 67 C 29.398438 66.367188 29.738281 66.738281 31.5 67.59375 C 34.238281 68.789062 36.917969 69.523438 39.84375 70.125 C 40.234375 70.207031 40.625 70.289062 41.03125 70.371094 C 43.679688 70.894531 46.316406 71.230469 49 71.5 C 48.011719 71.746094 48.011719 71.746094 47 72 C 47.035156 74.8125 47.035156 74.8125 48.53125 77 C 51.40625 77.976562 53.542969 77.847656 56.21875 76.53125 C 57.441406 75.875 57.441406 75.875 58.5 75 C 58.683594 73.835938 58.851562 72.667969 59 71.5 C 59.617188 71.28125 59.617188 71.28125 60.246094 71.058594 C 63.476562 69.871094 66.410156 68.597656 69.292969 66.714844 C 70.5 66 70.5 66 72.171875 65.34375 C 74.285156 64.464844 75.839844 63.105469 77.53125 61.59375 C 77.824219 61.339844 78.117188 61.085938 78.417969 60.824219 C 82.359375 57.554688 82.359375 57.554688 85.09375 53.34375 C 85.523438 52.320312 85.523438 52.320312 86 51.5 C 86.496094 51.335938 86.988281 51.171875 87.5 51 C 87.253906 50.257812 87.253906 50.257812 87 49.5 C 86.671875 49.5 86.339844 49.5 86 49.5 C 85.753906 47.519531 85.753906 47.519531 85.5 45.5 C 85.746094 45.996094 85.746094 45.996094 86 46.5 C 86.496094 46.5 86.988281 46.5 87.5 46.5 C 87.5 44.851562 87.5 43.199219 87.5 41.5 Z M 28.5 67.5 C 28.355469 72.28125 28.769531 76.785156 29.5 81.5 C 31.386719 80.445312 32.914062 79.25 34.5 77.78125 C 39.320312 73.328125 39.320312 73.328125 42.15625 72.875 C 42.597656 72.800781 43.039062 72.726562 43.492188 72.648438 C 43.992188 72.574219 43.992188 72.574219 44.5 72.5 C 44.5 72.335938 44.5 72.171875 44.5 72 C 43.914062 71.878906 43.328125 71.757812 42.726562 71.632812 C 38.34375 70.707031 34.164062 69.675781 30 68 C 29.503906 67.835938 29.011719 67.671875 28.5 67.5 Z M 28.5 67.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(7.058824%,18.82353%,10.980392%);fill-opacity:1;" d="M 85.5 31 C 85.828125 31.328125 86.160156 31.660156 86.5 32 C 85.710938 32.09375 85.710938 32.09375 84.90625 32.1875 C 83.628906 32.5625 83.09375 32.878906 82.265625 33.933594 C 80.398438 37.410156 79.683594 41.144531 79 45 C 80.15625 45 81.308594 45 82.5 45 C 82.523438 44.667969 82.550781 44.335938 82.574219 43.992188 C 82.945312 39.609375 82.945312 39.609375 84 37.5 C 84.59375 39.285156 84.367188 40.253906 84.03125 42.09375 C 83.933594 42.636719 83.835938 43.179688 83.734375 43.742188 C 83.660156 44.15625 83.582031 44.570312 83.5 45 C 83.996094 45 84.488281 45 85 45 C 85.164062 43.515625 85.328125 42.03125 85.5 40.5 C 85.664062 40.5 85.828125 40.5 86 40.5 C 86 43.46875 86 46.441406 86 49.5 C 86.328125 49.5 86.660156 49.5 87 49.5 C 87.328125 50.160156 87.660156 50.820312 88 51.5 C 87.503906 51.5 87.011719 51.5 86.5 51.5 C 86.335938 51.171875 86.171875 50.839844 86 50.5 C 85.816406 50.792969 85.632812 51.082031 85.445312 51.382812 C 82.007812 56.589844 77.625 60.953125 72.5 64.5 C 72.171875 64.335938 71.839844 64.171875 71.5 64 C 71.921875 63.625 72.34375 63.253906 72.777344 62.867188 C 73.332031 62.371094 73.882812 61.871094 74.4375 61.375 C 74.855469 61.007812 74.855469 61.007812 75.28125 60.632812 C 76.679688 59.367188 77.828125 58.289062 78.5 56.5 C 78.050781 56.019531 78.050781 56.019531 77.59375 55.53125 C 73.949219 50.429688 74.53125 43.941406 75.5 38 C 73.851562 36.84375 72.199219 35.691406 70.5 34.5 C 70.828125 34.335938 71.160156 34.171875 71.5 34 C 72.691406 34.609375 72.691406 34.609375 74.0625 35.46875 C 74.515625 35.75 74.972656 36.035156 75.441406 36.328125 C 75.789062 36.546875 76.140625 36.769531 76.5 37 C 76.789062 36.539062 76.789062 36.539062 77.085938 36.070312 C 77.347656 35.664062 77.605469 35.261719 77.875 34.84375 C 78.128906 34.445312 78.386719 34.042969 78.648438 33.632812 C 80.699219 30.90625 82.246094 30.738281 85.5 31 Z M 85.5 31 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(80.784315%,10.196079%,34.117648%);fill-opacity:1;" d="M 44 7 C 45.277344 8.445312 45.488281 9.355469 45.65625 11.28125 C 45.515625 12.835938 45.328125 13.703125 44.5 15 C 44.171875 15 43.839844 15 43.5 15 C 43.5 15.164062 43.5 15.328125 43.5 15.5 C 40.1875 15.625 40.1875 15.625 38.5 14.5 C 38.664062 14.664062 38.828125 14.828125 39 15 C 38.5 16 38.5 16 37.617188 16.320312 C 37.242188 16.410156 36.867188 16.5 36.476562 16.59375 C 36.066406 16.695312 35.652344 16.796875 35.226562 16.902344 C 34.574219 17.058594 34.574219 17.058594 33.90625 17.21875 C 31.066406 17.902344 28.605469 18.644531 26 20 C 25.503906 20 25.011719 20 24.5 20 C 24.085938 17.882812 23.96875 16.101562 24.5 14 C 28.375 10.066406 38.5625 5.378906 44 7 Z M 44 7 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(92.941177%,80.000001%,68.235296%);fill-opacity:1;" d="M 4 70 C 6.300781 70.421875 7.453125 71.820312 9 73.46875 C 11.199219 75.890625 11.199219 75.890625 14 77.5 C 14.0625 77.027344 14.125 76.550781 14.1875 76.0625 C 14.5 74.5 14.5 74.5 15.5 73.5 C 15.996094 73.5 16.488281 73.5 17 73.5 C 16.996094 73.902344 16.988281 74.308594 16.984375 74.726562 C 16.976562 75.257812 16.972656 75.792969 16.96875 76.34375 C 16.964844 76.871094 16.957031 77.398438 16.953125 77.945312 C 16.996094 79.375 17.105469 80.625 17.5 82 C 17.828125 82.164062 18.160156 82.328125 18.5 82.5 C 18.171875 82.5 17.839844 82.5 17.5 82.5 C 17.390625 83.058594 17.390625 83.058594 17.28125 83.625 C 17.050781 84.761719 16.785156 85.878906 16.5 87 C 16.898438 87.246094 17.292969 87.492188 17.703125 87.75 C 18.214844 88.070312 18.722656 88.390625 19.25 88.71875 C 20.015625 89.199219 20.015625 89.199219 20.796875 89.6875 C 21.195312 89.953125 21.589844 90.222656 22 90.5 C 22 90.664062 22 90.828125 22 91 C 19.703125 91.363281 18.5 90.78125 16.625 89.5625 C 14.875 88.433594 13.53125 87.800781 11.46875 87.5 C 8.65625 87.082031 6.722656 85.707031 4.5 84 C 4.632812 81.769531 4.789062 80.742188 6.28125 79.03125 C 6.882812 78.519531 6.882812 78.519531 7.5 78 C 6.824219 75.976562 5.824219 74.519531 4.511719 72.875 C 4 72 4 72 4 70 Z M 4 70 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(6.666667%,54.901963%,48.235294%);fill-opacity:1;" d="M 85.71875 31.625 C 87.894531 32.082031 88.546875 32.769531 89.859375 34.492188 C 91.792969 37.523438 93 40.875 92.5 44.5 C 91.625 46.25 91.625 46.25 90.5 47.5 C 89.839844 47.5 89.179688 47.5 88.5 47.5 C 88.335938 47.003906 88.171875 46.511719 88 46 C 87.339844 46.164062 86.679688 46.328125 86 46.5 C 85.835938 45.179688 85.671875 43.859375 85.5 42.5 C 85.335938 43.324219 85.171875 44.148438 85 45 C 84.503906 45 84.011719 45 83.5 45 C 83.5 43.515625 83.5 42.03125 83.5 40.5 C 83.171875 41.984375 82.839844 43.46875 82.5 45 C 81.34375 45 80.191406 45 79 45 C 79.171875 40.460938 79.914062 36.640625 82.65625 32.9375 C 84 32 84 32 85.71875 31.625 Z M 85.71875 31.625 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(14.509805%,37.64706%,6.27451%);fill-opacity:1;" d="M 67 36 C 69.300781 36.886719 71.859375 38.21875 73 40.5 C 72.835938 40.828125 72.671875 41.160156 72.5 41.5 C 72.996094 41.664062 73.488281 41.828125 74 42 C 73.503906 42.246094 73.503906 42.246094 73 42.5 C 72.796875 43.660156 72.628906 44.828125 72.5 46 C 72.335938 46 72.171875 46 72 46 C 71.949219 45.664062 71.902344 45.324219 71.851562 44.976562 C 71.316406 41.519531 71.316406 41.519531 69 39 C 69.164062 39.496094 69.328125 39.988281 69.5 40.5 C 69.953125 47.03125 69.953125 47.03125 68 50 C 67.835938 50.496094 67.671875 50.988281 67.5 51.5 C 65.683594 51.5 63.871094 51.5 62 51.5 C 62.328125 51.171875 62.660156 50.839844 63 50.5 C 62.671875 50.148438 62.339844 49.796875 62 49.4375 C 60.261719 46.875 60.695312 43.449219 61 40.5 C 61.511719 39.285156 61.511719 39.285156 62 38.5 C 60.472656 39.367188 59.183594 40.191406 58 41.5 C 58.621094 39.019531 59.378906 37.929688 61.5 36.5 C 63.226562 35.636719 65.105469 35.820312 67 36 Z M 67 36 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(83.92157%,9.019608%,34.117648%);fill-opacity:1;" d="M 73.5 6.5 C 78.40625 8.304688 83.226562 10.355469 86.121094 14.996094 C 86.800781 16.800781 86.800781 18.605469 86.5 20.5 C 85.5 21.78125 85.5 21.78125 84.5 22.5 C 84.015625 22.148438 83.53125 21.796875 83.03125 21.4375 C 79.667969 19.308594 75.753906 18.242188 72 17 C 72.160156 16.414062 72.316406 15.824219 72.480469 15.21875 C 72.957031 13.3125 73.191406 11.464844 73.3125 9.5 C 73.351562 8.933594 73.386719 8.367188 73.425781 7.78125 C 73.449219 7.359375 73.476562 6.9375 73.5 6.5 Z M 73.5 6.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(52.156866%,50.196081%,48.235294%);fill-opacity:1;" d="M 86 50 C 86.5 51.5 86.5 51.5 86.160156 52.472656 C 82.765625 58.660156 76.664062 63.988281 70.265625 66.90625 C 68.867188 67.5625 67.554688 68.320312 66.21875 69.09375 C 63.820312 70.421875 61.699219 71.058594 59 71.5 C 59.050781 72.035156 59.101562 72.574219 59.15625 73.125 C 59 75 59 75 57.90625 76.15625 C 55.300781 77.71875 53.078125 78.46875 50 78 C 48.25 77.25 48.25 77.25 47 76 C 46.746094 74.574219 46.855469 73.460938 47 72 C 53.03125 69.988281 53.03125 69.988281 56.3125 69.6875 C 60.558594 69.199219 64.21875 67.699219 68.078125 65.917969 C 68.851562 65.566406 69.632812 65.242188 70.421875 64.925781 C 76.78125 62.183594 82.390625 55.777344 86 50 Z M 86 50 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(3.137255%,2.352941%,2.352941%);fill-opacity:1;" d="M 15 23 C 15.132812 23.277344 15.269531 23.558594 15.40625 23.84375 C 16.621094 26.539062 16.621094 26.539062 19 28 C 18.996094 28.578125 18.996094 29.152344 18.992188 29.746094 C 18.984375 31.929688 18.976562 34.109375 18.972656 36.292969 C 18.96875 37.226562 18.964844 38.164062 18.960938 39.101562 C 18.941406 44.0625 18.984375 48.988281 19.316406 53.9375 C 19.542969 57.394531 19.574219 60.84375 19.566406 64.304688 C 19.5625 65.347656 19.566406 66.390625 19.570312 67.433594 C 19.574219 78.777344 19.574219 78.777344 18.5 82 C 18.171875 82 17.839844 82 17.5 82 C 16.515625 79.976562 16.441406 78.566406 16.6875 76.34375 C 16.769531 75.542969 16.769531 75.542969 16.855469 74.726562 C 16.925781 74.117188 16.925781 74.117188 17 73.5 C 16.003906 73.894531 16.003906 73.894531 15 74.5 C 14.355469 76.019531 14.355469 76.019531 14 77.5 C 11.503906 76.765625 10.164062 75.308594 8.4375 73.46875 C 6.511719 71.324219 6.511719 71.324219 4 70 C 5.03125 72.65625 6.28125 74.722656 8 77 C 7.433594 78.273438 6.863281 79.15625 5.9375 80.21875 C 4.855469 81.695312 4.726562 82.214844 5 84 C 7.484375 86.339844 10.511719 87.21875 13.796875 87.71875 C 15.777344 88.183594 17.421875 89.234375 19 90.5 C 19 90.664062 19 90.828125 19 91 C 17.84375 91 16.691406 91 15.5 91 C 15.5 90.503906 15.5 90.011719 15.5 89.5 C 15.089844 89.425781 14.675781 89.351562 14.253906 89.277344 C 10.378906 88.5 7.164062 87.425781 4 85 C 3.273438 83.546875 3.3125 82.613281 3.5 81 C 4.078125 80.207031 4.078125 80.207031 4.78125 79.4375 C 6.125 78.195312 6.125 78.195312 5.921875 77.066406 C 5.222656 75.300781 4.132812 73.796875 3.089844 72.21875 C 2.5 71 2.5 71 2.660156 69.84375 C 3 69 3 69 3.84375 68.621094 C 5 68.5 5 68.5 6.203125 69.347656 C 6.621094 69.75 7.039062 70.148438 7.46875 70.5625 C 7.90625 70.976562 8.34375 71.390625 8.792969 71.816406 C 9.191406 72.207031 9.589844 72.597656 10 73 C 10.285156 73.277344 10.570312 73.554688 10.863281 73.839844 C 11.414062 74.386719 11.960938 74.941406 12.5 75.5 C 12.746094 75.746094 12.746094 75.746094 13 76 C 13.277344 75.515625 13.558594 75.03125 13.84375 74.53125 C 15 73 15 73 16.375 72.5625 C 16.746094 72.542969 17.117188 72.519531 17.5 72.5 C 17.867188 70.53125 18.0625 68.664062 18.066406 66.664062 C 18.066406 66.128906 18.066406 65.59375 18.070312 65.046875 C 18.066406 64.484375 18.066406 63.921875 18.0625 63.34375 C 18.0625 62.761719 18.0625 62.179688 18.058594 61.582031 C 18.039062 57.703125 17.882812 53.859375 17.609375 49.988281 C 17.46875 47.441406 17.472656 44.902344 17.519531 42.351562 C 17.542969 40.335938 17.351562 38.484375 17 36.5 C 16.957031 35.574219 16.933594 34.644531 16.9375 33.71875 C 16.9375 33.265625 16.933594 32.808594 16.933594 32.34375 C 17 31.015625 17.207031 29.792969 17.5 28.5 C 17.179688 28.355469 16.859375 28.210938 16.53125 28.0625 C 15.5 27.5 15.5 27.5 15 26.5 C 14.980469 25.332031 14.976562 24.167969 15 23 Z M 17.5 76.5 C 18 77.5 18 77.5 18 77.5 Z M 17.5 76.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(23.529412%,23.137255%,22.745098%);fill-opacity:1;" d="M 19 43.5 C 21.367188 45.867188 21.191406 49.222656 21.296875 52.367188 C 21.328125 53.058594 21.328125 53.058594 21.359375 53.765625 C 21.421875 55.230469 21.476562 56.691406 21.53125 58.15625 C 21.589844 59.628906 21.648438 61.101562 21.710938 62.574219 C 21.746094 63.488281 21.785156 64.402344 21.816406 65.320312 C 21.703125 67.320312 21.703125 67.320312 22.5 69 C 22.746094 67.515625 22.746094 67.515625 23 66 C 23.164062 66 23.328125 66 23.5 66 C 23.929688 68.222656 24.058594 70.3125 24.046875 72.574219 C 24.046875 72.902344 24.046875 73.230469 24.046875 73.570312 C 24.042969 74.609375 24.039062 75.648438 24.03125 76.6875 C 24.027344 77.394531 24.027344 78.105469 24.023438 78.8125 C 24.019531 80.542969 24.011719 82.273438 24 84 C 23.269531 83.84375 22.542969 83.6875 21.8125 83.53125 C 21.40625 83.445312 21 83.355469 20.582031 83.265625 C 19.5 83 19.5 83 18.5 82.5 C 18.535156 82.03125 18.570312 81.5625 18.609375 81.078125 C 19.117188 73.714844 19.058594 66.351562 19.035156 58.976562 C 19.027344 57.226562 19.027344 55.476562 19.023438 53.726562 C 19.019531 50.316406 19.011719 46.90625 19 43.5 Z M 19 43.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(73.333335%,25.490198%,29.411766%);fill-opacity:1;" d="M 58.5 57.5 C 59 58 59 58 59.15625 59.84375 C 59.128906 62.0625 58.4375 63.285156 57 65 C 54.714844 67.070312 52.230469 67.210938 49.242188 67.171875 C 47.113281 66.878906 46.367188 66.148438 45 64.5 C 44.5625 62.59375 44.5625 62.59375 44.5 61 C 48.339844 57.816406 53.652344 57.175781 58.5 57.5 Z M 58.5 57.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(95.686275%,82.352942%,70.588237%);fill-opacity:1;" d="M 62.5 74.5 C 64.457031 74.5 65.828125 74.515625 67.4375 75.679688 C 71.359375 79.789062 73.035156 85.632812 74.5 91 C 72.023438 91 69.550781 91 67 91 C 66.800781 89.855469 66.800781 89.855469 66.59375 88.6875 C 66.078125 86.105469 65.300781 84.046875 63.6875 81.9375 C 62.472656 80.28125 62.4375 78.632812 62.46875 76.625 C 62.472656 76.226562 62.476562 75.824219 62.484375 75.414062 C 62.488281 75.113281 62.492188 74.8125 62.5 74.5 Z M 62.5 74.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.54902%,3.529412%,6.666667%);fill-opacity:1;" d="M 25 11.5 C 24.996094 12.101562 24.988281 12.703125 24.984375 13.324219 C 24.976562 14.113281 24.972656 14.898438 24.96875 15.6875 C 24.964844 16.085938 24.960938 16.480469 24.957031 16.890625 C 24.886719 19.011719 24.886719 19.011719 25.5 21 C 25.789062 20.902344 26.078125 20.804688 26.378906 20.707031 C 27.1875 20.4375 28 20.167969 28.808594 19.898438 C 29.753906 19.582031 30.691406 19.246094 31.625 18.90625 C 32.996094 18.5 34.082031 18.417969 35.5 18.5 C 33.949219 19.316406 32.351562 19.984375 30.734375 20.65625 C 24.335938 23.273438 24.335938 23.273438 19.5 28 C 18.34375 27.90625 18.34375 27.90625 17 27.5 C 15.636719 25.792969 15 24.183594 15 22 C 16.332031 18.757812 21.214844 11.5 25 11.5 Z M 25 11.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(33.725491%,34.117648%,33.725491%);fill-opacity:1;" d="M 69.5 20.5 C 70.375 20.714844 71.25 20.933594 72.125 21.15625 C 72.613281 21.277344 73.097656 21.398438 73.601562 21.523438 C 74.996094 22 75.886719 22.554688 77 23.5 C 76.835938 23.828125 76.671875 24.160156 76.5 24.5 C 76.003906 24.171875 75.511719 23.839844 75 23.5 C 76.238281 24.984375 76.238281 24.984375 77.5 26.5 C 77.335938 26.828125 77.171875 27.160156 77 27.5 C 75.761719 26.757812 75.761719 26.757812 74.5 26 C 74.828125 26.988281 75.160156 27.980469 75.5 29 C 73.851562 28.011719 72.199219 27.019531 70.5 26 C 70.664062 26.824219 70.828125 27.648438 71 28.5 C 69.84375 28.003906 68.691406 27.511719 67.5 27 C 67.335938 27.328125 67.171875 27.660156 67 28 C 65.933594 27.214844 64.871094 26.421875 63.8125 25.625 C 63.511719 25.402344 63.210938 25.179688 62.898438 24.953125 C 61.355469 23.785156 60.152344 22.859375 59.5 21 C 60.414062 21.230469 60.414062 21.230469 61.34375 21.46875 C 63.890625 22.191406 63.890625 22.191406 66.5 22.5 C 66.5 22.171875 66.5 21.839844 66.5 21.5 C 67.488281 21.5 68.480469 21.5 69.5 21.5 C 69.5 21.171875 69.5 20.839844 69.5 20.5 Z M 69.5 20.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(0.784314%,34.901962%,29.019609%);fill-opacity:1;" d="M 84 37.5 C 84.59375 39.285156 84.367188 40.253906 84.03125 42.09375 C 83.933594 42.636719 83.835938 43.179688 83.734375 43.742188 C 83.660156 44.15625 83.582031 44.570312 83.5 45 C 83.996094 45 84.488281 45 85 45 C 85.164062 43.515625 85.328125 42.03125 85.5 40.5 C 85.664062 40.5 85.828125 40.5 86 40.5 C 86 43.46875 86 46.441406 86 49.5 C 86.328125 49.5 86.660156 49.5 87 49.5 C 87.328125 50.160156 87.660156 50.820312 88 51.5 C 87.503906 51.5 87.011719 51.5 86.5 51.5 C 86.335938 51.171875 86.171875 50.839844 86 50.5 C 85.011719 51.984375 84.019531 53.46875 83 55 C 81.480469 54.261719 80.589844 53.625 79.59375 52.25 C 78.769531 49.816406 78.878906 47.546875 79 45 C 80.15625 45 81.308594 45 82.5 45 C 82.523438 44.667969 82.550781 44.335938 82.574219 43.992188 C 82.945312 39.609375 82.945312 39.609375 84 37.5 Z M 84 37.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(32.941177%,47.843137%,27.450982%);fill-opacity:1;" d="M 60.5 37 C 61.160156 37.328125 61.820312 37.660156 62.5 38 C 62.5 38.328125 62.5 38.660156 62.5 39 C 63.160156 39.164062 63.820312 39.328125 64.5 39.5 C 64.535156 41.007812 64.527344 42.363281 64.21875 43.84375 C 63.890625 45.074219 63.890625 45.074219 64.5 46.5 C 64.996094 46.335938 65.488281 46.171875 66 46 C 66.609375 44.746094 66.609375 44.746094 67 43.5 C 67.660156 43.664062 68.320312 43.828125 69 44 C 69.164062 46.773438 69.238281 48.683594 67.5 51 C 67.5 51.164062 67.5 51.328125 67.5 51.5 C 65.683594 51.5 63.871094 51.5 62 51.5 C 62.328125 51.171875 62.660156 50.839844 63 50.5 C 62.671875 50.148438 62.339844 49.796875 62 49.4375 C 60.261719 46.875 60.695312 43.449219 61 40.5 C 61.511719 39.285156 61.511719 39.285156 62 38.5 C 60.472656 39.367188 59.183594 40.191406 58 41.5 C 58.546875 39.613281 58.953125 38.265625 60.5 37 Z M 60.5 37 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.156863%,9.411765%,7.450981%);fill-opacity:1;" d="M 28 66.5 C 28.609375 66.707031 29.21875 66.914062 29.84375 67.125 C 30.261719 67.261719 30.675781 67.398438 31.105469 67.539062 C 32.039062 67.847656 32.96875 68.164062 33.894531 68.492188 C 38.832031 70.152344 43.847656 70.820312 49 71.5 C 48.339844 71.664062 47.679688 71.828125 47 72 C 46.835938 72.824219 46.671875 73.648438 46.5 74.5 C 46.253906 74.003906 46.253906 74.003906 46 73.5 C 39.910156 74.003906 36.929688 76.445312 32.820312 80.710938 C 31.648438 81.839844 30.386719 82.65625 29 83.5 C 28.671875 83.335938 28.339844 83.171875 28 83 C 27.253906 72.128906 27.253906 72.128906 27.5 67 C 27.664062 66.835938 27.828125 66.671875 28 66.5 Z M 28.5 67.5 C 28.355469 72.28125 28.769531 76.785156 29.5 81.5 C 31.386719 80.445312 32.914062 79.25 34.5 77.78125 C 39.320312 73.328125 39.320312 73.328125 42.15625 72.875 C 42.597656 72.800781 43.039062 72.726562 43.492188 72.648438 C 43.992188 72.574219 43.992188 72.574219 44.5 72.5 C 44.5 72.335938 44.5 72.171875 44.5 72 C 43.914062 71.878906 43.328125 71.757812 42.726562 71.632812 C 38.34375 70.707031 34.164062 69.675781 30 68 C 29.503906 67.835938 29.011719 67.671875 28.5 67.5 Z M 28.5 67.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(15.294118%,12.941177%,10.588235%);fill-opacity:1;" d="M 3.5 68.5 C 6.949219 68.945312 9.507812 72.417969 11.5 75 C 11.996094 75.328125 12.488281 75.660156 13 76 C 13.277344 75.515625 13.558594 75.03125 13.84375 74.53125 C 15 73 15 73 16.3125 72.5 C 16.703125 72.5 17.097656 72.5 17.5 72.5 C 17.828125 72.828125 18.160156 73.160156 18.5 73.5 C 18.171875 73.5 17.839844 73.5 17.5 73.5 C 17.519531 74.117188 17.542969 74.738281 17.5625 75.375 C 17.625 77.226562 17.402344 77.695312 16.5 79.5 C 16.664062 77.519531 16.828125 75.539062 17 73.5 C 14.886719 74.28125 14.886719 74.28125 14.375 76.0625 C 14.25 76.535156 14.128906 77.011719 14 77.5 C 11.503906 76.765625 10.164062 75.308594 8.4375 73.46875 C 6.511719 71.324219 6.511719 71.324219 4 70 C 5.03125 72.65625 6.28125 74.722656 8 77 C 7.433594 78.273438 6.863281 79.15625 5.9375 80.21875 C 4.855469 81.695312 4.726562 82.214844 5 84 C 7.484375 86.339844 10.511719 87.21875 13.796875 87.71875 C 15.777344 88.183594 17.421875 89.234375 19 90.5 C 19 90.664062 19 90.828125 19 91 C 17.84375 91 16.691406 91 15.5 91 C 15.5 90.503906 15.5 90.011719 15.5 89.5 C 15.089844 89.425781 14.675781 89.351562 14.253906 89.277344 C 10.378906 88.5 7.164062 87.425781 4 85 C 3.273438 83.546875 3.3125 82.613281 3.5 81 C 4.078125 80.207031 4.078125 80.207031 4.78125 79.4375 C 6.125 78.195312 6.125 78.195312 5.921875 77.066406 C 5.222656 75.300781 4.132812 73.796875 3.089844 72.21875 C 2.5 71 2.5 71 2.628906 69.875 C 3 69 3 69 3.5 68.5 Z M 3.5 68.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(85.09804%,61.56863%,46.27451%);fill-opacity:1;" d="M 8.5 78 C 8.769531 78.175781 9.035156 78.351562 9.3125 78.53125 C 10.867188 79.144531 11.898438 78.863281 13.5 78.5 C 13.335938 78.996094 13.171875 79.488281 13 80 C 12.964844 80.695312 12.957031 81.394531 12.96875 82.09375 C 12.980469 82.722656 12.988281 83.351562 13 84 C 14.199219 84.339844 14.199219 84.339844 15.5 84.5 C 16.679688 83.320312 16.746094 82.609375 17 81 C 17.164062 81.328125 17.328125 81.660156 17.5 82 C 17.828125 82.164062 18.160156 82.328125 18.5 82.5 C 18.171875 82.5 17.839844 82.5 17.5 82.5 C 17.390625 83.058594 17.390625 83.058594 17.28125 83.625 C 17.050781 84.761719 16.785156 85.878906 16.5 87 C 16.898438 87.246094 17.292969 87.492188 17.703125 87.75 C 18.214844 88.070312 18.722656 88.390625 19.25 88.71875 C 19.761719 89.039062 20.269531 89.355469 20.796875 89.6875 C 21.195312 89.953125 21.589844 90.222656 22 90.5 C 22 90.664062 22 90.828125 22 91 C 19.703125 91.363281 18.5 90.78125 16.625 89.5625 C 14.875 88.433594 13.53125 87.800781 11.46875 87.5 C 8.65625 87.082031 6.722656 85.707031 4.5 84 C 6.191406 83.511719 7.28125 83.570312 9 84 C 8.671875 83.175781 8.339844 82.351562 8 81.5 C 9.238281 81.996094 9.238281 81.996094 10.5 82.5 C 9.96875 80.796875 9.496094 79.492188 8.5 78 Z M 8.5 78 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(91.764706%,35.294119%,54.11765%);fill-opacity:1;" d="M 91.5 37 C 92.375 39.621094 92.765625 41.730469 92.5 44.5 C 91.625 46.25 91.625 46.25 90.5 47.5 C 89.839844 47.5 89.179688 47.5 88.5 47.5 C 88.335938 47.003906 88.171875 46.511719 88 46 C 87.339844 46.164062 86.679688 46.328125 86 46.5 C 85.777344 42.742188 85.679688 40.070312 88 37 C 89.332031 36.332031 90.082031 36.664062 91.5 37 Z M 91.5 37 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(89.411765%,63.529414%,46.666667%);fill-opacity:1;" d="M 47 72.5 C 50.078125 72.453125 50.078125 72.453125 51 72.5 C 51.164062 72.664062 51.328125 72.828125 51.5 73 C 53.890625 73.09375 56.140625 72.886719 58.5 72.5 C 58.664062 73.324219 58.828125 74.148438 59 75 C 56.433594 77.199219 54.464844 78.265625 51 78.15625 C 49.234375 77.882812 48.265625 77.234375 47 76 C 46.8125 74.125 46.8125 74.125 47 72.5 Z M 47 72.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(95.294118%,38.039216%,43.137255%);fill-opacity:1;" d="M 54.84375 60.96875 C 55.339844 60.972656 55.835938 60.976562 56.351562 60.984375 C 56.917969 60.992188 56.917969 60.992188 57.5 61 C 58 62.5 58 62.5 57.621094 63.523438 C 56.828125 64.761719 56.179688 65.605469 55 66.5 C 52.148438 67.027344 49.453125 67.132812 47 65.5 C 47.0625 64.625 47.0625 64.625 47.5 63.5 C 49.992188 61.605469 51.746094 60.933594 54.84375 60.96875 Z M 54.84375 60.96875 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(82.745099%,9.019608%,33.725491%);fill-opacity:1;" d="M 20 16.5 C 20.328125 16.5 20.660156 16.5 21 16.5 C 21 17.488281 21 18.480469 21 19.5 C 21.328125 19.5 21.660156 19.5 22 19.5 C 22.363281 21.019531 22.617188 22.433594 22.5 24 C 21.628906 25.15625 20.734375 25.6875 19.5 26.5 C 19.335938 26.828125 19.171875 27.160156 19 27.5 C 17.78125 27.3125 17.78125 27.3125 16.5 27 C 15.679688 25.355469 15.671875 23.769531 16 22 C 17.0625 19.945312 18.363281 18.136719 20 16.5 Z M 20 16.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(34.117648%,34.117648%,34.117648%);fill-opacity:1;" d="M 23 27.5 C 23 27.996094 23 28.488281 23 29 C 23.328125 29.164062 23.660156 29.328125 24 29.5 C 24.164062 29.996094 24.328125 30.488281 24.5 31 C 25.65625 30.175781 26.808594 29.351562 28 28.5 C 28 29.160156 28 29.820312 28 30.5 C 28.988281 30.003906 29.980469 29.511719 31 29 C 30.21875 30.898438 29.023438 32.078125 27.5625 33.5 C 27.128906 33.921875 26.699219 34.34375 26.253906 34.78125 C 25.839844 35.183594 25.425781 35.585938 25 36 C 24.332031 36.664062 23.664062 37.332031 23 38 C 23 37.011719 23 36.019531 23 35 C 22.503906 35.496094 22.011719 35.988281 21.5 36.5 C 21.582031 35.941406 21.664062 35.386719 21.75 34.8125 C 22.023438 33.140625 22.023438 33.140625 22 32 C 21.339844 32.824219 20.679688 33.648438 20 34.5 C 20.289062 31.535156 20.878906 29.621094 23 27.5 Z M 23 27.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(56.862748%,0.392157%,20.392157%);fill-opacity:1;" d="M 49.5 12.5 C 49.664062 12.828125 49.828125 13.160156 50 13.5 C 50.4375 13.496094 50.871094 13.488281 51.320312 13.484375 C 52.953125 13.46875 54.585938 13.453125 56.214844 13.445312 C 56.921875 13.441406 57.625 13.433594 58.328125 13.425781 C 59.34375 13.414062 60.359375 13.40625 61.375 13.402344 C 61.988281 13.398438 62.597656 13.390625 63.226562 13.386719 C 65.152344 13.507812 66.671875 13.90625 68.5 14.5 C 68.335938 14.996094 68.171875 15.488281 68 16 C 60.410156 16 52.820312 16 45 16 C 45.496094 15.671875 45.988281 15.339844 46.5 15 C 47.515625 14.183594 48.503906 13.339844 49.5 12.5 Z M 49.5 12.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(86.274511%,61.960787%,45.882353%);fill-opacity:1;" d="M 38 27.5 C 38.40625 28.542969 38.8125 29.582031 39.21875 30.625 C 39.558594 31.496094 39.558594 31.496094 39.90625 32.382812 C 40.351562 33.59375 40.730469 34.742188 41 36 C 40.671875 36 40.339844 36 40 36 C 39.246094 35.015625 39.246094 35.015625 38.4375 33.75 C 38.167969 33.332031 37.898438 32.914062 37.621094 32.484375 C 37.417969 32.160156 37.210938 31.835938 37 31.5 C 32.65625 33.488281 28.566406 37.792969 26.195312 41.886719 C 25.96875 42.253906 25.738281 42.621094 25.5 43 C 25.171875 43 24.839844 43 24.5 43 C 26.207031 38.355469 32.332031 27.5 38 27.5 Z M 38 27.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(41.176471%,12.54902%,19.215687%);fill-opacity:1;" d="M 43.5 73.5 C 46 73.5 46 73.5 47 74 C 47.144531 74.402344 47.289062 74.804688 47.4375 75.21875 C 47.886719 76.511719 47.886719 76.511719 48.8125 77.15625 C 51.972656 78.070312 54.339844 77.617188 57.15625 76.078125 C 58.1875 75.371094 58.78125 74.507812 59.5 73.5 C 60.160156 73.5 60.820312 73.5 61.5 73.5 C 62.886719 74.886719 62.699219 75.882812 62.8125 77.8125 C 62.851562 78.40625 62.886719 79 62.925781 79.613281 C 62.949219 80.070312 62.976562 80.527344 63 81 C 61.855469 79.28125 61.910156 78.507812 62 76.5 C 61.726562 76.644531 61.453125 76.789062 61.171875 76.9375 C 55.585938 79.550781 49.707031 78.992188 44 77 C 43.671875 77.660156 43.339844 78.320312 43 79 C 43.164062 77.183594 43.328125 75.371094 43.5 73.5 Z M 43.5 73.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(96.470588%,83.137256%,71.372551%);fill-opacity:1;" d="M 15.5 73.5 C 15.996094 73.5 16.488281 73.5 17 73.5 C 17.042969 75.136719 17.070312 76.769531 17.09375 78.40625 C 17.113281 79.101562 17.113281 79.101562 17.132812 79.8125 C 17.140625 80.484375 17.140625 80.484375 17.148438 81.164062 C 17.160156 81.78125 17.160156 81.78125 17.171875 82.410156 C 17 83.5 17 83.5 16.324219 84.402344 C 15.5 85 15.5 85 14.1875 84.875 C 13 84.5 13 84.5 12.5 84 C 12.320312 81.28125 12.28125 80.132812 14 78 C 14.179688 76.835938 14.347656 75.667969 14.5 74.5 C 14.828125 74.171875 15.160156 73.839844 15.5 73.5 Z M 15.5 73.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(4.313726%,1.176471%,2.352941%);fill-opacity:1;" d="M 87 17 C 87.164062 17 87.328125 17 87.5 17 C 87.691406 20.667969 87.691406 20.667969 86.65625 22.46875 C 85.78125 24.511719 86.324219 25.410156 87.113281 27.40625 C 87.605469 28.800781 87.613281 30.035156 87.5 31.5 C 87.792969 31.609375 88.082031 31.714844 88.382812 31.828125 C 90.964844 33.386719 92.496094 36.917969 93.390625 39.71875 C 93.851562 42.625 93.1875 45.335938 92 48 C 90.90625 49.3125 90.90625 49.3125 90 50 C 90 48.398438 90.222656 47.988281 91 46.65625 C 92.273438 44.175781 92.402344 42.199219 91.765625 39.5 C 90.890625 36.820312 89.371094 34.59375 87.5 32.5 C 87.171875 32.335938 86.839844 32.171875 86.5 32 C 86.234375 31 86.011719 29.984375 85.8125 28.96875 C 85.214844 26.179688 84.40625 23.972656 83 21.5 C 83.660156 21.664062 84.320312 21.828125 85 22 C 86.09375 20.332031 86.71875 18.980469 87 17 Z M 87 17 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(58.039218%,0.392157%,21.176471%);fill-opacity:1;" d="M 38 14 C 38.269531 14.164062 38.535156 14.328125 38.8125 14.5 C 40.269531 15.113281 41.433594 15.070312 43 15 C 43.164062 14.671875 43.328125 14.339844 43.5 14 C 43.996094 14.164062 44.488281 14.328125 45 14.5 C 45 14.828125 45 15.160156 45 15.5 C 41.527344 16.882812 38.058594 17.890625 34.433594 18.742188 C 31.988281 19.335938 29.738281 20.125 27.460938 21.179688 C 26.5 21.5 26.5 21.5 25 21 C 25 20.671875 25 20.339844 25 20 C 29.355469 17.746094 33.75 16.566406 38.5 15.5 C 38.335938 15.003906 38.171875 14.511719 38 14 Z M 38 14 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(9.019608%,8.627451%,6.666667%);fill-opacity:1;" d="M 65.859375 35.855469 C 68.542969 36.195312 70.523438 37.75 72.5 39.5 C 72.664062 39.828125 72.828125 40.160156 73 40.5 C 72.835938 40.828125 72.671875 41.160156 72.5 41.5 C 72.996094 41.664062 73.488281 41.828125 74 42 C 73.503906 42.246094 73.503906 42.246094 73 42.5 C 72.796875 43.660156 72.628906 44.828125 72.5 46 C 72.335938 46 72.171875 46 72 46 C 71.949219 45.664062 71.902344 45.324219 71.851562 44.976562 C 71.316406 41.519531 71.316406 41.519531 69 39 C 69 39.824219 69 40.648438 69 41.5 C 68.578125 41.085938 68.15625 40.675781 67.71875 40.25 C 66.335938 39.078125 66 39 64.0625 38.84375 C 63.546875 38.894531 63.03125 38.945312 62.5 39 C 62.5 38.671875 62.5 38.339844 62.5 38 C 62.003906 37.835938 61.511719 37.671875 61 37.5 C 62.496094 35.703125 63.628906 35.777344 65.859375 35.855469 Z M 65.859375 35.855469 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(32.549021%,7.843138%,16.078432%);fill-opacity:1;" d="M 52.71875 4.804688 C 53.78125 4.8125 54.84375 4.804688 55.90625 4.792969 C 60.457031 4.789062 65.351562 4.925781 69.5 7 C 69.664062 6.503906 69.828125 6.011719 70 5.5 C 70.105469 9.023438 70.039062 12.121094 69 15.5 C 68.835938 15.5 68.671875 15.5 68.5 15.5 C 68.785156 12.824219 69.105469 10.160156 69.5 7.5 C 68.574219 7.34375 67.644531 7.1875 66.71875 7.03125 C 66.203125 6.945312 65.6875 6.855469 65.152344 6.765625 C 64.335938 6.636719 64.335938 6.636719 63.5 6.5 C 62.996094 6.40625 62.496094 6.316406 61.976562 6.21875 C 59.035156 5.890625 56.113281 5.929688 53.15625 5.9375 C 52.542969 5.9375 51.933594 5.933594 51.304688 5.929688 C 50.71875 5.933594 50.128906 5.933594 49.523438 5.933594 C 48.726562 5.933594 48.726562 5.933594 47.910156 5.933594 C 46.675781 5.992188 45.660156 6.074219 44.5 6.5 C 44.335938 6.828125 44.171875 7.160156 44 7.5 C 43.671875 7.5 43.339844 7.5 43 7.5 C 43 7.171875 43 6.839844 43 6.5 C 42.402344 6.6875 41.808594 6.871094 41.191406 7.0625 C 40.410156 7.304688 39.628906 7.542969 38.84375 7.78125 C 38.449219 7.90625 38.058594 8.027344 37.652344 8.15625 C 37.273438 8.269531 36.894531 8.382812 36.503906 8.5 C 36.15625 8.609375 35.808594 8.714844 35.449219 8.828125 C 34.5 9 34.5 9 33 8.5 C 39.492188 5.890625 45.765625 4.742188 52.71875 4.804688 Z M 52.71875 4.804688 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(19.607843%,36.078432%,14.117648%);fill-opacity:1;" d="M 60.5 37 C 61.160156 37.328125 61.820312 37.660156 62.5 38 C 62.5 38.328125 62.5 38.660156 62.5 39 C 63.160156 39.164062 63.820312 39.328125 64.5 39.5 C 64.542969 41.421875 64.417969 43.125 64 45 C 64 45.328125 64 45.660156 64 46 C 63.671875 46 63.339844 46 63 46 C 62.503906 47.238281 62.503906 47.238281 62 48.5 C 61 47.5 61 47.5 60.867188 45.941406 C 60.871094 45.320312 60.871094 44.703125 60.875 44.0625 C 60.871094 43.136719 60.871094 43.136719 60.867188 42.191406 C 60.988281 40.640625 61.179688 39.792969 62 38.5 C 60.472656 39.367188 59.183594 40.191406 58 41.5 C 58.546875 39.613281 58.953125 38.265625 60.5 37 Z M 60.5 37 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(10.196079%,7.450981%,7.843138%);fill-opacity:1;" d="M 87.5 41.5 C 87.664062 41.5 87.828125 41.5 88 41.5 C 88.164062 43.480469 88.328125 45.460938 88.5 47.5 C 89.324219 47.335938 90.148438 47.171875 91 47 C 90.121094 50.339844 88.417969 54.054688 85.5 56 C 85.5 55.671875 85.5 55.339844 85.5 55 C 85.171875 54.671875 84.839844 54.339844 84.5 54 C 85.8125 51.5625 85.8125 51.5625 87.5 51 C 87.335938 50.503906 87.171875 50.011719 87 49.5 C 86.671875 49.5 86.339844 49.5 86 49.5 C 85.835938 48.179688 85.671875 46.859375 85.5 45.5 C 85.664062 45.828125 85.828125 46.160156 86 46.5 C 86.496094 46.5 86.988281 46.5 87.5 46.5 C 87.5 44.851562 87.5 43.199219 87.5 41.5 Z M 87.5 41.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(10.196079%,1.960784%,4.313726%);fill-opacity:1;" d="M 47 1.5 C 47.328125 1.828125 47.660156 2.160156 48 2.5 C 47.605469 2.550781 47.210938 2.597656 46.804688 2.648438 C 38.652344 3.769531 31.066406 6.214844 25 12 C 25 10.5 25 10.5 25.9375 9.34375 C 31.3125 4.328125 39.632812 0.984375 47 1.5 Z M 47 1.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(3.529412%,3.137255%,2.745098%);fill-opacity:1;" d="M 28 66.5 C 31.359375 67.59375 34.683594 68.777344 38 70 C 38 70.164062 38 70.328125 38 70.5 C 34.589844 70.285156 32.042969 69.546875 29 68 C 28.933594 71.796875 29.058594 75.546875 29.367188 79.328125 C 29.484375 80.90625 29.507812 81.980469 29 83.5 C 28.671875 83.335938 28.339844 83.171875 28 83 C 27.253906 72.128906 27.253906 72.128906 27.5 67 C 27.664062 66.835938 27.828125 66.671875 28 66.5 Z M 28 66.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(38.82353%,80.392158%,72.941178%);fill-opacity:1;" d="M 90 35 C 90.164062 35.496094 90.328125 35.988281 90.5 36.5 C 90.117188 36.601562 89.738281 36.707031 89.34375 36.8125 C 87.589844 37.710938 87.238281 38.714844 86.5 40.5 C 86.171875 40.5 85.839844 40.5 85.5 40.5 C 85.335938 41.984375 85.171875 43.46875 85 45 C 84.503906 45 84.011719 45 83.5 45 C 83.277344 41.507812 83.683594 39.023438 85.5 36 C 87.011719 34.34375 87.898438 34.359375 90 35 Z M 90 35 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(23.921569%,71.764708%,61.960787%);fill-opacity:1;" d="M 70 5.5 C 72.5 6.5 72.5 6.5 73 7 C 73.0625 7.875 73.089844 8.75 73.09375 9.625 C 73.101562 10.101562 73.109375 10.578125 73.117188 11.070312 C 72.996094 12.5625 72.625 13.648438 72 15 C 71.175781 15 70.351562 15 69.5 15 C 69.664062 11.863281 69.828125 8.730469 70 5.5 Z M 70 5.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(97.647059%,45.882353%,58.823532%);fill-opacity:1;" d="M 53.5 85 C 55.179688 87.136719 55.535156 88.210938 56 91 C 53.855469 91 51.710938 91 49.5 91 C 51.324219 85 51.324219 85 53.5 85 Z M 53.5 85 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(17.254902%,43.529412%,35.294119%);fill-opacity:1;" d="M 84 37.5 C 84.59375 39.285156 84.367188 40.253906 84.03125 42.09375 C 83.933594 42.636719 83.835938 43.179688 83.734375 43.742188 C 83.660156 44.15625 83.582031 44.570312 83.5 45 C 83.996094 45 84.488281 45 85 45 C 85.164062 43.515625 85.328125 42.03125 85.5 40.5 C 85.664062 40.5 85.828125 40.5 86 40.5 C 86 43.46875 86 46.441406 86 49.5 C 86.328125 49.5 86.660156 49.5 87 49.5 C 87.328125 50.160156 87.660156 50.820312 88 51.5 C 87.503906 51.5 87.011719 51.5 86.5 51.5 C 86.335938 51.171875 86.171875 50.839844 86 50.5 C 85.671875 50.828125 85.339844 51.160156 85 51.5 C 82.984375 49.082031 82.476562 46.378906 82.515625 43.261719 C 82.710938 41.191406 83.085938 39.386719 84 37.5 Z M 84 37.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(56.078434%,12.156863%,26.274511%);fill-opacity:1;" d="M 73.5 6.5 C 78.367188 8.289062 81.847656 10.214844 85.5 14 C 85.335938 14.328125 85.171875 14.660156 85 15 C 84.578125 14.730469 84.15625 14.457031 83.71875 14.179688 C 80.738281 12.289062 77.886719 10.5625 74.5 9.5 C 74.171875 9.828125 73.839844 10.160156 73.5 10.5 C 73.5 9.179688 73.5 7.859375 73.5 6.5 Z M 73.5 6.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(11.372549%,10.980392%,10.980392%);fill-opacity:1;" d="M 82 57.5 C 82 59.628906 81.023438 60.371094 79.628906 61.90625 C 75.699219 65.71875 71.050781 69.039062 65.5 69.5 C 66.117188 68.269531 66.554688 68.144531 67.78125 67.5625 C 68.261719 67.332031 68.261719 67.332031 68.75 67.097656 C 69.464844 66.757812 70.179688 66.417969 70.902344 66.09375 C 73.511719 64.875 75.550781 63.351562 77.6875 61.4375 C 77.996094 61.167969 78.300781 60.898438 78.617188 60.625 C 79.777344 59.605469 80.910156 58.589844 82 57.5 Z M 82 57.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(36.470589%,30.980393%,23.137255%);fill-opacity:1;" d="M 56.5 70 C 56.664062 70.496094 56.828125 70.988281 57 71.5 C 57.660156 71.5 58.320312 71.5 59 71.5 C 59 72.324219 59 73.148438 59 74 C 58.835938 73.503906 58.671875 73.011719 58.5 72.5 C 57.742188 72.746094 57.742188 72.746094 56.96875 73 C 55.117188 73.53125 53.429688 73.617188 51.5 73.5 C 51.335938 73.335938 51.171875 73.171875 51 73 C 50.328125 72.882812 49.648438 72.792969 48.96875 72.71875 C 48.320312 72.648438 47.667969 72.574219 47 72.5 C 50.136719 70.800781 52.917969 69.777344 56.5 70 Z M 56.5 70 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(10.980392%,10.196079%,8.627451%);fill-opacity:1;" d="M 35.5 48 C 39.183594 49.367188 39.183594 49.367188 40 51 C 39.671875 50.917969 39.339844 50.835938 39 50.75 C 35.335938 50.140625 32.847656 50.566406 29.5 52 C 29.003906 52.164062 28.511719 52.328125 28 52.5 C 28.101562 50.660156 28.390625 50.097656 29.78125 48.84375 C 31.777344 47.863281 33.335938 47.589844 35.5 48 Z M 35.5 48 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(15.294118%,12.54902%,9.803922%);fill-opacity:1;" d="M 68 75 C 70.625 77.851562 72.046875 81.113281 73.4375 84.6875 C 73.636719 85.1875 73.839844 85.683594 74.046875 86.195312 C 75.5 89.839844 75.5 89.839844 75.5 91 C 75.171875 91 74.839844 91 74.5 91 C 74.136719 90.125 73.769531 89.25 73.40625 88.375 C 73.175781 87.828125 72.949219 87.28125 72.710938 86.71875 C 72.253906 85.617188 71.808594 84.503906 71.382812 83.390625 C 69.691406 78.925781 69.691406 78.925781 66.5 75.5 C 65.011719 74.960938 63.570312 74.671875 62 74.5 C 61.835938 74.996094 61.671875 75.488281 61.5 76 C 61.5 75.339844 61.5 74.679688 61.5 74 C 64.222656 73.402344 65.691406 73.378906 68 75 Z M 68 75 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(56.078434%,0.392157%,20%);fill-opacity:1;" d="M 72.5 15 C 74.09375 15.621094 75.6875 16.246094 77.28125 16.875 C 77.730469 17.050781 78.175781 17.222656 78.636719 17.402344 C 81.230469 18.425781 83.617188 19.550781 86 21 C 85.503906 21.496094 85.011719 21.988281 84.5 22.5 C 84.015625 22.148438 83.53125 21.796875 83.03125 21.4375 C 79.667969 19.308594 75.753906 18.242188 72 17 C 72.164062 16.339844 72.328125 15.679688 72.5 15 Z M 72.5 15 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(76.078433%,54.901963%,40.392157%);fill-opacity:1;" d="M 54.5 25.5 C 55.601562 26.128906 56.707031 26.757812 57.84375 27.40625 C 58.1875 27.601562 58.527344 27.796875 58.882812 27.996094 C 59.757812 28.496094 60.628906 28.996094 61.5 29.5 C 61.988281 29.78125 62.476562 30.0625 62.976562 30.351562 C 63.316406 30.5625 63.652344 30.777344 64 31 C 64 31.164062 64 31.328125 64 31.5 C 60.398438 31.757812 58.125 30.167969 55 28.5 C 55 31.46875 55 34.441406 55 37.5 C 54.835938 37.5 54.671875 37.5 54.5 37.5 C 54.5 33.539062 54.5 29.578125 54.5 25.5 Z M 54.5 25.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(88.235295%,63.529414%,47.450981%);fill-opacity:1;" d="M 77 55 C 77.660156 55.660156 78.320312 56.320312 79 57 C 75.828125 60.8125 72.523438 64.328125 68 66.5 C 67.671875 66.335938 67.339844 66.171875 67 66 C 67.320312 65.757812 67.640625 65.515625 67.96875 65.265625 C 71.726562 62.355469 75.109375 59.445312 77 55 Z M 77 55 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(99.607843%,65.490198%,53.333336%);fill-opacity:1;" d="M 72.5 53 C 72.996094 53.328125 73.488281 53.660156 74 54 C 73.5 55.5 73.5 55.5 72.46875 56.09375 C 70.574219 56.617188 68.964844 56.566406 67 56.5 C 66.835938 55.839844 66.671875 55.179688 66.5 54.5 C 68.386719 52.613281 69.96875 52.800781 72.5 53 Z M 72.5 53 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(11.764706%,8.235294%,9.803922%);fill-opacity:1;" d="M 70.5 4.5 C 72.1875 4.773438 73.042969 5.027344 74.5 6 C 74.171875 6 73.839844 6 73.5 6 C 73.511719 6.464844 73.523438 6.929688 73.535156 7.40625 C 73.625 13.546875 73.625 13.546875 72.5 16.5 C 75.34375 17.757812 78.234375 18.839844 81.152344 19.90625 C 82.5 20.5 82.5 20.5 83 21.5 C 82.421875 21.304688 81.84375 21.109375 81.25 20.90625 C 77.683594 19.714844 74.089844 18.613281 70.5 17.5 C 70.828125 17.335938 71.160156 17.171875 71.5 17 C 72.707031 13.792969 72.613281 10.371094 72.5 7 C 71.839844 6.671875 71.179688 6.339844 70.5 6 C 70.5 5.503906 70.5 5.011719 70.5 4.5 Z M 70.5 4.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(92.156863%,76.862746%,92.941177%);fill-opacity:1;" d="M 43.5 7 C 44.984375 8.421875 45.476562 9.0625 45.59375 11.1875 C 45.507812 12.835938 45.371094 13.640625 44.5 15 C 42.074219 12.9375 42.074219 12.9375 41.75 11 C 42.023438 9.355469 42.542969 8.355469 43.5 7 Z M 43.5 7 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(52.941179%,52.549022%,52.156866%);fill-opacity:1;" d="M 86 50 C 86.5 51.5 86.5 51.5 86.160156 52.472656 C 84.109375 56.210938 81.164062 59.734375 77.5 62 C 77.003906 61.835938 76.511719 61.671875 76 61.5 C 76.46875 61.046875 76.941406 60.59375 77.425781 60.125 C 82.222656 55.480469 82.222656 55.480469 86 50 Z M 86 50 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(78.431374%,60.392159%,47.058824%);fill-opacity:1;" d="M 60.5 32 C 65.894531 31.066406 70.660156 34.101562 74.996094 37 C 76 38 76 38 76.132812 39.070312 C 75.921875 40.046875 75.710938 41.023438 75.5 42 C 74.839844 42 74.179688 42 73.5 42 C 73.996094 41.835938 74.488281 41.671875 75 41.5 C 74.851562 40.023438 74.742188 39.257812 74 38 C 72.28125 36.757812 70.414062 35.898438 68.5 35 C 67.800781 34.613281 67.105469 34.230469 66.40625 33.84375 C 64.453125 32.871094 62.664062 32.65625 60.5 32.5 C 60.5 32.335938 60.5 32.171875 60.5 32 Z M 60.5 32 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(80.392158%,58.823532%,85.09804%);fill-opacity:1;" d="M 51 8.5 C 50.875 11.074219 49.996094 12.449219 48.316406 14.292969 C 47.261719 15.207031 46.386719 15.460938 45 15.5 C 45.671875 12.746094 47.015625 10.957031 49 9 C 50 8.5 50 8.5 51 8.5 Z M 51 8.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(96.862745%,45.490196%,58.431375%);fill-opacity:1;" d="M 61 86 C 61 87.648438 61 89.300781 61 91 C 59.515625 91 58.03125 91 56.5 91 C 56.5 90.011719 56.5 89.019531 56.5 88 C 59.875 86 59.875 86 61 86 Z M 61 86 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(5.882353%,4.705882%,4.313726%);fill-opacity:1;" d="M 19.5 69 C 19.429688 70.886719 19.355469 72.769531 19.28125 74.65625 C 19.261719 75.191406 19.242188 75.726562 19.222656 76.28125 C 19.199219 76.792969 19.179688 77.308594 19.15625 77.835938 C 19.140625 78.3125 19.121094 78.785156 19.101562 79.273438 C 19 80.5 19 80.5 18.5 82 C 18.171875 82 17.839844 82 17.5 82 C 16.761719 80.523438 16.96875 79.128906 17 77.5 C 17.328125 77.5 17.660156 77.5 18 77.5 C 17.835938 77.171875 17.671875 76.839844 17.5 76.5 C 17.4375 74.96875 17.4375 74.96875 17.5 73.5 C 17.664062 73.335938 17.828125 73.171875 18 73 C 18.019531 71.832031 18.019531 70.667969 18 69.5 C 19 69 19 69 19.5 69 Z M 19.5 69 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(53.725493%,32.156864%,30.980393%);fill-opacity:1;" d="M 58.5 57.5 C 59 58 59 58 59.046875 58.921875 C 59.03125 59.949219 59.015625 60.976562 59 62 C 58.835938 62 58.671875 62 58.5 62 C 58.5 60.84375 58.5 59.691406 58.5 58.5 C 52.8125 58.671875 52.8125 58.671875 47.5 60.5 C 46.339844 61.910156 45.832031 63.214844 45.5 65 C 44.421875 63.382812 44.132812 62.878906 44.5 61 C 47.964844 57.632812 53.894531 57.191406 58.5 57.5 Z M 58.5 57.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.54902%,9.803922%,7.450981%);fill-opacity:1;" d="M 3.5 82 C 3.855469 82.542969 3.855469 82.542969 4.21875 83.09375 C 6.898438 86.035156 9.9375 87.082031 13.78125 87.703125 C 15.761719 88.1875 17.414062 89.226562 19 90.5 C 19 90.664062 19 90.828125 19 91 C 17.84375 91 16.691406 91 15.5 91 C 15.5 90.503906 15.5 90.011719 15.5 89.5 C 15.089844 89.425781 14.675781 89.351562 14.253906 89.277344 C 10.472656 88.519531 6.945312 87.613281 4 85 C 3.4375 83.3125 3.4375 83.3125 3.5 82 Z M 3.5 82 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(96.078432%,45.09804%,58.039218%);fill-opacity:1;" d="M 44.5 86 C 47.875 87.375 47.875 87.375 49 88.5 C 49.0625 89.8125 49.0625 89.8125 49 91 C 47.515625 91 46.03125 91 44.5 91 C 44.5 89.351562 44.5 87.699219 44.5 86 Z M 44.5 86 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(22.352941%,68.235296%,58.431375%);fill-opacity:1;" d="M 24 13.5 C 24 15.976562 24 18.449219 24 21 C 23.339844 21.164062 22.679688 21.328125 22 21.5 C 21.589844 20.050781 21.40625 18.722656 21.375 17.21875 C 21.363281 16.835938 21.347656 16.453125 21.335938 16.0625 C 21.585938 14.453125 22.324219 13.5 24 13.5 Z M 24 13.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(91.37255%,76.862746%,91.764706%);fill-opacity:1;" d="M 37 11 C 38.757812 11.035156 40.046875 11.101562 41.398438 12.296875 C 43.5 14.746094 43.5 14.746094 43.5 15.5 C 41.304688 15.714844 40.335938 15.761719 38.5625 14.375 C 37.621094 13.15625 37.167969 12.503906 37 11 Z M 37 11 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(47.058824%,12.54902%,22.352941%);fill-opacity:1;" d="M 59.5 73.5 C 60.160156 73.5 60.820312 73.5 61.5 73.5 C 62.886719 74.886719 62.699219 75.882812 62.8125 77.8125 C 62.851562 78.40625 62.886719 79 62.925781 79.613281 C 62.960938 80.300781 62.960938 80.300781 63 81 C 61.855469 79.28125 61.910156 78.507812 62 76.5 C 61.664062 76.648438 61.324219 76.800781 60.976562 76.953125 C 60.53125 77.144531 60.085938 77.335938 59.625 77.53125 C 59.183594 77.722656 58.742188 77.914062 58.289062 78.109375 C 56.921875 78.523438 56.316406 78.488281 55 78 C 55.464844 77.8125 55.929688 77.628906 56.40625 77.4375 C 58.261719 76.34375 58.714844 75.464844 59.5 73.5 Z M 59.5 73.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(96.078432%,60.784316%,74.901962%);fill-opacity:1;" d="M 90.5 36.5 C 89.992188 38.023438 89.613281 38.433594 88.5 39.5 C 87.824219 40.851562 87.832031 42.058594 87.71875 43.5625 C 87.675781 44.109375 87.636719 44.65625 87.59375 45.222656 C 87.5625 45.644531 87.53125 46.066406 87.5 46.5 C 87.003906 46.5 86.511719 46.5 86 46.5 C 85.777344 42.742188 85.679688 40.070312 88 37 C 89 36.5 89 36.5 90.5 36.5 Z M 90.5 36.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(5.490196%,6.27451%,4.313726%);fill-opacity:1;" d="M 85.5 31 C 85.5 31.164062 85.5 31.328125 85.5 31.5 C 84.890625 31.613281 84.28125 31.726562 83.65625 31.84375 C 81.445312 32.421875 80.394531 33.117188 79 35 C 78.070312 36.964844 77.261719 38.964844 76.5 41 C 76.335938 41 76.171875 41 76 41 C 75.929688 40.484375 75.855469 39.96875 75.78125 39.4375 C 75.144531 37.082031 73.445312 36.238281 71.449219 35.03125 C 71.136719 34.855469 70.824219 34.679688 70.5 34.5 C 70.828125 34.335938 71.160156 34.171875 71.5 34 C 72.691406 34.609375 72.691406 34.609375 74.0625 35.46875 C 74.515625 35.75 74.972656 36.035156 75.441406 36.328125 C 75.789062 36.546875 76.140625 36.769531 76.5 37 C 76.789062 36.539062 76.789062 36.539062 77.085938 36.070312 C 77.347656 35.664062 77.605469 35.261719 77.875 34.84375 C 78.128906 34.445312 78.386719 34.042969 78.648438 33.632812 C 80.71875 30.878906 82.238281 30.835938 85.5 31 Z M 85.5 31 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(94.509804%,92.54902%,89.803922%);fill-opacity:1;" d="M 60.5 42 C 60.519531 42.289062 60.539062 42.578125 60.554688 42.878906 C 60.769531 45.425781 61.230469 47.238281 62.5 49.5 C 62.5 49.828125 62.5 50.160156 62.5 50.5 C 61.625 50.46875 61.625 50.46875 60.5 50 C 58.800781 47.636719 58.089844 45.984375 58.5 43 C 60 42 60 42 60.5 42 Z M 60.5 42 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(99.607843%,63.921571%,52.156866%);fill-opacity:1;" d="M 29.082031 56.402344 C 29.691406 56.417969 29.691406 56.417969 30.3125 56.4375 C 30.925781 56.449219 30.925781 56.449219 31.550781 56.464844 C 32.019531 56.480469 32.019531 56.480469 32.5 56.5 C 32.664062 57.160156 32.828125 57.820312 33 58.5 C 31.183594 59.664062 29.613281 59.582031 27.5 59.5 C 27.1875 58.5625 27.1875 58.5625 27 57.5 C 28 56.5 28 56.5 29.082031 56.402344 Z M 29.082031 56.402344 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(85.882354%,62.7451%,47.058824%);fill-opacity:1;" d="M 39.5 83 C 39.828125 83.328125 40.160156 83.660156 40.5 84 C 38.664062 86.902344 36.527344 89.292969 33.5 91 C 32.109375 91.09375 32.109375 91.09375 31 91 C 32.335938 89.613281 33.707031 88.425781 35.25 87.28125 C 36.929688 85.996094 38.28125 84.75 39.5 83 Z M 39.5 83 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(49.803922%,49.019608%,49.019608%);fill-opacity:1;" d="M 75.5 61.5 C 75.996094 61.828125 76.488281 62.160156 77 62.5 C 70.519531 67.867188 70.519531 67.867188 66.5 67.5 C 66.5 67.171875 66.5 66.839844 66.5 66.5 C 68.449219 65.507812 70.375 64.535156 72.40625 63.71875 C 73.796875 63.09375 74.558594 62.664062 75.5 61.5 Z M 75.5 61.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(38.82353%,84.705883%,27.058825%);fill-opacity:1;" d="M 64.472656 46.902344 C 65.480469 46.933594 66.492188 46.96875 67.5 47 C 67.664062 47.660156 67.828125 48.320312 68 49 C 66.835938 50.078125 66.304688 50.507812 64.6875 50.46875 C 63.5 50 63.5 50 62.84375 48.96875 C 62.730469 48.648438 62.617188 48.328125 62.5 48 C 63.5 47 63.5 47 64.472656 46.902344 Z M 64.472656 46.902344 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(1.960784%,3.921569%,1.176471%);fill-opacity:1;" d="M 65 40.5 C 65.496094 40.664062 65.988281 40.828125 66.5 41 C 66.5 42.648438 66.5 44.300781 66.5 46 C 65.511719 46.496094 65.511719 46.496094 64.5 47 C 63.742188 45.796875 63.480469 45.183594 63.625 43.75 C 63.976562 42.574219 64.40625 41.570312 65 40.5 Z M 65 40.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(93.333334%,91.764706%,89.019608%);fill-opacity:1;" d="M 70 42 C 71.195312 43.246094 71.492188 43.949219 71.71875 45.6875 C 71.457031 47.839844 70.929688 48.878906 69.5 50.5 C 68.179688 50.832031 68.179688 50.832031 67 51 C 67.175781 50.726562 67.347656 50.453125 67.53125 50.171875 C 69.21875 47.375 69.84375 45.265625 70 42 Z M 70 42 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(33.725491%,34.117648%,33.725491%);fill-opacity:1;" d="M 42.5 27.5 C 43.832031 28.746094 44.777344 29.816406 45.5 31.5 C 45.664062 30.675781 45.828125 29.851562 46 29 C 47.25 30.085938 47.71875 30.863281 48 32.5 C 47.671875 32.5 47.339844 32.5 47 32.5 C 46.835938 32.828125 46.671875 33.160156 46.5 33.5 C 46.003906 33.335938 45.511719 33.171875 45 33 C 44.835938 32.671875 44.671875 32.339844 44.5 32 C 44.003906 32.742188 44.003906 32.742188 43.5 33.5 C 42.5 31.441406 42.410156 29.773438 42.5 27.5 Z M 42.5 27.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.941177%,10.588235%,9.019608%);fill-opacity:1;" d="M 39.5 73 C 40.160156 73.328125 40.820312 73.660156 41.5 74 C 41.054688 74.203125 40.609375 74.40625 40.152344 74.617188 C 38.523438 75.488281 37.394531 76.390625 36.0625 77.65625 C 33.972656 79.59375 31.820312 81.347656 29.5 83 C 29.5 82.503906 29.5 82.011719 29.5 81.5 C 30.546875 80.507812 30.546875 80.507812 31.96875 79.40625 C 34.5625 77.355469 37.050781 75.21875 39.5 73 Z M 39.5 73 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(17.254902%,2.745098%,7.450981%);fill-opacity:1;" d="M 21 14 C 21.328125 14 21.660156 14 22 14 C 21.835938 15.816406 21.671875 17.628906 21.5 19.5 C 21.335938 19.5 21.171875 19.5 21 19.5 C 21 18.511719 21 17.519531 21 16.5 C 18.261719 18.402344 16.832031 20.484375 16.125 23.71875 C 16.0625 24.351562 16.0625 24.351562 16 25 C 15.4375 24.1875 15.4375 24.1875 15 23 C 15.566406 20.703125 16.492188 19.179688 18 17.375 C 18.351562 16.949219 18.703125 16.523438 19.0625 16.085938 C 20 15 20 15 21 14 Z M 21 14 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(54.509807%,1.960784%,20.784314%);fill-opacity:1;" d="M 21.5 19.5 C 21.664062 19.5 21.828125 19.5 22 19.5 C 22.363281 21.019531 22.617188 22.433594 22.5 24 C 21.628906 25.15625 20.734375 25.6875 19.5 26.5 C 19.335938 26.828125 19.171875 27.160156 19 27.5 C 18.175781 27.335938 17.351562 27.171875 16.5 27 C 16.964844 26.433594 17.4375 25.875 17.90625 25.3125 C 18.167969 25 18.429688 24.6875 18.695312 24.363281 C 19.5 23.5 19.5 23.5 21 22.5 C 21.367188 20.96875 21.367188 20.96875 21.5 19.5 Z M 21.5 19.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.156863%,5.490196%,7.843138%);fill-opacity:1;" d="M 25 11.5 C 25 15.128906 25 18.761719 25 22.5 C 24.175781 22.664062 23.351562 22.828125 22.5 23 C 22.335938 22.503906 22.171875 22.011719 22 21.5 C 22.496094 21.335938 22.988281 21.171875 23.5 21 C 23.664062 18.523438 23.828125 16.050781 24 13.5 C 23.011719 13.996094 23.011719 13.996094 22 14.5 C 22 14.003906 22 13.511719 22 13 C 23.667969 11.5 23.667969 11.5 25 11.5 Z M 25 11.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(11.764706%,10.196079%,8.627451%);fill-opacity:1;" d="M 77.5 54.5 C 79.234375 55.488281 79.234375 55.488281 81 56.5 C 78.523438 59.644531 75.640625 62.035156 72.5 64.5 C 72.171875 64.335938 71.839844 64.171875 71.5 64 C 71.921875 63.625 72.34375 63.253906 72.777344 62.867188 C 73.332031 62.371094 73.882812 61.871094 74.4375 61.375 C 74.714844 61.128906 74.992188 60.882812 75.28125 60.632812 C 77.292969 58.847656 77.292969 58.847656 78.5 56.5 C 78.109375 55.382812 78.109375 55.382812 77.5 54.5 Z M 77.5 54.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(29.803923%,61.960787%,56.862748%);fill-opacity:1;" d="M 88.84375 34.6875 C 89.226562 34.789062 89.605469 34.894531 90 35 C 90.164062 35.496094 90.328125 35.988281 90.5 36.5 C 90.117188 36.601562 89.738281 36.707031 89.34375 36.8125 C 87.589844 37.710938 87.238281 38.714844 86.5 40.5 C 86 39 86 39 86.5 37.5 C 86.003906 37.5 85.511719 37.5 85 37.5 C 85.832031 35.582031 86.546875 34.367188 88.84375 34.6875 Z M 88.84375 34.6875 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(50.588238%,11.372549%,29.019609%);fill-opacity:1;" d="M 38 14 C 38.402344 14.246094 38.402344 14.246094 38.8125 14.5 C 40.269531 15.113281 41.433594 15.070312 43 15 C 43.164062 14.671875 43.328125 14.339844 43.5 14 C 43.996094 14.164062 44.488281 14.328125 45 14.5 C 45 14.828125 45 15.160156 45 15.5 C 39.238281 17.796875 39.238281 17.796875 36.5 17.5 C 37.160156 16.839844 37.820312 16.179688 38.5 15.5 C 38.335938 15.003906 38.171875 14.511719 38 14 Z M 38 14 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(10.588235%,9.411765%,9.019608%);fill-opacity:1;" d="M 66.5 70.5 C 64.683594 71.160156 62.871094 71.820312 61 72.5 C 62.320312 72.828125 63.640625 73.160156 65 73.5 C 65 73.664062 65 73.828125 65 74 C 63.019531 74 61.039062 74 59 74 C 59 73.175781 59 72.351562 59 71.5 C 59.871094 71.136719 60.746094 70.785156 61.625 70.4375 C 62.113281 70.242188 62.597656 70.042969 63.101562 69.839844 C 64.5 69.5 64.5 69.5 66.5 70.5 Z M 66.5 70.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(13.333334%,12.54902%,10.980392%);fill-opacity:1;" d="M 69 37.5 C 69.9375 37.90625 69.9375 37.90625 71 38.5 C 71.371094 38.6875 71.742188 38.871094 72.125 39.0625 C 72.414062 39.207031 72.703125 39.351562 73 39.5 C 72.835938 40.160156 72.671875 40.820312 72.5 41.5 C 72.996094 41.664062 73.488281 41.828125 74 42 C 73.503906 42.246094 73.503906 42.246094 73 42.5 C 72.796875 43.660156 72.628906 44.828125 72.5 46 C 72.335938 46 72.171875 46 72 46 C 71.925781 45.492188 71.925781 45.492188 71.851562 44.976562 C 71.398438 42.144531 71.398438 42.144531 70 39.6875 C 69.066406 38.90625 69.066406 38.90625 68 39 C 68 38.671875 68 38.339844 68 38 C 68.328125 37.835938 68.660156 37.671875 69 37.5 Z M 69 37.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(40.784314%,15.686275%,24.705882%);fill-opacity:1;" d="M 87.5 41.5 C 87.664062 41.5 87.828125 41.5 88 41.5 C 88.164062 43.480469 88.328125 45.460938 88.5 47.5 C 89.324219 47.335938 90.148438 47.171875 91 47 C 90.65625 47.96875 90.65625 47.96875 90 49 C 88.6875 49.375 88.6875 49.375 87.5 49.5 C 87.003906 48.675781 86.511719 47.851562 86 47 C 86.496094 46.835938 86.988281 46.671875 87.5 46.5 C 87.5 44.851562 87.5 43.199219 87.5 41.5 Z M 87.5 41.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(9.019608%,3.529412%,5.490196%);fill-opacity:1;" d="M 91.5 37.5 C 91.828125 37.5 92.160156 37.5 92.5 37.5 C 93.746094 40.941406 93.703125 43.777344 92.3125 47.1875 C 91.617188 48.308594 90.996094 49.144531 90 50 C 90 48.398438 90.222656 47.988281 91 46.65625 C 92.585938 43.605469 92.164062 40.824219 91.5 37.5 Z M 91.5 37.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(71.372551%,69.411767%,67.058825%);fill-opacity:1;" d="M 68.5 38.5 C 70.324219 39.460938 71.234375 40.082031 72 42 C 72.207031 43.179688 72.207031 43.179688 72.3125 44.375 C 72.351562 44.773438 72.386719 45.175781 72.425781 45.585938 C 72.460938 46.039062 72.460938 46.039062 72.5 46.5 C 72.171875 46.5 71.839844 46.5 71.5 46.5 C 71.171875 45.34375 70.839844 44.191406 70.5 43 C 70.335938 44.15625 70.171875 45.308594 70 46.5 C 69.835938 46.5 69.671875 46.5 69.5 46.5 C 69.46875 46.089844 69.4375 45.679688 69.40625 45.257812 C 69.367188 44.71875 69.324219 44.179688 69.28125 43.625 C 69.242188 43.089844 69.199219 42.558594 69.15625 42.007812 C 69.03125 40.78125 68.839844 39.679688 68.5 38.5 Z M 68.5 38.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(61.960787%,47.843137%,36.862746%);fill-opacity:1;" d="M 8.5 78 C 10.082031 78.683594 10.496094 78.996094 11.5 80.5 C 11.5625 81.84375 11.5625 81.84375 11.5 83 C 11.003906 83.164062 10.511719 83.328125 10 83.5 C 9.835938 83.996094 9.671875 84.488281 9.5 85 C 8.859375 85.03125 8.222656 85.0625 7.5625 85.09375 C 7.203125 85.109375 6.84375 85.128906 6.472656 85.148438 C 6.152344 85.097656 5.832031 85.050781 5.5 85 C 5.171875 84.503906 4.839844 84.011719 4.5 83.5 C 5.984375 83.664062 7.46875 83.828125 9 84 C 8.671875 83.175781 8.339844 82.351562 8 81.5 C 9.238281 81.996094 9.238281 81.996094 10.5 82.5 C 9.96875 80.796875 9.496094 79.492188 8.5 78 Z M 8.5 78 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(9.803922%,9.803922%,9.411765%);fill-opacity:1;" d="M 19 40.5 C 20.25 42.8125 21.5 45.351562 21.5 48 C 21.828125 48 22.160156 48 22.5 48 C 22.335938 48.660156 22.171875 49.320312 22 50 C 21.675781 49.363281 21.351562 48.730469 21.03125 48.09375 C 20.851562 47.738281 20.671875 47.386719 20.484375 47.023438 C 20 46 20 46 19.5 44.5 C 19.335938 45.820312 19.171875 47.140625 19 48.5 C 18.835938 48.5 18.671875 48.5 18.5 48.5 C 18.488281 47.355469 18.476562 46.207031 18.46875 45.0625 C 18.464844 44.425781 18.457031 43.785156 18.453125 43.128906 C 18.5 41.5 18.5 41.5 19 40.5 Z M 19 40.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(71.372551%,69.803923%,67.450982%);fill-opacity:1;" d="M 62 38.5 C 61.671875 41.140625 61.339844 43.78125 61 46.5 C 60.835938 46.5 60.671875 46.5 60.5 46.5 C 60.335938 45.179688 60.171875 43.859375 60 42.5 C 59.339844 42.335938 58.679688 42.171875 58 42 C 60.234375 38.5 60.234375 38.5 62 38.5 Z M 62 38.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(16.862746%,6.666667%,8.235294%);fill-opacity:1;" d="M 39.5 85 C 40.203125 86.410156 39.753906 87.085938 39.28125 88.5625 C 39.136719 89.015625 38.992188 89.472656 38.84375 89.941406 C 38.730469 90.289062 38.617188 90.640625 38.5 91 C 38.171875 91 37.839844 91 37.5 91 C 37.5 90.339844 37.5 89.679688 37.5 89 C 37.117188 89.339844 36.738281 89.679688 36.34375 90.03125 C 35.898438 90.351562 35.457031 90.671875 35 91 C 34.503906 90.835938 34.011719 90.671875 33.5 90.5 C 33.847656 90.230469 34.199219 89.957031 34.558594 89.679688 C 35.011719 89.320312 35.46875 88.960938 35.9375 88.59375 C 36.390625 88.238281 36.84375 87.886719 37.308594 87.523438 C 38.582031 86.523438 38.582031 86.523438 39.5 85 Z M 39.5 85 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(20%,14.901961%,12.941177%);fill-opacity:1;" d="M 68 75 C 68.921875 75.957031 69.695312 76.9375 70.5 78 C 70.171875 78.328125 69.839844 78.660156 69.5 79 C 69.335938 78.757812 69.171875 78.519531 69 78.269531 C 67.199219 75.878906 65.835938 75.058594 62.878906 74.59375 C 62.589844 74.5625 62.296875 74.53125 62 74.5 C 61.835938 74.996094 61.671875 75.488281 61.5 76 C 61.5 75.339844 61.5 74.679688 61.5 74 C 64.222656 73.402344 65.667969 73.425781 68 75 Z M 68 75 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(10.980392%,9.411765%,7.843138%);fill-opacity:1;" d="M 37.5 25.5 C 37.828125 25.5 38.160156 25.5 38.5 25.5 C 38.5 26.324219 38.5 27.148438 38.5 28 C 38.230469 28.035156 37.957031 28.070312 37.679688 28.109375 C 36.019531 28.660156 34.957031 29.65625 33.625 30.78125 C 33.132812 31.195312 32.640625 31.609375 32.132812 32.03125 C 31.757812 32.351562 31.386719 32.671875 31 33 C 31 32.503906 31 32.011719 31 31.5 C 31.96875 30.597656 32.925781 29.78125 33.96875 28.96875 C 36.015625 27.480469 36.015625 27.480469 37.5 25.5 Z M 37.5 25.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(31.764707%,32.156864%,31.764707%);fill-opacity:1;" d="M 23 27.5 C 23 27.996094 23 28.488281 23 29 C 23.328125 29.328125 23.660156 29.660156 24 30 C 23.519531 30.5625 23.042969 31.125 22.5625 31.6875 C 22.296875 32 22.027344 32.3125 21.753906 32.636719 C 21.191406 33.277344 20.601562 33.898438 20 34.5 C 20.289062 31.535156 20.878906 29.621094 23 27.5 Z M 23 27.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(1.568628%,3.921569%,3.529412%);fill-opacity:1;" d="M 86.5 28.5 C 86.828125 28.5 87.160156 28.5 87.5 28.5 C 87.5 29.488281 87.5 30.480469 87.5 31.5 C 87.800781 31.625 88.101562 31.746094 88.414062 31.875 C 89.675781 32.601562 90.152344 33.261719 90.90625 34.5 C 91.125 34.851562 91.347656 35.203125 91.570312 35.5625 C 91.714844 35.871094 91.855469 36.179688 92 36.5 C 91.835938 36.828125 91.671875 37.160156 91.5 37.5 C 91.222656 37.144531 90.945312 36.789062 90.660156 36.421875 C 90.296875 35.964844 89.9375 35.503906 89.5625 35.03125 C 89.203125 34.574219 88.84375 34.113281 88.472656 33.640625 C 87.601562 32.46875 87.601562 32.46875 86.5 32 C 86.5 30.84375 86.5 29.691406 86.5 28.5 Z M 86.5 28.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(27.450982%,70.588237%,14.117648%);fill-opacity:1;" d="M 67 43.5 C 67.660156 43.664062 68.320312 43.828125 69 44 C 69.0625 45.4375 69.0625 45.4375 69 47 C 68.671875 47.328125 68.339844 47.660156 68 48 C 66.761719 47.503906 66.761719 47.503906 65.5 47 C 65.996094 45.84375 66.488281 44.691406 67 43.5 Z M 67 43.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(7.058824%,19.607843%,2.745098%);fill-opacity:1;" d="M 62 37.5 C 66.523438 37.1875 66.523438 37.1875 68.34375 38.375 C 69 39.5 69 39.5 68.875 40.625 C 68.75 40.914062 68.628906 41.203125 68.5 41.5 C 68.160156 41.085938 67.820312 40.675781 67.46875 40.25 C 66.078125 38.820312 66.078125 38.820312 64.09375 38.84375 C 63.566406 38.894531 63.042969 38.945312 62.5 39 C 62.335938 38.503906 62.171875 38.011719 62 37.5 Z M 62 37.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(91.764706%,91.37255%,89.019608%);fill-opacity:1;" d="M 63 39 C 63.496094 39.164062 63.988281 39.328125 64.5 39.5 C 64.5 40.488281 64.5 41.480469 64.5 42.5 C 63.53125 42.78125 63.53125 42.78125 62.5 43 C 62 42.5 62 42.5 61.90625 41.28125 C 62 40 62 40 63 39 Z M 63 39 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(45.490196%,26.274511%,34.509805%);fill-opacity:1;" d="M 44.5 7.5 C 44.828125 7.5 45.160156 7.5 45.5 7.5 C 45.664062 8.324219 45.828125 9.148438 46 10 C 46.988281 9.671875 47.980469 9.339844 49 9 C 48.523438 9.730469 48.042969 10.460938 47.5625 11.1875 C 47.296875 11.59375 47.027344 12 46.753906 12.417969 C 46 13.5 46 13.5 45 14.5 C 45.003906 14.164062 45.011719 13.824219 45.015625 13.476562 C 45.023438 13.03125 45.027344 12.585938 45.03125 12.125 C 45.039062 11.464844 45.039062 11.464844 45.046875 10.789062 C 45.003906 9.609375 44.835938 8.628906 44.5 7.5 Z M 44.5 7.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(85.882354%,65.098041%,50.196081%);fill-opacity:1;" d="M 60.5 32 C 64.121094 31.734375 66.363281 32.140625 69.5 34 C 69.5 34.328125 69.5 34.660156 69.5 35 C 66.683594 34.105469 66.683594 34.105469 65.5625 33.46875 C 63.921875 32.746094 62.28125 32.679688 60.5 32.5 C 60.5 32.335938 60.5 32.171875 60.5 32 Z M 60.5 32 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(91.764706%,49.019608%,65.098041%);fill-opacity:1;" d="M 86 40.5 C 86.164062 40.996094 86.328125 41.488281 86.5 42 C 86.828125 42 87.160156 42 87.5 42 C 87.5 43.484375 87.5 44.96875 87.5 46.5 C 87.003906 46.5 86.511719 46.5 86 46.5 C 86 44.519531 86 42.539062 86 40.5 Z M 86 40.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(93.725491%,79.215688%,93.725491%);fill-opacity:1;" d="M 48.5 9.5 C 48.996094 9.664062 49.488281 9.828125 50 10 C 48.84375 11.484375 47.691406 12.96875 46.5 14.5 C 46 12.5 46 12.5 46.5 11.648438 C 47.164062 10.933594 47.832031 10.214844 48.5 9.5 Z M 48.5 9.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(5.882353%,1.568628%,3.137255%);fill-opacity:1;" d="M 64.5 2.5 C 66.660156 2.753906 68.515625 3.097656 70.5 4 C 70.5 4.496094 70.5 4.988281 70.5 5.5 C 68.992188 5.179688 67.742188 4.675781 66.375 3.96875 C 66.023438 3.789062 65.667969 3.605469 65.304688 3.421875 C 65.039062 3.28125 64.773438 3.144531 64.5 3 C 64.5 2.835938 64.5 2.671875 64.5 2.5 Z M 64.5 2.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(5.490196%,1.176471%,2.352941%);fill-opacity:1;" d="M 47 1.5 C 47.328125 1.828125 47.660156 2.160156 48 2.5 C 44.039062 2.746094 44.039062 2.746094 40 3 C 40.164062 2.671875 40.328125 2.339844 40.5 2 C 42.648438 1.347656 44.777344 1.390625 47 1.5 Z M 47 1.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(19.607843%,17.647059%,14.901961%);fill-opacity:1;" d="M 18.5 73 C 18.003906 73.246094 18.003906 73.246094 17.5 73.5 C 17.519531 74.117188 17.542969 74.738281 17.5625 75.375 C 17.625 77.226562 17.402344 77.695312 16.5 79.5 C 16.664062 77.519531 16.828125 75.539062 17 73.5 C 16.011719 73.996094 15.019531 74.488281 14 75 C 15.132812 72.492188 15.863281 72.375 18.5 73 Z M 18.5 73 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(7.058824%,5.490196%,4.705882%);fill-opacity:1;" d="M 30.5 67.5 C 32.976562 68.324219 35.449219 69.148438 38 70 C 38 70.164062 38 70.328125 38 70.5 C 35.023438 70.304688 32.6875 69.8125 30 68.5 C 30.164062 68.171875 30.328125 67.839844 30.5 67.5 Z M 30.5 67.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(58.431375%,57.647061%,57.647061%);fill-opacity:1;" d="M 86 50 C 86.5 51.5 86.5 51.5 85.6875 53.28125 C 84.777344 54.972656 83.800781 55.398438 82 56 C 82.570312 55.09375 83.144531 54.1875 83.71875 53.28125 C 84.199219 52.523438 84.199219 52.523438 84.6875 51.75 C 85.5 50.5 85.5 50.5 86 50 Z M 86 50 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(41.176471%,16.078432%,18.039216%);fill-opacity:1;" d="M 45.5 60 C 45.996094 60 46.488281 60 47 60 C 46.0625 63.875 46.0625 63.875 45.5 65 C 44.425781 63.386719 44.308594 62.867188 44.5 61 C 44.828125 60.671875 45.160156 60.339844 45.5 60 Z M 45.5 60 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(13.333334%,13.333334%,12.941177%);fill-opacity:1;" d="M 82 57.5 C 82 59.292969 81.667969 59.554688 80.5 60.84375 C 80.222656 61.15625 79.941406 61.464844 79.65625 61.789062 C 79.441406 62.023438 79.222656 62.257812 79 62.5 C 78.503906 62.171875 78.011719 61.839844 77.5 61.5 C 78.984375 60.179688 80.46875 58.859375 82 57.5 Z M 82 57.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(56.078434%,89.019608%,47.450981%);fill-opacity:1;" d="M 66.5 47 C 66.996094 47.164062 67.488281 47.328125 68 47.5 C 67.90625 48.4375 67.90625 48.4375 67.5 49.5 C 66.21875 50.125 66.21875 50.125 65 50.5 C 65 50.003906 65 49.511719 65 49 C 65.328125 49 65.660156 49 66 49 C 66.164062 48.339844 66.328125 47.679688 66.5 47 Z M 66.5 47 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(7.843138%,36.470589%,30.19608%);fill-opacity:1;" d="M 24 18.5 C 25.074219 20.113281 25.101562 20.640625 25 22.5 C 24.175781 22.664062 23.351562 22.828125 22.5 23 C 22.335938 22.503906 22.171875 22.011719 22 21.5 C 22.496094 21.335938 22.988281 21.171875 23.5 21 C 23.664062 20.175781 23.828125 19.351562 24 18.5 Z M 24 18.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(46.27451%,43.921569%,30.980393%);fill-opacity:1;" d="M 53 70 C 54.484375 70.246094 54.484375 70.246094 56 70.5 C 56 70.996094 56 71.488281 56 72 C 54.679688 72 53.359375 72 52 72 C 52.328125 71.339844 52.660156 70.679688 53 70 Z M 53 70 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(56.470591%,15.294118%,29.803923%);fill-opacity:1;" d="M 86 46.5 C 86.660156 46.664062 87.320312 46.828125 88 47 C 88 47.328125 88 47.660156 88 48 C 88.496094 48.164062 88.988281 48.328125 89.5 48.5 C 89.5 48.828125 89.5 49.160156 89.5 49.5 C 88.3125 49.40625 88.3125 49.40625 87 49 C 86.34375 47.71875 86.34375 47.71875 86 46.5 Z M 86 46.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(46.27451%,41.568628%,38.431373%);fill-opacity:1;" d="M 70.5 64.5 C 70.828125 64.664062 71.160156 64.828125 71.5 65 C 69 67.5 69 67.5 67.6875 67.5625 C 67.296875 67.542969 66.902344 67.519531 66.5 67.5 C 66.5 67.171875 66.5 66.839844 66.5 66.5 C 67.144531 66.152344 67.789062 65.808594 68.4375 65.46875 C 68.796875 65.277344 69.15625 65.085938 69.527344 64.890625 C 69.847656 64.761719 70.167969 64.632812 70.5 64.5 Z M 70.5 64.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(21.176471%,56.862748%,10.588235%);fill-opacity:1;" d="M 62.5 44.5 C 62.996094 44.664062 63.488281 44.828125 64 45 C 64 45.328125 64 45.660156 64 46 C 63.671875 46 63.339844 46 63 46 C 62.671875 46.824219 62.339844 47.648438 62 48.5 C 61.671875 48.003906 61.339844 47.511719 61 47 C 61.496094 46.175781 61.988281 45.351562 62.5 44.5 Z M 62.5 44.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(61.176473%,31.764707%,43.137255%);fill-opacity:1;" d="M 90.5 36.5 C 90.828125 36.664062 91.160156 36.828125 91.5 37 C 91.5 37.496094 91.5 37.988281 91.5 38.5 C 91.191406 38.519531 90.882812 38.542969 90.5625 38.5625 C 90.210938 38.707031 89.859375 38.851562 89.5 39 C 88.847656 40.5 88.847656 40.5 88.5 42 C 88.171875 41.835938 87.839844 41.671875 87.5 41.5 C 88.105469 39.285156 88.757812 38.007812 90.5 36.5 Z M 90.5 36.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(5.882353%,39.215687%,32.156864%);fill-opacity:1;" d="M 69 13.5 C 69.164062 13.996094 69.328125 14.488281 69.5 15 C 70.324219 15 71.148438 15 72 15 C 71.835938 15.496094 71.671875 15.988281 71.5 16.5 C 70.511719 16.5 69.519531 16.5 68.5 16.5 C 68.664062 15.511719 68.828125 14.519531 69 13.5 Z M 69 13.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(77.64706%,54.509807%,83.92157%);fill-opacity:1;" d="M 37 12 C 38.886719 12.90625 40.6875 13.953125 42.5 15 C 42.5 15.164062 42.5 15.328125 42.5 15.5 C 41.40625 15.65625 41.40625 15.65625 40 15.5 C 38.742188 14.414062 37.746094 13.496094 37 12 Z M 37 12 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(53.333336%,11.764706%,25.098041%);fill-opacity:1;" d="M 27.53125 10.96875 C 28.257812 10.984375 28.257812 10.984375 29 11 C 28.035156 12.011719 27.183594 12.710938 26 13.5 C 25.503906 13.5 25.011719 13.5 24.5 13.5 C 25.707031 11.007812 25.707031 11.007812 27.53125 10.96875 Z M 27.53125 10.96875 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(65.490198%,15.686275%,31.764707%);fill-opacity:1;" d="M 73.5 6.5 C 74.820312 6.996094 76.140625 7.488281 77.5 8 C 77.003906 8.742188 77.003906 8.742188 76.5 9.5 C 75.511719 9.171875 74.519531 8.839844 73.5 8.5 C 73.5 7.839844 73.5 7.179688 73.5 6.5 Z M 73.5 6.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(14.509805%,14.509805%,14.117648%);fill-opacity:1;" d="M 86 51.5 C 86.660156 51.5 87.320312 51.5 88 51.5 C 87.5 52.5 87 53.5 86.5 54.5 C 85.839844 54.335938 85.179688 54.171875 84.5 54 C 85.125 52.75 85.125 52.75 86 51.5 Z M 86 51.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(28.235295%,72.941178%,64.313728%);fill-opacity:1;" d="M 83.5 40.5 C 83.996094 41.242188 83.996094 41.242188 84.5 42 C 84.828125 42.164062 85.160156 42.328125 85.5 42.5 C 85.335938 43.324219 85.171875 44.148438 85 45 C 84.503906 45 84.011719 45 83.5 45 C 83.5 43.515625 83.5 42.03125 83.5 40.5 Z M 83.5 40.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(84.313726%,60.784316%,45.09804%);fill-opacity:1;" d="M 69.5 34 C 70.160156 34.296875 70.816406 34.597656 71.46875 34.90625 C 72.015625 35.160156 72.015625 35.160156 72.578125 35.414062 C 73.5 36 73.5 36 74 37.5 C 72.28125 36.878906 70.875 36.222656 69.5 35 C 69.5 34.671875 69.5 34.339844 69.5 34 Z M 69.5 34 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(41.960785%,9.803922%,24.705882%);fill-opacity:1;" d="M 49.5 12.5 C 49.664062 12.828125 49.828125 13.160156 50 13.5 C 49.1875 14.75 49.1875 14.75 48 16 C 46.375 16.15625 46.375 16.15625 45 16 C 45.496094 15.671875 45.988281 15.339844 46.5 15 C 47.515625 14.183594 48.503906 13.339844 49.5 12.5 Z M 49.5 12.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(12.156863%,8.235294%,6.27451%);fill-opacity:1;" d="M 38 87.5 C 37.378906 89.363281 36.613281 89.921875 35 91 C 34.503906 90.835938 34.011719 90.671875 33.5 90.5 C 34.058594 89.996094 34.621094 89.496094 35.1875 89 C 35.65625 88.582031 35.65625 88.582031 36.136719 88.15625 C 37 87.5 37 87.5 38 87.5 Z M 38 87.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(3.921569%,1.960784%,1.568628%);fill-opacity:1;" d="M 59 72 C 61.96875 72.742188 61.96875 72.742188 65 73.5 C 65 73.664062 65 73.828125 65 74 C 63.019531 74 61.039062 74 59 74 C 59 73.339844 59 72.679688 59 72 Z M 59 72 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(26.274511%,22.352941%,19.215687%);fill-opacity:1;" d="M 76 60 C 76.328125 60.328125 76.660156 60.660156 77 61 C 75.515625 62.15625 74.03125 63.308594 72.5 64.5 C 72.171875 64.335938 71.839844 64.171875 71.5 64 C 72.984375 62.679688 74.46875 61.359375 76 60 Z M 76 60 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(22.352941%,20.392157%,18.039216%);fill-opacity:1;" d="M 45.5 38.5 C 46.160156 38.828125 46.820312 39.160156 47.5 39.5 C 47.335938 40.160156 47.171875 40.820312 47 41.5 C 45.0625 40.625 45.0625 40.625 44.5 39.5 C 44.828125 39.5 45.160156 39.5 45.5 39.5 C 45.5 39.171875 45.5 38.839844 45.5 38.5 Z M 45.5 38.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(18.431373%,17.254902%,15.294118%);fill-opacity:1;" d="M 60.5 37 C 60.828125 37.164062 61.160156 37.328125 61.5 37.5 C 60.808594 39.570312 59.699219 40.160156 58 41.5 C 58.546875 39.613281 58.953125 38.265625 60.5 37 Z M 60.5 37 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(20.784314%,20.392157%,20%);fill-opacity:1;" d="M 23 66 C 23.164062 66 23.328125 66 23.5 66 C 23.664062 67.980469 23.828125 69.960938 24 72 C 23.503906 71.835938 23.011719 71.671875 22.5 71.5 C 22.664062 69.683594 22.828125 67.871094 23 66 Z M 23 66 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(5.490196%,5.490196%,5.098039%);fill-opacity:1;" d="M 84.5 54.5 C 84.996094 54.664062 85.488281 54.828125 86 55 C 85.5 56.5 85.5 56.5 83.96875 57.34375 C 83.484375 57.558594 83 57.777344 82.5 58 C 83.101562 56.753906 83.726562 55.660156 84.5 54.5 Z M 84.5 54.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(87.058824%,72.941178%,87.058824%);fill-opacity:1;" d="M 43.5 7 C 44.75 8.085938 45.21875 8.863281 45.5 10.5 C 45.003906 10.335938 44.511719 10.171875 44 10 C 43.835938 9.671875 43.671875 9.339844 43.5 9 C 43.003906 9.246094 43.003906 9.246094 42.5 9.5 C 42.828125 8.675781 43.160156 7.851562 43.5 7 Z M 43.5 7 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(21.568628%,18.82353%,15.294118%);fill-opacity:1;" d="M 39.5 73 C 40.160156 73.328125 40.820312 73.660156 41.5 74 C 38.6875 75.5 38.6875 75.5 37 75.5 C 37.824219 74.675781 38.648438 73.851562 39.5 73 Z M 39.5 73 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(40.392157%,7.450981%,18.431373%);fill-opacity:1;" d="M 80.5 11.5 C 82.296875 12.167969 83.878906 12.980469 85.5 14 C 85.335938 14.328125 85.171875 14.660156 85 15 C 84.25 14.511719 83.5 14.023438 82.75 13.53125 C 82.332031 13.257812 81.914062 12.984375 81.484375 12.703125 C 81.160156 12.472656 80.835938 12.238281 80.5 12 C 80.5 11.835938 80.5 11.671875 80.5 11.5 Z M 80.5 11.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(90.196079%,74.509805%,90.588236%);fill-opacity:1;" d="M 44 10 C 44.496094 10.164062 44.988281 10.328125 45.5 10.5 C 45.335938 11.820312 45.171875 13.140625 45 14.5 C 44.070312 12.933594 43.898438 11.8125 44 10 Z M 44 10 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(77.64706%,57.647061%,81.960785%);fill-opacity:1;" d="M 50 8.5 C 50.328125 8.5 50.660156 8.5 51 8.5 C 50.625 11.375 50.625 11.375 49.5 12.5 C 49 11 49 11 49 9.5 C 49.328125 9.171875 49.660156 8.839844 50 8.5 Z M 50 8.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(11.372549%,10.588235%,9.803922%);fill-opacity:1;" d="M 18.5 77 C 18.664062 77 18.828125 77 19 77 C 19.0625 80.3125 19.0625 80.3125 18.5 82 C 18.171875 82 17.839844 82 17.5 82 C 17.335938 81.339844 17.171875 80.679688 17 80 C 17.496094 80 17.988281 80 18.5 80 C 18.5 79.011719 18.5 78.019531 18.5 77 Z M 18.5 77 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(32.156864%,77.254903%,20%);fill-opacity:1;" d="M 66.5 45.5 C 67.160156 45.664062 67.820312 45.828125 68.5 46 C 68.335938 46.660156 68.171875 47.320312 68 48 C 66.761719 47.503906 66.761719 47.503906 65.5 47 C 65.828125 46.503906 66.160156 46.011719 66.5 45.5 Z M 66.5 45.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(4.313726%,3.921569%,3.137255%);fill-opacity:1;" d="M 61.5 36.5 C 62.820312 36.5 64.140625 36.5 65.5 36.5 C 64.320312 37.546875 63.726562 37.984375 62.125 38.09375 C 61.566406 38.046875 61.566406 38.046875 61 38 C 61.164062 37.503906 61.328125 37.011719 61.5 36.5 Z M 61.5 36.5 "/>
<path style=" stroke:none;fill-rule:nonzero;fill:rgb(37.254903%,63.137257%,59.215689%);fill-opacity:1;" d="M 86.5 36 C 86.828125 36.164062 87.160156 36.328125 87.5 36.5 C 87.125 39.375 87.125 39.375 86 40.5 C 86.164062 39.511719 86.328125 38.519531 86.5 37.5 C 86.171875 37.335938 85.839844 37.171875 85.5 37 C 85.828125 36.671875 86.160156 36.339844 86.5 36 Z M 86.5 36 "/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 97 KiB

1
app/icons/fire.svg Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="32" height="32" viewBox="0 0 24 24"><path fill="currentColor" d="M12.832 21.801c3.126-.626 7.168-2.875 7.168-8.69c0-5.291-3.873-8.815-6.658-10.434c-.619-.36-1.342.113-1.342.828v1.828c0 1.442-.606 4.074-2.29 5.169c-.86.559-1.79-.278-1.894-1.298l-.086-.838c-.1-.974-1.092-1.565-1.87-.971C4.461 8.46 3 10.33 3 13.11C3 20.221 8.289 22 10.933 22q.232 0 .484-.015C10.111 21.874 8 21.064 8 18.444c0-2.05 1.495-3.435 2.631-4.11c.306-.18.663.055.663.41v.59c0 .45.175 1.155.59 1.637c.47.546 1.159-.026 1.214-.744c.018-.226.246-.37.442-.256c.641.375 1.46 1.175 1.46 2.473c0 2.048-1.129 2.99-2.168 3.357"/></svg>

After

Width:  |  Height:  |  Size: 648 B

11
app/icons/headphone.svg Normal file
View File

@@ -0,0 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="16" height="16" viewBox="0 0 48 48" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M4 28C4 26.8954 4.89543 26 6 26H10V38H6C4.89543 38 4 37.1046 4 36V28Z" fill="none" />
<path d="M38 26H42C43.1046 26 44 26.8954 44 28V36C44 37.1046 43.1046 38 42 38H38V26Z"
fill="none" />
<path
d="M10 36V24C10 16.268 16.268 10 24 10C31.732 10 38 16.268 38 24V36M10 26H6C4.89543 26 4 26.8954 4 28V36C4 37.1046 4.89543 38 6 38H10V26ZM38 26H42C43.1046 26 44 26.8954 44 28V36C44 37.1046 43.1046 38 42 38H38V26Z"
stroke="#333" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" />
<path d="M16 32H20L22 26L26 38L28 32H32" stroke="#333" stroke-width="4" stroke-linecap="round"
stroke-linejoin="round" />
</svg>

After

Width:  |  Height:  |  Size: 808 B

View File

@@ -0,0 +1,14 @@
<svg height="1em" style="flex:none;line-height:1" viewBox="0 0 30 30" width="1em" xmlns="http://www.w3.org/2000/svg">
<title>ChatGLM</title>
<rect width="30" height="30" fill="#E7F8FF" rx="6"/>
<g transform="translate(3, 3)">
<defs>
<linearGradient id="lobe-icons-chatglm-fill" x1="-18.756%" x2="70.894%" y1="49.371%" y2="90.944%">
<stop offset="0%" stop-color="#504AF4"></stop>
<stop offset="100%" stop-color="#3485FF"></stop>
</linearGradient>
</defs>
<path d="M9.917 2c4.906 0 10.178 3.947 8.93 10.58-.014.07-.037.14-.057.21l-.003-.277c-.083-3-1.534-8.934-8.87-8.934-3.393 0-8.137 3.054-7.93 8.158-.04 4.778 3.555 8.4 7.95 8.332l.073-.001c1.2-.033 2.763-.429 3.1-1.657.063-.031.26.534.268.598.048.256.112.369.192.34.981-.348 2.286-1.222 1.952-2.38-.176-.61-1.775-.147-1.921-.347.418-.979 2.234-.926 3.153-.716.443.102.657.38 1.012.442.29.052.981-.2.96.242-1.5 3.042-4.893 5.41-8.808 5.41C3.654 22 0 16.574 0 11.737 0 5.947 4.959 2 9.917 2zM9.9 5.3c.484 0 1.125.225 1.38.585 3.669.145 4.313 2.686 4.694 5.444.255 1.838.315 2.3.182 1.387l.083.59c.068.448.554.737.982.516.144-.075.254-.231.328-.47a.2.2 0 01.258-.13l.625.22a.2.2 0 01.124.238 2.172 2.172 0 01-.51.92c-.878.917-2.757.664-3.08-.62-.14-.554-.055-.626-.345-1.242-.292-.621-1.238-.709-1.69-.295-.345.315-.407.805-.406 1.282L12.6 15.9a.9.9 0 01-.9.9h-1.4a.9.9 0 01-.9-.9v-.65a1.15 1.15 0 10-2.3 0v.65a.9.9 0 01-.9.9H4.8a.9.9 0 01-.9-.9l.035-3.239c.012-1.884.356-3.658 2.47-4.134.2-.045.252.13.29.342.025.154.043.252.053.294.701 3.058 1.75 4.299 3.144 3.722l.66-.331.254-.13c.158-.082.25-.131.276-.15.012-.01-.165-.206-.407-.464l-1.012-1.067a8.925 8.925 0 01-.199-.216c-.047-.034-.116.068-.208.306-.074.157-.251.252-.272.326-.013.058.108.298.362.72.164.288.22.508-.31.343-1.04-.8-1.518-2.273-1.684-3.725-.004-.035-.162-1.913-.162-1.913a1.2 1.2 0 011.113-1.281L9.9 5.3zm12.994 8.68c.037.697-.403.704-1.213.591l-1.783-.276c-.265-.053-.385-.099-.313-.147.47-.315 3.268-.93 3.31-.168zm-.915-.083l-.926.042c-.85.077-1.452.24.338.336l.103.003c.815.012 1.264-.359.485-.381zm1.667-3.601h.01c.79.398.067 1.03-.65 1.393-.14.07-.491.176-1.052.315-.241.04-.457.092-.333.16l.01.005c1.952.958-3.123 1.534-2.495 1.285l.38-.148c.68-.266 1.614-.682 1.666-1.337.038-.48 1.253-.442 1.493-.968.048-.106 0-.236-.144-.389-.05-.047-.094-.094-.107-.148-.073-.305.7-.431 1.222-.168zm-2.568-.474c-.135 1.198-2.479 4.192-1.949 2.863l.017-.042c.298-.717.376-2.221 1.337-3.221.25-.26.636.035.595.4zm-7.976-.253c.02-.694 1.002-.968 1.346-.347.01-1.274-1.941-.768-1.346.347z"
fill="url(#lobe-icons-chatglm-fill)" fill-rule="evenodd"></path>
</g>
</svg>

After

Width:  |  Height:  |  Size: 2.6 KiB

Some files were not shown because too many files have changed in this diff Show More