Compare commits

...

36 Commits

Author SHA1 Message Date
Soulter
30862bb82f perf: 优化更新速度和更新流程 2024-02-06 19:18:53 +08:00
Soulter
6c0bda8feb Update README.md 2024-02-06 18:30:56 +08:00
Soulter
e14dece206 perf: 优化项目更新逻辑 2024-02-06 17:45:02 +08:00
Soulter
680593d636 fix: 修复web指令前缀失效问题 2024-02-06 15:42:29 +08:00
Soulter
144440214f fix: 修复画图报错的问题 2024-02-06 12:56:41 +08:00
Soulter
6667b58a3f fix: 修复容器出现的一些问题 2024-02-06 12:48:57 +08:00
Soulter
b55d9533be chore: 清理了一些无用代码 2024-02-05 23:46:46 +08:00
Soulter
3484fc60e6 fix: dashboard fake dead 2024-02-05 14:51:19 +08:00
Soulter
eac0265522 fix: 修复频道私聊且独立会话下的报错 2024-02-05 14:45:32 +08:00
Soulter
ac74431633 fix: 修复远程连接可视化面板下控制台不能正常显示的问题 2024-02-05 14:12:38 +08:00
Soulter
4c098200be fix: 修复docker环境下ws server的报错 2024-02-05 13:54:45 +08:00
Soulter
2cf18972f3 fix: 修复面板保存配置时报错的问题;修复频道私聊报错的问题
perf: 改善日志
2024-02-05 13:18:34 +08:00
Soulter
d522d2a6a9 Merge remote-tracking branch 'refs/remotes/origin/master' 2024-02-04 21:29:58 +08:00
Soulter
7079ce096f feat: 可视化面板支持日志显示
chore: 减少了一些日志输出
2024-02-04 21:28:03 +08:00
Soulter
5e8c5067b1 Update README.md 2024-01-16 00:32:07 +08:00
Soulter
570ff4e8b6 perf: 优化bing网页搜索 2024-01-10 16:48:46 +08:00
Soulter
e2f1362a1f fix: 修复myid指令在gocq平台上不可用的情况 2024-01-09 22:25:52 +08:00
Soulter
3519e38211 perf: 移除默认prompt 2024-01-07 14:51:43 +08:00
Soulter
08734250f7 feat: 支持频道上的文字转图片 2024-01-05 18:59:02 +08:00
Soulter
e8407f6449 feat: 添加逆向语言模型服务相关配置到面板 2024-01-05 17:13:52 +08:00
Soulter
04f3400f83 perf: 改善插件搜集流程 2024-01-03 20:19:31 +08:00
Soulter
89c8b3e7fc fix: 修复 gocq 环境下 at 机器人报错的问题 2024-01-03 16:32:08 +08:00
Soulter
66294100ec fix: typo fix 2024-01-03 16:26:58 +08:00
Soulter
8ed8a23c8b fix: 修复 gocq 环境下消息响应的一些问题 2024-01-03 16:15:37 +08:00
Soulter
449b0b03b5 fix: 修复报错 nick_qq 的问题 2024-01-03 16:00:51 +08:00
Soulter
d93754bf1d Update cmd_config.py 2024-01-03 15:46:11 +08:00
Soulter
a007a61ecc Update docker-image.yml 2024-01-02 16:44:58 +08:00
Soulter
e481377317 fix: 修复 update 的一些问题 2024-01-01 12:46:22 +08:00
Soulter
4c5831c7b4 remove: 删除 simhei 字体资源 2024-01-01 12:03:21 +08:00
Soulter
fc54b5237f feat: 支持设置 llm 唤醒词 2024-01-01 11:48:55 +08:00
Soulter
f8f42678d1 fix: 修复 消息 send() 不能够正常使用的问题 2024-01-01 11:34:56 +08:00
Soulter
38b1f4128c Merge pull request #145 from Soulter/dev_platform_refact
重构与消息平台有关的部分代码
2024-01-01 11:07:13 +08:00
Soulter
04fb4f88ad feat: 重构代码 2023-12-30 20:08:28 +08:00
Soulter
4675f5df08 Create stale.yml 2023-12-28 14:12:25 +08:00
Soulter
34ee358d40 Update README.md 2023-12-28 14:01:53 +08:00
Soulter
c4cfd1a3e2 Update README.md 2023-12-28 13:18:47 +08:00
60 changed files with 1589 additions and 1820 deletions

View File

@@ -4,7 +4,6 @@ on:
push:
branches:
- master
- dev_dashboard
paths-ignore:
- '**/*.md'
workflow_dispatch:
@@ -18,8 +17,8 @@ jobs:
uses: actions/checkout@v2
- name: Build image
run: |
docker build -t ${{ secrets.DOCKER_HUB_USERNAME }}/astrbot:v1 .
docker build -t ${{ secrets.DOCKER_HUB_USERNAME }}/astrbot:latest .
- name: Publish image
run: |
docker login -u ${{ secrets.DOCKER_HUB_USERNAME }} -p ${{ secrets.DOCKER_HUB_PASSWORD }}
docker push ${{ secrets.DOCKER_HUB_USERNAME }}/astrbot:v1
docker push ${{ secrets.DOCKER_HUB_USERNAME }}/astrbot:latest

27
.github/workflows/stale.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
# This workflow warns and then closes issues and PRs that have had no activity for a specified amount of time.
#
# You can adjust the behavior by modifying this file.
# For more information, see:
# https://github.com/actions/stale
name: Mark stale issues and pull requests
on:
schedule:
- cron: '21 23 * * *'
jobs:
stale:
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
steps:
- uses: actions/stale@v5
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: 'Stale issue message'
stale-pr-message: 'Stale pull request message'
stale-issue-label: 'no-issue-activity'
stale-pr-label: 'no-pr-activity'

2
.gitignore vendored
View File

@@ -6,3 +6,5 @@ configs/session
configs/config.yaml
**/.DS_Store
temp
cmd_config.json
addons/plugins/

View File

@@ -6,36 +6,31 @@
# AstrBot
*✨ 2024 - 希望成为一个跨平台、极易上手、稳定安全的机器人项目。✨*
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/Soulter/AstrBot)](https://github.com/Soulter/AstrBot/releases/latest)
<img src="https://wakatime.com/badge/user/915e5316-99c6-4563-a483-ef186cf000c9/project/34412545-2e37-400f-bedc-42348713ac1f.svg" alt="wakatime">
<img src="https://img.shields.io/badge/python-3.9+-blue.svg" alt="python">
<a href="https://hub.docker.com/r/soulter/astrbot"><img alt="Docker pull" src="https://img.shields.io/docker/pulls/soulter/astrbot.svg"/></a>
<a href="https://qm.qq.com/cgi-bin/qm/qr?k=EYGsuUTfe00_iOu9JTXS7_TEpMkXOvwv&jump_from=webapi&authKey=uUEMKCROfsseS+8IzqPjzV3y1tzy4AkykwTib2jNkOFdzezF9s9XknqnIaf3CDft">
<img alt="Static Badge" src="https://img.shields.io/badge/QQ群-322154837-purple">
</a>
<img alt="Static Badge" src="https://img.shields.io/badge/频道-x42d56aki2-purple">
<a href="https://astrbot.soulter.top/center">项目主页(开发中)</a>
<a href="https://github.com/Soulter/QQChannelChatGPT/wiki">部署文档</a>
<a href="https://astrbot.soulter.top/center">项目部署</a>
<a href="https://github.com/Soulter/QQChannelChatGPT/issues">问题提交</a>
<a href="https://astrbot.soulter.top/center/docs/%E5%BC%80%E5%8F%91/%E6%8F%92%E4%BB%B6%E5%BC%80%E5%8F%91">插件开发(最少只需 25 行,真不难!</a>
<a href="https://astrbot.soulter.top/center/docs/%E5%BC%80%E5%8F%91/%E6%8F%92%E4%BB%B6%E5%BC%80%E5%8F%91">插件开发(最少只需 25 行)</a>
</div>
## 🤔您可能想了解的
- **如何部署?** [帮助文档](https://github.com/Soulter/QQChannelChatGPT/wiki) (部署不成功欢迎进群捞人解决<3)
- **如何部署?** [帮助文档](https://astrbot.soulter.top/center/docs/%E9%83%A8%E7%BD%B2/%E9%80%9A%E8%BF%87Docker%E9%83%A8%E7%BD%B2) (部署不成功欢迎进群捞人解决<3)
- **go-cqhttp启动不成功报登录失败** [在这里搜索解决方法](https://github.com/Mrs4s/go-cqhttp/issues)
- **程序闪退/机器人启动不成功** [提交issue或加群反馈](https://github.com/Soulter/QQChannelChatGPT/issues)
- **如何开启ChatGPTBardClaude等语言模型** [查看帮助](https://github.com/Soulter/QQChannelChatGPT/wiki/%E8%A1%A5%E5%85%85%EF%BC%9A%E5%A6%82%E4%BD%95%E5%BC%80%E5%90%AFChatGPT%E3%80%81Bard%E3%80%81Claude%E7%AD%89%E8%AF%AD%E8%A8%80%E6%A8%A1%E5%9E%8B%EF%BC%9F)
- **如何开启 ChatGPTClaudeHuggingChat 等语言模型** [查看帮助](https://astrbot.soulter.top/center/docs/%E4%BD%BF%E7%94%A8/%E5%A4%A7%E8%AF%AD%E8%A8%80%E6%A8%A1%E5%9E%8B)
## 🧩功能:
最近功能
1. 支持切换代码分支输入`/update checkout <分支名>`即可切换代码分支
2. 正在测试可视化面板输入`/update checkout dev_dashboard`后根据提示即可体验
1. 可视化面板
2. Docker 一键部署项目[链接](https://astrbot.soulter.top/center/docs/%E9%83%A8%E7%BD%B2/%E9%80%9A%E8%BF%87Docker%E9%83%A8%E7%BD%B2)
🌍支持的AI语言模型一览
@@ -48,10 +43,9 @@
- HuggingChat免费[LLMs插件](https://github.com/Soulter/llms)支持
**图片生成**
- OpenAI Dalle 接口
- NovelAI/Naifu (免费[AIDraw插件](https://github.com/Soulter/aidraw)支持)
🌍机器人支持的能力一览
- 可视化面板beta
- 同时部署机器人到 QQ QQ 频道
@@ -129,8 +123,6 @@
> 使用`plugin i 插件GitHub链接`即可安装。
插件开发教程https://github.com/Soulter/QQChannelChatGPT/wiki/%E5%9B%9B%E3%80%81%E5%BC%80%E5%8F%91%E6%8F%92%E4%BB%B6
部分插件:
- `LLMS`: https://github.com/Soulter/llms | Claude, HuggingChat 大语言模型接入。
@@ -162,15 +154,13 @@
- `/key` 动态添加key
- `/set` 人格设置面板
- `/keyword nihao 你好` 设置关键词回复。nihao->你好
- `/bing` 切换为bing
- `/revgpt` 切换为ChatGPT逆向库
- `/画` 画画
#### 逆向ChatGPT库语言模型
- `/gpt` 切换为OpenAI官方API
- `/bing` 切换为bing
* 切换模型指令支持临时回复。如`/bing 你好`将会临时使用一次bing模型 -->
* 切换模型指令支持临时回复。如`/a 你好`将会临时使用一次bing模型 -->
<!--
## 🙇‍感谢

View File

@@ -1 +1 @@
import{x as i,o as l,c as _,w as s,a as e,f as a,J as m,V as r,b as t,t as u,ab as p,B as n,ac as o,j as f}from"./index-7c8bc001.js";const b={class:"text-h3"},h={class:"d-flex align-center"},g={class:"d-flex align-center"},V=i({__name:"BaseBreadcrumb",props:{title:String,breadcrumbs:Array,icon:String},setup(d){const c=d;return(x,B)=>(l(),_(r,{class:"page-breadcrumb mb-1 mt-1"},{default:s(()=>[e(a,{cols:"12",md:"12"},{default:s(()=>[e(m,{variant:"outlined",elevation:"0",class:"px-4 py-3 withbg"},{default:s(()=>[e(r,{"no-gutters":"",class:"align-center"},{default:s(()=>[e(a,{md:"5"},{default:s(()=>[t("h3",b,u(c.title),1)]),_:1}),e(a,{md:"7",sm:"12",cols:"12"},{default:s(()=>[e(p,{items:c.breadcrumbs,class:"text-h5 justify-md-end pa-1"},{divider:s(()=>[t("div",h,[e(n(o),{size:"17"})])]),prepend:s(()=>[e(f,{size:"small",icon:"mdi-home",class:"text-secondary mr-2"}),t("div",g,[e(n(o),{size:"17"})])]),_:1},8,["items"])]),_:1})]),_:1})]),_:1})]),_:1})]),_:1}))}});export{V as _};
import{x as i,o as l,c as _,w as s,a as e,f as a,J as m,V as r,b as t,t as u,ab as p,B as n,ac as o,j as f}from"./index-70ea897e.js";const b={class:"text-h3"},h={class:"d-flex align-center"},g={class:"d-flex align-center"},V=i({__name:"BaseBreadcrumb",props:{title:String,breadcrumbs:Array,icon:String},setup(d){const c=d;return(x,B)=>(l(),_(r,{class:"page-breadcrumb mb-1 mt-1"},{default:s(()=>[e(a,{cols:"12",md:"12"},{default:s(()=>[e(m,{variant:"outlined",elevation:"0",class:"px-4 py-3 withbg"},{default:s(()=>[e(r,{"no-gutters":"",class:"align-center"},{default:s(()=>[e(a,{md:"5"},{default:s(()=>[t("h3",b,u(c.title),1)]),_:1}),e(a,{md:"7",sm:"12",cols:"12"},{default:s(()=>[e(p,{items:c.breadcrumbs,class:"text-h5 justify-md-end pa-1"},{divider:s(()=>[t("div",h,[e(n(o),{size:"17"})])]),prepend:s(()=>[e(f,{size:"small",icon:"mdi-home",class:"text-secondary mr-2"}),t("div",g,[e(n(o),{size:"17"})])]),_:1},8,["items"])]),_:1})]),_:1})]),_:1})]),_:1})]),_:1}))}});export{V as _};

View File

@@ -1 +1 @@
import{x as e,o as a,c as t,w as o,a as s,B as n,X as r,T as c}from"./index-7c8bc001.js";const f=e({__name:"BlankLayout",setup(p){return(u,_)=>(a(),t(c,null,{default:o(()=>[s(n(r))]),_:1}))}});export{f as default};
import{x as e,o as a,c as t,w as o,a as s,B as n,X as r,T as c}from"./index-70ea897e.js";const f=e({__name:"BlankLayout",setup(p){return(u,_)=>(a(),t(c,null,{default:o(()=>[s(n(r))]),_:1}))}});export{f as default};

View File

@@ -1 +1 @@
import{_ as m}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-89ca5198.js";import{_}from"./UiParentCard.vue_vue_type_script_setup_true_lang-03a5c441.js";import{x as p,D as a,o as r,s,a as e,w as t,f as o,V as i,F as n,u as g,c as h,_ as b,e as x,t as y}from"./index-7c8bc001.js";const P=p({__name:"ColorPage",setup(C){const c=a({title:"Colors Page"}),d=a([{title:"Utilities",disabled:!1,href:"#"},{title:"Colors",disabled:!0,href:"#"}]),u=a(["primary","lightprimary","secondary","lightsecondary","info","success","accent","warning","error","darkText","lightText","borderLight","inputBorder","containerBg"]);return(V,k)=>(r(),s(n,null,[e(m,{title:c.value.title,breadcrumbs:d.value},null,8,["title","breadcrumbs"]),e(i,null,{default:t(()=>[e(o,{cols:"12",md:"12"},{default:t(()=>[e(_,{title:"Color Palette"},{default:t(()=>[e(i,null,{default:t(()=>[(r(!0),s(n,null,g(u.value,(l,f)=>(r(),h(o,{md:"3",cols:"12",key:f},{default:t(()=>[e(b,{rounded:"md",class:"align-center justify-center d-flex",height:"100",width:"100%",color:l},{default:t(()=>[x("class: "+y(l),1)]),_:2},1032,["color"])]),_:2},1024))),128))]),_:1})]),_:1})]),_:1})]),_:1})],64))}});export{P as default};
import{_ as m}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-79f7dd1a.js";import{_}from"./UiParentCard.vue_vue_type_script_setup_true_lang-13601505.js";import{x as p,D as a,o as r,s,a as e,w as t,f as o,V as i,F as n,u as g,c as h,_ as b,e as x,t as y}from"./index-70ea897e.js";const P=p({__name:"ColorPage",setup(C){const c=a({title:"Colors Page"}),d=a([{title:"Utilities",disabled:!1,href:"#"},{title:"Colors",disabled:!0,href:"#"}]),u=a(["primary","lightprimary","secondary","lightsecondary","info","success","accent","warning","error","darkText","lightText","borderLight","inputBorder","containerBg"]);return(V,k)=>(r(),s(n,null,[e(m,{title:c.value.title,breadcrumbs:d.value},null,8,["title","breadcrumbs"]),e(i,null,{default:t(()=>[e(o,{cols:"12",md:"12"},{default:t(()=>[e(_,{title:"Color Palette"},{default:t(()=>[e(i,null,{default:t(()=>[(r(!0),s(n,null,g(u.value,(l,f)=>(r(),h(o,{md:"3",cols:"12",key:f},{default:t(()=>[e(b,{rounded:"md",class:"align-center justify-center d-flex",height:"100",width:"100%",color:l},{default:t(()=>[x("class: "+y(l),1)]),_:2},1032,["color"])]),_:2},1024))),128))]),_:1})]),_:1})]),_:1})]),_:1})],64))}});export{P as default};

View File

@@ -1 +1 @@
import{_ as h}from"./UiParentCard.vue_vue_type_script_setup_true_lang-03a5c441.js";import{o as a,s as t,a as n,w as i,f as b,F as d,u as g,V as C,d as U,e as x,t as c,a8 as B,R as _,c as r,a9 as w,O as v,b as V,aa as N,i as F,q as P,k as f,A as S}from"./index-7c8bc001.js";const D={name:"ConfigPage",components:{UiParentCard:h},data(){return{config_data:{data:[]},save_message_snack:!1,save_message:"",save_message_success:""}},mounted(){this.getConfig()},methods:{getConfig(){_.get("/api/configs").then(o=>{this.config_data=o.data.data,console.log(this.config_data)})},updateConfig(){_.post("/api/configs",this.config_data).then(o=>{console.log(this.config_data),o.data.status==="success"?(this.save_message=o.data.message,this.save_message_snack=!0,this.save_message_success="success"):(this.save_message=o.data.message,this.save_message_snack=!0,this.save_message_success="error")})}}},$=Object.assign(D,{setup(o){return(s,m)=>(a(),t(d,null,[n(C,null,{default:i(()=>[n(b,{cols:"12",md:"12"},{default:i(()=>[(a(!0),t(d,null,g(s.config_data.data,u=>(a(),r(h,{key:u.name,title:u.name,style:{"margin-bottom":"16px"}},{default:i(()=>[(a(!0),t(d,null,g(u.body,e=>(a(),t(d,null,[e.config_type==="item"?(a(),t(d,{key:0},[e.val_type==="bool"?(a(),r(w,{key:0,modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,label:e.name,hint:e.description,color:"primary",inset:""},null,8,["modelValue","onUpdate:modelValue","label","hint"])):e.val_type==="string"?(a(),r(v,{key:1,modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,label:e.name,hint:e.description,style:{"margin-bottom":"8px"},variant:"outlined"},null,8,["modelValue","onUpdate:modelValue","label","hint"])):e.val_type==="int"?(a(),r(v,{key:2,modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,label:e.name,hint:e.description,style:{"margin-bottom":"8px"},variant:"outlined"},null,8,["modelValue","onUpdate:modelValue","label","hint"])):e.val_type==="list"?(a(),t(d,{key:3},[V("span",null,c(e.name),1),n(N,{modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,chips:"",clearable:"",label:"请添加",multiple:"","prepend-icon":"mdi-tag-multiple-outline"},{selection:i(({attrs:l,item:p,select:k,selected:y})=>[n(F,P(l,{"model-value":y,closable:"",onClick:k,"onClick:close":O=>s.remove(p)}),{default:i(()=>[V("strong",null,c(p),1)]),_:2},1040,["model-value","onClick","onClick:close"])]),_:2},1032,["modelValue","onUpdate:modelValue"])],64)):f("",!0)],64)):e.config_type==="divider"?(a(),r(S,{key:1,style:{"margin-top":"8px","margin-bottom":"8px"}})):f("",!0)],64))),256))]),_:2},1032,["title"]))),128))]),_:1})]),_:1}),n(U,{icon:"mdi-content-save",size:"x-large",style:{position:"fixed",right:"52px",bottom:"52px"},color:"darkprimary",onClick:s.updateConfig},null,8,["onClick"]),n(B,{timeout:2e3,elevation:"24",color:s.save_message_success,modelValue:s.save_message_snack,"onUpdate:modelValue":m[0]||(m[0]=u=>s.save_message_snack=u)},{default:i(()=>[x(c(s.save_message),1)]),_:1},8,["color","modelValue"])],64))}});export{$ as default};
import{_ as h}from"./UiParentCard.vue_vue_type_script_setup_true_lang-13601505.js";import{o as a,s as t,a as n,w as i,f as b,F as d,u as g,V as C,d as U,e as x,t as c,a8 as B,R as _,c as r,a9 as w,O as v,b as V,aa as N,i as F,q as P,k as f,A as S}from"./index-70ea897e.js";const D={name:"ConfigPage",components:{UiParentCard:h},data(){return{config_data:{data:[]},save_message_snack:!1,save_message:"",save_message_success:""}},mounted(){this.getConfig()},methods:{getConfig(){_.get("/api/configs").then(o=>{this.config_data=o.data.data,console.log(this.config_data)})},updateConfig(){_.post("/api/configs",this.config_data).then(o=>{console.log(this.config_data),o.data.status==="success"?(this.save_message=o.data.message,this.save_message_snack=!0,this.save_message_success="success"):(this.save_message=o.data.message,this.save_message_snack=!0,this.save_message_success="error")})}}},$=Object.assign(D,{setup(o){return(s,m)=>(a(),t(d,null,[n(C,null,{default:i(()=>[n(b,{cols:"12",md:"12"},{default:i(()=>[(a(!0),t(d,null,g(s.config_data.data,u=>(a(),r(h,{key:u.name,title:u.name,style:{"margin-bottom":"16px"}},{default:i(()=>[(a(!0),t(d,null,g(u.body,e=>(a(),t(d,null,[e.config_type==="item"?(a(),t(d,{key:0},[e.val_type==="bool"?(a(),r(w,{key:0,modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,label:e.name,hint:e.description,color:"primary",inset:""},null,8,["modelValue","onUpdate:modelValue","label","hint"])):e.val_type==="string"?(a(),r(v,{key:1,modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,label:e.name,hint:e.description,style:{"margin-bottom":"8px"},variant:"outlined"},null,8,["modelValue","onUpdate:modelValue","label","hint"])):e.val_type==="int"?(a(),r(v,{key:2,modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,label:e.name,hint:e.description,style:{"margin-bottom":"8px"},variant:"outlined"},null,8,["modelValue","onUpdate:modelValue","label","hint"])):e.val_type==="list"?(a(),t(d,{key:3},[V("span",null,c(e.name),1),n(N,{modelValue:e.value,"onUpdate:modelValue":l=>e.value=l,chips:"",clearable:"",label:"请添加",multiple:"","prepend-icon":"mdi-tag-multiple-outline"},{selection:i(({attrs:l,item:p,select:k,selected:y})=>[n(F,P(l,{"model-value":y,closable:"",onClick:k,"onClick:close":O=>s.remove(p)}),{default:i(()=>[V("strong",null,c(p),1)]),_:2},1040,["model-value","onClick","onClick:close"])]),_:2},1032,["modelValue","onUpdate:modelValue"])],64)):f("",!0)],64)):e.config_type==="divider"?(a(),r(S,{key:1,style:{"margin-top":"8px","margin-bottom":"8px"}})):f("",!0)],64))),256))]),_:2},1032,["title"]))),128))]),_:1})]),_:1}),n(U,{icon:"mdi-content-save",size:"x-large",style:{position:"fixed",right:"52px",bottom:"52px"},color:"darkprimary",onClick:s.updateConfig},null,8,["onClick"]),n(B,{timeout:2e3,elevation:"24",color:s.save_message_success,modelValue:s.save_message_snack,"onUpdate:modelValue":m[0]||(m[0]=u=>s.save_message_snack=u)},{default:i(()=>[x(c(s.save_message),1)]),_:1},8,["color","modelValue"])],64))}});export{$ as default};

View File

@@ -0,0 +1,32 @@
/**
* Copyright (c) 2014 The xterm.js authors. All rights reserved.
* Copyright (c) 2012-2013, Christopher Jeffrey (MIT License)
* https://github.com/chjj/term.js
* @license MIT
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*
* Originally forked from (with the author's permission):
* Fabrice Bellard's javascript vt100 for jslinux:
* http://bellard.org/jslinux/
* Copyright (c) 2011 Fabrice Bellard
* The original design remains. The terminal itself
* has been extended to include xterm CSI codes, among
* other features.
*/.xterm{cursor:text;position:relative;user-select:none;-ms-user-select:none;-webkit-user-select:none}.xterm.focus,.xterm:focus{outline:none}.xterm .xterm-helpers{position:absolute;top:0;z-index:5}.xterm .xterm-helper-textarea{padding:0;border:0;margin:0;position:absolute;opacity:0;left:-9999em;top:0;width:0;height:0;z-index:-5;white-space:nowrap;overflow:hidden;resize:none}.xterm .composition-view{background:#000;color:#fff;display:none;position:absolute;white-space:nowrap;z-index:1}.xterm .composition-view.active{display:block}.xterm .xterm-viewport{background-color:#000;overflow-y:scroll;cursor:default;position:absolute;right:0;left:0;top:0;bottom:0}.xterm .xterm-screen{position:relative}.xterm .xterm-screen canvas{position:absolute;left:0;top:0}.xterm .xterm-scroll-area{visibility:hidden}.xterm-char-measure-element{display:inline-block;visibility:hidden;position:absolute;top:0;left:-9999em;line-height:normal}.xterm.enable-mouse-events{cursor:default}.xterm.xterm-cursor-pointer,.xterm .xterm-cursor-pointer{cursor:pointer}.xterm.column-select.focus{cursor:crosshair}.xterm .xterm-accessibility,.xterm .xterm-message{position:absolute;left:0;top:0;bottom:0;right:0;z-index:10;color:transparent;pointer-events:none}.xterm .live-region{position:absolute;left:-9999px;width:1px;height:1px;overflow:hidden}.xterm-dim{opacity:1!important}.xterm-underline-1{text-decoration:underline}.xterm-underline-2{text-decoration:double underline}.xterm-underline-3{text-decoration:wavy underline}.xterm-underline-4{text-decoration:dotted underline}.xterm-underline-5{text-decoration:dashed underline}.xterm-overline{text-decoration:overline}.xterm-overline.xterm-underline-1{text-decoration:overline underline}.xterm-overline.xterm-underline-2{text-decoration:overline double underline}.xterm-overline.xterm-underline-3{text-decoration:overline wavy underline}.xterm-overline.xterm-underline-4{text-decoration:overline dotted underline}.xterm-overline.xterm-underline-5{text-decoration:overline dashed underline}.xterm-strikethrough{text-decoration:line-through}.xterm-screen .xterm-decoration-container .xterm-decoration{z-index:6;position:absolute}.xterm-screen .xterm-decoration-container .xterm-decoration.xterm-decoration-top-layer{z-index:7}.xterm-decoration-overview-ruler{z-index:8;position:absolute;top:0;right:0;pointer-events:none}.xterm-decoration-top{z-index:2;position:relative}

View File

@@ -1 +1 @@
import{_ as t}from"./_plugin-vue_export-helper-c27b6911.js";import{o,c,w as s,V as i,a as r,b as e,d as l,e as a,f as d}from"./index-7c8bc001.js";const n="/assets/img-error-bg-ab6474a0.svg",_="/assets/img-error-blue-2675a7a9.svg",m="/assets/img-error-text-a6aebfa0.svg",g="/assets/img-error-purple-edee3fbc.svg";const p={},u={class:"text-center"},f=e("div",{class:"CardMediaWrapper"},[e("img",{src:n,alt:"grid",class:"w-100"}),e("img",{src:_,alt:"grid",class:"CardMediaParts"}),e("img",{src:m,alt:"build",class:"CardMediaBuild"}),e("img",{src:g,alt:"build",class:"CardMediaBuild"})],-1),h=e("h1",{class:"text-h1"},"Something is wrong",-1),v=e("p",null,[e("small",null,[a("The page you are looking was moved, removed, "),e("br"),a("renamed, or might never exist! ")])],-1);function x(b,V){return o(),c(i,{"no-gutters":"",class:"h-100vh"},{default:s(()=>[r(d,{class:"d-flex align-center justify-center"},{default:s(()=>[e("div",u,[f,h,v,r(l,{variant:"flat",color:"primary",class:"mt-4",to:"/","prepend-icon":"mdi-home"},{default:s(()=>[a(" Home")]),_:1})])]),_:1})]),_:1})}const C=t(p,[["render",x]]);export{C as default};
import{_ as t}from"./_plugin-vue_export-helper-c27b6911.js";import{o,c,w as s,V as i,a as r,b as e,d as l,e as a,f as d}from"./index-70ea897e.js";const n="/assets/img-error-bg-ab6474a0.svg",_="/assets/img-error-blue-2675a7a9.svg",m="/assets/img-error-text-a6aebfa0.svg",g="/assets/img-error-purple-edee3fbc.svg";const p={},u={class:"text-center"},f=e("div",{class:"CardMediaWrapper"},[e("img",{src:n,alt:"grid",class:"w-100"}),e("img",{src:_,alt:"grid",class:"CardMediaParts"}),e("img",{src:m,alt:"build",class:"CardMediaBuild"}),e("img",{src:g,alt:"build",class:"CardMediaBuild"})],-1),h=e("h1",{class:"text-h1"},"Something is wrong",-1),v=e("p",null,[e("small",null,[a("The page you are looking was moved, removed, "),e("br"),a("renamed, or might never exist! ")])],-1);function x(b,V){return o(),c(i,{"no-gutters":"",class:"h-100vh"},{default:s(()=>[r(d,{class:"d-flex align-center justify-center"},{default:s(()=>[e("div",u,[f,h,v,r(l,{variant:"flat",color:"primary",class:"mt-4",to:"/","prepend-icon":"mdi-home"},{default:s(()=>[a(" Home")]),_:1})])]),_:1})]),_:1})}const C=t(p,[["render",x]]);export{C as default};

View File

@@ -1 +1 @@
import{x as b,o as d,c as h,w as e,a,a6 as C,b as i,K as x,e as o,t as u,G as m,d as r,A as E,L as V,a7 as y,J as w,s as p,f as c,F as f,u as $,V as k,q as S,N as B,O as N,P as T,H as j,a8 as D,R as g,j as F}from"./index-7c8bc001.js";const G={class:"d-sm-flex align-center justify-space-between"},v=b({__name:"ExtensionCard",props:{title:String,link:String},setup(n){const s=n,l=t=>{window.open(t,"_blank")};return(t,_)=>(d(),h(w,{variant:"outlined",elevation:"0",class:"withbg"},{default:e(()=>[a(C,{style:{padding:"10px 20px"}},{default:e(()=>[i("div",G,[a(x,null,{default:e(()=>[o(u(s.title),1)]),_:1}),a(m),a(r,{icon:"mdi-link",variant:"plain",onClick:_[0]||(_[0]=z=>l(s.link))})])]),_:1}),a(E),a(V,null,{default:e(()=>[y(t.$slots,"default")]),_:3})]),_:3}))}}),P=i("div",{style:{"background-color":"white",width:"100%",padding:"16px","border-radius":"10px"}},[i("h3",null,"🧩 已安装的插件")],-1),U={style:{"min-height":"180px","max-height":"180px",overflow:"hidden"}},q={class:"d-flex align-center gap-3"},A=i("div",{style:{"background-color":"white",width:"100%",padding:"16px","border-radius":"10px"}},[i("h3",null,"🧩 插件市场 [待开发]")],-1),I=i("span",{class:"text-h5"},"从 Git 仓库链接安装插件",-1),L=i("small",null,"github, gitee, gitlab 等公开的仓库都行。",-1),O=i("br",null,null,-1),R={name:"ExtensionPage",components:{ExtensionCard:v},data(){return{extension_data:{data:[]},save_message_snack:!1,save_message:"",save_message_success:"",extension_url:"",status:"",dialog:!1,snack_message:"",snack_show:!1,snack_success:"success",install_loading:!1,uninstall_loading:!1}},mounted(){this.getExtensions()},methods:{getExtensions(){g.get("/api/extensions").then(n=>{this.extension_data.data=n.data.data,console.log(this.extension_data)})},newExtension(){this.install_loading=!0,console.log(this.install_loading),g.post("/api/extensions/install",{url:this.extension_url}).then(n=>{if(this.install_loading=!1,n.data.status==="error"){this.snack_message=n.data.message,this.snack_show=!0,this.snack_success="error";return}this.extension_data.data=n.data.data,console.log(this.extension_data),this.extension_url="",this.snack_message=n.data.message,this.snack_show=!0,this.snack_success="success",this.dialog=!1,this.getExtensions()}).catch(n=>{this.install_loading=!1,this.snack_message=n,this.snack_show=!0,this.snack_success="error"})},uninstallExtension(n){this.uninstall_loading=!0,g.post("/api/extensions/uninstall",{name:n}).then(s=>{if(this.uninstall_loading=!1,s.data.status==="error"){this.snack_message=s.data.message,this.snack_show=!0,this.snack_success="error";return}this.extension_data.data=s.data.data,console.log(this.extension_data),this.snack_message=s.data.message,this.snack_show=!0,this.snack_success="success",this.dialog=!1,this.getExtensions()}).catch(s=>{this.uninstall_loading=!1,this.snack_message=s,this.snack_show=!0,this.snack_success="error"})}}},J=Object.assign(R,{setup(n){return(s,l)=>(d(),p(f,null,[a(k,null,{default:e(()=>[a(c,{cols:"12",md:"12"},{default:e(()=>[P]),_:1}),(d(!0),p(f,null,$(s.extension_data.data,t=>(d(),h(c,{cols:"12",md:"6",lg:"4"},{default:e(()=>[(d(),h(v,{key:t.name,title:t.name,link:t.repo,style:{"margin-bottom":"16px"}},{default:e(()=>[i("p",U,u(t.desc),1),i("div",q,[a(F,null,{default:e(()=>[o("mdi-account")]),_:1}),i("span",null,u(t.author),1),a(m),a(r,{variant:"plain",onClick:_=>s.uninstallExtension(t.name),loading:s.uninstall_loading},{default:e(()=>[o("卸 载")]),_:2},1032,["onClick","loading"])])]),_:2},1032,["title","link"]))]),_:2},1024))),256)),a(c,{cols:"12",md:"12"},{default:e(()=>[A]),_:1})]),_:1}),a(j,{modelValue:s.dialog,"onUpdate:modelValue":l[3]||(l[3]=t=>s.dialog=t),persistent:"",width:"700"},{activator:e(({props:t})=>[a(r,S(t,{icon:"mdi-plus",size:"x-large",style:{position:"fixed",right:"52px",bottom:"52px"},color:"darkprimary"}),null,16)]),default:e(()=>[a(w,null,{default:e(()=>[a(x,null,{default:e(()=>[I]),_:1}),a(V,null,{default:e(()=>[a(B,null,{default:e(()=>[a(k,null,{default:e(()=>[a(c,{cols:"12"},{default:e(()=>[a(N,{label:"Git 库链接",modelValue:s.extension_url,"onUpdate:modelValue":l[0]||(l[0]=t=>s.extension_url=t),required:""},null,8,["modelValue"])]),_:1})]),_:1})]),_:1}),L,O,i("small",null,u(s.status),1)]),_:1}),a(T,null,{default:e(()=>[a(m),a(r,{color:"blue-darken-1",variant:"text",onClick:l[1]||(l[1]=t=>s.dialog=!1)},{default:e(()=>[o(" 关闭 ")]),_:1}),a(r,{color:"blue-darken-1",variant:"text",loading:s.install_loading,onClick:l[2]||(l[2]=t=>s.newExtension(s.extension_url))},{default:e(()=>[o(" 安装 ")]),_:1},8,["loading"])]),_:1})]),_:1})]),_:1},8,["modelValue"]),a(D,{timeout:2e3,elevation:"24",color:s.snack_success,modelValue:s.snack_show,"onUpdate:modelValue":l[4]||(l[4]=t=>s.snack_show=t)},{default:e(()=>[o(u(s.snack_message),1)]),_:1},8,["color","modelValue"])],64))}});export{J as default};
import{x as b,o as d,c as h,w as e,a,a6 as C,b as i,K as x,e as o,t as u,G as m,d as r,A as E,L as V,a7 as y,J as w,s as p,f as c,F as f,u as $,V as k,q as S,N as B,O as N,P as T,H as j,a8 as D,R as g,j as F}from"./index-70ea897e.js";const G={class:"d-sm-flex align-center justify-space-between"},v=b({__name:"ExtensionCard",props:{title:String,link:String},setup(n){const s=n,l=t=>{window.open(t,"_blank")};return(t,_)=>(d(),h(w,{variant:"outlined",elevation:"0",class:"withbg"},{default:e(()=>[a(C,{style:{padding:"10px 20px"}},{default:e(()=>[i("div",G,[a(x,null,{default:e(()=>[o(u(s.title),1)]),_:1}),a(m),a(r,{icon:"mdi-link",variant:"plain",onClick:_[0]||(_[0]=z=>l(s.link))})])]),_:1}),a(E),a(V,null,{default:e(()=>[y(t.$slots,"default")]),_:3})]),_:3}))}}),P=i("div",{style:{"background-color":"white",width:"100%",padding:"16px","border-radius":"10px"}},[i("h3",null,"🧩 已安装的插件")],-1),U={style:{"min-height":"180px","max-height":"180px",overflow:"hidden"}},q={class:"d-flex align-center gap-3"},A=i("div",{style:{"background-color":"white",width:"100%",padding:"16px","border-radius":"10px"}},[i("h3",null,"🧩 插件市场 [待开发]")],-1),I=i("span",{class:"text-h5"},"从 Git 仓库链接安装插件",-1),L=i("small",null,"github, gitee, gitlab 等公开的仓库都行。",-1),O=i("br",null,null,-1),R={name:"ExtensionPage",components:{ExtensionCard:v},data(){return{extension_data:{data:[]},save_message_snack:!1,save_message:"",save_message_success:"",extension_url:"",status:"",dialog:!1,snack_message:"",snack_show:!1,snack_success:"success",install_loading:!1,uninstall_loading:!1}},mounted(){this.getExtensions()},methods:{getExtensions(){g.get("/api/extensions").then(n=>{this.extension_data.data=n.data.data,console.log(this.extension_data)})},newExtension(){this.install_loading=!0,console.log(this.install_loading),g.post("/api/extensions/install",{url:this.extension_url}).then(n=>{if(this.install_loading=!1,n.data.status==="error"){this.snack_message=n.data.message,this.snack_show=!0,this.snack_success="error";return}this.extension_data.data=n.data.data,console.log(this.extension_data),this.extension_url="",this.snack_message=n.data.message,this.snack_show=!0,this.snack_success="success",this.dialog=!1,this.getExtensions()}).catch(n=>{this.install_loading=!1,this.snack_message=n,this.snack_show=!0,this.snack_success="error"})},uninstallExtension(n){this.uninstall_loading=!0,g.post("/api/extensions/uninstall",{name:n}).then(s=>{if(this.uninstall_loading=!1,s.data.status==="error"){this.snack_message=s.data.message,this.snack_show=!0,this.snack_success="error";return}this.extension_data.data=s.data.data,console.log(this.extension_data),this.snack_message=s.data.message,this.snack_show=!0,this.snack_success="success",this.dialog=!1,this.getExtensions()}).catch(s=>{this.uninstall_loading=!1,this.snack_message=s,this.snack_show=!0,this.snack_success="error"})}}},J=Object.assign(R,{setup(n){return(s,l)=>(d(),p(f,null,[a(k,null,{default:e(()=>[a(c,{cols:"12",md:"12"},{default:e(()=>[P]),_:1}),(d(!0),p(f,null,$(s.extension_data.data,t=>(d(),h(c,{cols:"12",md:"6",lg:"4"},{default:e(()=>[(d(),h(v,{key:t.name,title:t.name,link:t.repo,style:{"margin-bottom":"16px"}},{default:e(()=>[i("p",U,u(t.desc),1),i("div",q,[a(F,null,{default:e(()=>[o("mdi-account")]),_:1}),i("span",null,u(t.author),1),a(m),a(r,{variant:"plain",onClick:_=>s.uninstallExtension(t.name),loading:s.uninstall_loading},{default:e(()=>[o("卸 载")]),_:2},1032,["onClick","loading"])])]),_:2},1032,["title","link"]))]),_:2},1024))),256)),a(c,{cols:"12",md:"12"},{default:e(()=>[A]),_:1})]),_:1}),a(j,{modelValue:s.dialog,"onUpdate:modelValue":l[3]||(l[3]=t=>s.dialog=t),persistent:"",width:"700"},{activator:e(({props:t})=>[a(r,S(t,{icon:"mdi-plus",size:"x-large",style:{position:"fixed",right:"52px",bottom:"52px"},color:"darkprimary"}),null,16)]),default:e(()=>[a(w,null,{default:e(()=>[a(x,null,{default:e(()=>[I]),_:1}),a(V,null,{default:e(()=>[a(B,null,{default:e(()=>[a(k,null,{default:e(()=>[a(c,{cols:"12"},{default:e(()=>[a(N,{label:"Git 库链接",modelValue:s.extension_url,"onUpdate:modelValue":l[0]||(l[0]=t=>s.extension_url=t),required:""},null,8,["modelValue"])]),_:1})]),_:1})]),_:1}),L,O,i("small",null,u(s.status),1)]),_:1}),a(T,null,{default:e(()=>[a(m),a(r,{color:"blue-darken-1",variant:"text",onClick:l[1]||(l[1]=t=>s.dialog=!1)},{default:e(()=>[o(" 关闭 ")]),_:1}),a(r,{color:"blue-darken-1",variant:"text",loading:s.install_loading,onClick:l[2]||(l[2]=t=>s.newExtension(s.extension_url))},{default:e(()=>[o(" 安装 ")]),_:1},8,["loading"])]),_:1})]),_:1})]),_:1},8,["modelValue"]),a(D,{timeout:2e3,elevation:"24",color:s.snack_success,modelValue:s.snack_show,"onUpdate:modelValue":l[4]||(l[4]=t=>s.snack_show=t)},{default:e(()=>[o(u(s.snack_message),1)]),_:1},8,["color","modelValue"])],64))}});export{J as default};

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,4 +1,4 @@
import{_ as _t}from"./LogoDark.vue_vue_type_script_setup_true_lang-4faa128a.js";import{x as ke,ad as we,r as Ot,ae as Vt,D as A,af as Ne,a0 as P,B as I,ag as Q,ah as St,I as Be,ai as Ie,aj as Et,ak as jt,al as At,am as G,y as wt,o as Re,c as tt,w as C,a as j,O as qe,b as ge,an as Ft,d as Pt,e as Ge,s as Ct,ao as Tt,t as Nt,k as Bt,S as It,f as Fe,N as Rt,V as Pe,J as Ye,L as kt}from"./index-7c8bc001.js";import{a as Mt}from"./md5-c4ed0ff4.js";/**
import{_ as _t}from"./LogoDark.vue_vue_type_script_setup_true_lang-b8ef80c7.js";import{x as ke,ad as we,r as Ot,ae as Vt,D as A,af as Ne,a0 as P,B as I,ag as Q,ah as St,I as Be,ai as Ie,aj as Et,ak as jt,al as At,am as G,y as wt,o as Re,c as tt,w as C,a as j,O as qe,b as ge,an as Ft,d as Pt,e as Ge,s as Ct,ao as Tt,t as Nt,k as Bt,S as It,f as Fe,N as Rt,V as Pe,J as Ye,L as kt}from"./index-70ea897e.js";import{a as Mt}from"./md5-6a8a08e9.js";/**
* vee-validate v4.11.3
* (c) 2023 Abdelrahman Awad
* @license MIT

View File

@@ -1 +1 @@
import{at as _,x as d,D as n,o as c,s as m,a as f,w as p,au as r,b as a,av as o,B as t,aw as h}from"./index-7c8bc001.js";const s={Sidebar_drawer:!0,Customizer_drawer:!1,mini_sidebar:!1,fontTheme:"Roboto",inputBg:!1},l=_({id:"customizer",state:()=>({Sidebar_drawer:s.Sidebar_drawer,Customizer_drawer:s.Customizer_drawer,mini_sidebar:s.mini_sidebar,fontTheme:"Poppins",inputBg:s.inputBg}),getters:{},actions:{SET_SIDEBAR_DRAWER(){this.Sidebar_drawer=!this.Sidebar_drawer},SET_MINI_SIDEBAR(e){this.mini_sidebar=e},SET_FONT(e){this.fontTheme=e}}}),u={class:"logo",style:{display:"flex","align-items":"center"}},b={style:{"font-size":"24px","font-weight":"1000"}},w={style:{"font-size":"20px","font-weight":"1000"}},S={style:{"font-size":"20px"}},z=d({__name:"LogoDark",setup(e){n("rgb(var(--v-theme-primary))"),n("rgb(var(--v-theme-secondary))");const i=l();return(g,B)=>(c(),m("div",u,[f(t(h),{to:"/",style:{"text-decoration":"none",color:"black"}},{default:p(()=>[r(a("span",b,"AstrBot 仪表盘",512),[[o,!t(i).mini_sidebar]]),r(a("span",w,"Astr",512),[[o,t(i).mini_sidebar]]),r(a("span",S,"Bot",512),[[o,t(i).mini_sidebar]])]),_:1})]))}});export{z as _,l as u};
import{at as _,x as d,D as n,o as c,s as m,a as f,w as p,au as r,b as a,av as o,B as t,aw as h}from"./index-70ea897e.js";const s={Sidebar_drawer:!0,Customizer_drawer:!1,mini_sidebar:!1,fontTheme:"Roboto",inputBg:!1},l=_({id:"customizer",state:()=>({Sidebar_drawer:s.Sidebar_drawer,Customizer_drawer:s.Customizer_drawer,mini_sidebar:s.mini_sidebar,fontTheme:"Poppins",inputBg:s.inputBg}),getters:{},actions:{SET_SIDEBAR_DRAWER(){this.Sidebar_drawer=!this.Sidebar_drawer},SET_MINI_SIDEBAR(e){this.mini_sidebar=e},SET_FONT(e){this.fontTheme=e}}}),u={class:"logo",style:{display:"flex","align-items":"center"}},b={style:{"font-size":"24px","font-weight":"1000"}},w={style:{"font-size":"20px","font-weight":"1000"}},S={style:{"font-size":"20px"}},z=d({__name:"LogoDark",setup(e){n("rgb(var(--v-theme-primary))"),n("rgb(var(--v-theme-secondary))");const i=l();return(g,B)=>(c(),m("div",u,[f(t(h),{to:"/",style:{"text-decoration":"none",color:"black"}},{default:p(()=>[r(a("span",b,"AstrBot 仪表盘",512),[[o,!t(i).mini_sidebar]]),r(a("span",w,"Astr",512),[[o,t(i).mini_sidebar]]),r(a("span",S,"Bot",512),[[o,t(i).mini_sidebar]])]),_:1})]))}});export{z as _,l as u};

View File

@@ -1 +1 @@
import{_ as o}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-89ca5198.js";import{_ as i}from"./UiParentCard.vue_vue_type_script_setup_true_lang-03a5c441.js";import{x as n,D as a,o as c,s as m,a as e,w as t,f as d,b as f,V as _,F as u}from"./index-7c8bc001.js";const p=["innerHTML"],v=n({__name:"MaterialIcons",setup(b){const s=a({title:"Material Icons"}),r=a('<iframe src="https://materialdesignicons.com/" frameborder="0" width="100%" height="1000"></iframe>'),l=a([{title:"Icons",disabled:!1,href:"#"},{title:"Material Icons",disabled:!0,href:"#"}]);return(h,M)=>(c(),m(u,null,[e(o,{title:s.value.title,breadcrumbs:l.value},null,8,["title","breadcrumbs"]),e(_,null,{default:t(()=>[e(d,{cols:"12",md:"12"},{default:t(()=>[e(i,{title:"Material Icons"},{default:t(()=>[f("div",{innerHTML:r.value},null,8,p)]),_:1})]),_:1})]),_:1})],64))}});export{v as default};
import{_ as o}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-79f7dd1a.js";import{_ as i}from"./UiParentCard.vue_vue_type_script_setup_true_lang-13601505.js";import{x as n,D as a,o as c,s as m,a as e,w as t,f as d,b as f,V as _,F as u}from"./index-70ea897e.js";const p=["innerHTML"],v=n({__name:"MaterialIcons",setup(b){const s=a({title:"Material Icons"}),r=a('<iframe src="https://materialdesignicons.com/" frameborder="0" width="100%" height="1000"></iframe>'),l=a([{title:"Icons",disabled:!1,href:"#"},{title:"Material Icons",disabled:!0,href:"#"}]);return(h,M)=>(c(),m(u,null,[e(o,{title:s.value.title,breadcrumbs:l.value},null,8,["title","breadcrumbs"]),e(_,null,{default:t(()=>[e(d,{cols:"12",md:"12"},{default:t(()=>[e(i,{title:"Material Icons"},{default:t(()=>[f("div",{innerHTML:r.value},null,8,p)]),_:1})]),_:1})]),_:1})],64))}});export{v as default};

View File

@@ -1 +1 @@
import{_ as B}from"./LogoDark.vue_vue_type_script_setup_true_lang-4faa128a.js";import{x as y,D as o,o as b,s as U,a as e,w as a,b as n,B as $,d as u,f as d,A as _,e as f,V as r,O as m,an as A,as as E,F,c as T,N as q,J as V,L as P}from"./index-7c8bc001.js";const z="/assets/social-google-a359a253.svg",N=["src"],S=n("span",{class:"ml-2"},"Sign up with Google",-1),D=n("h5",{class:"text-h5 text-center my-4 mb-8"},"Sign up with Email address",-1),G={class:"d-sm-inline-flex align-center mt-2 mb-7 mb-sm-0 font-weight-bold"},L=n("a",{href:"#",class:"ml-1 text-lightText"},"Terms and Condition",-1),O={class:"mt-5 text-right"},j=y({__name:"AuthRegister",setup(w){const c=o(!1),i=o(!1),p=o(""),v=o(""),g=o(),h=o(""),x=o(""),k=o([s=>!!s||"Password is required",s=>s&&s.length<=10||"Password must be less than 10 characters"]),C=o([s=>!!s||"E-mail is required",s=>/.+@.+\..+/.test(s)||"E-mail must be valid"]);function R(){g.value.validate()}return(s,l)=>(b(),U(F,null,[e(u,{block:"",color:"primary",variant:"outlined",class:"text-lightText googleBtn"},{default:a(()=>[n("img",{src:$(z),alt:"google"},null,8,N),S]),_:1}),e(r,null,{default:a(()=>[e(d,{class:"d-flex align-center"},{default:a(()=>[e(_,{class:"custom-devider"}),e(u,{variant:"outlined",class:"orbtn",rounded:"md",size:"small"},{default:a(()=>[f("OR")]),_:1}),e(_,{class:"custom-devider"})]),_:1})]),_:1}),D,e(E,{ref_key:"Regform",ref:g,"lazy-validation":"",action:"/dashboards/analytical",class:"mt-7 loginForm"},{default:a(()=>[e(r,null,{default:a(()=>[e(d,{cols:"12",sm:"6"},{default:a(()=>[e(m,{modelValue:h.value,"onUpdate:modelValue":l[0]||(l[0]=t=>h.value=t),density:"comfortable","hide-details":"auto",variant:"outlined",color:"primary",label:"Firstname"},null,8,["modelValue"])]),_:1}),e(d,{cols:"12",sm:"6"},{default:a(()=>[e(m,{modelValue:x.value,"onUpdate:modelValue":l[1]||(l[1]=t=>x.value=t),density:"comfortable","hide-details":"auto",variant:"outlined",color:"primary",label:"Lastname"},null,8,["modelValue"])]),_:1})]),_:1}),e(m,{modelValue:v.value,"onUpdate:modelValue":l[2]||(l[2]=t=>v.value=t),rules:C.value,label:"Email Address / Username",class:"mt-4 mb-4",required:"",density:"comfortable","hide-details":"auto",variant:"outlined",color:"primary"},null,8,["modelValue","rules"]),e(m,{modelValue:p.value,"onUpdate:modelValue":l[3]||(l[3]=t=>p.value=t),rules:k.value,label:"Password",required:"",density:"comfortable",variant:"outlined",color:"primary","hide-details":"auto","append-icon":i.value?"mdi-eye":"mdi-eye-off",type:i.value?"text":"password","onClick:append":l[4]||(l[4]=t=>i.value=!i.value),class:"pwdInput"},null,8,["modelValue","rules","append-icon","type"]),n("div",G,[e(A,{modelValue:c.value,"onUpdate:modelValue":l[5]||(l[5]=t=>c.value=t),rules:[t=>!!t||"You must agree to continue!"],label:"Agree with?",required:"",color:"primary",class:"ms-n2","hide-details":""},null,8,["modelValue","rules"]),L]),e(u,{color:"secondary",block:"",class:"mt-2",variant:"flat",size:"large",onClick:l[6]||(l[6]=t=>R())},{default:a(()=>[f("Sign Up")]),_:1})]),_:1},512),n("div",O,[e(_),e(u,{variant:"plain",to:"/auth/login",class:"mt-2 text-capitalize mr-n2"},{default:a(()=>[f("Already have an account?")]),_:1})])],64))}});const I={class:"pa-7 pa-sm-12"},J=n("h2",{class:"text-secondary text-h2 mt-8"},"Sign up",-1),Y=n("h4",{class:"text-disabled text-h4 mt-3"},"Enter credentials to continue",-1),M=y({__name:"RegisterPage",setup(w){return(c,i)=>(b(),T(r,{class:"h-100vh","no-gutters":""},{default:a(()=>[e(d,{cols:"12",class:"d-flex align-center bg-lightprimary"},{default:a(()=>[e(q,null,{default:a(()=>[n("div",I,[e(r,{justify:"center"},{default:a(()=>[e(d,{cols:"12",lg:"10",xl:"6",md:"7"},{default:a(()=>[e(V,{elevation:"0",class:"loginBox"},{default:a(()=>[e(V,{variant:"outlined"},{default:a(()=>[e(P,{class:"pa-9"},{default:a(()=>[e(r,null,{default:a(()=>[e(d,{cols:"12",class:"text-center"},{default:a(()=>[e(B),J,Y]),_:1})]),_:1}),e(j)]),_:1})]),_:1})]),_:1})]),_:1})]),_:1})])]),_:1})]),_:1})]),_:1}))}});export{M as default};
import{_ as B}from"./LogoDark.vue_vue_type_script_setup_true_lang-b8ef80c7.js";import{x as y,D as o,o as b,s as U,a as e,w as a,b as n,B as $,d as u,f as d,A as _,e as f,V as r,O as m,an as A,as as E,F,c as T,N as q,J as V,L as P}from"./index-70ea897e.js";const z="/assets/social-google-a359a253.svg",N=["src"],S=n("span",{class:"ml-2"},"Sign up with Google",-1),D=n("h5",{class:"text-h5 text-center my-4 mb-8"},"Sign up with Email address",-1),G={class:"d-sm-inline-flex align-center mt-2 mb-7 mb-sm-0 font-weight-bold"},L=n("a",{href:"#",class:"ml-1 text-lightText"},"Terms and Condition",-1),O={class:"mt-5 text-right"},j=y({__name:"AuthRegister",setup(w){const c=o(!1),i=o(!1),p=o(""),v=o(""),g=o(),h=o(""),x=o(""),k=o([s=>!!s||"Password is required",s=>s&&s.length<=10||"Password must be less than 10 characters"]),C=o([s=>!!s||"E-mail is required",s=>/.+@.+\..+/.test(s)||"E-mail must be valid"]);function R(){g.value.validate()}return(s,l)=>(b(),U(F,null,[e(u,{block:"",color:"primary",variant:"outlined",class:"text-lightText googleBtn"},{default:a(()=>[n("img",{src:$(z),alt:"google"},null,8,N),S]),_:1}),e(r,null,{default:a(()=>[e(d,{class:"d-flex align-center"},{default:a(()=>[e(_,{class:"custom-devider"}),e(u,{variant:"outlined",class:"orbtn",rounded:"md",size:"small"},{default:a(()=>[f("OR")]),_:1}),e(_,{class:"custom-devider"})]),_:1})]),_:1}),D,e(E,{ref_key:"Regform",ref:g,"lazy-validation":"",action:"/dashboards/analytical",class:"mt-7 loginForm"},{default:a(()=>[e(r,null,{default:a(()=>[e(d,{cols:"12",sm:"6"},{default:a(()=>[e(m,{modelValue:h.value,"onUpdate:modelValue":l[0]||(l[0]=t=>h.value=t),density:"comfortable","hide-details":"auto",variant:"outlined",color:"primary",label:"Firstname"},null,8,["modelValue"])]),_:1}),e(d,{cols:"12",sm:"6"},{default:a(()=>[e(m,{modelValue:x.value,"onUpdate:modelValue":l[1]||(l[1]=t=>x.value=t),density:"comfortable","hide-details":"auto",variant:"outlined",color:"primary",label:"Lastname"},null,8,["modelValue"])]),_:1})]),_:1}),e(m,{modelValue:v.value,"onUpdate:modelValue":l[2]||(l[2]=t=>v.value=t),rules:C.value,label:"Email Address / Username",class:"mt-4 mb-4",required:"",density:"comfortable","hide-details":"auto",variant:"outlined",color:"primary"},null,8,["modelValue","rules"]),e(m,{modelValue:p.value,"onUpdate:modelValue":l[3]||(l[3]=t=>p.value=t),rules:k.value,label:"Password",required:"",density:"comfortable",variant:"outlined",color:"primary","hide-details":"auto","append-icon":i.value?"mdi-eye":"mdi-eye-off",type:i.value?"text":"password","onClick:append":l[4]||(l[4]=t=>i.value=!i.value),class:"pwdInput"},null,8,["modelValue","rules","append-icon","type"]),n("div",G,[e(A,{modelValue:c.value,"onUpdate:modelValue":l[5]||(l[5]=t=>c.value=t),rules:[t=>!!t||"You must agree to continue!"],label:"Agree with?",required:"",color:"primary",class:"ms-n2","hide-details":""},null,8,["modelValue","rules"]),L]),e(u,{color:"secondary",block:"",class:"mt-2",variant:"flat",size:"large",onClick:l[6]||(l[6]=t=>R())},{default:a(()=>[f("Sign Up")]),_:1})]),_:1},512),n("div",O,[e(_),e(u,{variant:"plain",to:"/auth/login",class:"mt-2 text-capitalize mr-n2"},{default:a(()=>[f("Already have an account?")]),_:1})])],64))}});const I={class:"pa-7 pa-sm-12"},J=n("h2",{class:"text-secondary text-h2 mt-8"},"Sign up",-1),Y=n("h4",{class:"text-disabled text-h4 mt-3"},"Enter credentials to continue",-1),M=y({__name:"RegisterPage",setup(w){return(c,i)=>(b(),T(r,{class:"h-100vh","no-gutters":""},{default:a(()=>[e(d,{cols:"12",class:"d-flex align-center bg-lightprimary"},{default:a(()=>[e(q,null,{default:a(()=>[n("div",I,[e(r,{justify:"center"},{default:a(()=>[e(d,{cols:"12",lg:"10",xl:"6",md:"7"},{default:a(()=>[e(V,{elevation:"0",class:"loginBox"},{default:a(()=>[e(V,{variant:"outlined"},{default:a(()=>[e(P,{class:"pa-9"},{default:a(()=>[e(r,null,{default:a(()=>[e(d,{cols:"12",class:"text-center"},{default:a(()=>[e(B),J,Y]),_:1})]),_:1}),e(j)]),_:1})]),_:1})]),_:1})]),_:1})]),_:1})])]),_:1})]),_:1})]),_:1}))}});export{M as default};

View File

@@ -1 +1 @@
import{_ as c}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-89ca5198.js";import{_ as f}from"./UiParentCard.vue_vue_type_script_setup_true_lang-03a5c441.js";import{x as m,D as s,o as l,s as r,a as e,w as a,f as i,V as o,F as d,u as _,J as p,U as b,b as h,t as g}from"./index-7c8bc001.js";const v=m({__name:"ShadowPage",setup(w){const n=s({title:"Shadow Page"}),u=s([{title:"Utilities",disabled:!1,href:"#"},{title:"Shadow",disabled:!0,href:"#"}]);return(V,x)=>(l(),r(d,null,[e(c,{title:n.value.title,breadcrumbs:u.value},null,8,["title","breadcrumbs"]),e(o,null,{default:a(()=>[e(i,{cols:"12",md:"12"},{default:a(()=>[e(f,{title:"Basic Shadow"},{default:a(()=>[e(o,{justify:"center"},{default:a(()=>[(l(),r(d,null,_(25,t=>e(i,{key:t,cols:"auto"},{default:a(()=>[e(p,{height:"100",width:"100",class:b(["mb-5",["d-flex justify-center align-center bg-primary",`elevation-${t}`]])},{default:a(()=>[h("div",null,g(t-1),1)]),_:2},1032,["class"])]),_:2},1024)),64))]),_:1})]),_:1})]),_:1})]),_:1})],64))}});export{v as default};
import{_ as c}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-79f7dd1a.js";import{_ as f}from"./UiParentCard.vue_vue_type_script_setup_true_lang-13601505.js";import{x as m,D as s,o as l,s as r,a as e,w as a,f as i,V as o,F as d,u as _,J as p,U as b,b as h,t as g}from"./index-70ea897e.js";const v=m({__name:"ShadowPage",setup(w){const n=s({title:"Shadow Page"}),u=s([{title:"Utilities",disabled:!1,href:"#"},{title:"Shadow",disabled:!0,href:"#"}]);return(V,x)=>(l(),r(d,null,[e(c,{title:n.value.title,breadcrumbs:u.value},null,8,["title","breadcrumbs"]),e(o,null,{default:a(()=>[e(i,{cols:"12",md:"12"},{default:a(()=>[e(f,{title:"Basic Shadow"},{default:a(()=>[e(o,{justify:"center"},{default:a(()=>[(l(),r(d,null,_(25,t=>e(i,{key:t,cols:"auto"},{default:a(()=>[e(p,{height:"100",width:"100",class:b(["mb-5",["d-flex justify-center align-center bg-primary",`elevation-${t}`]])},{default:a(()=>[h("div",null,g(t-1),1)]),_:2},1032,["class"])]),_:2},1024)),64))]),_:1})]),_:1})]),_:1})]),_:1})],64))}});export{v as default};

View File

@@ -1 +1 @@
import{_ as o}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-89ca5198.js";import{_ as n}from"./UiParentCard.vue_vue_type_script_setup_true_lang-03a5c441.js";import{x as c,D as a,o as i,s as m,a as e,w as t,f as d,b as f,V as _,F as u}from"./index-7c8bc001.js";const b=["innerHTML"],w=c({__name:"TablerIcons",setup(p){const s=a({title:"Tabler Icons"}),r=a('<iframe src="https://tablericons.com/" frameborder="0" width="100%" height="600"></iframe>'),l=a([{title:"Icons",disabled:!1,href:"#"},{title:"Tabler Icons",disabled:!0,href:"#"}]);return(h,T)=>(i(),m(u,null,[e(o,{title:s.value.title,breadcrumbs:l.value},null,8,["title","breadcrumbs"]),e(_,null,{default:t(()=>[e(d,{cols:"12",md:"12"},{default:t(()=>[e(n,{title:"Tabler Icons"},{default:t(()=>[f("div",{innerHTML:r.value},null,8,b)]),_:1})]),_:1})]),_:1})],64))}});export{w as default};
import{_ as o}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-79f7dd1a.js";import{_ as n}from"./UiParentCard.vue_vue_type_script_setup_true_lang-13601505.js";import{x as c,D as a,o as i,s as m,a as e,w as t,f as d,b as f,V as _,F as u}from"./index-70ea897e.js";const b=["innerHTML"],w=c({__name:"TablerIcons",setup(p){const s=a({title:"Tabler Icons"}),r=a('<iframe src="https://tablericons.com/" frameborder="0" width="100%" height="600"></iframe>'),l=a([{title:"Icons",disabled:!1,href:"#"},{title:"Tabler Icons",disabled:!0,href:"#"}]);return(h,T)=>(i(),m(u,null,[e(o,{title:s.value.title,breadcrumbs:l.value},null,8,["title","breadcrumbs"]),e(_,null,{default:t(()=>[e(d,{cols:"12",md:"12"},{default:t(()=>[e(n,{title:"Tabler Icons"},{default:t(()=>[f("div",{innerHTML:r.value},null,8,b)]),_:1})]),_:1})]),_:1})],64))}});export{w as default};

View File

@@ -1 +1 @@
import{_ as m}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-89ca5198.js";import{_ as v}from"./UiParentCard.vue_vue_type_script_setup_true_lang-03a5c441.js";import{x as f,o as i,c as g,w as e,a,a6 as y,K as b,e as w,t as d,A as C,L as V,a7 as L,J as _,D as o,s as h,f as k,b as t,F as x,u as B,U as H,V as T}from"./index-7c8bc001.js";const s=f({__name:"UiChildCard",props:{title:String},setup(r){const l=r;return(n,c)=>(i(),g(_,{variant:"outlined"},{default:e(()=>[a(y,{class:"py-3"},{default:e(()=>[a(b,{class:"text-h5"},{default:e(()=>[w(d(l.title),1)]),_:1})]),_:1}),a(C),a(V,null,{default:e(()=>[L(n.$slots,"default")]),_:3})]),_:3}))}}),D={class:"d-flex flex-column gap-1"},S={class:"text-caption pa-2 bg-lightprimary"},z=t("div",{class:"text-grey"},"Class",-1),N={class:"font-weight-medium"},U=t("div",null,[t("p",{class:"text-left"},"Left aligned on all viewport sizes."),t("p",{class:"text-center"},"Center aligned on all viewport sizes."),t("p",{class:"text-right"},"Right aligned on all viewport sizes."),t("p",{class:"text-sm-left"},"Left aligned on viewports SM (small) or wider."),t("p",{class:"text-right text-md-left"},"Left aligned on viewports MD (medium) or wider."),t("p",{class:"text-right text-lg-left"},"Left aligned on viewports LG (large) or wider."),t("p",{class:"text-right text-xl-left"},"Left aligned on viewports XL (extra-large) or wider.")],-1),$=t("div",{class:"d-flex justify-space-between flex-row"},[t("a",{href:"#",class:"text-decoration-none"},"Non-underlined link"),t("div",{class:"text-decoration-line-through"},"Line-through text"),t("div",{class:"text-decoration-overline"},"Overline text"),t("div",{class:"text-decoration-underline"},"Underline text")],-1),M=t("div",null,[t("p",{class:"text-high-emphasis"},"High-emphasis has an opacity of 87% in light theme and 100% in dark."),t("p",{class:"text-medium-emphasis"},"Medium-emphasis text and hint text have opacities of 60% in light theme and 70% in dark."),t("p",{class:"text-disabled"},"Disabled text has an opacity of 38% in light theme and 50% in dark.")],-1),A=f({__name:"TypographyPage",setup(r){const l=o({title:"Typography Page"}),n=o([["Heading 1","text-h1"],["Heading 2","text-h2"],["Heading 3","text-h3"],["Heading 4","text-h4"],["Heading 5","text-h5"],["Heading 6","text-h6"],["Subtitle 1","text-subtitle-1"],["Subtitle 2","text-subtitle-2"],["Body 1","text-body-1"],["Body 2","text-body-2"],["Button","text-button"],["Caption","text-caption"],["Overline","text-overline"]]),c=o([{title:"Utilities",disabled:!1,href:"#"},{title:"Typography",disabled:!0,href:"#"}]);return(O,F)=>(i(),h(x,null,[a(m,{title:l.value.title,breadcrumbs:c.value},null,8,["title","breadcrumbs"]),a(T,null,{default:e(()=>[a(k,{cols:"12",md:"12"},{default:e(()=>[a(v,{title:"Basic Typography"},{default:e(()=>[a(s,{title:"Heading"},{default:e(()=>[t("div",D,[(i(!0),h(x,null,B(n.value,([p,u])=>(i(),g(_,{variant:"outlined",key:p,class:"my-4"},{default:e(()=>[t("div",{class:H([u,"pa-2"])},d(p),3),t("div",S,[z,t("div",N,d(u),1)])]),_:2},1024))),128))])]),_:1}),a(s,{title:"Text-alignment",class:"mt-8"},{default:e(()=>[U]),_:1}),a(s,{title:"Decoration",class:"mt-8"},{default:e(()=>[$]),_:1}),a(s,{title:"Opacity",class:"mt-8"},{default:e(()=>[M]),_:1})]),_:1})]),_:1})]),_:1})],64))}});export{A as default};
import{_ as m}from"./BaseBreadcrumb.vue_vue_type_style_index_0_lang-79f7dd1a.js";import{_ as v}from"./UiParentCard.vue_vue_type_script_setup_true_lang-13601505.js";import{x as f,o as i,c as g,w as e,a,a6 as y,K as b,e as w,t as d,A as C,L as V,a7 as L,J as _,D as o,s as h,f as k,b as t,F as x,u as B,U as H,V as T}from"./index-70ea897e.js";const s=f({__name:"UiChildCard",props:{title:String},setup(r){const l=r;return(n,c)=>(i(),g(_,{variant:"outlined"},{default:e(()=>[a(y,{class:"py-3"},{default:e(()=>[a(b,{class:"text-h5"},{default:e(()=>[w(d(l.title),1)]),_:1})]),_:1}),a(C),a(V,null,{default:e(()=>[L(n.$slots,"default")]),_:3})]),_:3}))}}),D={class:"d-flex flex-column gap-1"},S={class:"text-caption pa-2 bg-lightprimary"},z=t("div",{class:"text-grey"},"Class",-1),N={class:"font-weight-medium"},U=t("div",null,[t("p",{class:"text-left"},"Left aligned on all viewport sizes."),t("p",{class:"text-center"},"Center aligned on all viewport sizes."),t("p",{class:"text-right"},"Right aligned on all viewport sizes."),t("p",{class:"text-sm-left"},"Left aligned on viewports SM (small) or wider."),t("p",{class:"text-right text-md-left"},"Left aligned on viewports MD (medium) or wider."),t("p",{class:"text-right text-lg-left"},"Left aligned on viewports LG (large) or wider."),t("p",{class:"text-right text-xl-left"},"Left aligned on viewports XL (extra-large) or wider.")],-1),$=t("div",{class:"d-flex justify-space-between flex-row"},[t("a",{href:"#",class:"text-decoration-none"},"Non-underlined link"),t("div",{class:"text-decoration-line-through"},"Line-through text"),t("div",{class:"text-decoration-overline"},"Overline text"),t("div",{class:"text-decoration-underline"},"Underline text")],-1),M=t("div",null,[t("p",{class:"text-high-emphasis"},"High-emphasis has an opacity of 87% in light theme and 100% in dark."),t("p",{class:"text-medium-emphasis"},"Medium-emphasis text and hint text have opacities of 60% in light theme and 70% in dark."),t("p",{class:"text-disabled"},"Disabled text has an opacity of 38% in light theme and 50% in dark.")],-1),A=f({__name:"TypographyPage",setup(r){const l=o({title:"Typography Page"}),n=o([["Heading 1","text-h1"],["Heading 2","text-h2"],["Heading 3","text-h3"],["Heading 4","text-h4"],["Heading 5","text-h5"],["Heading 6","text-h6"],["Subtitle 1","text-subtitle-1"],["Subtitle 2","text-subtitle-2"],["Body 1","text-body-1"],["Body 2","text-body-2"],["Button","text-button"],["Caption","text-caption"],["Overline","text-overline"]]),c=o([{title:"Utilities",disabled:!1,href:"#"},{title:"Typography",disabled:!0,href:"#"}]);return(O,F)=>(i(),h(x,null,[a(m,{title:l.value.title,breadcrumbs:c.value},null,8,["title","breadcrumbs"]),a(T,null,{default:e(()=>[a(k,{cols:"12",md:"12"},{default:e(()=>[a(v,{title:"Basic Typography"},{default:e(()=>[a(s,{title:"Heading"},{default:e(()=>[t("div",D,[(i(!0),h(x,null,B(n.value,([p,u])=>(i(),g(_,{variant:"outlined",key:p,class:"my-4"},{default:e(()=>[t("div",{class:H([u,"pa-2"])},d(p),3),t("div",S,[z,t("div",N,d(u),1)])]),_:2},1024))),128))])]),_:1}),a(s,{title:"Text-alignment",class:"mt-8"},{default:e(()=>[U]),_:1}),a(s,{title:"Decoration",class:"mt-8"},{default:e(()=>[$]),_:1}),a(s,{title:"Opacity",class:"mt-8"},{default:e(()=>[M]),_:1})]),_:1})]),_:1})]),_:1})],64))}});export{A as default};

View File

@@ -1 +1 @@
import{x as n,o,c as i,w as e,a,a6 as d,b as c,K as u,e as p,t as _,a7 as s,A as f,L as V,J as m}from"./index-7c8bc001.js";const C={class:"d-sm-flex align-center justify-space-between"},h=n({__name:"UiParentCard",props:{title:String},setup(l){const r=l;return(t,x)=>(o(),i(m,{variant:"outlined",elevation:"0",class:"withbg"},{default:e(()=>[a(d,null,{default:e(()=>[c("div",C,[a(u,null,{default:e(()=>[p(_(r.title),1)]),_:1}),s(t.$slots,"action")])]),_:3}),a(f),a(V,null,{default:e(()=>[s(t.$slots,"default")]),_:3})]),_:3}))}});export{h as _};
import{x as n,o,c as i,w as e,a,a6 as d,b as c,K as u,e as p,t as _,a7 as s,A as f,L as V,J as m}from"./index-70ea897e.js";const C={class:"d-sm-flex align-center justify-space-between"},h=n({__name:"UiParentCard",props:{title:String},setup(l){const r=l;return(t,x)=>(o(),i(m,{variant:"outlined",elevation:"0",class:"withbg"},{default:e(()=>[a(d,null,{default:e(()=>[c("div",C,[a(u,null,{default:e(()=>[p(_(r.title),1)]),_:1}),s(t.$slots,"action")])]),_:3}),a(f),a(V,null,{default:e(()=>[s(t.$slots,"default")]),_:3})]),_:3}))}});export{h as _};

File diff suppressed because one or more lines are too long

View File

@@ -1,4 +1,4 @@
import{ap as K,aq as Y,ar as V}from"./index-7c8bc001.js";var C={exports:{}};const $={},k=Object.freeze(Object.defineProperty({__proto__:null,default:$},Symbol.toStringTag,{value:"Module"})),z=K(k);/**
import{ap as K,aq as Y,ar as V}from"./index-70ea897e.js";var C={exports:{}};const $={},k=Object.freeze(Object.defineProperty({__proto__:null,default:$},Symbol.toStringTag,{value:"Module"})),z=K(k);/**
* [js-md5]{@link https://github.com/emn178/js-md5}
*
* @namespace md5

View File

@@ -11,7 +11,7 @@
href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&family=Poppins:wght@400;500;600;700&family=Roboto:wght@400;500;700&display=swap"
/>
<title>AstrBot - 仪表盘</title>
<script type="module" crossorigin src="/assets/index-7c8bc001.js"></script>
<script type="module" crossorigin src="/assets/index-70ea897e.js"></script>
<link rel="stylesheet" href="/assets/index-0f1523f3.css">
</head>
<body>

View File

@@ -9,6 +9,7 @@ import sys
import os
import threading
import time
import asyncio
def shutdown_bot(delay_s: int):
@@ -27,26 +28,30 @@ class DashBoardConfig():
val_type: Optional[str] = None # 仅 item 才需要
class DashBoardHelper():
def __init__(self, dashboard_data: DashBoardData, config: dict):
def __init__(self, global_object, config: dict):
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
self.logger = global_object.logger
dashboard_data = global_object.dashboard_data
dashboard_data.configs = {
"data": []
}
self.parse_default_config(dashboard_data, config)
self.dashboard_data: DashBoardData = dashboard_data
self.dashboard = AstrBotDashBoard(self.dashboard_data)
self.dashboard = AstrBotDashBoard(global_object)
self.key_map = {} # key: uuid, value: config key name
self.cc = CmdConfig()
@self.dashboard.register("post_configs")
def on_post_configs(post_configs: dict):
try:
gu.log(f"收到配置更新请求", gu.LEVEL_INFO, tag="可视化面板")
# self.logger.log(f"收到配置更新请求", gu.LEVEL_INFO, tag="可视化面板")
self.save_config(post_configs)
self.parse_default_config(self.dashboard_data, self.cc.get_all())
# 重启
threading.Thread(target=shutdown_bot, args=(2,), daemon=True).start()
except Exception as e:
gu.log(f"在保存配置时发生错误:{e}", gu.LEVEL_ERROR, tag="可视化面板")
# self.logger.log(f"在保存配置时发生错误:{e}", gu.LEVEL_ERROR, tag="可视化面板")
raise e
@@ -197,6 +202,14 @@ class DashBoardHelper():
value=config['direct_message_mode'],
path="direct_message_mode",
),
DashBoardConfig(
config_type="item",
val_type="string",
name="LLM 唤醒词",
description="如果不为空, 那么只有当消息以此词开头时,才会调用大语言模型进行回复。如设置为 /chat那么只有当消息以 /chat 开头时,才会调用大语言模型进行回复。",
value=config['llm_wake_prefix'],
path="llm_wake_prefix",
)
]
)
@@ -274,14 +287,14 @@ class DashBoardHelper():
llm_group = DashBoardConfig(
config_type="group",
name="LLM 配",
name="OpenAI 官方接口类设",
description="",
body=[
DashBoardConfig(
config_type="item",
val_type="list",
name="OpenAI API KEY",
description="OpenAI API 的 KEY。支持使用非官方但是兼容的 API。",
description="OpenAI API 的 KEY。支持使用非官方但是兼容的 API第三方中转key",
value=config['openai']['key'],
path="openai.key",
),
@@ -392,6 +405,47 @@ class DashBoardHelper():
]
)
rev_chatgpt_accounts = config['rev_ChatGPT']['account']
new_accs = []
for i in rev_chatgpt_accounts:
if isinstance(i, dict) and 'access_token' in i:
new_accs.append(i['access_token'])
elif isinstance(i, str):
new_accs.append(i)
config['rev_ChatGPT']['account'] = new_accs
rev_chatgpt_group = DashBoardConfig(
config_type="group",
name="逆向语言模型服务设置",
description="",
body=[
DashBoardConfig(
config_type="item",
val_type="bool",
name="启用逆向语言模型服务",
description="",
value=config['rev_ChatGPT']['enable'],
path="rev_ChatGPT.enable",
),
DashBoardConfig(
config_type="item",
val_type="string",
name="终结点Endpoint地址",
description="逆向服务的终结点服务器的地址。",
value=config['CHATGPT_BASE_URL'],
path="CHATGPT_BASE_URL",
),
DashBoardConfig(
config_type="item",
val_type="list",
name="assess_token",
description="assess_token",
value=config['rev_ChatGPT']['account'],
path="rev_ChatGPT.account",
),
]
)
baidu_aip_group = DashBoardConfig(
config_type="group",
name="百度内容审核",
@@ -468,12 +522,13 @@ class DashBoardHelper():
gocq_platform_detail_group,
proxy_group,
llm_group,
rev_chatgpt_group,
other_group,
baidu_aip_group
]
except Exception as e:
gu.log(f"配置文件解析错误:{e}", gu.LEVEL_ERROR)
self.logger.log(f"配置文件解析错误:{e}", gu.LEVEL_ERROR)
raise e

View File

@@ -8,6 +8,11 @@ import logging
from cores.database.conn import dbConn
from util.cmd_config import CmdConfig
import util.plugin_util as putil
import websockets
import json
import threading
import asyncio
import time
@dataclass
class DashBoardData():
@@ -23,14 +28,19 @@ class Response():
data: dict
class AstrBotDashBoard():
def __init__(self, dashboard_data: DashBoardData):
self.dashboard_data = dashboard_data
def __init__(self, global_object):
self.loop = asyncio.get_event_loop()
asyncio.set_event_loop(self.loop)
self.dashboard_data = global_object.dashboard_data
self.dashboard_be = Flask(__name__, static_folder="dist", static_url_path="/")
log = logging.getLogger('werkzeug')
log.setLevel(logging.ERROR)
self.funcs = {}
self.cc = CmdConfig()
self.logger = global_object.logger
self.ws_clients = {} # remote_ip: ws
# 启动 websocket 服务器
self.ws_server = websockets.serve(self.__handle_msg, "0.0.0.0", 6186)
@self.dashboard_be.get("/")
def index():
@@ -132,15 +142,6 @@ class AstrBotDashBoard():
@self.dashboard_be.get("/api/extensions")
def get_plugins():
"""
{
"name": "GoodPlugins",
"repo": "https://gitee.com/soulter/goodplugins",
"author": "soulter",
"desc": "一些好用的插件",
"version": "1.0"
}
"""
_plugin_resp = []
for plugin in self.dashboard_data.plugins:
_p = self.dashboard_data.plugins[plugin]
@@ -163,9 +164,9 @@ class AstrBotDashBoard():
post_data = request.json
repo_url = post_data["url"]
try:
gu.log(f"正在安装插件 {repo_url}", tag="可视化面板")
self.logger.log(f"正在安装插件 {repo_url}", tag="可视化面板")
putil.install_plugin(repo_url, self.dashboard_data.plugins)
gu.log(f"安装插件 {repo_url} 成功", tag="可视化面板")
self.logger.log(f"安装插件 {repo_url} 成功", tag="可视化面板")
return Response(
status="success",
message="安装成功~",
@@ -183,9 +184,9 @@ class AstrBotDashBoard():
post_data = request.json
plugin_name = post_data["name"]
try:
gu.log(f"正在卸载插件 {plugin_name}", tag="可视化面板")
self.logger.log(f"正在卸载插件 {plugin_name}", tag="可视化面板")
putil.uninstall_plugin(plugin_name, self.dashboard_data.plugins)
gu.log(f"卸载插件 {plugin_name} 成功", tag="可视化面板")
self.logger.log(f"卸载插件 {plugin_name} 成功", tag="可视化面板")
return Response(
status="success",
message="卸载成功~",
@@ -203,9 +204,9 @@ class AstrBotDashBoard():
post_data = request.json
plugin_name = post_data["name"]
try:
gu.log(f"正在更新插件 {plugin_name}", tag="可视化面板")
self.logger.log(f"正在更新插件 {plugin_name}", tag="可视化面板")
putil.update_plugin(plugin_name, self.dashboard_data.plugins)
gu.log(f"更新插件 {plugin_name} 成功", tag="可视化面板")
self.logger.log(f"更新插件 {plugin_name} 成功", tag="可视化面板")
return Response(
status="success",
message="更新成功~",
@@ -217,6 +218,15 @@ class AstrBotDashBoard():
message=e.__str__(),
data=None
).__dict__
@self.dashboard_be.post("/api/log")
def log():
for item in self.ws_clients:
try:
asyncio.run_coroutine_threadsafe(self.ws_clients[item].send(request.data.decode()), self.loop)
except Exception as e:
pass
return 'ok'
def register(self, name: str):
def decorator(func):
@@ -224,10 +234,35 @@ class AstrBotDashBoard():
return func
return decorator
async def __handle_msg(self, websocket, path):
address = websocket.remote_address
# self.logger.log(f"和 {address} 建立了 websocket 连接", tag="可视化面板")
self.ws_clients[address] = websocket
data = ''.join(self.logger.history).replace('\n', '\r\n')
await websocket.send(data)
while True:
try:
msg = await websocket.recv()
except websockets.exceptions.ConnectionClosedError:
# self.logger.log(f"和 {address} 的 websocket 连接已断开", tag="可视化面板")
del self.ws_clients[address]
break
except Exception as e:
# self.logger.log(f"和 {path} 的 websocket 连接发生了错误: {e.__str__()}", tag="可视化面板")
del self.ws_clients[address]
break
def run_ws_server(self, loop):
asyncio.set_event_loop(loop)
loop.run_until_complete(self.ws_server)
loop.run_forever()
def run(self):
threading.Thread(target=self.run_ws_server, args=(self.loop,)).start()
self.logger.log("已启动 websocket 服务器", tag="可视化面板")
ip_address = gu.get_local_ip_addresses()
ip_str = f"http://{ip_address}:6185\n\thttp://localhost:6185"
gu.log(f"\n\n==================\n您可访问:\n\n\t{ip_str}\n\n来登录可视化面板。\n注意: 所有配置项现已全量迁移至 cmd_config.json 文件下。您可以登录可视化面板在线修改配置。\n==================\n\n", tag="可视化面板")
# self.dashboard_be.run(host="0.0.0.0", port=6185)
http_server = make_server('0.0.0.0', 6185, self.dashboard_be)
self.logger.log(f"\n==================\n您可访问:\n\n\t{ip_str}\n\n来登录可视化面板,默认账号密码为空\n注意: 所有配置项现已全量迁移至 cmd_config.json 文件下,可登录可视化面板在线修改配置。\n==================\n", tag="可视化面板")
http_server = make_server('0.0.0.0', 6185, self.dashboard_be, threaded=True)
http_server.serve_forever()

View File

@@ -4,7 +4,6 @@ from nakuru import (
FriendMessage
)
from botpy.message import Message, DirectMessage
from model.platform.qq import QQ
from cores.qqbot.global_object import (
AstrMessageEvent,
CommandResult

1
addons/plugins/llms Submodule

Submodule addons/plugins/llms added at ec088771a3

View File

View File

View File

@@ -1,5 +1,3 @@
import botpy
from botpy.message import Message, DirectMessage
import re
import json
import threading
@@ -11,46 +9,31 @@ import os
import sys
from cores.qqbot.personality import personalities
from addons.baidu_aip_judge import BaiduJudge
from model.platform.qqchan import QQChan, NakuruGuildMember, NakuruGuildMessage
from model.platform.qq import QQ
from model.platform.qqgroup import (
UnofficialQQBotSDK,
Event as QQEvent,
Message as QQMessage,
MessageChain,
PlainText
)
from nakuru import (
CQHTTP,
GroupMessage,
GroupMemberIncrease,
FriendMessage,
GuildMessage,
Notify
)
from model.platform._nakuru_translation_layer import NakuruGuildMember, NakuruGuildMessage
from nakuru.entities.components import Plain,At,Image
from model.provider.provider import Provider
from model.command.command import Command
from util import general_utils as gu
from util.general_utils import Logger
from util.cmd_config import CmdConfig as cc
from util.cmd_config import init_astrbot_config_items
import util.function_calling.gplugin as gplugin
import util.plugin_util as putil
from PIL import Image as PILImage
import io
import traceback
from . global_object import GlobalObject
from typing import Union, Callable
from typing import Union
from addons.dashboard.helper import DashBoardHelper
from addons.dashboard.server import DashBoardData
from cores.monitor.perf import run_monitor
from cores.database.conn import dbConn
# 缓存的会话
session_dict = {}
# 统计信息
count = {}
# 统计信息
stat_file = ''
from model.platform._message_result import MessageResult
# 用户发言频率
user_frequency = {}
@@ -59,158 +42,69 @@ frequency_time = 60
# 计数默认值
frequency_count = 2
# 公告(可自定义):
announcement = ""
# 机器人私聊模式
direct_message_mode = True
# 版本
version = '3.1.0'
version = '3.1.2'
# 语言模型
REV_CHATGPT = 'rev_chatgpt'
OPENAI_OFFICIAL = 'openai_official'
REV_ERNIE = 'rev_ernie'
REV_EDGEGPT = 'rev_edgegpt'
NONE_LLM = 'none_llm'
chosen_provider = None
# 语言模型对象
llm_instance: dict[str, Provider] = {}
llm_command_instance: dict[str, Command] = {}
llm_wake_prefix = ""
# 百度内容审核实例
baidu_judge = None
# 关键词回复
keywords = {}
# QQ频道机器人
qqchannel_bot: QQChan = None
PLATFORM_QQCHAN = 'qqchan'
qqchan_loop = None
client = None
# QQ群机器人
PLATFROM_QQBOT = 'qqbot'
# CLI
PLATFORM_CLI = 'cli'
# 加载默认配置
cc.init_attributes("qq_forward_threshold", 200)
cc.init_attributes("qq_welcome", "欢迎加入本群!\n欢迎给https://github.com/Soulter/QQChannelChatGPT项目一个Star😊~\n输入help查看帮助~\n")
cc.init_attributes("bing_proxy", "")
cc.init_attributes("qq_pic_mode", False)
cc.init_attributes("rev_chatgpt_model", "")
cc.init_attributes("rev_chatgpt_plugin_ids", [])
cc.init_attributes("rev_chatgpt_PUID", "")
cc.init_attributes("rev_chatgpt_unverified_plugin_domains", [])
cc.init_attributes("gocq_host", "127.0.0.1")
cc.init_attributes("gocq_http_port", 5700)
cc.init_attributes("gocq_websocket_port", 6700)
cc.init_attributes("gocq_react_group", True)
cc.init_attributes("gocq_react_guild", True)
cc.init_attributes("gocq_react_friend", True)
cc.init_attributes("gocq_react_group_increase", True)
cc.init_attributes("gocq_qqchan_admin", "")
cc.init_attributes("other_admins", [])
cc.init_attributes("CHATGPT_BASE_URL", "")
cc.init_attributes("qqbot_appid", "")
cc.init_attributes("qqbot_secret", "")
cc.init_attributes("llm_env_prompt", "> hint: 末尾根据内容和心情添加 1-2 个emoji")
cc.init_attributes("default_personality_str", "")
cc.init_attributes("openai_image_generate", {
"model": "dall-e-3",
"size": "1024x1024",
"style": "vivid",
"quality": "standard",
})
cc.init_attributes("http_proxy", "")
cc.init_attributes("https_proxy", "")
cc.init_attributes("dashboard_username", "")
cc.init_attributes("dashboard_password", "")
# cc.init_attributes(["qq_forward_mode"], False)
# QQ机器人
gocq_bot = None
PLATFORM_GOCQ = 'gocq'
gocq_app = CQHTTP(
host=cc.get("gocq_host", "127.0.0.1"),
port=cc.get("gocq_websocket_port", 6700),
http_port=cc.get("gocq_http_port", 5700),
)
qq_bot: UnofficialQQBotSDK = UnofficialQQBotSDK(
cc.get("qqbot_appid", None),
cc.get("qqbot_secret", None)
)
gocq_loop: asyncio.AbstractEventLoop = None
qqbot_loop: asyncio.AbstractEventLoop = None
init_astrbot_config_items()
# 全局对象
_global_object: GlobalObject = None
def new_sub_thread(func, args=()):
thread = threading.Thread(target=_runner, args=(func, args), daemon=True)
thread.start()
def _runner(func: Callable, args: tuple):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(func(*args))
loop.close()
logger: Logger = Logger()
# 统计消息数据
def upload():
global version, gocq_bot, qqchannel_bot
global version
while True:
addr = ''
addr_ip = ''
session_dict_dump = '{}'
try:
addr = requests.get('http://myip.ipip.net', timeout=5).text
addr_ip = re.findall(r'\d+.\d+.\d+.\d+', addr)[0]
except BaseException as e:
pass
try:
gocq_cnt = 0
qqchan_cnt = 0
if gocq_bot is not None:
gocq_cnt = gocq_bot.get_cnt()
if qqchannel_bot is not None:
qqchan_cnt = qqchannel_bot.get_cnt()
o = {"cnt_total": _global_object.cnt_total,"admin": _global_object.admin_qq,"addr": addr, 's': session_dict_dump}
o = {
"cnt_total": _global_object.cnt_total,
"admin": _global_object.admin_qq,
}
o_j = json.dumps(o)
res = {"version": version, "count": gocq_cnt+qqchan_cnt, "ip": addr_ip, "others": o_j, "cntqc": qqchan_cnt, "cntgc": gocq_cnt}
gu.log(res, gu.LEVEL_DEBUG, tag="Upload", fg = gu.FG_COLORS['yellow'], bg=gu.BG_COLORS['black'])
res = {
"version": version,
"count": _global_object.cnt_total,
"cntqc": -1,
"cntgc": -1,
"ip": addr_ip,
"others": o_j,
"sys": sys.platform,
}
logger.log(res, gu.LEVEL_DEBUG, tag="Uploader")
resp = requests.post('https://api.soulter.top/upload', data=json.dumps(res), timeout=5)
# print(resp.text)
if resp.status_code == 200:
ok = resp.json()
if ok['status'] == 'ok':
_global_object.cnt_total = 0
if gocq_bot is not None:
gocq_cnt = gocq_bot.set_cnt(0)
if qqchannel_bot is not None:
qqchan_cnt = qqchannel_bot.set_cnt(0)
except BaseException as e:
gu.log("上传统计信息时出现错误: " + str(e), gu.LEVEL_ERROR, tag="Upload")
pass
time.sleep(10*60)
# 语言模型选择
def privider_chooser(cfg):
l = []
if 'rev_ChatGPT' in cfg and cfg['rev_ChatGPT']['enable']:
l.append('rev_chatgpt')
if 'rev_ernie' in cfg and cfg['rev_ernie']['enable']:
l.append('rev_ernie')
if 'rev_edgegpt' in cfg and cfg['rev_edgegpt']['enable']:
l.append('rev_edgegpt')
if 'openai' in cfg and len(cfg['openai']['key']) > 0 and cfg['openai']['key'][0] is not None:
l.append('openai_official')
return l
@@ -221,8 +115,9 @@ def privider_chooser(cfg):
def initBot(cfg):
global llm_instance, llm_command_instance
global baidu_judge, chosen_provider
global frequency_count, frequency_time, announcement, direct_message_mode
global frequency_count, frequency_time
global keywords, _global_object
global logger
# 迁移旧配置
gu.try_migrate_config(cfg)
@@ -238,6 +133,8 @@ def initBot(cfg):
_global_object.stat['session'] = {}
_global_object.stat['message'] = {}
_global_object.stat['platform'] = {}
_global_object.logger = logger
logger.log("AstrBot v"+version, gu.LEVEL_INFO)
if 'reply_prefix' in cfg:
# 适配旧版配置
@@ -249,10 +146,10 @@ def initBot(cfg):
_global_object.reply_prefix = cfg['reply_prefix']
# 语言模型提供商
gu.log("--------加载语言模型--------", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
logger.log("正在载入语言模型...", gu.LEVEL_INFO)
prov = privider_chooser(cfg)
if REV_CHATGPT in prov:
gu.log("- 逆向ChatGPT库 -", gu.LEVEL_INFO)
logger.log("初始化:逆向 ChatGPT", gu.LEVEL_INFO)
if cfg['rev_ChatGPT']['enable']:
if 'account' in cfg['rev_ChatGPT']:
from model.provider.rev_chatgpt import ProviderRevChatGPT
@@ -261,24 +158,9 @@ def initBot(cfg):
llm_command_instance[REV_CHATGPT] = CommandRevChatGPT(llm_instance[REV_CHATGPT], _global_object)
chosen_provider = REV_CHATGPT
else:
input("[System-err] 请退出本程序, 然后在配置文件中填写rev_ChatGPT相关配置")
if REV_EDGEGPT in prov:
gu.log("- New Bing -", gu.LEVEL_INFO)
if not os.path.exists('./cookies.json'):
input("[System-err] 导入Bing模型时发生错误, 没有找到cookies文件或者cookies文件放置位置错误。windows启动器启动的用户请把cookies.json文件放到和启动器相同的目录下。\n如何获取请看https://github.com/Soulter/QQChannelChatGPT仓库介绍。")
else:
if cfg['rev_edgegpt']['enable']:
try:
from model.provider.rev_edgegpt import ProviderRevEdgeGPT
from model.command.rev_edgegpt import CommandRevEdgeGPT
llm_instance[REV_EDGEGPT] = ProviderRevEdgeGPT()
llm_command_instance[REV_EDGEGPT] = CommandRevEdgeGPT(llm_instance[REV_EDGEGPT], _global_object)
chosen_provider = REV_EDGEGPT
except BaseException as e:
print(traceback.format_exc())
gu.log("加载Bing模型时发生错误, 请检查1. cookies文件是否正确放置 2. 是否设置了代理(梯子)。", gu.LEVEL_ERROR, max_len=60)
input("请退出本程序, 然后在配置文件中填写rev_ChatGPT相关配置")
if OPENAI_OFFICIAL in prov:
gu.log("- OpenAI官方 -", gu.LEVEL_INFO)
logger.log("初始化:OpenAI官方", gu.LEVEL_INFO)
if cfg['openai']['key'] is not None and cfg['openai']['key'] != [None]:
from model.provider.openai_official import ProviderOpenAIOfficial
from model.command.openai_official import CommandOpenAIOfficial
@@ -286,7 +168,6 @@ def initBot(cfg):
llm_command_instance[OPENAI_OFFICIAL] = CommandOpenAIOfficial(llm_instance[OPENAI_OFFICIAL], _global_object)
chosen_provider = OPENAI_OFFICIAL
gu.log("--------加载配置--------", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
# 得到关键词
if os.path.exists("keyword.json"):
with open("keyword.json", 'r', encoding='utf-8') as f:
@@ -296,53 +177,31 @@ def initBot(cfg):
p = cc.get("chosen_provider", None)
if p is not None and p in llm_instance:
chosen_provider = p
gu.log(f"将使用 {chosen_provider} 语言模型。", gu.LEVEL_INFO)
# 百度内容审核
if 'baidu_aip' in cfg and 'enable' in cfg['baidu_aip'] and cfg['baidu_aip']['enable']:
try:
baidu_judge = BaiduJudge(cfg['baidu_aip'])
gu.log("百度内容审核初始化成功", gu.LEVEL_INFO)
logger.log("百度内容审核初始化成功", gu.LEVEL_INFO)
except BaseException as e:
gu.log("百度内容审核初始化失败", gu.LEVEL_ERROR)
logger.log("百度内容审核初始化失败", gu.LEVEL_ERROR)
threading.Thread(target=upload, daemon=True).start()
# 得到私聊模式配置
if 'direct_message_mode' in cfg:
direct_message_mode = cfg['direct_message_mode']
gu.log("私聊功能: "+str(direct_message_mode), gu.LEVEL_INFO)
# 得到发言频率配置
if 'limit' in cfg:
gu.log("发言频率配置: "+str(cfg['limit']), gu.LEVEL_INFO)
if 'count' in cfg['limit']:
frequency_count = cfg['limit']['count']
if 'time' in cfg['limit']:
frequency_time = cfg['limit']['time']
# 得到公告配置
if 'notice' in cfg:
if cc.get("qq_welcome", None) != None and cfg['notice'] == '此机器人由Github项目QQChannelChatGPT驱动。':
announcement = cc.get("qq_welcome", None)
else:
announcement = cfg['notice']
gu.log("公告配置: " + announcement, gu.LEVEL_INFO)
try:
if 'uniqueSessionMode' in cfg and cfg['uniqueSessionMode']:
_global_object.uniqueSession = True
else:
_global_object.uniqueSession = False
gu.log("独立会话: "+str(_global_object.uniqueSession), gu.LEVEL_INFO)
except BaseException as e:
gu.log("独立会话配置错误: "+str(e), gu.LEVEL_ERROR)
gu.log(f"QQ开放平台AppID: {cfg['qqbot']['appid']} 令牌: {cfg['qqbot']['token']}")
if chosen_provider is None:
gu.log("检测到没有启动任何语言模型。", gu.LEVEL_CRITICAL)
logger.log("独立会话配置错误: "+str(e), gu.LEVEL_ERROR)
nick_qq = cc.get("nick_qq", None)
if nick_qq == None:
@@ -353,73 +212,37 @@ def initBot(cfg):
nick_qq = tuple(nick_qq)
_global_object.nick = nick_qq
thread_inst = None
# 语言模型唤醒词
global llm_wake_prefix
llm_wake_prefix = cc.get("llm_wake_prefix", "")
gu.log("--------加载插件--------", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
logger.log("正在载入插件...", gu.LEVEL_INFO)
# 加载插件
_command = Command(None, _global_object)
ok, err = putil.plugin_reload(_global_object.cached_plugins)
if ok:
gu.log("加载插件完", gu.LEVEL_INFO)
logger.log(f"功载入{len(_global_object.cached_plugins)}个插件", gu.LEVEL_INFO)
else:
gu.log(err, gu.LEVEL_ERROR)
logger.log(err, gu.LEVEL_ERROR)
if chosen_provider is None:
llm_command_instance[NONE_LLM] = _command
chosen_provider = NONE_LLM
gu.log("--------加载机器人平台--------", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
admin_qq = cc.get('admin_qq', None)
admin_qqchan = cc.get('admin_qqchan', None)
if admin_qq == None:
gu.log("未设置管理者QQ号(管理者才能使用update/plugin等指令),如需设置,请编辑 cmd_config.json 文件", gu.LEVEL_WARNING)
if admin_qqchan == None:
gu.log("未设置管理者QQ频道用户号(管理者才能使用update/plugin等指令),如需设置,请编辑 cmd_config.json 文件。可在频道发送指令 !myid 获取", gu.LEVEL_WARNING)
_global_object.admin_qq = admin_qq
_global_object.admin_qqchan = admin_qqchan
global qq_bot, qqbot_loop
qqbot_loop = asyncio.new_event_loop()
if cc.get("qqbot_appid", '') != '' and cc.get("qqbot_secret", '') != '':
gu.log("- 启用QQ群机器人 -", gu.LEVEL_INFO)
thread_inst = threading.Thread(target=run_qqbot, args=(qqbot_loop, qq_bot,), daemon=True)
thread_inst.start()
logger.log("正在载入机器人消息平台", gu.LEVEL_INFO)
# logger.log("提示:需要添加管理员 ID 才能使用 update/plugin 等指令),可在可视化面板添加。(如已添加可忽略)", gu.LEVEL_WARNING)
platform_str = ""
# GOCQ
global gocq_bot
if 'gocqbot' in cfg and cfg['gocqbot']['enable']:
gu.log("- 启用QQ机器人 -", gu.LEVEL_INFO)
global gocq_app, gocq_loop
gocq_loop = asyncio.new_event_loop()
gocq_bot = QQ(True, cc, gocq_loop)
thread_inst = threading.Thread(target=run_gocq_bot, args=(gocq_loop, gocq_bot, gocq_app), daemon=True)
thread_inst.start()
else:
gocq_bot = QQ(False)
_global_object.platform_qq = gocq_bot
gu.log("机器人部署教程: https://github.com/Soulter/QQChannelChatGPT/wiki/", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
gu.log("如果有任何问题, 请在 https://github.com/Soulter/QQChannelChatGPT 上提交 issue 或加群 322154837", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
gu.log("请给 https://github.com/Soulter/QQChannelChatGPT 点个 star!", gu.LEVEL_INFO, fg=gu.FG_COLORS['yellow'])
logger.log("启用 QQ_GOCQ 机器人消息平台", gu.LEVEL_INFO)
threading.Thread(target=run_gocq_bot, args=(cfg, _global_object), daemon=True).start()
platform_str += "QQ_GOCQ,"
# QQ频道
if 'qqbot' in cfg and cfg['qqbot']['enable']:
gu.log("- 启用QQ频道机器人 -", gu.LEVEL_INFO)
global qqchannel_bot, qqchan_loop
qqchannel_bot = QQChan()
qqchan_loop = asyncio.new_event_loop()
_global_object.platform_qqchan = qqchannel_bot
thread_inst = threading.Thread(target=run_qqchan_bot, args=(cfg, qqchan_loop, qqchannel_bot), daemon=True)
thread_inst.start()
# thread.join()
if thread_inst == None:
gu.log("没有启用/成功启用任何机器人平台", gu.LEVEL_CRITICAL)
if 'qqbot' in cfg and cfg['qqbot']['enable'] and cfg['qqbot']['appid'] != None:
logger.log("启用 QQ_OFFICIAL 机器人消息平台", gu.LEVEL_INFO)
threading.Thread(target=run_qqchan_bot, args=(cfg, _global_object), daemon=True).start()
platform_str += "QQ_OFFICIAL,"
default_personality_str = cc.get("default_personality_str", "")
if default_personality_str == "":
@@ -436,17 +259,19 @@ def initBot(cfg):
logs={},
plugins=_global_object.cached_plugins,
)
dashboard_helper = DashBoardHelper(_global_object.dashboard_data, config=cc.get_all())
dashboard_helper = DashBoardHelper(_global_object, config=cc.get_all())
dashboard_thread = threading.Thread(target=dashboard_helper.run, daemon=True)
dashboard_thread.start()
# 运行 monitor
threading.Thread(target=run_monitor, args=(_global_object,), daemon=False).start()
gu.log("🎉 项目启动完成。")
# asyncio.get_event_loop().run_until_complete(cli())
logger.log("如果有任何问题, 请在 https://github.com/Soulter/AstrBot 上提交 issue 或加群 322154837。", gu.LEVEL_INFO)
logger.log("请给 https://github.com/Soulter/AstrBot 点个 star。", gu.LEVEL_INFO)
if platform_str == '':
platform_str = "(未启动任何平台,请前往面板添加)"
logger.log(f"🎉 项目启动完成\n - 启动的LLM: {len(llm_instance)}\n - 启动的平台: {platform_str}\n - 启动的插件: {len(_global_object.cached_plugins)}")
dashboard_thread.join()
async def cli():
@@ -478,56 +303,42 @@ async def cli_pack_message(prompt: str) -> NakuruGuildMessage:
return ngm
'''
运行QQ频道机器人
运行 QQ_OFFICIAL 机器人
'''
def run_qqchan_bot(cfg, loop, qqchannel_bot: QQChan):
asyncio.set_event_loop(loop)
intents = botpy.Intents(public_guild_messages=True, direct_message=True)
global client
client = botClient(
intents=intents,
bot_log=False
)
def run_qqchan_bot(cfg: dict, global_object: GlobalObject):
try:
qqchannel_bot.run_bot(client, cfg['qqbot']['appid'], cfg['qqbot']['token'])
from model.platform.qq_official import QQOfficial
qqchannel_bot = QQOfficial(cfg=cfg, message_handler=oper_msg, global_object=global_object)
global_object.platform_qqchan = qqchannel_bot
qqchannel_bot.run()
except BaseException as e:
gu.log("启动QQ频道机器人时出现错误, 原因如下: " + str(e), gu.LEVEL_CRITICAL, tag="QQ频道")
gu.log(r"如果您是初次启动,请修改配置文件QQChannelChatGPT/config.yaml详情请看https://github.com/Soulter/QQChannelChatGPT/wiki" + str(e), gu.LEVEL_CRITICAL, tag="System")
i = input("按回车退出程序。\n")
logger.log("启动QQ频道机器人时出现错误, 原因如下: " + str(e), gu.LEVEL_CRITICAL, tag="QQ频道")
logger.log(r"如果您是初次启动,请前往可视化面板填写配置。详情请看https://astrbot.soulter.top/center/" + str(e), gu.LEVEL_CRITICAL)
'''
运行GOCQ机器人
运行 QQ_GOCQ 机器人
'''
def run_gocq_bot(loop, gocq_bot, gocq_app):
asyncio.set_event_loop(loop)
gu.log("正在检查本地GO-CQHTTP连接...端口5700, 6700", tag="QQ")
def run_gocq_bot(cfg: dict, _global_object: GlobalObject):
from model.platform.qq_gocq import QQGOCQ
logger.log("正在检查本地GO-CQHTTP连接...端口5700, 6700", tag="QQ")
noticed = False
while True:
if not gu.port_checker(5700, cc.get("gocq_host", "127.0.0.1")) or not gu.port_checker(6700, cc.get("gocq_host", "127.0.0.1")):
if not noticed:
noticed = True
gu.log("与GO-CQHTTP通信失败, 请检查GO-CQHTTP是否启动并正确配置。程序会每隔 5s 自动重试。", gu.LEVEL_CRITICAL, tag="QQ")
logger.log("与GO-CQHTTP通信失败, 请检查GO-CQHTTP是否启动并正确配置。程序会每隔 5s 自动重试。", gu.LEVEL_CRITICAL, tag="QQ")
time.sleep(5)
else:
gu.log("检查完毕,未发现问题。", tag="QQ")
logger.log("检查完毕,未发现问题。", tag="QQ")
break
global gocq_client
gocq_client = gocqClient()
try:
gocq_bot.run_bot(gocq_app)
qq_gocq = QQGOCQ(cfg=cfg, message_handler=oper_msg, global_object=_global_object)
_global_object.platform_qq = qq_gocq
qq_gocq.run()
except BaseException as e:
input("启动QQ机器人出现错误"+str(e))
'''
启动QQ群机器人(官方接口)
'''
def run_qqbot(loop: asyncio.AbstractEventLoop, qq_bot: UnofficialQQBotSDK):
asyncio.set_event_loop(loop)
QQBotClient()
qq_bot.run_bot()
'''
检查发言频率
@@ -550,133 +361,51 @@ def check_frequency(id) -> bool:
user_frequency[id] = t
return True
'''
通用消息回复
'''
async def send_message(platform, message, res, session_id = None):
global qqchannel_bot, qqchannel_bot, gocq_loop, session_dict
# 统计会话信息
if session_id is not None:
if session_id not in session_dict:
session_dict[session_id] = {'cnt': 1}
else:
session_dict[session_id]['cnt'] += 1
else:
session_dict[session_id]['cnt'] += 1
async def record_message(platform: str, session_id: str):
# TODO: 这里会非常吃资源。然而 sqlite3 不支持多线程,所以暂时这样写。
curr_ts = int(time.time())
db_inst = dbConn()
db_inst.increment_stat_session(platform, session_id, 1)
db_inst.increment_stat_message(curr_ts, 1)
db_inst.increment_stat_platform(curr_ts, platform, 1)
if platform == PLATFORM_QQCHAN:
qqchannel_bot.send_qq_msg(message, res)
elif platform == PLATFORM_GOCQ:
await gocq_bot.send_qq_msg(message, res)
elif platform == PLATFROM_QQBOT:
message_chain = MessageChain()
message_chain.parse_from_nakuru(res)
await qq_bot.send(message, message_chain)
elif platform == PLATFORM_CLI:
print(res)
_global_object.cnt_total += 1
async def oper_msg(message: Union[GroupMessage, FriendMessage, GuildMessage, NakuruGuildMessage],
group: bool=False,
platform: str = None):
session_id: str,
role: str = 'member',
platform: str = None,
) -> MessageResult:
"""
处理消息。
group: 群聊模式,
message: 频道是频道的消息对象, QQ是nakuru-gocq的消息对象
msg_ref: 引用消息(频道)
message: 消息对象
session_id: 该消息源的唯一识别号
role: member | admin
platform: 平台(gocq, qqchan)
"""
global chosen_provider, keywords, qqchannel_bot, gocq_bot
global _global_object
qq_msg = ''
session_id = ''
user_id = ''
role = "member" # 角色, member或admin
global chosen_provider, keywords, _global_object
message_str = ''
session_id = session_id
role = role
hit = False # 是否命中指令
command_result = () # 调用指令返回的结果
_global_object.cnt_total += 1
# 统计数据,如频道消息量
record_message(platform, session_id)
with_tag = False # 是否带有昵称
if platform == PLATFORM_QQCHAN or platform == PLATFROM_QQBOT or platform == PLATFORM_CLI:
with_tag = True
_len = 0
for i in message.message:
if isinstance(i, Plain) or isinstance(i, PlainText):
qq_msg += str(i.text).strip()
if isinstance(i, At):
if message.type == "GuildMessage":
if i.qq == message.user_id or i.qq == message.self_tiny_id:
with_tag = True
if message.type == "FriendMessage":
if i.qq == message.self_id:
with_tag = True
if message.type == "GroupMessage":
if i.qq == message.self_id:
with_tag = True
for i in _global_object.nick:
if i != '' and qq_msg.startswith(i):
_len = len(i)
with_tag = True
break
qq_msg = qq_msg[_len:].strip()
gu.log(f"收到消息:{qq_msg}", gu.LEVEL_INFO, tag="QQ")
user_id = message.user_id
if group:
# 适配GO-CQHTTP的频道功能
if message.type == "GuildMessage":
session_id = message.channel_id
else:
session_id = message.group_id
else:
with_tag = True
session_id = message.user_id
if message.type == "GuildMessage":
sender_id = str(message.sender.tiny_id)
else:
sender_id = str(message.sender.user_id)
if sender_id == _global_object.admin_qq or \
sender_id == _global_object.admin_qqchan or \
sender_id in cc.get("other_admins", []) or \
sender_id == cc.get("gocq_qqchan_admin", "") or \
platform == PLATFORM_CLI:
role = "admin"
if _global_object.uniqueSession:
# 独立会话时,一个用户一个 session
session_id = sender_id
if qq_msg == "":
await send_message(platform, message, f"Hi~", session_id=session_id)
return
if isinstance(i, Plain):
message_str += i.text.strip()
if message_str == "":
return MessageResult("Hi~")
if with_tag:
# 检查发言频率
if not check_frequency(user_id):
await send_message(platform, message, f'你的发言超过频率限制(╯▔皿▔)╯。\n管理员设置{frequency_time}秒内只能提问{frequency_count}次。', session_id=session_id)
return
# logf.write("[GOCQBOT] "+ qq_msg+'\n')
# logf.flush()
# 检查发言频率
user_id = message.user_id
if not check_frequency(user_id):
return MessageResult(f'你的发言超过频率限制(╯▔皿▔)╯。\n管理员设置{frequency_time}秒内只能提问{frequency_count}次。')
# 关键词回复
for k in keywords:
if qq_msg == k:
if message_str == k:
plain_text = ""
if 'plain_text' in keywords[k]:
plain_text = keywords[k]['plain_text']
@@ -687,45 +416,32 @@ async def oper_msg(message: Union[GroupMessage, FriendMessage, GuildMessage, Nak
image_url = keywords[k]['image_url']
if image_url != "":
res = [Plain(plain_text), Image.fromURL(image_url)]
await send_message(platform, message, res, session_id=session_id)
else:
await send_message(platform, message, plain_text, session_id=session_id)
return
return MessageResult(res)
return MessageResult(plain_text)
# 检查是否是更换语言模型的请求
temp_switch = ""
if qq_msg.startswith('/bing') or qq_msg.startswith('/gpt') or qq_msg.startswith('/revgpt'):
if message_str.startswith('/gpt') or message_str.startswith('/revgpt'):
target = chosen_provider
if qq_msg.startswith('/bing'):
target = REV_EDGEGPT
elif qq_msg.startswith('/gpt'):
if message_str.startswith('/gpt'):
target = OPENAI_OFFICIAL
elif qq_msg.startswith('/revgpt'):
elif message_str.startswith('/revgpt'):
target = REV_CHATGPT
l = qq_msg.split(' ')
l = message_str.split(' ')
if len(l) > 1 and l[1] != "":
# 临时对话模式,先记录下之前的语言模型,回答完毕后再切回
temp_switch = chosen_provider
chosen_provider = target
qq_msg = l[1]
message_str = l[1]
else:
chosen_provider = target
cc.put("chosen_provider", chosen_provider)
await send_message(platform, message, f"已切换至【{chosen_provider}", session_id=session_id)
return
chatgpt_res = ""
return MessageResult(f"已切换至【{chosen_provider}")
# 如果是等待回复的消息
if platform == PLATFORM_GOCQ and session_id in gocq_bot.waiting and gocq_bot.waiting[session_id] == '':
gocq_bot.waiting[session_id] = message
return
if platform == PLATFORM_QQCHAN and session_id in qqchannel_bot.waiting and qqchannel_bot.waiting[session_id] == '':
qqchannel_bot.waiting[session_id] = message
return
llm_result_str = ""
hit, command_result = llm_command_instance[chosen_provider].check_command(
qq_msg,
message_str,
session_id,
role,
platform,
@@ -734,23 +450,20 @@ async def oper_msg(message: Union[GroupMessage, FriendMessage, GuildMessage, Nak
# 没触发指令
if not hit:
if not with_tag:
return
# 关键词拦截
for i in uw.unfit_words_q:
matches = re.match(i, qq_msg.strip(), re.I | re.M)
matches = re.match(i, message_str.strip(), re.I | re.M)
if matches:
await send_message(platform, message, f"你的提问得到的回复未通过【自有关键词拦截】服务, 不予回复。", session_id=session_id)
return
return MessageResult(f"你的提问得到的回复未通过【默认关键词拦截】服务, 不予回复。")
if baidu_judge != None:
check, msg = baidu_judge.judge(qq_msg)
check, msg = baidu_judge.judge(message_str)
if not check:
await send_message(platform, message, f"你的提问得到的回复未通过【百度AI内容审核】服务, 不予回复。\n\n{msg}", session_id=session_id)
return
if chosen_provider == None:
await send_message(platform, message, f"管理员未启动任何语言模型或者语言模型初始化时失败。", session_id=session_id)
return
return MessageResult(f"你的提问得到的回复未通过【百度AI内容审核】服务, 不予回复。\n\n{msg}")
if chosen_provider == NONE_LLM:
return MessageResult("没有启动任何 LLM 并且未触发任何指令。")
try:
if llm_wake_prefix != "" and not message_str.startswith(llm_wake_prefix):
return
# check image url
image_url = None
for comp in message.message:
@@ -763,35 +476,22 @@ async def oper_msg(message: Union[GroupMessage, FriendMessage, GuildMessage, Nak
break
# web search keyword
web_sch_flag = False
if qq_msg.startswith("ws ") and qq_msg != "ws ":
qq_msg = qq_msg[3:]
if message_str.startswith("ws ") and message_str != "ws ":
message_str = message_str[3:]
web_sch_flag = True
else:
qq_msg += " " + cc.get("llm_env_prompt", "")
message_str += " " + cc.get("llm_env_prompt", "")
if chosen_provider == REV_CHATGPT or chosen_provider == OPENAI_OFFICIAL:
if _global_object.web_search or web_sch_flag:
official_fc = chosen_provider == OPENAI_OFFICIAL
chatgpt_res = gplugin.web_search(qq_msg, llm_instance[chosen_provider], session_id, official_fc)
llm_result_str = gplugin.web_search(message_str, llm_instance[chosen_provider], session_id, official_fc)
else:
chatgpt_res = str(llm_instance[chosen_provider].text_chat(qq_msg, session_id, image_url, default_personality = _global_object.default_personality))
elif chosen_provider == REV_EDGEGPT:
res, res_code = await llm_instance[chosen_provider].text_chat(qq_msg, platform)
if res_code == 0: # bing不想继续话题重置会话后重试。
await send_message(platform, message, "Bing不想继续话题了, 正在自动重置会话并重试。", session_id=session_id)
await llm_instance[chosen_provider].forget()
res, res_code = await llm_instance[chosen_provider].text_chat(qq_msg, platform)
if res_code == 0: # bing还是不想继续话题大概率说明提问有问题。
await llm_instance[chosen_provider].forget()
await send_message(platform, message, "Bing仍然不想继续话题, 会话已重置, 请检查您的提问后重试。", session_id=session_id)
res = ""
chatgpt_res = str(res)
llm_result_str = str(llm_instance[chosen_provider].text_chat(message_str, session_id, image_url, default_personality = _global_object.default_personality))
chatgpt_res = _global_object.reply_prefix + chatgpt_res
llm_result_str = _global_object.reply_prefix + llm_result_str
except BaseException as e:
gu.log(f"调用异常:{traceback.format_exc()}", gu.LEVEL_ERROR, max_len=100000)
gu.log("调用语言模型例程时出现异常。原因: "+str(e), gu.LEVEL_ERROR)
await send_message(platform, message, "调用语言模型例程时出现异常。原因: "+str(e), session_id=session_id)
return
logger.log(f"调用异常:{traceback.format_exc()}", gu.LEVEL_ERROR)
return MessageResult(f"调用语言模型例程时出现异常。原因: {str(e)}")
# 切换回原来的语言模型
if temp_switch != "":
@@ -799,139 +499,58 @@ async def oper_msg(message: Union[GroupMessage, FriendMessage, GuildMessage, Nak
# 指令回复
if hit:
# 检查指令. command_result是一个元组(指令调用是否成功, 指令返回的文本结果, 指令类型)
# 检查指令command_result 是一个元组:(指令调用是否成功, 指令返回的文本结果, 指令类型)
if command_result == None:
return
command = command_result[2]
if command == "keyword":
if os.path.exists("keyword.json"):
with open("keyword.json", "r", encoding="utf-8") as f:
keywords = json.load(f)
else:
try:
await send_message(platform, message, command_result[1], session_id=session_id)
return MessageResult(command_result[1])
except BaseException as e:
await send_message(platform, message, f"回复消息出错: {str(e)}", session_id=session_id)
return MessageResult(f"回复消息出错: {str(e)}")
if command == "update latest r":
await send_message(platform, message, command_result[1] + "\n\n即将自动重启。", session_id=session_id)
py = sys.executable
os.execl(py, py, *sys.argv)
def update_restart():
py = sys.executable
os.execl(py, py, *sys.argv)
return MessageResult(command_result[1] + "\n\n即将自动重启。", callback=update_restart)
if not command_result[0]:
await send_message(platform, message, f"指令调用错误: \n{str(command_result[1])}", session_id=session_id)
return
return MessageResult(f"指令调用错误: \n{str(command_result[1])}")
# 画图指令
if isinstance(command_result[1], list) and len(command_result) == 3 and command == 'draw':
for i in command_result[1]:
# i is a link
# 保存到本地
pic_res = requests.get(i, stream = True)
if pic_res.status_code == 200:
image = PILImage.open(io.BytesIO(pic_res.content))
await send_message(platform, message, [Image.fromFileSystem(gu.save_temp_img(image))], session_id=session_id)
return MessageResult([Image.fromFileSystem(gu.save_temp_img(image))])
# 其他指令
else:
try:
await send_message(platform, message, command_result[1], session_id=session_id)
return MessageResult(command_result[1])
except BaseException as e:
await send_message(platform, message, f"回复消息出错: {str(e)}", session_id=session_id)
return MessageResult(f"回复消息出错: {str(e)}")
return
# 记录日志
# logf.write(f"{reply_prefix} {str(chatgpt_res)}\n")
# logf.flush()
# 敏感过滤
# 过滤不合适的词
for i in uw.unfit_words:
chatgpt_res = re.sub(i, "***", chatgpt_res)
llm_result_str = re.sub(i, "***", llm_result_str)
# 百度内容审核服务二次审核
if baidu_judge != None:
check, msg = baidu_judge.judge(chatgpt_res)
check, msg = baidu_judge.judge(llm_result_str)
if not check:
await send_message(platform, message, f"你的提问得到的回复【百度内容审核】未通过,不予回复。\n\n{msg}", session_id=session_id)
return
return MessageResult(f"你的提问得到的回复【百度内容审核】未通过,不予回复。\n\n{msg}")
# 发送信息
try:
await send_message(platform, message, chatgpt_res, session_id=session_id)
return MessageResult(llm_result_str)
except BaseException as e:
gu.log("回复消息错误: \n"+str(e), gu.LEVEL_ERROR)
# QQ频道机器人
class botClient(botpy.Client):
# 收到频道消息
async def on_at_message_create(self, message: Message):
gu.log(str(message), gu.LEVEL_DEBUG, max_len=9999)
# 转换层
nakuru_guild_message = qqchannel_bot.gocq_compatible_receive(message)
gu.log(f"转换后: {str(nakuru_guild_message)}", gu.LEVEL_DEBUG, max_len=9999)
new_sub_thread(oper_msg, (nakuru_guild_message, True, PLATFORM_QQCHAN))
# 收到私聊消息
async def on_direct_message_create(self, message: DirectMessage):
if direct_message_mode:
# 转换层
nakuru_guild_message = qqchannel_bot.gocq_compatible_receive(message)
gu.log(f"转换后: {str(nakuru_guild_message)}", gu.LEVEL_DEBUG, max_len=9999)
new_sub_thread(oper_msg, (nakuru_guild_message, False, PLATFORM_QQCHAN))
# QQ机器人
class gocqClient():
# 收到群聊消息
@gocq_app.receiver("GroupMessage")
async def _(app: CQHTTP, source: GroupMessage):
if cc.get("gocq_react_group", True):
if isinstance(source.message[0], Plain):
new_sub_thread(oper_msg, (source, True, PLATFORM_GOCQ))
if isinstance(source.message[0], At):
if source.message[0].qq == source.self_id:
new_sub_thread(oper_msg, (source, True, PLATFORM_GOCQ))
else:
return
@gocq_app.receiver("FriendMessage")
async def _(app: CQHTTP, source: FriendMessage):
if cc.get("gocq_react_friend", True):
if isinstance(source.message[0], Plain):
new_sub_thread(oper_msg, (source, False, PLATFORM_GOCQ))
else:
return
@gocq_app.receiver("GroupMemberIncrease")
async def _(app: CQHTTP, source: GroupMemberIncrease):
if cc.get("gocq_react_group_increase", True):
global announcement
await app.sendGroupMessage(source.group_id, [
Plain(text = announcement),
])
@gocq_app.receiver("Notify")
async def _(app: CQHTTP, source: Notify):
print(source)
if source.sub_type == "poke" and source.target_id == source.self_id:
new_sub_thread(oper_msg, (source, False, PLATFORM_GOCQ))
@gocq_app.receiver("GuildMessage")
async def _(app: CQHTTP, source: GuildMessage):
if cc.get("gocq_react_guild", True):
if isinstance(source.message[0], Plain):
new_sub_thread(oper_msg, (source, True, PLATFORM_GOCQ))
if isinstance(source.message[0], At):
if source.message[0].qq == source.self_tiny_id:
new_sub_thread(oper_msg, (source, True, PLATFORM_GOCQ))
else:
return
class QQBotClient():
@qq_bot.on('GroupMessage')
async def _(bot: UnofficialQQBotSDK, message: QQMessage):
print(message)
new_sub_thread(oper_msg, (message, True, PLATFROM_QQBOT))
logger.log("回复消息错误: \n"+str(e), gu.LEVEL_ERROR)

View File

@@ -1,5 +1,5 @@
from model.platform.qqchan import QQChan, NakuruGuildMember, NakuruGuildMessage
from model.platform.qq import QQ
from model.platform.qq_official import QQOfficial, NakuruGuildMember, NakuruGuildMessage
from model.platform.qq_gocq import QQGOCQ
from model.provider.provider import Provider
from addons.dashboard.server import DashBoardData
from nakuru import (
@@ -17,7 +17,7 @@ class GlobalObject:
存放一些公用的数据,用于在不同模块(如core与command)之间传递
'''
nick: str # gocq 的昵称
base_config: dict # config.yaml
base_config: dict # config.json
cached_plugins: dict # 缓存的插件
web_search: bool # 是否开启了网页搜索
reply_prefix: str
@@ -25,11 +25,12 @@ class GlobalObject:
admin_qqchan: str
uniqueSession: bool
cnt_total: int
platform_qq: QQ
platform_qqchan: QQChan
platform_qq: QQGOCQ
platform_qqchan: QQOfficial
default_personality: dict
dashboard_data: DashBoardData
stat: dict
logger: None
def __init__(self):
self.nick = None # gocq 的昵称
@@ -46,35 +47,13 @@ class GlobalObject:
self.default_personality = None
self.dashboard_data = None
self.stat = {}
'''
{
"config": {},
"session": [
{
"platform": "qq",
"session_id": 123456,
"cnt": 0
},
{...}
],
"message": [
// 以一小时为单位
{
"ts": 1234567,
"cnt": 0
}
]
}
'''
class AstrMessageEvent():
message_str: str # 纯消息字符串
message_obj: Union[GroupMessage, FriendMessage, GuildMessage, NakuruGuildMessage] # 消息对象
gocq_platform: QQ
qq_sdk_platform: QQChan
gocq_platform: QQGOCQ
qq_sdk_platform: QQOfficial
platform: str # `gocq` 或 `qqchan`
role: str # `admin` 或 `member`
global_object: GlobalObject # 一些公用数据
@@ -82,8 +61,8 @@ class AstrMessageEvent():
def __init__(self, message_str: str,
message_obj: Union[GroupMessage, FriendMessage, GuildMessage, NakuruGuildMessage],
gocq_platform: QQ,
qq_sdk_platform: QQChan,
gocq_platform: QQGOCQ,
qq_sdk_platform: QQOfficial,
platform: str,
role: str,
global_object: GlobalObject,

44
main.py
View File

@@ -3,42 +3,35 @@ from pip._internal import main as pipmain
import warnings
import traceback
import threading
import logging
warnings.filterwarnings("ignore")
abs_path = os.path.dirname(os.path.realpath(sys.argv[0])) + '/'
def main():
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
datefmt='%H:%M:%S',
)
# config.yaml 配置文件加载和环境确认
try:
import cores.qqbot.core as qqBot
import yaml
from yaml.scanner import ScannerError
import util.general_utils as gu
ymlfile = open(abs_path+"configs/config.yaml", 'r', encoding='utf-8')
cfg = yaml.safe_load(ymlfile)
except ImportError as import_error:
traceback.print_exc()
print(import_error)
input("第三方库未完全安装完毕,请退出程序重试。")
except FileNotFoundError as file_not_found:
print(file_not_found)
input("配置文件不存在,请检查是否已经下载配置文件。")
except ScannerError as e:
print(traceback.format_exc())
input("config.yaml 配置文件格式错误,请遵守 yaml 格式。")
except BaseException as e:
print(e)
# 设置代理
if 'http_proxy' in cfg and cfg['http_proxy'] != '':
os.environ['HTTP_PROXY'] = cfg['http_proxy']
if 'https_proxy' in cfg and cfg['https_proxy'] != '':
os.environ['HTTPS_PROXY'] = cfg['https_proxy']
os.environ['NO_PROXY'] = 'cn.bing.com,https://api.sgroup.qq.com'
os.environ['NO_PROXY'] = 'https://api.sgroup.qq.com'
# 检查并创建 temp 文件夹
if not os.path.exists(abs_path + "temp"):
@@ -57,13 +50,13 @@ def check_env(ch_mirror=False):
pth = 'requirements.txt'
else:
pth = 'QQChannelChatGPT'+ os.sep +'requirements.txt'
print("正在检查更新第三方库...")
print("正在检查或下载第三方库,请耐心等待...")
try:
if ch_mirror:
print("使用阿里云镜像")
pipmain(['install', '-r', pth, '-i', 'https://mirrors.aliyun.com/pypi/simple/', '--quiet'])
pipmain(['install', '-r', pth, '-i', 'https://mirrors.aliyun.com/pypi/simple/'])
else:
pipmain(['install', '-r', pth, '--quiet'])
pipmain(['install', '-r', pth])
except BaseException as e:
print(e)
while True:
@@ -86,18 +79,6 @@ def check_env(ch_mirror=False):
break
print("第三方库检查完毕。")
def get_platform():
import platform
sys_platform = platform.platform().lower()
if "windows" in sys_platform:
return "win"
elif "macos" in sys_platform:
return "mac"
elif "linux" in sys_platform:
return "linux"
else:
print("other")
if __name__ == "__main__":
args = sys.argv
@@ -105,15 +86,6 @@ if __name__ == "__main__":
check_env(True)
else:
check_env()
if '-replit' in args:
print("[System] 启动Replit Web保活服务...")
try:
from util.webapp_replit import keep_alive
keep_alive()
except BaseException as e:
print(e)
print(f"[System-err] Replit Web保活服务启动失败:{str(e)}")
t = threading.Thread(target=main, daemon=False)
t.start()

View File

@@ -5,7 +5,6 @@ try:
import git.exc
from git.repo import Repo
except BaseException as e:
gu.log("你正运行在无Git环境下暂时将无法使用插件、热更新功能。")
has_git = False
import os
import sys
@@ -13,11 +12,9 @@ import requests
from model.provider.provider import Provider
import json
import util.plugin_util as putil
import shutil
import importlib
from util.cmd_config import CmdConfig as cc
from model.platform.qq import QQ
import stat
from util.general_utils import Logger
import util.updator
from nakuru.entities.components import (
Plain,
Image
@@ -35,6 +32,7 @@ class Command:
def __init__(self, provider: Provider, global_object: GlobalObject = None):
self.provider = provider
self.global_object = global_object
self.logger: Logger = global_object.logger
def check_command(self,
message,
@@ -60,7 +58,6 @@ class Command:
if isinstance(result, CommandResult):
hit = result.hit
res = result._result_tuple()
print(hit, res)
elif isinstance(result, tuple):
hit = result[0]
res = result[1]
@@ -75,16 +72,16 @@ class Command:
if hit:
return True, res
except BaseException as e:
gu.log(f"{k}插件异常,原因: {str(e)}\n已安装插件: {cached_plugins.keys}\n如果你没有相关装插件的想法, 请直接忽略此报错, 不影响其他功能的运行。", level=gu.LEVEL_WARNING)
self.logger.log(f"{k}插件异常,原因: {str(e)}\n已安装插件: {cached_plugins.keys}\n如果你没有相关装插件的想法, 请直接忽略此报错, 不影响其他功能的运行。", level=gu.LEVEL_WARNING)
except BaseException as e:
gu.log(f"{k} 插件异常,原因: {str(e)}\n已安装插件: {cached_plugins.keys}\n如果你没有相关装插件的想法, 请直接忽略此报错, 不影响其他功能的运行。", level=gu.LEVEL_WARNING)
self.logger.log(f"{k} 插件异常,原因: {str(e)}\n已安装插件: {cached_plugins.keys}\n如果你没有相关装插件的想法, 请直接忽略此报错, 不影响其他功能的运行。", level=gu.LEVEL_WARNING)
if self.command_start_with(message, "nick"):
return True, self.set_nick(message, platform, role)
if self.command_start_with(message, "plugin"):
return True, self.plugin_oper(message, role, cached_plugins, platform)
if self.command_start_with(message, "myid") or self.command_start_with(message, "!myid"):
return True, self.get_my_id(message_obj)
return True, self.get_my_id(message_obj, platform)
if self.command_start_with(message, "nconf") or self.command_start_with(message, "newconf"):
return True, self.get_new_conf(message, role)
if self.command_start_with(message, "web"): # 网页搜索
@@ -99,17 +96,25 @@ class Command:
return False, None
def web_search(self, message):
if message == "web on":
l = message.split(' ')
if len(l) == 1:
return True, f"网页搜索功能当前状态: {self.global_object.web_search}", "web"
elif l[1] == 'on':
self.global_object.web_search = True
return True, "已开启网页搜索", "web"
elif message == "web off":
elif l[1] == 'off':
self.global_object.web_search = False
return True, "已关闭网页搜索", "web"
return True, f"网页搜索功能当前状态: {self.global_object.web_search}", "web"
def get_my_id(self, message_obj):
return True, f"你的ID{str(message_obj.sender.tiny_id)}", "plugin"
def get_my_id(self, message_obj, platform):
user_id = "Unknown"
if platform == PLATFORM_QQCHAN:
user_id = str(message_obj.sender.tiny_id)
elif platform == PLATFORM_GOCQ:
user_id = str(message_obj.user_id)
return True, f"你在此平台上的ID{user_id}", "plugin"
def get_new_conf(self, message, role):
if role != "admin":
return False, f"你的身份组{role}没有权限使用此指令。", "newconf"
@@ -169,30 +174,6 @@ class Command:
return False, "未找到该插件", "plugin"
except BaseException as e:
return False, f"获取插件信息失败,原因: {str(e)}", "plugin"
# elif l[1] == "reload":
# if role != "admin":
# return False, f"你的身份组{role}没有权限重载插件", "plugin"
# for plugin in cached_plugins:
# try:
# print(f"更新插件 {plugin} 依赖...")
# plugin_path = os.path.join(ppath, cached_plugins[plugin]["root_dir_name"])
# if os.path.exists(os.path.join(plugin_path, "requirements.txt")):
# mm = pipmain(['install', '-r', os.path.join(plugin_path, "requirements.txt"), "--quiet"])
# if mm != 0:
# return False, "插件依赖安装失败需要您手动pip安装对应插件的依赖。", "plugin"
# except BaseException as e:
# print(f"插件{plugin}依赖安装失败,原因: {str(e)}")
# try:
# ok, err = self.plugin_reload(cached_plugins, all = True)
# if ok:
# return True, "\n重载插件成功~", "plugin"
# else:
# # if os.path.exists(plugin_path):
# # shutil.rmtree(plugin_path)
# return False, f"插件重载失败。\n跟踪: \n{err}", "plugin"
# except BaseException as e:
# return False, f"插件重载失败,原因: {str(e)}", "plugin"
elif l[1] == "dev":
if role != "admin":
return False, f"你的身份组{role}没有权限开发者模式", "plugin"
@@ -220,15 +201,12 @@ class Command:
return {
"help": "帮助",
"keyword": "设置关键词/关键指令回复",
"update": "更新面板",
"update latest": "更新到最新版本",
"update r": "重启机器人",
"reset": "重置会话",
"update": "更新项目",
"nick": "设置机器人昵称",
"plugin": "插件安装、卸载和重载",
"web on/off": "启动或关闭网页搜索能力",
"/bing": "切换到bing模型",
"/gpt": "切换到OpenAI ChatGPT API",
"web on/off": "LLM 网页搜索能力",
"reset": "重置 LLM 对话",
"/gpt": "切换到 OpenAI 官方接口",
"/revgpt": "切换到网页版ChatGPT",
}
@@ -255,12 +233,14 @@ class Command:
p = gu.create_markdown_image(msg)
return [Image.fromFileSystem(p)]
except BaseException as e:
gu.log(str(e))
self.logger.log(str(e))
finally:
return msg
# 接受可变参数
def command_start_with(self, message: str, *args):
'''
当消息以指定的指令开头时返回True
'''
for arg in args:
if message.startswith(arg) or message.startswith('/'+arg):
return True
@@ -286,8 +266,7 @@ class Command:
l = plain_text.split(" ")
if len(l) < 3 and image_url == "":
return True, """
【设置关键词回复】示例:
return True, """【设置关键词回复】示例:
1. keyword hi 你好
当发送hi的时候会回复你好
2. keyword /hi 你好
@@ -295,8 +274,7 @@ class Command:
3. keyword d hi
删除hi关键词的回复
4. keyword hi <图片>
当发送hi时会回复图片
""", "keyword"
当发送hi时会回复图片""", "keyword"
del_mode = False
if l[1] == "d":
@@ -343,63 +321,33 @@ class Command:
if role != "admin":
return True, "你没有权限使用该指令", "keyword"
l = message.split(" ")
try:
repo = Repo()
except git.exc.InvalidGitRepositoryError:
try:
repo = Repo(path="QQChannelChatGPT")
except git.exc.InvalidGitRepositoryError:
repo = Repo(path="AstrBot")
if len(l) == 1:
curr_branch = repo.active_branch.name
# 得到本地版本号和最新版本号
now_commit = repo.head.commit
# 得到远程3条commit列表, 包含commit信息
origin = repo.remotes.origin
origin.fetch()
commits = list(repo.iter_commits(curr_branch, max_count=3))
commits_log = ''
index = 1
for commit in commits:
if commit.message.endswith("\n"):
commits_log += f"[{index}] {commit.message}-----------\n"
else:
commits_log += f"[{index}] {commit.message}\n-----------\n"
index+=1
remote_commit_hash = origin.refs.master.commit.hexsha[:6]
return True, f"当前分支: {curr_branch}\n当前版本: {now_commit.hexsha[:6]}\n最新版本: {remote_commit_hash}\n\n3条commit(非最新):\n{str(commits_log)}\nTips:\n1. 使用 update latest 更新至最新版本;\n2. 使用 update checkout <分支名> 切换代码分支。", "update"
try:
update_info = util.updator.check_update()
update_info += "\nTips:\n输入「update latest」更新到最新版本\n输入「update <版本号如v3.1.3>」切换到指定版本\n输入「update r」重启机器人\n"
return True, update_info, "update"
except BaseException as e:
return False, "检查更新失败: "+str(e), "update"
else:
if l[1] == "latest":
try:
origin = repo.remotes.origin
origin.fetch()
commits = list(repo.iter_commits('master', max_count=1))
commit_log = commits[0].message
tag = "update"
if len(l) == 3 and l[2] == "r":
tag = "update latest r"
return True, f"更新成功。新版本内容: \n{commit_log}\nps:重启后生效。输入update r重启重启指令不返回任何确认信息", tag
release_data = util.updator.request_release_info()
util.updator.update_project(release_data)
return True, "更新成功重启生效。可输入「update r」重启", "update"
except BaseException as e:
return False, "更新失败: "+str(e), "update"
if l[1] == "r":
py = sys.executable
os.execl(py, py, *sys.argv)
if l[1] == 'checkout':
# 切换分支
if len(l) < 3:
return False, "请提供分支名,如 /update checkout dev_dashboard", "update"
try:
origin = repo.remotes.origin
origin.fetch()
repo.git.checkout(l[2])
# 获得最新的 commit
commits = list(repo.iter_commits(max_count=1))
commit_log = commits[0].message
return True, f"切换分支成功,机器人将在 5 秒内重新启动以应用新的功能。\n当前分支: {l[2]}\n此分支最近更新: \n{commit_log}", "update latest r"
except BaseException as e:
return False, f"切换分支失败。原因: {str(e)}", "update"
elif l[1] == "r":
util.updator._reboot()
else:
if l[1].lower().startswith('v'):
try:
release_data = util.updator.request_release_info(latest=False)
util.updator.update_project(release_data, latest=False, version=l[1])
return True, "更新成功重启生效。可输入「update r」重启", "update"
except BaseException as e:
return False, "更新失败: "+str(e), "update"
else:
return False, "版本号格式错误", "update"
def reset(self):
return False
@@ -426,6 +374,4 @@ class Command:
return False
def draw(self):
return False
return False

View File

@@ -1,9 +1,6 @@
from model.command.command import Command
from model.provider.openai_official import ProviderOpenAIOfficial
from cores.qqbot.personality import personalities
from model.platform.qq import QQ
from util import general_utils as gu
from cores.qqbot.global_object import GlobalObject
class CommandOpenAIOfficial(Command):
@@ -41,8 +38,6 @@ class CommandOpenAIOfficial(Command):
return True, self.gpt()
elif self.command_start_with(message, "status"):
return True, self.status()
elif self.command_start_with(message, "count"):
return True, self.count()
elif self.command_start_with(message, "help", "帮助"):
return True, self.help()
elif self.command_start_with(message, "unset"):
@@ -73,7 +68,7 @@ class CommandOpenAIOfficial(Command):
def reset(self, session_id: str, message: str = "reset"):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "reset"
return False, "未启OpenAI 官方 API", "reset"
l = message.split(" ")
if len(l) == 1:
self.provider.forget(session_id)
@@ -86,7 +81,7 @@ class CommandOpenAIOfficial(Command):
def his(self, message: str, session_id: str):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "his"
return False, "未启OpenAI 官方 API", "his"
#分页每页5条
msg = ''
size_per_page = 3
@@ -104,17 +99,17 @@ class CommandOpenAIOfficial(Command):
def token(self, session_id: str):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "token"
return False, "未启OpenAI 官方 API", "token"
return True, f"会话的token数: {self.provider.get_user_usage_tokens(self.provider.session_dict[session_id])}\n系统最大缓存token数: {self.provider.max_tokens}", "token"
def gpt(self):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "gpt"
return False, "未启OpenAI 官方 API", "gpt"
return True, f"OpenAI GPT配置:\n {self.provider.chatGPT_configs}", "gpt"
def status(self):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "status"
return False, "未启OpenAI 官方 API", "status"
chatgpt_cfg_str = ""
key_stat = self.provider.get_key_stat()
index = 1
@@ -134,15 +129,9 @@ class CommandOpenAIOfficial(Command):
index += 1
return True, f"⭐使用情况({str(gg_count)}个已用):\n{chatgpt_cfg_str}", "status"
def count(self):
if self.provider is None:
return False, "未启动OpenAI ChatGPT语言模型。", "reset"
guild_count, guild_msg_count, guild_direct_msg_count, session_count = self.provider.get_stat()
return True, f"【本指令部分统计可能已经过时】\n当前会话数: {len(self.provider.session_dict)}\n共有频道数: {guild_count} \n共有消息数: {guild_msg_count}\n私信数: {guild_direct_msg_count}\n历史会话数: {session_count}", "count"
def key(self, message: str):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "reset"
return False, "未启OpenAI 官方 API", "reset"
l = message.split(" ")
if len(l) == 1:
msg = "感谢您赞助keykey为官方API使用请以以下格式赞助:\n/key xxxxx"
@@ -188,14 +177,14 @@ class CommandOpenAIOfficial(Command):
def unset(self, session_id: str):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "unset"
return False, "未启OpenAI 官方 API", "unset"
self.provider.curr_personality = {}
self.provider.forget(session_id)
return True, "已清除人格并重置历史记录。", "unset"
def set(self, message: str, session_id: str):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "set"
return False, "未启OpenAI 官方 API", "set"
l = message.split(" ")
if len(l) == 1:
return True, f"【人格文本由PlexPt开源项目awesome-chatgpt-pr \
@@ -267,7 +256,7 @@ class CommandOpenAIOfficial(Command):
def draw(self, message):
if self.provider is None:
return False, "未启OpenAI ChatGPT语言模型.", "draw"
return False, "未启OpenAI 官方 API", "draw"
if message.startswith("/画"):
message = message[2:]
elif message.startswith(""):
@@ -279,9 +268,4 @@ class CommandOpenAIOfficial(Command):
except Exception as e:
if 'exceeded' in str(e):
return f"OpenAI API错误。原因\n{str(e)} \n超额了。可自己搭建一个机器人(Github仓库QQChannelChatGPT)"
return False, f"图片生成失败: {e}", "draw"
return False, f"图片生成失败: {e}", "draw"

View File

@@ -1,6 +1,5 @@
from model.command.command import Command
from model.provider.rev_chatgpt import ProviderRevChatGPT
from model.platform.qq import QQ
from cores.qqbot.personality import personalities
from cores.qqbot.global_object import GlobalObject

View File

@@ -1,52 +0,0 @@
from model.command.command import Command
from model.provider.rev_edgegpt import ProviderRevEdgeGPT
import asyncio
from model.platform.qq import QQ
from cores.qqbot.global_object import GlobalObject
class CommandRevEdgeGPT(Command):
def __init__(self, provider: ProviderRevEdgeGPT, global_object: GlobalObject):
self.provider = provider
self.cached_plugins = {}
self.global_object = global_object
super().__init__(provider, global_object)
def check_command(self,
message: str,
session_id: str,
role: str,
platform: str,
message_obj):
self.platform = platform
hit, res = super().check_command(
message,
session_id,
role,
platform,
message_obj
)
if hit:
return True, res
if self.command_start_with(message, "reset"):
return True, self.reset()
elif self.command_start_with(message, "help"):
return True, self.help()
elif self.command_start_with(message, "update"):
return True, self.update(message, role)
return False, None
def reset(self, loop = None):
if self.provider is None:
return False, "未启动Bing语言模型.", "reset"
res = asyncio.run_coroutine_threadsafe(self.provider.forget(), loop).result()
print(res)
if res:
return res, "重置成功", "reset"
else:
return res, "重置失败", "reset"
def help(self):
return True, super().help_messager(super().general_commands(), self.platform, self.global_object.cached_plugins), "help"

View File

@@ -0,0 +1,8 @@
from dataclasses import dataclass
from typing import Union, Optional
@dataclass
class MessageResult():
result_message: Union[str, list]
is_command_call: Optional[bool] = False
callback: Optional[callable] = None

View File

@@ -0,0 +1,77 @@
from nakuru.entities.components import Plain, At, Image
from botpy.message import Message, DirectMessage
class NakuruGuildMember():
tiny_id: int # 发送者识别号
user_id: int # 发送者识别号
title: str
nickname: str # 昵称
role: int # 角色
icon_url: str # 头像url
class NakuruGuildMessage():
type: str = "GuildMessage"
self_id: int # bot的qq号
self_tiny_id: int # bot的qq号
sub_type: str # 消息类型
message_id: str # 消息id
guild_id: int # 频道号
channel_id: int # 子频道号
user_id: int # 发送者qq号
message: list # 消息内容
sender: NakuruGuildMember # 发送者信息
raw_message: Message
def __str__(self) -> str:
return str(self.__dict__)
# gocq-频道SDK兼容层
def gocq_compatible_send(gocq_message_chain: list):
plain_text = ""
image_path = None # only one img supported
for i in gocq_message_chain:
if isinstance(i, Plain):
plain_text += i.text
elif isinstance(i, Image) and image_path == None:
if i.path is not None:
image_path = i.path
else:
image_path = i.file
return plain_text, image_path
# gocq-频道SDK兼容层
def gocq_compatible_receive(message: Message) -> NakuruGuildMessage:
ngm = NakuruGuildMessage()
try:
ngm.self_id = message.mentions[0].id
ngm.self_tiny_id = message.mentions[0].id
except:
ngm.self_id = 0
ngm.self_tiny_id = 0
ngm.sub_type = "normal"
ngm.message_id = message.id
ngm.guild_id = int(message.guild_id)
ngm.channel_id = int(message.channel_id)
ngm.user_id = int(message.author.id)
msg = []
plain_content = message.content.replace("<@!"+str(ngm.self_id)+">", "").strip()
msg.append(Plain(plain_content))
if message.attachments:
for i in message.attachments:
if i.content_type.startswith("image"):
url = i.url
if not url.startswith("http"):
url = "https://"+url
img = Image.fromURL(url)
msg.append(img)
ngm.message = msg
ngm.sender = NakuruGuildMember()
ngm.sender.tiny_id = int(message.author.id)
ngm.sender.user_id = int(message.author.id)
ngm.sender.title = ""
ngm.sender.nickname = message.author.username
ngm.sender.role = 0
ngm.sender.icon_url = message.author.avatar
ngm.raw_message = message
return ngm

View File

@@ -0,0 +1,82 @@
import abc
import threading
import asyncio
from typing import Callable, Union
from nakuru import (
GuildMessage,
GroupMessage,
FriendMessage,
)
from ._nakuru_translation_layer import (
NakuruGuildMessage,
)
from nakuru.entities.components import Plain, At, Image, Node
class Platform():
def __init__(self, message_handler: callable) -> None:
'''
初始化平台的各种接口
'''
self.message_handler = message_handler
pass
@abc.abstractmethod
def handle_msg():
'''
处理到来的消息
'''
pass
@abc.abstractmethod
def reply_msg():
'''
回复消息(被动发送)
'''
pass
@abc.abstractmethod
def send_msg():
'''
发送消息(主动发送)
'''
pass
@abc.abstractmethod
def send():
'''
发送消息(主动发送)同 send_msg()
'''
pass
def parse_message_outline(self, message: Union[GuildMessage, GroupMessage, FriendMessage, str]) -> NakuruGuildMessage:
'''
将消息解析成大纲消息形式。
如: xxxxx[图片]xxxxx
'''
if isinstance(message, str):
return message
ret = ''
try:
for node in message.message:
if isinstance(node, Plain):
ret += node.text
elif isinstance(node, At):
ret += f'[At: {node.name}/{node.qq}]'
elif isinstance(node, Image):
ret += f'[图片]'
except Exception as e:
pass
ret.replace('\n', '')
return ret
def new_sub_thread(self, func, args=()):
thread = threading.Thread(target=self._runner, args=(func, args), daemon=True)
thread.start()
def _runner(self, func: Callable, args: tuple):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(func(*args))
loop.close()

View File

@@ -1,190 +0,0 @@
from nakuru.entities.components import Plain, At, Image, Node
from util import general_utils as gu
from util.cmd_config import CmdConfig
import asyncio
from nakuru import (
CQHTTP,
GuildMessage,
GroupMessage,
FriendMessage
)
from typing import Union
import time
class FakeSource:
def __init__(self, type, group_id):
self.type = type
self.group_id = group_id
class QQ:
def __init__(self, is_start: bool, cc: CmdConfig = None, gocq_loop = None) -> None:
self.is_start = is_start
self.gocq_loop = gocq_loop
self.cc = cc
self.waiting = {}
self.gocq_cnt = 0
def run_bot(self, gocq):
self.client: CQHTTP = gocq
self.client.run()
def get_msg_loop(self):
return self.gocq_loop
def get_cnt(self):
return self.gocq_cnt
def set_cnt(self, cnt):
self.gocq_cnt = cnt
async def send_qq_msg(self,
source,
res,
image_mode=None):
self.gocq_cnt += 1
if not self.is_start:
raise Exception("管理员未启动GOCQ平台")
"""
res可以是一个数组, 也就是gocq的消息链。
插件开发者请使用send方法, 可以不用直接调用这个方法。
"""
gu.log("回复GOCQ消息: "+str(res), level=gu.LEVEL_INFO, tag="GOCQ", max_len=300)
if isinstance(source, int):
source = FakeSource("GroupMessage", source)
# str convert to CQ Message Chain
if isinstance(res, str):
res_str = res
res = []
if source.type == "GroupMessage" and not isinstance(source, FakeSource):
res.append(At(qq=source.user_id))
res.append(Plain(text=res_str))
# if image mode, put all Plain texts into a new picture.
if image_mode is None:
image_mode = self.cc.get('qq_pic_mode', False)
if image_mode and isinstance(res, list):
plains = []
news = []
for i in res:
if isinstance(i, Plain):
plains.append(i.text)
else:
news.append(i)
plains_str = "".join(plains).strip()
if plains_str != "" and len(plains_str) > 50:
p = gu.create_markdown_image("".join(plains))
news.append(Image.fromFileSystem(p))
res = news
# 回复消息链
if isinstance(res, list) and len(res) > 0:
if source.type == "GuildMessage":
await self.client.sendGuildChannelMessage(source.guild_id, source.channel_id, res)
return
elif source.type == "FriendMessage":
await self.client.sendFriendMessage(source.user_id, res)
return
elif source.type == "GroupMessage":
# 过长时forward发送
plain_text_len = 0
image_num = 0
for i in res:
if isinstance(i, Plain):
plain_text_len += len(i.text)
elif isinstance(i, Image):
image_num += 1
if plain_text_len > self.cc.get('qq_forward_threshold', 200):
# 删除At
for i in res:
if isinstance(i, At):
res.remove(i)
node = Node(res)
# node.content = res
node.uin = 123456
node.name = f"bot"
node.time = int(time.time())
# print(node)
nodes=[node]
await self.client.sendGroupForwardMessage(source.group_id, nodes)
return
await self.client.sendGroupMessage(source.group_id, res)
return
def send(self,
to,
res,
image_mode=False,
):
'''
提供给插件的发送QQ消息接口, 不用在外部await。
参数说明第一个参数可以是消息对象也可以是QQ群号。第二个参数是消息内容消息内容可以是消息链列表也可以是纯文字信息
第三个参数是是否开启图片模式,如果开启,那么所有纯文字信息都会被合并成一张图片。
'''
try:
asyncio.run_coroutine_threadsafe(self.send_qq_msg(to, res, image_mode), self.gocq_loop).result()
except BaseException as e:
raise e
def send_guild(self,
message_obj,
res,
):
'''
提供给插件的发送GOCQ QQ频道消息接口, 不用在外部await。
参数说明:第一个参数必须是消息对象, 第二个参数是消息内容(消息内容可以是消息链列表,也可以是纯文字信息)。
'''
try:
asyncio.run_coroutine_threadsafe(self.send_qq_msg(message_obj, res), self.gocq_loop).result()
except BaseException as e:
raise e
def create_text_image(title: str, text: str, max_width=30, font_size=20):
'''
文本转图片。
title: 标题
text: 文本内容
max_width: 文本宽度最大值默认30
font_size: 字体大小默认20
返回:文件路径
'''
try:
img = gu.word2img(title, text, max_width, font_size)
p = gu.save_temp_img(img)
return p
except Exception as e:
raise e
def wait_for_message(self, group_id) -> Union[GroupMessage, FriendMessage, GuildMessage]:
'''
等待下一条消息,超时 300s 后抛出异常
'''
self.waiting[group_id] = ''
cnt = 0
while True:
if group_id in self.waiting and self.waiting[group_id] != '':
# 去掉
ret = self.waiting[group_id]
del self.waiting[group_id]
return ret
cnt += 1
if cnt > 300:
raise Exception("等待消息超时。")
time.sleep(1)
def get_client(self):
return self.client
def nakuru_method_invoker(self, func, *args, **kwargs):
"""
返回一个方法调用器可以用来立即调用nakuru的方法。
"""
try:
ret = asyncio.run_coroutine_threadsafe(func(*args, **kwargs), self.gocq_loop).result()
return ret
except BaseException as e:
raise e

324
model/platform/qq_gocq.py Normal file
View File

@@ -0,0 +1,324 @@
from nakuru.entities.components import Plain, At, Image, Node
from util import general_utils as gu
from util.cmd_config import CmdConfig
import asyncio
from nakuru import (
CQHTTP,
GuildMessage,
GroupMessage,
FriendMessage,
GroupMemberIncrease,
Notify,
Member
)
from typing import Union
import time
from ._platfrom import Platform
class FakeSource:
def __init__(self, type, group_id):
self.type = type
self.group_id = group_id
class QQGOCQ(Platform):
def __init__(self, cfg: dict, message_handler: callable, global_object) -> None:
super().__init__(message_handler)
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
self.waiting = {}
self.gocq_cnt = 0
self.cc = CmdConfig()
self.cfg = cfg
self.logger: gu.Logger = global_object.logger
try:
self.nick_qq = cfg['nick_qq']
except:
self.nick_qq = ["ai","!",""]
nick_qq = self.nick_qq
if isinstance(nick_qq, str):
nick_qq = [nick_qq]
self.unique_session = cfg['uniqueSessionMode']
self.pic_mode = cfg['qq_pic_mode']
self.client = CQHTTP(
host=self.cc.get("gocq_host", "127.0.0.1"),
port=self.cc.get("gocq_websocket_port", 6700),
http_port=self.cc.get("gocq_http_port", 5700),
)
gocq_app = self.client
self.announcement = self.cc.get("announcement", "欢迎新人!")
@gocq_app.receiver("GroupMessage")
async def _(app: CQHTTP, source: GroupMessage):
if self.cc.get("gocq_react_group", True):
if isinstance(source.message[0], Plain):
self.new_sub_thread(self.handle_msg, (source, True))
elif isinstance(source.message[0], At):
if source.message[0].qq == source.self_id:
self.new_sub_thread(self.handle_msg, (source, True))
else:
return
@gocq_app.receiver("FriendMessage")
async def _(app: CQHTTP, source: FriendMessage):
if self.cc.get("gocq_react_friend", True):
if isinstance(source.message[0], Plain):
self.new_sub_thread(self.handle_msg, (source, False))
else:
return
@gocq_app.receiver("GroupMemberIncrease")
async def _(app: CQHTTP, source: GroupMemberIncrease):
if self.cc.get("gocq_react_group_increase", True):
await app.sendGroupMessage(source.group_id, [
Plain(text = self.announcement)
])
@gocq_app.receiver("Notify")
async def _(app: CQHTTP, source: Notify):
print(source)
if source.sub_type == "poke" and source.target_id == source.self_id:
# await self.handle_msg(source, False)
self.new_sub_thread(self.handle_msg, (source, False))
@gocq_app.receiver("GuildMessage")
async def _(app: CQHTTP, source: GuildMessage):
if self.cc.get("gocq_react_guild", True):
if isinstance(source.message[0], Plain):
# await self.handle_msg(source, True)
self.new_sub_thread(self.handle_msg, (source, True))
elif isinstance(source.message[0], At):
if source.message[0].qq == source.self_tiny_id:
# await self.handle_msg(source, True)
self.new_sub_thread(self.handle_msg, (source, True))
else:
return
def run(self):
self.client.run()
async def handle_msg(self, message: Union[GroupMessage, FriendMessage, GuildMessage, Notify], is_group: bool):
self.logger.log(f"{message.user_id} -> {self.parse_message_outline(message)}", tag="QQ_GOCQ")
# 判断是否响应消息
resp = False
if not is_group:
resp = True
else:
for i in message.message:
if isinstance(i, At):
if message.type == "GuildMessage":
if i.qq == message.user_id or i.qq == message.self_tiny_id:
resp = True
if message.type == "FriendMessage":
if i.qq == message.self_id:
resp = True
if message.type == "GroupMessage":
if i.qq == message.self_id:
resp = True
elif isinstance(i, Plain):
for nick in self.nick_qq:
if nick != '' and i.text.strip().startswith(nick):
resp = True
break
if not resp: return
# 解析 session_id
if self.unique_session or not is_group:
session_id = message.user_id
elif message.type == "GroupMessage":
session_id = message.group_id
elif message.type == "GuildMessage":
session_id = message.channel_id
else:
session_id = message.user_id
# 解析 role
sender_id = str(message.user_id)
if sender_id == self.cc.get('admin_qq', '') or \
sender_id == self.cc.get('gocq_qqchan_admin', '') or \
sender_id in self.cc.get('other_admins', []):
role = 'admin'
else:
role = 'member'
message_result = await self.message_handler(
message=message,
session_id=session_id,
role=role,
platform='gocq'
)
if message_result is None:
return
self.reply_msg(message, message_result.result_message)
if message_result.callback is not None:
message_result.callback()
# 如果是等待回复的消息
if session_id in self.waiting and self.waiting[session_id] == '':
self.waiting[session_id] = message
def reply_msg(self,
message: Union[GroupMessage, FriendMessage, GuildMessage, Notify],
result_message: list):
"""
插件开发者请使用send方法, 可以不用直接调用这个方法。
"""
source = message
res = result_message
self.gocq_cnt += 1
self.logger.log(f"{source.user_id} <- {self.parse_message_outline(res)}", tag="QQ_GOCQ")
if isinstance(source, int):
source = FakeSource("GroupMessage", source)
# str convert to CQ Message Chain
if isinstance(res, str):
res_str = res
res = []
if source.type == "GroupMessage" and not isinstance(source, FakeSource):
res.append(At(qq=source.user_id))
res.append(Plain(text=res_str))
# if image mode, put all Plain texts into a new picture.
if self.pic_mode and isinstance(res, list):
plains = []
news = []
for i in res:
if isinstance(i, Plain):
plains.append(i.text)
else:
news.append(i)
plains_str = "".join(plains).strip()
if plains_str != "" and len(plains_str) > 50:
p = gu.create_markdown_image("".join(plains))
news.append(Image.fromFileSystem(p))
res = news
# 回复消息链
if isinstance(res, list) and len(res) > 0:
if source.type == "GuildMessage":
# await self.client.sendGuildChannelMessage(source.guild_id, source.channel_id, res)
asyncio.run_coroutine_threadsafe(self.client.sendGuildChannelMessage(source.guild_id, source.channel_id, res), self.loop).result()
return
elif source.type == "FriendMessage":
# await self.client.sendFriendMessage(source.user_id, res)
asyncio.run_coroutine_threadsafe(self.client.sendFriendMessage(source.user_id, res), self.loop).result()
return
elif source.type == "GroupMessage":
# 过长时forward发送
plain_text_len = 0
image_num = 0
for i in res:
if isinstance(i, Plain):
plain_text_len += len(i.text)
elif isinstance(i, Image):
image_num += 1
if plain_text_len > self.cc.get('qq_forward_threshold', 200):
# 删除At
for i in res:
if isinstance(i, At):
res.remove(i)
node = Node(res)
# node.content = res
node.uin = 123456
node.name = f"bot"
node.time = int(time.time())
# print(node)
nodes=[node]
# await self.client.sendGroupForwardMessage(source.group_id, nodes)
asyncio.run_coroutine_threadsafe(self.client.sendGroupForwardMessage(source.group_id, nodes), self.loop).result()
return
# await self.client.sendGroupMessage(source.group_id, res)
asyncio.run_coroutine_threadsafe(self.client.sendGroupMessage(source.group_id, res), self.loop).result()
return
def send_msg(self, message: Union[GroupMessage, FriendMessage, GuildMessage, Notify], result_message: list):
'''
提供给插件的发送QQ消息接口。
参数说明第一个参数可以是消息对象也可以是QQ群号。第二个参数是消息内容消息内容可以是消息链列表也可以是纯文字信息
非异步
'''
try:
# await self.reply_msg(message, result_message)
self.reply_msg(message, result_message)
except BaseException as e:
raise e
def send(self,
to,
res):
'''
同 send_msg()
非异步
'''
try:
# await self.send_msg(to, res)
self.reply_msg(to, res)
except BaseException as e:
raise e
def create_text_image(title: str, text: str, max_width=30, font_size=20):
'''
文本转图片。
title: 标题
text: 文本内容
max_width: 文本宽度最大值默认30
font_size: 字体大小默认20
返回:文件路径
'''
try:
img = gu.word2img(title, text, max_width, font_size)
p = gu.save_temp_img(img)
return p
except Exception as e:
raise e
def wait_for_message(self, group_id) -> Union[GroupMessage, FriendMessage, GuildMessage]:
'''
等待下一条消息,超时 300s 后抛出异常
'''
self.waiting[group_id] = ''
cnt = 0
while True:
if group_id in self.waiting and self.waiting[group_id] != '':
# 去掉
ret = self.waiting[group_id]
del self.waiting[group_id]
return ret
cnt += 1
if cnt > 300:
raise Exception("等待消息超时。")
time.sleep(1)
def get_client(self):
return self.client
def nakuru_method_invoker(self, func, *args, **kwargs):
"""
返回一个方法调用器可以用来立即调用nakuru的方法。
"""
try:
ret = asyncio.run_coroutine_threadsafe(func(*args, **kwargs), self.loop).result()
return ret
except BaseException as e:
raise e
def get_cnt(self):
return self.gocq_cnt
def set_cnt(self, cnt):
self.gocq_cnt = cnt

View File

@@ -0,0 +1,254 @@
import io
import botpy
from PIL import Image as PILImage
from botpy.message import Message, DirectMessage
import re
import asyncio
import requests
from util import general_utils as gu
from botpy.types.message import Reference
from botpy import Client
import time
from ._platfrom import Platform
from ._nakuru_translation_layer import(
NakuruGuildMessage,
NakuruGuildMember,
gocq_compatible_receive,
gocq_compatible_send
)
from typing import Union
# QQ 机器人官方框架
class botClient(Client):
def set_platform(self, platform: 'QQOfficial'):
self.platform = platform
# 收到频道消息
async def on_at_message_create(self, message: Message):
# 转换层
nakuru_guild_message = gocq_compatible_receive(message)
self.platform.new_sub_thread(self.platform.handle_msg, (nakuru_guild_message, True))
# 收到私聊消息
async def on_direct_message_create(self, message: DirectMessage):
# 转换层
nakuru_guild_message = gocq_compatible_receive(message)
self.platform.new_sub_thread(self.platform.handle_msg, (nakuru_guild_message, False))
class QQOfficial(Platform):
def __init__(self, cfg: dict, message_handler: callable, global_object) -> None:
super().__init__(message_handler)
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
self.qqchan_cnt = 0
self.waiting: dict = {}
self.cfg = cfg
self.appid = cfg['qqbot']['appid']
self.token = cfg['qqbot']['token']
self.secret = cfg['qqbot_secret']
self.unique_session = cfg['uniqueSessionMode']
self.logger: gu.Logger = global_object.logger
self.intents = botpy.Intents(
public_guild_messages=True,
direct_message=cfg['direct_message_mode']
)
self.client = botClient(
intents=self.intents,
bot_log=False
)
self.client.set_platform(self)
def run(self):
try:
self.loop.run_until_complete(self.client.run(
appid=self.appid,
secret=self.secret
))
except BaseException as e:
print(e)
self.client = botClient(
intents=self.intents,
bot_log=False
)
self.client.set_platform(self)
self.client.run(
appid=self.appid,
token=self.token
)
async def handle_msg(self, message: NakuruGuildMessage, is_group: bool):
_t = "/私聊" if not is_group else ""
self.logger.log(f"{message.sender.nickname}({message.sender.tiny_id}{_t}) -> {self.parse_message_outline(message)}", tag="QQ_OFFICIAL")
# 解析出 session_id
if self.unique_session or not is_group:
session_id = message.sender.user_id
else:
session_id = message.channel_id
# 解析出 role
sender_id = str(message.sender.tiny_id)
if sender_id == self.cfg['admin_qqchan'] or \
sender_id in self.cfg['other_admins']:
role = 'admin'
else:
role = 'member'
message_result = await self.message_handler(
message=message,
session_id=session_id,
role=role,
platform='qqchan'
)
if message_result is None:
return
self.reply_msg(is_group, message, message_result.result_message)
if message_result.callback is not None:
message_result.callback()
# 如果是等待回复的消息
if session_id in self.waiting and self.waiting[session_id] == '':
self.waiting[session_id] = message
def reply_msg(self,
is_group: bool,
message: NakuruGuildMessage,
res: Union[str, list]):
'''
回复频道消息
'''
self.logger.log(f"{message.sender.nickname}({message.sender.tiny_id}) <- {self.parse_message_outline(res)}", tag="QQ_OFFICIAL")
self.qqchan_cnt += 1
plain_text = ''
image_path = ''
msg_ref = None
if isinstance(res, list):
plain_text, image_path = gocq_compatible_send(res)
elif isinstance(res, str):
plain_text = res
if self.cfg['qq_pic_mode']:
# 文本转图片,并且加上原来的图片
if plain_text != '' or image_path != '':
if image_path is not None and image_path != '':
if image_path.startswith("http"):
plain_text += "\n\n" + "![](" + image_path + ")"
else:
plain_text += "\n\n" + "![](file:///" + image_path + ")"
image_path = gu.create_markdown_image("".join(plain_text))
plain_text = ""
else:
if image_path is not None and image_path != '':
msg_ref = None
if image_path.startswith("http"):
pic_res = requests.get(image_path, stream = True)
if pic_res.status_code == 200:
image = PILImage.open(io.BytesIO(pic_res.content))
image_path = gu.save_temp_img(image)
if message.raw_message is not None and image_path == '': # file_image与message_reference不能同时传入
msg_ref = Reference(message_id=message.raw_message.id, ignore_get_message_error=False)
# 到这里,我们得到了 plain_textimage_pathmsg_ref
data = {
'content': plain_text,
'msg_id': message.message_id,
'message_reference': msg_ref
}
if is_group:
data['channel_id'] = str(message.channel_id)
else:
data['guild_id'] = str(message.guild_id)
if image_path != '':
data['file_image'] = image_path
try:
# await self._send_wrapper(**data)
self._send_wrapper(**data)
except BaseException as e:
print(e)
# 分割过长的消息
if "msg over length" in str(e):
split_res = []
split_res.append(plain_text[:len(plain_text)//2])
split_res.append(plain_text[len(plain_text)//2:])
for i in split_res:
data['content'] = i
# await self._send_wrapper(**data)
self._send_wrapper(**data)
else:
# 发送qq信息
try:
# 防止被qq频道过滤消息
plain_text = plain_text.replace(".", " . ")
# await self._send_wrapper(**data)
self._send_wrapper(**data)
except BaseException as e:
try:
data['content'] = str.join(" ", plain_text)
# await self._send_wrapper(**data)
self._send_wrapper(**data)
except BaseException as e:
plain_text = re.sub(r'(https|http)?:\/\/(\w|\.|\/|\?|\=|\&|\%)*\b', '[被隐藏的链接]', str(e), flags=re.MULTILINE)
plain_text = plain_text.replace(".", "·")
data['content'] = plain_text
# await self._send_wrapper(**data)
self._send_wrapper(**data)
def _send_wrapper(self, **kwargs):
if 'channel_id' in kwargs:
asyncio.run_coroutine_threadsafe(self.client.api.post_message(**kwargs), self.loop).result()
else:
asyncio.run_coroutine_threadsafe(self.client.api.post_dms(**kwargs), self.loop).result()
def send_msg(self, channel_id: int, message_chain: list, message_id: int = None):
'''
推送消息, 如果有 message_id那么就是回复消息。非异步。
'''
_n = NakuruGuildMessage()
_n.channel_id = channel_id
_n.message_id = message_id
# await self.reply_msg(_n, message_chain)
self.reply_msg(_n, message_chain)
def send(self, message_obj, message_chain: list):
'''
发送信息。内容同 reply_msg。非异步。
'''
# await self.reply_msg(message_obj, message_chain)
self.reply_msg(message_obj, message_chain)
def wait_for_message(self, channel_id: int) -> NakuruGuildMessage:
'''
等待指定 channel_id 的下一条信息,超时 300s 后抛出异常
'''
self.waiting[channel_id] = ''
cnt = 0
while True:
if channel_id in self.waiting and self.waiting[channel_id] != '':
# 去掉
ret = self.waiting[channel_id]
del self.waiting[channel_id]
return ret
cnt += 1
if cnt > 300:
raise Exception("等待消息超时。")
time.sleep(1)
def get_cnt(self):
return self.qqchan_cnt
def set_cnt(self, cnt):
self.qqchan_cnt = cnt

View File

@@ -1,217 +0,0 @@
import io
import botpy
from PIL import Image as PILImage
from botpy.message import Message, DirectMessage
import re
import asyncio
import requests
from cores.qqbot.personality import personalities
from util import general_utils as gu
from nakuru.entities.components import Plain, At, Image
from botpy.types.message import Reference
from botpy import Client
import time
class NakuruGuildMember():
tiny_id: int # 发送者识别号
user_id: int # 发送者识别号
title: str
nickname: str # 昵称
role: int # 角色
icon_url: str # 头像url
class NakuruGuildMessage():
type: str = "GuildMessage"
self_id: int # bot的qq号
self_tiny_id: int # bot的qq号
sub_type: str # 消息类型
message_id: str # 消息id
guild_id: int # 频道号
channel_id: int # 子频道号
user_id: int # 发送者qq号
message: list # 消息内容
sender: NakuruGuildMember # 发送者信息
raw_message: Message
def __str__(self) -> str:
return str(self.__dict__)
class QQChan():
def __init__(self, cnt: dict = None) -> None:
self.qqchan_cnt = 0
self.waiting: dict = {}
def get_cnt(self):
return self.qqchan_cnt
def set_cnt(self, cnt):
self.qqchan_cnt = cnt
def run_bot(self, botclient: Client, appid, token):
intents = botpy.Intents(public_guild_messages=True, direct_message=True)
self.client = botclient
self.client.run(appid=appid, token=token)
# gocq-频道SDK兼容层
def gocq_compatible_send(self, gocq_message_chain: list):
plain_text = ""
image_path = None # only one img supported
for i in gocq_message_chain:
if isinstance(i, Plain):
plain_text += i.text
elif isinstance(i, Image) and image_path == None:
if i.path is not None:
image_path = i.path
else:
image_path = i.file
return plain_text, image_path
# gocq-频道SDK兼容层
def gocq_compatible_receive(self, message: Message) -> NakuruGuildMessage:
ngm = NakuruGuildMessage()
try:
ngm.self_id = message.mentions[0].id
ngm.self_tiny_id = message.mentions[0].id
except:
ngm.self_id = 0
ngm.self_tiny_id = 0
ngm.sub_type = "normal"
ngm.message_id = message.id
ngm.guild_id = int(message.guild_id)
ngm.channel_id = int(message.channel_id)
ngm.user_id = int(message.author.id)
msg = []
plain_content = message.content.replace("<@!"+str(ngm.self_id)+">", "").strip()
msg.append(Plain(plain_content))
if message.attachments:
for i in message.attachments:
if i.content_type.startswith("image"):
url = i.url
if not url.startswith("http"):
url = "https://"+url
img = Image.fromURL(url)
msg.append(img)
ngm.message = msg
ngm.sender = NakuruGuildMember()
ngm.sender.tiny_id = int(message.author.id)
ngm.sender.user_id = int(message.author.id)
ngm.sender.title = ""
ngm.sender.nickname = message.author.username
ngm.sender.role = 0
ngm.sender.icon_url = message.author.avatar
ngm.raw_message = message
return ngm
def send_qq_msg(self,
message: NakuruGuildMessage,
res: list):
'''
回复频道消息
'''
gu.log("回复QQ频道消息: "+str(res), level=gu.LEVEL_INFO, tag="QQ频道", max_len=500)
self.qqchan_cnt += 1
plain_text = ""
image_path = None
if isinstance(res, list):
# 兼容gocq
plain_text, image_path = self.gocq_compatible_send(res)
elif isinstance(res, str):
plain_text = res
# print(plain_text, image_path)
msg_ref = None
if message.raw_message is not None:
msg_ref = Reference(message_id=message.raw_message.id, ignore_get_message_error=False)
if image_path is not None:
msg_ref = None
if image_path.startswith("http"):
pic_res = requests.get(image_path, stream = True)
if pic_res.status_code == 200:
image = PILImage.open(io.BytesIO(pic_res.content))
image_path = gu.save_temp_img(image)
try:
# reply_res = asyncio.run_coroutine_threadsafe(message.raw_message.reply(content=str(plain_text), message_reference = msg_ref, file_image=image_path), self.client.loop)
reply_res = asyncio.run_coroutine_threadsafe(self.client.api.post_message(channel_id=str(message.channel_id),
content=str(plain_text),
msg_id=message.message_id,
file_image=image_path,
message_reference=msg_ref), self.client.loop)
reply_res.result()
except BaseException as e:
# 分割过长的消息
if "msg over length" in str(e):
split_res = []
split_res.append(plain_text[:len(plain_text)//2])
split_res.append(plain_text[len(plain_text)//2:])
for i in split_res:
reply_res = asyncio.run_coroutine_threadsafe(self.client.api.post_message(channel_id=str(message.channel_id),
content=str(i),
msg_id=message.message_id,
file_image=image_path,
message_reference=msg_ref), self.client.loop)
reply_res.result()
else:
# 发送qq信息
try:
# 防止被qq频道过滤消息
plain_text = plain_text.replace(".", " . ")
reply_res = asyncio.run_coroutine_threadsafe(self.client.api.post_message(channel_id=str(message.channel_id),
content=str(plain_text),
msg_id=message.message_id,
file_image=image_path,
message_reference=msg_ref), self.client.loop).result() # 发送信息
except BaseException as e:
print("QQ频道API错误: \n"+str(e))
try:
# reply_res = asyncio.run_coroutine_threadsafe(message.raw_message.reply(content=str(str.join(" ", plain_text)), message_reference = msg_ref, file_image=image_path), self.client.loop)
reply_res = asyncio.run_coroutine_threadsafe(self.client.api.post_message(channel_id=str(message.channel_id),
content=str(str.join(" ", plain_text)),
msg_id=message.message_id,
file_image=image_path,
message_reference=msg_ref), self.client.loop).result()
except BaseException as e:
plain_text = re.sub(r'(https|http)?:\/\/(\w|\.|\/|\?|\=|\&|\%)*\b', '[被隐藏的链接]', str(e), flags=re.MULTILINE)
plain_text = plain_text.replace(".", "·")
reply_res = asyncio.run_coroutine_threadsafe(self.client.api.post_message(channel_id=str(message.channel_id),
content=plain_text,
msg_id=message.message_id,
file_image=image_path,
message_reference=msg_ref), self.client.loop).result()
# send(message, f"QQ频道API错误{str(e)}\n下面是格式化后的回答\n{f_res}")
def push_message(self, channel_id: int, message_chain: list, message_id: int = None):
'''
推送消息, 如果有 message_id那么就是回复消息。
'''
_n = NakuruGuildMessage()
_n.channel_id = channel_id
_n.message_id = message_id
self.send_qq_msg(_n, message_chain)
def send(self, message_obj, message_chain: list):
'''
发送信息
'''
self.send_qq_msg(message_obj, message_chain)
def wait_for_message(self, channel_id: int) -> NakuruGuildMessage:
'''
等待指定 channel_id 的下一条信息,超时 300s 后抛出异常
'''
self.waiting[channel_id] = ''
cnt = 0
while True:
if channel_id in self.waiting and self.waiting[channel_id] != '':
# 去掉
ret = self.waiting[channel_id]
del self.waiting[channel_id]
return ret
cnt += 1
if cnt > 300:
raise Exception("等待消息超时。")
time.sleep(1)

View File

@@ -1,188 +0,0 @@
import requests
import asyncio
import websockets
from websockets import WebSocketClientProtocol
import json
import inspect
from typing import Callable, Awaitable, Union
from pydantic import BaseModel
import datetime
class Event(BaseModel):
GroupMessage: str = "GuildMessage"
class Sender(BaseModel):
user_id: str
member_openid: str
class MessageComponent(BaseModel):
type: str
class PlainText(MessageComponent):
text: str
class Image(MessageComponent):
path: str
file: str
url: str
class MessageChain(list):
def append(self, __object: MessageComponent) -> None:
if not isinstance(__object, MessageComponent):
raise TypeError("不受支持的消息链元素类型。回复的消息链必须是 MessageComponent 的子类。")
return super().append(__object)
def insert(self, __index: int, __object: MessageComponent) -> None:
if not isinstance(__object, MessageComponent):
raise TypeError("不受支持的消息链元素类型。回复的消息链必须是 MessageComponent 的子类。")
return super().insert(__index, __object)
def parse_from_nakuru(self, nakuru_message_chain: Union[list, str]) -> None:
if isinstance(nakuru_message_chain, str):
self.append(PlainText(type='Plain', text=nakuru_message_chain))
else:
for i in nakuru_message_chain:
if i['type'] == 'Plain':
self.append(PlainText(type='Plain', text=i['text']))
elif i['type'] == 'Image':
self.append(Image(path=i['path'], file=i['file'], url=i['url']))
class Message(BaseModel):
type: str
user_id: str
member_openid: str
message_id: str
group_id: str
group_openid: str
content: str
message: MessageChain
time: int
sender: Sender
class UnofficialQQBotSDK:
GET_APP_ACCESS_TOKEN_URL = "https://bots.qq.com/app/getAppAccessToken"
OPENAPI_BASE_URL = "https://api.sgroup.qq.com"
def __init__(self, appid: str, client_secret: str) -> None:
self.appid = appid
self.client_secret = client_secret
self.events: dict[str, Awaitable] = {}
def run_bot(self) -> None:
self.__get_access_token()
self.__get_wss_endpoint()
asyncio.get_event_loop().run_until_complete(self.__ws_client())
def __get_access_token(self) -> None:
res = requests.post(self.GET_APP_ACCESS_TOKEN_URL, json={
"appId": self.appid,
"clientSecret": self.client_secret
}, headers={
"Content-Type": "application/json"
})
res = res.json()
code = res['code'] if 'code' in res else 1
if 'access_token' not in res:
raise Exception(f"获取 access_token 失败。原因:{res['message'] if 'message' in res else '未知'}")
self.access_token = 'QQBot ' + res['access_token']
def __auth_header(self) -> str:
return {
'Authorization': self.access_token,
'X-Union-Appid': self.appid,
}
def __get_wss_endpoint(self):
res = requests.get(self.OPENAPI_BASE_URL + "/gateway", headers=self.__auth_header())
self.wss_endpoint = res.json()['url']
# print("wss_endpoint: " + self.wss_endpoint)
async def __behav_heartbeat(self, ws: WebSocketClientProtocol, t: int):
while True:
await asyncio.sleep(t - 1)
try:
await ws.send(json.dumps({
"op": 1,
"d": self.s
}))
except:
print("heartbeat error.")
async def __handle_msg(self, ws: WebSocketClientProtocol, msg: dict):
if msg['op'] == 10:
asyncio.get_event_loop().create_task(self.__behav_heartbeat(ws, msg['d']['heartbeat_interval'] / 1000))
# 鉴权获得session
await ws.send(json.dumps({
"op": 2,
"d": {
"token": self.access_token,
"intents": 33554432,
"shard": [0, 1],
"properties": {
"$os": "linux",
"$browser": "my_library",
"$device": "my_library"
}
}
}))
if msg['op'] == 0:
# ready
data = msg['d']
event_typ: str = msg['t'] if 't' in msg else None
if event_typ == 'GROUP_AT_MESSAGE_CREATE':
if 'GroupMessage' in self.events:
coro = self.events['GroupMessage']
else:
return
message_chain = MessageChain()
message_chain.append(PlainText(type="Plain", text=data['content']))
group_message = Message(
type='GroupMessage',
user_id=data['author']['id'],
member_openid=data['author']['member_openid'],
message_id=data['id'],
group_id=data['group_id'],
group_openid=data['group_openid'],
content=data['content'],
# 2023-11-24T19:51:11+08:00
time=int(datetime.datetime.strptime(data['timestamp'], "%Y-%m-%dT%H:%M:%S%z").timestamp()),
sender=Sender(
user_id=data['author']['id'],
member_openid=data['author']['member_openid']
),
message=message_chain
)
await coro(self, group_message)
async def send(self, message: Message, message_chain: MessageChain) -> None:
# todo: 消息链转换支持更多类型。
plain_text = ""
for i in message_chain:
if isinstance(i, PlainText):
plain_text += i.text
requests.post(self.OPENAPI_BASE_URL + f"/v2/groups/{message.group_openid}/messages", headers=self.__auth_header(), json={
"content": plain_text,
"message_type": 0,
"msg_id": message.message_id
})
async def __ws_client(self):
self.s = 0
async with websockets.connect(self.wss_endpoint) as websocket:
while True:
msg = await websocket.recv()
msg = json.loads(msg)
if 's' in msg:
self.s = msg['s']
await self.__handle_msg(websocket, msg)
def on(self, event: str) -> None:
def wrapper(func: Awaitable):
if inspect.iscoroutinefunction(func) == False:
raise TypeError("func must be a coroutine function")
self.events[event] = func
return wrapper

View File

@@ -10,6 +10,7 @@ from model.provider.provider import Provider
import threading
from util import general_utils as gu
from util.cmd_config import CmdConfig
from util.general_utils import Logger
import traceback
import tiktoken
@@ -19,9 +20,10 @@ abs_path = os.path.dirname(os.path.realpath(sys.argv[0])) + '/'
class ProviderOpenAIOfficial(Provider):
def __init__(self, cfg):
self.cc = CmdConfig()
self.logger = Logger()
self.key_list = []
# 如果 cfg['key']中有长度为1的字符串,那么是格式错误,直接报错
# 如果 cfg['key'] 中有长度为 1 的字符串,那么是格式错误,直接报错
for key in cfg['key']:
if len(key) == 1:
input("检查到了长度为 1 的Key。配置文件中的 openai.key 处的格式错误 (符号 - 的后面要加空格),请退出程序并检查配置文件,按回车跳过。")
@@ -29,7 +31,7 @@ class ProviderOpenAIOfficial(Provider):
if cfg['key'] != '' and cfg['key'] != None:
self.key_list = cfg['key']
else:
input("[System] 请先去完善ChatGPT的Key。详情请前往https://beta.openai.com/account/api-keys")
input("[System] 请先填写 Key。详情请前往 https://beta.openai.com/account/api-keys 或使用中转 Key 方案。")
if len(self.key_list) == 0:
raise Exception("您打开了 OpenAI 模型服务,但是未填写 key。请前往填写。")
@@ -40,15 +42,16 @@ class ProviderOpenAIOfficial(Provider):
self.api_base = None
if 'api_base' in cfg and cfg['api_base'] != 'none' and cfg['api_base'] != '':
self.api_base = cfg['api_base']
gu.log(f"设置 api_base 为: {self.api_base}")
# openai client
self.logger.log(f"设置 api_base 为: {self.api_base}", tag="OpenAI")
# 创建 OpenAI Client
self.client = OpenAI(
api_key=self.key_list[0],
base_url=self.api_base
)
self.openai_model_configs: dict = cfg['chatGPTConfigs']
gu.log(f'加载 OpenAI Chat Configs: {self.openai_model_configs}')
self.logger.log(f'加载 OpenAI Chat Configs: {self.openai_model_configs}', tag="OpenAI")
self.openai_configs = cfg
# 会话缓存
self.session_dict = {}
@@ -59,30 +62,14 @@ class ProviderOpenAIOfficial(Provider):
self.enc = tiktoken.get_encoding("cl100k_base")
# 读取历史记录
# 从 SQLite DB 读取历史记录
try:
db1 = dbConn()
for session in db1.get_all_session():
self.session_dict[session[0]] = json.loads(session[1])['data']
gu.log("读取历史记录成功。")
self.logger.log("读取历史记录成功。", tag="OpenAI")
except BaseException as e:
gu.log("读取历史记录失败,但不影响使用。", level=gu.LEVEL_ERROR)
# 读取统计信息
if not os.path.exists(abs_path+"configs/stat"):
with open(abs_path+"configs/stat", 'w', encoding='utf-8') as f:
json.dump({}, f)
self.stat_file = open(abs_path+"configs/stat", 'r', encoding='utf-8')
global count
res = self.stat_file.read()
if res == '':
count = {}
else:
try:
count = json.loads(res)
except BaseException:
pass
self.logger.log("读取历史记录失败,但不影响使用。", level=gu.LEVEL_ERROR, tag="OpenAI")
# 创建转储定时器线程
threading.Thread(target=self.dump_history, daemon=True).start()
@@ -90,7 +77,6 @@ class ProviderOpenAIOfficial(Provider):
# 人格
self.curr_personality = {}
# 转储历史记录
def dump_history(self):
time.sleep(10)
@@ -99,7 +85,6 @@ class ProviderOpenAIOfficial(Provider):
try:
# print("转储历史记录...")
for key in self.session_dict:
# print("TEST: "+str(db.get_session(key)))
data = self.session_dict[key]
data_json = {
'data': data
@@ -144,20 +129,6 @@ class ProviderOpenAIOfficial(Provider):
# 会话机制
if session_id not in self.session_dict:
self.session_dict[session_id] = []
fjson = {}
try:
f = open(abs_path+"configs/session", "r", encoding="utf-8")
fjson = json.loads(f.read())
f.close()
except:
pass
finally:
fjson[session_id] = 'true'
f = open(abs_path+"configs/session", "w", encoding="utf-8")
f.write(json.dumps(fjson))
f.flush()
f.close()
if len(self.session_dict[session_id]) == 0:
# 设置默认人格
@@ -169,11 +140,11 @@ class ProviderOpenAIOfficial(Provider):
_encoded_prompt = self.enc.encode(prompt)
if self.openai_model_configs['max_tokens'] < len(_encoded_prompt):
prompt = self.enc.decode(_encoded_prompt[:int(self.openai_model_configs['max_tokens']*0.80)])
gu.log(f"注意,有一部分 prompt 文本由于超出 token 限制而被截断。", level=gu.LEVEL_WARNING, max_len=300)
self.logger.log(f"注意,有一部分 prompt 文本由于超出 token 限制而被截断。", level=gu.LEVEL_WARNING, tag="OpenAI")
cache_data_list, new_record, req = self.wrap(prompt, session_id, image_url)
gu.log(f"CACHE_DATA_: {str(cache_data_list)}", level=gu.LEVEL_DEBUG, max_len=99999)
gu.log(f"OPENAI REQUEST: {str(req)}", level=gu.LEVEL_DEBUG, max_len=9999)
self.logger.log(f"CACHE_DATA_: {str(cache_data_list)}", level=gu.LEVEL_DEBUG, tag="OpenAI")
self.logger.log(f"OPENAI REQUEST: {str(req)}", level=gu.LEVEL_DEBUG, tag="OpenAI")
retry = 0
response = None
err = ''
@@ -216,7 +187,7 @@ class ProviderOpenAIOfficial(Provider):
if 'Invalid content type. image_url is only supported by certain models.' in str(e):
raise e
if 'You exceeded' in str(e) or 'Billing hard limit has been reached' in str(e) or 'No API key provided' in str(e) or 'Incorrect API key provided' in str(e):
gu.log("当前Key已超额或异常, 正在切换", level=gu.LEVEL_WARNING)
self.logger.log("当前 Key 已超额或异常, 正在切换", level=gu.LEVEL_WARNING, tag="OpenAI")
self.key_stat[self.client.api_key]['exceed'] = True
is_switched = self.handle_switch_key()
if not is_switched:
@@ -224,7 +195,7 @@ class ProviderOpenAIOfficial(Provider):
raise e
retry -= 1
elif 'maximum context length' in str(e):
gu.log("token超限, 清空对应缓存,并进行消息截断")
self.logger.log("token 超限, 清空对应缓存,并进行消息截断", tag="OpenAI")
self.session_dict[session_id] = []
prompt = prompt[:int(len(prompt)*truncate_rate)]
truncate_rate -= 0.05
@@ -234,15 +205,15 @@ class ProviderOpenAIOfficial(Provider):
time.sleep(30)
continue
else:
gu.log(str(e), level=gu.LEVEL_ERROR)
self.logger.log(str(e), level=gu.LEVEL_ERROR, tag="OpenAI")
time.sleep(2)
err = str(e)
retry += 1
if retry >= 10:
gu.log(r"如果报错, 且您的机器在中国大陆内, 请确保您的电脑已经设置好代理软件(梯子), 并在配置文件设置了系统代理地址。详见https://github.com/Soulter/QQChannelChatGPT/wiki/%E4%BA%8C%E3%80%81%E9%A1%B9%E7%9B%AE%E9%85%8D%E7%BD%AE%E6%96%87%E4%BB%B6%E9%85%8D%E7%BD%AE", max_len=999)
self.logger.log(r"如果报错, 且您的机器在中国大陆内, 请确保您的电脑已经设置好代理软件(梯子), 并在配置文件设置了系统代理地址。详见 https://github.com/Soulter/QQChannelChatGPT/wiki", tag="OpenAI")
raise BaseException("连接出错: "+str(err))
assert isinstance(response, ChatCompletion)
gu.log(f"OPENAI RESPONSE: {response.usage}", level=gu.LEVEL_DEBUG, max_len=9999)
self.logger.log(f"OPENAI RESPONSE: {response.usage}", level=gu.LEVEL_DEBUG, tag="OpenAI")
# 结果分类
choice = response.choices[0]
@@ -307,17 +278,17 @@ class ProviderOpenAIOfficial(Provider):
image_url.append(response.data[i].url)
break
except Exception as e:
gu.log(str(e), level=gu.LEVEL_ERROR)
self.logger.log(str(e), level=gu.LEVEL_ERROR)
if 'You exceeded' in str(e) or 'Billing hard limit has been reached' in str(
e) or 'No API key provided' in str(e) or 'Incorrect API key provided' in str(e):
gu.log("当前 Key 已超额或者不正常, 正在切换", level=gu.LEVEL_WARNING)
self.logger.log("当前 Key 已超额或者不正常, 正在切换", level=gu.LEVEL_WARNING, tag="OpenAI")
self.key_stat[self.client.api_key]['exceed'] = True
is_switched = self.handle_switch_key()
if not is_switched:
# 所有Key都超额或不正常
raise e
elif 'Your request was rejected as a result of our safety system.' in str(e):
gu.log("您的请求被 OpenAI 安全系统拒绝, 请稍后再试", level=gu.LEVEL_WARNING)
self.logger.log("您的请求被 OpenAI 安全系统拒绝, 请稍后再试", level=gu.LEVEL_WARNING, tag="OpenAI")
raise e
else:
retry += 1
@@ -360,34 +331,6 @@ class ProviderOpenAIOfficial(Provider):
usage_tokens += int(item['single_tokens'])
return usage_tokens
'''
获取统计信息
'''
def get_stat(self):
try:
f = open(abs_path+"configs/stat", "r", encoding="utf-8")
fjson = json.loads(f.read())
f.close()
guild_count = 0
guild_msg_count = 0
guild_direct_msg_count = 0
for k,v in fjson.items():
guild_count += 1
guild_msg_count += v['count']
guild_direct_msg_count += v['direct_count']
session_count = 0
f = open(abs_path+"configs/session", "r", encoding="utf-8")
fjson = json.loads(f.read())
f.close()
for k,v in fjson.items():
session_count += 1
return guild_count, guild_msg_count, guild_direct_msg_count, session_count
except:
return -1, -1, -1, -1
# 包装信息
def wrap(self, prompt, session_id, image_url = None):
if image_url is not None:
@@ -431,10 +374,10 @@ class ProviderOpenAIOfficial(Provider):
continue
is_all_exceed = False
self.client.api_key = key
gu.log(f"切换到Key: {key}, 已使用token: {self.key_stat[key]['used']}", level=gu.LEVEL_INFO)
self.logger.log(f"切换到 Key: {key}(已使用 token: {self.key_stat[key]['used']})", level=gu.LEVEL_INFO, tag="OpenAI")
break
if is_all_exceed:
gu.log("所有Key已超额", level=gu.LEVEL_CRITICAL)
self.logger.log("所有 Key 已超额", level=gu.LEVEL_CRITICAL, tag="OpenAI")
return False
return True

View File

@@ -15,13 +15,19 @@ class ProviderRevChatGPT(Provider):
for i in range(0, len(config['account'])):
try:
gu.log(f"创建逆向ChatGPT负载{str(i+1)}中...", level=gu.LEVEL_INFO, tag="RevChatGPT")
if 'password' in config['account'][i]:
gu.log(f"创建逆向ChatGPT负载{str(i+1)}失败: 已不支持账号密码登录请使用access_token方式登录。", level=gu.LEVEL_ERROR, tag="RevChatGPT")
continue
rev_account_config = {
'access_token': config['account'][i]['access_token'],
}
if isinstance(config['account'][i], str):
# 默认是 access_token
rev_account_config = {
'access_token': config['account'][i],
}
else:
if 'password' in config['account'][i]:
gu.log(f"创建逆向ChatGPT负载{str(i+1)}失败: 已不支持账号密码登录请使用access_token方式登录。", level=gu.LEVEL_ERROR, tag="RevChatGPT")
continue
rev_account_config = {
'access_token': config['account'][i]['access_token'],
}
if self.cc.get("rev_chatgpt_model") != "":
rev_account_config['model'] = self.cc.get("rev_chatgpt_model")
if len(self.cc.get("rev_chatgpt_plugin_ids")) > 0:

View File

@@ -1,104 +0,0 @@
from model.provider.provider import Provider
from EdgeGPT import Chatbot, ConversationStyle
import json
import os
from util import general_utils as gu
from util.cmd_config import CmdConfig as cc
import time
class ProviderRevEdgeGPT(Provider):
def __init__(self):
raise Exception("Bing 逆向已停止维护,不可用,请使用 ChatGPT 官方 API。")
self.busy = False
self.wait_stack = []
with open('./cookies.json', 'r') as f:
cookies = json.load(f)
proxy = cc.get("bing_proxy", None)
if proxy == "":
proxy = None
# q = Query("Hello, bing!", cookie_files="./cookies.json")
# print(q)
self.bot = EdgeChatbot(cookies=cookies, proxy = "http://127.0.0.1:7890")
ret = self.bot.ask_stream("Hello, bing!", conversation_style=ConversationStyle.creative, wss_link="wss://ai.nothingnessvoid.tech/sydney/ChatHub")
# self.bot = Chatbot(cookies=cookies, proxy = proxy)
for i in ret:
print(i, flush=True)
def is_busy(self):
return self.busy
async def forget(self, session_id = None):
try:
await self.bot.reset()
return True
except BaseException:
return False
async def text_chat(self, prompt, platform = 'none', image_url=None, function_call=None):
while self.busy:
time.sleep(1)
self.busy = True
resp = 'err'
err_count = 0
retry_count = 5
while err_count < retry_count:
try:
resp = await self.bot.ask(prompt=prompt, conversation_style=ConversationStyle.creative)
# print("[RevEdgeGPT] "+str(resp))
if 'messages' not in resp['item']:
await self.bot.reset()
msj_obj = resp['item']['messages'][len(resp['item']['messages'])-1]
reply_msg = msj_obj['text']
if 'sourceAttributions' in msj_obj:
reply_source = msj_obj['sourceAttributions']
else:
reply_source = []
if 'throttling' in resp['item']:
throttling = resp['item']['throttling']
# print(throttling)
else:
throttling = None
if 'I\'m sorry but I prefer not to continue this conversation. I\'m still learning so I appreciate your understanding and patience.' in reply_msg:
self.busy = False
return '', 0
if reply_msg == prompt:
# resp += '\n\n如果你没有让我复述你的话那代表我可能不想和你继续这个话题了请输入reset重置会话😶'
await self.forget()
err_count += 1
continue
if reply_source is None:
# 不想答复
return '', 0
else:
if platform != 'qqchan':
index = 1
if len(reply_source) > 0:
reply_msg += "\n\n信息来源:\n"
for i in reply_source:
reply_msg += f"[{str(index)}]: {i['seeMoreUrl']} | {i['providerDisplayName']}\n"
index += 1
if throttling is not None:
if throttling['numUserMessagesInConversation'] == throttling['maxNumUserMessagesInConversation']:
# 达到上限,重置会话
await self.forget()
if throttling['numUserMessagesInConversation'] > throttling['maxNumUserMessagesInConversation']:
await self.forget()
err_count += 1
continue
reply_msg += f"\n[{throttling['numUserMessagesInConversation']}/{throttling['maxNumUserMessagesInConversation']}]"
break
except BaseException as e:
gu.log(str(e), level=gu.LEVEL_WARNING, tag="RevEdgeGPT")
err_count += 1
if err_count >= retry_count:
gu.log(r"如果报错, 且您的机器在中国大陆内, 请确保您的电脑已经设置好代理软件(梯子), 并在配置文件设置了系统代理地址。详见https://github.com/Soulter/QQChannelChatGPT/wiki/%E4%BA%8C%E3%80%81%E9%A1%B9%E7%9B%AE%E9%85%8D%E7%BD%AE%E6%96%87%E4%BB%B6%E9%85%8D%E7%BD%AE", max_len=999)
self.busy = False
raise e
gu.log("请求出现了一些问题, 正在重试。次数"+str(err_count), level=gu.LEVEL_WARNING, tag="RevEdgeGPT")
self.busy = False
# print("[RevEdgeGPT] "+str(reply_msg))
return reply_msg, 1

View File

@@ -1,7 +1,7 @@
pydantic~=1.10.4
requests~=2.28.1
openai~=1.2.3
qq-botpy==1.1.2
qq-botpy
chardet~=5.1.0
Pillow~=9.4.0
GitPython~=3.1.31

Binary file not shown.

8
test.py Normal file
View File

@@ -0,0 +1,8 @@
from pip._internal.operations.freeze import freeze
# 获取已安装包的信息
installed_packages = freeze()
# 输出已安装包的信息
for package in installed_packages:
print(package)

View File

@@ -80,4 +80,43 @@ class CmdConfig():
if _tag:
with open(cpath, "w", encoding="utf-8-sig") as f:
json.dump(d, f, indent=4, ensure_ascii=False)
f.flush()
f.flush()
def init_astrbot_config_items():
# 加载默认配置
cc = CmdConfig()
cc.init_attributes("qq_forward_threshold", 200)
cc.init_attributes("qq_welcome", "欢迎加入本群!\n欢迎给https://github.com/Soulter/QQChannelChatGPT项目一个Star😊~\n输入help查看帮助~\n")
cc.init_attributes("qq_pic_mode", False)
cc.init_attributes("rev_chatgpt_model", "")
cc.init_attributes("rev_chatgpt_plugin_ids", [])
cc.init_attributes("rev_chatgpt_PUID", "")
cc.init_attributes("rev_chatgpt_unverified_plugin_domains", [])
cc.init_attributes("gocq_host", "127.0.0.1")
cc.init_attributes("gocq_http_port", 5700)
cc.init_attributes("gocq_websocket_port", 6700)
cc.init_attributes("gocq_react_group", True)
cc.init_attributes("gocq_react_guild", True)
cc.init_attributes("gocq_react_friend", True)
cc.init_attributes("gocq_react_group_increase", True)
cc.init_attributes("gocq_qqchan_admin", "")
cc.init_attributes("other_admins", [])
cc.init_attributes("CHATGPT_BASE_URL", "")
cc.init_attributes("qqbot_appid", "")
cc.init_attributes("qqbot_secret", "")
cc.init_attributes("admin_qq", "")
cc.init_attributes("nick_qq", ["!", "", "ai"])
cc.init_attributes("admin_qqchan", "")
cc.init_attributes("llm_env_prompt", "")
cc.init_attributes("llm_wake_prefix", "")
cc.init_attributes("default_personality_str", "")
cc.init_attributes("openai_image_generate", {
"model": "dall-e-3",
"size": "1024x1024",
"style": "vivid",
"quality": "standard",
})
cc.init_attributes("http_proxy", "")
cc.init_attributes("https_proxy", "")
cc.init_attributes("dashboard_username", "")
cc.init_attributes("dashboard_password", "")

View File

@@ -53,6 +53,7 @@ def google_web_search(keyword) -> str:
for i in ls:
desc = i.description
try:
gu.log(f"搜索网页: {i.url}", tag="网页搜索", level=gu.LEVEL_INFO)
desc = fetch_website_content(i.url)
except BaseException as e:
print(f"(google) fetch_website_content err: {str(e)}")
@@ -74,51 +75,54 @@ def web_keyword_search_via_bing(keyword) -> str:
}
url = "https://www.bing.com/search?q="+keyword
_cnt = 0
_detail_store = []
# _detail_store = []
while _cnt < 5:
try:
response = requests.get(url, headers=headers)
response.encoding = "utf-8"
gu.log(f"bing response: {response.text}", tag="bing", level=gu.LEVEL_DEBUG, max_len=9999)
soup = BeautifulSoup(response.text, "html.parser")
res = []
res = ""
result_cnt = 0
ols = soup.find(id="b_results")
for i in ols.find_all("li", class_="b_algo"):
try:
title = i.find("h2").text
desc = i.find("p").text
link = i.find("h2").find("a").get("href")
res.append({
"title": title,
"desc": desc,
"link": link,
})
if len(res) >= 5: # 限制5条
break
if len(_detail_store) >= 3:
continue
# res.append({
# "title": title,
# "desc": desc,
# "link": link,
# })
try:
gu.log(f"搜索网页: {link}", tag="网页搜索", level=gu.LEVEL_INFO)
desc = fetch_website_content(link)
except BaseException as e:
print(f"(bing) fetch_website_content err: {str(e)}")
# 爬取前两条的网页内容
if "zhihu.com" in link:
try:
_detail_store.append(special_fetch_zhihu(link))
except BaseException as e:
print(f"zhihu parse err: {str(e)}")
else:
try:
_detail_store.append(fetch_website_content(link))
except BaseException as e:
print(f"fetch_website_content err: {str(e)}")
res += f"# No.{str(result_cnt + 1)}\ntitle: {title}\nurl: {link}\ncontent: {desc}\n\n"
result_cnt += 1
if result_cnt > 5: break
# if len(_detail_store) >= 3:
# continue
# # 爬取前两条的网页内容
# if "zhihu.com" in link:
# try:
# _detail_store.append(special_fetch_zhihu(link))
# except BaseException as e:
# print(f"zhihu parse err: {str(e)}")
# else:
# try:
# _detail_store.append(fetch_website_content(link))
# except BaseException as e:
# print(f"fetch_website_content err: {str(e)}")
except Exception as e:
print(f"bing parse err: {str(e)}")
if len(res) == 0:
break
if len(_detail_store) > 0:
ret = f"{str(res)} \n具体网页内容: {str(_detail_store)}"
else:
ret = f"{str(res)}"
return str(ret)
if result_cnt == 0: break
return res
except Exception as e:
gu.log(f"bing fetch err: {str(e)}")
_cnt += 1
@@ -152,6 +156,7 @@ def web_keyword_search_via_sougou(keyword) -> str:
if len(res) >= 5: # 限制5条
break
except Exception as e:
pass
gu.log(f"sougou parse err: {str(e)}", tag="web_keyword_search_via_sougou", level=gu.LEVEL_ERROR)
# 爬取网页内容
_detail_store = []
@@ -175,26 +180,6 @@ def fetch_website_content(url):
}
response = requests.get(url, headers=headers, timeout=3)
response.encoding = "utf-8"
# soup = BeautifulSoup(response.text, "html.parser")
# # 如果有container / content / main等的话就只取这些部分
# has = False
# beleive_ls = ["container", "content", "main"]
# res = ""
# for cls in beleive_ls:
# for i in soup.find_all(class_=cls):
# has = True
# res += i.text
# if not has:
# res = soup.text
# res = res.replace("\n", "").replace(" ", " ").replace("\r", "").replace("\t", "")
# if not has:
# res = res[300:1100]
# else:
# res = res[100:800]
# # with open(f"temp_{time.time()}.html", "w", encoding="utf-8") as f:
# # f.write(res)
# gu.log(f"fetch_website_content: end", tag="fetch_website_content", level=gu.LEVEL_DEBUG)
# return res
doc = Document(response.content)
# print('title:', doc.title())
ret = doc.summary(html_partial=True)
@@ -213,7 +198,7 @@ def web_search(question, provider: Provider, session_id, official_fc=False):
"description": "google search query (分词,尽量保留所有信息)"
}],
"通过搜索引擎搜索。如果问题需要在网页上搜索(如天气、新闻或任何需要通过网页获取信息的问题),则调用此函数;如果没有,不要调用此函数。",
google_web_search
web_keyword_search_via_bing
)
new_func_call.add_func("fetch_website_content", [{
"type": "string",
@@ -259,13 +244,20 @@ def web_search(question, provider: Provider, session_id, official_fc=False):
if has_func:
provider.forget(session_id)
question3 = f"""请你用活泼的语气回答`{question}`问题。\n以下是相关材料,请直接拿此材料针对问题进行总结回答。在文章末尾加上各参考链接,如`[1] <title> <url>`不要提到任何函数调用的信息在总结的末尾加上1或2个相关的emoji。```\n{function_invoked_ret}\n```\n"""
question3 = f"""
以下是相关材料,你的任务是:
1. 根据材料对问题`{question}`做切题的总结回答;
2. 发表你对这个问题的看法.
你的总结末尾应当有对材料的引用, 如果有链接, 请在末尾附上引用网页链接。引用格式严格按照 `\n[1] title url \n`。
不要提到任何函数调用的信息。以下是相关材料:
"""
gu.log(f"web_search: {question3}", tag="web_search", level=gu.LEVEL_DEBUG, max_len=99999)
_c = 0
while _c < 3:
try:
print('text chat')
final_ret = provider.text_chat(question3)
final_ret = provider.text_chat(question3 + "```" + function_invoked_ret + "```", session_id)
return final_ret
except Exception as e:
print(e)
@@ -275,5 +267,4 @@ def web_search(question, provider: Provider, session_id, official_fc=False):
provider.forget(session_id)
function_invoked_ret = function_invoked_ret[:int(len(function_invoked_ret) / 2)]
time.sleep(3)
question3 = f"""请回答`{question}`问题。\n以下是相关材料,请直接拿此材料针对问题进行回答,再给参考链接, 参考链接首末有空格。```\n{function_invoked_ret}\n```\n"""
return function_invoked_ret

View File

@@ -7,6 +7,10 @@ import re
import requests
from util.cmd_config import CmdConfig
import socket
from cores.qqbot.global_object import GlobalObject
import platform
import requests
import logging
PLATFORM_GOCQ = 'gocq'
PLATFORM_QQCHAN = 'qqchan'
@@ -37,84 +41,98 @@ BG_COLORS = {
LEVEL_DEBUG = "DEBUG"
LEVEL_INFO = "INFO"
LEVEL_WARNING = "WARNING"
LEVEL_WARNING = "WARN"
LEVEL_ERROR = "ERROR"
LEVEL_CRITICAL = "CRITICAL"
# 为了兼容旧版
level_codes = {
LEVEL_DEBUG: 0,
LEVEL_INFO: 1,
LEVEL_WARNING: 2,
LEVEL_ERROR: 3,
LEVEL_CRITICAL: 4
LEVEL_DEBUG: logging.DEBUG,
LEVEL_INFO: logging.INFO,
LEVEL_WARNING: logging.WARNING,
LEVEL_ERROR: logging.ERROR,
LEVEL_CRITICAL: logging.CRITICAL,
}
level_colors = {
"INFO": "green",
"WARNING": "yellow",
"WARN": "yellow",
"ERROR": "red",
"CRITICAL": "purple",
}
def log(
msg: str,
level: str = "INFO",
tag: str = "System",
fg: str = None,
bg: str = None,
max_len: int = 500,
err: Exception = None,):
"""
日志打印函数
"""
_set_level_code = level_codes[LEVEL_INFO]
if 'LOG_LEVEL' in os.environ and os.environ['LOG_LEVEL'] in level_codes:
_set_level_code = level_codes[os.environ['LOG_LEVEL']]
if level in level_codes and level_codes[level] < _set_level_code:
return
class Logger:
def __init__(self) -> None:
self.history = []
if err is not None:
msg += "\n异常原因: " + str(err)
level = LEVEL_ERROR
def log(
self,
msg: str,
level: str = "INFO",
tag: str = "System",
fg: str = None,
bg: str = None,
max_len: int = 50000,
err: Exception = None,):
"""
日志打印函数
"""
_set_level_code = level_codes[LEVEL_INFO]
if 'LOG_LEVEL' in os.environ and os.environ['LOG_LEVEL'] in level_codes:
_set_level_code = level_codes[os.environ['LOG_LEVEL']]
if len(msg) > max_len:
msg = msg[:max_len] + "..."
now = datetime.datetime.now().strftime("%m-%d %H:%M:%S")
pres = []
for line in msg.split("\n"):
if line == "\n":
pres.append("")
else:
pres.append(f"[{now}] [{level}] [{tag}]: {line}")
if level in level_codes and level_codes[level] < _set_level_code:
return
if err is not None:
msg += "\n异常原因: " + str(err)
level = LEVEL_ERROR
if level == "INFO":
if fg is None:
fg = FG_COLORS["green"]
if bg is None:
bg = BG_COLORS["default"]
elif level == "WARNING":
if fg is None:
fg = FG_COLORS["yellow"]
if bg is None:
bg = BG_COLORS["default"]
elif level == "ERROR":
if fg is None:
fg = FG_COLORS["red"]
if bg is None:
bg = BG_COLORS["default"]
elif level == "CRITICAL":
if fg is None:
fg = FG_COLORS["purple"]
if bg is None:
bg = BG_COLORS["default"]
ret = ""
for line in pres:
ret += f"\033[{fg};{bg}m{line}\033[0m\n"
print(ret[:-1])
if len(msg) > max_len:
msg = msg[:max_len] + "..."
now = datetime.datetime.now().strftime("%H:%M:%S")
pres = []
for line in msg.split("\n"):
if line == "\n":
pres.append("")
else:
pres.append(f"[{now}] [{tag}/{level}] {line}")
if level == "INFO":
if fg is None:
fg = FG_COLORS["green"]
if bg is None:
bg = BG_COLORS["default"]
elif level == "WARN":
if fg is None:
fg = FG_COLORS["yellow"]
if bg is None:
bg = BG_COLORS["default"]
elif level == "ERROR":
if fg is None:
fg = FG_COLORS["red"]
if bg is None:
bg = BG_COLORS["default"]
elif level == "CRITICAL":
if fg is None:
fg = FG_COLORS["purple"]
if bg is None:
bg = BG_COLORS["default"]
ret = ""
for line in pres:
ret += f"\033[{fg};{bg}m{line}\033[0m\n"
try:
requests.post("http://localhost:6185/api/log", data=ret[:-1].encode(), timeout=1)
except BaseException as e:
pass
self.history.append(ret)
if len(self.history) > 100:
self.history = self.history[-100:]
print(ret[:-1])
log = Logger()
def port_checker(port: int, host: str = "localhost"):
sk = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
@@ -127,18 +145,23 @@ def port_checker(port: int, host: str = "localhost"):
sk.close()
return False
def word2img(title: str, text: str, max_width=30, font_size=20):
def get_font_path() -> str:
if os.path.exists("resources/fonts/syst.otf"):
font_path = "resources/fonts/syst.otf"
elif os.path.exists("QQChannelChatGPT/resources/fonts/syst.otf"):
font_path = "QQChannelChatGPT/resources/fonts/syst.otf"
elif os.path.exists("AstrBot/resources/fonts/syst.otf"):
font_path = "AstrBot/resources/fonts/syst.otf"
elif os.path.exists("C:/Windows/Fonts/simhei.ttf"):
font_path = "C:/Windows/Fonts/simhei.ttf"
elif os.path.exists("/usr/share/fonts/opentype/noto/NotoSansCJK-Regular.ttc"):
font_path = "/usr/share/fonts/opentype/noto/NotoSansCJK-Regular.ttc"
else:
raise Exception("找不到字体文件")
return font_path
def word2img(title: str, text: str, max_width=30, font_size=20):
font_path = get_font_path()
width_factor = 1.0
height_factor = 1.5
# 格式化文本宽度最大为30
@@ -147,8 +170,6 @@ def word2img(title: str, text: str, max_width=30, font_size=20):
length = len(lines)
for l in lines:
if len(l) > max_width:
# lines[i] = l[:max_width] + '\n' + l[max_width:]
# for
cp = l
for ii in range(len(l)):
if ii % max_width == 0:
@@ -212,24 +233,9 @@ def render_markdown(markdown_text, image_width=800, image_height=600, font_size=
# 用于匹配图片的正则表达式
IMAGE_REGEX = r"!\s*\[.*?\]\s*\((.*?)\)"
if os.path.exists("resources/fonts/syst.otf"):
font_path = "resources/fonts/syst.otf"
elif os.path.exists("QQChannelChatGPT/resources/fonts/syst.otf"):
font_path = "QQChannelChatGPT/resources/fonts/syst.otf"
elif os.path.exists("C:/Windows/Fonts/simhei.ttf"):
font_path = "C:/Windows/Fonts/simhei.ttf"
elif os.path.exists("/usr/share/fonts/opentype/noto/NotoSansCJK-Regular.ttc"):
font_path = "/usr/share/fonts/opentype/noto/NotoSansCJK-Regular.ttc"
else:
raise Exception("找不到字体文件")
# backup
if os.path.exists("resources/fonts/simhei.ttf"):
font_path1 = "resources/fonts/simhei.ttf"
elif os.path.exists("QQChannelChatGPT/resources/fonts/simhei.ttf"):
font_path1 = "QQChannelChatGPT/resources/fonts/simhei.ttf"
else:
font_path1 = font_path
font_path = get_font_path()
font_path1 = font_path
# 加载字体
font = ImageFont.truetype(font_path, font_size)
@@ -476,7 +482,7 @@ def save_temp_img(img: Image) -> str:
if time.time() - ctime > 3600:
os.remove(path)
except Exception as e:
log(f"清除临时文件失败: {e}", level=LEVEL_WARNING, tag="GeneralUtils")
print(f"清除临时文件失败: {e}", level=LEVEL_WARNING, tag="GeneralUtils")
# 获得时间戳
timestamp = int(time.time())
@@ -524,11 +530,27 @@ def try_migrate_config(old_config: dict):
cc.put(k, old_config[k])
def get_local_ip_addresses():
ip = ''
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.connect(('8.8.8.8', 80))
ip = s.getsockname()[0]
except BaseException as e:
pass
finally:
s.close()
return ip
return ip
def get_sys_info(global_object: GlobalObject):
mem = None
stats = global_object.dashboard_data.stats
os_name = platform.system()
os_version = platform.version()
if 'sys_perf' in stats and 'memory' in stats['sys_perf']:
mem = stats['sys_perf']['memory']
return {
'mem': mem,
'os': os_name + '_' + os_version,
'py': platform.python_version(),
}

View File

@@ -12,33 +12,40 @@ import shutil
from pip._internal import main as pipmain
import importlib
import stat
import traceback
from types import ModuleType
# 找出模块里所有的类名
def get_classes(p_name, arg):
def get_classes(p_name, arg: ModuleType):
classes = []
clsmembers = inspect.getmembers(arg, inspect.isclass)
for (name, _) in clsmembers:
# print(name, p_name)
if p_name.lower() == name.lower()[:-6] or name.lower() == "main":
if name.lower().endswith("plugin") or name.lower() == "main":
classes.append(name)
break
# if p_name.lower() == name.lower()[:-6] or name.lower() == "main":
return classes
# 获取一个文件夹下所有的模块, 文件名和文件夹名相同
def get_modules(path):
modules = []
for root, dirs, files in os.walk(path):
# 获得所在目录名
p_name = os.path.basename(root)
for file in files:
"""
与文件夹名不计大小写相同或者是main.py的都算启动模块
"""
if file.endswith(".py") and not file.startswith("__") and (p_name.lower() == file[:-3].lower() or file[:-3].lower() == "main"):
# 得到其下的所有文件夹
dirs = os.listdir(path)
# 遍历文件夹,找到 main.py 或者和文件夹同名的文件
for d in dirs:
if os.path.isdir(os.path.join(path, d)):
if os.path.exists(os.path.join(path, d, "main.py")):
module_str = 'main'
elif os.path.exists(os.path.join(path, d, d + ".py")):
module_str = d
else:
print(f"插件 {d} 未找到 main.py 或者 {d}.py跳过。")
continue
if os.path.exists(os.path.join(path, d, "main.py")) or os.path.exists(os.path.join(path, d, d + ".py")):
modules.append({
"pname": p_name,
"module": file[:-3],
"pname": d,
"module": module_str
})
return modules
@@ -100,6 +107,7 @@ def plugin_reload(cached_plugins: dict, target: str = None, all: bool = False):
"root_dir_name": root_dir_name,
}
except BaseException as e:
traceback.print_exc()
fail_rec += f"加载{p}插件出现问题,原因 {str(e)}\n"
if fail_rec == "":
return True, None

145
util/updator.py Normal file
View File

@@ -0,0 +1,145 @@
has_git = True
try:
import git.exc
from git.repo import Repo
except BaseException as e:
has_git = False
import sys, os
import requests
def _reboot():
py = sys.executable
os.execl(py, py, *sys.argv)
def find_repo() -> Repo:
if not has_git:
raise Exception("未安装 GitPython 库,无法进行更新。")
repo = None
# 由于项目更名过,因此这里需要多次尝试。
try:
repo = Repo()
except git.exc.InvalidGitRepositoryError:
try:
repo = Repo(path="QQChannelChatGPT")
except git.exc.InvalidGitRepositoryError:
repo = Repo(path="AstrBot")
if not repo:
raise Exception("在已知的目录下未找到项目位置。请联系项目维护者。")
return repo
def request_release_info(latest: bool = True) -> list:
'''
请求版本信息。
返回一个列表每个元素是一个字典包含版本号、发布时间、更新内容、commit hash等信息。
'''
api_url1 = "https://api.github.com/repos/Soulter/AstrBot/releases"
api_url2 = "https://api.soulter.top/releases" # 0-10 分钟的缓存时间
try:
result = requests.get(api_url2).json()
except BaseException as e:
result = requests.get(api_url1).json()
try:
if latest:
ret = github_api_release_parser([result[0]])
else:
ret = github_api_release_parser(result)
except BaseException as e:
raise Exception(f"解析版本信息失败: {result}")
return ret
def github_api_release_parser(releases: list) -> list:
'''
解析 GitHub API 返回的 releases 信息。
返回一个列表每个元素是一个字典包含版本号、发布时间、更新内容、commit hash等信息。
'''
ret = []
for release in releases:
version = release['name']
commit_hash = ''
# 规范是: v3.0.7.xxxxxx其中xxxxxx为 commit hash
_t = version.split(".")
if len(_t) == 4:
commit_hash = _t[3]
ret.append({
"version": release['name'],
"published_at": release['published_at'],
"body": release['body'],
"commit_hash": commit_hash,
"tag_name": release['tag_name']
})
return ret
def check_update() -> str:
repo = find_repo()
curr_commit = repo.commit().hexsha
update_data = request_release_info()
new_commit = update_data[0]['commit_hash']
print(f"当前版本: {curr_commit}")
print(f"最新版本: {new_commit}")
if curr_commit.startswith(new_commit):
return "当前已经是最新版本。"
else:
update_info = f"""有新版本可用。
=== 当前版本 ===
{curr_commit}
=== 新版本 ===
{update_data[0]['version']}
=== 发布时间 ===
{update_data[0]['published_at']}
=== 更新内容 ===
{update_data[0]['body']}"""
return update_info
def update_project(update_data: list,
reboot: bool = False,
latest: bool = True,
version: str = ''):
repo = find_repo()
# update_data = request_release_info(latest)
if latest:
# 检查本地commit和最新commit是否一致
curr_commit = repo.head.commit.hexsha
new_commit = update_data[0]['commit_hash']
if curr_commit == '':
raise Exception("无法获取当前版本号对应的版本位置。请联系项目维护者。")
if curr_commit.startswith(new_commit):
raise Exception("当前已经是最新版本。")
else:
# 更新到最新版本对应的commit
try:
repo.remotes.origin.fetch()
repo.git.checkout(update_data[0]['tag_name'])
if reboot: _reboot()
except BaseException as e:
raise e
else:
# 更新到指定版本
flag = False
for data in update_data:
if data['tag_name'] == version:
try:
repo.remotes.origin.fetch()
repo.git.checkout(data['tag_name'])
flag = True
if reboot: _reboot()
except BaseException as e:
raise e
else:
continue
if not flag:
raise Exception("未找到指定版本。")
def checkout_branch(branch_name: str):
repo = find_repo()
try:
origin = repo.remotes.origin
origin.fetch()
repo.git.checkout(branch_name)
repo.git.pull("origin", branch_name, "-f")
return True
except BaseException as e:
raise e

View File

@@ -1,25 +0,0 @@
from flask import Flask
from threading import Thread
import datetime
app = Flask(__name__)
@app.route('/')
def main_func():
content = "<h1>QQChannelChatGPT Web APP</h1>"
content += "<p>" + "Online @ " + str(datetime.datetime.now()) + "</p>"
content += "<p>欢迎Star本项目</p>"
return content
def run():
app.run(host="0.0.0.0", port=8080)
def keep_alive():
server = Thread(target=run)
server.start()