Compare commits

..

26 Commits

Author SHA1 Message Date
copilot-swe-agent[bot]
e8fc3fb0a2 test: add tests for HTTP MCP compatibility fix
Co-authored-by: DeJeune <67425183+DeJeune@users.noreply.github.com>
2025-11-01 18:25:18 +00:00
copilot-swe-agent[bot]
4b254637b0 🐛 fix: remove strict mode from MCP tools to fix HTTP MCP errors
Co-authored-by: DeJeune <67425183+DeJeune@users.noreply.github.com>
2025-11-01 18:23:45 +00:00
copilot-swe-agent[bot]
456b709c29 Initial plan 2025-11-01 18:12:43 +00:00
SuYao
28bc89ac7c perf: optimize QR code generation and connection info for phone LAN export (#11086)
* Increase QR code margin for better scanning reliability

- Change QRCodeSVG marginSize from 2 to 4 pixels
- Maintains same QR code size (160px) and error correction level (Q)
- Improves readability and scanning success rate on mobile devices

* Optimize QR code generation and connection info for phone LAN export

- Increase QR code size to 180px and reduce error correction to 'L' for better mobile scanning
- Replace hardcoded logo path with AppLogo config and increase logo size to 60px
- Simplify connection info by removing candidates array and using only essential IP/port data

* Optimize QR code data structure for LAN connection

- Compress IP addresses to numeric format to reduce QR code complexity
- Use compact array format instead of verbose JSON object structure
- Remove debug logging to streamline connection flow

* feat: 更新 WebSocket 状态和候选者响应类型,优化连接信息处理

* Increase QR code size and error correction for better scanning

- Increase QR code size from 180px to 300px for improved readability
- Change error correction level from L (low) to H (high) for better reliability
- Reduce logo size from 60px to 40px to accommodate larger QR data
- Increase margin size from 1 to 2 for better border clearance

* 调整二维码大小和图标尺寸以优化扫描体验

* fix(i18n): Auto update translations for PR #11086

* fix(i18n): Auto update translations for PR #11086

* fix(i18n): Auto update translations for PR #11086

---------

Co-authored-by: GitHub Action <action@github.com>
2025-11-01 12:13:11 +08:00
fullex
dc06c103e0 chore[lint]: add import type lint (#11091)
chore: add import type lint
2025-11-01 10:40:02 +08:00
SuYao
1f0381aebe Fix/azure embedding (#11044)
* fix: update EmbeddingsFactory to use net.fetch and refactor KnowledgeService to use ModernAiProvider

* fix: remove deprecated @langchain/community dependency from package.json

* fix: add @langchain/community dependency to package.json and update yarn.lock
2025-11-01 01:52:16 +08:00
kangfenmao
fb02a61a48 feat: add AddAssistantOrAgentPopup component and update i18n translations
- Introduced a new AddAssistantOrAgentPopup component for selecting between assistant and agent options.
- Updated English, Simplified Chinese, and Traditional Chinese translations to include descriptions and titles for assistant and agent options.
- Refactored UnifiedAddButton to utilize the new popup for adding assistants or agents.
2025-10-31 23:46:51 +08:00
defi-failure
562fbb3ff7 fix: minor ui tweak of plugin installation interface (#11085)
* fix: use dropdown instead of chip filter

* fix: add padding to avoid scroll bar overlap

* fix: set max card grid col to 2

* fix: minor ui tweak for plugin card

* fix: remove redundant args

* fix(i18n): Auto update translations for PR #11085

* fix: cleanup comments

---------

Co-authored-by: GitHub Action <action@github.com>
2025-10-31 22:28:25 +08:00
Pleasure1234
1018ad87b8 fix: cancel debounced save on file path update (#11069)
Adds cancellation of the debounced save when the active file path is updated after moving a file or folder. This prevents saving to the old path and ensures lastFilePathRef is updated accordingly.
2025-10-31 14:17:06 +00:00
kangfenmao
82ca35fc29 chore: update issue template names by removing language specification
- Modified the names of the issue templates for bug reports, feature requests, and other questions to remove the "(English)" suffix, simplifying the titles for better clarity.
2025-10-31 21:29:29 +08:00
kangfenmao
fe53b0914a feat: add enterprise section and remove license from AboutSettings
- Introduced an "Enterprise" section in the i18n files for English, Simplified Chinese, and Traditional Chinese.
- Removed the "License" section from the AboutSettings component, replacing it with a link to the enterprise website.
- Updated icons in the AboutSettings component to reflect the new structure.
2025-10-31 21:28:30 +08:00
defi-failure
67a379641f fix(agent): resolve edit modal loading race condition (#11084)
* fix(agent): resolve edit modal loading race condition

* fix(i18n): Auto update translations for PR #11084

---------

Co-authored-by: GitHub Action <action@github.com>
2025-10-31 21:00:35 +08:00
kangfenmao
9dbc6fbf67 Revert "feat: 添加路由懒加载组件以优化页面加载性能 (#11042)"
This reverts commit dd8690b592.
2025-10-31 18:54:48 +08:00
kangfenmao
8da43ab794 chore: update release notes for v1.7.0-beta.3
- Added new features including an enhanced tool permission system, plugin management, and support for various AI models.
- Improved UI elements and agent creation processes.
- Fixed multiple bugs related to session models, assistant activation, and various API integrations.
- Updated version in package.json to v1.7.0-beta.3.
2025-10-31 17:20:41 +08:00
槑囿脑袋
2a06c606e1 feat: restore data to mobile App (#10108)
* feat: restore data to App

* fix: i18n check

* fix: lint

* Change WebSocket service port to 11451

- Update default port from 3000 to 11451 for WebSocket connections
- Maintain existing service structure and client connection handling

* Add local IP address to WebSocket server configuration

- Set server path using local IP address for improved network accessibility
- Maintain existing CORS policy with wildcard origin
- Keep backward compatibility with current connection handling

* Remove local IP path and enforce WebSocket transport

- Replace dynamic local IP path with static WebSocket transport configuration
- Maintain CORS policy with wildcard origin for cross-origin connections
- Ensure reliable WebSocket-only communication by disabling fallback transports

* Add detailed logging to WebSocket connection flow

- Enhance WebSocketService with verbose connection logging including transport type and client count
- Add comprehensive logging in ExportToPhoneLanPopup for WebSocket initialization and status tracking
- Improve error handling with null checks for main window before sending events

* Add engine-level WebSocket connection monitoring

- Add initial_headers event listener to log connection attempts with URL and headers
- Add engine connection event to log established connections with remote addresses
- Add startup logs for server binding and allowed transports

* chore: change to use 7017 port

* Improve local IP address selection with interface priority system

- Implement network interface priority ranking to prefer Ethernet/Wi-Fi over virtual/VPN interfaces
- Add detailed logging for interface discovery and selection process
- Remove websocket-only transport restriction for broader client compatibility
- Clean up unused parameter in initial_headers event handler

* Add VPN interface patterns for Tailscale and WireGuard

- Include Tailscale VPN interfaces in network interface filtering
- Add WireGuard VPN interfaces to low-priority network candidates
- Maintain existing VPN tunnel interface patterns for compatibility

* Add network interface prioritization for QR code generation

- Implement `getAllCandidates()` method to scan and prioritize network interfaces by type (Ethernet/Wi-Fi over VPN/virtual interfaces)
- Update QR code payload to include all candidate IPs with priority rankings instead of single host
- Add comprehensive interface pattern matching for macOS, Windows, and Linux systems

* Add WebSocket getAllCandidates IPC channel

- Add new WebSocket_GetAllCandidates enum value to IpcChannel
- Register getAllCandidates handler in main process IPC
- Expose getAllCandidates method in preload script API

* Add WebSocket connection logging and temporary test button

- Add URL and method logging to WebSocket engine connection events
- Implement Socket.IO connect and connect_error event handlers with logging
- Add temporary test button to force connection status for debugging

* Clean up WebSocket logging and remove debug code

- Remove verbose debug logs from WebSocket service and connection handling
- Consolidate connection logging into single informative messages
- Remove temporary test button and force connection functionality from UI
- Add missing "sending" translation key for export button loading state

* Enhance file transfer with progress tracking and improved UI

- Add transfer speed monitoring and formatted file size display in WebSocket service
- Implement detailed connection and transfer state management in UI component
- Improve visual feedback with status indicators, progress bars, and error handling

* Enhance WebSocket service and LAN export UI with improved logging and user experience

- Add detailed WebSocket server configuration with transports, CORS, and timeout settings
- Implement comprehensive connection logging at both Socket.IO and Engine.IO levels
- Refactor export popup with modular components, status indicators, and i18n support

* 移除 WebSocket 连接时的冗余日志记录

* Remove dot indicator from connection status component

- Simplify status style map by removing unused dot color properties
- Delete dot indicator element from connection status display
- Maintain existing border and background color styling for status states

* Refactor ExportToPhoneLanPopup with dedicated UI components and improved UX

- Extract QR code display states into separate components (LoadingQRCode, ScanQRCode, ConnectingAnimation, ConnectedDisplay, ErrorQRCode)
- Add confirmation dialog when attempting to close during active file transfer
- Improve WebSocket cleanup and modal dismissal behavior with proper connection handling

* Remove close button hiding during QR code generation

- Eliminate `hideCloseButton={isSending}` prop to keep close button visible
- Maintain consistent modal behavior throughout export process
- Prevent user confusion by ensuring close option remains available

* auto close

* Extract auto-close countdown into separate component

- Move auto-close countdown logic from TransferProgress to dedicated AutoCloseCountdown component
- Update styling to use paddingTop instead of marginTop for better spacing
- Clean up TransferProgress dependencies by removing autoCloseCountdown

* 添加局域网传输相关的翻译文本,包括自动关闭提示和确认关闭消息

---------

Co-authored-by: suyao <sy20010504@gmail.com>
2025-10-31 16:48:09 +08:00
MyPrototypeWhat
b6dcf2f5fa Feat/add skill tool (#11051)
* feat: add SkillTool component and integrate into agent tools

- Introduced SkillTool component for rendering skill-related functionality.
- Updated MessageAgentTools to include SkillTool in the tool renderers.
- Enhanced MessageTool to recognize 'Skill' as a valid agent tool type.
- Modified handleUserMessage to conditionally handle text blocks based on skill inclusion.
- Added SkillToolInput and SkillToolOutput types for better type safety.

* feat: implement command tag filtering in message handling

- Added filterCommandTags function to remove command-* tags from text content, ensuring internal command messages do not appear in the user-facing UI.
- Updated handleUserMessage to utilize the new filtering logic, enhancing the handling of text blocks and improving user experience by preventing unwanted command messages from being displayed.

* refactor: rename tool prefix constants for clarity

- Updated variable names for tool prefixes in MessageTool and SkillTool components to enhance code readability.
- Changed `prefix` to `builtinToolsPrefix` and `agentPrefix` to `agentMcpToolsPrefix` for better understanding of their purpose.
2025-10-31 16:31:50 +08:00
defi-failure
68e0d8b0f1 feat: add confirmation modal for activating protocol-installed MCP (#11070)
* feat: add confirmation modal for activating protocol-installed MCP

* fix: sync i18n

* fix(i18n): Auto update translations for PR #11070

* chore: verify ci is working

* Revert "chore: verify ci is working"

This reverts commit a2434a397d.

---------

Co-authored-by: GitHub Action <action@github.com>
2025-10-31 16:05:02 +08:00
LiuVaayne
7f1c234ac1 fix(ClaudeCodeService): update environment variable names for models (#11073) 2025-10-31 14:46:24 +08:00
Phantom
c1fd23742f fix: activate assistant/agent when creating new (#11009)
* refactor: remove unused SWITCH_ASSISTANT event and related code

Clean up unused event and associated listener in HomePage component

* feat(agents): improve agent handling and state management

- Return result from useUpdateAgent hook
- Update useActiveTopic to handle null assistantId
- Add state management for active agent and topic in Tabs
- Implement afterSubmit callback in AgentModal
- Refactor agent press handling in AssistantsTab
- Clean up HomePage state management logic
- Add afterCreate callback in UnifiedAddButton

* refactor(agent): update agent and session update functions to return entities

Modify update functions in useUpdateAgent and useUpdateSession hooks to return updated entities.
Update related components to handle the new return types and adjust type definitions accordingly.

* refactor(hooks): simplify active topic hook by using useAssistant

* refactor(agent): consolidate agent update types and functions

Move UpdateAgentBaseOptions and related function types from hooks/agents/types.ts to types/agent.ts
Update components to use new UpdateAgentFunctionUnion type
Simplify component props by removing redundant type definitions

* refactor(agent): update type for plugin settings update function

* refactor(AgentSettings): simplify tooling settings type definitions

Remove unused hooks and use direct type imports instead of ReturnType
2025-10-31 14:41:07 +08:00
LiuVaayne
d792bf7fe0 🐛 fix: resolve tool approval UI and shared workspace plugin inconsistency (#11043)
* fix(ToolPermissionRequestCard): simplify button rendering by removing suggestion handling

*  feat: add CachedPluginsDataSchema for plugin cache file

- Add Zod schema for .claude/plugins.json cache file format
- Schema includes version, lastUpdated timestamp, and plugins array
- Reuses existing InstalledPluginSchema for type safety
- Cache will store metadata for all installed plugins

*  feat: add cache management methods to PluginService

- Add readCacheFile() to read .claude/plugins.json
- Add writeCacheFile() for atomic cache writes (temp + rename)
- Add rebuildCache() to scan filesystem and rebuild cache
- Add listInstalledFromCache() to load plugins from cache with fallback
- Add updateCache() helper for transactional cache updates
- All methods handle missing/corrupt cache gracefully
- Cache auto-regenerates from filesystem if needed

*  feat: integrate cache loading in AgentService.getAgent()

- Add installed_plugins field to GetAgentResponseSchema
- Load plugins from cache via PluginService.listInstalledFromCache()
- Gracefully handle errors by returning empty array
- Use loggerService for error logging

* 🐛 fix: break circular dependency causing infinite loop in cache methods

- Change cache method signatures from agentId to workdir parameter
- Update listInstalledFromCache(workdir) to accept workdir directly
- Update rebuildCache(workdir) to accept workdir directly
- Update updateCache(workdir, updater) to accept workdir directly
- AgentService.getAgent() now passes accessible_paths[0] to cache methods
- Removes AgentService.getAgent() calls from PluginService methods
- Fixes infinite recursion bug where methods called each other endlessly

Breaking the circular dependency:
BEFORE: AgentService.getAgent() → PluginService.listInstalledFromCache(id)
        → AgentService.getAgent(id) [INFINITE LOOP]
AFTER:  AgentService.getAgent() → PluginService.listInstalledFromCache(workdir)
        [NO MORE RECURSION]

* 🐛 fix: update listInstalled() to use agent.installed_plugins

- Change from agent.configuration.installed_plugins (old DB location)
- To agent.installed_plugins (new top-level field from cache)
- Simplify validation logic to use existing plugin structure
- Fixes UI not showing installed plugins correctly

This was causing the UI to show empty plugin lists even though plugins
were correctly loaded in the cache by AgentService.getAgent().

* ♻️ refactor: remove unused updateCache helper

* ♻️ refactor: centralize plugin directory helpers

* feat: Implement Plugin Management System

- Added PluginCacheStore for managing plugin metadata and caching.
- Introduced PluginInstaller for handling installation and uninstallation of plugins.
- Created PluginService to manage plugin lifecycle, including installation, uninstallation, and listing of available plugins.
- Enhanced AgentService to integrate with PluginService for loading installed plugins.
- Implemented validation and sanitization for plugin file names and paths to prevent security issues.
- Added support for skills as a new plugin type, including installation and management.
- Introduced caching mechanism for available plugins to improve performance.

* ♻️ refactor: simplify PluginInstaller and PluginService by removing agent dependency and updating plugin handling
2025-10-31 14:30:50 +08:00
亢奋猫
f8a599322f feat(useAppInit): implement automatic update checks with interval sup… (#11063)
feat(useAppInit): implement automatic update checks with interval support

- Added a function to check for updates, which is called initially and set to run every 6 hours if the app is packaged and auto-update is enabled.
- Refactored the initial update check to utilize the new function for better code organization and clarity.
2025-10-31 13:35:27 +08:00
defi-failure
aa810a7ead fix: notify renderer when api server ready (#11049)
* fix: notify renderer when api server ready

* chore: minor comment update

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* fix: minor ui change to reflect server loading state

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-31 12:13:59 +08:00
Pleasure1234
b586e1796e fix: sort grouped items by saved tags order from Redux (#11065)
Updated useUnifiedGrouping to sort grouped items by the tags order saved in the Redux store, falling back to untagged first. This improves consistency with user-defined tag ordering.
2025-10-31 11:21:10 +08:00
George·Dong
fa2ec69fa9 fix(SettingsTab): Context slider inconsistent (#10943)
* fix(i18n): standardize "max" translation to indicate unlimited

* feat(SettingsTab): add current context

* feat(settings): show proper "max" label for context count

* fix(settings): simplify contextCount value expression

* feat(settings): make context count editable with number input
2025-10-30 20:15:35 +08:00
SuYao
dd8690b592 feat: 添加路由懒加载组件以优化页面加载性能 (#11042) 2025-10-30 16:41:07 +08:00
MyPrototypeWhat
09e6b9741e fix: update GlobTool to count lines instead of files in output (#11036) 2025-10-30 16:14:04 +08:00
891 changed files with 6484 additions and 5581 deletions

View File

@@ -1,4 +1,4 @@
name: 🐛 Bug Report (English)
name: 🐛 Bug Report
description: Create a report to help us improve
title: '[Bug]: '
labels: ['BUG']

View File

@@ -1,4 +1,4 @@
name: 💡 Feature Request (English)
name: 💡 Feature Request
description: Suggest an idea for this project
title: '[Feature]: '
labels: ['feature']

View File

@@ -1,4 +1,4 @@
name: 🤔 Other Questions (English)
name: 🤔 Other Questions
description: Submit questions that don't fit into bug reports or feature requests
title: '[Other]: '
body:

160
.github/pr-modules.yml vendored
View File

@@ -1,160 +0,0 @@
# 模块 → 路径匹配globs与 GitHub 审核人列表
# 多模块命中时取优先级最高的为主类,其余在卡片中显示“涉及模块”
categories:
ai_core:
name: "AI Core"
globs:
- "packages/aiCore/**"
- "src/renderer/src/aiCore/**"
github_reviewers: ["DeJeune", "MyPrototypeWhat", "Vaayne"]
agent:
name: "Agent"
globs:
- "packages/shared/agents/**"
- "resources/data/agents-*.json"
- "src/renderer/src/api/agent.ts"
- "src/renderer/src/types/agent.ts"
- "src/renderer/src/utils/agentSession.ts"
- "src/renderer/src/services/db/AgentMessageDataSource.ts"
- "src/renderer/src/hooks/agents/**"
- "src/renderer/src/components/Popups/agent/**"
- "src/renderer/src/pages/home/**/Agent*.tsx"
- "src/renderer/src/pages/settings/AgentSettings/**"
- "src/main/services/agents/**"
- "src/main/apiServer/routes/agents/**"
github_reviewers: ["EurFelux", "Vaayne", "DeJeune"]
provider:
name: "Provider"
globs:
- "src/renderer/src/config/providers.ts"
- "src/renderer/src/config/preprocessProviders.ts"
- "src/renderer/src/config/webSearchProviders.ts"
- "src/renderer/src/hooks/useWebSearchProviders.ts"
- "src/renderer/src/providers/**"
- "src/renderer/src/pages/settings/ProviderSettings/**"
- "src/renderer/src/pages/settings/WebSearchSettings/**"
- "src/renderer/src/pages/settings/DocProcessSettings/PreprocessProviderSettings.tsx"
- "src/renderer/src/pages/settings/MCPSettings/providers/**"
- "src/renderer/src/assets/images/providers/**"
github_reviewers: ["YinsenHo", "kangfenmao", "alephpiece"]
backend:
name: "后端/平台"
globs:
- "src/main/apiServer/**"
- "src/main/services/**"
- "src/main/*.ts"
- "src/preload/**"
- "src/main/mcpServers/**"
github_reviewers: ["beyondkmp", "Vaayne", "kangfenmao"]
knowledge:
name: "知识库"
globs:
- "src/main/knowledge/**"
- "src/renderer/src/pages/knowledge/**"
- "src/renderer/src/store/knowledge.ts"
- "src/renderer/src/queue/KnowledgeQueue.ts"
github_reviewers: ["eeee0717", "alephpiece", "GeorgeDong32"]
data_storage:
name: "数据与存储"
globs:
- "src/renderer/src/databases/**"
- "src/renderer/src/services/db/**"
- "src/main/services/agents/database/**"
- "resources/database/drizzle/**"
- "src/renderer/src/store/migrate.ts"
- "src/renderer/src/databases/upgrades.ts"
github_reviewers: ["0xfullex", "kangfenmao", "Vaayne", "DeJeune"]
backup_export:
name: "备份/导出"
globs:
- "src/renderer/src/components/*Backup*"
- "src/renderer/src/components/Webdav*"
- "src/renderer/src/components/ObsidianExportDialog.tsx"
- "src/renderer/src/components/S3*"
- "src/renderer/src/store/backup.ts"
- "src/renderer/src/store/nutstore.ts"
- "src/renderer/src/pages/settings/DataSettings/**"
github_reviewers: ["beyondkmp", "GeorgeDong32"]
minapps:
name: "小程序"
globs:
- "src/renderer/src/pages/minapps/**"
- "src/renderer/src/store/minapps.ts"
- "src/renderer/src/config/minapps.ts"
github_reviewers: ["GeorgeDong32", "beyondkmp"]
chat:
name: "对话"
globs:
- "src/renderer/src/pages/home/**"
- "src/renderer/src/store/newMessage.ts"
- "src/renderer/src/store/messageBlock.ts"
- "src/renderer/src/store/memory.ts"
- "src/renderer/src/store/llm.ts"
github_reviewers: ["kangfenmao", "alephpiece", "EurFelux"]
draw:
name: "绘图"
globs:
- "src/renderer/src/pages/paintings/**"
- "src/renderer/src/store/paintings.ts"
github_reviewers: ["EurFelux", "DeJeune"]
uiux:
name: "UI/UX"
globs:
- "src/renderer/src/components/**"
- "src/renderer/src/ui/**"
- "src/renderer/src/assets/styles/**"
- "src/renderer/src/windows/**"
github_reviewers: ["kangfenmao", "MyPrototypeWhat", "alephpiece"]
build-config:
name: "构建/配置"
globs:
- "package.json"
- "tsconfig*.json"
- "electron-builder.yml"
- "electron.vite.config.ts"
- "vitest.config.ts"
- "playwright.config.ts"
- ".github/workflows/**"
- "scripts/**"
github_reviewers: ["kangfenmao", "beyondkmp", "alephpiece"]
test:
name: "测试"
globs:
- "tests/**"
- "src/**/__tests__/**"
- "scripts/__tests__/**"
github_reviewers: ["alephpiece", "DeJeune", "EurFelux"]
docs:
name: "文档"
globs:
- "docs/**"
- "README*.md"
- "SECURITY.md"
- "CODE_OF_CONDUCT.md"
- "AGENTS.md"
github_reviewers: ["kangfenmao", "0xfullex", "EurFelux"]
rules:
vendor_added:
# 新增供应商时的强制审核人
github_reviewers: ["YinsenHo"]
large_change:
# 重大变更阈值(改动文件数 > changed_files_gt 触发)
changed_files_gt: 30
github_reviewers: ["kangfenmao"]

View File

@@ -1,455 +0,0 @@
{
"generatedAt": "2025-10-29T06:19:19.098Z",
"suggestions": {
"ai_core": [
{
"github": "SuYao",
"name": "SuYao",
"email": "sy20010504@gmail.com",
"commits": 33
},
{
"github": "MyPrototypeWhat",
"name": "MyPrototypeWhat",
"email": "daoquqiexing@gmail.com",
"commits": 12
},
{
"github": "Vaayne",
"name": "Vaayne",
"email": "liu.vaayne@gmail.com",
"commits": 10
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 9
},
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 6
}
],
"agent": [
{
"github": "icarus",
"name": "icarus",
"email": "eurfelux@gmail.com",
"commits": 152
},
{
"github": "Vaayne",
"name": "Vaayne",
"email": "liu.vaayne@gmail.com",
"commits": 80
},
{
"github": "suyao",
"name": "suyao",
"email": "sy20010504@gmail.com",
"commits": 29
},
{
"github": "defi-failure",
"name": "defi-failure",
"email": "159208748+defi-failure@users.noreply.github.com",
"commits": 8
},
{
"github": "Phantom",
"name": "Phantom",
"email": "eurfelux@gmail.com",
"commits": 5
}
],
"provider": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 53
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 32
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 30
},
{
"github": "SuYao",
"name": "SuYao",
"email": "sy20010504@gmail.com",
"commits": 14
},
{
"github": "eeee0717",
"name": "Chen Tao",
"email": "70054568+eeee0717@users.noreply.github.com",
"commits": 10
}
],
"backend": [
{
"github": "beyondkmp",
"name": "beyondkmp",
"email": "beyondkmp@gmail.com",
"commits": 99
},
{
"github": "Vaayne",
"name": "Vaayne",
"email": "liu.vaayne@gmail.com",
"commits": 96
},
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 84
},
{
"github": "0xfullex",
"name": "fullex",
"email": "106392080+0xfullex@users.noreply.github.com",
"commits": 49
},
{
"github": "vaayne",
"name": "LiuVaayne",
"email": "10231735+vaayne@users.noreply.github.com",
"commits": 33
}
],
"knowledge": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 20
},
{
"github": "eeee0717",
"name": "Chen Tao",
"email": "70054568+eeee0717@users.noreply.github.com",
"commits": 13
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 8
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 6
},
{
"github": "beyondkmp",
"name": "beyondkmp",
"email": "beyondkmp@gmail.com",
"commits": 5
}
],
"data_storage": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 63
},
{
"github": "Vaayne",
"name": "Vaayne",
"email": "liu.vaayne@gmail.com",
"commits": 21
},
{
"github": "SuYao",
"name": "SuYao",
"email": "sy20010504@gmail.com",
"commits": 20
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 17
},
{
"github": "suyao",
"name": "suyao",
"email": "sy20010504@gmail.com",
"commits": 13
}
],
"backup_export": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 23
},
{
"github": "beyondkmp",
"name": "beyondkmp",
"email": "beyondkmp@gmail.com",
"commits": 12
},
{
"github": "GeorgeDong32",
"name": "George·Dong",
"email": "98630204+GeorgeDong32@users.noreply.github.com",
"commits": 9
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 5
},
{
"github": "0xfullex",
"name": "fullex",
"email": "106392080+0xfullex@users.noreply.github.com",
"commits": 5
}
],
"minapps": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 12
},
{
"github": "beyondkmp",
"name": "beyondkmp",
"email": "beyondkmp@gmail.com",
"commits": 5
},
{
"github": "GeorgeDong32",
"name": "George·Dong",
"email": "98630204+GeorgeDong32@users.noreply.github.com",
"commits": 4
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 4
},
{
"github": "0xfullex",
"name": "fullex",
"email": "106392080+0xfullex@users.noreply.github.com",
"commits": 3
}
],
"chat": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 189
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 86
},
{
"github": "icarus",
"name": "icarus",
"email": "eurfelux@gmail.com",
"commits": 85
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 52
},
{
"github": "SuYao",
"name": "SuYao",
"email": "sy20010504@gmail.com",
"commits": 48
}
],
"draw": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 8
},
{
"github": "jin-wang-c",
"name": "Caelan",
"email": "79105826+jin-wang-c@users.noreply.github.com",
"commits": 7
},
{
"github": "DDU1222",
"name": "chenxue",
"email": "DDU1222@users.noreply.github.com",
"commits": 6
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 5
},
{
"github": "0xfullex",
"name": "fullex",
"email": "106392080+0xfullex@users.noreply.github.com",
"commits": 4
}
],
"uiux": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 109
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 89
},
{
"github": "icarus",
"name": "icarus",
"email": "eurfelux@gmail.com",
"commits": 35
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 32
},
{
"github": "0xfullex",
"name": "fullex",
"email": "106392080+0xfullex@users.noreply.github.com",
"commits": 24
}
],
"build-config": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 170
},
{
"github": "beyondkmp",
"name": "beyondkmp",
"email": "beyondkmp@gmail.com",
"commits": 65
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 40
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 34
},
{
"github": "SuYao",
"name": "SuYao",
"email": "sy20010504@gmail.com",
"commits": 32
}
],
"test": [
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 45
},
{
"github": "SuYao",
"name": "SuYao",
"email": "sy20010504@gmail.com",
"commits": 27
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 20
},
{
"github": "Vaayne",
"name": "Vaayne",
"email": "liu.vaayne@gmail.com",
"commits": 19
},
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 18
}
],
"docs": [
{
"github": "kangfenmao",
"name": "kangfenmao",
"email": "kangfenmao@qq.com",
"commits": 18
},
{
"github": "0xfullex",
"name": "fullex",
"email": "106392080+0xfullex@users.noreply.github.com",
"commits": 7
},
{
"github": "EurFelux",
"name": "Phantom",
"email": "59059173+EurFelux@users.noreply.github.com",
"commits": 6
},
{
"github": "one",
"name": "one",
"email": "wangan.cs@gmail.com",
"commits": 6
},
{
"github": "sunrise0o0",
"name": "牡丹凤凰",
"email": "87239270+sunrise0o0@users.noreply.github.com",
"commits": 4
}
],
"vendor_added": [],
"large_change": []
}
}

View File

@@ -1,285 +0,0 @@
name: GitHub PR Tracker with Feishu Notification
on:
pull_request:
types: [opened, ready_for_review, review_requested, reopened]
schedule:
# Run every day at 8:30 Beijing Time (00:30 UTC)
- cron: '30 0 * * *'
workflow_dispatch:
jobs:
process-new-pr:
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Check PR conditions
id: check_pr
uses: actions/github-script@v7
with:
script: |
const pr = context.payload.pull_request;
// Check if PR is draft
if (pr.draft) {
console.log('⏭️ PR is in draft state, skipping notification');
core.setOutput('should_notify', 'false');
core.setOutput('skip_reason', 'draft');
return;
}
// We will notify regardless of whether reviewers/assignees are set
console.log('✅ PR meets notification criteria');
core.setOutput('should_notify', 'true');
// Prepare reviewer and assignee lists
const reviewers = (pr.requested_reviewers || []).map(r => r.login).join(',');
const assignees = (pr.assignees || []).map(a => a.login).join(',');
core.setOutput('reviewers', reviewers);
core.setOutput('assignees', assignees);
- name: Check Beijing Time
if: steps.check_pr.outputs.should_notify == 'true'
id: check_time
run: |
# Get current time in Beijing timezone (UTC+8)
BEIJING_HOUR=$(TZ='Asia/Shanghai' date +%H)
BEIJING_MINUTE=$(TZ='Asia/Shanghai' date +%M)
echo "Beijing Time: ${BEIJING_HOUR}:${BEIJING_MINUTE}"
# Check if time is between 00:00 and 08:30
if [ $BEIJING_HOUR -lt 8 ] || ([ $BEIJING_HOUR -eq 8 ] && [ $BEIJING_MINUTE -le 30 ]); then
echo "should_delay=true" >> $GITHUB_OUTPUT
echo "⏰ PR created during quiet hours (00:00-08:30 Beijing Time)"
echo "Will schedule notification for 08:30"
else
echo "should_delay=false" >> $GITHUB_OUTPUT
echo "✅ PR created during active hours, will notify immediately"
fi
- name: Add pending label if in quiet hours
if: steps.check_pr.outputs.should_notify == 'true' && steps.check_time.outputs.should_delay == 'true'
uses: actions/github-script@v7
with:
script: |
const pr = context.payload.pull_request;
github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: pr.number,
labels: ['pending-feishu-pr-notification']
});
- name: Setup Node.js
if: steps.check_pr.outputs.should_notify == 'true' && steps.check_time.outputs.should_delay == 'false'
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Process PR with Claude
if: steps.check_pr.outputs.should_notify == 'true' && steps.check_time.outputs.should_delay == 'false'
uses: anthropics/claude-code-action@main
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
allowed_non_write_users: "*"
anthropic_api_key: ${{ secrets.CLAUDE_TRANSLATOR_APIKEY }}
claude_args: "--allowed-tools Bash(gh pr:*),Bash(node scripts/feishu-pr-notify.js)"
prompt: |
你是一个GitHub Pull Request自动化处理助手。请完成以下任务
## 当前PR信息
- PR编号#${{ github.event.pull_request.number }}
- 标题:${{ github.event.pull_request.title }}
- 作者:${{ github.event.pull_request.user.login }}
- URL${{ github.event.pull_request.html_url }}
- 内容:${{ github.event.pull_request.body }}
- 标签:${{ join(github.event.pull_request.labels.*.name, ', ') }}
- 改动文件数:${{ github.event.pull_request.changed_files }}
- 增加行数:${{ github.event.pull_request.additions }}
- 删除行数:${{ github.event.pull_request.deletions }}
- Reviewers${{ steps.check_pr.outputs.reviewers }}
- Assignees${{ steps.check_pr.outputs.assignees }}
## 任务步骤
1. **分析PR改动内容**
首先使用以下命令获取PR的文件变更列表
```bash
gh pr view ${{ github.event.pull_request.number }} --json files --jq '.files[].path'
```
2. **分类PR内容**
根据改动的文件路径和PR标题、描述判断PR的类型若命中多个模块则输出 `multiple`,若均未命中则 `other`
- chat对话: src/renderer/src/pages/home/**, src/renderer/src/store/(newMessage|messageBlock|memory).ts
- draw绘图: src/renderer/src/pages/paintings/**, src/renderer/src/store/paintings.ts
- uiuxUI/UX: src/renderer/src/components/**, src/renderer/src/ui/**, src/renderer/src/assets/styles/**, src/renderer/src/windows/**
- knowledge知识库: src/main/knowledge/**, src/renderer/src/pages/knowledge/**, src/renderer/src/store/knowledge.ts, src/renderer/src/queue/KnowledgeQueue.ts
- minapps小程序: src/renderer/src/pages/minapps/**, src/renderer/src/store/minapps.ts, src/renderer/src/config/minapps.ts
- backup_export备份/导出): src/renderer/src/components/*Backup*, src/renderer/src/components/Webdav*, src/renderer/src/components/ObsidianExportDialog.tsx, src/renderer/src/components/S3*, src/renderer/src/store/(backup|nutstore).ts
- data_storage数据与存储: src/renderer/src/databases/**, src/renderer/src/services/db/**, src/main/services/agents/database/**, resources/database/drizzle/**, src/renderer/src/store/migrate.ts, src/renderer/src/databases/upgrades.ts
- ai_coreAI基础设施: packages/aiCore/**, src/renderer/src/aiCore/**
- backend后端/平台): src/main/apiServer/**, src/main/services/**, src/main/*.ts, src/preload/**, src/main/mcpServers/**
- agentAgent: packages/shared/agents/**, resources/data/agents-*.json, src/renderer/src/api/agent.ts, src/renderer/src/types/agent.ts, src/renderer/src/utils/agentSession.ts, src/renderer/src/services/db/AgentMessageDataSource.ts, src/renderer/src/hooks/agents/**, src/renderer/src/components/Popups/agent/**, src/renderer/src/pages/home/**/Agent*.tsx, src/renderer/src/pages/settings/AgentSettings/**, src/main/services/agents/**, src/main/apiServer/routes/agents/**
- providerProvider: src/renderer/src/config/(providers|preprocessProviders|webSearchProviders).ts, src/renderer/src/hooks/useWebSearchProviders.ts, src/renderer/src/providers/**, src/renderer/src/pages/settings/ProviderSettings/**, src/renderer/src/pages/settings/WebSearchSettings/**, src/renderer/src/pages/settings/DocProcessSettings/(OcrProviderSettings|PreprocessProviderSettings).tsx, src/renderer/src/pages/settings/MCPSettings/providers/**, src/renderer/src/assets/images/providers/**, src/main/services/urlschema/handle-providers.ts
- build-config构建/配置): package.json, tsconfig*.json, electron-builder.yml, electron.vite.config.ts, vitest.config.ts, playwright.config.ts, .github/workflows/**, scripts/**
- test测试: tests/**, src/**/__tests__/**, scripts/__tests__/**
- docs文档: docs/**, README*.md, SECURITY.md, CODE_OF_CONDUCT.md, AGENTS.md
2.1 **识别是否“新增供应商”**
满足以下任一条件则视为“新增供应商”并设置变量 PR_VENDOR_ADDED=true否则为 false
- 改动文件包含:
- src/renderer/src/config/providers.ts
- src/renderer/src/providers/**
- packages/aiCore/**/provider/** 或 packages/aiCore/src/provider/**
- resources/data/agents-*.json
- 或 PR 标题/描述包含关键词:"供应商"、"厂商"、"provider"、"新增"、"集成"
3. **总结PR**
用中文简体提供简洁的总结2-3句话包括
- PR的主要改动内容
- 核心功能或修复
- 重要的技术细节或影响范围
4. **发送飞书通知**
使用以下命令发送飞书通知:
```bash
PR_URL="${{ github.event.pull_request.html_url }}" \
PR_NUMBER="${{ github.event.pull_request.number }}" \
PR_TITLE="${{ github.event.pull_request.title }}" \
PR_AUTHOR="${{ github.event.pull_request.user.login }}" \
PR_LABELS="${{ join(github.event.pull_request.labels.*.name, ',') }}" \
PR_SUMMARY="<你生成的中文总结>" \
PR_REVIEWERS="${{ steps.check_pr.outputs.reviewers }}" \
PR_ASSIGNEES="${{ steps.check_pr.outputs.assignees }}" \
PR_CATEGORY="<你判断的PR类型>" \
PR_VENDOR_ADDED="<true 或 false>" \
PR_CHANGED_FILES="${{ github.event.pull_request.changed_files }}" \
PR_ADDITIONS="${{ github.event.pull_request.additions }}" \
PR_DELETIONS="${{ github.event.pull_request.deletions }}" \
node scripts/feishu-pr-notify.js
```
## 注意事项
- 总结必须使用简体中文
- PR_SUMMARY 和其他参数在传递时需要正确转义特殊字符
- PR_CATEGORY 必须是上述定义的类型之一
- 如果PR内容为空也要提供一个简短的说明
请开始执行任务!
env:
ANTHROPIC_BASE_URL: ${{ secrets.CLAUDE_TRANSLATOR_BASEURL }}
FEISHU_WEBHOOK_URL: ${{ secrets.FEISHU_WEBHOOK_URL }}
FEISHU_WEBHOOK_SECRET: ${{ secrets.FEISHU_WEBHOOK_SECRET }}
FEISHU_USER_MAPPING: ${{ secrets.FEISHU_USER_MAPPING }}
process-pending-prs:
if: github.event_name == 'schedule' || github.event_name == 'workflow_dispatch'
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Process pending PRs with Claude
uses: anthropics/claude-code-action@main
with:
anthropic_api_key: ${{ secrets.CLAUDE_TRANSLATOR_APIKEY }}
allowed_non_write_users: "*"
github_token: ${{ secrets.GITHUB_TOKEN }}
claude_args: "--allowed-tools Bash(gh pr:*),Bash(gh api:*),Bash(node scripts/feishu-pr-notify.js)"
prompt: |
你是一个GitHub Pull Request自动化处理助手。请完成以下任务
## 任务说明
处理所有待发送飞书通知的GitHub PRs标记为 `pending-feishu-pr-notification` 的PRs
## 步骤
1. **获取待处理的PRs**
使用以下命令获取所有带 `pending-feishu-pr-notification` 标签的PRs
```bash
gh api repos/${{ github.repository }}/pulls?state=open | jq '.[] | select(.labels[]?.name == "pending-feishu-pr-notification")'
```
2. **验证PR条件**
对于每个PR检查
- 是否仍然不是draft状态
- 是否有reviewers或assignees
- 如果不满足条件,移除标签并跳过
3. **分析和分类PR**
获取PR的文件变更
```bash
gh pr view <PR编号> --json files --jq '.files[].path'
```
根据文件路径判断PR类型chat/draw/uiux/knowledge/minapps/backup_export/data_storage/ai_core/backend/agent/provider/docs/build-config/test/multiple/other
4. **总结每个PR**
用中文提供简洁的总结2-3句话包括
- PR的主要改动内容
- 核心功能或修复
- 重要的技术细节
5. **发送飞书通知**
使用以下命令发送通知:
```bash
PR_URL="<PR的html_url>" \
PR_NUMBER="<PR编号>" \
PR_TITLE="<PR标题>" \
PR_AUTHOR="<PR作者>" \
PR_LABELS="<逗号分隔的标签列表排除pending-feishu-pr-notification>" \
PR_SUMMARY="<你生成的中文总结>" \
PR_REVIEWERS="<reviewers列表逗号分隔>" \
PR_ASSIGNEES="<assignees列表逗号分隔>" \
PR_CATEGORY="<PR类型>" \
PR_VENDOR_ADDED="<true 或 false>" \
PR_CHANGED_FILES="<改动文件数>" \
PR_ADDITIONS="<增加行数>" \
PR_DELETIONS="<删除行数>" \
node scripts/feishu-pr-notify.js
```
6. **移除标签**
成功发送后,移除标签:
```bash
gh api -X DELETE repos/${{ github.repository }}/issues/<PR编号>/labels/pending-feishu-pr-notification
```
## 环境变量
- Repository: ${{ github.repository }}
- Feishu webhook URL和密钥已配置
## 注意事项
- 如果没有待处理的PRs输出提示后结束
- 处理多个PRs时每个之间等待2-3秒
- 某个PR失败不中断整个流程
- 所有总结使用简体中文
请开始执行任务!
env:
ANTHROPIC_BASE_URL: ${{ secrets.CLAUDE_TRANSLATOR_BASEURL }}
FEISHU_WEBHOOK_URL: ${{ secrets.FEISHU_WEBHOOK_URL }}
FEISHU_WEBHOOK_SECRET: ${{ secrets.FEISHU_WEBHOOK_SECRET }}
FEISHU_USER_MAPPING: ${{ secrets.FEISHU_USER_MAPPING }}

View File

@@ -140,7 +140,7 @@
"typescript/await-thenable": "warn",
// "typescript/ban-ts-comment": "error",
"typescript/no-array-constructor": "error",
// "typescript/consistent-type-imports": "error",
"typescript/consistent-type-imports": "error",
"typescript/no-array-delete": "warn",
"typescript/no-base-to-string": "warn",
"typescript/no-duplicate-enum-values": "error",

View File

@@ -1,71 +0,0 @@
diff --git a/dist/utils/tiktoken.cjs b/dist/utils/tiktoken.cjs
index 973b0d0e75aeaf8de579419af31b879b32975413..f23c7caa8b9dc8bd404132725346a4786f6b278b 100644
--- a/dist/utils/tiktoken.cjs
+++ b/dist/utils/tiktoken.cjs
@@ -1,25 +1,14 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.encodingForModel = exports.getEncoding = void 0;
-const lite_1 = require("js-tiktoken/lite");
const async_caller_js_1 = require("./async_caller.cjs");
const cache = {};
const caller = /* #__PURE__ */ new async_caller_js_1.AsyncCaller({});
async function getEncoding(encoding) {
- if (!(encoding in cache)) {
- cache[encoding] = caller
- .fetch(`https://tiktoken.pages.dev/js/${encoding}.json`)
- .then((res) => res.json())
- .then((data) => new lite_1.Tiktoken(data))
- .catch((e) => {
- delete cache[encoding];
- throw e;
- });
- }
- return await cache[encoding];
+ throw new Error("TikToken Not implemented");
}
exports.getEncoding = getEncoding;
async function encodingForModel(model) {
- return getEncoding((0, lite_1.getEncodingNameForModel)(model));
+ throw new Error("TikToken Not implemented");
}
exports.encodingForModel = encodingForModel;
diff --git a/dist/utils/tiktoken.js b/dist/utils/tiktoken.js
index 8e41ee6f00f2f9c7fa2c59fa2b2f4297634b97aa..aa5f314a6349ad0d1c5aea8631a56aad099176e0 100644
--- a/dist/utils/tiktoken.js
+++ b/dist/utils/tiktoken.js
@@ -1,20 +1,9 @@
-import { Tiktoken, getEncodingNameForModel, } from "js-tiktoken/lite";
import { AsyncCaller } from "./async_caller.js";
const cache = {};
const caller = /* #__PURE__ */ new AsyncCaller({});
export async function getEncoding(encoding) {
- if (!(encoding in cache)) {
- cache[encoding] = caller
- .fetch(`https://tiktoken.pages.dev/js/${encoding}.json`)
- .then((res) => res.json())
- .then((data) => new Tiktoken(data))
- .catch((e) => {
- delete cache[encoding];
- throw e;
- });
- }
- return await cache[encoding];
+ throw new Error("TikToken Not implemented");
}
export async function encodingForModel(model) {
- return getEncoding(getEncodingNameForModel(model));
+ throw new Error("TikToken Not implemented");
}
diff --git a/package.json b/package.json
index 36072aecf700fca1bc49832a19be832eca726103..90b8922fba1c3d1b26f78477c891b07816d6238a 100644
--- a/package.json
+++ b/package.json
@@ -37,7 +37,6 @@
"ansi-styles": "^5.0.0",
"camelcase": "6",
"decamelize": "1.2.0",
- "js-tiktoken": "^1.0.12",
"langsmith": ">=0.2.8 <0.4.0",
"mustache": "^4.2.0",
"p-queue": "^6.6.2",

View File

@@ -0,0 +1,68 @@
diff --git a/dist/utils/tiktoken.cjs b/dist/utils/tiktoken.cjs
index c5b41f121d2e3d24c3a4969e31fa1acffdcad3b9..ec724489dcae79ee6c61acf2d4d84bd19daef036 100644
--- a/dist/utils/tiktoken.cjs
+++ b/dist/utils/tiktoken.cjs
@@ -1,6 +1,5 @@
const require_rolldown_runtime = require('../_virtual/rolldown_runtime.cjs');
const require_utils_async_caller = require('./async_caller.cjs');
-const js_tiktoken_lite = require_rolldown_runtime.__toESM(require("js-tiktoken/lite"));
//#region src/utils/tiktoken.ts
var tiktoken_exports = {};
@@ -11,14 +10,10 @@ require_rolldown_runtime.__export(tiktoken_exports, {
const cache = {};
const caller = /* @__PURE__ */ new require_utils_async_caller.AsyncCaller({});
async function getEncoding(encoding) {
- if (!(encoding in cache)) cache[encoding] = caller.fetch(`https://tiktoken.pages.dev/js/${encoding}.json`).then((res) => res.json()).then((data) => new js_tiktoken_lite.Tiktoken(data)).catch((e) => {
- delete cache[encoding];
- throw e;
- });
- return await cache[encoding];
+ throw new Error("TikToken Not implemented");
}
async function encodingForModel(model) {
- return getEncoding((0, js_tiktoken_lite.getEncodingNameForModel)(model));
+ throw new Error("TikToken Not implemented");
}
//#endregion
diff --git a/dist/utils/tiktoken.js b/dist/utils/tiktoken.js
index 641acca03cb92f04a6fa5c9c31f1880ce635572e..707389970ad957aa0ff20ef37fa8dd2875be737c 100644
--- a/dist/utils/tiktoken.js
+++ b/dist/utils/tiktoken.js
@@ -1,6 +1,5 @@
import { __export } from "../_virtual/rolldown_runtime.js";
import { AsyncCaller } from "./async_caller.js";
-import { Tiktoken, getEncodingNameForModel } from "js-tiktoken/lite";
//#region src/utils/tiktoken.ts
var tiktoken_exports = {};
@@ -11,14 +10,10 @@ __export(tiktoken_exports, {
const cache = {};
const caller = /* @__PURE__ */ new AsyncCaller({});
async function getEncoding(encoding) {
- if (!(encoding in cache)) cache[encoding] = caller.fetch(`https://tiktoken.pages.dev/js/${encoding}.json`).then((res) => res.json()).then((data) => new Tiktoken(data)).catch((e) => {
- delete cache[encoding];
- throw e;
- });
- return await cache[encoding];
+ throw new Error("TikToken Not implemented");
}
async function encodingForModel(model) {
- return getEncoding(getEncodingNameForModel(model));
+ throw new Error("TikToken Not implemented");
}
//#endregion
diff --git a/package.json b/package.json
index a24f8fc61de58526051999260f2ebee5f136354b..e885359e8966e7730c51772533ce37e01edb3046 100644
--- a/package.json
+++ b/package.json
@@ -20,7 +20,6 @@
"ansi-styles": "^5.0.0",
"camelcase": "6",
"decamelize": "1.2.0",
- "js-tiktoken": "^1.0.12",
"langsmith": "^0.3.64",
"mustache": "^4.2.0",
"p-queue": "^6.6.2",

View File

@@ -1,19 +0,0 @@
diff --git a/dist/embeddings.js b/dist/embeddings.js
index 1f8154be3e9c22442a915eb4b85fa6d2a21b0d0c..dc13ef4a30e6c282824a5357bcee9bd0ae222aab 100644
--- a/dist/embeddings.js
+++ b/dist/embeddings.js
@@ -214,10 +214,12 @@ export class OpenAIEmbeddings extends Embeddings {
* @returns Promise that resolves to an embedding for the document.
*/
async embedQuery(text) {
+ const isBaiduCloud = this.clientConfig.baseURL.includes('baidubce.com')
+ const input = this.stripNewLines ? text.replace(/\n/g, ' ') : text
const params = {
model: this.model,
- input: this.stripNewLines ? text.replace(/\n/g, " ") : text,
- };
+ input: isBaiduCloud ? [input] : input
+ }
if (this.dimensions) {
params.dimensions = this.dimensions;
}

View File

@@ -0,0 +1,17 @@
diff --git a/dist/embeddings.js b/dist/embeddings.js
index 6f4b928d3e4717309382e1b5c2e31ab5bc6c5af0..bc79429c88a6d27d4997a2740c4d8ae0707f5991 100644
--- a/dist/embeddings.js
+++ b/dist/embeddings.js
@@ -94,9 +94,11 @@ var OpenAIEmbeddings = class extends Embeddings {
* @returns Promise that resolves to an embedding for the document.
*/
async embedQuery(text) {
+ const isBaiduCloud = this.clientConfig.baseURL.includes('baidubce.com');
+ const input = this.stripNewLines ? text.replace(/\n/g, " ") : text
const params = {
model: this.model,
- input: this.stripNewLines ? text.replace(/\n/g, " ") : text
+ input: isBaiduCloud ? [input] : input
};
if (this.dimensions) params.dimensions = this.dimensions;
if (this.encodingFormat) params.encoding_format = this.encodingFormat;

View File

@@ -23,7 +23,7 @@
},
"files": {
"ignoreUnknown": false,
"includes": ["**"],
"includes": ["**", "!**/.claude/**"],
"maxSize": 2097152
},
"formatter": {

View File

@@ -133,60 +133,116 @@ artifactBuildCompleted: scripts/artifact-build-completed.js
releaseInfo:
releaseNotes: |
<!--LANG:en-->
What's New in v1.7.0-beta.2
What's New in v1.7.0-beta.3
New Features:
- Session Settings: Manage session-specific settings and model configurations independently
- Notes Full-Text Search: Search across all notes with match highlighting
- Built-in DiDi MCP Server: Integration with DiDi ride-hailing services (China only)
- Intel OV OCR: Hardware-accelerated OCR using Intel NPU
- Auto-start API Server: Automatically starts when agents exist
- Enhanced Tool Permission System: Real-time tool approval interface with improved UX
- Plugin Management System: Support for Claude Agent plugins (agents, commands, skills)
- Skill Tool: Add skill execution capabilities for agents
- Mobile App Data Restore: Support restoring data to mobile applications
- OpenMinerU Preprocessor: Knowledge base now supports open-source MinerU for document processing
- HuggingFace Provider: Added HuggingFace as AI provider
- Claude Haiku 4.5: Support for the latest Claude Haiku 4.5 model
- Ling Series Models: Added support for Ling-1T and related models
- Intel OVMS Painting: New painting provider using Intel OpenVINO Model Server
- Automatic Update Checks: Implement periodic update checking with configurable intervals
- HuggingChat Mini App: New mini app for HuggingChat integration
Improvements:
- Agent model selection now requires explicit user choice
- Added Mistral AI provider support
- Added NewAPI generic provider support
- Improved navbar layout consistency across different modes
- Enhanced chat component responsiveness
- Better code block display on small screens
- Updated OVMS to 2025.3 official release
- Added Greek language support
- Agent Creation: New agents are now automatically activated upon creation
- Lazy Loading: Optimize page load performance with route lazy loading
- UI Enhancements: Improved agent item styling and layout consistency
- Navigation: Better navbar layout for fullscreen mode on macOS
- Settings Tab: Enhanced context slider consistency
- Backup Manager: Unified footer layout for local and S3 backup managers
- Menu System: Enhanced application menu with improved help section
- Proxy Rules: Comprehensive proxy bypass rule matching
- German Language: Added German language support
- MCP Confirmation: Added confirmation modal when activating protocol-installed MCP servers
- Translation: Enhanced translation script with concurrency and validation
- Electron & Vite: Updated to Electron 38 and Vite 4.0.1
Claude Code Tool Improvements:
- GlobTool: Now counts lines instead of files in output for better clarity
- ReadTool: Automatically removes system reminder tags from output
- TodoWriteTool: Improved rendering behavior
- Environment Variables: Updated model-related environment variable names
Bug Fixes:
- Fixed GitHub Copilot gpt-5-codex streaming issues
- Fixed assistant creation failures
- Fixed translate auto-copy functionality
- Fixed miniapps external link opening
- Fixed message layout and overflow issues
- Fixed API key parsing to preserve spaces
- Fixed agent display in different navbar layouts
- Fixed session model not being used when sending messages
- Fixed tool approval UI and shared workspace plugin inconsistencies
- Fixed API server readiness notification to renderer
- Fixed grouped items not respecting saved tag order
- Fixed assistant/agent activation when creating new ones
- Fixed Dashscope Anthropic API host and migrated old configs
- Fixed Qwen3 thinking mode control for Ollama
- Fixed disappeared MCP button
- Fixed create assistant causing blank screen
- Fixed up-down button visibility in some cases
- Fixed hooks preventing save on composing enter key
- Fixed Azure GPT-image-1 and OpenRouter Gemini-image
- Fixed Silicon reasoning issues
- Fixed topic branch incomplete copy with two-pass ID mapping
- Fixed deep research model search context restrictions
- Fixed model capability checking logic
- Fixed reranker API error response capture
- Fixed right-click paste file content into inputbar
- Fixed minimax-m2 support in aiCore
<!--LANG:zh-CN-->
v1.7.0-beta.2 新特性
v1.7.0-beta.3 新特性
新功能:
- 会话设置:独立管理会话特定的设置和模型配置
- 笔记全文搜索:跨所有笔记搜索并高亮匹配内容
- 内置滴滴 MCP 服务器:集成滴滴打车服务(仅限中国地区)
- Intel OV OCR使用 Intel NPU 的硬件加速 OCR
- 自动启动 API 服务器:当存在 Agent 时自动启动
- 增强工具权限系统:实时工具审批界面,改进用户体验
- 插件管理系统:支持 Claude Agent 插件agents、commands、skills
- 技能工具:为 Agent 添加技能执行能力
- 移动应用数据恢复:支持将数据恢复到移动应用程序
- OpenMinerU 预处理器:知识库现支持使用开源 MinerU 处理文档
- HuggingFace 提供商:添加 HuggingFace 作为 AI 提供商
- Claude Haiku 4.5:支持最新的 Claude Haiku 4.5 模型
- Ling 系列模型:添加 Ling-1T 及相关模型支持
- Intel OVMS 绘图:使用 Intel OpenVINO 模型服务器的新绘图提供商
- 自动更新检查:实现可配置间隔的定期更新检查
- HuggingChat 小程序:新增 HuggingChat 集成小程序
改进:
- Agent 模型选择现在需要用户显式选择
- 添加 Mistral AI 提供商支持
- 添加 NewAPI 通用提供商支持
- 改进不同模式下的导航栏布局一致性
- 增强聊天组件响应式设计
- 优化小屏幕代码块显示
- 更新 OVMS 至 2025.3 正式版
- 添加希腊语支持
- Agent 创建:新创建的 Agent 现在会自动激活
- 懒加载:通过路由懒加载优化页面加载性能
- UI 增强:改进 Agent 项目样式和布局一致性
- 导航:改进 macOS 全屏模式下的导航栏布局
- 设置选项卡:增强上下文滑块一致性
- 备份管理器:统一本地和 S3 备份管理器的页脚布局
- 菜单系统:增强应用菜单,改进帮助部分
- 代理规则:全面的代理绕过规则匹配
- 德语支持:添加德语语言支持
- MCP 确认:添加激活协议安装的 MCP 服务器时的确认模态框
- 翻译:增强翻译脚本的并发和验证功能
- Electron & Vite更新至 Electron 38 和 Vite 4.0.1
Claude Code 工具改进:
- GlobTool现在计算行数而不是文件数提供更清晰的输出
- ReadTool自动从输出中移除系统提醒标签
- TodoWriteTool改进渲染行为
- 环境变量:更新模型相关的环境变量名称
问题修复:
- 修复 GitHub Copilot gpt-5-codex 流式传输问题
- 修复助手创建失败
- 修复翻译自动复制功能
- 修复小程序外部链接打开
- 修复消息布局和溢出问题
- 修复 API 密钥解析以保留空格
- 修复不同导航栏布局中的 Agent 显示
- 修复发送消息时未使用会话模型
- 修复工具审批 UI 和共享工作区插件不一致
- 修复 API 服务器就绪通知到渲染器
- 修复分组项目不遵守已保存标签顺序
- 修复创建新的助手/Agent 时的激活问题
- 修复 Dashscope Anthropic API 主机并迁移旧配置
- 修复 Ollama 的 Qwen3 思考模式控制
- 修复 MCP 按钮消失
- 修复创建助手导致空白屏幕
- 修复某些情况下上下按钮可见性
- 修复钩子在输入法输入时阻止保存
- 修复 Azure GPT-image-1 和 OpenRouter Gemini-image
- 修复 Silicon 推理问题
- 修复主题分支不完整复制,采用两阶段 ID 映射
- 修复深度研究模型搜索上下文限制
- 修复模型能力检查逻辑
- 修复 reranker API 错误响应捕获
- 修复右键粘贴文件内容到输入栏
- 修复 aiCore 中的 minimax-m2 支持
<!--LANG:END-->

View File

@@ -1,6 +1,6 @@
{
"name": "CherryStudio",
"version": "1.7.0-beta.2",
"version": "1.7.0-beta.3",
"private": true,
"description": "A powerful AI assistant for producer.",
"main": "./out/main/index.js",
@@ -92,8 +92,10 @@
"node-stream-zip": "^1.15.0",
"officeparser": "^4.2.0",
"os-proxy-config": "^1.1.2",
"qrcode.react": "^4.2.0",
"selection-hook": "^1.0.12",
"sharp": "^0.34.3",
"socket.io": "^4.8.1",
"swagger-jsdoc": "^6.2.8",
"swagger-ui-express": "^5.0.1",
"tesseract.js": "patch:tesseract.js@npm%3A6.0.1#~/.yarn/patches/tesseract.js-npm-6.0.1-2562a7e46d.patch",
@@ -146,7 +148,9 @@
"@hello-pangea/dnd": "^18.0.1",
"@heroui/react": "^2.8.3",
"@kangfenmao/keyv-storage": "^0.1.0",
"@langchain/community": "^0.3.50",
"@langchain/community": "^1.0.0",
"@langchain/core": "patch:@langchain/core@npm%3A1.0.2#~/.yarn/patches/@langchain-core-npm-1.0.2-183ef83fe4.patch",
"@langchain/openai": "patch:@langchain/openai@npm%3A1.0.0#~/.yarn/patches/@langchain-openai-npm-1.0.0-474d0ad9d4.patch",
"@mistralai/mistralai": "^1.7.5",
"@modelcontextprotocol/sdk": "^1.17.5",
"@mozilla/readability": "^0.6.0",
@@ -372,9 +376,7 @@
"@codemirror/language": "6.11.3",
"@codemirror/lint": "6.8.5",
"@codemirror/view": "6.38.1",
"@langchain/core@npm:^0.3.26": "patch:@langchain/core@npm%3A0.3.44#~/.yarn/patches/@langchain-core-npm-0.3.44-41d5c3cb0a.patch",
"@langchain/openai@npm:^0.3.16": "patch:@langchain/openai@npm%3A0.3.16#~/.yarn/patches/@langchain-openai-npm-0.3.16-e525b59526.patch",
"@langchain/openai@npm:>=0.1.0 <0.4.0": "patch:@langchain/openai@npm%3A0.3.16#~/.yarn/patches/@langchain-openai-npm-0.3.16-e525b59526.patch",
"@langchain/core@npm:^0.3.26": "patch:@langchain/core@npm%3A1.0.2#~/.yarn/patches/@langchain-core-npm-1.0.2-183ef83fe4.patch",
"app-builder-lib@npm:26.0.13": "patch:app-builder-lib@npm%3A26.0.13#~/.yarn/patches/app-builder-lib-npm-26.0.13-a064c9e1d0.patch",
"app-builder-lib@npm:26.0.15": "patch:app-builder-lib@npm%3A26.0.15#~/.yarn/patches/app-builder-lib-npm-26.0.15-360e5b0476.patch",
"atomically@npm:^1.7.0": "patch:atomically@npm%3A1.7.0#~/.yarn/patches/atomically-npm-1.7.0-e742e5293b.patch",
@@ -398,7 +400,10 @@
"@img/sharp-linux-arm64": "0.34.3",
"@img/sharp-linux-x64": "0.34.3",
"@img/sharp-win32-x64": "0.34.3",
"openai@npm:5.12.2": "npm:@cherrystudio/openai@6.5.0"
"openai@npm:5.12.2": "npm:@cherrystudio/openai@6.5.0",
"@langchain/openai@npm:>=0.1.0 <0.6.0": "patch:@langchain/openai@npm%3A1.0.0#~/.yarn/patches/@langchain-openai-npm-1.0.0-474d0ad9d4.patch",
"@langchain/openai@npm:^0.3.16": "patch:@langchain/openai@npm%3A1.0.0#~/.yarn/patches/@langchain-openai-npm-1.0.0-474d0ad9d4.patch",
"@langchain/openai@npm:>=0.2.0 <0.7.0": "patch:@langchain/openai@npm%3A1.0.0#~/.yarn/patches/@langchain-openai-npm-1.0.0-474d0ad9d4.patch"
},
"packageManager": "yarn@4.9.1",
"lint-staged": {

View File

@@ -2,7 +2,7 @@
* 中间件管理器
* 专注于 AI SDK 中间件的管理,与插件系统分离
*/
import { LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { LanguageModelV2Middleware } from '@ai-sdk/provider'
/**
* 创建中间件列表

View File

@@ -1,7 +1,7 @@
/**
* 中间件系统类型定义
*/
import { LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { LanguageModelV2Middleware } from '@ai-sdk/provider'
/**
* 具名中间件接口

View File

@@ -2,7 +2,7 @@
* 模型包装工具函数
* 用于将中间件应用到LanguageModel上
*/
import { LanguageModelV2, LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { LanguageModelV2, LanguageModelV2Middleware } from '@ai-sdk/provider'
import { wrapLanguageModel } from 'ai'
/**

View File

@@ -5,7 +5,7 @@
* 集成了来自 ModelCreator 的特殊处理逻辑
*/
import { EmbeddingModelV2, ImageModelV2, LanguageModelV2, LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { EmbeddingModelV2, ImageModelV2, LanguageModelV2, LanguageModelV2Middleware } from '@ai-sdk/provider'
import { wrapModelWithMiddlewares } from '../middleware/wrapper'
import { DEFAULT_SEPARATOR, globalRegistryManagement } from '../providers/RegistryManagement'

View File

@@ -1,7 +1,7 @@
/**
* Creation 模块类型定义
*/
import { LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { ProviderId, ProviderSettingsMap } from '../providers/types'

View File

@@ -1,4 +1,4 @@
import { ExtractProviderOptions, ProviderOptionsMap, TypedProviderOptions } from './types'
import type { ExtractProviderOptions, ProviderOptionsMap, TypedProviderOptions } from './types'
/**
* 创建特定供应商的选项

View File

@@ -10,7 +10,7 @@ import type { AiRequestContext } from '../../types'
import { StreamEventManager } from './StreamEventManager'
import { type TagConfig, TagExtractor } from './tagExtraction'
import { ToolExecutor } from './ToolExecutor'
import { PromptToolUseConfig, ToolUseResult } from './type'
import type { PromptToolUseConfig, ToolUseResult } from './type'
/**
* 工具使用标签配置

View File

@@ -1,6 +1,6 @@
import { ToolSet } from 'ai'
import type { ToolSet } from 'ai'
import { AiRequestContext } from '../..'
import type { AiRequestContext } from '../..'
/**
* 解析结果类型

View File

@@ -1,10 +1,11 @@
import { anthropic } from '@ai-sdk/anthropic'
import { google } from '@ai-sdk/google'
import { openai } from '@ai-sdk/openai'
import { InferToolInput, InferToolOutput, type Tool } from 'ai'
import type { anthropic } from '@ai-sdk/anthropic'
import type { google } from '@ai-sdk/google'
import type { openai } from '@ai-sdk/openai'
import type { InferToolInput, InferToolOutput } from 'ai'
import { type Tool } from 'ai'
import { ProviderOptionsMap } from '../../../options/types'
import { OpenRouterSearchConfig } from './openrouter'
import type { ProviderOptionsMap } from '../../../options/types'
import type { OpenRouterSearchConfig } from './openrouter'
/**
* 从 AI SDK 的工具函数中提取参数类型,以确保类型安全。

View File

@@ -9,7 +9,8 @@ import { openai } from '@ai-sdk/openai'
import { createOpenRouterOptions, createXaiOptions, mergeProviderOptions } from '../../../options'
import { definePlugin } from '../../'
import type { AiRequestContext } from '../../types'
import { DEFAULT_WEB_SEARCH_CONFIG, WebSearchPluginConfig } from './helper'
import type { WebSearchPluginConfig } from './helper'
import { DEFAULT_WEB_SEARCH_CONFIG } from './helper'
/**
* 网络搜索插件

View File

@@ -1,4 +1,4 @@
import { AiPlugin, AiRequestContext } from './types'
import type { AiPlugin, AiRequestContext } from './types'
/**
* 插件管理器

View File

@@ -5,7 +5,7 @@
* 例如: aihubmix:anthropic:claude-3.5-sonnet
*/
import { ProviderV2 } from '@ai-sdk/provider'
import type { ProviderV2 } from '@ai-sdk/provider'
import { customProvider } from 'ai'
import { globalRegistryManagement } from './RegistryManagement'

View File

@@ -4,7 +4,7 @@
* 基于 AI SDK 原生的 createProviderRegistry
*/
import { EmbeddingModelV2, ImageModelV2, LanguageModelV2, ProviderV2 } from '@ai-sdk/provider'
import type { EmbeddingModelV2, ImageModelV2, LanguageModelV2, ProviderV2 } from '@ai-sdk/provider'
import { createProviderRegistry, type ProviderRegistryProvider } from 'ai'
type PROVIDERS = Record<string, ProviderV2>

View File

@@ -10,10 +10,11 @@ import { createGoogleGenerativeAI } from '@ai-sdk/google'
import { createHuggingFace } from '@ai-sdk/huggingface'
import { createOpenAI, type OpenAIProviderSettings } from '@ai-sdk/openai'
import { createOpenAICompatible } from '@ai-sdk/openai-compatible'
import { LanguageModelV2 } from '@ai-sdk/provider'
import type { LanguageModelV2 } from '@ai-sdk/provider'
import { createXai } from '@ai-sdk/xai'
import { createOpenRouter } from '@openrouter/ai-sdk-provider'
import { customProvider, Provider } from 'ai'
import type { Provider } from 'ai'
import { customProvider } from 'ai'
import * as z from 'zod'
/**

View File

@@ -4,7 +4,7 @@ import { type DeepSeekProviderSettings } from '@ai-sdk/deepseek'
import { type GoogleGenerativeAIProviderSettings } from '@ai-sdk/google'
import { type OpenAIProviderSettings } from '@ai-sdk/openai'
import { type OpenAICompatibleProviderSettings } from '@ai-sdk/openai-compatible'
import {
import type {
EmbeddingModelV2 as EmbeddingModel,
ImageModelV2 as ImageModel,
LanguageModelV2 as LanguageModel,

View File

@@ -1,4 +1,4 @@
import { ImageModelV2 } from '@ai-sdk/provider'
import type { ImageModelV2 } from '@ai-sdk/provider'
import { experimental_generateImage as aiGenerateImage, NoImageGeneratedError } from 'ai'
import { beforeEach, describe, expect, it, vi } from 'vitest'

View File

@@ -2,12 +2,12 @@
* 运行时执行器
* 专注于插件化的AI调用处理
*/
import { ImageModelV2, LanguageModelV2, LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { ImageModelV2, LanguageModelV2, LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { LanguageModel } from 'ai'
import {
experimental_generateImage as _generateImage,
generateObject as _generateObject,
generateText as _generateText,
LanguageModel,
streamObject as _streamObject,
streamText as _streamText
} from 'ai'

View File

@@ -11,7 +11,7 @@ export type { RuntimeConfig } from './types'
// === 便捷工厂函数 ===
import { LanguageModelV2Middleware } from '@ai-sdk/provider'
import type { LanguageModelV2Middleware } from '@ai-sdk/provider'
import { type AiPlugin } from '../plugins'
import { type ProviderId, type ProviderSettingsMap } from '../providers/types'

View File

@@ -1,6 +1,13 @@
/* eslint-disable @eslint-react/naming-convention/context-name */
import { ImageModelV2 } from '@ai-sdk/provider'
import { experimental_generateImage, generateObject, generateText, LanguageModel, streamObject, streamText } from 'ai'
import type { ImageModelV2 } from '@ai-sdk/provider'
import type {
experimental_generateImage,
generateObject,
generateText,
LanguageModel,
streamObject,
streamText
} from 'ai'
import { type AiPlugin, createContext, PluginManager } from '../plugins'
import { type ProviderId } from '../providers/types'

View File

@@ -1,8 +1,8 @@
/**
* Runtime 层类型定义
*/
import { ImageModelV2 } from '@ai-sdk/provider'
import { experimental_generateImage, generateObject, generateText, streamObject, streamText } from 'ai'
import type { ImageModelV2 } from '@ai-sdk/provider'
import type { experimental_generateImage, generateObject, generateText, streamObject, streamText } from 'ai'
import { type ModelConfig } from '../models/types'
import { type AiPlugin } from '../plugins'

View File

@@ -1,4 +1,5 @@
import { Extension, Node } from '@tiptap/core'
import type { Node } from '@tiptap/core'
import { Extension } from '@tiptap/core'
import type { TableCellOptions } from '../cell/index.js'
import { TableCell } from '../cell/index.js'

View File

@@ -1,7 +1,7 @@
import { SpanKind, SpanStatusCode } from '@opentelemetry/api'
import { ReadableSpan } from '@opentelemetry/sdk-trace-base'
import type { ReadableSpan } from '@opentelemetry/sdk-trace-base'
import { SpanEntity } from '../types/config'
import type { SpanEntity } from '../types/config'
/**
* convert ReadableSpan to SpanEntity

View File

@@ -1,4 +1,4 @@
import { ReadableSpan } from '@opentelemetry/sdk-trace-base'
import type { ReadableSpan } from '@opentelemetry/sdk-trace-base'
export interface TraceCache {
createSpan: (span: ReadableSpan) => void

View File

@@ -1,5 +1,6 @@
import { ExportResult, ExportResultCode } from '@opentelemetry/core'
import { ReadableSpan, SpanExporter } from '@opentelemetry/sdk-trace-base'
import type { ExportResult } from '@opentelemetry/core'
import { ExportResultCode } from '@opentelemetry/core'
import type { ReadableSpan, SpanExporter } from '@opentelemetry/sdk-trace-base'
export type SaveFunction = (spans: ReadableSpan[]) => Promise<void>

View File

@@ -1,7 +1,9 @@
import { Context, trace } from '@opentelemetry/api'
import { BatchSpanProcessor, BufferConfig, ReadableSpan, Span, SpanExporter } from '@opentelemetry/sdk-trace-base'
import type { Context } from '@opentelemetry/api'
import { trace } from '@opentelemetry/api'
import type { BufferConfig, ReadableSpan, Span, SpanExporter } from '@opentelemetry/sdk-trace-base'
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-base'
import { TraceCache } from '../core/traceCache'
import type { TraceCache } from '../core/traceCache'
export class CacheBatchSpanProcessor extends BatchSpanProcessor {
private cache: TraceCache

View File

@@ -1,6 +1,7 @@
import { Context } from '@opentelemetry/api'
import { BatchSpanProcessor, BufferConfig, ReadableSpan, Span, SpanExporter } from '@opentelemetry/sdk-trace-base'
import { EventEmitter } from 'stream'
import type { Context } from '@opentelemetry/api'
import type { BufferConfig, ReadableSpan, Span, SpanExporter } from '@opentelemetry/sdk-trace-base'
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-base'
import type { EventEmitter } from 'stream'
import { convertSpanToSpanEntity } from '../core/spanConvert'

View File

@@ -1,5 +1,7 @@
import { Context, trace } from '@opentelemetry/api'
import { BatchSpanProcessor, BufferConfig, ReadableSpan, Span, SpanExporter } from '@opentelemetry/sdk-trace-base'
import type { Context } from '@opentelemetry/api'
import { trace } from '@opentelemetry/api'
import type { BufferConfig, ReadableSpan, Span, SpanExporter } from '@opentelemetry/sdk-trace-base'
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-base'
export type SpanFunction = (span: ReadableSpan) => void

View File

@@ -1,5 +1,5 @@
import { Link } from '@opentelemetry/api'
import { TimedEvent } from '@opentelemetry/sdk-trace-base'
import type { Link } from '@opentelemetry/api'
import type { TimedEvent } from '@opentelemetry/sdk-trace-base'
export type AttributeValue =
| string

View File

@@ -1,11 +1,14 @@
import { trace, Tracer } from '@opentelemetry/api'
import type { Tracer } from '@opentelemetry/api'
import { trace } from '@opentelemetry/api'
import { AsyncLocalStorageContextManager } from '@opentelemetry/context-async-hooks'
import { W3CTraceContextPropagator } from '@opentelemetry/core'
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http'
import { BatchSpanProcessor, ConsoleSpanExporter, SpanProcessor } from '@opentelemetry/sdk-trace-base'
import type { SpanProcessor } from '@opentelemetry/sdk-trace-base'
import { BatchSpanProcessor, ConsoleSpanExporter } from '@opentelemetry/sdk-trace-base'
import { NodeTracerProvider } from '@opentelemetry/sdk-trace-node'
import { defaultConfig, TraceConfig } from '../trace-core/types/config'
import type { TraceConfig } from '../trace-core/types/config'
import { defaultConfig } from '../trace-core/types/config'
export class NodeTracer {
private static provider: NodeTracerProvider

View File

@@ -1,4 +1,5 @@
import { Context, ContextManager, ROOT_CONTEXT } from '@opentelemetry/api'
import type { Context, ContextManager } from '@opentelemetry/api'
import { ROOT_CONTEXT } from '@opentelemetry/api'
export class TopicContextManager implements ContextManager {
private topicContextStack: Map<string, Context[]>

View File

@@ -1,4 +1,5 @@
import { Context, context } from '@opentelemetry/api'
import type { Context } from '@opentelemetry/api'
import { context } from '@opentelemetry/api'
const originalPromise = globalThis.Promise

View File

@@ -1,9 +1,11 @@
import { W3CTraceContextPropagator } from '@opentelemetry/core'
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http'
import { BatchSpanProcessor, ConsoleSpanExporter, SpanProcessor } from '@opentelemetry/sdk-trace-base'
import type { SpanProcessor } from '@opentelemetry/sdk-trace-base'
import { BatchSpanProcessor, ConsoleSpanExporter } from '@opentelemetry/sdk-trace-base'
import { WebTracerProvider } from '@opentelemetry/sdk-trace-web'
import { defaultConfig, TraceConfig } from '../trace-core/types/config'
import type { TraceConfig } from '../trace-core/types/config'
import { defaultConfig } from '../trace-core/types/config'
import { TopicContextManager } from './TopicContextManager'
export const contextManager = new TopicContextManager()

View File

@@ -322,6 +322,7 @@ export enum IpcChannel {
ApiServer_Stop = 'api-server:stop',
ApiServer_Restart = 'api-server:restart',
ApiServer_GetStatus = 'api-server:get-status',
ApiServer_Ready = 'api-server:ready',
// NOTE: This api is not be used.
ApiServer_GetConfig = 'api-server:get-config',
@@ -363,5 +364,12 @@ export enum IpcChannel {
ClaudeCodePlugin_ListInstalled = 'claudeCodePlugin:list-installed',
ClaudeCodePlugin_InvalidateCache = 'claudeCodePlugin:invalidate-cache',
ClaudeCodePlugin_ReadContent = 'claudeCodePlugin:read-content',
ClaudeCodePlugin_WriteContent = 'claudeCodePlugin:write-content'
ClaudeCodePlugin_WriteContent = 'claudeCodePlugin:write-content',
// WebSocket
WebSocket_Start = 'webSocket:start',
WebSocket_Stop = 'webSocket:stop',
WebSocket_Status = 'webSocket:status',
WebSocket_SendFile = 'webSocket:send-file',
WebSocket_GetAllCandidates = 'webSocket:get-all-candidates'
}

View File

@@ -9,9 +9,9 @@
*/
import Anthropic from '@anthropic-ai/sdk'
import { TextBlockParam } from '@anthropic-ai/sdk/resources'
import type { TextBlockParam } from '@anthropic-ai/sdk/resources'
import { loggerService } from '@logger'
import { Provider } from '@types'
import type { Provider } from '@types'
import type { ModelMessage } from 'ai'
const logger = loggerService.withContext('anthropic-sdk')

View File

@@ -1,4 +1,4 @@
import { ProcessingStatus } from '@types'
import type { ProcessingStatus } from '@types'
export type LoaderReturn = {
entriesAdded: number
@@ -31,3 +31,16 @@ export type WebviewKeyEvent = {
shift: boolean
alt: boolean
}
export interface WebSocketStatusResponse {
isRunning: boolean
port?: number
ip?: string
clientConnected: boolean
}
export interface WebSocketCandidatesResponse {
host: string
interface: string
priority: number
}

View File

@@ -1,635 +0,0 @@
/**
* Feishu (Lark) Webhook Notification Script for Pull Requests
* Sends GitHub PR summaries to Feishu with @ mentions for reviewers and assignees
*/
const crypto = require('crypto')
const https = require('https')
const fs = require('fs')
const path = require('path')
/**
* Generate Feishu webhook signature
* @param {string} secret - Feishu webhook secret
* @param {number} timestamp - Unix timestamp in seconds
* @returns {string} Base64 encoded signature
*/
function generateSignature(secret, timestamp) {
const stringToSign = `${timestamp}\n${secret}`
const hmac = crypto.createHmac('sha256', stringToSign)
return hmac.digest('base64')
}
/**
* Send message to Feishu webhook
* @param {string} webhookUrl - Feishu webhook URL
* @param {string} secret - Feishu webhook secret
* @param {object} content - Message content
* @returns {Promise<void>}
*/
function sendToFeishu(webhookUrl, secret, content) {
return new Promise((resolve, reject) => {
const timestamp = Math.floor(Date.now() / 1000)
const sign = generateSignature(secret, timestamp)
const payload = JSON.stringify({
timestamp: timestamp.toString(),
sign: sign,
msg_type: 'interactive',
card: content
})
const url = new URL(webhookUrl)
const options = {
hostname: url.hostname,
path: url.pathname + url.search,
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(payload)
}
}
const req = https.request(options, (res) => {
let data = ''
res.on('data', (chunk) => {
data += chunk
})
res.on('end', () => {
if (res.statusCode >= 200 && res.statusCode < 300) {
console.log('✅ Successfully sent to Feishu:', data)
resolve()
} else {
reject(new Error(`Feishu API error: ${res.statusCode} - ${data}`))
}
})
})
req.on('error', (error) => {
reject(error)
})
req.write(payload)
req.end()
})
}
/**
* Parse user mapping from environment variable
* Expected format: "github_user1:feishu_id1,github_user2:feishu_id2"
* @param {string} mappingStr - User mapping string
* @returns {Map<string, string>} Map of GitHub username to Feishu user ID
*/
function parseUserMapping(mappingStr) {
const mapping = new Map()
if (!mappingStr) {
return mapping
}
const pairs = mappingStr.split(',')
for (const pair of pairs) {
const [github, feishu] = pair.split(':').map((s) => s.trim())
if (github && feishu) {
mapping.set(github, feishu)
}
}
return mapping
}
/**
* Get PR category display info
* @param {string} category - PR category
* @returns {object} Category display info
*/
function getCategoryInfo(category) {
const categoryMap = {
chat: { emoji: '💬', name: '对话', color: 'blue' },
draw: { emoji: '🖼️', name: '绘图', color: 'blue' },
uiux: { emoji: '🎨', name: 'UI/UX', color: 'blue' },
knowledge: { emoji: '🧠', name: '知识库', color: 'green' },
agent: { emoji: '🕹️', name: 'Agent', color: 'turquoise' },
provider: { emoji: '🔌', name: 'Provider', color: 'turquoise' },
minapps: { emoji: '🧩', name: '小程序', color: 'turquoise' },
backup_export: { emoji: '💾', name: '备份/导出', color: 'purple' },
data_storage: { emoji: '🗄️', name: '数据与存储', color: 'purple' },
ai_core: { emoji: '🤖', name: 'AI基础设施', color: 'purple' },
backend: { emoji: '⚙️', name: '后端/平台', color: 'green' },
docs: { emoji: '📚', name: '文档', color: 'grey' },
'build-config': { emoji: '🔧', name: '构建/配置', color: 'orange' },
test: { emoji: '🧪', name: '测试', color: 'yellow' },
multiple: { emoji: '🔀', name: '多模块', color: 'red' },
other: { emoji: '📝', name: '其他', color: 'blue' }
}
return categoryMap[category] || categoryMap.other
}
/**
* Load GitHub reviewers per category from .github/pr-modules.yml (optional)
* Supports inline array style: github_reviewers: ["user1","user2"] or []
* @returns {Map<string, string[]>}
*/
function loadConfigGithubReviewersByCategory() {
const result = new Map()
result.__rules = { vendor_added: [], large_change: { changed_files_gt: 30, reviewers: [] } }
try {
const candidates = [
path.join(process.cwd(), '.github', 'pr-modules.yml'),
path.join(process.cwd(), '.github', 'pr-modules.yaml')
]
let filePath = null
for (const p of candidates) {
if (fs.existsSync(p)) {
filePath = p
break
}
}
if (!filePath) return result
const content = fs.readFileSync(filePath, 'utf8')
const lines = content.split(/\r?\n/)
let inCategories = false
let inRules = false
let currentCategory = null
for (let i = 0; i < lines.length; i++) {
const line = lines[i]
if (!inCategories && !inRules) {
if (/^categories:\s*$/.test(line)) {
inCategories = true
continue
}
if (/^rules:\s*$/.test(line)) {
inRules = true
continue
}
continue
}
if (inCategories) {
const catMatch = /^\s{2}([a-zA-Z0-9_-]+):\s*$/.exec(line)
if (catMatch) {
currentCategory = catMatch[1]
continue
}
if (currentCategory) {
const reviewersMatch = /^\s{4}github_reviewers:\s*(.*)$/.exec(line)
if (reviewersMatch) {
let value = (reviewersMatch[1] || '').trim()
let users = []
if (value.startsWith('[') && value.endsWith(']')) {
const inner = value.slice(1, -1).trim()
if (inner.length > 0) {
users = inner
.split(',')
.map((s) => s.trim().replace(/^"|"$/g, '').replace(/^'|'$/g, ''))
.filter(Boolean)
}
} else if (value === '' || value === '[]') {
// try to parse dash list style
const collected = []
let j = i + 1
while (j < lines.length) {
const l = lines[j]
const dash = /^\s{6}-\s*(["']?)([^"']*)\1\s*$/.exec(l)
if (dash) {
const user = dash[2].trim()
if (user) collected.push(user)
j++
continue
}
break
}
users = collected
}
result.set(currentCategory, Array.from(new Set(users)))
}
}
} else if (inRules) {
// vendor_added block
if (/^\s{2}vendor_added:\s*$/.test(line)) {
// parse github_reviewers under vendor_added
let j = i + 1
const reviewers = []
while (j < lines.length) {
const l = lines[j]
const reviewersLine = /^\s{4}github_reviewers:\s*(.*)$/.exec(l)
if (reviewersLine) {
let value = (reviewersLine[1] || '').trim()
if (value.startsWith('[') && value.endsWith(']')) {
const inner = value.slice(1, -1).trim()
if (inner.length > 0) {
inner.split(',').forEach((s) => {
const u = s.trim().replace(/^"|"$/g, '').replace(/^'|'$/g, '')
if (u) reviewers.push(u)
})
}
}
j++
continue
}
const dash = /^\s{6}-\s*(["']?)([^"']*)\1\s*$/.exec(l)
if (dash) {
const u = dash[2].trim()
if (u) reviewers.push(u)
j++
continue
}
if (/^\s{2}[a-zA-Z0-9_-]+:\s*$/.test(l)) break
j++
}
result.__rules.vendor_added = Array.from(new Set(reviewers))
}
// large_change block
if (/^\s{2}large_change:\s*$/.test(line)) {
let j = i + 1
const rule = { changed_files_gt: 30, reviewers: [] }
while (j < lines.length) {
const l = lines[j]
const threshold = /^\s{4}changed_files_gt:\s*(\d+)\s*$/.exec(l)
if (threshold) {
rule.changed_files_gt = parseInt(threshold[1], 10)
j++
continue
}
const reviewersLine = /^\s{4}github_reviewers:\s*(.*)$/.exec(l)
if (reviewersLine) {
let value = (reviewersLine[1] || '').trim()
if (value.startsWith('[') && value.endsWith(']')) {
const inner = value.slice(1, -1).trim()
if (inner.length > 0) {
inner.split(',').forEach((s) => {
const u = s.trim().replace(/^"|"$/g, '').replace(/^'|'$/g, '')
if (u) rule.reviewers.push(u)
})
}
}
j++
continue
}
const dash = /^\s{6}-\s*(["']?)([^"']*)\1\s*$/.exec(l)
if (dash) {
const u = dash[2].trim()
if (u) rule.reviewers.push(u)
j++
continue
}
if (/^\s{2}[a-zA-Z0-9_-]+:\s*$/.test(l)) break
j++
}
rule.reviewers = Array.from(new Set(rule.reviewers))
result.__rules.large_change = rule
}
}
}
} catch (e) {
console.warn('⚠️ Failed to load .github/pr-modules.yml:', e.message)
}
return result
}
/**
* Get recommended reviewers based on PR category
* This is a helper for Claude to suggest appropriate reviewers
* @param {string} category - PR category
* @param {Map<string, string>} userMapping - GitHub to Feishu user mapping
* @returns {string[]} List of Feishu user IDs to notify
*/
function getRecommendedReviewersByCategory(category, userMapping, configGithubReviewersMap) {
// Fallback mapping when config not provided
const fallback = {
backend: ['kangfenmao'],
ai_core: ['kangfenmao'],
'build-config': ['kangfenmao'],
multiple: ['kangfenmao']
}
const configUsers = (configGithubReviewersMap && configGithubReviewersMap.get(category)) || []
const fallbackUsers = fallback[category] || []
const githubUsers = Array.from(new Set([...configUsers, ...fallbackUsers]))
return githubUsers.map((gh) => userMapping.get(gh)).filter(Boolean)
}
/**
* Create Feishu card message from PR data
* @param {object} prData - GitHub PR data
* @param {Map<string, string>} userMapping - GitHub to Feishu user mapping
* @returns {object} Feishu card content
*/
function createPRCard(prData, userMapping, configGithubReviewersMap) {
const {
prUrl,
prNumber,
prTitle,
prSummary,
prAuthor,
labels,
reviewers,
assignees,
category,
changedFiles,
additions,
deletions,
vendorAdded
} = prData
const categoryInfo = getCategoryInfo(category)
// Build labels section
const labelElements =
labels && labels.length > 0
? [
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**🏷️ Labels:** ${labels.map((l) => `\`${l}\``).join(' ')}`
}
}
]
: []
// Build stats section
const statsContent = [
`📁 ${changedFiles || 0} files`,
`<font color='green'>+${additions || 0}</font>`,
`<font color='red'>-${deletions || 0}</font>`
].join(' · ')
// Build mention content for reviewers and assignees
const mentions = []
const mentionedUsers = new Set()
// Add reviewers
if (reviewers && reviewers.length > 0) {
reviewers.forEach((reviewer) => {
const feishuId = userMapping.get(reviewer)
if (feishuId && !mentionedUsers.has(feishuId)) {
mentions.push(`<at id="${feishuId}"></at>`)
mentionedUsers.add(feishuId)
}
})
}
// Add assignees
if (assignees && assignees.length > 0) {
assignees.forEach((assignee) => {
const feishuId = userMapping.get(assignee)
if (feishuId && !mentionedUsers.has(feishuId)) {
mentions.push(`<at id="${feishuId}"></at>`)
mentionedUsers.add(feishuId)
}
})
}
// Add category-based experts (if not already mentioned)
const categoryExperts = getRecommendedReviewersByCategory(category, userMapping, configGithubReviewersMap)
categoryExperts.forEach((feishuId) => {
if (feishuId && !mentionedUsers.has(feishuId)) {
mentions.push(`<at id="${feishuId}"></at>`)
mentionedUsers.add(feishuId)
}
})
// Enforce mandatory reviewers based on rules
const mandatoryGithubUsers = []
const rules = configGithubReviewersMap.__rules || {
vendor_added: [],
large_change: { changed_files_gt: 30, reviewers: [] }
}
if (vendorAdded) {
mandatoryGithubUsers.push(...(rules.vendor_added || ['Yinsen-Ho']))
}
const changedFilesNum = Number(changedFiles) || 0
const threshold = (rules.large_change && rules.large_change.changed_files_gt) || 30
if (changedFilesNum > threshold) {
const reviewers = (rules.large_change && rules.large_change.reviewers) || ['kangfenmao']
mandatoryGithubUsers.push(...reviewers)
}
mandatoryGithubUsers.forEach((gh) => {
const feishuId = userMapping.get(gh)
if (feishuId && !mentionedUsers.has(feishuId)) {
mentions.push(`<at id="${feishuId}"></at>`)
mentionedUsers.add(feishuId)
}
})
// Build mentions section
const mentionElements =
mentions.length > 0
? [
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**👥 请关注:** ${mentions.join(' ')}`
}
}
]
: []
// Build reviewer and assignee info
const reviewerInfo = []
if (reviewers && reviewers.length > 0) {
reviewerInfo.push({
tag: 'div',
text: {
tag: 'lark_md',
content: `**👀 Reviewers:** ${reviewers.map((r) => `\`${r}\``).join(', ')}`
}
})
}
if (assignees && assignees.length > 0) {
reviewerInfo.push({
tag: 'div',
text: {
tag: 'lark_md',
content: `**👤 Assignees:** ${assignees.map((a) => `\`${a}\``).join(', ')}`
}
})
}
return {
elements: [
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**🔀 New Pull Request #${prNumber}**`
}
},
{
tag: 'hr'
},
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**${categoryInfo.emoji} 类型:** ${categoryInfo.name}`
}
},
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**📝 Title:** ${prTitle}`
}
},
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**👤 Author:** \`${prAuthor}\``
}
},
...reviewerInfo,
...labelElements,
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**📊 Changes:** ${statsContent}`
}
},
{
tag: 'hr'
},
{
tag: 'div',
text: {
tag: 'lark_md',
content: `**📋 Summary:**\n${prSummary}`
}
},
...mentionElements,
{
tag: 'hr'
},
{
tag: 'action',
actions: [
{
tag: 'button',
text: {
tag: 'plain_text',
content: '🔗 View PR'
},
type: 'primary',
url: prUrl
}
]
}
],
header: {
template: categoryInfo.color,
title: {
tag: 'plain_text',
content: `${categoryInfo.emoji} Cherry Studio - New PR [${categoryInfo.name}]`
}
}
}
}
/**
* Main function
*/
async function main() {
try {
// Get environment variables
const webhookUrl = process.env.FEISHU_WEBHOOK_URL
const secret = process.env.FEISHU_WEBHOOK_SECRET
const userMappingStr = process.env.FEISHU_USER_MAPPING || ''
const prUrl = process.env.PR_URL
const prNumber = process.env.PR_NUMBER
const prTitle = process.env.PR_TITLE
const prSummary = process.env.PR_SUMMARY
const prAuthor = process.env.PR_AUTHOR
const labelsStr = process.env.PR_LABELS || ''
const reviewersStr = process.env.PR_REVIEWERS || ''
const assigneesStr = process.env.PR_ASSIGNEES || ''
const category = process.env.PR_CATEGORY || 'multiple'
const vendorAdded = String(process.env.PR_VENDOR_ADDED || 'false').toLowerCase() === 'true'
const changedFiles = process.env.PR_CHANGED_FILES || '0'
const additions = process.env.PR_ADDITIONS || '0'
const deletions = process.env.PR_DELETIONS || '0'
// Validate required environment variables
if (!webhookUrl) {
throw new Error('FEISHU_WEBHOOK_URL environment variable is required')
}
if (!secret) {
throw new Error('FEISHU_WEBHOOK_SECRET environment variable is required')
}
if (!prUrl || !prNumber || !prTitle || !prSummary) {
throw new Error('PR data environment variables are required')
}
// Parse data
const userMapping = parseUserMapping(userMappingStr)
const configGithubReviewersMap = loadConfigGithubReviewersByCategory()
const labels = labelsStr
? labelsStr
.split(',')
.map((l) => l.trim())
.filter(Boolean)
: []
const reviewers = reviewersStr
? reviewersStr
.split(',')
.map((r) => r.trim())
.filter(Boolean)
: []
const assignees = assigneesStr
? assigneesStr
.split(',')
.map((a) => a.trim())
.filter(Boolean)
: []
// Create PR data object
const prData = {
prUrl,
prNumber,
prTitle,
prSummary,
prAuthor: prAuthor || 'Unknown',
labels,
reviewers,
assignees,
category,
vendorAdded,
changedFiles,
additions,
deletions
}
console.log('📤 Sending PR notification to Feishu...')
console.log(`PR #${prNumber}: ${prTitle}`)
console.log(`Category: ${category}`)
console.log(`Vendor added: ${vendorAdded}`)
console.log(`Reviewers: ${reviewers.join(', ') || 'None'}`)
console.log(`Assignees: ${assignees.join(', ') || 'None'}`)
console.log(`User mapping entries: ${userMapping.size}`)
// Create card content
const card = createPRCard(prData, userMapping, configGithubReviewersMap)
// Send to Feishu
await sendToFeishu(webhookUrl, secret, card)
console.log('✅ PR notification sent successfully!')
} catch (error) {
console.error('❌ Error:', error.message)
process.exit(1)
}
}
// Run main function
main()

View File

@@ -1,277 +0,0 @@
/**
* Stats major contributors per module based on .github/pr-modules.yml
* Output a markdown summary and write JSON to .github/reviewer-suggestions.json
*
* Usage:
* node scripts/stats-contributors.js [--top 3] [--since 1.year] [--mode auto|shortlog|log|blame] [--blame-sample 30]
*/
const { spawnSync } = require('child_process')
const fs = require('fs')
const path = require('path')
function readText(file) {
try {
return fs.readFileSync(file, 'utf8')
} catch {
return null
}
}
function parseArgs() {
const args = process.argv.slice(2)
const out = { top: 3, since: '', mode: 'auto', blameSample: 30 }
for (let i = 0; i < args.length; i++) {
if (args[i] === '--top' && i + 1 < args.length) {
out.top = parseInt(args[++i], 10) || 3
} else if (args[i] === '--since' && i + 1 < args.length) {
out.since = String(args[++i])
} else if (args[i] === '--mode' && i + 1 < args.length) {
out.mode = String(args[++i])
} else if (args[i] === '--blame-sample' && i + 1 < args.length) {
out.blameSample = parseInt(args[++i], 10) || 30
}
}
return out
}
// Minimal YAML parser for categories/globs in .github/pr-modules.yml
function parseModulesConfig(configPath) {
const text = readText(configPath)
if (!text) throw new Error(`Cannot read ${configPath}`)
const lines = text.split(/\r?\n/)
const categories = []
let inCategories = false
let current = null
for (let i = 0; i < lines.length; i++) {
const line = lines[i]
if (!inCategories) {
if (/^categories:\s*$/.test(line)) inCategories = true
continue
}
// New category key
const catMatch = /^\s{2}([a-zA-Z0-9_-]+):\s*$/.exec(line)
if (catMatch) {
if (current) categories.push(current)
current = { key: catMatch[1], name: '', globs: [] }
continue
}
if (!current) continue
const nameMatch = /^\s{4}name:\s*"?([^"]+)"?\s*$/.exec(line)
if (nameMatch) {
current.name = nameMatch[1].trim()
continue
}
// Enter globs list, then collect dash items
const globsHeader = /^\s{4}globs:\s*$/.exec(line)
if (globsHeader) {
let j = i + 1
while (j < lines.length) {
const l = lines[j]
const item = /^\s{6}-\s*"?([^"]+)"?\s*$/.exec(l)
if (!item) break
current.globs.push(item[1].trim())
j++
}
continue
}
}
if (current) categories.push(current)
return categories
}
function git(args, cwd) {
const res = spawnSync('git', args, { cwd, encoding: 'utf8' })
if (res.status !== 0) {
const msg = (res.stderr || '').trim() || `git ${args.join(' ')} failed`
throw new Error(msg)
}
return res.stdout
}
function buildPathspecs(globs) {
// Use pathspec magic :(glob)pattern so that ** works and we avoid shell expansion
return globs.map((g) => `:(glob)${g}`)
}
function lsFilesForGlobs(globs, repoRoot) {
const pathspecs = buildPathspecs(globs)
if (pathspecs.length === 0) return []
try {
const stdout = git(['ls-files', '--', ...pathspecs], repoRoot)
return stdout
.split(/\r?\n/)
.map((l) => l.trim())
.filter(Boolean)
} catch (e) {
// No matched files or pathspec error → treat as empty
return []
}
}
function shortlogFor(globs, repoRoot, since) {
const files = lsFilesForGlobs(globs, repoRoot)
if (files.length === 0) return []
const base = ['shortlog', '-sne']
if (since) base.push(`--since=${since}`)
const stdout = git([...base, '--', ...files], repoRoot)
const lines = stdout
.split(/\r?\n/)
.map((l) => l.trim())
.filter(Boolean)
const rows = []
for (const l of lines) {
// e.g. " 42 John Doe <john@example.com>"
const m = /^(\d+)\s+(.+?)\s+<([^>]+)>$/.exec(l)
if (!m) continue
const commits = parseInt(m[1], 10)
const name = m[2]
const email = m[3]
const gh = extractGithubUsername(name, email)
rows.push({ commits, name, email, github: gh })
}
rows.sort((a, b) => b.commits - a.commits)
return rows
}
function logAuthorsFor(globs, repoRoot, since) {
const files = lsFilesForGlobs(globs, repoRoot)
if (files.length === 0) return []
const base = ['log', '--format=%an <%ae>']
if (since) base.push(`--since=${since}`)
const stdout = git([...base, '--', ...files], repoRoot)
const lines = stdout
.split(/\r?\n/)
.map((l) => l.trim())
.filter(Boolean)
const map = new Map()
for (const l of lines) {
const m = /^(.+?)\s+<([^>]+)>$/.exec(l)
if (!m) continue
const name = m[1]
const email = m[2]
const gh = extractGithubUsername(name, email)
const key = `${name} <${email}>`
map.set(key, (map.get(key) || 0) + 1)
}
const out = []
for (const [key, commits] of map.entries()) {
const m = /^(.+?)\s+<([^>]+)>$/.exec(key)
out.push({ commits, name: m[1], email: m[2], github: extractGithubUsername(m[1], m[2]) })
}
out.sort((a, b) => b.commits - a.commits)
return out
}
function blameAuthorsSample(globs, repoRoot, sample) {
const files = lsFilesForGlobs(globs, repoRoot)
if (files.length === 0) return []
const pick = files.slice(0, Math.max(1, sample))
const map = new Map()
for (const f of pick) {
let stdout = ''
try {
stdout = git(['blame', '--line-porcelain', '--', f], repoRoot)
} catch (e) {
continue
}
const lines = stdout.split(/\r?\n/)
for (const line of lines) {
// author and author-mail lines
const am = /^author-mail\s+<([^>]+)>$/.exec(line)
if (am) {
const email = am[1]
// We do not rely on index; we just keep email-based identity
const gh = extractGithubUsername('', email)
const key = `${gh || ''}<${email}>`
map.set(key, (map.get(key) || 0) + 1)
}
}
}
const out = []
for (const [key, commits] of map.entries()) {
const m = /^(.*?)<([^>]+)>$/.exec(key)
const email = m ? m[2] : ''
const gh = extractGithubUsername('', email)
out.push({ commits, name: gh || email, email, github: gh })
}
out.sort((a, b) => b.commits - a.commits)
return out
}
function extractGithubUsername(name, email) {
// Try noreply forms: 12345+user@users.noreply.github.com or user@users.noreply.github.com
const noreply = /^(?:\d+\+)?([A-Za-z0-9-]+)@users\.noreply\.github\.com$/.exec(email)
if (noreply) return noreply[1]
// If name itself looks like a probable GitHub handle
if (/^[A-Za-z0-9-]{3,}$/.test(name)) return name
return ''
}
function main() {
const repoRoot = process.cwd()
const { top, since, mode, blameSample } = parseArgs()
const configPath = path.join(repoRoot, '.github', 'pr-modules.yml')
const categories = parseModulesConfig(configPath)
const suggestions = {}
const markdownLines = []
markdownLines.push('| Module | Top Contributors (commits) |')
markdownLines.push('|---|---|')
for (const cat of categories) {
let rows = []
try {
if (mode === 'shortlog' || mode === 'auto') rows = shortlogFor(cat.globs, repoRoot, since)
if (rows.length === 0 && (mode === 'log' || mode === 'auto')) rows = logAuthorsFor(cat.globs, repoRoot, since)
if (rows.length === 0 && (mode === 'blame' || mode === 'auto'))
rows = blameAuthorsSample(cat.globs, repoRoot, blameSample)
} catch (e) {
// Fallback to next method if one fails
if (mode === 'auto') {
try {
rows = logAuthorsFor(cat.globs, repoRoot, since)
} catch (e2) {
// ignore and continue
}
if (rows.length === 0) {
try {
rows = blameAuthorsSample(cat.globs, repoRoot, blameSample)
} catch (e3) {
// ignore and continue
}
}
} else {
// Non-auto mode: report empty on error
rows = []
}
}
const topRows = rows.slice(0, top)
suggestions[cat.key] = topRows.map((r) => ({
github: r.github,
name: r.name,
email: r.email,
commits: r.commits
}))
const cell = topRows
.map((r) => {
const id = r.github ? `@${r.github}` : r.name
return `${id} (${r.commits})`
})
.join(', ')
markdownLines.push(`| ${cat.key} | ${cell || '-'} |`)
}
const outJsonPath = path.join(repoRoot, '.github', 'reviewer-suggestions.json')
fs.writeFileSync(outJsonPath, JSON.stringify({ generatedAt: new Date().toISOString(), suggestions }, null, 2))
console.log(markdownLines.join('\n'))
console.log(`\nSaved JSON: ${path.relative(repoRoot, outJsonPath)}`)
}
main()

View File

@@ -1,4 +1,4 @@
import { ApiServerConfig } from '@types'
import type { ApiServerConfig } from '@types'
import { v4 as uuidv4 } from 'uuid'
import { loggerService } from '../services/LoggerService'

View File

@@ -1,5 +1,5 @@
import crypto from 'crypto'
import { NextFunction, Request, Response } from 'express'
import type { NextFunction, Request, Response } from 'express'
import { config } from '../config'

View File

@@ -1,4 +1,4 @@
import { NextFunction, Request, Response } from 'express'
import type { NextFunction, Request, Response } from 'express'
import { loggerService } from '../../services/LoggerService'

View File

@@ -1,4 +1,4 @@
import { Express } from 'express'
import type { Express } from 'express'
import swaggerJSDoc from 'swagger-jsdoc'
import swaggerUi from 'swagger-ui-express'

View File

@@ -1,7 +1,8 @@
import { loggerService } from '@logger'
import { AgentModelValidationError, agentService, sessionService } from '@main/services/agents'
import { ListAgentsResponse, type ReplaceAgentRequest, type UpdateAgentRequest } from '@types'
import { Request, Response } from 'express'
import type { ListAgentsResponse } from '@types'
import { type ReplaceAgentRequest, type UpdateAgentRequest } from '@types'
import type { Request, Response } from 'express'
import type { ValidationRequest } from '../validators/zodValidator'

View File

@@ -2,7 +2,7 @@ import { loggerService } from '@logger'
import { MESSAGE_STREAM_TIMEOUT_MS } from '@main/apiServer/config/timeouts'
import { createStreamAbortController, STREAM_TIMEOUT_REASON } from '@main/apiServer/utils/createStreamAbortController'
import { agentService, sessionMessageService, sessionService } from '@main/services/agents'
import { Request, Response } from 'express'
import type { Request, Response } from 'express'
const logger = loggerService.withContext('ApiServerMessagesHandlers')

View File

@@ -1,7 +1,8 @@
import { loggerService } from '@logger'
import { AgentModelValidationError, sessionMessageService, sessionService } from '@main/services/agents'
import { ListAgentSessionsResponse, type ReplaceSessionRequest, UpdateSessionResponse } from '@types'
import { Request, Response } from 'express'
import type { ListAgentSessionsResponse, UpdateSessionResponse } from '@types'
import { type ReplaceSessionRequest } from '@types'
import type { Request, Response } from 'express'
import type { ValidationRequest } from '../validators/zodValidator'

View File

@@ -1,4 +1,4 @@
import { Request, Response } from 'express'
import type { Request, Response } from 'express'
import { agentService } from '../../../../services/agents'
import { loggerService } from '../../../../services/LoggerService'

View File

@@ -1,5 +1,6 @@
import { NextFunction, Request, Response } from 'express'
import { ZodError, ZodType } from 'zod'
import type { NextFunction, Request, Response } from 'express'
import type { ZodType } from 'zod'
import { ZodError } from 'zod'
export interface ValidationRequest extends Request {
validatedBody?: any

View File

@@ -1,5 +1,6 @@
import { ChatCompletionCreateParams } from '@cherrystudio/openai/resources'
import express, { Request, Response } from 'express'
import type { ChatCompletionCreateParams } from '@cherrystudio/openai/resources'
import type { Request, Response } from 'express'
import express from 'express'
import { loggerService } from '../../services/LoggerService'
import {

View File

@@ -1,4 +1,5 @@
import express, { Request, Response } from 'express'
import type { Request, Response } from 'express'
import express from 'express'
import { loggerService } from '../../services/LoggerService'
import { mcpApiService } from '../services/mcp'

View File

@@ -1,7 +1,8 @@
import { MessageCreateParams } from '@anthropic-ai/sdk/resources'
import type { MessageCreateParams } from '@anthropic-ai/sdk/resources'
import { loggerService } from '@logger'
import { Provider } from '@types'
import express, { Request, Response } from 'express'
import type { Provider } from '@types'
import type { Request, Response } from 'express'
import express from 'express'
import { messagesService } from '../services/messages'
import { getProviderById, validateModelId } from '../utils'

View File

@@ -1,5 +1,7 @@
import { ApiModelsFilterSchema, ApiModelsResponse } from '@types'
import express, { Request, Response } from 'express'
import type { ApiModelsResponse } from '@types'
import { ApiModelsFilterSchema } from '@types'
import type { Request, Response } from 'express'
import express from 'express'
import { loggerService } from '../../services/LoggerService'
import { modelsService } from '../services/models'

View File

@@ -1,8 +1,10 @@
import { createServer } from 'node:http'
import { loggerService } from '@logger'
import { IpcChannel } from '@shared/IpcChannel'
import { agentService } from '../services/agents'
import { windowService } from '../services/WindowService'
import { app } from './app'
import { config } from './config'
@@ -43,6 +45,13 @@ export class ApiServer {
return new Promise((resolve, reject) => {
this.server!.listen(port, host, () => {
logger.info('API server started', { host, port })
// Notify renderer that API server is ready
const mainWindow = windowService.getMainWindow()
if (mainWindow && !mainWindow.isDestroyed()) {
mainWindow.webContents.send(IpcChannel.ApiServer_Ready)
}
resolve()
})

View File

@@ -1,9 +1,10 @@
import OpenAI from '@cherrystudio/openai'
import { ChatCompletionCreateParams, ChatCompletionCreateParamsStreaming } from '@cherrystudio/openai/resources'
import { Provider } from '@types'
import type { ChatCompletionCreateParams, ChatCompletionCreateParamsStreaming } from '@cherrystudio/openai/resources'
import type { Provider } from '@types'
import { loggerService } from '../../services/LoggerService'
import { ModelValidationError, validateModelId } from '../utils'
import type { ModelValidationError } from '../utils'
import { validateModelId } from '../utils'
const logger = loggerService.withContext('ChatCompletionService')

View File

@@ -1,16 +1,12 @@
import mcpService from '@main/services/MCPService'
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp'
import {
isJSONRPCRequest,
JSONRPCMessage,
JSONRPCMessageSchema,
MessageExtraInfo
} from '@modelcontextprotocol/sdk/types'
import { MCPServer } from '@types'
import type { JSONRPCMessage, MessageExtraInfo } from '@modelcontextprotocol/sdk/types'
import { isJSONRPCRequest, JSONRPCMessageSchema } from '@modelcontextprotocol/sdk/types'
import type { MCPServer } from '@types'
import { randomUUID } from 'crypto'
import { EventEmitter } from 'events'
import { Request, Response } from 'express'
import { IncomingMessage, ServerResponse } from 'http'
import type { Request, Response } from 'express'
import type { IncomingMessage, ServerResponse } from 'http'
import { loggerService } from '../../services/LoggerService'
import { getMcpServerById, getMCPServersFromRedux } from '../utils/mcp'

View File

@@ -1,10 +1,10 @@
import Anthropic from '@anthropic-ai/sdk'
import { MessageCreateParams, MessageStreamEvent } from '@anthropic-ai/sdk/resources'
import type Anthropic from '@anthropic-ai/sdk'
import type { MessageCreateParams, MessageStreamEvent } from '@anthropic-ai/sdk/resources'
import { loggerService } from '@logger'
import anthropicService from '@main/services/AnthropicService'
import { buildClaudeCodeSystemMessage, getSdkClient } from '@shared/anthropic'
import { Provider } from '@types'
import { Response } from 'express'
import type { Provider } from '@types'
import type { Response } from 'express'
const logger = loggerService.withContext('MessagesService')
const EXCLUDED_FORWARD_HEADERS: ReadonlySet<string> = new Set([

View File

@@ -1,6 +1,6 @@
import { isEmpty } from 'lodash'
import { ApiModel, ApiModelsFilter, ApiModelsResponse } from '../../../renderer/src/types/apiModels'
import type { ApiModel, ApiModelsFilter, ApiModelsResponse } from '../../../renderer/src/types/apiModels'
import { loggerService } from '../../services/LoggerService'
import {
getAvailableProviders,

View File

@@ -1,7 +1,7 @@
import { CacheService } from '@main/services/CacheService'
import { loggerService } from '@main/services/LoggerService'
import { reduxService } from '@main/services/ReduxService'
import { ApiModel, Model, Provider } from '@types'
import type { ApiModel, Model, Provider } from '@types'
const logger = loggerService.withContext('ApiServerUtils')

View File

@@ -1,8 +1,9 @@
import { CacheService } from '@main/services/CacheService'
import mcpService from '@main/services/MCPService'
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
import { CallToolRequestSchema, ListToolsRequestSchema, ListToolsResult } from '@modelcontextprotocol/sdk/types.js'
import { MCPServer } from '@types'
import type { ListToolsResult } from '@modelcontextprotocol/sdk/types.js'
import { CallToolRequestSchema, ListToolsRequestSchema } from '@modelcontextprotocol/sdk/types.js'
import type { MCPServer } from '@types'
import { loggerService } from '../../services/LoggerService'
import { reduxService } from '../../services/ReduxService'

View File

@@ -8,11 +8,12 @@ import { generateSignature } from '@main/integration/cherryai'
import anthropicService from '@main/services/AnthropicService'
import { getBinaryPath, isBinaryExists, runInstallScript } from '@main/utils/process'
import { handleZoomFactor } from '@main/utils/zoom'
import { SpanEntity, TokenUsage } from '@mcp-trace/trace-core'
import { MIN_WINDOW_HEIGHT, MIN_WINDOW_WIDTH, UpgradeChannel } from '@shared/config/constant'
import type { SpanEntity, TokenUsage } from '@mcp-trace/trace-core'
import type { UpgradeChannel } from '@shared/config/constant'
import { MIN_WINDOW_HEIGHT, MIN_WINDOW_WIDTH } from '@shared/config/constant'
import { IpcChannel } from '@shared/IpcChannel'
import type { PluginError } from '@types'
import {
import type {
AgentPersistedMessage,
FileMetadata,
Notification,
@@ -23,10 +24,12 @@ import {
ThemeMode
} from '@types'
import checkDiskSpace from 'check-disk-space'
import { BrowserWindow, dialog, ipcMain, ProxyConfig, session, shell, systemPreferences, webContents } from 'electron'
import type { ProxyConfig } from 'electron'
import { BrowserWindow, dialog, ipcMain, session, shell, systemPreferences, webContents } from 'electron'
import fontList from 'font-list'
import { agentMessageRepository } from './services/agents/database'
import { PluginService } from './services/agents/plugins/PluginService'
import { apiServerService } from './services/ApiServerService'
import appService from './services/AppService'
import AppUpdater from './services/AppUpdater'
@@ -47,7 +50,6 @@ import * as NutstoreService from './services/NutstoreService'
import ObsidianVaultService from './services/ObsidianVaultService'
import { ocrService } from './services/ocr/OcrService'
import OvmsManager from './services/OvmsManager'
import { PluginService } from './services/PluginService'
import { proxyManager } from './services/ProxyManager'
import { pythonService } from './services/PythonService'
import { FileServiceManager } from './services/remotefile/FileServiceManager'
@@ -70,6 +72,7 @@ import {
import storeSyncService from './services/StoreSyncService'
import { themeService } from './services/ThemeService'
import VertexAIService from './services/VertexAIService'
import WebSocketService from './services/WebSocketService'
import { setOpenLinkExternal } from './services/WebviewService'
import { windowService } from './services/WindowService'
import { calculateDirectorySize, getResourcePath } from './utils'
@@ -1017,4 +1020,11 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
return { success: false, error }
}
})
// WebSocket
ipcMain.handle(IpcChannel.WebSocket_Start, WebSocketService.start)
ipcMain.handle(IpcChannel.WebSocket_Stop, WebSocketService.stop)
ipcMain.handle(IpcChannel.WebSocket_Status, WebSocketService.getStatus)
ipcMain.handle(IpcChannel.WebSocket_SendFile, WebSocketService.sendFile)
ipcMain.handle(IpcChannel.WebSocket_GetAllCandidates, WebSocketService.getAllCandidates)
}

View File

@@ -1,6 +1,6 @@
import type { BaseEmbeddings } from '@cherrystudio/embedjs-interfaces'
import { TraceMethod } from '@mcp-trace/trace-core'
import { ApiClient } from '@types'
import type { ApiClient } from '@types'
import EmbeddingsFactory from './EmbeddingsFactory'

View File

@@ -1,7 +1,8 @@
import type { BaseEmbeddings } from '@cherrystudio/embedjs-interfaces'
import { OllamaEmbeddings } from '@cherrystudio/embedjs-ollama'
import { OpenAiEmbeddings } from '@cherrystudio/embedjs-openai'
import { ApiClient } from '@types'
import type { ApiClient } from '@types'
import { net } from 'electron'
import { VoyageEmbeddings } from './VoyageEmbeddings'
@@ -43,7 +44,7 @@ export default class EmbeddingsFactory {
apiKey,
dimensions,
batchSize,
configuration: { baseURL }
configuration: { baseURL, fetch: net.fetch as typeof fetch }
})
}
}

View File

@@ -1,10 +1,11 @@
import { JsonLoader, LocalPathLoader, RAGApplication, TextLoader } from '@cherrystudio/embedjs'
import type { RAGApplication } from '@cherrystudio/embedjs'
import { JsonLoader, LocalPathLoader, TextLoader } from '@cherrystudio/embedjs'
import type { AddLoaderReturn } from '@cherrystudio/embedjs-interfaces'
import { WebLoader } from '@cherrystudio/embedjs-loader-web'
import { loggerService } from '@logger'
import { readTextFileWithAutoEncoding } from '@main/utils/file'
import { LoaderReturn } from '@shared/config/types'
import { FileMetadata, KnowledgeBaseParams } from '@types'
import type { LoaderReturn } from '@shared/config/types'
import type { FileMetadata, KnowledgeBaseParams } from '@types'
import { DraftsExportLoader } from './draftsExportLoader'
import { EpubLoader } from './epubLoader'

View File

@@ -3,7 +3,8 @@ import { cleanString } from '@cherrystudio/embedjs-utils'
import { RecursiveCharacterTextSplitter } from '@langchain/textsplitters'
import { loggerService } from '@logger'
import md5 from 'md5'
import { OfficeParserConfig, parseOfficeAsync } from 'officeparser'
import type { OfficeParserConfig } from 'officeparser'
import { parseOfficeAsync } from 'officeparser'
const logger = loggerService.withContext('OdLoader')

View File

@@ -4,7 +4,7 @@ import path from 'node:path'
import { loggerService } from '@logger'
import { windowService } from '@main/services/WindowService'
import { getFileExt, getTempDir } from '@main/utils/file'
import { FileMetadata, PreprocessProvider } from '@types'
import type { FileMetadata, PreprocessProvider } from '@types'
import { PDFDocument } from 'pdf-lib'
const logger = loggerService.withContext('BasePreprocessProvider')

View File

@@ -1,4 +1,4 @@
import { FileMetadata, PreprocessProvider } from '@types'
import type { FileMetadata, PreprocessProvider } from '@types'
import BasePreprocessProvider from './BasePreprocessProvider'

View File

@@ -3,7 +3,7 @@ import path from 'node:path'
import { loggerService } from '@logger'
import { fileStorage } from '@main/services/FileStorage'
import { FileMetadata, PreprocessProvider } from '@types'
import type { FileMetadata, PreprocessProvider } from '@types'
import AdmZip from 'adm-zip'
import { net } from 'electron'

View File

@@ -3,7 +3,7 @@ import path from 'node:path'
import { loggerService } from '@logger'
import { fileStorage } from '@main/services/FileStorage'
import { FileMetadata, PreprocessProvider } from '@types'
import type { FileMetadata, PreprocessProvider } from '@types'
import AdmZip from 'adm-zip'
import { net } from 'electron'

View File

@@ -4,11 +4,12 @@ import { loggerService } from '@logger'
import { fileStorage } from '@main/services/FileStorage'
import { MistralClientManager } from '@main/services/MistralClientManager'
import { MistralService } from '@main/services/remotefile/MistralService'
import { Mistral } from '@mistralai/mistralai'
import { DocumentURLChunk } from '@mistralai/mistralai/models/components/documenturlchunk'
import { ImageURLChunk } from '@mistralai/mistralai/models/components/imageurlchunk'
import { OCRResponse } from '@mistralai/mistralai/models/components/ocrresponse'
import { FileMetadata, FileTypes, PreprocessProvider, Provider } from '@types'
import type { Mistral } from '@mistralai/mistralai'
import type { DocumentURLChunk } from '@mistralai/mistralai/models/components/documenturlchunk'
import type { ImageURLChunk } from '@mistralai/mistralai/models/components/imageurlchunk'
import type { OCRResponse } from '@mistralai/mistralai/models/components/ocrresponse'
import type { FileMetadata, PreprocessProvider, Provider } from '@types'
import { FileTypes } from '@types'
import path from 'path'
import BasePreprocessProvider from './BasePreprocessProvider'

View File

@@ -3,7 +3,7 @@ import path from 'node:path'
import { loggerService } from '@logger'
import { fileStorage } from '@main/services/FileStorage'
import { FileMetadata, PreprocessProvider } from '@types'
import type { FileMetadata, PreprocessProvider } from '@types'
import AdmZip from 'adm-zip'
import { net } from 'electron'
import FormData from 'form-data'

View File

@@ -1,6 +1,6 @@
import { FileMetadata, PreprocessProvider as Provider } from '@types'
import type { FileMetadata, PreprocessProvider as Provider } from '@types'
import BasePreprocessProvider from './BasePreprocessProvider'
import type BasePreprocessProvider from './BasePreprocessProvider'
import PreprocessProviderFactory from './PreprocessProviderFactory'
export default class PreprocessProvider {

View File

@@ -1,6 +1,6 @@
import { PreprocessProvider } from '@types'
import type { PreprocessProvider } from '@types'
import BasePreprocessProvider from './BasePreprocessProvider'
import type BasePreprocessProvider from './BasePreprocessProvider'
import DefaultPreprocessProvider from './DefaultPreprocessProvider'
import Doc2xPreprocessProvider from './Doc2xPreprocessProvider'
import MineruPreprocessProvider from './MineruPreprocessProvider'

View File

@@ -1,7 +1,7 @@
import { DEFAULT_DOCUMENT_COUNT, DEFAULT_RELEVANT_SCORE } from '@main/utils/knowledge'
import { KnowledgeBaseParams, KnowledgeSearchResult } from '@types'
import type { KnowledgeBaseParams, KnowledgeSearchResult } from '@types'
import { MultiModalDocument, RerankStrategy } from './strategies/RerankStrategy'
import type { MultiModalDocument, RerankStrategy } from './strategies/RerankStrategy'
import { StrategyFactory } from './strategies/StrategyFactory'
export default abstract class BaseReranker {

View File

@@ -1,4 +1,4 @@
import { KnowledgeBaseParams, KnowledgeSearchResult } from '@types'
import type { KnowledgeBaseParams, KnowledgeSearchResult } from '@types'
import { net } from 'electron'
import BaseReranker from './BaseReranker'

View File

@@ -1,4 +1,4 @@
import { KnowledgeBaseParams, KnowledgeSearchResult } from '@types'
import type { KnowledgeBaseParams, KnowledgeSearchResult } from '@types'
import GeneralReranker from './GeneralReranker'

View File

@@ -1,4 +1,4 @@
import { MultiModalDocument, RerankStrategy } from './RerankStrategy'
import type { MultiModalDocument, RerankStrategy } from './RerankStrategy'
export class BailianStrategy implements RerankStrategy {
buildUrl(): string {
return 'https://dashscope.aliyuncs.com/api/v1/services/rerank/text-rerank/text-rerank'

View File

@@ -1,4 +1,4 @@
import { MultiModalDocument, RerankStrategy } from './RerankStrategy'
import type { MultiModalDocument, RerankStrategy } from './RerankStrategy'
export class DefaultStrategy implements RerankStrategy {
buildUrl(baseURL?: string): string {
if (baseURL && baseURL.endsWith('/')) {

View File

@@ -1,4 +1,4 @@
import { MultiModalDocument, RerankStrategy } from './RerankStrategy'
import type { MultiModalDocument, RerankStrategy } from './RerankStrategy'
export class JinaStrategy implements RerankStrategy {
buildUrl(baseURL?: string): string {
if (baseURL && baseURL.endsWith('/')) {

View File

@@ -1,7 +1,7 @@
import { BailianStrategy } from './BailianStrategy'
import { DefaultStrategy } from './DefaultStrategy'
import { JinaStrategy } from './JinaStrategy'
import { RerankStrategy } from './RerankStrategy'
import type { RerankStrategy } from './RerankStrategy'
import { TEIStrategy } from './TeiStrategy'
import { isTEIProvider, RERANKER_PROVIDERS } from './types'
import { VoyageAIStrategy } from './VoyageStrategy'

View File

@@ -1,4 +1,4 @@
import { MultiModalDocument, RerankStrategy } from './RerankStrategy'
import type { MultiModalDocument, RerankStrategy } from './RerankStrategy'
export class TEIStrategy implements RerankStrategy {
buildUrl(baseURL?: string): string {
if (baseURL && baseURL.endsWith('/')) {

View File

@@ -1,4 +1,4 @@
import { MultiModalDocument, RerankStrategy } from './RerankStrategy'
import type { MultiModalDocument, RerankStrategy } from './RerankStrategy'
export class VoyageAIStrategy implements RerankStrategy {
buildUrl(baseURL?: string): string {
if (baseURL && baseURL.endsWith('/')) {

View File

@@ -2,7 +2,8 @@
// port https://github.com/modelcontextprotocol/servers/blob/main/src/brave-search/index.ts
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
import { CallToolRequestSchema, ListToolsRequestSchema, Tool } from '@modelcontextprotocol/sdk/types.js'
import type { Tool } from '@modelcontextprotocol/sdk/types.js'
import { CallToolRequestSchema, ListToolsRequestSchema } from '@modelcontextprotocol/sdk/types.js'
import { net } from 'electron'
const WEB_SEARCH_TOOL: Tool = {

View File

@@ -1,6 +1,7 @@
import { loggerService } from '@logger'
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
import { BuiltinMCPServerName, BuiltinMCPServerNames } from '@types'
import type { Server } from '@modelcontextprotocol/sdk/server/index.js'
import type { BuiltinMCPServerName } from '@types'
import { BuiltinMCPServerNames } from '@types'
import BraveSearchServer from './brave-search'
import DiDiMcpServer from './didi-mcp'

View File

@@ -3,7 +3,8 @@
import { loggerService } from '@logger'
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
import { CallToolRequestSchema, ListToolsRequestSchema, Tool } from '@modelcontextprotocol/sdk/types.js'
import type { Tool } from '@modelcontextprotocol/sdk/types.js'
import { CallToolRequestSchema, ListToolsRequestSchema } from '@modelcontextprotocol/sdk/types.js'
// Fixed chalk import for ESM
import chalk from 'chalk'

View File

@@ -1,5 +1,5 @@
import { IpcChannel } from '@shared/IpcChannel'
import {
import type {
ApiServerConfig,
GetApiServerStatusResult,
RestartApiServerStatusResult,

View File

@@ -2,7 +2,8 @@ import { isMac } from '@main/constant'
import { windowService } from '@main/services/WindowService'
import { locales } from '@main/utils/locales'
import { IpcChannel } from '@shared/IpcChannel'
import { app, Menu, MenuItemConstructorOptions, shell } from 'electron'
import type { MenuItemConstructorOptions } from 'electron'
import { app, Menu, shell } from 'electron'
import { configManager } from './ConfigManager'
export class AppMenuService {

View File

@@ -4,9 +4,11 @@ import { getIpCountry } from '@main/utils/ipService'
import { generateUserAgent } from '@main/utils/systemInfo'
import { FeedUrl, UpgradeChannel } from '@shared/config/constant'
import { IpcChannel } from '@shared/IpcChannel'
import { CancellationToken, UpdateInfo } from 'builder-util-runtime'
import type { UpdateInfo } from 'builder-util-runtime'
import { CancellationToken } from 'builder-util-runtime'
import { app, net } from 'electron'
import { AppUpdater as _AppUpdater, autoUpdater, Logger, NsisUpdater, UpdateCheckResult } from 'electron-updater'
import type { AppUpdater as _AppUpdater, Logger, NsisUpdater, UpdateCheckResult } from 'electron-updater'
import { autoUpdater } from 'electron-updater'
import path from 'path'
import semver from 'semver'

View File

@@ -1,14 +1,14 @@
import { loggerService } from '@logger'
import { IpcChannel } from '@shared/IpcChannel'
import { WebDavConfig } from '@types'
import { S3Config } from '@types'
import type { WebDavConfig } from '@types'
import type { S3Config } from '@types'
import archiver from 'archiver'
import { exec } from 'child_process'
import { app } from 'electron'
import * as fs from 'fs-extra'
import StreamZip from 'node-stream-zip'
import * as path from 'path'
import { CreateDirectoryOptions, FileStat } from 'webdav'
import type { CreateDirectoryOptions, FileStat } from 'webdav'
import { getDataPath } from '../utils'
import S3Storage from './S3Storage'

Some files were not shown because too many files have changed in this diff Show More