refactor: Unified Logger / 统一日志管理 (#8207)
* Revert "feat: optimize minapp cache with LRU (#8160)"
This reverts commit f0043b4be5.
* feat: integrate logger service and enhance logging throughout the application
- Added a new LoggerService to standardize logging across the application.
- Replaced console.error and console.warn calls with logger methods for improved consistency and error tracking.
- Introduced a new IPC channel for logging messages to the main process.
- Updated various components and services to utilize the new logging system, enhancing error handling and debugging capabilities.
* refactor: enhance logging and error handling across various components
- Integrated the LoggerService for consistent logging throughout the application.
- Updated multiple components and services to utilize the new logging system, improving error tracking and debugging capabilities.
- Refactored file handling and error management in several services to enhance reliability and clarity.
- Improved the structure and readability of the codebase by removing redundant checks and simplifying logic.
* chore: update TypeScript configuration and enhance test setup
- Added test mock paths to tsconfig.web.json for improved test coverage.
- Configured Vitest to include a setup file for main tests, ensuring consistent test environment.
- Updated IPC logger context for better clarity in logging.
- Enhanced LoggerService to handle undefined values gracefully.
- Mocked LoggerService globally in renderer tests to streamline testing process.
* refactor: standardize logging across ProxyManager and ReduxService
- Replaced instances of Logger with logger for consistent logging implementation.
- Improved logging clarity in ProxyManager's configureProxy method and ReduxService's state handling.
- Enhanced error logging in ReduxService to align with the new logging system.
* refactor: reorganize LoggerService for improved clarity and consistency
- Moved the definition of SYSTEM_INFO, APP_VERSION, and DEFAULT_LEVEL to enhance code organization.
- Simplified the getIsDev function in the renderer LoggerService for better readability.
- Updated logging conditions to ensure messages are logged correctly based on context.
* docs: add usage instructions for LoggerService and clean up logging code
- Included important usage instructions for LoggerService in both English and Chinese.
- Commented out the console transport in LoggerService to streamline logging.
- Improved logging message formatting in MCPService for clarity.
- Removed redundant logging statements in SelectionService to enhance code cleanliness.
* refactor: update LoggerService documentation paths and enhance logging implementation
- Changed the documentation paths for LoggerService usage instructions to `docs/technical/how-to-use-logger-en.md` and `docs/technical/how-to-use-logger-zh.md`.
- Replaced console logging with the loggerService in various components, including `MCPSettings`, `BlockManager`, and multiple callback files, to ensure consistent logging practices across the application.
- Improved the clarity and context of log messages for better debugging and monitoring.
* docs: emphasize logger usage guidelines in documentation
- Added a note in both English and Chinese documentation to discourage the use of `console.xxx` for logging unless necessary, promoting consistent logging practices across the application.
This commit is contained in:
134
docs/technical/how-to-use-logger-en.md
Normal file
134
docs/technical/how-to-use-logger-en.md
Normal file
@@ -0,0 +1,134 @@
|
||||
# How to use the LoggerService
|
||||
|
||||
This is a developer document on how to use the logger.
|
||||
|
||||
CherryStudio uses a unified logging service to print and record logs. **Unless there is a special reason, do not use `console.xxx` to print logs**
|
||||
|
||||
The following are detailed instructions.
|
||||
|
||||
## Usage in the `main` process
|
||||
|
||||
### Importing
|
||||
|
||||
```typescript
|
||||
import { loggerService } from '@logger'
|
||||
```
|
||||
|
||||
### Setting module information (Required by convention)
|
||||
|
||||
After the import statements, set it up as follows:
|
||||
|
||||
```typescript
|
||||
const logger = loggerService.withContext('moduleName')
|
||||
```
|
||||
|
||||
- `moduleName` is the name of the current file's module. It can be named after the filename, main class name, main function name, etc. The principle is to be clear and understandable.
|
||||
- `moduleName` will be printed in the terminal and will also be present in the file log, making it easier to filter.
|
||||
|
||||
### Setting `CONTEXT` information (Optional)
|
||||
|
||||
In `withContext`, you can also set other `CONTEXT` information:
|
||||
|
||||
```typescript
|
||||
const logger = loggerService.withContext('moduleName', CONTEXT)
|
||||
```
|
||||
|
||||
- `CONTEXT` is an object of the form `{ key: value, ... }`.
|
||||
- `CONTEXT` information will not be printed in the terminal, but it will be recorded in the file log, making it easier to filter.
|
||||
|
||||
### Logging
|
||||
|
||||
In your code, you can call `logger` at any time to record logs. The supported methods are: `error`, `warn`, `info`, `verbose`, `debug`, `silly`.
|
||||
For the meaning of each level, please refer to the section below.
|
||||
|
||||
The following examples show how to use `logger.info` and `logger.error`. Other levels are used in the same way:
|
||||
```typescript
|
||||
logger.info('message', CONTEXT)
|
||||
logger.info('message %s %d', 'hello', 123, CONTEXT)
|
||||
logger.error('message', new Error('error message'), CONTEXT)
|
||||
```
|
||||
- `message` is a required string. All other options are optional.
|
||||
- `CONTEXT` as `{ key: value, ... }` is optional and will be recorded in the log file.
|
||||
- If an `Error` type is passed, the error stack will be automatically recorded.
|
||||
|
||||
### Log Levels
|
||||
|
||||
- In the development environment, all log levels are printed to the terminal and recorded in the file log.
|
||||
- In the production environment, the default log level is `info`. Logs are only recorded to the file and are not printed to the terminal.
|
||||
|
||||
Changing the log level:
|
||||
- You can change the log level with `logger.setLevel('newLevel')`.
|
||||
- `logger.resetLevel()` resets it to the default level.
|
||||
- `logger.getLevel()` gets the current log level.
|
||||
|
||||
**Note:** Changing the log level has a global effect. Please do not change it arbitrarily in your code unless you are very clear about what you are doing.
|
||||
|
||||
## Usage in the `renderer` process
|
||||
|
||||
Usage in the `renderer` process for *importing*, *setting module information*, and *setting context information* is **exactly the same** as in the `main` process.
|
||||
The following section focuses on the differences.
|
||||
|
||||
### `initWindowSource`
|
||||
|
||||
In the `renderer` process, there are different `window`s. Before starting to use the `logger`, we must set the `window` information:
|
||||
|
||||
```typescript
|
||||
loggerService.initWindowSource('windowName')
|
||||
```
|
||||
|
||||
As a rule, we will set this in the `window`'s `entryPoint.tsx`. This ensures that `windowName` is set before it's used.
|
||||
- An error will be thrown if `windowName` is not set, and the `logger` will not work.
|
||||
- `windowName` can only be set once; subsequent attempts to set it will have no effect.
|
||||
- `windowName` will not be printed in the `devTool`'s `console`, but it will be recorded in the `main` process terminal and the file log.
|
||||
|
||||
### Log Levels
|
||||
|
||||
- In the development environment, all log levels are printed to the `devTool`'s `console` by default.
|
||||
- In the production environment, the default log level is `info`, and logs are printed to the `devTool`'s `console`.
|
||||
- In both development and production environments, `warn` and `error` level logs are, by default, transmitted to the `main` process and recorded in the file log.
|
||||
- In the development environment, the `main` process terminal will also print the logs transmitted from the renderer.
|
||||
|
||||
#### Changing the Log Level
|
||||
|
||||
Same as in the `main` process, you can manage the log level using `setLevel('level')`, `resetLevel()`, and `getLevel()`.
|
||||
Similarly, changing the log level is a global adjustment.
|
||||
|
||||
#### Changing the Level Transmitted to `main`
|
||||
|
||||
Logs from the `renderer` are sent to `main` to be managed and recorded to a file centrally (according to `main`'s file logging level). By default, only `warn` and `error` level logs are transmitted to `main`.
|
||||
|
||||
There are two ways to change the log level for transmission to `main`:
|
||||
|
||||
##### Global Change
|
||||
|
||||
The following methods can be used to set, reset, and get the log level for transmission to `main`, respectively.
|
||||
|
||||
```typescript
|
||||
logger.setLogToMainLevel('newLevel')
|
||||
logger.resetLogToMainLevel()
|
||||
logger.getLogToMainLevel()
|
||||
```
|
||||
**Note:** This method has a global effect. Please do not change it arbitrarily in your code unless you are very clear about what you are doing.
|
||||
|
||||
|
||||
##### Per-log Change
|
||||
|
||||
By adding `{ logToMain: true }` at the end of the log call, you can force a single log entry to be transmitted to `main` (bypassing the global log level restriction), for example:
|
||||
|
||||
```typescript
|
||||
logger.info('message', { logToMain: true })
|
||||
```
|
||||
|
||||
## Log Level Usage Guidelines
|
||||
|
||||
There are many log levels. The following are the guidelines that should be followed in CherryStudio for when to use each level:
|
||||
(Arranged from highest to lowest log level)
|
||||
|
||||
| Log Level | Core Definition & Use Case | Example |
|
||||
| :------------ | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **`error`** | **Critical error causing the program to crash or core functionality to become unusable.** <br> This is the highest-priority log, usually requiring immediate reporting or user notification. | - Main or renderer process crash. <br> - Failure to read/write critical user data files (e.g., database, configuration files), preventing the application from running. <br> - All unhandled exceptions. |
|
||||
| **`warn`** | **Potential issue or unexpected situation that does not affect the program's core functionality.** <br> The program can recover or use a fallback. | - Configuration file `settings.json` is missing; started with default settings. <br> - Auto-update check failed, but does not affect the use of the current version. <br> - A non-essential plugin failed to load. |
|
||||
| **`info`** | **Records application lifecycle events and key user actions.** <br> This is the default level that should be recorded in a production release to trace the user's main operational path. | - Application start, exit. <br> - User successfully opens/saves a file. <br> - Main window created/closed. <br> - Starting an important task (e.g., "Start video export"). |
|
||||
| **`verbose`** | **More detailed flow information than `info`, used for tracing specific features.** <br> Enabled when diagnosing issues with a specific feature to help understand the internal execution flow. | - Loading `Toolbar` module. <br> - IPC message `open-file-dialog` sent from the renderer process. <br> - Applying filter 'Sepia' to the image. |
|
||||
| **`debug`** | **Detailed diagnostic information used during development and debugging.** <br> **Must not be enabled by default in production releases**, as it may contain sensitive data and impact performance. | - Parameters for function `renderImage`: `{ width: 800, ... }`. <br> - Specific data content received by IPC message `save-file`. <br> - Details of Redux/Vuex state changes in the renderer process. |
|
||||
| **`silly`** | **The most detailed, low-level information, used only for extreme debugging.** <br> Rarely used in regular development; only for solving very difficult problems. | - Real-time mouse coordinates `(x: 150, y: 320)`. <br> - Size of each data chunk when reading a file. <br> - Time taken for each rendered frame. |
|
||||
135
docs/technical/how-to-use-logger-zh.md
Normal file
135
docs/technical/how-to-use-logger-zh.md
Normal file
@@ -0,0 +1,135 @@
|
||||
# 如何使用日志 LoggerService
|
||||
|
||||
这是关于如何使用日志的开发者文档。
|
||||
|
||||
CherryStudio使用统一的日志服务来打印和记录日志,**若无特殊原因,请勿使用`console.xxx`来打印日志**
|
||||
|
||||
以下是详细说明
|
||||
|
||||
|
||||
## 在`main`进程中使用
|
||||
|
||||
### 引入
|
||||
|
||||
``` typescript
|
||||
import { loggerService } from '@logger'
|
||||
```
|
||||
|
||||
### 设置module信息(规范要求)
|
||||
|
||||
在import头之后,设置:
|
||||
|
||||
``` typescript
|
||||
const logger = loggerService.withContext('moduleName')
|
||||
```
|
||||
|
||||
- `moduleName`是当前文件模块的名称,命名可以以文件名、主类名、主函数名等,原则是清晰明了
|
||||
- `moduleName`会在终端中打印出来,也会在文件日志中提现,方便筛选
|
||||
|
||||
### 设置`CONTEXT`信息(可选)
|
||||
|
||||
在`withContext`中,也可以设置其他`CONTEXT`信息:
|
||||
|
||||
``` typescript
|
||||
const logger = loggerService.withContext('moduleName', CONTEXT)
|
||||
```
|
||||
|
||||
- `CONTEXT`为`{ key: value, ... }`
|
||||
- `CONTEXT`信息不会在终端中打印出来,但是会在文件日志中记录,方便筛选
|
||||
|
||||
### 记录日志
|
||||
|
||||
在代码中,可以随时调用 `logger` 来记录日志,支持的方法有:`error`, `warn`, `info`, `verbose`, `debug`, `silly`
|
||||
各级别的含义,请参考下面的章节。
|
||||
|
||||
以下以 `logger.info` 和 `logger.error` 举例如何使用,其他级别是一样的:
|
||||
``` typescript
|
||||
logger.info('message', CONTEXT)
|
||||
logger.info('message %s %d', 'hello', 123, CONTEXT)
|
||||
logger.error('message', new Error('error message'), CONTEXT)
|
||||
```
|
||||
- `message` 是必填的,`string`类型,其他选项都是可选的
|
||||
- `CONTEXT`为`{ key: value, ...}` 是可选的,会在日志文件中记录
|
||||
- 如果传递了`Error`类型,会自动记录错误堆栈
|
||||
|
||||
### 记录级别
|
||||
|
||||
- 开发环境下,所有级别的日志都会打印到终端,并且记录到文件日志中
|
||||
- 生产环境下,默认记录级别为`info`,日志只会记录到文件,不会打印到终端
|
||||
|
||||
更改日志记录级别:
|
||||
- 可以通过 `logger.setLevel('newLevel')` 来更改日志记录级别
|
||||
- `logger.resetLevel()` 可以重置为默认级别
|
||||
- `logger.getLevel()` 可以获取当前记录记录级别
|
||||
|
||||
**注意** 更改日志记录级别是全局生效的,请不要在代码中随意更改,除非你非常清楚自己在做什么
|
||||
|
||||
## 在`renderer`进程中使用
|
||||
|
||||
在`renderer`进程中使用,*引入方法*、*设置`module`信息*、*设置`context`信息的方法*和`main`进程中是**完全一样**的。
|
||||
下面着重讲一下不同之处。
|
||||
|
||||
### `initWindowSource`
|
||||
|
||||
`renderer`进程中,有不同的`window`,在开始使用`logger`之前,我们必须设置`window`信息:
|
||||
|
||||
```typescript
|
||||
loggerService.initWindowSource('windowName')
|
||||
```
|
||||
|
||||
原则上,我们将在`window`的`entryPoint.tsx`中进行设置,这可以保证`windowName`在开始使用前已经设置好了。
|
||||
- 未设置`windowName`会报错,`logger`将不起作用
|
||||
- `windowName`只能设置一次,重复设置将不生效
|
||||
- `windowName`不会在`devTool`的`console`中打印出来,但是会在`main`进程的终端和文件日志中记录
|
||||
|
||||
### 记录级别
|
||||
|
||||
- 开发环境下,默认所有级别的日志都会打印到`devTool`的`console`
|
||||
- 生产环境下,默认记录级别为`info`,日志会打印到`devTool`的`console`
|
||||
- 在开发和生产环境下,默认`warn`和`error`级别的日志,会传输给`main`进程,并记录到文件日志
|
||||
- 开发环境下,`main`进程终端中也会打印传输过来的日志
|
||||
|
||||
#### 更改日志记录级别
|
||||
|
||||
和`main`进程中一样,你可以通过`setLevel('level')`、`resetLevel()`和`getLevel()`来管理日志记录级别。
|
||||
同样,该日志记录级别也是全局调整的。
|
||||
|
||||
#### 更改传输到`main`的级别
|
||||
|
||||
将`renderer`的日志发送到`main`,并由`main`统一管理和记录到文件(根据`main`的记录到文件的级别),默认只有`warn`和`error`级别的日志会传输到`main`
|
||||
|
||||
有以下两种方式,可以更改传输到`main`的日志级别:
|
||||
|
||||
##### 全局更改
|
||||
|
||||
以下方法可以分别设置、重置和获取传输到`main`的日志级别
|
||||
|
||||
```typescript
|
||||
logger.setLogToMainLevel('newLevel')
|
||||
logger.resetLogToMainLevel()
|
||||
logger.getLogToMainLevel()
|
||||
```
|
||||
**注意** 该方法是全局生效的,请不要在代码中随意更改,除非你非常清楚自己在做什么
|
||||
|
||||
|
||||
##### 单条更改
|
||||
|
||||
在日志记录的最末尾,加上`{ logToMain: true }`,即可将本条日志传输到`main`(不受全局日志级别限制),例如:
|
||||
|
||||
```typescript
|
||||
logger.info('message', { logToMain: true })
|
||||
```
|
||||
|
||||
## 日志级别的使用规范
|
||||
|
||||
日志有很多级别,什么时候应该用哪个级别,下面是在CherryStudio中应该遵循的规范:
|
||||
(按日志级别从高到低排列)
|
||||
|
||||
| 日志级别 | 核心定义与使用场景 | 示例 |
|
||||
| :------------ | :------------------------------------------------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **`error`** | **严重错误,导致程序崩溃或核心功能无法使用。** <br> 这是最高优的日志,通常需要立即上报或提示用户。 | - 主进程或渲染进程崩溃。 <br> - 无法读写用户关键数据文件(如数据库、配置文件),导致应用无法运行。<br> - 所有未捕获的异常。` |
|
||||
| **`warn`** | **潜在问题或非预期情况,但不影响程序核心功能。** <br> 程序可以从中恢复或使用备用方案。 | - 配置文件 `settings.json` 缺失,已使用默认配置启动。 <br> - 自动更新检查失败,但不影响当前版本使用。<br> - 某个非核心插件加载失败。` |
|
||||
| **`info`** | **记录应用生命周期和关键用户行为。** <br> 这是发布版中默认应记录的级别,用于追踪用户的主要操作路径。 | - 应用启动、退出。<br> - 用户成功打开/保存文件。 <br> - 主窗口创建/关闭。<br> - 开始执行一项重要任务(如“开始导出视频”)。` |
|
||||
| **`verbose`** | **比 `info` 更详细的流程信息,用于追踪特定功能。** <br> 在诊断特定功能问题时开启,帮助理解内部执行流程。 | - 正在加载 `Toolbar` 模块。 <br> - IPC 消息 `open-file-dialog` 已从渲染进程发送。<br> - 正在应用滤镜 'Sepia' 到图像。` |
|
||||
| **`debug`** | **开发和调试时使用的详细诊断信息。** <br> **严禁在发布版中默认开启**,因为它可能包含敏感数据并影响性能。 | - 函数 `renderImage` 的入参: `{ width: 800, ... }`。<br> - IPC 消息 `save-file` 收到的具体数据内容。<br> - 渲染进程中 Redux/Vuex 的 state 变更详情。` |
|
||||
| **`silly`** | **最详尽的底层信息,仅用于极限调试。** <br> 几乎不在常规开发中使用,仅为解决棘手问题。 | - 鼠标移动的实时坐标 `(x: 150, y: 320)`。<br> - 读取文件时每个数据块(chunk)的大小。<br> - 每一次渲染帧的耗时。 |
|
||||
@@ -18,7 +18,8 @@ export default defineConfig({
|
||||
alias: {
|
||||
'@main': resolve('src/main'),
|
||||
'@types': resolve('src/renderer/src/types'),
|
||||
'@shared': resolve('packages/shared')
|
||||
'@shared': resolve('packages/shared'),
|
||||
'@logger': resolve('src/main/services/LoggerService')
|
||||
}
|
||||
},
|
||||
build: {
|
||||
@@ -68,7 +69,8 @@ export default defineConfig({
|
||||
resolve: {
|
||||
alias: {
|
||||
'@renderer': resolve('src/renderer/src'),
|
||||
'@shared': resolve('packages/shared')
|
||||
'@shared': resolve('packages/shared'),
|
||||
'@logger': resolve('src/renderer/src/services/LoggerService')
|
||||
}
|
||||
},
|
||||
optimizeDeps: {
|
||||
|
||||
@@ -163,7 +163,6 @@
|
||||
"electron": "35.6.0",
|
||||
"electron-builder": "26.0.15",
|
||||
"electron-devtools-installer": "^3.2.0",
|
||||
"electron-log": "^5.1.5",
|
||||
"electron-store": "^8.2.0",
|
||||
"electron-updater": "6.6.4",
|
||||
"electron-vite": "^3.1.0",
|
||||
@@ -236,6 +235,8 @@
|
||||
"vite": "6.2.6",
|
||||
"vitest": "^3.1.4",
|
||||
"webdav": "^5.8.0",
|
||||
"winston": "^3.17.0",
|
||||
"winston-daily-rotate-file": "^5.0.0",
|
||||
"word-extractor": "^1.0.4",
|
||||
"zipread": "^1.3.3"
|
||||
},
|
||||
|
||||
@@ -31,6 +31,7 @@ export enum IpcChannel {
|
||||
App_GetBinaryPath = 'app:get-binary-path',
|
||||
App_InstallUvBinary = 'app:install-uv-binary',
|
||||
App_InstallBunBinary = 'app:install-bun-binary',
|
||||
App_LogToMain = 'app:log-to-main',
|
||||
|
||||
App_MacIsProcessTrusted = 'app:mac-is-process-trusted',
|
||||
App_MacRequestProcessTrust = 'app:mac-request-process-trust',
|
||||
|
||||
@@ -9,3 +9,12 @@ export type LoaderReturn = {
|
||||
message?: string
|
||||
messageSource?: 'preprocess' | 'embedding'
|
||||
}
|
||||
|
||||
export type LogSourceWithContext = {
|
||||
process: 'main' | 'renderer'
|
||||
window?: string // only for renderer process
|
||||
module?: string
|
||||
context?: Record<string, any>
|
||||
}
|
||||
|
||||
export type LogLevel = 'error' | 'warn' | 'info' | 'debug' | 'verbose' | 'silly'
|
||||
|
||||
@@ -5,13 +5,13 @@ import './bootstrap'
|
||||
|
||||
import '@main/config'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { electronApp, optimizer } from '@electron-toolkit/utils'
|
||||
import { replaceDevtoolsFont } from '@main/utils/windowUtil'
|
||||
import { app } from 'electron'
|
||||
import installExtension, { REACT_DEVELOPER_TOOLS, REDUX_DEVTOOLS } from 'electron-devtools-installer'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { isDev, isWin, isLinux } from './constant'
|
||||
import { isDev, isLinux, isWin } from './constant'
|
||||
import { registerIpc } from './ipc'
|
||||
import { configManager } from './services/ConfigManager'
|
||||
import mcpService from './services/MCPService'
|
||||
@@ -26,7 +26,7 @@ import { registerShortcuts } from './services/ShortcutService'
|
||||
import { TrayService } from './services/TrayService'
|
||||
import { windowService } from './services/WindowService'
|
||||
|
||||
Logger.initialize()
|
||||
const logger = loggerService.withContext('MainEntry')
|
||||
|
||||
/**
|
||||
* Disable hardware acceleration if setting is enabled
|
||||
@@ -68,9 +68,9 @@ app.on('web-contents-created', (_, webContents) => {
|
||||
|
||||
webContents.on('unresponsive', async () => {
|
||||
// Interrupt execution and collect call stack from unresponsive renderer
|
||||
Logger.error('Renderer unresponsive start')
|
||||
logger.error('Renderer unresponsive start')
|
||||
const callStack = await webContents.mainFrame.collectJavaScriptCallStack()
|
||||
Logger.error('Renderer unresponsive js call stack\n', callStack)
|
||||
logger.error('Renderer unresponsive js call stack\n', callStack)
|
||||
})
|
||||
})
|
||||
|
||||
@@ -78,12 +78,12 @@ app.on('web-contents-created', (_, webContents) => {
|
||||
if (!isDev) {
|
||||
// handle uncaught exception
|
||||
process.on('uncaughtException', (error) => {
|
||||
Logger.error('Uncaught Exception:', error)
|
||||
logger.error('Uncaught Exception:', error)
|
||||
})
|
||||
|
||||
// handle unhandled rejection
|
||||
process.on('unhandledRejection', (reason, promise) => {
|
||||
Logger.error('Unhandled Rejection at:', promise, 'reason:', reason)
|
||||
logger.error('Unhandled Rejection at:', promise, 'reason:', reason)
|
||||
})
|
||||
}
|
||||
|
||||
@@ -181,8 +181,11 @@ if (!app.requestSingleInstanceLock()) {
|
||||
try {
|
||||
await mcpService.cleanup()
|
||||
} catch (error) {
|
||||
Logger.error('Error cleaning up MCP service:', error)
|
||||
logger.error('Error cleaning up MCP service:', error)
|
||||
}
|
||||
|
||||
// finish the logger
|
||||
logger.finish()
|
||||
})
|
||||
|
||||
// In this file you can include the rest of your app"s specific main process
|
||||
|
||||
@@ -2,6 +2,7 @@ import fs from 'node:fs'
|
||||
import { arch } from 'node:os'
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { isLinux, isMac, isWin } from '@main/constant'
|
||||
import { getBinaryPath, isBinaryExists, runInstallScript } from '@main/utils/process'
|
||||
import { handleZoomFactor } from '@main/utils/zoom'
|
||||
@@ -9,7 +10,6 @@ import { UpgradeChannel } from '@shared/config/constant'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { FileMetadata, Provider, Shortcut, ThemeMode } from '@types'
|
||||
import { BrowserWindow, dialog, ipcMain, ProxyConfig, session, shell, systemPreferences, webContents } from 'electron'
|
||||
import log from 'electron-log'
|
||||
import { Notification } from 'src/renderer/src/types/notification'
|
||||
|
||||
import appService from './services/AppService'
|
||||
@@ -43,6 +43,8 @@ import { decrypt, encrypt } from './utils/aes'
|
||||
import { getCacheDir, getConfigDir, getFilesDir, hasWritePermission, updateAppDataConfig } from './utils/file'
|
||||
import { compress, decompress } from './utils/zip'
|
||||
|
||||
const logger = loggerService.withContext('IPC')
|
||||
|
||||
const fileManager = new FileStorage()
|
||||
const backupManager = new BackupManager()
|
||||
const exportService = new ExportService(fileManager)
|
||||
@@ -66,7 +68,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
configPath: getConfigDir(),
|
||||
appDataPath: app.getPath('userData'),
|
||||
resourcesPath: getResourcePath(),
|
||||
logsPath: log.transports.file.getFile().path,
|
||||
logsPath: logger.getLogsDir(),
|
||||
arch: arch(),
|
||||
isPortable: isWin && 'PORTABLE_EXECUTABLE_DIR' in process.env,
|
||||
installPath: path.dirname(app.getPath('exe'))
|
||||
@@ -145,7 +147,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
})
|
||||
|
||||
ipcMain.handle(IpcChannel.App_SetTestPlan, async (_, isActive: boolean) => {
|
||||
log.info('set test plan', isActive)
|
||||
logger.info('set test plan', isActive)
|
||||
if (isActive !== configManager.getTestPlan()) {
|
||||
appUpdater.cancelDownload()
|
||||
configManager.setTestPlan(isActive)
|
||||
@@ -153,7 +155,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
})
|
||||
|
||||
ipcMain.handle(IpcChannel.App_SetTestChannel, async (_, channel: UpgradeChannel) => {
|
||||
log.info('set test channel', channel)
|
||||
logger.info('set test channel', channel)
|
||||
if (channel !== configManager.getTestChannel()) {
|
||||
appUpdater.cancelDownload()
|
||||
configManager.setTestChannel(channel)
|
||||
@@ -205,10 +207,12 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
})
|
||||
)
|
||||
await fileManager.clearTemp()
|
||||
await fs.writeFileSync(log.transports.file.getFile().path, '')
|
||||
// do not clear logs for now
|
||||
// TODO clear logs
|
||||
// await fs.writeFileSync(log.transports.file.getFile().path, '')
|
||||
return { success: true }
|
||||
} catch (error: any) {
|
||||
log.error('Failed to clear cache:', error)
|
||||
logger.error('Failed to clear cache:', error)
|
||||
return { success: false, error: error.message }
|
||||
}
|
||||
})
|
||||
@@ -216,14 +220,14 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
// get cache size
|
||||
ipcMain.handle(IpcChannel.App_GetCacheSize, async () => {
|
||||
const cachePath = getCacheDir()
|
||||
log.info(`Calculating cache size for path: ${cachePath}`)
|
||||
logger.info(`Calculating cache size for path: ${cachePath}`)
|
||||
|
||||
try {
|
||||
const sizeInBytes = await calculateDirectorySize(cachePath)
|
||||
const sizeInMB = (sizeInBytes / (1024 * 1024)).toFixed(2)
|
||||
return `${sizeInMB}`
|
||||
} catch (error: any) {
|
||||
log.error(`Failed to calculate cache size for ${cachePath}: ${error.message}`)
|
||||
logger.error(`Failed to calculate cache size for ${cachePath}: ${error.message}`)
|
||||
return '0'
|
||||
}
|
||||
})
|
||||
@@ -260,7 +264,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
}
|
||||
return filePaths[0]
|
||||
} catch (error: any) {
|
||||
log.error('Failed to select app data path:', error)
|
||||
logger.error('Failed to select app data path:', error)
|
||||
return null
|
||||
}
|
||||
})
|
||||
@@ -313,7 +317,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
})
|
||||
return { success: true }
|
||||
} catch (error: any) {
|
||||
log.error('Failed to copy user data:', error)
|
||||
logger.error('Failed to copy user data:', error)
|
||||
return { success: false, error: error.message }
|
||||
}
|
||||
})
|
||||
@@ -322,7 +326,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
ipcMain.handle(IpcChannel.App_RelaunchApp, (_, options?: Electron.RelaunchOptions) => {
|
||||
// Fix for .AppImage
|
||||
if (isLinux && process.env.APPIMAGE) {
|
||||
log.info('Relaunching app with options:', process.env.APPIMAGE, options)
|
||||
logger.info('Relaunching app with options:', process.env.APPIMAGE, options)
|
||||
// On Linux, we need to use the APPIMAGE environment variable to relaunch
|
||||
// https://github.com/electron-userland/electron-builder/issues/1727#issuecomment-769896927
|
||||
options = options || {}
|
||||
@@ -554,7 +558,7 @@ export function registerIpc(mainWindow: BrowserWindow, app: Electron.App) {
|
||||
// Process DXT file using the temporary path
|
||||
return await dxtService.uploadDxt(event, tempPath)
|
||||
} catch (error) {
|
||||
log.error('[IPC] DXT upload error:', error)
|
||||
logger.error('DXT upload error:', error)
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to upload DXT file'
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isMac } from '@main/constant'
|
||||
import { FileMetadata, OcrProvider } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
import * as fs from 'fs'
|
||||
import * as path from 'path'
|
||||
import { TextItem } from 'pdfjs-dist/types/src/display/api'
|
||||
|
||||
import BaseOcrProvider from './BaseOcrProvider'
|
||||
|
||||
const logger = loggerService.withContext('MacSysOcrProvider')
|
||||
|
||||
export default class MacSysOcrProvider extends BaseOcrProvider {
|
||||
private readonly MIN_TEXT_LENGTH = 1000
|
||||
private MacOCR: any
|
||||
@@ -21,7 +23,7 @@ export default class MacSysOcrProvider extends BaseOcrProvider {
|
||||
const module = await import('@cherrystudio/mac-system-ocr')
|
||||
this.MacOCR = module.default
|
||||
} catch (error) {
|
||||
Logger.error('[OCR] Failed to load mac-system-ocr:', error)
|
||||
logger.error('Failed to load mac-system-ocr:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -83,7 +85,7 @@ export default class MacSysOcrProvider extends BaseOcrProvider {
|
||||
}
|
||||
|
||||
public async parseFile(sourceId: string, file: FileMetadata): Promise<{ processedFile: FileMetadata }> {
|
||||
Logger.info(`[OCR] Starting OCR process for file: ${file.name}`)
|
||||
logger.info(`Starting OCR process for file: ${file.name}`)
|
||||
if (file.ext === '.pdf') {
|
||||
try {
|
||||
const { pdf } = await import('@cherrystudio/pdf-to-img-napi')
|
||||
@@ -103,7 +105,7 @@ export default class MacSysOcrProvider extends BaseOcrProvider {
|
||||
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
writeStream.end(() => {
|
||||
Logger.info(`[OCR] OCR process completed successfully for ${file.origin_name}`)
|
||||
logger.info(`OCR process completed successfully for ${file.origin_name}`)
|
||||
resolve()
|
||||
})
|
||||
writeStream.on('error', reject)
|
||||
@@ -119,7 +121,7 @@ export default class MacSysOcrProvider extends BaseOcrProvider {
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('[OCR] Error during OCR process:', error)
|
||||
logger.error('Error during OCR process:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,16 +1,19 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isMac } from '@main/constant'
|
||||
import { OcrProvider } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import BaseOcrProvider from './BaseOcrProvider'
|
||||
import DefaultOcrProvider from './DefaultOcrProvider'
|
||||
import MacSysOcrProvider from './MacSysOcrProvider'
|
||||
|
||||
const logger = loggerService.withContext('OcrProviderFactory')
|
||||
|
||||
export default class OcrProviderFactory {
|
||||
static create(provider: OcrProvider): BaseOcrProvider {
|
||||
switch (provider.id) {
|
||||
case 'system':
|
||||
if (!isMac) {
|
||||
Logger.warn('[OCR] System OCR provider is only available on macOS')
|
||||
logger.warn('System OCR provider is only available on macOS')
|
||||
}
|
||||
return new MacSysOcrProvider(provider)
|
||||
default:
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
import fs from 'node:fs'
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { FileMetadata, PreprocessProvider } from '@types'
|
||||
import AdmZip from 'adm-zip'
|
||||
import axios, { AxiosRequestConfig } from 'axios'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import BasePreprocessProvider from './BasePreprocessProvider'
|
||||
|
||||
const logger = loggerService.withContext('Doc2xPreprocessProvider')
|
||||
|
||||
type ApiResponse<T> = {
|
||||
code: string
|
||||
data: T
|
||||
@@ -52,11 +54,11 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
|
||||
public async parseFile(sourceId: string, file: FileMetadata): Promise<{ processedFile: FileMetadata }> {
|
||||
try {
|
||||
Logger.info(`Preprocess processing started: ${file.path}`)
|
||||
logger.info(`Preprocess processing started: ${file.path}`)
|
||||
|
||||
// 步骤1: 准备上传
|
||||
const { uid, url } = await this.preupload()
|
||||
Logger.info(`Preprocess preupload completed: uid=${uid}`)
|
||||
logger.info(`Preprocess preupload completed: uid=${uid}`)
|
||||
|
||||
await this.validateFile(file.path)
|
||||
|
||||
@@ -65,7 +67,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
|
||||
// 步骤3: 等待处理完成
|
||||
await this.waitForProcessing(sourceId, uid)
|
||||
Logger.info(`Preprocess parsing completed successfully for: ${file.path}`)
|
||||
logger.info(`Preprocess parsing completed successfully for: ${file.path}`)
|
||||
|
||||
// 步骤4: 导出文件
|
||||
const { path: outputPath } = await this.exportFile(file, uid)
|
||||
@@ -75,7 +77,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
processedFile: this.createProcessedFileInfo(file, outputPath)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(
|
||||
logger.error(
|
||||
`Preprocess processing failed for ${file.path}: ${error instanceof Error ? error.message : String(error)}`
|
||||
)
|
||||
throw error
|
||||
@@ -100,11 +102,11 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
* @returns 导出文件的路径
|
||||
*/
|
||||
public async exportFile(file: FileMetadata, uid: string): Promise<{ path: string }> {
|
||||
Logger.info(`Exporting file: ${file.path}`)
|
||||
logger.info(`Exporting file: ${file.path}`)
|
||||
|
||||
// 步骤1: 转换文件
|
||||
await this.convertFile(uid, file.path)
|
||||
Logger.info(`File conversion completed for: ${file.path}`)
|
||||
logger.info(`File conversion completed for: ${file.path}`)
|
||||
|
||||
// 步骤2: 等待导出并获取URL
|
||||
const exportUrl = await this.waitForExport(uid)
|
||||
@@ -123,7 +125,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
await this.delay(1000)
|
||||
const { status, progress } = await this.getStatus(uid)
|
||||
await this.sendPreprocessProgress(sourceId, progress)
|
||||
Logger.info(`Preprocess processing status: ${status}, progress: ${progress}%`)
|
||||
logger.info(`Preprocess processing status: ${status}, progress: ${progress}%`)
|
||||
|
||||
if (status === 'success') {
|
||||
return
|
||||
@@ -142,7 +144,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
while (true) {
|
||||
await this.delay(1000)
|
||||
const { status, url } = await this.getParsedFile(uid)
|
||||
Logger.info(`Export status: ${status}`)
|
||||
logger.info(`Export status: ${status}`)
|
||||
|
||||
if (status === 'success' && url) {
|
||||
return url
|
||||
@@ -169,7 +171,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`API returned error: ${data.message || JSON.stringify(data)}`)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to get preupload URL: ${error instanceof Error ? error.message : String(error)}`)
|
||||
logger.error(`Failed to get preupload URL: ${error instanceof Error ? error.message : String(error)}`)
|
||||
throw new Error('Failed to get preupload URL')
|
||||
}
|
||||
}
|
||||
@@ -188,7 +190,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`HTTP status ${response.status}: ${response.statusText}`)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to upload file ${filePath}: ${error instanceof Error ? error.message : String(error)}`)
|
||||
logger.error(`Failed to upload file ${filePath}: ${error instanceof Error ? error.message : String(error)}`)
|
||||
throw new Error('Failed to upload file')
|
||||
}
|
||||
}
|
||||
@@ -206,7 +208,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`API returned error: ${response.data.message || JSON.stringify(response.data)}`)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to get status for uid ${uid}: ${error instanceof Error ? error.message : String(error)}`)
|
||||
logger.error(`Failed to get status for uid ${uid}: ${error instanceof Error ? error.message : String(error)}`)
|
||||
throw new Error('Failed to get processing status')
|
||||
}
|
||||
}
|
||||
@@ -242,7 +244,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`API returned error: ${response.data.message || JSON.stringify(response.data)}`)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to convert file ${filePath}: ${error instanceof Error ? error.message : String(error)}`)
|
||||
logger.error(`Failed to convert file ${filePath}: ${error instanceof Error ? error.message : String(error)}`)
|
||||
throw new Error('Failed to convert file')
|
||||
}
|
||||
}
|
||||
@@ -265,7 +267,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`HTTP status ${response.status}: ${response.statusText}`)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(
|
||||
logger.error(
|
||||
`Failed to get parsed file for uid ${uid}: ${error instanceof Error ? error.message : String(error)}`
|
||||
)
|
||||
throw new Error('Failed to get parsed file information')
|
||||
@@ -288,7 +290,7 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
fs.mkdirSync(dirPath, { recursive: true })
|
||||
fs.mkdirSync(extractPath, { recursive: true })
|
||||
|
||||
Logger.info(`Downloading to export path: ${zipPath}`)
|
||||
logger.info(`Downloading to export path: ${zipPath}`)
|
||||
|
||||
try {
|
||||
// 下载文件
|
||||
@@ -303,14 +305,14 @@ export default class Doc2xPreprocessProvider extends BasePreprocessProvider {
|
||||
// 解压文件
|
||||
const zip = new AdmZip(zipPath)
|
||||
zip.extractAllTo(extractPath, true)
|
||||
Logger.info(`Extracted files to: ${extractPath}`)
|
||||
logger.info(`Extracted files to: ${extractPath}`)
|
||||
|
||||
// 删除临时ZIP文件
|
||||
fs.unlinkSync(zipPath)
|
||||
|
||||
return { path: extractPath }
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to download and extract file: ${error instanceof Error ? error.message : String(error)}`)
|
||||
logger.error(`Failed to download and extract file: ${error instanceof Error ? error.message : String(error)}`)
|
||||
throw new Error('Failed to download and extract file')
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
import fs from 'node:fs'
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { FileMetadata, PreprocessProvider } from '@types'
|
||||
import AdmZip from 'adm-zip'
|
||||
import axios from 'axios'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import BasePreprocessProvider from './BasePreprocessProvider'
|
||||
|
||||
const logger = loggerService.withContext('MineruPreprocessProvider')
|
||||
|
||||
type ApiResponse<T> = {
|
||||
code: number
|
||||
data: T
|
||||
@@ -61,16 +63,16 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
file: FileMetadata
|
||||
): Promise<{ processedFile: FileMetadata; quota: number }> {
|
||||
try {
|
||||
Logger.info(`MinerU preprocess processing started: ${file.path}`)
|
||||
logger.info(`MinerU preprocess processing started: ${file.path}`)
|
||||
await this.validateFile(file.path)
|
||||
|
||||
// 1. 获取上传URL并上传文件
|
||||
const batchId = await this.uploadFile(file)
|
||||
Logger.info(`MinerU file upload completed: batch_id=${batchId}`)
|
||||
logger.info(`MinerU file upload completed: batch_id=${batchId}`)
|
||||
|
||||
// 2. 等待处理完成并获取结果
|
||||
const extractResult = await this.waitForCompletion(sourceId, batchId, file.origin_name)
|
||||
Logger.info(`MinerU processing completed for batch: ${batchId}`)
|
||||
logger.info(`MinerU processing completed for batch: ${batchId}`)
|
||||
|
||||
// 3. 下载并解压文件
|
||||
const { path: outputPath } = await this.downloadAndExtractFile(extractResult.full_zip_url!, file)
|
||||
@@ -84,7 +86,7 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
quota
|
||||
}
|
||||
} catch (error: any) {
|
||||
Logger.error(`MinerU preprocess processing failed for ${file.path}: ${error.message}`)
|
||||
logger.error(`MinerU preprocess processing failed for ${file.path}: ${error.message}`)
|
||||
throw new Error(error.message)
|
||||
}
|
||||
}
|
||||
@@ -105,7 +107,7 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
const response: QuotaResponse = await quota.json()
|
||||
return response.data.user_left_quota
|
||||
} catch (error) {
|
||||
console.error('Error checking quota:', error)
|
||||
logger.error('Error checking quota:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -143,16 +145,16 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
try {
|
||||
fs.renameSync(originalMdPath, newMdPath)
|
||||
finalPath = newMdPath
|
||||
Logger.info(`Renamed markdown file from ${mdFile} to ${finalName}`)
|
||||
logger.info(`Renamed markdown file from ${mdFile} to ${finalName}`)
|
||||
} catch (renameError) {
|
||||
Logger.warn(`Failed to rename file ${mdFile} to ${finalName}: ${renameError}`)
|
||||
logger.warn(`Failed to rename file ${mdFile} to ${finalName}: ${renameError}`)
|
||||
// 如果重命名失败,使用原文件
|
||||
finalPath = originalMdPath
|
||||
finalName = mdFile
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.warn(`Failed to read output directory ${outputPath}: ${error}`)
|
||||
logger.warn(`Failed to read output directory ${outputPath}: ${error}`)
|
||||
finalPath = path.join(outputPath, `${file.id}.md`)
|
||||
}
|
||||
|
||||
@@ -171,13 +173,13 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
const zipPath = path.join(dirPath, `${file.id}.zip`)
|
||||
const extractPath = path.join(dirPath, `${file.id}`)
|
||||
|
||||
Logger.info(`Downloading MinerU result to: ${zipPath}`)
|
||||
logger.info(`Downloading MinerU result to: ${zipPath}`)
|
||||
|
||||
try {
|
||||
// 下载ZIP文件
|
||||
const response = await axios.get(zipUrl, { responseType: 'arraybuffer' })
|
||||
fs.writeFileSync(zipPath, response.data)
|
||||
Logger.info(`Downloaded ZIP file: ${zipPath}`)
|
||||
logger.info(`Downloaded ZIP file: ${zipPath}`)
|
||||
|
||||
// 确保提取目录存在
|
||||
if (!fs.existsSync(extractPath)) {
|
||||
@@ -187,14 +189,14 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
// 解压文件
|
||||
const zip = new AdmZip(zipPath)
|
||||
zip.extractAllTo(extractPath, true)
|
||||
Logger.info(`Extracted files to: ${extractPath}`)
|
||||
logger.info(`Extracted files to: ${extractPath}`)
|
||||
|
||||
// 删除临时ZIP文件
|
||||
fs.unlinkSync(zipPath)
|
||||
|
||||
return { path: extractPath }
|
||||
} catch (error: any) {
|
||||
Logger.error(`Failed to download and extract file: ${error.message}`)
|
||||
logger.error(`Failed to download and extract file: ${error.message}`)
|
||||
throw new Error(error.message)
|
||||
}
|
||||
}
|
||||
@@ -203,16 +205,16 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
try {
|
||||
// 步骤1: 获取上传URL
|
||||
const { batchId, fileUrls } = await this.getBatchUploadUrls(file)
|
||||
Logger.info(`Got upload URLs for batch: ${batchId}`)
|
||||
logger.info(`Got upload URLs for batch: ${batchId}`)
|
||||
|
||||
console.log('batchId:', batchId, 'fileurls:', fileUrls)
|
||||
// 步骤2: 上传文件到获取的URL
|
||||
await this.putFileToUrl(file.path, fileUrls[0])
|
||||
Logger.info(`File uploaded successfully: ${file.path}`)
|
||||
logger.info(`File uploaded successfully: ${file.path}`)
|
||||
|
||||
return batchId
|
||||
} catch (error: any) {
|
||||
Logger.error(`Failed to upload file ${file.path}: ${error.message}`)
|
||||
logger.error(`Failed to upload file ${file.path}: ${error.message}`)
|
||||
throw new Error(error.message)
|
||||
}
|
||||
}
|
||||
@@ -260,7 +262,7 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`HTTP ${response.status}: ${response.statusText}`)
|
||||
}
|
||||
} catch (error: any) {
|
||||
Logger.error(`Failed to get batch upload URLs: ${error.message}`)
|
||||
logger.error(`Failed to get batch upload URLs: ${error.message}`)
|
||||
throw new Error(error.message)
|
||||
}
|
||||
}
|
||||
@@ -296,16 +298,16 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
body: responseBody
|
||||
}
|
||||
|
||||
console.error('Response details:', errorInfo)
|
||||
logger.error('Response details:', errorInfo)
|
||||
throw new Error(`Upload failed with status ${response.status}: ${responseBody}`)
|
||||
} catch (parseError) {
|
||||
throw new Error(`Upload failed with status ${response.status}. Could not parse response body.`)
|
||||
}
|
||||
}
|
||||
|
||||
Logger.info(`File uploaded successfully to: ${uploadUrl}`)
|
||||
logger.info(`File uploaded successfully to: ${uploadUrl}`)
|
||||
} catch (error: any) {
|
||||
Logger.error(`Failed to upload file to URL ${uploadUrl}: ${error}`)
|
||||
logger.error(`Failed to upload file to URL ${uploadUrl}: ${error}`)
|
||||
throw new Error(error.message)
|
||||
}
|
||||
}
|
||||
@@ -334,7 +336,7 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
throw new Error(`HTTP ${response.status}: ${response.statusText}`)
|
||||
}
|
||||
} catch (error: any) {
|
||||
Logger.error(`Failed to get extract results for batch ${batchId}: ${error.message}`)
|
||||
logger.error(`Failed to get extract results for batch ${batchId}: ${error.message}`)
|
||||
throw new Error(error.message)
|
||||
}
|
||||
}
|
||||
@@ -360,7 +362,7 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
|
||||
// 检查处理状态
|
||||
if (fileResult.state === 'done' && fileResult.full_zip_url) {
|
||||
Logger.info(`Processing completed for file: ${fileName}`)
|
||||
logger.info(`Processing completed for file: ${fileName}`)
|
||||
return fileResult
|
||||
} else if (fileResult.state === 'failed') {
|
||||
throw new Error(`Processing failed for file: ${fileName}, error: ${fileResult.err_msg}`)
|
||||
@@ -371,15 +373,15 @@ export default class MineruPreprocessProvider extends BasePreprocessProvider {
|
||||
(fileResult.extract_progress.extracted_pages / fileResult.extract_progress.total_pages) * 100
|
||||
)
|
||||
await this.sendPreprocessProgress(sourceId, progress)
|
||||
Logger.info(`File ${fileName} processing progress: ${progress}%`)
|
||||
logger.info(`File ${fileName} processing progress: ${progress}%`)
|
||||
} else {
|
||||
// 如果没有具体进度信息,发送一个通用进度
|
||||
await this.sendPreprocessProgress(sourceId, 50)
|
||||
Logger.info(`File ${fileName} is still processing...`)
|
||||
logger.info(`File ${fileName} is still processing...`)
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.warn(`Failed to check status for batch ${batchId}, retry ${retries + 1}/${maxRetries}`)
|
||||
logger.warn(`Failed to check status for batch ${batchId}, retry ${retries + 1}/${maxRetries}`)
|
||||
if (retries === maxRetries - 1) {
|
||||
throw error
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import fs from 'node:fs'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { MistralClientManager } from '@main/services/MistralClientManager'
|
||||
import { MistralService } from '@main/services/remotefile/MistralService'
|
||||
import { Mistral } from '@mistralai/mistralai'
|
||||
@@ -7,13 +8,14 @@ import { DocumentURLChunk } from '@mistralai/mistralai/models/components/documen
|
||||
import { ImageURLChunk } from '@mistralai/mistralai/models/components/imageurlchunk'
|
||||
import { OCRResponse } from '@mistralai/mistralai/models/components/ocrresponse'
|
||||
import { FileMetadata, FileTypes, PreprocessProvider, Provider } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
import path from 'path'
|
||||
|
||||
import BasePreprocessProvider from './BasePreprocessProvider'
|
||||
|
||||
type PreuploadResponse = DocumentURLChunk | ImageURLChunk
|
||||
|
||||
const logger = loggerService.withContext('MistralPreprocessProvider')
|
||||
|
||||
export default class MistralPreprocessProvider extends BasePreprocessProvider {
|
||||
private sdk: Mistral
|
||||
private fileService: MistralService
|
||||
@@ -36,20 +38,20 @@ export default class MistralPreprocessProvider extends BasePreprocessProvider {
|
||||
|
||||
private async preupload(file: FileMetadata): Promise<PreuploadResponse> {
|
||||
let document: PreuploadResponse
|
||||
Logger.info(`preprocess preupload started for local file: ${file.path}`)
|
||||
logger.info(`preprocess preupload started for local file: ${file.path}`)
|
||||
|
||||
if (file.ext.toLowerCase() === '.pdf') {
|
||||
const uploadResponse = await this.fileService.uploadFile(file)
|
||||
|
||||
if (uploadResponse.status === 'failed') {
|
||||
Logger.error('File upload failed:', uploadResponse)
|
||||
logger.error('File upload failed:', uploadResponse)
|
||||
throw new Error('Failed to upload file: ' + uploadResponse.displayName)
|
||||
}
|
||||
await this.sendPreprocessProgress(file.id, 15)
|
||||
const fileUrl = await this.sdk.files.getSignedUrl({
|
||||
fileId: uploadResponse.fileId
|
||||
})
|
||||
Logger.info('Got signed URL:', fileUrl)
|
||||
logger.info('Got signed URL:', fileUrl)
|
||||
await this.sendPreprocessProgress(file.id, 20)
|
||||
document = {
|
||||
type: 'document_url',
|
||||
@@ -152,7 +154,7 @@ export default class MistralPreprocessProvider extends BasePreprocessProvider {
|
||||
|
||||
counter++
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to save image ${imageFileName}:`, error)
|
||||
logger.error(`Failed to save image ${imageFileName}:`, error)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
@@ -1,8 +1,11 @@
|
||||
import { BaseEmbeddings } from '@cherrystudio/embedjs-interfaces'
|
||||
import { VoyageEmbeddings as _VoyageEmbeddings } from '@langchain/community/embeddings/voyage'
|
||||
import { loggerService } from '@logger'
|
||||
|
||||
import { VOYAGE_SUPPORTED_DIM_MODELS } from './utils'
|
||||
|
||||
const logger = loggerService.withContext('VoyageEmbeddings')
|
||||
|
||||
/**
|
||||
* 支持设置嵌入维度的模型
|
||||
*/
|
||||
@@ -16,7 +19,7 @@ export class VoyageEmbeddings extends BaseEmbeddings {
|
||||
if (!this.configuration.modelName) this.configuration.modelName = 'voyage-3'
|
||||
|
||||
if (!VOYAGE_SUPPORTED_DIM_MODELS.includes(this.configuration.modelName) && this.configuration.outputDimension) {
|
||||
console.error(`VoyageEmbeddings only supports ${VOYAGE_SUPPORTED_DIM_MODELS.join(', ')} to set outputDimension.`)
|
||||
logger.error(`VoyageEmbeddings only supports ${VOYAGE_SUPPORTED_DIM_MODELS.join(', ')} to set outputDimension.`)
|
||||
this.model = new _VoyageEmbeddings({ ...this.configuration, outputDimension: undefined })
|
||||
} else {
|
||||
this.model = new _VoyageEmbeddings(this.configuration)
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
import { BaseLoader } from '@cherrystudio/embedjs-interfaces'
|
||||
import { cleanString } from '@cherrystudio/embedjs-utils'
|
||||
import { RecursiveCharacterTextSplitter } from '@langchain/textsplitters'
|
||||
import { loggerService } from '@logger'
|
||||
import { getTempDir } from '@main/utils/file'
|
||||
import Logger from 'electron-log'
|
||||
import EPub from 'epub'
|
||||
import * as fs from 'fs'
|
||||
import path from 'path'
|
||||
|
||||
const logger = loggerService.withContext('EpubLoader')
|
||||
|
||||
/**
|
||||
* epub 加载器的配置选项
|
||||
*/
|
||||
@@ -183,7 +185,7 @@ export class EpubLoader extends BaseLoader<Record<string, string | number | bool
|
||||
writeStream.write(text + '\n\n')
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`[EpubLoader] Error processing chapter ${chapter.id}:`, error)
|
||||
logger.error(`[EpubLoader] Error processing chapter ${chapter.id}:`, error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -203,9 +205,9 @@ export class EpubLoader extends BaseLoader<Record<string, string | number | bool
|
||||
fs.unlinkSync(tempFilePath)
|
||||
|
||||
// 只添加一条完成日志
|
||||
Logger.info(`[EpubLoader] 电子书 ${this.metadata?.title || path.basename(this.filePath)} 处理完成`)
|
||||
logger.info(`[EpubLoader] 电子书 ${this.metadata?.title || path.basename(this.filePath)} 处理完成`)
|
||||
} catch (error) {
|
||||
Logger.error('[EpubLoader] Error in extractTextFromEpub:', error)
|
||||
logger.error('[EpubLoader] Error in extractTextFromEpub:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -221,7 +223,7 @@ export class EpubLoader extends BaseLoader<Record<string, string | number | bool
|
||||
await this.extractTextFromEpub()
|
||||
}
|
||||
|
||||
Logger.info('[EpubLoader] 书名:', this.metadata?.title || '未知书名', ' 文本大小:', this.extractedText.length)
|
||||
logger.info('[EpubLoader] 书名:', this.metadata?.title || '未知书名', ' 文本大小:', this.extractedText.length)
|
||||
|
||||
// 创建文本分块器
|
||||
const chunker = new RecursiveCharacterTextSplitter({
|
||||
|
||||
@@ -1,15 +1,17 @@
|
||||
import { JsonLoader, LocalPathLoader, RAGApplication, TextLoader } from '@cherrystudio/embedjs'
|
||||
import type { AddLoaderReturn } from '@cherrystudio/embedjs-interfaces'
|
||||
import { WebLoader } from '@cherrystudio/embedjs-loader-web'
|
||||
import { loggerService } from '@logger'
|
||||
import { readTextFileWithAutoEncoding } from '@main/utils/file'
|
||||
import { LoaderReturn } from '@shared/config/types'
|
||||
import { FileMetadata, KnowledgeBaseParams } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { DraftsExportLoader } from './draftsExportLoader'
|
||||
import { EpubLoader } from './epubLoader'
|
||||
import { OdLoader, OdType } from './odLoader'
|
||||
|
||||
const logger = loggerService.withContext('KnowledgeLoader')
|
||||
|
||||
// 文件扩展名到加载器类型的映射
|
||||
const FILE_LOADER_MAP: Record<string, string> = {
|
||||
// 内置类型
|
||||
@@ -75,7 +77,7 @@ export async function addFileLoader(
|
||||
// JSON类型处理
|
||||
let jsonObject = {}
|
||||
let jsonParsed = true
|
||||
Logger.info(`[KnowledgeBase] processing file ${file.path} as ${loaderType} type`)
|
||||
logger.info(`[KnowledgeBase] processing file ${file.path} as ${loaderType} type`)
|
||||
switch (loaderType) {
|
||||
case 'common':
|
||||
// 内置类型处理
|
||||
@@ -127,7 +129,7 @@ export async function addFileLoader(
|
||||
jsonObject = JSON.parse(await readTextFileWithAutoEncoding(file.path))
|
||||
} catch (error) {
|
||||
jsonParsed = false
|
||||
Logger.warn('[KnowledgeBase] failed parsing json file, falling back to text processing:', file.path, error)
|
||||
logger.warn('[KnowledgeBase] failed parsing json file, falling back to text processing:', file.path, error)
|
||||
}
|
||||
|
||||
if (jsonParsed) {
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import { BaseLoader } from '@cherrystudio/embedjs-interfaces'
|
||||
import { cleanString } from '@cherrystudio/embedjs-utils'
|
||||
import { RecursiveCharacterTextSplitter } from '@langchain/textsplitters'
|
||||
import { loggerService } from '@logger'
|
||||
import md5 from 'md5'
|
||||
import { OfficeParserConfig, parseOfficeAsync } from 'officeparser'
|
||||
|
||||
const logger = loggerService.withContext('OdLoader')
|
||||
|
||||
export enum OdType {
|
||||
OdtLoader = 'OdtLoader',
|
||||
OdsLoader = 'OdsLoader',
|
||||
@@ -42,7 +45,7 @@ export class OdLoader<OdType> extends BaseLoader<{ type: string }> {
|
||||
try {
|
||||
this.extractedText = await parseOfficeAsync(this.filePath, this.config)
|
||||
} catch (err) {
|
||||
console.error('odLoader error', err)
|
||||
logger.error('odLoader error', err)
|
||||
throw err
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
// inspired by https://dify.ai/blog/turn-your-dify-app-into-an-mcp-server
|
||||
import { loggerService } from '@logger'
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
|
||||
import { CallToolRequestSchema, ListToolsRequestSchema, ToolSchema } from '@modelcontextprotocol/sdk/types.js'
|
||||
import { z } from 'zod'
|
||||
import { zodToJsonSchema } from 'zod-to-json-schema'
|
||||
|
||||
const logger = loggerService.withContext('DifyKnowledgeServer')
|
||||
|
||||
interface DifyKnowledgeServerConfig {
|
||||
difyKey: string
|
||||
apiHost: string
|
||||
@@ -168,7 +171,7 @@ class DifyKnowledgeServer {
|
||||
content: [{ type: 'text', text: formattedText }]
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('获取知识库列表时出错:', error)
|
||||
logger.error('Error fetching knowledge list:', error)
|
||||
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||
// 返回包含错误信息的 MCP 响应
|
||||
return {
|
||||
@@ -247,7 +250,7 @@ class DifyKnowledgeServer {
|
||||
content: [{ type: 'text', text: formattedText }]
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('搜索知识库时出错:', error)
|
||||
logger.error('Error searching knowledge:', error)
|
||||
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||
return {
|
||||
content: [{ type: 'text', text: `Search Knowledge Error: ${errorMessage}` }],
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import BraveSearchServer from './brave-search'
|
||||
import DifyKnowledgeServer from './dify-knowledge'
|
||||
@@ -9,8 +9,10 @@ import MemoryServer from './memory'
|
||||
import PythonServer from './python'
|
||||
import ThinkingServer from './sequentialthinking'
|
||||
|
||||
const logger = loggerService.withContext('MCPFactory')
|
||||
|
||||
export function createInMemoryMCPServer(name: string, args: string[] = [], envs: Record<string, string> = {}): Server {
|
||||
Logger.info(`[MCP] Creating in-memory MCP server: ${name} with args: ${args} and envs: ${JSON.stringify(envs)}`)
|
||||
logger.debug(`[MCP] Creating in-memory MCP server: ${name} with args: ${args} and envs: ${JSON.stringify(envs)}`)
|
||||
switch (name) {
|
||||
case '@cherry/memory': {
|
||||
const envPath = envs.MEMORY_FILE_PATH
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
// port https://github.com/modelcontextprotocol/servers/blob/main/src/filesystem/index.ts
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
|
||||
import { CallToolRequestSchema, ListToolsRequestSchema, ToolSchema } from '@modelcontextprotocol/sdk/types.js'
|
||||
import { createTwoFilesPatch } from 'diff'
|
||||
@@ -10,6 +11,8 @@ import path from 'path'
|
||||
import { z } from 'zod'
|
||||
import { zodToJsonSchema } from 'zod-to-json-schema'
|
||||
|
||||
const logger = loggerService.withContext('MCP:FileSystemServer')
|
||||
|
||||
// Normalize all paths consistently
|
||||
function normalizePath(p: string): string {
|
||||
return path.normalize(p)
|
||||
@@ -294,7 +297,7 @@ class FileSystemServer {
|
||||
|
||||
// Validate that all directories exist and are accessible
|
||||
this.validateDirs().catch((error) => {
|
||||
console.error('Error validating allowed directories:', error)
|
||||
logger.error('Error validating allowed directories:', error)
|
||||
throw new Error(`Error validating allowed directories: ${error}`)
|
||||
})
|
||||
|
||||
@@ -319,11 +322,11 @@ class FileSystemServer {
|
||||
try {
|
||||
const stats = await fs.stat(expandHome(dir))
|
||||
if (!stats.isDirectory()) {
|
||||
console.error(`Error: ${dir} is not a directory`)
|
||||
logger.error(`Error: ${dir} is not a directory`)
|
||||
throw new Error(`Error: ${dir} is not a directory`)
|
||||
}
|
||||
} catch (error: any) {
|
||||
console.error(`Error accessing directory ${dir}:`, error)
|
||||
logger.error(`Error accessing directory ${dir}:`, error)
|
||||
throw new Error(`Error accessing directory ${dir}:`, error)
|
||||
}
|
||||
})
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { getConfigDir } from '@main/utils/file'
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
|
||||
import { CallToolRequestSchema, ErrorCode, ListToolsRequestSchema, McpError } from '@modelcontextprotocol/sdk/types.js'
|
||||
import { Mutex } from 'async-mutex' // 引入 Mutex
|
||||
import Logger from 'electron-log'
|
||||
import { promises as fs } from 'fs'
|
||||
import path from 'path'
|
||||
|
||||
const logger = loggerService.withContext('MCPServer:Memory')
|
||||
|
||||
// Define memory file path
|
||||
const defaultMemoryPath = path.join(getConfigDir(), 'memory.json')
|
||||
|
||||
@@ -61,7 +63,7 @@ class KnowledgeGraphManager {
|
||||
await fs.writeFile(this.memoryPath, JSON.stringify({ entities: [], relations: [] }, null, 2))
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to ensure memory path exists:', error)
|
||||
logger.error('Failed to ensure memory path exists:', error)
|
||||
// Propagate the error or handle it more gracefully depending on requirements
|
||||
throw new McpError(
|
||||
ErrorCode.InternalError,
|
||||
@@ -94,13 +96,13 @@ class KnowledgeGraphManager {
|
||||
this.relations = new Set()
|
||||
await this._persistGraph() // Create the file with empty structure
|
||||
} else if (error instanceof SyntaxError) {
|
||||
console.error('Failed to parse memory.json, initializing with empty graph:', error)
|
||||
logger.error('Failed to parse memory.json, initializing with empty graph:', error)
|
||||
// If JSON is invalid, start fresh and overwrite the corrupted file
|
||||
this.entities = new Map()
|
||||
this.relations = new Set()
|
||||
await this._persistGraph()
|
||||
} else {
|
||||
console.error('Failed to load knowledge graph from disk:', error)
|
||||
logger.error('Failed to load knowledge graph from disk:', error)
|
||||
throw new McpError(
|
||||
ErrorCode.InternalError,
|
||||
`Failed to load graph: ${error instanceof Error ? error.message : String(error)}`
|
||||
@@ -119,7 +121,7 @@ class KnowledgeGraphManager {
|
||||
}
|
||||
await fs.writeFile(this.memoryPath, JSON.stringify(graphData, null, 2))
|
||||
} catch (error) {
|
||||
console.error('Failed to save knowledge graph:', error)
|
||||
logger.error('Failed to save knowledge graph:', error)
|
||||
// Decide how to handle write errors - potentially retry or notify
|
||||
throw new McpError(
|
||||
ErrorCode.InternalError,
|
||||
@@ -162,7 +164,7 @@ class KnowledgeGraphManager {
|
||||
relations.forEach((relation) => {
|
||||
// Ensure related entities exist before creating a relation
|
||||
if (!this.entities.has(relation.from) || !this.entities.has(relation.to)) {
|
||||
console.warn(`Skipping relation creation: Entity not found for relation ${relation.from} -> ${relation.to}`)
|
||||
logger.warn(`Skipping relation creation: Entity not found for relation ${relation.from} -> ${relation.to}`)
|
||||
return // Skip this relation
|
||||
}
|
||||
const relationStr = this._serializeRelation(relation)
|
||||
@@ -188,7 +190,7 @@ class KnowledgeGraphManager {
|
||||
// Option 1: Throw error
|
||||
throw new McpError(ErrorCode.InvalidParams, `Entity with name ${o.entityName} not found`)
|
||||
// Option 2: Skip and warn
|
||||
// console.warn(`Entity with name ${o.entityName} not found when adding observations. Skipping.`);
|
||||
// logger.warn(`Entity with name ${o.entityName} not found when adding observations. Skipping.`);
|
||||
// return;
|
||||
}
|
||||
// Ensure observations array exists
|
||||
@@ -356,9 +358,9 @@ class MemoryServer {
|
||||
private async _initializeManager(memoryPath: string): Promise<void> {
|
||||
try {
|
||||
this.knowledgeGraphManager = await KnowledgeGraphManager.create(memoryPath)
|
||||
Logger.log('KnowledgeGraphManager initialized successfully.')
|
||||
logger.debug('KnowledgeGraphManager initialized successfully.')
|
||||
} catch (error) {
|
||||
Logger.error('Failed to initialize KnowledgeGraphManager:', error)
|
||||
logger.error('Failed to initialize KnowledgeGraphManager:', error)
|
||||
// Server might be unusable, consider how to handle this state
|
||||
// Maybe set a flag and return errors for all tool calls?
|
||||
this.knowledgeGraphManager = null // Ensure it's null if init fails
|
||||
@@ -385,7 +387,7 @@ class MemoryServer {
|
||||
await this._getManager() // Wait for initialization before confirming tools are available
|
||||
} catch (error) {
|
||||
// If manager failed to init, maybe return an empty tool list or throw?
|
||||
console.error('Cannot list tools, manager initialization failed:', error)
|
||||
logger.error('Cannot list tools, manager initialization failed:', error)
|
||||
return { tools: [] } // Return empty list if server is not ready
|
||||
}
|
||||
|
||||
@@ -687,7 +689,7 @@ class MemoryServer {
|
||||
if (error instanceof McpError) {
|
||||
throw error // Re-throw McpErrors directly
|
||||
}
|
||||
console.error(`Error executing tool ${name}:`, error)
|
||||
logger.error(`Error executing tool ${name}:`, error)
|
||||
// Throw a generic internal error for unexpected issues
|
||||
throw new McpError(
|
||||
ErrorCode.InternalError,
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { pythonService } from '@main/services/PythonService'
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
|
||||
import { CallToolRequestSchema, ErrorCode, ListToolsRequestSchema, McpError } from '@modelcontextprotocol/sdk/types.js'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
const logger = loggerService.withContext('MCPServer:Python')
|
||||
|
||||
/**
|
||||
* Python MCP Server for executing Python code using Pyodide
|
||||
@@ -88,7 +90,7 @@ print('python code here')`,
|
||||
throw new McpError(ErrorCode.InvalidParams, 'Code parameter is required and must be a string')
|
||||
}
|
||||
|
||||
Logger.info('Executing Python code via Pyodide')
|
||||
logger.debug('Executing Python code via Pyodide')
|
||||
|
||||
const result = await pythonService.executeScript(code, context, timeout)
|
||||
|
||||
@@ -102,7 +104,7 @@ print('python code here')`,
|
||||
}
|
||||
} catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||
Logger.error('Python execution error:', errorMessage)
|
||||
logger.error('Python execution error:', errorMessage)
|
||||
|
||||
throw new McpError(ErrorCode.InternalError, `Python execution failed: ${errorMessage}`)
|
||||
}
|
||||
|
||||
@@ -1,11 +1,14 @@
|
||||
// Sequential Thinking MCP Server
|
||||
// port https://github.com/modelcontextprotocol/servers/blob/main/src/sequentialthinking/index.ts
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { Server } from '@modelcontextprotocol/sdk/server/index.js'
|
||||
import { CallToolRequestSchema, ListToolsRequestSchema, Tool } from '@modelcontextprotocol/sdk/types.js'
|
||||
// Fixed chalk import for ESM
|
||||
import chalk from 'chalk'
|
||||
|
||||
const logger = loggerService.withContext('MCPServer:SequentialThinkingServer')
|
||||
|
||||
interface ThoughtData {
|
||||
thought: string
|
||||
thoughtNumber: number
|
||||
@@ -98,7 +101,7 @@ class SequentialThinkingServer {
|
||||
}
|
||||
|
||||
const formattedThought = this.formatThought(validatedInput)
|
||||
console.error(formattedThought)
|
||||
logger.error(formattedThought)
|
||||
|
||||
return {
|
||||
content: [
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isDev, isLinux, isMac, isWin } from '@main/constant'
|
||||
import { app } from 'electron'
|
||||
import log from 'electron-log'
|
||||
import fs from 'fs'
|
||||
import os from 'os'
|
||||
import path from 'path'
|
||||
|
||||
const logger = loggerService.withContext('AppService')
|
||||
|
||||
export class AppService {
|
||||
private static instance: AppService
|
||||
|
||||
@@ -59,19 +61,19 @@ export class AppService {
|
||||
|
||||
// Write desktop file
|
||||
await fs.promises.writeFile(desktopFile, desktopContent)
|
||||
log.info('Created autostart desktop file for Linux')
|
||||
logger.info('Created autostart desktop file for Linux')
|
||||
} else {
|
||||
// Remove desktop file
|
||||
try {
|
||||
await fs.promises.access(desktopFile)
|
||||
await fs.promises.unlink(desktopFile)
|
||||
log.info('Removed autostart desktop file for Linux')
|
||||
logger.info('Removed autostart desktop file for Linux')
|
||||
} catch {
|
||||
// File doesn't exist, no need to remove
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
log.error('Failed to set launch on boot for Linux:', error)
|
||||
logger.error('Failed to set launch on boot for Linux:', error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isWin } from '@main/constant'
|
||||
import { locales } from '@main/utils/locales'
|
||||
import { generateUserAgent } from '@main/utils/systemInfo'
|
||||
@@ -5,13 +6,14 @@ import { FeedUrl, UpgradeChannel } from '@shared/config/constant'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { CancellationToken, UpdateInfo } from 'builder-util-runtime'
|
||||
import { app, BrowserWindow, dialog } from 'electron'
|
||||
import logger from 'electron-log'
|
||||
import { AppUpdater as _AppUpdater, autoUpdater, NsisUpdater, UpdateCheckResult } from 'electron-updater'
|
||||
import path from 'path'
|
||||
|
||||
import icon from '../../../build/icon.png?asset'
|
||||
import { configManager } from './ConfigManager'
|
||||
|
||||
const logger = loggerService.withContext('AppUpdater')
|
||||
|
||||
export default class AppUpdater {
|
||||
autoUpdater: _AppUpdater = autoUpdater
|
||||
private releaseInfo: UpdateInfo | undefined
|
||||
@@ -19,8 +21,6 @@ export default class AppUpdater {
|
||||
private updateCheckResult: UpdateCheckResult | null = null
|
||||
|
||||
constructor(mainWindow: BrowserWindow) {
|
||||
logger.transports.file.level = 'info'
|
||||
|
||||
autoUpdater.logger = logger
|
||||
autoUpdater.forceDevUpdateConfig = !app.isPackaged
|
||||
autoUpdater.autoDownload = configManager.getAutoUpdate()
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { WebDavConfig } from '@types'
|
||||
import { S3Config } from '@types'
|
||||
import archiver from 'archiver'
|
||||
import { exec } from 'child_process'
|
||||
import { app } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import * as fs from 'fs-extra'
|
||||
import StreamZip from 'node-stream-zip'
|
||||
import * as path from 'path'
|
||||
@@ -15,6 +15,8 @@ import S3Storage from './S3Storage'
|
||||
import WebDav from './WebDav'
|
||||
import { windowService } from './WindowService'
|
||||
|
||||
const logger = loggerService.withContext('BackupManager')
|
||||
|
||||
class BackupManager {
|
||||
private tempDir = path.join(app.getPath('temp'), 'cherry-studio', 'backup', 'temp')
|
||||
private backupDir = path.join(app.getPath('temp'), 'cherry-studio', 'backup')
|
||||
@@ -58,7 +60,7 @@ class BackupManager {
|
||||
// 确保根目录权限
|
||||
await this.forceSetWritable(dirPath)
|
||||
} catch (error) {
|
||||
Logger.error(`权限设置失败:${dirPath}`, error)
|
||||
logger.error(`权限设置失败:${dirPath}`, error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -81,7 +83,7 @@ class BackupManager {
|
||||
}
|
||||
} catch (error) {
|
||||
if ((error as NodeJS.ErrnoException).code !== 'ENOENT') {
|
||||
Logger.warn(`权限设置警告:${targetPath}`, error)
|
||||
logger.warn(`权限设置警告:${targetPath}`, error)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -100,7 +102,7 @@ class BackupManager {
|
||||
// 只在关键阶段记录日志:开始、结束和主要阶段转换点
|
||||
const logStages = ['preparing', 'writing_data', 'preparing_compression', 'completed']
|
||||
if (logStages.includes(processData.stage) || processData.progress === 100) {
|
||||
Logger.log('[BackupManager] backup progress', processData)
|
||||
logger.debug('backup progress', processData)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -122,7 +124,7 @@ class BackupManager {
|
||||
|
||||
onProgress({ stage: 'writing_data', progress: 20, total: 100 })
|
||||
|
||||
Logger.log('[BackupManager IPC] ', skipBackupFile)
|
||||
logger.debug('BackupManager IPC', skipBackupFile)
|
||||
|
||||
if (!skipBackupFile) {
|
||||
// 复制 Data 目录到临时目录
|
||||
@@ -143,7 +145,7 @@ class BackupManager {
|
||||
await this.setWritableRecursive(tempDataDir)
|
||||
onProgress({ stage: 'preparing_compression', progress: 50, total: 100 })
|
||||
} else {
|
||||
Logger.log('[BackupManager] Skip the backup of the file')
|
||||
logger.debug('Skip the backup of the file')
|
||||
await fs.promises.mkdir(path.join(this.tempDir, 'Data')) // 不创建空 Data 目录会导致 restore 失败
|
||||
}
|
||||
|
||||
@@ -179,7 +181,7 @@ class BackupManager {
|
||||
}
|
||||
} catch (error) {
|
||||
// 仅在出错时记录日志
|
||||
Logger.error('[BackupManager] Error calculating totals:', error)
|
||||
logger.error('[BackupManager] Error calculating totals:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -218,7 +220,7 @@ class BackupManager {
|
||||
archive.on('error', reject)
|
||||
archive.on('warning', (err: any) => {
|
||||
if (err.code !== 'ENOENT') {
|
||||
Logger.warn('[BackupManager] Archive warning:', err)
|
||||
logger.warn('[BackupManager] Archive warning:', err)
|
||||
}
|
||||
})
|
||||
|
||||
@@ -236,10 +238,10 @@ class BackupManager {
|
||||
await fs.remove(this.tempDir)
|
||||
onProgress({ stage: 'completed', progress: 100, total: 100 })
|
||||
|
||||
Logger.log('[BackupManager] Backup completed successfully')
|
||||
logger.debug('Backup completed successfully')
|
||||
return backupedFilePath
|
||||
} catch (error) {
|
||||
Logger.error('[BackupManager] Backup failed:', error)
|
||||
logger.error('[BackupManager] Backup failed:', error)
|
||||
// 确保清理临时目录
|
||||
await fs.remove(this.tempDir).catch(() => {})
|
||||
throw error
|
||||
@@ -254,7 +256,7 @@ class BackupManager {
|
||||
// 只在关键阶段记录日志
|
||||
const logStages = ['preparing', 'extracting', 'extracted', 'reading_data', 'completed']
|
||||
if (logStages.includes(processData.stage) || processData.progress === 100) {
|
||||
Logger.log('[BackupManager] restore progress', processData)
|
||||
logger.debug('restore progress', processData)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -263,20 +265,20 @@ class BackupManager {
|
||||
await fs.ensureDir(this.tempDir)
|
||||
onProgress({ stage: 'preparing', progress: 0, total: 100 })
|
||||
|
||||
Logger.log('[backup] step 1: unzip backup file', this.tempDir)
|
||||
logger.debug('step 1: unzip backup file', this.tempDir)
|
||||
|
||||
const zip = new StreamZip.async({ file: backupPath })
|
||||
onProgress({ stage: 'extracting', progress: 15, total: 100 })
|
||||
await zip.extract(null, this.tempDir)
|
||||
onProgress({ stage: 'extracted', progress: 25, total: 100 })
|
||||
|
||||
Logger.log('[backup] step 2: read data.json')
|
||||
logger.debug('step 2: read data.json')
|
||||
// 读取 data.json
|
||||
const dataPath = path.join(this.tempDir, 'data.json')
|
||||
const data = await fs.readFile(dataPath, 'utf-8')
|
||||
onProgress({ stage: 'reading_data', progress: 35, total: 100 })
|
||||
|
||||
Logger.log('[backup] step 3: restore Data directory')
|
||||
logger.debug('step 3: restore Data directory')
|
||||
// 恢复 Data 目录
|
||||
const sourcePath = path.join(this.tempDir, 'Data')
|
||||
const destPath = getDataPath()
|
||||
@@ -299,20 +301,20 @@ class BackupManager {
|
||||
onProgress({ stage: 'copying_files', progress, total: 100 })
|
||||
})
|
||||
} else {
|
||||
Logger.log('[backup] skipBackupFile is true, skip restoring Data directory')
|
||||
logger.debug('skipBackupFile is true, skip restoring Data directory')
|
||||
}
|
||||
|
||||
Logger.log('[backup] step 4: clean up temp directory')
|
||||
logger.debug('step 4: clean up temp directory')
|
||||
// 清理临时目录
|
||||
await this.setWritableRecursive(this.tempDir)
|
||||
await fs.remove(this.tempDir)
|
||||
onProgress({ stage: 'completed', progress: 100, total: 100 })
|
||||
|
||||
Logger.log('[backup] step 5: Restore completed successfully')
|
||||
logger.debug('step 5: Restore completed successfully')
|
||||
|
||||
return data
|
||||
} catch (error) {
|
||||
Logger.error('[backup] Restore failed:', error)
|
||||
logger.error('Restore failed:', error)
|
||||
await fs.remove(this.tempDir).catch(() => {})
|
||||
throw error
|
||||
}
|
||||
@@ -369,7 +371,7 @@ class BackupManager {
|
||||
|
||||
return await this.restore(_, backupedFilePath)
|
||||
} catch (error: any) {
|
||||
Logger.error('[backup] Failed to restore from WebDAV:', error)
|
||||
logger.error('Failed to restore from WebDAV:', error)
|
||||
throw new Error(error.message || 'Failed to restore backup file')
|
||||
}
|
||||
}
|
||||
@@ -389,7 +391,7 @@ class BackupManager {
|
||||
}))
|
||||
.sort((a, b) => new Date(b.modifiedTime).getTime() - new Date(a.modifiedTime).getTime())
|
||||
} catch (error: any) {
|
||||
Logger.error('Failed to list WebDAV files:', error)
|
||||
logger.error('Failed to list WebDAV files:', error)
|
||||
throw new Error(error.message || 'Failed to list backup files')
|
||||
}
|
||||
}
|
||||
@@ -485,7 +487,7 @@ class BackupManager {
|
||||
const webdavClient = new WebDav(webdavConfig)
|
||||
return await webdavClient.deleteFile(fileName)
|
||||
} catch (error: any) {
|
||||
Logger.error('Failed to delete WebDAV file:', error)
|
||||
logger.error('Failed to delete WebDAV file:', error)
|
||||
throw new Error(error.message || 'Failed to delete backup file')
|
||||
}
|
||||
}
|
||||
@@ -507,7 +509,7 @@ class BackupManager {
|
||||
const backupedFilePath = await this.backup(_, fileName, data, backupDir, localConfig.skipBackupFile)
|
||||
return backupedFilePath
|
||||
} catch (error) {
|
||||
Logger.error('[BackupManager] Local backup failed:', error)
|
||||
logger.error('[BackupManager] Local backup failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -521,7 +523,7 @@ class BackupManager {
|
||||
.slice(0, 14)
|
||||
const filename = s3Config.fileName || `cherry-studio.backup.${deviceName}.${timestamp}.zip`
|
||||
|
||||
Logger.log(`[BackupManager] Starting S3 backup to ${filename}`)
|
||||
logger.debug(`Starting S3 backup to ${filename}`)
|
||||
|
||||
const backupedFilePath = await this.backup(_, filename, data, undefined, s3Config.skipBackupFile)
|
||||
const s3Client = new S3Storage(s3Config)
|
||||
@@ -530,10 +532,10 @@ class BackupManager {
|
||||
const result = await s3Client.putFileContents(filename, fileBuffer)
|
||||
await fs.remove(backupedFilePath)
|
||||
|
||||
Logger.log(`[BackupManager] S3 backup completed successfully: ${filename}`)
|
||||
logger.debug(`S3 backup completed successfully: ${filename}`)
|
||||
return result
|
||||
} catch (error) {
|
||||
Logger.error(`[BackupManager] S3 backup failed:`, error)
|
||||
logger.error(`[BackupManager] S3 backup failed:`, error)
|
||||
await fs.remove(backupedFilePath)
|
||||
throw error
|
||||
}
|
||||
@@ -550,7 +552,7 @@ class BackupManager {
|
||||
|
||||
return await this.restore(_, backupPath)
|
||||
} catch (error) {
|
||||
Logger.error('[BackupManager] Local restore failed:', error)
|
||||
logger.error('[BackupManager] Local restore failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -576,7 +578,7 @@ class BackupManager {
|
||||
// Sort by modified time, newest first
|
||||
return result.sort((a, b) => new Date(b.modifiedTime).getTime() - new Date(a.modifiedTime).getTime())
|
||||
} catch (error) {
|
||||
Logger.error('[BackupManager] List local backup files failed:', error)
|
||||
logger.error('[BackupManager] List local backup files failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -592,7 +594,7 @@ class BackupManager {
|
||||
await fs.remove(filePath)
|
||||
return true
|
||||
} catch (error) {
|
||||
Logger.error('[BackupManager] Delete local backup file failed:', error)
|
||||
logger.error('[BackupManager] Delete local backup file failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -603,7 +605,7 @@ class BackupManager {
|
||||
await fs.ensureDir(dirPath)
|
||||
return true
|
||||
} catch (error) {
|
||||
Logger.error('[BackupManager] Set local backup directory failed:', error)
|
||||
logger.error('[BackupManager] Set local backup directory failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -611,7 +613,7 @@ class BackupManager {
|
||||
async restoreFromS3(_: Electron.IpcMainInvokeEvent, s3Config: S3Config) {
|
||||
const filename = s3Config.fileName || 'cherry-studio.backup.zip'
|
||||
|
||||
Logger.log(`[BackupManager] Starting restore from S3: ${filename}`)
|
||||
logger.debug(`Starting restore from S3: ${filename}`)
|
||||
|
||||
const s3Client = new S3Storage(s3Config)
|
||||
try {
|
||||
@@ -628,10 +630,10 @@ class BackupManager {
|
||||
writeStream.on('error', (error) => reject(error))
|
||||
})
|
||||
|
||||
Logger.log(`[BackupManager] S3 restore file downloaded successfully: ${filename}`)
|
||||
logger.debug(`S3 restore file downloaded successfully: ${filename}`)
|
||||
return await this.restore(_, backupedFilePath)
|
||||
} catch (error: any) {
|
||||
Logger.error('[BackupManager] Failed to restore from S3:', error)
|
||||
logger.error('[BackupManager] Failed to restore from S3:', error)
|
||||
throw new Error(error.message || 'Failed to restore backup file')
|
||||
}
|
||||
}
|
||||
@@ -655,7 +657,7 @@ class BackupManager {
|
||||
|
||||
return files.sort((a, b) => new Date(b.modifiedTime).getTime() - new Date(a.modifiedTime).getTime())
|
||||
} catch (error: any) {
|
||||
Logger.error('Failed to list S3 files:', error)
|
||||
logger.error('Failed to list S3 files:', error)
|
||||
throw new Error(error.message || 'Failed to list backup files')
|
||||
}
|
||||
}
|
||||
@@ -665,7 +667,7 @@ class BackupManager {
|
||||
const s3Client = new S3Storage(s3Config)
|
||||
return await s3Client.deleteFile(fileName)
|
||||
} catch (error: any) {
|
||||
Logger.error('Failed to delete S3 file:', error)
|
||||
logger.error('Failed to delete S3 file:', error)
|
||||
throw new Error(error.message || 'Failed to delete backup file')
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { AxiosRequestConfig } from 'axios'
|
||||
import axios from 'axios'
|
||||
import { app, safeStorage } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import fs from 'fs/promises'
|
||||
import path from 'path'
|
||||
|
||||
const logger = loggerService.withContext('CopilotService')
|
||||
|
||||
// 配置常量,集中管理
|
||||
const CONFIG = {
|
||||
GITHUB_CLIENT_ID: 'Iv1.b507a08c87ecfe98',
|
||||
@@ -101,7 +103,7 @@ class CopilotService {
|
||||
avatar: response.data.avatar_url
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to get user information:', error)
|
||||
logger.error('Failed to get user information:', error)
|
||||
throw new CopilotServiceError('无法获取GitHub用户信息', error)
|
||||
}
|
||||
}
|
||||
@@ -127,7 +129,7 @@ class CopilotService {
|
||||
|
||||
return response.data
|
||||
} catch (error) {
|
||||
console.error('Failed to get auth message:', error)
|
||||
logger.error('Failed to get auth message:', error)
|
||||
throw new CopilotServiceError('无法获取GitHub授权信息', error)
|
||||
}
|
||||
}
|
||||
@@ -169,7 +171,7 @@ class CopilotService {
|
||||
// 仅在最后一次尝试失败时记录详细错误
|
||||
const isLastAttempt = attempt === CONFIG.POLLING.MAX_ATTEMPTS - 1
|
||||
if (isLastAttempt) {
|
||||
console.error(`Token polling failed after ${CONFIG.POLLING.MAX_ATTEMPTS} attempts:`, error)
|
||||
logger.error(`Token polling failed after ${CONFIG.POLLING.MAX_ATTEMPTS} attempts:`, error)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -185,7 +187,7 @@ class CopilotService {
|
||||
const encryptedToken = safeStorage.encryptString(token)
|
||||
await fs.writeFile(this.tokenFilePath, encryptedToken)
|
||||
} catch (error) {
|
||||
console.error('Failed to save token:', error)
|
||||
logger.error('Failed to save token:', error)
|
||||
throw new CopilotServiceError('无法保存访问令牌', error)
|
||||
}
|
||||
}
|
||||
@@ -214,7 +216,7 @@ class CopilotService {
|
||||
|
||||
return response.data
|
||||
} catch (error) {
|
||||
console.error('Failed to get Copilot token:', error)
|
||||
logger.error('Failed to get Copilot token:', error)
|
||||
throw new CopilotServiceError('无法获取Copilot令牌,请重新授权', error)
|
||||
}
|
||||
}
|
||||
@@ -227,13 +229,13 @@ class CopilotService {
|
||||
try {
|
||||
await fs.access(this.tokenFilePath)
|
||||
await fs.unlink(this.tokenFilePath)
|
||||
Logger.log('Successfully logged out from Copilot')
|
||||
logger.debug('Successfully logged out from Copilot')
|
||||
} catch (error) {
|
||||
// 文件不存在不是错误,只是记录一下
|
||||
Logger.log('Token file not found, nothing to delete')
|
||||
logger.debug('Token file not found, nothing to delete')
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to logout:', error)
|
||||
logger.error('Failed to logout:', error)
|
||||
throw new CopilotServiceError('无法完成退出登录操作', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { getMcpDir, getTempDir } from '@main/utils/file'
|
||||
import logger from 'electron-log'
|
||||
import * as fs from 'fs'
|
||||
import StreamZip from 'node-stream-zip'
|
||||
import * as os from 'os'
|
||||
import * as path from 'path'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
|
||||
const logger = loggerService.withContext('DxtService')
|
||||
|
||||
// Type definitions
|
||||
export interface DxtManifest {
|
||||
dxt_version: string
|
||||
@@ -174,7 +176,7 @@ class DxtService {
|
||||
fs.mkdirSync(this.mcpDir, { recursive: true })
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('[DxtService] Failed to create directories:', error)
|
||||
logger.error('Failed to create directories:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -184,7 +186,7 @@ class DxtService {
|
||||
fs.renameSync(source, destination)
|
||||
} catch (error) {
|
||||
// If rename fails (cross-filesystem), use copy + remove
|
||||
logger.info('[DxtService] Cross-filesystem move detected, using copy + remove')
|
||||
logger.debug('Cross-filesystem move detected, using copy + remove')
|
||||
|
||||
// Ensure parent directory exists
|
||||
const parentDir = path.dirname(destination)
|
||||
@@ -230,7 +232,7 @@ class DxtService {
|
||||
}
|
||||
|
||||
// Extract the DXT file (which is a ZIP archive) to a temporary directory
|
||||
logger.info('[DxtService] Extracting DXT file:', filePath)
|
||||
logger.debug('Extracting DXT file:', filePath)
|
||||
|
||||
const zip = new StreamZip.async({ file: filePath })
|
||||
await zip.extract(null, tempExtractDir)
|
||||
@@ -276,14 +278,14 @@ class DxtService {
|
||||
|
||||
// Clean up any existing version of this server
|
||||
if (fs.existsSync(finalExtractDir)) {
|
||||
logger.info('[DxtService] Removing existing server directory:', finalExtractDir)
|
||||
logger.debug('Removing existing server directory:', finalExtractDir)
|
||||
fs.rmSync(finalExtractDir, { recursive: true, force: true })
|
||||
}
|
||||
|
||||
// Move the temporary directory to the final location
|
||||
// Use recursive copy + remove instead of rename to handle cross-filesystem moves
|
||||
await this.moveDirectory(tempExtractDir, finalExtractDir)
|
||||
logger.info('[DxtService] DXT server extracted to:', finalExtractDir)
|
||||
logger.debug('DXT server extracted to:', finalExtractDir)
|
||||
|
||||
// Clean up the uploaded DXT file if it's in temp directory
|
||||
if (filePath.startsWith(this.tempDir)) {
|
||||
@@ -305,7 +307,7 @@ class DxtService {
|
||||
}
|
||||
|
||||
const errorMessage = error instanceof Error ? error.message : 'Failed to process DXT file'
|
||||
logger.error('[DxtService] DXT upload error:', error)
|
||||
logger.error('DXT upload error:', error)
|
||||
|
||||
return {
|
||||
success: false,
|
||||
@@ -322,7 +324,7 @@ class DxtService {
|
||||
// Read the manifest from the DXT server directory
|
||||
const manifestPath = path.join(dxtPath, 'manifest.json')
|
||||
if (!fs.existsSync(manifestPath)) {
|
||||
logger.error('[DxtService] Manifest not found:', manifestPath)
|
||||
logger.error('Manifest not found:', manifestPath)
|
||||
return null
|
||||
}
|
||||
|
||||
@@ -330,14 +332,14 @@ class DxtService {
|
||||
const manifest: DxtManifest = JSON.parse(manifestContent)
|
||||
|
||||
if (!manifest.server?.mcp_config) {
|
||||
logger.error('[DxtService] No mcp_config found in manifest')
|
||||
logger.error('No mcp_config found in manifest')
|
||||
return null
|
||||
}
|
||||
|
||||
// Apply platform overrides and variable substitution
|
||||
const resolvedConfig = applyPlatformOverrides(manifest.server.mcp_config, dxtPath, userConfig)
|
||||
|
||||
logger.info('[DxtService] Resolved MCP config:', {
|
||||
logger.debug('Resolved MCP config:', {
|
||||
command: resolvedConfig.command,
|
||||
args: resolvedConfig.args,
|
||||
env: resolvedConfig.env ? Object.keys(resolvedConfig.env) : undefined
|
||||
@@ -345,7 +347,7 @@ class DxtService {
|
||||
|
||||
return resolvedConfig
|
||||
} catch (error) {
|
||||
logger.error('[DxtService] Failed to resolve MCP config:', error)
|
||||
logger.error('Failed to resolve MCP config:', error)
|
||||
return null
|
||||
}
|
||||
}
|
||||
@@ -360,7 +362,7 @@ class DxtService {
|
||||
|
||||
// First try the sanitized path
|
||||
if (fs.existsSync(serverDir)) {
|
||||
logger.info('[DxtService] Removing DXT server directory:', serverDir)
|
||||
logger.debug('Removing DXT server directory:', serverDir)
|
||||
fs.rmSync(serverDir, { recursive: true, force: true })
|
||||
return true
|
||||
}
|
||||
@@ -368,15 +370,15 @@ class DxtService {
|
||||
// Fallback: try with original name in case it was stored differently
|
||||
const originalServerDir = path.join(this.mcpDir, `server-${serverName}`)
|
||||
if (fs.existsSync(originalServerDir)) {
|
||||
logger.info('[DxtService] Removing DXT server directory:', originalServerDir)
|
||||
logger.debug('Removing DXT server directory:', originalServerDir)
|
||||
fs.rmSync(originalServerDir, { recursive: true, force: true })
|
||||
return true
|
||||
}
|
||||
|
||||
logger.warn('[DxtService] Server directory not found:', serverDir)
|
||||
logger.warn('Server directory not found:', serverDir)
|
||||
return false
|
||||
} catch (error) {
|
||||
logger.error('[DxtService] Failed to cleanup DXT server:', error)
|
||||
logger.error('Failed to cleanup DXT server:', error)
|
||||
return false
|
||||
}
|
||||
}
|
||||
@@ -388,7 +390,7 @@ class DxtService {
|
||||
fs.rmSync(this.tempDir, { recursive: true, force: true })
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('[DxtService] Cleanup error:', error)
|
||||
logger.error('Cleanup error:', error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
/* eslint-disable no-case-declarations */
|
||||
// ExportService
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import {
|
||||
AlignmentType,
|
||||
BorderStyle,
|
||||
@@ -18,11 +19,11 @@ import {
|
||||
WidthType
|
||||
} from 'docx'
|
||||
import { dialog } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import MarkdownIt from 'markdown-it'
|
||||
|
||||
import FileStorage from './FileStorage'
|
||||
|
||||
const logger = loggerService.withContext('ExportService')
|
||||
export class ExportService {
|
||||
private fileManager: FileStorage
|
||||
private md: MarkdownIt
|
||||
@@ -399,10 +400,10 @@ export class ExportService {
|
||||
|
||||
if (filePath) {
|
||||
await this.fileManager.writeFile(_, filePath, buffer)
|
||||
Logger.info('[ExportService] Document exported successfully')
|
||||
logger.debug('Document exported successfully')
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('[ExportService] Export to Word failed:', error)
|
||||
logger.error('Export to Word failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { getFilesDir, getFileType, getTempDir, readTextFileWithAutoEncoding } from '@main/utils/file'
|
||||
import { documentExts, imageExts, MB } from '@shared/config/constant'
|
||||
import { FileMetadata } from '@types'
|
||||
@@ -10,7 +11,6 @@ import {
|
||||
SaveDialogReturnValue,
|
||||
shell
|
||||
} from 'electron'
|
||||
import logger from 'electron-log'
|
||||
import * as fs from 'fs'
|
||||
import { writeFileSync } from 'fs'
|
||||
import { readFile } from 'fs/promises'
|
||||
@@ -21,6 +21,8 @@ import { chdir } from 'process'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
import WordExtractor from 'word-extractor'
|
||||
|
||||
const logger = loggerService.withContext('FileStorage')
|
||||
|
||||
class FileStorage {
|
||||
private storageDir = getFilesDir()
|
||||
private tempDir = getTempDir()
|
||||
@@ -38,7 +40,7 @@ class FileStorage {
|
||||
fs.mkdirSync(this.tempDir, { recursive: true })
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('[FileStorage] Failed to initialize storage directories:', error)
|
||||
logger.error('Failed to initialize storage directories:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -136,9 +138,9 @@ class FileStorage {
|
||||
if (fileSizeInMB > 1) {
|
||||
try {
|
||||
await fs.promises.copyFile(sourcePath, destPath)
|
||||
logger.info('[FileStorage] Image compressed successfully:', sourcePath)
|
||||
logger.debug('Image compressed successfully:', sourcePath)
|
||||
} catch (jimpError) {
|
||||
logger.error('[FileStorage] Image compression failed:', jimpError)
|
||||
logger.error('Image compression failed:', jimpError)
|
||||
await fs.promises.copyFile(sourcePath, destPath)
|
||||
}
|
||||
} else {
|
||||
@@ -146,7 +148,7 @@ class FileStorage {
|
||||
await fs.promises.copyFile(sourcePath, destPath)
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('[FileStorage] Image handling failed:', error)
|
||||
logger.error('Image handling failed:', error)
|
||||
// 错误情况下直接复制原文件
|
||||
await fs.promises.copyFile(sourcePath, destPath)
|
||||
}
|
||||
@@ -188,7 +190,7 @@ class FileStorage {
|
||||
count: 1
|
||||
}
|
||||
|
||||
logger.info('[FileStorage] File uploaded:', fileMetadata)
|
||||
logger.debug('File uploaded:', fileMetadata)
|
||||
|
||||
return fileMetadata
|
||||
}
|
||||
@@ -257,7 +259,7 @@ class FileStorage {
|
||||
return data
|
||||
} catch (error) {
|
||||
chdir(originalCwd)
|
||||
logger.error(error)
|
||||
logger.error('Failed to read file:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -269,7 +271,7 @@ class FileStorage {
|
||||
return fs.readFileSync(filePath, 'utf-8')
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error(error)
|
||||
logger.error('Failed to read file:', error)
|
||||
throw new Error(`Failed to read file: ${filePath}.`)
|
||||
}
|
||||
}
|
||||
@@ -319,7 +321,7 @@ class FileStorage {
|
||||
const ext = '.png'
|
||||
const destPath = path.join(this.storageDir, uuid + ext)
|
||||
|
||||
logger.info('[FileStorage] Saving base64 image:', {
|
||||
logger.debug('Saving base64 image:', {
|
||||
storageDir: this.storageDir,
|
||||
destPath,
|
||||
bufferSize: buffer.length
|
||||
@@ -346,7 +348,7 @@ class FileStorage {
|
||||
|
||||
return fileMetadata
|
||||
} catch (error) {
|
||||
logger.error('[FileStorage] Failed to save base64 image:', error)
|
||||
logger.error('Failed to save base64 image:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -560,7 +562,7 @@ class FileStorage {
|
||||
|
||||
return fileMetadata
|
||||
} catch (error) {
|
||||
logger.error('[FileStorage] Download file error:', error)
|
||||
logger.error('Download file error:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -596,9 +598,9 @@ class FileStorage {
|
||||
|
||||
// 复制文件
|
||||
await fs.promises.copyFile(sourcePath, destPath)
|
||||
logger.info('[FileStorage] File copied successfully:', { from: sourcePath, to: destPath })
|
||||
logger.debug('File copied successfully:', { from: sourcePath, to: destPath })
|
||||
} catch (error) {
|
||||
logger.error('[FileStorage] Copy file failed:', error)
|
||||
logger.error('Copy file failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -606,18 +608,18 @@ class FileStorage {
|
||||
public writeFileWithId = async (_: Electron.IpcMainInvokeEvent, id: string, content: string): Promise<void> => {
|
||||
try {
|
||||
const filePath = path.join(this.storageDir, id)
|
||||
logger.info('[FileStorage] Writing file:', filePath)
|
||||
logger.debug('Writing file:', filePath)
|
||||
|
||||
// 确保目录存在
|
||||
if (!fs.existsSync(this.storageDir)) {
|
||||
logger.info('[FileStorage] Creating storage directory:', this.storageDir)
|
||||
logger.debug('Creating storage directory:', this.storageDir)
|
||||
fs.mkdirSync(this.storageDir, { recursive: true })
|
||||
}
|
||||
|
||||
await fs.promises.writeFile(filePath, content, 'utf8')
|
||||
logger.info('[FileStorage] File written successfully:', filePath)
|
||||
logger.debug('File written successfully:', filePath)
|
||||
} catch (error) {
|
||||
logger.error('[FileStorage] Failed to write file:', error)
|
||||
logger.error('Failed to write file:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -21,6 +21,7 @@ import type { ExtractChunkData } from '@cherrystudio/embedjs-interfaces'
|
||||
import { LibSqlDb } from '@cherrystudio/embedjs-libsql'
|
||||
import { SitemapLoader } from '@cherrystudio/embedjs-loader-sitemap'
|
||||
import { WebLoader } from '@cherrystudio/embedjs-loader-web'
|
||||
import { loggerService } from '@logger'
|
||||
import OcrProvider from '@main/knowledage/ocr/OcrProvider'
|
||||
import PreprocessProvider from '@main/knowledage/preprocess/PreprocessProvider'
|
||||
import Embeddings from '@main/knowledge/embeddings/Embeddings'
|
||||
@@ -34,9 +35,10 @@ import { MB } from '@shared/config/constant'
|
||||
import type { LoaderReturn } from '@shared/config/types'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { FileMetadata, KnowledgeBaseParams, KnowledgeItem } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
|
||||
const logger = loggerService.withContext('KnowledgeService')
|
||||
|
||||
export interface KnowledgeBaseAddItemOptions {
|
||||
base: KnowledgeBaseParams
|
||||
item: KnowledgeItem
|
||||
@@ -137,7 +139,7 @@ class KnowledgeService {
|
||||
.setSearchResultCount(documentCount || 30)
|
||||
.build()
|
||||
} catch (e) {
|
||||
Logger.error(e)
|
||||
logger.error('Failed to create RAGApplication:', e)
|
||||
throw new Error(`Failed to create RAGApplication: ${e}`)
|
||||
}
|
||||
|
||||
@@ -190,7 +192,7 @@ class KnowledgeService {
|
||||
return result
|
||||
})
|
||||
.catch((e) => {
|
||||
Logger.error(`Error in addFileLoader for ${file.name}: ${e}`)
|
||||
logger.error(`Error in addFileLoader for ${file.name}: ${e}`)
|
||||
const errorResult: LoaderReturn = {
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: e.message,
|
||||
@@ -200,7 +202,7 @@ class KnowledgeService {
|
||||
return errorResult
|
||||
})
|
||||
} catch (e: any) {
|
||||
Logger.error(`Preprocessing failed for ${file.name}: ${e}`)
|
||||
logger.error(`Preprocessing failed for ${file.name}: ${e}`)
|
||||
const errorResult: LoaderReturn = {
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: e.message,
|
||||
@@ -256,7 +258,7 @@ class KnowledgeService {
|
||||
return result
|
||||
})
|
||||
.catch((err) => {
|
||||
Logger.error(err)
|
||||
logger.error('Failed to add dir loader:', err)
|
||||
return {
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: `Failed to add dir loader: ${err.message}`,
|
||||
@@ -306,7 +308,7 @@ class KnowledgeService {
|
||||
return result
|
||||
})
|
||||
.catch((err) => {
|
||||
Logger.error(err)
|
||||
logger.error('Failed to add url loader:', err)
|
||||
return {
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: `Failed to add url loader: ${err.message}`,
|
||||
@@ -350,7 +352,7 @@ class KnowledgeService {
|
||||
return result
|
||||
})
|
||||
.catch((err) => {
|
||||
Logger.error(err)
|
||||
logger.error('Failed to add sitemap loader:', err)
|
||||
return {
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: `Failed to add sitemap loader: ${err.message}`,
|
||||
@@ -400,7 +402,7 @@ class KnowledgeService {
|
||||
}
|
||||
})
|
||||
.catch((err) => {
|
||||
Logger.error(err)
|
||||
logger.error('Failed to add note loader:', err)
|
||||
return {
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: `Failed to add note loader: ${err.message}`,
|
||||
@@ -508,7 +510,7 @@ class KnowledgeService {
|
||||
}
|
||||
})
|
||||
.catch((err) => {
|
||||
Logger.error(err)
|
||||
logger.error('Failed to add item:', err)
|
||||
resolve({
|
||||
...KnowledgeService.ERROR_LOADER_RETURN,
|
||||
message: `Failed to add item: ${err.message}`,
|
||||
@@ -523,7 +525,7 @@ class KnowledgeService {
|
||||
{ uniqueId, uniqueIds, base }: { uniqueId: string; uniqueIds: string[]; base: KnowledgeBaseParams }
|
||||
): Promise<void> => {
|
||||
const ragApplication = await this.getRagApplication(base)
|
||||
Logger.log(`[ KnowledgeService Remove Item UniqueId: ${uniqueId}]`)
|
||||
logger.debug(`Remove Item UniqueId: ${uniqueId}`)
|
||||
for (const id of uniqueIds) {
|
||||
await ragApplication.deleteLoader(id)
|
||||
}
|
||||
@@ -569,12 +571,12 @@ class KnowledgeService {
|
||||
// 首先检查文件是否已经被预处理过
|
||||
const alreadyProcessed = await provider.checkIfAlreadyProcessed(file)
|
||||
if (alreadyProcessed) {
|
||||
Logger.info(`File already preprocess processed, using cached result: ${file.path}`)
|
||||
logger.debug(`File already preprocess processed, using cached result: ${file.path}`)
|
||||
return alreadyProcessed
|
||||
}
|
||||
|
||||
// 执行预处理
|
||||
Logger.info(`Starting preprocess processing for scanned PDF: ${file.path}`)
|
||||
logger.debug(`Starting preprocess processing for scanned PDF: ${file.path}`)
|
||||
const { processedFile, quota } = await provider.parseFile(item.id, file)
|
||||
fileToProcess = processedFile
|
||||
const mainWindow = windowService.getMainWindow()
|
||||
@@ -583,7 +585,7 @@ class KnowledgeService {
|
||||
quota: quota
|
||||
})
|
||||
} catch (err) {
|
||||
Logger.error(`Preprocess processing failed: ${err}`)
|
||||
logger.error(`Preprocess processing failed: ${err}`)
|
||||
// 如果预处理失败,使用原始文件
|
||||
// fileToProcess = file
|
||||
throw new Error(`Preprocess processing failed: ${err}`)
|
||||
@@ -605,7 +607,7 @@ class KnowledgeService {
|
||||
}
|
||||
throw new Error('No preprocess provider configured')
|
||||
} catch (err) {
|
||||
Logger.error(`Failed to check quota: ${err}`)
|
||||
logger.error(`Failed to check quota: ${err}`)
|
||||
throw new Error(`Failed to check quota: ${err}`)
|
||||
}
|
||||
}
|
||||
|
||||
272
src/main/services/LoggerService.ts
Normal file
272
src/main/services/LoggerService.ts
Normal file
@@ -0,0 +1,272 @@
|
||||
import type { LogLevel, LogSourceWithContext } from '@shared/config/types'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { app, ipcMain } from 'electron'
|
||||
import os from 'os'
|
||||
import path from 'path'
|
||||
import winston from 'winston'
|
||||
import DailyRotateFile from 'winston-daily-rotate-file'
|
||||
|
||||
import { isDev } from '../constant'
|
||||
|
||||
const ANSICOLORS = {
|
||||
RED: '\x1b[31m',
|
||||
GREEN: '\x1b[32m',
|
||||
YELLOW: '\x1b[33m',
|
||||
BLUE: '\x1b[34m',
|
||||
MAGENTA: '\x1b[35m',
|
||||
CYAN: '\x1b[36m',
|
||||
END: '\x1b[0m',
|
||||
BOLD: '\x1b[1m',
|
||||
ITALIC: '\x1b[3m',
|
||||
UNDERLINE: '\x1b[4m'
|
||||
}
|
||||
function colorText(text: string, color: string) {
|
||||
return ANSICOLORS[color] + text + ANSICOLORS.END
|
||||
}
|
||||
|
||||
const SYSTEM_INFO = {
|
||||
os: `${os.platform()}-${os.arch()} / ${os.version()}`,
|
||||
hw: `${os.cpus()[0]?.model || 'Unknown CPU'} / ${(os.totalmem() / 1024 / 1024 / 1024).toFixed(2)}GB`
|
||||
}
|
||||
const APP_VERSION = `v${app?.getVersion?.() || 'unknown'}`
|
||||
|
||||
const DEFAULT_LEVEL = isDev ? 'silly' : 'info'
|
||||
|
||||
/**
|
||||
* IMPORTANT: How to use LoggerService
|
||||
* please refer to
|
||||
* English: `docs/technical/how-to-use-logger-en.md`
|
||||
* Chinese: `docs/technical/how-to-use-logger-zh.md`
|
||||
*/
|
||||
export class LoggerService {
|
||||
private static instance: LoggerService
|
||||
private logger: winston.Logger
|
||||
|
||||
private logsDir: string = ''
|
||||
|
||||
private module: string = ''
|
||||
private context: Record<string, any> = {}
|
||||
|
||||
private constructor() {
|
||||
// Create logs directory path
|
||||
this.logsDir = path.join(app.getPath('userData'), 'logs')
|
||||
|
||||
// Configure transports based on environment
|
||||
const transports: winston.transport[] = []
|
||||
|
||||
//TODO remove when debug is done
|
||||
// transports.push(new winston.transports.Console())
|
||||
|
||||
// Daily rotate file transport for general logs
|
||||
transports.push(
|
||||
new DailyRotateFile({
|
||||
filename: path.join(this.logsDir, 'app.%DATE%.log'),
|
||||
datePattern: 'YYYY-MM-DD',
|
||||
maxSize: '10m',
|
||||
maxFiles: '30d'
|
||||
})
|
||||
)
|
||||
|
||||
// Daily rotate file transport for error logs
|
||||
transports.push(
|
||||
new DailyRotateFile({
|
||||
level: 'warn',
|
||||
filename: path.join(this.logsDir, 'app-error.%DATE%.log'),
|
||||
datePattern: 'YYYY-MM-DD',
|
||||
maxSize: '10m',
|
||||
maxFiles: '60d'
|
||||
})
|
||||
)
|
||||
|
||||
// Configure Winston logger
|
||||
this.logger = winston.createLogger({
|
||||
level: DEFAULT_LEVEL, // Development: all levels, Production: info and above
|
||||
format: winston.format.combine(
|
||||
winston.format.splat(),
|
||||
winston.format.timestamp({
|
||||
format: 'YYYY-MM-DD HH:mm:ss'
|
||||
}),
|
||||
winston.format.errors({ stack: true }),
|
||||
winston.format.json()
|
||||
),
|
||||
exitOnError: false,
|
||||
transports
|
||||
})
|
||||
|
||||
// Handle transport events
|
||||
this.logger.on('error', (error) => {
|
||||
console.error('LoggerService fatal error:', error)
|
||||
})
|
||||
|
||||
//register ipc handler, for renderer process to log to main process
|
||||
this.registerIpcHandler()
|
||||
}
|
||||
|
||||
public static getInstance(): LoggerService {
|
||||
if (!LoggerService.instance) {
|
||||
LoggerService.instance = new LoggerService()
|
||||
}
|
||||
return LoggerService.instance
|
||||
}
|
||||
|
||||
public withContext(module: string, context?: Record<string, any>): LoggerService {
|
||||
const newLogger = Object.create(this)
|
||||
|
||||
// Copy all properties from the base logger
|
||||
newLogger.logger = this.logger
|
||||
newLogger.module = module
|
||||
newLogger.context = { ...this.context, ...context }
|
||||
|
||||
return newLogger
|
||||
}
|
||||
|
||||
public finish() {
|
||||
this.logger.end()
|
||||
}
|
||||
|
||||
private processLog(source: LogSourceWithContext, level: LogLevel, message: string, meta: any[]): void {
|
||||
if (isDev) {
|
||||
const datetimeColored = colorText(
|
||||
new Date().toLocaleString('zh-CN', {
|
||||
year: 'numeric',
|
||||
month: '2-digit',
|
||||
day: '2-digit',
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
second: '2-digit',
|
||||
fractionalSecondDigits: 3,
|
||||
hour12: false
|
||||
}),
|
||||
'CYAN'
|
||||
)
|
||||
|
||||
console.log('processLog', source.process, this.module, this.context)
|
||||
|
||||
let moduleString = ''
|
||||
if (source.process === 'main') {
|
||||
moduleString = this.module ? ` [${colorText(this.module, 'UNDERLINE')}] ` : ' '
|
||||
} else {
|
||||
const combineString = `${source.window}:${source.module}`
|
||||
moduleString = ` [${colorText(combineString, 'UNDERLINE')}] `
|
||||
}
|
||||
|
||||
switch (level) {
|
||||
case 'error':
|
||||
console.error(
|
||||
`${datetimeColored} ${colorText(colorText('<ERROR>', 'RED'), 'BOLD')}${moduleString}${message}`,
|
||||
...meta
|
||||
)
|
||||
break
|
||||
case 'warn':
|
||||
console.warn(
|
||||
`${datetimeColored} ${colorText(colorText('<WARN>', 'YELLOW'), 'BOLD')}${moduleString}${message}`,
|
||||
...meta
|
||||
)
|
||||
break
|
||||
case 'info':
|
||||
console.info(
|
||||
`${datetimeColored} ${colorText(colorText('<INFO>', 'GREEN'), 'BOLD')}${moduleString}${message}`,
|
||||
...meta
|
||||
)
|
||||
break
|
||||
case 'debug':
|
||||
console.debug(
|
||||
`${datetimeColored} ${colorText(colorText('<DEBUG>', 'BLUE'), 'BOLD')}${moduleString}${message}`,
|
||||
...meta
|
||||
)
|
||||
break
|
||||
case 'verbose':
|
||||
console.log(`${datetimeColored} ${colorText('<VERBOSE>', 'BOLD')}${moduleString}${message}`, ...meta)
|
||||
break
|
||||
case 'silly':
|
||||
console.log(`${datetimeColored} ${colorText('<SILLY>', 'BOLD')}${moduleString}${message}`, ...meta)
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
// add source information to meta
|
||||
// renderer process has its own module and context, do not use this.module and this.context
|
||||
const sourceWithContext: LogSourceWithContext = source
|
||||
if (source.process === 'main') {
|
||||
sourceWithContext.module = this.module
|
||||
if (Object.keys(this.context).length > 0) {
|
||||
sourceWithContext.context = this.context
|
||||
}
|
||||
}
|
||||
meta.push(sourceWithContext)
|
||||
|
||||
// add extra system information for error and warn levels
|
||||
if (level === 'error' || level === 'warn') {
|
||||
const extra = {
|
||||
sys: SYSTEM_INFO,
|
||||
appver: APP_VERSION
|
||||
}
|
||||
|
||||
meta.push(extra)
|
||||
}
|
||||
|
||||
this.logger.log(level, message, ...meta)
|
||||
}
|
||||
|
||||
public error(message: string, ...data: any[]): void {
|
||||
this.processMainLog('error', message, data)
|
||||
}
|
||||
public warn(message: string, ...data: any[]): void {
|
||||
this.processMainLog('warn', message, data)
|
||||
}
|
||||
public info(message: string, ...data: any[]): void {
|
||||
this.processMainLog('info', message, data)
|
||||
}
|
||||
public verbose(message: string, ...data: any[]): void {
|
||||
this.processMainLog('verbose', message, data)
|
||||
}
|
||||
public debug(message: string, ...data: any[]): void {
|
||||
this.processMainLog('debug', message, data)
|
||||
}
|
||||
public silly(message: string, ...data: any[]): void {
|
||||
this.processMainLog('silly', message, data)
|
||||
}
|
||||
|
||||
private processMainLog(level: LogLevel, message: string, data: any[]): void {
|
||||
this.processLog({ process: 'main' }, level, message, data)
|
||||
}
|
||||
|
||||
// bind original this to become a callback function
|
||||
private processRendererLog = (source: LogSourceWithContext, level: LogLevel, message: string, data: any[]): void => {
|
||||
this.processLog(source, level, message, data)
|
||||
}
|
||||
|
||||
// Additional utility methods
|
||||
public setLevel(level: string): void {
|
||||
this.logger.level = level
|
||||
}
|
||||
|
||||
public getLevel(): string {
|
||||
return this.logger.level
|
||||
}
|
||||
|
||||
// Method to reset log level to environment default
|
||||
public resetLevel(): void {
|
||||
this.setLevel(DEFAULT_LEVEL)
|
||||
}
|
||||
|
||||
// Method to get the underlying Winston logger instance
|
||||
public getBaseLogger(): winston.Logger {
|
||||
return this.logger
|
||||
}
|
||||
|
||||
public getLogsDir(): string {
|
||||
return this.logsDir
|
||||
}
|
||||
|
||||
private registerIpcHandler(): void {
|
||||
ipcMain.handle(
|
||||
IpcChannel.App_LogToMain,
|
||||
(_, source: LogSourceWithContext, level: LogLevel, message: string, data: any[]) => {
|
||||
this.processRendererLog(source, level, message, data)
|
||||
}
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
export const loggerService = LoggerService.getInstance()
|
||||
@@ -2,6 +2,7 @@ import crypto from 'node:crypto'
|
||||
import os from 'node:os'
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { createInMemoryMCPServer } from '@main/mcpServers/factory'
|
||||
import { makeSureDirExists } from '@main/utils'
|
||||
import { buildFunctionCallToolName } from '@main/utils/mcp'
|
||||
@@ -35,7 +36,6 @@ import {
|
||||
MCPTool
|
||||
} from '@types'
|
||||
import { app } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import { EventEmitter } from 'events'
|
||||
import { memoize } from 'lodash'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
@@ -49,6 +49,8 @@ import getLoginShellEnvironment from './mcp/shell-env'
|
||||
// Generic type for caching wrapped functions
|
||||
type CachedFunction<T extends unknown[], R> = (...args: T) => Promise<R>
|
||||
|
||||
const logger = loggerService.withContext('MCPService')
|
||||
|
||||
/**
|
||||
* Higher-order function to add caching capability to any async function
|
||||
* @param fn The original function to be wrapped with caching
|
||||
@@ -67,7 +69,7 @@ function withCache<T extends unknown[], R>(
|
||||
const cacheKey = getCacheKey(...args)
|
||||
|
||||
if (CacheService.has(cacheKey)) {
|
||||
Logger.info(`${logPrefix} loaded from cache`)
|
||||
logger.debug(`${logPrefix} loaded from cache`)
|
||||
const cachedData = CacheService.get<R>(cacheKey)
|
||||
if (cachedData) {
|
||||
return cachedData
|
||||
@@ -130,7 +132,7 @@ class McpService {
|
||||
try {
|
||||
// Check if the existing client is still connected
|
||||
const pingResult = await existingClient.ping()
|
||||
Logger.info(`[MCP] Ping result for ${server.name}:`, pingResult)
|
||||
logger.debug(`Ping result for ${server.name}:`, pingResult)
|
||||
// If the ping fails, remove the client from the cache
|
||||
// and create a new one
|
||||
if (!pingResult) {
|
||||
@@ -139,7 +141,7 @@ class McpService {
|
||||
return existingClient
|
||||
}
|
||||
} catch (error: any) {
|
||||
Logger.error(`[MCP] Error pinging server ${server.name}:`, error?.message)
|
||||
logger.error(`Error pinging server ${server.name}:`, error?.message)
|
||||
this.clients.delete(serverKey)
|
||||
}
|
||||
}
|
||||
@@ -165,15 +167,15 @@ class McpService {
|
||||
> => {
|
||||
// Create appropriate transport based on configuration
|
||||
if (server.type === 'inMemory') {
|
||||
Logger.info(`[MCP] Using in-memory transport for server: ${server.name}`)
|
||||
logger.debug(`Using in-memory transport for server: ${server.name}`)
|
||||
const [clientTransport, serverTransport] = InMemoryTransport.createLinkedPair()
|
||||
// start the in-memory server with the given name and environment variables
|
||||
const inMemoryServer = createInMemoryMCPServer(server.name, args, server.env || {})
|
||||
try {
|
||||
await inMemoryServer.connect(serverTransport)
|
||||
Logger.info(`[MCP] In-memory server started: ${server.name}`)
|
||||
logger.debug(`In-memory server started: ${server.name}`)
|
||||
} catch (error: Error | any) {
|
||||
Logger.error(`[MCP] Error starting in-memory server: ${error}`)
|
||||
logger.error(`Error starting in-memory server: ${error}`)
|
||||
throw new Error(`Failed to start in-memory server: ${error.message}`)
|
||||
}
|
||||
// set the client transport to the client
|
||||
@@ -201,7 +203,7 @@ class McpService {
|
||||
headers['Authorization'] = `Bearer ${tokens.access_token}`
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Failed to fetch tokens:', error)
|
||||
logger.error('Failed to fetch tokens:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -231,15 +233,15 @@ class McpService {
|
||||
...server.env,
|
||||
...resolvedConfig.env
|
||||
}
|
||||
Logger.info(`[MCP] Using resolved DXT config - command: ${cmd}, args: ${args?.join(' ')}`)
|
||||
logger.debug(`Using resolved DXT config - command: ${cmd}, args: ${args?.join(' ')}`)
|
||||
} else {
|
||||
Logger.warn(`[MCP] Failed to resolve DXT config for ${server.name}, falling back to manifest values`)
|
||||
logger.warn(`Failed to resolve DXT config for ${server.name}, falling back to manifest values`)
|
||||
}
|
||||
}
|
||||
|
||||
if (server.command === 'npx') {
|
||||
cmd = await getBinaryPath('bun')
|
||||
Logger.info(`[MCP] Using command: ${cmd}`)
|
||||
logger.debug(`Using command: ${cmd}`)
|
||||
|
||||
// add -x to args if args exist
|
||||
if (args && args.length > 0) {
|
||||
@@ -274,7 +276,7 @@ class McpService {
|
||||
}
|
||||
}
|
||||
|
||||
Logger.info(`[MCP] Starting server with command: ${cmd} ${args ? args.join(' ') : ''}`)
|
||||
logger.debug(`Starting server with command: ${cmd} ${args ? args.join(' ') : ''}`)
|
||||
// Logger.info(`[MCP] Environment variables for server:`, server.env)
|
||||
const loginShellEnv = await this.getLoginShellEnv()
|
||||
|
||||
@@ -296,12 +298,12 @@ class McpService {
|
||||
// For DXT servers, set the working directory to the extracted path
|
||||
if (server.dxtPath) {
|
||||
transportOptions.cwd = server.dxtPath
|
||||
Logger.info(`[MCP] Setting working directory for DXT server: ${server.dxtPath}`)
|
||||
logger.debug(`Setting working directory for DXT server: ${server.dxtPath}`)
|
||||
}
|
||||
|
||||
const stdioTransport = new StdioClientTransport(transportOptions)
|
||||
stdioTransport.stderr?.on('data', (data) =>
|
||||
Logger.info(`[MCP] Stdio stderr for server: ${server.name} `, data.toString())
|
||||
logger.debug(`Stdio stderr for server: ${server.name}` + data.toString())
|
||||
)
|
||||
return stdioTransport
|
||||
} else {
|
||||
@@ -310,7 +312,7 @@ class McpService {
|
||||
}
|
||||
|
||||
const handleAuth = async (client: Client, transport: SSEClientTransport | StreamableHTTPClientTransport) => {
|
||||
Logger.info(`[MCP] Starting OAuth flow for server: ${server.name}`)
|
||||
logger.debug(`Starting OAuth flow for server: ${server.name}`)
|
||||
// Create an event emitter for the OAuth callback
|
||||
const events = new EventEmitter()
|
||||
|
||||
@@ -323,27 +325,27 @@ class McpService {
|
||||
|
||||
// Set a timeout to close the callback server
|
||||
const timeoutId = setTimeout(() => {
|
||||
Logger.warn(`[MCP] OAuth flow timed out for server: ${server.name}`)
|
||||
logger.warn(`OAuth flow timed out for server: ${server.name}`)
|
||||
callbackServer.close()
|
||||
}, 300000) // 5 minutes timeout
|
||||
|
||||
try {
|
||||
// Wait for the authorization code
|
||||
const authCode = await callbackServer.waitForAuthCode()
|
||||
Logger.info(`[MCP] Received auth code: ${authCode}`)
|
||||
logger.debug(`Received auth code: ${authCode}`)
|
||||
|
||||
// Complete the OAuth flow
|
||||
await transport.finishAuth(authCode)
|
||||
|
||||
Logger.info(`[MCP] OAuth flow completed for server: ${server.name}`)
|
||||
logger.debug(`OAuth flow completed for server: ${server.name}`)
|
||||
|
||||
const newTransport = await initTransport()
|
||||
// Try to connect again
|
||||
await client.connect(newTransport)
|
||||
|
||||
Logger.info(`[MCP] Successfully authenticated with server: ${server.name}`)
|
||||
logger.debug(`Successfully authenticated with server: ${server.name}`)
|
||||
} catch (oauthError) {
|
||||
Logger.error(`[MCP] OAuth authentication failed for server ${server.name}:`, oauthError)
|
||||
logger.error(`OAuth authentication failed for server ${server.name}:`, oauthError)
|
||||
throw new Error(
|
||||
`OAuth authentication failed: ${oauthError instanceof Error ? oauthError.message : String(oauthError)}`
|
||||
)
|
||||
@@ -363,7 +365,7 @@ class McpService {
|
||||
error instanceof Error &&
|
||||
(error.name === 'UnauthorizedError' || error.message.includes('Unauthorized'))
|
||||
) {
|
||||
Logger.info(`[MCP] Authentication required for server: ${server.name}`)
|
||||
logger.debug(`Authentication required for server: ${server.name}`)
|
||||
await handleAuth(client, transport as SSEClientTransport | StreamableHTTPClientTransport)
|
||||
} else {
|
||||
throw error
|
||||
@@ -379,10 +381,16 @@ class McpService {
|
||||
// Clear existing cache to ensure fresh data
|
||||
this.clearServerCache(serverKey)
|
||||
|
||||
Logger.info(`[MCP] Activated server: ${server.name}`)
|
||||
// Set up notification handlers
|
||||
this.setupNotificationHandlers(client, server)
|
||||
|
||||
// Clear existing cache to ensure fresh data
|
||||
this.clearServerCache(serverKey)
|
||||
|
||||
logger.debug(`Activated server: ${server.name}`)
|
||||
return client
|
||||
} catch (error: any) {
|
||||
Logger.error(`[MCP] Error activating server ${server.name}:`, error?.message)
|
||||
logger.error(`Error activating server ${server.name}:`, error?.message)
|
||||
throw new Error(`[MCP] Error activating server ${server.name}: ${error.message}`)
|
||||
}
|
||||
} finally {
|
||||
@@ -406,50 +414,50 @@ class McpService {
|
||||
try {
|
||||
// Set up tools list changed notification handler
|
||||
client.setNotificationHandler(ToolListChangedNotificationSchema, async () => {
|
||||
Logger.info(`[MCP] Tools list changed for server: ${server.name}`)
|
||||
logger.debug(`Tools list changed for server: ${server.name}`)
|
||||
// Clear tools cache
|
||||
CacheService.remove(`mcp:list_tool:${serverKey}`)
|
||||
})
|
||||
|
||||
// Set up resources list changed notification handler
|
||||
client.setNotificationHandler(ResourceListChangedNotificationSchema, async () => {
|
||||
Logger.info(`[MCP] Resources list changed for server: ${server.name}`)
|
||||
logger.debug(`Resources list changed for server: ${server.name}`)
|
||||
// Clear resources cache
|
||||
CacheService.remove(`mcp:list_resources:${serverKey}`)
|
||||
})
|
||||
|
||||
// Set up prompts list changed notification handler
|
||||
client.setNotificationHandler(PromptListChangedNotificationSchema, async () => {
|
||||
Logger.info(`[MCP] Prompts list changed for server: ${server.name}`)
|
||||
logger.debug(`Prompts list changed for server: ${server.name}`)
|
||||
// Clear prompts cache
|
||||
CacheService.remove(`mcp:list_prompts:${serverKey}`)
|
||||
})
|
||||
|
||||
// Set up resource updated notification handler
|
||||
client.setNotificationHandler(ResourceUpdatedNotificationSchema, async () => {
|
||||
Logger.info(`[MCP] Resource updated for server: ${server.name}`)
|
||||
logger.debug(`Resource updated for server: ${server.name}`)
|
||||
// Clear resource-specific caches
|
||||
this.clearResourceCaches(serverKey)
|
||||
})
|
||||
|
||||
// Set up progress notification handler
|
||||
client.setNotificationHandler(ProgressNotificationSchema, async (notification) => {
|
||||
Logger.info(`[MCP] Progress notification received for server: ${server.name}`, notification.params)
|
||||
logger.debug(`Progress notification received for server: ${server.name}`, notification.params)
|
||||
})
|
||||
|
||||
// Set up cancelled notification handler
|
||||
client.setNotificationHandler(CancelledNotificationSchema, async (notification) => {
|
||||
Logger.info(`[MCP] Operation cancelled for server: ${server.name}`, notification.params)
|
||||
logger.debug(`Operation cancelled for server: ${server.name}`, notification.params)
|
||||
})
|
||||
|
||||
// Set up logging message notification handler
|
||||
client.setNotificationHandler(LoggingMessageNotificationSchema, async (notification) => {
|
||||
Logger.info(`[MCP] Message from server ${server.name}:`, notification.params)
|
||||
logger.debug(`Message from server ${server.name}:`, notification.params)
|
||||
})
|
||||
|
||||
Logger.info(`[MCP] Set up notification handlers for server: ${server.name}`)
|
||||
logger.debug(`Set up notification handlers for server: ${server.name}`)
|
||||
} catch (error) {
|
||||
Logger.error(`[MCP] Failed to set up notification handlers for server ${server.name}:`, error)
|
||||
logger.error(`Failed to set up notification handlers for server ${server.name}:`, error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -467,7 +475,7 @@ class McpService {
|
||||
CacheService.remove(`mcp:list_tool:${serverKey}`)
|
||||
CacheService.remove(`mcp:list_prompts:${serverKey}`)
|
||||
CacheService.remove(`mcp:list_resources:${serverKey}`)
|
||||
Logger.info(`[MCP] Cleared all caches for server: ${serverKey}`)
|
||||
logger.debug(`Cleared all caches for server: ${serverKey}`)
|
||||
}
|
||||
|
||||
async closeClient(serverKey: string) {
|
||||
@@ -475,18 +483,18 @@ class McpService {
|
||||
if (client) {
|
||||
// Remove the client from the cache
|
||||
await client.close()
|
||||
Logger.info(`[MCP] Closed server: ${serverKey}`)
|
||||
logger.debug(`Closed server: ${serverKey}`)
|
||||
this.clients.delete(serverKey)
|
||||
// Clear all caches for this server
|
||||
this.clearServerCache(serverKey)
|
||||
} else {
|
||||
Logger.warn(`[MCP] No client found for server: ${serverKey}`)
|
||||
logger.warn(`No client found for server: ${serverKey}`)
|
||||
}
|
||||
}
|
||||
|
||||
async stopServer(_: Electron.IpcMainInvokeEvent, server: MCPServer) {
|
||||
const serverKey = this.getServerKey(server)
|
||||
Logger.info(`[MCP] Stopping server: ${server.name}`)
|
||||
logger.debug(`Stopping server: ${server.name}`)
|
||||
await this.closeClient(serverKey)
|
||||
}
|
||||
|
||||
@@ -502,16 +510,16 @@ class McpService {
|
||||
try {
|
||||
const cleaned = this.dxtService.cleanupDxtServer(server.name)
|
||||
if (cleaned) {
|
||||
Logger.info(`[MCP] Cleaned up DXT server directory for: ${server.name}`)
|
||||
logger.debug(`Cleaned up DXT server directory for: ${server.name}`)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`[MCP] Failed to cleanup DXT server: ${server.name}`, error)
|
||||
logger.error(`Failed to cleanup DXT server: ${server.name}`, error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async restartServer(_: Electron.IpcMainInvokeEvent, server: MCPServer) {
|
||||
Logger.info(`[MCP] Restarting server: ${server.name}`)
|
||||
logger.debug(`Restarting server: ${server.name}`)
|
||||
const serverKey = this.getServerKey(server)
|
||||
await this.closeClient(serverKey)
|
||||
// Clear cache before restarting to ensure fresh data
|
||||
@@ -524,7 +532,7 @@ class McpService {
|
||||
try {
|
||||
await this.closeClient(key)
|
||||
} catch (error: any) {
|
||||
Logger.error(`[MCP] Failed to close client: ${error?.message}`)
|
||||
logger.error(`Failed to close client: ${error?.message}`)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -533,9 +541,9 @@ class McpService {
|
||||
* Check connectivity for an MCP server
|
||||
*/
|
||||
public async checkMcpConnectivity(_: Electron.IpcMainInvokeEvent, server: MCPServer): Promise<boolean> {
|
||||
Logger.info(`[MCP] Checking connectivity for server: ${server.name}`)
|
||||
logger.debug(`Checking connectivity for server: ${server.name}`)
|
||||
try {
|
||||
Logger.info(`[MCP] About to call initClient for server: ${server.name}`, { hasInitClient: !!this.initClient })
|
||||
logger.debug(`About to call initClient for server: ${server.name}`, { hasInitClient: !!this.initClient })
|
||||
|
||||
if (!this.initClient) {
|
||||
throw new Error('initClient method is not available')
|
||||
@@ -544,10 +552,10 @@ class McpService {
|
||||
const client = await this.initClient(server)
|
||||
// Attempt to list tools as a way to check connectivity
|
||||
await client.listTools()
|
||||
Logger.info(`[MCP] Connectivity check successful for server: ${server.name}`)
|
||||
logger.debug(`Connectivity check successful for server: ${server.name}`)
|
||||
return true
|
||||
} catch (error) {
|
||||
Logger.error(`[MCP] Connectivity check failed for server: ${server.name}`, error)
|
||||
logger.error(`Connectivity check failed for server: ${server.name}`, error)
|
||||
// Close the client if connectivity check fails to ensure a clean state for the next attempt
|
||||
const serverKey = this.getServerKey(server)
|
||||
await this.closeClient(serverKey)
|
||||
@@ -556,7 +564,7 @@ class McpService {
|
||||
}
|
||||
|
||||
private async listToolsImpl(server: MCPServer): Promise<MCPTool[]> {
|
||||
Logger.info(`[MCP] Listing tools for server: ${server.name}`)
|
||||
logger.debug(`Listing tools for server: ${server.name}`)
|
||||
const client = await this.initClient(server)
|
||||
try {
|
||||
const { tools } = await client.listTools()
|
||||
@@ -572,7 +580,7 @@ class McpService {
|
||||
})
|
||||
return serverTools
|
||||
} catch (error: any) {
|
||||
Logger.error(`[MCP] Failed to list tools for server: ${server.name}`, error?.message)
|
||||
logger.error(`Failed to list tools for server: ${server.name}`, error?.message)
|
||||
return []
|
||||
}
|
||||
}
|
||||
@@ -603,12 +611,12 @@ class McpService {
|
||||
this.activeToolCalls.set(toolCallId, abortController)
|
||||
|
||||
try {
|
||||
Logger.info('[MCP] Calling:', server.name, name, args, 'callId:', toolCallId)
|
||||
logger.debug('Calling:', server.name, name, args, 'callId:', toolCallId)
|
||||
if (typeof args === 'string') {
|
||||
try {
|
||||
args = JSON.parse(args)
|
||||
} catch (e) {
|
||||
Logger.error('[MCP] args parse error', args)
|
||||
logger.error('args parse error', args)
|
||||
}
|
||||
}
|
||||
const client = await this.initClient(server)
|
||||
@@ -622,7 +630,7 @@ class McpService {
|
||||
})
|
||||
return result as MCPCallToolResponse
|
||||
} catch (error) {
|
||||
Logger.error(`[MCP] Error calling tool ${name} on ${server.name}:`, error)
|
||||
logger.error(`Error calling tool ${name} on ${server.name}:`, error)
|
||||
throw error
|
||||
} finally {
|
||||
this.activeToolCalls.delete(toolCallId)
|
||||
@@ -643,7 +651,7 @@ class McpService {
|
||||
*/
|
||||
private async listPromptsImpl(server: MCPServer): Promise<MCPPrompt[]> {
|
||||
const client = await this.initClient(server)
|
||||
Logger.info(`[MCP] Listing prompts for server: ${server.name}`)
|
||||
logger.debug(`Listing prompts for server: ${server.name}`)
|
||||
try {
|
||||
const { prompts } = await client.listPrompts()
|
||||
return prompts.map((prompt: any) => ({
|
||||
@@ -655,7 +663,7 @@ class McpService {
|
||||
} catch (error: any) {
|
||||
// -32601 is the code for the method not found
|
||||
if (error?.code !== -32601) {
|
||||
Logger.error(`[MCP] Failed to list prompts for server: ${server.name}`, error?.message)
|
||||
logger.error(`Failed to list prompts for server: ${server.name}`, error?.message)
|
||||
}
|
||||
return []
|
||||
}
|
||||
@@ -685,7 +693,7 @@ class McpService {
|
||||
name: string,
|
||||
args?: Record<string, any>
|
||||
): Promise<GetMCPPromptResponse> {
|
||||
Logger.info(`[MCP] Getting prompt ${name} from server: ${server.name}`)
|
||||
logger.debug(`Getting prompt ${name} from server: ${server.name}`)
|
||||
const client = await this.initClient(server)
|
||||
return await client.getPrompt({ name, arguments: args })
|
||||
}
|
||||
@@ -715,7 +723,7 @@ class McpService {
|
||||
*/
|
||||
private async listResourcesImpl(server: MCPServer): Promise<MCPResource[]> {
|
||||
const client = await this.initClient(server)
|
||||
Logger.info(`[MCP] Listing resources for server: ${server.name}`)
|
||||
logger.debug(`Listing resources for server: ${server.name}`)
|
||||
try {
|
||||
const result = await client.listResources()
|
||||
const resources = result.resources || []
|
||||
@@ -727,7 +735,7 @@ class McpService {
|
||||
} catch (error: any) {
|
||||
// -32601 is the code for the method not found
|
||||
if (error?.code !== -32601) {
|
||||
Logger.error(`[MCP] Failed to list resources for server: ${server.name}`, error?.message)
|
||||
logger.error(`Failed to list resources for server: ${server.name}`, error?.message)
|
||||
}
|
||||
return []
|
||||
}
|
||||
@@ -753,7 +761,7 @@ class McpService {
|
||||
* Get a specific resource from an MCP server (implementation)
|
||||
*/
|
||||
private async getResourceImpl(server: MCPServer, uri: string): Promise<GetResourceResponse> {
|
||||
Logger.info(`[MCP] Getting resource ${uri} from server: ${server.name}`)
|
||||
logger.debug(`Getting resource ${uri} from server: ${server.name}`)
|
||||
const client = await this.initClient(server)
|
||||
try {
|
||||
const result = await client.readResource({ uri: uri })
|
||||
@@ -771,7 +779,7 @@ class McpService {
|
||||
contents: contents
|
||||
}
|
||||
} catch (error: Error | any) {
|
||||
Logger.error(`[MCP] Failed to get resource ${uri} from server: ${server.name}`, error.message)
|
||||
logger.error(`Failed to get resource ${uri} from server: ${server.name}`, error.message)
|
||||
throw new Error(`Failed to get resource ${uri} from server: ${server.name}: ${error.message}`)
|
||||
}
|
||||
}
|
||||
@@ -801,10 +809,10 @@ class McpService {
|
||||
const pathSeparator = process.platform === 'win32' ? ';' : ':'
|
||||
const cherryBinPath = path.join(os.homedir(), '.cherrystudio', 'bin')
|
||||
loginEnv.PATH = `${loginEnv.PATH}${pathSeparator}${cherryBinPath}`
|
||||
Logger.info('[MCP] Successfully fetched login shell environment variables:')
|
||||
logger.debug('Successfully fetched login shell environment variables:')
|
||||
return loginEnv
|
||||
} catch (error) {
|
||||
Logger.error('[MCP] Failed to fetch login shell environment variables:', error)
|
||||
logger.error('Failed to fetch login shell environment variables:', error)
|
||||
return {}
|
||||
}
|
||||
})
|
||||
@@ -823,10 +831,10 @@ class McpService {
|
||||
if (activeToolCall) {
|
||||
activeToolCall.abort()
|
||||
this.activeToolCalls.delete(callId)
|
||||
Logger.info(`[MCP] Aborted tool call: ${callId}`)
|
||||
logger.debug(`Aborted tool call: ${callId}`)
|
||||
return true
|
||||
} else {
|
||||
Logger.warn(`[MCP] No active tool call found for callId: ${callId}`)
|
||||
logger.warn(`No active tool call found for callId: ${callId}`)
|
||||
return false
|
||||
}
|
||||
}
|
||||
@@ -836,22 +844,22 @@ class McpService {
|
||||
*/
|
||||
public async getServerVersion(_: Electron.IpcMainInvokeEvent, server: MCPServer): Promise<string | null> {
|
||||
try {
|
||||
Logger.info(`[MCP] Getting server version for: ${server.name}`)
|
||||
logger.debug(`Getting server version for: ${server.name}`)
|
||||
const client = await this.initClient(server)
|
||||
|
||||
// Try to get server information which may include version
|
||||
const serverInfo = client.getServerVersion()
|
||||
Logger.info(`[MCP] Server info for ${server.name}:`, serverInfo)
|
||||
logger.debug(`Server info for ${server.name}:`, serverInfo)
|
||||
|
||||
if (serverInfo && serverInfo.version) {
|
||||
Logger.info(`[MCP] Server version for ${server.name}: ${serverInfo.version}`)
|
||||
logger.debug(`Server version for ${server.name}: ${serverInfo.version}`)
|
||||
return serverInfo.version
|
||||
}
|
||||
|
||||
Logger.warn(`[MCP] No version information available for server: ${server.name}`)
|
||||
logger.warn(`No version information available for server: ${server.name}`)
|
||||
return null
|
||||
} catch (error: any) {
|
||||
Logger.error(`[MCP] Failed to get server version for ${server.name}:`, error?.message)
|
||||
logger.error(`Failed to get server version for ${server.name}:`, error?.message)
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { NUTSTORE_HOST } from '@shared/config/nutstore'
|
||||
import { XMLParser } from 'fast-xml-parser'
|
||||
import { isNil, partial } from 'lodash'
|
||||
@@ -7,6 +8,8 @@ import { type FileStat } from 'webdav'
|
||||
|
||||
import { createOAuthUrl, decryptSecret } from '../integration/nutstore/sso/lib/index.mjs'
|
||||
|
||||
const logger = loggerService.withContext('NutstoreService')
|
||||
|
||||
interface OAuthResponse {
|
||||
username: string
|
||||
userid: string
|
||||
@@ -45,7 +48,7 @@ export async function decryptToken(token: string) {
|
||||
})
|
||||
return JSON.parse(decrypted) as OAuthResponse
|
||||
} catch (error) {
|
||||
console.error('解密失败:', error)
|
||||
logger.error('Failed to decrypt token:', error)
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { app } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import fs from 'fs'
|
||||
import path from 'path'
|
||||
|
||||
const logger = loggerService.withContext('ObsidianVaultService')
|
||||
interface VaultInfo {
|
||||
path: string
|
||||
name: string
|
||||
@@ -56,7 +57,7 @@ class ObsidianVaultService {
|
||||
name: vault.name || path.basename(vault.path)
|
||||
}))
|
||||
} catch (error) {
|
||||
console.error('获取Obsidian Vault失败:', error)
|
||||
logger.error('Failed to get Obsidian Vault:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
@@ -70,20 +71,20 @@ class ObsidianVaultService {
|
||||
try {
|
||||
// 检查vault路径是否存在
|
||||
if (!fs.existsSync(vaultPath)) {
|
||||
console.error('Vault路径不存在:', vaultPath)
|
||||
logger.error('Vault path does not exist:', vaultPath)
|
||||
return []
|
||||
}
|
||||
|
||||
// 检查是否是目录
|
||||
const stats = fs.statSync(vaultPath)
|
||||
if (!stats.isDirectory()) {
|
||||
console.error('Vault路径不是一个目录:', vaultPath)
|
||||
logger.error('Vault path is not a directory:', vaultPath)
|
||||
return []
|
||||
}
|
||||
|
||||
this.traverseDirectory(vaultPath, '', results)
|
||||
} catch (error) {
|
||||
console.error('读取Vault文件夹结构失败:', error)
|
||||
logger.error('Failed to read Vault folder structure:', error)
|
||||
}
|
||||
|
||||
return results
|
||||
@@ -105,7 +106,7 @@ class ObsidianVaultService {
|
||||
|
||||
// 确保目录存在且可访问
|
||||
if (!fs.existsSync(dirPath)) {
|
||||
console.error('目录不存在:', dirPath)
|
||||
logger.error('Directory does not exist:', dirPath)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -113,7 +114,7 @@ class ObsidianVaultService {
|
||||
try {
|
||||
items = fs.readdirSync(dirPath, { withFileTypes: true })
|
||||
} catch (err) {
|
||||
console.error(`无法读取目录 ${dirPath}:`, err)
|
||||
logger.error(`Failed to read directory ${dirPath}:`, err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -138,7 +139,7 @@ class ObsidianVaultService {
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`遍历目录出错 ${dirPath}:`, error)
|
||||
logger.error(`Failed to traverse directory ${dirPath}:`, error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -152,14 +153,14 @@ class ObsidianVaultService {
|
||||
const vault = vaults.find((v) => v.name === vaultName)
|
||||
|
||||
if (!vault) {
|
||||
console.error('未找到指定名称的Vault:', vaultName)
|
||||
logger.error('Vault not found:', vaultName)
|
||||
return []
|
||||
}
|
||||
|
||||
Logger.log('获取Vault文件结构:', vault.name, vault.path)
|
||||
logger.debug('Get Vault file structure:', vault.name, vault.path)
|
||||
return this.getVaultStructure(vault.path)
|
||||
} catch (error) {
|
||||
console.error('获取Vault文件结构时发生错误:', error)
|
||||
logger.error('Failed to get Vault file structure:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3,13 +3,15 @@ import fs from 'node:fs/promises'
|
||||
import path from 'node:path'
|
||||
import { promisify } from 'node:util'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { app } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { handleProvidersProtocolUrl } from './urlschema/handle-providers'
|
||||
import { handleMcpProtocolUrl } from './urlschema/mcp-install'
|
||||
import { windowService } from './WindowService'
|
||||
|
||||
const logger = loggerService.withContext('ProtocolClient')
|
||||
|
||||
export const CHERRY_STUDIO_PROTOCOL = 'cherrystudio'
|
||||
|
||||
export function registerProtocolClient(app: Electron.App) {
|
||||
@@ -65,12 +67,12 @@ export async function setupAppImageDeepLink(): Promise<void> {
|
||||
return
|
||||
}
|
||||
|
||||
Logger.info('AppImage environment detected on Linux, setting up deep link.')
|
||||
logger.debug('AppImage environment detected on Linux, setting up deep link.')
|
||||
|
||||
try {
|
||||
const appPath = app.getPath('exe')
|
||||
if (!appPath) {
|
||||
Logger.error('Could not determine App path.')
|
||||
logger.error('Could not determine App path.')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -95,24 +97,24 @@ NoDisplay=true
|
||||
|
||||
// Write the .desktop file (overwrite if exists)
|
||||
await fs.writeFile(desktopFilePath, desktopFileContent, 'utf-8')
|
||||
Logger.info(`Created/Updated desktop file: ${desktopFilePath}`)
|
||||
logger.debug(`Created/Updated desktop file: ${desktopFilePath}`)
|
||||
|
||||
// Update the desktop database
|
||||
// It's important to update the database for the changes to take effect
|
||||
try {
|
||||
const { stdout, stderr } = await execAsync(`update-desktop-database ${escapePathForExec(applicationsDir)}`)
|
||||
if (stderr) {
|
||||
Logger.warn(`update-desktop-database stderr: ${stderr}`)
|
||||
logger.warn(`update-desktop-database stderr: ${stderr}`)
|
||||
}
|
||||
Logger.info(`update-desktop-database stdout: ${stdout}`)
|
||||
Logger.info('Desktop database updated successfully.')
|
||||
logger.debug(`update-desktop-database stdout: ${stdout}`)
|
||||
logger.debug('Desktop database updated successfully.')
|
||||
} catch (updateError) {
|
||||
Logger.error('Failed to update desktop database:', updateError)
|
||||
logger.error('Failed to update desktop database:', updateError)
|
||||
// Continue even if update fails, as the file is still created.
|
||||
}
|
||||
} catch (error) {
|
||||
// Log the error but don't prevent the app from starting
|
||||
Logger.error('Failed to setup AppImage deep link:', error)
|
||||
logger.error('Failed to setup AppImage deep link:', error)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { loggerService } from '@logger'
|
||||
import axios from 'axios'
|
||||
import { app, ProxyConfig, session } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import { socksDispatcher } from 'fetch-socks'
|
||||
import http from 'http'
|
||||
import https from 'https'
|
||||
@@ -8,6 +8,8 @@ import { getSystemProxy } from 'os-proxy-config'
|
||||
import { ProxyAgent } from 'proxy-agent'
|
||||
import { Dispatcher, EnvHttpProxyAgent, getGlobalDispatcher, setGlobalDispatcher } from 'undici'
|
||||
|
||||
const logger = loggerService.withContext('ProxyManager')
|
||||
|
||||
export class ProxyManager {
|
||||
private config: ProxyConfig = { mode: 'direct' }
|
||||
private systemProxyInterval: NodeJS.Timeout | null = null
|
||||
@@ -59,7 +61,7 @@ export class ProxyManager {
|
||||
}
|
||||
|
||||
async configureProxy(config: ProxyConfig): Promise<void> {
|
||||
Logger.info('configureProxy', config.mode, config.proxyRules)
|
||||
logger.info('configureProxy', config.mode, config.proxyRules)
|
||||
if (this.isSettingProxy) {
|
||||
return
|
||||
}
|
||||
@@ -68,7 +70,7 @@ export class ProxyManager {
|
||||
|
||||
try {
|
||||
if (config?.mode === this.config?.mode && config?.proxyRules === this.config?.proxyRules) {
|
||||
Logger.info('proxy config is the same, skip configure')
|
||||
logger.info('proxy config is the same, skip configure')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -77,7 +79,7 @@ export class ProxyManager {
|
||||
if (config.mode === 'system') {
|
||||
const currentProxy = await getSystemProxy()
|
||||
if (currentProxy) {
|
||||
Logger.info('current system proxy', currentProxy.proxyUrl)
|
||||
logger.info('current system proxy', currentProxy.proxyUrl)
|
||||
this.config.proxyRules = currentProxy.proxyUrl.toLowerCase()
|
||||
this.monitorSystemProxy()
|
||||
} else {
|
||||
@@ -88,7 +90,7 @@ export class ProxyManager {
|
||||
|
||||
this.setGlobalProxy()
|
||||
} catch (error) {
|
||||
Logger.error('Failed to config proxy:', error)
|
||||
logger.error('Failed to config proxy:', error)
|
||||
throw error
|
||||
} finally {
|
||||
this.isSettingProxy = false
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { ipcMain } from 'electron'
|
||||
import { EventEmitter } from 'events'
|
||||
@@ -7,6 +8,8 @@ import { windowService } from './WindowService'
|
||||
type StoreValue = any
|
||||
type Unsubscribe = () => void
|
||||
|
||||
const logger = loggerService.withContext('ReduxService')
|
||||
|
||||
export class ReduxService extends EventEmitter {
|
||||
private stateCache: any = {}
|
||||
private isReady = false
|
||||
@@ -65,7 +68,7 @@ export class ReduxService extends EventEmitter {
|
||||
const selectorFn = new Function('state', `return ${selector}`)
|
||||
return selectorFn(this.stateCache)
|
||||
} catch (error) {
|
||||
console.error('Failed to select from cache:', error)
|
||||
logger.error('Failed to select from cache:', error)
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
@@ -94,7 +97,7 @@ export class ReduxService extends EventEmitter {
|
||||
})()
|
||||
`)
|
||||
} catch (error) {
|
||||
console.error('Failed to select store value:', error)
|
||||
logger.error('Failed to select store value:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -111,7 +114,7 @@ export class ReduxService extends EventEmitter {
|
||||
window.store.dispatch(${JSON.stringify(action)})
|
||||
`)
|
||||
} catch (error) {
|
||||
console.error('Failed to dispatch action:', error)
|
||||
logger.error('Failed to dispatch action:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -149,7 +152,7 @@ export class ReduxService extends EventEmitter {
|
||||
const newValue = await this.select(selector)
|
||||
callback(newValue)
|
||||
} catch (error) {
|
||||
console.error('Error in subscription handler:', error)
|
||||
logger.error('Error in subscription handler:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -171,7 +174,7 @@ export class ReduxService extends EventEmitter {
|
||||
window.store.getState()
|
||||
`)
|
||||
} catch (error) {
|
||||
console.error('Failed to get state:', error)
|
||||
logger.error('Failed to get state:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -191,7 +194,7 @@ export const reduxService = new ReduxService()
|
||||
try {
|
||||
// 读取状态
|
||||
const settings = await reduxService.select('state.settings')
|
||||
Logger.log('settings', settings)
|
||||
logger.log('settings', settings)
|
||||
|
||||
// 派发 action
|
||||
await reduxService.dispatch({
|
||||
@@ -201,7 +204,7 @@ export const reduxService = new ReduxService()
|
||||
|
||||
// 订阅状态变化
|
||||
const unsubscribe = await reduxService.subscribe('state.settings.apiKey', (newValue) => {
|
||||
Logger.log('API key changed:', newValue)
|
||||
logger.log('API key changed:', newValue)
|
||||
})
|
||||
|
||||
// 批量执行 actions
|
||||
@@ -212,16 +215,16 @@ export const reduxService = new ReduxService()
|
||||
|
||||
// 同步方法虽然可能不是最新的数据,但响应更快
|
||||
const apiKey = reduxService.selectSync('state.settings.apiKey')
|
||||
Logger.log('apiKey', apiKey)
|
||||
logger.log('apiKey', apiKey)
|
||||
|
||||
// 处理保证是最新的数据
|
||||
const apiKey1 = await reduxService.select('state.settings.apiKey')
|
||||
Logger.log('apiKey1', apiKey1)
|
||||
logger.log('apiKey1', apiKey1)
|
||||
|
||||
// 取消订阅
|
||||
unsubscribe()
|
||||
} catch (error) {
|
||||
Logger.error('Error:', error)
|
||||
logger.error('Error:', error)
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
@@ -6,11 +6,13 @@ import {
|
||||
PutObjectCommand,
|
||||
S3Client
|
||||
} from '@aws-sdk/client-s3'
|
||||
import { loggerService } from '@logger'
|
||||
import type { S3Config } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
import * as net from 'net'
|
||||
import { Readable } from 'stream'
|
||||
|
||||
const logger = loggerService.withContext('S3Storage')
|
||||
|
||||
/**
|
||||
* 将可读流转换为 Buffer
|
||||
*/
|
||||
@@ -50,7 +52,7 @@ export default class S3Storage {
|
||||
const isInWhiteList = VIRTUAL_HOST_SUFFIXES.some((suffix) => hostname.endsWith(suffix))
|
||||
return !isInWhiteList
|
||||
} catch (e) {
|
||||
Logger.warn('[S3Storage] Failed to parse endpoint, fallback to Path-Style:', endpoint, e)
|
||||
logger.warn('[S3Storage] Failed to parse endpoint, fallback to Path-Style:', endpoint, e)
|
||||
return true
|
||||
}
|
||||
})()
|
||||
@@ -96,7 +98,7 @@ export default class S3Storage {
|
||||
})
|
||||
)
|
||||
} catch (error) {
|
||||
Logger.error('[S3Storage] Error putting object:', error)
|
||||
logger.error('[S3Storage] Error putting object:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -109,7 +111,7 @@ export default class S3Storage {
|
||||
}
|
||||
return await streamToBuffer(res.Body as Readable)
|
||||
} catch (error) {
|
||||
Logger.error('[S3Storage] Error getting object:', error)
|
||||
logger.error('[S3Storage] Error getting object:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -126,7 +128,7 @@ export default class S3Storage {
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('[S3Storage] Error deleting object:', error)
|
||||
logger.error('[S3Storage] Error deleting object:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -163,7 +165,7 @@ export default class S3Storage {
|
||||
|
||||
return files
|
||||
} catch (error) {
|
||||
Logger.error('[S3Storage] Error listing objects:', error)
|
||||
logger.error('[S3Storage] Error listing objects:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -176,7 +178,7 @@ export default class S3Storage {
|
||||
await this.client.send(new HeadBucketCommand({ Bucket: this.bucket }))
|
||||
return true
|
||||
} catch (error) {
|
||||
Logger.error('[S3Storage] Error checking connection:', error)
|
||||
logger.error('[S3Storage] Error checking connection:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { SELECTION_FINETUNED_LIST, SELECTION_PREDEFINED_BLACKLIST } from '@main/configs/SelectionConfig'
|
||||
import { isDev, isMac, isWin } from '@main/constant'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { app, BrowserWindow, ipcMain, screen, systemPreferences } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import { join } from 'path'
|
||||
import type {
|
||||
KeyboardEventData,
|
||||
@@ -16,6 +16,8 @@ import type { ActionItem } from '../../renderer/src/types/selectionTypes'
|
||||
import { ConfigKeys, configManager } from './ConfigManager'
|
||||
import storeSyncService from './StoreSyncService'
|
||||
|
||||
const logger = loggerService.withContext('SelectionService')
|
||||
|
||||
const isSupportedOS = isWin || isMac
|
||||
|
||||
let SelectionHook: SelectionHookConstructor | null = null
|
||||
@@ -25,7 +27,7 @@ try {
|
||||
SelectionHook = require('selection-hook')
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Failed to load selection-hook:', error)
|
||||
logger.error('Failed to load selection-hook:', error)
|
||||
}
|
||||
|
||||
// Type definitions
|
||||
@@ -1504,12 +1506,12 @@ export class SelectionService {
|
||||
|
||||
private logInfo(message: string, forceShow: boolean = false): void {
|
||||
if (isDev || forceShow) {
|
||||
Logger.info('[SelectionService] Info: ', message)
|
||||
logger.info(message)
|
||||
}
|
||||
}
|
||||
|
||||
private logError(...args: [...string[], Error]): void {
|
||||
Logger.error('[SelectionService] Error: ', ...args)
|
||||
logger.error('[SelectionService] Error: ', ...args)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1525,7 +1527,7 @@ export function initSelectionService(): boolean {
|
||||
//avoid closure
|
||||
const ss = SelectionService.getInstance()
|
||||
if (!ss) {
|
||||
Logger.error('SelectionService not initialized: instance is null')
|
||||
logger.error('SelectionService not initialized: instance is null')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1540,7 +1542,7 @@ export function initSelectionService(): boolean {
|
||||
|
||||
const ss = SelectionService.getInstance()
|
||||
if (!ss) {
|
||||
Logger.error('SelectionService not initialized: instance is null')
|
||||
logger.error('SelectionService not initialized: instance is null')
|
||||
return false
|
||||
}
|
||||
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { handleZoomFactor } from '@main/utils/zoom'
|
||||
import { Shortcut } from '@types'
|
||||
import { BrowserWindow, globalShortcut } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { configManager } from './ConfigManager'
|
||||
import selectionService from './SelectionService'
|
||||
import { windowService } from './WindowService'
|
||||
|
||||
const logger = loggerService.withContext('ShortcutService')
|
||||
|
||||
let showAppAccelerator: string | null = null
|
||||
let showMiniWindowAccelerator: string | null = null
|
||||
let selectionAssistantToggleAccelerator: string | null = null
|
||||
@@ -222,7 +224,7 @@ export function registerShortcuts(window: BrowserWindow) {
|
||||
|
||||
globalShortcut.register(accelerator, () => handler(window))
|
||||
} catch (error) {
|
||||
Logger.error(`[ShortcutService] Failed to register shortcut ${shortcut.key}`)
|
||||
logger.warn(`Failed to register shortcut ${shortcut.key}`)
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -257,7 +259,7 @@ export function registerShortcuts(window: BrowserWindow) {
|
||||
handler && globalShortcut.register(accelerator, () => handler(window))
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('[ShortcutService] Failed to unregister shortcuts')
|
||||
logger.warn('Failed to unregister shortcuts')
|
||||
}
|
||||
}
|
||||
|
||||
@@ -290,6 +292,6 @@ export function unregisterAllShortcuts() {
|
||||
windowOnHandlers.clear()
|
||||
globalShortcut.unregisterAll()
|
||||
} catch (error) {
|
||||
Logger.error('[ShortcutService] Failed to unregister all shortcuts')
|
||||
logger.warn('Failed to unregister all shortcuts')
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { WebDavConfig } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
import https from 'https'
|
||||
import path from 'path'
|
||||
import Stream from 'stream'
|
||||
@@ -11,6 +11,9 @@ import {
|
||||
PutFileContentsOptions,
|
||||
WebDAVClient
|
||||
} from 'webdav'
|
||||
|
||||
const logger = loggerService.withContext('WebDav')
|
||||
|
||||
export default class WebDav {
|
||||
public instance: WebDAVClient | undefined
|
||||
private webdavPath: string
|
||||
@@ -50,7 +53,7 @@ export default class WebDav {
|
||||
})
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error creating directory on WebDAV:', error)
|
||||
logger.error('Error creating directory on WebDAV:', error)
|
||||
throw error
|
||||
}
|
||||
|
||||
@@ -59,7 +62,7 @@ export default class WebDav {
|
||||
try {
|
||||
return await this.instance.putFileContents(remoteFilePath, data, options)
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error putting file contents on WebDAV:', error)
|
||||
logger.error('Error putting file contents on WebDAV:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -74,7 +77,7 @@ export default class WebDav {
|
||||
try {
|
||||
return await this.instance.getFileContents(remoteFilePath, options)
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error getting file contents on WebDAV:', error)
|
||||
logger.error('Error getting file contents on WebDAV:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -87,7 +90,7 @@ export default class WebDav {
|
||||
try {
|
||||
return await this.instance.getDirectoryContents(this.webdavPath)
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error getting directory contents on WebDAV:', error)
|
||||
logger.error('Error getting directory contents on WebDAV:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -100,7 +103,7 @@ export default class WebDav {
|
||||
try {
|
||||
return await this.instance.exists('/')
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error checking connection:', error)
|
||||
logger.error('Error checking connection:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -113,7 +116,7 @@ export default class WebDav {
|
||||
try {
|
||||
return await this.instance.createDirectory(path, options)
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error creating directory on WebDAV:', error)
|
||||
logger.error('Error creating directory on WebDAV:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -128,7 +131,7 @@ export default class WebDav {
|
||||
try {
|
||||
return await this.instance.deleteFile(remoteFilePath)
|
||||
} catch (error) {
|
||||
Logger.error('[WebDAV] Error deleting file on WebDAV:', error)
|
||||
logger.error('Error deleting file on WebDAV:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,11 +2,11 @@
|
||||
import './ThemeService'
|
||||
|
||||
import { is } from '@electron-toolkit/utils'
|
||||
import { loggerService } from '@logger'
|
||||
import { isDev, isLinux, isMac, isWin } from '@main/constant'
|
||||
import { getFilesDir } from '@main/utils/file'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { app, BrowserWindow, nativeTheme, screen, shell } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import windowStateKeeper from 'electron-window-state'
|
||||
import { join } from 'path'
|
||||
|
||||
@@ -19,6 +19,9 @@ import { initSessionUserAgent } from './WebviewService'
|
||||
const DEFAULT_MINIWINDOW_WIDTH = 550
|
||||
const DEFAULT_MINIWINDOW_HEIGHT = 400
|
||||
|
||||
// const logger = loggerService.withContext('WindowService')
|
||||
const logger = loggerService.withContext('WindowService')
|
||||
|
||||
export class WindowService {
|
||||
private static instance: WindowService | null = null
|
||||
private mainWindow: BrowserWindow | null = null
|
||||
@@ -118,14 +121,14 @@ export class WindowService {
|
||||
const spellCheckLanguages = configManager.get('spellCheckLanguages', []) as string[]
|
||||
spellCheckLanguages.length > 0 && mainWindow.webContents.session.setSpellCheckerLanguages(spellCheckLanguages)
|
||||
} catch (error) {
|
||||
Logger.error('Failed to set spell check languages:', error as Error)
|
||||
logger.error('Failed to set spell check languages:', error as Error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private setupMainWindowMonitor(mainWindow: BrowserWindow) {
|
||||
mainWindow.webContents.on('render-process-gone', (_, details) => {
|
||||
Logger.error(`Renderer process crashed with: ${JSON.stringify(details)}`)
|
||||
logger.error(`Renderer process crashed with: ${JSON.stringify(details)}`)
|
||||
const currentTime = Date.now()
|
||||
const lastCrashTime = this.lastRendererProcessCrashTime
|
||||
this.lastRendererProcessCrashTime = currentTime
|
||||
@@ -272,7 +275,7 @@ export class WindowService {
|
||||
const fileName = url.replace('http://file/', '')
|
||||
const storageDir = getFilesDir()
|
||||
const filePath = storageDir + '/' + fileName
|
||||
shell.openPath(filePath).catch((err) => Logger.error('Failed to open file:', err))
|
||||
shell.openPath(filePath).catch((err) => logger.error('Failed to open file:', err))
|
||||
} else {
|
||||
shell.openExternal(details.url)
|
||||
}
|
||||
@@ -625,7 +628,7 @@ export class WindowService {
|
||||
}, 100)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Failed to quote to main window:', error as Error)
|
||||
logger.error('Failed to quote to main window:', error as Error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import Logger from 'electron-log'
|
||||
import { loggerService } from '@logger'
|
||||
import EventEmitter from 'events'
|
||||
import http from 'http'
|
||||
import { URL } from 'url'
|
||||
|
||||
import { OAuthCallbackServerOptions } from './types'
|
||||
|
||||
const logger = loggerService.withContext('MCP:OAuthCallbackServer')
|
||||
|
||||
export class CallBackServer {
|
||||
private server: Promise<http.Server>
|
||||
private events: EventEmitter
|
||||
@@ -28,7 +30,7 @@ export class CallBackServer {
|
||||
this.events.emit('auth-code-received', code)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Error processing OAuth callback:', error)
|
||||
logger.error('Error processing OAuth callback:', error)
|
||||
res.writeHead(500, { 'Content-Type': 'text/plain' })
|
||||
res.end('Internal Server Error')
|
||||
}
|
||||
@@ -41,12 +43,12 @@ export class CallBackServer {
|
||||
|
||||
// Handle server errors
|
||||
server.on('error', (error) => {
|
||||
Logger.error('OAuth callback server error:', error)
|
||||
logger.error('OAuth callback server error:', error)
|
||||
})
|
||||
|
||||
return new Promise<http.Server>((resolve, reject) => {
|
||||
server.listen(port, () => {
|
||||
Logger.info(`OAuth callback server listening on port ${port}`)
|
||||
logger.info(`OAuth callback server listening on port ${port}`)
|
||||
resolve(server)
|
||||
})
|
||||
|
||||
|
||||
@@ -1,14 +1,16 @@
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { getConfigDir } from '@main/utils/file'
|
||||
import { OAuthClientProvider } from '@modelcontextprotocol/sdk/client/auth'
|
||||
import { OAuthClientInformation, OAuthClientInformationFull, OAuthTokens } from '@modelcontextprotocol/sdk/shared/auth'
|
||||
import Logger from 'electron-log'
|
||||
import open from 'open'
|
||||
|
||||
import { JsonFileStorage } from './storage'
|
||||
import { OAuthProviderOptions } from './types'
|
||||
|
||||
const logger = loggerService.withContext('MCP:OAuthClientProvider')
|
||||
|
||||
export class McpOAuthClientProvider implements OAuthClientProvider {
|
||||
private storage: JsonFileStorage
|
||||
public readonly config: Required<OAuthProviderOptions>
|
||||
@@ -61,9 +63,9 @@ export class McpOAuthClientProvider implements OAuthClientProvider {
|
||||
try {
|
||||
// Open the browser to the authorization URL
|
||||
await open(authorizationUrl.toString())
|
||||
Logger.info('Browser opened automatically.')
|
||||
logger.debug('Browser opened automatically.')
|
||||
} catch (error) {
|
||||
Logger.error('Could not open browser automatically.')
|
||||
logger.error('Could not open browser automatically.')
|
||||
throw error // Let caller handle the error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,14 +1,16 @@
|
||||
import { loggerService } from '@logger'
|
||||
import {
|
||||
OAuthClientInformation,
|
||||
OAuthClientInformationFull,
|
||||
OAuthTokens
|
||||
} from '@modelcontextprotocol/sdk/shared/auth.js'
|
||||
import Logger from 'electron-log'
|
||||
import fs from 'fs/promises'
|
||||
import path from 'path'
|
||||
|
||||
import { IOAuthStorage, OAuthStorageData, OAuthStorageSchema } from './types'
|
||||
|
||||
const logger = loggerService.withContext('MCP:OAuthStorage')
|
||||
|
||||
export class JsonFileStorage implements IOAuthStorage {
|
||||
private readonly filePath: string
|
||||
private cache: OAuthStorageData | null = null
|
||||
@@ -38,7 +40,7 @@ export class JsonFileStorage implements IOAuthStorage {
|
||||
await this.writeStorage(initial)
|
||||
return initial
|
||||
}
|
||||
Logger.error('Error reading OAuth storage:', error)
|
||||
logger.error('Error reading OAuth storage:', error)
|
||||
throw new Error(`Failed to read OAuth storage: ${error instanceof Error ? error.message : String(error)}`)
|
||||
}
|
||||
}
|
||||
@@ -59,7 +61,7 @@ export class JsonFileStorage implements IOAuthStorage {
|
||||
// Update cache
|
||||
this.cache = data
|
||||
} catch (error) {
|
||||
Logger.error('Error writing OAuth storage:', error)
|
||||
logger.error('Error writing OAuth storage:', error)
|
||||
throw new Error(`Failed to write OAuth storage: ${error instanceof Error ? error.message : String(error)}`)
|
||||
}
|
||||
}
|
||||
@@ -112,7 +114,7 @@ export class JsonFileStorage implements IOAuthStorage {
|
||||
this.cache = null
|
||||
} catch (error) {
|
||||
if (error instanceof Error && 'code' in error && error.code !== 'ENOENT') {
|
||||
Logger.error('Error clearing OAuth storage:', error)
|
||||
logger.error('Error clearing OAuth storage:', error)
|
||||
throw new Error(`Failed to clear OAuth storage: ${error instanceof Error ? error.message : String(error)}`)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { spawn } from 'child_process'
|
||||
import Logger from 'electron-log'
|
||||
import os from 'os'
|
||||
|
||||
const logger = loggerService.withContext('ShellEnv')
|
||||
|
||||
/**
|
||||
* Spawns a login shell in the user's home directory to capture its environment variables.
|
||||
* @returns {Promise<Object>} A promise that resolves with an object containing
|
||||
@@ -35,7 +37,7 @@ function getLoginShellEnvironment(): Promise<Record<string, string>> {
|
||||
// Defaulting to bash, but this might not be the user's actual login shell.
|
||||
// A more robust solution might involve checking /etc/passwd or similar,
|
||||
// but that's more complex and often requires higher privileges or native modules.
|
||||
Logger.warn("process.env.SHELL is not set. Defaulting to /bin/bash. This might not be the user's login shell.")
|
||||
logger.warn("process.env.SHELL is not set. Defaulting to /bin/bash. This might not be the user's login shell.")
|
||||
shellPath = '/bin/bash' // A common default
|
||||
}
|
||||
// -l: Make it a login shell. This sources profile files like .profile, .bash_profile, .zprofile etc.
|
||||
@@ -47,7 +49,7 @@ function getLoginShellEnvironment(): Promise<Record<string, string>> {
|
||||
commandArgs = ['-ilc', shellCommandToGetEnv] // -i for interactive, -l for login, -c to execute command
|
||||
}
|
||||
|
||||
Logger.log(`[ShellEnv] Spawning shell: ${shellPath} with args: ${commandArgs.join(' ')} in ${homeDirectory}`)
|
||||
logger.debug(`Spawning shell: ${shellPath} with args: ${commandArgs.join(' ')} in ${homeDirectory}`)
|
||||
|
||||
const child = spawn(shellPath, commandArgs, {
|
||||
cwd: homeDirectory, // Run the command in the user's home directory
|
||||
@@ -68,21 +70,21 @@ function getLoginShellEnvironment(): Promise<Record<string, string>> {
|
||||
})
|
||||
|
||||
child.on('error', (error) => {
|
||||
Logger.error(`Failed to start shell process: ${shellPath}`, error)
|
||||
logger.error(`Failed to start shell process: ${shellPath}`, error)
|
||||
reject(new Error(`Failed to start shell: ${error.message}`))
|
||||
})
|
||||
|
||||
child.on('close', (code) => {
|
||||
if (code !== 0) {
|
||||
const errorMessage = `Shell process exited with code ${code}. Shell: ${shellPath}. Args: ${commandArgs.join(' ')}. CWD: ${homeDirectory}. Stderr: ${errorOutput.trim()}`
|
||||
Logger.error(errorMessage)
|
||||
logger.error(errorMessage)
|
||||
return reject(new Error(errorMessage))
|
||||
}
|
||||
|
||||
if (errorOutput.trim()) {
|
||||
// Some shells might output warnings or non-fatal errors to stderr
|
||||
// during profile loading. Log it, but proceed if exit code is 0.
|
||||
Logger.warn(`Shell process stderr output (even with exit code 0):\n${errorOutput.trim()}`)
|
||||
logger.warn(`Shell process stderr output (even with exit code 0):\n${errorOutput.trim()}`)
|
||||
}
|
||||
|
||||
const env: Record<string, string> = {}
|
||||
@@ -104,10 +106,10 @@ function getLoginShellEnvironment(): Promise<Record<string, string>> {
|
||||
if (Object.keys(env).length === 0 && output.length < 100) {
|
||||
// Arbitrary small length check
|
||||
// This might indicate an issue if no env vars were parsed or output was minimal
|
||||
Logger.warn(
|
||||
logger.warn(
|
||||
'Parsed environment is empty or output was very short. This might indicate an issue with shell execution or environment variable retrieval.'
|
||||
)
|
||||
Logger.warn('Raw output from shell:\n', output)
|
||||
logger.warn('Raw output from shell:\n', output)
|
||||
}
|
||||
|
||||
env.PATH = env.Path || env.PATH || ''
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { Client, createClient } from '@libsql/client'
|
||||
import { loggerService } from '@logger'
|
||||
import Embeddings from '@main/knowledge/embeddings/Embeddings'
|
||||
import type {
|
||||
AddMemoryOptions,
|
||||
@@ -11,11 +12,12 @@ import type {
|
||||
} from '@types'
|
||||
import crypto from 'crypto'
|
||||
import { app } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import path from 'path'
|
||||
|
||||
import { MemoryQueries } from './queries'
|
||||
|
||||
const logger = loggerService.withContext('MemoryService')
|
||||
|
||||
export interface EmbeddingOptions {
|
||||
model: string
|
||||
provider: string
|
||||
@@ -88,9 +90,9 @@ export class MemoryService {
|
||||
// Create tables
|
||||
await this.createTables()
|
||||
this.isInitialized = true
|
||||
Logger.info('Memory database initialized successfully')
|
||||
logger.debug('Memory database initialized successfully')
|
||||
} catch (error) {
|
||||
Logger.error('Failed to initialize memory database:', error)
|
||||
logger.error('Failed to initialize memory database:', error)
|
||||
throw new Error(
|
||||
`Memory database initialization failed: ${error instanceof Error ? error.message : 'Unknown error'}`
|
||||
)
|
||||
@@ -118,7 +120,7 @@ export class MemoryService {
|
||||
await this.db.execute(MemoryQueries.createIndexes.vector)
|
||||
} catch (error) {
|
||||
// Vector index might not be supported in all versions
|
||||
Logger.warn('Failed to create vector index, falling back to non-indexed search:', error)
|
||||
logger.warn('Failed to create vector index, falling back to non-indexed search:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -157,11 +159,11 @@ export class MemoryService {
|
||||
|
||||
if (!isDeleted) {
|
||||
// Active record exists, skip insertion
|
||||
Logger.info(`Memory already exists with hash: ${hash}`)
|
||||
logger.debug(`Memory already exists with hash: ${hash}`)
|
||||
continue
|
||||
} else {
|
||||
// Deleted record exists, restore it instead of inserting new one
|
||||
Logger.info(`Restoring deleted memory with hash: ${hash}`)
|
||||
logger.debug(`Restoring deleted memory with hash: ${hash}`)
|
||||
|
||||
// Generate embedding if model is configured
|
||||
let embedding: number[] | null = null
|
||||
@@ -169,11 +171,11 @@ export class MemoryService {
|
||||
if (embedderApiClient) {
|
||||
try {
|
||||
embedding = await this.generateEmbedding(trimmedMemory)
|
||||
Logger.info(
|
||||
logger.debug(
|
||||
`Generated embedding for restored memory with dimension: ${embedding.length} (target: ${this.config?.embedderDimensions || MemoryService.UNIFIED_DIMENSION})`
|
||||
)
|
||||
} catch (error) {
|
||||
Logger.error('Failed to generate embedding for restored memory:', error)
|
||||
logger.error('Failed to generate embedding for restored memory:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -211,7 +213,7 @@ export class MemoryService {
|
||||
if (this.config?.embedderApiClient) {
|
||||
try {
|
||||
embedding = await this.generateEmbedding(trimmedMemory)
|
||||
Logger.info(
|
||||
logger.debug(
|
||||
`Generated embedding with dimension: ${embedding.length} (target: ${this.config?.embedderDimensions || MemoryService.UNIFIED_DIMENSION})`
|
||||
)
|
||||
|
||||
@@ -227,15 +229,15 @@ export class MemoryService {
|
||||
if (similarMemories.memories.length > 0) {
|
||||
const highestSimilarity = Math.max(...similarMemories.memories.map((m) => m.score || 0))
|
||||
if (highestSimilarity >= MemoryService.SIMILARITY_THRESHOLD) {
|
||||
Logger.info(
|
||||
logger.debug(
|
||||
`Skipping memory addition due to high similarity: ${highestSimilarity.toFixed(3)} >= ${MemoryService.SIMILARITY_THRESHOLD}`
|
||||
)
|
||||
Logger.info(`Similar memory found: "${similarMemories.memories[0].memory}"`)
|
||||
logger.debug(`Similar memory found: "${similarMemories.memories[0].memory}"`)
|
||||
continue
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Failed to generate embedding:', error)
|
||||
logger.error('Failed to generate embedding:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -277,7 +279,7 @@ export class MemoryService {
|
||||
count: addedMemories.length
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Failed to add memories:', error)
|
||||
logger.error('Failed to add memories:', error)
|
||||
return {
|
||||
memories: [],
|
||||
count: 0,
|
||||
@@ -302,7 +304,7 @@ export class MemoryService {
|
||||
const queryEmbedding = await this.generateEmbedding(query)
|
||||
return await this.hybridSearch(query, queryEmbedding, { limit, userId, agentId, filters })
|
||||
} catch (error) {
|
||||
Logger.error('Vector search failed, falling back to text search:', error)
|
||||
logger.error('Vector search failed, falling back to text search:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -357,7 +359,7 @@ export class MemoryService {
|
||||
count: memories.length
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Search failed:', error)
|
||||
logger.error('Search failed:', error)
|
||||
return {
|
||||
memories: [],
|
||||
count: 0,
|
||||
@@ -422,7 +424,7 @@ export class MemoryService {
|
||||
count: totalCount
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('List failed:', error)
|
||||
logger.error('List failed:', error)
|
||||
return {
|
||||
memories: [],
|
||||
count: 0,
|
||||
@@ -460,9 +462,9 @@ export class MemoryService {
|
||||
// Add to history
|
||||
await this.addHistory(id, currentMemory, null, 'DELETE')
|
||||
|
||||
Logger.info(`Memory deleted: ${id}`)
|
||||
logger.debug(`Memory deleted: ${id}`)
|
||||
} catch (error) {
|
||||
Logger.error('Delete failed:', error)
|
||||
logger.error('Delete failed:', error)
|
||||
throw new Error(`Failed to delete memory: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -497,11 +499,11 @@ export class MemoryService {
|
||||
if (this.config?.embedderApiClient) {
|
||||
try {
|
||||
embedding = await this.generateEmbedding(memory)
|
||||
Logger.info(
|
||||
logger.debug(
|
||||
`Updated embedding with dimension: ${embedding.length} (target: ${this.config?.embedderDimensions || MemoryService.UNIFIED_DIMENSION})`
|
||||
)
|
||||
} catch (error) {
|
||||
Logger.error('Failed to generate embedding for update:', error)
|
||||
logger.error('Failed to generate embedding for update:', error)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -524,9 +526,9 @@ export class MemoryService {
|
||||
// Add to history
|
||||
await this.addHistory(id, previousMemory, memory, 'UPDATE')
|
||||
|
||||
Logger.info(`Memory updated: ${id}`)
|
||||
logger.debug(`Memory updated: ${id}`)
|
||||
} catch (error) {
|
||||
Logger.error('Update failed:', error)
|
||||
logger.error('Update failed:', error)
|
||||
throw new Error(`Failed to update memory: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -555,7 +557,7 @@ export class MemoryService {
|
||||
isDeleted: row.is_deleted === 1
|
||||
}))
|
||||
} catch (error) {
|
||||
Logger.error('Get history failed:', error)
|
||||
logger.error('Get history failed:', error)
|
||||
throw new Error(`Failed to get memory history: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -591,9 +593,9 @@ export class MemoryService {
|
||||
args: [userId]
|
||||
})
|
||||
|
||||
Logger.info(`Reset all memories for user ${userId} (${totalCount} memories deleted)`)
|
||||
logger.debug(`Reset all memories for user ${userId} (${totalCount} memories deleted)`)
|
||||
} catch (error) {
|
||||
Logger.error('Reset user memories failed:', error)
|
||||
logger.error('Reset user memories failed:', error)
|
||||
throw new Error(`Failed to reset user memories: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -633,9 +635,9 @@ export class MemoryService {
|
||||
args: [userId]
|
||||
})
|
||||
|
||||
Logger.info(`Deleted user ${userId} and ${totalCount} memories`)
|
||||
logger.debug(`Deleted user ${userId} and ${totalCount} memories`)
|
||||
} catch (error) {
|
||||
Logger.error('Delete user failed:', error)
|
||||
logger.error('Delete user failed:', error)
|
||||
throw new Error(`Failed to delete user: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -659,7 +661,7 @@ export class MemoryService {
|
||||
lastMemoryDate: row.last_memory_date as string
|
||||
}))
|
||||
} catch (error) {
|
||||
Logger.error('Get users list failed:', error)
|
||||
logger.error('Get users list failed:', error)
|
||||
throw new Error(`Failed to get users list: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -730,7 +732,7 @@ export class MemoryService {
|
||||
// Normalize to unified dimension
|
||||
return this.normalizeEmbedding(embedding)
|
||||
} catch (error) {
|
||||
Logger.error('Embedding generation failed:', error)
|
||||
logger.error('Embedding generation failed:', error)
|
||||
throw new Error(`Failed to generate embedding: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
@@ -800,7 +802,7 @@ export class MemoryService {
|
||||
count: memories.length
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Hybrid search failed:', error)
|
||||
logger.error('Hybrid search failed:', error)
|
||||
throw new Error(`Hybrid search failed: ${error instanceof Error ? error.message : 'Unknown error'}`)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
import { File, Files, FileState, GoogleGenAI } from '@google/genai'
|
||||
import { loggerService } from '@logger'
|
||||
import { FileListResponse, FileMetadata, FileUploadResponse, Provider } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
|
||||
import { CacheService } from '../CacheService'
|
||||
import { BaseFileService } from './BaseFileService'
|
||||
|
||||
const logger = loggerService.withContext('GeminiService')
|
||||
|
||||
export class GeminiService extends BaseFileService {
|
||||
private static readonly FILE_LIST_CACHE_KEY = 'gemini_file_list'
|
||||
private static readonly FILE_CACHE_DURATION = 48 * 60 * 60 * 1000
|
||||
@@ -69,7 +71,7 @@ export class GeminiService extends BaseFileService {
|
||||
|
||||
return response
|
||||
} catch (error) {
|
||||
Logger.error('Error uploading file to Gemini:', error)
|
||||
logger.error('Error uploading file to Gemini:', error)
|
||||
return {
|
||||
fileId: '',
|
||||
displayName: file.origin_name,
|
||||
@@ -82,7 +84,7 @@ export class GeminiService extends BaseFileService {
|
||||
async retrieveFile(fileId: string): Promise<FileUploadResponse> {
|
||||
try {
|
||||
const cachedResponse = CacheService.get<FileUploadResponse>(`${GeminiService.FILE_LIST_CACHE_KEY}_${fileId}`)
|
||||
Logger.info('[GeminiService] cachedResponse', cachedResponse)
|
||||
logger.debug('[GeminiService] cachedResponse', cachedResponse)
|
||||
if (cachedResponse) {
|
||||
return cachedResponse
|
||||
}
|
||||
@@ -91,11 +93,11 @@ export class GeminiService extends BaseFileService {
|
||||
for await (const f of await this.fileManager.list()) {
|
||||
files.push(f)
|
||||
}
|
||||
Logger.info('[GeminiService] files', files)
|
||||
logger.debug('files', files)
|
||||
const file = files
|
||||
.filter((file) => file.state === FileState.ACTIVE)
|
||||
.find((file) => file.name?.substring(6) === fileId) // 去掉 files/ 前缀
|
||||
Logger.info('[GeminiService] file', file)
|
||||
logger.debug('file', file)
|
||||
if (file) {
|
||||
return {
|
||||
fileId: fileId,
|
||||
@@ -115,7 +117,7 @@ export class GeminiService extends BaseFileService {
|
||||
originalFile: undefined
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Error retrieving file from Gemini:', error)
|
||||
logger.error('Error retrieving file from Gemini:', error)
|
||||
return {
|
||||
fileId: fileId,
|
||||
displayName: '',
|
||||
@@ -173,7 +175,7 @@ export class GeminiService extends BaseFileService {
|
||||
CacheService.set(GeminiService.FILE_LIST_CACHE_KEY, fileList, GeminiService.LIST_CACHE_DURATION)
|
||||
return fileList
|
||||
} catch (error) {
|
||||
Logger.error('Error listing files from Gemini:', error)
|
||||
logger.error('Error listing files from Gemini:', error)
|
||||
return { files: [] }
|
||||
}
|
||||
}
|
||||
@@ -181,9 +183,9 @@ export class GeminiService extends BaseFileService {
|
||||
async deleteFile(fileId: string): Promise<void> {
|
||||
try {
|
||||
await this.fileManager.delete({ name: fileId })
|
||||
Logger.info(`File ${fileId} deleted from Gemini`)
|
||||
logger.debug(`File ${fileId} deleted from Gemini`)
|
||||
} catch (error) {
|
||||
Logger.error('Error deleting file from Gemini:', error)
|
||||
logger.error('Error deleting file from Gemini:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
import fs from 'node:fs/promises'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { Mistral } from '@mistralai/mistralai'
|
||||
import { FileListResponse, FileMetadata, FileUploadResponse, Provider } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { MistralClientManager } from '../MistralClientManager'
|
||||
import { BaseFileService } from './BaseFileService'
|
||||
|
||||
const logger = loggerService.withContext('MistralService')
|
||||
|
||||
export class MistralService extends BaseFileService {
|
||||
private readonly client: Mistral
|
||||
|
||||
@@ -38,7 +40,7 @@ export class MistralService extends BaseFileService {
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Error uploading file:', error)
|
||||
logger.error('Error uploading file:', error)
|
||||
return {
|
||||
fileId: '',
|
||||
displayName: file.origin_name,
|
||||
@@ -63,7 +65,7 @@ export class MistralService extends BaseFileService {
|
||||
}))
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Error listing files:', error)
|
||||
logger.error('Error listing files:', error)
|
||||
return { files: [] }
|
||||
}
|
||||
}
|
||||
@@ -73,9 +75,9 @@ export class MistralService extends BaseFileService {
|
||||
await this.client.files.delete({
|
||||
fileId
|
||||
})
|
||||
Logger.info(`File ${fileId} deleted`)
|
||||
logger.debug(`File ${fileId} deleted`)
|
||||
} catch (error) {
|
||||
Logger.error('Error deleting file:', error)
|
||||
logger.error('Error deleting file:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -92,7 +94,7 @@ export class MistralService extends BaseFileService {
|
||||
status: 'success' // Retrieved files are always processed
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error('Error retrieving file:', error)
|
||||
logger.error('Error retrieving file:', error)
|
||||
return {
|
||||
fileId: fileId,
|
||||
displayName: '',
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isMac } from '@main/constant'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { windowService } from '../WindowService'
|
||||
const logger = loggerService.withContext('URLSchema:handleProvidersProtocolUrl')
|
||||
|
||||
function ParseData(data: string) {
|
||||
try {
|
||||
@@ -9,7 +10,7 @@ function ParseData(data: string) {
|
||||
|
||||
return JSON.stringify(result)
|
||||
} catch (error) {
|
||||
Logger.error('ParseData error:', { error })
|
||||
logger.error('ParseData error:', error)
|
||||
return null
|
||||
}
|
||||
}
|
||||
@@ -33,7 +34,7 @@ export async function handleProvidersProtocolUrl(url: URL) {
|
||||
const data = ParseData(params.get('data')?.replaceAll('_', '+').replaceAll('-', '/') || '')
|
||||
|
||||
if (!data) {
|
||||
Logger.error('handleProvidersProtocolUrl data is null or invalid')
|
||||
logger.error('handleProvidersProtocolUrl data is null or invalid')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -41,7 +42,7 @@ export async function handleProvidersProtocolUrl(url: URL) {
|
||||
const version = params.get('v')
|
||||
if (version == '1') {
|
||||
// TODO: handle different version
|
||||
Logger.info('handleProvidersProtocolUrl', { data, version })
|
||||
logger.debug('handleProvidersProtocolUrl', { data, version })
|
||||
}
|
||||
|
||||
// add check there is window.navigate function in mainWindow
|
||||
@@ -59,14 +60,14 @@ export async function handleProvidersProtocolUrl(url: URL) {
|
||||
}
|
||||
} else {
|
||||
setTimeout(() => {
|
||||
Logger.info('handleProvidersProtocolUrl timeout', { data, version })
|
||||
logger.debug('handleProvidersProtocolUrl timeout', { data, version })
|
||||
handleProvidersProtocolUrl(url)
|
||||
}, 1000)
|
||||
}
|
||||
break
|
||||
}
|
||||
default:
|
||||
Logger.error(`Unknown MCP protocol URL: ${url}`)
|
||||
logger.error(`Unknown MCP protocol URL: ${url}`)
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { nanoid } from '@reduxjs/toolkit'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { MCPServer } from '@types'
|
||||
import Logger from 'electron-log'
|
||||
|
||||
import { windowService } from '../WindowService'
|
||||
|
||||
const logger = loggerService.withContext('URLSchema:handleMcpProtocolUrl')
|
||||
|
||||
function installMCPServer(server: MCPServer) {
|
||||
const mainWindow = windowService.getMainWindow()
|
||||
|
||||
@@ -49,9 +51,9 @@ export function handleMcpProtocolUrl(url: URL) {
|
||||
|
||||
if (data) {
|
||||
const stringify = Buffer.from(data, 'base64').toString('utf8')
|
||||
Logger.info('install MCP servers from urlschema: ', stringify)
|
||||
logger.debug('install MCP servers from urlschema: ', stringify)
|
||||
const jsonConfig = JSON.parse(stringify)
|
||||
Logger.info('install MCP servers from urlschema: ', jsonConfig)
|
||||
logger.debug('install MCP servers from urlschema: ', jsonConfig)
|
||||
|
||||
// support both {mcpServers: [servers]}, [servers] and {server}
|
||||
if (jsonConfig.mcpServers) {
|
||||
@@ -70,7 +72,7 @@ export function handleMcpProtocolUrl(url: URL) {
|
||||
break
|
||||
}
|
||||
default:
|
||||
console.error(`Unknown MCP protocol URL: ${url}`)
|
||||
logger.error(`Unknown MCP protocol URL: ${url}`)
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3,15 +3,17 @@ import { open, readFile } from 'node:fs/promises'
|
||||
import os from 'node:os'
|
||||
import path from 'node:path'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import { isLinux, isPortable } from '@main/constant'
|
||||
import { audioExts, documentExts, imageExts, MB, textExts, videoExts } from '@shared/config/constant'
|
||||
import { FileMetadata, FileTypes } from '@types'
|
||||
import { app } from 'electron'
|
||||
import Logger from 'electron-log'
|
||||
import iconv from 'iconv-lite'
|
||||
import * as jschardet from 'jschardet'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
|
||||
const logger = loggerService.withContext('Utils:File')
|
||||
|
||||
export function initAppDataDir() {
|
||||
const appDataPath = getAppDataPathFromConfig()
|
||||
if (appDataPath) {
|
||||
@@ -234,7 +236,7 @@ export async function readTextFileWithAutoEncoding(filePath: string): Promise<st
|
||||
.slice(0, 2)
|
||||
|
||||
if (encodings.length === 0) {
|
||||
Logger.error('Failed to detect encoding. Use utf-8 to decode.')
|
||||
logger.error('Failed to detect encoding. Use utf-8 to decode.')
|
||||
const data = await readFile(filePath)
|
||||
return iconv.decode(data, 'UTF-8')
|
||||
}
|
||||
@@ -245,7 +247,7 @@ export async function readTextFileWithAutoEncoding(filePath: string): Promise<st
|
||||
const encoding = item.encoding
|
||||
const content = iconv.decode(data, encoding)
|
||||
if (content.includes('\uFFFD')) {
|
||||
Logger.error(
|
||||
logger.error(
|
||||
`File ${filePath} was auto-detected as ${encoding} encoding, but contains invalid characters. Trying other encodings`
|
||||
)
|
||||
} else {
|
||||
@@ -253,6 +255,6 @@ export async function readTextFileWithAutoEncoding(filePath: string): Promise<st
|
||||
}
|
||||
}
|
||||
|
||||
Logger.error(`File ${filePath} failed to decode with all possible encodings, trying UTF-8 encoding`)
|
||||
logger.error(`File ${filePath} failed to decode with all possible encodings, trying UTF-8 encoding`)
|
||||
return iconv.decode(data, 'UTF-8')
|
||||
}
|
||||
|
||||
@@ -1,34 +1,36 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { spawn } from 'child_process'
|
||||
import log from 'electron-log'
|
||||
import fs from 'fs'
|
||||
import os from 'os'
|
||||
import path from 'path'
|
||||
|
||||
import { getResourcePath } from '.'
|
||||
|
||||
const logger = loggerService.withContext('Utils:Process')
|
||||
|
||||
export function runInstallScript(scriptPath: string): Promise<void> {
|
||||
return new Promise<void>((resolve, reject) => {
|
||||
const installScriptPath = path.join(getResourcePath(), 'scripts', scriptPath)
|
||||
log.info(`Running script at: ${installScriptPath}`)
|
||||
logger.info(`Running script at: ${installScriptPath}`)
|
||||
|
||||
const nodeProcess = spawn(process.execPath, [installScriptPath], {
|
||||
env: { ...process.env, ELECTRON_RUN_AS_NODE: '1' }
|
||||
})
|
||||
|
||||
nodeProcess.stdout.on('data', (data) => {
|
||||
log.info(`Script output: ${data}`)
|
||||
logger.debug(`Script output: ${data}`)
|
||||
})
|
||||
|
||||
nodeProcess.stderr.on('data', (data) => {
|
||||
log.error(`Script error: ${data}`)
|
||||
logger.error(`Script error: ${data}`)
|
||||
})
|
||||
|
||||
nodeProcess.on('close', (code) => {
|
||||
if (code === 0) {
|
||||
log.info('Script completed successfully')
|
||||
logger.debug('Script completed successfully')
|
||||
resolve()
|
||||
} else {
|
||||
log.error(`Script exited with code ${code}`)
|
||||
logger.warn(`Script exited with code ${code}`)
|
||||
reject(new Error(`Process exited with code ${code}`))
|
||||
}
|
||||
})
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import util from 'node:util'
|
||||
import zlib from 'node:zlib'
|
||||
|
||||
import logger from 'electron-log'
|
||||
import { loggerService } from '@logger'
|
||||
|
||||
const logger = loggerService.withContext('Utils:Zip')
|
||||
|
||||
// 将 zlib 的 gzip 和 gunzip 方法转换为 Promise 版本
|
||||
const gzipPromise = util.promisify(zlib.gzip)
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import type { ExtractChunkData } from '@cherrystudio/embedjs-interfaces'
|
||||
import { electronAPI } from '@electron-toolkit/preload'
|
||||
import { UpgradeChannel } from '@shared/config/constant'
|
||||
import type { LogLevel, LogSourceWithContext } from '@shared/config/types'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import {
|
||||
AddMemoryOptions,
|
||||
@@ -59,6 +60,8 @@ const api = {
|
||||
openWebsite: (url: string) => ipcRenderer.invoke(IpcChannel.Open_Website, url),
|
||||
getCacheSize: () => ipcRenderer.invoke(IpcChannel.App_GetCacheSize),
|
||||
clearCache: () => ipcRenderer.invoke(IpcChannel.App_ClearCache),
|
||||
logToMain: (source: LogSourceWithContext, level: LogLevel, message: string, data: any[]) =>
|
||||
ipcRenderer.invoke(IpcChannel.App_LogToMain, source, level, message, data),
|
||||
mac: {
|
||||
isProcessTrusted: (): Promise<boolean> => ipcRenderer.invoke(IpcChannel.App_MacIsProcessTrusted),
|
||||
requestProcessTrust: (): Promise<boolean> => ipcRenderer.invoke(IpcChannel.App_MacRequestProcessTrust)
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import '@renderer/databases'
|
||||
|
||||
import { loggerService } from '@logger'
|
||||
import store, { persistor } from '@renderer/store'
|
||||
import { Provider } from 'react-redux'
|
||||
import { HashRouter, Route, Routes } from 'react-router-dom'
|
||||
@@ -22,7 +23,11 @@ import PaintingsRoutePage from './pages/paintings/PaintingsRoutePage'
|
||||
import SettingsPage from './pages/settings/SettingsPage'
|
||||
import TranslatePage from './pages/translate/TranslatePage'
|
||||
|
||||
const logger = loggerService.withContext('App.tsx')
|
||||
|
||||
function App(): React.ReactElement {
|
||||
logger.error('App initialized')
|
||||
|
||||
return (
|
||||
<Provider store={store}>
|
||||
<StyleSheetManager>
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import {
|
||||
isFunctionCallingModel,
|
||||
isNotSupportTemperatureAndTopP,
|
||||
@@ -40,12 +41,13 @@ import { isJSON, parseJSON } from '@renderer/utils'
|
||||
import { addAbortController, removeAbortController } from '@renderer/utils/abortController'
|
||||
import { findFileBlocks, getMainTextContent } from '@renderer/utils/messageUtils/find'
|
||||
import { defaultTimeout } from '@shared/config/constant'
|
||||
import Logger from 'electron-log/renderer'
|
||||
import { isEmpty } from 'lodash'
|
||||
|
||||
import { CompletionsContext } from '../middleware/types'
|
||||
import { ApiClient, RequestTransformer, ResponseChunkTransformer } from './types'
|
||||
|
||||
const logger = loggerService.withContext('BaseApiClient')
|
||||
|
||||
/**
|
||||
* Abstract base class for API clients.
|
||||
* Provides common functionality and structure for specific client implementations.
|
||||
@@ -228,7 +230,7 @@ export abstract class BaseApiClient<
|
||||
|
||||
const allReferences = [...webSearchReferences, ...reindexedKnowledgeReferences, ...memoryReferences]
|
||||
|
||||
Logger.log(`Found ${allReferences.length} references for ID: ${message.id}`, allReferences)
|
||||
logger.debug(`Found ${allReferences.length} references for ID: ${message.id}`, allReferences)
|
||||
|
||||
if (!isEmpty(allReferences)) {
|
||||
const referenceContent = `\`\`\`json\n${JSON.stringify(allReferences, null, 2)}\n\`\`\``
|
||||
@@ -317,10 +319,10 @@ export abstract class BaseApiClient<
|
||||
|
||||
if (!isEmpty(knowledgeReferences)) {
|
||||
window.keyv.remove(`knowledge-search-${message.id}`)
|
||||
// Logger.log(`Found ${knowledgeReferences.length} knowledge base references in cache for ID: ${message.id}`)
|
||||
logger.debug(`Found ${knowledgeReferences.length} knowledge base references in cache for ID: ${message.id}`)
|
||||
return knowledgeReferences
|
||||
}
|
||||
// Logger.log(`No knowledge base references found in cache for ID: ${message.id}`)
|
||||
logger.debug(`No knowledge base references found in cache for ID: ${message.id}`)
|
||||
return []
|
||||
}
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isSupportedModel } from '@renderer/config/models'
|
||||
import {
|
||||
GenerateImageParams,
|
||||
@@ -28,6 +29,8 @@ import { OpenAIAPIClient } from './openai/OpenAIApiClient'
|
||||
import { OpenAIResponseAPIClient } from './openai/OpenAIResponseAPIClient'
|
||||
import { RequestTransformer, ResponseChunkTransformer } from './types'
|
||||
|
||||
const logger = loggerService.withContext('NewAPIClient')
|
||||
|
||||
export class NewAPIClient extends BaseApiClient {
|
||||
// 使用联合类型而不是any,保持类型安全
|
||||
private clients: Map<string, AnthropicAPIClient | GeminiAPIClient | OpenAIResponseAPIClient | OpenAIAPIClient> =
|
||||
@@ -176,7 +179,7 @@ export class NewAPIClient extends BaseApiClient {
|
||||
|
||||
return models.filter(isSupportedModel)
|
||||
} catch (error) {
|
||||
console.error('Error listing models:', error)
|
||||
logger.error('Error listing models:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
@@ -24,9 +24,9 @@ import {
|
||||
WebSearchToolResultError
|
||||
} from '@anthropic-ai/sdk/resources/messages'
|
||||
import { MessageStream } from '@anthropic-ai/sdk/resources/messages/messages'
|
||||
import { loggerService } from '@logger'
|
||||
import { GenericChunk } from '@renderer/aiCore/middleware/schemas'
|
||||
import { DEFAULT_MAX_TOKENS } from '@renderer/config/constant'
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { findTokenLimit, isClaudeReasoningModel, isReasoningModel, isWebSearchModel } from '@renderer/config/models'
|
||||
import { getAssistantSettings } from '@renderer/services/AssistantService'
|
||||
import FileManager from '@renderer/services/FileManager'
|
||||
@@ -74,6 +74,8 @@ import { buildSystemPrompt } from '@renderer/utils/prompt'
|
||||
import { BaseApiClient } from '../BaseApiClient'
|
||||
import { AnthropicStreamListener, RawStreamListener, RequestTransformer, ResponseChunkTransformer } from '../types'
|
||||
|
||||
const logger = loggerService.withContext('AnthropicAPIClient')
|
||||
|
||||
export class AnthropicAPIClient extends BaseApiClient<
|
||||
Anthropic,
|
||||
AnthropicSdkParams,
|
||||
@@ -374,12 +376,12 @@ export class AnthropicAPIClient extends BaseApiClient<
|
||||
rawOutput: AnthropicSdkRawOutput,
|
||||
listener: RawStreamListener<AnthropicSdkRawChunk>
|
||||
): AnthropicSdkRawOutput {
|
||||
console.log(`[AnthropicApiClient] 附加流监听器到原始输出`)
|
||||
logger.debug(`Attaching stream listener to raw output`)
|
||||
// 专用的Anthropic事件处理
|
||||
const anthropicListener = listener as AnthropicStreamListener
|
||||
// 检查是否为MessageStream
|
||||
if (rawOutput instanceof MessageStream) {
|
||||
console.log(`[AnthropicApiClient] 检测到 Anthropic MessageStream,附加专用监听器`)
|
||||
logger.debug(`Detected Anthropic MessageStream, attaching specialized listener`)
|
||||
|
||||
if (listener.onStart) {
|
||||
listener.onStart()
|
||||
@@ -679,13 +681,13 @@ export class AnthropicAPIClient extends BaseApiClient<
|
||||
if (toolCall) {
|
||||
try {
|
||||
toolCall.input = JSON.parse(accumulatedJson)
|
||||
Logger.debug(`Tool call id: ${toolCall.id}, accumulated json: ${accumulatedJson}`)
|
||||
logger.debug(`Tool call id: ${toolCall.id}, accumulated json: ${accumulatedJson}`)
|
||||
controller.enqueue({
|
||||
type: ChunkType.MCP_TOOL_CREATED,
|
||||
tool_calls: [toolCall]
|
||||
} as MCPToolCreatedChunk)
|
||||
} catch (error) {
|
||||
Logger.error(`Error parsing tool call input: ${error}`)
|
||||
logger.error(`Error parsing tool call input: ${error}`)
|
||||
}
|
||||
}
|
||||
break
|
||||
|
||||
@@ -16,6 +16,7 @@ import {
|
||||
ThinkingConfig,
|
||||
Tool
|
||||
} from '@google/genai'
|
||||
import { loggerService } from '@logger'
|
||||
import { nanoid } from '@reduxjs/toolkit'
|
||||
import { GenericChunk } from '@renderer/aiCore/middleware/schemas'
|
||||
import {
|
||||
@@ -64,6 +65,8 @@ import { defaultTimeout, MB } from '@shared/config/constant'
|
||||
import { BaseApiClient } from '../BaseApiClient'
|
||||
import { RequestTransformer, ResponseChunkTransformer } from '../types'
|
||||
|
||||
const logger = loggerService.withContext('GeminiAPIClient')
|
||||
|
||||
export class GeminiAPIClient extends BaseApiClient<
|
||||
GoogleGenAI,
|
||||
GeminiSdkParams,
|
||||
@@ -139,7 +142,7 @@ export class GeminiAPIClient extends BaseApiClient<
|
||||
// console.log(response?.generatedImages?.[0]?.image?.imageBytes);
|
||||
return images
|
||||
} catch (error) {
|
||||
console.error('[generateImage] error:', error)
|
||||
logger.error('[generateImage] error:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import { GoogleGenAI } from '@google/genai'
|
||||
import { loggerService } from '@logger'
|
||||
import { getVertexAILocation, getVertexAIProjectId, getVertexAIServiceAccount } from '@renderer/hooks/useVertexAI'
|
||||
import { Provider } from '@renderer/types'
|
||||
|
||||
import { GeminiAPIClient } from './GeminiAPIClient'
|
||||
|
||||
const logger = loggerService.withContext('VertexAPIClient')
|
||||
export class VertexAPIClient extends GeminiAPIClient {
|
||||
private authHeaders?: Record<string, string>
|
||||
private authHeadersExpiry?: number
|
||||
@@ -73,7 +75,7 @@ export class VertexAPIClient extends GeminiAPIClient {
|
||||
|
||||
return this.authHeaders
|
||||
} catch (error: any) {
|
||||
console.error('Failed to get auth headers:', error)
|
||||
logger.error('Failed to get auth headers:', error)
|
||||
throw new Error(`Service Account authentication failed: ${error.message}`)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { DEFAULT_MAX_TOKENS } from '@renderer/config/constant'
|
||||
import Logger from '@renderer/config/logger'
|
||||
import {
|
||||
findTokenLimit,
|
||||
GEMINI_FLASH_MODEL_REGEX,
|
||||
@@ -58,6 +58,8 @@ import { GenericChunk } from '../../middleware/schemas'
|
||||
import { RequestTransformer, ResponseChunkTransformer, ResponseChunkTransformerContext } from '../types'
|
||||
import { OpenAIBaseClient } from './OpenAIBaseClient'
|
||||
|
||||
const logger = loggerService.withContext('OpenAIApiClient')
|
||||
|
||||
export class OpenAIAPIClient extends OpenAIBaseClient<
|
||||
OpenAI | AzureOpenAI,
|
||||
OpenAISdkParams,
|
||||
@@ -790,7 +792,7 @@ export class OpenAIAPIClient extends OpenAIBaseClient<
|
||||
|
||||
// 处理finish_reason,发送流结束信号
|
||||
if ('finish_reason' in choice && choice.finish_reason) {
|
||||
Logger.debug(`[OpenAIApiClient] Stream finished with reason: ${choice.finish_reason}`)
|
||||
logger.debug(`Stream finished with reason: ${choice.finish_reason}`)
|
||||
const webSearchData = collectWebSearchData(chunk, contentSource, context)
|
||||
if (webSearchData) {
|
||||
controller.enqueue({
|
||||
@@ -808,7 +810,7 @@ export class OpenAIAPIClient extends OpenAIBaseClient<
|
||||
flush(controller) {
|
||||
if (isFinished) return
|
||||
|
||||
Logger.debug('[OpenAIApiClient] Stream ended without finish_reason, emitting fallback completion signals')
|
||||
logger.debug('Stream ended without finish_reason, emitting fallback completion signals')
|
||||
emitCompletionSignals(controller)
|
||||
}
|
||||
})
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import {
|
||||
isClaudeReasoningModel,
|
||||
isNotSupportTemperatureAndTopP,
|
||||
@@ -28,6 +29,8 @@ import OpenAI, { AzureOpenAI } from 'openai'
|
||||
|
||||
import { BaseApiClient } from '../BaseApiClient'
|
||||
|
||||
const logger = loggerService.withContext('OpenAIBaseClient')
|
||||
|
||||
/**
|
||||
* 抽象的OpenAI基础客户端类,包含两个OpenAI客户端之间的共享功能
|
||||
*/
|
||||
@@ -125,7 +128,7 @@ export abstract class OpenAIBaseClient<
|
||||
|
||||
return models.filter(isSupportedModel)
|
||||
} catch (error) {
|
||||
console.error('Error listing models:', error)
|
||||
logger.error('Error listing models:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isSupportedModel } from '@renderer/config/models'
|
||||
import { Provider } from '@renderer/types'
|
||||
import OpenAI from 'openai'
|
||||
|
||||
import { OpenAIAPIClient } from '../openai/OpenAIApiClient'
|
||||
|
||||
const logger = loggerService.withContext('PPIOAPIClient')
|
||||
export class PPIOAPIClient extends OpenAIAPIClient {
|
||||
constructor(provider: Provider) {
|
||||
super(provider)
|
||||
@@ -58,7 +60,7 @@ export class PPIOAPIClient extends OpenAIAPIClient {
|
||||
|
||||
return processedModels.filter(isSupportedModel)
|
||||
} catch (error) {
|
||||
console.error('Error listing PPIO models:', error)
|
||||
logger.error('Error listing PPIO models:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { ApiClientFactory } from '@renderer/aiCore/clients/ApiClientFactory'
|
||||
import { BaseApiClient } from '@renderer/aiCore/clients/BaseApiClient'
|
||||
import { isDedicatedImageGenerationModel, isFunctionCallingModel } from '@renderer/config/models'
|
||||
@@ -25,6 +26,8 @@ import { MIDDLEWARE_NAME as ToolUseExtractionMiddlewareName } from './middleware
|
||||
import { MiddlewareRegistry } from './middleware/register'
|
||||
import { CompletionsParams, CompletionsResult } from './middleware/schemas'
|
||||
|
||||
const logger = loggerService.withContext('AiProvider')
|
||||
|
||||
export default class AiProvider {
|
||||
private apiClient: BaseApiClient
|
||||
|
||||
@@ -124,7 +127,7 @@ export default class AiProvider {
|
||||
const dimensions = await this.apiClient.getEmbeddingDimensions(model)
|
||||
return dimensions
|
||||
} catch (error) {
|
||||
console.error('Error getting embedding dimensions:', error)
|
||||
logger.error('Error getting embedding dimensions:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
import { loggerService } from '@logger'
|
||||
|
||||
import { DefaultCompletionsNamedMiddlewares } from './register'
|
||||
import { BaseContext, CompletionsMiddleware, MethodMiddleware } from './types'
|
||||
|
||||
const logger = loggerService.withContext('aiCore:MiddlewareBuilder')
|
||||
|
||||
/**
|
||||
* 带有名称标识的中间件接口
|
||||
*/
|
||||
@@ -66,7 +70,7 @@ export class MiddlewareBuilder<TMiddleware = any> {
|
||||
if (index !== -1) {
|
||||
this.middlewares.splice(index + 1, 0, middlewareToInsert)
|
||||
} else {
|
||||
console.warn(`MiddlewareBuilder: 未找到名为 '${targetName}' 的中间件,无法插入`)
|
||||
logger.warn(`未找到名为 '${targetName}' 的中间件,无法插入`)
|
||||
}
|
||||
return this
|
||||
}
|
||||
@@ -82,7 +86,7 @@ export class MiddlewareBuilder<TMiddleware = any> {
|
||||
if (index !== -1) {
|
||||
this.middlewares.splice(index, 0, middlewareToInsert)
|
||||
} else {
|
||||
console.warn(`MiddlewareBuilder: 未找到名为 '${targetName}' 的中间件,无法插入`)
|
||||
logger.warn(`未找到名为 '${targetName}' 的中间件,无法插入`)
|
||||
}
|
||||
return this
|
||||
}
|
||||
@@ -98,7 +102,7 @@ export class MiddlewareBuilder<TMiddleware = any> {
|
||||
if (index !== -1) {
|
||||
this.middlewares[index] = newMiddleware
|
||||
} else {
|
||||
console.warn(`MiddlewareBuilder: 未找到名为 '${targetName}' 的中间件,无法替换`)
|
||||
logger.warn(`未找到名为 '${targetName}' 的中间件,无法替换`)
|
||||
}
|
||||
return this
|
||||
}
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { Chunk, ChunkType, ErrorChunk } from '@renderer/types/chunk'
|
||||
import { addAbortController, removeAbortController } from '@renderer/utils/abortController'
|
||||
|
||||
import { CompletionsParams, CompletionsResult } from '../schemas'
|
||||
import type { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
const logger = loggerService.withContext('aiCore:AbortHandlerMiddleware')
|
||||
|
||||
export const MIDDLEWARE_NAME = 'AbortHandlerMiddleware'
|
||||
|
||||
export const AbortHandlerMiddleware: CompletionsMiddleware =
|
||||
@@ -31,7 +34,7 @@ export const AbortHandlerMiddleware: CompletionsMiddleware =
|
||||
}
|
||||
|
||||
if (!messageId) {
|
||||
console.warn(`[${MIDDLEWARE_NAME}] No messageId found, abort functionality will not be available.`)
|
||||
logger.warn(`No messageId found, abort functionality will not be available.`)
|
||||
return next(ctx, params)
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { Usage } from '@renderer/types'
|
||||
import type { Chunk } from '@renderer/types/chunk'
|
||||
import { ChunkType } from '@renderer/types/chunk'
|
||||
@@ -8,6 +8,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
export const MIDDLEWARE_NAME = 'FinalChunkConsumerAndNotifierMiddleware'
|
||||
|
||||
const logger = loggerService.withContext('FinalChunkConsumerMiddleware')
|
||||
|
||||
/**
|
||||
* 最终Chunk消费和通知中间件
|
||||
*
|
||||
@@ -63,7 +65,7 @@ const FinalChunkConsumerMiddleware: CompletionsMiddleware =
|
||||
while (true) {
|
||||
const { done, value: chunk } = await reader.read()
|
||||
if (done) {
|
||||
Logger.debug(`[${MIDDLEWARE_NAME}] Input stream finished.`)
|
||||
logger.debug(`Input stream finished.`)
|
||||
break
|
||||
}
|
||||
|
||||
@@ -79,11 +81,11 @@ const FinalChunkConsumerMiddleware: CompletionsMiddleware =
|
||||
|
||||
if (!shouldSkipChunk) params.onChunk?.(genericChunk)
|
||||
} else {
|
||||
Logger.warn(`[${MIDDLEWARE_NAME}] Received undefined chunk before stream was done.`)
|
||||
logger.warn(`Received undefined chunk before stream was done.`)
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`[${MIDDLEWARE_NAME}] Error consuming stream:`, error)
|
||||
logger.error(`Error consuming stream:`, error)
|
||||
throw error
|
||||
} finally {
|
||||
if (params.onChunk && !isRecursiveCall) {
|
||||
@@ -115,7 +117,7 @@ const FinalChunkConsumerMiddleware: CompletionsMiddleware =
|
||||
|
||||
return modifiedResult
|
||||
} else {
|
||||
Logger.debug(`[${MIDDLEWARE_NAME}] No GenericChunk stream to process.`)
|
||||
logger.debug(`No GenericChunk stream to process.`)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -133,7 +135,7 @@ function extractAndAccumulateUsageMetrics(ctx: CompletionsContext, chunk: Generi
|
||||
try {
|
||||
if (ctx._internal.customState && !ctx._internal.customState?.firstTokenTimestamp) {
|
||||
ctx._internal.customState.firstTokenTimestamp = Date.now()
|
||||
Logger.debug(`[${MIDDLEWARE_NAME}] First token timestamp: ${ctx._internal.customState.firstTokenTimestamp}`)
|
||||
logger.debug(`First token timestamp: ${ctx._internal.customState.firstTokenTimestamp}`)
|
||||
}
|
||||
if (chunk.type === ChunkType.LLM_RESPONSE_COMPLETE) {
|
||||
// 从LLM_RESPONSE_COMPLETE chunk中提取usage数据
|
||||
@@ -157,7 +159,7 @@ function extractAndAccumulateUsageMetrics(ctx: CompletionsContext, chunk: Generi
|
||||
)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`[${MIDDLEWARE_NAME}] Error extracting usage/metrics from chunk:`, error)
|
||||
logger.error(`Error extracting usage/metrics from chunk:`, error)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,9 @@
|
||||
import { loggerService } from '@logger'
|
||||
|
||||
import { BaseContext, MethodMiddleware, MiddlewareAPI } from '../types'
|
||||
|
||||
const logger = loggerService.withContext('LoggingMiddleware')
|
||||
|
||||
export const MIDDLEWARE_NAME = 'GenericLoggingMiddlewares'
|
||||
|
||||
/**
|
||||
@@ -44,20 +48,20 @@ export const createGenericLoggingMiddleware: () => MethodMiddleware = () => {
|
||||
return (_: MiddlewareAPI<BaseContext, any[]>) => (next) => async (ctx, args) => {
|
||||
const methodName = ctx.methodName
|
||||
const logPrefix = `[${middlewareName} (${methodName})]`
|
||||
console.log(`${logPrefix} Initiating. Args:`, stringifyArgsForLogging(args))
|
||||
logger.debug(`${logPrefix} Initiating. Args:`, stringifyArgsForLogging(args))
|
||||
const startTime = Date.now()
|
||||
try {
|
||||
const result = await next(ctx, args)
|
||||
const duration = Date.now() - startTime
|
||||
// Log successful completion of the method call with duration. /
|
||||
// 记录方法调用成功完成及其持续时间。
|
||||
console.log(`${logPrefix} Successful. Duration: ${duration}ms`)
|
||||
logger.debug(`${logPrefix} Successful. Duration: ${duration}ms`)
|
||||
return result
|
||||
} catch (error) {
|
||||
const duration = Date.now() - startTime
|
||||
// Log failure of the method call with duration and error information. /
|
||||
// 记录方法调用失败及其持续时间和错误信息。
|
||||
console.error(`${logPrefix} Failed. Duration: ${duration}ms`, error)
|
||||
logger.error(`${logPrefix} Failed. Duration: ${duration}ms`, error)
|
||||
throw error // Re-throw the error to be handled by subsequent layers or the caller / 重新抛出错误,由后续层或调用者处理
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { MCPTool, MCPToolResponse, Model, ToolCallResponse } from '@renderer/types'
|
||||
import { ChunkType, MCPToolCreatedChunk } from '@renderer/types/chunk'
|
||||
import { SdkMessageParam, SdkRawOutput, SdkToolCall } from '@renderer/types/sdk'
|
||||
@@ -10,6 +10,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
export const MIDDLEWARE_NAME = 'McpToolChunkMiddleware'
|
||||
const MAX_TOOL_RECURSION_DEPTH = 20 // 防止无限递归
|
||||
|
||||
const logger = loggerService.withContext('McpToolChunkMiddleware')
|
||||
|
||||
/**
|
||||
* MCP工具处理中间件
|
||||
*
|
||||
@@ -32,7 +34,7 @@ export const McpToolChunkMiddleware: CompletionsMiddleware =
|
||||
|
||||
const executeWithToolHandling = async (currentParams: CompletionsParams, depth = 0): Promise<CompletionsResult> => {
|
||||
if (depth >= MAX_TOOL_RECURSION_DEPTH) {
|
||||
Logger.error(`🔧 [${MIDDLEWARE_NAME}] Maximum recursion depth ${MAX_TOOL_RECURSION_DEPTH} exceeded`)
|
||||
logger.error(`Maximum recursion depth ${MAX_TOOL_RECURSION_DEPTH} exceeded`)
|
||||
throw new Error(`Maximum tool recursion depth ${MAX_TOOL_RECURSION_DEPTH} exceeded`)
|
||||
}
|
||||
|
||||
@@ -43,7 +45,7 @@ export const McpToolChunkMiddleware: CompletionsMiddleware =
|
||||
} else {
|
||||
const enhancedCompletions = ctx._internal.enhancedDispatch
|
||||
if (!enhancedCompletions) {
|
||||
Logger.error(`🔧 [${MIDDLEWARE_NAME}] Enhanced completions method not found, cannot perform recursive call`)
|
||||
logger.error(`Enhanced completions method not found, cannot perform recursive call`)
|
||||
throw new Error('Enhanced completions method not found')
|
||||
}
|
||||
|
||||
@@ -54,7 +56,7 @@ export const McpToolChunkMiddleware: CompletionsMiddleware =
|
||||
}
|
||||
|
||||
if (!result.stream) {
|
||||
Logger.error(`🔧 [${MIDDLEWARE_NAME}] No stream returned from enhanced completions`)
|
||||
logger.error(`No stream returned from enhanced completions`)
|
||||
throw new Error('No stream returned from enhanced completions')
|
||||
}
|
||||
|
||||
@@ -123,7 +125,7 @@ function createToolHandlingTransform(
|
||||
executedToolResults.push(...result.toolResults)
|
||||
executedToolCalls.push(...result.confirmedToolCalls)
|
||||
} catch (error) {
|
||||
console.error(`🔧 [${MIDDLEWARE_NAME}] Error executing tool call asynchronously:`, error)
|
||||
logger.error(`Error executing tool call asynchronously:`, error)
|
||||
}
|
||||
})()
|
||||
|
||||
@@ -150,7 +152,7 @@ function createToolHandlingTransform(
|
||||
// 缓存执行结果
|
||||
executedToolResults.push(...result.toolResults)
|
||||
} catch (error) {
|
||||
console.error(`🔧 [${MIDDLEWARE_NAME}] Error executing tool use response asynchronously:`, error)
|
||||
logger.error(`Error executing tool use response asynchronously:`, error)
|
||||
// 错误时不影响其他工具的执行
|
||||
}
|
||||
})()
|
||||
@@ -162,7 +164,7 @@ function createToolHandlingTransform(
|
||||
controller.enqueue(chunk)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`🔧 [${MIDDLEWARE_NAME}] Error processing chunk:`, error)
|
||||
logger.error(`Error processing chunk:`, error)
|
||||
controller.error(error)
|
||||
}
|
||||
},
|
||||
@@ -194,7 +196,7 @@ function createToolHandlingTransform(
|
||||
await executeWithToolHandling(newParams, depth + 1)
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`🔧 [${MIDDLEWARE_NAME}] Error in tool processing:`, error)
|
||||
logger.error(`Error in tool processing:`, error)
|
||||
controller.error(error)
|
||||
} finally {
|
||||
hasToolCalls = false
|
||||
@@ -227,7 +229,7 @@ async function executeToolCalls(
|
||||
.filter((t): t is ToolCallResponse => typeof t !== 'undefined')
|
||||
|
||||
if (mcpToolResponses.length === 0) {
|
||||
console.warn(`🔧 [${MIDDLEWARE_NAME}] No valid MCP tool responses to execute`)
|
||||
logger.warn(`No valid MCP tool responses to execute`)
|
||||
return { toolResults: [], confirmedToolCalls: [] }
|
||||
}
|
||||
|
||||
@@ -325,7 +327,7 @@ function buildParamsWithToolResults(
|
||||
ctx._internal.observer.usage.total_tokens += additionalTokens
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`🔧 [${MIDDLEWARE_NAME}] Error estimating token usage for new messages:`, error)
|
||||
logger.error(`Error estimating token usage for new messages:`, error)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { SdkRawChunk } from '@renderer/types/sdk'
|
||||
|
||||
import { ResponseChunkTransformerContext } from '../../clients/types'
|
||||
@@ -7,6 +7,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
export const MIDDLEWARE_NAME = 'ResponseTransformMiddleware'
|
||||
|
||||
const logger = loggerService.withContext('ResponseTransformMiddleware')
|
||||
|
||||
/**
|
||||
* 响应转换中间件
|
||||
*
|
||||
@@ -32,14 +34,14 @@ export const ResponseTransformMiddleware: CompletionsMiddleware =
|
||||
if (adaptedStream instanceof ReadableStream) {
|
||||
const apiClient = ctx.apiClientInstance
|
||||
if (!apiClient) {
|
||||
console.error(`[${MIDDLEWARE_NAME}] ApiClient instance not found in context`)
|
||||
logger.error(`ApiClient instance not found in context`)
|
||||
throw new Error('ApiClient instance not found in context')
|
||||
}
|
||||
|
||||
// 获取响应转换器
|
||||
const responseChunkTransformer = apiClient.getResponseChunkTransformer(ctx)
|
||||
if (!responseChunkTransformer) {
|
||||
Logger.warn(`[${MIDDLEWARE_NAME}] No ResponseChunkTransformer available, skipping transformation`)
|
||||
logger.warn(`No ResponseChunkTransformer available, skipping transformation`)
|
||||
return result
|
||||
}
|
||||
|
||||
@@ -47,7 +49,7 @@ export const ResponseTransformMiddleware: CompletionsMiddleware =
|
||||
const model = assistant?.model
|
||||
|
||||
if (!assistant || !model) {
|
||||
console.error(`[${MIDDLEWARE_NAME}] Assistant or Model not found for transformation`)
|
||||
logger.error(`Assistant or Model not found for transformation`)
|
||||
throw new Error('Assistant or Model not found for transformation')
|
||||
}
|
||||
|
||||
@@ -61,7 +63,7 @@ export const ResponseTransformMiddleware: CompletionsMiddleware =
|
||||
provider: ctx.apiClientInstance?.provider
|
||||
}
|
||||
|
||||
console.log(`[${MIDDLEWARE_NAME}] Transforming raw SDK chunks with context:`, transformerContext)
|
||||
logger.debug(`Transforming raw SDK chunks with context:`, transformerContext)
|
||||
|
||||
try {
|
||||
// 创建转换后的流
|
||||
@@ -75,7 +77,7 @@ export const ResponseTransformMiddleware: CompletionsMiddleware =
|
||||
stream: genericChunkTransformStream
|
||||
}
|
||||
} catch (error) {
|
||||
Logger.error(`[${MIDDLEWARE_NAME}] Error during chunk transformation:`, error)
|
||||
logger.error(`Error during chunk transformation:`, error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { ChunkType } from '@renderer/types/chunk'
|
||||
|
||||
import { CompletionsParams, CompletionsResult, GenericChunk } from '../schemas'
|
||||
@@ -6,6 +6,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
export const MIDDLEWARE_NAME = 'TextChunkMiddleware'
|
||||
|
||||
const logger = loggerService.withContext('TextChunkMiddleware')
|
||||
|
||||
/**
|
||||
* 文本块处理中间件
|
||||
*
|
||||
@@ -32,7 +34,7 @@ export const TextChunkMiddleware: CompletionsMiddleware =
|
||||
const model = params.assistant?.model
|
||||
|
||||
if (!assistant || !model) {
|
||||
Logger.warn(`[${MIDDLEWARE_NAME}] Missing assistant or model information, skipping text processing`)
|
||||
logger.warn(`Missing assistant or model information, skipping text processing`)
|
||||
return result
|
||||
}
|
||||
|
||||
@@ -92,7 +94,7 @@ export const TextChunkMiddleware: CompletionsMiddleware =
|
||||
stream: enhancedTextStream
|
||||
}
|
||||
} else {
|
||||
Logger.warn(`[${MIDDLEWARE_NAME}] No stream to process or not a ReadableStream. Returning original result.`)
|
||||
logger.warn(`No stream to process or not a ReadableStream. Returning original result.`)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { ChunkType, ThinkingCompleteChunk, ThinkingDeltaChunk } from '@renderer/types/chunk'
|
||||
|
||||
import { CompletionsParams, CompletionsResult, GenericChunk } from '../schemas'
|
||||
@@ -6,6 +6,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
export const MIDDLEWARE_NAME = 'ThinkChunkMiddleware'
|
||||
|
||||
const logger = loggerService.withContext('ThinkChunkMiddleware')
|
||||
|
||||
/**
|
||||
* 处理思考内容的中间件
|
||||
*
|
||||
@@ -94,7 +96,7 @@ export const ThinkChunkMiddleware: CompletionsMiddleware =
|
||||
stream: processedStream
|
||||
}
|
||||
} else {
|
||||
Logger.warn(`[${MIDDLEWARE_NAME}] No generic chunk stream to process or not a ReadableStream.`)
|
||||
logger.warn(`No generic chunk stream to process or not a ReadableStream.`)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { ChunkType } from '@renderer/types/chunk'
|
||||
|
||||
import { CompletionsParams, CompletionsResult } from '../schemas'
|
||||
@@ -6,6 +6,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
export const MIDDLEWARE_NAME = 'TransformCoreToSdkParamsMiddleware'
|
||||
|
||||
const logger = loggerService.withContext('TransformCoreToSdkParamsMiddleware')
|
||||
|
||||
/**
|
||||
* 中间件:将CoreCompletionsRequest转换为SDK特定的参数
|
||||
* 使用上下文中ApiClient实例的requestTransformer进行转换
|
||||
@@ -23,16 +25,14 @@ export const TransformCoreToSdkParamsMiddleware: CompletionsMiddleware =
|
||||
const apiClient = ctx.apiClientInstance
|
||||
|
||||
if (!apiClient) {
|
||||
Logger.error(`🔄 [${MIDDLEWARE_NAME}] ApiClient instance not found in context.`)
|
||||
logger.error(`ApiClient instance not found in context.`)
|
||||
throw new Error('ApiClient instance not found in context')
|
||||
}
|
||||
|
||||
// 检查是否有requestTransformer方法
|
||||
const requestTransformer = apiClient.getRequestTransformer()
|
||||
if (!requestTransformer) {
|
||||
Logger.warn(
|
||||
`🔄 [${MIDDLEWARE_NAME}] ApiClient does not have getRequestTransformer method, skipping transformation`
|
||||
)
|
||||
logger.warn(`ApiClient does not have getRequestTransformer method, skipping transformation`)
|
||||
const result = await next(ctx, params)
|
||||
return result
|
||||
}
|
||||
@@ -42,7 +42,7 @@ export const TransformCoreToSdkParamsMiddleware: CompletionsMiddleware =
|
||||
const model = params.assistant.model
|
||||
|
||||
if (!assistant || !model) {
|
||||
console.error(`🔄 [${MIDDLEWARE_NAME}] Assistant or Model not found for transformation.`)
|
||||
logger.error(`Assistant or Model not found for transformation.`)
|
||||
throw new Error('Assistant or Model not found for transformation')
|
||||
}
|
||||
|
||||
@@ -74,7 +74,7 @@ export const TransformCoreToSdkParamsMiddleware: CompletionsMiddleware =
|
||||
}
|
||||
return next(ctx, params)
|
||||
} catch (error) {
|
||||
Logger.error(`🔄 [${MIDDLEWARE_NAME}] Error during request transformation:`, error)
|
||||
logger.error(`Error during request transformation:`, error)
|
||||
// 让错误向上传播,或者可以在这里进行特定的错误处理
|
||||
throw error
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { Model } from '@renderer/types'
|
||||
import {
|
||||
ChunkType,
|
||||
@@ -7,11 +8,12 @@ import {
|
||||
ThinkingStartChunk
|
||||
} from '@renderer/types/chunk'
|
||||
import { TagConfig, TagExtractor } from '@renderer/utils/tagExtraction'
|
||||
import Logger from 'electron-log/renderer'
|
||||
|
||||
import { CompletionsParams, CompletionsResult, GenericChunk } from '../schemas'
|
||||
import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
const logger = loggerService.withContext('ThinkingTagExtractionMiddleware')
|
||||
|
||||
export const MIDDLEWARE_NAME = 'ThinkingTagExtractionMiddleware'
|
||||
|
||||
// 不同模型的思考标签配置
|
||||
@@ -151,7 +153,7 @@ export const ThinkingTagExtractionMiddleware: CompletionsMiddleware =
|
||||
stream: processedStream
|
||||
}
|
||||
} else {
|
||||
Logger.warn(`[${MIDDLEWARE_NAME}] No generic chunk stream to process or not a ReadableStream.`)
|
||||
logger.warn(`[${MIDDLEWARE_NAME}] No generic chunk stream to process or not a ReadableStream.`)
|
||||
}
|
||||
}
|
||||
return result
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { MCPTool } from '@renderer/types'
|
||||
import { ChunkType, MCPToolCreatedChunk, TextDeltaChunk } from '@renderer/types/chunk'
|
||||
import { parseToolUse } from '@renderer/utils/mcp-tools'
|
||||
@@ -8,6 +9,8 @@ import { CompletionsContext, CompletionsMiddleware } from '../types'
|
||||
|
||||
export const MIDDLEWARE_NAME = 'ToolUseExtractionMiddleware'
|
||||
|
||||
const logger = loggerService.withContext('ToolUseExtractionMiddleware')
|
||||
|
||||
// 工具使用标签配置
|
||||
const TOOL_USE_TAG_CONFIG: TagConfig = {
|
||||
openingTag: '<tool_use>',
|
||||
@@ -106,7 +109,7 @@ function createToolUseExtractionTransform(
|
||||
// 转发其他所有chunk
|
||||
controller.enqueue(chunk)
|
||||
} catch (error) {
|
||||
console.error(`🔧 [${MIDDLEWARE_NAME}] Error processing chunk:`, error)
|
||||
logger.error(`Error processing chunk:`, error)
|
||||
controller.error(error)
|
||||
}
|
||||
},
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { CodeOutlined, LinkOutlined } from '@ant-design/icons'
|
||||
import { loggerService } from '@logger'
|
||||
import { useTheme } from '@renderer/context/ThemeProvider'
|
||||
import { ThemeMode } from '@renderer/types'
|
||||
import { extractTitle } from '@renderer/utils/formats'
|
||||
@@ -11,6 +12,8 @@ import styled, { keyframes } from 'styled-components'
|
||||
|
||||
import HtmlArtifactsPopup from './HtmlArtifactsPopup'
|
||||
|
||||
const logger = loggerService.withContext('HtmlArtifactsCard')
|
||||
|
||||
const HTML_VOID_ELEMENTS = new Set([
|
||||
'area',
|
||||
'base',
|
||||
@@ -123,7 +126,7 @@ const HtmlArtifactsCard: FC<Props> = ({ html }) => {
|
||||
if (window.api.shell?.openExternal) {
|
||||
window.api.shell.openExternal(filePath)
|
||||
} else {
|
||||
console.error(t('artifacts.preview.openExternal.error.content'))
|
||||
logger.error(t('artifacts.preview.openExternal.error.content'))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { LoadingOutlined } from '@ant-design/icons'
|
||||
import { loggerService } from '@logger'
|
||||
import CodeEditor from '@renderer/components/CodeEditor'
|
||||
import { CodeTool, CodeToolbar, TOOL_SPECS, useCodeTool } from '@renderer/components/CodeToolbar'
|
||||
import { useSettings } from '@renderer/hooks/useSettings'
|
||||
@@ -17,6 +18,8 @@ import HtmlArtifactsCard from './HtmlArtifactsCard'
|
||||
import StatusBar from './StatusBar'
|
||||
import { ViewMode } from './types'
|
||||
|
||||
const logger = loggerService.withContext('CodeBlockView')
|
||||
|
||||
interface Props {
|
||||
children: string
|
||||
language: string
|
||||
@@ -92,7 +95,7 @@ export const CodeBlockView: React.FC<Props> = memo(({ children, language, onSave
|
||||
setOutput(formattedOutput)
|
||||
})
|
||||
.catch((error) => {
|
||||
console.error('Unexpected error:', error)
|
||||
logger.error('Unexpected error:', error)
|
||||
setOutput(`Unexpected error: ${error.message || 'Unknown error'}`)
|
||||
})
|
||||
.finally(() => {
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { download } from '@renderer/utils/download'
|
||||
import { FileImage, ZoomIn, ZoomOut } from 'lucide-react'
|
||||
import { RefObject, useCallback, useEffect, useRef, useState } from 'react'
|
||||
@@ -8,6 +9,8 @@ import { TOOL_SPECS } from './constants'
|
||||
import { useCodeTool } from './hook'
|
||||
import { CodeTool } from './types'
|
||||
|
||||
const logger = loggerService.withContext('usePreviewToolHandlers')
|
||||
|
||||
// 预编译正则表达式用于查询位置
|
||||
const TRANSFORM_REGEX = /translate\((-?\d+\.?\d*)px,\s*(-?\d+\.?\d*)px\)/
|
||||
|
||||
@@ -205,7 +208,7 @@ export const usePreviewToolHandlers = (
|
||||
}
|
||||
img.src = svgBase64
|
||||
} catch (error) {
|
||||
console.error('Copy failed:', error)
|
||||
logger.error('Copy failed:', error)
|
||||
window.message.error(t('message.copy.failed'))
|
||||
}
|
||||
}, [getImgElement, t])
|
||||
@@ -265,7 +268,7 @@ export const usePreviewToolHandlers = (
|
||||
img.src = svgBase64
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Download failed:', error)
|
||||
logger.error('Download failed:', error)
|
||||
}
|
||||
},
|
||||
[getImgElement, prefix, customDownloader]
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { useEffect, useState } from 'react'
|
||||
import styled from 'styled-components'
|
||||
|
||||
const logger = loggerService.withContext('FallbackFavicon')
|
||||
|
||||
// 记录失败的URL的缓存键前缀
|
||||
const FAILED_FAVICON_CACHE_PREFIX = 'failed_favicon_'
|
||||
// 失败URL的缓存时间 (24小时)
|
||||
@@ -121,7 +123,7 @@ const FallbackFavicon: React.FC<FallbackFaviconProps> = ({ hostname, alt }) => {
|
||||
setFaviconState({ status: 'loaded', src: url })
|
||||
})
|
||||
.catch((error) => {
|
||||
Logger.log('All favicon requests failed:', error)
|
||||
logger.error('All favicon requests failed:', error)
|
||||
setFaviconState({ status: 'loaded', src: faviconUrls[0] })
|
||||
})
|
||||
|
||||
|
||||
@@ -9,6 +9,7 @@ import {
|
||||
ZoomInOutlined,
|
||||
ZoomOutOutlined
|
||||
} from '@ant-design/icons'
|
||||
import { loggerService } from '@logger'
|
||||
import { download } from '@renderer/utils/download'
|
||||
import { Dropdown, Image as AntImage, ImageProps as AntImageProps, Space } from 'antd'
|
||||
import { Base64 } from 'js-base64'
|
||||
@@ -21,6 +22,8 @@ interface ImageViewerProps extends AntImageProps {
|
||||
src: string
|
||||
}
|
||||
|
||||
const logger = loggerService.withContext('ImageViewer')
|
||||
|
||||
const ImageViewer: React.FC<ImageViewerProps> = ({ src, style, ...props }) => {
|
||||
const { t } = useTranslation()
|
||||
|
||||
@@ -59,7 +62,7 @@ const ImageViewer: React.FC<ImageViewerProps> = ({ src, style, ...props }) => {
|
||||
|
||||
window.message.success(t('message.copy.success'))
|
||||
} catch (error) {
|
||||
console.error('复制图片失败:', error)
|
||||
logger.error('复制图片失败:', error)
|
||||
window.message.error(t('message.copy.failed'))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { backupToLocal } from '@renderer/services/BackupService'
|
||||
import { Button, Input, Modal } from 'antd'
|
||||
import dayjs from 'dayjs'
|
||||
@@ -13,6 +14,8 @@ interface LocalBackupModalProps {
|
||||
setCustomFileName: (value: string) => void
|
||||
}
|
||||
|
||||
const logger = loggerService.withContext('LocalBackupModal')
|
||||
|
||||
export function LocalBackupModal({
|
||||
isModalVisible,
|
||||
handleBackup,
|
||||
@@ -80,7 +83,7 @@ export function useLocalBackupModal(localBackupDir: string | undefined) {
|
||||
})
|
||||
setIsModalVisible(false)
|
||||
} catch (error) {
|
||||
console.error('[LocalBackupModal] Backup failed:', error)
|
||||
logger.error('Backup failed:', error)
|
||||
} finally {
|
||||
setBackuping(false)
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { FolderIcon as NutstoreFolderIcon } from '@renderer/components/Icons/NutstoreIcons'
|
||||
import { Button, Input } from 'antd'
|
||||
import { useCallback, useEffect, useState } from 'react'
|
||||
@@ -12,6 +13,8 @@ interface NewFolderProps {
|
||||
className?: string
|
||||
}
|
||||
|
||||
const logger = loggerService.withContext('NutstorePathSelector')
|
||||
|
||||
const NewFolderContainer = styled.div`
|
||||
display: flex;
|
||||
align-items: center;
|
||||
@@ -95,7 +98,7 @@ function FileList(props: FileListProps) {
|
||||
setFiles(items)
|
||||
} catch (error) {
|
||||
if (error instanceof Error) {
|
||||
console.error(error)
|
||||
logger.error('Error fetching files:', error)
|
||||
window.modal.error({
|
||||
content: error.message,
|
||||
centered: true
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import i18n from '@renderer/i18n'
|
||||
import store from '@renderer/store'
|
||||
import type { Topic } from '@renderer/types'
|
||||
@@ -12,6 +13,8 @@ import {
|
||||
import { Alert, Empty, Form, Input, Modal, Select, Spin, Switch, TreeSelect } from 'antd'
|
||||
import React, { useEffect, useState } from 'react'
|
||||
|
||||
const logger = loggerService.withContext('ObsidianExportDialog')
|
||||
|
||||
const { Option } = Select
|
||||
|
||||
interface FileInfo {
|
||||
@@ -192,7 +195,7 @@ const PopupContainer: React.FC<PopupContainerProps> = ({
|
||||
setFiles(filesData)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('获取Obsidian Vault失败:', error)
|
||||
logger.error('获取Obsidian Vault失败:', error)
|
||||
setError(i18n.t('chat.topics.export.obsidian_fetch_error'))
|
||||
} finally {
|
||||
setLoading(false)
|
||||
@@ -210,7 +213,7 @@ const PopupContainer: React.FC<PopupContainerProps> = ({
|
||||
const filesData = await window.obsidian.getFiles(selectedVault)
|
||||
setFiles(filesData)
|
||||
} catch (error) {
|
||||
console.error('获取Obsidian文件失败:', error)
|
||||
logger.error('获取Obsidian文件失败:', error)
|
||||
setError(i18n.t('chat.topics.export.obsidian_fetch_folders_error'))
|
||||
} finally {
|
||||
setLoading(false)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { isEmbeddingModel, isRerankModel } from '@renderer/config/models'
|
||||
import SelectProviderModelPopup from '@renderer/pages/settings/ProviderSettings/SelectProviderModelPopup'
|
||||
import { checkApi } from '@renderer/services/ApiService'
|
||||
@@ -19,6 +19,8 @@ interface UseApiKeysProps {
|
||||
providerKind: ApiProviderKind
|
||||
}
|
||||
|
||||
const logger = loggerService.withContext('ApiKeyListPopup')
|
||||
|
||||
/**
|
||||
* API Keys 管理 hook
|
||||
*/
|
||||
@@ -116,7 +118,7 @@ export function useApiKeys({ provider, updateProvider, providerKind }: UseApiKey
|
||||
const updateKey = useCallback(
|
||||
(index: number, key: string): ApiKeyValidity => {
|
||||
if (index < 0 || index >= keys.length) {
|
||||
Logger.error('[ApiKeyList] invalid key index', { index })
|
||||
logger.error('invalid key index', { index })
|
||||
return { isValid: false, error: 'Invalid index' }
|
||||
}
|
||||
|
||||
@@ -220,7 +222,7 @@ export function useApiKeys({ provider, updateProvider, providerKind }: UseApiKey
|
||||
latency: undefined
|
||||
})
|
||||
|
||||
Logger.error('[ApiKeyList] failed to validate the connectivity of the api key', error)
|
||||
logger.error('failed to validate the connectivity of the api key', error)
|
||||
}
|
||||
},
|
||||
[keys, connectivityStates, updateConnectivityState, provider, providerKind]
|
||||
@@ -301,7 +303,7 @@ async function getModelForCheck(provider: Provider, t: TFunction): Promise<Model
|
||||
if (!selectedModel) return null
|
||||
return selectedModel
|
||||
} catch (error) {
|
||||
Logger.error('[ApiKeyList] failed to select model', error)
|
||||
logger.error('failed to select model', error)
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { backup } from '@renderer/services/BackupService'
|
||||
import store from '@renderer/store'
|
||||
import { IpcChannel } from '@shared/IpcChannel'
|
||||
import { Modal, Progress } from 'antd'
|
||||
import Logger from 'electron-log'
|
||||
import { useEffect, useState } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
|
||||
import { TopView } from '../TopView'
|
||||
|
||||
const logger = loggerService.withContext('BackupPopup')
|
||||
|
||||
interface Props {
|
||||
resolve: (data: any) => void
|
||||
}
|
||||
@@ -35,7 +37,7 @@ const PopupContainer: React.FC<Props> = ({ resolve }) => {
|
||||
}, [])
|
||||
|
||||
const onOk = async () => {
|
||||
Logger.log('[BackupManager] ', skipBackupFile)
|
||||
logger.debug('skipBackupFile', skipBackupFile)
|
||||
await backup(skipBackupFile)
|
||||
setOpen(false)
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { loggerService } from '@logger'
|
||||
import CustomTag from '@renderer/components/CustomTag'
|
||||
import { TopView } from '@renderer/components/TopView'
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { useKnowledge, useKnowledgeBases } from '@renderer/hooks/useKnowledge'
|
||||
import { Message } from '@renderer/types/newMessage'
|
||||
import {
|
||||
@@ -16,6 +16,8 @@ import { useEffect, useMemo, useState } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import styled from 'styled-components'
|
||||
|
||||
const logger = loggerService.withContext('SaveToKnowledgePopup')
|
||||
|
||||
const { Text } = Typography
|
||||
|
||||
// 内容类型配置
|
||||
@@ -201,7 +203,7 @@ const PopupContainer: React.FC<Props> = ({ message, title, resolve }) => {
|
||||
setOpen(false)
|
||||
resolve({ success: true, savedCount })
|
||||
} catch (error) {
|
||||
Logger.error('[SaveToKnowledgePopup] save failed:', error)
|
||||
logger.error('save failed:', error)
|
||||
window.message.error(t('chat.save.knowledge.error.save_failed'))
|
||||
setLoading(false)
|
||||
}
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { LoadingOutlined } from '@ant-design/icons'
|
||||
import { loggerService } from '@logger'
|
||||
import { useDefaultModel } from '@renderer/hooks/useAssistant'
|
||||
import { useSettings } from '@renderer/hooks/useSettings'
|
||||
import { fetchTranslate } from '@renderer/services/ApiService'
|
||||
@@ -15,6 +16,8 @@ import styled from 'styled-components'
|
||||
|
||||
import { TopView } from '../TopView'
|
||||
|
||||
const logger = loggerService.withContext('TextEditPopup')
|
||||
|
||||
interface ShowParams {
|
||||
text: string
|
||||
textareaProps?: TextAreaProps
|
||||
@@ -118,7 +121,7 @@ const PopupContainer: React.FC<Props> = ({
|
||||
setTextValue(translatedText)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Translation failed:', error)
|
||||
logger.error('Translation failed:', error)
|
||||
window.message.error({
|
||||
content: t('translate.error.failed'),
|
||||
key: 'translate-message'
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { LoadingOutlined } from '@ant-design/icons'
|
||||
import { loggerService } from '@logger'
|
||||
import { useDefaultModel } from '@renderer/hooks/useAssistant'
|
||||
import { useSettings } from '@renderer/hooks/useSettings'
|
||||
import { fetchTranslate } from '@renderer/services/ApiService'
|
||||
@@ -18,6 +19,8 @@ interface Props {
|
||||
isLoading?: boolean
|
||||
}
|
||||
|
||||
const logger = loggerService.withContext('TranslateButton')
|
||||
|
||||
const TranslateButton: FC<Props> = ({ text, onTranslated, disabled, style, isLoading }) => {
|
||||
const { t } = useTranslation()
|
||||
const { translateModel } = useDefaultModel()
|
||||
@@ -59,7 +62,7 @@ const TranslateButton: FC<Props> = ({ text, onTranslated, disabled, style, isLoa
|
||||
const translatedText = await fetchTranslate({ content: text, assistant })
|
||||
onTranslated(translatedText)
|
||||
} catch (error) {
|
||||
console.error('Translation failed:', error)
|
||||
logger.error('Translation failed:', error)
|
||||
window.message.error({
|
||||
content: t('translate.error.failed'),
|
||||
key: 'translate-message'
|
||||
|
||||
@@ -1,6 +0,0 @@
|
||||
import Logger from 'electron-log/renderer'
|
||||
|
||||
// 设置渲染进程的日志级别
|
||||
Logger.transports.console.level = 'info'
|
||||
|
||||
export default Logger
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import ThreeMinTopAppLogo from '@renderer/assets/images/apps/3mintop.png?url'
|
||||
import AbacusLogo from '@renderer/assets/images/apps/abacus.webp?url'
|
||||
import AIStudioLogo from '@renderer/assets/images/apps/aistudio.svg?url'
|
||||
@@ -57,6 +58,8 @@ import OpenAiProviderLogo from '@renderer/assets/images/providers/openai.png?url
|
||||
import SiliconFlowProviderLogo from '@renderer/assets/images/providers/silicon.png?url'
|
||||
import { MinAppType } from '@renderer/types'
|
||||
|
||||
const logger = loggerService.withContext('Config:minapps')
|
||||
|
||||
// 加载自定义小应用
|
||||
const loadCustomMiniApp = async (): Promise<MinAppType[]> => {
|
||||
try {
|
||||
@@ -79,7 +82,7 @@ const loadCustomMiniApp = async (): Promise<MinAppType[]> => {
|
||||
addTime: app.addTime || now
|
||||
}))
|
||||
} catch (error) {
|
||||
console.error('Failed to load custom mini apps:', error)
|
||||
logger.error('Failed to load custom mini apps:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { loggerService } from '@logger'
|
||||
import { LanguagesEnum } from '@renderer/config/translate'
|
||||
import type { LanguageCode, LegacyMessage as OldMessage, Topic } from '@renderer/types'
|
||||
import { FileTypes, WebSearchSource } from '@renderer/types' // Import FileTypes enum
|
||||
@@ -23,6 +23,8 @@ import {
|
||||
createTranslationBlock
|
||||
} from '../utils/messageUtils/create'
|
||||
|
||||
const logger = loggerService.withContext('Database:Upgrades')
|
||||
|
||||
export async function upgradeToV5(tx: Transaction): Promise<void> {
|
||||
const topics = await tx.table('topics').toArray()
|
||||
const files = await tx.table('files').toArray()
|
||||
@@ -91,7 +93,7 @@ function mapOldStatusToNewMessageStatus(oldStatus: OldMessage['status']): NewMes
|
||||
|
||||
// --- UPDATED UPGRADE FUNCTION for Version 7 ---
|
||||
export async function upgradeToV7(tx: Transaction): Promise<void> {
|
||||
Logger.info('Starting DB migration to version 7: Normalizing messages and blocks...')
|
||||
logger.info('Starting DB migration to version 7: Normalizing messages and blocks...')
|
||||
|
||||
const oldTopicsTable = tx.table('topics')
|
||||
const newBlocksTable = tx.table('message_blocks')
|
||||
@@ -102,7 +104,7 @@ export async function upgradeToV7(tx: Transaction): Promise<void> {
|
||||
const blocksToCreate: MessageBlock[] = []
|
||||
|
||||
if (!oldTopic.messages || !Array.isArray(oldTopic.messages)) {
|
||||
console.warn(`Topic ${oldTopic.id} has no valid messages array, skipping.`)
|
||||
logger.warn(`Topic ${oldTopic.id} has no valid messages array, skipping.`)
|
||||
topicUpdates[oldTopic.id] = { messages: [] }
|
||||
return
|
||||
}
|
||||
@@ -303,14 +305,14 @@ export async function upgradeToV7(tx: Transaction): Promise<void> {
|
||||
const updateOperations = Object.entries(topicUpdates).map(([id, data]) => ({ key: id, changes: data }))
|
||||
if (updateOperations.length > 0) {
|
||||
await oldTopicsTable.bulkUpdate(updateOperations)
|
||||
Logger.log(`Updated message references for ${updateOperations.length} topics.`)
|
||||
logger.info(`Updated message references for ${updateOperations.length} topics.`)
|
||||
}
|
||||
|
||||
Logger.log('DB migration to version 7 finished successfully.')
|
||||
logger.info('DB migration to version 7 finished successfully.')
|
||||
}
|
||||
|
||||
export async function upgradeToV8(tx: Transaction): Promise<void> {
|
||||
Logger.log('DB migration to version 8 started')
|
||||
logger.info('DB migration to version 8 started')
|
||||
|
||||
const langMap: Record<string, LanguageCode> = {
|
||||
english: 'en-us',
|
||||
@@ -340,7 +342,7 @@ export async function upgradeToV8(tx: Transaction): Promise<void> {
|
||||
const originTarget = (await settingsTable.get('translate:target:language'))?.value
|
||||
const originPair = (await settingsTable.get('translate:bidirectional:pair'))?.value
|
||||
let newSource, newTarget, newPair
|
||||
Logger.log('originSource: %o', originSource)
|
||||
logger.info('originSource: %o', originSource)
|
||||
if (originSource === 'auto') {
|
||||
newSource = 'auto'
|
||||
} else {
|
||||
@@ -350,20 +352,20 @@ export async function upgradeToV8(tx: Transaction): Promise<void> {
|
||||
}
|
||||
}
|
||||
|
||||
Logger.log('originTarget: %o', originTarget)
|
||||
logger.info('originTarget: %o', originTarget)
|
||||
newTarget = langMap[originTarget]
|
||||
if (!newTarget) {
|
||||
newTarget = LanguagesEnum.zhCN.langCode
|
||||
}
|
||||
|
||||
Logger.log('originPair: %o', originPair)
|
||||
logger.info('originPair: %o', originPair)
|
||||
if (!originPair || !originPair[0] || !originPair[1]) {
|
||||
newPair = defaultPair
|
||||
} else {
|
||||
newPair = [langMap[originPair[0]], langMap[originPair[1]]]
|
||||
}
|
||||
|
||||
Logger.log('DB migration to version 8: %o', { newSource, newTarget, newPair })
|
||||
logger.info('DB migration to version 8: %o', { newSource, newTarget, newPair })
|
||||
|
||||
await settingsTable.put({ id: 'translate:bidirectional:pair', value: newPair })
|
||||
await settingsTable.put({ id: 'translate:source:language', value: newSource })
|
||||
@@ -379,8 +381,8 @@ export async function upgradeToV8(tx: Transaction): Promise<void> {
|
||||
targetLanguage: langMap[history.targetLanguage]
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error upgrading history:', error)
|
||||
logger.error('Error upgrading history:', error)
|
||||
}
|
||||
}
|
||||
Logger.log('DB migration to version 8 finished.')
|
||||
logger.info('DB migration to version 8 finished.')
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { isMac } from '@renderer/config/constant'
|
||||
import { isLocalAi } from '@renderer/config/env'
|
||||
import { useTheme } from '@renderer/context/ThemeProvider'
|
||||
@@ -20,6 +21,8 @@ import { useRuntime } from './useRuntime'
|
||||
import { useSettings } from './useSettings'
|
||||
import useUpdateHandler from './useUpdateHandler'
|
||||
|
||||
const logger = loggerService.withContext('useAppInit')
|
||||
|
||||
export function useAppInit() {
|
||||
const dispatch = useAppDispatch()
|
||||
const { proxyUrl, language, windowStyle, autoCheckUpdate, proxyMode, customCss, enableDataCollection } = useSettings()
|
||||
@@ -133,7 +136,7 @@ export function useAppInit() {
|
||||
useEffect(() => {
|
||||
const memoryService = MemoryService.getInstance()
|
||||
memoryService.updateConfig().catch((error) => {
|
||||
console.error('Failed to update memory config:', error)
|
||||
logger.error('Failed to update memory config:', error)
|
||||
})
|
||||
}, [memoryConfig])
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { useMessageOperations } from '@renderer/hooks/useMessageOperations'
|
||||
import { EVENT_NAMES, EventEmitter } from '@renderer/services/EventService'
|
||||
import { RootState } from '@renderer/store'
|
||||
@@ -9,6 +10,8 @@ import { useCallback, useEffect, useState } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { useDispatch, useSelector, useStore } from 'react-redux'
|
||||
|
||||
const logger = loggerService.withContext('useChatContext')
|
||||
|
||||
export const useChatContext = (activeTopic: Topic) => {
|
||||
const { t } = useTranslation()
|
||||
const dispatch = useDispatch()
|
||||
@@ -115,7 +118,7 @@ export const useChatContext = (activeTopic: Topic) => {
|
||||
window.message.success(t('message.delete.success'))
|
||||
handleToggleMultiSelectMode(false)
|
||||
} catch (error) {
|
||||
console.error('Failed to delete messages:', error)
|
||||
logger.error('Failed to delete messages:', error)
|
||||
window.message.error(t('message.delete.failed'))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { createSelector } from '@reduxjs/toolkit'
|
||||
import Logger from '@renderer/config/logger'
|
||||
import { EVENT_NAMES, EventEmitter } from '@renderer/services/EventService'
|
||||
import { estimateUserPromptUsage } from '@renderer/services/TokenService'
|
||||
import store, { type RootState, useAppDispatch, useAppSelector } from '@renderer/store'
|
||||
@@ -26,6 +26,8 @@ import { abortCompletion } from '@renderer/utils/abortController'
|
||||
import { throttle } from 'lodash'
|
||||
import { useCallback } from 'react'
|
||||
|
||||
const logger = loggerService.withContext('UseMessageOperations')
|
||||
|
||||
const selectMessagesState = (state: RootState) => state.messages
|
||||
|
||||
export const selectNewTopicLoading = createSelector(
|
||||
@@ -75,7 +77,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
const editMessage = useCallback(
|
||||
async (messageId: string, updates: Partial<Omit<Message, 'id' | 'topicId' | 'blocks'>>) => {
|
||||
if (!topic?.id) {
|
||||
console.error('[editMessage] Topic prop is not valid.')
|
||||
logger.error('[editMessage] Topic prop is not valid.')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -157,7 +159,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
const regenerateAssistantMessage = useCallback(
|
||||
async (message: Message, assistant: Assistant) => {
|
||||
if (message.role !== 'assistant') {
|
||||
console.warn('regenerateAssistantMessage should only be called for assistant messages.')
|
||||
logger.warn('regenerateAssistantMessage should only be called for assistant messages.')
|
||||
return
|
||||
}
|
||||
await dispatch(regenerateAssistantResponseThunk(topic.id, message, assistant))
|
||||
@@ -172,11 +174,11 @@ export function useMessageOperations(topic: Topic) {
|
||||
const appendAssistantResponse = useCallback(
|
||||
async (existingAssistantMessage: Message, newModel: Model, assistant: Assistant) => {
|
||||
if (existingAssistantMessage.role !== 'assistant') {
|
||||
console.error('appendAssistantResponse should only be called for an existing assistant message.')
|
||||
logger.error('appendAssistantResponse should only be called for an existing assistant message.')
|
||||
return
|
||||
}
|
||||
if (!existingAssistantMessage.askId) {
|
||||
console.error('Cannot append response: The existing assistant message is missing its askId.')
|
||||
logger.error('Cannot append response: The existing assistant message is missing its askId.')
|
||||
return
|
||||
}
|
||||
await dispatch(appendAssistantResponseThunk(topic.id, existingAssistantMessage.id, newModel, assistant))
|
||||
@@ -204,7 +206,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
const state = store.getState()
|
||||
const message = state.messages.entities[messageId]
|
||||
if (!message) {
|
||||
console.error('[getTranslationUpdater] cannot find message:', messageId)
|
||||
logger.error('[getTranslationUpdater] cannot find message:', messageId)
|
||||
return null
|
||||
}
|
||||
|
||||
@@ -240,7 +242,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
}
|
||||
|
||||
if (!blockId) {
|
||||
console.error('[getTranslationUpdater] Failed to create translation block.')
|
||||
logger.error('[getTranslationUpdater] Failed to create translation block.')
|
||||
return null
|
||||
}
|
||||
|
||||
@@ -265,7 +267,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
*/
|
||||
const createTopicBranch = useCallback(
|
||||
(sourceTopicId: string, branchPointIndex: number, newTopic: Topic) => {
|
||||
Logger.log(`Cloning messages from topic ${sourceTopicId} to new topic ${newTopic.id}`)
|
||||
logger.info(`Cloning messages from topic ${sourceTopicId} to new topic ${newTopic.id}`)
|
||||
return dispatch(cloneMessagesToNewTopicThunk(sourceTopicId, branchPointIndex, newTopic))
|
||||
},
|
||||
[dispatch]
|
||||
@@ -280,7 +282,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
const editMessageBlocks = useCallback(
|
||||
async (messageId: string, editedBlocks: MessageBlock[]) => {
|
||||
if (!topic?.id) {
|
||||
console.error('[editMessageBlocks] Topic prop is not valid.')
|
||||
logger.error('[editMessageBlocks] Topic prop is not valid.')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -289,7 +291,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
const state = store.getState()
|
||||
const message = state.messages.entities[messageId]
|
||||
if (!message) {
|
||||
console.error('[editMessageBlocks] Message not found:', messageId)
|
||||
logger.error('[editMessageBlocks] Message not found:', messageId)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -353,7 +355,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
await dispatch(removeBlocksThunk(topic.id, messageId, blockIdsToRemove))
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('[editMessageBlocks] Failed to update message blocks:', error)
|
||||
logger.error('[editMessageBlocks] Failed to update message blocks:', error)
|
||||
}
|
||||
},
|
||||
[dispatch, topic?.id]
|
||||
@@ -369,7 +371,7 @@ export function useMessageOperations(topic: Topic) {
|
||||
|
||||
const mainTextBlock = editedBlocks.find((block) => block.type === MessageBlockType.MAIN_TEXT)
|
||||
if (!mainTextBlock) {
|
||||
console.error('[resendUserMessageWithEdit] Main text block not found in edited blocks')
|
||||
logger.error('[resendUserMessageWithEdit] Main text block not found in edited blocks')
|
||||
return
|
||||
}
|
||||
|
||||
@@ -401,14 +403,14 @@ export function useMessageOperations(topic: Topic) {
|
||||
const removeMessageBlock = useCallback(
|
||||
async (messageId: string, blockIdToRemove: string) => {
|
||||
if (!topic?.id) {
|
||||
console.error('[removeMessageBlock] Topic prop is not valid.')
|
||||
logger.error('[removeMessageBlock] Topic prop is not valid.')
|
||||
return
|
||||
}
|
||||
|
||||
const state = store.getState()
|
||||
const message = state.messages.entities[messageId]
|
||||
if (!message || !message.blocks) {
|
||||
console.error('[removeMessageBlock] Message not found or has no blocks:', messageId)
|
||||
logger.error('[removeMessageBlock] Message not found or has no blocks:', messageId)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,8 @@
|
||||
import { loggerService } from '@logger'
|
||||
import { useCallback } from 'react'
|
||||
|
||||
const logger = loggerService.withContext('useNutstoreSSO')
|
||||
|
||||
export function useNutstoreSSO() {
|
||||
const nutstoreSSOHandler = useCallback(() => {
|
||||
return new Promise<string>((resolve, reject) => {
|
||||
@@ -11,7 +14,7 @@ export function useNutstoreSSO() {
|
||||
if (!encryptedToken) return reject(null)
|
||||
resolve(encryptedToken)
|
||||
} catch (error) {
|
||||
console.error('解析URL失败:', error)
|
||||
logger.error('解析URL失败:', error)
|
||||
reject(null)
|
||||
} finally {
|
||||
removeListener()
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { loggerService } from '@logger'
|
||||
import db from '@renderer/databases'
|
||||
import { getModelUniqId } from '@renderer/services/ModelService'
|
||||
import { sortBy } from 'lodash'
|
||||
@@ -5,6 +6,8 @@ import { useCallback, useEffect, useState } from 'react'
|
||||
|
||||
import { useProviders } from './useProvider'
|
||||
|
||||
const logger = loggerService.withContext('usePinnedModels')
|
||||
|
||||
export const usePinnedModels = () => {
|
||||
const [pinnedModels, setPinnedModels] = useState<string[]>([])
|
||||
const [loading, setLoading] = useState(true)
|
||||
@@ -30,7 +33,7 @@ export const usePinnedModels = () => {
|
||||
}
|
||||
|
||||
loadPinnedModels().catch((error) => {
|
||||
console.error('Failed to load pinned models', error)
|
||||
logger.error('Failed to load pinned models', error)
|
||||
setPinnedModels([])
|
||||
setLoading(false)
|
||||
})
|
||||
@@ -53,7 +56,7 @@ export const usePinnedModels = () => {
|
||||
: [...pinnedModels, modelId]
|
||||
await updatePinnedModels(newPinnedModels)
|
||||
} catch (error) {
|
||||
console.error('Failed to toggle pinned model', error)
|
||||
logger.error('Failed to toggle pinned model', error)
|
||||
}
|
||||
},
|
||||
[pinnedModels, updatePinnedModels]
|
||||
|
||||
@@ -1,10 +1,13 @@
|
||||
import KeyvStorage from '@kangfenmao/keyv-storage'
|
||||
import { loggerService } from '@logger'
|
||||
|
||||
import { startAutoSync } from './services/BackupService'
|
||||
import { startNutstoreAutoSync } from './services/NutstoreService'
|
||||
import storeSyncService from './services/StoreSyncService'
|
||||
import store from './store'
|
||||
|
||||
loggerService.initWindowSource('mainWindow')
|
||||
|
||||
function initKeyv() {
|
||||
window.keyv = new KeyvStorage()
|
||||
window.keyv.init()
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import 'emoji-picker-element'
|
||||
|
||||
import { CheckOutlined, LoadingOutlined, RollbackOutlined, ThunderboltOutlined } from '@ant-design/icons'
|
||||
import { loggerService } from '@logger'
|
||||
import EmojiPicker from '@renderer/components/EmojiPicker'
|
||||
import { TopView } from '@renderer/components/TopView'
|
||||
import { AGENT_PROMPT } from '@renderer/config/prompts'
|
||||
@@ -30,6 +31,8 @@ type FieldType = {
|
||||
knowledge_base_ids: string[]
|
||||
}
|
||||
|
||||
const logger = loggerService.withContext('AddAgentPopup')
|
||||
|
||||
const PopupContainer: React.FC<Props> = ({ resolve }) => {
|
||||
const [open, setOpen] = useState(true)
|
||||
const [form] = Form.useForm()
|
||||
@@ -140,7 +143,7 @@ const PopupContainer: React.FC<Props> = ({ resolve }) => {
|
||||
setOriginalPrompt(content)
|
||||
setHasUnsavedChanges(true)
|
||||
} catch (error) {
|
||||
console.error('Error fetching data:', error)
|
||||
logger.error('Error fetching data:', error)
|
||||
}
|
||||
|
||||
setLoading(false)
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user