- Introduced `ConversationSidebar.vue` for improved conversation management and sidebar functionality.
- Enhanced `MessageList.vue` to handle loading states and improved message rendering.
- Created new composables: `useConversations`, `useMessages`, `useMediaHandling`, `useRecording` for better code organization and reusability.
- Added loading indicators and improved user experience during message processing.
- Ensured backward compatibility and maintained existing functionalities.
- Added WebChatSession model for managing user sessions.
- Introduced methods for creating, retrieving, updating, and deleting WebChat sessions in the database.
- Updated core lifecycle to include migration from version 4.6 to 4.7, creating WebChat sessions from existing platform message history.
- Refactored chat routes to support new session-based architecture, replacing conversation-related endpoints with session endpoints.
- Updated frontend components to handle sessions instead of conversations, including session creation and management.
* ci: build test image on master pushes
* ci: split workflows for master test and release builds
* test ci
* test ci
* Update docker-image.yml
* test ci
Updated README to enhance deployment instructions.
* Make GHCR publishing optional in Docker workflow
* chore: Update DockerHub password secret in workflow
* Update .github/workflows/docker-image.yml
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
* chore: rename job to build nightly image in workflow
* feat: schedule the nightly build at 0:00 am everyday, if have new commits
* fix: update build-nightly-image job to trigger only on schedule events
* Update fetch-depth and enable fetch-tag in workflows
---------
Co-authored-by: Soulter <37870767+Soulter@users.noreply.github.com>
Co-authored-by: LIghtJUNction <lightjunction.me@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Soulter <905617992@qq.com>
* feat: enhance parameter type handling in LLM tool registration with JSON schema support
* refactor: remove debug print statement from FunctionToolManager
* refactor: LLM response handling with reasoning content
- Added a `show_reasoning` parameter to `run_agent` to control the display of reasoning content.
- Updated `LLMResponse` to include a `reasoning_content` field for storing reasoning text.
- Modified `WebChatMessageEvent` to handle and send reasoning content in streaming responses.
- Implemented reasoning extraction in various provider sources (e.g., OpenAI, Gemini).
- Updated the chat interface to display reasoning content in a collapsible format.
- Removed the deprecated `thinking_filter` package and its associated logic.
- Updated localization files to include new reasoning-related strings.
* feat: add Groq chat completion provider and associated configurations
* Update astrbot/core/provider/sources/gemini_source.py
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
* refactor: LLM response handling with reasoning content
- Added a `show_reasoning` parameter to `run_agent` to control the display of reasoning content.
- Updated `LLMResponse` to include a `reasoning_content` field for storing reasoning text.
- Modified `WebChatMessageEvent` to handle and send reasoning content in streaming responses.
- Implemented reasoning extraction in various provider sources (e.g., OpenAI, Gemini).
- Updated the chat interface to display reasoning content in a collapsible format.
- Removed the deprecated `thinking_filter` package and its associated logic.
- Updated localization files to include new reasoning-related strings.
* feat: add Groq chat completion provider and associated configurations
* Update astrbot/core/provider/sources/gemini_source.py
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>