* refactor: LLM response handling with reasoning content
- Added a `show_reasoning` parameter to `run_agent` to control the display of reasoning content.
- Updated `LLMResponse` to include a `reasoning_content` field for storing reasoning text.
- Modified `WebChatMessageEvent` to handle and send reasoning content in streaming responses.
- Implemented reasoning extraction in various provider sources (e.g., OpenAI, Gemini).
- Updated the chat interface to display reasoning content in a collapsible format.
- Removed the deprecated `thinking_filter` package and its associated logic.
- Updated localization files to include new reasoning-related strings.
* feat: add Groq chat completion provider and associated configurations
* Update astrbot/core/provider/sources/gemini_source.py
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
---------
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>