* refactor: llm message schema * feat: implement MCPTool and local LLM tools with enhanced context handling * refactor: reorganize imports and enhance docstrings for clarity * refactor: enhance ContentPart validation and add message pair handling in ConversationManager * chore: ruff format * refactor: remove debug print statement from payloads in ProviderOpenAIOfficial * Update astrbot/core/agent/tool.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update astrbot/core/agent/message.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update astrbot/core/agent/message.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update astrbot/core/agent/tool.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update astrbot/core/pipeline/process_stage/method/llm_request.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update astrbot/core/agent/message.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * refactor: enhance documentation and import mcp in tool.py; update call method return type * fix: 修复以数据类的方式注册 tool 时的插件重载机制问题 * refactor: change role attributes to use Literal types for message segments * fix: add support for 'decorator_handler' method in call_local_llm_tool * fix: handle None prompt in text_chat method and ensure context is properly formatted --------- Co-authored-by: LIghtJUNction <lightjunction.me@gmail.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
31 lines
830 B
Python
31 lines
830 B
Python
from typing import Generic
|
|
|
|
import mcp
|
|
|
|
from astrbot.core.agent.tool import FunctionTool
|
|
from astrbot.core.provider.entities import LLMResponse
|
|
|
|
from .run_context import ContextWrapper, TContext
|
|
|
|
|
|
class BaseAgentRunHooks(Generic[TContext]):
|
|
async def on_agent_begin(self, run_context: ContextWrapper[TContext]): ...
|
|
async def on_tool_start(
|
|
self,
|
|
run_context: ContextWrapper[TContext],
|
|
tool: FunctionTool,
|
|
tool_args: dict | None,
|
|
): ...
|
|
async def on_tool_end(
|
|
self,
|
|
run_context: ContextWrapper[TContext],
|
|
tool: FunctionTool,
|
|
tool_args: dict | None,
|
|
tool_result: mcp.types.CallToolResult | None,
|
|
): ...
|
|
async def on_agent_done(
|
|
self,
|
|
run_context: ContextWrapper[TContext],
|
|
llm_response: LLMResponse,
|
|
): ...
|