ahvn.agent.base module¶

class ahvn.agent.base.BaseAgentSpec(tools=None, llm_args=None, max_steps=None, **kwargs)[source]¶

Bases: ABC

Parameters:
__init__(tools=None, llm_args=None, max_steps=None, **kwargs)[source]¶
Parameters:
abstractmethod encode(**inputs)[source]¶

Convert input arguments into Messages for the agent.

Parameters:

**inputs – Arbitrary input arguments.

Returns:

The encoded messages for the agent.

Return type:

Messages

step(messages, include=None)[source]¶

Execute a single LLM call with streaming.

Parameters:
  • messages (Union[str, Dict[str, Any], Any, List[Union[str, Dict[str, Any], Any]]]) – Current conversation messages.

  • include (Optional[List[Literal['text', 'think', 'tool_calls', 'content', 'message', 'structured', 'tool_messages', 'tool_results', 'delta_messages', 'messages']]]) – Fields to include in the stream chunks.

Yields:

Stream chunks from the LLM.

Return type:

Generator[Dict[str, Any], None, None]

abstractmethod is_done(messages, delta_messages)[source]¶
Return type:

Tuple[bool, Dict[str, Any]]

Parameters:
user_proxy(messages, delta_messages, finish_state=None)[source]¶

Add a user proxy message to prompt the agent to continue.

This is called when the agent is not done after a step, to encourage it to keep going.

Parameters:
Returns:

A list of messages to append.

Return type:

Messages

abstractmethod decode(messages, finish_state=None)[source]¶
Return type:

Any

Parameters:
stream(messages, include=None)[source]¶

Stream the agent execution, yielding chunks as they are generated.

This is the core streaming interface. Each chunk contains: - Standard LLM fields: text, think, tool_calls, tool_messages, etc. - Agent control fields: step, done, finish_state, messages

Parameters:
  • messages (Union[str, Dict[str, Any], Any, List[Union[str, Dict[str, Any], Any]]]) – Initial messages to start the agent.

  • include (Optional[List[Literal['text', 'think', 'tool_calls', 'content', 'message', 'structured', 'tool_messages', 'tool_results', 'delta_messages', 'messages']]]) – Fields to include in the stream. Defaults to common fields.

Yields:

AgentStreamChunk – Stream chunks with LLM output and agent state.

Return type:

Generator[Dict[str, Any], None, None]

run(messages, include=None)[source]¶

Run the agent to completion, collecting all stream output.

This is a convenience wrapper around stream() that blocks until completion.

Parameters:
  • messages (Union[str, Dict[str, Any], Any, List[Union[str, Dict[str, Any], Any]]]) – Initial messages to start the agent.

  • include (Optional[List[Literal['text', 'think', 'tool_calls', 'content', 'message', 'structured', 'tool_messages', 'tool_results', 'delta_messages', 'messages']]]) – Fields to include (passed to stream).

Return type:

Tuple[Union[str, Dict[str, Any], Any, List[Union[str, Dict[str, Any], Any]]], Dict[str, Any]]

Returns:

Tuple of (final_messages, finish_state).

__call__(**inputs)[source]¶

Convenience method to encode, run, and decode in one call.

Parameters:

**inputs – Input arguments passed to encode().

Return type:

Any

Returns:

Decoded output from the agent.

class ahvn.agent.base.BasePromptAgentSpec(prompt, tools=None, llm_args=None, max_steps=None, **kwargs)[source]¶

Bases: BaseAgentSpec

Parameters:
__init__(prompt, tools=None, llm_args=None, max_steps=None, **kwargs)[source]¶
Parameters:
encode(**inputs)[source]¶

Convert input arguments into Messages for the agent.

Parameters:

**inputs – Arbitrary input arguments.

Returns:

The encoded messages for the agent.

Return type:

Messages