ahvn.utils.exts.autocode module

autocode utilities for AgentHeaven.

This module provides the autocode function that creates static callable functions automatically implemented using Large Language Models (LLMs) based on function specifications and examples.

The function generates a complete Python implementation from examples and executes the code to create a static callable function (not LLM-based).

ahvn.utils.exts.autocode.autocode(func_spec=None, prompt=None, system=None, descriptions=None, examples=None, instructions=None, env=None, composer='autocode', lang=None, llm_args=None, search_args=None, capture=None, **kwargs)[source]

Create a static function that is automatically generated using LLM code generation.

This function takes a function specification and examples, then uses an LLM to generate a complete implementation. The generated code is executed to return a static callable function (not LLM-based).

Can be used as a decorator or as a regular function call.

Parameters:
  • func_spec (Union[Callable, ToolSpec], optional) – The function specification.

  • prompt (Optional[PromptUKFT]) – A pre-defined PromptUKFT template to use for code generation. If None, a default prompt will be constructed using the provided func_spec and other parameters. If not None, the prompt will be used directly and other parameters (func_spec, descriptions, system, examples, instructions) will be ignored. (TODO: behavior of other parameters -> update prompt)

  • system (str, optional) – System prompt to guide the LLM’s behavior.

  • descriptions (Union[str, List[str]], optional) – Additional descriptions for the task.

  • examples (Iterable[Union[Dict[str, Any], CacheEntry]], optional) – Examples demonstrating the desired input-output behavior.

  • instructions (Union[str, List[str]], optional) – Additional instructions for the LLM.

  • env (Optional[Dict], optional) – The environment in which to execute the code. Defaults to None.

  • composer (str, optional) – The prompt composer to use. Defaults to “autocode”.

  • lang (str, optional) – Language code for localization.

  • llm_args (Dict, optional) – Arguments for the LLM model. Notice that code generation oughts to be called once and then reused. Therefore, it is strongly recommended to use a high-quality LLM, and it is also strongly recommended to have cache enabled to avoid repeated code generation calls.

  • search_args (Dict, optional) – Arguments for searching examples from example sources. It is used only when examples is a KL example source (KLStore, KLEngine, KLBase).

  • capture (Dict, optional) – Capture settings for logging or debugging. If provided, it will be used to capture the execution details. - ‘prompt’: The constructed prompt object.

  • kwargs – Additional keyword arguments.

Returns:

A static callable function generated from the LLM-generated code.

Return type:

Callable

Raises:

AutoFuncError – If the LLM fails to generate valid code or execution fails.

Examples

>>> @autocode(examples=[{"inputs": {"x": 5}, "output": 25}])
>>> def square(x: int) -> int:
...     pass
>>> square(x=4)
16
ahvn.utils.exts.autocode.autocode_prompt_composer(kl, func_spec, system=None, descriptions=None, examples=None, instructions=None, search_args=None, **kwargs)[source]
Return type:

str

Parameters:
ahvn.utils.exts.autocode.build_autocode_base_prompt()[source]
Return type:

PromptUKFT