ahvn.utils.exts.autofunc module

autofunc utilities for AgentHeaven.

This module provides the autofunc function that creates callable functions automatically implemented using Large Language Models (LLMs) based on function specifications and inputs.

The function asks the LLM to be a skillful python expert and produce output given function specification and inputs. It should support Callable and ToolSpec.

ahvn.utils.exts.autofunc.autofunc(func_spec=None, prompt=None, system=None, descriptions=None, examples=None, instructions=None, composer='autofunc', lang=None, llm_args=None, search_args=None, capture=None, **kwargs)[source]

Create a function that is automatically implemented using LLM inference.

This function asks the LLM to be a skillful Python expert and produce output given function specification and inputs. Uses PromptUKFT for template rendering with structured prompt generation.

Can be used as a decorator or as a regular function call.

Parameters:
  • func_spec (Union[Callable, ToolSpec], optional) – The function specification.

  • prompt (Optional[PromptUKFT]) – A pre-defined PromptUKFT template to use for the function. If None, a default prompt will be constructed using the provided func_spec and other parameters. If not None, the prompt will be used directly and other parameters (func_spec, descriptions, system, examples, instructions) will be ignored. (TODO: behavior of other parameters -> update prompt)

  • system (str, optional) – System prompt to guide the LLM’s behavior.

  • descriptions (Union[str, List[str]], optional) – Additional descriptions for the task.

  • examples (Iterable[Union[Dict[str, Any], CacheEntry]], optional) – Examples demonstrating the desired input-output behavior.

  • instructions (Union[str, List[str]], optional) – Additional instructions for the LLM.

  • composer (str, optional) – The prompt composer to use. Defaults to “autofunc”.

  • lang (str, optional) – Language code for localization.

  • llm_args (Dict, optional) – Arguments for the LLM model.

  • search_args (Dict, optional) – Arguments for searching examples from example sources. It is used only when examples is a KL example source (KLStore, KLEngine, KLBase).

  • capture (Dict, optional) – Capture settings for logging or debugging. If provided, it will be used to capture the execution details. - ‘prompt’: The constructed prompt object.

  • kwargs – Additional keyword arguments.

Returns:

A function that takes keyword arguments matching the function specification

and returns the LLM-inferred output.

Return type:

Callable

Raises:

AutoFuncError – If the LLM fails to generate valid output or execution fails.

Examples

>>> # Usage 1: Direct function call
>>> def square(x: int) -> int:
...     '''Return the square of x.'''
...     pass
>>> f = autofunc(square, llm_args={"preset": "tiny"})
>>> f(x=5)
25
>>> # Usage 2: As a decorator with arguments
>>> @autofunc(examples=[{"inputs": {"x": 5}, "output": 25}], llm_args={"preset": "tiny"})
>>> def square(x: int) -> int:
...     '''Return the square of x.'''
...     pass
>>> square(x=4)
16
>>> # Usage 3: As a decorator without arguments
>>> @autofunc
>>> def add(x: int, y: int) -> int:
...     '''Add two numbers.'''
...     pass
>>> add(x=3, y=4)
7
ahvn.utils.exts.autofunc.autofunc_prompt_composer(kl, func_spec, system=None, descriptions=None, examples=None, instructions=None, instance=None, search_args=None, **kwargs)[source]
Return type:

str

Parameters:
ahvn.utils.exts.autofunc.build_autofunc_base_prompt()[source]
Return type:

PromptUKFT