ahvn.llm.llm_utils module¶
- ahvn.llm.llm_utils.get_litellm_retryable_exceptions()[source]¶
Get retryable exceptions from litellm.
- ahvn.llm.llm_utils.resolve_llm_config(preset=None, model=None, provider=None, **kwargs)[source]¶
Compile an LLM configuration dictionary based on the following order of priority: 1. kwargs 2. preset 3. provider 4. model 5. global configuration When a parameter is specified in multiple places, the one with the highest priority is used. For example, if a parameter is specified in both kwargs and preset, the value from kwargs will be used. When missing, the preset falls back to the default preset, the model falls back to the default model, and the provider falls back to the default provider of the model.
- ahvn.llm.llm_utils.format_messages(messages)[source]¶
Unify messages for LLM in diverse formats to OpenAI message format.
If messages is a single string, it is treated as a single user message.
If messages is a list, each item is processed as follows:
If the item is a litellm.Message object, it is converted to dict using its json() method.
If the item is a string, it is treated as a user message.
If the item is a dict, it is used as is, but must contain a “role” field.
If the item is of any other type, a TypeError is raised.
If a message dict contains “tool_calls”, its “function.arguments” field is converted to a JSON string if it is not already a string.
- Parameters:
messages (
Union[str,Dict[str,Any],Any,List[Union[str,Dict[str,Any],Any]]]) – List of messages that can be either dict or Message objects- Returns:
List of formatted messages in OpenAI format
- Return type:
List[dict]
- Raises:
ValueError – If messages are invalid or missing required fields
TypeError – If an unsupported message type is encountered