Configuration data
LiveHub AI Framework automatically populates the following variables from agent’s configuration:
-
llm_model– LLM model, for example,gpt-4o-mini -
temperature– LLM temperature -
max_tokens– maximum number of tokens that LLM can output -
max_turns– maximum number of User/LLM interactions -
documents– True, if agent has access to some documents -
tools– True, if agent is equipped with some tools -
subagents– True, if agent has some sub-agents defined -
orchestration_mode– orchestration mode for multi-agent topologies (consult/delegate/delegate_with_history) -
document_mode– document mode (rag/doc_search/prompt/doc_get)
Configuration data may be used in agent’s prompt / welcome message / Tools parameters, similar to the agent variables. For example:
{{#if documents}}
If you receive extracts from the documents, use them to answer the questions.
Prioritize data from the extracts over prior knowledge.
{{/if}}