You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Enhance tool calling to support multi-step orchestration
This commit improves the Meta Llama tool calling infinite loop fix by
replacing the overly restrictive single-tool limitation with intelligent
multi-step support that allows up to 8 sequential tool calls.
Changes:
- Add max_sequential_tool_calls parameter (default: 8) to OCIGenAIBase
- Implement loop detection algorithm that identifies when the same tool
is called repeatedly with identical arguments
- Replace unconditional tool_choice="none" with conditional logic that
only forces stop when limit exceeded or loop detected
- Otherwise allow default model behavior for continued tool calling
Benefits:
- Prevents infinite loops (original goal maintained)
- Enables multi-step tool orchestration (new capability)
- Fully backward compatible via default parameter
- Configurable per use case
- Domain-agnostic implementation
Tested with integration tests showing successful multi-step workflows
for diagnostic and remediation scenarios.
0 commit comments