Skip to content

Conversation

@fede-kamel
Copy link
Contributor

@fede-kamel fede-kamel commented Oct 31, 2025

Summary

Add support for parallel tool calling to enable models to execute multiple tools simultaneously, improving performance for multi-tool workflows.

Problem

The langchain-oracle SDK did not expose the OCI API's is_parallel_tool_calls parameter, forcing sequential tool execution even when tools could run in parallel.

Solution

Implemented hybrid approach allowing both class-level defaults and per-binding overrides:

# Option 1: Class-level default
llm = ChatOCIGenAI(
    model_id="meta.llama-3.3-70b-instruct",  # Works with Meta, Llama, Grok, OpenAI, Mistral
    parallel_tool_calls=True
)

# Option 2: Per-binding override
llm_with_tools = llm.bind_tools(
    [tool1, tool2, tool3],
    parallel_tool_calls=True
)

Changes

  • Add parallel_tool_calls parameter to OCIGenAIBase (default: False)
  • Update bind_tools() method to accept parallel_tool_calls parameter
  • Update GenericProvider to pass is_parallel_tool_calls to OCI API
  • Add validation for Cohere models (raises clear error)
  • Add comprehensive documentation and examples

Testing

Unit Tests (9/9 passing)

  • Class-level parameter setting
  • Default behavior verification
  • Explicit True/False in bind_tools
  • Class default usage and override
  • Parameter passed to OCI API
  • Cohere model validation

Integration Tests (4/4 passing)

  • Parallel tool calling enabled
  • Sequential tool calling (baseline)
  • bind_tools override functionality
  • Cohere model error handling

All tests verified with live OCI GenAI API.

Backward Compatibility

Fully backward compatible

  • Default value is False (existing behavior)
  • Opt-in feature
  • No changes required to existing code

Benefits

  • Performance: Faster execution for multi-tool workflows
  • Flexibility: Both global defaults and per-binding control
  • Safety: Clear validation and error messages
  • Consistency: Follows existing parameter patterns

Model Support

Supported (GenericChatRequest models):

  • Meta Llama 3.1, 3.2, 3.3, 4.x
  • xAI Grok 3, 3 Mini, 4, 4 Fast
  • OpenAI gpt-oss models
  • Mistral models
  • Any model using GenericChatRequest

Unsupported:

  • Cohere models (CohereChatRequest - clear error message provided)

Add support for the parallel_tool_calls parameter to enable parallel
function calling in Meta/Llama models, improving performance for
multi-tool workflows.

## Changes

- Add parallel_tool_calls class parameter to OCIGenAIBase (default: False)
- Add parallel_tool_calls parameter to bind_tools() method
- Support hybrid approach: class-level default + per-binding override
- Pass is_parallel_tool_calls to OCI API in MetaProvider
- Add validation for Cohere models (raises error if attempted)

## Testing

- 9 comprehensive unit tests (all passing)
- 4 integration tests with live OCI API (all passing)
- No regression in existing tests

## Usage

Class-level default:
  llm = ChatOCIGenAI(
      model_id="meta.llama-3.3-70b-instruct",
      parallel_tool_calls=True
  )

Per-binding override:
  llm_with_tools = llm.bind_tools(
      [tool1, tool2, tool3],
      parallel_tool_calls=True
  )

## Benefits

- Up to N× speedup for N independent tool calls
- Backward compatible (default: False)
- Clear error messages for unsupported models
- Follows existing parameter patterns
@oracle-contributor-agreement oracle-contributor-agreement bot added the OCA Verified All contributors have signed the Oracle Contributor Agreement. label Oct 31, 2025
@fede-kamel
Copy link
Contributor Author

🔍 Verification: is_parallel_tool_calls is Meta/Llama Only

Verified through OCI API documentation that is_parallel_tool_calls is only available for Meta/Llama models, not Cohere.

API Documentation Findings

GenericChatRequest (Meta/Llama models):

CohereChatRequest (Cohere models):

Conclusion

The implementation correctly restricts parallel_tool_calls to Meta/Llama models because:

  1. OCI API limitation - Parameter doesn't exist in CohereChatRequest
  2. Clear error handling - Our code raises ValueError when attempted with Cohere
  3. Accurate documentation - README and docstrings note "Meta/Llama models only"

This is an OCI platform limitation, not a langchain-oracle implementation choice.

Future Support

If OCI adds is_parallel_tool_calls to CohereChatRequest in the future, we can extend support by:

  1. Removing the Cohere validation check
  2. Adding parameter passing in CohereProvider
  3. Updating documentation

For now, Meta/Llama only is correct and properly documented.

…ol calling

- Update README to include all GenericChatRequest models (Grok, OpenAI, Mistral)
- Update code comments and docstrings
- Update error messages with complete model list
- Clarify that feature works with GenericChatRequest, not just Meta/Llama
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

OCA Verified All contributors have signed the Oracle Contributor Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant