Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Muxing issues with FIM in Continue #1005

Closed
@danbarr

Description

@danbarr

Describe the issue

Possibly related to #980, but seeing problems with muxing and FIM in Continue beyond just OpenRouter.

With OpenRouter, seeing the same problem via muxing as we see in #980, CodeGate appears to be processing the FIM event but nothing shows up from the extension.

With Ollama (via mux), no error but also no output, and the logs are suspiciously thin, the prompt isn't logged at all and doesn't show up in the dashboard history.

2025-02-10T20:31:55.027709Z [info     ] No particilar client detected, using generic client lineno=244 module=detector pathname=/app/src/codegate/clients/detector.py
2025-02-10T20:31:55.028218Z [info     ] Catch all rule matched         lineno=90 module=rulematcher pathname=/app/src/codegate/muxing/rulematcher.py
2025-02-10T20:31:55.028354Z [info     ] Muxing request routed to destination provider lineno=91 model=qwen2.5-coder:7b module=router pathname=/app/src/codegate/muxing/router.py provider_name=Ollama provider_type=ollama
2025-02-10T20:31:55.028802Z [info     ] FIM pipeline selected for execution. lineno=154 module=base pathname=/app/src/codegate/providers/base.py
2025-02-10T20:31:55.037993Z [info     ] Total secrets redacted since last assistant message: 0 lineno=361 module=secrets pathname=/app/src/codegate/pipeline/secrets/secrets.py
2025-02-10T20:31:55.075064Z [info     ] Total PII instances redacted: 0 lineno=96 module=pii pathname=/app/src/codegate/pipeline/pii/pii.py
2025-02-10T20:31:55.075440Z [info     ] FIM pipeline selected for output. lineno=109 module=base pathname=/app/src/codegate/providers/base.py
2025-02-10T20:31:55.075511Z [info     ] No output pipeline steps configured, passing through lineno=123 module=base pathname=/app/src/codegate/providers/base.py

With OpenAI via the mux endpoint, I get an error:

HTTP 400 Bad Request from http://127.0.0.1:8989/v1/mux/completions {"detail":"litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': \"Invalid 'messages': empty array. Expected an array with minimum length 1, but got an empty array instead.\", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'empty_array'}}"}

I don't have an Anthropic account to test with.

Steps to Reproduce

Continue autocomplete config:

  "tabAutocompleteModel": {
    "title": "CodeGate-Mux-Autocomplete",
    "provider": "openai",
    "model": "fake-value-not-used",
    "apiKey": "fake-value-not-used",
    "apiBase": "http://localhost:8989/v1/mux"
  }

Operating System

MacOS (Arm)

IDE and Version

VS Code 1.97.0

Extension and Version

Continue 0.8.68

Provider

Other

Model

Multiple

Codegate version

v0.1.18

Logs

No response

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions