Skip to content

Commit 4290483

Browse files
yroblalukehinds
authored andcommitted
feat: expose lm studio configuration endpoint (#791)
When running from docker, the env vars for lm studio need to be exposed and parameterized to use host.docker.internal
1 parent b4e46cf commit 4290483

File tree

7 files changed

+31
-1
lines changed

7 files changed

+31
-1
lines changed

Dockerfile

+1
Original file line numberDiff line numberDiff line change
@@ -112,6 +112,7 @@ ENV CODEGATE_VLLM_URL=
112112
ENV CODEGATE_OPENAI_URL=
113113
ENV CODEGATE_ANTHROPIC_URL=
114114
ENV CODEGATE_OLLAMA_URL=http://host.docker.internal:11434
115+
ENV CODEGATE_LM_STUDIO_URL=http://host.docker.internal:1234
115116
ENV CODEGATE_APP_LOG_LEVEL=WARNING
116117
ENV CODEGATE_LOG_FORMAT=TEXT
117118

config.yaml.example

+1
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ provider_urls:
2626
# --vllm-url
2727
# --openai-url
2828
# --anthropic-url
29+
# --lm-studio-url
2930

3031
# Certificate configuration
3132
certs_dir: "./certs" # Directory for certificate files

docs/cli.md

+11
Original file line numberDiff line numberDiff line change
@@ -71,6 +71,11 @@ codegate serve [OPTIONS]
7171
- Base URL for Ollama provider (/api path is added automatically)
7272
- Overrides configuration file and environment variables
7373

74+
- `--lm-studio-url TEXT`: LM Studio provider URL (default: `http://localhost:1234`)
75+
- Optional
76+
- Base URL for LM studio provider (/v1 path is added automatically)
77+
- Overrides configuration file and environment variables
78+
7479
- `--model-base-path TEXT`: Base path for loading models needed for the system
7580
- Optional
7681

@@ -199,6 +204,12 @@ Start server with custom Ollama endpoint:
199204
codegate serve --ollama-url http://localhost:11434
200205
```
201206

207+
Start server with custom LM Studio endpoint:
208+
209+
```bash
210+
codegate serve --lm-studio-url https://lmstudio.example.com
211+
```
212+
202213
Show default system prompts:
203214

204215
```bash

docs/configuration.md

+4
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ Values from higher-priority sources take precedence over lower-priority values.
2929
- OpenAI: `"https://api.openai.com/v1"`
3030
- Anthropic: `"https://api.anthropic.com/v1"`
3131
- Ollama: `"http://localhost:11434"`
32+
- LM Studio: `"http://localhost:1234"`
3233
- Certificate configuration:
3334
- Certs directory: `"./certs"`
3435
- CA certificate: `"ca.crt"`
@@ -59,6 +60,7 @@ provider_urls:
5960
openai: "https://api.openai.com/v1"
6061
anthropic: "https://api.anthropic.com/v1"
6162
ollama: "http://localhost:11434"
63+
lm_studio: "http://localhost:1234"
6264
certs_dir: "./certs"
6365
ca_cert: "ca.crt"
6466
ca_key: "ca.key"
@@ -80,6 +82,7 @@ Environment variables are automatically loaded with these mappings:
8082
- `CODEGATE_PROVIDER_OPENAI_URL`: OpenAI provider URL
8183
- `CODEGATE_PROVIDER_ANTHROPIC_URL`: Anthropic provider URL
8284
- `CODEGATE_PROVIDER_OLLAMA_URL`: Ollama provider URL
85+
- `CODEGATE_PROVIDER_LM_STUDIO_URL`: LM Studio provider URL
8386
- `CODEGATE_CERTS_DIR`: directory for certificate files
8487
- `CODEGATE_CA_CERT`: CA certificate file name
8588
- `CODEGATE_CA_KEY`: CA key file name
@@ -139,6 +142,7 @@ Provider URLs can be configured in several ways:
139142
export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1
140143
export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1
141144
export CODEGATE_PROVIDER_OLLAMA_URL=http://localhost:11434
145+
export CODEGATE_PROVIDER_LM_STUDIO_URL=http://localhost:1234
142146
```
143147

144148
3. CLI flags:

docs/development.md

+3
Original file line numberDiff line numberDiff line change
@@ -232,6 +232,8 @@ docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:89
232232
[https://api.anthropic.com/v1](https://api.anthropic.com/v1))
233233
- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to
234234
[http://localhost:11434/api](http://localhost:11434/api))
235+
- CODEGATE_LM_STUDIO_URL: URL for LM Studio inference engine (defaults to
236+
[http://localhost:1234/api](http://localhost:1234/api))
235237
- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate
236238
server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG)
237239
- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate
@@ -312,6 +314,7 @@ Provider URLs can be configured through:
312314
export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1
313315
export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1
314316
export CODEGATE_PROVIDER_OLLAMA_URL=http://localhost:11434
317+
export CODEGATE_PROVIDER_LM_STUDIO_URL=http://localhost:1234
315318
```
316319

317320
3. CLI flags:

scripts/entrypoint.sh

+1
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,7 @@ start_application() {
4444
[ -n "$CODEGATE_ANTHROPIC_URL" ] && CMD_ARGS+=" --anthropic-url $CODEGATE_ANTHROPIC_URL"
4545
[ -n "$CODEGATE_OLLAMA_URL" ] && CMD_ARGS+=" --ollama-url $CODEGATE_OLLAMA_URL"
4646
[ -n "$CODEGATE_VLLM_URL" ] && CMD_ARGS+=" --vllm-url $CODEGATE_VLLM_URL"
47+
[ -n "$CODEGATE_LM_STUDIO_URL" ] && CMD_ARGS+=" --lm-studio-url $CODEGATE_LM_STUDIO_URL"
4748

4849
# Check and append debug level if set
4950
[ -n "$CODEGATE_APP_LOG_LEVEL" ] && CMD_ARGS+=" --log-level $CODEGATE_APP_LOG_LEVEL"

src/codegate/cli.py

+10-1
Original file line numberDiff line numberDiff line change
@@ -192,6 +192,12 @@ def show_prompts(prompts: Optional[Path]) -> None:
192192
default=None,
193193
help="Ollama provider URL (default: http://localhost:11434/)",
194194
)
195+
@click.option(
196+
"--lm-studio-url",
197+
type=str,
198+
default=None,
199+
help="LM Studio provider URL (default: http://localhost:1234/)",
200+
)
195201
@click.option(
196202
"--model-base-path",
197203
type=str,
@@ -246,7 +252,7 @@ def show_prompts(prompts: Optional[Path]) -> None:
246252
default=None,
247253
help="Path to the vector SQLite database file (default: ./sqlite_data/vectordb.db)",
248254
)
249-
def serve(
255+
def serve( # noqa: C901
250256
port: Optional[int],
251257
proxy_port: Optional[int],
252258
host: Optional[str],
@@ -258,6 +264,7 @@ def serve(
258264
openai_url: Optional[str],
259265
anthropic_url: Optional[str],
260266
ollama_url: Optional[str],
267+
lm_studio_url: Optional[str],
261268
model_base_path: Optional[str],
262269
embedding_model: Optional[str],
263270
db_path: Optional[str],
@@ -280,6 +287,8 @@ def serve(
280287
cli_provider_urls["anthropic"] = anthropic_url
281288
if ollama_url:
282289
cli_provider_urls["ollama"] = ollama_url
290+
if lm_studio_url:
291+
cli_provider_urls["lm_studio"] = lm_studio_url
283292

284293
# Load configuration with priority resolution
285294
cfg = Config.load(

0 commit comments

Comments
 (0)