diff --git a/docs/about/changelog.md b/docs/about/changelog.md
index aaa03ab..eea0445 100644
--- a/docs/about/changelog.md
+++ b/docs/about/changelog.md
@@ -13,6 +13,11 @@ Major features and changes are noted here. To review all updates, see the
Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
+- **Cline support** - 28 Jan, 2025\
+ CodeGate version 0.1.14 adds support for
+ [Cline](https://github.com/cline/cline) with Anthropic, OpenAI, Ollama, and LM
+ Studio. See the [how-to guide](../how-to/use-with-cline.mdx) to learn more.
+
- **Workspaces** - 22 Jan, 2025\
Now available in CodeGate v0.1.12, workspaces help you organize and customize
your AI-assisted development. Learn more in
diff --git a/docs/how-to/configure.md b/docs/how-to/configure.md
index fab10c0..03b2cd5 100644
--- a/docs/how-to/configure.md
+++ b/docs/how-to/configure.md
@@ -20,14 +20,15 @@ docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
CodeGate supports the following parameters:
-| Parameter | Default value | Description |
-| :----------------------- | :---------------------------------- | :------------------------------------------------------------------------------------------------------------ |
-| `CODEGATE_OLLAMA_URL` | `http://host.docker.internal:11434` | Specifies the URL of an Ollama instance. Used when the provider in your plugin config is `ollama`. |
-| `CODEGATE_VLLM_URL` | `http://localhost:8000` | Specifies the URL of a model hosted by a vLLM server. Used when the provider in your plugin config is `vllm`. |
-| `CODEGATE_ANTHROPIC_URL` | `https://api.anthropic.com/v1` | Specifies the Anthropic engine API endpoint URL. |
-| `CODEGATE_OPENAI_URL` | `https://api.openai.com/v1` | Specifies the OpenAI engine API endpoint URL. |
-| `CODEGATE_APP_LOG_LEVEL` | `WARNING` | Sets the logging level. Valid values: `ERROR`, `WARNING`, `INFO`, `DEBUG` (case sensitive) |
-| `CODEGATE_LOG_FORMAT` | `TEXT` | Type of log formatting. Valid values: `TEXT`, `JSON` (case sensitive) |
+| Parameter | Default value | Description |
+| :----------------------- | :---------------------------------- | :----------------------------------------------------------------------------------------- |
+| `CODEGATE_APP_LOG_LEVEL` | `WARNING` | Sets the logging level. Valid values: `ERROR`, `WARNING`, `INFO`, `DEBUG` (case sensitive) |
+| `CODEGATE_LOG_FORMAT` | `TEXT` | Type of log formatting. Valid values: `TEXT`, `JSON` (case sensitive) |
+| `CODEGATE_ANTHROPIC_URL` | `https://api.anthropic.com/v1` | Specifies the Anthropic engine API endpoint URL. |
+| `CODEGATE_LM_STUDIO_URL` | `http://host.docker.internal:1234` | Specifies the URL of your LM Studio server. |
+| `CODEGATE_OLLAMA_URL` | `http://host.docker.internal:11434` | Specifies the URL of your Ollama instance. |
+| `CODEGATE_OPENAI_URL` | `https://api.openai.com/v1` | Specifies the OpenAI engine API endpoint URL. |
+| `CODEGATE_VLLM_URL` | `http://localhost:8000` | Specifies the URL of the vLLM server to use. |
## Example: Use CodeGate with OpenRouter
diff --git a/docs/how-to/install.md b/docs/how-to/install.md
index 50f65cd..ad933c0 100644
--- a/docs/how-to/install.md
+++ b/docs/how-to/install.md
@@ -42,7 +42,8 @@ application settings, see [Configure CodeGate](./configure.md)
### Alternative run commands {#examples}
-Run with minimal functionality for use with **Continue** or **aider**:
+Run with minimal functionality for use with **Continue**, **aider**, or
+**Cline**:
```bash
docker run -d -p 8989:8989 -p 9090:9090 --restart unless-stopped ghcr.io/stacklok/codegate:latest
@@ -150,15 +151,17 @@ persistent volume.
Now that CodeGate is running, proceed to configure your IDE integration.
-- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
- [Use CodeGate with aider](./use-with-aider.mdx)
+- [Use CodeGate with Cline](./use-with-cline.mdx)
- [Use CodeGate with Continue](./use-with-continue.mdx)
+- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
## Remove CodeGate
If you decide to stop using CodeGate, follow the removal steps for your IDE
integration:
-- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)
- [Remove CodeGate - aider](./use-with-aider.mdx#remove-codegate)
+- [Remove CodeGate - Cline](./use-with-cline.mdx#remove-codegate)
- [Remove CodeGate - Continue](./use-with-continue.mdx#remove-codegate)
+- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)
diff --git a/docs/how-to/use-with-aider.mdx b/docs/how-to/use-with-aider.mdx
index 517b947..7fd7137 100644
--- a/docs/how-to/use-with-aider.mdx
+++ b/docs/how-to/use-with-aider.mdx
@@ -15,7 +15,7 @@ CodeGate works with the following AI model providers through aider:
- Local / self-managed:
- [Ollama](https://ollama.com/)
- Hosted:
- - [OpenAI](https://openai.com/api/)
+ - [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
:::note
diff --git a/docs/how-to/use-with-cline.mdx b/docs/how-to/use-with-cline.mdx
new file mode 100644
index 0000000..d95344b
--- /dev/null
+++ b/docs/how-to/use-with-cline.mdx
@@ -0,0 +1,128 @@
+---
+title: Use CodeGate with Cline
+description: Configure the Cline IDE extension
+sidebar_label: Use with Cline
+sidebar_position: 90
+---
+
+import useBaseUrl from '@docusaurus/useBaseUrl';
+import ThemedImage from '@theme/ThemedImage';
+
+[Cline](https://github.com/cline/cline) is an autonomous coding agent for Visual
+Studio Code that supports numerous API providers and models.
+
+CodeGate works with the following AI model providers through Cline:
+
+- Local / self-managed:
+ - [Ollama](https://ollama.com/)
+ - [LM Studio](https://lmstudio.ai/)
+- Hosted:
+ - [Anthropic](https://www.anthropic.com/api)
+ - [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
+
+## Install the Cline extension
+
+The Cline extension is available in the
+[Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=saoudrizwan.claude-dev).
+
+Install the extension using the **Install** link on the Marketplace page or
+search for "Cline" in the Extensions panel within VS Code.
+
+You can also install from the CLI:
+
+```bash
+code --install-extension saoudrizwan.claude-dev
+```
+
+If you need help, see
+[Managing Extensions](https://code.visualstudio.com/docs/editor/extension-marketplace)
+in the VS Code documentation.
+
+## Configure Cline to use CodeGate
+
+import ClineProviders from '../partials/_cline-providers.mdx';
+
+To configure Cline to send requests through CodeGate:
+
+1. Open the Cline extension sidebar from the VS Code Activity Bar and open its
+ settings using the gear icon.
+
+
+
+1. Select your provider and configure as detailed here:
+
+
+
+1. Click **Done** to save the settings.
+
+## Verify configuration
+
+To verify that you've successfully connected Cline to CodeGate, open the Cline
+sidebar and type `codegate version`. You should receive a response like
+"CodeGate version 0.1.14":
+
+
+
+Try asking CodeGate about a known malicious Python package:
+
+```plain title="Cline chat"
+Tell me how to use the invokehttp package from PyPI
+```
+
+CodeGate responds with a warning and a link to the Stacklok Insight report about
+this package:
+
+```plain title="Cline chat"
+Warning: CodeGate detected one or more malicious, deprecated or archived packages.
+
+ • invokehttp: https://www.insight.stacklok.com/report/pypi/invokehttp
+
+The `invokehttp` package from PyPI has been identified as malicious and should
+not be used. Please avoid using this package and consider using a trusted
+alternative such as `requests` for making HTTP requests in Python.
+
+Here is an example of how to use the `requests` package:
+
+...
+```
+
+## Next steps
+
+Learn more about CodeGate's features and how to use them:
+
+- [Access the dashboard](./dashboard.md)
+- [CodeGate features](../features/index.mdx)
+
+## Remove CodeGate
+
+If you decide to stop using CodeGate, follow these steps to remove it and revert
+your environment.
+
+1. Remove the custom base URL from your Cline provider settings.
+
+1. Stop and remove the CodeGate container:
+
+ ```bash
+ docker stop codegate && docker rm codegate
+ ```
+
+1. If you launched CodeGate with a persistent volume, delete it to remove the
+ CodeGate database and other files:
+
+ ```bash
+ docker volume rm codegate_volume
+ ```
diff --git a/docs/index.md b/docs/index.md
index 0049c08..b2b3abd 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -54,7 +54,13 @@ AI coding assistants / IDEs:
- Anthropic
- OpenAI
-- **[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI
+- **[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI-compatible
+ APIs
+
+- **[Cline](./how-to/use-with-cline.mdx)** with Visual Studio Code
+
+ CodeGate supports Ollama, Anthropic, OpenAI-compatible APIs, and LM Studio
+ with Cline.
As the project evolves, we plan to add support for more IDE assistants and AI
model providers.
diff --git a/docs/partials/.markdownlint.json b/docs/partials/.markdownlint.json
new file mode 100644
index 0000000..62baab5
--- /dev/null
+++ b/docs/partials/.markdownlint.json
@@ -0,0 +1,3 @@
+{
+ "first-line-h1": false
+}
diff --git a/docs/partials/_aider-providers.mdx b/docs/partials/_aider-providers.mdx
index 064c059..9b2308d 100644
--- a/docs/partials/_aider-providers.mdx
+++ b/docs/partials/_aider-providers.mdx
@@ -1,10 +1,14 @@
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
+import LocalModelRecommendation from './_local-model-recommendation.md';
+
You need an [OpenAI API](https://openai.com/api/) account to use this provider.
+To use a different OpenAI-compatible endpoint, set the `CODEGATE_OPENAI_URL`
+[configuration parameter](../how-to/configure.md#config-parameters).
Before you run aider, set environment variables for your API key and to set the
API base URL to CodeGate's API port. Alternately, use one of aider's other
@@ -58,7 +62,7 @@ You need Ollama installed on your local system with the server running
CodeGate connects to `http://host.docker.internal:11434` by default. If you
changed the default Ollama server port or to connect to a remote Ollama
instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
-set to the correct URL. See [Configure CodeGate](/how-to/configure.md).
+set to the correct URL. See [Configure CodeGate](../how-to/configure.md).
Before you run aider, set the Ollama base URL to CodeGate's API port using an
environment variable. Alternately, use one of aider's other
@@ -105,13 +109,7 @@ aider --model ollama_chat/
Replace `` with the name of a coding model you have installed
locally using `ollama pull`.
-We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
-series of models. Our minimum recommendation for quality results is the 7
-billion parameter (7B) version, `qwen2.5-coder:7b`.
-
-This model balances performance and quality for typical systems with at least 4
-CPU cores and 16GB of RAM. If you have more compute resources available, our
-experimentation shows that larger models do yield better results.
+
For more information, see the
[aider docs for connecting to Ollama](https://aider.chat/docs/llms/ollama.html).
diff --git a/docs/partials/_cline-providers.mdx b/docs/partials/_cline-providers.mdx
new file mode 100644
index 0000000..071d3d3
--- /dev/null
+++ b/docs/partials/_cline-providers.mdx
@@ -0,0 +1,124 @@
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+import useBaseUrl from '@docusaurus/useBaseUrl';
+import ThemedImage from '@theme/ThemedImage';
+
+import LocalModelRecommendation from './_local-model-recommendation.md';
+
+
+
+
+You need an [Anthropic API](https://www.anthropic.com/api) account to use this
+provider.
+
+In the Cline settings, choose **Anthropic** as your provider, enter your
+Anthropic API key, and choose your preferred model (we recommend
+`claude-3-5-sonnet-`).
+
+To enable CodeGate, enable **Use custom base URL** and enter
+`https://localhost:8989/anthropic`.
+
+
+
+
+
+
+You need an [OpenAI API](https://openai.com/api/) account to use this provider.
+To use a different OpenAI-compatible endpoint, set the `CODEGATE_OPENAI_URL`
+[configuration parameter](../how-to/configure.md) when you launch CodeGate.
+
+In the Cline settings, choose **OpenAI Compatible** as your provider, enter your
+OpenAI API key, and set your preferred model (example: `gpt-4o-mini`).
+
+To enable CodeGate, set the **Base URL** to `https://localhost:8989/openai`.
+
+
+
+
+
+
+You need Ollama installed on your local system with the server running
+(`ollama serve`) to use this provider.
+
+CodeGate connects to `http://host.docker.internal:11434` by default. If you
+changed the default Ollama server port or to connect to a remote Ollama
+instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
+set to the correct URL. See [Configure CodeGate](/how-to/configure.md).
+
+In the Cline settings, choose **Ollama** as your provider and set the **Base
+URL** to `http://localhost:8989/ollama`.
+
+For the **Model ID**, provide the name of a coding model you have installed
+locally using `ollama pull`.
+
+
+
+
+
+
+
+
+You need LM Studio installed on your local system with a server running from LM
+Studio's **Developer** tab to use this provider. See the
+[LM Studio docs](https://lmstudio.ai/docs/api/server) for more information.
+
+Cline uses large prompts, so you will likely need to increase the context length
+for the model you've loaded in LM Studio. In the Developer tab, select the model
+you'll use with CodeGate, open the **Load** tab on the right and increase the
+**Context Length** to _at least_ 18k (18,432) tokens, then reload the model.
+
+
+
+CodeGate connects to `http://host.docker.internal:1234` by default. If you
+changed the default LM Studio server port, launch CodeGate with the
+`CODEGATE_LM_STUDIO_URL` environment variable set to the correct URL. See
+[Configure CodeGate](/how-to/configure.md).
+
+In the Cline settings, choose LM Studio as your provider and set the **Base
+URL** to `http://localhost:8989/openai`.
+
+Set the **Model ID** to `lm_studio/`, where `` is the
+name of the model you're serving through LM Studio (shown in the Developer tab),
+for example `lm_studio/qwen2.5-coder-7b-instruct`.
+
+
+
+
+
+
+
diff --git a/docs/partials/_local-model-recommendation.md b/docs/partials/_local-model-recommendation.md
new file mode 100644
index 0000000..a16f710
--- /dev/null
+++ b/docs/partials/_local-model-recommendation.md
@@ -0,0 +1,6 @@
+We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
+series of models. Our minimum recommendation for quality results is the 7
+billion parameter (7B) version, `qwen2.5-coder:7b-instruct`. This model balances
+performance and quality for systems with at least 4 CPU cores and 16GB of RAM.
+If you have more compute resources available, our experimentation shows that
+larger models do yield better results.
diff --git a/static/img/how-to/cline-codegate-version-dark.webp b/static/img/how-to/cline-codegate-version-dark.webp
new file mode 100644
index 0000000..74fd913
Binary files /dev/null and b/static/img/how-to/cline-codegate-version-dark.webp differ
diff --git a/static/img/how-to/cline-codegate-version-light.webp b/static/img/how-to/cline-codegate-version-light.webp
new file mode 100644
index 0000000..f17062c
Binary files /dev/null and b/static/img/how-to/cline-codegate-version-light.webp differ
diff --git a/static/img/how-to/cline-provider-anthropic-dark.webp b/static/img/how-to/cline-provider-anthropic-dark.webp
new file mode 100644
index 0000000..bfb43c4
Binary files /dev/null and b/static/img/how-to/cline-provider-anthropic-dark.webp differ
diff --git a/static/img/how-to/cline-provider-anthropic-light.webp b/static/img/how-to/cline-provider-anthropic-light.webp
new file mode 100644
index 0000000..89a8d8a
Binary files /dev/null and b/static/img/how-to/cline-provider-anthropic-light.webp differ
diff --git a/static/img/how-to/cline-provider-lmstudio-dark.webp b/static/img/how-to/cline-provider-lmstudio-dark.webp
new file mode 100644
index 0000000..ef683ee
Binary files /dev/null and b/static/img/how-to/cline-provider-lmstudio-dark.webp differ
diff --git a/static/img/how-to/cline-provider-lmstudio-light.webp b/static/img/how-to/cline-provider-lmstudio-light.webp
new file mode 100644
index 0000000..79b9c9e
Binary files /dev/null and b/static/img/how-to/cline-provider-lmstudio-light.webp differ
diff --git a/static/img/how-to/cline-provider-ollama-dark.webp b/static/img/how-to/cline-provider-ollama-dark.webp
new file mode 100644
index 0000000..dddcd3c
Binary files /dev/null and b/static/img/how-to/cline-provider-ollama-dark.webp differ
diff --git a/static/img/how-to/cline-provider-ollama-light.webp b/static/img/how-to/cline-provider-ollama-light.webp
new file mode 100644
index 0000000..dabc03d
Binary files /dev/null and b/static/img/how-to/cline-provider-ollama-light.webp differ
diff --git a/static/img/how-to/cline-provider-openai-dark.webp b/static/img/how-to/cline-provider-openai-dark.webp
new file mode 100644
index 0000000..4670021
Binary files /dev/null and b/static/img/how-to/cline-provider-openai-dark.webp differ
diff --git a/static/img/how-to/cline-provider-openai-light.webp b/static/img/how-to/cline-provider-openai-light.webp
new file mode 100644
index 0000000..dceb1c6
Binary files /dev/null and b/static/img/how-to/cline-provider-openai-light.webp differ
diff --git a/static/img/how-to/cline-settings-dark.webp b/static/img/how-to/cline-settings-dark.webp
new file mode 100644
index 0000000..ffddce5
Binary files /dev/null and b/static/img/how-to/cline-settings-dark.webp differ
diff --git a/static/img/how-to/cline-settings-light.webp b/static/img/how-to/cline-settings-light.webp
new file mode 100644
index 0000000..e0b3c39
Binary files /dev/null and b/static/img/how-to/cline-settings-light.webp differ
diff --git a/static/img/how-to/lmstudio-server-dark.webp b/static/img/how-to/lmstudio-server-dark.webp
new file mode 100644
index 0000000..4f396ac
Binary files /dev/null and b/static/img/how-to/lmstudio-server-dark.webp differ
diff --git a/static/img/how-to/lmstudio-server-light.webp b/static/img/how-to/lmstudio-server-light.webp
new file mode 100644
index 0000000..3a72b94
Binary files /dev/null and b/static/img/how-to/lmstudio-server-light.webp differ