diff --git a/docs/about/changelog.md b/docs/about/changelog.md
index 0d4b86e..697d039 100644
--- a/docs/about/changelog.md
+++ b/docs/about/changelog.md
@@ -11,6 +11,13 @@ Major features and changes are noted here. To review all updates, see the
:::
+Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
+
+- **Aider support** - 13 Jan, 2025\
+ CodeGate version 0.1.6 adds support for [Aider](https://aider.chat/), an LLM
+ pair programmer in your terminal. See the
+ [how-to guide](../how-to/use-with-aider.mdx) to learn more.
+
- **Semantic versioning for container image** - 8 Jan, 2025\
Starting with v0.1.4, the CodeGate container image is published with semantic
version tags corresponding to
diff --git a/docs/about/faq.md b/docs/about/faq.md
index f09d44d..cf15f5b 100644
--- a/docs/about/faq.md
+++ b/docs/about/faq.md
@@ -10,7 +10,8 @@ sidebar_position: 10
No, CodeGate works _with_ your AI code assistant, as a local intermediary
between your client and the LLM it's communicating with.
-### Does CodeGate work with any plugins other than Copilot and Continue?
+### Does CodeGate work with any other IDE plugins or coding assistants?
-Currently, CodeGate works with GitHub Copilot and Continue. We are actively
-exploring additional integrations based on user feedback.
+We are actively exploring additional integrations based on user feedback.
+[Join the community on Discord](https://discord.gg/stacklok) to let us know
+about your favorite AI coding tool!
diff --git a/docs/how-to/install.md b/docs/how-to/install.md
index 6a2dac0..554d16f 100644
--- a/docs/how-to/install.md
+++ b/docs/how-to/install.md
@@ -42,7 +42,7 @@ application settings, see [Configure CodeGate](./configure.md)
### Alternative run commands {#examples}
-Run with minimal functionality for use with **Continue**:
+Run with minimal functionality for use with **Continue** or **Aider**:
```bash
docker run -d -p 8989:8989 -p 9090:9090 --restart unless-stopped ghcr.io/stacklok/codegate:latest
@@ -152,6 +152,7 @@ Now that CodeGate is running, proceed to configure your IDE integration.
- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
- [Use CodeGate with Continue](./use-with-continue.mdx)
+- [Use CodeGate with Aider](./use-with-aider.mdx)
## Remove CodeGate
@@ -160,3 +161,4 @@ integration:
- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)
- [Remove CodeGate - Continue](./use-with-continue.mdx#remove-codegate)
+- [Remove CodeGate - Aider](./use-with-aider.mdx#remove-codegate)
diff --git a/docs/how-to/use-with-aider.mdx b/docs/how-to/use-with-aider.mdx
new file mode 100644
index 0000000..f6ff417
--- /dev/null
+++ b/docs/how-to/use-with-aider.mdx
@@ -0,0 +1,73 @@
+---
+title: Use CodeGate with Aider
+description: Configure the Aider for CodeGate
+sidebar_label: Use with Aider
+sidebar_position: 90
+---
+
+import AiderProviders from '../partials/_aider-providers.mdx';
+
+[Aider](https://aider.chat/) is an open source AI coding assistant that lets you
+pair program with LLMs in your terminal.
+
+CodeGate works with the following AI model providers through Aider:
+
+- Local / self-managed:
+ - [Ollama](https://ollama.com/)
+- Hosted:
+ - [OpenAI](https://openai.com/api/)
+
+:::note
+
+This guide assumes you have already installed Aider using their
+[installation instructions](https://aider.chat/docs/install.html).
+
+:::
+
+## Configure Aider to use CodeGate
+
+To configure Aider to send requests through CodeGate:
+
+
+
+## Verify configuration
+
+To verify that you've successfully connected Aider to CodeGate, type
+`/ask codegate-version` into the Aider chat in your terminal. You should receive
+a response like "CodeGate version 0.1.0":
+
+## Next steps
+
+Learn more about CodeGate's features:
+
+- [Access the dashboard](./dashboard.md)
+- [CodeGate features](../features/index.mdx)
+
+## Remove CodeGate
+
+If you decide to stop using CodeGate, follow these steps to remove it and revert
+your environment.
+
+1. Stop Aider and unset the environment variables you set during the
+ configuration process:
+
+ **OpenAI:** `unset OPENAI_API_BASE` (macOS/Linux) or
+ `setx OPENAI_API_BASE ""` (Windows)
+
+ **Ollama:** `unset OLLAMA_API_BASE` (macOS/Linux) or
+ `setx OLLAMA_API_BASE ""` (Windows)
+
+1. Re-launch Aider.
+
+1. Stop and remove the CodeGate container:
+
+ ```bash
+ docker stop codegate && docker rm codegate
+ ```
+
+1. If you launched CodeGate with a persistent volume, delete it to remove the
+ CodeGate database and other files:
+
+ ```bash
+ docker volume rm codegate_volume
+ ```
diff --git a/docs/index.md b/docs/index.md
index f808c77..06dd537 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -37,25 +37,27 @@ CodeGate supports several development environments and AI providers.
AI coding assistants / IDEs:
-- **[GitHub Copilot](https://github.com/features/copilot)** with Visual Studio
- Code
+- **[GitHub Copilot](./how-to/use-with-copilot.mdx)** with Visual Studio Code
+ (JetBrains coming soon!)
-- **[Continue](https://www.continue.dev/)** with Visual Studio Code and
+- **[Continue](./how-to/use-with-continue.mdx)** with Visual Studio Code and
JetBrains IDEs
CodeGate supports the following AI model providers with Continue:
- Local / self-managed:
- - [Ollama](https://ollama.com/)
- - [llama.cpp](https://github.com/ggerganov/llama.cpp)
- - [vLLM](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html)
+ - Ollama
+ - llama.cpp
+ - vLLM
- Hosted:
- - [OpenRouter](https://openrouter.ai/)
- - [Anthropic](https://www.anthropic.com/api)
- - [OpenAI](https://openai.com/api/)
+ - OpenRouter
+ - Anthropic
+ - OpenAI
+
+**[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI
As the project evolves, we plan to add support for more IDE assistants and AI
-models.
+model providers.
## How to get involved
diff --git a/docs/partials/_aider-providers.mdx b/docs/partials/_aider-providers.mdx
new file mode 100644
index 0000000..ad7d3db
--- /dev/null
+++ b/docs/partials/_aider-providers.mdx
@@ -0,0 +1,120 @@
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+
+
+
+
+You need an [OpenAI API](https://openai.com/api/) account to use this provider.
+
+Before you run Aider, set environment variables for your API key and to set the
+API base URL to CodeGate's API port. Alternately, use one of Aider's other
+[supported configuration methods](https://aider.chat/docs/config/api-keys.html)
+to set the corresponding values.
+
+
+
+
+```bash
+export OPENAI_API_KEY=
+export OPENAI_API_BASE=http://localhost:8989/openai
+```
+
+:::note
+
+To persist these variables, add them to your shell profile (e.g., `~/.bashrc` or
+`~/.zshrc`).
+
+:::
+
+
+
+
+```bash
+setx OPENAI_API_KEY
+setx OPENAI_API_BASE http://localhost:8989/openai
+```
+
+:::note
+
+Restart your shell after running `setx`.
+
+:::
+
+
+
+
+Replace `` with your
+[OpenAI API key](https://platform.openai.com/api-keys).
+
+Then run `aider` as normal. For more information, see the
+[Aider docs for connecting to OpenAI](https://aider.chat/docs/llms/openai.html).
+
+
+
+
+You need Ollama installed on your local system with the server running
+(`ollama serve`) to use this provider.
+
+CodeGate connects to `http://host.docker.internal:11434` by default. If you
+changed the default Ollama server port or to connect to a remote Ollama
+instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
+set to the correct URL. See [Configure CodeGate](/how-to/configure.md).
+
+Before you run Aider, set the Ollama base URL to CodeGate's API port using an
+environment variable. Alternately, use one of Aider's other
+[supported configuration methods](https://aider.chat/docs/config/api-keys.html)
+to set the corresponding values.
+
+
+
+
+```bash
+export OLLAMA_API_BASE=http://localhost:8989/ollama
+```
+
+:::note
+
+To persist this setting, add it to your shell profile (e.g., `~/.bashrc` or
+`~/.zshrc`) or use one of Aider's other
+[supported configuration methods](https://aider.chat/docs/config/api-keys.html).
+
+:::
+
+
+
+
+```bash
+setx OLLAMA_API_BASE http://localhost:8989/ollama
+```
+
+:::note
+
+Restart your shell after running `setx`.
+
+:::
+
+
+
+
+Then run Aider:
+
+```bash
+aider --model ollama/
+```
+
+Replace `` with the name of a coding model you have installed
+locally using `ollama pull`.
+
+We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
+series of models. Our minimum recommendation for quality results is the 7
+billion parameter (7B) version, `qwen2.5-coder:7b`.
+
+This model balances performance and quality for typical systems with at least 4
+CPU cores and 16GB of RAM. If you have more compute resources available, our
+experimentation shows that larger models do yield better results.
+
+For more information, see the
+[Aider docs for connecting to Ollama](https://aider.chat/docs/llms/ollama.html).
+
+
+