Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Harmonize readme and dev guide (fixes https://github.com/stacklok/codegate/issues/700) #701

Merged
merged 5 commits into from
Jan 23, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
81 changes: 0 additions & 81 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,87 +135,6 @@ Check out the developer reference guides:
- [Configuration system](./docs/configuration.md)
- [Logging system](./docs/logging.md)

### Local setup

```bash
# Get the code
git clone https://github.com/stacklok/codegate.git
cd codegate

# Set up virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate

# Install dev dependencies
pip install -e ".[dev]"
```

### Testing

To run the unit tests, execute this command:

```bash
pytest
```

To run the integration tests, create a `.env` file in the repo root directory
and add the following properties to it:

```plain
ENV_OPENAI_KEY=<YOUR_KEY>
ENV_VLLM_KEY=<YOUR_KEY>
ENV_ANTHROPIC_KEY=<YOUR_KEY>
```

Then the integration tests can be executed by running:

```bash
python tests/integration/integration_tests.py
```

## 🐳 Docker deployment

### Build the image

```bash
make image-build
```

### Run the container

```bash
# Basic usage with local image
docker run -p 8989:8989 -p 9090:9090 codegate:latest

# With pre-built pulled image
docker pull ghcr.io/stacklok/codegate:latest
docker run --name codegate -d -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest

# It will mount a volume to /app/codegate_volume
# The directory supports storing Llama CPP models under subdirectory /models
# A sqlite DB with the messages and alerts is stored under the subdirectory /db
docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest
```

### Exposed parameters

- CODEGATE_VLLM_URL: URL for the inference engine (defaults to
[https://inference.codegate.ai](https://inference.codegate.ai))
- CODEGATE_OPENAI_URL: URL for OpenAI inference engine (defaults to
[https://api.openai.com/v1](https://api.openai.com/v1))
- CODEGATE_ANTHROPIC_URL: URL for Anthropic inference engine (defaults to
[https://api.anthropic.com/v1](https://api.anthropic.com/v1))
- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to
[http://localhost:11434/api](http://localhost:11434/api))
- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate
server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG)
- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate
server (default to TEXT, can be JSON/TEXT)

```bash
docker run -p 8989:8989 -p 9090:9090 -e CODEGATE_OLLAMA_URL=http://1.2.3.4:11434/api ghcr.io/stacklok/codegate:latest
```

## 🤝 Contributing

We welcome contributions! Whether it's bug reports, feature requests, or code
Expand Down
75 changes: 74 additions & 1 deletion docs/development.md
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,8 @@ The project uses several tools to maintain code quality:

### 3. Testing

Run the test suite with coverage:
#### Unit Tests
To run the unit test suite with coverage:

```bash
poetry run pytest
Expand All @@ -156,6 +157,35 @@ poetry run pytest
Tests are located in the `tests/` directory and follow the same structure as the
source code.

#### Integration Tests
To run the integration tests, create a `.env` file in the repo root directory and add the
following properties to it:
```
ENV_OPENAI_KEY=<YOUR_KEY>
ENV_VLLM_KEY=<YOUR_KEY>
ENV_ANTHROPIC_KEY=<YOUR_KEY>
```

Next, run import_packages to ensure integration test data is created:
```bash
poetry run python scripts/import_packages.py
```

Next, start the CodeGate server:
```bash
poetry run codegate serve --log-level DEBUG --log-format TEXT
```

Then the integration tests can be executed by running:
```bash
poetry run python tests/integration/integration_tests.py
```

You can include additional properties to specify test scope and other information. For instance, to execute the tests for Copilot providers, for instance, run:
```bash
CODEGATE_PROVIDERS=copilot CA_CERT_FILE=./codegate_volume/certs/ca.crt poetry run python tests/integration/integration_tests.py
```

### 4. Make commands

The project includes a Makefile for common development tasks:
Expand All @@ -168,6 +198,49 @@ The project includes a Makefile for common development tasks:
- `make build`: build distribution packages
- `make all`: run all checks and build (recommended before committing)

## 🐳 Docker deployment

### Build the image

```bash
make image-build
```

### Run the container

```bash
# Basic usage with local image
docker run -p 8989:8989 -p 9090:9090 codegate:latest

# With pre-built pulled image
docker pull ghcr.io/stacklok/codegate:latest
docker run --name codegate -d -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest

# It will mount a volume to /app/codegate_volume
# The directory supports storing Llama CPP models under subdirectory /models
# A sqlite DB with the messages and alerts is stored under the subdirectory /db
docker run --name codegate -d -v /path/to/volume:/app/codegate_volume -p 8989:8989 -p 9090:9090 ghcr.io/stacklok/codegate:latest
```

### Exposed parameters

- CODEGATE_VLLM_URL: URL for the inference engine (defaults to
[https://inference.codegate.ai](https://inference.codegate.ai))
- CODEGATE_OPENAI_URL: URL for OpenAI inference engine (defaults to
[https://api.openai.com/v1](https://api.openai.com/v1))
- CODEGATE_ANTHROPIC_URL: URL for Anthropic inference engine (defaults to
[https://api.anthropic.com/v1](https://api.anthropic.com/v1))
- CODEGATE_OLLAMA_URL: URL for OLlama inference engine (defaults to
[http://localhost:11434/api](http://localhost:11434/api))
- CODEGATE_APP_LOG_LEVEL: Level of debug desired when running the codegate
server (defaults to WARNING, can be ERROR/WARNING/INFO/DEBUG)
- CODEGATE_LOG_FORMAT: Type of log formatting desired when running the codegate
server (default to TEXT, can be JSON/TEXT)

```bash
docker run -p 8989:8989 -p 9090:9090 -e CODEGATE_OLLAMA_URL=http://1.2.3.4:11434/api ghcr.io/stacklok/codegate:latest
```

## Configuration system

CodeGate uses a hierarchical configuration system with the following priority
Expand Down
Loading