Skip to main content

Configure CodeGate

The CodeGate container runs with default settings to support Ollama, Anthropic, and OpenAI APIs with typical settings. To customize the behavior, you can supply extra configuration parameters to the container as environment variables:

docker run --name codegate -d -p 8989:8989 -p 9090:80 \
[-e KEY=VALUE ...] \
--restart unless-stopped ghcr.io/stacklok/codegate

Config parameters

CodeGate supports the following parameters:

ParameterDefault valueDescription
CODEGATE_OLLAMA_URLhttp://localhost:11434/apiSpecifies the URL of an Ollama instance. Used when the provider in your plugin config is ollama.
CODEGATE_VLLM_URLhttps://inference.codegate.aiSpecifies the URL of a model hosted by a vLLM endpoint. Used when the provider in your plugin config is vllm.
CODEGATE_ANTHROPIC_URLhttps://api.anthropic.com/v1Specifies the Anthropic engine API endpoint URL.
CODEGATE_OPENAI_URLhttps://api.openai.com/v1Specifies the OpenAI engine API endpoint URL.
CODEGATE_APP_LOG_LEVELWARNINGSets the logging level. Valid values: ERROR, WARNING, INFO, DEBUG
CODEGATE_LOG_FORMATTEXTType of log formatting. Valid values: TEXT, JSON

Example: Use CodeGate with OpenRouter

OpenRouter is an interface to many large language models. CodeGate's vLLM provider works with OpenRouter's API when used along with the Continue IDE plugin.

To use OpenRouter, set the vLLM URL when you launch CodeGate:

docker run --name codegate -d -p 8989:8989 -p 9090:80 \
-e CODEGATE_VLLM_URL=https://openrouter.ai/api \
--restart unless-stopped ghcr.io/stacklok/codegate

Then, configure the Continue IDE plugin to access the vLLM endpoint (http://localhost:8989/vllm/) and specify the name of the model you'd like to use and your OpenRouter API key.