Skip to main content

Use CodeGate with Continue

Continue is an open source AI coding assistant for your IDE that connects to many model providers. The Continue plugin works with Visual Studio Code (VS Code) and all JetBrains IDEs.

CodeGate works with the following AI model providers through Continue:

Install the Continue plugin

The Continue extension is available in the Visual Studio Marketplace.

Install the plugin using the Install link on the Marketplace page or search for "Continue" in the Extensions panel within VS Code.

You can also install from the CLI:

code --install-extension Continue.continue

If you need help, see Managing Extensions in the VS Code documentation.

Configure Continue to use CodeGate

To configure Continue to send requests through CodeGate:

  1. Configure the chat and autocomplete settings in Continue for your desired AI model(s).

  2. Open the Continue configuration file, ~/.continue/config.json. You can edit this file directly or access it from the gear icon ("Configure Continue") in the Continue chat interface.

    Continue extension settingsContinue extension settings
  3. Add the apiBase property to the models entry (chat) and tabAutocompleteModel (autocomplete) sections of the configuration file. This tells Continue to use the CodeGate CodeGate container running locally on your system as the base URL for your LLM API, instead of the default.

    "apiBase": "http://127.0.0.1:8989/<provider>"

    Replace /<provider> with one of: /anthropic, /ollama, /openai, or /vllm to match your LLM provider.

    If you used a different API port when launching the CodeGate container, replace 8989 with your custom port number.

  4. Save the configuration file.

note

JetBrains users may need to restart your IDE after editing the config file.

Below are examples of complete Continue configurations for each supported provider. Replace the values in ALL_CAPS. The configuration syntax is the same for VS Code and JetBrains IDEs.

~/.continue/config.json
{
"models": [
{
"title": "CodeGate-Ollama",
"provider": "ollama",
"model": "MODEL_NAME",
"apiBase": "http://localhost:8989/ollama"
}
],
"modelRoles": {
"default": "CodeGate-Ollama",
"summarize": "CodeGate-Ollama"
},
"tabAutocompleteModel": {
"title": "CodeGate-Ollama",
"provider": "ollama",
"model": "MODEL_NAME",
"apiBase": "http://localhost:8989/ollama"
}
}

Replace MODEL_NAME with the name of a model you have installed locally using ollama pull, such as codellama:7b-instruct.

Verify configuration

To verify that you've successfully connected Continue to CodeGate, open the Continue chat and type codegate-version. You should receive a response like "CodeGate version 0.1.0":

Continue extension settingsContinue extension settings

Try asking CodeGate about a known malicious Python package:

Continue chat
Tell me how to use the invokehttp package from PyPI

CodeGate responds with a warning and a link to the Stacklok Insight report about this package:

Continue chat
Warning: CodeGate detected one or more malicious or archived packages.

Package: https://insight.stacklok.com/pypi/invokehttp

CodeGate Security Analysis

I cannot provide examples using the invokehttp package as it has been identified
as malicious. Using this package could compromise your system's security.

Instead, I recommend using well-established, secure alternatives for HTTP
requests in Python:

...

Next steps

Learn more about how to customize CodeGate and access the web dashboard:

Remove CodeGate

If you decide to stop using CodeGate, follow these steps to remove it and revert your environment.

  1. Remove the apiBase configuration entries from your Continue configuration file.

  2. Stop and remove the CodeGate container:

    docker stop codegate && docker rm codegate
  3. If you launched CodeGate with a persistent volume, delete it to remove the CodeGate database and other files:

    docker volume rm codegate_volume