Skip to main content

Select providers

Continue makes it easy to use different providers for serving your chat, autocomplete, and embeddings models.

To select the ones you want to use, add them to your config.json.

Self-hosted

Local

You can run a model on your local computer using:

Remote

You can deploy a model in your AWS, GCP, Azure, or other clouds using:

SaaS

You can access both open-source and commercial LLMs via:

Open-source models

You can run open-source LLMs with cloud services like:

Commercial models

You can use commercial LLMs via APIs using:

In addition to selecting providers, you will need to figure out what models to use.