Continue Guides
Model & Setup Guides
- Using Ollama with Continue - Local AI development with Ollama
- Setting up Codestral - Configure Mistral's Codestral model
- How to Self-Host a Model - Self-hosting AI models
- Running Continue Without Internet - Offline development setup
- Llama 3.1 Setup - Getting started with Llama 3.1
Advanced Tutorials
- Build Your Own Context Provider - Create custom context providers
- Custom Code RAG - Implement custom retrieval-augmented generation
Contributing
Have a guide idea or found an issue? We welcome contributions! Check our GitHub repository to get involved.