Model setup
This page recommends models and providers for Chat. Read more about how to set up your config.json
here.
Best overall experience
For the best overall Chat experience, you will want to use a 400B+ parameter model or one of the frontier models.
Claude Sonnet 3.5 from Anthropic
Our current top recommendation is Claude Sonnet 3.5 from Anthropic.
"models": [
{
"title": "Claude 3.5 Sonnet",
"provider": "anthropic",
"model": "claude-3-5-sonnet-20240620",
"apiKey": "[ANTHROPIC_API_KEY]"
}
]
Llama 3.1 405B from Meta
If you prefer to use an open-weight model, then Llama 3.1 405B from Meta is your best option right now. You will need to decide if you use it through a SaaS model provider (e.g. Together or Groq) or self-host it (e.g. using vLLM or Ollama).
- Together
- Groq
- vLLM
- Ollama
"models": [
{
"title": "Llama 3.1 405B",
"provider": "together",
"model": "llama3.1-405b",
"apiKey": "[TOGETHER_API_KEY]"
}
]
"models": [
{
"title": "Llama 3.1 405B",
"provider": "groq",
"model": "llama3.1-405b",
"apiKey": "[GROQ_API_KEY]"
}
]
"models": [
{
"title": "Llama 3.1 405B",
"provider": "vllm",
"model": "llama3.1-405b"
}
]
"models": [
{
"title": "Llama 3.1 405B",
"provider": "ollama",
"model": "llama3.1-405b"
}
]
GPT-4o from OpenAI
If you prefer to use a model from OpenAI, then we recommend GPT-4o.
"models": [
{
"title": "GPT-4o",
"provider": "openai",
"model": "",
"apiKey": "[OPENAI_API_KEY]"
}
]
Gemini 1.5 Pro from Google
If you prefer to use a model from Google, then we recommend Gemini 1.5 Pro.
"models": [
{
"title": "Gemini 1.5 Pro",
"provider": "gemini",
"model": "gemini-1.5-pro-latest",
"apiKey": "[GEMINI_API_KEY]"
}
]
Local, offline experience
For the best local, offline Chat experience, you will want to use a model that is large but fast enough on your machine.
Llama 3.1 8B
If your local machine can run an 8B parameter model, then we recommend running Llama 3.1 8B on your machine (e.g. using Ollama or LM Studio).
- Ollama
- LM Studio
"models": [
{
"title": "Llama 3.1 8B",
"provider": "ollama",
"model": "llama3.1-8b"
}
]
"models": [
{
"title": "Llama 3.1 8B",
"provider": "lmstudio",
"model": "llama3.1-8b"
}
]
DeepSeek Coder 2 16B
If your local machine can run a 16B parameter model, then we recommend running DeepSeek Coder 2 16B (e.g. using Ollama or LM Studio).
- Ollama
- LM Studio
"models": [
{
"title": "DeepSeek Coder 2 16B",
"provider": "ollama",
"model": "deepseek-coder-v2:16b"
}
]
"models": [
{
"title": "DeepSeek Coder 2 16B",
"provider": "lmstudio",
"model": "deepseek-coder-v2:16b"
}
]
Other experiences
There are many more models and providers you can use with Chat beyond those mentioned above. Read more here