Groq
Groq provides ultra-fast inference on open-source models (Llama, Gemma, Mistral, and more) using custom LPU hardware. OpenClaw connects to Groq through its OpenAI-compatible API.
| Property | Value |
|---|---|
| Provider | groq |
| Auth | GROQ_API_KEY |
| API | OpenAI-compatible |
Getting started
- Get an API key
Create an API key at console.groq.com/keys.
- Set the API key
export GROQ_API_KEY="gsk_..." - Set a default model
{ agents: { defaults: { model: { primary: "groq/llama-3.3-70b-versatile" }, }, }, }
Config file example
{
env: { GROQ_API_KEY: "gsk_..." },
agents: {
defaults: {
model: { primary: "groq/llama-3.3-70b-versatile" },
},
},
}
Available models
Groq's model catalog changes frequently. Run openclaw models list | grep groq
to see currently available models, or check
console.groq.com/docs/models.
| Model | Notes |
|---|---|
| Llama 3.3 70B Versatile | General-purpose, large context |
| Llama 3.1 8B Instant | Fast, lightweight |
| Gemma 2 9B | Compact, efficient |
| Mixtral 8x7B | MoE architecture, strong reasoning |
Use openclaw models list --provider groq for the most up-to-date list of
models available on your account.
Audio transcription
Groq also provides fast Whisper-based audio transcription. When configured as a
media-understanding provider, OpenClaw uses Groq's whisper-large-v3-turbo
model to transcribe voice messages through the shared tools.media.audio
surface.
{
tools: {
media: {
audio: {
models: [{ provider: "groq" }],
},
},
},
}
Audio transcription details
| Property | Value |
|---|---|
| Shared config path | tools.media.audio |
| Default base URL | https://api.groq.com/openai/v1 |
| Default model | whisper-large-v3-turbo |
| API endpoint | OpenAI-compatible /audio/transcriptions |
Environment note
If the Gateway runs as a daemon (launchd/systemd), make sure GROQ_API_KEY is
available to that process (for example, in ~/.openclaw/.env or via
env.shellEnv).
Keys set only in your interactive shell are not visible to daemon-managed
gateway processes. Use ~/.openclaw/.env or env.shellEnv config for
persistent availability.