Skip to main content

Groq

The Groq API supports the OpenAI format, which makes it easy to integrate with promptfoo as a variation of the openai provider.

Prerequisites

Before you begin, make sure you have a Groq API key. You can obtain one from the Groq Console and set it as the GROQ_API_KEY environment variable.

Configuration

To use the Groq API with Promptfoo, you need to configure the provider in your promptfoo configuration file.

Here's an example configuration:

providers:
- id: openai:chat:mixtral-8x7b-32768
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
- id: openai:chat:llama2-70b-4096
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY

In this configuration:

  • id specifies the provider ID in the format openai:chat:<model>. Replace <model> with the desired Groq model.
  • config.apiBaseUrl points it to the Groq API.
  • config.apiKeyEnvar specifies the environment variable that holds your Groq API key.

You can also directly specify the API key in the configuration using the apiKey field instead of apiKeyEnvar:

providers:
- id: openai:chat:mixtral-8x7b-32768
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKey: gsk_abc123_your_groq_api_key

Supported Models

As of the time of writing, the Groq API supports the following models:

  • mixtral-8x7b-32768
  • llama2-70b-4096

You can find the latest list of supported models in the Groq Console documentation.

Using the Provider

Once you have configured the Groq provider, you can use it in your Promptfoo tests just like any other OpenAI-compatible provider. Specify the provider ID in your test configuration, and Promptfoo will send the requests to the Groq API.

Here's an example test configuration:

prompts:
- 'Answer this as concisely as possible: {{question}}'

providers:
- id: openai:chat:mixtral-8x7b-32768
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
- id: openai:chat:llama2-70b-4096
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY

tests:
- vars:
question: What is the capital of France?
assert:
- type: equals
value: Paris

In this example, the test will be run against both the mixtral-8x7b-32768 and llama2-70b-4096 models using the Groq API.

Additional Configuration

The Groq provider supports additional configuration options, such as temperature, max_tokens, etc. You can specify these options under the config field for each provider.

For example:

providers:
- id: openai:chat:mixtral-8x7b-32768
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
temperature: 0.7
max_tokens: 100

Refer to the Groq OpenAI compatibility docs as well as the OpenAI documentation for the full list of supported configuration options.