lucianotonet/groq-php
PHP client for the Groq API. Provides a simple, lightweight way to call Groq LLM endpoints from PHP apps, with support for common chat/completions workflows and easy integration into existing projects.

High-performance PHP client for GroqCloud API
A comprehensive PHP SDK that simplifies interaction with the world's fastest LLM inference platform, allowing PHP developers to easily integrate high-performance models (DeepSeek r1, Llama 3.3, Mixtral, Gemma, and more) into any PHP application.
Using on Laravel? Check this out: GroqLaravel
composer require lucianotonet/groq-php
Get your API Key:
Configure your API Key:
export GROQ_API_KEY=your_key_here
.env file:GROQ_API_KEY=your_key_here
GROQ_API_BASE=https://api.groq.com/openai/v1 # (Optional, if different from default)
List available models.
$models = $groq->models()->list();
print_r($models['data']);
Generate interactive chat responses.
<?php
use LucianoTonet\GroqPHP\Groq;
$groq = new Groq(getenv('GROQ_API_KEY'));
try {
$response = $groq->chat()->completions()->create([
'model' => 'llama3-8b-8192', // Or another supported model
'messages' => [
['role' => 'user', 'content' => 'Explain the importance of low latency in LLMs'],
],
]);
echo $response['choices'][0]['message']['content'];
} catch (\LucianoTonet\GroqPHP\GroqException $e) {
echo 'Error: ' . $e->getMessage();
}
Streaming:
$response = $groq->chat()->completions()->create([
'model' => 'llama3-8b-8192',
'messages' => [
['role' => 'user', 'content' => 'Tell me a short story'],
],
'stream' => true
]);
foreach ($response->chunks() as $chunk) {
if (isset($chunk['choices'][0]['delta']['content'])) {
echo $chunk['choices'][0]['delta']['content'];
ob_flush(); // Important for real streaming
flush();
}
}
JSON Mode:
$response = $groq->chat()->completions()->create([
'model' => 'llama3-70b-8192',
'messages' => [
['role' => 'system', 'content' => 'You are an API and must respond only with valid JSON.'],
['role' => 'user', 'content' => 'Give me information about the current weather in London'],
],
'response_format' => ['type' => 'json_object']
]);
$content = $response['choices'][0]['message']['content'];
echo json_encode(json_decode($content), JSON_PRETTY_PRINT); // Display formatted JSON
Additional Parameters (Chat Completions):
temperature: Controls randomness (0.0 - 2.0)max_completion_tokens: Maximum tokens in responsetop_p: Nucleus samplingfrequency_penalty: Penalty for repeated tokens (-2.0 - 2.0)presence_penalty: Penalty for repeated topics (-2.0 - 2.0)stop: Stop sequencesseed: For reproducibilityAllows the model to call external functions/tools.
// Example function (simulated)
function getNbaScore($teamName) {
// ... (simulated logic to return score) ...
return json_encode(['team' => $teamName, 'score' => 100]); // Example
}
$messages = [
['role' => 'system', 'content' => "You must call the 'getNbaScore' function to answer questions about NBA game scores."],
['role' => 'user', 'content' => 'What is the Lakers score?']
];
$tools = [
[
'type' => 'function',
'function' => [
'name' => 'getNbaScore',
'description' => 'Get the score for an NBA game',
'parameters' => [
'type' => 'object',
'properties' => [
'team_name' => ['type' => 'string', 'description' => 'NBA team name'],
],
'required' => ['team_name'],
],
],
]
];
$response = $groq->chat()->completions()->create([
'model' => 'llama3-groq-70b-8192-tool-use-preview', // Model that supports tool calling
'messages' => $messages,
'tool_choice' => 'auto',
'tools' => $tools
]);
if (isset($response['choices'][0]['message']['tool_calls'])) {
// ... (process tool call, call function, and send response) ...
$tool_call = $response['choices'][0]['message']['tool_calls'][0];
$function_args = json_decode($tool_call['function']['arguments'], true);
$function_response = getNbaScore($function_args['team_name']);
$messages[] = [
'tool_call_id' => $tool_call['id'],
'role' => 'tool',
'name' => 'getNbaScore',
'content' => $function_response,
];
// Second call to the model with tool response:
$response = $groq->chat()->completions()->create([
'model' => 'llama3-groq-70b-8192-tool-use-preview',
'messages' => $messages
]);
echo $response['choices'][0]['message']['content'];
} else {
// Direct response, no tool_calls
echo $response['choices'][0]['message']['content'];
}
Advanced Tool Calling (with multiple tools and parallel calls):
See examples/tool-calling-advanced.php for a more complete example, including:
getCurrentDateTimeTool, getCurrentWeatherTool)parallel_tool_calls: Controls whether tool calls can be made in parallel (currently must be forced false in code)use LucianoTonet\GroqPHP\Groq;
$groq = new Groq(getenv('GROQ_API_KEY'));
try {
$transcription = $groq->audio()->transcriptions()->create([
'file' => 'audio.mp3', /* Your audio file */
'model' => 'whisper-large-v3',
'response_format' => 'verbose_json', /* Or 'text', 'json' */
'language' => 'en', /* ISO 639-1 code (optional but recommended) */
'prompt' => 'Audio transcription...' /* (optional) */
]);
echo json_encode($transcription, JSON_PRETTY_PRINT | JSON_UNESCAPED_UNICODE);
} catch (\LucianoTonet\GroqPHP\GroqException $e) {
echo "Error: " . $e->getMessage();
}
// (Similar to transcription, but uses ->translations()->create() and always translates to English)
// Target language for translation is always English
$translation = $groq->audio()->translations()->create([
'file' => 'audio_in_spanish.mp3',
'model' => 'whisper-large-v3'
]);
'json', 'verbose_json', 'text'. The vtt and srt formats are not supported.language: ISO 639-1 code of the source language (optional but recommended for better accuracy). See examples/audio-transcriptions.php for a complete list of supported languages.temperature: Controls variability.Convert text to speech using GroqCloud's Text-to-Speech API.
use LucianoTonet\GroqPHP\Groq;
$groq = new Groq(getenv('GROQ_API_KEY'));
try {
// Method 1: Save to file
$result = $groq->audio()->speech()
->model('playai-tts') // 'playai-tts' for English, 'playai-tts-arabic' for Arabic
->input('Hello, this text will be converted to speech')
->voice('Bryan-PlayAI') // Voice identifier
->responseFormat('wav') // Output format
->save('output.wav');
if ($result) {
echo "Audio file saved successfully!";
}
// Method 2: Get as stream
$audioStream = $groq->audio()->speech()
->model('playai-tts')
->input('This is another example text')
->voice('Bryan-PlayAI')
->create();
// Use the stream (e.g., send to browser)
header('Content-Type: audio/wav');
header('Content-Disposition: inline; filename="speech.wav"');
echo $audioStream;
} catch (\LucianoTonet\GroqPHP\GroqException $e) {
echo "Error: " . $e->getMessage();
}
'playai-tts' (English), 'playai-tts-arabic' (Arabic)model(): The TTS model to useinput(): Text to convert to speechvoice(): Voice identifier (e.g., "Bryan-PlayAI")responseFormat(): Output format (default: "wav")create(): Returns audio content as streamsave($filePath): Saves audio to a file and returns success booleanAnalyze images using Groq's vision models.
use LucianoTonet\GroqPHP\Groq;
$groq = new Groq(getenv('GROQ_API_KEY'));
try {
// Analyze a local image
$response = $groq->vision()->analyze('path/to/image.jpg', 'What do you see in this image?');
// Analyze an image from URL
$response = $groq->vision()->analyze('https://example.com/image.jpg', 'Describe this image');
// Custom options
$response = $groq->vision()->analyze('path/to/image.jpg', 'What colors do you see?', [
'temperature' => 0.7,
'max_completion_tokens' => 100
]);
} catch (\LucianoTonet\GroqPHP\GroqException $e) {
echo 'Error: ' . $e->getMessage();
}
Vision Model:
The vision functionality uses the meta-llama/llama-4-scout-17b-16e-instruct model by default, which supports:
Enables step-by-step reasoning tasks.
use LucianoTonet\GroqPHP\Groq;
$groq = new Groq(getenv('GROQ_API_KEY'));
try {
$response = $groq->reasoning()->analyze(
'Explain the process of photosynthesis.',
[
'model' => 'deepseek-r1-distill-llama-70b',
'reasoning_format' => 'raw', // 'raw' (default), 'parsed', 'hidden'
'temperature' => 0.6,
'max_completion_tokens' => 10240
]
);
echo $response['choices'][0]['message']['content'];
} catch (\LucianoTonet\GroqPHP\GroqException $e) {
echo "Error: " . $e->getMessage();
}
analyze(): Takes the prompt (question/problem) and an options array.reasoning_format:
'raw': Includes reasoning with <think> tags in content (default)'parsed': Returns reasoning in a separate reasoning field'hidden': Returns only the final answersystem_prompt: Additional instructions for the model (optional). Added as a system message before the user message.'parsed' or 'hidden' format when using JSON modetemperature, max_completion_tokens, top_p, frequency_penalty, etc.The reasoning feature supports three output formats:
Raw Format (Default)
<think> tags in the content$response = $groq->reasoning()->analyze(
"Explain quantum entanglement.",
[
'model' => 'deepseek-r1-distill-llama-70b',
'reasoning_format' => 'raw'
]
);
// Response includes: <think>First, let's understand...</think>
Parsed Format
$response = $groq->reasoning()->analyze(
"Solve this math problem: 3x + 7 = 22",
[
'model' => 'deepseek-r1-distill-llama-70b',
'reasoning_format' => 'parsed'
]
);
// Response structure:
// {
// "reasoning": "Step 1: Subtract 7 from both sides...",
// "content": "x = 5"
// }
Hidden Format
$response = $groq->reasoning()->analyze(
"What is the capital of France?",
[
'model' => 'deepseek-r1-distill-llama-70b',
'reasoning_format' => 'hidden'
]
);
// Response includes only: "The capital of France is Paris."
Process large volumes of data asynchronously using Groq's Files and Batch Processing API.
use LucianoTonet\GroqPHP\Groq;
$groq = new Groq(getenv('GROQ_API_KEY'));
$fileManager = $groq->files();
// Upload a file
$file = $fileManager->upload('path/to/your/file.jsonl', 'batch');
// List files
$files = $fileManager->list('batch', [
'limit' => 10,
'order' => 'desc'
]);
// Retrieve file info
$file = $fileManager->retrieve('file_id');
// Download file content
$content = $fileManager->download('file_id');
// Delete file
$fileManager->delete('file_id');
$batchManager = $groq->batches();
// Create a batch
$batch = $batchManager->create([
'input_file_id' => 'file_id',
'endpoint' => '/v1/chat/completions',
'completion_window' => '24h',
'metadata' => [
'description' => 'Processing customer queries'
]
]);
// List batches
$batches = $batchManager->list([
'limit' => 10,
'order' => 'desc',
'status' => 'completed'
]);
// Get batch status
$batch = $batchManager->retrieve('batch_id');
$summary = $batch->getSummary();
// Cancel batch
$batch = $batchManager->cancel('batch_id');
File Requirements:
custom_id: Your unique identifier for tracking the batch requestmethod: The HTTP method (currently POST only)url: The API endpoint to call (one of: /v1/chat/completions, /v1/audio/transcriptions, or /v1/audio/translations)body: The parameters of your request matching to any synchronous API format like messages for chat, url for audio, etc.Example JSONL file:
{"custom_id": "chat-request-1", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "llama-3.1-8b-instant", "messages": [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is quantum computing?"}]}}
{"custom_id": "audio-request-1", "method": "POST", "url": "/v1/audio/transcriptions", "body": {"model": "whisper-large-v3", "language": "en", "url": "https://github.com/voxserv/audio_quality_testing_samples/raw/refs/heads/master/testaudio/8000/test01_20s.wav", "response_format": "verbose_json", "timestamp_granularities": ["segment"]}}
{"custom_id": "chat-request-2", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "llama-3.3-70b-versatile", "messages": [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Explain machine learning in simple terms."}]}}
{"custom_id":"audio-request-2","method":"POST","url":"/v1/audio/translations","body":{"model":"whisper-large-v3","language":"en","url":"https://console.groq.com/audio/batch/sample-zh.wav","response_format":"verbose_json","timestamp_granularities":["segment"]}}
Supported Features:
Completion Windows:
Batch Statuses:
The library throws GroqException for API errors. The exception contains:
getMessage(): Descriptive error messagegetCode(): HTTP status code (or 0 for invalid API key)getType(): Error type (see GroqException::ERROR_TYPES for possible types)getHeaders(): HTTP response headersgetResponseBody(): Response body (as object if JSON)getError(): Returns array with error details (message, type, code)getFailedGeneration(): If error type is failed_generation, returns the invalid JSON that caused the issuetry {
// ... API call ...
} catch (\LucianoTonet\GroqPHP\GroqException $e) {
echo "Groq Error: " . $e->getMessage() . "\n";
echo "Type: " . $e->getType() . "\n";
echo "Code: " . $e->getCode() . "\n";
if ($e->getFailedGeneration()) {
echo "Invalid JSON: " . $e->getFailedGeneration();
}
}
The GroqException class provides static methods for creating specific exceptions like invalidRequest(), authenticationError(), etc., following a factory pattern.
The examples/ folder contains complete, working PHP scripts demonstrating each library feature. You can run them directly to see the library in action and interact with on your browser.
First, you need to copy your .env file from the root of the project to the examples folder.
cp .env examples/.env
Then, in the examples folder, you need to install the dependencies with:
cd examples
composer install
Now, you can start the server with:
php -S 127.0.0.1:8000
Finally, you can access the examples in your browser at http://127.0.0.1:8000.
The tests/ folder contains unit tests. Run them with composer test. Tests require the GROQ_API_KEY environment variable to be set.
Note: Tests make real API calls to Groq and consume API credits. For this reason, our CI pipeline runs tests only on PHP 8.2. If you need to test with different PHP versions, please do so locally and be mindful of API usage.
fileinfo extensionguzzlehttp/guzzleContributions are welcome! If you find a bug, have a suggestion, or want to add functionality, please open an issue or submit a pull request.
See CHANGELOG.md for the full changelog.
This package follows SemVer conventions. However, breaking changes may be released in minor versions in the following cases:
How can I help you explore Laravel packages today?