Weave Code
Code Weaver
Helps Laravel developers discover, compare, and choose open-source packages. See popularity, security, maintainers, and scores at a glance to make better decisions.
Feedback
Share your thoughts, report bugs, or suggest improvements.
Subject
Message

Ai Laravel Package

laravel/ai

Laravel AI SDK offers a unified, Laravel-friendly API for OpenAI, Anthropic, Gemini, and more. Build agents with tools and structured output, generate images, synthesize/transcribe audio, and create embeddings—all through one consistent interface.

View on GitHub
Deep Wiki
Context7

Getting Started

Minimal Steps

  1. Installation:

    composer require laravel/ai
    

    Publish the config file:

    php artisan vendor:publish --provider="Laravel\AI\AIServiceProvider" --tag="ai-config"
    
  2. Configure Providers: Edit config/ai.php and add your preferred AI provider (e.g., OpenAI, Anthropic, Gemini) with API keys:

    'providers' => [
        'openai' => [
            'key' => env('OPENAI_KEY'),
            'model' => 'gpt-4o',
        ],
    ],
    
  3. First Use Case: Generate a simple chat response:

    use Laravel\AI\Facades\AI;
    
    $response = AI::send('What is Laravel?', 'openai');
    echo $response->content;
    
  4. Check Documentation: Review the official Laravel AI SDK docs for provider-specific configurations and advanced features.


Implementation Patterns

Core Workflows

1. Chat Completions

Use the send() method for synchronous responses or stream() for real-time output:

// Synchronous
$response = AI::send('Summarize this: ' . $longText, 'openai');

// Streaming
AI::stream('Explain Laravel AI', 'openai')
    ->then(function ($chunk) {
        echo $chunk->content;
    });

2. Tools and Agents

Define tools and create agents for multi-step workflows:

use Laravel\AI\Agents\Agent;
use Laravel\AI\Tools\Tool;

$agent = new Agent('openai', 'gpt-4o');
$agent->addTools([
    new Tool('get_user', fn($userId) => User::findOrFail($userId)),
]);

$response = $agent->call('Find user 1 and return their name');

Middleware Integration: Use the make:agent-middleware Artisan command to create middleware for pre/post-processing:

php artisan make:agent-middleware ValidateUser

3. Embeddings

Generate vector embeddings for semantic search or similarity:

$embedding = AI::embeddings(['Your text here'], 'openai');
$vector = $embedding->first()->embedding;

4. Audio and Image Generation

Convert text to speech or generate images:

// Text-to-speech
AI::speech('Hello world', 'openai')
    ->save('output.mp3');

// Image generation
AI::image('A sunset over mountains', 'openai')
    ->save('sunset.png');

5. Conversations

Maintain stateful conversations with users:

$conversation = AI::conversation('openai')
    ->remember()
    ->user('user123')
    ->speak('Hello!');

$response = $conversation->reply('Hi there!');

Integration Tips

Service Providers

Bind custom providers or extend existing ones:

AI::extend('custom-provider', function () {
    return new CustomAIProvider();
});

Failover and Fallbacks

Configure failover for providers (e.g., switch to a backup if the primary fails):

AI::withFailover(['openai', 'anthropic'])
    ->send('Your prompt');

Broadcasting Streams

Broadcast AI responses in real-time using Laravel Echo:

AI::stream('Your prompt')
    ->broadcast();

Middleware

Apply middleware to agents for validation, logging, or rate-limiting:

$agent->pipe(new ValidateUserMiddleware());

Testing

Use mock providers for unit tests:

AI::fake();
AI::shouldReceive('send')->andReturn(new Response('Mocked reply'));

Gotchas and Tips

Pitfalls

  1. Provider-Specific Quirks:

    • OpenAI: Ensure providerOptions are correctly passed for tools (e.g., strict: true).
    • Anthropic: Use the direct anthropic gateway for Messages API to avoid compatibility issues.
    • Gemini: Note that image_size is case-sensitive (e.g., HD vs. hD).
  2. Streaming Issues:

    • Under Octane, ensure streaming generators are properly configured. Use ->toResponse() for HTTP responses.
    • Avoid memory leaks in long-running streams by closing resources explicitly.
  3. Tool Call Failures:

    • Validate tool schemas strictly. Missing additionalProperties or incorrect types (e.g., nested objects) may cause silent failures.
    • Use ->strict(true) on tools to enforce schema validation.
  4. Conversations and State:

    • Conversation leakage: Ensure forUser() is called to isolate user-specific conversations.
    • Tool call history: Verify tool calls are round-tripped correctly in the conversation context.
  5. File Handling:

    • Audio/Transcriptions: Specify correct file extensions (e.g., .mp3 for speech, .wav for transcription inputs).
    • Image generation: Use ->size('1024x1024') for explicit dimensions.
  6. Rate Limiting:

    • Implement HandlesRateLimiting middleware for providers with strict quotas.
    • Monitor usage in responses to avoid unexpected failures.

Debugging Tips

  1. Enable Debugging: Set AI_DEBUG=true in .env to log raw API responses and errors.

  2. Log Provider Responses: Use AI::debug() to inspect the last request/response:

    AI::debug()->send('Your prompt');
    
  3. Common Errors:

    • PrismException: Check for malformed tool schemas or missing API keys.
    • PendingResponse failures: Ensure failover providers are correctly configured.
    • Streaming timeouts: Increase timeout in provider options (e.g., ->timeout(30)).
  4. Schema Validation: For structured outputs, validate schemas with:

    AI::validateSchema([
        'property' => 'string',
    ]);
    

Extension Points

  1. Custom Providers: Extend the Provider contract to support new AI services:

    class CustomProvider extends Provider {
        public function send($message, $model = null) {
            // Implement custom logic
        }
    }
    
  2. Middleware: Create reusable middleware for agents:

    class LogAgentRequests {
        public function handle($next) {
            Log::info('Agent request:', $next->prompt);
            return $next($next->prompt);
        }
    }
    
  3. Macros: Add helper methods to the AI facade:

    AI::macro('summarize', function ($text) {
        return AI::send("Summarize this: $text", 'openai');
    });
    
  4. Events: Listen to AI events for analytics or logging:

    AI::on('agent.prompted', function ($event) {
        // Log or process the event
    });
    
  5. Testing Helpers: Use AI::fake() to mock responses in tests:

    AI::fake([
        'openai' => 'I am a fake response',
    ]);
    

Configuration Quirks

  1. Provider Options:

    • Pass provider-specific options via ->options():
      AI::send('Prompt', 'openai')
          ->options(['temperature' => 0.5]);
      
    • For tools, use ->providerOptions():
      $tool->providerOptions(['strict' => true]);
      
  2. Base URLs: Override default provider URLs (e.g., for self-hosted OpenAI):

    'openai' => [
        'key' => env('OPENAI_KEY'),
        'url' => 'https://your-openai-instance.com/v1',
    ],
    
  3. Default Models: Configure default models per provider in config/ai.php:

    'defaults' => [
        'openai' => 'gpt-4o',
        'anthropic' => 'claude-3-opus',
    ],
    
  4. Timeouts: Set timeouts for long-running requests (e.g., audio processing):

    AI::speech('Text', 'openai')
        ->timeout(60); // 60 seconds
    

Performance Tips

  1. Caching: Cache embeddings or frequent responses:

    $embedding = Cache::remember("embedding:$text", now()->addHours(1), function () {
        return AI::embeddings([$text], 'openai')->first()->embedding;
    });
    
Weaver

How can I help you explore Laravel packages today?

Conversation history is not saved when not logged in.
Prompt
Add packages to context
No packages found.
davejamesmiller/laravel-breadcrumbs
artisanry/parsedown
christhompsontldr/phpsdk
enqueue/dsn
bunny/bunny
enqueue/test
enqueue/null
enqueue/amqp-tools
milesj/emojibase
bower-asset/punycode
bower-asset/inputmask
bower-asset/jquery
bower-asset/yii2-pjax
laravel/nova
spatie/laravel-mailcoach
spatie/laravel-superseeder
laravel/liferaft
nst/json-test-suite
danielmiessler/sec-lists
jackalope/jackalope-transport