Weave Code
Code Weaver
Helps Laravel developers discover, compare, and choose open-source packages. See popularity, security, maintainers, and scores at a glance to make better decisions.
Feedback
Share your thoughts, report bugs, or suggest improvements.
Subject
Message

Ai Laravel Package

laravel/ai

Laravel AI SDK offers a unified, Laravel-friendly API for OpenAI, Anthropic, Gemini, and more. Build agents with tools and structured output, generate images, synthesize/transcribe audio, and create embeddings—all through one consistent interface.

View on GitHub
Deep Wiki
Context7

Laravel AI is a Laravel-native SDK that provides a unified, expressive API for working with multiple AI providers (OpenAI, Anthropic, Gemini, and more). It lets you build AI-driven features using a consistent interface that fits naturally into Laravel applications.

Use it to create intelligent workflows—from agents with tools and structured output to media generation and embeddings—without rewriting integrations per provider.

  • Multi-provider abstraction with a consistent Laravel-friendly API
  • Build agents, tool calling, and structured responses
  • Generate images and work with audio (synthesis/transcription)
  • Create vector embeddings for search and retrieval use cases
  • Designed for clean integration into modern Laravel projects
Frequently asked questions about Ai
How do I install Laravel AI in a Laravel 10+ project?
Run `composer require laravel/ai` in your project directory. The package auto-discovers and registers via Laravel’s service provider. Ensure PHP 8.3+ and Laravel 10+ are installed, as these are strict requirements. Configure your AI provider keys in `config/ai.php` after installation.
Which AI providers are supported out of the box, and can I add custom ones?
Laravel AI natively supports OpenAI, Anthropic, and Gemini. You can extend it with custom providers by implementing the `Provider` contract and registering them in the service container. The package includes a `Failover` mechanism to switch between providers if one fails.
How do I create an AI agent with tools in Laravel AI?
Define an agent using the `Agent` facade, then register tools with `tools()` and specify their schemas. For example, `Agent::create()->tools([new MyTool()])->call()`. Tools can interact with databases, APIs, or other services. Structured output schemas ensure responses match expected formats.
Can Laravel AI handle streaming responses for chatbots or real-time applications?
Yes, Laravel AI supports streaming responses via the `stream()` method on agents or chat models. For Laravel Octane (Swoole/RoadRunner), ensure proper buffering to avoid timeouts. Streaming is ideal for chat interfaces or live AI interactions where partial responses are useful.
How do I manage API rate limits and avoid throttling with multiple providers?
Use the `withFailover()` method to define fallback providers if one is throttled. Implement retry logic with `retry()` or leverage Laravel’s queue system for delayed retries. Monitor token usage and set provider-specific rate limits in `config/ai.php` to prevent cost overruns.
Does Laravel AI support vector embeddings for semantic search?
Yes, generate embeddings with `Embedding::create()->run()`. Store vectors in Laravel Scout or a dedicated database like Pinecone/Weaviate. The package integrates seamlessly with Laravel’s query builder for similarity searches, making it ideal for AI-powered search or recommendation systems.
How do I persist AI conversations or tool calls to a database?
Use the `Conversation` and `ToolCall` Eloquent models to log interactions. Enable persistence by setting `enable_conversation_logging` to `true` in `config/ai.php`. This is useful for auditing, debugging, or rebuilding stateful agents later.
What’s the best way to handle sensitive data in AI prompts or responses?
Avoid logging or storing sensitive prompts/responses directly. Use middleware to redact PII or encrypt data before sending to providers. For compliance (e.g., GDPR), disable conversation logging or use a dedicated secure storage system for AI interactions.
Can I use Laravel AI with local AI models like Ollama or LM Studio?
Currently, Laravel AI is optimized for cloud providers (OpenAI, Anthropic, etc.). For local models, consider wrapping their APIs in a custom `Provider` implementation. The package’s contract-first design makes this feasible, though you’ll need to handle offline mode and latency manually.
Are there alternatives to Laravel AI for multi-provider AI integrations?
Alternatives include direct provider SDKs (e.g., OpenAI’s PHP SDK) or frameworks like LangChain or Haystack. However, Laravel AI offers a Laravel-native, unified interface with built-in tooling (agents, embeddings, streaming) that reduces boilerplate. It’s ideal if you want deep Laravel integration without vendor lock-in.
Weaver

How can I help you explore Laravel packages today?

Conversation history is not saved when not logged in.
Prompt
Add packages to context
No packages found.
davejamesmiller/laravel-breadcrumbs
artisanry/parsedown
christhompsontldr/phpsdk
enqueue/dsn
bunny/bunny
enqueue/test
enqueue/null
enqueue/amqp-tools
milesj/emojibase
bower-asset/punycode
bower-asset/inputmask
bower-asset/jquery
bower-asset/yii2-pjax
laravel/nova
spatie/laravel-mailcoach
spatie/laravel-superseeder
laravel/liferaft
nst/json-test-suite
danielmiessler/sec-lists
jackalope/jackalope-transport