- How do I install Laravel AI in a Laravel 10+ project?
- Run `composer require laravel/ai` in your project directory. The package auto-discovers and registers via Laravel’s service provider. Ensure PHP 8.3+ and Laravel 10+ are installed, as these are strict requirements. Configure your AI provider keys in `config/ai.php` after installation.
- Which AI providers are supported out of the box, and can I add custom ones?
- Laravel AI natively supports OpenAI, Anthropic, and Gemini. You can extend it with custom providers by implementing the `Provider` contract and registering them in the service container. The package includes a `Failover` mechanism to switch between providers if one fails.
- How do I create an AI agent with tools in Laravel AI?
- Define an agent using the `Agent` facade, then register tools with `tools()` and specify their schemas. For example, `Agent::create()->tools([new MyTool()])->call()`. Tools can interact with databases, APIs, or other services. Structured output schemas ensure responses match expected formats.
- Can Laravel AI handle streaming responses for chatbots or real-time applications?
- Yes, Laravel AI supports streaming responses via the `stream()` method on agents or chat models. For Laravel Octane (Swoole/RoadRunner), ensure proper buffering to avoid timeouts. Streaming is ideal for chat interfaces or live AI interactions where partial responses are useful.
- How do I manage API rate limits and avoid throttling with multiple providers?
- Use the `withFailover()` method to define fallback providers if one is throttled. Implement retry logic with `retry()` or leverage Laravel’s queue system for delayed retries. Monitor token usage and set provider-specific rate limits in `config/ai.php` to prevent cost overruns.
- Does Laravel AI support vector embeddings for semantic search?
- Yes, generate embeddings with `Embedding::create()->run()`. Store vectors in Laravel Scout or a dedicated database like Pinecone/Weaviate. The package integrates seamlessly with Laravel’s query builder for similarity searches, making it ideal for AI-powered search or recommendation systems.
- How do I persist AI conversations or tool calls to a database?
- Use the `Conversation` and `ToolCall` Eloquent models to log interactions. Enable persistence by setting `enable_conversation_logging` to `true` in `config/ai.php`. This is useful for auditing, debugging, or rebuilding stateful agents later.
- What’s the best way to handle sensitive data in AI prompts or responses?
- Avoid logging or storing sensitive prompts/responses directly. Use middleware to redact PII or encrypt data before sending to providers. For compliance (e.g., GDPR), disable conversation logging or use a dedicated secure storage system for AI interactions.
- Can I use Laravel AI with local AI models like Ollama or LM Studio?
- Currently, Laravel AI is optimized for cloud providers (OpenAI, Anthropic, etc.). For local models, consider wrapping their APIs in a custom `Provider` implementation. The package’s contract-first design makes this feasible, though you’ll need to handle offline mode and latency manually.
- Are there alternatives to Laravel AI for multi-provider AI integrations?
- Alternatives include direct provider SDKs (e.g., OpenAI’s PHP SDK) or frameworks like LangChain or Haystack. However, Laravel AI offers a Laravel-native, unified interface with built-in tooling (agents, embeddings, streaming) that reduces boilerplate. It’s ideal if you want deep Laravel integration without vendor lock-in.