Weave Code
Code Weaver
Helps Laravel developers discover, compare, and choose open-source packages. See popularity, security, maintainers, and scores at a glance to make better decisions.
Feedback
Share your thoughts, report bugs, or suggest improvements.
Subject
Message

Prism Laravel Package

prism-php/prism

Prism is a Laravel package for integrating LLMs with a fluent API for text generation, multi-step conversations, and tool usage across multiple AI providers—letting you build AI features without dealing with low-level provider details.

View on GitHub
Deep Wiki
Context7

Prism is a Laravel package for integrating Large Language Models (LLMs) into your apps with a clean, fluent API. It helps you build AI-powered features—like chat, assistants, and automation—while abstracting away provider-specific complexity.

Designed for production Laravel workflows, Prism supports multi-step interactions and tool usage so you can focus on shipping great experiences instead of wiring APIs.

  • Fluent interface for text generation and structured prompting
  • Multi-step conversations with context handling
  • Tooling support to let models call app functions/actions
  • Works with multiple AI providers via a unified API
  • Laravel-friendly DX for building AI features quickly
Frequently asked questions about Prism
How do I install Prism in a Laravel project?
Run `composer require prism-php/prism` in your project directory. Prism supports Laravel 11+ and PHP 8.2+. After installation, publish the config file with `php artisan prism:install` to set up provider credentials via `.env`. The package integrates seamlessly with Laravel’s service container and Facades.
Which AI providers does Prism support out of the box?
Prism natively supports OpenAI and Anthropic. You can also extend it to work with other providers by implementing the `ProviderInterface` and registering your custom provider in the config. The unified API ensures consistent behavior regardless of the underlying provider.
Can Prism handle multi-turn conversations with context?
Yes, Prism includes built-in support for multi-step conversations. Use the `Prism::chat()` method to chain messages and maintain context across turns. You can store conversation state in Laravel’s cache (e.g., Redis) or a database for persistence between requests.
How do I use tools or function calling with Prism?
Define tools as PHP functions or closures and register them with Prism’s `withTools()` method. When calling `Prism::chat()`, the package will automatically invoke these tools if the LLM requests them. Tools are useful for integrating database queries, external APIs, or custom logic into your AI workflows.
What Laravel versions does Prism officially support?
Prism is designed for Laravel 11 and later. While it may work with older versions, the package leverages Laravel’s modern features like Facades, Artisan commands, and dependency injection. For non-Laravel PHP projects, use the standalone `prism-php/prism` core package.
How do I handle rate limits or token costs in production?
Prism provides hooks for rate limit handling, but you’ll need to implement monitoring and fallback logic. Use Laravel Queues to offload LLM calls and avoid timeouts. For cost management, track token usage with middleware or a custom service and set budget alerts.
Can I use Prism with self-hosted LLMs or local models?
Prism’s architecture supports custom providers, so you can extend it to work with self-hosted models like Ollama or vLLM. Implement the `ProviderInterface` and configure your local endpoint in the `prism.php` config. For production, consider Prism’s server mode for optimized inference.
What are the alternatives to Prism for Laravel LLM integration?
Alternatives include Laravel AI (for simpler use cases), LangChain PHP (for advanced workflows), or direct API wrappers like Guzzle for OpenAI. Prism stands out with its fluent API, tool integration, and Laravel-native DX, making it ideal for complex AI features like assistants or automation.
How do I test Prism in my Laravel application?
Mock LLM responses using Laravel’s HTTP testing helpers or Prism’s built-in testing utilities. For provider-specific tests, use the `Prism::fake()` method to simulate API calls. Test multi-step conversations by chaining `Prism::chat()` assertions in your PHPUnit or Pest tests.
Is Prism secure for handling sensitive data in prompts?
Prism validates inputs and supports masking sensitive data via the `withMasking()` method. For production, combine this with Laravel Policies to restrict LLM access and encrypt provider credentials in `.env`. Always sanitize dynamic data before passing it to prompts.
Weaver

How can I help you explore Laravel packages today?

Conversation history is not saved when not logged in.
Prompt
Add packages to context
No packages found.
davejamesmiller/laravel-breadcrumbs
artisanry/parsedown
christhompsontldr/phpsdk
enqueue/dsn
bunny/bunny
enqueue/test
enqueue/null
enqueue/amqp-tools
milesj/emojibase
bower-asset/punycode
bower-asset/inputmask
bower-asset/jquery
bower-asset/yii2-pjax
laravel/nova
spatie/laravel-mailcoach
spatie/laravel-superseeder
laravel/liferaft
nst/json-test-suite
danielmiessler/sec-lists
jackalope/jackalope-transport