- How do I publish a Kafka message in Laravel using this package?
- Use the fluent facade method `Kafka::publish()->onTopic('your-topic')->withBody(['key' => 'value'])->send()`. This follows Laravel’s idiomatic style and handles serialization automatically. For structured data, pair it with a schema registry like Avro or Protobuf.
- Does Laravel Kafka support consuming messages from Kafka topics?
- Yes, define a consumer class extending `KafkaConsumer` and register it in your `AppServiceProvider`. Use `Kafka::consume()` in a CLI command or queue worker. The package supports both polling and push-based consumption patterns.
- What Laravel versions does this package support?
- Laravel Kafka is designed for Laravel 10+ and requires PHP 8.1+. Check the [documentation](https://laravelkafka.com/) for version-specific setup instructions. The package aligns with Laravel’s latest conventions and testing tools.
- Can I mock Kafka in unit tests without a real broker?
- Absolutely. Use `Kafka::fake()` to simulate publishes and consumes in tests. This eliminates the need for Dockerized Kafka clusters during development, making unit testing as simple as mocking Laravel queues.
- How does Laravel Kafka handle serialization? Can I use Avro or Protobuf?
- By default, it uses JSON serialization. For Avro or Protobuf, integrate the `confluentinc/cp-schema-registry` package and configure a custom serializer in the package’s config. The package remains agnostic to schema formats.
- Is this package suitable for high-throughput event processing (e.g., 1M+ messages/hour)?
- Yes, but tuning may be required. Configure consumer groups, partitioning, and `fetch.max.bytes` in your `.env` for optimal performance. For extreme scale, pair it with Kafka Streams or KSQL for stream processing.
- How do I handle failed message production or consumption?
- The package uses exponential backoff for retries by default. For custom logic, implement `KafkaProducer::failed()` or `KafkaConsumer::failed()` callbacks. Failed messages can also be logged or dead-lettered via middleware.
- Can I use Laravel Kafka alongside Laravel’s built-in queues?
- Yes, it’s designed to complement or replace Laravel queues. Use Kafka for high-throughput async tasks (e.g., processing 10K+ events/hour) while keeping queues for simpler background jobs. Both can coexist in the same app.
- What are the infrastructure requirements for running this package?
- You’ll need a Kafka broker (self-hosted, Confluent Cloud, or AWS MSK) and the `rdkafka` PHP extension (PECL) for production. For local development, Dockerized Kafka clusters (e.g., Bitnami or Confluent images) simplify setup.
- Are there alternatives to Laravel Kafka for Kafka in Laravel?
- Other options include raw `rdkafka` PHP bindings or packages like `php-kafka/kafka`. However, Laravel Kafka stands out for its Laravel-native syntax, built-in testing support (`Kafka::fake()`), and seamless integration with Laravel’s service container and queues.