jaybizzle/crawler-detect
Detect bots/crawlers/spiders in PHP by matching User-Agent and HTTP_FROM headers. CrawlerDetect recognizes thousands of known crawlers, lets you check the current request or a provided UA string, and returns the matched bot name.
Cost Efficiency & Time-to-Market:
Feature Expansion Roadmap:
AhrefsBot) from sensitive endpoints (e.g., /api/v1/pricing).Googlebot) for /sitemap.xml while throttling others.100 RPS for bots, 10,000 RPS for humans).CrawlerDetectFilter).Bingbot) via Blade middleware or response modifiers.headless Chrome) by integrating with Laravel’s fail2ban or IP blocking middleware.CrawlerDetect to Laravel’s auth middleware (e.g., CrawlerDetectMiddleware::class).@if(!crawler()) <script defer>...</script>@endif).Googlebot) to reduce API costs (e.g., AWS Lambda savings of ~30%).Use Cases:
caam crwlr) while allowing SEO crawlers to maintain search rankings.RuxitSynthetic) using Laravel validation middleware.1Pilot) and scrapers to preserve bandwidth.Adopt if:
Look Elsewhere if:
For Executives: "CrawlerDetect is a turnkey solution to eliminate bot traffic waste, reducing API costs by ~30% and protecting high-value endpoints from scrapers. With zero development overhead, it integrates seamlessly into Laravel, enabling immediate security and performance gains—like blocking price scrapers while allowing SEO crawlers. This aligns with our goals to cut operational costs and improve reliability without disrupting existing workflows."
For Engineering Teams: *"This package provides a maintained, battle-tested crawler detection library with 1,000+ pre-configured patterns, saving us months of development time. We can deploy it as Laravel middleware to:
AhrefsBot) from sensitive APIs.For Security Teams: *"CrawlerDetect hardens our Laravel app against automated abuse by identifying and mitigating scrapers, headless browsers, and monitoring tools. We can:
How can I help you explore Laravel packages today?