spatie/laravel-robots-middleware
Laravel middleware to control search engine indexing via X-Robots-Tag/robots meta behavior. Extend the base middleware and implement shouldIndex() to allow or block indexing per request (e.g., disable for admin routes), without changing your views.
robots.txt-like headers without modifying core routing logic.Kernel.php), requiring minimal deviation from standard practices.all/none behavior).robots directives are hardcoded without dynamic logic.none for all /admin routes) or dynamic (e.g., per-user or per-request)?<meta> tags (alternative approach)?robots headers?robots.txt and <meta> tag implementations (if any).composer require spatie/laravel-robots-middleware.app/Http/Kernel.php:
protected $middleware = [
\Spatie\RobotsMiddleware\RobotsMiddleware::class,
];
php artisan route:list to verify header injection.RobotsMiddleware::allowAllExcept(['/admin/*']);
composer.json constraints).proxy_hide_header X-Robots-Tag.X-Robots-Tag.none for all /api routes).RobotsMiddleware::debug()).Vary headers), or misconfigured proxy rules.curl -I https://example.com to inspect headers.| Failure Scenario | Impact | Mitigation |
|---|---|---|
| Middleware misconfiguration | Incorrect indexing (e.g., none for public routes) |
Unit tests for route-specific rules. |
| Proxy/CDN stripping headers | Crawlers ignore directives | Configure proxy to preserve headers. |
| Race conditions in dynamic rules | Inconsistent headers per request | Use request-scoped logic (e.g., closures). |
| Package abandonment | Unmaintained code | Monitor GitHub activity; fork if needed. |
dd($request->headers)).all vs. none.How can I help you explore Laravel packages today?