spatie/laravel-robots-middleware
Laravel middleware to control search engine indexing via X-Robots-Tag/robots meta behavior. Extend the base middleware and implement shouldIndex() to allow or block indexing per request (e.g., disable for admin routes), without changing your views.
/admin, /checkout, or /api from appearing in search results).Adopt if:
noindex/index rules.Look elsewhere if:
spatie/laravel-robots-txt instead).For Executives: "This package lets us centrally control which parts of our site are search-engine-indexable with a single middleware toggle—no manual tagging per page. For example, we can instantly block admin panels or GDPR-sensitive routes from appearing in search results, improving compliance and performance. It’s a 10-minute setup that saves dev time and reduces crawl budget waste by search engines. Given its MIT license and Spatie’s track record, it’s a low-risk, high-reward choice for SEO-sensitive features."
For Engineering:
*"Spatie’s laravel-robots-middleware provides a dead-simple way to add X-Robots-Tag: noindex or index headers via middleware. Key benefits:
app/Http/Kernel.php and define rules in config/robots.php.if (auth()->check())).Use case example: Block /admin/* routes globally with:
Robots::forPaths(['admin/*'])->deny();
Perfect for projects where SEO compliance is critical but we don’t want to clutter views with meta tags."*
How can I help you explore Laravel packages today?