spatie/robots-txt
Generate and serve a correct robots.txt in Laravel with an expressive API. Add user-agents, allow/disallow rules, sitemaps and host directives, then publish it via a route or controller—perfect for managing crawler access per environment.
Executives: "Prevents accidental indexing of staging environments, safeguarding SEO and avoiding revenue loss from uncontrolled pre-release content exposure—without manual oversight."
Engineering: "Integrates seamlessly with Laravel’s config system, centralizing robots.txt rules in code for version control. Eliminates deployment steps for static files and supports environment-aware rules out-of-the-box with minimal setup."
How can I help you explore Laravel packages today?