Weave Code
Code Weaver
Helps Laravel developers discover, compare, and choose open-source packages. See popularity, security, maintainers, and scores at a glance to make better decisions.
Feedback
Share your thoughts, report bugs, or suggest improvements.
Subject
Message

Robots Txt Laravel Package

spatie/robots-txt

Generate and serve a correct robots.txt in Laravel with an expressive API. Add user-agents, allow/disallow rules, sitemaps and host directives, then publish it via a route or controller—perfect for managing crawler access per environment.

View on GitHub
Deep Wiki
Context7

Getting Started

Install the package via Composer: composer require spatie/robots-txt. Publish the config file with php artisan vendor:publish --provider="Spatie\RobotsTxt\RobotsTxtServiceProvider". The config (config/robots-txt.php) defines default rules and allows environment-specific overrides. By default, a /robots.txt route is auto-registered (GET /robots.txt) and returns a dynamic response. Start by checking config/app.php to ensure the service provider auto-discovers (no manual registration needed). For first use, update config/robots.txt.php to add your production sitemap and disallow paths like /admin.

Implementation Patterns

  • Environment-aware control: Use RobotsTxt::configureUsing() in AppServiceProvider@boot to apply logic-based rules (e.g., if (app()->isLocal()) { $robots->Disallow('/'); }).
  • Programmatic rule building: In config or service providers, chain methods fluently:
    return RobotsTxt::create()
        ->userAgent('*', fn ($userAgent) => $userAgent
            ->disallow('/admin/')
            ->allow('/public/')
            ->sitemapUrl(env('APP_URL') . '/sitemap.xml')
        );
    
  • Dynamic sitemaps: Inject environment variables or config keys for sitemap URLs to support multi-environment deployments without code changes.
  • Integration with Horizon/Queues: Temporarily disable indexing during maintenance or high-load periods by toggling a config flag and reusing the same RobotsTxt builder.
  • Testing: Write feature tests asserting GET /robots.txt returns expected status and directives using Pest’s assertSee() or PHPUnit’s assertSnapshotEquals() against a known-good config snapshot.

Gotchas and Tips

  • Route collisions: If you have a /robots.txt file in public/, Laravel will serve it instead of the route—delete the static file to avoid confusion.
  • Caching pitfalls: The package caches output per request unless you add custom caching; for high-traffic apps, wrap the config closure in Cache::remember() or use Laravel’s response caching middleware.
  • Case sensitivity: User-agent matching is case-insensitive per the spec, but your code must match against the exact string (e.g., 'googlebot' won’t match Googlebot’s real user-agent string 'Googlebot/2.1'). Prefer wildcard '*' unless targeting specific bots.
  • Config fallbacks: Rules from the config file are applied before configureUsing() callbacks—combine both carefully to avoid unintentional overrides.
  • Debugging: Run php artisan route:list | grep robots to confirm the route is registered. Use dd() inside configureUsing() to inspect runtime context (e.g., config('app.env')).
  • SEO impact: Missing or malformed robots.txt can block indexing—always validate responses in tools like Google Search Console’s robots.txt Tester.
Weaver

How can I help you explore Laravel packages today?

Conversation history is not saved when not logged in.
Prompt
Add packages to context
No packages found.
davejamesmiller/laravel-breadcrumbs
artisanry/parsedown
christhompsontldr/phpsdk
enqueue/dsn
bunny/bunny
enqueue/test
enqueue/null
enqueue/amqp-tools
milesj/emojibase
bower-asset/punycode
bower-asset/inputmask
bower-asset/jquery
bower-asset/yii2-pjax
laravel/nova
spatie/laravel-mailcoach
spatie/laravel-superseeder
laravel/liferaft
nst/json-test-suite
danielmiessler/sec-lists
jackalope/jackalope-transport