Weave Code
Code Weaver
Helps Laravel developers discover, compare, and choose open-source packages. See popularity, security, maintainers, and scores at a glance to make better decisions.
Feedback
Share your thoughts, report bugs, or suggest improvements.
Subject
Message

Laravel Robots Middleware Laravel Package

spatie/laravel-robots-middleware

Laravel middleware to control search engine indexing via X-Robots-Tag/robots meta behavior. Extend the base middleware and implement shouldIndex() to allow or block indexing per request (e.g., disable for admin routes), without changing your views.

View on GitHub
Deep Wiki
Context7

Getting Started

Minimal Setup

  1. Installation:

    composer require spatie/laravel-robots-middleware
    

    Publish the config (optional):

    php artisan vendor:publish --provider="Spatie\RobotsMiddleware\RobotsMiddlewareServiceProvider"
    
  2. First Use Case: Add the middleware to your app/Http/Kernel.php under $middlewareGroups['web']:

    \Spatie\RobotsMiddleware\RobotsMiddleware::class,
    

    This will block all search engines by default (none directive).

    To allow indexing, update config/robots-middleware.php:

    'default' => 'all',
    
  3. Quick Test: Visit any page on your site and inspect the X-Robots-Tag HTTP header. It should reflect your chosen directive (none or all).


Implementation Patterns

Core Workflows

  1. Global Control: Use the middleware globally (as above) to enforce a site-wide robots directive. Ideal for:

    • Development environments (none).
    • Staging/QA sites (none).
    • Public-facing sites (all).
  2. Route-Specific Overrides: Override the directive for specific routes by adding a robots key to route definitions:

    Route::get('/admin', function () {
        return view('admin.dashboard');
    })->robots('none'); // Blocks search engines for this route only
    
  3. Dynamic Directives: Use closures to conditionally set directives:

    Route::get('/user/{id}', function ($id) {
        return view('user.profile', ['id' => $id]);
    })->robots(function ($request) {
        return auth()->check() ? 'all' : 'none';
    });
    
  4. Middleware Stack Integration: Combine with other middleware (e.g., auth) for granular control:

    Route::middleware(['auth', \Spatie\RobotsMiddleware\RobotsMiddleware::class])->group(function () {
        // Public routes for authenticated users (e.g., 'all')
    });
    

Integration Tips

  • API Routes: Exclude API routes from the middleware group in Kernel.php to avoid unnecessary headers:

    protected $middlewareGroups = [
        'web' => [
            // ... other middleware
        ],
        'api' => [
            // No robots middleware here
        ],
    ];
    
  • Caching: The middleware is lightweight and cache-friendly. No additional configuration is needed for caching headers.

  • Testing: Mock the middleware in tests to verify behavior:

    $response = $this->get('/');
    $response->assertHeader('X-Robots-Tag', 'none');
    

Gotchas and Tips

Pitfalls

  1. Header vs. Meta Tag: The package uses the X-Robots-Tag HTTP header, not the <meta> tag. Ensure your SEO tools (e.g., Google Search Console) are configured to respect HTTP headers.

  2. Caching Headers: If using a CDN or reverse proxy (e.g., Varnish, Cloudflare), ensure the X-Robots-Tag header is forwarded to the client. Some proxies may strip custom headers by default.

  3. Route Overrides: Route-specific robots() directives take precedence over global settings. Test overrides thoroughly to avoid unintended exposure of sensitive routes.

  4. Case Sensitivity: The directive values (all, none, noindex, etc.) are case-insensitive, but the config uses lowercase. Stick to lowercase for consistency.

Debugging

  • Missing Headers: Verify the middleware is registered in Kernel.php and no other middleware is interfering (e.g., TrustProxies misconfiguration).

  • Incorrect Directives: Check the robots-middleware.php config and route overrides. Use dd($request->header('X-Robots-Tag')) in a route to debug live values.

  • Performance: The middleware adds minimal overhead (~1ms). If profiling shows delays, ensure no complex logic is in route closures.

Extension Points

  1. Custom Directives: Extend the package by publishing the config and adding custom directives (e.g., noarchive):

    'directives' => [
        'all' => 'all',
        'none' => 'noindex, nofollow',
        'custom' => 'noarchive, nosnippet',
    ],
    
  2. Dynamic User Agents: Override the handle method in a custom middleware class to target specific bots:

    public function handle($request, Closure $next)
    {
        if ($request->userAgent() === 'Googlebot') {
            $request->headers->set('X-Robots-Tag', 'all');
        }
        return $next($request);
    }
    
  3. Environment-Specific Rules: Use Laravel’s config('robots-middleware.default') in route closures to dynamically switch directives:

    ->robots(app()->environment('production') ? 'all' : 'none')
    
  4. Event-Based Control: Listen to robots.middleware events to modify directives programmatically:

    \Spatie\RobotsMiddleware\Events\RobotsMiddlewareEvent::class => function ($event) {
        if (auth()->check()) {
            $event->setDirective('all');
        }
    },
    
Weaver

How can I help you explore Laravel packages today?

Conversation history is not saved when not logged in.
Prompt
Add packages to context
No packages found.
davejamesmiller/laravel-breadcrumbs
artisanry/parsedown
christhompsontldr/phpsdk
enqueue/dsn
bunny/bunny
enqueue/test
enqueue/null
enqueue/amqp-tools
milesj/emojibase
bower-asset/punycode
bower-asset/inputmask
bower-asset/jquery
bower-asset/yii2-pjax
laravel/nova
spatie/laravel-mailcoach
spatie/laravel-superseeder
laravel/liferaft
nst/json-test-suite
danielmiessler/sec-lists
jackalope/jackalope-transport