spatie/laravel-sitemap
Generate XML sitemaps for Laravel automatically by crawling your site or building them manually. Add URLs, models, lastmod/changefreq/priority, images and alternates, then write to file or disk. Supports sitemap index and large sites.
Sitemapable interface), enabling tailored implementations for complex architectures.composer require. Leverages Laravel’s filesystem, queues, and scheduling systems.spatie/crawler (included) and optionally spatie/browsershot (for JS rendering). Minimal external dependencies beyond core Laravel.browsershot) adds complexity and latency. Requires Chrome installation and may fail in headless environments (e.g., CI/CD).shouldCrawl logic.browsershot viable, or should URLs be pre-defined?Schedule facade for periodic regeneration.Sitemap::create()->add(Url::create(...))) for critical paths (e.g., blog posts, product pages).SitemapGenerator::create().shouldCrawl, hasCrawled) to exclude non-essential pages (e.g., admin, login).spatie/browsershot and test in a staging environment with execute_javascript = true.sitemap:generate) with incremental updates (e.g., daily or post-deployment).spatie/crawler and browsershot are well-maintained.composer require spatie/laravel-sitemap
php artisan vendor:publish --provider="Spatie\Sitemap\SitemapServiceProvider" --tag=sitemap-config
config/sitemap.php for crawler settings (e.g., execute_javascript, chrome_binary_path).public, s3) for sitemap output.SitemapGenerator::create()->writeToFile().sitemap:generate).concurrency/depth.Sitemapable interface) for dynamic models.shouldCrawl/hasCrawled logic.spatie/crawler and browsershot for breaking changes (e.g., Chrome version requirements).SitemapGenerator in a service with custom logging.browsershot issues (e.g., Chrome crashes) require system-level debugging.concurrency or server resources for large sites.| Failure Scenario | Impact | Mitigation |
|---|---|---|
| Crawler times out on large site | Incomplete sitemap | Set setMaximumCrawlCount, monitor duration. |
JS rendering fails (browsershot) |
Missed dynamic URLs | Fallback to manual URLs or disable JS crawling. |
| Storage write permissions denied | Sitemap not published | Verify disk visibility settings. |
| Scheduled job skips | Stale sitemap | Use Laravel’s withoutOverlapping for reliability. |
| External links return 404s | Broken URLs in sitemap | Filter out non-200 responses in hasCrawled. |
Sitemapable interface).How can I help you explore Laravel packages today?