SEO Health Checker
Validate your robots.txt and sitemap.xml files to ensure search engines can properly crawl and index your website. Get instant feedback on syntax errors, structure issues, and SEO best practices.
Robots.txt Content
Best Practices
- • Always include at least one User-agent directive
- • Use absolute URLs for Sitemap directives
- • Be careful with Disallow rules - they can block important pages
- • Test your robots.txt in Google Search Console
- • Place robots.txt at the root of your domain