SEO Checker
Score any URL on 30+ SEO signals — titles, meta tags, Open Graph, structured data, hreflang, sitemap, robots.txt, llms.txt, security headers, and more.
Categories
Discovery files
HTML syntax
Response headers
What this SEO checker does
Enter any public URL and we fetch the raw HTML server-side, parse it, and run it against a battery of SEO and crawlability checks. We also probe the origin's well-known files — robots.txt, sitemap.xml, llms.txt, humans.txt, ads.txt, security.txt, and the web app manifest — and parse them for common issues.
What gets checked
- SEO basics: title presence and length, meta description, canonical URL, viewport,
<html lang>, charset, doctype. - Document structure: exactly one H1, heading hierarchy, duplicate IDs, deprecated tags.
- Open Graph & Twitter: required OG fields, og:image, twitter:card.
- Structured data: JSON-LD presence, validity, and types detected.
- Internationalization: hreflang variants and
x-defaultcoverage. - Crawlability: robots meta, robots.txt content, sitemap discoverability.
- Images: alt-text coverage and width/height attributes.
- Performance hints: render-blocking scripts in the head, inline style usage.
- Security headers: HSTS, CSP, X-Content-Type-Options, Referrer-Policy.
- Discovery files: llms.txt, humans.txt, ads.txt, security.txt, manifest, favicon.
How the score works
Each rule is weighted by importance and contributes a fractional score from 0 to 1 (pass / partial / fail). The final score is the weighted average across every applicable rule, rounded to a whole number, then mapped to a letter grade: A 90+, B 80+, C 70+, D 60+, F below 60.
Frequently asked questions
Does this run JavaScript on the page?
No. The checker analyzes the raw HTML returned by the server, before any client-side JavaScript runs. For single-page apps where critical content is rendered on the client, the score will reflect what crawlers see on first byte — which is what matters for SEO.
Why probe llms.txt and humans.txt?
They're informational. llms.txt is an emerging convention for telling LLMs what they should and shouldn't read on your site. humans.txt credits the people behind the site. Neither affects ranking, but their presence is a small quality signal we surface.
How is the score weighted?
Critical SEO basics (title, meta description, viewport, robots, indexability, HTTPS, sitemap) carry the most weight. Open Graph, structured data, hreflang, image alt coverage, and security headers carry medium weight. Optional discovery files like llms.txt, humans.txt, and ads.txt carry minimal weight and never fail the score.
Is the data cached?
Yes. Reports are cached for two hours per URL to keep the tool fast and avoid hammering the target site. Cache hits are clearly labeled in the response.