SEO Issue Detail

406 ModSecurity inconsistency

Use Refresh / Scan to re-check the live findings for this exact issue and confirm whether it is still active.

Ready to scan.
← Back to dashboard
Needs verification

A 406 on crawl-critical endpoints can mean either a real access problem or a WAF blocking low-trust request signatures. Treat it as critical only if a full browser UA or a recognized crawler UA fails too. If only bare/generic scripted UAs are blocked, this is usually expected WAF behavior rather than a customer-facing outage.

What this is

WAF / ModSecurity behavior can block some request signatures on endpoints such as homepage, robots.txt, and sitemap URLs. The important distinction is whether the block hits real browser traffic / recognized crawlers or only generic scripted UAs.

Why it matters

  • If full browser UAs fail, it can be user-impacting
  • If recognized crawler UAs fail, it can become an SEO / crawlability issue
  • If only bare or generic scripted UAs fail, it is usually a monitoring / WAF-classification issue
  • Bad classification here can create noisy false alarms inside the dashboard

Evidence

  • Current verification shows a full Chrome UA returning 200 on /, /robots.txt, /sitemap_index.xml, and /product-sitemap.xml
  • Googlebot and Bingbot UAs also return 200 on those same endpoints
  • Bare Mozilla/5.0 and python-requests style UAs return 406, which fits expected WAF / ModSecurity filtering of low-trust requests

Owner

Kai / Stuart / hosting admin — only needs WAF / HostGator changes if full browser or recognized crawler checks start failing

Exact fix needed

  1. First classify the behavior: test with a full browser UA, a recognized crawler UA, and a generic scripted UA
  2. If full browser + recognized crawler UAs pass, downgrade this to WAF filtering of generic UAs rather than a critical crawl blocker
  3. Only review / relax ModSecurity rules if legitimate traffic classes are being blocked too
  4. Keep monitoring /, /robots.txt, /sitemap_index.xml, and /product-sitemap.xml with that classification logic built in
  5. Alert critically only when the failure reaches real browsers or recognized crawlers

Live scan findings

Refreshing live findings…

Running live issue scan…

Current fix checklist

Waiting for scan…

Status

Open for verification — currently should be treated as a monitoring-classification issue unless full browser or recognized crawler checks fail.

← Back to SEO Health preview