Skip to main content
Back to Help Center
Troubleshooting

Audit not completing

6 min read
Last updated: March 2025

Quick resolution guide

Having trouble with stuck or incomplete audits? This guide covers the most common issues and their solutions. Most problems can be resolved in just a few minutes with the right approach.

🔍 Quick diagnostics checklist

Is your website publicly accessible?
Is your internet connection stable?
Have you waited at least 5 minutes?
Does your site respond quickly (<10s)?
Are you using the correct URL format?
Have you tried refreshing the page?

Common issues & solutions

🔄 Audit stuck at "Analyzing..."

This is the most common issue, usually caused by network delays or large site complexity.

Step-by-step solution:

  1. Wait 5-10 minutes (large sites take longer)
  2. Refresh your browser page (Ctrl+F5 or Cmd+R)
  3. Check if the audit status changed
  4. If still stuck, cancel and restart the audit
  5. Try auditing during off-peak hours (early morning/late evening)

💡 Pro tip

Enable audit notifications in your settings to get alerts when audits complete, even if you close the browser.

⏰ Audit timeout errors

Timeout errors occur when your website takes too long to respond or when auditing very large sites.

Immediate solutions:

  • Limit audit scope: Focus on specific pages instead of the entire site
  • Check server performance: Ensure your website loads quickly
  • Optimize before auditing: Enable caching and CDN if available
  • Contact hosting provider: Ask about server resource limits

⚠️ For large sites (>1000 pages)

Consider upgrading to our Pro plan for extended timeout limits and priority processing, or contact support for enterprise solutions.

🚫 "Access denied" or "Site not reachable"

These errors indicate that our crawlers cannot access your website due to security restrictions or misconfigurations.

Common causes & fixes:

Password protection

Remove password protection or whitelist our crawler IPs

Firewall blocking

Configure firewall to allow AISEOTurbo crawler access

robots.txt restrictions

Check if robots.txt is blocking legitimate crawlers

Server downtime

Verify your website is online and accessible

🔧 Testing your site accessibility

  1. Open an incognito/private browser window
  2. Try accessing your site from the exact URL you provided
  3. Check if any login prompts or error pages appear
  4. Test from different devices or networks

📊 Incomplete or partial results

Sometimes audits complete but show incomplete data or missing sections.

Why this happens:

  • Some pages were inaccessible during crawling
  • JavaScript-heavy content couldn't be fully analyzed
  • Rate limiting or server restrictions kicked in
  • Network interruptions during the audit process

✅ Solutions

  • • Re-run the audit during low-traffic periods
  • • Enable server-side rendering for JavaScript content
  • • Increase server resources temporarily during audits
  • • Check your sitemap.xml for accuracy and completeness

Advanced troubleshooting

For technical users

Check server logs

Look for crawler requests from AISEOTurbo IP ranges:

grep "AISEOTurbo" /var/log/apache2/access.log
# or
grep "185.199.108.0/22" /var/log/nginx/access.log

Test robots.txt compliance

Verify our crawler is allowed:

curl -I https://yoursite.com/robots.txt
# Check for "Disallow: /" or bot restrictions

Validate SSL certificate

Ensure HTTPS is properly configured:

openssl s_client -connect yoursite.com:443 -servername yoursite.com

When to contact support

If you've tried the solutions above and still experiencing issues, contact our support team with:

Your website URL
Exact error messages
Time when the issue occurred
Steps you've already tried
Your browser and device info
Screenshots of error screens

Prevention is better than cure

Avoid future audit issues by following these best practices:

  • • Ensure your website loads consistently in under 5 seconds
  • • Use a reliable hosting provider with good uptime guarantees
  • • Keep your robots.txt file simple and permissive for legitimate crawlers
  • • Implement proper error handling for 404s and server errors
  • • Schedule audits during low-traffic periods
  • • Monitor your site's performance regularly

Related help articles