I`m trying to solve a puzzle, have a WP site that has 13 pages in the sitemap, all show up as status 200 when I upload a file with a list of pages, but if I have Screaming Frog just crawl the site, from the home page, it only finds one page. Any help or suggestions would be appreciated
Best I can tell, robots.txt isn`t blocking access, and I don`t think I have SF set wrong, because it crawls other sites correctly
pages are things like;
/location-hours/
/pay-bills/
/forms-policies/
and robots looks like this;
# Google Image
User-agent: Googlebot-Image
Disallow:
Allow: /*
# Google AdSense
User-agent: Mediapartners-Google*
Disallow:
# digg mirror
User-agent: duggmirror
Disallow: /
# global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category//
Disallow: */trackback/
Disallow: */feed/
Disallow: */comments/
Disallow: /*?
Allow: /wp-content/uploads/