search - How can i fix "Googlebot can't access your site" issue? -
i keep getting message
"over last 24 hours, googlebot encountered 1 errors while attempting access robots.txt. ensure didn't crawl pages listed in file, postponed our crawl. site's overall robots.txt error rate 100.0%. can see more details these errors in webmaster tools. "
i searched , told me add robots.txt on site
and when test robots.txt on google webmaster tools ,the robots.txt cannot fetched.
i thought maybe robots.txt blocked site ,but when test says allowed gwt.
'http://momentcamofficial.com/robots.txt' , here content of robots.txt : user-agent: * disallow:
so why robots.txt cannot fetched google?what did miss .... can me ???
before googlebot crawls site, accesses robots.txt file determine if site blocking google crawling pages or urls. if robots.txt file exists unreachable (in other words, if doesn’t return 200 or 404 http status code), we’ll postpone our crawl rather risk crawling urls not want crawled. when happens, googlebot return site , crawl can access robots.txt file.
as know having robots.txt optional don't need make one, make sure host send 200 or 404 http status only.
Comments
Post a Comment