search - How can i fix "Googlebot can't access your site" issue? -


i keep getting message

"over last 24 hours, googlebot encountered 1 errors while attempting access robots.txt. ensure didn't crawl pages listed in file, postponed our crawl. site's overall robots.txt error rate 100.0%. can see more details these errors in webmaster tools. "

i searched , told me add robots.txt on site

and when test robots.txt on google webmaster tools ,the robots.txt cannot fetched. enter image description here

i thought maybe robots.txt blocked site ,but when test says allowed gwt.

enter image description here

'http://momentcamofficial.com/robots.txt' , here content of robots.txt : user-agent: * disallow:

so why robots.txt cannot fetched google?what did miss .... can me ???

before googlebot crawls site, accesses robots.txt file determine if site blocking google crawling pages or urls. if robots.txt file exists unreachable (in other words, if doesn’t return 200 or 404 http status code), we’ll postpone our crawl rather risk crawling urls not want crawled. when happens, googlebot return site , crawl can access robots.txt file.

as know having robots.txt optional don't need make one, make sure host send 200 or 404 http status only.


Comments

Popular posts from this blog

javascript - Jquery show_hide, what to add in order to make the page scroll to the bottom of the hidden field once button is clicked -

javascript - Highcharts multi-color line -

javascript - Enter key does not work in search box -