My googlebot stopped working?

ChancyRatChancyRat Registered Users Posts: 2,141 Major grins
edited June 11, 2013 in SmugMug Support
I'm clueless about how to fix this but google is sending me messages saying:

"Over the last 24 hours, Googlebot encountered 2 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%"

My site has been fine for years... Did Smugmug do something, or am I supposed to do something on Smugmug or my site, to fix this?

I viewed the google help pages but since I'm completely illiterate in this subject, I have no idea what to do.:cry:cry:cry

thanks.

Comments

  • bobbyherobobbyhero Registered Users Posts: 207 Major grins
    edited June 11, 2013
    For the past couple months Google's web crawling robot has been sending
    SmugMug customers over 3x the normal levels of traffic. SmugMug
    restricts the level of bot traffic to prioritize actual usage, and
    when bots go over a certain level of traffic we respond with an HTTP
    code "503" which advises Google's robot to come back later. This
    tracks Google's recommendations for responses to overaggressive bot
    traffic. The downside of this is that Google's webmaster tools
    needlessly alerts users that their site is responding with "come back
    later". This would be a concern if it were happening with human
    traffic, but with bot traffic it's not a problem.
Sign In or Register to comment.