My googlebot stopped working?
ChancyRat
Registered Users Posts: 2,141 Major grins
I'm clueless about how to fix this but google is sending me messages saying:
"Over the last 24 hours, Googlebot encountered 2 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%"
My site has been fine for years... Did Smugmug do something, or am I supposed to do something on Smugmug or my site, to fix this?
I viewed the google help pages but since I'm completely illiterate in this subject, I have no idea what to do.cry:cry
thanks.
"Over the last 24 hours, Googlebot encountered 2 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%"
My site has been fine for years... Did Smugmug do something, or am I supposed to do something on Smugmug or my site, to fix this?
I viewed the google help pages but since I'm completely illiterate in this subject, I have no idea what to do.cry:cry
thanks.
0
Comments
SmugMug customers over 3x the normal levels of traffic. SmugMug
restricts the level of bot traffic to prioritize actual usage, and
when bots go over a certain level of traffic we respond with an HTTP
code "503" which advises Google's robot to come back later. This
tracks Google's recommendations for responses to overaggressive bot
traffic. The downside of this is that Google's webmaster tools
needlessly alerts users that their site is responding with "come back
later". This would be a concern if it were happening with human
traffic, but with bot traffic it's not a problem.
SmugMug Hero & CSS Monkey
https://help.smugmug.com/get-started-with-customization-SkgwJ4rM