In the last 24 hours, Googlebot error 1 while trying to access your robots.txt. To make sure we did not crawl any pages listed in that file, we postponed the crawl. Robots.txt overall error rate was 50.0% of the site.
You can see the details in the webmaster tools of these errors.
- If the site is 100% error:
Use a web browser to try to access http://www.festive-commercial.biz/robots.txt. If you can access it from your browser, then your site can be configured to deny access to Googlebot. Check the configuration of your firewall and site, to ensure that you do not deny access to Googlebot.
- If your robots.txt is a static page, verify that your Web service has the appropriate permissions to access the file.
- If your robots.txt is dynamically generated, robots.txt verify that the generated script is configured correctly, and have permission to run. Examine the log for your site, take a look at your script fails, if so try to diagnose the cause of the failure.
If the site is less than 100 percent error rate:
- Use Webmaster Tools to find the day’s high error rate and check the log for that day of your Web server. Find the error log robots.txt day visit and fix causes of these errors.
- The most likely explanation is that your site is overloaded. Please contact your hosting provider and discuss reconfigure your Web server or add more resources to your website.
- If your site is redirected to another host name, another possible explanation is to be redirected to a host in the URL of your website, its robots.txt file services on display one or more of the problems .