Many of our clients started receiving this message in their search console this week:
On a website you can use a file called “robots.txt” to indicate what content the search engine bots (ie. Googlebot) are allowed to index on your site. If this file is too restrictive google will now start penalizing your site in your search rankings. Google is trying to make sure that what their Googlebot sees when they visit your site is the same thing that a user would see when they come to the site because your browser does not pay any attention to the robots.txt file.
If you get a warning in your Google Search Console but don’t know how to remedy the problem, please don’t hesitate to call or email us and we can help you get it fixed before your Google page rank drops.