We’ve gained a couple of clients recently because their previous web design agencies have either deliberately or negligently dropped them off of Google.
In both cases this has been an easy fix, we didn’t even charge for our time, and they started to notice results very quickly, so we’d thought we’d share.
Their robots.txt file had been set to this:
This roughly translates to “No matter what search engine you are (User-agent: *), we don’t want you to index any part of our site (Disallow: /)”
On WordPress based websites there is even an option for this under ‘Settings’ / ‘Reading’ / ‘Discourage search engines from indexing this site’.
If you want to see what yours is set to, go to www.yoursite.co.uk/robots.txt and have a look.
In both cases, the web agency couldn’t, or wouldn’t explain to the client why their site wasn’t ranking in Google. In our opinion, there is no excuse for a professional web agency not to understand this or to deliberately turn it off. We've also known of this happening by accident due to security plugins - see here, but in these instances, it wasn't the case.
Luckily it’s an easy fix, but unfortunately, it’ll take a while for the SEO to ‘catch up’.
This is not the only situation we’ve had like this; one was far worse where the previous web agency deliberately started a (bad) link building campaign in order to get the client a penalty. This was reversible (see the results) but wasn’t a quick fix.
We are Webbed Feet, we understand SEO