Skip navigation

Tag Archives: SERPs

Google now indexes around 30-40 billion web pages and this figure is growing by the day. How could any company, even one the size of Google, police a database of that size? Google’s algorithm obviously has measures in place to recognise certain types of spam and unusual activity but some spammy pages and blackhat SEO activity will always slip through the net. Manual policing of indexed web pages in-house isn’t even worth considering due to the infinite resources required to run such an operation.

So what’s the solution?

I think that more and more Google leave it to us to police the SERP’s for them. SEO is huge business, and one of the first things any SEO consultant or agency will do upon accepting a new project is to check out the competition. Any dubious/spammy listings get reported to Google who then takes appropriate action. It’s a mutually beneficial relationship – Google keeps its results pages spam free and relevant, and we remove any ill gotten obstacles to our ranking success.

This is just my opinion but it would go someway to explain why some previously discredited SEO techniques can still generate good rankings.