In a recent blog entry, Matt Cutts discusses a common response of sites that have been delisted or had their SERP position drop. Webmasters say that there’s nothing wrong with their site, that they haven’t been engaged in any shady link-building strategies, and Google is unfairly punishing them. Cutts responds that in many of these cases the reason for the penalty is that sites have been hacked and infected with malicious software without webmaster being aware.
Hacking a site is one of a number of Negative SEO strategies that a site’s competitors can engage in to damage search rankings and reputations. Today we’ll be having a look at hacking and a couple of other Negative SEO tactics, so that you can be aware of possible vectors of attack for your sites, and what you can do about them.
As we noted in our previous article, personalization in the major search engines is changing the SERP landscape. Indeed, each individual is getting a search landscape uniquely shaped to meet their needs. Google is spending incredible amounts of money on research and development in order to bring relevant search results to its users. They have more or less bet the farm on socially directed, personalized search results. Some people find this to be a wonderful advance, and Google’s recent iteration for search on mobile — Google Now — is the logical extension of this approach, and has found favor in many corners.
Many sites around the web have taken a hit with their traffic numbers because of Google’s recent Panda and Penguin updates, and it would be hard to argue that this is in any way positive. But, if we put these causes for traffic loss aside, there is another factor that is of concern to SEOs and webmasters. Even if sites haven’t seen their traffic dip because of algorithm changes, Google’s recent drive to personalize search results has made it nigh on impossible to predict with any certainty what a particular user’s SERP is going to look like. All the SEO in the world isn’t going to help with driving traffic to a site if Google decides, based on their browsing, search, and social media data, that a particular user, in spite of entering apparently relevant search terms, is actually looking for something unrelated to your site.
However, in theory at least, personalization of SERP results is reducing traffic only by reducing false positives. By taking note of web history, search history, and social media signals, Google is attempting to reduce the ambiguity of search keywords and refer only those visitors who are actually going to be interested in the content of a particular site. This means that even though overall traffic is reduced, conversions rates will remain steady, or even rise as a percentage of visits.
Ever since Eric Schmidt confirmed at LeWeb last year that social signals were being taken into consideration for SERPs, the SEO world has been rife with speculation and conjecture about the relative merits of social signals as compared to links. Some of the more excitable members of the industry even went so far as to declare the death of link-building as an SEO tool. In a recent interview, Google’s chief spam slayer, Matt Cutts, has attempted to dampen down the more extreme prognostications about the ascendency of social signals. Yes, social signals are being used to some extent, but the humble link still leads the way when it comes to determining SERP rankings, and it will for some years to come.