Many websites have always been the subject of link spamming, either because of bad SEO (Black Hat) or due to malicious attacks from competing companies and domains throughout the internet. Unscrupulous competitors can inflict search engine ranking drops by creating links to websites that aren’t trustworthy to Google, like dubious online pharmacies, casinos, link farms, bad directories, lottery sites, sharing sites or anything that might look like scam, or by blog/forum spamming. Simply out, anything that might look shady in the real world will likely be shady to Google. A large Google algorithm update in February 2011 (Panda) affected around 6% of the websites on the internet, including big names like J.C. Penney, whose SEO agency used black hat techniques to make them rank high for just about everything they sold. After that, and being exposed by an article in the New York Times, they got penalized and their pages were nowhere to be found for the queries they had previously ranked number one for. On the other hand, websites who did not use link spamming techniques were not affected by this algorithm.
Major post-Panda algorithm updates:
• Page Layout Algorithm Update on January 2012
• Penguin 1 on April 24, 2012
• Penguin 2 on May 26, 2012
• Penguin 3 on October 5, 2012
• Penguin 4 on May 22, 2013
• Penguin 5 on October 4, 2013
Google watches over everything, it is omnipresent and semi-Omni sapient on the internet. Their goal is to continue delivering relevant unique content for the end user, hence their crusade against excessive spammers. Good SEO services are based on the notion that websites need to earn links from high-ranking sites in an organical manner, not by exploiting internal mechanism flaws. So after Google released those updates, webmasters started getting feedback notices from Google explaining why their websites were considered to be using Black-hat SEO techniques for link building, providing bad user experience, having a faulty navigation structure and other issues. Google also placed a reconsideration request feature on Google Webmaster Tools for those who considered they have been affected by a false-positive by the algorithm. The next image shows that “request review” action:
In this case the best one could do to get back the previous ranking, or at least not to drop so low, is to download the list of links that are pointing to your website in Google Webmaster Tools and find those that might look spammy, based on Google’s Quality Guidelines for link schemes. Then you should send a blast email to all of the different webmasters in the list of sites, with a message that looks something like this:
As of “date” Google has notified me that there has been a link from your website pointing towards this page of my site (Add URL) is considered by their quality guidelines as SPAM. I kindly request from you to remove this link or un-follow it as soon as possible.
(Additional contact info)
After that, there another way to clean malicious links that were not removed by the previous effort, using Google Webmaster Disavow Tool. This is an advanced feature that tells Google bots not to pay attention to specific URLs that are coming from a website, and should be used with caution because links that you might want to be considered can be accidentally deleted with this tool. Following next steps from the next screenshot will get rid of those awful spammy links:
Once you followed the steps described above, you may Request a reconsideration for your site. Add documentation about the links that were removed or disavowed and wait while google does its magic, paying attention to your Google Webmaster Tool account for any notification. Hope this helps to kill your panda!