Search Marketing

My Thoughts on Google’s “Panda” Algorithm Change

I have mixed feelings on Google making significant changes to its algorithm. On one hand, I appreciate the fact that they are trying to improve… since I’m typically unsatisfied with Google search results. Earlier today I was searching for some statistics on blogging… and the results were terrible:

search blogging statsIf you take a look at the results… and go from page to page, it appears that Google is paying less attention to larger sites and more attention to smaller sites. The problem is that the results I’m looking for are exactly the opposite. Some might argue that Google can’t possibly understand my intent… not true. Google has a years’ worth of history on my search patterns. That history would provide input into the topics I’m interested in pursuing.

The recent Google update, otherwise known as the Panda update (named after a developer), was supposed to improve quality. The problem, as described by many SEO folks, was that they were having a hard time competing with content farms. In all honesty, I didn’t actually see a lot of complains from users… but Google appeared to have caved under the pressure of the industry.

If small content sites were truly unable to compete with large sites, I absolutely understand. Anything impeding the democratization of the web should be corrected. I don’t believe Google actually fixed the problem, though. It seems to me they just did a lateral shift… plugging one hole while more leaks started. The algorithm change improved a major flaw – large sites with large quantities of high ranking pages seemed to get ranked easier on new pages.

The next issue, of course, is now large sites that actually do have a great quantity of ranking pages… but a small percentage of crappy pages, now dropped rank across the board. Imagine investing in a site and building out thousands of pages of great content, only to find that overnight your site’s ranking dropped because you also have some pages that suffer. The resulting drop is already costing some companies dearly.

This blog has over 2,500 blog posts. Surely not all of them are class “A” material. Granted, the size of this blog doesn’t compare to many content farms that have hundreds of thousands or millions of pages. However, I’m still farming… trying to build rank for a variety of topics relevant to search, social, mobile and other marketing efforts. I’m not sure how much content I have to create before I’m seen as a content farm… and punished accordingly… but I’m not really too happy about it.

The old secrets to SEO weren’t so secret. Write relevant content, utilize keywords effectively, structure your pages properly, design your site to leverage that content… and promote the heck out of it. Effective keyword usage and placement would get you in the right results… and promotion of that content off-site would get you ranked better. The new secret isn’t really known. Those of us in the industry are still scrambling to understand just what needs done. Google is hush-hush on it, too, so we’re on our own.

Truth be told, I’m disappointed that Google would think it was a good idea to impact 12% of all search results overnight. There are victims in this mess – some of them hardworking consultants that are looking out for their clients to provide them with the best advice possible. Google has even had to backpedal and retweak the changes.

Google aggressively launched the SEO industry and even promoted optimization to try to improve the quality of their results. We didn’t game it, as CNN suggests… we all studied, responded and acted on the advice provided. We worked hard to implement the recommendations that Google asked us to. We paid for and attended the events that folks like Matt Cutts continue to promote. We worked with large clients and helped them to fully leverage their content… only now to get the carpet pulled out from under us. Google points to sites like Wikipedia as quality sites… but has penalized sites where the content is actually purchased and people are employed to write. Go figure.

Google did need to change. However, the drastic nature of the change and the lack of any warning on Google’s part was unnecessary. Why couldn’t Google simply warn large publishers that there was an algorithm that was going to be implemented in 30 days that rewarded large publishers for developing their pages in greater detail and quality? Why not preview the change utilizing a special search or sandbox environment? At least companies could have prepared for a large drop in traffic, diversified their online marketing efforts more, and made some (much needed) improvements.

One specific example is a client that I’m working with. We were already building out better email, mobile and social integrations – and a feedback loop where readers could indicate the quality of the content they were reading so that it could be improved. Had we known there was going to be an algorithm update that would drop 40% of the site’s traffic, we would have worked hard to get those strategies live rather than continuing to tweak the site. Now we’re struggling to catch up.

Douglas Karr

Douglas Karr is the founder of the Martech Zone and a recognized expert on digital transformation. Douglas has helped start several successful MarTech startups, has assisted in the due diligence of over $5 bil in Martech acquisitions and investments, and continues to launch his own platforms and services. He's a co-founder of Highbridge, a digital transformation consulting firm. Douglas is also a published author of a Dummie's guide and a business leadership book.

Related Articles


  1. None of my sites were harmed in the Farmer update. None of my clients either. I believe this is because quality of content is important, but quality and authority of links to that content still rules the day. It’s also more important than ever how to two relate directly.
    Show me a site that was bumped downward, I and will show you holes in the link profile as compared to others in the niche who gained. It happens like this with every update, no matter what name is given to it. “Big” sites did not necessarily lose out… just authority sites that did not improve their standing. “Big” sites get scraped a lot as well, and that doesn’t help.
    In addition, I believe algo indicators of quality content have been elevated and are looking for far more than just “good and original” words. There are several factors including the accepted language of the niche, AP Stylebook considerations, the platform (blog or static for example) and readability score.

    The high profile sites that got slapped (ezinearticles, mahalo) were victims of manual penalty. They were made examples on purpose to stir the buzz. These were human reviewed and humans are biased… that’s why some of the “good” sites like Cult of Mac got nailed too… I think Mac people are arrogant and haughty and I’d slap a site about Mac too. LOL j/k

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.


Adblock Detected

Martech Zone is able to provide you this content at no cost because we monetize our site through ad revenue, affiliate links, and sponsorships. We would appreciate if you would remove your ad blocker as you view our site.