How to Beat A Panda

close

Run for your lives it’s another post about the panda update! I don’t usually blog about SEO news and algorithm shifts for a couple of reasons- firstly it’s boring and other sites will always be better for that sort of thing. Secondly because of the approach I take to SEO I rarely get effected by changes at Google.

Actually most of my sites and my clients, have either gained slightly or been unaffected by the panda update. I do however think it’s pretty significant and have spoken to a few site owners who have been smashed by it although not nearly as badly as some are making out (and no I won’t be linking to searchmetrics here!).

Before I talk about how I would go about beating the panda I have to point out that this update seems specifically designed to devalue sites in markets which Google is looking to enter. Airline aggregation, voucher codes, reviews and local directories are all businesses Google wants to muscle in on and reducing these sites visibility and therefore traffic is a handy way to bolster the visibility of their own offerings in these markets. From that point of view if you’re in a business which Google wants a piece of you might have more problems on your plate than this one update.

Anyway on with the post…

Most discussion around the update so far seems to be concerned with content and the devaluation of so called “low quality content”. For me that’s a bit of a red herring here. While that may be the goal, explaining the update in these terms suggests that Google is making a qualitative assessment of content in the way a human reader would. Thinking in these terms probably isn’t that helpful, Google still pretty much just has the same data to work from when it comes to assessing a page, the update is simply a different interpretation of this data, so:

  • Internal links
  • External links
  • Onpage factors
  • Duplication checking
  • Click stream data (maybe)

If you look at those factors actually very little has changed and sites which do well on all of these (think wikipedia) don’t appear to have been affected.

Part of the issue here seems to be that for while now Google has been ranking pages with few links, based on domain “trust” and strong onpage signals- hence the ehow effect. Now unless your articles are directly (or closely) linked to the source of that trust through internal links, or have links coming into them from external sources they might not rank like they used to.

So what the f**k do I do?

My first bit of advice to sites which have taken a knock is don’t panic and jump on the first advice you read on SEO blogs. One of the knee jerk reactions so far which I’ve heard banded around is to remove “low quality content” from your site. Last week I received a few of these emails from Brighthub, where I’m an author:

bright hub email
Now I’m not going to tell you that every article I’ve ever published on article sites is top quality but bright hub are really throwing the baby out with the bathwater here. Some of these articles they’re deleting were really pretty decent and now they’ve returned them to authors they’ll never get that content back. The first thing I did when I got these emails was to take that content and publish it on other sites instead, as I’m sure most of the affected authors did.

Other article sites also seem to be taking similar steps, banning user accounts and getting rid of articles. If it where me I’d keep publishing for now and ride the storm.

Your content is not the problem

In most cases at least. For years Google have preached about content and how site owners should create great content and wait for the traffic which would magically appear. I really don’t think the point of this update was for site owners to start thinking of their content as a liability and penalising sites for having too much content.

If your content is copied or scraped from other sites then yeah it’s sucking PageRank from your good pages and could do you more harm than good. But I don’t see any cases where it could be a good idea to delete unique content, whatever its quality. Instead I would first look at ways to deliver more links to your deep content pages, either by improving your internal link architecture or boosting your inbound links from external sources.

Thin content pages

Now at the other end of the spectrum we have thin content pages. Those voucher code and review sites which seem to have taken a hit. From what I’ve seen, and obviously this is just theory at this stage, this isn’t as simple as the number of words on your pages so all those site owners who are desperately ordering textbroker content to thicken their deep pages might want to hold fire.

Adding content is always a good idea and if you add 400 extra words to any page you should get more search traffic, but I don’t think this doesn’t work how most people understand it. If you’ve seen your long tail traffic take a hit after panda one way to regain some of that traffic is to add more content, but that isn’t going to fix the systemic issues with your site or regain the traffic you lost during panda, it’s just going to bump up your long tail search traffic in other places by adding new keywords to your pages which you weren’t ranking for before.

Again here I would personally look to address other issues before adding content en mass.

Action plan

So here’s the stuff I’d look at:

  • Identify the pages on your site which have lost traffic after the update- is it across the board or just certain sections?
  • Identify your internal link hubs – parts of your site with the highest PageRank or number of internal/ external links.
  • Internal linking – instead of deleting pages which have been affected by Panda, create new links to those pages from your internal link hubs (or from pages closely linked from your link hubs)
  • Run an external link building campaign for a select number of affected pages to see if a boost in external links helps. Be systematic, use one tactic to link to one group of pages, another to another group of pages so you can see what works and what doesn’t. Going forward try to ensure that every page on your site has at least one external link pointing to it. Don’t be afraid to link out if you have to to get links back
  • Audit your outbound links. Linking out is probably a good thing in general but not if you’re linking to spam, that’ll kill you. You queries like this to identify and delete these links.
  • Identify duplicate content on your site using copyscape. If nobody ever visits those pages then you can think about deleting them else maybe just noindex them for the time being
  • Built trust in the root of your domain. It never hurts to do this anyway but I’ve noticed that while a few of the sites I’ve seen affected by the update have lots of links/ trust at their root (homepage) their link velocity has plateaued in recent years. Start by checking your site for links to broken pages and 301 redirecting these back to your homepage (link juice recovery tool is good for this). Maybe even think about yahoo directory listings if you don’t already have them or other means to quickly boost your quality link numbers and get your homepage link velocity moving again.

Bear in mind this might not be good advice, they’re just ideas. But the point is its definitely not bad advice, all this stuff is actionable and should only help your site overall, even if it doesn’t address your panda related issues.

7 Comments How to Beat A Panda

  1. Susan

    I wrote over 200 articles and posted them to Ezine, Buzzle and other such sites, which the Panda discovered. I used to rank #2 for a local popular real estate term and now I sit at #8. It’s not the worst, but it is frustrating and a lot work lost. I like your advice about not deleting unique content that already exists. Even if it’s worth less, it’s still worth something.

    Reply
    1. Ando Hiroshige

      Jesus. Am sorry to hear that. Can you please tell me if article marketing is the only type of marketing you did for this website or was there a mix of other link building methods?

      Reply
  2. Elizabeth Kaylene

    Do you really think that Google is purposely removing sites in the same markets that they want to pursue? I’ve always perceived Google as trustworthy, so I really don’t want to believe this. Can you point me in the direction of more information? I have a book/comic review site that I’m trying to monetize, and really don’t want to be up against Google.

    Which SEO methods do you use? (You know, if you don’t mind sharing a basic overview.) I know of a lot of businesses that optimize only for Google, which hurts them in the long run. I tend to just focus on keyword optimization. For blogs, I concentrate on the keywords in my post titles, and the tags I put in my posts. For non-blog websites, I make sure they all have good meta tags. I don’t at all claim to be an SEO. I have a very basic knowledge and it’s always seemed to do me good. The more I learn, the better my sites perform. My clients’ sites — when I was a freelance web designer and social marketing consultant — always performed fairly well, too.

    I’ve seen other sites throwing out the baby with the bathwater, too. I just read an article written by a webmaster who no-indexed all of his site’s articles because the sites publishing his syndicated articles were showing up higher in SERPs. This seems completely counterproductive to me. He even noted that the no-index made things worse, and I don’t understand at all why he thought it would make things better. If you’re ranking lower than your partners, you wouldn’t want to pull your articles completely from SERPs!

    I’ve found sites scraping my content in the past, and have yet to check and see how they rank now compared to mine (they didn’t outrank me pre-Panda). There wasn’t really anything I could do about it at the time. It still made me really mad, though — especially since from what I’ve seen, scrapers replace all links in your original content with questionable links of their own.

    I’m hoping that this Panda update is going to rectify itself, and that it’s not some sinister move on Google’s part. I honestly think it’s a case of things having to settle on their own. Realistically, webmasters should be giving it several weeks before they start panicking… but I can definitely understand the concern.

    Reply
  3. homebaseincome101

    I enjoyed your update. I’ve read so many different points of view and it can become confusing and frustrating.

    I personally did not panic and continued down my path as usual and feel that was the best thing I could have done.

    I did notice a drop to traffic to my articles, blogs and websites but it will come back. With so many changes going on we all need to chill and see what happens.

    Reply
  4. Jonathan

    Yeah a good read as I am looking to improve a clients long tail keywords now and so the very first thing that I know the site has is a poor link structure to thelong tail content which is the first thing I am going to address. Thanks for the post – its very good work.

    Reply
  5. Jason OConnor

    This is the best article I have read on Panda yet, and I think I have read up to a hundred at this point! Red herring indeed! The Google algo has always been primarily about links, and it still is. And to think Google would be so ‘evil’ is not that difficult. And thanks for the links to tools and queries, very helpful!

    Reply
  6. Robert Denton

    Solid article John. I appreciate how you admit that these are ideas, not absolutes. I think a lot of people try to spout SEO absolutes when we all know that Google is playing “hide the ball” with their algorithm and will never tell any of us what it actually is.

    My concern is whether or not the Panda updates have allowed new sites that have plenty of original relevant comments actually rank higher faster. In the past it has seemed like Google took quite a while to rank up a well done website with lots of helpful content. With the instant nature of information today and how businesses pop-up rather quickly, it would be a shame to think that Google is still heavily weighing a score with length of time on the web. I’m especially bothered by this because when I see what the “old reliable” sites look like I feel like I stepped back 15 years.

    I guess we’ll see.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>