How to Beat A Panda

closeThis post was first published 3 years 25 days ago. It might make references to techniques or tactics which no longer work or which I no longer endorse so please proceed with caution.

Run for your lives it’s another post about the panda update! I don’t usually blog about SEO news and algorithm shifts for a couple of reasons- firstly it’s boring and other sites will always be better for that sort of thing. Secondly because of the approach I take to SEO I rarely get effected by changes at Google.

Actually most of my sites and my clients, have either gained slightly or been unaffected by the panda update. I do however think it’s pretty significant and have spoken to a few site owners who have been smashed by it although not nearly as badly as some are making out (and no I won’t be linking to searchmetrics here!).

Before I talk about how I would go about beating the panda I have to point out that this update seems specifically designed to devalue sites in markets which Google is looking to enter. Airline aggregation, voucher codes, reviews and local directories are all businesses Google wants to muscle in on and reducing these sites visibility and therefore traffic is a handy way to bolster the visibility of their own offerings in these markets. From that point of view if you’re in a business which Google wants a piece of you might have more problems on your plate than this one update.

Anyway on with the post…

Most discussion around the update so far seems to be concerned with content and the devaluation of so called “low quality content”. For me that’s a bit of a red herring here. While that may be the goal, explaining the update in these terms suggests that Google is making a qualitative assessment of content in the way a human reader would. Thinking in these terms probably isn’t that helpful, Google still pretty much just has the same data to work from when it comes to assessing a page, the update is simply a different interpretation of this data, so:

  • Internal links
  • External links
  • Onpage factors
  • Duplication checking
  • Click stream data (maybe)

If you look at those factors actually very little has changed and sites which do well on all of these (think wikipedia) don’t appear to have been affected.

Part of the issue here seems to be that for while now Google has been ranking pages with few links, based on domain “trust” and strong onpage signals- hence the ehow effect. Now unless your articles are directly (or closely) linked to the source of that trust through internal links, or have links coming into them from external sources they might not rank like they used to.

So what the f**k do I do?

My first bit of advice to sites which have taken a knock is don’t panic and jump on the first advice you read on SEO blogs. One of the knee jerk reactions so far which I’ve heard banded around is to remove “low quality content” from your site. Last week I received a few of these emails from Brighthub, where I’m an author:

bright hub email
Now I’m not going to tell you that every article I’ve ever published on article sites is top quality but bright hub are really throwing the baby out with the bathwater here. Some of these articles they’re deleting were really pretty decent and now they’ve returned them to authors they’ll never get that content back. The first thing I did when I got these emails was to take that content and publish it on other sites instead, as I’m sure most of the affected authors did.

Other article sites also seem to be taking similar steps, banning user accounts and getting rid of articles. If it where me I’d keep publishing for now and ride the storm.

Your content is not the problem

In most cases at least. For years Google have preached about content and how site owners should create great content and wait for the traffic which would magically appear. I really don’t think the point of this update was for site owners to start thinking of their content as a liability and penalising sites for having too much content.

If your content is copied or scraped from other sites then yeah it’s sucking PageRank from your good pages and could do you more harm than good. But I don’t see any cases where it could be a good idea to delete unique content, whatever its quality. Instead I would first look at ways to deliver more links to your deep content pages, either by improving your internal link architecture or boosting your inbound links from external sources.

Thin content pages

Now at the other end of the spectrum we have thin content pages. Those voucher code and review sites which seem to have taken a hit. From what I’ve seen, and obviously this is just theory at this stage, this isn’t as simple as the number of words on your pages so all those site owners who are desperately ordering textbroker content to thicken their deep pages might want to hold fire.

Adding content is always a good idea and if you add 400 extra words to any page you should get more search traffic, but I don’t think this doesn’t work how most people understand it. If you’ve seen your long tail traffic take a hit after panda one way to regain some of that traffic is to add more content, but that isn’t going to fix the systemic issues with your site or regain the traffic you lost during panda, it’s just going to bump up your long tail search traffic in other places by adding new keywords to your pages which you weren’t ranking for before.

Again here I would personally look to address other issues before adding content en mass.

Action plan

So here’s the stuff I’d look at:

  • Identify the pages on your site which have lost traffic after the update- is it across the board or just certain sections?
  • Identify your internal link hubs – parts of your site with the highest PageRank or number of internal/ external links.
  • Internal linking – instead of deleting pages which have been affected by Panda, create new links to those pages from your internal link hubs (or from pages closely linked from your link hubs)
  • Run an external link building campaign for a select number of affected pages to see if a boost in external links helps. Be systematic, use one tactic to link to one group of pages, another to another group of pages so you can see what works and what doesn’t. Going forward try to ensure that every page on your site has at least one external link pointing to it. Don’t be afraid to link out if you have to to get links back
  • Audit your outbound links. Linking out is probably a good thing in general but not if you’re linking to spam, that’ll kill you. You queries like this to identify and delete these links.
  • Identify duplicate content on your site using copyscape. If nobody ever visits those pages then you can think about deleting them else maybe just noindex them for the time being
  • Built trust in the root of your domain. It never hurts to do this anyway but I’ve noticed that while a few of the sites I’ve seen affected by the update have lots of links/ trust at their root (homepage) their link velocity has plateaued in recent years. Start by checking your site for links to broken pages and 301 redirecting these back to your homepage (link juice recovery tool is good for this). Maybe even think about yahoo directory listings if you don’t already have them or other means to quickly boost your quality link numbers and get your homepage link velocity moving again.

Bear in mind this might not be good advice, they’re just ideas. But the point is its definitely not bad advice, all this stuff is actionable and should only help your site overall, even if it doesn’t address your panda related issues.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>