If you’ve read my posts before you might of notice I’m a little paranoid about competitors mining my links. In part that’s because I don’t want them to see some of the stuff I’m up to! But more importantly when you’re working in competitive sectors like gambling or travel every link counts and the difference between ranking 1 or 2 could be a matter of a couple of decent links- and that can be worth hundreds of thousands of pounds.
I’m always playing around with ways to throw competitors off the scent and here’s one I’ve been testing lately. My testing is never as rigorous as it could be but hopefully its a bit of insight you can go away and play around with yourself.
If I put a link on a page which isn’t indexed (using the robots meta tag) will it:
a) Pass link weight
b) Show up on backlink analysis tools like YSE or OSE
What I did was find 3 pages at random on decent domains. I didn’t want pages with thousands of links because it would be harder to see if my links were showing up in the backlink reports. I’ve given the original link numbers for each page below in the brackets…
I then linked to each page using unique ID numbers as the anchor text as you can see below.
This page sat on a little microsite I own. Not the best site in the world but its well indexed, homepage PR 4, few links, yada yada – that’s not really important. What’s important is that the page used a robots meta tag noindex, follow:
<meta name="robots" content="noindex,follow" />
For anyone that doesn’t know, what this setting should do is let Google (and other robots like SEOmoz) see the page, follow any links on it but not return the page itself in any search results.
I then linked to the page sitewide so it got plenty of internal link juice.
I then gave it about 3 months. Primarily that was because I completely forgot to write this up until now but also gave plenty of opportunity for the link data sources to update themselves. The linkscape index in particular takes quite a while to update but there’s been a couple of index updates since then.
So there was 2 elements to this. The first was whether those links would ‘count’ to Google. We can tell these links are being counted because if you search for the unique ID numbers in Google it returns the target pages for each link as you can see on the links below:
Nothing too surprising here. This is exactly how Google should treat the robots tag, but still reassuring to know it works.
The second and more important part of this test is checking the backlinks of those pages in our favourite link analysis tools.
I checked each of the 3 pages in Yahoo Siteexplorer, SEOmoz’s Open Site Explorer and Majestic SEO. In none of these tools did my page show up in the backlink reports. I’ll show you the link reports on the BBC page below…
Again this is basically what we know should happen. The crawlers which provide this data should be obeying the robots tags. Still its worth testing these things rather than relying on blind faith.
So how can we use this in our link building?
Well for starters anytime you’re paying for a link you’d rather your competitors didn’t see like an advertorial, sponsored review or presell page you could get the webmaster to robots noindex the page with the links on. If you’re buying up high value links this should help reduce the chance of this kind of unwanted attention.
Second if you’re running a few (or hundreds) of your own sites or blogs you’d probably rather your whole network wasn’t exposed in your backlink profile so only put your links on certain pages and noindex, follow those.
Basically it’s going to be far harder for competitors to map your link networks and copy your tactics or spam report you if those pages aren’t in the web index. Who knows you might even be able to throw Google’s manual reviewers off the scent if you were worried about that sort of thing.
What we don’t know is whether a page which isn’t indexed will pass the same amount of link weight as a link on an indexed page. That’s for another day…