BBC reminds me of two elements of consumer satisfaction

I’m a big fan of much of the work the BBC does online, and in general it does a very good job of providing a massive amount of content in a fairly logical manner.

But using the site as a consumer with a couple of urgent needs highlighted a couple of things which I think are good lessons for any website:

Multi-channel delivery:

I’m a huge fan of the BBC iPlayer, and the fact it allows me to watch good quality online and on-demand television. So on Sunday morning, I rushed to watch Match of the Day, having missed it on Saturday night (and with the Absolute Radio Fantasy Football game meaning I need to pay extra attention to every team this year!).

But the listing was greyed out – and with no reason given, I had to presume it was down to the licensing rights for the Premier League.

So it was a bit weird to be looking for something else a bit later, and stumble across it in the sport section! (Flaw here was attempting to browse my way to it, rather than using the site or Google search.)

The lesson: If you’re putting out content through two difference channels for whatever reason, then link between them! And always try to explain why someone can’t access something if they might logically think they should.


The BBC carries a lot of event coverage, particularly in areas such as music, and especially sport. For example, it’s great to be able to watch the MotoGP series via the BBC, and also great to be able to see the full list of races (125 and 250cc) online, as my TV set-up seems to struggle with the Red Button Freeview channels.

But although it’s nice to see everything go live at the same time, as if a single switch somewhere brings everything to life, unless you’ve got Freeview and the website running at the same time, it isn’t that impressive. And the fact the online feed wasn’t listed from the MotoGP page of the Sport section until the video went live two minutes after the listed time meant that I probably wasn’t the only one frantically refreshing the page to see if it would appear or if there was a problem.

The lesson: If you’re covering an event that starts at a specific time, why not have a page and link ready and live in advance, which can provide a bit of reassurance for internet users? That way, we can relax knowing that everything will go live at noon, for example, rather than worrying that there’s a technical fault with 1 minute to go. Whatever happens afterwards, we’re already stressed and less likely to enjoy and appreciate your hard work!


I’m still a huge fan of the BBC, and there are hundreds of sites which could have been used for the same points – the reason it stood out for me was that I was a completely powerless consumer. Reinforcing the final lesson – always look at your website as a consumer trying to achieve something.

Should you stop linking to Wikipedia? (Black Hole SEO)

I’m not a huge fan of ‘Black Hat’ SEO (i.e. bending the rules, or breaking them to game SEO), but I do like to be aware of what goes on. And a recent discussion on ‘Black Hole SEO’ struck a chord with me outside of simple search engine optimisation, so I thought it was worth flagging to the wider world (that readers TheWayoftheWeb, anyway!).

Basically it refers to sites which are large enough to have authority across topics, which then ensure all links are internal, or ‘no-follow’ links (meaning they give no authority in Google ranking). There’s been discussion about ‘no-follow’ since it’s introduction, mainly around whether a blog comment should result in a legitmate link, or whether it discourages spammers to make them no-follow.

But this is far more worrying, as it essentially means large sites are following the example of Wikipedia. Because Wikipedia has so much content and authority, we all boost the site rankings by linking to it. But when it needs external information it rewrites it, and links to it internally, or links out with a ‘no follow’. You still get a traffic boost, but no ranking advantage.

There’s more on SEOblackhat, and they use examples from mainstream media, including the New York Times and Business Week. Daily Blog Tips has an open discussion on whether to boost your own sites in this method, while SEOBlackHat gives a ‘how-to‘ guide.

But noone has looked at the ethical debate around this, as far as I’m aware, which is what I’d like to do. I have sympathy for Wikipedia as a reference work limiting external links in this way, although I do question whether it’s the correct approach, as it essentially limits the reward of any site putting time and effort into creating something valuable on the subject.

But I seriously question the likes of Mainstream Media (MSM) or sites like Digg etc for doing it – these are organisations which make a profit from the content they display, and the position they occupy within search rankings. As ‘link journalism‘ begins to rise, and more people are recognising smaller blogs and websites as relevant within their field, it’s only right that they should receive the reward for their efforts, whether from recognition or financial reward.

And in the long term it has serious implications for these sites – if they rely on people providing content to enable a wide range of topics, internal rankings, and high search results, then they need the content provided. If hundreds or thousands or people who provide this content start to become disillusioned because they aren’t getting the recognition or reward for their efforts, will they start to rebel by removing content, embedding code, or starting to copyright their work and charge MSM?

Will we end up with an internet which is based around paying to be able to link to someone, rather than rewarding them by sending them PR and traffic?

I can understand why large sites do have the content available for internal linking, and this is to be expected. But as I write this I’m becoming more and more convinced that by not rewarding external sites when they are linked, is akin to stealing.

(Disclosure: I work across various titles for Bauer Media, and as far as I’m involved, and aware of, external links are encouraged, and are ‘do-follow’.)

Do you agree? Or do you think it’s nothing to worry about? And if you’re a ‘do-follow’ advocate, what action would you suggest to counteract these seo black holes?