Rupert Murdoch Threatens to Pull WSJ, Other News Corp. Sites, From Google

When news sites complain that Google is “ripping them off” by indexing their content, the question is why don’t they just block Google’s spiders via robots.txt? The answer, of course, is that they don’t want to give up the traffic Google throws their way — they want the traffic and they want Google to pay them for it. Murdoch, though, is now saying they just might do it and block Google’s spider.

Monday, 9 November 2009