I read that Steve Rubel over at Micro Persuasion is attempting another test of engines. this time it is a test of Memetrackers. He is after a post on Breememe.
This is as a follow up to Robert Scobles’ test of search engines with the Brrreeeport test.
Now this is fine and I have done the bit BUT the thing that concerns me is the lack of analytical vigor being applied to these tests. Other than producing a gut feeling by the main player of who comes out best there does not seem to be any methodology behind the investigation.
It seems to me that what is needed is an experiment which has the backing of a statistically sound methodology. That way the merits of different engines could be formed not just on who has the most links at random times but a correlation of how engines performed over time and what if any groupings where discovered. I.e. Does one engine favor certain blog hosts, or does one have a greater lag but better life cycle accuracy.
I just wish I had the time and brains to write such software and the traffic to get such a test underway.
EDIT: another thing that I fear will distort the accuracy of the results is the fact that so many people have to have their comments and trackbacks moderated. This will have the effect of introducing a massive lag in any engine that examines the linkage for a post. This will have major implications for any engine’s responsiveness.