Talk:Interaction information

From Wikipedia, the free encyclopedia

Articles for deletion This article was nominated for deletion in the past. The result of the discussion was keep.

[edit] Sign convention

The sign convention used in interaction information as described in the article is not the same as that used in the measure-theoretic generalization of mutual information to the multivariate case. (See Information theory and measure theory.) For an even number of random variables the signs are the same, whereas for an odd number of random variables the signs are opposite. This would seem to mean the the interaction information of a single random variable, if it were defined, would be negative (or at least non-positive). Or do I have it all wrong? 198.145.196.71 (talk) 06:20, 20 December 2007 (UTC)

You may have it right. I would email one of the authors of the Interaction papers (Jakulin). They can tell you definitively. —Dfass (talk) 14:33, 23 December 2007 (UTC)
I looked very carefully for the second time at the general formula given in their paper (which you have duplicated in this article) and I do believe I am right. They mention (but do not describe) towards the end of their paper a slightly different formulation of interaction information that can be derived from the set-theoretic principle of inclusion-exclusion, and also admit that it is this formulation which is the more popular in recent literature (Yeung, for example).
Consider the case where all the variables but one are independent unbiased bits and the last is a parity bit. Here Jakulin and Bratko's formulation gives an interaction information of +1 in all cases, but the formulation which is based on measure theory, or rather the set-theoretic principle of inclusion-exclusion, gives -1 in the case of an odd number of random variables and +1 in the even case. 198.145.196.71 (talk) 19:34, 24 December 2007 (UTC)