-.- Renyi-Kahre disinformation: their information can become negative Copyright (C) 2002 + 2004 , Jan Hajek , NL NO part of this document may be published, implemented, programmed, copied or communicated by any means without an explicit & FULL REFerence to this author together with the FULL title and the website www.humintel.com/hajek for the freshest version, or www.matheory.info , plus the CopyRight note in texts and in ALL references to this. An implicit, incomplete, indirect, disconnected or unlinked reference (in your text and/or on www) does NOT suffice. All based on 1st-hand experience. ALL rights reserved. Version 1.07a , 2004-9-9 , lines of < 79+CrLf chars in plain ASCII, unlikely to be updated soon, written in plain ASCII for easy view, submitted to the webmaster of http://www.humintel.com/hajek (= the freshest version). This epaper may read better (more of left margin, PgDn/Up) outside your email. This epaper has new facilities for fast finding and browsing. Please save the last copy and use a file differencer to see only where the versions differ : download Visual Compare VC154.ZIP and run VCOMP vers1 vers2 /k /i which is the best & the brightest colorful line-by-line comparer for plain .txt files. Your comments are welcome, preferably interlaced into this .txt lines-file. Search for the markers ! { -.- Renyi-Kahre information is asymmetrical and can become negative: Renyi entropy and information has been known since 1960 and published in { Renyi 1961 } and later. Shannon's mutual information I(A;B) is symmetrical with respect to both random variables A, B, as it holds : H(A) - H(A|B) = I(A;B) = I(B;A) = H(B) - H(B|A) Due to its symmetry, Shannon's I(A;B) is provably suboptimal for the tasks like : pattern classification, identification, decision making, feature selection, diagnosing and pattern recognition, where we want high hit-rates i.e. low miss-rates i.e. low probability of (a classification) error. Analogously re-formulated Renyi information { Kahre 2002, p.106 } is not only asymmetrical (I do not know if its asymmetry has been pointed out somewhere), i.e. R(A@B) <> R(B@A), which should be good for higher hit-rates, but it has also been believed to be non-negative like Shannon's information. I do consider such an asymmetry to be a highly desirable property for tasks like : pattern classification, identification, decision making, feature selection, diagnosing and pattern recognition. But negativity disqualifies an information measure as such, unless its negative values can be meaningfully interpreted as disinformation. During the days prior to August 12, 2002, I have disproven the ruling conjecture of non-negativity of R(A@B) . The disproof is easily done by counterexamples generated by systematically trying out many values of the probabilities necessary to specify R(A@B) for a pair of 2-valued i.e. binary random variables A, B. The results obtained so far with my program ARenyi.pas for varying inputs including the parameter alpha : The lowest value of alpha = 2.010 for which R(A@B) was negative while taking into account the finite precision of even an extended arithmetic. My other numerical & graphical investigations with XPL have confirmed that ! R(A@B) < 0.0 is possible for alpha > 2.00 ! The lowest value of R(A@B) = -0.048 approx., for alpha = 7.05 ! The lowest value of R(A@B) = -0.076 approx., for alpha = 29.93 Conclusion : Kahre's Renyi information is no information for alpha's for which it can become negative. Quiz: Is it then a disinformation ? -.- References { refs } : Q: Which refs are books and which are papers ? A: Unlike in the titles of (e)papers here listed, in the titles of books and periodicals all words start with a CAP, except for the insignificants like eg: a, and, der, die, das, for, from, in, of, or, to, the, with, etc. Alfred Renyi: On measures of entropy and information, Proc. Fourth Berkeley Symposium on Probability and Statistics, 1960, published 1961, vol.1, pp 547-561. Jan Kahre: The Mathematical Theory of Information, 2002, Springer -.-