1. ## Info Theory

We have 14 discrete symbols s1, s2,  s14. Each symbol haves the probability p1, p2, , p14 respectively, to be transmitted. Is the average transmitted information in bits (p1*log2(1) + p2*log2(2).  , + p14*log2(14) ) /14 or (p1*log2(14) + p2*log2(14).  , + p14*log2(14) ) /14 or something else? Is the information for each symbol in bits log2(14) or something else? (I am not a student, and sorry its not a subject of C).
Thanks. 2. So if it's not a C topic, why did you feel the need to post it on the C forum, then fein apology?

Quzah. 3. Moved to tech. 4. Originally Posted by vangmor
We have 14 discrete symbols s1, s2,  s14. Each symbol haves the probability p1, p2, , p14 respectively, to be transmitted. Is the average transmitted information in bits (p1*log2(1) + p2*log2(2).  , + p14*log2(14) ) /14 or (p1*log2(14) + p2*log2(14).  , + p14*log2(14) ) /14 or something else? Is the information for each symbol in bits log2(14) or something else? (I am not a student, and sorry its not a subject of C).
Thanks.
The entropy is p1 log2 (1/ pi) + p2 log2 (1 / p2) + p3 log2 (1 / p3) .. p14 log (1 / p14). The entropy is maximized when all the probabilities are equally likely, so all are 1/14, in which case the entropy is log2(14). 5. Thanks SilentStrike! Popular pages Recent additions 