1. the science of information 2. information measures 2.1 independence and markov chains 2.2 shannon's information measures 2.3 continuity of shannon's information measures 2.4 chain rules 2.5 informational divergence 2.6 the basic inequalities 2.7 some useful information inequalities 2.8 fano's inequality 2.9 entropy rate of stationary source problems historical notes 3. zero-error data compression 3.1 the entropy bound 3.2 prefix codes 3.2.1 definition and existence 3.2.2 huffman codes 3.3 redundancy of prefix codes problems historical notes 4. weak typicality 4.1 the weak aep 4.2 the source coding theorem 4.3 efficient source coding 4.4 the shannon-mcmiilan-breimantheorem problems historical notes 5. strong typicality 5.1 strongaep 5.2 strong typicality versus weak typicality 5.3 joint typicality 5.4 an interpretation of the basic inequalities problems historical notes 6. the/-measure 6.1 preliminaries 6.2 the/-measure for two random variables 6.3 construction of the/-measure 6.4 #* can be negative 6.5 information diagrams 6.6 examples of applications appendix 6.a: a variation of the inclusion-exclusion formula problems historical notes 7. markov structures 7.1 conditional mutual independence 7.2 full conditional mutual independence 7.3 markov random field 7.4 markov chain problems historical notes 8. channel capacity 8.1 discrete memorylesschannels 8.2 the channel coding theorem 8.3 the converse 8.4 achievability of the channel capacity 8.5 a discussion 8.6 feedback capacity 8.7 separation of source and channel coding problems historical notes 9. rate-distortion theory 9.1 single-letter distortion measures 9.2 the rate-distortion function r(d) 9.3 the rate-distortion theorem 9.4 the converse 9.5 achievability of ri(d) problems historical notes 10. the blahut-arimoto algorithms 10.i alternating optimization 10.2 the algorithms 10.2.1 channel capacity 10.2.2 the rate-distortion function 10.3 convergence 10.3.1- a sufficient condition 10.3.2 convergence to the channel capacity problems historical notes 11. single-source network coding 11.1 a point-to-point network 11.2 what is network coding? 11.3 a network code 11.4 the max-flow bound 11.5 achievability of the max-flow bound 11.5.1 acyclic networks 11.5.2 cyclic networks problems historical notes 12. information inequalities 12.1 the region fn 12.2 information expressions in canonical form 12.3 a geometrical framework 12.3.1 unconstrained inequalities 12.3.2 constrained inequalities 12.3.3 constrained identities 12.4 equivalence of constrained inequalities 12.5 the implication problem of conditional independence problems historical notes 13 shannon-type inequalities 13.1 the elemental inequalities 13.2 a linear programming approach 13.2.1 unconstrained inequalities 13.2.2 constrained inequalities and identities 13.3 a duality 13.4 machine proving - itip 13.5 tackling the implication problem 13.6 minimality of the elemental inequalities appendix 13.a: the basic inequalities and the polymatroidal axioms problems historical notes 14. beyond shannon-type inequalities 14.1 characterizations of 鉵 14.2 a non-shannon-type unconstrained inequality 14.3 a non-shannon-type constrained inequality 14.4 applications problems historical notes 15. multi-source network coding 15.1 two characteristics 15.1.1 the max-flow bounds 15.1.2 superposition coding 15.2 examples of application 15.2.1 multilevel diversity coding 15.2.2 satellite communication network 15.3 a network code for acyclic networks 15.4 an inner bound 15.5 an outer bound 15.6 the lp bound and its tightness 15.7 achievability of rin appendix 15.a: approximation of random variables with infinite alphabets problems historical notes 16. entropy and groups 16.1 group preliminaries 16.2 group-characterizable entropy functions 16.3 a group characterization of 鉵 16.4 information inequalities and group inequalities problems historical notes bibliography index