What is information?
Entropy of discrete variables
Source coding theorem
Noisy channel coding theorem
Entropy of continuous variables
Mutual information: continuous
Channel capacity: continuous
Thermodynamic entropy and information
Information as Nature's currency
Further reading
A. Glossary
B. Mathematical symbols
C. Logarithms
D. Probability density functions
E. Averages from distributions
F. The rules of probability
G. The Gaussian distribution
H. Key equations.
Information theory : a tutorial introduction by James V. Stone. ISBN 9780956372857. Published by Sebtel Press in 2015. Publication and catalogue information, links to buy online and reader comments.