Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence.
Foreword.- Preface.- 1 Definition of information and entropy in the absence of noise- 2 Encoding of discrete information in the absence of noise and penalties.- 3 Encoding in the presence of penalties. The first variational problem- 4 The first asymptotic theorem and relative results.- 5 Computation of entropy for special cases. Entropy of stochastic processes.- 6 Information in the presence of noise. The Shannon's amount of information.- 7 Message transmission in the presence of noise. The second asymptotic theorem and its various formulations.- 8 Channel capacity. Important particular cases of channels.- 9 Definition of the value of information.- 10 The value of Shannon information for the most important Bayesian systems.- 11 Asymptotical results related to the value of information. The Third asymptotic theorem.- 12 Information theory and the second law of thermodynamics.- Appendix Some matrix (operator) identities.- Index.
“The book could be useful in advanced graduate courses with students, who are not afraid of integrals and probabilities.” (Jaak Henno, zbMATH 1454.94002, 2021)
Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, Walid V. Soussou, Dawn M. Taylor, Patrick A. Tresco, Theodore W Berger, John K Chapin, Greg A Gerhardt, Dennis J McFarland, Jose C Principe, Walid V Soussou, Dawn M Taylor, Patrick A Tresco
Danilo Comminiello, Jose C. Principe, Italy) Comminiello, Danilo (Department of Information Engineering, Electronics and Telecommunications - Sapienza University of Rome, USA) Principe, Jose C. (Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL