Items related to Information Measures: Information and its Description...

Information Measures: Information and its Description in Science and Engineering - Softcover

 
9783642566707: Information Measures: Information and its Description in Science and Engineering

This specific ISBN edition is currently not available.

Synopsis

Abstract.- Structure and Structuring.- 1 Introduction.- Science and information.- Man as control loop.- Information, complexity and typical sequences.- Concepts of information.- Information, its technical dimension and the meaning of a message.- Information as a central concept.- 2 Basic considerations.- 2.1 Formal derivation of information.- 2.1.1 Unit and reference scale.- 2.1.2 Information and the unit element.- 2.2 Application of the information measure (Shannon's information).- 2.2.1 Summary.- 2.3 The law of Weber and Fechner.- 2.4 Information of discrete random variables.- 3 Historic development of information theory.- 3.1 Development of information transmission.- 3.1.1 Samuel F. B. Morse 1837.- 3.1.2 Thomas Edison 1874.- 3.1.3 Nyquist 1924.- 3.1.4 Optimal number of characters of the alphabet used for the coding.- 3.2 Development of information functions.- 3.2.1 Hartley 1928.- 3.2.2 Dennis Gabor 1946.- 3.2.3 Shannon 1948.- 3.2.3.1 Validity of the postulates for Shannon's Information.- 3.2.3.2 Shannon's information (another possibility of a derivation).- 3.2.3.3 Properties of Shannon's information, entropy.- 3.2.3.4 Shannon's entropy or Shannon's information.- 3.2.3.5 The Kraft inequality.- Kraft's inequality:.- Proof of Kraft's inequality:.- 3.2.3.6 Limits of the optimal length of codewords.- 3.2.3.6.1 Shannon's coding theorem.- 3.2.3.6.2 A sequence of n symbols (elements).- 3.2.3.6.3 Application of the previous results.- 3.2.3.7 Information and utility (coding, porfolio analysis).- 4 The concept of entropy in physics.- The laws of thermodynamics:.- 4.1 Macroscopic entropy.- 4.1.1 Sadi Carnot 1824.- 4.1.2 Clausius's entropy 1850.- 4.1.3 Increase of entropy in a closed system.- 4.1.4 Prigogine's entropy.- 4.1.5 Entropy balance equation.- 4.1.6 Gibbs's free energy and the quality of the energy.- 4.1.7 Considerations on the macroscopic entropy.- 4.1.7.1 Irreversible transformations.- 4.1.7.2 Perpetuum mobile and transfer of heat.- 4.2 Statistical entropy.- 4.2.1 Boltzmann's entropy.- 4.2.2 Derivation of Boltzmann's entropy.- 4.2.2.1 Variation, permutation and the formula of Stirling.- 4.2.2.2 Special case: Two states.- 4.2.2.3 Example: Lottery.- 4.2.3 The Boltzmann factor.- 4.2.4 Maximum entropy in equilibrium.- 4.2.5 Statistical interpretation of entropy.- 4.2.6 Examples regarding statistical entropy.- 4.2.6.1 Energy and fluctuation.- 4.2.6.2 Quantized oscillator.- 4.2.7 Brillouin-Schrödinger negentropy.- 4.2.7.1 Brillouin: Precise definition of information.- 4.2.7.2 Negentropy as a generalization of Carnot's principle.- Maxwell's demon.- 4.2.8 Information measures of Hartley and Boltzmann.- 4.2.8.1 Examples.- 4.2.9 Shannon's entropy.- 4.3 Dynamic entropy.- 4.3.1 Eddington and the arrow of time.- 4.3.2 Kolmogorov's entropy.- 4.3.3 Rényi's entropy.- 5 Extension of Shannon's information.- 5.1 Rényi's Information 1960.- 5.1.1 Properties of Rényi's entropy.- 5.1.2 Limits in the interval 0 ? ?< ?.- 5.1.3 Nonnegativity for discrete events.- 5.1.4 Additivity and a connection to Minkowski's norm.- 5.1.5 The meaning of S?(A) for ? 1.- 5.1.6 Graphical presentations of Rényi's information.- 5.2 Another generalized entropy (logical expansion).- 5.3 Gain of information via conditional probabilities.- 5.4 Other entropy or information measures.- 5.4.1 Daroczy's entropy.- 5.4.2 Quadratic entropy.- 5.4.3 R-norm entropy.- 6 Generalized entropy measures.- 6.1 The corresponding measures of divergence.- 6.2 Weighted entropies and expectation values of entropies.- 7 Information functions and gaussian distributions.- 7.1 Rényi's information of a gaussian distributed random variable.- 7.1.1 Rényi's ?-information.- 7.1.2 Rényi's G-divergence.- 7.2 Shannon's information.- 8 Shannon's information of discrete probability distributions.- 8.1 Continuous and discrete random variables.- 8.1.1 Summary.- 8.2 Shannon's information of a gaussian distribution.- 8.3 Shannon's information as the possible gain of informati

"synopsis" may belong to another edition of this title.

(No Available Copies)

Search Books:



Create a Want

Can't find the book you're looking for? We'll keep searching for you. If one of our booksellers adds it to AbeBooks, we'll let you know!

Create a Want

Other Popular Editions of the Same Title

9783540416333: Information Measures: Information and its Description in Science and Engineering

Featured Edition

ISBN 10:  3540416331 ISBN 13:  9783540416333
Publisher: Springer, 2001
Hardcover