The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.

*"synopsis" may belong to another edition of this title.*

Published by
Springer-Verlag New York Inc.
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Quantity Available: > 20

Seller:

Rating

**Book Description **Springer-Verlag New York Inc., 2013. PAP. Condition: New. New Book. Shipped from US within 10 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Seller Inventory # IQ-9781461353324

Published by
Springer-Verlag New York Inc., United States
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Paperback
Quantity Available: 10

Seller:

Rating

**Book Description **Springer-Verlag New York Inc., United States, 2013. Paperback. Condition: New. Language: English . Brand New Book ***** Print on Demand *****. The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon s noisy channel. The entropy in the classical information theory is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron s law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making two plus two greater than four . Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation. Softcover reprint of the original 1st ed. 2002. Seller Inventory # AAV9781461353324

Published by
Springer-Verlag New York Inc., United States
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Paperback
Quantity Available: 10

Seller:

Rating

**Book Description **Springer-Verlag New York Inc., United States, 2013. Paperback. Condition: New. Language: English . This book usually ship within 10-15 business days and we will endeavor to dispatch orders quicker than this where possible. Brand New Book. The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon s noisy channel. The entropy in the classical information theory is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron s law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making two plus two greater than four . Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation. Softcover reprint of the original 1st ed. 2002. Seller Inventory # LIE9781461353324

Published by
Springer-Verlag New York Inc., United States
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Paperback
Quantity Available: 10

Seller:

Rating

**Book Description **Springer-Verlag New York Inc., United States, 2013. Paperback. Condition: New. Language: English . Brand New Book ***** Print on Demand *****.The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon s noisy channel. The entropy in the classical information theory is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron s law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making two plus two greater than four . Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation. Softcover reprint of the original 1st ed. 2002. Seller Inventory # AAV9781461353324

Published by
Springer-Verlag New York Inc.
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Quantity Available: > 20

Seller:

Rating

**Book Description **Springer-Verlag New York Inc., 2013. PAP. Condition: New. New Book. Delivered from our UK warehouse in 4 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Seller Inventory # LQ-9781461353324

Published by
Springer
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Paperback
Quantity Available: 1

Seller:

Rating

**Book Description **Springer, 2013. Paperback. Condition: NEW. 9781461353324 This listing is a new book, a title currently in-print which we order directly and immediately from the publisher. For all enquiries, please contact Herb Tandree Philosophy Books directly - customer service is our primary goal. Seller Inventory # HTANDREE0406954

Published by
Springer
(2018)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Paperback
Quantity Available: > 20

Seller:

Rating

**Book Description **Springer, 2018. Paperback. Condition: New. Never used! This item is printed on demand. Seller Inventory # 1461353327

Published by
Springer
(2013)

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Softcover
Quantity Available: 15

Seller:

Rating

**Book Description **Springer, 2013. Condition: New. This item is printed on demand for shipment within 3 working days. Seller Inventory # LP9781461353324

Published by
Springer

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
Paperback
Quantity Available: > 20

Seller:

Rating

**Book Description **Springer. Paperback. Condition: New. 502 pages. Dimensions: 9.2in. x 6.1in. x 1.2in.The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannons noisy channel. The entropy in the classical information theory is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byrons law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making two plus two greater than four. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation. This item ships from multiple locations. Your book may arrive from Roseburg,OR, La Vergne,TN. Paperback. Seller Inventory # 9781461353324

Published by
Springer

ISBN 10: 1461353327
ISBN 13: 9781461353324

New
PAPERBACK
Quantity Available: > 20

Seller:

Rating

**Book Description **Springer. PAPERBACK. Condition: New. 1461353327 Special order direct from the distributor. Seller Inventory # ING9781461353324