The Mathematical Theory of Information
Best Price (Coupon Required):
Buy The Mathematical Theory of Information for $36.00 at @ Link.springer.com when you apply the 10% OFF coupon at checkout.
Click “Get Coupon & Buy” to copy the code and unlock the deal.
Set a price drop alert to never miss an offer.
Single Product Purchase
Price Comparison
Seller | Contact Seller | List Price | On Sale | Shipping | Best Promo | Final Price | Volume Discount | Financing | Availability | Seller's Page |
---|---|---|---|---|---|---|---|---|---|---|
BEST PRICE 1 Product Purchase
|
|
$39.99 | $39.99 |
|
10% OFF
This deals requires coupon
|
$36.00 | See Site | In stock | Visit Store |
Product Details
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.