Descriere YEO:
Pe YEO găsești Mathematical Foundations of Information Theory de la Alexander I. Khinchin, în categoria Science.
Indiferent de nevoile tale, Mathematical Foundations of Information Theory - Alexander I. Khinchin din categoria Science îți poate aduce un echilibru perfect între calitate și preț, cu avantaje practice și moderne.
Preț: 61.1 Lei
Caracteristicile produsului Mathematical Foundations of Information Theory
- Brand: Alexander I. Khinchin
- Categoria: Science
- Magazin: libris.ro
- Ultima actualizare: 28-10-2025 01:22:05
Comandă Mathematical Foundations of Information Theory Online, Simplu și Rapid
Prin intermediul platformei YEO, poți comanda Mathematical Foundations of Information Theory de la libris.ro rapid și în siguranță. Bucură-te de o experiență de cumpărături online optimizată și descoperă cele mai bune oferte actualizate constant.
Descriere magazin:
The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and
Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field. In his first paper, Dr.
Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite scheme, and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory. Partial Contents: I. The Entropy Concept in Probability
Theory -- Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding
Theory. II. On the Fundamental Theorems of
Information Theory -- Two generalizations of Shannon\'s inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein\'s Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.