Search Results for topics-in-statistical-information-theory

IEEE Transactions on Information Theory, IT-13,126-7. ... "Minimum Discrimination Information Estimation and Application-Invited Paper Presented to Sixteenth Conference on the Design of Experiments in ... Handbook of Statistics, Vol.

Author: Solomon Kullback

Publisher: Springer Science & Business Media

ISBN: 9781461580805

Category: Mathematics

Page: 159

View: 619

DOWNLOAD & READ
The relevance of information theory to statistical theory and its applications to stochastic processes is a unifying influence in these TOPICS. The integral representation of discrimination information is presented in these TOPICS reviewing various approaches used in the literature, and is also developed herein using intrinsically information-theoretic methods. Log likelihood ratios associated with various stochastic processes are computed by an application of minimum discrimination information estimates. Linear discriminant functionals are used in the information-theoretic analysis of a variety of stochastic processes. Sections are numbered serially within each chapter, with a decimal notation for subsections. Equations, examples, theorems and lemmas, are numbered serially within each section with a decimal notation. The digits to the left of the decimal point represent the section and the digits to the right of the decimal point the serial number within the section. When reference is made to a section, equation, example, theorem or lemma within the same chapter only the section number or equation number, etc., is given. When the reference is to a section ,equation, etc., in a different chapter, then in addition to the section or equation etc., number, the chapter number is also given. References to the bibliography are by the author's name followed by the year of publication in parentheses. The transpose of a matrix is denoted by a prime; thus one-row matrices are denoted by primes as the transposes of one-column matrices (vectors).
2013-12-01 By Solomon Kullback

The relevance of information theory to statistical theory and its applications to stochastic processes is a unifying influence in these TOPICS.

Author: Solomon Kullback

Publisher: Springer

ISBN: 0387965122

Category: Mathematics

Page: 159

View: 739

DOWNLOAD & READ
The relevance of information theory to statistical theory and its applications to stochastic processes is a unifying influence in these TOPICS. The integral representation of discrimination information is presented in these TOPICS reviewing various approaches used in the literature, and is also developed herein using intrinsically information-theoretic methods. Log likelihood ratios associated with various stochastic processes are computed by an application of minimum discrimination information estimates. Linear discriminant functionals are used in the information-theoretic analysis of a variety of stochastic processes. Sections are numbered serially within each chapter, with a decimal notation for subsections. Equations, examples, theorems and lemmas, are numbered serially within each section with a decimal notation. The digits to the left of the decimal point represent the section and the digits to the right of the decimal point the serial number within the section. When reference is made to a section, equation, example, theorem or lemma within the same chapter only the section number or equation number, etc., is given. When the reference is to a section ,equation, etc., in a different chapter, then in addition to the section or equation etc., number, the chapter number is also given. References to the bibliography are by the author's name followed by the year of publication in parentheses. The transpose of a matrix is denoted by a prime; thus one-row matrices are denoted by primes as the transposes of one-column matrices (vectors).
1987-07-28 By Solomon Kullback

Statistical Theory of Signal Detection. 2nd ed. ... Papers on Probability, Statistics and Statistical Physics. Dordrecht: Reidel, 1982. F. Jelinek. ... Topics in Statistical Information Theory. Berlin: Springer, 1987. H].

Author: Mark Kelbert

Publisher: Cambridge University Press

ISBN: 9780521769358

Category: Computers

Page: 526

View: 389

DOWNLOAD & READ
A valuable teaching aid. Provides relevant background material, many examples and clear solutions to problems taken from real exam papers.
2013-09-12 By Mark Kelbert

G. A. Jones and J. M. Jones, Information and Coding Theory, Springer, London, 2000. Y. Kakihara, Abstract Methods ... S. Kullback, Topics in Statistical Information Theory, Springer-Verlag, Berlin, 1987. S. Kullback and R. A. Leibler, ...

Author: Raymond W. Yeung

Publisher: Springer Science & Business Media

ISBN: 9780387792347

Category: Computers

Page: 580

View: 886

DOWNLOAD & READ
This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research ?eld of its own in information science. With its root in infor- tion theory, network coding has not only brought about a paradigm shift in network communications at large, but also had signi?cant in?uence on such speci?c research ?elds as coding theory, networking, switching, wireless c- munications,distributeddatastorage,cryptography,andoptimizationtheory. While new applications of network coding keep emerging, the fundamental - sults that lay the foundation of the subject are more or less mature. One of the main goals of this book therefore is to present these results in a unifying and coherent manner. While the previous book focused only on information theory for discrete random variables, the current book contains two new chapters on information theory for continuous random variables, namely the chapter on di?erential entropy and the chapter on continuous-valued channels. With these topics included, the book becomes more comprehensive and is more suitable to be used as a textbook for a course in an electrical engineering department.
2008-08-28 By Raymond W. Yeung

S. Kullback, Information Theory and Statistics, Wiley, New York, 1959. S. Kullback, Topics in Statistical Information Theory, Springer-Verlag, Berlin, 1987. S. Kullback and R. A. Leibler, “On information and sufficiency,” Ann. Math.

Author: Raymond W. Yeung

Publisher: Springer Science & Business Media

ISBN: 9781441986085

Category: Technology & Engineering

Page: 412

View: 135

DOWNLOAD & READ
This book provides an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.
2012-12-06 By Raymond W. Yeung

Inf. Theory, IT-13:126–127, 1967. [337] S. Kullback, J. C. Keegel, and J. H. Kullback. Topics in Statistical Information Theory. Springer-Verlag, Berlin, 1987. [338] S. Kullback and M. A. Khairat. A note on minimum discrimination ...

Author: Thomas M. Cover

Publisher: John Wiley & Sons

ISBN: 9781118585771

Category: Computers

Page: 776

View: 932

DOWNLOAD & READ
The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
2012-11-28 By Thomas M. Cover

(2) ∫ plog ( ∫ ) In statistics, the minimization of the KLD measure produces the most likely approximation as given by the ... in turn, has a direct equivalence to the (Shannon) entropy maximization criterion in information theory.

Author: Leandro Pardo

Publisher: MDPI

ISBN: 9783038979364

Category: Social Science

Page: 344

View: 769

DOWNLOAD & READ
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
2019-05-20 By Leandro Pardo

Information Theory and Statistics: A Tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting.

Author: Imre Csiszár

Publisher: Now Publishers Inc

ISBN: 1933019050

Category: Computers

Page: 115

View: 110

DOWNLOAD & READ
Information Theory and Statistics: A Tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an "information geometry" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory. The tutorial does not assume the reader has an in-depth knowledge of Information Theory or statistics. As such, Information Theory and Statistics: A Tutorial, is an excellent introductory text to this highly-important topic in mathematics, computer science and electrical engineering. It provides both students and researchers with an invaluable resource to quickly get up to speed in the field.
2004 By Imre Csiszár

Vol.27; A. Janssen, H. Milbrodt, H. Strasser Infinitely Divisible Statistical Experiments. VI, 163 pages. 1985. Vol. 28. ... Vol.42: S. Kullback, J.C. Keegel, J.H. Kullback, Topics in Statistical Information Theory. IX, 158 pages.

Author: D. Basu

Publisher: Springer Science & Business Media

ISBN: 9781461238942

Category: Mathematics

Page: 365

View: 731

DOWNLOAD & READ
It is an honor to be asked to write a foreword to this book, for I believe that it and other books to follow will eventually lead to a dramatic change in the current statistics curriculum in our universities. I spent the 1975-76 academic year at Florida State University in Tallahassee. My purpose was to complete a book on Statistical Reliability Theory with Frank Proschan. At the time, I was working on total time on test processes. At the same time, I started attending lectures by Dev Basu on statistical inference. It was Lehmann's hypothesis testing course and Lehmann's book was the text. However, I noticed something strange - Basu never opened the book. He was obviously not following it. Instead, he was giving a very elegant, measure theoretic treatment of the concepts of sufficiency, ancillarity, and invariance. He was interested in the concept of information - what it meant. - how it fitted in with contemporary statistics. As he looked at the fundamental ideas, the logic behind their use seemed to evaporate. I was shocked. I didn't like priors. I didn't like Bayesian statistics. But after the smoke had cleared, that was all that was left. Basu loves counterexamples. He is like an art critic in the field of statistical inference. He would find a counterexample to the Bayesian approach if he could. So far, he has failed in this respect.
2012-12-06 By D. Basu

a Information theory is based on mathematics , especially on probability theory and mathematical statistics . Information transmission is , more or ... mathematical statistics is presented . Links between information theory and topics ...

Author: Shunsuke Ihara

Publisher: World Scientific

ISBN: 9810209851

Category: Computers

Page: 308

View: 799

DOWNLOAD & READ
This book provides a systematic mathematical analysis of entropy and stochastic processes, especially Gaussian processes, and its applications to information theory.The contents fall roughly into two parts. In the first part a unified treatment of entropy in information theory, probability theory and mathematical statistics is presented. The second part deals mostly with information theory for continuous communication systems. Particular emphasis is placed on the Gaussian channel.An advantage of this book is that, unlike most books on information theory, it places emphasis on continuous communication systems, rather than discrete ones.
1993 By Shunsuke Ihara