Information theory is a branch of the mathematical theory of probability and mathematical statistics with wide applications in a variety of fields. This book studies the logarithmic measures of information and their application to testing statistical hypotheses. Unification of statistical procedures scattered through the literature is achieved by a consistent application of the concepts and properties of information theory. Applications are limited to the analysis of samples of fixed size. In Chapter 1, the measures of information are introduced and defined. Chapter 2 develops the properties of the information measures and examines their relationship with Fisher's information measure and sufficiency. In Chapter 3 certain fundamental inequalities of information theory are derived, and the relation with the now classic inequality associated with the names of Frechet, Darmois, Cramer, and Rao is examined. In Chapter 4 some limiting properties are derived following the weak law of large numbers. In Chapter 5 the asymptotic distribution theory of estimates of the information measures is examined. The remainder of the book consists of applications. Topics include analysis of multinomial samples and samples from Poisson populations, analysis of contingency tables, ideas associated with multivariate normal populations, analysis of samples from univariate normal populations under the linear hypothesis, multivariate linear hypothesis, linear discriminant functions, and more. The book contains numerous worked examples to help clarify the discussion and provide simple illustrations, while problems at the end of each chapter and in the text provide a means for the reader to expand and apply the theory and to anticipate and develop some of the needed background. The reader is assumed to have some familiarity with mathematical probability and mathematical statistics.