Limit search to available items
Book Cover
E-book
Author Rohde, Charles A., author

Title Introductory statistical inference with the likelihood function / Charles A. Rohde
Published Cham : Springer, 2014

Copies

Description 1 online resource (xvi, 332 pages) : illustrations
Contents Note continued: 4.2. Frequentist Approach -- 4.2.1. Importance of the Long Run -- 4.2.2. Application to the Normal Distribution -- 4.3. Pivots -- 4.4. Likelihood Intervals -- 4.5. Bayesian Approach -- 4.6. Objective Bayes -- 4.7.Comparing the Intervals -- 4.8. Interval Estimation Example -- 4.9. Exercises -- 5. Hypothesis Testing -- 5.1. Law of Likelihood -- 5.2. Neyman-Pearson Theory -- 5.2.1. Introduction -- 5.2.2. Neyman-Pearson Lemma -- 5.2.3. Using the Neyman-Pearson Lemma -- 5.2.4. Uniformly Most Powerful Tests -- 5.2.5.A Complication -- 5.2.6.Comments and Examples -- 5.2.7.A Different Criterion -- 5.2.8. Inductive Behavior -- 5.2.9. Still Another Criterion -- 5.3.p-Values -- 5.4. Duality of Confidence Intervals and Tests -- 5.5.Composite Hypotheses -- 5.5.1. One-Way Analysis of Variance -- 5.6. The Multiple Testing Problem -- 5.7. Exercises -- 6. Standard Practice of Statistics -- 6.1. Introduction -- 6.2. Frequentist Statistical Procedures -- 6.2.1. Estimation
Note continued: 6.2.2. Confidence Intervals -- 6.2.3. Hypothesis Testing -- 6.2.4. Significance Tests -- 7. Maximum Likelihood: Basic Results -- 7.1. Basic Properties -- 7.2. Consistency of Maximum Likelihood -- 7.3. General Results on the Score Function -- 7.4. General Maximum Likelihood -- 7.5. Cramer-Rao Inequality -- 7.6. Summary Properties of Maximum Likelihood -- 7.7. Multiparameter Case -- 7.8. Maximum Likelihood in the Multivariate Normal -- 7.9. Multinomial -- 8. Linear Models -- 8.1. Introduction -- 8.2. Basic Results -- 8.2.1. The Fitted Values and the Residuals -- 8.3. The Basic "Regression" Model -- 8.3.1. Adding Covariates -- 8.3.2. Interpretation of Regression Coefficients -- 8.3.3. Added Sum of Squares -- 8.3.4. Identity of Regression Coefficients -- 8.3.5. Likelihood and Bayesian Results -- 8.4. Interpretation of the Coefficients -- 8.5. Factors as Covariates -- 8.6. Exercises -- 9. Other Estimation Methods -- 9.1. Estimation Using Empirical Distributions
Note continued: 9.1.1. Empirical Distribution Functions -- 9.1.2. Statistical Functionals -- 9.1.3. Linear Statistical Functionals -- 9.1.4. Quantiles -- 9.1.5. Confidence Intervals for Quantiles -- 9.2. Method of Moments -- 9.2.1. Technical Details of the Method of Moments -- 9.2.2. Application to the Normal Distribution -- 9.3. Estimating Functions -- 9.3.1. General Linear Model -- 9.3.2. Maximum Likelihood -- 9.3.3. Method of Moments -- 9.3.4. Generalized Linear Models -- 9.3.5. Quasi-Likelihood -- 9.3.6. Generalized Estimating Equations -- 9.4. Generalized Method of Moments -- 9.5. The Bootstrap -- 9.5.1. Basic Ideas -- 9.5.2. Simulation Background -- 9.5.3. Variance Estimation Using the Bootstrap -- 9.6. Confidence Intervals Using the Bootstrap -- 9.6.1. Normal Interval -- 9.6.2. Pivotal Interval -- 9.6.3. Percentile Interval -- 9.6.4. Parametric Version -- 9.6.5. Dangers of the Bootstrap -- 9.6.6. The Number of Possible Bootstrap Samples -- 10. Decision Theory -- 10.1. Introduction
Note continued: 10.1.1. Actions, Losses, and Risks -- 10.2. Admissibility -- 10.3. Bayes Risk and Bayes Rules -- 10.4. Examples of Bayes Rules -- 10.5. Stein's Result -- 10.6. Exercises -- 11. Sufficiency -- 11.1. Families of Distributions -- 11.1.1. Introduction to Sufficiency -- 11.1.2. Rationale for Sufficiency -- 11.1.3. Factorization Criterion -- 11.1.4. Sketch of Proof of the Factorization Criterion -- 11.1.5. Properties of Sufficient Statistics -- 11.1.6. Minimal Sufficient Statistics -- 11.2. Importance of Sufficient Statistics in Inference -- 11.2.1. Frequentist Statistics -- 11.2.2. Bayesian Inference -- 11.2.3. Likelihood Inference -- 11.3. Alternative Proof of Factorization Theorem -- 11.4. Exercises -- 12. Conditionally -- 12.1. Ancillarity -- 12.2. Problems with Conditioning -- 13. Statistical Principles -- 13.1. Introduction -- 13.1.1. Birnbaum's Formulation -- 13.1.2. Framework and Notation -- 13.1.3. Mathematical Equivalence -- 13.1.4. Irrelevant Noise
Note continued: 13.2. Likelihood Principle -- 13.3. Equivalence of Likelihood and Irrelevant Noise Plus Mathematical Equivalence -- 13.4. Sufficiency, Conditionality, and Likelihood Principles -- 13.5. Fundamental Result -- 13.6. Stopping Rules -- 13.6.1.Comments -- 13.6.2. Jeffreys/Lindley Paradox -- 13.6.3. Randomization -- 13.6.4. Permutation or Randomization Tests -- 14. Bayesian Inference -- 14.1. Frequentist vs Bayesian -- 14.2. The Bayesian Model for Inference -- 14.3. Why Bayesian? Exchangeability -- 14.4. Stable Estimation -- 14.5. Bayesian Consistency -- 14.6. Relation to Maximum Likelihood -- 14.7. Priors -- 14.7.1. Different Types of Priors -- 14.7.2. Vague Priors -- 15. Bayesian Statistics: Computation -- 15.1.Computation -- 15.1.1. Monte Carlo Integration -- 15.1.2. Importance Sampling -- 15.1.3. Markov Chain Monte Carlo -- 15.1.4. The Gibbs Sampler -- 15.1.5. Software -- 16. Bayesian Inference: Miscellaneous -- 16.1. Bayesian Updating -- 16.2. Bayesian Prediction
Note continued: 16.3. Stopping Rules in Bayesian Inference -- 16.4. Nuisance Parameters -- 16.5. Summing Up -- 17. Pure Likelihood Methods -- 17.1. Introduction -- 17.2. Misleading Statistical Evidence -- 17.2.1. Weak Statistical Evidence -- 17.2.2. Sample Size -- 17.3. Birnbaum's Confidence Concept -- 17.4.Combining Evidence -- 17.5. Exercises -- 18. Pure Likelihood Methods and Nuisance Parameters -- 18.1. Nuisance Parameters -- 18.1.1. Introduction -- 18.1.2. Neyman-Scott Problem -- 18.2. Elimination Methods -- 18.3. Evidence in the Presence of Nuisance Parameters -- 18.3.1. Orthogonal Parameters -- 18.3.2. Orthogonal Reparametrizations -- 18.4. Varieties of Likelihood -- 18.5. Information Loss -- 18.6. Marginal Likelihood -- 18.7. Conditional Likelihood -- 18.7.1. Estimated Likelihoods -- 18.8. Profile Likelihood -- 18.8.1. Introduction -- 18.8.2. Misleading Evidence Using Profile Likelihoods -- 18.8.3. General Linear Model Likelihood Functions -- 18.8.4. Using Profile Likelihoods
Note continued: 18.8.5. Profile Likelihoods for Unknown Variance -- 18.9.Computation of Profile Likelihoods -- 18.10. Summary -- 19. Other Inference Methods and Concepts -- 19.1. Fiducial Probability and Inference -- 19.1.1. Good's Example -- 19.1.2. Edward's Example -- 19.2. Confidence Distributions -- 19.2.1. Bootstrap Connections -- 19.2.2. Likelihood Connections -- 19.2.3. Confidence Curves -- 19.3.P-Values Again -- 19.3.1. Sampling Distribution of P-Values -- 19.4. Severe Testing -- 19.5. Cornfield on Testing and Confidence Intervals -- 20. Finite Population Sampling -- 20.1. Introduction -- 20.2. Populations and Samples -- 20.3. Principal Types of Sampling Methods -- 20.4. Simple Random Sampling -- 20.5. Horvitz-Thompson Estimator -- 20.5.1. Basu's Elephant -- 20.5.2. An Unmentioned Assumption -- 20.6. Prediction Approach -- 20.6.1. Proof of the Prediction Result -- 20.7. Stratified Sampling -- 20.7.1. Basic Results -- 20.8. Cluster Sampling -- 20.9. Practical Considerations
Note continued: 20.9.1. Sampling Frame Problems -- 20.9.2. Nonresponse -- 20.9.3. Sampling Errors -- 20.9.4. Non-sampling Errors -- 20.10. Role of Finite Population Sampling in Modern Statistics -- 21. Appendix: Probability and Mathematical Concepts -- 21.1. Probability Models -- 21.1.1. Definitions -- 21.1.2. Properties of Probability -- 21.1.3. Continuity Properties of Probability Measures -- 21.1.4. Conditional Probability -- 21.1.5. Properties of Conditional Probability -- 21.1.6. Finite and Denumerable Sample Spaces -- 21.1.7. Independence -- 21.2. Random Variables and Probability Distributions -- 21.2.1. Measurable Functions -- 21.2.2. Random Variables:Definitions -- 21.2.3. Distribution Functions -- 21.2.4. Discrete Random Variables -- 21.2.5. Continuous Random Variables -- 21.2.6. Functions of Random Variables -- 21.3. Random Vectors -- 21.3.1. Definitions -- 21.3.2. Discrete and Continuous Random Vectors -- 21.3.3. Marginal Distributions -- 21.3.4. The Multinomial Distribution
Note continued: 21.3.5. Independence of Random Variables -- 21.3.6. Conditional Distributions -- 21.3.7. Functions of a Random Vector -- 21.3.8. The Multivariate Normal Distribution -- 21.4. Expected Values -- 21.4.1. Expected Value of a Random Variable -- 21.4.2. Distributions and Expected Values -- 21.4.3. Moments -- 21.4.4. Properties and Results on Expected Values and Moments -- 21.4.5. Other Functionals of a Distribution -- 21.4.6. Probability Generating Functions -- 21.4.7. Moment-Generating Functions -- 21.4.8. Cumulant Generating Functions and Cumulants -- 21.4.9. Expected Values of Functions of Random Vectors -- 21.4.10. Conditional Expectations and Variances -- 21.5. What Is Probability? -- 21.5.1. Frequency Interpretation -- 21.5.2. Belief Interpretations -- 21.5.3. Rational Belief Interpretation -- 21.5.4. Countable Additivity Assumption -- 21.5.5. Lindley's Wheel -- 21.5.6. Probability via Expectations -- 21.6. Final Comments -- 21.7. Introduction to Stochastic Processes
Note continued: 21.11.5.O, O Definitions and Results -- 21.11.6. Results and Special Cases -- 21.11.7. Extension to Functions -- 21.11.8. Extension to Vectors -- 21.11.9. Extension to Vector-Valued Functions
Summary This textbook covers the fundamentals of statistical inference and statistical theory including Bayesian and frequentist approaches and methodology possible without excessive emphasis on the underlying mathematics. This book is about some of the basic principles of statistics that are necessary to understand and evaluate methods for analyzing complex data sets. The likelihood function is used for pure likelihood inference throughout the book. There is also coverage of severity and finite population sampling. The material was developed from an introductory statistical theory course taught by the author at the Johns Hopkins University?s Department of Biostatistics. Students and instructors in public health programs will benefit from the likelihood modeling approach that is used throughout the text. This will also appeal to epidemiologists and psychometricians. After a brief introduction, there are chapters on estimation, hypothesis testing, and maximum likelihood modeling. The book concludes with sections on Bayesian computation and inference. An appendix contains unique coverage of the interpretation of probability, and coverage of probability and mathematical concepts
Bibliography Includes bibliographical references and index
Notes Online resource; title from PDF title page (SpringerLink, viewed November 19, 2014)
Subject Mathematical statistics.
Mathematical statistics
Form Electronic book
ISBN 9783319104614
3319104616
3319104608
9783319104607