Rights Contact Login For More Details
- Wiley
More About This Title Methods and Applications of Linear Models: Regression and the Analysis of Variance, Third Edition
- English
English
"An essential desktop reference book . . . it should definitely be on your bookshelf."
—Technometrics
A thoroughly updated book, Methods and Applications of Linear Models: Regression and the Analysis of Variance, Third Edition features innovative approaches to understanding and working with models and theory of linear regression. The Third Edition provides readers with the necessary theoretical concepts, which are presented using intuitive ideas rather than complicated proofs, to describe the inference that is appropriate for the methods being discussed.
The book presents a unique discussion that combines coverage of mathematical theory of linear models with analysis of variance models, providing readers with a comprehensive understanding of both the theoretical and technical aspects of linear models. With a new focus on fixed effects models, Methods and Applications of Linear Models: Regression and the Analysis of Variance, Third Edition also features:
- Newly added topics including least squares, the cell means model, and graphical inspection of data in the AVE method
- Frequent conceptual and numerical examples for clarifying the statistical analyses and demonstrating potential pitfalls
- Graphics and computations developed using JMP® software to accompany the concepts and techniques presented
- Numerous exercises presented to test readers and deepen their understanding of the material
An ideal book for courses on linear models and linear regression at the undergraduate and graduate levels, the Third Edition of Methods and Applications of Linear Models: Regression and the Analysis of Variance is also a valuable reference for applied statisticians and researchers who utilize linear model methodology.
- English
English
RONALD R. HOCKING, PhD, is Professor Emeritus in the Department of Statistics and Founder of the Ronald R. Hocking Lecture Series at Texas A&M University. A Fellow of the American Statistical Association, Dr. Hocking is the recipient of numerous honors in the statistical community including the Shewell Award, the Youden Award, the Wilcoxon Award, the Snedecor Award, and the Owen Award.
- English
English
Preface to the Third Edition xvii
Preface to the Second Edition xix
Preface to the First Edition xxi
PART I REGRESSION 1
1 Introduction to Linear Models 3
1.1 Background Information, 3
1.2 Mathematical and Statistical Models, 5
1.3 Definition of the Linear Model, 8
1.4 Examples of Regression Models, 13
1.5 Concluding Comments, 21
Exercises, 21
2 Regression on Functions of One Variable 23
2.1 The Simple Linear Regression Model, 23
2.2 Parameter Estimation, 25
2.3 Properties of the Estimators and Test Statistics, 34
2.4 The Analysis of Simple Linear Regression Models, 39
2.5 Examining the Data and the Model, 50
2.6 Polynomial Regression Models, 63
Exercises, 72
3 Transforming the Data 81
3.1 The Need for Transformations, 81
3.2 Weighted Least Squares, 82
3.3 Variance Stabilizing Transformations, 85
3.4 Transformations to Achieve a Linear Model, 86
3.5 Analysis of the Transformed Model, 92
Exercises, 95
4 Regression on Functions of Several Variables 99
4.1 The Multiple Linear Regression Model, 99
4.2 Preliminary Data Analysis, 100
4.3 Analysis of the Multiple Linear Regression Model, 103
4.4 Partial Correlation and Added-Variable Plots, 113
4.5 Variable Selection, 119
4.6 Model Specification, 130
Exercises, 137
5 Collinearity in Multiple Linear Regression 142
5.1 The Collinearity Problem, 142
5.2 An Example with Collinearity, 150
5.3 Collinearity Diagnostics, 156
5.4 Remedial Solutions: Biased Estimators, 166
Exercises, 178
6 Influential Observations in Multiple Linear Regression 182
6.1 The Influential Data Problem, 182
6.2 The Hat Matrix, 183
6.3 The Effects of Deleting Observations, 188
6.4 Numerical Measures of Influence, 192
6.5 The Dilemma Data, 197
6.6 Plots for Identifying Unusual Cases, 201
6.7 Robust/Resistant Methods in Regression Analysis, 209
Exercises, 213
7 Polynomial Models and Qualitative Predictors 216
7.1 Polynomial Models, 216
7.2 The Analysis of Response Surfaces, 220
7.3 Models with Qualitative Predictors, 225
Exercises, 247
8 Additional Topics 254
8.1 Nonlinear Regression Models, 254
8.2 Nonparametric Model-Fitting Methods, 260
8.3 Generalized Linear Models, 265
8.4 Random Input Variables, 274
8.5 Errors in the Inputs, 276
8.6 Calibration, 277
Exercises, 278
PART II THE ANALYSIS OF VARIANCE 283
9 Classification Models I: Introduction 285
9.1 Background Information, 285
9.2 The One-Way Classification Model, 286
9.3 The Two-Way Classification Model: Balanced Data, 304
9.4 The Two-Way Classification Model: Unbalanced Data, 322
9.5 The Two-Way Classification Model: No Interaction, 334
9.6 Concluding Comments, 347
Exercises, 347
10 The Mathematical Theory of Linear Models 359
10.1 The Distribution of Linear and Quadratic Forms, 359
10.2 Estimation and Inference for Linear Models, 368
10.3 Tests of Linear Hypotheses on β, 380
10.4 Confidence Regions and Intervals, 392
Exercises, 395
11 Classification Models II: Multiple Crossed and Nested Factors 405
11.1 The Three-Factor Cross-Classified Model, 406
11.2 A General Structure for Balanced, Factorial Models, 412
11.3 The Twofold Nested Model, 417
11.4 A General Structure for Balanced, Nested Models, 426
11.5 A Three-Factor, Nested-Factorial Model, 429
11.6 A General Structure for Balanced, Nested-Factorial Models, 434
Exercises, 438
12 Mixed Models I: The AOV Method with Balanced Data 443
12.1 Introduction, 443
12.2 Examples of the Analysis of Mixed Models, 444
12.3 The General Analysis for Balanced, Mixed Models, 464
12.4 Additional Examples, 479
12.5 Alternative Developments of Mixed Models, 487
Exercises, 493
13 Mixed Models II: The AVE Method with Balanced Data 499
13.1 Introduction, 499
13.2 The Two-Way Cross-Classification Model, 500
13.3 The Three-Factor, Cross-Classification Model, 511
13.4 Nested Models, 515
13.5 Nested-Factorial Models, 518
13.6 A General Description of the AVE Table, 524
13.7 Additional Examples, 531
13.8 The Computational Procedure for the AVE Method, 537
Exercises, 537
14 Mixed Models III: Unbalanced Data 543
14.1 Introduction, 543
14.2 Parameter Estimation: Likelihood Methods, 545
14.3 ML and REML Estimates with Balanced Data, 554
14.4 The EM Algorithm for REML Estimation, 558
14.5 Diagnostic Analysis with the EM Algorithm, 572
14.6 Models with Covariates, 581
14.7 Summary, 585
Exercises, 585
15 Simultaneous Inference: Tests and Confidence Intervals 591
15.1 Simultaneous Tests, 591
15.2 Simultaneous Confidence Intervals, 610
Exercises, 612
Appendix A Mathematics 615
A.I Matrix Algebra, 615
A.I.1 Notation, 615
A.I.2 The Rank of a Matrix, 616
A.I.3 The Trace of a Matrix, 617
A.I.4 Eigenvalues and Eigenvectors, 617
A.I.5 Quadratic Forms and Definite Matrices, 618
A.I.6 Special Matrices, 619
A.I.7 The Diagonalization of Matrices, 620
A.I.8 Kronecker Products of Matrices, 620
A.I.9 Factorization of Matrices, 621
A.I.10 Matrix Inversion, 622
A.I.11 The Solution of Linear Equations, 624
A.I.12 Generalized Inverses, 627
A.I.13 Cauchy–Schwartz Inequalities, 630
A.II Optimization, 630
A.II.1 The Differentiation of Matrices and Determinants, 630
A.II.2 The Differentiation of a Function with Respect to a Vector, 631
A.II.3 The Optimization of a Function, 632
Appendix B Statistics 634
B.I Distributions, 634
B.I.1 The Normal Distribution, 634
B.I.2 The χ2-Distribution, 637
B.I.3 The t-Distribution, 638
B.I.4 The F-distribution, 639
B.II The Distribution of Quadratic Forms, 639
B.III Estimation, 642
B.III.1 Maximum Likelihood Estimation, 642
B.III.2 Constrained Maximum Likelihood Estimation, 642
B.III.3 Complete, Sufficient Statistics, 643
B.IV Tests of Hypotheses and Confidence Regions, 643
B.IV.1 Tests of Hypotheses, 643
B.IV.2 Confidence Intervals and Regions, 644
Appendix C Data Tables 645
C.I Downloading Data Files from FTP Server, 645
C.II Listing of Data Set Files, 645
Appendix D Statistical Tables 660
References 669
Index 677