2 options
Linear regression analysis / George A.F. Seber, Alan J. Lee.
Holman Biotech Commons QA278.2 .S4 2003
By Request
- Format:
- Book
- Author/Creator:
- Seber, G. A. F. (George Arthur Frederick), 1938-
- Series:
- Wiley series in probability and statistics
- Language:
- English
- Subjects (All):
- Regression analysis.
- Physical Description:
- xvi, 557 pages : illustrations ; 24 cm.
- Edition:
- Second edition.
- Place of Publication:
- Hoboken, N.J. : Wiley-Interscience, [2003]
- Summary:
- For more than two decades, the First Edition of Linear Regression Analysis has been an authoritative resource for one of the most common methods of handling statistical data. There have been many advances in the field over the last twenty years, including the development of more efficient and accurate regression computer programs, new ways of fitting regressions, and new methods of model selection and prediction. Linear Regression Analysis, Second Edition, revises and expands this standard text, providing extensive coverage of state-of-the-art theory and applications of linear regression analysis. Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models. Concise, mathematically clear, and comprehensive, Linear Regression Analysis, Second Edition, serves as both a reliable reference for the practitioner and a valuable textbook for the student.
- Contents:
- 1 Vectors of Random Variables 1
- 1.2 Statistical Models 2
- 1.3 Linear Regression Models 4
- 1.4 Expectation and Covariance Operators 5
- 1.5 Mean and Variance of Quadratic Forms 9
- 1.6 Moment Generating Functions and Independence 13
- 2 Multivariate Normal Distribution 17
- 2.1 Density Function 17
- 2.2 Moment Generating Functions 20
- 2.3 Statistical Independence 24
- 2.4 Distribution of Quadratic Forms 27
- 3 Linear Regression: Estimation and Distribution Theory 35
- 3.1 Least Squares Estimation 35
- 3.2 Properties of Least Squares Estimates 42
- 3.3 Unbiased Estimation of [sigma superscript 2] 44
- 3.4 Distribution Theory 47
- 3.5 Maximum Likelihood Estimation 49
- 3.6 Orthogonal Columns in the Regression Matrix 51
- 3.7 Introducing Further Explanatory Variables 54
- 3.7.1 General Theory 54
- 3.7.2 One Extra Variable 57
- 3.8 Estimation with Linear Restrictions 59
- 3.8.1 Method of Lagrange Multipliers 60
- 3.8.2 Method of Orthogonal Projections 61
- 3.9 Design Matrix of Less Than Full Rank 62
- 3.9.1 Least Squares Estimation 62
- 3.9.2 Estimable Functions 64
- 3.9.3 Introducing Further Explanatory Variables 65
- 3.9.4 Introducing Linear Restrictions 65
- 3.10 Generalized Least Squares 66
- 3.11 Centering and Scaling the Explanatory Variables 69
- 3.11.1 Centering 70
- 3.11.2 Scaling 71
- 3.12 Bayesian Estimation 73
- 3.13 Robust Regression 77
- 3.13.1 M-Estimates 78
- 3.13.2 Estimates Based on Robust Location and Scale Measures 80
- 3.13.3 Measuring Robustness 82
- 3.13.4 Other Robust Estimates 88
- 4 Hypothesis Testing 97
- 4.2 Likelihood Ratio Test 98
- 4.3 F-Test 99
- 4.3.1 Motivation 99
- 4.3.2 Derivation 99
- 4.3.3 Some Examples 103
- 4.3.4 The Straight Line 107
- 4.4 Multiple Correlation Coefficient 110
- 4.5 Canonical Form for H 113
- 4.6 Goodness-of-Fit Test 115
- 4.7 F-Test and Projection Matrices 116
- 5 Confidence Intervals and Regions 119
- 5.1 Simultaneous Interval Estimation 119
- 5.1.1 Simultaneous Inferences 119
- 5.1.2 Comparison of Methods 124
- 5.1.3 Confidence Regions 125
- 5.1.4 Hypothesis Testing and Confidence Intervals 127
- 5.2 Confidence Bands for the Regression Surface 129
- 5.2.1 Confidence Intervals 129
- 5.2.2 Confidence Bands 129
- 5.3 Prediction Intervals and Bands for the Response 131
- 5.3.1 Prediction Intervals 131
- 5.3.2 Simultaneous Prediction Bands 133
- 5.4 Enlarging the Regression Matrix 135
- 6 Straight-Line Regression 139
- 6.1 The Straight Line 139
- 6.1.1 Confidence Intervals for the Slope and Intercept 139
- 6.1.2 Confidence Interval for the x-Intercept 140
- 6.1.3 Prediction Intervals and Bands 141
- 6.1.4 Prediction Intervals for the Response 145
- 6.1.5 Inverse Prediction (Calibration) 145
- 6.2 Straight Line through the Origin 149
- 6.3 Weighted Least Squares for the Straight Line 150
- 6.3.1 Known Weights 150
- 6.3.2 Unknown Weights 151
- 6.4 Comparing Straight Lines 154
- 6.4.1 General Model 154
- 6.4.2 Use of Dummy Explanatory Variables 156
- 6.5 Two-Phase Linear Regression 159
- 6.6 Local Linear Regression 162
- 7 Polynomial Regression 165
- 7.1 Polynomials in One Variable 165
- 7.1.1 Problem of Ill-Conditioning 165
- 7.1.2 Using Orthogonal Polynomials 166
- 7.1.3 Controlled Calibration 172
- 7.2 Piecewise Polynomial Fitting 172
- 7.2.1 Unsatisfactory Fit 172
- 7.2.2 Spline Functions 173
- 7.2.3 Smoothing Splines 176
- 7.3 Polynomial Regression in Several Variables 180
- 7.3.1 Response Surfaces 180
- 7.3.2 Multidimensional Smoothing 184
- 8 Analysis of Variance 187
- 8.2 One-Way Classification 188
- 8.2.1 General Theory 188
- 8.2.2 Confidence Intervals 192
- 8.2.3 Underlying Assumptions 195
- 8.3 Two-Way Classification (Unbalanced) 197
- 8.3.1 Representation as a Regression Model 197
- 8.3.2 Hypothesis Testing 197
- 8.3.3 Procedures for Testing the Hypotheses 201
- 8.3.4 Confidence Intervals 204
- 8.4 Two-Way Classification (Balanced) 206
- 8.5 Two-Way Classification (One Observation per Mean) 211
- 8.5.1 Underlying Assumptions 212
- 8.6 Higher-Way Classifications with Equal Numbers per Mean 216
- 8.6.1 Definition of Interactions 216
- 8.6.2 Hypothesis Testing 217
- 8.6.3 Missing Observations 220
- 8.7 Designs with Simple Block Structure 221
- 8.8 Analysis of Covariance 222
- 9 Departures from Underlying Assumptions 227
- 9.2 Bias 228
- 9.2.1 Bias Due to Underfitting 228
- 9.2.2 Bias Due to Overfitting 230
- 9.3 Incorrect Variance Matrix 231
- 9.4 Effect of Outliers 233
- 9.5 Robustness of the F-Test to Nonnormality 235
- 9.5.1 Effect of the Regressor Variables 235
- 9.5.2 Quadratically Balanced F-Tests 236
- 9.6 Effect of Random Explanatory Variables 240
- 9.6.1 Random Explanatory Variables Measured without Error 240
- 9.6.2 Fixed Explanatory Variables Measured with Error 241
- 9.6.3 Round-off Errors 245
- 9.6.4 Some Working Rules 245
- 9.6.5 Random Explanatory Variables Measured with Error 246
- 9.6.6 Controlled Variables Model 248
- 9.7 Collinearity 249
- 9.7.1 Effect on the Variances of the Estimated Coefficients 249
- 9.7.2 Variance Inflation Factors 254
- 9.7.3 Variances and Eigenvalues 255
- 9.7.4 Perturbation Theory 255
- 9.7.5 Collinearity and Prediction 261
- 10 Departures from Assumptions: Diagnosis and Remedies 265
- 10.2 Residuals and Hat Matrix Diagonals 266
- 10.3 Dealing with Curvature 271
- 10.3.1 Visualizing Regression Surfaces 271
- 10.3.2 Transforming to Remove Curvature 275
- 10.3.3 Adding and Deleting Variables 277
- 10.4 Nonconstant Variance and Serial Correlation 281
- 10.4.1 Detecting Nonconstant Variance 281
- 10.4.2 Estimating Variance Functions 288
- 10.4.3 Transforming to Equalize Variances 291
- 10.4.4 Serial Correlation and the Durbin-Watson Test 292
- 10.5 Departures from Normality 295
- 10.5.1 Normal Plotting 295
- 10.5.2 Transforming the Response 297
- 10.5.3 Transforming Both Sides 299
- 10.6 Detecting and Dealing with Outliers 301
- 10.6.1 Types of Outliers 301
- 10.6.2 Identifying High-Leverage Points 304
- 10.6.3 Leave-One-Out Case Diagnostics 306
- 10.6.4 Test for Outliers 310
- 10.6.5 Other Methods 311
- 10.7 Diagnosing Collinearity 315
- 10.7.1 Drawbacks of Centering 316
- 10.7.2 Detection of Points Influencing Collinearity 319
- 10.7.3 Remedies for Collinearity 320
- 11 Computational Algorithms for Fitting a Regression 329
- 11.1.1 Basic Methods 329
- 11.2 Direct Solution of the Normal Equations 330
- 11.2.1 Calculation of the Matrix X'X 330
- 11.2.2 Solving the Normal Equations 331
- 11.3 QR Decomposition 338
- 11.3.1 Calculation of Regression Quantities 340
- 11.3.2 Algorithms for the QR and WU Decompositions 341
- 11.4 Singular Value Decomposition 353
- 11.4.1 Regression Calculations Using the SVD 353
- 11.4.2 Computing the SVD 354
- 11.5 Weighted Least Squares 355
- 11.6 Adding and Deleting Cases and Variables 356
- 11.6.1 Updating Formulas 356
- 11.6.2 Connection with the Sweep Operator 357
- 11.6.3 Adding and Deleting Cases and Variables Using QR 360
- 11.7 Centering the Data 363
- 11.8 Comparing Methods 365
- 11.8.2 Efficiency 366
- 11.8.3 Accuracy 369
- 11.9 Rank-Deficient Case 376
- 11.9.1 Modifying the QR Decomposition 376
- 11.9.2 Solving the Least Squares Problem 378
- 11.9.3 Calculating Rank in the Presence of Round-off Error 378
- 11.9.4 Using the Singular Value Decomposition 379
- 11.10 Computing the Hat Matrix Diagonals 379
- 11.10.1 Using the Cholesky Factorization 380
- 11.10.2 Using the Thin QR Decomposition 380
- 11.11 Calculating Test Statistics 380
- 11.12 Robust Regression Calculations 382
- 11.12.1 Algorithms for L[subscript 1] Regression 382
- 11.12.2 Algorithms for M- and GM-Estimation 384
- 11.12.3 Elemental Regressions 385
- 11.12.4 Algorithms for High-Breakdown Methods 385
- 12 Prediction and Model Selection 391
- 12.2 Why Select? 393
- 12.3 Choosing the Best Subset 399
- 12.3.1 Goodness-of-Fit Criteria 400
- 12.3.2 Criteria Based on Prediction Error 401
- 12.3.3 Estimating Distributional Discrepancies 407
- 12.3.4 Approximating Posterior Probabilities 410
- 12.4 Stepwise Methods 413
- 12.4.1 Forward Selection 414
- 12.4.2 Backward Elimination 416
- 12.4.3 Stepwise Regression 418
- 12.5 Shrinkage Methods 420
- 12.5.1 Stein Shrinkage 420
- 12.5.2 Ridge Regression 423
- 12.5.3 Garrote and Lasso Estimates 425
- 12.6 Bayesian Methods 428
- 12.6.1 Predictive Densities 428
- 12.6.2 Bayesian Prediction 431
- 12.6.3 Bayesian Model Averaging 433
- 12.7 Effect of Model Selection on Inference 434
- 12.7.1 Conditional and Unconditional Distributions 434
- 12.7.2 Bias 436
- 12.7.3 Conditional Means and Variances 437
- 12.7.4 Estimating Coefficients Using Conditional Likelihood 437
- 12.7.5 Other Effects of Model Selection 438
- 12.8 Computational Considerations 439
- 12.8.1 Methods for All Possible Subsets 439
- 12.8.2 Generating the Best Regressions 442
- 12.8.3 All Possible Regressions Using QR Decompositions 446
- 12.9 Comparison of Methods 447
- 12.9.1 Identifying the Correct Subset 447
- 12.9.2 Using Prediction Error as a Criterion 448
- Appendix A Some Matrix Algebra 457
- A.1 Trace and Eigenvalues 457
- A.2 Rank 458
- A.3 Positive-Semidefinite Matrices 460
- A.4 Positive-Definite Matrices 461
- A.5 Permutation Matrices 464
- A.6 Idempotent Matrices 464
- A.7 Eigenvalue Applications 465
- A.8 Vector Differentiation 466
- A.9 Patterned Matrices 466
- A.10 Generalized Inverse 469
- A.11 Some Useful Results 471
- A.12 Singular Value Decomposition 471
- A.13 Some Miscellaneous Statistical Results 472
- A.14 Fisher Scoring 473
- Appendix B Orthogonal Projections 475
- B.1 Orthogonal Decomposition of Vectors 475
- B.2 Orthogonal Complements 477
- B.3 Projections on Subspaces 477
- C.1 Percentage Points of the Bonferroni t-Statistic 480
- C.2 Distribution of the Largest Absolute Value of k Student t Variables 482
- C.3 Working-Hotelling Confidence Bands for Finite Intervals 489.
- Notes:
- Includes bibliographical references (pages 531-548) and index.
- Local Notes:
- Acquired for the Penn Libraries with assistance from the Albert E. Visk, W'28, Memorial Book Fund.
- ISBN:
- 0471415405
- OCLC:
- 52381258
- Publisher Number:
- 99942973679
- Online:
- Contributor biographical information
- Publisher description
The Penn Libraries is committed to describing library materials using current, accurate, and responsible language. If you discover outdated or inaccurate language, please fill out this feedback form to report it and suggest alternative language.