《經(jīng)典和現(xiàn)代回歸分析及其應(yīng)用》純英文影印版,Many volumes have been written by statisticians and scientists with the resultbeing that the arsenal of effective regression methods has increased manyfold. My intent for this second edition is to provide a rather substantial increase inmaterial related to classical regression while continuing to introduce relevant newand modern techniques. I have included major supplements in simple linearregression that deal with simultaneous influence, maximum likelihood estimationof parameters, and the plotting of residuals. In multiple regression, new andsubstantial sections on the use of the general linear hypothesis, indicator variables,the geometry of least squares, and relationship to ANOVA models are added.
作者簡介
暫缺《經(jīng)典和現(xiàn)代回歸分析及其應(yīng)用:英文本》作者簡介
圖書目錄
CHAPTER 1 INTRODUCTION: REGRESSION ANALYSIS Regression models Formal uses of regression analysis The data base References CHAPTER 2 THE SIMPLE LINEAR REGRESSION MODEL The model description Assumptions and interpretation of model parameters Least squares formulation Maximum likelihood estimation Partioning total variability Tests of hypothesis on slope and intercept Simple regression through the origin (Fixed intercept) Quality of fitted model Confidence intervals on mean response and prediction intervals Simultaneous inference in simple linear regression A complete annotated computer printout A look at residuals Both x and y random Exercises References CHAPTER 3 THE MULTIPLE LINEAR REGRESSION MODEL Model description and assumptions The general linear mode] and the least squares procedure Properties of least squares estimators under ideal conditions Hypothesis testing in multiple linear regression Confidence intervals and prediction intervals in multiple regressions Data with repeated observations Simultaneous inference in multiple regression Multicollinearity in multiple regression data Quality fit, quality prediction, and the HAT matrix Categorical or indicator variables (Regression models and ANOVA models) Exercises References CHAPTER 4 CRITERIA FOR CHOICE OF BEST MODEL Standard criteria for comparing models Cross validation for model selection and determination of model performance Conceptual predictive criteria (The Cp statistic) Sequential variable selection procedures Further comments and all possible regressions Exercises References CHAPTER 5 ANALYSIS OF RESIDUALS 209 Information retrieved from residuals Plotting of residuals Studentized residuals Relation to standardized PRESS residuals Detection of outliers Diagnostic plots Normal residual plots Further comments on analysis of residuals Exercises References CHAPTER 6 INFLUENCE DIAGNOSTICS Sources of influence Diagnostics: Residuals and the HAT matrix Diagnostics that determine extent of influence Influence on performance What do we do with high influence points? Exercises References CHAPTER 7 NONSTANDARD CONDITIONS, VIOLATIONS OF ASSUMPTIONS,AND TRANSFORMATIONS Heterogeneous variance: Weighted least squares Problem with correlated errors (Autocorrelation) Transformations to improve fit and prediction Regression with a binary response Further developments in models with a discrete response (Poisson regression) Generalized linear models Failure of normality assumption: Presence of outliers Measurement errors in the regressor variables Exercises References CHAPTER 8 DETECTING AND COMBATING MULTICOLLINEARITY Multicollinearity diagnostics Variance proportions Further topics concerning multicollinearity Alternatives to least squares in cases of multicollinearity Exercises References CHAPTER 9 NONLINEAR REGRESSION Nonlinear least squares Properties of the least squares estimators The Gauss-Newton procedure for finding estimates Other modifications of the Gauss-Newton procedure Some special classes of nonlinear models Further considerations in nonlinear regression Why not transform data to linearize? Exercises References APPENDIX A SOME SPECIAL CONCEPTS IN MATRIX ALGEBRA Solutions to simultaneous linear equations Quadratic form Eigenvalues and eigenvectors The inverses of a partitioned matrix Sherman-Morrison-Woodbury theorem References APPENDIX B SOME SPECIAL MANIPULATIONS Unbiasedness of the residual mean square Expected value of residual sum of squares and mean square for an underspecified model The maximum likelihood estimator Development of the PRESS statistic Computation of s _ i Dominance of a residual by the corresponding model error .Computation of influence diagnostics Maximum likelihood estimator in the nonlinear model Taylor series Development of the C~-statistic References APPENDIX C STATISTICAL TABLES INDEX