Intro -- Prefaces -- Contents -- 1 Nonparametric Regression -- 1.1 Basic Notation -- 1.2 Linear Approximations -- 1.3 Simple Nonparametric Regression -- 1.4 Estimation -- 1.4.1 Polynomials -- 1.4.2 Cosines -- 1.4.3 Haar Wavelets -- 1.4.4 Cubic Splines -- 1.4.5 Orthonormal Series Estimation -- 1.5 Variable Selection -- 1.6 Heteroscedastic Simple Nonparametric Regression -- 1.7 Approximating-Functions with Small Support -- 1.7.1 Polynomial Splines -- 1.7.1.1 B-Splines -- 1.7.1.2 Equivalence of Spline Methods* -- 1.7.2 Fitting Local Functions -- 1.7.3 Local Regression
Text of Note
2.3 Lasso Regression -- 2.4 Bayesian Connections -- 2.5 Another Approach -- 2.5.1 Geometry -- 2.5.1.1 More Lasso Geometry -- 2.5.2 Equivalence of Approaches -- 2.6 Two Other Penalty Functions -- References -- 3 Reproducing Kernel Hilbert Spaces -- 3.1 Introduction -- 3.1.1 Interpolating Splines -- 3.2 Banach and Hilbert Spaces -- 3.2.1 Banach Spaces -- 3.2.2 Hilbert Spaces -- 3.3 Reproducing Kernel Hilbert Spaces -- 3.3.1 The Projection Principle for an RKHS -- 3.4 Two Approaches -- 3.4.1 Testing Lack of Fit -- 3.5 Penalized Regression with RKHSs -- 3.5.1 Ridge and Lasso Regression
Text of Note
4.4 Linear Covariance Structures -- 4.5 MINQUE -- 4.5.1 Deriving the MINQUE Equations -- 4.6 MIVQUE -- 4.7 The Effect of Estimated Covariances -- 4.7.1 Mathematical Results* -- References -- 5 Mixed Models and Variance Components -- 5.1 Mixed Models -- 5.2 Mixed Model Equations -- 5.3 Equivalence of Random Effects and Ridge Regression -- 5.4 Partitioning and Linear Covariance Structures -- 5.5 Variance Component Models -- 5.5.1 Variance Component Estimation -- 5.6 A Longitudinal Model -- 5.7 Henderson's Method 3 -- 5.7.1 Additional Estimates -- 5.8 Exact F Tests for Variance Components
0
8
8
SUMMARY OR ABSTRACT
Text of Note
Now in its third edition, this companion volume to Ronald Christensen's Plane Answers to Complex Questions uses three fundamental concepts from standard linear model theory--best linear prediction, projections, and Mahalanobis distance-- to extend standard linear modeling into the realms of Statistical Learning and Dependent Data. This new edition features a wealth of new and revised content. In Statistical Learning it delves into nonparametric regression, penalized estimation (regularization), reproducing kernel Hilbert spaces, the kernel trick, and support vector machines. For Dependent Data it uses linear model theory to examine general linear models, linear mixed models, time series, spatial data, (generalized) multivariate linear models, discrimination, and dimension reduction. While numerous references to Plane Answers are made throughout the volume, Advanced Linear Modeling can be used on its own given a solid background in linear models. Accompanying R code for the analyses is available online.--
ACQUISITION INFORMATION NOTE
Source for Acquisition/Subscription Address
Springer Nature
Stock Number
com.springer.onix.9783030291648
OTHER EDITION IN ANOTHER MEDIUM
Title
Advanced Linear Modeling : Statistical Learning and Dependent Data.