Loading...
Syllabus
Module 1. Classical Linear Regression
Ordinary Least Squares
(OLS)
Estimation
BLUE, Gauss-Markov Theorem
Confidence
Confidence Interval for a single Linear Parametric Function
(LPF)
Confidence Regions for multiple LPFs
Simultaneous Confidence Intervals for multiple LPFs
Prediction Interval
Hypothesis Testing
Testing for the significance of a single LPF
Testing for the significance of individual predictors: t-test
ANOVA Table and testing hypothesis involving several LPFs
Testing for the significance of the entire model/testing for lack of fit: F-test
Categorical Predictors, Interaction models
Least Squares in Heteroskedastic Models
Generalized Least Squares
Weighted Least Squares
Residual Diagnostics
Additional Readings:
Read chapter 12 of PRA by J. Faraway for a complete linear regression example
tidyverse from ModernDive for reducing unstructured data to a regression framework.
Module 2. Beyond Least Squares
Non-Linear Regression
Transforming the Response: Box-Cox method
Transforming the Predictors
Polynomial Regression
Regression Splines*
Local Regression
Generalized Additive Models
(GAM)
Bootstrapping methods
Additional Readings:
Ch 6 slides from ISLR for more information on the above methods
Ch 5 slides from ISLR for cross validation and bootstrap
Robust Regression
Quantile Regression
M-Estimation
Huber loss, Robustness and rlm R package
Please turn on JavaScript to use Paper in all of its awesomeness. ^_^
Module 1. Classical Linear Regression
Module 2. Beyond Least Squares