OLS Regression in R

1. Objective

In this R tutorial, we will be going to discuss OLS Regression in R in detail. We will also cover OLS models, useful commands and diagnostic in R.

OLS Regression in R

OLS Regression in R

2. Introduction to OLS Regression in R

It is a type of statistical technique. That is being used for modeling. Also, used for analysis of linear relationships between a response variable. If there is a relationship between two variables appears to be linear. Then a straight line can be fit to the data to model the relationship. The linear equation for a bivariate regression takes the following form:
y=mx+c
Where y = response(dependent) variable
m = gradient(slope)
x = predictor(independent) variable
C = is the intercept

3. OLS(Linear Model Estimation Using Ordinary Least Squares) in R

i. Keywords

Models, regression

ii. Usage

ols(formula, data, weights, subset, na.action=na.delete,
method=”qr”, model=FALSE,
x=FALSE, y=FALSE, se.fit=FALSE, linear.predictors=TRUE,
penalty=0, penalty.matrix, tol=1e-7, sigma,
var.penalty=c(‘simple’,’sandwich’), …)

iii. Arguments

a. Formula
an S formula object, e.g.
Y ~ rcs(x1,5)*lsp(x2,c(10,20))
b. Data
It is a name of an S data frame containing all needed variables.
c. Weights 
We use it in the fitting process.
d. Subset
it is an expression that defines a subset of the observations to use in the fit. The default is to use all observations.
e. na.action
This specifies an S function to handle missing data.
f. Method
This specifies a particular fitting method, or “model.frame”.
g. Model
The default is FALSE. it is set to TRUE. That return the model frame as element model of the fit object.
h. X
The default is FALSE. Set to TRUE to return the expanded design matrix as element x of the returned fit object. First set both x=TRUE if you are going to use the residuals function.
i. Y
The default is FALSE. Set to TRUE to return the vector of response values as element y of the fit.
j. Se.fit
the default is FALSE. It is set to TRUE. That computes the estimated standard errors of the estimate of Xβ. And also store them in element se.fit of the fit.
k. Linear.predictors 
It is set FALSE as default. That is being used to cause predicted values not to be stored.
l. Penalty penalty.matrix
see lrm
m. Tol  
tolerance for information matrix singularity.
n. Sigma 
If sigma is being given, then we can use it as the actual root mean squared error parameter for the model. Otherwise, sigma is being estimated from the data using the usual formulas.
o. Var.penalty
It is the type of variance-covariance matrix. That is to be stored in the var component of the fit when penalization is being used.
p. …
arguments to pass to lm.wfit or lm.fit

4. OLS Data Analysis: Descriptive Stats

  • Several built-in commands for describing data has been present in R.
  • Also, we use list() command to get the output all elements of an object.
  • Moreover, summary() command to describe all variables contained within a data frame.
  • We use summary() command also with individual variables.
  • Simple plots can also provide familiarity with the data.
  • We use the hist() command which produces a histogram for any given data values.
  • We use the plot() command. That produces both univariate and bivariate plots for any given objects.

i. Other Useful Commands

  • sum
  • min
  • max
  • mean
  • median
  • var
  • sd
  • cor
  • range
  • Summary

5. Data Analysis: OLS Regression in R useful commands

  • lm – Linear Model.
  • lme – Mixed effects.
  • glm – General lm.
  • Multinomial – Multinomial Logit.
  • Optim – General Optimizer.
R Quiz

6. OLS Diagnostics in R

  • Post-estimation diagnostics are key to data analysis.
  • Furthermore, we can use diagnostics. That allows us the opportunity to show off some of the R’s graphs. What could be driving our driving our data?

-outlier: Basically, it is an unusual observation.
-Leverage: Generally, it has an ability to change the slope of the regression line.
-Influence: Moreover, the combined impact of strong leverage and outlier status.

So, this was all in OLS Regression in R. Hope you like our explanation.

7. Conclusion – OLS Regression in R

Hence, we have seen how OLS regression in R using ordinary least squares exist. Also, we have learned its usage as well as its command. Moreover, we have studied diagnostic in R which helps in showing graph. Still, if you have any query regarding OLS Regression in R, ask in the comment tab.

Reference for R 

Leave a Reply

Your email address will not be published. Required fields are marked *