Ordinary Least Squares is a kind of linear regression models. It is simple and easy to understand. In this tutorial, we will explain it for you to help you understand it.

Ordinary Least Squares is define as:

where** y^{^}** is predicted target,

**,**

*x = (x*_{1}, x_{2}, …, x_{n})**is the**

*x*_{n}**-th feature of sample**

*n***.**

*x**is called coefficients,*

**w = (w**_{1}, w_{2}, …, w_{n})*is call intercept,*

**w**_{o}**and**

*w***will be estimated by algorithm.**

*w*_{o}**How to estimate w and w_{o}**

This is core of Ordinary Least Square, our target is to make* y* and

**are as same as possible. to esitmate the discrepancy between them, we difine a loss funciton as:**

*y^*Notice: we do not use * w_{o}* in this loss function.

The more mininum of loss function, the more minimum of discrepancy between * y* and

**. This is Ordinary Least Squares.**

*y^***How to measure the quality of w and w_{o}**

We can compute the R2 coefficient between** y** and

*.*

**y^**An Introduction to R2 Coefficient for Beginners – Scikit-Learn Tutorial