Regression is not hard to understand if you laern math before.
But using it in ML is a new insignt for me.
Steps of model building
1. model hypothesis----linear model
2. model evaluation----loss function
3. model optimization----gradient descent
y = wT*x+b
w : weight
b : bias
training set: used to train the model
testing set: used to test the validation of the model.（what we really care about is the erro on new data(testing data））
A more complex model does not always lead to better performance on testing data.
Maybe it will lead to overfitting of model.
So we should select the suitable model.
And more input and more parameters are not always useful for the model.
The third way to optimize the model is adding the regularization term which is used to smooth the function.
more knowledge about gradient descent and regularization will showed in later video.
(It’s difficult to print equations, so I omit here, and why I use English to take notes is in order to improve my English. It’s useful to output in English.)
By xijiu in Matrix Team. 2022.06.14