正则化 (Regularization)
Cost Function of Logistic Regression
$$ \begin{equation}J_\theta=-{1\over m}\sum_{i=1}^m{(y^{(i)}lg(h_\theta(x^{(i)}))+(1-y^{(i)})lg(1-h_\theta(x^{(i)})))} + {\lambda \over 2m}\sum_{j=1}^n\theta_j^2\end{equation} $$
Gradient Descent after Regularization
if j == 0:
$$ \begin{equation}\theta_0 := \theta_0 – \alpha{1\over m}\sum_{i=1}^m(h_\theta(x^{(i)})-y^{(i)})\end{equation} $$
else:
$$ \begin{equation}\theta_j := \theta_j – \alpha{1\over m}[\sum_{i=1}^m(h_\theta(x^{(i)})-y^{(i)})x_j^{(i)} + \lambda\theta_j]\end{equation} $$
or written as:
$$ \begin{equation}\theta_j := \theta_j(1-\alpha{\lambda\over m}) – \alpha{1\over m}\sum_{i=1}^m(h_\theta(x^{(i)})-y^{(i)})x^{(i)}_j\end{equation} $$
使用上述公式同时更新$\theta_0,\theta_1,\cdots,\theta_n$