Key Sequence Review even more! Linear Regression, give some intuition, discuss logistic regression and give an optimization method for it. Notation Recall the notation: \left(x^{(i)}, y^{(i)}\right), ith example x^{(i)} \in \mathbb{R}^{m+1}, where x_0^{(i)}, \forall i = 1 y^{(i)} \in \mathbb{R} n — number of examples; m — number of features New Concepts Locally-Weighted Regression logistic regression Newton’s Method parametricity of learning algorithms Non-Parametric Learning Algorithms Parametric Learning Algorithm Important Results / Claims probabilistic intuition for least-squares error Questions Interesting Factoids Scratch

[[curator]]
I'm the Curator. I can help you navigate, organize, and curate this wiki. What would you like to do?