Perceptron

  • It's the foundation of MLP and Deep Learning

  • Each input value has a related weight and a bias value is added:

z[i]:=b+j=1nxj[i]wj1 if z[i]>0;0 otherwisez^{[i]} := b + \sum_{j=1}^n x_j^{[i]} * w_j \rightarrow 1\ if\ z^{[i]} > 0; 0\ otherwise
  • To update the weights iteratively, we compute the errors on each iteration:

error:=y[i]y^[i]wj:=wj+errorxj[i]error := y^{[i]} - ŷ^{[i]} \\ w_j := w_j + error * x_j^{[i]}
  • Only capable of classifying using a linear boundary

Last updated

Was this helpful?