Activation Functions

Linear

The input equals the output f(z)=zf(z) = z. Meaning the output is continuous.

Sigmoid or Logistic function

It transforms a vector in the continuous range [0, 1].

f(si)=11+esif(s_i) = \frac{1}{1 + e^{-s_i}}

Softmax

Transforms raw scores into probabilities in a multi-label classification. All the probabilities are normalized to sum 1 all of them.

f(si)=exijexjf(s_i) = \frac{ e^{x_i}}{\sum_j e^{x_j}}

ReLU

Also called Rectified Linear Unit,

  • if the input value is ≥ 0, (acts as a Linear function)

  • it the input value is < 0, it returns 0

Last updated

Was this helpful?