Activation Functions
Linear
The input equals the output . Meaning the output is continuous.

Sigmoid or Logistic function
It transforms a vector in the continuous range [0, 1].

A sigmoid activation may be used within a Dense Keras Layer:
from tensorflow.keras.layers import Dense
from tensorflow.keras.models import Model
input_layer = # Previous layer
logits = Dense(
units=32,
activation='sigmoid'
)(input_layer)
model = Model(inputs=input_layer, outputs=logits)It provides support to the operation by computing sigmod of x element-wise.
import tensorflow as tf
tf.math.sigmoid(
x, # A tensor of "floating type"
name=None # Optional
)Softmax
Converts its input values into a probability distribution among N classes.
Transforms raw scores into probabilities in a multi-label classification. All the probabilities are normalized to sum 1 all of them.
ReLU
Also called Rectified Linear Unit,
if the input value is ≥ 0, (acts as a Linear function)
it the input value is < 0, it returns 0
It can be summarized as

Last updated
Was this helpful?