Which of the following functions are classified as activation functions?

Disable ads (and more) with a membership for a one time $4.99 payment

Prepare for the Huawei Certified ICT Associate – AI Exam with flashcards and multiple-choice questions, featuring hints and explanations. Gear up for success!

The activation functions in neural networks are crucial for introducing non-linearity into the model, allowing it to learn complex patterns in the data. The correct answer here includes several functions that are widely recognized as activation functions.

The ReLU (Rectified Linear Unit) function is an activation function defined as (f(x) = \max(0, x)), which allows positive values to pass through while blocking negative values. It helps in dealing with the vanishing gradient problem, making it popular for training deep networks.

The SoftPlus function, defined as (f(x) = \log(1 + e^x)), serves as a smooth approximation to the ReLU function, ensuring that it is always differentiable.

The tanh (hyperbolic tangent) function, given by (f(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}), maps input values to a range between -1 and 1. It's commonly used in the hidden layers during training of neural networks because it can center the data, aiding in convergence.

The sigmoid function, defined as (f(x) = \frac{1}{1 + e^{-x}}), squashes input values into a range between