Simd Library Documentation.

Home | Release Notes | Download | Documentation | Issues | GitHub
Function Struct Reference

Activation Function structure. More...

#include <SimdNeural.hpp>

Public Types

enum  Type {
  Identity ,
  Tanh ,
  Sigmoid ,
  Relu ,
  LeakyRelu ,
  Softmax
}
 

Detailed Description

Activation Function structure.

Provides activation functions and their derivatives.

Member Enumeration Documentation

◆ Type

enum Type

Describes types of activation function. It is used in order to create a Layer in Network.

Enumerator
Identity 

Identity:

f(x) = x;
df(y) = 1;
Tanh 

Hyperbolic Tangent:

                f(x) = (exp(x) - exp(-x))/(exp(x) + exp(-x));
                df(y) = 1 - y*y;

See implementation details: SimdSynetTanh32f and SimdNeuralDerivativeTanh.

Sigmoid 

Sigmoid:

                f(x) = 1/(1 + exp(-x));
                df(y) = (1 - y)*y;

See implementation details: SimdSynetSigmoid32f and SimdNeuralDerivativeSigmoid.

Relu 

ReLU (Rectified Linear Unit):

                f(x) = max(0, x);
                df(y) = y > 0 ? 1 : 0;

See implementation details: SimdSynetRelu32f and SimdNeuralDerivativeRelu.

LeakyRelu 

Leaky ReLU(Rectified Linear Unit):

                f(x) = x > 0 ? x : 0.01*x;
                df(y) = y > 0 ? 1 : 0.01;

See implementation details: SimdSynetRelu32f and SimdNeuralDerivativeRelu.

Softmax 

Softmax (normalized exponential function):

                f(x[i]) = exp(x[i])/sum(exp(x[i]));
                df(y) = y*(1 - y);