shlogg · Early preview
Super Kai (Kazuya Ito) @superkai_kazuya

Tanh And Softsign Activation Functions Explained

Tanh() and Softsign() explained: compute zero or more values from input tensors using PyTorch nn modules.

Buy Me a Coffee☕
*Memos:

My post explains Tanh, Softsign, Sigmoid and Softmax.
My post explains heaviside() and Identity().
My post explains ReLU() and LeakyReLU().
My post explains PReLU() and ELU().
My post explains SELU() and CELU().
My post explains GELU() and Mish().
My post explains SiLU() and Softplus().
My post explains Sigmoid() and Softmax().

Tanh() can get the 0D or more D tensor of the zero or more values computed by Tanh function from the 0D or more D tensor of zero or more elements as shown below:
*Memos:

The 1st argument is input(Required-Type:tensor of int, float, complex o...