shlogg · Early preview
Super Kai (Kazuya Ito) @superkai_kazuya

Software Engineering And Web Development: Explained Recurrent Layers

Exploring popular neural network layers: Recurrent, LSTM, GRU, Transformer, activation functions, loss functions, optimizers & more!

Buy Me a Coffee☕
*Memos:

My post explains Recurrent Layer, LSTM, GRU and Transformer.
My post explains activation functions in PyTorch.
My post explains loss functions in PyTorch.
My post explains optimizers in PyTorch.

A layer is a collection of nodes to do a specific task.
Basically, a Neural Network(NN) consists of 3 layers as shown below:

Input Layer:

is the 1st layer which accepts data and pass it to a hidden layer.



Hidden Layer:

is the layer between an input and output layer.
can be zero or more hidden layers in a neural network.



Output layer:

is the last layer which holds a...