Backpropagation is arguably the most fundamental building block in a neural network. It was popularized by Rumelhart et al in a paper entitled "Learning representations by back-propagating errors" [1].

Backprogation is a process used to calculate the gradients of the weights of a machine learning model from a batch of input data. This is followed by the optimization step, where those gradients are used to update the model.

Related Articles

No items found.