What is the difference between reconstruction vs backpropagation?

I was following a tutorial on understanding Restricted Boltzmann Machines (RBMs) and I noticed that they used both the terms reconstruction and backpropagation to describe the process of updating weights. They seemed to use reconstruction when referring to the links between the input and the first hidden layer and then backpropagation when referring to the links to the output layer.

Are these terms used interchangeably or are they different concepts?

Topic backpropagation rbm neural-network

Category Data Science


Reconstruction is used for the concept Restricted Boltzmann Machine (RBM), it describes a phase where the structure reconstructs (generates) visible samples from the hidden states of the layers. For more detail, you can refer to: https://stackoverflow.com/questions/4105538/restricted-boltzmann-machine-reconstruction

Backpropagation is something different entirely; you can find backpropagation in Deep Neural Nets, Convolutional Nets, RBMs (in some way) and so on. Assume a Deep Neural Net with N hidden layers. Upon training we feed the input forward through the randomly initialized weights up to the last neuron. From the output of the last neuron, we calculate our loss using a cost function that calculates the error between predicted and the true output. This is forward propagation.

Knowing the loss and the loss function, we start to take derivatives backwards to find gradients of weights and biases of each layer using the chain rule, all back to the input side. This is called backpropagation. After backpropagation, we update all of the weights and biases of N layers with the gradients for each layer we calculated. Then we do forward propagation again, backpropagation again, update again and all again until we achieve our goal (which is usually minimizing the error as solving hopefully a convex optimization problem). You can also refer to:

https://www.youtube.com/watch?v=x_Eamf8MHwU for backpropagation

Also a final note: Gradients of RBM algorithm are not able to be calculated via classic backpropagation, a method called as contrastive divergence is used for the calculation. You can also have a look at the RBM paper on this matter and more, by Geoffrey Hinton:

https://www.cs.toronto.edu/~hinton/absps/guideTR.pdf

Hope I could help, good luck!

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.