If you, like me, belong to the skeptics club, you also might have wondered what all the fuss is about deep learning. Neural networks (NNs) are not a new concept. The multilayer perceptron was introduced in 1961, which is not exactly only yesterday.
But current neural networks are more complex than just a multilayer perceptron; they can have many more hidden layers and even recurrent connections. But hold on, don’t they still use the backpropagation algorithm for training?
Yes! Now, machine computational power is incomparable to what was available in the ’60s or even in the ’80s. This means much more complex neural architectures can be trained in a reasonable time. READ MORE ON: TECHOPEDIA