Why do we need for Shortcut Connections to build Residual Networks,

Because otherwise the network would not be a residual network.

how [do residual connections] help to train neural networks for classification and detection?

They add shortcuts for the gradient. This way the first few layers get updates very quick, the vanishing gradient problem is no longer a problem. So you can easily make networks with a 1000 layers. See Residual Networks paper.

There is nothing specific about classification / detection. Its about very deep networks.


The short answer is that when a net is very deep it becomes very difficult for gradients to propagate backwards all the way. Skip connections offer "short cuts" for gradients to propagate further and allow for efficient training of very deep nets.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.