Calculation of VC dimension of simple neural network
Suppose I have a perceptron with one-hidden layer, with the input - one real number $x \in \mathbb{R}$, and the activation function of the output layers - threshold functions:
$$
\theta(x) =
\begin{cases}
0, x \leq 0 \\
1, x 0
\end{cases}
$$
The hidden layer may contain $k$ units and I would like to calculate the VC
dimension of this feed-forward neural network. The VC
dimension is defined as a cardinality of the maximal set, that can be shattered by properly adjusting the weights of the neural network.
The threshold functions have a VC
dimension of $n+1$, where $n$ is a number of input neurons, because by a plane $n-1$ plane one may split $n$ points in any way. So when considering the results in the first layer, we have a VC
dimension of $2$ for each gate, and the total number of points, that can be separated by the activation is $2 k$. Then we have a vector $\in \mathbb{R}^k$ to be processed to output, and the output unit has a dimension $k + 1$.
Do I understand correctly, that the resulting VC
dimension of this simple neural network is :
$$
2 k + k + 1 = 3k + 1
$$
Topic vc-theory perceptron neural-network
Category Data Science