What is the difference between a "cell" and a "layer" within neural networks?
So I understand what layers are. If you have 5 layers in your model, your data basically gets transformed 5 times via 5 activation functions. The number of neurons within a layer dictate how many outputs a layer creates.
So what are cells? I never understood where cells come into play. Are they a collection of layers?
Per Wiki: https://en.wikipedia.org/wiki/Long_short-term_memory
If the orange are layers, then I would imagine each has a bunch of neurons. So a cell is a collection of layers and yellow stuff? I'm having trouble understanding where this cell fits into an overall NN architecture. I am used to the pictures with input layer - hidden layer - output layer. So where would the cell occur?
Topic terminology deep-learning neural-network
Category Data Science