Categorical Variable Embedding

I have a categorical variable in my labeled dataset. I trained one-hot encoded version of it in another neural network having embedding layer with a larger labeled dataset. I have obtained the weights of embedding layer. Is it possible to use embedding layer weights as a categorical variable representation like one-hot-encoding while using it in another network which has no embedding layer? For example,

One-hot-encoded variable,

  A B C D
D 0 0 0 1       
B 0 1 0 0
A 1 0 0 0
C 0 0 1 0
B 0 1 0 0
C 0 0 1 0
...

Embedding layer weights as a result of training,

   X1  X2   
A 0.2 -0.1
B 0.3 0.1
C -0.2 0.5
D 0.5 0.6

Representation of variable in the dataset using embedding layer weights above,

   X1  X2   
D 0.5 0.6
B 0.3 0.1
A 0.2 -0.1
C -0.2 0.5
B 0.3 0.1
C -0.2 0.5
...

Topic categorical-encoding embeddings

Category Data Science


Yes its completely possible. Its similar to people using Word2vec embedding or any other embedding.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.