Keras: Softmax output into embedding layer
I'm trying to build an encoder-decoder network in Keras to generate a sentence of a particular style. As my problem is unsupervised i.e. I don't have the ground truths for the generated sentences, I use a classifier to help during training. I pass the decoder's output into the classifier to tell me what style the decoded sentence is.
The decoder outputs a softmax distribution which I was intending to feed straight into the classifier but I realised that it has an embedding layer, which in Keras, accepts only integer sequences not softmax / one-hot. Does anybody know a way to remedy this?
Thanks
Topic sequence-to-sequence embeddings keras
Category Data Science