Incorporating structural information in a Transformer?
For a Neural Machine Translation (NMT) task, my input data has relational information. This relation could be modelled using a graphical structure.
So one approach could be to use Graph Neural Network (GNN) and use a Graph2Seq model. But I can't find a good generational model for GNN.
Instead, I want to use Transformer. But then the challenge is how can I embed structural information there? Is there any open source artefact for Relational Transformer that I can use out of the box?
Topic graph-neural-network transformer deep-learning machine-learning
Category Data Science