Incorporating structural information in a Transformer?

For a Neural Machine Translation (NMT) task, my input data has relational information. This relation could be modelled using a graphical structure.

So one approach could be to use Graph Neural Network (GNN) and use a Graph2Seq model. But I can't find a good generational model for GNN.

Instead, I want to use Transformer. But then the challenge is how can I embed structural information there? Is there any open source artefact for Relational Transformer that I can use out of the box?

Topic graph-neural-network transformer deep-learning machine-learning

Category Data Science


Some researchers have tried to exploit transformer for graph data. Please read this paper for more details

The implemented code for this has been outsourced on github location with an explanation on youtube video

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.