What are the differences between Knowledge Graph Embeddings (KGE) and Graph Neural Network (GNN)

From page 3 of this paper Knowledge Graph Embeddings and Explainable AI, they mentioned as below:

Note that knowledge graph embeddings are different from Graph Neural Networks (GNNs). KG embedding models are in general shallow and linear models and should be distinguished from GNNs [78], which are neural networks that take relational structures as inputs

However, it's still vague to me. It seems that we can get embeddings from both of them. What are the difference? How should we choose which approach if we want to get embeddings?

Topic graph-neural-network embeddings deep-learning

Category Data Science


Knowledge graph (KG) is a different structure then Graph Neural Network (GNN). Both are indeed graphs but where KG differs is that it is not a Machine learning (ML) model, its just a way to "represent" relation between entities using an edge (predicate) while GNN is ML model that it learns the structure of the graph (neighbors and neighbor of neighbors) while training. This is why KG embedding is shallow in a sense that it could be merely an embedding of nodes obtained from either word embeddings such as Word2Vec or other methods such as TransE, not accounting much for multi-hop context. Whereas GNN is indeed aimed at learning/encoding multi-hop context such as information about neighbors and neighbor of neighbors, into node embeddings, hence they are more rigorous then KG embeddings.

How should we choose which approach if we want to get embeddings?

As I mentioned KG is merely representing a caterogical entity, such as "cat" with a relation such as "feral", while GNN is an ML model, (not just representing relations between entities). So depending upon the problem, you can choose one or the other.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.