To facilitate the usage of knowledge graph representations in semantic tasks, we provide a bunch of pre-trained embeddings for some common datasets.


Wikidata5m is a large-scale knowledge graph dataset constructed from Wikidata and Wikipedia. It contains about 5 million entities in the general domain, such as celebrities, events, concepts and things.

Download pre-trained models Performance benchmark results


Here are pre-trained entity embeddings of TransE and RotatE visualized by GraphVite. You can zoom in the visualization to see more details.