Poster
in
Affinity Event: Muslims in ML
A Contextualized BERT model for Knowledge Graph Completion
Haji Gul · Abdul Naim · Ajaz A Bhat
Keywords: [ Knowledge Graphs ] [ Tail entity prediction ] [ Knowledge Graph Completion ] [ Link prediction ]
Knowledge graphs (KGs) are valuable for representing structured, interconnected information across domains, enabling tasks like semantic search, recommendation systems and inference. A pertinent challenge with KGs, however, is that many entities (i.e., heads, tails) or relationships are unknown. Knowledge Graph Completion (KGC) addresses this by predicting these missing nodes or links, enhancing the graph's informational depth and utility. Traditional methods like TransE and ComplEx predict tail entities but struggle with unseen entities. Textual-based models leverage additional semantics but come with high computational costs, semantic inconsistencies, and data imbalance issues. Recent LLM-based models show improvement but overlook contextual information and rely heavily on entity descriptions. In this study, we introduce a contextualized BERT model for KGC that overcomes these limitations by utilizing the contextual information from neighbouring entities and relationships to predict tail entities. Our model eliminates the need for entity descriptions and negative triplet sampling, reducing computational demands while improving performance. Our model outperforms state-of-the-art methods on standard datasets, improving Hit@1 by 5.3\% and 4.88\% on FB15k-237 and WN18RR respectively, setting a new benchmark in KGC.
Live content is unavailable. Log in and register to view live content