Pavlovic, A., & Sallinger, E. (2023). Building Bridges: Knowledge Graph Embeddings Respecting Logical Rules (short paper). In B. Kimelfeld, M. V. Martinez, & R. Angles (Eds.), Proceedings of the 15th Alberto Mendelzon International Workshop on Foundations of Data Management (AMW 2023).
Knowledge graphs (KGs) are typically highly incomplete. Therefore substantial research has been directed toward typically machine-learning-based approaches for knowledge graph completion (KGC), i.e., predicting missing triples from the data stored in the KG. KG embedding models (KGEs) have yielded promising results for KGC. In practice, the data management community typically represents major properties of data through constraints, axioms, or dependencies expressed as logical rules. However, any current KGE cannot capture vital logical rules, i.e., infer missing triples while adhering to such rules. For instance, correctly capturing general composition and jointly capturing composition and hierarchy rules is still an open problem. This work introduces the ExpressivE model that bridges this gap between the data management and machine learning community. ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space R2𝑑. This model design allows ExpressivE to capture a rich set of logical rules jointly and display any supported rule through the spatial relation of hyper-parallelograms, additionally offering an intuitive and consistent geometric interpretation of ExpressivE embeddings and captured rules. Experimental results on standard KGC benchmarks reveal that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on WN18RR. This short paper is based on our recently published ICLR 2023 paper [1].
en
Project title:
Scalable Reasoning in Knowledge Graphs: VRG18-013 (WWTF Wiener Wissenschafts-, Forschu und Technologiefonds)